Category: Industry Insights

  • My Space’s Tom Anderson gives Google+ a stamp of approval

     

    Okay, so MySpace may be “dead” now that it’s been sold to Specific Media for a paltry 35 million, but you can bet that many of its years of success had something to do with its founder, Tom Anderson. Your first friend on MySpace now has a shiny Google+ account of his own, and he also had plenty to share about the new social network in a guest post on TechCrunch, most of it positive.

    The main theme of Tom’s post has to do with Facebook and how it is reacting to the debut of Google+. According to Tom, Google has the advantage. “When it comes to “monetization” on the G+ “website,” Google’s trump card against Facebook is that we may never even see an ad on G+.” Tom says.  “Google has plenty to gain without ever showing an ad and, put simply, Google doesn’t need the money. Facebook’s got to know this, and it’s got to have them just a little bit concerned.”

    Tom also alludes to Facebook considering changes to make their feeds real time rather than the current “Top News” setup because of complaints from advertisers and app developers. In fact, Facebook has taken some defensive actions since Google+ launched, which surely shows that they are considering the new social network to be a threat.

    Tom also mentions that Facebook gaffed when it came to the way they handled their relationships with developers, and that how Google handles this delicate issue will be crucial to the next step in Google+ development. In fact, he stresses the balance between advertisers, users, developers as crucial, and I absolutely agree — too much in one direction and social networks seem to suffer. We’ve seen plenty of that with past sites such as Friendster, Tribe and even MySpace.

    In summary, the social network field is in an interesting place at the moment, with Facebook and all its massive popularity up against Google and its bright new ideas and excellent resources. Google has something of a clean slate on this playing field, while Facebook has made its mistakes. But can Google capture the user base that Facebook has? It seems without those numbers, the key to success still would be out of reach.

    What do you feel is key to the success of a social network?

  • Pittpatt Puts Google Acquisitions To 100

    Thursday, Google acquired Carnegie-Mellon University spinoff project PittPatt, which specialized in facial recognition software for businesses like GE. We’ll be a bit naïve here and take Google at their word when they claim they won’t implement full facial recognition software without “strong privacy protections” and intentionally excluded the feature from Google Goggles. It seems clear that Google will use the technology to enhance video quality and would otherwise be caught in crass doublespeak during a time to play it straight.

    It is striking that Google seems utterly unfazed by two government commissions investigating it for its aggressive vertical integration into niche search and social media.

    PittPatt marks the hundredth merger by Google, the sixteenth this year. For comparison, the slightly older and smaller Yahoo has acquired 64 companies and the more traditional technology company Microsoft purchased 144, both with only two acquisitions over the past seven months.

    Google seems to be almost thumbing its nose at both European and American investigators. Not only are they explosively expanding and refining their social tools, they are reinforcing use of Google properties on Maps and de-emphasizing other sites’ role in local search. Google is even boasting its wide variety of overt technologies by giving the user everything they could love (with a layout ripped from a Mahalo demo video) even as they take away a line of customer interaction with the company.

    Of course companies change, and to be mired in chasing the FTC algorithm would be much more likely than and as damaging as wrongdoing being found by the inquiry. Even more important is that these acquisitions allowed Google to provide a wide range of products, most of which were cobbled together from multiple targets’ technology. But is Larry Page’s aggressive business strategy going to engender the company to the FTC during the Google Antitrust case?

    Check back tomorrow for an informative infographic from SI’s own Joe Luft on the situation!

  • Google Isn’t Bowing Down

    So, in case you haven’t heard, there was a major update to Google Places late last week. My Friday was filled with confusion, frustration and a feeling of something worse to come. It was sort of like a David Lynch movie with a Local twist. Now that the smoke has cleared, one question remains. What is Google up to?

    Google Places Update

    Before I get into the possible reasons for this update, here is a rundown of the most prominent changes that have been made to business listings on Google Places.

    • Google reviews are the only ones to include snippets now.
    • Third-party reviews have been relegated  to a  “Reviews from around the web” section at the bottom of the page.
    • Third-Party citations have been removed completely.
    • A big red “Write a review” button has been added in two prominent positions.
    • The “More about this place” section is gone.
    • The “What people are saying” section has been replaced by “descriptive terms”.

    Now, back to the question-at-hand. What is Google up to? The Wall Street Journal’s Amir Efrati suggests that Google is bowing down to it’s competitors under growing pressure from the FTC’s antitrust investigation. Saying that, by removing third-party reviews from Places pages, Google is distancing itself from the claims that they “steal” content from the likes of Yelp and Citysearch, post it on their own Places pages, and give those pages preferential rankings in search results. While I see where Amir is coming from, I don’t think that Google is bowing down at all. Quite the contrary, actually. I believe that Google is putting their attack plans into motion.

    Google was initially designed to index third-party content in an easily searchable and user-friendly format. By removing third-party citations and review snippets and promoting their own reviews in what is usually the #1 ranked search result within it’s own #1 ranked search engine, Google has formatted their local search results in a way that obviously favors their own content over that of their rivals. This is the exact reason why the FTC is investigating Google in the first place.

    In other words, Google isn’t interested in displaying reviews from Yelp, Citysearch, and other prominent sites, they want to make those reviews obsolete. If you look at the changes to Google Places from this perspective, it doesn’t seem like Google is that worried about the FTC’s investigation. With all that money they are spending on federal lobbying, I guess Big G thinks they’ll come out on top like Microsoft did in the 90’s.

    Ultimately, not much has changed when it comes to what really matters… getting results. Google Places is still the holy grail of Local SEO. There haven’t been any reported drops in rankings due to this update, so no major algorithm changes are believed to be involved. This means that while Google may not be displaying third-party reviews and citations, they still matter when it comes determining the rankings of local search results, for now.

     

  • Donde Esta Mexico, Google?

    The realm of Local SEO can be a complex, frustrating land. A fruitful, popular land but a land where seemingly easy obstacles can quickly turn into uphill battles. And, if your client happens to be in a country other than the U.S, that hilly jog transforms into a trek up Everest… without a Sherpa.

     

     

     

     

     

     

     

     

     

     

    For example, we are currently working with a Playa del Carmen hotelsclient that is interested in ramping-up their local presence. This isn’t the easiest task but it certainly isn’t impossible. A few Mexico-focused business directories will need to be sourced, Google translator will be used like it is going out of style and a Mexican-formatted Google Places listing will need to be created. Again, not simple but all in a day’s work around here.

    The most important thing to keep in mind when it comes to local SEO is that it is a newer industry and is constantly changing. This means that there can be quite a few kinks in the technology. Google is no exception to this.

    After creating the client’s Google Places listing, we had the misfortune of discovering that the wrong country code was added to the listing. Typically, this would be a setback that would require another round of verification for the listing but not a deal breaker. Unfortunately, this was not a typical case. The country code had not been added incorrectly by us, but by Google and our client wasn’t the only one that was affected by this.

    After some research, we discovered that listings in Playa Del Carmen, Mexico are using the USA country code (01) rather than Mexico’s country code (52) in the main search engine results page. However, the correct country code is used on the actual Place page. This is a pretty significant bug in Google Places but is par for the course in local SEO. Surely, the issue will eventually be resolved but for the time being it seems like there are going to be quite a few Mexican businesses with silent phone lines.

  • Lunch & Learn – Anthony Coleman on Facebook Tabs and Iframes

    Around here at the SI office we take our lunches seriously, and at the intersection of tasty food and a bunch of fun, hardworking SEO professionals comes our new “Lunch and Learn” series. This recently-debuted program is a biweekly tradition here at the office, where an Influencer will take the time to put together a humorous and educational presentation to embiggen the knowledge of even the most seasoned SEO veteran. Today, our very own Anthony Coleman kicked off the series with a superhero-tastic exploration of Facebook reveal tabs and newsletter sign-ups, as well as how iframes can be a boon to development, tracking and user interaction.

    Since Facebook’s native markup of a combination of HTML and XML can be clunky and very limited (especially when it comes to more sophisticated elements like Javascript), iframes offer a sleeker approach to Facebook brand display. They pull content directly from a business’s site, so that traditional HTML and the aforementioned advanced elements can be utilized to create a much more multi-faceted approach to drawing in eyeballs both before and after the “like.” Additionally, multiple tabs can be created within the frame, allowing for a customized complexity not normally seen in the basic FB layout. These tabs can offer different aspects of the business such as a “thank-you for liking” display, locations, specials, and forms.

    As most Facebook users simply “like and forget,” this kind of direct data can draw traffic to the business’s page instead of its Facebook. Since the third-party content exists on a familiar platform, though, it can function as a preliminary landing page, easing the users in to getting to know the brand or service. The more dynamic aspect of the frame tabs also makes it easier to track page views and user behavior. In turn, targeting demographics by aspects such as location, sex, age and interests is made simpler as well — something that anyone with an interest in Facebook marketing is surely invested in.

    On a more surface-level aspect, this approach is simply more engaging to the user than passive newsfeed updates. There are over 750 million users on Facebook (150 million + in the US alone), and maintaining a strong presence is a must for any social media-savvy brand. The updates can be informational (such as special offers and exclusives shown directly from the website onto the viewer’s feed), but also allows for direct interaction in the form of contests, mailing list sign-ups and other conversion points from viewer to participator.

    It was an educational afternoon at Search Influence, and we thank Anthony for his hard work putting together his presentation! He’s got an upcoming blog post on more Facebook tactics, and you can bet that future Lunch & Learns will be written up right here. In the meantime, you can check out his heroic (and informative!) presentation here.

  • Google Antitrust Case – Google+ Defiant in the Shadow of Harvey Birdman

    Google Antitrust Case
    "I'll take the case!"
    Over the past few years, Google has been investigated by the FTC over most aspects of their business, largely circling around their aggressive acquisition of properties to vertically integrate other services into their core search. As Google has added features such as Products, Maps, and Video searches and integrated them into the main search results, competitors have seen their traffic decrease as users stay on Google-owned sites.

    Making good on a somewhat threatening letter from the Senate Subcommittee on Antitrust, Competition Policy, and Consumer Rights two weeks earlier, on June 23, the FTC continued their impression of Harvey Birdman. Issuing a subpoena “relating to a review… of Google’s business practices, including search and advertising,” the FTC has opened an investigation to “address fundamental questions of business operations.” These questions, though immediately unclear to Google (if their admission and response on their blog isn’t just rhetoric), have been speculated on rather endlessly by pundits.

    Starting from their blog, Google lists a few ways that they believe they put the user, i.e. their consumer in economic terms, first in their decisions. Search neutrality, vertical integration of search, the relationships involving paid search, clarity of function and policy, and consumer choice and freedom of movement, are all core facets upon which Google prides itself. Similarly, SEO dominatrix Matt Cutts discussed with Bloomberg news about the case, but managed to deflect most of the more interesting questions.

    On first glance, this investigation seems to parallel the Microsoft case that crippled the company’s ability to innovate on its OS, but not its market share or company size. The Findings of Fact from that case, which were not overturned in the successful appeal, offer insight into the kinds of inquiry Google will endure. However, it’s quickly becoming clear that substantive differences may doom the FTC’s case.

    Much of the argument for monopoly power in the Microsoft case stems from an inability for consumers to effectively and seamlessly switch from platform to platform. If you have a PC, you’ll either have to run Windows or put in serious effort to run your computer like Windows. But in Google’s case, they are more like the open-source alternatives in that there is a high level of customizability and a large number of competitors that are easy to switch to. Google and Microsoft both benefit from the positive feedback loop (or arguably vicious cycle, if you’re a competitor) caused by being the dominant firm in the industry. Consumers and business partners (read: surfers and webmasters) use and develop for Google first and primarily, as it is the best-known way of finding things on the Internet.

    Google’s close-to-first-mover advantage is, of course a major effect on competitiveness of the industry, outstripping Yahoo and, so far, with more panache than Bing. Furthermore, Google’s dominance isn’t just in search. Their maps and directions have all but relegated Mapquest to vintage sites; YouTube is clearly the most-used video site on the web; the image search, though lacking some of the features of TinEye, certainly is used more; Google Docs is infinitely more well-known than any of its competitors, including offerings from Microsoft Office and Adobe; and their forays into social platforms, though rather unsuccessful, may have finally found its niche, cutting into Facebook shares per article across the top 100 tech sites. Google might even suffer the same fear of pre-installation, as three of four major browsers, one of which is owned by the company, search on Google by default.

    Sure, Google has the majority of search share, between 60% and 70%. But do the influences of Facebook’s social search, Twitter’s massive linksharing search, the vertical search engines, and the distinctive qualities that truly separate Google’s broad search competitors from the Big Panda really have no effect on the breadth of choices the consumer has to find content on the web? Even more so, many of the features of Docs are better implemented outside of a browser. And ever-lurking behind that success is the fact that technology, and especially non-physical products and services, have a habit of being fleeting or changing focus rapidly — case in point: Mahalo. Google isn’t invulnerable — their social networks so far have been rife with deep enough privacy issues to be all but shuttered by a class-action settlement.

    Even further complicating the antitrust case is the increasingly incestuous relationship between Google and the US government. “Only” $40 million (.13%) of Google’s revenue comes from government contracts, but it’s extraordinarily clear that even having government contracts gives a major foothold in emerging markets, amplifying the first-mover advantage already in place. These contracts range from simple things like email (using Google’s Apps for Government) and Analytics to more bizarre agreements involving Google Earth for the Pentagon, FBI, and DEA and aircraft parking deals with NASA. Even more sticky is the enormous amount of lobbying Google has done over the past 5 years, increasing their budget from $800,000 in 2006 to over $5.1 million in 2010. Eric Schmidt, former Google CEO, is a close friend of President Obama, and the company came in 5th for campaign donations to his campaign.

    But the obvious monopoly power wasn’t even the focus of the Microsoft trial — it was, in fact, a bait-and-switch on the general public to garner support for much more complex antitrust issues. The trial was to “address fundamental questions of business operations” concerning how it licensed the APIs differently to different companies to limit competition. While on the one hand Google has relentlessly supported open source coding, helping create a transparent and easy-to-enter market, they are also constantly involved in boxing out competitors, though this doesn’t minimize that coalitions have gathered to do the same to Big Panda too.

    A slough of allegations regarding Google’s advertisement business have cropped up in the wake of the announcement of the investigation. Two arguments are at the center of the debate: that Google unfairly affects the Adwords bidding process in favor of Google-owned entities, and that Google unfairly affects the organic rankings in favor of high-revenue Adsense partners. Both of these arguments fly in the face of official Google policy.

    Those claims must be on the radar of FTC officials, but judges and prosecutors will almost certainly either be too unfamiliar with the system to be able to parse out the technical details in place, or be baffled by the ubiquity of Google ads across the internet and lose the forest for the trees — such relatively outlandish accusations drive attention away from more prescient issues in the advertising system, such as uneven application of standards for content. These arguments break down even further on investigation, since social sharing likely drives more traffic to larger content farms, such as perennial slacker favorite and Demand Media flagship Cracked, which enjoys a vibrant social share rate. It would be extremely surprising if either of these accusations turn out to be true; however, if they are, Google will have to answer for significantly more than anyone expected.

    The other side of the allegation coin is significantly less conspiracy-theory. The massive vertical integration Google has committed since moving away from simply being a text search engine, starting with Images and moving towards Maps and GIS systems, Books, shopping, and travel links. Such rapid expansion into every money-making part of the web leads some to ask “How many industries is Google allowed to index under search and deprive the creators of an ability to monetize it?”

    That would be a valid question if two factors weren’t in play. First, modern economic theory finds vertical integration to be more economically efficient and better for consumers — the real victims of noncompetitive markets for antitrust theory, as opposed to the monopoly’s competitors. But this isn’t enough — if the only portal to websites were search engines, it would be a travesty to allow search engines to keep eating up website business models and including them as “search.” But why would a company solely rely on a third party to provide traffic? Certainly the reason why one would go to Kayak or Expedia is because of more traditional marketing they’ve done to draw attention to their site. Where the internet is concerned, many e-businesses seem to forget that their branding isn’t their top-ranking keyword — it’s actually having a business that people will want to use and recommend to their friends.

    Recommendations are not only through social media sites, either. +1 and Facebook and Twitter and whatever else is around the bend isn’t the only kind of endorsement e-business can enjoy — whatever happened to meatspace? Yelp, Foundem, and Kayak have all registered complaints against Google, saying that the vertical search integration has damaged their businesses irreparably. The short answer is “get over it;” the longer one involves some stern words to the companies’ marketing departments.

    But Google+ is different, and may be taken as a gigantic pyrotechnic middle finger to the FTC. Facebook and Google do compete, but the lines have always been in different business models, even as Facebook includes more and more Internet searching into their platform. Google+ is the first real competitor to Facebook, as Orkut was pre-existing and woefully underpopulated in the US and Buzz was more of a Twitter competitor. This kind of vertical integration and attempt to eat up more of people’s time on the web is exactly the wrong kind of publicity Google needs, even as it avoids the problems of stagnation that Microsoft had when it was under investigation. It’s hard to believe the competition-themed sales pitch Google cooked up after receiving the FTC notification, when even the Economist sees Google+ as a direct competitor to FB in order to return Google to its position as the “main conduit via which people access the web.”

    Regardless of the intent or timing of Google+, the expansion and Internet buzz around the new service shows the power of the already-abandoned mantra “Don’t be evil.” The Economist likened the “addictiveness” of Google to chocolate instead of cigarettes; Internet analyst Greg Sterling mentions Google’s “capacity to evoke a certain kind of enthusiasm when it tries to do something that is difficult;” and everyone’s Facebook feed seems to have one friend who’s closing up shop and scooting over to feed El-Goog because they somehow trust Singhal, Schmidt, and Cutts over Zuckerberg.

    And that consumer trust, despite the wails of webmasters of various types and complaints of those who work closely with Google, might be the deciding factor in this case. In general, we choose to search, share, and produce with Google, despite the fact that we might be more than gently directed recursively back to other Google properties. While this FTC investigation could spell the end of Google dominance, the tipping point is a filing of a complaint, not the start of the review, and even former FTC official David Balto assumes that there is likely no explicit wrongdoing. This FTC case, however, may give Bing its golden ticket to significantly differentiate itself from Google and become a more significant competitor.

    The AP reported a 1.4% drop in stock value immediately following the FTC announcement, but the stock has since recovered.

    For edification, here are the Bing and Google organic results pages for a variety of broad Google Services.

  • For A Good Time Call…Why the Bathroom is the Perfect Place for Internet Marketers

    Recently I tweeted a statistic from Google that 39% of searchers use their mobile devices while going to the bathroom. While it served the intended purpose of getting people to @ me (I’m incredibly thirsty for Internet attention), I didn’t think about what this statement really meant for SEO. Several days later a friend of mine posted the following on Facebook wall:

     

    As comical as it is my friend’s post forced me to think about the legitimacy of Google’s mobile user stat, how many times have you sat down on the can and immediately reached for the phone in your pocket? There you are in complete solitude with your device. Just checking /b/, posting trivial Facebook statuses, or finding the address for local adult entertainment –- completely engrossed in the Internet.

     

    The isolation of the bathroom becomes the perfect environment for search marketers. While users are going, they are also staying focused on search engines or social media. Internet marketers can use this opportunity to make them brand-aware through mobile ads on Google* or promote their newest special offer on Facebook and Twitter.

    While I originally took this Google “fact” as something funny to tweet, it shows how much potential there is with mobile search. Users are more attentive on what is being presented to them when they are in the bathroom. With the right content and call-to-action, you can easily convert these users into leads or at the very least have them remember your name.

    *I purposely didn’t mention Bing because NO ONE does mobile searches on Bing.
    For more search marketing musings, follow me and Search Influence on Twitter: @anthonyfcoleman & @searchinfluence

    Picture Source – http://windows7reports.com

     

     

  • Google Transparency – Is It Really That Bad?

    Could it hurt Google to be a little bit more clear? Mr. Kohn at Blind Five Year Old thinks not. Google’s public persona can be unclear and capricious, while the size of their user base is so large that noise from their algorithm can have deep effects on the livelihood of those users. Are his principles of “real” engagement, transparency, and amplification already seen across Google’s user interactions? Or does the the SEO community and the SEO dominatrix take care of the rest?

     

    Support Forums – MUST GET FIX BEFORE SOMEONE GET KILL!!!

    The ecosystem of the Google Forums allows professionals, amateurs, and Google professionals to come together and try to solve the problems that come up in the daily life of a website owner. However, the average website owner or business professional doesn’t have the faintest idea of how Google is organized or to whom even to address problems.

    When a Map Error puts customers in danger, you’d like to see a fast response and some targeted information to help your specific situation. The first response to the thread is by a Google Employee, probably on the Maps team. However, the responder who is best situated to directly take a look at the problem and at least explain what’s going on to cause the error simply puts a boilerplate answer.

    It takes a community member to explain in detail what’s going on. This could be Google awaiting a naturally crowdsourced solution to support problems, but that sounds like a euphemism for “we don’t have time for this through inappropriate channels.” That’s fair, of course, but those channels are hard to find and cold copypasta doesn’t reheat well, especially when it’s an error so far beyond the ken of most business owners.

     

    Blog Posts – “In short. This articles fails it’s own goals.”

    For those without a “MUST GET FIX,” researching Google through their corporate blogs would seem to be a worthy pursuit. But even when giving deep information about what factors enter into the search algorithm, Google isn’t immune to denouncements of “unmitigated prevarication.” Again, the complaint is that it’s PR instead of real help.

    To be fair, I can’t see this complaint, especially in the Panda algorithm change posts. Through sticking to their guns about not releasing too much algorithm data, Google has provided intuitive, i.e. non-technical, ways to check the perceived quality of a site. Here, despite claims of “misdirection” and “saying one thing and doing another,” Google might even be giving better information than they’re credited with, saying that the algorithm is using techniques that are either fuzzy or heuristic or some other kind of higher statistics that somewhat accurately model real human interactions — basically, don’t bother chasing the algorithm, it chases you.

    Compounding this problem of transparency, webmasters only see the noise; that is, they don’t see how well the algorithm works overall and are only concerned with their own site, a little statistical blip that is hard to reconcile with the overall trends. Furthermore, well-educated webmasters also see where Google policy and practice diverge, whether it’s for Ads, Maps, or Snippets. In addition, the wheedling that can happen when a site owner talks directly to Google and the mystique of getting that number for that red phone undermines the transparency Google works to show.

     

    ThinkInsights – Data Beats Opinion

    When it comes to the state of search, Google offers a set of studies they’ve conducted and compiled at their ThinkInsights, giving a glimpse at what Google is interested in right now. Since April, they’ve been pushing Mobile internet, which affects most areas of internet marketing: PPC, website design, and specified marketing. These reports are published as slideshows in Powerpoint and PDF formats.

    These studies are Google’s way to keep everyone up to date while pushing its products. Like the early videos for learning AdWords, this kind of infotising does a lot for Google, but may not give the kinds of in-depth information some might hope. Here, Google could  improve two areas. Firstly, they need to amplify. These things could be better shown to the average person if they were one-shot infographics or otherwise promoted for easy sharing and larger reach. But it’s not just that — format and distribution might not be a problem if the information were targeted, but it’s neither juicy enough for the common reader, nor specific or new enough for the marketer. The information is often more of a middle-ground, possibly good for the part-time marketer, but not for keeping up with new trends or finding the next big thing. But is that Google’s job?

     

    Matt Cutts – Yes, there is a Santa

    Regardless of whose job it is, Google has its own unofficial mouthpiece in its head of Search Quality, Matt Cutts. While he’s a bit of a god among mortals in the search community, his position allows him to discuss with professionals and semi-professionals on forums, blogs, and other social arenas. Most recently, his role in shutting down incorrect speculation on ranking penalty factors puts him at the forefront of any transparency discussion. In this case, he seems to follow the debunking flowchart Danny Sullivan created. He sees a fairly bizarre claim, but didn’t respond until he saw the same claim repeated on Hacker News.

    Looking at the comments, you immediately see issues with the one-man show. Not only is a real person capable of mistakes and poor wording, but the nature of a small industry gives people long memories. Furthermore, his direct connection to Google and his dislike of discussing specifics of the algorithm give way to vagueness and expected corporate doublespeak. Finally, the high level of technical savvy of his audience lets them research well, leaving others to repeat the supporting data to undermine his transparent claims.

    I feel for Mr. Cutts — he doesn’t have to trawl messageboards and blogs to help people understand his business better, but he does, often to a less than warm welcome. But his position is somewhat self-made as the most vocal Google Guy, leaving the door open for the last line of transparency, the SEO community, who are the alternative to top-down transparency.

     

    SEO Community – Publishing the Factors

    Google is a corporation, and therefore sometimes has to avoid certain topics or cloud the waters to dissuade people from gaming the system. The SEO community comes to the rescue and fills in the blanks. Anecdotal reports, such as case studies, forum posts, and SEO blogs, are incredible sources, especially when you find yourself in the same situation. But it’s not these that provide the greatest transparency for those under Google’s will; it’s things like Rand Fishkin’s Search Ranking Factors and David Mihm’s Local Search Ranking Factors that fill in the blanks that Google leaves through its inability to talk about the algorithm or provide meaningful search data.

    These factors reports are the result of serious research and collected soft feelings from knowledgeable industry professionals — a good combination of information. Furthermore, since they provide methodologies and even raw data, you can double-check or even focus on a subset of their data, allowing the ultimate kind of transparency.

    These information sources beyond Google’s reach are the real transparency for the Search Industry. They are the ones who give as unbiased as possible information (minus high-level trade secrets of course) and unplug the bung for much of the meaningful search truths.

     

    It’s hard to be a large, looming company and give enough care and information to your users to make them feel like you’re being transparent. Giving credit where it’s due, Google is often meeting the criteria asked; however, meeting the criteria and meeting the small business owners’ expectations of those criteria are two separate issues.

    The biggest issue for Google’s transparency, despite the multiple sources of information and moderately high level of involvement, is amplification. They aren’t making their information particularly easy to access, nor are they making the answers they give very public. But that’s not the worst thing — bloggers, researchers, and other SEO kings are more than happy to throw their two cents in, letting independent sources keep the information lines clear. Finally, the quest for transparency, especially on the internet, is a bit of a red herring: the Internet is not the democratic utopia we hoped it was.

     

    Is Google doing enough to make its products, services, and policies clear for you?

  • MonitorThis — Keep An Eye On Your Brand’s Online Reputation Without Lifting A Finger

    While Google may seem as inescapable as the Empire, it’s not the primary search engine (nor English the primary language) of many areas of the world. As the Internet makes impressing potential customers and business partners in a cohesive manner more important, managing the different pages that these individuals will see when performing research on you becomes an intrinsic part of any online marketing campaign. Assessing your Internet reputation should be the first step in any management thereof. While a Google search “snapshot” is useful for the moment in which it is performed, it is by no means exhaustive of every single venue information users may have posted about your brand, and does not include the possibility for further exhaustive monitoring. Fortunately, a more sustainable solution exists for the brand info-junkie.

    MonitorThis is a free service that collates results from 25 different global web resources into a single digest-style RSS feed. This list includes an impressive array of social media (from small-scale blog-focused projects such as blogdigger to the behemoth of Twitter), news aggregators like Google and Yahoo News, and sharing/discovery engines like Metacafe and del.icio.us. Interestingly enough, Bing is also included in the sites aggregated– with its recent decision to start weighing Facebook likes into its SERP rankings, you’re getting a double whammy of meta-social search. The service observes results from all of these sources, updating its feed instantly so you can see your chosen keyword’s mention as soon as it’s indexed by the various engines’ spiders.

    It’s unfeasible to track down every blog venue, review directory, social networking site and even article engine that may make mention of your brand or business, MonitorThis makes it easy to stay on top of every aspect of your online appearance. Simply import the OPML file into your favorite RSS reader and rest easy knowing you’ve got a watchful eye on your name and reputation.

  • Microformats – Web 3.0? Really?

    Microformats, an extension of the concept of the “semantic web,” are renewed again for SEO. While they’ve been around since at least 2005, microformats use XHTML to build upon existing standards to make it easier for people to be able to use normal-looking content and for search engines and other inhuman web site visitors to parse information to connect disparate pages like Google Places and review sites.

    Just Semantics

    Semantic web design is nothing new to the SEO community. We know the difference between span style=font-size:30px and an H1 tag or strong and b, and we’re clear that Google and other spiders read the two tags differently. But with microformats, the intent is not to (necessarily) change the presentation along with the meaning, as most HTML tags do, but it integrate metadata for spiders and other programs.

    Three microformats are clearly worthwhile for the search marketing community: rel=nofollow, hCard, and hReview. There are other microformats that are for events, syndication, or denoting a more personal relationship between people, but these are of particular, niche uses. Most microformats can be tested using the Google Rich Snippet Tester, which will also alert you to some of Google’s peculiarities about microformats.

    rel=“nofollow”

    Nofollow is an “elemental microformat,” one that is a “minimal solution to a single problem,” that is pervasive throughout the web thanks to Google’s endorsement from 2005. Every blog owner knows about nofollow. It’s the key to making comment spam not hurt your site. Put simply, making a link be nofollow-ed keeps your “link juice” from reaching that site, effectively taking away your endorsement of the link.

    While it was a common practice to “sculpt” your link profile using nofollow, 2009 brought about an announcement from none other than SEO dominatrix Matt Cutts, changing how nofollow is treated, effectively eliminating sculpting as a highly effective technique.

    Nevertheless, nofollow is a necessary part of any comment or signature system.

    hCard 1.0

    “hCard is a simple, open, distributed format for representing people, companies, organizations, and places, using a 1:1 representation of vCard … properties and values in semantic HTML or XHTML.” Uhhh… What?

    vCard is the electronic equivalent of a business card, saved in a .vcf file. It’s a file you send or have for download that includes information like name, company, telephone, or even logo and photo. hCard puts that kind of functionality onto a website using a “compound microformat,” allowing spiders and even some browsers to parse out the information.

    hCards can be downloaded directly to a vCard usingbookmarklets, on-page links or browser plugins for Safari, Firefox, Chrome, or IE. This could be the new way to give someone you’re meeting for the first time your card.

    hReview 0.3

    hReview is similar to hCard, described as “a simple, open, distributed format, suitable for embedding reviews (of products, services, businesses, events, etc.) in HTML, XHTML, Atom, RSS, and arbitrary XML.” hReview works in the same way as hCard, using classes to mark text as metadata. However, fields in hReview can be hidden for better presentation, using a class of “value-title” and putting the content in a title attribute.

    hReview isn’t as immediately applicable as hCard, though it can connect reviews with a places page. We’re currently testing how best to do this, and what it really takes to make that connection — let us know if you’ve found anything!

    Web 3.0

    Is this really the next step in the Internet’s evolution? Certainly, the term Web 2.0 has been bandied about, referring to the social sphere, where everyone networks with and talks to random people, trying to raise awareness of their pet ideas and projects. But with the “death” of the first round of social networks like Friendster and MySpace, and the utter ubiquity of others like Twitter and Facebook, could the web be going through its next iteration by making every page be “semantic” to better provide meta-information?

    Well, the social web was just the effective monetization of old ideas like BBSes, forums, and chatrooms; similarly, the semantic web is just further specializing the strides in separating presentation from content that began with the deprecation of the <font> tag with the rise of CSS.

    Are microformats the future of the web? Are you using them right now?