Category: SEO

  • Google Places’ Hotel Booking Feature

    It’s been nearly four years since I’ve left New Orleans on any kind of leisure trip, but this weekend, I’ll be out on the road to Austin for my first “I’m an adult” vacation. Leading, for the most part, a frugal and boring existence, the task of planning a vacation is one that’s both foreign and daunting for me. This trip marks the first time I’ve ever booked a hotel on my own, and how did I go about booking?

    Really, how else would I have booked a hotel room for below $100 per night within 20 minutes of my intended destination only a couple of days before I actually left for a major cultural hub? It turns out that Google actually has an answer, which would’ve been some help had I found it before I booked.

    Though relatively quietly, Google has begun to place the ability for customers to book a stay directly on hotels’ Google Places pages. This addition to the site is similar to their “Shopping” section where users can compare prices for a certain product between multiple retailers and ultimately deal with one that meets their needs. Here, however, the customer will not be dealing with a seemingly faceless online retailer shipping out goods but rather directly with a business whose quality of service is readily accessible by a few mouse clicks.

    As depicted above, potential guests have multiple options available for booking directly from Google, mostly through a number of popular travel sites, but also directly through the hotel. How exactly does this newish feature affect the involved parties?

    First of all, and who I can actually identify with, is the Internet user who’s looking for a hotel room that will totally rule. Places pages have always been a great source to learn more about a business so that you can determine if you’d like to deal with them. With the addition of this feature for hotels, being able to find out how well a hotel presents itself as interpreted by user input and book on the exact same page is pretty convenient. As mentioned above, your options for booking at a particular hotel aren’t limited to any one site, so the convenience of having all of this information presented is much better than the individual travel sites offer on their own.

    Realistically, however, the presentation of this information will not change search habits unless Google intends to push this service in a direction similar to that of these big travel sites. My method of booking was looking up “cheap hotels” in general, then going to one of a number of sites to find a place in Austin. This bypassed what Google had to offer on their own and, honestly, netted me a better deal than what Google was advertising. Keep in mind that this is an example only of my personal search habits; searching for “hotel in Austin” or anything similar as a normal person may have will bring the user to the Places page, from where they can choose to book however they please.

    Through whichever entity the future guest to a hotel decides to book with is up to that user, but this feature actually gives a foothold for the independent hotel owner to net more profits. According to Mike Blumenthal, by buying pay per click ads directly through Google to place on their page, they can trim the hefty fees associated with listing on such sites as Expedia and Hotwire. By owning a Google Places page, a hotel owner can more effectively drive business directly to the hotel and create a positive connection with the customer base. Though according to Google, the ability to book through Google’s links “will not change the way that hotels are ranked in Google Maps,” thereby having no effect on SEO, an independent hotel is still getting exposure at a better price from Google than with the big travel sites.

    What, then, of the big travel sites? Even if it seems that Google’s edging in on their territory, they still have distinct advantages. Services offered by the likes of Travelocity and friends extend beyond just booking hotels and into full travel packages. Google doesn’t offer links to car rental intermediaries if you visit a car rental location’s Places page, nor does it if you visit an airport’s location. They may have something to worry about if Google decides to extend the present booking capabilities to other areas of travel, though strictly in the realm of hotels, this change may shift the power more to the hotels than large booking agencies.

    In my opinion, this is not a bad thing. The ultimate winners from this feature are Google, hotel owners, and the individual user. Major travel sites may see a slight decrease in hotel bookings, but the advantages that they have over direct hotel bookings will continue to keep their heads above water. The internet is constantly evolving, and these sites will have to adapt to whatever changes or disappear. Meanwhile, the individual user will be able to benefit from the additional information available through Google’s Places pages, and we, at Search Influence, won’t see any of our clients’ results negatively affected by this. I, however, wish I would have known about this recent Google feature before I decided to book my mystery hotel room for this weekend, but at least it let me know that I saved around twenty bucks per night by booking through Hotwire! Thanks for everything, Google, and I can’t wait to see what cool new things you have coming to us in the future.

  • Optimizing Site Speed – Why Google Is Right

    With the most recent algorithm update, Google made it clear that a site’s user experience is now a factor for your search ranking. Though this may be a giant leap forward for assessment of site quality, the backlash against the farmer update shows that such metrics for SERP rank might not always be the best. But a user experience factor older than the Big Panda still gets some people’s goats.

    Site Load Time

    Site speed has been a significant factor since about this time last year. Reading some of the comments on Matt Cutts’s blog post could be an exercise in patience, but as always over-concerned webmasters bring up two good points: javascript and CMS design.

    The first is that things like javascript, Flash, and messy implementations of HTML and CSS can slow down a page’s load time. While standard wisdom says to avoid tricky solutions to simple problems, common analytics and advertising packages are among the last things to load on a page and are often causes of frustratingly long load times.

    The Big Panda update may have added a factor of ad density to the list, though simply removing ads isn’t an option for many sites. Additionally, removing an analytics package most likely would do more harm than good if you’re tracking your internet business. And to change your entire layout for one factor is farcical at best.

    Yahoo’s Developer Network offers a whole host of solutions that don’t involve tons of man-hours put in. Three major best practices are easy to implement and should solve a number of speed issues. The first is to keep scripts and CSS off the page. This makes sure you’re saving time by letting your user cache the component parts of your site. Secondly, keep all scripts on one file, all CSS in another, minimizing the amount of files you have to call per page. Finally, if you’re sure you’re not going to be editing your scripts and CSS, you can Minify your code, which removes white space and comments. This is especially effective for large files, though it can make reading and editing the code more difficult.

    But really, optimizing site speed is just clean coding. You shouldn’t be surprised that shrinking your megapixel images to 50×50-pixel thumbnails or making a slough of HTTP requests for each page would keep things slow. What happens, though, when it’s not your fault?

    You use a CMS because you didn’t want to have to deal with all that. You expected what “real” coders created would be following best practices. This is especially grating when you bought a CMS and paid good money for it.

    Here’s where there aren’t always good answers. To start, CMS’s almost invariably use databases to house their content and other components; this immediately adds steps to your request chain. And while plugins and hard-coding the CMS are always options, these can be either cost- or knowledge-prohibitive. For many CMS’s, simple things like being SEO friendly are hard enough to accomplish without fighting a losing battle against site speed.

    Some suggest using a caching plugin, but even a benchmarking report from a year ago shows a forest of options just for WordPress, none of which look different on the surface, but can offer extraordinary ranges of performance. Not to mention the added headache of working “with” these plugins as you write for your site. It almost seems like Google wants you to have a “flat” HTML site.

    Take a deep breath. First, remind yourself that this is one factor, and a small one. Linking and on-page optimization are the most important factors. After clearing your head, try to follow the same guidelines as much as you can — Minifying code, using simple solutions, and avoiding inline CSS and scripts will take you farther than you’d think. Once you’ve done all that, do practical tests: how fast is your site from a user’s perspective? Would you be turned off because of it?

    If the answer is “Yes,” then you have bigger concerns than just one factor.

  • Small Business SEO Basics: Rank for Your Own Name!

    Have you ever searched for a business by its own name and come up empty-handed? What about searching for its own name PLUS the city it’s located in and still no luck? It happens! Google definitely makes stupid choices at times, but most of the time this happens when we are not telling it the most basic information about ourselves — as redundant as it might seem.

    Alligator dogs from Dat Dog's Yelp page.

    As a stereotypical New Orleanian, I’m obsessed with great food. I have always noticed and been upset when a business doesn’t rank #1 for its own name, or even in the top 3. Recently I saw the phenomenon three times on the same weekend, so I had to blog about it.

    I was hungry for something inexpensive, but really good and meaty, which means I had approximately 1,298,450 locations to choose from in this city. Many of these will never have websites. Somehow it came down to Courtyard Grill, Cowbell and Dat Dog (I have been to only one of these restaurants but can tell from the reviews that I’m going to love the other two). Since I only trust hours of operation posted on official websites, I searched for the restaurant names with city names behind them. These were the results (from 3/28/11):

    Search Term Rank/Page
    dat dog new orleans 15- About
    courtyard grill new orleans 05 – About
    cowbell new orleans 02 – Home

    Aesthetically, most would agree that these sites are beautiful, but since Google is a robot, it does not share this opinion. Of course I don’t expect a year-old restaurant to rank #1 for the term Courtyard Grill, but including the city name should be enough to make that happen. Same goes for the other two. Why are these restaurants not ranking first for their name?

    Who dat? Ain't Dat Dog.

    Yes, they all are getting outranked by Yelp and Urban Spoon pages with a bunch of reviews, but the big problem here is a lack of on-site optimization. Breaking the most fundamental rule of on-site SEO, Dat Dog and Courtyard Grill both fail to include any content on the home page. Furthermore, they all lack meta-descriptions and only one has a decent title tag. Cowbell performs the best in the results because “cowbell” and “new orleans” are mentioned next to each other twice in the text of the home page; not to mention the address is in text at the top of the page.

    Unless you have a unique name, or a URL that perfectly matches your business name — (Bouligny Tavern has both, as you can see here) — you will need on-site optimization to save you from these problems. On-site optimization is so important that even Google, which is secretive about their search formula, has written a guide to this which can be found here. I highly recommend it to web designers and webmasters because it expresses the minimum that you should do to be favored in search results.

    Who cares if my site isn’t first? I have 4 stars on Yelp!

    You should be concerned about making sure that when people are looking for YOU that they can find YOU, and not another site that is talking about you. Otherwise, why would you have a website in the first place? You want to be in control of your own destiny, and it helps to lead the conversation. Let’s say you have 4/5 stars overall on Yelp, but the most recent rating was a 1; do you want a potential customer to have this information branded in their head so early in the relationship? Even if a customer has a good experience, reading bad reviews can lead to Yelp-fulfilling prophecies. Yelp is a great site, and if you want people to see your Yelp page, link to it with a banner, just like Lock Busters did.

    I know I’m not first for my name and city, but x means the same thing as my city.

    Even though I am talking specifically about NOLA, it’s a lesson we can all learn: think about the words people are saying when they search for your business, and optimize based on those words. Locals often call New Orleans Louisiana “NOLA” (and it happens to be a component of the URLs for the 3 sites we are comparing), but chances are that when someone is looking for you they are not using this word. The screenshot below is from Google Insights. Not only are people searching for New Orleans a lot more than NOLA, but as you can see at the bottom of the image, people are searching for “nola jobs” and “nola news,” which undoubtedly refers to our friends at www.nola.com, and the term “nola restaurant” probably refers to the Emeril’s establishment called NOLA Restaurant.

    Nola What? Nola Who?

    On-page optimization does not guarantee you will rank for your name, but you don’t have much of a shot at ranking for anything without content and metadata on your home page. There are other ways, like being linked to via anchor text from an awesome blog, which we are glad to do for 3 great local businesses, but small business owners should go for the low-hanging fruit first.

  • Why Flash Movies Don’t Help Web Marketing

    The number one rule in marketing is staying current.  To show that your product matters to a customer and help make the sale, you need to keep track of trends, impress with technology, and give your target demographic enough reasons to buy your product.  One effective way to wow people is Flash.

    Flash technology has been in use in websites as a way to provide visual and audio interactivity on a website. Great Flash movies engage users and trigger their curiosity. What would happen if a Flash user clicks on a button or drags an icon on the page? Flash movies can answer that question.

    While Flash is able to capture a user’s interest, the first step is driving users to the site. There are some problems with Flash affecting optimization. Here are some of them:

    • Flash provides no beneficial information for crawlers
      To install a Flash movie on a webpage, the file can be put in an EMBED or OBJECT tag . Regardless of the choice, the code itself offers no support for search engine optimization.

      The lack of juice becomes a problem on sites that rely heavily on Flash and lack content. Sites with little else besides the movie file are missing out on additional ways to increase their web presence. While off-page optimization (social linking, articles, etc.) is an option, on-page optimization is more reliable than off-page optimization since the domain owner has control over what gets placed on the page.

    • Flash is not 100% cross platform compatible While Flash is installable on a variety of browsers, it does not work on many mobile browsers. Since system memory is more limited on mobile devices, adding some way to translate Flash would consume too many resources. Flash also has its programming interfaces blocked by Apple products, preventing access to the necessary code to make the movies work on the extremely popular Iphone and Ipad products.
      Iphone
      Source: Reticulating — Flickr

      I’ll stop for a minute and provide a counterexample – Youtube. Normally, Youtube provides its videos in Flash format for viewing. Despite Flash’s limitation on mobile devices, Youtube still works on them. The reason? HTML5. While HTML5 can play the animations designed in Flash, it does not support the interactivity of Flash. That means that Flash through HTML will be a strictly visual experience.

    • Some target demographics don’t know about Flash There are technologically challenged people in the world today. If you’ve ever worked in customer service or helped an older family member with a computer problem, you know what I’m talking about.

      As far as they know, Flash could be a computer virus. Or something good for their computer. Or some form of scam.

      Because these people are uneducated about the technology, they may not decide to install the program. Without the program, they will not see the video and miss out on the marketing. While providing a non-Flash version on a page is an option, the downgraded version will lack the audiovisual and interactive capabilities of a full-fledged Flash version.

    Some tips on using Flash without hindering optimization:

    • Don’t offer a non-Flash version, redirect to one. Doing so means the user has less clicking to do and can be exposed to marketing more quickly. On non-Flash pages, offer the option to increase the viewing experience with Flash if the user chooses so.
    • Use Flash to do things regular HTML/CSS can’t. Good example – HTML and CSS can handle basic navigation. No need to make fancy menus when HTML/CSS is just as capable. Menus are also a good source of keywords on a page, so don’t deny the crawler the keywords it loves with a Flash menu. Reserve Flash to create unique experiences for the site viewer.
    • Don’t skimp on content. Site content can provide engine for search engine optimization. Flash movies provide no value to optimization, so use non-Flash content to help out.

    Remember, Flash should be used as a complement to a website and not as the website itself. All of the pretty effects and animations mean nothing if there is no person to view it.

  • Google “Farmer” Update: Big Panda Makes Mountains and Molehills

    google farmer update big panda
    Damn it feels good to be a Panda.

    Over two weeks ago, Google scared the SEO community by publicly releasing what even they called a “pretty big” change in their algorithm, “noticeably” affecting 11.8% of queries. This change, coming on the heels of competitor Blekko.com blacklisting certain content farms and the announcement of a Chrome plugin to block websites from search, is part of what analysts at Sistrix call a “quest for quality.” This is old news to anyone paying attention, but a wide variety of opinions are still circling about what this update really meant. Most people pointed at “content farms” as the target of the update; yet it becomes clear that these sites were not the primary target nor were they targeted intentionally as “low quality.”

    Pandas munch only on bamboo…

    The Big Panda update is the culmination of a long line of work; Matt Cutts, Google’s spam god himself, points to the Caffeine update from way back in 2009 as the beginning of his team really looking at low-quality results. The spiders were finding so many new pages that they had to do something about plagiarists, scrapers, aggregators, and meaningless fluff content. While Cutts was happy with the low amounts of straight V14gR4-style spam, he was worried about that fluff — what he called “shallow content.”

    Moving forward two years to the end of this January, Cutts announced a small change which attempted to make some programmer-centric search terms rank Stack Overflow and other providers of original user-created content over sites scraping that content. This affected a whopping .5% of queries, including such high-converting gems as “pass json body to spring mvc” and “aws s3 emr pig.” Obviously, ma and pa aren’t going to see these changes.

    But the ones made a month later, seen first on February 24th, are the kinds of searches the average user would be making. Demand Media, owner of eHow and other alleged content farms made an almost-snarky blog post that either thumbed its nose at Google’s change or reaffirmed the company’s legitimacy, depending on how much bile you smelled on the words.

    By the time Google officially announced the change on its blog on the 28th, webmasters, SEO experts, and even CNN had weighed in on the change. Website owners noted major changes in their rankings and traffic, and they were rarely happy with the results. Finally, on March 1st, a heart-to-heart with Cutts and algorithm genius Amit Singhai at TED 2011 explained the change in depth.

    …Authoritative, quality bamboo.

    This talk laid out how Google designed its algorithm. Instead of trying to find factors and hope that said factors are in line with users’ expectations, they asked questions of real people about a variety of websites, then built the algorithm to match the answers to those questions. These questions cut to the core of what’s important on web sites: Would I give this site my credit card information? Would I use this information for my sick child? These questions should be at the core of website development and content creation — unfortunately, Google has to step in and play Gatekeeper and Keymaster.

    Google Farmer Update
    Jason Calcanis tastes like Bamboo to Big Panda.

    The next day, Jason Calcanis, CEO of “education site” Mahalo.com known for quipping “SEO is Bullshit,” laid off 10% of his workforce and halted all written content production in response to the site’s losses in search engine ranking, when a month earlier he boasted of doubling the size of the company this year. While not talking to the press, he tweeted: “We were the right size for a certain market opportunity, and overnight that changed.” Somehow that sounds like AOL and Netscape, too…

    Later that day, Google set up a thread in their forum to try to assuage the growing litany of discontented site owners. Google knew people were watching, and they were quickly responding. Most posts complained of losing a majority of income from the algorithm switch, and pridefully stated nothing could have been wrong with their websites. SEO gearheads wagged their heads in frustration and more or less told everyone to chill out. Without copying what Ms. Fox said, not much has changed in SEO — don’t try to “win” at it by doing the bare minimum and don’t base your entire livelihood on search engine ranking. That’s not comforting to someone like the owner of Fonerbooks, who claimed to see everything but his original work in the rankings, or HealthCastle, who saw a 40% drop in traffic and wanted to know “why Google named us low quality.”

    Demand vs. “Thanks”

    Two companies represent diametrically opposed business plans and responses to the Algorithm change: Demand Media and Mahalo, the well-rounded shrug and the panicked crisis. I think that the differences in approach to content farming truly showed the reason why the sites were affected differently by the Big Panda algorithm change.

    Demand Media owns eHow, one of the most egregious of content farms. Yet, according to most data, it rose in the rankings. While this is enough to aggravate “legitimate” content creators, how eHow is supposed to be used sets it apart and mitigates much of the spamminess it’s known for. Adweek explains the concept well:

    The company’s evolving media strategy, according to Rosenblatt, is to attract users to eHow and other properties by answering highly specific search queries, and then drive them to other branded channels, hoping they become loyalists. “We are constantly trying to figure out ways of bringing people in [and] give them a much broader experience on our network,” he said.

    They use long-tail and specific search phrases to drive traffic to niche branded sites. These sites have meaningful authors and sponsors: Lance Armstrong, Tyra Banks, and Rachael Ray among them. I’ll even admit two Demand Media-owned sites are too often in my history: David Wong’s Cracked.com and Fancylad’s i-am-bored.com. What sets Demand Media apart from some of the other content farms is the high level of community loyalty and interaction these sites enjoy. Most stories and links on these sites have plenty of user comments, and their Facebook pages enjoy a level of curation and personality that rival traditional content providers like bloggers. The links are strongly shared and enjoys some market saturation — an “article” on the flaws in alien takeover strategies has 1.3 million views, 10,000 likes, and 1600 comments over the course of 32 days. That’s better than most SEO companies’ biggest clients.

    google farmer update
    Branding for the Bored

    Compare this to Mahalo, which hardly enjoys the same kind of saturation. A much more broadly marketable subject, “Super Bowl Commercials 2011,” garnered only 37 tweets and no comments over the same number of days. Mahalo, though designed cleanly and conspicuously missing any ads, refuses to market themselves to an audience, happy to provide something that doesn’t work over and over. This piece of poetic gold shows the difference in style of laziness:

    DreamWorks forthcoming animated comedy “Kung Fu Panda 2,” featuring the voice talent of Jack Black, had a trailer debut during Super Bowl XLV. The ad features the film’s tag line: “We Will Wok You,” along with the lead character attacking enemies with a wok…

    While Demand might pump out three articles on the same fluff topic, there isn’t the lack of attention to meaningfulness seen on other farms. Don’t forget that Mahalo is trying to brand itself as a one-stop-shop for learning, while Demand revels in its content for content’s sake mission.

    Big Panda vs. Old Dinosaur

    But all this talk about content farms is a red herring. This was not Blekko; this was not a targeted swipe at a business plan. If that were the case, Fonerbooks, who had allegedly well-respected web sites with large amounts of original and informative content, would have no problems with the “Farmer” update. The Wall Street Journal took up his case, citing his innovative computer troubleshooting techniques, and using it as a screed against the latest update’s failure to address the “real problem.” The article compares Demand Media’s eHow 200-word summarization to the detailed article 10 times the length that eHow supposedly “stole.”

    Here are the two pages in question: the first link on Google for “toshiba troubleshooting” and the first result for “toshiba troubleshooting fonerbooks.” The difference is striking.

    Sure, eHow has two other articles linked that have nearly the same title, though thankfully with different content. Sure, eHow has 7 ads in the first 900 or so pixels of the screen, and even more below the fold. Sure, eHow’s writer has written such crucial treatises on computer repair as “How to Take Crease Marks Out of Posters” and “Fitness Requirements for the Army Reserves.” But what page is more useful to the reader?

    google farmer update
    Is this really quality?

    Is it the one with poorly laid out tables, walls of text, veiled insults to the reader, and links to static gif flowcharts, culminating with an ever-present plug to buy a $25 self-published book? Or is it the one that someone can look at immediately and see if the information is useful? Comparing site statistics shows that users spend the same amount of time on both sites — can you read 2000 words in under 3 minutes?

    This isn’t to indemnify the owner of Fonerbooks; rather to show that this update was really about user experience, rather than a content-based punishment. Are you providing a brand to your readers? Will they come back, recommend your site, and consider you a valuable source of information? 4 people “Liked” the eHow link on Facebook — that kind of user interaction is absent on Fonerbooks, excluding buying his prose.

    Playing numbers games with Big Panda

    At the heart of all SEO is trying to figure out what Google wants. How do we deliver what the algorithm deigns important, while still remaining relevant to the clients of our clients? Eight factors seemed relevant across the posts and stories about the “Farmer” update. Big Panda doesn’t want to play games; the factors suggested are a product, not a cause of good marketing and website design. These factors were culled from a variety of sources, though Searchmetrics’s confidence in visitor statistics like bounce rates, saying they “have obviously been part of the quality guidelines of the update” provided an excellent starting point to develop a way to examine. Search Engine Land’s “Why Me?” post also provided important metrics to test, such as website authority and ad density.

    1. Duplication of Content
    2. Author Authority
    3. Ad Density
    4. Overzealously Using NoFollow Links
    5. On-site User Engagement
    6. Off-site User Engagement, i. e. Social Media
    7. Link Profile
    8. Visitor Statistics

    Using these eight metrics, I tested the top winners and losers from reports by searchmetrics.com, seoclarity.net, sistrix.com, the Google support thread, and other news stories. My method tried to discern if any of these suggested metrics truly held weight.

    Using a random article, page, or product linked from the home page of the site, I checked each metric. My method was hardly scientific, and limited by the process of doing it by hand and the subjective nature of some of the variables.

    Duplication of content was checked using articlechecker.com or Googling random paragraphs, depending on what gave any results. No duplication was 0; otherwise it was a count up to 20 duplicates on the web. The goal is always 0 duplication, though this method ignored any canonicalization.

    Author Authority was a subjective 0, .5, or 1 denoting the author’s specialization of his writing. A writer in a wide variety of fields or with no name received a 0; one who wrote in one general field, such as electronics, received a 1. Those that I felt could have gone either way got a .5. Clearly, more authority should be better.

    Ad density was a simple count “above the fold.” My screen is set to 1440×900, falling into the majority of web users according to W3Schools. This helps determine purpose of the page — too many advertisements could mean a low-quality site.

    Nofollow links were the percentage of outbound links with a nofollow tag. If a site doesn’t want to pass its link juice, how can we trust those links are helpful to the reader?

    On-site User Engagement was a simple count of comments on the page. This usually means a site has repeat visitors and is building a brand.

    Off-site User Engagement was split into tweets and Facebook posts. I searched the titles of the pages, long and shortened URLs, and keywords in Tweetscan.com and youropenbook.org. Rarely did these numbers match what was on the site; I’m not sure if this is a weakness or not as Google might not index old tweets or private Facebook pages. Either way, more social involvement means people find your site useful, and should bring your rank up.

    The page’s Link Profile was checked via a random smattering of off-site backlinks from the Yahoo Site Explorer. I rated the links on a 5 scale, where 1 is least spammy (AOK) and 5 is most spammy. A spammier link profile should mean a lower ranking, if backlink quality were taken into account.

    Finally, visitor statistics of the domain were checked for the most recent time period on Alexa. I checked the bounce rate and time-on-site. Ideally longer is better, though you can’t beat a timewaster like Facebook, who reigns supreme at 32.7 minutes per visit.

    The data is embedded and linked at the bottom of the post; I invite anyone to seriously crunch the numbers, but a quick analysis can show “direction,” that is, whether a given factor positively or negatively affects ranking. For a 90% confidence interval, we can see that 4 metrics definitely matter: ad density, social networking, link profile, and visitor statistics. All of these metrics fall in line with the expected effect on ranking described above. In short, Big Panda works.

    So what does it all mean?

    Deeper study is always needed, but for a designer or marketer, Big Panda brought the web back to the user. The goal, more now than ever, is to create a site that people will want to interact with. Compelling visual designs, interesting and relatable content, and user loyalty both on and off the site will reign supreme.

    But that’s been the standard operating procedure of advertising and content production the whole time. In many ways, the more things change, the more they stay the same. Now, Google is whole-heartedly supporting your efforts and rewarding you.

    If you’re having trouble surviving with a Big Panda in the room, it’s time to change how you do business on the internet.



    Big Panda Data Spreadsheet

  • New Google website block feature: end-user advantage or an assault on SEO?

    Yesterday, Google (ever heard of ’em?) announced that they’ll allow users to start blocking whole websites that they don’t want to see in their search results. The internet’s long been a place where the individual user can define a personalized geography of websites and online communities where irrelevant content effectively does not exist unless brought in by some outsider. Google’s implementation of individualized domain blocking is in line with the largely tailored nature of the internet.

    Paired with its earlier announcement that they’ve changed their search algorithm in the name of “helping people find ever higher quality in [Google’s] results,” it seems that Google’s pushing hard to make the user’s web surfing experience totally bitchin’ by essentially cleaning out the trash from the potentially gnarly waves of the internet. (Sorry. “Surfin’ the ‘net” as a marketing phrase is hilarious to me.) Making the internet easier to use is a noble effort, for sure, but Google establishing itself as the judge for what should be the most important puts out some grody vibes to this surfer, man.

    Contrasting the top-down filtering of search results straight from Google, the new announcement affords more freedom and power to the end user to decide what they actually want to see amid their results. This, of course, necessitates more vigilance on the user’s part to clean up the trash themselves so that their “waves,” or search results (I promise, I’m about to break out of this terrible analogy), are exactly what they want for the most tubular ride.

    While this decision by Google doesn’t directly affect SEO significantly, what some bloggers have been buzzing about is this one sentence, pictured below next to a blurry picture of my cat:

    As depicted above, they mention that they’ll look at the data of blocked sites and consider it for use in the future. What does this specific announcement mean for SEO? Not necessarily anything. The user has to voluntarily block sites that they don’t want to appear in their search results. I’m not going to pull out any semi-relevant statistics on web-based participation, but the fact alone that users have to be logged into their Google accounts to block sites means that the participation is only limited to a slim segment of the internet’s population. Will such a minimal amount of input really have an impact on a site’s traffic? Maybe, but it’s doubtful that this would carry any more weight than just clicking on one site over another competitor.

    Good on Google for doing this, I suppose. Being able to decide which sites I want to view beats having someone else decide which sites I want to view, though the latter’s going on regardless of my input. Personally, there have been exactly zero times where I’ve wanted to block an entire domain for giving bad results, but know that I’m a proponent of imperfection and trash in general. So worry not, SEO friends, until we can find specific data that correlates a number of users’ personally “blocked” sites with a decrease in traffic, it cannot be attributed to anything else whatsoever.

  • Barnacle SEO – Local Search Engine Optimization for The Sam’s Club Crowd

    Barnacle SEO update for Pubcon New Orleans 2014 on slideshare: Barnacle SEO for Local Search at #Pubcon

    This post is republished here since the site on which it was originally published is now offline. Originally published. Many of the examples are no longer valid, but I hope the screenshots are still useful.

    Barnacle SEO for everyone.

    In the time since this was originally published, our number New Orleans SEO agency has gone from working almost exclusively with local small businesses to practicing Barnacle SEO on a much larger scale. Our clients today include higher education and healthcare concerns for whom the barnacles are bigger and more plentiful.

    Every industry has an opportunity to leverage directories, high-authority sites, and more to catch those searchers floating by in the current.

     

    Small business owners are generally a frugal lot.

    Of course there are some who’ve got the budget to invest heavily as long as they see a return, but what about those who don’t?A long time ago someone coined the phrase “Parasite SEO” talking about the abuse of Blogger and WordPress.com to get better search term positioning.  I love the term, but “parasite” is just so ugly sounding.  I prefer “Barnacle SEO”.

    Barnacle SEO: attaching oneself to a large fixed object and waiting for the customers to float by in the current.
    As we know, there are lots of large trusted sites out there which by virtue of all those factors we know and love take very little coaxing to rank well for long-tail phrases.

    And if I’ve said it once I’ve said it 1000 times “local search IS long tail search”.

    So, here’s a few of those large fixed objects:

    Local profiles have a lot of benefit for Barnacle SEO.  In many instances they play supporting roles — helping to increase the number of web references or “citations” as David likes to call them.

    The following examples are from real, Main Street, clients who pay less than $500.00 / month for SEO services.  Obviously these are not our most competitive categories, but we’ve found that for many small businesses it doesn’t take a lot to move the needle.

    Clearly we could, and will over time, better optimize the profile pages themselves.  And amazingly, in one of these cases the site uses a bad CMS so we can’t even control the on-page content.

    Merchant Circle and Yahoo in supporting roles:

    Property Damage Rhode Island - Image from Google

    Notice the client, Puroclean RI, has top billing in both Organic and on the Map for “Property Damage Rhode Island“, even with a very bad title tag, and is present with both Merchant Circle and Yahoo! Local listings.  So, with a little Barnacle SEO we’ve got 30% of page 1 (and this is just one of many search terms) without breaking the bank.

    Merchant Circle in a leading role:
    Accounting Firm Louisiana - Image from Google

    As you can see, with the Merchant Circle listing we have 2 of the top 3 for “Accounting Firm Louisiana” for ASU, LLC in Metairie LA.  And this one is very early in our process.  Within 60 days or so we would expect his listing to supplant the Merchant Circle listing and hopefully knock out that top one as well.

    Picture of Sara Palin with an M-16
    Sara Palin Taking Aim on those Barnacles

    Like all SEO it’s not a case of “build it and they will come”, each of these listings needs to be cultivated, and promoted around the web.

    But, as the above demonstrate a little bit can go a long way when you first attach yourself to a big heavy object.

    With Barnacle SEO, even a Sam’s Club shopper can compete with the Neiman Marcus set in local search engine optimization.

    Image Credits (Wikipedia):
    Gooseneck barnacles taken at Limekiln Point on San Juan Island
    Camp Buehring, Kuwait – Lt. Col. David Cogdell helps Alaska Governor Sarah Palin test out the Engagement Skills

    Will Scott is president and founder of Search Influence a website promotion company in New Orleans, LA focused on local search engine optimization for small business.

     

  • Google Places iPhone app – Not convinced yet…

    Yesterday, Google announced and released its vast (almost too vast) catalogue of data on the brand new Places app.  Places can be a nightmare for anyone looking for specific businesses on the web, especially if you’re in SEO. The app is a fairly elegant solution to this problem.

    At first glance, it’s a great app. It’s easy to use and easy to find and review different businesses around your city. However, Yelp did it first and did it better. Also, I’m sure they’re going to roll out an update that allows you to check-in to locations, and who really needs another one of those? I still haven’t chosen my favorite one yet and I’m not going to complicate things by introducing another.

    There is some good to this app, though. Just like Google Places on the web, it pulls reviews from multiple sites across the ‘Net.

    Also, everybody is on Google. This can make for a great recommendations service, as long as it takes off. It also integrates their new Hotpot feature and it’s free, which is always a plus.

    As you can tell, I’m not entirely sold yet on this, which is insane. Usually I’m right behind Google when they roll out their latest venture. Not this time, though. Maybe Google should stop focusing on copying or trying to buy the latest craze and start creating them again.

  • Top 5 Search Marketing Bloggers You Should Be Stalking in 2011

    Why hasn't Matthew Lesko written a get rich with SEO book yet?
    Why hasn't Matthew Lesko written a get rich with SEO book yet?

    I’m a search marketing guy, have been since my first job out of college (if we are not counting that liquor store on Ponce I worked out for 2 weeks in the summer of ’06. You can’t beat a 30% discount…). Most people can’t grasp what I do or even understand why the company I work for exist (my mother thinks I’m everything from a telemarketer to an Internet panhandler) but it is what I do and love. Recently I have become increasingly more serious about being able to do my job better and become a more rounded internet guy (I mostly just hang out on the fringe of the internet with the /b/tards and tweeps heckling e-celebs for my own amusement). Now I feel I have to pour myself into learning new search marketing strategies in order to keep with the constant changes and competition.

    Finding good information is not always easy for search marketers.

    There are no continued learning programs for SEO backlinking strategies or creating compelling keyword rich content. So I decided to put together a list of my favorite internet marketing bloggers and explain why you should be reading them if you are serious about SEO.

    #1) Matt Cutts is Better Than You

    If you are lazy like me you prefer to just listen to Matt Cutts talk.
    http://www.youtube.com/user/GoogleWebmasterHelp

    If you don’t know who Matt Cutts is and you are in Search Marketing, ctrl+alt+delete your resume. Best known as being the smilin’ enforcer of Google Webmaster Guidelines, Cutts provides guidance to search marketers through his Youtube videos and blog posts.

    Why You Should Care: Matt Cutts is Google’s Prometheus.  He presents SEO guidance in several different formats (videos, blog posts, guest speaking, etc). Having so many different formats make the information he provides easier to digest. If you are like me, hitting the play button is a lot easier to do than reading three paragraphs of insights.

    What Should You Stalk: While you can always follow his blog, I find his information most useful from the Youtube Channel – Google Webmaster Central Channel

    #2) The eWhisper

    @bgtheory on Twitter
    @bgtheory on Twitter

    bgTheory’s Brad Geddes is a world-class PPC Geek and Adwords affacionato. Besides being the Official Adwords seminar leader, he is also the author of Advanced Google Adwords.

    Why You Should Care: If you want to know what you should be doing with your Adwords account, Geddes is the man to follow. He is a well of information and ideas about how you be better use your Adwords campaign for generating and sustaining ROI.

    What Should You Stalk: His Twitter account – http://twitter.com/#!/bgtheory

    #3) David and the Paid Goliath

    @Szetela on Twitter
    @Szetela on Twitter

    David Szetela – the occasional snarky owner of Clix Marketing, David Szetela is one of my go-to sources for PPC news and updates.

    Why You Should Care: His knowledge of cross-platform paid search is extensive and an quick follow on Twitter will have your timeline full of information on tweets on paid search strategies and general SEO information.

    What Should You Stalk: Like with Brad, David is most active on Twitter – http://twitter.com/#!/szetela

    #4) Moz-el tov

    http://www.seomoz.org/blog
    http://www.seomoz.org/blog

    SEOmoz  has some of the most informative blog post on SEO benefits, keyword mining, and PPC tips.

    Why You Should Care: I’ve been faithfully reading SEOmoz’s blog for up-to-date information on SEO, PPC, and SMM from some of the best and brightest. If you are not following this blog, you are missing out.

    What You Should Stalk: I highly suggest daily readings of the blog – http://www.seomoz.org/blog

    #5) Mike Maps It All Out

    [/caption]

    A student of life, political economy & local search, Mike Blumenthal is a Google Maps and local search guru. He frequently authors articles at Search Engine Land and his blog Understanding Google Maps and Local Search.

    Why You Should Care: Local search, local search, and local search. Oh and maps.

    What You Should Stalk: : Mike’s blog is a warehouse of data points and Google Maps techniques

  • The Effect of Google Instant on Long Tail Search Behaviors

    We have been watching some recent Google product changes over the last few months to monitor the effects on site traffic, and in November we see most websites moved along at a positive level with their organic search numbers.  Nothing startling.  Everything is on track.  But for a significant few websites, digging into the Google organic search data in November elicited responses of “Help!” or “what in the world!?!”

    With drops in Google organic traffic in the double digits, some at 13% decrease, some at 20%, and some with a 32% nose dive, something clearly is going on for these few sites. Of course, my first thoughts turn straight to the two different Google product rollouts over the last 2 months and how they must be changing search behavior and traffic percentages.

    Trying to find an answer or at minimum, a clue, I am paying attention to three specific areas of organic decreases:

    1) geo specific searches

    2) branded searches

    3) photo organic search

    For an example, Google organic search for 1 website for November dropped 17% or 57 visits, of which 27 were geo specific related phrases.  His organic rankings are super strong so that’s not the cause. He is in the SERPs in some great spots, so where is the traffic going?

    I have another website in a different industry which lost 156 Google organic searches in November – 74 of these were geo specific.

    I believe these losses are largely a consequence of Google Instant where the results differ with each letter typed in search.  I believe Instant is pulling searchers away from their intended search when these alternatives are dynamically generating as they type.  I know when Instant first rolled live, some SEO experts opined that Instant was going to be a non-event.   I can’t agree.

    As I type “dentist Las Vegas” Google Instant offers several options to divert my search.  Well, I didn’t think about searching by zip code, but maybe I live in 89113, and that alternative search might be better than the one I was going to type.

    dentist las vegas

    Instant is offering some pretty specific options that may mean a wider keyword universe for SEO.  The “best dentist?”  Well, I would like to see who is best…

    dentist boulder

    Branded search is not safe from these Google product changes either.  From my first example site, of the 57 dropped searches, 30 were branded search related (the doctor’s name).

    For the branded search drops, I wondered if people were clicking on his 1 box instead of his website with his name search.

    Or maybe searchers were finding a different results set for his name and were clicking on his profile or other directory listings, but I don’t see ANY increase in directory referrals.  Frankly, I’m a little stumped.

    A third trend I found his month is more specific to the plastic surgery industry.  Each month there is a lot of organic search around “photos,” “pictures,” or “before and after.”  These are not search in Google Images.  I have 1 surgeon based on the West coast, who lost 37 visits from lost “photos” searching.  And 1 practice on the East coast, which lost 134 visits – dropped in half!, and I filtered out all other factors that would affect it.

    google instant results

    I attribute these traffic losses to both Google Instant – I can see where searches can get distracted from their intent –34c breast size

    And to SERPs.  I’m pretty sure that where our docs once ranked for those non-localized phrases have been given over to directories.  Unfortunately, I don’t have historical data on where they once ranked for non-geo phrases in location based search and where they rank today.  Bummer.  That would have been cool to know.

    I haven’t figured it all out yet.  doh!Surely, I am missing something in the data even though I have been combing through Analytics, probably something right in front of my face. Someone may be kind enough to point it out to me, forcing me to say “doh!”

    Thanks to cytoon for the great nose dive image http://www.flickr.com/photos/cytoon/

    Thanks to LuChOeDu for the iconic Homer http://www.flickr.com/photos/luchoedu/