B2B SEO | Google Hummingbird and Beyond

B2B SEO | Google Hummingbird and Beyond.


What does Google’s new approach to search mean to you.

by Michael Andrew Westfall

To take advantage of this new environment, one must first put yourself in Google’s shoes. The monolithic search market juggernaut is on a crusade to continue its core brand strategy as the only true innovator in its space. Ask, what would I do if I were trying to maintain this lofty status as the world’s leading provider of information – With this in mind we can better understand the motives and derive some context from the ever-so-vague and nebulous explanations we get from Google’s search team and leadership alike.

|1| Wouldn’t you like search engines like Google to present users with the most personal and relevant results and data possible? Even if it means keeping large amounts of personal data?




Guardian: SEO is dead. Long live social media optimisation– optimiZation

SEO is dead. Long live social media optimisation

As Google search results throw up more and more ads, using SEO to reach your audience is becoming increasingly futile. Could social media optimisation be the answer?


The Social Media Command Center software, powered by the Salesforce Marketing Cloud

Today’s businesses need to consider what it means to be social-media optimised, with a focus on customer-centric interaction rather than merely setting up a web property in the hope that Google will deliver hits. Photograph: Steve Marcus/Reuters/Corbis

Search engine optimisation (SEO) was always a flawed concept. At its worst, it means making web content less engaging for the reader but supposedly better for search robots and for the mysterious algorithms that determine the order in which results appear for a Google search. At its best, it means no more than following best practice in creating clear, accessible web sites with intelligible content, meaningful titles, descriptive “alt” attributes for image, no broken links, and the rest of what makes for a high-quality web destination.

Now SEO is dying. A striking post by Dan Graziano reveals that a Google search may display only 13% organic results; “the rest is ads and junk”. In addition, a recent Forrester report on how consumers found websites in 2012 shows that social media is catching up with search, accounting for 32% of discoveries versus 54% for search, according to the US respondents, up from 25% in 2011. The trend towards localised results delivered to mobile users, perhaps via an app rather than a web page, is another reason why traditional SEO is decreasingly important.

A better model for today’s businesses is to consider what it means to be social-media optimised, with a focus on customer-centric interaction rather than merely setting up a web property in the hope that Google will deliver hits. Recommendations from friends count for more than a search engine algorithm will ever achieve.

What then is social media optimisation? It is about inviting people into conversation rather than merely broadcasting a message. It is about listening to social media chatter and acting on the results. It is a hashtag or twitter handle on every ad, and a responsive team behind that social media presence for those who respond. It is integrating multiple touch points on multiple channels so that customers get a consistent experience across all of them. These things are not trivial to implement, though the technology to do so now exists, and they have the potential not only to drive sales but also to transform customer experience.

SEO will not be missed.

15 Must-Knows for SEO

15 Must-Knows for SEO

Search algorithms seem to change more frequently than the latest fashion trends. Although the Panda and Penguin updates have received lots of the attention lately, experts say that Google alone makes adjustments to their algorithm several hundred times per year. Every time Google makes another update there’s a mini-panic from webmasters and small business owners – “oh no, now what?!?”

Take a deep breath and stay calm because there are some things in SEO that stay consistent. Here are five SEO constant concepts to consider and build upon:

1. Don’t cheat. I’m a firm believer in playing fair. If you don’t want to get de-listed or bumped down to page 50+, be a white hatter (aka – do the right thing). Don’t spam, keyword stuff, buy links, cookie stuff, create fake profiles, buy “likes” or any other shady tactics. The last thing you want is to have your site to be removed from the Google index. Yes, this does happen!

2. Get links. Put your wallet away, you don’t need to buy links. The right way to get links is to create link-worthy material. You can create guides, infographics, interactive calculators, a unique article, comparison charts, an illustration, a storytelling photograph or dozens of other content types.  No matter what it is – the more thought and care you put into it the better. Another great way to gain new links is to guest-post on a blog relevant to your brand’s services or products.

3. Use Keywords. Use relevant keywords in copy that users may use in search to find you. Unsure what they are? Check the AdWords keyword tool for ideas. Don’t over-optimize by stuffing these terms too much throughout the copy (remember #1 above). Include it only where it makes sense when reading. Include it in the page title, description and throughout the copy a few times. You can also include it in the image name if it’s relevant, as well as the alt tag attribute.

4. Build your site for people, not for ad dollars or search engines. When creating your site, remember who you’re building it for. Don’t bombard your visitors with too many ads or anything other than what will provide a positive user experience. Don’t make them click “next” 25 times to see 25 ways to make Ramen. Put your readers first and deliver the user experience they’re looking for. This improves bounce rate, repeat viewers and, maybe you’ll get a new newsletter subscriber (if you have your opt-in form on a page), which could lead to sales.

5. Be a resource. There’s no Candyland slide for good content. Create content that will reach your desired audience. Make useful pages that educate and inform, and ensure that it’s easy for viewers to navigate and find what they’re looking for. Teach, share and most importantly, don’t limit your site to just selling. Show some personality, your artsy side and/or spread knowledge through guides, articles and examples. Show Google what a good resource you are by including outbound links, plenty of images and even a video or two. Don’t forget to include social sharing icons so your website visitors can easily share all your great content with their own social networks!

These five simple concepts have been consistent in SEO for the last 10 years and will most likely hold true for many more. Although this generally means working harder than some folks, rest assured you won’t be banned, blocked or have to duplicate your efforts to regain visibility. Learn more about webmaster best practices.

SEO Tips Googly Analytics Realtime Screenshot

Sourced from: Bonnie D’Amico Rogers



  • Create XML sitemaps and upload them to the root of my site
  • Ensure navigational structure can be accessed by search crawlers
  • Ensure content is not buried inside rich media (Adobe Flash Player, JavaScript, Ajax)
  • Ensure a clean down-level experience to expose content when rich media is used
  • Ensure a clean, keyword rich URL structure is in place
  • Create a robots.txt file and place at the root of my domain

Site Structure for SEO

  • Ensure content is laid out in a logical hierarchy
  • Ensure content is well linked to with internal links
  • Ensure a clean, keyword rich URL structure is in place
  • Ensure URLs contain no extraneous parameters where possible (sessions, tracking, etc.)
  • Create both HTML & XML sitemaps for humans and search crawlers respectively
  • Ensure JavaScript/Adobe Flash Player/rich media does not hide links from crawlers

On-Page SEO

Inside the <head> code

  • Titles should be unique, relevant, and approximately 60 characters long
  • Descriptions should be unique, relevant, grammatically correct, and 160 or fewer characters
  • Must have only one title and description per page on my site

Inside the <body> code

  • Should have only one <H1> tag – content held in tag should be keyword rich
  • <ALT> tags should be available for all content-related images. They should contain content that explains the image and uses targeted keywords
  • Ensure the keyword targeted by your page appears within the actual content of the page
  • Use targeted keywords in anchor text when pointing links to other internal pages


  • Ensure content is built based on keyword research to match content to what users are searching for
  • Ensure a dedicated down-level experience exists to enhance discoverability of content held in rich media
  • Ensure as much written content as possible is kept out of rich media and images
  • Produce deep, content rich pages; be an authority for users by producing excellent content
  • Set a schedule and produce new content frequently
  • Be sure content is unique – don’t reuse content from other sources
  • Use 301 redirects to transfer value to new pages when content is moved or retired
  • Use the <rel canonical> tag to help engines understand which duplicated pages should be indexed and have value attributed to them
  • Use 404 error pages to help cleanse old pages from search engine indexes


  • Plan for incoming & outgoing links – plan how to encourage users to link to content
  • Create a plan to manage internal links ensuring the engines see deeper content
  • Create a social media plan to promote content via social spaces
  • Be sure to manage anchor text to help define content inside the site
  • Avoid buying links in an attempt to influence search rankings

Convincing others of the importance and need for SEO

Convincing others of the importance of organic search and the need for SEO management has admittedly become easier over the years. Many have followed suit with competitors to jump into SEO endeavors to keep pace and many are coming to see that online organic ROI often times is much better than traditional marketing, is better targeted and easier to measure.

However, there remains a skeptical, more “old school” crowd, that still needs to be shown the need for organic search devotion. Whether you work for an agency trying to make a sale or on an in-house team tired of not drawing leads, taking a look at your Google Analytics data with this 10-minute drill down on what organic is doing for you can make a great impact.

Google Analytics has taken great strides in the last year or so by adding multi-channel attribution data to site reporting, in addition to maintaining historical bare bones data that can be quite convincing in itself. We now enjoy an analytical progression where additional data on referred visits that shows organic may be playing more of a part in your site conversion success than previously thought.

What follows are some of the common questions and responses you might get as you’re painting the picture.

Why is Organic Search Important?

A very high level look at analytics in the Traffic Sources>Overview will show that most of the time sites retrieve the lion’s share of traffic from organic search. The only time this isn’t true is when you

  • Have the most established brand in the land and drive a lot of traffic through the direct channel.
  • Have an amazing social (or another type of) campaign rewarding you with a ton of traffic in the referring site channel.
  • Have robbed a bank to feed your paid search needs.

While you’re looking at the above diagram, take a look at last year. Has the percentage of traffic from organic search increased? Great, let’s keep it up. Has it gone down? Not so great, it’s time to reverse becoming a web dinosaur and revive your online presence.

Organic Doesn’t Convert, We Get Our Leads from Direct Traffic

Oh, really? This is why I so enjoy the advancement in multi-channel attribution.

With this data we can see how much of an impact the assistance of organic search has with other referring channels. Where once it was thought that the direct traffic channel was the hero due to last touch goal tracking, we can now see that many initial visits led ultimately to a direct channel conversion but were initiated via organic search.

Yeah, Yeah, We’ve Built a Big Brand, That’s Why Organic Search Converts

With this in mind we could simply go into organic search reporting, assuming there is conversion goal or ecommerce tracking, and see if this response is really true.

Yes, branded search converts at a higher rate since the visitor was looking for you, but what power does non-branded organic search have beyond basic non-branded keyword filtered conversion reporting?

Once again, we step further into analytics and take a look at what wasn’t so apparent on the surface of keyword referral tracking. Look at the Multi-Channel Funnel section of Google Analytics. By assessing the top channel groupings assisting each other to conversions you may quickly see a picture such as this:

One of the top assisted paths is organic search driven and revisited through the direct channel, which you can see contributed with 200 conversions. See how this becomes more convincing once we look at channel assistance?

OK, But Branded Search is Probably What Helps Assist in Direct Traffic Conversions

Another great facet of digging deep in analytics to prove the case for organic search is customized Channel Grouping. We have the ability to not only see where organic search helps other referring channels convert but to create custom channels filtered by either branded or non-branded keywords.

The sub-headline above gets shot down pretty quick when you see something like this:

OK, I’m Convinced!

Looks to me like the direct channel owes non-branded organic search a beer!

While trying to convince a needing party of the importance of organic search and SEO as a referrer of traffic, you can parse data several different ways to prove your point.

The above is a 10-minute method to show from a high level to a very granular level what organic search brings to the table outright and also how it helps the site convert as a whole.

Top 10 ways to avoid Duplicate Content – Search engines can penalize a sites ranking

Top 10 ways to avoid Duplicate Content – Search engines can penalize a sites ranking


Duplicate content

In a perfect world there would be only one version of every document but in real life this is not true. A great example is online versions of newspaper sites. In one form or another they all publish exactly or close to the same information about events and facts.

This also applies to many other areas, say, recipes, fitness programs, diet programs, definitions, explanations and many others.

Search engines can penalize a sites ranking if it looks as though content has been taken from another site. However, these sites do sit in index and even rank very well. How can this be? Does it mean there are no filters?

Filters exist, but they are in a primitive form

Search Engine’s need lots of resources to check the entire internet. Therefore, engines use simple forms to uncover duplicate content.


The most common pattern is links. In general engines check between two linked sites for duplicate pages/content. If they exist, then engines try to get rid of the duplicate – usually the one who links to the source site.

How does this work?

If website B (healthy news) republished an article from site A (health research institute) and puts a link from B to A for reference, search engines understands that site A is the original source and the site B has copied it. Site B is seen as dupe content website.

How do search engines penalize?

One or two articles like this won’t harm your site much for example, if website B has lots of original content. However, if duplicate content fills a significant number of pages, websites can be penalized by being moved down in the search result pages, moved to supplemental index or even unlisted from search index.

Test Conducted

Website X has 5 pages of original content and ranked #1 for very uncommon search term. This website has 100+ links pointing to it and Google PageRank of PR4. After few weeks 30 pages of content from another site that required a link back to it were added. In results, after Google indexed the website, it went from #1 position down to about #67 – #100.

The test was continued by adding even more links – but even an extra 100 links (now site has over 200 links) didn’t help to move it back of even above these #67 – #100 low positions.

After 3 months – duplicate pages were removed. After search engines updated the website – it slowly moved back to position #2.

What does it mean to you?

The idea is simple – don’t link to websites that have the same content as yours – especially if it is in huge amounts compared to total number of pages.

What you should do?

To put it simply – engines see all page as code, that probably includes site specific menu, HTML tags, layouts, title, descriptions etc. so it mix it up and articles aren’t considered as duplicate since it is part of some bigger code. That’s the secret.

Some useful tips for Content Connection users

To ensure even more security from search engines it is recommended to use one or more of these tips:

1) use articles on pages that have a solid size menu

2) try to have a unique intro and closing for used articles, change titles

3) use synonyms to replace words

4) insert comments in the middle of an article or links to detailed info

5) insert contextual short product reviews (1 picture plus one paragraph and link to full review will be just fine)

6) split longer articles into 2 pages or more as long there stays 150 – 200 words on each page

7) rewrite articles into your own words

8) combine two or more articles into one

9) if there are bullet points – mix their order if possible

10) think outside the box…

Last tip

Don’t try to host a DMOZ directory or part of it for making more pages on your website. Google can easily ban you for that and remove your entire site as happened to a website about coffee that hosted 10000 pages from DMOZ to drive extra 500 visitors a day. Sad but a fact. Duplicate content is evil. Be aware.

TimeStopping Social Media SEO Consulting

Google wants to transform words that appear on page into entities that mean something

Biological network analysis (Social Signals)

With the recent explosion of publicly available high throughput biological data, the analysis of molecular networks has gained significant interest. The type of analysis in this context is closely related to social network analysis, but often focusing on local patterns in the network. For example network motifs are small subgraphs that are over-represented in the network. Similarly, activity motifs are patterns in the attributes of nodes and edges in the network that are over-represented given the network structure.

PageRank is a link analysis algorithm, named after Larry Pageand used by the Google Internet search engine, that assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of “measuring” its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is referred to as the PageRank of E and denoted by PR(E).

The name “PageRank” is a trademark of Google, and the PageRank process has been patented (U.S. Patent 6,285,999). However, the patent is assigned to Stanford University and not to Google. Google has exclusive license rights on the patent from Stanford University. The university received 1.8 million shares of Google in exchange for use of the patent; the shares were sold in 2005 for $336 million.


An anchor hyperlink is a link bound to a portion of a document—generally text, though not necessarily. For instance, it may also be a hot area in an image (image map in HTML), a designated, often irregular part of an image. One way to define it is by a list of coordinates that indicate its boundaries. For example, a political map of Africa may have each country hyperlinked to further information about that country. A separate invisible hot area interface allows for swapping skins or labels within the linked hot areas without repetitive embedding of links in the various skin elements.

Google Penguin is a code name for a Google algorithm update that was first announced on April 24, 2012. The update is aimed at decreasing search engine rankings of websites that violate Google’s Webmaster Guidelines  by using black-hat SEO techniques, such as keyword stuffing, cloaking, participating in link schemes, deliberate creation of duplicate content, and others.

Penguin’s effect on Google search results

By Google’s estimates, Penguin affects approximately 3.1% of search queries in English, about 3% of queries in languages like German, Chinese, and Arabic, and an even bigger percentage of them in “highly-spammed” languages. On May 25th, 2012, Google unveiled the latest Penguin update, called Penguin 1.1.  This update, according to Matt Cutts, was supposed to impact less than one-tenth of a percent of English searches. The guiding principle for the update was to penalise websites using manipulative techniques to achieve high rankings.

SERP Snippet/Optimizer Preview Tool

PageRank is a probability distribution used to represent the likelihood that a person randomly clicking on links will arrive at any particular page. PageRank can be calculated for collections of documents of any size. It is assumed in several research papers that the distribution is evenly divided among all documents in the collection at the beginning of the computational process. The PageRank computations require several passes, called “iterations”, through the collection to adjust approximate PageRank values to more closely reflect the theoretical true value.

A probability is expressed as a numeric value between 0 and 1. A 0.5 probability is commonly expressed as a “50% chance” of something happening. Hence, a PageRank of 0.5 means there is a 50% chance that a person clicking on a random link will be directed to the document with the 0.5 PageRank.

Simplified algorithm

Assume a small universe of four web pages: A, B, C and D. Links from a page to itself, or multiple outbound links from one single page to another single page, are ignored. PageRank is initialized to the same value for all pages. In the original form of PageRank, the sum of PageRank over all pages was the total number of pages on the web at that time, so each page in this example would have an initial PageRank of 1. However, later versions of PageRank, and the remainder of this section, assume a probability distribution between 0 and 1. Hence the initial value for each page is 0.25.

The PageRank transferred from a given page to the targets of its outbound links upon the next iteration is divided equally among all outbound links.

If the only links in the system were from pages B, C, and D to A, each link would transfer 0.25 PageRank to A upon the next iteration, for a total of 0.75.

PR(A)= PR(B) + PR(C) + PR(D).\,

Suppose instead that page B had a link to pages C and A, while page D had links to all three pages. Thus, upon the next iteration, page B would transfer half of its existing value, or 0.125, to page A and the other half, or 0.125, to page C. Since D had three outbound links, it would transfer one third of its existing value, or approximately 0.083, to A.

PR(A)= \frac{PR(B)}{2}+ \frac{PR(C)}{1}+ \frac{PR(D)}{3}.\,

In other words, the PageRank conferred by an outbound link is equal to the document’s own PageRank score divided by the number of outbound links L( ).

PR(A)= \frac{PR(B)}{L(B)}+ \frac{PR(C)}{L(C)}+ \frac{PR(D)}{L(D)}. \,

In the general case, the PageRank value for any page u can be expressed as:

PR(u) = \sum_{v \in B_u} \frac{PR(v)}{L(v)},