Responsive Design: Great for Web Sites, Hard for Mobile Devices

Responsive web design (RWD) is generally regarded, as CMSWire writer Tom Murphy noted today, “just fine for now.” But a new survey from Trilibis, a provider of mobile development solutions, claims mobile performance on responsive design websites is often “unacceptable.”Four out of five RWD websites provide suboptimal page weight and load time on mobile devices, according to the survey, “Responsive Web Design: Why Image Optimization Is Crucial for a Mobile-Friendly Customer Experience.”

The survey examined 155 responsive websites. Of these, Trilibis found only 21 percent loaded in less than four seconds on a smartphone. Nearly a third required between eight and 48 seconds to load, with equivalent connectivity.

The Culprit

The culprit: image size, the former bugaboo of websites before virtually everyone got high-speed access. Images were more than half of the overall page weight for 69 percent of the sampled sites. If the overall page weight was greater than one megabyte, the survey found, the load time became unacceptable. And 61 percent presented a home page that weighed at least one megabyte.

As might be expected from the company that conducted the survey, Trilibis claims it has a solution.

The remedy is its server-side processing solution called Snow, which performs image optimization with the targeted mobile device in mind. Average page weight among all sampled sites was 1.8 megabytes, which Snow reduced to 789 kilobytes for desktop, 470 kilobytes for tablets and 420 kilobytes for phone. Page load times were reduced by 28 percent to 54 percent.

Suntae Kim, Trilibis’ vice president of professional services, told CMSWire that Snow conducts a two-step image optimization process. First, there’s optimization of all images on a web page, creating and storing an optimized version for each of three device groups – desktop, tablet and mobile browser. Then, he said, “when a web page is requested by a device, Snow will detect the device and serve up” an image optimized for its device group.

Starbucks - jpg.jpg
Trilibis: a 28 percent reduction in Starbucks load time with image optimization.

RWD provides one URL, one code base and one set of content, which is then rendered for a target device’s presentation and features through the use of fluid grids and cascading style sheets (CSS). Trilibis claims that CSS detects the screen width on the client side, but that Snow detects device characteristics before the content is sent. Additionally, Kim said, CSS reformats layout and scales the image on the device side – after the image has already been sent and caused any loading performance issues.

Adaptive Design

By comparison to responsive design, an adaptive design approach builds different sites for different devices – a smartphone site, for instance, in addition to a tablet site and a desktop/laptop site.

There are some notable downsides for responsive designed sites. Any change to one version will ripple through all of them, it’s a time consuming process, and older desktop browsers do not always support responsive design.

But adaptive design has a major SEO disadvantage. You need to have one URL for the different versions, which means your mobile device goes first to the desktop/laptop site and is then redirected to the mobile site – a likely performance hit. Otherwise, there would be separate URLs for essentially the same content, something that Google and other search engines are likely to consider spamming.

Ted Verani, senior vice president for sales and marketing at Trilibis, pointed out to us that his company’s survey “was only for responsive sites, because adaptive sites have already been optimized” for each of its targets.

He confirmed a growing sense among the web development community that there is “an explosive trend toward responsive web design,” because of the single code and content base, the consistent visual identity, and the SEO issue. Forrester Research has reported the same trend. But, Verani said, “the biggest issue is performance” on mobile devices.

‘Number One Offender’

“We’re saying do responsive design,” he added, “but add this [server-side optimization] to the mix.”

When asked how his company’s server side processing differed from other server side processing, he pointed to device detection, on-the-fly resizing and compressing, an all-in-one packaging that compares to the custom-programmed server side processing for many sites.

Peter Sheldon, vice president and principal analyst at Forrester Research, hasextensively followed the responsive and adaptive design space. He told us he fully agreed “if you don’t do anything to plan for [too-large images on mobile devices when using responsive design], absolutely the download time can be horrifically slow.”



How To Rank Number 1 In Google Search [By Shamsudeen Adeshokan]

One of the most consistent and popular search term in Google search is how to rank number 1 in Google search engine. I also carried out an extensive research on how to get on the first pages of SERP.I always thought that to get on search result first pages is exclusively reserved for the SEO experts. I never thought an average blogger like me could create a page that would hit the first pages of Google, Bing and yahoo.

Many times over, I would run a Google search on how to rank number 1 in Google. The information presented always gets complicated, I never figure out how these SEO experts do it.

I felt completely overwhelmed, confused and frustrated due to lack of free targeted traffic. This began my quest to find out how Google works.

After many trial and errors, I finally decode some basic steps you could take to increase your chances of being rank higher in search result pages.

So I decided to write this article for beginners blogger who wants to understand the simple basics to build both human and search engine friendly blog so they could increase their chances of ranking high in search engines without having to know the nitty-gritty of search engine optimization.

What You Should Know.

There are tons of information out there on how to rank #1 in Google, many of which will only turn you to internet junkies. Lots of garbage information that will only get you confused and completely paralyzed.

Many of what you’ll find online on how you can rank your pages up to the first page of search result are completely meant for the techies and not an average bloggers like you and I.

In my opinion, there are only 4 most important aspect you should focus all your energy and attention when trying to hit the first page of Google or any other search engine.

Below is what I consider most if you want to rank your web pages high in search result.

*Warning  There are lots of factors that determined how webpages are ranked like domain age, quality of content, quality of back links, sites speed, frequency in posting, search term you’re targeting etc. What you’re reading now are meant to put your blog and content on the right track and not a guarantee that your webpages will hit top spot after following my instructions.

#1. Submit Your Blog Url.

The first step towards search visibility is to submit your blog url to Google and other major search engines on the net.

If your blog url is not submitted to search engines, you don’t stand any chance of getting found.

Although, Google will find you even if you don’t submit your blog url but is better to put your destiny in your own hands.

#2. Create a site map for your blog.

Creating a site map is easy and you don’t have to do it yourself. If you’re on Word Press, there are plugins that will create and submit your site map on your behalf each time you updates your blog.

This is very essential to do as it always allow Google crawlers to properly craw your entire blog and index your pages properly. It tells Google how your site has been structured so crawler knows their ways in and out.

You’ll need to install a video site map plugin that allow Google to index videos on your blog.

#3. Blog Description.

This is one of the most important foundations to search visibility. If search engines crawlers can’t in one simple sentence understand what your blog topics is all about, you probably not going to appear anywhere on the net.

And if you does appear by mere accident in any search result pages, it is for the wrong keyword. This will amount to higher bounce rate on your blog.

The first most important thing to do is to have in a clear visible space on your blog and in one sentence what problem exactly is your content solving on the net?

In other words;

  • what are your writing about?
  • What category of audience are you targeting with your information?

In one good concise statement you should be able to answer all this questions. If you can’t, then you’re in serious trouble and no amount of search engine optimization and keyword research combine will help you get on the first page of SERP.

The downside to this is that if you don’t have your own blog description in-place, search engines will find a string of text on any page on your blog and attached it as your description.

That is if you’re lucky they found one, and the worst case may happen in which Google may choose not to display any meta description in your listing. Very bad for commercial purpose.

A good Meta description should contain a part of the keywords your company or blog is primarily targeting with your content. It makes it easier for people and search engines to quickly understand your content, services or products.

#4. Write Quality content.

Everything you’ve read above will become completely useless and waste of time, energy and resources if at the end of it all your content is nothing but gibberish.

The only factor that stands between you and success online is the value you give in your service. Make sure in every article you produce anywhere online, you give your best.

What are quality content?

I can’t categorically tell you what quality contents is, but I assure you that when you create one, you’ll recognize it. When you create a quality content, these are the benefits that comes along with it:

  1. It will spread like wild-fire on social media sites.
  2. It will become a reference point.
  3. You will build back-links in the process.
  4. Bloggers will start linking to it.
  5. It will rank higher in search engine due to quality of links pointing toward it.
  6. It will bring in new loyal readers.
  7. It will be relevant today, tomorrow and always.

So how do you create a quality article?

There are no right or wrong ways to produce a quality article. Don’t be surprise when an article you pour all your heart in writing didn’t get a single social share and something you wrote just to update your blog went viral like a virus.

Is all about connections and effective promotions. But there are some things you can start

The first thing you’ll need most in creating quality article is the”headline“. Learn how to craft articles headline that stand-out from the crowd. I won’t duel much on writing never to ignore headline

Nevertheless, here are 5 time-tested and proven headlines categories that compel users to take certain action and click the call-to-action button in your copy.

News Headline –

  • Italian Airline Crashes at San Francisco.
  • Six Win as Word Press end guest blogging contest.
  • Just released – Comment-luv 3.1 Word Press plugin.

How to Headline –

  • How to write compelling articles.
  • How to rank #1 in Google search engine.
  • How to satisfy a woman in bed.

Question Asking Headline –

  • Short post vs long post which is better for SEO?
  • Do you commit these web design mistakes?
  • Who else want to make $1000 online working 2 hours weekly?

Direct headline –

  • Get 5% off Go-daddy hosting.
  • Get my money-making secret Guide.
  • Download our 30 pages SEO book Now.

Command headline –

  • Click on the link now.
  • Subscribe to our newsletter.
  • 2014 Guest Blogging contest is out! Join now.

Benefits Headline –

  • Make $1000 in two days working five minutes daily.
  • Loss 30 ponds in less than two-week with weight loss pro.
  • Rank your websites number 1 with Google ranker 2.0 WordPress plugin.

Testimonial Headline –

  • How I rank #1 In Google.
  • How 20-yr-old deaf and dump made $647,694 in 30 days.
  • How I loss 67 ponds in two weeks and how you too can.

Start experimenting with each of the above headlines and see what works and don’t work for you. I guess you grab the ideas behind each one?

Optimizing your content for search visibility.

What you need to understand here is, its takes work and lots of it to rank your pages high in search result. Am not saying you will rank #1 or even get to the first page of Google but if you follow the steps outline here, you will increase your chances of hitting first page.

Ask me why I believed you will? I know because they worked for me.

The key important thing you’ll want to take here is before you write your next post; ask yourself this very question –

  • Who are my targeted audience with this information?
  • Why should I write for this audience?
  • What problem are they facing that I need to solve with my information?
  • In what format should I present this information?

All these questions you’ll need to answer in your content. Remember, you’re to write for human first, and search engine second.

When you’re done brainstorming on all this criteria, is time to do some keyword research using the Google keyword planner formerly called keyword tools. That’s what I use.

Providing genuine answer to those questions will make it easier for you to have a list of keywords related to your content. You’ll want to find its global monthly search and market value.

Here you’ll want to go after keywords with high volume global monthly search to build your content around.

A Word of Advice Here.

Focus on low competitive keyword to increase your chances of ranking high for a keyword that your targeted audience will find you with. There are many factors that lead to higher ranking which only on-site SEO will not guarantee you there.

Type your keyword phrase into Google search and see what result it return. If the search result is more than 100,000 then it seems is highly competitive.

Focus on writing your article around keywords that are below 70,000 return results to increase your chances.

What you’ll need to understand here is; the keyword may not send you massive amount of traffic but is far better than bunch of drive-by traffic that does nothing for your business.

What you want is free targeted traffic; you only need to make the best out of it. Quality matter most, not quantity.

Now, its time to write your post and tweak it around to include your targeted keyword. I’ll suggest you write your post normal and later see how you can include your focus keyword in it.

Don’t stuff your article with keyword in all paragraph, it doesn’t work that way anymore. I recommend you include your focus keyword in the first paragraph of your post and lastly, in the concluding part of your post. Keyword density is no more an important aspect to ranking webpages.

Make sure you include your keyword phrase in the title tag and post title, also in the Meta description tag. This help to entice visitor from search engine to click-through to your websites.

The earlier you tell these spiders what your content is about the better, that’s why you’ll want to include your focus keyword in the early paragraph of your post and it also help readers to understand they’re reading the right information.

All-in-all, write for human first, then make sure the search spiders are able to  find enough meat and potatoes on your blog when they come crawl your pages.

Also remember that these spiders have no eyes yet, so it will be best to give your images description tag. It will help these spiders to know what type of information they’re crawling.

What next?

Now that you’ve created something worthy enough to tell families and friends, is time to share it with loved ones.

In fact, no one will ever notice your content exists if you don’t go out and tell them about it. You start by placing on your blog social sharing buttons of all major social media sites like Facebook, twitter, LinkedIn, Google+, Delicious, StumbleUpon, etc.

I assume you already joined as many as possible relevant bloggers communities. After hitting the publish button on your blog, visit relevant Facebook groups, bloggers communities and share your content in there.

Promote promote and promote till you have exhausted all available marketing channel within your reach. This way your content will be seen by more targeted audience who could help spread the word out of your blog.

I believe all I’ve said so far is just an intro. There is plenty left to be learned when it comes to search engine visibility and the theory of blog post optimization.

I will be grateful if you can share with me your invaluable experience through the comment box below.


Eating My (Key)Words: Changing The Way We Think About SEO [Jenny Halasz]


I have glimpsed the future of search, and it is not keyword-driven. While I have long been an advocate of using keywords as an indicator of a searcher’s intent, I am about to eat my words.

The truth is that search is heading in a direction that most of us could not have foreseen… a technically complex and varied amalgamation of platforms, devices, and inputs.

What I mean by this is that as the search engines become more focused on discovering user intent based on various elements that can be measured before a user even types anything — location, search history, mobility, circles, etc. — it becomes less important what that searcher actually typed in to access the SERP. And with the rise of conversational search touted in platforms like Google Now and Google Glass, the searcher may not even type at all.

Keywords In Real Life

Take a recent example of how Google Now works. You can ask, “Who is the President of the United States?” And the answer is displayed for you: it’s Barack Obama. You can then ask, “Who is his wife?” The answer is again displayed for you — it’s Michelle Obama — but let’s say you choose to click on or otherwise select a result on that page. Maybe it’s a listing of famous first wives throughout history. As the site owner, the referring keyword would be [who is his wife]. That’s not useful to you, because you don’t know if [his] refers to Obama, Jefferson, or Washington.

This is the future of keyword-based referral, and it’s one reason of many that you shouldn’t be too upset about “Not Provided” going to 100%. (Unless, of course, you want to debate the politics of the issue, data sharing, and paying for data — then there’s plenty to get upset about.) But on the face of it, you’re not losing much in the way of useful customer data.

So Is SEO Finally Dead?

No, SEO is not dead. And neither are keywords. The truth is, we probably won’t see the standard search box disappear in our lifetime. But we will see a lot more variation in how people type keywords into the search box.

These changes make it absolutely essential to grow SEO into a slightly different concept. Gaming the system ended last year, for those of you who didn’t get the memo. Attempting to reverse-engineer the algorithm will get you in a lot of trouble.

But the future of SEO is all about optimization. In a sense, what’s old is new again — except that the way we think about the term [SEO] needs to be changed again. Matt Cutts said at SXSW last year that we should think of SEO as “Search Experience Optimization.” I’ll go one step further and suggest…

Subject Experience Optimization

Instead of thinking of “marketing,” which is defined as the act of promoting and selling products or services, SEOs need to be thinking about how they can deliver the best possible experience for their subject — the visitor.

What combination of elements do they need to present to make the user experience most optimal? What key actions do people want to take on the website, and how can we appeal to their base instincts and language with clear headlines and copy?

There’s a concept called “aboutness” that was developed by R. A. Fairthorne back in 1969, popularized by William John Hutchins in the mid-70s and more recently adopted by Shari Thurow, an industry expert whom you may be familiar with. Originally used for library and information science, in a marketing application, it refers to making what a page is “about” very clear to the user.

You can do that with well-chosen and labeled images, with keyword-based headlines, and with copy that clearly explains the purpose of the page. This is where keywords become essential, because until we can read minds, we have to be able to guess at the language that will compel users to take action — guess at first, and test to refine. This is not to be confused with “user experience optimization,” which goes deeper and is more detailed. But it does scratch the surface of what good SEOs should be thinking about.

Technical Optimization Is Here To Stay

The traditional definition of SEO, “Search Engine Optimization,” will always be relevant and necessary. This refers to how we optimize websites for search engines, and it includes everything from making sure search engine spiders can crawl pages to helping them understand complex content on pages with Schema markup. But there’s another, even more important, aspect of SEO that we will need to pay attention to.

The Rise Of Entity Search

Earlier this month at SMX East, there was a panel (the first of its kind, I believe) on Entity Search and its impact on the future of SEO. David Amerland began his part of the session by asking everyone to think of a tree. I immediately visualized something like the image on the left (below), but then realized he probably meant for people to visualize something like the image on the right:


The rest of this exercise was particularly powerful for me because of what I had visualized at first. David said you probably saw something different based on where in the world you live, or what your understanding of a tree was. In my case, my mind was on entities and information architecture, which is why my “tree” looked like the image on the left.

The power of this example is that it explains a universal truth in a very concrete way: language cannot indicate with certainty what someone is thinking.

If Google had known based on my location or the last photo I took or my search history that I was working on information architecture for a client during the last break or attending a search conference, then a search for [tree] might have resulted in something very different for me.

So as SEOs under my new definition (Subject Experience Optimization), we have a responsibility to clearly define and indicate our clients’ entities. Where are they located, what do they specialize in? How do those things form relationships to other entities? Some of this is familiar, like the concept of “local search” or “authorship” or “link graph;” but, the whole is truly greater than the sum of its parts.

The future of SEO is not based on keywords, but rather on how those “keywords” form a relationship to an entity, a concept, or a target. For more on this, I strongly suggest reading Paul Bruemmer’s latest article on Entity Search.

As to what exact format that will take, or whether we’ll all soon be flocking to FreeBase in the way that we once killed DMOZ, I can’t say. But the future of search is coming quickly, and SEOs (whatever we define that as) will need to adapt rapidly to remain ahead of the pack.


synchronous vs. asynchronous tags – what’s the big deal?

If the web is a human body, JavaScript tags – the bits of code governing the execution of web pages – are like its nerve endings.  They are the means by which websites sense, respond, execute, measure, and remember.  They issue commands to load most page content and deliver nearly every advertising message.  Further, they serve as the gateways through which information is shared: across web pages; between different websites; and among the countless media companies, marketers, and service providers involved in creating a user’s web experience.
But all that activity comes with a cost.  Often, tag activity slows down pages, which can have a detrimental effect on user experience and the web operator’s bottom line.  Don’t believe me?  Ask these guys:

  • Amazon has reported that a 0.1 second delay in page load time can translate into a 1% drop in customer activity.
  • AOL observed a ~30% variance in page views/visit based on load time alone – slow pages drive users away.
  • Perhaps most importantly, page load time is an increasingly important factor in the Google and Bing search rankings, a critical consideration for any web-enabled business.

In this post, I aim to provide some valuable context and perspective to help you understand tag serving challenges, the dynamics at play, and potential solutions.  And if we’re lucky, you may even come away with some small talk fodder so you can look smart at your next tech or media industry cocktail party.

About The Problem

As any front end performance guru will attest, JavaScript is the #1 cause of slow web pages.  Why?  Because it’s blocking.  That’s geek speak for “preventing everything else after it from happening.”  The impact of that blocking is illustrated in the waterfall graph below, showing the behind-the-scenes requests for all the assets needed on a given page.

The key takeaway is that while the JavaScript is loading, nothing else is happening.  This means that users are stuck waiting for that JavaScript to load, and other content won’t load until it’s downloaded and parsed.  Browsers render page elements, including tags, synchronously (meaning one page element can’t load until the one before it has).  In the case above, one JavaScript file held up the rendering of the page for nearly 1.5 seconds.  And as noted earlier, 1.5 seconds may not sound like much, but it can have an enormous impact on the bottom line.

You’ve probably heard some chatter about the potential for asynchronous tags in mitigating the risk of JavaScript-related page slow down.  That’s basically a process by which JavaScript calls are parallelized, eliminating the blocking dynamic illustrated above.  There is potential there, particularly for JavaScript tags that call non-visible page elements, such as measurement or analytics tags.  For visible page elements, such as images, content, or ads, things get more complicated.  Let’s explore what synchronous and asynchronous really mean within the context of web browsers.

Pro tip: You can analyze your own pages at, and you will be able to generate snazzy waterfall graphs like this one, as well as gain other insight on what is slowing down your pages.
But wait, there’s more… The problem of synchronous calls creates challenges on the server as well.  For an introduction to the benefits of non-blocking, event-driven programming concepts, view Ryan Dahl’s introduction to node.js.

So, why do browsers rely on synchronous loading?  Shouldn’t they parallelize the calls?  If they load them asynchronously – making multiple requests simultaneously – wont they be able to prevent these kinds of roadblocks?

Oh, they’d love to.  The reason why they can’t is because of an arcane JavaScript construct called document.write.  It’s a method for inserting something into a web page, such as a string of text or a tag that calls for some page element to be loaded (like an ad creative or an image).  It’s a handy tool and widely used, but there’s a catch:  document.write expects to alter the page inline, inserting content as the page is rendering.

This means that the use of document.write requires, almost without exception, that all page elements be loaded synchronously.

To illustrate, consider the following example of instructions given to the browser.  Here, we’re inserting the word “Nick” between two blocks of content on the page.

content that comes before,
content that comes after

We’d expect the resulting output to look like:

[content that comes before], Nick, [content that comes after]

That’s precisely what document.write was designed for, the inline, ad hoc insertion of some content or page element.  But what if the document.write is in a remote script, one that calls an external source to provide an element that is being rendered?  Example:

content that comes before,
<script src=”remote.js”></script>,
content that comes after

And remote.js contains:

document.write(“Remote Nick”);

Still not a problem, as long as the browser blocks and waits for remote.js to come back.  If executed synchronously (or, sequentially), these instructions would result in the following:

[content that comes before], Remote Nick, [content that comes after]

However, if we execute asynchronously (or, non-sequentially), it will take a while to execute that remote.js round trip, and we might end up something that looks like this:

[content that comes before], [content that comes after], Remote Nick

This isn’t what we wanted at all.  The browser loaded elements as they were returned, and given the nature of the document.write construct, when the remote.js responded, it was able to insert ‘Remote Nick’ in the wrong place on the page.

Now, imagine that ‘Remote Nick,’ instead of being a couple of random words, was actually an advertisement that got mangled as the page loaded.  That’s something with real dollars-and-cents implications for web publishers and the advertisers that help keep the lights on.  Ultimately, the potential risks to web operators face are too great and the use of document.write too pervasive, limiting browser-level asynchronous loading.  Unless the browser is guaranteed that the remote script does not contain document.write, it must block and wait for the return to ensure that the web page renders properly.

So, if document.write is so terrible, why don’t we just get rid of it altogether?  A lot of people would love to.  It slows down script execution, it complicates asynchronous loading, and it can even completely blank the page in older browsers.  While document.write is considered bad practice, and there are viable alternatives out there (e.g., element creationinnerHTML, and libraries), it’s deeply rooted in our Internet’s infrastructure,especially in the advertising technology ecosystem.  Until the entire ad stack – from ad servers, to RTB platforms, all the way to each and every ad creative – can guarantee that document.write won’t be used, browsers are stuck blocking.  In short, document.write is not going away anytime soon.

Pro tip:  When you load JavaScript synchronously from a third-party, what happens to your site when that third-party is down, or simple very, very slow?  You probably have guessed by now, it blocks the rest of the content from loadingIf you’re running a third-party’s tag synchronously, you are well advised to ask questions about the quality and speed of their infrastructure.

When it comes to synchronous loading, their performance is your performance.



Use of async  in <script> tag.

Google Hummingbird: The New Google Algorithm and What You Need To Know [By:Shell Robshaw-Bryan]

For the past few weeks I’d seen a pattern repeating itself across the board with most of the clients that I do SEO and content marketing for, small drops in Page Authority and Domain Authority for every domain I was monitoring. I also observed some big leaps up the SERPs for content that was previously ranking poorly.Despite suspecting that change was afoot, Google remained unusually quiet and while speculation was rife amongst those of us involved in SEO on a daily basis, nothing had been confirmed. That was, until 26th September 2013 when Google came clean and admitted that the new algorithm had been up and running for the past month.


Panda and Penguin were updates which changed part of the algorithm, but Hummingbird has replaced the old algorithm and it’s the biggest change in 3 years. It’s not just a major update or refresh, it’s an entirely new ranking algorithm.

This latest news comes hot on the heels of Google’s announcement that in future, all searches will be secure and as such, keyword data will no longer be available in Google Analytics. Not only this, but many website owners have spent the last few months dealing with the effects of the major Penguin refresh which hit earlier this year and had far reaching effects, making ‘bad’ SEO not just unsuccessful, but ensuring guilty websites were actively penalised.

Hummingbird aims to deliver results which are precise and fast

Whilst specifics are still somewhat patchy, Google has confirmed that Hummingbird focuses on ranking information based on more intelligent and naturalistic search requests. In short, Google is getting smarter and is now better able to understand the relationships and relevance of words and phrases, instead of just considering a bunch of individual words.

Google Hummingbird At A Glance

  • Many of the existing rules and weightings still apply, so don’t stop doing what you are doing if your activities are based on Penguin pleasing, sustainable and ethical content focused techniques
  • A sizeable 90% of all searches are likely to be affected by Hummingbird though the full extent and reach of its effects is currently unknown
  • Known as Semantic search, more naturalistic or ‘conversational’ search terms (which tend to be long-tail in their nature) are now more important than ever
  • Google still wants to return the most relevant, accurate and useful search results to its users, Hummingbird provides a more sophisticated means for Google to deliver this
  • There is now less emphasis on individual keywords and more emphasis on their collective (semantic) meaning
  • PageRank remains an active ranking signal and Google claims that there is nothing massively different that SEOs need to be doing or worrying about


If you’ve not noticed any significant changes in the last month, then it looks like you’ve escaped unscathed. Some of the effects we’ve seen have been small however and could easily be missed, including small losses in Domain Authority and drops down SERPs for some previously highly ranking content, while other, less obvious content has risen up.

For some time now, the emphasis has been upon providing useful, high quality content on websites and blogs and upon optimising content towards long tail keywords. This simply means that future SEO activities will be more focused on longer, semantic search terms. In real terms, for those who have already adapted their content marketing and SEO following the Penguin update earlier this year, very little is likely to change.