Google's Penguin update grabbed the majority of online marketing headlines over the past two weeks with Matt Cutts, hyperbole, and fear motivating many SEO practitioners to expect the worst before the worst even materialized.
Now, post Penguin 2.0 launch, even though there's sufficient evidence to support limited panic, we actually find the majority of folks managed to survive the challenges of Penguin and in fact see little or no impact.
So what's an SEO practitioner to fear given the current overkill on Penguin-focused headlines?
To provide a little fodder, and cause for discussion, here's my slightly warped predictions (and solutions) on possible SEO challenges to plan for or against!
1. The 1-2 Content PunchPrediction: A focus on more facets of duplicate content beyond simple patterns or simhash matching. Search engines will look within sites to better identify similarities between content on algorithmically similar topics (not just content itself) to assess a unique value beyond just unique. This will especially affect sites with user generated 'story' content, geo-based structures and / or large product datasets.
Solution: Site owners should conduct a comprehensive site content audit, consolidating anything that looks, feels or 'tastes' similar, and ensure proper optimization of consolidated page and 301 redirects from old to updated.
Short term: Review and catalog site content around themes.
2. The SEO BandaidPrediction: Search engines will be able to recognize temporary fixes that are just in place for search engines only (not ultimately for users). There will be no 'quick fixes' or placeholder bandaids, only permanent, user-satisfaction focused design and development initiatives.
Solution: Site owners will no longer be able to 'kick the can down the road' forcing user-centric research, production and implementation of best practice SEO (which becomes better practice usability and engagement.) Onsite content, platform and delivery mechanisms will align with consumer intent and context to satisfy user needs, opportunity and personalized experience.
Short term: Investigate responsive design.
3. The Broken Flipper FocusPrediction: Sites with too diverse subject matter will face visibility issues as search engines seek more focused expertise and thought leadership on specific (and more limited) topics. Sites that seek to be 'one-stop-shops' will face challenges as smaller niche sites gain traction based on clearer messaging, authorship authority and user signals of satisfaction. Search queries will be guided towards topic experts, not aggregated and faceless sites.
Solution: Site owners will need to assess site topics and build silos of expertise driven by expert personalities. As search engines seek to associate known identities to known entities (such as articles, content, sites, topic authorities, learning institutes etc.), an identity's digital footprint will support topic expertise and allow site owners to gain 'flipper focus' to propel a site's visibility in search results.
Short term: Identify topic authorities, implement authorship, and plan content around their expertise.
4. The Penguin in the Headlights ParalysisPrediction: Sites owners, marketing executives and other stakeholders will get caught in the headlights of the availability of massive real-time datasets and attempt to blindly react to SEO on a daily basis without truly have enough relevant data to make the right decisions. Search engines will see the action of too many and too frequent changes as a manipulation signal and will penalize sites who make these reactionary updates based on the search engine's understanding of the cause (visibility, warning etc.)
Solution: Site owners should ensure that the greater availability and real-time immediacy of data is an "end to a means", not a better decision enabler. Smart webmasters will leverage real-time data to support or inform historical or trend datasets allowing them to consider all likely scenarios for a "least imperfect" SEO strategy.
Short term: Catalog data sources, identify overlap, assign value, and then define short, medium and longer term KPIs.
5. The Pain in the BackendPrediction: Search engines will introduce tighter definitions of "great" user experience around site performance, speed and site availability. Google Webmaster Tools and / or Google Analytics will expand its reporting to display server data such as calculated server load, server response, page elements' load, consecutive requests and other page, site and server performance metrics (similar to Google's current site speed insights / plugin / service.)
Solution: Site owners should use Google's current toolset to help identify and resolve site speed and performance issues. Site owners should be monitoring Google Webmaster Tools and Google Analytics performance data frequently and leverage site pinging tools to ensure site availability and content delivery speeds are optimal for a great user experience. For large sites, load balancing, cloud-services, geo-based or CDN delivery should be a consideration and constantly tested to confirm users' experience remain within acceptable (better) parameters.
Short term: Use Google's PageSpeed Insights to analyze individual URLs. Dig into Google Analytics under Content > Site Speed and leverage data insights to identify performance challenges.
Penguin 2.0 frenzy may have (just) passed, but many more updates are pending or lurking behind the nearest ice flow.
"Recovery" is not an inevitable goal, rather it is both an avoidable scenario and a mitigable situation through planning and consideration of the possible SEO challenges born from future 'Penguin' style updates.
No comments:
Post a Comment