Google Social Search, Now With Google Buzz

Google Social Search, which shows content from those in your social circle, is now tapping into a new source of knowing what your friends are writing about, sharing and creating: Google Buzz.

Remind Me: What’s Google Social Search?

For those unfamiliar with Google Social Search, it launched across all of Google in January, after initially rolling out as an experiment last October.

When you’re signed in to Google, the service shows matching web pages and other content on the topic of your search that’s created by your friends or others that Google has determined that you’re “connected” to.

For example, on a search at Google for JetBlue, this is an example of the social search results I’ve personally seen:

The two items listed, one from our SMX search engine marketing conference series and the other from 10e20 (now Blueglass), appear for me because Google’s decided they’re connected to me.

If you don’t see Google social search results when signed in and doing a search:

Google might not feel there are relevant social results to show to youYou might not have much of a social circle, which can be fixed with some of the methods described belowGoogle just might be having a glitch, which is hitting me today

You can also “force” Google to show social results by doing a search, then selecting the “More Search Tools” drop-down button (see Meet The New Google Look & Its Colorful, Useful �Search Options� Column about this) and choosing “Social” from the “All results” section.

And Google Finds My Social Circle How?

How does Google determine my social network? My previous article, Google Social Search Launches, Gives Results From Your Trusted Social Circle, is an in-depth look at this. In short, Google examines:

Google Reader: if you have a Google Reader account, any content such as blogs that you subscribe to are considered part of your circleGoogle Chat: anyone you’ve enabled to chat with is considered part of your social circleGoogle Contacts: Anyone you’ve classified as friends, family or coworkers is part of your circleGoogle Profile: Anyone’s content you’ve associated with yourself via your profile is examined to locate people to add to your circle

The last part, how your Google Profile is used to find friends and social connections, is the hardest to understand. It’s also key to today’s change.

Google Profiles allow you to associate yourself with content you’ve created across the web. (To understand more about Google Profiles, see our previous article, Hoping To Improve People Search, Google Launches Profile Results).

For example, here’s how I’ve added links to my content to my own Google Profile:

From these links, Google can figure out who some of my friends are. For instance, take the link to my Twitter account:

I link to my Twitter account from my Google ProfileGoogle reads who my friends are from my Twitter account (this is public information)Now I might get tweets from my friends showing in my social search results

Consider further

One of my friends links to their blog from their Twitter profile (again, public information)Google now understands that my friend has that blogNow I might get blog posts from that friend in my social search results

And even further:

The friend�s blog has a link to their Flickr account (once again, public info)Now I might get links to their Flickr content in my social search results

All this is possible simply because I’ve linked to my Twitter account from my Google Profile. From that, Google can follow “social links” to content my friends have created across the web.

From Google Profiles To Google Buzz

The problem is, Google says, that not everyone has fully pimped their Google Profile pages to “connect” to content that in turn would connect them with friends. For example, maybe they haven’t linked their Twitter account to their profile. That means Google can’t “see” as well who their friends are.

This is where Google Buzz comes in. Since that service launched, people have had an entirely new way to connect content to themselves, through their Google Buzz accounts. Here’s an example from my account:

You can see that I’ve connected six sites to my Buzz account, and there’s an option to connect more.

Google tells me, when I spoke with them today, that there are many people who have more sites connected to their Buzz account than to their Google Profile. So today, Google’s now using Buzz’s “Connected Sites” feature to help determine who is in your social circle for social search results.

The stupid thing is that you have to depend on Google Buzz making a best guess about sites you want to have connected with your Google Buzz profile, which in turn comes off what you’ve listed in your Google Profile page — the same page that Google says people don’t seem to use that much.

Want to add something that’s not listed in Google Buzz connected sites? You have to go to your Google Profile, add a link to this, then hope that Buzz decides to allow you to make that a connected site (and in my experience, this is very hit and miss).

Google Buzz, of course, has struggled with privacy issues. Is this move a new cause for concern. It shouldn’t be. Only you will see your social search results, when you’re logged in to Google. No one else sees these. And everything you see is already out on the public web in the first place.

Each person also gets their own unique social results. You can also see exactly who is in your social circle through new features that rolled out in January (see Google Social Search Goes Live, Adds New Features for more about this).

Hey, what about this new rumored “Google Me” social service from Google. Is that going to be used? Yeah, Google had no comment about even if there is such a service. But if this is an overhaul of Google Profiles and/or Google Buzz, yes, I’d expect it eventually to get integrated into Google Social Search.

How To Find Your Most Popular Posts

It’s the end of the year and a lot of blogs are now putting out their top posts of the year posts.� But how does one find out which are the top posts on their blog?

Here are a few ways you can find out which posts are tops on your blog.

Number of Comments – One easy way to see if a post is popular or not is to look at the number of comments.� The more comments, the more engaged people were with the post.� The downside here is that a lot of people may read the post, but only a few may actually comment.Number of Trackbacks – How many other blogs are referencing your blog post?� If you have a lot of trackbacks, that means your post could have a far reaching impact across the blogosphere.� However, the quality of those trackbacks is an important thing to check too.Social Traffic – How many Diggs did your post get?� Does it have a lot of stumbles?� How well is the post doing in social media sites could be another great indicator if the post is popular or not.� Again, it all depends on the quality of traffic.� StumbleUpon can send a quick 200+ people, but what if they all gave it a thumbs down?Analytics – Analytics is a great way to see which posts are getting the most traffic.� Analytics encompasses traffic from other websites, blogs, social media and search engines.� It’s probably the best indicator and the easiest one to pull results from.Post Plugins – You can also get plugins such as Post Ratings where users can rate the quality of a post, or you can download one of the popular posts plugins which analyzes things like comments, trackbacks and other indicators they feel are necessary to pull popular posts.Use a 3rd Party Service – Sites like PostRank work to analyze each post and come up with a popularity score.� The nice thing here is the other service does all the work for you.

Any one of the items above can help a blogger find out which posts are the most popular on their blogs and help create one of those “Top posts of the year” posts.

What other way have you found to find the most popular posts?

Bing Deep Links Issue to Blame for 'Mysterious' Yahoo Links

A Parkersburg News & Sentinel blog post featured a reader’s question that stumped the author: “Why does a link for a fiery accident on I-77 from 2011 pop up when I do a simple Yahoo search for ‘The Marietta Times’?” Turns out there is an easy answer.

Bing's Deep Links to Blame

Rather than settle for a dissatisfying conclusion of “I don’t know,” Search Engine Watch contacted Duane Forrester, Bing’s senior product manager, to find out why two of the eight featured Deep Links pointed users to rather odd locations – one to a story about a fatal fiery car crash from April 2011 and the other to an obituary from February.

“This is a known issue – essentially, there are instances when ‘spiky’ terms can appear in Deep Links,” Forrester said. “Believe it or not, those two items were very popular on that site. We’re looking at ways to sort this in the very near future.”

So what are Deep Links? Bing says it assigns Deep Links to established and “authoritative” websites, with the goal of “exposing the most popular deeper content for trusted sites.”

Basically, they are Bing’s equivalent of Google sitelinks – a group of eight links located beneath the search result featuring the homepage link, URL, and description (in this case The Marietta Times) that point users toward popular sections within the same website (e.g., Obituaries, Local News, Local Sports, etc.), as seen here:

As a comparison, here’s what Google shows:

This can also be seen in a search for [Parkersburg News and Sentinel], where Bullets Hit Car, House is one of the eight Deep Links featured along with Obituaries, News, Sports, Jobs, and so on:

As a comparison, here’s what Google shows:

Wait, Bing? Isn’t This is a Yahoo Search Problem?

For those unfamiliar, Yahoo no longer serves its own organic search results. Bing began serving the algorithmic results that Yahoo searchers see starting in August 2010 in the U.S., as part of the Microsoft-Yahoo search deal signed in 2009.

So yes, Yahoo shows the same search results:

And note all the way at the bottom of Yahoo’s search results the “Powered by Bing” notation. Indeed, humans don’t control search engine links, but Bing does control Yahoo’s search results:

How to Edit Deep Links on Bing

Is Bing surfacing an odd Deep Link for your website? Would you rather another section or page be featured instead? There’s a solution for you (and these news sites) if you don’t want to wait on Bing's fix.

To manage your Deep Links, sign into your Bing Webmaster Tools account (create an account if you haven’t already). Select your site, click the Index tab, and click on Deep Links. From there, you can block (or unblock) a deep link, or change the order of Deep Links by using the Weighting Preference option, giving high or low preference to links.

Changes won’t take effect instantly. Bing notes it may take “some time” before you see any change.

Think your company or agency deserves recognition for their excellence? ClickZ Connected Marketing Awards are designed to recognize brands and organizations that have embraced creativity and interactivity to connect with their audiences - and drive results. Call for Entries ends June 25. http://bit.ly/K3F7u5

CEMPER’s New Power*Trust & Daily Refreshed Trust & Power Metrics: What You Need to Know

�Link Research Tools�, probably the�most advanced SEO and link building tool, built by CEMPER.COM�scored again! With the new metrics CEMPER Trust� and CEMPER Power � they obsolete a lot of other link metrics you may know and still use. The new “Power*Trust keyword cloud� is again a whole new way to visualize backlinks and the Power*Trust metrics is said to become the only metric to base link decision on.

Google Penguin Asks for Better Links and Better Tools

With the recent Google Penguin update shaking SEOs around the world it became apparent that we need better tools to identify the link that we should let go and those that we should pursue. LRT (short for Link Research Tools) has been at the cutting edge so many times, and once again shows us new ways to do our SEO job better and faster.

They introduced no less than seven (7!) metrics to measure Trust and Power of a page, a sub-domain and the whole domain.

You can watch the video below from Christoph C. Cemper or just read on�


CEMPER Power� & CEMPER Trust�

They introduced no less than seven (7!) metrics to measure Trust and Power of a page, a sub-domain and top level domain. As you can see below, these metrics have color-bars that allow you to quickly get a feeling for every single links� quality. As usual these metrics can be sorted, cascade-sorted and of course filtered.

This means you can go in and find all those links with neither power nor trust (on the page level) and get rid of them. Obviously these are often links from public hosting platforms but embedded into content that has no value, not for users, not for SEOs.

CEMPER Power*Trust

The final fun is Power*Trust which is the product of both metrics Trust and Power (each numeric 0-10) yielding in a total strength metric going from 0 to 100 (in theory). Guess what � I had a hard time finding links stronger than 70 by power*trust.

It appears they found THE single metric to judge link quality on with this, as the diagram from the�Competitive Landscape Analyzer�(CLA), another of their tools in the toolkit, show. It compares ten losers of the Penguin Update (their link�s power*trust quality in green bars) with one Winner (in orange).

I think that picture says it all � Power*Trust measure what appears to be Winner�s link!

All of these 7 powerful metrics for Link Trust, Link Power and Link Power*Trust are available in�all (16 currently) tools of the LRT.

SEO Keyword-Cloud by Power*Trust

If that wouldn�t be enough, they also just launched what they call �Power*Trust Keyword Clouds�.

Think keyword-tag-cloud as you know it from blogrolls and add high tech SEO data to it � enter Power*Trust clouds.

That�s the difference between �wanting to rank� (keyword cloud picture one below) and �ranking� (keyword cloud picture two)

Keyword clouds as we know it � biggest number of links for a keyword drawn biggest� ouch.

Weighted by Power*Trust you see the strongest links don�t pass any anchor relevancy!

PageRank�, ACRank and SEOmoz MozRank obsolete metrics?

According to their release information, all of the above metrics are obsolete now and were removed or hidden in a �legacy� package for �nostalgic users�.

Quick Mode

Another great new feature that helps everyone is the option to�run all tools in �Quick Mode��which reduces the set of metrics from 77 to a dozen important ones, giving you execution time of only a few seconds compared to a couple minutes. That is a huge time saver!

Final Thoughts

Already voted�#5 in the best link building tools�a while ago, I think this change will help the Link Research Tools advance further up. We had some�previous reviews��of the Link Research Tools for several new�features like Link Velocity�and the famous�Quick Backlinks tool (QBL)�that you may also want to check out.

Oh � pretty cool understatement: in a byline they also mentioned � we also support analysis of Pinterest data for your links :-) This feature could have been a �major� release elsewhere.

If you haven�t seen or used the�Link Research Tools�before, it�s probably the best time to do so.

Let me know what you think about these changes in the comments below!

Search Market Share: Google Up, Bing Flat, Yahoo Hits New Low

The financial analysts are starting to expose search market share data for May, ahead of the official release of those numbers by comScore tomorrow. If they’re accurate the data we saw reflect another monthly decline for Yahoo and AOL. Bing was flat while Google gained.

Here are the numbers:

Google: 66.7 percent in May (vs. 66.5 percent in April)Bing: 15.4 percent (vs. 15.4 percent in April and 14.1 percent a year ago)Yahoo: 13.4 percent (vs. 13.5 percent in April; and 15.9 percent a year ago)Ask: 3 percent (vs. 3 percent in April)AOL: 1.5 percent (vs. 1.6 percent in April)

These figures don’t reflect mobile search queries or market share. In mobile Google enjoys a roughly 95 percent share in the US market.

The numbers above represent the 9th consecutive monthly decline for Yahoo’s search market share and a new low.

Below is a graphical representation of US search market share comparing May 2012 vs May 2011.

Postscript: The official numbers, confirming the above, have just been released by comScore.

Source: comScore, May 2012

Poll: How Do You Measure Business Blogging Success?

I recently completed developing and recording 2 modules for a business blogging program with the DMA. �One of the most interesting portions (in my opinion) of the material addressed measuring blog success. There’s no one right answer because the purpose for business blogs can vary from branding to sales.

Since bloggers often read other blogs, I’m counting on Online Marketing Blog readers to take this quick poll to identify what the most common goals for business blogging are. There should be enough options to address whatever purpose you have for your blog. You can pick up to 3 choices.

Pick your top 3 measures of success for business blogging

Engagement: comments, links (36%, 65 Votes)Improved brand recognition (31%, 56 Votes)Build thought leadership (31%, 55 Votes)Search engine rankings (31%, 55 Votes)Better communicate with customers (30%, 53 Votes)Traffic to the blog (27%, 49 Votes)Coverage by media and other blogs (18%, 32 Votes)Traffic to the corporate web site (16%, 28 Votes)Sales leads (16%, 28 Votes)Industry Recognition (13%, 23 Votes)Sell products (12%, 22 Votes)Improved customer satisfaction (11%, 20 Votes)Page views (9%, 16 Votes)Time on Site (6%, 10 Votes)Ad revenue on the blog (5%, 9 Votes)

Total Voters: 179

 Loading ...

A bit of Online Marketing Blog trivia: This is our 51st poll!

Google Plans for 3D Cities, Trekker View & Offline Maps for Android

While Google's Street View service is undoubtedly useful and one most people online have used, enjoyed and benefitted from, for the firm itself it has caused many headaches, as it has faced numerous lawsuits and privacy scandals over its Wi-Fi sniffing capabilities.

However, this has done little to deter the firm after it unveiled a series of new plans to further increase the scope and coverage of its mapping tools, including flying planes over cities to create entire 3D maps of the buildings below.

"By the end of the year we aim to have 3D coverage for metropolitan areas with a combined population of 300 million people," it said in a blog post.

You have been warned.

As if photographing our homes and then taking to the skies wasn't enough, though, Google also announced a portable version of its photography equipment that can be worn when walking or skiing so no terrain to remote or arduous is off-limits – perhaps inspired by the April Fool's Google Street Roo?

"There's a whole wilderness out there that is only accessible by foot. Trekker solves that problem [why is that a problem?] by enabling us to photograph beautiful places such as the Grand Canyon so anyone can explore them," it added.

In some more worthwhile news, the firm also said that it will make its Map tool available to download to Android devices so users can access information offline.

"Users will be able to take maps offline from more than 100 countries. This means that the next time you are on the subway, or don't have a data connection, you can still use our maps," it explained.

This could be particularly useful if you're trekking through a remote location with one of Google's cameras mounted on your back, perhaps.

This post originally appeared on V3.

Register now and receive the low Pre-Agenda Rate for SES San Francisco 2012, taking place August 13-17. Marketers and SEO professionals attend SES San Francisco each year to network and learn about topics such as PPC management, keyword research, search engine optimization, social media, local, mobile, link building, duplicate content, multiple site issues, video optimization, site optimization, usability and more. The conference offers 70+ sessions, intensive training workshops, and an expo floor packed with companies that can help you grow your business. While you're at it, network with peers and leading industry vendors.

How to Choose Web Hosting Service: Customer Reviews

Your website hosting service choice is crucial: it’s the foundation of your website success. A web host delivers your web site to the world. In case you make a bad choice, you risk getting into trouble:

Uptime: Apart from obvious damage to your website performance, frequent crashes often negatively affect SEO. Google won’t rank your website high if it is often unavailable or down (see tip #2 in the list of the ways of increasing Google crawl rate);Security: Improperly managed web hosts are often attacked by hackers, so your website will often have troubles being labeled as one “that may harm your computer” (which also dramatically decreases the click through as well as damages your web resource reputation).

Now, there are plenty of web hosting services available and here’s a basic checklist I often turn to when choosing one for myself or my client:

My project budget (how much money am I able to spend on web hosting?);Service reputation (I perform a few Google searches to check what people say about the service provider);Basic features (no more than I need for the future project, some extra features I will never need won’t encourage me to choose one particular service provider);(Very important) 24/7 customer support (I usually contact them prior to subscribing with a few basic (stupid?) questions to see how responsive and helpful they are).

To decide whether one particular service provider is compliant to my very basic requirement listed above, I usually turn to customer reviews. To properly search for “real” customer testimonials, I usually use the following tricks:

1. Google “Reviews”, “Forums” and Date Search

Before Google introduced side search help, I had used inurl:forums or intitle:review (and the like) search operators in combination with date range search within advanced search to find what people say about the service. But now it is even easier:

2. Hosting Comparing Services

There are a few helpful services that let you compare various services and packages as well as look through user reviews. WebHostingGeeks.com is one of those: it compares web hosting services in multiple categories (free web hosting, dedicated server hosting, vps hosting … etc) in a handy table containing:

Web hosting provider;Basic features (space, traffic, price);Bonus features;Reviews rating.

If you click the link in the last column, you will be taken to the whole list of independent reviews about the chosen hosting provider and package:

3. Twitter Search

One cool hack that I often use to find negative reviews on Twitter is :( search. Just add it to the search query and you will have the list of dissatisfied customers tweeting about their poor experience:

And how do you decide if the hosting provider is worth a try?

Google Does Away With “Sponsored Links” Label, Now Ads Are Labeled “Ads”

A month ago, Google began testing labeling the AdWords ads as “ads”. Prior, Google labeled those AdWords ads as “sponsored links,” which is how they have been named for as long as I can remember.

Today, it seems like everyone should now be able to see the AdWords listings labeled as “ads” as opposed to “sponsored listings.” Google has yet to confirm this, but based on my tests and asking around – the AdWords listings appear to be “ads” for all searchers. Google did however confirm the test a month ago. I’ll update this post with additional details as I get them.

Here is how the ads look now:

Here is the old “sponsored listings” label:

I should note, Gmail ads are still labeled “sponsored listings,” as well as on other properties.

Hat tip to Ben Edelman for spotting this.

Postscript: Google has confirmed this rollout. The rollout is only on English language domains for now, but will roll out to additional languages in the future. A Google spokesperson said:

Yes, I can confirm this rollout. We are always experimenting with the look and feel of our search result pages, including the delivery of relevant advertising. This is on English language domains now and rolling out to all languages and domains.

See Your Impact, The Non-Profit Rand Fishkin Supports is Amazing

We contacted Rand Fishkin (@Randfish) CEO of SEOmoz, about the non-profit he supports and I personally found the organization very inspiring. Please read on and learn more about it.

Also, Rand is currently at SMX Advanced so if you get a chance try to meet him. He is an outstanding guy.

See Your Impact – What They Do

See Your Impact helps make recurring donations more scalable, accessible and rewarding for those who give. It’s a platform that’s been highly successful in helping donors feel connected to the people they help, and likewise, gives a voice and a record to those who receive and use the donations.

How did you get introduced to them?

Their CEO and founder, Digvijay, was connected to me through several other folks in the Seattle startup world.

Why you are passionate about this particular cause, and this organization specifically?

I love scalability. I love how technology can improve the impact that an organization or an idea will have by leveraging the power of the web’s reach.

Hence, SeeYourImpact is a great fit for me – it’s not specific to any one cause, but about making the web work to power communication by people who should be talking – those who donate and those who receive assistance.

How do you support them?

We’ve donated some corporately and I have personally as well, but these are relatively small. Much of our help has been through contributing on their advisory board, on their website and helping them build better performing channels, particularly with social media.

What’s the best way for others to get involved?

Just visit their site – www.seeyourimpact.org – find a story that speaks to you, and sign up. It’s powerful to see how fast you can make a difference for lives around the world.

I Took Rand’s Advice…

I checked out See Your Impact and learned some very interesting things about the organization that I really liked:

100% of the donations for to the gift you select.The gift options are broad: Put a child through school for a month, buy a girl a month of computer training, buy food, provide vitamins, buy mosquito nets, give 50 children the chance to see a doctor…Contributors get to see how their donations have helped others.There are many inspirational stories about the people that have been helped by the contributions.

I want to thank Rand for taking the time to answer our questions and also for introducing us to such a fantastic organization.

Matt Cutts On Penalties Vs. Algorithm Changes, A Disavow-This-Link Tool & More

Is it a penalty? Or is it just a change to Google’s algorithm? That’s been one of the hot topics in search marketing in recent months thanks to the Panda and Penguin updates, and it was one of the topics of discussion tonight at our SMX Advanced conference in Seattle.

During the annual “You & A with Matt Cutts” keynote session, Google’s web spam chief told Search Engine Land Editor-In-Chief Danny Sullivan that Google’s definition of a “penalty” is when manual action is taken against a site — and that Google doesn’t use the term “penalty” as much as they say “manual action.” Cutts went on to say that neither Panda nor Penguin are penalties; they’re both algorithm updates.

He also mentioned — and this will be good news to many search marketers — that Google is considering offering a tool that allows web masters to disavow certain links, but that may be months away if it happens.

Other topics included why some spam reports aren’t acted on, whether Google+ and +1 votes are a strong SEO signal right now and much more. We’ll have separate coverage of those topics in future articles, but for now you can read my full (and largely unedited) live blog below.

********

We’re just moments away from our annual “You & A with Matt Cutts” keynote at SMX Advanced in Seattle. The room is packed like sardines in a can and, with all of the recent Panda and Penguin news buzzing around the search marketing industry, this conversation should be interesting, to say the least.

Search Engine Land’s Editor-In-Chief Danny Sullivan will be handling host duties, and I’ll do my best to keep up with the discussion below. So, stay tuned, hit your Refresh button every few minutes if you want, and follow along with all of us here in Seattle.

So we’re actually starting out with that hysterical video by Sam Applegate in which Matt Cutts explains how to rank number one on Google:

Danny and Matt have arrived to a penguin-filled stage and we’re getting started. And Matt has just thrown one of the stuffed penguins right at me, nearly taking my head off. But he missed, which is proof that he’s better at fighting web spam than at throwing stuffed penguins.

Danny: What’s the deal with Penguin. Is it a penalty?

Matt: We look at it something designed to tackle low-quality content. It started out with Panda, and then we noticed that there was still a lot of spam and Penguin was designed to tackle that. It’s an algorithmic change, but when we use a word like “penalty,” we’re talking about a manual action taken by the web spam team — it wasn’t that.

We don’t think of it as a penalty. We think of it as, “We have over 200 signals, and this is one of the signals.”

DS: So from now, does “penalty” mean it’s a human thing?

MC: That’s pretty much how we look at it. In fact, we don’t use the word “penalty” much, we refer to things as a “manual action.” Part of the reason why we do that breakdown is, how transparent can we be? We do monthly updates where we talk about changes, and in the past year, we’ve been more transparent about times when we take manual action. We send out alerts via Google Webmaster Tools.

DS: Did you just do another Penguin update?

MC: No.

Danny references the WPMU story and Matt says that the site recovered due to the data refreshes and algorithmic tweaks.

DS: Now we hear a lot of people talking about “negative SEO.”

MC: The story of this year has been more transparency, but we’re also trying to be better about enforcing our quality guidelines. People have asked questions about negative SEO for a long time. Our guidelines used to say it’s nearly impossible to do that, but there have been cases where that’s happened, so we changed the wording on that part of our guidelines.

Some have suggested that Google could disavow links. Even though we put in a lot of protection against negative SEO, there’s been so much talk about that that we’re talking about being able to enable that, maybe in a month or two or three.

DC: asks about different types of links

MC: We’ve done a good job of ignoring boilerplate, site wide links. In the last few months, we’ve been trying to make the point that not only is link buying like that not doing any good, we’re turning the dial up to let people know that certain link spam techniques are a waste of money.

DC: Danny asks about messaging.

MC: If you roll out a new algorithm, it can affect millions of sites. It’s not practical to notify website owners when you have 500 algo changes every year, but we can notify when there’s been manual action against a specific site.

One thing I’d like to clear up — the news earlier this year about 700,000 warnings. The vast majority of those were because we started sending out messages even for cases of very obvious black hat techniques. So now we’re completely transparent with the warnings we send. Typically your website ranking will drop if you don’t take action after you get one of those warnings.

DC: Anything new related to paid links?

MC: We’re always working on improving our tools. Some of the tools that we built, for example, to spot blog networks, can also be used to spot link buying. People sometimes think they can buy links without a footprint, but you don’t know about the person on the other side. People need to realize that, as we build up new tools, paid links becomes a higher risk endeavor. We’ve said it for years, but we’re starting to enforce it more.

I believe, if you ask any SEO, is SEO harder now than 5-6 years ago, I think they’d say it’s a little more challenging. You can expect that to increase. Google is getting more serious about buying and selling links. Penguin showed that some stuff that may work short term won’t work in the long term.

DS: Affiliate links. Do people need to run around and nofollow all that?

MC: If it’s a large enough affiliate network, we know about it and recognize it. But yes, I would recommend no following affiliate links. (That’s a paraphrase! Not an exact quote – sorry.)

DS: Do links still work, or are social signals gonna replace them?

MC: Douglas Adams wrote “Space is big. You have no idea how big space is.” The web is like that. Library of Congress, the biggest library in the world, has 235 terabytes of data. That’s not very big compared to the way the web grows.

The actual percentage of nofollow links on the web is a single digit percentage, and it’s a pretty small percentage. To say that links are a dead signal his wrong. I wouldn’t write the epitaph for links just yet.

DS: You do these 30-day challenges, like “I’m gonna use Bing for 30 days.”

MC: I have not done that one, and I’m afraid to try! (huge laughter from audience – Matt then says he’s joking and compliments Bing team)

Danny challenges Matt and Google to do something to see the web from an SEOs shoes, and says that SEOs should try to see things from Matt’s perspective, too.

DS: What’s up with your war on SEOs? (laughter) Or is it a war on spam?

MC: It’s a war on spam. If you go on the black hat forums, there’s a lot of people asking, How do I fake sincerity? How do I fake being awesome? Why not just be sincere and be awesome? We’re trying to stop spam so people can compete on a level playing field. I think our philosophy has been relatively consistent.

DS: What about tweets earlier today about using bounce rate? You don’t look at how quickly someone bounces from a search result and back to Google?

MC: Webspam doesn’t use Google Analytics. I asked again before this conference and was told, No, Google does not use analytics in its rankings.

And now we’re going to audience questions.

DS: What percent of organic queries are now secure?

MC: The launch was a little backwards, because we didn’t want to talk about being able to search over different corpi/corpuses. It was a single percentage of traffic in the US, and then we rolled it out internationally.

I think it’s still a minority of the traffic now, but there’s things like Firefox adding SSL search in the browser. There’s a lot of things aimed at helping users with privacy. I recognize that’s not good for marketers, but we have to put users first. We feel like moving toward SSL, moving toward encrypted, is the right long-term plan.

DS: (reading audience question) How come WordPress didn’t get penalized with all the blogs that have WordPress links in their footer?

MC: If you look at the volume of those links, most of them are from quality sites. WPMU had a pretty good number of links from lower quality sites.

DS: How come AdWords isn’t being blocked from keyword referrals?

MC: If we did that, every advertiser would do an exact match for every phrase and then the ad database would grow exponentially. He adds that he wishes Google might have reconsidered that decision, though.

(I missed the next question.)

Matt explains that web spam team has been working together with search quality people and other groups. He’s using it to further explain different between penalty and algorithm adjustment.

DS: So we have positive ranking factors and negative ranking factors?

MC: Yes.

DS: asks question about rich snippet spam

MC: Used to be that people wondered why it was so hard to get rich snippets, now it’s the other way around. We’re looking at ways to handle the abuse � missed the exact quote, but he said something about maybe removing ability for a domain to have rich snippets if there’s abuse.

DS: asks question about link removing after getting an alert in Webmaster Tools

MC: We want to see an earnest effort to remove the links. When you do a reconsideration request, we’ll look at some of the links and see “how much progress have they made?” We’ve talked about the idea of adding a disavow-this-link tool.

DS: What if you can’t get rid of bad links pointing to a page, should we get rid of the page?

MC: If it’s not an important page, you could. Or you could at least document the effort to remove the links and share it with us.

DS: What percent of spam reports does your team take action on?

MC: We have a good list of leads ourself. We’ve shut down tens of thousands, maybe hundreds of thousands of domains involved in link buying. When you get a spam report, you want to take action, but it may not be as high impact as doing something about one of our own leads. We use a factor of four — we measure the potential impact by four and if it still shows up near the bottom of the list, we may not take action on it.

DS: asks question about Google+ and SEO

MC: When we look at +1, we’ve found it’s not necessarily the best quality signal right now.

DS: You have to be on Google+ if you want to rank well in Google.

MC: No!!!! It’s still early days on how valuable the Google+ data will be.

DS: Why’d you call it Penguin, by the way?

MC: For Panda, there’s an engineer named Panda. For Penguin, we thought the codename might give away too much about how it works, so we let the engineer pick a name.

DS: If you were hit by Panda and Penguin, should we just give up? (audience roars with laughter)

MC: Sometimes you should. It’s possible to recover, but if you’re a fly-by-night spammer, it might be better to start over.

DS: What’s the deal on paid inclusion? Is it coming to web search?

MC: You call it paid inclusion, but it’s a separately labeled box and it’s not in web ranking. Google’s take on paid inclusion is when you take money and don’t disclose it. Google’s web rankings remain just as pure as they were 10 years ago. We have more stuff around the edges, that’s true, but that stuff is helpful. Matt mentions using Google Flight Search to book his trip here to Seattle. “You can’t buy higher rankings. That hasn’t changed. I don’t expect it to change.”

DS: Mentions that some people have been really mean to Matt recently.

MC: I’ve had a lot of people yell at me over the years. I’ve developed a thick skin. People aren’t striking out because they’re vicious, they’re striking out because they’re hurt or they believe Google isn’t doing the right thing. You want to listen to that. Some of our best launches have come from some of the most passionate criticism.

DS: What are you most excited about right now in search?

MC: I like some of the stuff we’re doing that hasn’t launched yet. I do like the Knowledge Graph a lot. I’m really excited that we’re pushing for more transparency. If you’d told me 10 years ago that we’re going to tell every spammer when we catch them, I would’ve said you were crazy.

And with that, we’re done. Thanks for tuning in!

Ten Search Marketing Awards You Should Know

“A person will work for a living, but they’ll die for recognition.” �I’m not sure who I heard that from first, but it’s just as true for agencies and companies as it is for individuals. �One common way to recognize excellence is through awards. There are awards for just about every industry from software to design to public relations. What about search marketing?

Awards are like lists. They’re valuable in part, because they include, but mostly exclude. The motivation for organizations to run award programs varies greatly from being a source of revenue from entry and sponsorship fees to seeking to advance the industry by recognizing it’s finest to something in between. Many SEM awards focus on paid search and attract large agencies in that space. Others offer a variety of categories.

Here are 10 awards opportunities for in-house and agency professionals in the Search Engine Marketing industry.


DMA International ECHO Awards > Search Marketing Category
The Search Marketing Award recognizes the most creative and strategic use of Internet search technology to achieve a direct marketing objective. Includes Search engine optimization (SEO) and pay-per-click (PPC) advertising.

Entry fee: $225 to $350 depending on date of entryNote: (Disclosure: TopRank has been a judge for several years)Next call for entries: Approximately�April 2010

OMMA Awards > Online Advertising Creativity Category > Search Marketing
OMMA Awards recognize the year�s best ads, promotions, campaigns and websites in online media, marketing and advertising with 28 award categories for Online Advertising Creativity, one of which is Search Marketing.

Entry fee: $195 per single ad/execution entry and $325 per campaign or website entryNote: Award site is in Flash, no dedicated page for SEM you can point to.Next call for entries: Approximately�August 2010

IAB MIXX Awards > Search Marketing Category
As part of Advertising Week, the MIXX Awards stats that it is the only international interactive awards competition judged by an all-star panel representing the entire interactive advertising ecosystem�brand marketers with direct control over many of the largest advertising budgets in the country, major media company executives and advertising agency experts who create campaigns for the world’s most powerful brands.

Entry fee: $295 per campaign and includes entry into one category; the entry fee is $150 for each additional category.Note: There are two phases, screening and finals.Next call for entries: Approximately July 2010

ad:tech Awards > Search Marketing Category
For more than a decade, the ad:tech awards program has recognized talented visual and technology designers who demonstrate excellence in interactive marketing with submissions in the following categories: Interactive Ads, Interactive Campaigns, Optimization/Search Strategy and Web Sites.

Entry fee: $255.00/category for�each ad or campaignNote: Award site is in Flash, no dedicated page for SEM you can point to.Next call for entries: Approximately�January 2010

PROMO Interactive Marketing Awards > Search Marketing Category
The program honors the best and brightest in effective interactive marketing�and recognizes the valuable role that interactive tactics play in motivating consumer response and creating strong, exciting brands.

Entry fee: $200 per entryNote: For this magazine sponsored award program, you have to register to see the winners & register again to see the webinar announcing the winners.Next call for entries: January 1, 2010

Econsultancy Innovation Awards > PPC & SEO Categories
New in 2009, the Innovation Awards are a natural progression from our commitment to recognizing innovation in the industry, as demonstrated by our regularly updated Innovation Report and a chance to receive acclaim as an innovator, be recognized by your industry peers and stand out from the crowd.

Entry fee: $195Note: While this is a new Awards program, Econsultancy has a community of 80,000 members worldwide.Next call for entries: Deadline 23 October 2009

Search Engine Watch Awards > Various Search Marketing Categories
The mission of the SEW Awards is to recognize excellence, as well as inspire innovation and encourage new ideas in search marketing. The SEW Awards honors 14 outstanding search marketers, search engines and technology providers.

Entry fee: $145 per entryNote: (Disclosure: TopRank was a judge this year)Next call for entries: Approximately July 2010

Yahoo! Searchlight Award > Search Marketing
The Yahoo! Searchlight Award represents Yahoo!’s commitment to the best and most creative search advertising ideas and executions recognizing advertising agencies that develop search marketing applications outside of the tried and true direct response mindset.

Entry fee: �?Note: It doesn’t say a Yahoo Paid Search campaign is required, but probably a great idea to include in your submission.Next call for entries: Early December 2009

ClickZ Marketing Excellence Awards > Search Ad Management
ClickZ Marketing Excellence Awards recognize the technologies, and companies that made a positive difference in the online marketing industry.

Entry fee: $49 per nominationNote: These awards are not for campaigns, but rather the technologies that enable best of breed online marketing execution in the areas of paid search, analytics, email, mobile and social media.Next call for entries: Approximately�March to April 2010.

Business Marketing Association Pro-Comm > Search/Blog/Online Mindshare Campaign Category
Pro-Comm is ranked as one of the advertising industry�s premier award programs, drawing hundreds of entries annually from b-to-b marketing agencies and clients from all around the U.S.

Entry fee: $150 – $225 depending on early bird rate and member/non-memberNote: This award is specifically for BtoB marketing.Next call for entries: Approximately�March 2010

There are also various regional awards programs from marketing related associations such as the�EIMA (Excellence in Interactive Marketing Awards)�run by the Dallas Ft Worth Interactive Marketing Association plus�other kinds of awards such as the SEMMYs (TopRank is a judge), which recognizes the top search marketing blog posts each year and the Marketing Pilgrim Search Engine Marketing Scholarship (TopRank is a judge), which is a contest to create and promote quality SEM content.

Promotion World and a few other similar sites promote top SEO/SEM awards, but judging isn’t done by a panel of industry verterans as with the other awards programs listed above and without 3rd party scrutiny or detailed judging information.

Have we missed any? What influential Search Marketing Awards should we add to this list? Do you have experience with any of the above awards programs? Good or bad, our readers would love to learn more.

Web Directory Submission Danger: Analysis of 2,678 Directories Shows 20% Penalized/Banned by Google

Hi, my name's Kurtis and I'm relatively new here at Moz. My official title is "Captain of Special Projects," which means I spend a lot of time browsing strange parts of the web, assembling metrics and inputting data in Google Docs/Excel. If you walk past my desk in the Mozplex, be warned, investigating webspam is on my task list, hence you may come away slightly traumatized by what you see. I ward off the demons by taking care of two cats and fondly remembering my days as a semi-professional scoundrel in Minnesota.

Let's move on to my first public project, which came about after Google deindexed several directories a few weeks ago. This event left us wondering if there was a rhyme to their reason. So we decided to do some intensive data collection of our own and try to figure out what was really going on.

We gathered a total of 2,678 directories from lists like  Val Web Design, SEOTIPSY.com, SEOmoz's own directory list (just the web directories were used), and a few others, the search for clues began. Out of the 2,678 directories, only 94 were banned – not too shabby. However, there were 417 additional directories that had avoided being banned, but had been penalized.

We define banned as having no results in Google when a site:domain.com search is performed:

We defined penalized as meaning the directory did not show up when highly obvious queries including its title tag / brand name produced the directory deep in the results (and that this could be repeated for any internal pages on the site as well):

As you can see above, the directory itself is nowhere to be found despite the exact title query, yet it's clearly still indexed (as you can see below by performing a domain name match query):

At first, the data for the banned directories had one common trait – none of them had a visible toolbar Pagerank. For the most part, this initial observation was fairly accurate. As we pressed on, the results became more sporadic. This leads me to believe that it may have been a manual update, rather than an algorithmic one, or at least, that no particular public metrics/patterns are clear from the directories that suffered a penalization/ban.

That is not to say the ones left unharmed are safe from a future algorithmic update. In fact, I suspect this update was intended to serve as a warning; Google will be cracking down on directories. Why? In my own humble opinion, most of the classic, "built-for-SEO-and-links" directories do not provide any benefit to users, falling under the category of non-content spam.

Some directories and link resource lists are likely going to be valuable and useful long term (e.g. CSS Beauty's collection of great designs, the Craft Site Directory or Public Legal's list of legal resources). These are obviously not in the same world as those "SEO directories" and thus probably don't deserve the same classification despite the nomenclature overlap.

Updated Directory List!

In the midst of the panic, a concerned individual brought to my attention that “half of our directories were deindexed” and wanted to know when we would be updating our list. If by half he meant 4 of the 228 we listed were banned and an additional 4 just penalized, then I’d have to agree. ;-) In any case, our list is now updated. Thanks for being patient!

Let's look at the data

We've set up two spreadsheets that show which directories were banned and/or penalized, plus a bit of data about each one. Please feel free to check them out for yourself.

SEOmoz Directory List

Directory Maximizer, Val Web Design, & SEOTIPSY Directory List

Additional Data Analysis

Given the size and scope of the data available, we're hoping that lots of you can jump in and perform your own analysis on these directories, and possibly find some other interesting correlations. As the process for checking for banning/penalization is very tedious and cumbersome, we likely won't be doing an analysis on this scale again in the very near future. But we may revisit it again in 6-12 months to see if things have changed and Google's cracking down more, letting some of the penalties/bans be lifted or making any other notable moves.

Changes were made to the list on Friday, June 1, 2012.

I look forward to your feedback and suggestions in the comments!

p.s. The Mozscape metrics (PA, DA, mozRank, etc) are from index 51, which rolled out at the start of May. Our new index, which was just released earlier today, will have more updated and possibly more useful/interesting data. If I have the chance, I'll try to update the public spreadsheets using those numbers.

Google Places Is Over, Company Makes Google+ The Center Of Gravity For Local Search

When Google+ and Google+ Pages for business were introduced a little less than a year ago many people in the local search arena began anticipating the day when Google would merge or integrate Google Places and Google+ Pages. Well, today is that day.

Google Places pages have been entirely replaced by new�Google+ Local�pages.�As of this morning roughly 80 million Google Place pages worldwide have been automatically converted into 80 million Google+ Local pages, according to Google’s Marissa Mayer. It’s a dramatic change (for the better) though it will undoubtedly disorient some users and business owners.

(See our related Google+ specific coverage,�New Google+ Local Tab Unveiled, Will Replace Google Places, at Marketing Land.)

A Range Of Changes Implemented

Here’s a brief overview of what’s new and what’s changing:

The substitution of the new Google+ Local pages (as mentioned) for Google Places pagesThe appearance of a “Local” tab within Google+The integration and free availability of Zagat reviews (its entire archive across categories)The integration of Google+ Local pages across Google properties (search, Maps, mobile)Integration of a circles filter to find reviews/recommendations from friends/family/colleagues

Static Places now give way to more dynamic Google+ Local pages. Google’s star ratings are also being replaced by the Zagat 30-point rating scale (for user reviews as well).

Below is an example SERP for “burgers near Seattle.” The top screenshot reflects the “old” Places look and feel. The second is the new search results, sans stars.

Marissa Mayer argued to me that Zagat scores can express much more differentiation and nuance because they contain separate scores for food, service and atmosphere vs. a five star scale, which is forced to factor all those considerations into a single rating (read: Yelp). The greater, 30-point spread also prevents everything from converging at 3.5 stars.

Consistent Experience, Several Doorways

Users will be able to discover the new Google+ Local pages in several ways: through a search on Google.com or Google Maps, in mobile apps or through a search on Google+. The image below an example of a local search result within Google+.

As a result, Google+ becomes another local search destination within Google, arguably with richer content and more functionality than Google.com offers at the SERP level.

Not unlike some similar functionality offered in Foursquare, users will be able to sort and filter search results by several criteria, including “your circles,” which will reveal places “touched” by friends. Currently this means reviews and posts, but could extend to check-ins later.

Google had originally hoped to make Places into interactive content pages that merchants would use regularly to communicate with customers and prospects. However that didn’t happen in part because of the�limitations�of Places pages themselves. Google+ Local pages are much more versatile and “social.” Indeed, it gives Google a local vehicle with functionality equivalent to Facebook and Twitter.

Below is a Places/+ Local “before” and “after” comparison for a restaurant in the Washington DC area, “Mio.”

Google+ Local pages are much more visually interesting. They also enable the presentation of a wider variety of information types than Google Places allowed. They will permit local merchants to develop followers and message them, and to have the kinds of social interactions now available on Facebook and Twitter.

Google says there will be many more merchant features to come, in a post on the�Google and Your Business Blog (formerly the Google Small Business Blog):

We know many of you have already created a Google+ Page for your business, and have been hosting hangouts and sharing photos, videos and posts. We�re excited that we�ll soon extend these social experiences to more Google+ Local pages in the weeks and months ahead.

Below is another example Google+ Local profile page. The design and functionality essentially match but seek to improve upon Facebook Pages.

Discovery . . . And Search

If you click the new “Local” tab in Google+ you’re taken to a personalized local home (discovery) page, which�offers�a mix of popular, social and recommended content. There are several variables that go into the content that appears on this page. The same two people in Seattle won’t see the same page, though aspects of it may be the same.

What’s also interesting is that Google has returned to a two search-box approach for Google+ Local.

Users can browse this “home page” content or search as they normally would on Google or Google Maps. As I said, the integration of Zagat content, plus the other social filters and features make Google+ now an arguably better local search destination than Google.com or Google Maps.

Below is what the new experience looks like on Google Maps. It’s largely the same as what exists today except for the replacement of the star ratings by Zagat scores (and of course the underlying new Google+ Local pages).

Rather than being asked to rate businesses along a 4 or 5 point star continuum, users are now asked to fill out a more structured form (food, service, atmosphere/decor) and leave additional comments. Some of those online reviews may also make it back into Zagat proper, at the discretion of Zagat editors I was told.

Mobile A Bit Less Straightforward

All these changes will show up almost immediately on Android handsets in what was the Places layer on Google Maps for Mobile and in the Google+ app. (The images below are Android shots from Google Maps for Mobile.) Google has submitted app updates to Apple for review and approval. They should be out very soon but will look and be accessed in a different way than on Android handsets.

It’s quite likely that Apple will replace Google Maps in June with its own Maps and so none of this experience will probably ever show up on the iOS map. Instead, Apple users will be able to access this Google+ Local experience through the Google Places app and the Google+ app on the iPhone. There was no discussion of other smartphone platforms.

Overall this should present a stronger and more useful local-mobile search experience for consumers, in large measure because of the Zagat content, but to a lesser degree the social and recommended content.

Google+ Local Pages Will Be Indexed!

The conversion of Places pages to Google+ Local pages is taking place�regardless�of whether Places pages were claimed by business owners or not. However nothing on the back end will change immediately for merchants. Google says this in its�Google and Your Business�post:

If you are a business owner, you should continue to manage your information in Google Places for Business. You�ll still be able to verify your basic listing data, make updates, and respond to reviews. For those who use AdWords Express, your ads will operate as normal as they�ll automatically redirect people to the destination you selected, or your current listing.

Despite this temporary calm, business owners are effectively being dropped into the social fray with more customer-interaction potential but also greater demands to learn how to use Google+ to their full advantage. Those who do will be rewarded. There’s a ton of SEO potential here. Most notably, unlike Google Places pages, these new Google+ Local pages will be indexed.

We asked about management of multiple locations from a single page. Google said that there’s no news for the time being but that’s the ultimate goal:

A single page through which businesses can manage their online presence is a top priority, and we’re committed to ensuring business owners have a clear voice in how their business is represented on Google, via Google+.

In its SMB-focused blog post Google provided example businesses that were invited in early to enhance their �Google+ Local pages. I’ve reproduced only a partial list here:

Oh! SushiNorth BowlChicago Music ExchangeDelfina RestaurantMezze RestaurantMuseum of Making MusicNick Strocchia PhotographyMio RestaurantA Few Preliminary Final Thoughts

These are major changes that Google is making in the fabric of local — for both consumers and marketers. They will enhance the consumer experience with a relatively small adjustment and learning curve. People will be able to go on using Google as they have but get the benefit of the richer pages and Zagat ratings. They won’t be forced to use Google+ to get the new content.

By the same token Google probably hopes that millions of local merchants creating and enhancing dynamic pages and content can bring additional usage and greater engagement to Google+. We’ll see how it plays out.

Business owners will probably have a somewhat more difficult transition than consumers, as they’re compelled now to pay attention to Google+ — in a big way. They now ignore Google+ at their own peril.

Overall local search also just got a lot more social for Google, as it has recently in a different way for Bing. We’ll explore the social dimensions as well as the SEO implications of Google+ Local pages in companion articles and during next week’s SMX Advanced, especially in the�Hardcore Local SEO Tactics�session.

Google Searches Surge After Knowledge Graph Launch

If one of Google’s goals in launching the Knowledge Graph was to increase the number of searches being conducted, then it seems the mission has been accomplished. Google reports an unspecified surge in search queries since it launched May 16.

“Early indications are that people are interacting with it more, learning about more things…and doing more [search] queries,” Google’s Senior VP Amit Singhal told the Wall Street Journal. “It’s stoking people’s curiosity.”

A Google spokesperson backed up Singhal’s claims, saying people are “doing more searches as a result.” But the WSJ couldn’t pry out any specific numbers.

The Knowledge Graph shows a large box (or “panel”, as CEO Larry Page described it) to the right of Google’s organic search results on certain queries for noteworthy people, TV shows, sports teams, places, books, and other entities.

In addition to pulling some basic biographical or factual information from Wikipedia, Google links to related searches. Say you search for [aerosmith], Google links you to searches for the individual band members, popular songs and albums, and other rock bands that other people search for.

A Wikimedia spokesperson said Google’s heavy use of the Wikipedia data is “suitable” but didn’t indicate whether the site is seeing more or less Google traffic since the Knowledge Graph debuted for all English language users. Considering Wikipedia data is featured in the box, in addition to its prominent visibility in Google’s search result pages, it’s pretty unlikely Wikipedia is hurting for Google traffic.

As Google continues building its “next generation of search,” Singhal noted the Knowledge Graph has some shortcomings, such as featuring inaccurate information (e.g., listing the wrong New York Knicks coach) and that it is “weak in many areas”, which the WSJ then notes means “products.”

Singhal also discussed the affect on PPC ads, noting that Google is “experimenting” with page designs for searches where numerous paid search ads appear for a search query in addition to the Knowledge panel.

Currently when this happens – for example, a search for [lake tahoe] – users see a map with the option of clicking on a down arrow to see more Knowledge Graph information, which pushes the PPC ads further down the page.

This has been a big month of changes at search engines, with Bing revamping its results to highlight website snapshots and social data beside its organic results, while Yahoo just rolled out Axis, its attempt at visual search.

You are invited to participate in a ClickZ-Google Analytics industrysurvey for trends in mobile marketing and apps. You'll also be enteredto win a free iPad or 1 of 2 free passes to SES Conference &Expo.

Google Hypocrisy: This Post NOT Sponsored by Google

Over the holidays, Google Chrome “sponsored” a campaign that paid hundreds of bloggers to write glowing reviews about how Chrome can help small businesses. The sponsored posts, which were primarily posted by somewhat influential �mommy bloggers,� included a promotional video of Chrome and neglected to mention factual details on how Chrome can actually help small businesses.

Aaron Wall of SEO Book, who initially discovered the marketing campaign, showed that a simple “This post is sponsored by Google” search query uncovers over 400 pages of content related to the �sponsored� marketing campaign. Although each of the posts clearly states that it was “sponsored by Google Chrome,” some of the paid posts include followed links instead of the Google required nofollow attribute, which violates Google�s own rules.

The Google Webmaster Guidelines is clear that purchasing links that pass PageRank is a direct violation:

��However, some SEOs and webmasters engage in the practice of buying and selling links that pass PageRank, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. Buying or selling links that pass PageRank is in violation of Google’s Webmaster Guidelines and can negatively impact a site’s ranking in search results.�

Since Google requests that web users inform them of sites attempting to manipulate search engine results with paid links, maybe we should let them know about Google Chrome purchasing links that pass PageRank. Considering that just last year Google penalized JC Penney, Forbes, and Overstock for similar violations, it will be interesting to see if Google penalizes their own browser!

Since the story broke, some of the links have been removed and Google has not responded to requests for more information.

[Sources Include: SEO Book & Image by�Search Engine Land]

A Powerful Analytics Tip Every Website Should Employ

How many presentations do you see that show traffic stats like these?

Or this:

Or this:

These charts aren't wrong, per se. They're not lying to you, but they are obscuring the truth, and they're making it impossible to know what's going right and wrong.

The problem isn't that the numbers are inaccurate, it's that no website is just ONE SITE. A website is a collection of pages, and oftentimes, a collection of lots of different KINDS of pages. Even the simplest of sites, built on blog CMS' like Wordpress or basic CMS' like Drupal have unique sections within them - the homepage, individual posts, static pages (about, contact, et al.), categories, search pages, posts by month, author, etc. - all of these have different formats, different functions and, almost certainly, different visitor stats.

Yet, for some reason, when we as marketers look at a site, we don't ask "how are the category pages doing this month?" or "how is the blog performing compared to the white paper articles?" We ask, "how's the site doing."

The singular answer to that question often obscures a more nuanced, but valuable truth: Different website sections perform differently.

If your car starts having trouble accelerating up hills, you don't blame the entire car for the subpar performance, you start to examine potential causes (electrical system, engine, tires, etc.) and break these components down until you find the cause. Likewise, with a website, every piece should be performance tested, tuned and monitored on a regular basis.

Don't do this:

The total page views data is fine as an overview, but we need to monitor each individual section to really understand what's gaining vs. falling.

Do this:

By segmenting out traffic to URLs that include */blog/* and those that include */ugc/* (YOUmoz), we can see when/where/how each section is rising or falling in traffic and contributing to the overall site's performance.

Even better, we should do this:

How did I make that chart?

Step 1: Separate the areas of your website by the words/characters in their URL string (or other identifying factors like keywords in their titles). For example, on SEOmoz, we've got:

The Blog - all URLs include /blogYOUmoz - all URLs have /UGCGuides - nearly all have /articlesTools - most URLs are different, but there's only around 20 so I can lump them togetheretc.

Once I have these segments, I'll use the URL structures to get data about pageviews (or any other metric I care about) separately through analytics.

Step 2: Use the content filter in Google Analytics to select only those pages that contain the URL string you're seeking:

 

 By using the simple filter for URLs "containing" /article, I've got a segmented report I can now use to start seeing what's really happening on my site.

pretty simple, right?

Step 3: Filter on each report and grab out the relevant pageviews number on a weekly basis:

I grab those numbers for each of the segments each week (well, actually, Joanna does - but she says it's less than an hour of work) and plug them into a spreadsheet.

Step 4: Create a spreadsheet and a stacked graph

This spreadsheet shows the number of pageviews to each section of the site

This stacked, area graph shows where traffic is shrinking (e.g. the Beginner's Guide) vs. growing (e.g. the Blog)

When you run these over long periods of time, you can really see the impact a new section is having, or where problems in traffic might exist. If you neglect to break things out in this fashion, you'll often find that traffic from one section's gain may overshadow the loss in another area. This over/under-compensation can hide the real issues for a site, especially in SEO (where indexation, rankings and keyword demand all play inter-connected roles).

Joanna, in her post on benchmarking,  shared this chart:


Also see this larger, detailed version

This helped us to realize where things had gone awry and why (the problem stemmed from some poorly done redirects from Linkscape to Open Site Explorer). I can't recommend this practice enough - if more marketers managed their analytics in this fashion, we'd have a much easier time identifying potential problems, opportunities and understanding not just the quantity of traffic, but the "whys" behind it.

Anyone with some clever Google Analytics methodologies to build these faster/more efficiently than my Excel hack, please do share!

UPDATE: Some friends from Maki Car Rental put together a stacked pageviews PHP code that pulls from the Google Analytics API here. Thanks!

Live Blogging: Interview with Amit Singhal, Google Fellow

Danny Sullivan and Chris Sherman are on the stage at SMX London to interview Amit Singhal. Amit is a Google Fellow, a honorary title reserved for Google�s most accomplished engineers, and he has spearheaded Google�s core ranking team since 2000. He�s also a key influencer of Search Plus Your World, Google�s search experience centered around people, that lets you find personal results from your world � your photos, your friends, your stuff � in search.

Chris Sherman is on stage to introduce Amit Singhal, Google’s Vice President and Google Fellow. Chris provides Amit’s background showing a dynamic visualization using Google Maps with the countries/cities that Amit has lived on. Amit got a M.Sc. degree in University of Minnesota. He worked for AT&T (Bell Labs), and from where he headed to Google.

Amit thanks for the intro and talks about his child memories and how he grew up watching Star Trek. He dreamed about creating robots that we can talk to. As an academic for many years he worked hard on language software that he believed would help him with his dream. In 2000 he went to Google and said to Sergey: “Your engine is�excellent, but let me re-write it!” This became the new ranking system.

Amit then talks about how he got involved with the challenge of moving beyond keywords to tackle the problem of having the same words that have multiple meanings: how can you understand user intent?�Apple (software) vs. Apple (fruit) was the first generation of search. The next leap in search technology will be when computers will understand the difference between those Apples. That’s what excites him.

In the last 5 years Amit feels he is very close to building his childhood dream. Even though there are many things to be done before achieving this dream, he feels Google is in the right direction and they will be able to achieve that. “Computers don’t understand things, they understand strings” and it is Google’s jobs to teach computers how to differentiate between different intentions.

Danny is speaking about Universal Search and how it evolved Google to Search Plus Your World. How is Search Plus Your World impacting Google? Amit says the key motivation behind Search Plus Your World is to have a secured search, it is the first baby step to achieve Google’s dream, and data shows that Google users like the personal results. It also gives the user one click removal from their personalized results. Google is currently analyzing and improving their personalization engine.

Chris mentions that personalization can be narrowing, as it gives people the same results and they do not discover new things. Amit answers that there should be different points of views in any search results, and Google is aware of that and they balance between personalized and non-personalized results.

Danny mentions a Pew research that concluded that people do not want personalization. Amit says “I am a scientist, when I look at researches I look at how the question was asked.” He discussed the specific research, and said that personalization is valuable for Google users. Danny asks:�can you tell what percentage of personalized searches are clicked? Amit says people are clicking more than before on searches and it is lifting CTR from search pages.

Chris mentions Bing Social efforts and how it is different from Google’s. Amit says: “the key challenge with personalization is that no one can judge a personalized search for someone else.” That’s why Google looks at the data about how users like their results.�Search Plus Your World is the same approach as Universal Search, people have to find what they intend to find on their results.

Danny mentions the integration Bing did with Twitter and Facebook, and how this might be good for users. Will Google do that in the future? Amit said that their contract with Twitter expired. Google cannot add Twitter and Facebook right now as their information is hidden behind a wall. It has been tough to build an integration in this terms.

Chris asks Amit how is the evolution process at Google with so many updates;�how does Google decide about which update goes live? Google has an internal system where every flawed search result is sent to Amit’s team. Based on that engineers are assigned to problems and solutions are tested on a sandbox. Then the engineer will show how the results will show after and before the update and the update is tested using an A/B test. They discuss the results and this loop runs several times until they find a change that is better in all aspects. After this process the change is send to a production environment for a very low percentage of real user traffic and see how the CTR is changed. Based on this, an independent analyst (that works for Google) will generate a report. Based on that report the group discuss and decides if the change is going to be launched or not. That’s how scientific the process is. There are some videos available on some of these sessions: check them at this post.

Danny talks about Penguin and asks how it is going from Google standpoint, are search results better? Amit says that in the end of the day, users will stay with the search engine that provides the most relevant results. Google’s objective was to reward high quality sites and that was a success with Penguin. One of the beauties of running a search engine is that the search engines that can measure best what the users feel is the one that will succeed more.

From Google’s perspective they use any signal that is available for them, more than 200 of them. They have to make sure they are accurate and good. They will use any signal, whether it is organic or not. Chris discusses the link graphs and how it is common sense now, but what about the knowledge graph? Google wants to return to users the answers that they are looking for, and that’s what drives them. Google is increasingly investing in understanding the real meaning of each query so that they can return the right answer.

Danny asks about Paid inclusion in vertical products, which was against Google’s policy in the past. Amit says that a class of searches could not be answered organically and they realized that they would have to establish relationships with data providers to get that data. To be super safe and honest with users, they make sure that these results look different, and they also started calling it sponsors to be even more clear about that.

Chris asks about the dream of creating a communicating machine and asks how this will change the way we relate with Google. Amit says that these changes come in baby steps, and it won’t be an overnight change. Amit gives the example of spoken search, and how this data is still scarce, and Google will adapt according to data.

Amit was asked whether search results are measured by Google’s revenue or by relevancy to users. Amit firmly states that revenue is not measured at all, only relevance is taking into account when defining search quality.

Amit says that if you build a great search engine for users, they get more curious because they expect to get great results, so they ask more questions. Giving relevant results will give more time for people to search more and free them time.

Postscript: Here’s video of his response, which came to a question about how publishers might potentially lose traffic if Google provides more direct answers:

Chris asks: with the scope that Google have reached, is there anyone that still knows all of Google? Amit says that there are senior executives that each can understand very well their own “entities” such as Search, Advertising, and other big groups, but no one understands everything.

Danny asks which funny search Amit has came across. Amit says that once he read a query along the lines “do my ear make me look fat?” Amit laughs: “why are you asking Google that? Go figure it alone!”

Amit concludes that he couldn’t have a better job, he gets to influence search quality and also to improve the world in some ways.

In Wake Of Penguin, Could You Be Sued For Linking To Others?

Many webmasters have been�desperately�trying to fix poor SEO work done to a site thanks to the recent Penguin update targeting webspam and the�bad link warnings�sent from Google. �The only current way to discredit a link is to have it removed as reverse nofollow�functionality�for webmasters simply doesn’t exist. �One recent example of a link removal�request�was particularly concerning as it claimed that the webmaster was�partaking�in illegal action against the company.

A recent example from IT blog pskl.us�brought to light a harsh tactic for a link removal. �A company had contacted pskl.us looking to have links removed via a DMCA (Digital Millennium Copyright Act) takedown�notice. �The DMCA�specifically�targets copyright infringment on the internet. �The specific note in question blamed the links on financial loses and search engine penalties. �Here’s the exact link removal request sent (Note: the�company�name was removed from our copy below as we have not seen the�official�emails):

It has come to our attention that your website or website hosted by your company contains links to <website> which results in financial losses by the company we represent, because of search engine penalties.

I request you to remove from following website (pskl.us)all links to <website> website as soon as possible.In order to find the links please do the following:1) If this is an online website directory, use directory�s search system to find �<company>� links.2) If there are hidden links in the source code of website, open website�s main page and view its source code. Search for �<website> in the source code and you will see hidden links.

I have a good faith belief that use of the material in the manner complained of is not authorized by <company> its agents, or the law. Therefore, this letter is an official notification to effect removal of the detected infringement listed in this letter.

I further declare under penalty of perjury that I am authorized to act on behalf of copyright holder and that the information in this letter is accurate.

Please, inform me within 48 hours of the results of your actions. Otherwise we will be forced to contact your ISP.<�company�> will be perusing legal action if the webmaster does not remove the referenced link within 48 hours.< company > �will be forced to include the hosting company in the suite for trademark infringement.

After this email was�received by pskl.us a lengthy back and forth was had. �It came out that someone at the company (or at a�competitive�company) had purchased hundreds of thousands of links to the site:

�However, we had a site cloak <company> and generate over 700K back links to our site without our knowledge. �Google stepped in and slapped us with a search ranking penalty to which our business has suffered major losses.

So this drastic technique brings up the point, is linking to other’s content illegal?

Linking Legality

In the United States many courts have found that merely linking to someone else’s public website is not illegal as long as the link is not to illegal or infringing content. �It should be known that actual theft of content by copying or linking to framed content from others has been defended as well as�linking�to illegal or infringing content.

Ford Motor Company v. 2600 Enterprises

The�plaintiff, Ford Motor Company, was displeased with the way vulgar domains (such F#ckgeneralmotors.com) were linked directly to Ford. �Ford lost the dispute as the court found that the link did not create a cause of action for trademark dilution, infringement or unfair competition. �The court also specifically�stated:

“This court does not believe that Congress intended the [Federal Trademark Dilution Act] to be used by trademark holders as a tool for eliminating Internet links that, in the trademark holder’s subjective view, somehow disparage its trademark. Trademark law does not permit Plaintiff to enjoin persons from linking to its homepage simply because it does not like the domain name of other content of the linking webpage.”

Ticketmaster Corp. v. Tickets.com, Inc.

In 2000 Ticketmaster brought a suit against Tickets.com for essentially “deep linking” to event pages where tickets could be purchased. �Tickets.com was simply linking to a public even page where�purchases�could be publicly made and the lawsuit was dismissed. �The court also made it clear that the act of linking was not against the law:

“hyperlinking [without framing] does not itself involve a violation of the Copyright Act … since no copying is involved … the customer is automatically transferred to the particular genuine web page of the original author. There is no deception in what is happening. This is analogous to using a library’s card index to get reference to particular items, albeit faster and more efficiently.”

and

“the customer is automatically transferred to the particular genuine web page of the original author. There is no deception in what is happening. This is analogous to using a library’s card index to get reference to particular items, albeit faster and more efficiently.”

In addition, those operating message boards, allowing user comments or hosting user generated content have even more protection under section�230 of the Communications Decency Act.

In conclusion, linking to others (legal, non-infringing) content is perfectly legal. �With that said have you seen an uptick in link removal requests?

SEO at Turner Broadcasting: Dan Perry Interview


Spotlight on Search Interview with Dan Perry, SEO Director at Turner Broadcasting

Working with Enterprise SEO projects is compared to smaller company sites is as different as marketing to BtoC vs. BtoB customers. This interview with Dan Perry, the SEO Director for Turner Broadcasting covers his SEO dream job, in-house SEO career advice and skills, enterprise SEO, the future of outsourcing to agencies, being persuasive inside organizations and of course, Golf!

We met while you were with Cars.com and now you�re with Turner Broadcasting. (Congrats) How did you get into the SEO world and what is it that keeps you there?

I started building very basic websites in 1998, but didn�t get into SEO until the summer of 2000. I built a site for a local golf course and a few months later, typed �Michigan golf� into a search engine. The site I built was on the first page! The light bulb went off immediately, and I�ve been promoting sites online ever since. The satisfaction of success is what keeps me in the industry. I�ve done enough SEO on sites of all sizes to know that it clearly works. Watching it work and seeing the baseline numbers for a site consistently increase over time is extremely satisfying.

What do you like best about your current position and company?

I�ll answer that with an example of a semi-typical day for me: Have an early conference call with London to discuss international SEO for Cartoon Network, have a meeting with PGA.com to discuss ongoing SEO Initiatives, meet with Topher Kohan (SEO Coordinator at CNN) to discuss strategy, have a call with NBA.com and TNT.tv to discuss the playoffs, and end the day by providing some Editorial SEO training to the team at Adult Swim. To have the opportunity to move the SEO needle on properties like these is truly a blessing. From an in-house SEO perspective, this job is as good as it gets.

You’ve�spent a lot of time working on the client side with SEO. What advice do you have for individuals that would like to break into that kind of career path?

Doing in-house SEO in a large company is much different than doing it for yourself, or at a small company. I haven�t �done� SEO in years. My job is training others how to do it, and having them keep SEO top-of-mind. It requires an even temperament, the ability to explain why SEO should be prioritized to developers, executives, and everyone in-between, and a love of PowerPoint and Excel.

What skills should a corporate marketer develop in order to be capable of handling in-house SEO duties?

The ability to sell SEO internally. You may have to convince a developer to change the way they�ve always done things. You may have to convince an executive that SEO is a good business decision, and be able to back it up with numbers. I don�t believe that SEO starts at the top and works its way down, or vice versa. It has to happen at both ends (and in the middle) and then you need to keep it top-of-mind throughout the organization. To sum it up, a strong ability to sell internally, a logical approach, and an understanding of the SEO potential and the ability to put that potential into realistic forecasts.

Do you look for specific backgrounds, experience or skills when you hire in-house SEOs?

First of all, there has to be a base SEO skill-set; this cannot be overstated. There needs to be a level of SEO confidence that one can only gain with years of trial and error, dealing with algorithm changes, etc. Also, the ability to take a complex SEO element and describe it in a simple and easy-to-understand way is an under-rated skill. Finally, a diplomatic personality is key.

With enterprise SEO, you don�t get to roll up your sleeves and jump in with a program in most cases. What do you see as some of the more common challenges with achieving end-goal results from search engine optimization in a large or complex organization?

Prioritization. You and I both know that SEO is valuable, and can produce impressive results. My job is to convince an executive that SEO should be prioritized above the dozens of other possible projects. I need to pull together an SEO plan, forecast potential gains in traffic, and explain why this should be prioritized over other projects. The funny thing is that once that happens and you get approval, THEN the real work starts.

I�ve seen you present many times on in-house SEO panels, which btw, have been priceless for SEO agencies that work for large companies. Will companies still need to outsource SEO in the next 2-3 years?

I think so. There�s a lot of value an agency can add, even when there�s an internal team. For example, agency folks can see how an algorithm change affects many different companies and industries. Over time, the lessons learned from this broad collection of sites are invaluable.

What role do you see outside agencies playing?

Depends on the level of need within a given organization and the size/bandwidth of the internal employees.

Where are SEO agencies usually the most helpful?

Every property�s needs are different, so it needs to be property-specific and driven by the unique goals and needs of each. It can vary from assisting with major initiatives like a redesign to keyword research to spillover work.

What�s your best tip for getting other departments in an organization on �your side� when it comes to content creation, approval and promotion for advancing search marketing goals? Any examples?

Showing the opportunity lost in terms of traffic and revenue. For example, if one of our sites is on the second page of Google for a set of keyterms, and I can provide data that shows the potential gains they should receive (traffic gains, and revenue gains) by getting on page one, it makes the selling process much easier.

What are some of the common �low hanging fruit� SEO suggestions you see the most often with large site SEO? The classic of course, is updating one robots.txt file to stop blocking all bots.

The SEO maximization of publishing templates is a great place to start. Relatively small changes at the template level can have a big impact. Secondly, finding inbound links that produce 404 errors and converting them to 301 redirects.

Please share some of the SEO and Social Media tools that you like most:

Working with such big brands, a lot of the tools aren�t as important as they used to be. Because of that, I spend more times in our analytics package then I ever have before.

How do you stay current with SEO and all the marketing, technology and communication channels that come with it? What are your favorite conferences, blogs, newsletters, organizations, books or networks that you rely on?

I�m a big fan of David Meerman Scott�s book on the New Rules. He took a relatively complex subject and boiled it down into easy-to-understand language. My favorite book of all-time is Don�t Make Me Think by Steve Krug. One of the few books that made me look at a website in a completely different way. When I attend conferences, I usually choose the sessions I�ll attend by speaker name rather than session description. Finally, the Planet Ocean SEO newsletter is one of the most consistent, well-written newsletters I�ve ever seen.

Since you�re a huge golf fan, do you have any interesting golf metaphors for SEO?

Love them both; here�s my top 10 list of similarities between golf and SEO:

Accept that you don�t know everything.Learn by doing.Measure often and pay attention to the numbers.Be prepared for the worst-case scenario.Learn from your mistakes.Stick with it, even during the bad times.Seek out good advice.Luck is just that.Use the right tools.Be patient and think long term.

Thanks Dan!

You can find Dan online on his�Blog,�Twitter or�LinkedIn.

Spotlight on Search is an interview series that shines a light on search marketing professionals to learn more about the nature of their work, differences in SEO amongst categories of web sites and of course, SEO tips, tactics and useful tools. We do not take PR firm pitch suggestions or or solicitations for these interviews. They are by request from TopRank Online Marketing Blog editorial staff only.

Does The First Amendment Create A Complete Defense For Google Against Antitrust Regulation?

Google now faces antitrust investigations on multiple continents. The US FTC recently hired a prominent outside litigator in a sign that it may be preparing to bring an action against the company. But does Google have a “slam dunk” defense against such a case (at least in the US) under the First Amendment of the Constitution?

A Preview of Google’s Legal Arguments?

Yes says UCLA Law Professor Eugene Volokh in a new paper-cum-legal brief. The document, which was commissioned by Google, also serves as a kind of template for legal arguments Google might make in a US antitrust case. The release of this paper is no doubt designed to “remind” Congress and the FTC that this law exists and that Google would potentially win an antitrust case on these grounds.

The 27 page document (below), replete with case law citations, can be summarized in one sentence: search engine results are editorial judgments, like newspaper content, protected by the First Amendment and thus precluded from being regulated by antitrust law and the US government as protected speech.

Professor: Nobody Can Dictate What Google Can “Say” in SERPs

Professor Volokh argues that Google may put whatever it likes in the SERP in whatever order it deems worthy, including links to its own properties and services and nobody is entitled to intervene and dictate how Google may display search results. It will be a shocking (though not entirely novel) argument to those who’ve complained against Google.

Volokh says two cases, Search King, Inc. v. Google Technology, Inc. (2003) and Langdon v. Google, Inc. (2007), firmly and conclusively establish that search results are protected editorial speech. While the US Supreme Court has not ruled on the specific question of whether search results are protected speech under the First Amendment,�Volokh cites numerous Supreme Court decisions that together stand for the idea that “the First Amendment fully protects Internet speech” and “fully pro�tects Internet speakers� editorial judgments about selection and arrangement of content” (i.e., Google).

The First Amendment vs. Antitrust Law

The most interesting part of the paper concerns the application — or lack of application goes the argument — of antitrust law to Google’s organic SERPs. (The document doesn’t discuss AdWords.) Volokh admits that the government has authority, in some cases, under antitrust law to regulate companies such as newspapers when their practices discriminate unjustly against competitors. However, he argues, this does not extend to matters of editorial discretion, even where the speaker has a “substantial monopoly.”

Volokh cites cases that stand for the broad idea that the protected exercise of speech cannot be regulated by antitrust law: “search engines� selection and arrangement decisions reflect editorial judgments about what to say and how to say it, which are protected by the First Amendment.” He adds, “[E]conomic regulations may not be used to require a speaker to include certain material in its speech product.”

The bottom line argument is that even under the guise of antitrust enforcement the government cannot interfere with protected (and absolute) editorial discretion (i.e., the Google SERP).

Is Google More Like a Cable TV Company or a Newspaper?

One case that “goes the other way” and could potentially support regulation of the Google SERPs is called�Turner Broadcasting System, Inc. v. FCC (1994). That Supreme Court case held that cable TV operators can be forced to carry programming against their will under federal law. The cable companies had argued that the federal “must carry” law was impermissible content regulation barred by the First Amendment.

Volokh distinguishes the case and argues Google and search engines are not like cable TV companies in several respects. He says in the Turner case there was almost no other way for consumers to access the disputed programming. By contrast he says there are plenty of other ways online to get access to content other than on Google. People can use Bing or Yahoo for example.

The Turner court reportedly held that the cable companies were mere “conduits” of third party speech and not editorial content producers themselves (like newspapers). Google is more like a newspaper than a cable TV company says Volokh. Yet I’m not quite sure the Turner case is that easily brushed aside.

Google is not generating the contents of its own SERPs (except in selected cases such as Google Maps or Google+). Rather it, like cable companies, is conveying third party speech and content (in the form of links). One could persuasively argue that Google is in fact more like a cable company than a newspaper.

Still the government would have a formidable challenge to overcome the weight of First Amendment case law that Volokh cites, which supports Google’s “absolute” discretion over what shows up in SERPs. I agree that it appears a very tough case for the feds to make.

Volokh first amendment paper

Related EntriesHow “Facebook Search” Could Help Google Escape The Antitrust NooseTheFind Joins FairSearch.org To “Restore Balance” To The Search MarketplaceUS Subpoenas Apple For Details About Default iOS Google Search DealReport: FTC Expanding Anti-Trust Investigation Of Google To Include Google+Deconstructing �Search Neutrality�FTC�s Google Probe Will Probably Come Away Empty HandedSantorum & Google�s Search Quality Czar Matt Cutts Makes The Colbert ReportOnce Again: Should Google Be Allowed To Send Itself Traffic?The New York Times Algorithm & Why It Needs Government Regulation