A Powerful Analytics Tip Every Website Should Employ

How many presentations do you see that show traffic stats like these?

Or this:

Or this:

These charts aren't wrong, per se. They're not lying to you, but they are obscuring the truth, and they're making it impossible to know what's going right and wrong.

The problem isn't that the numbers are inaccurate, it's that no website is just ONE SITE. A website is a collection of pages, and oftentimes, a collection of lots of different KINDS of pages. Even the simplest of sites, built on blog CMS' like Wordpress or basic CMS' like Drupal have unique sections within them - the homepage, individual posts, static pages (about, contact, et al.), categories, search pages, posts by month, author, etc. - all of these have different formats, different functions and, almost certainly, different visitor stats.

Yet, for some reason, when we as marketers look at a site, we don't ask "how are the category pages doing this month?" or "how is the blog performing compared to the white paper articles?" We ask, "how's the site doing."

The singular answer to that question often obscures a more nuanced, but valuable truth: Different website sections perform differently.

If your car starts having trouble accelerating up hills, you don't blame the entire car for the subpar performance, you start to examine potential causes (electrical system, engine, tires, etc.) and break these components down until you find the cause. Likewise, with a website, every piece should be performance tested, tuned and monitored on a regular basis.

Don't do this:

The total page views data is fine as an overview, but we need to monitor each individual section to really understand what's gaining vs. falling.

Do this:

By segmenting out traffic to URLs that include */blog/* and those that include */ugc/* (YOUmoz), we can see when/where/how each section is rising or falling in traffic and contributing to the overall site's performance.

Even better, we should do this:

How did I make that chart?

Step 1: Separate the areas of your website by the words/characters in their URL string (or other identifying factors like keywords in their titles). For example, on SEOmoz, we've got:

The Blog - all URLs include /blogYOUmoz - all URLs have /UGCGuides - nearly all have /articlesTools - most URLs are different, but there's only around 20 so I can lump them togetheretc.

Once I have these segments, I'll use the URL structures to get data about pageviews (or any other metric I care about) separately through analytics.

Step 2: Use the content filter in Google Analytics to select only those pages that contain the URL string you're seeking:

 

 By using the simple filter for URLs "containing" /article, I've got a segmented report I can now use to start seeing what's really happening on my site.

pretty simple, right?

Step 3: Filter on each report and grab out the relevant pageviews number on a weekly basis:

I grab those numbers for each of the segments each week (well, actually, Joanna does - but she says it's less than an hour of work) and plug them into a spreadsheet.

Step 4: Create a spreadsheet and a stacked graph

This spreadsheet shows the number of pageviews to each section of the site

This stacked, area graph shows where traffic is shrinking (e.g. the Beginner's Guide) vs. growing (e.g. the Blog)

When you run these over long periods of time, you can really see the impact a new section is having, or where problems in traffic might exist. If you neglect to break things out in this fashion, you'll often find that traffic from one section's gain may overshadow the loss in another area. This over/under-compensation can hide the real issues for a site, especially in SEO (where indexation, rankings and keyword demand all play inter-connected roles).

Joanna, in her post on benchmarking,  shared this chart:


Also see this larger, detailed version

This helped us to realize where things had gone awry and why (the problem stemmed from some poorly done redirects from Linkscape to Open Site Explorer). I can't recommend this practice enough - if more marketers managed their analytics in this fashion, we'd have a much easier time identifying potential problems, opportunities and understanding not just the quantity of traffic, but the "whys" behind it.

Anyone with some clever Google Analytics methodologies to build these faster/more efficiently than my Excel hack, please do share!

UPDATE: Some friends from Maki Car Rental put together a stacked pageviews PHP code that pulls from the Google Analytics API here. Thanks!

Live Blogging: Interview with Amit Singhal, Google Fellow

Danny Sullivan and Chris Sherman are on the stage at SMX London to interview Amit Singhal. Amit is a Google Fellow, a honorary title reserved for Google�s most accomplished engineers, and he has spearheaded Google�s core ranking team since 2000. He�s also a key influencer of Search Plus Your World, Google�s search experience centered around people, that lets you find personal results from your world � your photos, your friends, your stuff � in search.

Chris Sherman is on stage to introduce Amit Singhal, Google’s Vice President and Google Fellow. Chris provides Amit’s background showing a dynamic visualization using Google Maps with the countries/cities that Amit has lived on. Amit got a M.Sc. degree in University of Minnesota. He worked for AT&T (Bell Labs), and from where he headed to Google.

Amit thanks for the intro and talks about his child memories and how he grew up watching Star Trek. He dreamed about creating robots that we can talk to. As an academic for many years he worked hard on language software that he believed would help him with his dream. In 2000 he went to Google and said to Sergey: “Your engine is�excellent, but let me re-write it!” This became the new ranking system.

Amit then talks about how he got involved with the challenge of moving beyond keywords to tackle the problem of having the same words that have multiple meanings: how can you understand user intent?�Apple (software) vs. Apple (fruit) was the first generation of search. The next leap in search technology will be when computers will understand the difference between those Apples. That’s what excites him.

In the last 5 years Amit feels he is very close to building his childhood dream. Even though there are many things to be done before achieving this dream, he feels Google is in the right direction and they will be able to achieve that. “Computers don’t understand things, they understand strings” and it is Google’s jobs to teach computers how to differentiate between different intentions.

Danny is speaking about Universal Search and how it evolved Google to Search Plus Your World. How is Search Plus Your World impacting Google? Amit says the key motivation behind Search Plus Your World is to have a secured search, it is the first baby step to achieve Google’s dream, and data shows that Google users like the personal results. It also gives the user one click removal from their personalized results. Google is currently analyzing and improving their personalization engine.

Chris mentions that personalization can be narrowing, as it gives people the same results and they do not discover new things. Amit answers that there should be different points of views in any search results, and Google is aware of that and they balance between personalized and non-personalized results.

Danny mentions a Pew research that concluded that people do not want personalization. Amit says “I am a scientist, when I look at researches I look at how the question was asked.” He discussed the specific research, and said that personalization is valuable for Google users. Danny asks:�can you tell what percentage of personalized searches are clicked? Amit says people are clicking more than before on searches and it is lifting CTR from search pages.

Chris mentions Bing Social efforts and how it is different from Google’s. Amit says: “the key challenge with personalization is that no one can judge a personalized search for someone else.” That’s why Google looks at the data about how users like their results.�Search Plus Your World is the same approach as Universal Search, people have to find what they intend to find on their results.

Danny mentions the integration Bing did with Twitter and Facebook, and how this might be good for users. Will Google do that in the future? Amit said that their contract with Twitter expired. Google cannot add Twitter and Facebook right now as their information is hidden behind a wall. It has been tough to build an integration in this terms.

Chris asks Amit how is the evolution process at Google with so many updates;�how does Google decide about which update goes live? Google has an internal system where every flawed search result is sent to Amit’s team. Based on that engineers are assigned to problems and solutions are tested on a sandbox. Then the engineer will show how the results will show after and before the update and the update is tested using an A/B test. They discuss the results and this loop runs several times until they find a change that is better in all aspects. After this process the change is send to a production environment for a very low percentage of real user traffic and see how the CTR is changed. Based on this, an independent analyst (that works for Google) will generate a report. Based on that report the group discuss and decides if the change is going to be launched or not. That’s how scientific the process is. There are some videos available on some of these sessions: check them at this post.

Danny talks about Penguin and asks how it is going from Google standpoint, are search results better? Amit says that in the end of the day, users will stay with the search engine that provides the most relevant results. Google’s objective was to reward high quality sites and that was a success with Penguin. One of the beauties of running a search engine is that the search engines that can measure best what the users feel is the one that will succeed more.

From Google’s perspective they use any signal that is available for them, more than 200 of them. They have to make sure they are accurate and good. They will use any signal, whether it is organic or not. Chris discusses the link graphs and how it is common sense now, but what about the knowledge graph? Google wants to return to users the answers that they are looking for, and that’s what drives them. Google is increasingly investing in understanding the real meaning of each query so that they can return the right answer.

Danny asks about Paid inclusion in vertical products, which was against Google’s policy in the past. Amit says that a class of searches could not be answered organically and they realized that they would have to establish relationships with data providers to get that data. To be super safe and honest with users, they make sure that these results look different, and they also started calling it sponsors to be even more clear about that.

Chris asks about the dream of creating a communicating machine and asks how this will change the way we relate with Google. Amit says that these changes come in baby steps, and it won’t be an overnight change. Amit gives the example of spoken search, and how this data is still scarce, and Google will adapt according to data.

Amit was asked whether search results are measured by Google’s revenue or by relevancy to users. Amit firmly states that revenue is not measured at all, only relevance is taking into account when defining search quality.

Amit says that if you build a great search engine for users, they get more curious because they expect to get great results, so they ask more questions. Giving relevant results will give more time for people to search more and free them time.

Postscript: Here’s video of his response, which came to a question about how publishers might potentially lose traffic if Google provides more direct answers:

Chris asks: with the scope that Google have reached, is there anyone that still knows all of Google? Amit says that there are senior executives that each can understand very well their own “entities” such as Search, Advertising, and other big groups, but no one understands everything.

Danny asks which funny search Amit has came across. Amit says that once he read a query along the lines “do my ear make me look fat?” Amit laughs: “why are you asking Google that? Go figure it alone!”

Amit concludes that he couldn’t have a better job, he gets to influence search quality and also to improve the world in some ways.

In Wake Of Penguin, Could You Be Sued For Linking To Others?

Many webmasters have been�desperately�trying to fix poor SEO work done to a site thanks to the recent Penguin update targeting webspam and the�bad link warnings�sent from Google. �The only current way to discredit a link is to have it removed as reverse nofollow�functionality�for webmasters simply doesn’t exist. �One recent example of a link removal�request�was particularly concerning as it claimed that the webmaster was�partaking�in illegal action against the company.

A recent example from IT blog pskl.us�brought to light a harsh tactic for a link removal. �A company had contacted pskl.us looking to have links removed via a DMCA (Digital Millennium Copyright Act) takedown�notice. �The DMCA�specifically�targets copyright infringment on the internet. �The specific note in question blamed the links on financial loses and search engine penalties. �Here’s the exact link removal request sent (Note: the�company�name was removed from our copy below as we have not seen the�official�emails):

It has come to our attention that your website or website hosted by your company contains links to <website> which results in financial losses by the company we represent, because of search engine penalties.

I request you to remove from following website (pskl.us)all links to <website> website as soon as possible.In order to find the links please do the following:1) If this is an online website directory, use directory�s search system to find �<company>� links.2) If there are hidden links in the source code of website, open website�s main page and view its source code. Search for �<website> in the source code and you will see hidden links.

I have a good faith belief that use of the material in the manner complained of is not authorized by <company> its agents, or the law. Therefore, this letter is an official notification to effect removal of the detected infringement listed in this letter.

I further declare under penalty of perjury that I am authorized to act on behalf of copyright holder and that the information in this letter is accurate.

Please, inform me within 48 hours of the results of your actions. Otherwise we will be forced to contact your ISP.<�company�> will be perusing legal action if the webmaster does not remove the referenced link within 48 hours.< company > �will be forced to include the hosting company in the suite for trademark infringement.

After this email was�received by pskl.us a lengthy back and forth was had. �It came out that someone at the company (or at a�competitive�company) had purchased hundreds of thousands of links to the site:

�However, we had a site cloak <company> and generate over 700K back links to our site without our knowledge. �Google stepped in and slapped us with a search ranking penalty to which our business has suffered major losses.

So this drastic technique brings up the point, is linking to other’s content illegal?

Linking Legality

In the United States many courts have found that merely linking to someone else’s public website is not illegal as long as the link is not to illegal or infringing content. �It should be known that actual theft of content by copying or linking to framed content from others has been defended as well as�linking�to illegal or infringing content.

Ford Motor Company v. 2600 Enterprises

The�plaintiff, Ford Motor Company, was displeased with the way vulgar domains (such F#ckgeneralmotors.com) were linked directly to Ford. �Ford lost the dispute as the court found that the link did not create a cause of action for trademark dilution, infringement or unfair competition. �The court also specifically�stated:

“This court does not believe that Congress intended the [Federal Trademark Dilution Act] to be used by trademark holders as a tool for eliminating Internet links that, in the trademark holder’s subjective view, somehow disparage its trademark. Trademark law does not permit Plaintiff to enjoin persons from linking to its homepage simply because it does not like the domain name of other content of the linking webpage.”

Ticketmaster Corp. v. Tickets.com, Inc.

In 2000 Ticketmaster brought a suit against Tickets.com for essentially “deep linking” to event pages where tickets could be purchased. �Tickets.com was simply linking to a public even page where�purchases�could be publicly made and the lawsuit was dismissed. �The court also made it clear that the act of linking was not against the law:

“hyperlinking [without framing] does not itself involve a violation of the Copyright Act … since no copying is involved … the customer is automatically transferred to the particular genuine web page of the original author. There is no deception in what is happening. This is analogous to using a library’s card index to get reference to particular items, albeit faster and more efficiently.”

and

“the customer is automatically transferred to the particular genuine web page of the original author. There is no deception in what is happening. This is analogous to using a library’s card index to get reference to particular items, albeit faster and more efficiently.”

In addition, those operating message boards, allowing user comments or hosting user generated content have even more protection under section�230 of the Communications Decency Act.

In conclusion, linking to others (legal, non-infringing) content is perfectly legal. �With that said have you seen an uptick in link removal requests?

SEO at Turner Broadcasting: Dan Perry Interview


Spotlight on Search Interview with Dan Perry, SEO Director at Turner Broadcasting

Working with Enterprise SEO projects is compared to smaller company sites is as different as marketing to BtoC vs. BtoB customers. This interview with Dan Perry, the SEO Director for Turner Broadcasting covers his SEO dream job, in-house SEO career advice and skills, enterprise SEO, the future of outsourcing to agencies, being persuasive inside organizations and of course, Golf!

We met while you were with Cars.com and now you�re with Turner Broadcasting. (Congrats) How did you get into the SEO world and what is it that keeps you there?

I started building very basic websites in 1998, but didn�t get into SEO until the summer of 2000. I built a site for a local golf course and a few months later, typed �Michigan golf� into a search engine. The site I built was on the first page! The light bulb went off immediately, and I�ve been promoting sites online ever since. The satisfaction of success is what keeps me in the industry. I�ve done enough SEO on sites of all sizes to know that it clearly works. Watching it work and seeing the baseline numbers for a site consistently increase over time is extremely satisfying.

What do you like best about your current position and company?

I�ll answer that with an example of a semi-typical day for me: Have an early conference call with London to discuss international SEO for Cartoon Network, have a meeting with PGA.com to discuss ongoing SEO Initiatives, meet with Topher Kohan (SEO Coordinator at CNN) to discuss strategy, have a call with NBA.com and TNT.tv to discuss the playoffs, and end the day by providing some Editorial SEO training to the team at Adult Swim. To have the opportunity to move the SEO needle on properties like these is truly a blessing. From an in-house SEO perspective, this job is as good as it gets.

You’ve�spent a lot of time working on the client side with SEO. What advice do you have for individuals that would like to break into that kind of career path?

Doing in-house SEO in a large company is much different than doing it for yourself, or at a small company. I haven�t �done� SEO in years. My job is training others how to do it, and having them keep SEO top-of-mind. It requires an even temperament, the ability to explain why SEO should be prioritized to developers, executives, and everyone in-between, and a love of PowerPoint and Excel.

What skills should a corporate marketer develop in order to be capable of handling in-house SEO duties?

The ability to sell SEO internally. You may have to convince a developer to change the way they�ve always done things. You may have to convince an executive that SEO is a good business decision, and be able to back it up with numbers. I don�t believe that SEO starts at the top and works its way down, or vice versa. It has to happen at both ends (and in the middle) and then you need to keep it top-of-mind throughout the organization. To sum it up, a strong ability to sell internally, a logical approach, and an understanding of the SEO potential and the ability to put that potential into realistic forecasts.

Do you look for specific backgrounds, experience or skills when you hire in-house SEOs?

First of all, there has to be a base SEO skill-set; this cannot be overstated. There needs to be a level of SEO confidence that one can only gain with years of trial and error, dealing with algorithm changes, etc. Also, the ability to take a complex SEO element and describe it in a simple and easy-to-understand way is an under-rated skill. Finally, a diplomatic personality is key.

With enterprise SEO, you don�t get to roll up your sleeves and jump in with a program in most cases. What do you see as some of the more common challenges with achieving end-goal results from search engine optimization in a large or complex organization?

Prioritization. You and I both know that SEO is valuable, and can produce impressive results. My job is to convince an executive that SEO should be prioritized above the dozens of other possible projects. I need to pull together an SEO plan, forecast potential gains in traffic, and explain why this should be prioritized over other projects. The funny thing is that once that happens and you get approval, THEN the real work starts.

I�ve seen you present many times on in-house SEO panels, which btw, have been priceless for SEO agencies that work for large companies. Will companies still need to outsource SEO in the next 2-3 years?

I think so. There�s a lot of value an agency can add, even when there�s an internal team. For example, agency folks can see how an algorithm change affects many different companies and industries. Over time, the lessons learned from this broad collection of sites are invaluable.

What role do you see outside agencies playing?

Depends on the level of need within a given organization and the size/bandwidth of the internal employees.

Where are SEO agencies usually the most helpful?

Every property�s needs are different, so it needs to be property-specific and driven by the unique goals and needs of each. It can vary from assisting with major initiatives like a redesign to keyword research to spillover work.

What�s your best tip for getting other departments in an organization on �your side� when it comes to content creation, approval and promotion for advancing search marketing goals? Any examples?

Showing the opportunity lost in terms of traffic and revenue. For example, if one of our sites is on the second page of Google for a set of keyterms, and I can provide data that shows the potential gains they should receive (traffic gains, and revenue gains) by getting on page one, it makes the selling process much easier.

What are some of the common �low hanging fruit� SEO suggestions you see the most often with large site SEO? The classic of course, is updating one robots.txt file to stop blocking all bots.

The SEO maximization of publishing templates is a great place to start. Relatively small changes at the template level can have a big impact. Secondly, finding inbound links that produce 404 errors and converting them to 301 redirects.

Please share some of the SEO and Social Media tools that you like most:

Working with such big brands, a lot of the tools aren�t as important as they used to be. Because of that, I spend more times in our analytics package then I ever have before.

How do you stay current with SEO and all the marketing, technology and communication channels that come with it? What are your favorite conferences, blogs, newsletters, organizations, books or networks that you rely on?

I�m a big fan of David Meerman Scott�s book on the New Rules. He took a relatively complex subject and boiled it down into easy-to-understand language. My favorite book of all-time is Don�t Make Me Think by Steve Krug. One of the few books that made me look at a website in a completely different way. When I attend conferences, I usually choose the sessions I�ll attend by speaker name rather than session description. Finally, the Planet Ocean SEO newsletter is one of the most consistent, well-written newsletters I�ve ever seen.

Since you�re a huge golf fan, do you have any interesting golf metaphors for SEO?

Love them both; here�s my top 10 list of similarities between golf and SEO:

Accept that you don�t know everything.Learn by doing.Measure often and pay attention to the numbers.Be prepared for the worst-case scenario.Learn from your mistakes.Stick with it, even during the bad times.Seek out good advice.Luck is just that.Use the right tools.Be patient and think long term.

Thanks Dan!

You can find Dan online on his�Blog,�Twitter or�LinkedIn.

Spotlight on Search is an interview series that shines a light on search marketing professionals to learn more about the nature of their work, differences in SEO amongst categories of web sites and of course, SEO tips, tactics and useful tools. We do not take PR firm pitch suggestions or or solicitations for these interviews. They are by request from TopRank Online Marketing Blog editorial staff only.

Does The First Amendment Create A Complete Defense For Google Against Antitrust Regulation?

Google now faces antitrust investigations on multiple continents. The US FTC recently hired a prominent outside litigator in a sign that it may be preparing to bring an action against the company. But does Google have a “slam dunk” defense against such a case (at least in the US) under the First Amendment of the Constitution?

A Preview of Google’s Legal Arguments?

Yes says UCLA Law Professor Eugene Volokh in a new paper-cum-legal brief. The document, which was commissioned by Google, also serves as a kind of template for legal arguments Google might make in a US antitrust case. The release of this paper is no doubt designed to “remind” Congress and the FTC that this law exists and that Google would potentially win an antitrust case on these grounds.

The 27 page document (below), replete with case law citations, can be summarized in one sentence: search engine results are editorial judgments, like newspaper content, protected by the First Amendment and thus precluded from being regulated by antitrust law and the US government as protected speech.

Professor: Nobody Can Dictate What Google Can “Say” in SERPs

Professor Volokh argues that Google may put whatever it likes in the SERP in whatever order it deems worthy, including links to its own properties and services and nobody is entitled to intervene and dictate how Google may display search results. It will be a shocking (though not entirely novel) argument to those who’ve complained against Google.

Volokh says two cases, Search King, Inc. v. Google Technology, Inc. (2003) and Langdon v. Google, Inc. (2007), firmly and conclusively establish that search results are protected editorial speech. While the US Supreme Court has not ruled on the specific question of whether search results are protected speech under the First Amendment,�Volokh cites numerous Supreme Court decisions that together stand for the idea that “the First Amendment fully protects Internet speech” and “fully pro�tects Internet speakers� editorial judgments about selection and arrangement of content” (i.e., Google).

The First Amendment vs. Antitrust Law

The most interesting part of the paper concerns the application — or lack of application goes the argument — of antitrust law to Google’s organic SERPs. (The document doesn’t discuss AdWords.) Volokh admits that the government has authority, in some cases, under antitrust law to regulate companies such as newspapers when their practices discriminate unjustly against competitors. However, he argues, this does not extend to matters of editorial discretion, even where the speaker has a “substantial monopoly.”

Volokh cites cases that stand for the broad idea that the protected exercise of speech cannot be regulated by antitrust law: “search engines� selection and arrangement decisions reflect editorial judgments about what to say and how to say it, which are protected by the First Amendment.” He adds, “[E]conomic regulations may not be used to require a speaker to include certain material in its speech product.”

The bottom line argument is that even under the guise of antitrust enforcement the government cannot interfere with protected (and absolute) editorial discretion (i.e., the Google SERP).

Is Google More Like a Cable TV Company or a Newspaper?

One case that “goes the other way” and could potentially support regulation of the Google SERPs is called�Turner Broadcasting System, Inc. v. FCC (1994). That Supreme Court case held that cable TV operators can be forced to carry programming against their will under federal law. The cable companies had argued that the federal “must carry” law was impermissible content regulation barred by the First Amendment.

Volokh distinguishes the case and argues Google and search engines are not like cable TV companies in several respects. He says in the Turner case there was almost no other way for consumers to access the disputed programming. By contrast he says there are plenty of other ways online to get access to content other than on Google. People can use Bing or Yahoo for example.

The Turner court reportedly held that the cable companies were mere “conduits” of third party speech and not editorial content producers themselves (like newspapers). Google is more like a newspaper than a cable TV company says Volokh. Yet I’m not quite sure the Turner case is that easily brushed aside.

Google is not generating the contents of its own SERPs (except in selected cases such as Google Maps or Google+). Rather it, like cable companies, is conveying third party speech and content (in the form of links). One could persuasively argue that Google is in fact more like a cable company than a newspaper.

Still the government would have a formidable challenge to overcome the weight of First Amendment case law that Volokh cites, which supports Google’s “absolute” discretion over what shows up in SERPs. I agree that it appears a very tough case for the feds to make.

Volokh first amendment paper

Related EntriesHow “Facebook Search” Could Help Google Escape The Antitrust NooseTheFind Joins FairSearch.org To “Restore Balance” To The Search MarketplaceUS Subpoenas Apple For Details About Default iOS Google Search DealReport: FTC Expanding Anti-Trust Investigation Of Google To Include Google+Deconstructing �Search Neutrality�FTC�s Google Probe Will Probably Come Away Empty HandedSantorum & Google�s Search Quality Czar Matt Cutts Makes The Colbert ReportOnce Again: Should Google Be Allowed To Send Itself Traffic?The New York Times Algorithm & Why It Needs Government Regulation

Google NSA Relationship Secrecy Continues Despite Courts Efforts

The Federal Appeals Court has upheld the National Security Agency's decision not to release information confirming or denying if they have a relationship with Google. This particular ruling has to deal with encryption and cybersecurity in a 2010 cyberattack on Google users in China.

The Electronic Privacy Information Center (EPIC) wanted to know more about Google's relationship with NSA. NSA refused to confirm or deny a relationship between the two, arguing that this could make the U.S. government and its information systems potentially vulnerable to attacks.

The three judges sided with the NSA last year on the same ruling and last Friday the ruling was upheld in the U.S. Court of Appeals for the District of Colombia.

This all started back in 2010 when Google was cyberattacked by Chinese hackers. This attack was focused on Gmail accounts. Specifically Gmail accounts of human rights activists.

It was suggested that the Chinese government may have instigated them although they deny this and deny any involvement with this incident. Soon after, reports surfaced that Google and the NSA had teamed up to prevent similar attacks in the future.

Soon after this happened EPIC sought after all the documents and information that was passed between the two. In response the NSA invoked the "Glomar" response, which refers to a 1970s case in which the agency neither confirmed nor denied the existence of records for the particular topic they were discussing.

Wired reported that Judge Janice Rogers Brown, in a 3-0 opinion, sided with the government’s contention that acknowledging any records “might reveal whether the NSA investigated the threat,” or “deemed the threat a concern to the security of the U.S. government.”