Snowden has leaked secret documents with a common web crawler

Pierluigi Paganini February 10, 2014

According an internal investigation Snowden has leaked the secret NSA’documents using a web crawler software designed to search, index and backup a website.

The last assumption of the Snowden case is related to the way the whistleblower has collected the huge quantity of secret document from the National Security Agency (NSA).

The leaked documents have shown to the world the amazing capabilities of the US spy machine, every one and every thing were hacked, government agencies, politicians, allies and enemies are all targets of the US Intelligence.
Ironically, the super agency has capitulated under the threat of an individual who has cheated them in the context in which they had just become stronger, the confidentiality of information.

It has been estimated that Snowden has stolen more than 1.7 million confidential files from the NSA, but how he did it?

The New York Times reports Snowden has used an inexpensive and widely available software to “scrape” the NSA’s networks, the revelations came from the

Intelligence officials involved in the investigation. Another disturbing revelation regarding the discovery is that Snowden has continued to gather internal documents

even after he was briefly challenged by agency officials.

Snowden used a simple web crawler application to scan the network and scrape data out of our systems.

“We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” [ The process was ]“quite automated.” the official said.

Recently it published an intelligence report  issued by The Federal Government’s Track Record which provided a scaring picture on cybersecurity for US Government entities, and the circumstance seems to confirm it.

snowden web

The NSA declined to comment on its investigation, officials anonymously contacted doesn’t provided further information of the web crawler used by Snowden, but it is mysterious why the presence of a web crawler in a highly classified network was not detected.

“Agency officials insist that if Mr. Snowden had been working from N.S.A. headquarters at Fort Meade, Md., which was equipped with monitors designed to detect when a huge volume of data was being accessed and downloaded, he almost certainly would have been caught. But because he worked at an agency outpost that had not yet been upgraded with modern security measures, his copying of what the agency’s newly appointed No. 2 officer, Rick Ledgett, recently called “the keys to the kingdom” raised few alarms.” 

“When inserted with Mr. Snowden’s passwords, the web crawler became especially powerful. Investigators determined he probably had also made use of the passwords of some colleagues or supervisors.” reports the NYT.

The head of the Defense Intelligence Agency, Lt. Gen. Michael T. Flynn, reported to lawmakers that Mr. Snowden’s “disclosures could tip-off adversaries to American military tactics and operations”,  for this reason the US Government is spending a great effort and huge investments to restore a secure situation.

“Everything that he touched, we assume that he took,” [including details of how the military tracks terrorists, of enemies’ vulnerabilities and of American defenses against improvised explosive devices] “We assume the worst case.” said General Flynn

Interesting the comment of Richard Bejtlich, chief security strategist at FireEye, skeptical about the internal supervisory measures.
“Once you are inside the assumption is that you are supposed to be there, like in most organizations,” “But that doesn’t explain why they weren’t more vigilant about excessive activity in the system.”
The unique certainty is that NSA failed to detect Snowden’s “insider attack
[adrotate banner=”9″] [adrotate banner=”12″]

Pierluigi Paganini

(Security Affairs –  Snowden, NSA)

[adrotate banner=”5″]

[adrotate banner=”13″]



you might also like

leave a comment