National Security Agency Whistle-blower William Binney on U.S. Government Efforts to Control American People — Binney Should Be Awarded The Presidential Medal of Freedom — Videos

Posted on May 3, 2015. Filed under: American History, Articles, Blogroll, Books, Business, Central Intelligence Agency (CIA), College, Communications, Computers, Constitution, Corruption, Crime, Crisis, Data Storage, Documentary, Drug Cartels, Economics, Education, Employment, Entertainment, External Hard Drives, Faith, Family, Federal Bureau of Investigation (FBI), Federal Communications Commission, Federal Government, Federal Government Budget, Films, Fiscal Policy, Foreign Policy, Freedom, government, government spending, history, liberty, Life, Links, Literacy, media, Movies, National Security Agency (NSA), National Security Agency (NSA_, Non-Fiction, People, Philosophy, Photos, Police, Politics, Press, Psychology, Radio, Rants, Raves, Security, Speech, Systems, Tax Policy, Terrorism, Video, War, Wealth, Welfare, Wisdom, Writing | Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , |

william binneybill-binney-we-are-now-a-police-stateBill-Binney07Former NSA technical director Binney sits in the witness stand of a parliamentary inquiry in Berlinbinney

NSA Whistle-blower William Binney: The Future of FREEDOM

A 36-year veteran of America’s Intelligence Community, William Binney resigned from his position as Director for Global Communications Intelligence (COMINT) at the National Security Agency (NSA) and blew the whistle, after discovering that his efforts to protect the privacy and security of Americans were being undermined by those above him in the chain of command.

The NSA data-monitoring program which Binney and his team had developed — codenamed ThinThread — was being aimed not at foreign targets as intended, but at Americans (codenamed as Stellar Wind); destroying privacy here and around the world. Binney voices his call to action for the billions of individuals whose rights are currently being violated.

William Binney speaks out in this feature-length interview with Tragedy and Hope’s Richard Grove, focused on the topic of the ever-growing Surveillance State in America.

On January 22, 2015: (Berlin, Germany) – The Government Accountability Project (GAP) is proud to announce that retired NSA Technical Director and GAP client, William “Bill” Binney, will accept the Sam Adams Associates for Integrity in Intelligence Award today in Berlin, Germany. The award is presented annually by the Sam Adams Associates for Integrity in Intelligence (SAAII) to a professional who has taken a strong stand for ethics and integrity. http://whistleblower.org/press/nsa-wh…

NSA Whistle-blower: Everyone in US under virtual surveillance, all info stored, no matter the post

Former NSA Head Exposes Agency’s Real Crimes

Edward Snowden, v 1.0: NSA Whistleblower William Binney Tells All

“Where I see it going is toward a totalitarian state,” says William Binney. “You’ve got the NSA doing all this collecting of material on all of its citizens – that’s what the SS, the Gestapo, the Stasi, the KGB, and the NKVD did.”

Binney is talking about the collection of various forms of personal data on American citizens by the National Security Agency (NSA), where he worked for 30 years before quitting in 2001 from his high-placed post as technical leader for intelligence. A registered Republican for most of his life, Binney volunteered for military service during the Vietnam War, which led to his being hired by the NSA in the early ’70s.

In 2002 – long before the revelations of Edward Snowden rocked the world – Binney and several former colleagues went to Congress and the Department of Defense, asking that the NSA be investigated. Not only was the super-secretive agency wasting taxpayer dollars on ineffective programs, they argued, it was broadly violating constitutional guarantees to privacy and due process.

The government didn’t just turn a blind eye to the agency’s activities; it later accused the whistleblowers of leaking state secrets. A federal investigation of Binney – including an FBI search and seizure of his home and office computers that destroyed his consulting business – exonerated him on all charges.

“We are a clear example that [going through] the proper channels doesn’t work,” says Binney, who approves of Edward Snowden’s strategy of going straight to the media. At the same time, Binney criticizes Snowden’s leaking of documents not directly related to the NSA’s surveillance of American citizens and violation of constitutional rights. Binney believes that the NSA is vital to national security but has been become unmoored due to technological advances that vastly extend its capabilities and leadership that has no use for limits on government power. “They took that program designed [to prevent terrorist attacks] and used it to spy on American citizens and everyone else in the world,” flatly declares Binney (33:30).

Binney sat down with Reason TV’s Nick Gillespie to discuss “Trailblazer”, a data-collection program which was used on American citizens (1:00), why he thinks the NSA had the capability to stop the 9/11 attacks (7:00), his experience being raided by the FBI in 2007 (12:50), and why former President Gerald Ford, usually regarded as a hapless time-server, is one of his personal villians (41:25).

NSA Whistle-Blower Tells All: The Program | Op-Docs | The New York Times

William Binney: NSA had 9/11 foreknowledge

NSA Whistleblower Supports 9/11 Truth – William Binney and Richard Gage on GRTV

“The NSA Is Lying”: U.S. Government Has Copies of Most of Your Emails Says NSA Whistleblower

William Binney (U.S. intelligence official)

From Wikipedia, the free encyclopedia
William Binney
William Binney-IMG 9040.jpg

Binney at the Congress on Privacy & Surveillance (2013) of the École polytechnique fédérale de Lausanne (EPFL).
Born William Edward Binney
Pennsylvania, US
Education Pennsylvania State University(B.S., 1970)
Occupation Cryptanalyst-mathematician
Employer National Security Agency (NSA)
Known for Cryptography, SIGINT analysis,whistleblower
Awards Meritorious Civilian Service Award, Joe A. Callaway Award for Civic Courage (2012)[1]

William Edward Binney[2] is a former highly placed intelligence official with the United States National Security Agency (NSA)[3] turned whistleblower who resigned on October 31, 2001, after more than 30 years with the agency. He was a high-profile critic of his former employers during the George W. Bush administration.

Binney continues to speak out during Barack Obama‘s presidency about the NSA’s data collection policies, and continues interviews in the media regarding his experiences and his views on communication intercepts by governmental agencies of American citizens. In a legal case, Binney has testified in an affidavit that the NSA is in deliberate violation of the U.S. Constitution.

Biography

Binney grew up in rural Pennsylvania and graduated with a Bachelor of Science degree in mathematics from the Pennsylvania State University in 1970. He said that he volunteered for the Army during the Vietnam era in order to select work that would interest him rather than be drafted and have no input. He was found to have strong aptitudes for mathematics, analysis, and code-breaking,[4] and served four years from 1965–1969 at the Army Security Agency before going to the NSA in 1970. Binney was a Russia specialist and worked in the operations side of intelligence, starting as an analyst and ending as Technical Director prior to becoming a geopolitical world Technical Director. In the 1990s, he co-founded a unit on automating signals intelligence with NSA research chief Dr. John Taggart.[5] Binney’s NSA career culminated as Technical Leader for intelligence in 2001. Having expertise in intelligence analysis, traffic analysis, systems analysis, knowledge management, and mathematics (including set theory, number theory, and probability),[6] Binney has been described as one of the best analysts in the NSA’s history.[7] After retiring from the NSA he founded “Entity Mapping, LLC”, a private intelligence agency together with fellow NSA whistleblower J. Kirk Wiebe to market their analysis program to government agencies. NSA continued to retaliate against them, ultimately preventing them from getting work, or causing contracts they had secured to be terminated abruptly.[8]

Whistleblowing

Binney sitting in the offices ofDemocracy Now! in New York City, prior to appearing with hosts Amy Goodman, Juan Gonzalez, and guest Jacob Appelbaum. Photo taken byJacob Appelbaum.

In September 2002, he, along with J. Kirk Wiebe and Edward Loomis, asked the U.S. Defense Department to investigate the NSA for allegedly wasting “millions and millions of dollars” on Trailblazer, a system intended to analyze data carried on communications networks such as the Internet. Binney had been one of the inventors of an alternative system, ThinThread, which was shelved when Trailblazer was chosen instead. Binney has also been publicly critical of the NSA for spying on U.S. citizens, saying of its expanded surveillance after the September 11, 2001 attacks that “it’s better than anything that the KGB, the Stasi, or the Gestapo and SS ever had”[9] as well as noting Trailblazer’s ineffectiveness and unjustified high cost compared to the far less intrusive ThinThread.[10] He was furious that the NSA hadn’t uncovered the 9/11 plot and stated that intercepts it had collected but not analyzed likely would have garnered timely attention with his leaner more focused system.[7]

After he left the NSA in 2001, Binney was one of several people investigated as part of an inquiry into the 2005 New York Times exposé[11][12] on the agency’s warrantless eavesdropping program. Binney was cleared of wrongdoing after three interviews with FBI agents beginning in March 2007, but one morning in July 2007, a dozen agents armed with rifles appeared at his house, one of whom entered the bathroom and pointed his gun at Binney, still towelling off from a shower. In that raid, the FBI confiscated a desktop computer, disks, and personal and business records. The NSA revoked his security clearance, forcing him to close a business he ran with former colleagues at a loss of a reported $300,000 in annual income. In 2012, Binney and his co-plaintiffs went to federal court to get the items back. Binney spent more than $7,000 on legal fees.[13]

During interviews on Democracy Now! in April and May 2012[14] with elaboration in July 2012 at 2600’s hacker conference HOPE[4] and at DEF CON a couple weeks later,[15]Binney repeated estimates that the NSA (particularly its Stellar Wind project[16]) had intercepted 20 trillion communications “transactions” of Americans such as phone calls, emails, and other forms of data (but not including financial data). This includes most of the emails of US citizens. Binney disclosed in an affidavit for Jewel v. NSA[17] that the agency was “purposefully violating the Constitution”.[6] Binney also notes that he found out after retiring that the NSA was pursuing collect-it-all vs. targeted surveillance even before the 9/11 attacks.

Binney was invited as a witness by the NSA commission of the German Bundestag. On July 3, 2014 the Spiegel wrote, he said that the NSA wanted to have information about everything. In Binney’s view this is a totalitarian approach, which had previously been seen only in dictatorships.[18] Binney stated the goal was also to control people. Meanwhile, he said it is possible in principle to survey the whole population, abroad and in the US, which in his view contradicts the United States Constitution. In October 2001, shortly after the 9/11 attacks, the NSA began with its mass surveillance, he said. Therefore, he left the secret service shortly afterwards, after more than 30 years of employment. Binney mentioned that there were about 6000 analysts in the surveillance at NSA already during his tenure. According to him, everything changed after 9/11. The NSA used the attacks as a justification to start indiscriminate data collection. “This was a mistake. But they still do it”, he said. The secret service was saving the data as long as possible: “They do not discard anything. If they have anything they keep it.” Since then, the NSA has been saving collected data indefinitely. Binney said he deplored the NSA’s development of the past few years, to collect data not only on groups who are suspicious for criminal or terrorist activities. “We have moved away from the collection of these data to the collection of data of the 7 billion people on our planet.” Binney said he argued even then, to only pull relevant data from the cables. Access to the data was granted to departments of the government or the IRS.[18]

In August 2014 Binney was among the signatories of an open letter by the group Veteran Intelligence Professionals for Sanity to German chancellor Angela Merkel in which they urged the Chancellor to be suspicious of U.S. intelligence regarding the alleged invasion of Russia in Eastern Ukraine.[19][20]

See also

The Future of Freedom: A Feature Interview with NSA Whistleblower William Binney

http://en.wikipedia.org/wiki/William_Binney_%28U.S._intelligence_official%29

Background Articles and Videos

Presidential Medal of Freedom

From Wikipedia, the free encyclopedia
Presidential Medal of Freedom
PresMedalFreedom.jpg
Awarded by
Seal of the President of the United States.svg
President of the United States
Type Medal
Awarded for “An especially meritorious contribution to the security or national interests of the United States, world peace, cultural or other significant public or private endeavors.”[1]
Status Active
Statistics
Established 1960
First awarded 1960
Distinct
recipients
unknown; an average of fewer than 11 per year since 1993 [2]
Precedence
Next (lower) Presidential Citizens Medal
Presidential Medal of Freedom with Distinction (ribbon).PNG Presidential Medal of Freedom (ribbon).png
Service ribbon of the Presidential Medal of Freedom
(left: Medal with Distinction)

The Presidential Medal of Freedom is an award bestowed by the President of the United States and is—along with the comparable Congressional Gold Medal, bestowed by an act of U.S. Congress—the highest civilian award of the United States. It recognizes those individuals who have made “an especially meritorious contribution to the security or national interests of the United States, world peace, cultural or other significant public or private endeavors”.[3] The award is not limited to U.S. citizens and, while it is a civilian award, it can also be awarded to military personnel and worn on the uniform.

It was established in 1963 and replaced the earlier Medal of Freedom that was established by President Harry S. Truman in 1945 to honor civilian service duringWorld War II.

History of the award

Similar in name to the Medal of Freedom,[3] but much closer in meaning and precedence to the Medal for Merit: the Presidential Medal of Freedom is currently the supreme civilian decoration in precedence, whereas the Medal of Freedom was inferior in precedence to the Medal for Merit; the Medal of Freedom was awarded by any of three Cabinet secretaries, whereas the Medal for Merit was awarded by the president, as is the Presidential Medal of Freedom. Another measure of the difference between these two similarly named but very distinct awards is their per-capita frequency of award: from 1946 to 1961 the average annual incidence of award of the Medal of Freedom was approximately 1 per every 86,500 adult U.S. citizens; from 1996 to 2011 the average annual incidence of award of the Presidential Medal of Freedom was approximately 1 per every 20,500,000 adult U.S. citizens (so on an annualized per capita basis, 240 Medals of Freedom have been awarded per one Presidential Medal of Freedom).[2][4]

President John F. Kennedy established the current decoration in 1963 through Executive Order 11085, with unique and distinctive insignia, vastly expanded purpose, and far higher prestige.[1] It was the first U.S. civilian neck decoration and, in the grade of Awarded With Distinction, is the only U.S. sash and star decoration (the Chief Commander degree of the Legion of Merit – which may only be awarded to foreign heads of state – is a star decoration, but without a sash). The Executive Order calls for the medal to be awarded annually on or around July 4, and at other convenient times as chosen by the president,[5] but it has not been awarded every year (e.g., 2001, 2010). Recipients are selected by the president, either on his own initiative or based on recommendations. The order establishing the medal also expanded the size and the responsibilities of the Distinguished Civilian Service Awards Board so it could serve as a major source of such recommendations.

The medal may be awarded to an individual more than once; John Kenneth Galbraith and Colin Powell each have received two awards; Ellsworth Bunker received both of his awards With Distinction. It may also be awarded posthumously; examples include Cesar Chavez, Paul “Bear” Bryant, Roberto Clemente, Jack Kemp, John F. Kennedy, Thurgood Marshall and Lyndon Johnson.

Insignia

Medal andaccoutrementsincluding undress ribbon, miniature, and lapel badge.

Graphical representation of the Presidential Medal of Freedom with Distinction

The badge of the Presidential Medal of Freedom is in the form of a golden star with white enamel, with a red enamel pentagon behind it; the central disc bears thirteen gold stars on a blue enamel background (taken from the Great Seal of the United States) within a golden ring. Golden American bald eagles with spread wings stand between the points of the star. It is worn around the neck on a blue ribbon with white edge stripes.

A special grade of the medal, known as the Presidential Medal of Freedom with Distinction,[6] has a larger execution of the same medal design worn as a star on the left chest along with a sash over the right shoulder (similar to how the insignia of a Grand Cross is worn), with its rosette (blue with white edge, bearing the central disc of the medal at its center) resting on the left hip. When the medal With Distinction is awarded, the star may be presented depending from a neck ribbon and can be identified by its larger size than the standard medal (compare size of medals in pictures below; President Reagan’s was awarded With Distinction).

Both medals may also be worn in miniature form on a ribbon on the left chest, with a silver American bald eagle with spread wings on the ribbon, or a golden American bald eagle for a medal awarded With Distinction. In addition, the medal is accompanied by a service ribbon for wear on military service uniform, a miniature medal pendant for wear on mess dress or civilian formal wear, and a lapel badge for wear on civilian clothes (all shown in the accompanying photograph of the full presentation set).

Recipients

Gallery

See also

References

  1. ^ Jump up to:a b Executive Order 11085, signed February 22, 1960; Federal Register 28 FR 1759, February 26, 1963
  2. ^ Jump up to:a b Senate.gov
  3. ^ Jump up to:a b Executive Order 9586, signed July 6, 1945; Federal Register 10 FR 8523, July 10, 1945
  4. Jump up^ Census.gov
  5. Jump up^ Presidential Medal of Freedom Award

http://en.wikipedia.org/wiki/Presidential_Medal_of_Freedom

nsa_logo

flowchart_final_008acronyms_003130607_PRISM_ppt_1.jpg.CROP.original-original

prism-slide-2

nsa-prism-slide

prism_vendor_slide

prism-slide-6

top-secret-nsa-prism-slide-7

prism-slide-8

top-secret-nsa-prism-slide-9

xkeyscore_cover_slide

x-keyscore

keystore_slidexkeyscore_slide

NSA-X-Keyscore-slide-003

Screen-Shot

xkeyscore_slideNSA-X-Keyscore-slide-004

XKS_Future_Slide

xkeyscore_plugins

screen-shot-2013-07-31-at-8-01-30-pm

screen-shot-2013-07-31-at-8-03-13-pm

data-mining-and-terrorists-apprehended

xks_map

Through a PRISM, Darkly – Everything we know about NSA spying [30c3]

Published on Dec 30, 2013

Through a PRISM, Darkly
Everything we know about NSA spying

From Stellar Wind to PRISM, Boundless Informant to EvilOlive, the NSA spying programs are shrouded in secrecy and rubber-stamped by secret opinions from a court that meets in a faraday cage. The Electronic Frontier Foundation’s Kurt Opsahl explains the known facts about how the programs operate and the laws and regulations the U.S. government asserts allows the NSA to spy on you.
The Electronic Frontier Foundation, a non-profit civil society organization, has been litigating against the NSA spying program for the better part of a decade. EFF has collected and reviewed dozens of documents, from the original NY Times stories in 2005 and the first AT&T whistleblower in 2006, through the latest documents released in the Guardian or obtained through EFF’s Freedom of Information (government transparency) litigation. EFF attorney Kurt Opsahl’s lecture will describe how the NSA spying program works, the underlying technologies, the targeting procedures (how they decide who to focus on), the minimization procedures (how they decide which information to discard), and help you makes sense of the many code names and acronyms in the news. He will also discuss the legal and policy ramifications that have become part of the public debate following the recent disclosures, and what you can do about it. After summarizing the programs, technologies, and legal/policy framework in the lecture, the audience can ask questions.

Speaker: Kurt Opsahl
EventID: 5255
Event: 30th Chaos Communication Congress [30c3] by the Chaos Computer Club [CCC]
Location: Congress Centrum Hamburg (CCH); Am Dammtor; Marseiller Straße; 20355 Hamburg; Germany
Language: english

Glenn Becks “SURVEILLANCE STATE”

Inside the NSA

Ed Snowden, NSA, and Fairy Tales

AT&T Spying On Internet Traffic

For years the National Securities Agency, has been spying on each & every keystroke. The national headquarters of AT&T is in Missouri, where ex-employees describe a secret room. The program is called “Splitter Cut-In & Test Procedure.”

NSA Whistle-Blower Tells All – Op-Docs: The Program

The filmmaker Laura Poitras profiles William Binney, a 32-year veteran of the National Security Agency who helped design a top-secret program he says is broadly collecting Americans’ personal data.

NSA Whistleblower: Everyone in US under virtual surveillance, all info stored, no matter the post

He told you so: Bill Binney talks NSA leaks

William Benny – The Government is Profiling You (The NSA is Spying on You)

‘After 9/11 NSA had secret deal with White House’

The story of Whistleblower Thomas Drake

Whistleblowers, Part Two: Thomas Drake

NSA Whistleblower Thomas Drake speaks at National Press Club – March 15, 2013

Meet Edward Snowden: NSA PRISM Whistleblower

The Truth About Edward Snowden

N.S.A. Spying: Why Does It Matter?

Inside The NSA~Americas Cyber Secrets

NSA Whistleblower Exposes Obama’s Dragnet

AT&T whistleblower against immunity for Bush spy program-1/2

AT&T Whistleblower Urges Against Immunity for Telecoms in Bush Spy Program

The Senate is expected to vote on a controversial measure to amend the Foreign Intelligence Surveillance Act tomorrow. The legislation would rewrite the nation’s surveillance laws and authorize the National Security Agency’s secret program of warrantless wiretapping. We speak with Mark Klein, a technician with AT&T for over twenty-two years. In 2006 Klein leaked internal AT&T documents that revealed the company had set up a secret room in its San Francisco office to give the National Security Agency access to its fiber optic internet cables.

AT&T whistleblower against immunity for Bush spy program-2/2

Enemy Of The State 1998 (1080p) (Full movie)

Background Articles and Videos

Stellar Wind

Stellar Wind was the open secret code name for four surveillance programs by the United States National Security Agency (NSA) during the presidency of George W. Bush and revealed by Thomas Tamm to The New York Times reporters James Risen and Eric Lichtblau.[1] The operation was approved by President George W. Bush shortly after the September 11 attacks in 2001.[2] Stellar Wind was succeeded during the presidency of Barack Obama by four major lines of intelligence collection in the territorial United States, together capable of spanning the full range of modern telecommunications.[3]

The program’s activities involved data mining of a large database of the communications of American citizens, including e-mail communications, phone conversations, financial transactions, and Internet activity.[1] William Binney, a retired Technical Leader with the NSA, discussed some of the architectural and operational elements of the program at the 2012 Chaos Communication Congress.[4]

There were internal disputes within the Justice Department about the legality of the program, because data are collected for large numbers of people, not just the subjects of Foreign Intelligence Surveillance Act (FISA) warrants.[4]

During the Bush Administration, the Stellar Wind cases were referred to by FBI agents as “pizza cases” because many seemingly suspicious cases turned out to be food takeout orders. According to Mueller, approximately 99 percent of the cases led nowhere, but “it’s that other 1% that we’ve got to be concerned about”.[2] One of the known uses of these data were the creation of suspicious activity reports, or “SARS”, about people suspected of terrorist activities. It was one of these reports that revealed former New York governor Eliot Spitzer’s use of prostitutes, even though he was not suspected of terrorist activities.[1]

In March 2012 Wired magazine published “The NSA Is Building the Country’s Biggest Spy Center (Watch What You Say)” talking about a vast new NSA facility in Utah and says “For the first time, a former NSA official has gone on the record to describe the program, codenamed Stellar Wind, in detail,” naming the official William Binney, a former NSA code breaker. Binney went on to say that the NSA had highly secured rooms that tap into major switches, and satellite communications at both AT&T and Verizon.[5] The article suggested that the otherwise dispatched Stellar Wind is actually an active program.

http://en.wikipedia.org/wiki/Stellar_Wind_%28code_name%29

PRISM

PRISM is a clandestine national security electronic surveillance program operated by the United States National Security Agency (NSA) since 2007.[1][2][3][Notes 1] PRISM is a government codename for a data collection effort known officially as US-984XN.[8][9] It is operated under the supervision of the United States Foreign Intelligence Surveillance Court pursuant to the Foreign Intelligence Surveillance Act (FISA).[10] The existence of the program was leaked by NSA contractor Edward Snowden and published by The Guardian and The Washington Post on June 6, 2013.

A document included in the leak indicated that the PRISM SIGAD was “the number one source of raw intelligence used for NSA analytic reports.”[11] The President’s Daily Brief, an all-source intelligence product, cited PRISM data as a source in 1,477 items in 2012.[12] The leaked information came to light one day after the revelation that the United States Foreign Intelligence Surveillance Court had been requiring the telecommunications company Verizon to turn over to the NSA logs tracking all of its customers’ telephone calls on an ongoing daily basis.[13][14]

According to the Director of National Intelligence James Clapper, PRISM cannot be used to intentionally target any Americans or anyone in the United States. Clapper said a special court, Congress, and the executive branch oversee the program and extensive procedures ensure the acquisition, retention, and dissemination of data accidentally collected about Americans is kept to a minimum.[15] Clapper issued a statement and “fact sheet”[16] to correct what he characterized as “significant misimpressions” in articles by The Washington Post and The Guardian newspapers.[17]

History

Slide showing that much of the world’s communications flow through the US

Details of information collected via PRISM

PRISM is a “Special Source Operation” in the tradition of NSA’s intelligence alliances with as many as 100 trusted U.S. companies since the 1970s.[1] A prior program, the Terrorist Surveillance Program, was implemented in the wake of the September 11 attacks under the George W. Bush Administration but was widely criticized and had its legality questioned, because it was conducted without approval of the Foreign Intelligence Surveillance Court (FISC).[18][19][20][21] PRISM was authorized by an order of the FISC.[11] Its creation was enabled by the Protect America Act of 2007 under President Bush and the FISA Amendments Act of 2008, which legally immunized private companies that cooperated voluntarily with US intelligence collection and was renewed by Congress under President Obama in 2012 for five years until December 2017.[2][22] According to The Register, the FISA Amendments Act of 2008 “specifically authorizes intelligence agencies to monitor the phone, email, and other communications of U.S. citizens for up to a week without obtaining a warrant” when one of the parties is outside the U.S.[22]

PRISM was first publicly revealed on June 6, 2013, after classified documents about the program were leaked to The Washington Post and The Guardian by American Edward Snowden.[2][1] The leaked documents included 41 PowerPoint slides, four of which were published in news articles.[1][2] The documents identified several technology companies as participants in the PRISM program, including (date of joining PRISM in parentheses) Microsoft (2007), Yahoo! (2008), Google (2009), Facebook (2009), Paltalk (2009), YouTube (2010), AOL (2011), Skype (2011), and Apple (2012).[23] The speaker’s notes in the briefing document reviewed by The Washington Post indicated that “98 percent of PRISM production is based on Yahoo, Google and Microsoft.”[1]

The slide presentation stated that much of the world’s electronic communications pass through the United States, because electronic communications data tend to follow the least expensive route rather than the most physically direct route, and the bulk of the world’s internet infrastructure is based in the United States.[11] The presentation noted that these facts provide United States intelligence analysts with opportunities for intercepting the communications of foreign targets as their electronic data pass into or through the United States.[2][11]

According to The Washington Post, the intelligence analysts search PRISM data using terms intended to identify suspicious communications of targets whom the analysts suspect with at least 51 percent confidence to not be United States citizens, but in the process, communication data of some United States citizens are also collected unintentionally.[1] Training materials for analysts tell them that while they should periodically report such accidental collection of non-foreign United States data, “it’s nothing to worry about.”[1]

Response from companies

The original Washington Post and Guardian articles reporting on PRISM noted that one of the leaked briefing documents said PRISM involves collection of data “directly from the servers” of several major internet services providers.[2][1]

Initial Public Statements

Corporate executives of several companies identified in the leaked documents told The Guardian that they had no knowledge of the PRISM program in particular and also denied making information available to the government on the scale alleged by news reports.[2][24] Statements of several of the companies named in the leaked documents were reported by TechCrunch and The Washington Post as follows:[25][26]

Slide listing companies and the date that PRISM collection began

  • Microsoft: “We provide customer data only when we receive a legally binding order or subpoena to do so, and never on a voluntary basis. In addition we only ever comply with orders for requests about specific accounts or identifiers. If the government has a broader voluntary national security program to gather customer data we don’t participate in it.”[25]
  • Yahoo!: “Yahoo! takes users’ privacy very seriously. We do not provide the government with direct access to our servers, systems, or network.”[25] “Of the hundreds of millions of users we serve, an infinitesimal percentage will ever be the subject of a government data collection directive.”[26]
  • Facebook: “We do not provide any government organization with direct access to Facebook servers. When Facebook is asked for data or information about specific individuals, we carefully scrutinize any such request for compliance with all applicable laws, and provide information only to the extent required by law.”[25]
  • Google: “Google cares deeply about the security of our users’ data. We disclose user data to government in accordance with the law, and we review all such requests carefully. From time to time, people allege that we have created a government ‘back door’ into our systems, but Google does not have a backdoor for the government to access private user data.”[25] “[A]ny suggestion that Google is disclosing information about our users’ Internet activity on such a scale is completely false.”[26]
  • Apple: “We have never heard of PRISM. We do not provide any government agency with direct access to our servers, and any government agency requesting customer data must get a court order.”[27]
  • Dropbox: “We’ve seen reports that Dropbox might be asked to participate in a government program called PRISM. We are not part of any such program and remain committed to protecting our users’ privacy.”[25]

In response to the technology companies’ denials of the NSA being able to directly access the companies’ servers, The New York Times reported that sources had stated the NSA was gathering the surveillance data from the companies using other technical means in response to court orders for specific sets of data.[13] The Washington Post suggested, “It is possible that the conflict between the PRISM slides and the company spokesmen is the result of imprecision on the part of the NSA author. In another classified report obtained by The Post, the arrangement is described as allowing ‘collection managers [to send] content tasking instructions directly to equipment installed at company-controlled locations,’ rather than directly to company servers.”[1] “[I]n context, ‘direct’ is more likely to mean that the NSA is receiving data sent to them deliberately by the tech companies, as opposed to intercepting communications as they’re transmitted to some other destination.[26]

“If these companies received an order under the FISA amendments act, they are forbidden by law from disclosing having received the order and disclosing any information about the order at all,” Mark Rumold, staff attorney at the Electronic Frontier Foundation, told ABC News.[28]

Slide showing two different sources of NSA data collection. The first source the fiber optic cables of the internet handled by the Upstream program and the second source the servers of major internet companies handled by PRISM.[29]

On May 28, 2013, Google was ordered by United States District Court Judge Susan Illston to comply with a National Security Letter issued by the FBI to provide user data without a warrant.[30] Kurt Opsahl, a senior staff attorney at the Electronic Frontier Foundation, in an interview with VentureBeat said, “I certainly appreciate that Google put out a transparency report, but it appears that the transparency didn’t include this. I wouldn’t be surprised if they were subject to a gag order.”[31]

The New York Times reported on June 7, 2013, that “Twitter declined to make it easier for the government. But other companies were more compliant, according to people briefed on the negotiations.”[32] The other companies held discussions with national security personnel on how to make data available more efficiently and securely.[32] In some cases, these companies made modifications to their systems in support of the intelligence collection effort.[32] The dialogues have continued in recent months, as General Martin Dempsey, the chairman of the Joint Chiefs of Staff, has met with executives including those at Facebook, Microsoft, Google and Intel.[32] These details on the discussions provide insight into the disparity between initial descriptions of the government program including a training slide which states “Collection directly from the servers”[29] and the companies’ denials.[32]

While providing data in response to a legitimate FISA request approved by FISC is a legal requirement, modifying systems to make it easier for the government to collect the data is not. This is why Twitter could legally decline to provide an enhanced mechanism for data transmission.[32] Other than Twitter, the companies were effectively asked to construct a locked mailbox and provide the key to the government, people briefed on the negotiations said.[32] Facebook, for instance, built such a system for requesting and sharing the information.[32] Google does not provide a lockbox system, but instead transmits required data by hand delivery or secure FTP.[33]

Post-PRISM Transparency Reports

In response to the publicity surrounding media reports of data-sharing, several companies requested permission to reveal more public information about the nature and scope of information provided in response to National Security requests.

On June 14, 2013, Facebook reported that the U.S. Government had authorized the communication of “about these numbers in aggregate, and as a range.” In a press release posted to their web site, Facebook reported, “For the six months ending December 31, 2012, the total number of user-data requests Facebook received from any and all government entities in the U.S. (including local, state, and federal, and including criminal and national security-related requests) – was between 9,000 and 10,000.” Facebook further reported that the requests impacted “between 18,000 and 19,000” user accounts, a “tiny fraction of one percent” of more than 1.1 billion active user accounts.[34]

Microsoft reported that for the same period, it received “between 6,000 and 7,000 criminal and national security warrants, subpoenas and orders affecting between 31,000 and 32,000 consumer accounts from U.S. governmental entities (including local, state and federal)” which impacted “a tiny fraction of Microsoft’s global customer base”.[35]

Google issued a statement criticizing the requirement that data be reported in aggregated form, stating that lumping national security requests with criminal request data would be “a step backwards” from its previous, more detailed practices on its site transparency report. The company said that it would continue to seek government permission to publish the number and extent of FISA requests.[36]

Response from United States government

Executive branch

Shortly after publication of the reports by The Guardian and The Washington Post, the United States Director of National Intelligence, James Clapper, on June 7 released a statement confirming that for nearly six years the government of the United States had been using large internet services companies such as Google and Facebook to collect information on foreigners outside the United States as a defense against national security threats.[13] The statement read in part, “The Guardian and The Washington Post articles refer to collection of communications pursuant to Section 702 of the Foreign Intelligence Surveillance Act. They contain numerous inaccuracies.”[37] He went on to say, “Section 702 is a provision of FISA that is designed to facilitate the acquisition of foreign intelligence information concerning non-U.S. persons located outside the United States. It cannot be used to intentionally target any U.S. citizen, any other U.S. person, or anyone located within the United States.”[37] Clapper concluded his statement by stating “The unauthorized disclosure of information about this important and entirely legal program is reprehensible and risks important protections for the security of Americans.”[37] On March 12, 2013, Clapper had told the United States Senate Select Committee on Intelligence that the NSA does “not wittingly” collect any type of data on millions or hundreds of millions of Americans.[38] In an NBC News interview, Clapper said he answered Senator Wyden’s question in the “least untruthful manner by saying no”.[39]

Clapper also stated that “the NSA collects the phone data in broad swaths, because collecting it (in) a narrow fashion would make it harder to identify terrorism-related communications. The information collected lets the government, over time, make connections about terrorist activities. The program doesn’t let the U.S. listen to people’s calls, but only includes information like call length and telephone numbers dialed.”[15]

On June 8, 2013, Clapper said “the surveillance activities published in The Guardian and The Washington Post are lawful and conducted under authorities widely known and discussed, and fully debated and authorized by Congress.”[40][10] The fact sheet described PRISM as “an internal government computer system used to facilitate the government’s statutorily authorized collection of foreign intelligence information from electronic communication service providers under court supervision, as authorized by Section 702 of the Foreign Intelligence Surveillance Act (FISA) (50 U.S.C. § 1881a).”[10]

The National Intelligence fact sheet further stated that “the United States Government does not unilaterally obtain information from the servers of U.S. electronic communication service providers. All such information is obtained with FISA Court approval and with the knowledge of the provider based upon a written directive from the Attorney General and the Director of National Intelligence.” It said that the Attorney General provides FISA Court rulings and semi-annual reports about PRISM activities to Congress, “provid[ing] an unprecedented degree of accountability and transparency.”[10]

The President of the United States, Barack Obama, said on June 7 “What you’ve got is two programs that were originally authorized by Congress, have been repeatedly authorized by Congress. Bipartisan majorities have approved them. Congress is continually briefed on how these are conducted. There are a whole range of safeguards involved. And federal judges are overseeing the entire program throughout.”[41] He also said, “You can’t have 100 percent security and then also have 100 percent privacy and zero inconvenience. You know, we’re going to have to make some choices as a society.”[41]

In separate statements, senior (not mentioned by name in source) Obama administration officials said that Congress had been briefed 13 times on the programs since 2009.[42]

Legislative branch

In contrast to their swift and forceful reactions the previous day to allegations that the government had been conducting surveillance of United States citizens’ telephone records, Congressional leaders initially had little to say about the PRISM program the day after leaked information about the program was published. Several lawmakers declined to discuss PRISM, citing its top-secret classification,[43] and others said that they had not been aware of the program.[44] After statements had been released by the President and the Director of National Intelligence, some lawmakers began to comment:

Senator John McCain (R-AZ)

  • June 9 “We passed the Patriot Act. We passed specific provisions of the act that allowed for this program to take place, to be enacted in operation,”[45]

Senator Dianne Feinstein (D-CA), chair of the Senate Intelligence Committee

  • June 9 “These programs are within the law”, “part of our obligation is keeping Americans safe”, “Human intelligence isn’t going to do it”.[46]
  • June 9 “Here’s the rub: the instances where this has produced good — has disrupted plots, prevented terrorist attacks, is all classified, that’s what’s so hard about this.”[47]
  • June 11 “It went fine…we asked him[ Keith Alexander ] to declassify things because it would be helpful (for people and lawmakers to better understand the intelligence programs).” “I’ve just got to see if the information gets declassified. I’m sure people will find it very interesting.”[48]

Senator Susan Collins (R-ME), member of Senate Intelligence Committee and past member of Homeland Security Committee

  • June 11 “I had, along with Joe Lieberman, a monthly threat briefing, but I did not have access to this highly compartmentalized information” and “How can you ask when you don’t know the program exists?”[49]

Representative John Boehner (R-OH), Speaker of the House of Representatives

  • June 11 “He’s a traitor”[50] (referring to Edward Snowden)

Representative Jim Sensenbrenner (R-WI), principal sponsor of the Patriot Act

  • June 9, “This is well beyond what the Patriot Act allows.”[51] “President Obama’s claim that ‘this is the most transparent administration in history’ has once again proven false. In fact, it appears that no administration has ever peered more closely or intimately into the lives of innocent Americans.”[51]

Representative Mike Rogers (R-MI), a Chairman of the Permanent Select Committee on Intelligence.

  • June 9 “One of the things that we’re charged with is keeping America safe and keeping our civil liberties and privacy intact. I think we have done both in this particular case,”[46]
  • June 9 “Within the last few years this program was used to stop a program, excuse me, to stop a terrorist attack in the United States we know that. It’s, it’s, it’s important, it fills in a little seam that we have and it’s used to make sure that there is not an international nexus to any terrorism event that they may believe is ongoing in the United States. So in that regard it is a very valuable thing,”[52]

Senator Mark Udall (D-CO)

  • June 9 “I don’t think the American public knows the extent or knew the extent to which they were being surveilled and their data was being collected.” “I think we ought to reopen the Patriot Act and put some limits on the amount of data that the National Security (Agency) is collecting,” “It ought to remain sacred, and there’s got to be a balance here. That is what I’m aiming for. Let’s have the debate, let’s be transparent, let’s open this up”.[46]

Representative Todd Rokita (R-IN)

  • June 10 “We have no idea when they [ FISA ] meet, we have no idea what their judgments are”,[53]

Senator Rand Paul (R-KY)

  • June 6 “When the Senate rushed through a last-minute extension of the FISA Amendments Act late last year, I insisted on a vote on my amendment (SA 3436) to require stronger protections on business records and prohibiting the kind of data-mining this case has revealed. Just last month, I introduced S.1037, the Fourth Amendment Preservation and Protection Act,”[54]
  • June 9 “I’m going to be seeing if I can challenge this at the Supreme Court level. I’m going to be asking the Internet providers and all of the phone companies: ask your customers to join me in a class-action lawsuit.”[45]

Representative Luis Gutierrez (D-IL)

  • June 9 “We will be receiving secret briefings and we will be asking, I know I’m going to be asking to get more information. I want to make sure that what they’re doing is harvesting information that is necessary to keep us safe and not simply going into everybody’s private telephone conversations and Facebook and communications. I mean one of the, you know the terrorists win when you debilitate freedom of expression and privacy.”[52]

Judicial branch

The Foreign Intelligence Surveillance Court (FISC) has not acknowledged, denied or confirmed any involvement in the PRISM program at this time. It has not issued any press statement or release relating to the current situation and uncertainty.

Applicable law and practice

On June 8, 2013, the Director of National Intelligence issued a fact sheet stating that PRISM “is not an undisclosed collection or data mining program”, but rather computer software used to facilitate the collection of foreign intelligence information “under court supervision, as authorized by Section 702 of the Foreign Intelligence Surveillance Act (FISA) (50 U.S.C. § 1881a).”[10] Section 702 provides that “the Attorney General [A.G.] and the Director of National Intelligence [DNI] may authorize jointly, for a period of up to 1 year from the effective date of the authorization, the targeting of persons reasonably believed to be located outside the United States to acquire foreign intelligence information.”[55] In order to authorize the targeting, the A.G. and DNI need to get an order from the Foreign Intelligence Surveillance Court (FISC) pursuant to Section 702 or certify that “intelligence important to the national security of the United States may be lost or not timely acquired and time does not permit the issuance of an order.”[55] When asking for an order, the A.G. and DNI must certify to FISC that “a significant purpose of the acquisition is to obtain foreign intelligence information.”[55] They do not need to specify which facilities or property that the targeting will be directed at.[55]

After getting a FISC order or determining that there are emergency circumstances, the A.G. and DNI can direct an electronic communication service provider to give them access to information or facilities to carry out the targeting and keep the targeting secret.[55] The provider then has the option to: (1) comply with the directive; (2) reject it; or (3) challenge it to FISC.

If the provider complies with the directive, it is released from liability to its users for providing the information and reimbursed for the cost of providing it.[55]

If the provider rejects the directive, the A.G. may request an order from FISC to enforce it.[55] A provider that fails to comply with FISC’s order can be punished with contempt of court.[55]

Finally, a provider can petition FISC to reject the directive.[55] In case FISC denies the petition and orders the provider to comply with the directive, the provider risks contempt of court if it refuses to comply with FISC’s order.[55] The provider can appeal FISC’s denial to the Foreign Intelligence Surveillance Court of Review and then appeal the Court of Review’s decision to the Supreme Court by a writ of certiorari for review under seal.[55]

The Senate Select Committee on Intelligence and the FISA Courts had been put in place to oversee intelligence operations in the period after the death of J. Edgar Hoover. Beverly Gage of Slate said, “When they were created, these new mechanisms were supposed to stop the kinds of abuses that men like Hoover had engineered. Instead, it now looks as if they have come to function as rubber stamps for the expansive ambitions of the intelligence community. J. Edgar Hoover no longer rules Washington, but it turns out we didn’t need him anyway.”[56]

Involvement of other countries

Australia

The Australian government has said it will investigate the impact of the PRISM program and the use of the Pine Gap surveillance facility on the privacy of Australian citizens.[57]

Canada

Canada’s national cryptologic agency, the Communications Security Establishment, said that commenting on PRISM “would undermine CSE’s ability to carry out its mandate”. Privacy Commissioner Jennifer Stoddart lamented Canada’s standards when it comes to protecting personal online privacy stating “We have fallen too far behind,” Stoddart wrote in her report. “While other nations’ data protection authorities have the legal power to make binding orders, levy hefty fines and take meaningful action in the event of serious data breaches, we are restricted to a ‘soft’ approach: persuasion, encouragement and, at the most, the potential to publish the names of transgressors in the public interest.” And, “when push comes to shove,” Stoddart wrote, “short of a costly and time-consuming court battle, we have no power to enforce our recommendations.”[58]

Germany

Germany did not receive any raw PRISM data, according to a Reuters report.[59]

Israel

Israeli newspaper Calcalist discussed[60] the Business Insider article[61] about the possible involvement of technologies from two secretive Israeli companies in the PRISM program – Verint Systems and Narus.

New Zealand

In New Zealand, University of Otago information science Associate Professor Hank Wolfe said that “under what was unofficially known as the Five Eyes Alliance, New Zealand and other governments, including the United States, Australia, Canada, and Britain, dealt with internal spying by saying they didn’t do it. But they have all the partners doing it for them and then they share all the information.”[62]

United Kingdom

In the United Kingdom, Government Communications Headquarters (GCHQ) has had access to the PRISM program on or before June 2010 and wrote 197 reports with it in 2012 alone. PRISM may have allowed GCHQ to circumvent the formal legal process required to seek personal material.[63][64]

Domestic response

Unbalanced scales.svg
The neutrality of this section is disputed. Please do not remove this message until the dispute is resolved. (June 2013)

The New York Times editorial board charged that the Obama administration “has now lost all credibility on this issue,”[65] and lamented that “for years, members of Congress ignored evidence that domestic intelligence-gathering had grown beyond their control, and, even now, few seem disturbed to learn that every detail about the public’s calling and texting habits now reside in a N.S.A. database.”[66]

Republican and former member of Congress Ron Paul said, “We should be thankful for individuals like Edward Snowden and Glenn Greenwald who see injustice being carried out by their own government and speak out, despite the risk…. They have done a great service to the American people by exposing the truth about what our government is doing in secret.”[67] Paul denounced the government’s secret surveillance program: “The government does not need to know more about what we are doing…. We need to know more about what the government is doing.”[67] He called Congress “derelict in giving that much power to the government,” and said that had he been elected president, he would have ordered searches only when there was probable cause of a crime having been committed, which he said was not how the PRISM program was being operated.[68]

In response to Obama administration arguments that it could stop terrorism in the cases of Najibullah Zazi and David Headley, Ed Pilkington and Nicholas Watt of The Guardian said in regards to the role of PRISM and Boundless Informant interviews with parties involved in the Zazi scheme and court documents lodged in the United States and the United Kingdom indicated that “conventional” surveillance methods such as “old-fashioned tip-offs” of the British intelligence services initiated the investigation into the Zazi case.[69] An anonymous former CIA agent said that in regards to the Headley case, “That’s nonsense. It played no role at all in the Headley case. That’s not the way it happened at all.”[69] Pilkington and Watt concluded that the data-mining programs “played a relatively minor role in the interception of the two plots.”[69] Michael Daly of The Daily Beast stated that even though Tamerlan Tsarnaev had visited Inspire and even though Russian intelligence officials alerted U.S. intelligence officials about Tsarnaev, PRISM did not prevent him from carrying out the Boston bombings, and that the initial evidence implicating him came from his brother Dzhokhar Tsarnaev and not from federal intelligence. In addition Daly pointed to the fact that Faisal Shahzad visited Inspire but that federal authorities did not stop his attempted terrorist plot. Daly concluded “The problem is not just what the National Security Agency is gathering at the risk of our privacy but what it is apparently unable to monitor at the risk of our safety.”[70] In addition, political commentator Bill O’Reilly criticized the government, saying that PRISM did not stop the Boston bombings.[71]

In a blog post, David Simon, the creator of The Wire, compared the NSA’s programs, including PRISM, to a 1980s effort by the City of Baltimore to add dialed number recorders to all pay phones to know which individuals were being called by the callers;[72] the city believed that drug traffickers were using pay phones and pagers, and a municipal judge allowed the city to place the recorders. The placement of the dialers formed the basis of the show’s first season. Simon argued that the media attention regarding the NSA programs is a “faux scandal.”[72][73] George Takei, an actor who had experienced Japanese American internment, said that due to his memories of the internment, he felt concern towards the NSA surveillance programs that had been revealed.[74]

The Electronic Frontier Foundation (EFF), an international non-profit digital-rights group based in the U.S., is hosting a tool, by which an American resident can write to their government representatives regarding their opposition to mass spying.[75]

On June 11, 2013, the American Civil Liberties Union filed a lawsuit against the NSA citing that PRISM “violates Americans’ constitutional rights of free speech, association, and privacy”.[76]

International response

Reactions of Internet users in China were mixed between viewing a loss of freedom worldwide and seeing state surveillance coming out of secrecy. The story broke just before US President Barack Obama and Chinese President Xi Jinping met in California.[77][78] When asked about NSA hacking China, the spokeswoman of Ministry of Foreign Affairs of the People’s Republic of China said “China strongly advocates cybersecurity”.[79] The party-owned newspaper Liberation Daily described this surveillance like Nineteen Eighty-Four-style.[80] Hong Kong legislators Gary Fan and Claudia Mo wrote a letter to Obama, stating “the revelations of blanket surveillance of global communications by the world’s leading democracy have damaged the image of the U.S. among freedom-loving peoples around the world.”[81]

Sophie in ‘t Veld, a Dutch Member of the European Parliament, called PRISM “a violation of EU laws”.[82]

Protests at Checkpoint Charlie in Berlin

The German Federal Commissioner for Data Protection and Freedom of Information, Peter Schaar, condemned the program as “monstrous”.[83] He further added that White House claims do “not reassure me at all” and that “given the large number of German users of Google, Facebook, Apple or Microsoft services, I expect the German government […] is committed to clarification and limitation of surveillance.” Steffen Seibert, press secretary of the Chancellor’s office, announced that Angela Merkel will put these issues on the agenda of the talks with Barack Obama during his pending visit in Berlin.[84]

The Italian president of the Guarantor for the protection of personal data, Antonello Soro, said that the surveillance dragnet “would not be legal in Italy” and would be “contrary to the principles of our legislation and would represent a very serious violation”.[85]

William Hague, the foreign secretary of the United Kingdom, dismissed accusations that British security agencies had been circumventing British law by using information gathered on British citizens by Prism[86] saying, “Any data obtained by us from the United States involving UK nationals is subject to proper UK statutory controls and safeguards.”[86] David Cameron said Britain’s spy agencies that received data collected from PRISM acted within the law: “I’m satisfied that we have intelligence agencies that do a fantastically important job for this country to keep us safe, and they operate within the law.”[86][87] Malcolm Rifkind, the chairman of parliament’s Intelligence and Security Committee, said that if the British intelligence agencies were seeking to know the content of emails about people living in the UK, then they actually have to get lawful authority.[87] The UK’s Information Commissioner’s Office was more cautious, saying it would investigate PRISM alongside other European data agencies: “There are real issues about the extent to which U.S. law agencies can access personal data of UK and other European citizens. Aspects of U.S. law under which companies can be compelled to provide information to U.S. agencies potentially conflict with European data protection law, including the UK’s own Data Protection Act. The ICO has raised this with its European counterparts, and the issue is being considered by the European Commission, who are in discussions with the U.S. Government.”[82]

Ai Weiwei, a Chinese dissident, said “Even though we know governments do all kinds of things I was shocked by the information about the US surveillance operation, Prism. To me, it’s abusively using government powers to interfere in individuals’ privacy. This is an important moment for international society to reconsider and protect individual rights.”[88]

Kim Dotcom, a German-Finnish Internet entrepreneur who owned Megaupload, which was closed by the U.S. federal government, said “We should heed warnings from Snowden because the prospect of an Orwellian society outweighs whatever security benefits we derive from Prism or Five Eyes.”[89] The Hong Kong law firm representing Dotcom expressed a fear that the communication between Dotcom and the firm had been compromised by U.S. intelligence programs.[90]

Russia has offered to consider an asylum request from Edward Snowden.[91]

Taliban spokesperson Zabiullah Mujahid said “We knew about their past efforts to trace our system. We have used our technical resources to foil their efforts and have been able to stop them from succeeding so far.”[92][93]

Related government Internet surveillance programs

A parallel program, code-named BLARNEY, gathers up metadata as it streams past choke points along the backbone of the Internet. BLARNEY’s summary, set down in the slides alongside a cartoon insignia of a shamrock and a leprechaun hat, describes it as “an ongoing collection program that leverages IC [intelligence community] and commercial partnerships to gain access and exploit foreign intelligence obtained from global networks.”[94]

A related program, a big data visualization system based on cloud computing and free and open-source software (FOSS) technology known as “Boundless Informant”, was disclosed in documents leaked to The Guardian and reported on June 8, 2013. A leaked, top secret map allegedly produced by Boundless Informant revealed the extent of NSA surveillance in the U.S.[95]

http://en.wikipedia.org/wiki/PRISM_%28surveillance_program%29

ThinThread

ThinThread is the name of a project that the United States National Security Agency (NSA) pursued during the 1990s, according to a May 17, 2006 article in The Baltimore Sun.[1] The program involved wiretapping and sophisticated analysis of the resulting data, but according to the article, the program was discontinued three weeks before the September 11, 2001 attacks due to the changes in priorities and the consolidation of U.S. intelligence authority.[2] The “change in priority” consisted of the decision made by the director of NSA General Michael V. Hayden to go with a concept called Trailblazer, despite the fact that ThinThread was a working prototype that protected the privacy of U.S. citizens.

ThinThread was dismissed and replaced by the Trailblazer Project, which lacked the privacy protections.[3] A consortium led by Science Applications International Corporation was awarded a $280 million contract to develop Trailblazer in 2002.[4]

http://en.wikipedia.org/wiki/ThinThread

Trailblazer

Trailblazer was a United States National Security Agency (NSA) program intended to develop a capability to analyze data carried on communications networks like the Internet. It was intended to track entities using communication methods such as cell phones and e-mail.[1][2] It ran over budget, failed to accomplish critical goals, and was cancelled.

NSA whistleblowers J. Kirk Wiebe, William Binney, Ed Loomis, and House Permanent Select Committee on Intelligence staffer Diane Roark complained to the Department of Defense’s Inspector General (IG) about waste, fraud, and abuse in the program, and the fact that a successful operating prototype existed, but was ignored when the Trailblazer program was launched. The complaint was accepted by the IG and an investigation began that lasted until mid-2005 when the final results were issued. The results were largely hidden, as the report given to the public was heavily (90%) redacted, while the original report was heavily classified, thus restricting the ability of most people to see it.

The people who filed the IG complaint were later raided by armed Federal Bureau of Investigation (FBI) agents. While the Government threatened to prosecute all who signed the IG report, it ultimately chose to pursue an NSA Senior Executive — Thomas Andrews Drake — who helped with the report internally to NSA and who had spoken with a reporter about the project. Drake was later charged under the Espionage Act of 1917. His defenders claimed this was retaliation.[3][4] The charges against him were later dropped, and he agreed to plead guilty to having committed a misdemeanor under the Computer Fraud and Abuse Act, something that Jesselyn Radack of the Government Accountability Project (which helped represent him) called an “act of civil disobedience”.[5]

Background

Trailblazer was chosen over a similar program named ThinThread, a less costly project which had been designed with built-in privacy protections for United States citizens.[4][3] Trailblazer was later linked to the NSA electronic surveillance program and the NSA warrantless surveillance controversy.[3]

In 2002 a consortium led by Science Applications International Corporation was chosen by the NSA to produce a technology demonstration platform in a contract worth $280 million. Project participants included Boeing, Computer Sciences Corporation, and Booz Allen Hamilton. The project was overseen by NSA Deputy Director William B. Black, Jr., an NSA worker who had gone to SAIC, and then been re-hired back to NSA by NSA director Michael Hayden in 2000.[6][7][8] SAIC had also hired a former NSA director to its management; Bobby Inman.[9] SAIC also participated in the concept definition phase of Trailblazer.[10][11]

Redacted version of the DoD Inspector General audit, obtained through the Freedom of Information Act by the Project on Government Oversight and others. [12][5]

The NSA Inspector General issued a report on Trailblazer that “discussed improperly based contract cost increases, non-conformance in the management of the Statement of Work, and excessive labor rates for contractor personnel.” [13]

In 2004 the DoD IG report criticized the program (see the Whistleblowing section below). It said that the “NSA ‘disregarded solutions to urgent national security needs'” and “that TRAILBLAZER was poorly executed and overly expensive …” Several contractors for the project were worried about cooperating with DoD’s audit for fear of “management reprisal.”[5] The Director of NSA “nonconcurred” with several statements in the IG audit, and the report contains a discussion of those disagreements.[14]

In 2005, NSA director Michael Hayden told a Senate hearing that the Trailblazer program was several hundred million dollars over budget and years behind schedule.[15] In 2006 the program was shut down,[3] after having cost billions of US Dollars.[16] Several anonymous NSA sources told Hosenball of Newsweek later on that the project was a “wasteful failure”.[17]

The new project replacing Trailblazer is called Turbulence.[3]

Whistleblowing

According to a 2011 New Yorker article, in the early days of the project several NSA employees met with Diane S Roark, an NSA budget expert on the House Intelligence Committee. They aired their grievances about Trailblazer. In response, NSA director Michael Hayden sent out a memo saying that “individuals, in a session with our congressional overseers, took a position in direct opposition to one that we had corporately decided to follow … Actions contrary to our decisions will have a serious adverse effect on our efforts to transform N.S.A., and I cannot tolerate them.”[3]

In September 2002, several people filed a complaint with the Department of Defense IG’s office regarding problems with Trailblazer: they included Roark (aforementioned), ex-NSA senior analysts Bill Binney, Kirk Wiebe, and Senior Computer Systems Analyst Ed Loomis, who had quit the agency over concerns about its mismanagement of acquisition and allegedly illegal domestic spying.[3][18][19] A major source for the report was NSA senior officer Thomas Andrews Drake. Drake had been complaining to his superiors for some time about problems at the agency, and about the superiority of ThinThread over Trailblazer, for example, at protecting privacy.[19] Drake gave info to DoD during its investigation of the matter.[19] Roark also went to her boss at the House committee, Porter Goss, about problems, but was rebuffed.[20] She also attempted to contact William Renquist, the Supreme Court Chief Justice at the time.[19]

Drake’s own boss, Maureen Baginski, the third-highest officer at NSA, quit partly over concerns about the legality of its behavior.[3]

In 2003, the NSA IG (not the DoD IG)[19] had declared Trailblazer an expensive failure.[21] It had cost more than $1 billion.[8][22][23]

In 2005, the DoD IG produced a report on the result of its investigation of the complaint of Roark and the others in 2002. This report was not released to the public, but it has been described as very negative.[18] Mayer writes that it hastened the closure of Trailblazer, which was at the time in trouble from congress for being over budget.[3]

In November 2005, Drake contacted Siobhan Gorman, a reporter of The Baltimore Sun.[24][17][25] Gorman wrote several articles about problems at the NSA, including articles on Trailblazer. This series got her an award from the Society of Professional Journalists.[17]

In 2005, President George W. Bush ordered the FBI to find whoever had disclosed information about the NSA electronic surveillance program and its disclosure in the New York Times. Eventually, this investigation led to the people who had filed the 2002 DoD IG request, even though they had nothing to do with the New York Times disclosure. In 2007, the houses of Roark, Binney, and Wiebe were raided by armed FBI agents. According to Mayer, Binney claims the FBI pointed guns at his head and that of his wife. Wiebe said it reminded him of the Soviet Union.[3][18] None of these people were ever charged with any crime. Four months later, Drake was raided in November 2007 and his computers and documents were confiscated.

In 2010 Drake was indicted by the U.S. Department of Justice on charges of obstructing justice, providing false information, and violating the Espionage Act of 1917,[17][26][27] part of President Barack Obama’s crackdown on whistleblowers and “leakers”.[24][17][28][18] The government tried to get Roark to testify to a conspiracy, and made similar requests to Drake, offering him a plea bargain. They both refused.[3]

In June 2011, the ten original charges against Drake were dropped, instead he pleaded guilty to a misdemeanor.[5]

http://www.youtube.com/watch?v=1AXwwSq_me4

Boundless Informant

Boundless Informant is a big data analysis and data visualization system used by the United States National Security Agency (NSA) to give NSA managers summaries of NSA’s world wide data collection activities.[1] It is described in an unclassified, For Official Use Only Frequently Asked Questions (FAQ) memo published by The Guardian.[2] According to a Top Secret heat map display also published by The Guardian and allegedly produced by the Boundless Informant program, almost 3 billion data elements from inside the United States were captured by NSA over a 30-day period ending in March 2013.

Data analyzed by Boundless Informant includes electronic surveillance program records (DNI) and telephone call metadata records (DNR) stored in an NSA data archive called GM-PLACE. It does not include FISA data, according to the FAQ memo. PRISM, a government codename for a collection effort known officially as US-984XN, which was revealed at the same time as Boundless Informant, is one source of DNR data. According to the map, Boundless Informant summarizes data records from 504 separate DNR and DNI collection sources (SIGADs). In the map, countries that are under surveillance are assigned a color from green, representing least coverage to red, most intensive.[3][4]

History

Slide showing that much of the world’s communications flow through the US.

Intelligence gathered by the United States government inside the United States or specifically targeting US citizens is legally required to be gathered in compliance with the Foreign Intelligence Surveillance Act of 1978 (FISA) and under the authority of the Foreign Intelligence Surveillance Court (FISA court).[5][6][7]

NSA global data mining projects have existed for decades, but recent programs of intelligence gathering and analysis that include data gathered from inside the United States such as PRISM were enabled by changes to US surveillance law introduced under President Bush and renewed under President Obama in December 2012.[8]

Boundless Informant was first publicly revealed on June 8, 2013, after classified documents about the program were leaked to The Guardian.[1][9] The newspaper identified its informant, at his request, as Edward Snowden, who worked at the NSA for the defense contractor Booz Allen Hamilton.[10]

Technology

According to published slides, Boundless Informant leverages Free and Open Source Software—and is therefore “available to all NSA developers”—and corporate services hosted in the cloud. The tool uses HDFS, MapReduce, and Cloudbase for data processing.[11]

Legality and FISA Amendments Act of 2008

The FISA Amendments Act (FAA) Section 702 is referenced in PRISM documents detailing the electronic interception, capture and analysis of metadata. Many reports and letters of concern written by members of Congress suggest that this section of FAA in particular is legally and constitutionally problematic, such as by targeting U.S. persons, insofar as “Collections occur in U.S.” as published documents indicate.[12][13][14][15]

The ACLU has asserted the following regarding the FAA: “Regardless of abuses, the problem with the FAA is more fundamental: the statute itself is unconstitutional.”[16]

Senator Rand Paul is introducing new legislation called the Fourth Amendment Restoration Act of 2013 to stop the NSA or other agencies of the United States government from violating the Fourth Amendment to the U.S. Constitution using technology and big data information systems like PRISM and Boundless Informant.[17][18]

http://en.wikipedia.org/wiki/Boundless_Informant

ECHELON

ECHELON is a name used in global media and in popular culture to describe a signals intelligence (SIGINT) collection and analysis network operated on behalf of the five signatory states to the UKUSA Security Agreement[1] (Australia, Canada, New Zealand, the United Kingdom, and the United States, referred to by a number of abbreviations, including AUSCANNZUKUS[1] and Five Eyes).[2][3] It has also been described as the only software system which controls the download and dissemination of the intercept of commercial satellite trunk communications.[4]

ECHELON, according to information in the European Parliament document, “On the existence of a global system for the interception of private and commercial communications (ECHELON interception system)” was created to monitor the military and diplomatic communications of the Soviet Union and its Eastern Bloc allies during the Cold War in the early 1960s.[5]

The system has been reported in a number of public sources.[6] Its capabilities and political implications were investigated by a committee of the European Parliament during 2000 and 2001 with a report published in 2001,[5] and by author James Bamford in his books on the National Security Agency of the United States.[4] The European Parliament stated in its report that the term ECHELON is used in a number of contexts, but that the evidence presented indicates that it was the name for a signals intelligence collection system. The report concludes that, on the basis of information presented, ECHELON was capable of interception and content inspection of telephone calls, fax, e-mail and other data traffic globally through the interception of communication bearers including satellite transmission, public switched telephone networks (which once carried most Internet traffic) and microwave links.[5]

Bamford describes the system as the software controlling the collection and distribution of civilian telecommunications traffic conveyed using communication satellites, with the collection being undertaken by ground stations located in the footprint of the downlink leg.

Organization

UKUSA Community
Map of UKUSA Community countries with Ireland

Australia
Canada
New Zealand
United Kingdom
United States of America

The UKUSA intelligence community was assessed by the European Parliament (EP) in 2000 to include the signals intelligence agencies of each of the member states:

  • the Government Communications Headquarters of the United Kingdom,
  • the National Security Agency of the United States,
  • the Communications Security Establishment of Canada,
  • the Defence Signals Directorate of Australia, and
  • the Government Communications Security Bureau of New Zealand.
  • the National SIGINT Organisation (NSO) of The Netherlands

The EP report concluded that it seemed likely that ECHELON is a method of sorting captured signal traffic, rather than a comprehensive analysis tool.[5]

Capabilities

The ability to intercept communications depends on the medium used, be it radio, satellite, microwave, cellular or fiber-optic.[5] During World War II and through the 1950s, high frequency (“short wave”) radio was widely used for military and diplomatic communication,[7] and could be intercepted at great distances.[5] The rise of geostationary communications satellites in the 1960s presented new possibilities for intercepting international communications. The report to the European Parliament of 2001 states: “If UKUSA states operate listening stations in the relevant regions of the earth, in principle they can intercept all telephone, fax and data traffic transmitted via such satellites.”[5]

The role of satellites in point-to-point voice and data communications has largely been supplanted by fiber optics; in 2006, 99% of the world’s long-distance voice and data traffic was carried over optical-fiber.[8] The proportion of international communications accounted for by satellite links is said to have decreased substantially over the past few years[when?] in Central Europe to an amount between 0.4% and 5%.[5] Even in less-developed parts of the world, communications satellites are used largely for point-to-multipoint applications, such as video.[9] Thus, the majority of communications can no longer be intercepted by earth stations; they can only be collected by tapping cables and intercepting line-of-sight microwave signals, which is possible only to a limited extent.[5]

One method of interception is to place equipment at locations where fiber optic communications are switched. For the Internet, much of the switching occurs at relatively few sites. There have been reports of one such intercept site, Room 641A, in the United States. In the past[when?] much Internet traffic was routed through the U.S. and the UK, but this has changed; for example, in 2000, 95% of intra-German Internet communications was routed via the DE-CIX Internet exchange point in Frankfurt.[5] A comprehensive worldwide surveillance network is possible only if clandestine intercept sites are installed in the territory of friendly nations, and/or if local authorities cooperate. The report to the European Parliament points out that interception of private communications by foreign intelligence services is not necessarily limited to the U.S. or British foreign intelligence services.[5]

Most reports on ECHELON focus on satellite interception; testimony before the European Parliament indicated that separate but similar UK-US systems are in place to monitor communication through undersea cables, microwave transmissions and other lines.[10]

Controversy

See also: Industrial espionage

Intelligence monitoring of citizens, and their communications, in the area covered by the AUSCANNZUKUS security agreement has caused concern. British journalist Duncan Campbell and New Zealand journalist Nicky Hager asserted in the 1990s that the United States was exploiting ECHELON traffic for industrial espionage, rather than military and diplomatic purposes.[10] Examples alleged by the journalists include the gear-less wind turbine technology designed by the German firm Enercon[5][11] and the speech technology developed by the Belgian firm Lernout & Hauspie.[12] An article in the US newspaper Baltimore Sun reported in 1995 that European aerospace company Airbus lost a $6 billion contract with Saudi Arabia in 1994 after the US National Security Agency reported that Airbus officials had been bribing Saudi officials to secure the contract.[13][14]

In 2001, the Temporary Committee on the ECHELON Interception System recommended to the European Parliament that citizens of member states routinely use cryptography in their communications to protect their privacy, because economic espionage with ECHELON has been conducted by the US intelligence agencies.[5]

Bamford provides an alternative view, highlighting that legislation prohibits the use of intercepted communications for commercial purposes, although he does not elaborate on how intercepted communications are used as part of an all-source intelligence process.

Hardware

According to its website, the U.S. National Security Agency (NSA) is “a high technology organization … on the frontiers of communications and data processing”. In 1999 the Australian Senate Joint Standing Committee on Treaties was told by Professor Desmond Ball that the Pine Gap facility was used as a ground station for a satellite-based interception network. The satellites were said to be large radio dishes between 20 and 100 meters in diameter in geostationary orbits.[citation needed] The original purpose of the network was to monitor the telemetry from 1970s Soviet weapons, air defence radar, communications satellites and ground based microwave communications.[15]

Name

The European Parliament’s Temporary Committee on the ECHELON Interception System stated: “It seems likely, in view of the evidence and the consistent pattern of statements from a very wide range of individuals and organisations, including American sources, that its name is in fact ECHELON, although this is a relatively minor detail.”[5] The U.S. intelligence community uses many code names (see, for example, CIA cryptonym).

Former NSA employee Margaret Newsham claims that she worked on the configuration and installation of software that makes up the ECHELON system while employed at Lockheed Martin, for whom she worked from 1974 to 1984 in Sunnyvale, California, US, and in Menwith Hill, England, UK.[16] At that time, according to Newsham, the code name ECHELON was NSA’s term for the computer network itself. Lockheed called it P415. The software programs were called SILKWORTH and SIRE. A satellite named VORTEX intercepted communications. An image available on the internet of a fragment apparently torn from a job description shows Echelon listed along with several other code names.[17]

Ground stations

The 2001 European Parliamentary (EP) report[5] lists several ground stations as possibly belonging to, or participating in, the ECHELON network. These include:

Likely satellite intercept stations

The following stations are listed in the EP report (p. 54 ff) as likely to have, or to have had, a role in intercepting transmissions from telecommunications satellites:

  • Hong Kong (since closed)
  • Australian Defence Satellite Communications Station (Geraldton, Western Australia)
  • Menwith Hill (Yorkshire, U.K.) Map (reportedly the largest Echelon facility)[18]
  • Misawa Air Base (Japan) Map
  • GCHQ Bude, formerly known as GCHQ CSO Morwenstow, (Cornwall, U.K.) Map
  • Pine Gap (Northern Territory, Australia – close to Alice Springs) Map
  • Sugar Grove (West Virginia, U.S.) Map
  • Yakima Training Center (Washington, U.S.) Map
  • GCSB Waihopai (New Zealand)
  • GCSB Tangimoana (New Zealand)
  • CFS Leitrim (Ontario, Canada)
  • Teufelsberg (Berlin, Germany) (closed 1992)

Other potentially related stations

The following stations are listed in the EP report (p. 57 ff) as ones whose roles “cannot be clearly established”:

  • Ayios Nikolaos (Cyprus – U.K.)
  • BadAibling Station (BadAibling, Germany – U.S.)
    • relocated to Griesheim in 2004[19]
    • deactivated in 2008[20]
  • Buckley Air Force Base (Aurora, Colorado)
  • Fort Gordon (Georgia, U.S.)
  • Gander (Newfoundland & Labrador, Canada)
  • Guam (Pacific Ocean, U.S.)
  • Kunia Regional SIGINT Operations Center (Hawaii, U.S.)
  • Lackland Air Force Base, Medina Annex (San Antonio, Texas)

http://en.wikipedia.org/wiki/ECHELON

Room 641A

Room 641A is a telecommunication interception facility operated by AT&T for the U.S. National Security Agency that commenced operations in 2003 and was exposed in 2006.[1][2]

Description

Room 641A is located in the SBC Communications building at 611 Folsom Street, San Francisco, three floors of which were occupied by AT&T before SBC purchased AT&T.[1] The room was referred to in internal AT&T documents as the SG3 [Study Group 3] Secure Room. It is fed by fiber optic lines from beam splitters installed in fiber optic trunks carrying Internet backbone traffic[3] and, as analyzed by J. Scott Marcus, a former CTO for GTE and a former adviser to the FCC, who has access to all Internet traffic that passes through the building, and therefore “the capability to enable surveillance and analysis of internet content on a massive scale, including both overseas and purely domestic traffic.”[4] Former director of the NSA’s World Geopolitical and Military Analysis Reporting Group, William Binney, has estimated that 10 to 20 such facilities have been installed throughout the United States.[2]

The room measures about 24 by 48 feet (7.3 by 15 m) and contains several racks of equipment, including a Narus STA 6400, a device designed to intercept and analyze Internet communications at very high speeds.[1]

The very existence of the room was revealed by a former AT&T technician, Mark Klein, and was the subject of a 2006 class action lawsuit by the Electronic Frontier Foundation against AT&T.[5] Klein claims he was told that similar black rooms are operated at other facilities around the country.

Room 641A and the controversies surrounding it were subjects of an episode of Frontline, the current affairs documentary program on PBS. It was originally broadcast on May 15, 2007. It was also featured on PBS’s NOW on March 14, 2008. The room was also covered in the PBS Nova episode “The Spy Factory”.

Lawsuit

Basic diagram of how the alleged wiretapping was accomplished. From EFF court filings[4]

More complicated diagram of how it allegedly worked. From EFF court filings.[3] See bottom of the file page for enlarged and rotated version.

Main article: Hepting v. AT&T

The Electronic Frontier Foundation (EFF) filed a class-action lawsuit against AT&T on January 31, 2006, accusing the telecommunication company of violating the law and the privacy of its customers by collaborating with the National Security Agency (NSA) in a massive, illegal program to wiretap and data-mine Americans’ communications. On July 20, 2006, a federal judge denied the government’s and AT&T’s motions to dismiss the case, chiefly on the ground of the States Secrets Privilege, allowing the lawsuit to go forward. On August 15, 2007, the case was heard by the Ninth Circuit Court of Appeals and was dismissed on December 29, 2011 based on a retroactive grant of immunity by Congress for telecommunications companies that cooperated with the government. The U.S. Supreme Court declined to hear the case.[6] A different case by the EFF was filed on September 18, 2008, titled Jewel v. NSA.

http://en.wikipedia.org/wiki/Room_641A

List of government surveillance projects for the United States

United States

A top secret document leaked by Edward Snowden to The Guardian in 2013, originally due to be declassified on 12 April 2038.

http://en.wikipedia.org/wiki/List_of_government_surveillance_projects

Related Posts On Pronk Palisades

James Bamford — The National Security Agency (NSA) — Videos

National Security Agency (NSA) Wants To Build Supercomputer To Crack All Encryption — Videos

National Security Agency (NSA) Intercepts FedX and UPS Packages To Install Malware Software — Bugs iPhones and Laptops — Videos

No Such Agency — NSA — National Security Agency — Threat To The Liberty and Privacy of The American People — None Of Their Damn Business — Still Trust The Federal Government? — Videos

Enemy Of The State: Life Imitating Art –National Security Agency Targets American People — Vidoes

Big Brother Barack Targets All The American People As Enemies of The State and Democratic Party — National Security Agency’s PRISM Is The Secret Security Surveillance State (S4) Means of Invading Privacy and Limiting Liberty — Outrageous Overreach–Videos

National Security Agency (NSA) and Federal Bureau Investigation (FBI) Secret Security Surveillance State (S4) Uses Stellar Wind and PRISM To Create Secret Dossiers On All American Citizen Targets Similar To East Germany Stasi Files–Videos

NSA’s PRISM Political Payoff: 40 Million Plus Foreigners Are In USA As Illegal Aliens! — 75% Plus Lean Towards Democratic Party — Pathway To One Party Rule By 2025 If Senate Bill Becomes Law Giving Illegal Aliens Legal Status — 25 Million American Citizens Looking For Full Time Jobs! — Videos

Amnesty Before Enforcement — Congressional Gangsters’ Comprehensive Immigration “Reform” Bill Targets American Citizens For Unemployment — American Citizens Want All Illegal Aliens Deported Not Rewarded With Legal Status — Target The Amnesty Illegal Alien Gangsters For Defeat — Videos

U.S. Hacking China and Hong Kong — Videos

Digital Campaigns Using Microtargeting and Data Mining To Target Voters — Videos

Sasha Issenberg — The Victory Lab: The Secret Science of Winning Campaigns — Videos

Related Posts on Pronk Pops

Pronk Pops Show 112, June 7, 2013, Segment 0: Marxist-Leninists Go To The Wall With Holder — The Man Who Knows Where The Bodies Are Buried Enjoys President Obama’s Full Confidence Says Political Fixer Valerie Jarrett — Wall Street Wants Holder To Hang On — American People Say Hit The Road Jack — Videos

Pronk Pops Show 112, June 7, 2013: Segment 1: U.S. Real Gross Domestic Product Growth Still Stagnating At 2.4% in First Quarter of 2013 As Institute for Supply Management Factory Index Sinks to 49.0 Lowest Since June 2009 — Videos

Pronk Pops Show 112, June 7, 2013, Segment 2: Federal Advisory Council (FAC) May 17, 2013 Report — No Exit To A Bridge Over Troubled Waters — Keyboarding Money — We’re screwed! — Videos

Pronk Pops Show 112, June 7, 2013, Segment 3: Official Unemployment Rate Rises To 7.6% with 11.8 Million Americans Unemployed and Only 175,000 Jobs Created in May — Videos

Pronk Pops Show 112, June 7, 2013, Segment 4: No Such Agency — NSA — National Security Agency — Threat To The Liberty and Privacy of The American People — None Of Their Damn Business — Still Trust The Federal Government? — Videos

Read Full Post | Make a Comment ( None so far )

National Security Agency (NSA) Intercepts FedX and UPS Packages To Install Malware Software — Bugs iPhones and Laptops — TOR Network — Videos

Posted on December 31, 2013. Filed under: American History, Blogroll, Books, Business, College, Communications, Computers, Computers, Constitution, Crime, Economics, Education, Employment, Federal Government, Federal Government Budget, Fiscal Policy, Foreign Policy, government, government spending, history, Language, Law, liberty, Life, Links, media, People, Philosophy, Politics, Press, Programming, Psychology, Rants, Raves, Regulations, Resources, Security, Strategy, Talk Radio, Technology, Terrorism, Unemployment, Video, War, Wealth, Weapons, Wisdom, Writing | Tags: , , , , , , , , , , , , , , , , |

NSA Interception: Spy malware installed on laptops bought online

Glenn Greenwald Keynote on 30c3

The Tor Network [30c3] (with Jacob Applebaum)

NSA Spying Project Prism Glenn Greenwald Interview

Glenn Greenwald: The NSA Can “Literally Watch Every Keystroke You Make”

Spiegel has revealed new details about a secretive hacking unit inside the National Security Agency called the Office of Tailored Access Operations, or TAO. The unit was created in 1997 to hack into global communications traffic. Hackers inside the TAO have developed a way to break into computers running Microsoft Windows by gaining passive access to machines when users report program crashes to Microsoft. In addition, with help from the CIA and FBI, the NSA has the ability to intercept computers and other electronic accessories purchased online in order to secretly insert spyware and components that can provide backdoor access for the intelligence agencies. American Civil Liberties Union Deputy Legal Director Jameel Jaffer and journalist Glenn Greenwald join us to discuss the latest revelations, along with the future of Edward Snowden, who has recently offered to assist U.S. targets Germany and Brazil with their respective probes into NSA spying.

TAO Revealed: The NSA’s ‘top secret weapon’

‘NSA’s goal is elimination of privacy worldwide’ – Greenwald to EU (FULL SPEECH)

Glenn Greenwald and Ruth Marcus Get in Explosive Exchange over Snowden and ‘Horrible’ D.C. Media

How The NSA Hacks Your iPhone (Presenting DROPOUT JEEP)

by Tyler Durden

Following up on the latest stunning revelations released yesterday by German Spiegel which exposed the spy agency’s 50 page catalog of “backdoor penetration techniques“, today during a speech given by Jacob Applebaum (@ioerror) at the 30th Chaos Communication Congress, a new bombshell emerged: specifically the complete and detailed description of how the NSA bugs, remotely, your iPhone. The way the NSA accomplishes this is using software known as Dropout Jeep, which it describes as follows: “DROPOUT JEEP is a software implant for the Apple iPhone that utilizes modular mission applications to provide specific SIGINT functionality. This functionality includes the ability to remotely push/pull files from the device. SMS retrieval, contact list retrieval, voicemail, geolocation, hot mic, camera capture, cell tower location, etc. Command, control and data exfiltration can occur over SMS messaging or a GPRS data connection. All communications with the implant will be covert and encrypted.”

The flowchart of how the NSA makes your iPhone its iPhone is presented below:

  • NSA ROC operator
  • Load specified module
  • Send data request
  • iPhone accepts request
  • Retrieves required SIGINT data
  • Encrypt and send exfil data
  • Rinse repeat

And visually:

What is perhaps just as disturbing is the following rhetorical sequence from Applebaum:

“Do you think Apple helped them build that? I don’t know. I hope Apple will clarify that. Here’s the problem: I don’t really believe that Apple didn’t help them, I can’t really prove it but [the NSA] literally claim that anytime they target an iOS device that it will succeed for implantation. Either they have a huge collection of exploits that work against Apple products, meaning that they are hoarding information about critical systems that American companies produce and sabotaging them, or Apple sabotaged it themselves. Not sure which one it is. I’d like to believe that since Apple didn’t join the PRISM program until after Steve Jobs died, that maybe it’s just that they write shitty software. We know that’s true.”

Or, Apple’s software is hardly “shitty” even if it seems like that to the vast majority of experts (kinda like the Fed’s various programs), and in fact it achieves precisely what it is meant to achieve.

Either way, now everyone knows that their iPhone is nothing but a gateway for the NSA to peruse everyone’s “private” data at will. Which, incidentally, is not news, and was revealed when we showed how the “NSA Mocks Apple’s “Zombie” Customers; Asks “Your Target Is Using A BlackBerry? Now What?

How ironic would it be if Blackberry, left for dead by virtually everyone, began marketing its products as the only smartphone that does not allow the NSA access to one’s data (and did so accordingly). Since pretty much everything else it has tried has failed, we don’t see the downside to this hail mary attempt to strike back at Big Brother and maybe make some money, by doing the right thing for once.

We urge readers to watch the full one hour speech by Jacob Applebaum to realize just how massive Big Brother truly is, but those who want to just listen to the section on Apple can do so beginning 44 minutes 30 seconds in the presentation below.

http://www.zerohedge.com/news/2013-12-30/how-nsa-hacks-your-iphone-presenting-dropout-jeep

Related Posts On Pronk Palisades

No Such Agency — NSA — National Security Agency — Threat To The Liberty and Privacy of The American People — None Of Their Damn Business — Still Trust The Federal Government? — Videos

Read Full Post | Make a Comment ( None so far )

Enemy Of The State: Life Imitating Art –National Security Agency Targets American People — Videos

Posted on June 25, 2013. Filed under: Art, Blogroll, Business, Comedy, Communications, Computers, Crime, Economics, External Hard Drives, Federal Government, Federal Government Budget, Fiscal Policy, government spending, Law, liberty, Life, Links, media, People, Philosophy, Politics, Rants, Raves, Regulations, Resources, Security, Strategy, Talk Radio, Tax Policy, Technology, Terrorism, Unemployment, Video, War, Wealth, Weapons, Wisdom | Tags: , , , , , , , , , , , , , , , |

enemy-of-the-state-movie-poster-1998-1020192861

Enemy-of-the-State-movie-poster

movie_photos_enemy_of_the_state

nsa-building

NSA Phone Records

PRISM_logo prism-slide-1

prism-slide-4 prismnew prism slide prism-slide-2

ENEMY OF THE STATE… (1998) MUST WATCH..TAKE SERIOUSLY..

Nova: The Spy Factory Full Video

INTERVIEW with NSA WHISTLEBLOWER: Confirm EVERYONE in US is under VIRTUAL SURVEILLANCE since 9/11

He told you so: Bill Binney talks NSA leaks

James Bamford on NSA Leaks – Charlie Rose 06/13/2013

Companies With Ties to Israel Wiretap the U.S. for the NSA

James Bamford: Inside the NSA’s Largest  Secret Domestic Spy Center

James Bamford on NSA’s un democratic Surveillance

Enemy of the State

Enemy of the State is a 1998 American action-thriller about a group of rogue NSA agents who kill a US Congressman and try to cover up the murder. It was written by David Marconi, directed by Tony Scott, and produced by Jerry Bruckheimer. It stars Will Smith and Gene Hackman, with Jon Voight, Lisa Bonet, and Regina King in supporting roles.

The film grossed over $250,000,000 worldwide ($111,549,836 within the US).

Plot

As the U.S. Congress moves to pass new legislation that dramatically expands the surveillance powers of intelligence agencies, Congressman Phil Hammersley (Robards) remains firmly opposed to its passage. To ensure the bill’s passage, National Security Agency official Thomas Reynolds (Voight) kills Hammersley, but he is unaware of a video camera set up by wildlife researcher Daniel Zavitz (Lee) that has captured the entire incident. Zavitz discovers the murder, and alerts an underground journalist, at the same time transferring the video to an innocuous computer disc. Reynolds learns of Zavitz’s footage, and sends a team to recover the video. While fleeing, Zavitz runs into an old college friend, labor lawyer Robert Clayton Dean (Smith). Zavitz secretly passes the computer disc into Dean’s shopping bag without his knowledge. Zavitz flees and is killed when hit by a fire truck. Reynolds soon has the underground journalist killed.

When the NSA discovers that Dean may have the video, a team raids his house and plants surveillance devices. Unable to find the video, the NSA proceeds to falsely incriminate Dean of passing classified information to Rachel Banks (Bonet), a former girlfriend. The subterfuge destroys Dean’s life: he is fired from his job, his bank accounts are frozen, and his wife (King) throws him out of the house. Dean, trailed by the NSA, meets with Banks, who sets up a meeting with “Brill”, one of her secret contacts. After meeting an NSA agent posing as Brill (Byrne), Dean realizes his error, only to have the real Brill, retired NSA agent Edward Lyle (Hackman), ferry him to temporary safety and help rid Dean of most of the tracking devices he is unwittingly carrying. Dean ultimately rids himself of the final device and, fleeing his pursuers, escapes.

With Dean and Lyle in hiding, the NSA agents kill Banks and frame Dean for the murder. Lyle is able to find evidence that the NSA executed Hammersley’s murder, but it is destroyed during an escape from an NSA raid.

It is then revealed that Lyle was an expert in communications for the NSA; he was stationed in Iran before the Iranian Revolution. When the revolution occurred, Lyle made it out of the country, but his partner, Rachel’s father, was killed. Since then he has been in hiding. Lyle tries to coax Dean into trying to run away, but Dean is adamant about clearing his name.

Dean and Lyle blackmail another supporter of the surveillance bill, Congressman Sam Albert (Wilson), by videotaping him having an affair with his aide. Dean and Lyle “hide” bugs that Reynolds had used on Dean in Albert’s room so Albert will find them and have the NSA start an investigation. Lyle also deposits $140,000 into Reynolds’ bank account to make it appear that he is taking bribes.

Lyle contacts Reynolds to tell him he has the video of the Hammersley murder and asks to meet. Dean tells them that the Hammersley murder footage is in the hands of Mafia boss Joey Pintero (Sizemore), whose office is under FBI surveillance. Dean, Reynolds, and the NSA team head into Pintero’s restaurant, precipitating a gunfight that kills the mobsters, Reynolds, and several of his NSA team.

Dean and Lyle escape, with Lyle quickly disappearing from the authorities. The FBI discovers the plot behind the legislation, causing it to fail, though they cover up the NSA’s involvement. Dean is cleared of all charges and is reunited with his wife. Lyle escapes to a tropical location, but sends a “goodbye” message to Dean.

Cast

  • Will Smith as Robert Clayton Dean
  • Gene Hackman as Edward “Brill” Lyle
  • Jon Voight as Thomas Brian Reynolds
  • Barry Pepper as David Pratt
  • Regina King as Carla Dean
  • Ian Hart as John Bingham
  • Lisa Bonet as Rachel F. Banks
  • Jascha Washington as Eric Dean
  • James LeGros as Jerry Miller
  • Jake Busey as Krug
  • Scott Caan as Jones
  • Jamie Kennedy as Jamie Williams
  • Jason Lee as Daniel Leon Zavitz
  • Gabriel Byrne as Fake Brill
  • Stuart Wilson as Congressman Sam Albert
  • Jack Black as Fiedler
  • Anna Gunn as Emily Reynolds
  • Laura Cayouette as Christa Hawkins
  • Loren Dean as Loren Hicks
  • Bodhi Elfman as Van
  • Dan Butler as NSA Director Admiral Shaffer
  • Seth Green as Selby (uncredited)
  • Tom Sizemore as Boss Paulie Pintero (uncredited)
  • Jason Robards as Congressman Phil Hammersley (uncredited)
  • Philip Baker Hall as Attorney Mark Silverberg (uncredited)
  • Brian Markinson as Attorney Brian Blake (uncredited)
  • Larry King as Himself (uncredited)
  • Ivana Miličević as Ruby’s Sales Clerk

Production

Although the story is set in both Washington, D.C., and Baltimore, most of the filming was done in Baltimore. Location shooting began on a ferry in Fells Point. In mid-January, the company moved to Los Angeles to complete production in April 1998.[3]

Mel Gibson and Tom Cruise were considered for the part that went to Will Smith, who took the role largely because he wanted to work with Gene Hackman and had previously enjoyed working with producer Jerry Bruckheimer on Bad Boys. George Clooney was also considered for a role in the film. Sean Connery was considered for the role that went to Hackman. The film’s crew included a technical surveillance counter-measures consultant who also had a minor role as a spy shop merchant. Hackman had previously acted in a similar thriller about spying and surveillance film, The Conversation (1974).

Reception

Enemy of the State was moderately well received by critics. Rotten Tomatoes presented a 71% “Fresh” rating for the movie, with 57 critics approving of the movie and 24 noting the film as “Rotten;”[4] similar results could be found at the website Metacritic, which displayed a normalized ranking of 67 out of 100 on the basis of the views of 22 critics.[5] Kenneth Turan of the Los Angeles Times expressed enjoyment in the movie, noting how its “pizazz [overcame] occasional lapses in moment-to-moment plausibility;”[6] Janet Maslin of the New York Times approved of the film’s action-packed sequences, but cited how it was similar in manner to the rest of the members of “Simpson’s and Bruckheimer’s school of empty but sensation-packed filming.”[7] In a combination of the two’s views, Edvins Beitiks of the San Francisco Examiner praised many of the movie’s development aspects, but criticized the overall concept that drove the film from the beginning — the efficiency of government intelligence — as unrealistic.[8]

According to film critic Kim Newman, Enemy of the State could be construed as a “continuation of The Conversation,” the 1974 psychological thriller that starred Hackman as a paranoid, isolated surveillance expert.[9]

Box office

The film opened at #2, behind The Rugrats Movie, grossing $20,038,573 over its first weekend in 2,393 theatres and averaging about $8,374 per venue.[10][11]

Real life

An episode of PBS’ Nova titled “Spy Factory” reports that the film’s portrayal of the NSA’s capabilities are fiction: although the agency can intercept transmissions, connecting the dots is difficult.[12] However, in 2001, then-NSA director Gen. Michael Hayden, who was appointed to the position during the release of the film, told CNN’s Kyra Philipps that “I made the judgment that we couldn’t survive with the popular impression of this agency being formed by the last Will Smith movie.[13]” James Risen wrote in his 2006 book State of War: The Secret History of the CIA and the Bush Administration that Hayden “was appalled” by the film’s depiction of the NSA, and sought to counter it with a PR campaign on behalf of the agency.[14]

In June 2013 the NSA’s PRISM and Boundless Informant programs for domestic and international surveillance were uncovered by the Guardian and Washington Post as the result of information provided by whistleblower Edward Snowden. This information revealed much more extensive capabilities than those represented by the film, such as collection of internet browsing, email and telephone data of not only every American, but citizens of other nations as well. The Guardian’s John Patterson opined that Hollywood depictions of NSA surveillance, including Enemy of the State and Echelon Conspiracy, had “softened” up the American public to “the notion that our spending habits, our location, our every movement and conversation, are visible to others whose motives we cannot know.[15]

http://en.wikipedia.org/wiki/Enemy_of_the_State_%28film%29

Related Posts On Pronk Palisades

James Bamford — The National Security Agency (NSA) — Videos

Big Brother Barack Targets All The American People As Enemies of The State and Democratic Party — National Security Agency’s PRISM Is The Secret Security Surveillance State (S4) Means of Invading Privacy and Limiting Liberty — Outrageous Overreach–Videos

No Such Agency — NSA — National Security Agency — Threat To The Liberty and Privacy of The American People — None Of Their Damn Business — Still Trust The Federal Government? — Videos

National Security Agency (NSA) and Federal Bureau Investigation (FBI) Secret Security Surveillance State (S4) Uses Stellar Wind and PRISM To Create Secret Dossiers On All American Citizen Targets Similar To East Germany Stasi Files–Videos

NSA’s PRISM Political Payoff: 40 Million Plus Foreigners Are In USA As Illegal Aliens! — 75% Plus Lean Towards Democratic Party — Pathway To One Party Rule By 2025 If Senate Bill Becomes Law Giving Illegal Aliens Legal Status — 25 Million American Citizens Looking For Full Time Jobs! — Videos

Read Full Post | Make a Comment ( None so far )

Digital Campaigns Using Microtargeting and Data Mining To Target Voters — Videos

Posted on June 12, 2013. Filed under: American History, Blogroll, College, Communications, Computers, Demographics, Economics, Education, Employment, Federal Government, Federal Government Budget, Fiscal Policy, Foreign Policy, government, government spending, history, History of Economic Thought, Investments, IRS, Language, Law, liberty, Life, Links, Literacy, Macroeconomics, media, People, Philosophy, Politics, Programming, Psychology, Raves, Strategy, Tax Policy, Taxes, Unemployment, Video, War, Wisdom | Tags: , , , , , , , , , , , , , , , , , , , , , |

microtargeting

microtargeting-voter_lifestyle_social_media

National-Media-analysis-of-undecided-voters-media-habits

data-mining-algorithms

Data Mining

big-data-mining

obama_biden

Maxine Waters Confirms “Big Brother” Database 2013

Maxine Waters (D) Slip of the Tongue Reveals True Intentions (Socialism for America)

Obama’s secret microtargeting operation

Campaigns admit to data mining

During campaigning, candidates are going to great lengths to find out about residents. Both presidential campaigns admit to tracking everything you do online.

Obama’s win: data mining

How We Used Data to Win the Presidential Election

Dan Siroker, of the Obama Campaign and CarrotSticks, describes how the campaign used data to win the presidential election. He shares the lessons his team learned along the way and how one can apply them to any data-driven decision one needs to make — whether it be in developing, designing, or even marketing.

Can You Replicate the Obama Strategy? | The New School for Public Engagement

Political campaigns have revolutionized the way they target, contact and motivate supporters. Strategists are taking the insights of experimental social science and marrying them to the corporate world’s Big Data marketing tools. The Obama Campaign won in large part by using statistical modeling techniques to identify persuadable voters and to fine-tune persuasive messages. This is politics today and in the future—not only for elections but on issue campaigns for education reform, health care, the environment, labor rights and beyond. Who are the pioneers? And how might you apply their the strategies?

Strata 2013: Sasha Issenberg, “The Victory Lab”

The Victory Lab: ‘Moneyball for Politics'” Sasha Issenberg

A Conversation with Sasha Issenberg

Sasha Issenberg discusses the 2012 Obama campaign

Sasha Issenberg discusses the use of social science experiments in Rick Perry’s 2006 campaign

Political Checklist: Frontline Looks at Digital Campaigns

Frontline: The Digital Factor in Election 2012

Frontline: How Much Do Digital Campaigns Know About You?

FRONTLINE  The Digital Campaign

http://video.pbs.org/video/2295038658

RNC/DNC Collecting Your Info En Masse

Microtargeting

About Aristotle

Who Works for Aristotle?

Better Data

Precision of Information in a Campaign

Aristotle Testimonial – Paul Kilgore

Aristotle 360 – Dashboard and Home Page

Aristotle 360 – Dashboard

Aristotle 360 – Power Tools for Politics

Aristotle 360 – Creating Records

Webinar – Aristotle 360 General Training

Webinar – Political Campaign Fundraising with Aristotle 360

Use Voter Data for a Smart Political Campaign

‘Big Brother’ is watching, in sophisticated digital ways

By Gitte Laasby

Town of Mukwonago voter Priscilla Trulen is used to ignoring political solicitations. For weeks, she’s been receiving three political robocalls per day related to the presidential election. On Thursday, she got seven.

But one call she got on Halloween still haunts her. It was a recorded message read by a presidential candidate trying to get her to vote.

“It was Mitt Romney saying, ‘I know you have an absentee ballot and I know you haven’t sent it in yet,’ ” Trulen said in an interview. “That just sent me over the line. Not only is it like Big Brother. It is Big Brother. It’s down to where they know I have a ballot and I haven’t sent it in! I thought when I requested the ballot that the only other entity that would know was the Mukwonago clerk.”

Trulen isn’t the only voter among Wisconsin’s much-courted electorate who is getting creeped out by the political campaigns’ unprecedented, uncanny ability to micro-target voters who are likely to vote for their candidate.

In Brown County, residents are unnerved about “voter report cards” from Moveon.org that show the recipients how their voting participation compares to those of their neighbors.

The solicitations give only a small glimpse into how much digital information the campaigns are able to access about voters.

For years, campaigns have requested the statewide voter registration list, which is subject to public information requests.

The database contains the names and addresses of active voters who are registered and able to vote, as well as inactive voters who are ineligible to vote because they have passed away, moved out of state or committed a felony, or people who need to re-register to be eligible, said Reid Magney, public information officer with the Wisconsin Government Accountability Board.

The list also contains information that the state does not release, for instance people’s birth dates, driver’s license numbers and phone numbers.

“It’s typical for both parties, or individual candidates, to be making public records requests from the clerks. And it’s perfectly legal,” Magney said. “This information is public so there’s transparency in our elections. . . . Except for how you vote, there really are no secrets.”

The state database also contains information on absentee voters. The state’s 1,851 municipalities are required to account for military and overseas absentee ballots both before and after the election, Magney said. Municipalities don’t have to report to the state whether regular absentee ballots such as Trulen’s have been returned until the election is over. However, some municipalities, including the Town of Mukwonago where Trulen lives, report to the state database as they go whether those ballots have been returned. Most likely, that’s how the Republican campaign found out Trulen received an absentee ballot.

“There’s nothing confidential as far as, ‘Did so and so vote?’ ” said Kathy Karalewitz, administrative clerk treasurer with the town. “As far as how they vote, yes.”

Requesters can also request information related to absentee ballots directly from the municipalities, although that’s more cumbersome and labor intensive.

The cost of the entire state database is $12,500. Four requesters have been willing to pay that since Sept. 1, Magney said: Catalist (a progressive voter database organization), the Democratic National Committee, and data analysis firm Aristotle – all based in Washington, D.C. The last requester was Colorado-based Magellan Strategies, a firm that specializes in “micro-targeting” for Republican parties and candidates.

Another 200 requests have been made since Sept. 1 for smaller portions of the database, Magney said.

Crunching the numbers

But what really enables the campaigns to “slice and dice” the electorate down to individual voters is that the voter list is correlated with a slew of other information designed to predict voting behavior and issues that the voter would care about.

In an interview with PBS that aired in October, Aristotle’s chief executive officer, John Phillips, said the company keeps up to 500 data points on each voter – from the type of clothes they buy, the music they listen to, magazines they read and car they own, to whether they are a NASCAR fan, a smoker or a pet owner, or have a gold credit card. Some of that information comes from commercial marketing firms, product registration cards or surveys. Other information is obtained through Facebook, door-to-door canvassing, petitions and computer cookies – small data codes that register which websites the user has visited.

Through data modeling, analyzers can categorize voters based on how they feel about specific issues, values or candidates. They then try to predict voting behavior and figure out which issue ads voters are most likely to be susceptible to – for instance ads on education, gun control or immigration.

One of the companies that requested the full Wisconsin voter database, Magellan Strategies, explains on its website that it conducts surveys on people’s opinions and merges that with their political, consumer and census demographics.

“By correlating respondents’ demographics to the demographics of the whole voting district, we can make predictions about the voting preferences of each voter in the district,” the site states.

The company also states why the strategy is so popular.

“Microtargeting enables campaigns to send targeted messages to voters who are very receptive to those messages,” the website states. “Microtargeting allows for the most cost effective voter targeting programs, for voter persuasion or get-out-the-vote.”

According to its website, Magellan has conducted microtargeting since 2008.

A little extra effort is required to determine party affiliation in Wisconsin which, contrary to other states such as California, does not register people to vote by party.

The last piece of the puzzle is the phone number, which is not available through the government, but easily found in a phone book or located in online databases, sometimes free of charge.

Nathan Conrad, a spokesman for the Republican Party of Wisconsin, did not respond to a request for comment on how the campaign obtained Trulen’s digits. Graeme Zielinski, a spokesman for the Democratic Party of Wisconsin, did not respond for requests on how his party obtains phone numbers either.

As for Trulen, she just wishes she could find a way to make the calls stop.

“It’s alarming to me,” she said. “It’s just not right. . . . It’s like you can feel the tentacles creeping into your house under your door.”

The calls to Trulen were likely part of the GOP’s effort to get out the vote in what the party considers one of its strongest counties. Waukesha County is traditionally a Republican stronghold, just as Milwaukee tends to go for Democrats.

The irony is that the robocallers apparently haven’t figured out Trulen is actually a minority in her county: She has been voting Democratic.

Big Brother

Political campaigns can obtain nearly unlimited information about people through commercially available databases. Here’s what information they can, and can’t, learn about you from public records related to voting:

Public (obtainable)

Your name, address, gender and race

Which elections you voted in, going back to 2000

Whether you have requested an absentee ballot and whether you have sent it in.

Private (redacted)

Whom you voted for

Your date of birth

Your Social Security number, and any part of it

Your driver’s license number

Your phone number (if officials remember to redact it before they release your registration to anyone who asks.)

Online

For more on the information that campaigns and others collect on you, watch this video from PBS.

http://www.jsonline.com/news/wisconsin/unprecedented-microtargeting-by-campaigns-creeps-out-voters-007f111-177062301.html

Microtargeting

Microtargeting is the use by political parties and election campaigns of direct marketing datamining techniques that involve predictive market segmentation (aka cluster analysis). It is used by United States Republican and Democratic political parties and candidates to track individual voters and identify potential supporters.

They then use various means of communication—direct mail, phone calls, home visits, television, radio, web advertising, email, text messaging, etc.–to communicate with voters, crafting messages to build support for fundraising, campaign events, volunteering, and eventually to turn them out to the polls on election day. Microtargeting’s tactics rely on transmitting a tailored message to a subgroup of the electorate on the basis of unique information about that subgroup.

History

Although some of the tactics of microtargeting had been used in California since 1992, it really started to be used nationally only in 2004.[1] In that year, Karl Rove, along with Blaise Hazelwood at the Republican National Committee, used it to reach voters in 18 states that George W. Bush’s reelection campaign was not able to reach by other means. The results were greater contacts with likely Bush voters. For example, in Iowa the campaign was able to reach 92% of eventual Bush voters (compared to 50% in 2000) and in Florida it was able to reach 84% (compared to 50% in 2000).[2] Much of this pioneering work was done by Alex Gage and his firm, TargetPoint Consulting.

Democrats did only limited microtargeting in 2004, with some crediting microtargeting for Kerry’s win in Iowa in 2004.[3] Some news accounts credited Republican superiority in that area for victories in that election cycle.[4] Democrats later developed microtargeting capabilities for the 2006 election cycle.[1][2] “It’s no secret that the other side [Republicans] figured this out a little sooner”, said Josh Syrjamaki, director of the Minnesota chapter of America Votes in October 2006. “They’ve had four to six years’ jump on us on this stuff…but we feel like we can start to catch up.”[5]

Method

Microtargeting is a modification of a practice used by commercial direct marketers. It would not be possible on a large scale without the development of large and sophisticated databases that contain data about as many voters as possible. The database essentially tracks voter habits in the same ways that companies like Visa track consumer spending habits. The Republican National Committee’s database is called Voter Vault. The Democratic National Committee effort is called VoteBuilder.[6] A parallel Democratic effort is being developed by Catalist, a $9 million initiative headed by Harold Ickes,[2] while the leading non-partisan database is offered by Aristotle.[7]

The databases contain specific information about a particular voter (party affiliation, frequency of voting, contributions, volunteerism, etc.) with other activities and habits available from commercial marketing vendors such as Acxiom, Dun & Bradstreet, Experian Americas, and InfoUSA. Such personal information is a “product” sold to interested companies. These data are particularly illuminating when portrayed through a Geographic Information System (GIS), where trends based on location can be mapped alongside dozens or hundreds of other variables. This geographic depiction also makes it ideal for volunteers to visit potential voters (armed with lists in hand, laid out in the shortest route – much like how FedEx and UPS pre-determine delivery routes).

These databases are then mined to identify issues important to each voter and whether that voter is more likely to identify with one party or another. Political information is obviously important here, but consumer preferences can play a role as well. Individual voters are then put into groups on the basis of sophisticated computer modeling. Such groups have names like “Downscale Union Independents”, “Tax and Terrorism Moderates,” and “Older Suburban Newshounds.”[2][5]

Once a multitude of voting groups is established according to these criteria and their minute political differences, then the tailored messages can be sent via the appropriate means. While political parties and candidates once prepared a single television advertisement for general broadcast nationwide, it is now not at all uncommon to have several dozen variations on the one message, each with a unique and tailored message for that small demographic sliver of the voting public. This is the same for radio advertisement, direct mail, email, as well as stump speeches and fundraising events.

See also

References

  1. ^ a b Chad Vander Veen, Zeroing In, www.govtech.net, Jan 2, 2006, accessed November 1, 2006.
  2. ^ a b c d Yochi J. Dreazen, Democrats, Playing Catch-Up, Tap Database to Woo Potential Voters, The Wall Street Journal, October 31, 2006, A1.
  3. ^ Schaller, T: New Math: How a trio of savvy Kerry campaign workers used a fresh voter equation to win Iowa., web only. American Prospect, 2004.
  4. ^ Martin Kettle, “How Democrats missed the vote”, The Guardian, November 3, 2006 [1], accessed February 2, 2007
  5. ^ a b Dan Balz, Democrats Aim to Regain Edge In Getting Voters to the Polls, Washington Post, October 8, 2006, accessed November 7, 2006. [2]
  6. ^ Aaron Blake (August 15, 2007). “DNC holds national training as it rolls out new voter file”. The Hill.
  7. ^ James Verini (December 3, 2007). “Big Brother Inc.”. Vanity Fair.

External links

http://en.wikipedia.org/wiki/Microtargeting

Data mining

Data mining (the analysis step of the “Knowledge Discovery in Databases” process, or KDD),[1] an interdisciplinary subfield of computer science,[2][3][4] is the computational process of discovering patterns in large data sets involving methods at the intersection of artificial intelligence, machine learning, statistics, and database systems.[2] The overall goal of the data mining process is to extract information from a data set and transform it into an understandable structure for further use.[2] Aside from the raw analysis step, it involves database and data management aspects, data preprocessing, model and inference considerations, interestingness metrics, complexity considerations, post-processing of discovered structures, visualization, and online updating.[2]

The term is a buzzword,[5] and is frequently misused to mean any form of large-scale data or information processing (collection, extraction, warehousing, analysis, and statistics) but is also generalized to any kind of computer decision support system, including artificial intelligence, machine learning, and business intelligence. In the proper use of the word, the key term is discovery[citation needed], commonly defined as “detecting something new”. Even the popular book “Data mining: Practical machine learning tools and techniques with Java”[6] (which covers mostly machine learning material) was originally to be named just “Practical machine learning”, and the term “data mining” was only added for marketing reasons.[7] Often the more general terms “(large scale) data analysis“, or “analytics” – or when referring to actual methods, artificial intelligence and machine learning – are more appropriate.

The actual data mining task is the automatic or semi-automatic analysis of large quantities of data to extract previously unknown interesting patterns such as groups of data records (cluster analysis), unusual records (anomaly detection) and dependencies (association rule mining). This usually involves using database techniques such as spatial indices. These patterns can then be seen as a kind of summary of the input data, and may be used in further analysis or, for example, in machine learning and predictive analytics. For example, the data mining step might identify multiple groups in the data, which can then be used to obtain more accurate prediction results by a decision support system. Neither the data collection, data preparation, nor result interpretation and reporting are part of the data mining step, but do belong to the overall KDD process as additional steps.

The related terms data dredging, data fishing, and data snooping refer to the use of data mining methods to sample parts of a larger population data set that are (or may be) too small for reliable statistical inferences to be made about the validity of any patterns discovered. These methods can, however, be used in creating new hypotheses to test against the larger data populations.

Data mining uses information from past data to analyze the outcome of a particular problem or situation that may arise. Data mining works to analyze data stored in data warehouses that are used to store that data that is being analyzed. That particular data may come from all parts of business, from the production to the management. Managers also use data mining to decide upon marketing strategies for their product. They can use data to compare and contrast among competitors. Data mining interprets its data into real time analysis that can be used to increase sales, promote new product, or delete product that is not value-added to the company.

Etymology

In the 1960s, statisticians used terms like “Data Fishing” or “Data Dredging” to refer to what they considered the bad practice of analyzing data without an a-priori hypothesis. The term “Data Mining” appeared around 1990 in the database community. At the beginning of the century, there was a phrase “database mining”™, trademarked by HNC, a San Diego-based company (now merged into FICO), to pitch their Data Mining Workstation;[8] researchers consequently turned to “data mining”. Other terms used include Data Archaeology, Information Harvesting, Information Discovery, Knowledge Extraction, etc. Gregory Piatetsky-Shapiro coined the term “Knowledge Discovery in Databases” for the first workshop on the same topic (1989) and this term became more popular in AI and Machine Learning Community. However, the term data mining became more popular in the business and press communities.[9] Currently, Data Mining and Knowledge Discovery are used interchangeably.

Background

The manual extraction of patterns from data has occurred for centuries. Early methods of identifying patterns in data include Bayes’ theorem (1700s) and regression analysis (1800s). The proliferation, ubiquity and increasing power of computer technology has dramatically increased data collection, storage, and manipulation ability. As data sets have grown in size and complexity, direct “hands-on” data analysis has increasingly been augmented with indirect, automated data processing, aided by other discoveries in computer science, such as neural networks, cluster analysis, genetic algorithms (1950s), decision trees (1960s), and support vector machines (1990s). Data mining is the process of applying these methods with the intention of uncovering hidden patterns[10] in large data sets. It bridges the gap from applied statistics and artificial intelligence (which usually provide the mathematical background) to database management by exploiting the way data is stored and indexed in databases to execute the actual learning and discovery algorithms more efficiently, allowing such methods to be applied to ever larger data sets.

Research and evolution

The premier professional body in the field is the Association for Computing Machinery‘s (ACM) Special Interest Group (SIG) on Knowledge Discovery and Data Mining (SIGKDD). Since 1989 this ACM SIG has hosted an annual international conference and published its proceedings,[11] and since 1999 it has published a biannual academic journal titled “SIGKDD Explorations”.[12]

Computer science conferences on data mining include:

Data mining topics are also present on many data management/database conferences such as the ICDE Conference, SIGMOD Conference and International Conference on Very Large Data Bases

Process

The Knowledge Discovery in Databases (KDD) process is commonly defined with the stages:

(1) Selection
(2) Pre-processing
(3) Transformation
(4) Data Mining
(5) Interpretation/Evaluation.[1]

It exists, however, in many variations on this theme, such as the Cross Industry Standard Process for Data Mining (CRISP-DM) which defines six phases:

(1) Business Understanding
(2) Data Understanding
(3) Data Preparation
(4) Modeling
(5) Evaluation
(6) Deployment

or a simplified process such as (1) pre-processing, (2) data mining, and (3) results validation.

Polls conducted in 2002, 2004, and 2007 show that the CRISP-DM methodology is the leading methodology used by data miners.[13][14][15] The only other data mining standard named in these polls was SEMMA. However, 3-4 times as many people reported using CRISP-DM. Several teams of researchers have published reviews of data mining process models,[16][17] and Azevedo and Santos conducted a comparison of CRISP-DM and SEMMA in 2008.[18]

Pre-processing

Before data mining algorithms can be used, a target data set must be assembled. As data mining can only uncover patterns actually present in the data, the target data set must be large enough to contain these patterns while remaining concise enough to be mined within an acceptable time limit. A common source for data is a data mart or data warehouse. Pre-processing is essential to analyze the multivariate data sets before data mining. The target set is then cleaned. Data cleaning removes the observations containing noise and those with missing data.

Data mining

Data mining involves six common classes of tasks:[1]

  • Anomaly detection (Outlier/change/deviation detection) – The identification of unusual data records, that might be interesting or data errors that require further investigation.
  • Association rule learning (Dependency modeling) – Searches for relationships between variables. For example a supermarket might gather data on customer purchasing habits. Using association rule learning, the supermarket can determine which products are frequently bought together and use this information for marketing purposes. This is sometimes referred to as market basket analysis.
  • Clustering – is the task of discovering groups and structures in the data that are in some way or another “similar”, without using known structures in the data.
  • Classification – is the task of generalizing known structure to apply to new data. For example, an e-mail program might attempt to classify an e-mail as “legitimate” or as “spam”.
  • Regression – Attempts to find a function which models the data with the least error.
  • Summarization – providing a more compact representation of the data set, including visualization and report generation.
  • Sequential pattern mining – Sequential pattern mining finds sets of data items that occur together frequently in some sequences. Sequential pattern mining, which extracts frequent subsequences from a sequence database, has attracted a great deal of interest during the recent data mining research because it is the basis of many applications, such as: web user analysis, stock trend prediction, DNA sequence analysis, finding language or linguistic patterns from natural language texts, and using the history of symptoms to predict certain kind of disease.

Results validation

The final step of knowledge discovery from data is to verify that the patterns produced by the data mining algorithms occur in the wider data set. Not all patterns found by the data mining algorithms are necessarily valid. It is common for the data mining algorithms to find patterns in the training set which are not present in the general data set. This is called overfitting. To overcome this, the evaluation uses a test set of data on which the data mining algorithm was not trained. The learned patterns are applied to this test set and the resulting output is compared to the desired output. For example, a data mining algorithm trying to distinguish “spam” from “legitimate” emails would be trained on a training set of sample e-mails. Once trained, the learned patterns would be applied to the test set of e-mails on which it had not been trained. The accuracy of the patterns can then be measured from how many e-mails they correctly classify. A number of statistical methods may be used to evaluate the algorithm, such as ROC curves.

If the learned patterns do not meet the desired standards, then it is necessary to re-evaluate and change the pre-processing and data mining steps. If the learned patterns do meet the desired standards, then the final step is to interpret the learned patterns and turn them into knowledge.

Standards

There have been some efforts to define standards for the data mining process, for example the 1999 European Cross Industry Standard Process for Data Mining (CRISP-DM 1.0) and the 2004 Java Data Mining standard (JDM 1.0). Development on successors to these processes (CRISP-DM 2.0 and JDM 2.0) was active in 2006, but has stalled since. JDM 2.0 was withdrawn without reaching a final draft.

For exchanging the extracted models – in particular for use in predictive analytics – the key standard is the Predictive Model Markup Language (PMML), which is an XML-based language developed by the Data Mining Group (DMG) and supported as exchange format by many data mining applications. As the name suggests, it only covers prediction models, a particular data mining task of high importance to business applications. However, extensions to cover (for example) subspace clustering have been proposed independently of the DMG.[19]

Notable uses

See also category: Applied data mining

Games

Since the early 1960s, with the availability of oracles for certain combinatorial games, also called tablebases (e.g. for 3×3-chess) with any beginning configuration, small-board dots-and-boxes, small-board-hex, and certain endgames in chess, dots-and-boxes, and hex; a new area for data mining has been opened. This is the extraction of human-usable strategies from these oracles. Current pattern recognition approaches do not seem to fully acquire the high level of abstraction required to be applied successfully. Instead, extensive experimentation with the tablebases – combined with an intensive study of tablebase-answers to well designed problems, and with knowledge of prior art (i.e. pre-tablebase knowledge) – is used to yield insightful patterns. Berlekamp (in dots-and-boxes, etc.) and John Nunn (in chess endgames) are notable examples of researchers doing this work, though they were not – and are not – involved in tablebase generation.

Business

Data mining is the analysis of historical business activities, stored as static data in data warehouse databases, to reveal hidden patterns and trends. Data mining software uses advanced pattern recognition algorithms to sift through large amounts of data to assist in discovering previously unknown strategic business information. Examples of what businesses use data mining for include performing market analysis to identify new product bundles, finding the root cause of manufacturing problems, to prevent customer attrition and acquire new customers, cross-sell to existing customers, and profile customers with more accuracy.[20] In today’s world raw data is being collected by companies at an exploding rate. For example, Walmart processes over 20 million point-of-sale transactions every day. This information is stored in a centralized database, but would be useless without some type of data mining software to analyse it. If Walmart analyzed their point-of-sale data with data mining techniques they would be able to determine sales trends, develop marketing campaigns, and more accurately predict customer loyalty.[21] Every time we use our credit card, a store loyalty card, or fill out a warranty card data is being collected about our purchasing behavior. Many people find the amount of information stored about us from companies, such as Google, Facebook, and Amazon, disturbing and are concerned about privacy. Although there is the potential for our personal data to be used in harmful, or unwanted, ways it is also being used to make our lives better. For example, Ford and Audi hope to one day collect information about customer driving patterns so they can recommend safer routes and warn drivers about dangerous road conditions.[22]

Data mining in customer relationship management applications can contribute significantly to the bottom line.[citation needed] Rather than randomly contacting a prospect or customer through a call center or sending mail, a company can concentrate its efforts on prospects that are predicted to have a high likelihood of responding to an offer. More sophisticated methods may be used to optimize resources across campaigns so that one may predict to which channel and to which offer an individual is most likely to respond (across all potential offers). Additionally, sophisticated applications could be used to automate mailing. Once the results from data mining (potential prospect/customer and channel/offer) are determined, this “sophisticated application” can either automatically send an e-mail or a regular mail. Finally, in cases where many people will take an action without an offer, “uplift modeling” can be used to determine which people have the greatest increase in response if given an offer. Uplift modeling thereby enables marketers to focus mailings and offers on persuadable people, and not to send offers to people who will buy the product without an offer. Data clustering can also be used to automatically discover the segments or groups within a customer data set.

Businesses employing data mining may see a return on investment, but also they recognize that the number of predictive models can quickly become very large. Rather than using one model to predict how many customers will churn, a business could build a separate model for each region and customer type. Then, instead of sending an offer to all people that are likely to churn, it may only want to send offers to loyal customers. Finally, the business may want to determine which customers are going to be profitable over a certain window in time, and only send the offers to those that are likely to be profitable. In order to maintain this quantity of models, they need to manage model versions and move on to automated data mining.

Data mining can also be helpful to human resources (HR) departments in identifying the characteristics of their most successful employees. Information obtained – such as universities attended by highly successful employees – can help HR focus recruiting efforts accordingly. Additionally, Strategic Enterprise Management applications help a company translate corporate-level goals, such as profit and margin share targets, into operational decisions, such as production plans and workforce levels.[23]

Another example of data mining, often called the market basket analysis, relates to its use in retail sales. If a clothing store records the purchases of customers, a data mining system could identify those customers who favor silk shirts over cotton ones. Although some explanations of relationships may be difficult, taking advantage of it is easier. The example deals with association rules within transaction-based data. Not all data are transaction based and logical, or inexact rules may also be present within a database.

Market basket analysis has also been used to identify the purchase patterns of the Alpha Consumer. Alpha Consumers are people that play a key role in connecting with the concept behind a product, then adopting that product, and finally validating it for the rest of society. Analyzing the data collected on this type of user has allowed companies to predict future buying trends and forecast supply demands.[citation needed]

Data mining is a highly effective tool in the catalog marketing industry.[citation needed] Catalogers have a rich database of history of their customer transactions for millions of customers dating back a number of years. Data mining tools can identify patterns among customers and help identify the most likely customers to respond to upcoming mailing campaigns.

Data mining for business applications is a component that needs to be integrated into a complex modeling and decision making process. Reactive business intelligence (RBI) advocates a “holistic” approach that integrates data mining, modeling, and interactive visualization into an end-to-end discovery and continuous innovation process powered by human and automated learning.[24]

In the area of decision making, the RBI approach has been used to mine knowledge that is progressively acquired from the decision maker, and then self-tune the decision method accordingly.[25]

An example of data mining related to an integrated-circuit (IC) production line is described in the paper “Mining IC Test Data to Optimize VLSI Testing.”[26] In this paper, the application of data mining and decision analysis to the problem of die-level functional testing is described. Experiments mentioned demonstrate the ability to apply a system of mining historical die-test data to create a probabilistic model of patterns of die failure. These patterns are then utilized to decide, in real time, which die to test next and when to stop testing. This system has been shown, based on experiments with historical test data, to have the potential to improve profits on mature IC products.

Science and engineering

In recent years, data mining has been used widely in the areas of science and engineering, such as bioinformatics, genetics, medicine, education and electrical power engineering.

In the study of human genetics, sequence mining helps address the important goal of understanding the mapping relationship between the inter-individual variations in human DNA sequence and the variability in disease susceptibility. In simple terms, it aims to find out how the changes in an individual’s DNA sequence affects the risks of developing common diseases such as cancer, which is of great importance to improving methods of diagnosing, preventing, and treating these diseases. The data mining method that is used to perform this task is known as multifactor dimensionality reduction.[27]

In the area of electrical power engineering, data mining methods have been widely used for condition monitoring of high voltage electrical equipment. The purpose of condition monitoring is to obtain valuable information on, for example, the status of the insulation (or other important safety-related parameters). Data clustering techniques – such as the self-organizing map (SOM), have been applied to vibration monitoring and analysis of transformer on-load tap-changers (OLTCS). Using vibration monitoring, it can be observed that each tap change operation generates a signal that contains information about the condition of the tap changer contacts and the drive mechanisms. Obviously, different tap positions will generate different signals. However, there was considerable variability amongst normal condition signals for exactly the same tap position. SOM has been applied to detect abnormal conditions and to hypothesize about the nature of the abnormalities.[28]

Data mining methods have also been applied to dissolved gas analysis (DGA) in power transformers. DGA, as a diagnostics for power transformers, has been available for many years. Methods such as SOM has been applied to analyze generated data and to determine trends which are not obvious to the standard DGA ratio methods (such as Duval Triangle).[28]

Another example of data mining in science and engineering is found in educational research, where data mining has been used to study the factors leading students to choose to engage in behaviors which reduce their learning,[29] and to understand factors influencing university student retention.[30] A similar example of social application of data mining is its use in expertise finding systems, whereby descriptors of human expertise are extracted, normalized, and classified so as to facilitate the finding of experts, particularly in scientific and technical fields. In this way, data mining can facilitate institutional memory.

Other examples of application of data mining methods are biomedical data facilitated by domain ontologies,[31] mining clinical trial data,[32] and traffic analysis using SOM.[33]

In adverse drug reaction surveillance, the Uppsala Monitoring Centre has, since 1998, used data mining methods to routinely screen for reporting patterns indicative of emerging drug safety issues in the WHO global database of 4.6 million suspected adverse drug reaction incidents.[34] Recently, similar methodology has been developed to mine large collections of electronic health records for temporal patterns associating drug prescriptions to medical diagnoses.[35]

Data mining has been applied software artifacts within the realm of software engineering: Mining Software Repositories.

Human rights

Data mining of government records – particularly records of the justice system (i.e. courts, prisons) – enables the discovery of systemic human rights violations in connection to generation and publication of invalid or fraudulent legal records by various government agencies.[36][37]

Medical data mining

In 2011, the case of Sorrell v. IMS Health, Inc., decided by the Supreme Court of the United States, ruled that pharmacies may share information with outside companies. This practice was authorized under the 1st Amendment of the Constitution, protecting the “freedom of speech.”[38]

Spatial data mining

Spatial data mining is the application of data mining methods to spatial data. The end objective of spatial data mining is to find patterns in data with respect to geography. So far, data mining and Geographic Information Systems (GIS) have existed as two separate technologies, each with its own methods, traditions, and approaches to visualization and data analysis. Particularly, most contemporary GIS have only very basic spatial analysis functionality. The immense explosion in geographically referenced data occasioned by developments in IT, digital mapping, remote sensing, and the global diffusion of GIS emphasizes the importance of developing data-driven inductive approaches to geographical analysis and modeling.

Data mining offers great potential benefits for GIS-based applied decision-making. Recently, the task of integrating these two technologies has become of critical importance, especially as various public and private sector organizations possessing huge databases with thematic and geographically referenced data begin to realize the huge potential of the information contained therein. Among those organizations are:

  • offices requiring analysis or dissemination of geo-referenced statistical data
  • public health services searching for explanations of disease clustering
  • environmental agencies assessing the impact of changing land-use patterns on climate change
  • geo-marketing companies doing customer segmentation based on spatial location.

Challenges in Spatial mining: Geospatial data repositories tend to be very large. Moreover, existing GIS datasets are often splintered into feature and attribute components that are conventionally archived in hybrid data management systems. Algorithmic requirements differ substantially for relational (attribute) data management and for topological (feature) data management.[39] Related to this is the range and diversity of geographic data formats, which present unique challenges. The digital geographic data revolution is creating new types of data formats beyond the traditional “vector” and “raster” formats. Geographic data repositories increasingly include ill-structured data, such as imagery and geo-referenced multi-media.[40]

There are several critical research challenges in geographic knowledge discovery and data mining. Miller and Han[41] offer the following list of emerging research topics in the field:

  • Developing and supporting geographic data warehouses (GDW’s): Spatial properties are often reduced to simple aspatial attributes in mainstream data warehouses. Creating an integrated GDW requires solving issues of spatial and temporal data interoperability – including differences in semantics, referencing systems, geometry, accuracy, and position.
  • Better spatio-temporal representations in geographic knowledge discovery: Current geographic knowledge discovery (GKD) methods generally use very simple representations of geographic objects and spatial relationships. Geographic data mining methods should recognize more complex geographic objects (i.e. lines and polygons) and relationships (i.e. non-Euclidean distances, direction, connectivity, and interaction through attributed geographic space such as terrain). Furthermore, the time dimension needs to be more fully integrated into these geographic representations and relationships.
  • Geographic knowledge discovery using diverse data types: GKD methods should be developed that can handle diverse data types beyond the traditional raster and vector models, including imagery and geo-referenced multimedia, as well as dynamic data types (video streams, animation).

Sensor data mining

Wireless sensor networks can be used for facilitating the collection of data for spatial data mining for a variety of applications such as air pollution monitoring.[42] A characteristic of such networks is that nearby sensor nodes monitoring an environmental feature typically register similar values. This kind of data redundancy due to the spatial correlation between sensor observations inspires the techniques for in-network data aggregation and mining. By measuring the spatial correlation between data sampled by different sensors, a wide class of specialized algorithms can be developed to develop more efficient spatial data mining algorithms.[43]

Visual data mining

In the process of turning from analogical into digital, large data sets have been generated, collected, and stored discovering statistical patterns, trends and information which is hidden in data, in order to build predictive patterns. Studies suggest visual data mining is faster and much more intuitive than is traditional data mining.[44][45][46] See also Computer Vision.

Music data mining

Data mining techniques, and in particular co-occurrence analysis, has been used to discover relevant similarities among music corpora (radio lists, CD databases) for the purpose of classifying music into genres in a more objective manner.[47]

Surveillance

Data mining has been used to stop terrorist programs under the U.S. government, including the Total Information Awareness (TIA) program, Secure Flight (formerly known as Computer-Assisted Passenger Prescreening System (CAPPS II)), Analysis, Dissemination, Visualization, Insight, Semantic Enhancement (ADVISE),[48] and the Multi-state Anti-Terrorism Information Exchange (MATRIX).[49] These programs have been discontinued due to controversy over whether they violate the 4th Amendment to the United States Constitution, although many programs that were formed under them continue to be funded by different organizations or under different names.[50]

In the context of combating terrorism, two particularly plausible methods of data mining are “pattern mining” and “subject-based data mining”.

Pattern mining

“Pattern mining” is a data mining method that involves finding existing patterns in data. In this context patterns often means association rules. The original motivation for searching association rules came from the desire to analyze supermarket transaction data, that is, to examine customer behavior in terms of the purchased products. For example, an association rule “beer ⇒ potato chips (80%)” states that four out of five customers that bought beer also bought potato chips.

In the context of pattern mining as a tool to identify terrorist activity, the National Research Council provides the following definition: “Pattern-based data mining looks for patterns (including anomalous data patterns) that might be associated with terrorist activity — these patterns might be regarded as small signals in a large ocean of noise.”[51][52][53] Pattern Mining includes new areas such a Music Information Retrieval (MIR) where patterns seen both in the temporal and non temporal domains are imported to classical knowledge discovery search methods.

Subject-based data mining

“Subject-based data mining” is a data mining method involving the search for associations between individuals in data. In the context of combating terrorism, the National Research Council provides the following definition: “Subject-based data mining uses an initiating individual or other datum that is considered, based on other information, to be of high interest, and the goal is to determine what other persons or financial transactions or movements, etc., are related to that initiating datum.”[52]

Knowledge grid

Knowledge discovery “On the Grid” generally refers to conducting knowledge discovery in an open environment using grid computing concepts, allowing users to integrate data from various online data sources, as well make use of remote resources, for executing their data mining tasks. The earliest example was the Discovery Net,[54][55] developed at Imperial College London, which won the “Most Innovative Data-Intensive Application Award” at the ACM SC02 (Supercomputing 2002) conference and exhibition, based on a demonstration of a fully interactive distributed knowledge discovery application for a bioinformatics application. Other examples include work conducted by researchers at the University of Calabria, who developed a Knowledge Grid architecture for distributed knowledge discovery, based on grid computing.[56][57]

Reliability / Validity

Data mining can be misused, and can also unintentionally produce results which appear significant but which do not actually predict future behavior and cannot be reproduced on a new sample of data. See Data snooping, Data dredging.

Privacy concerns and ethics

Some people believe that data mining itself is ethically neutral.[58] While the term “data mining” has no ethical implications, it is often associated with the mining of information in relation to peoples’ behavior (ethical and otherwise). To be precise, data mining is a statistical method that is applied to a set of information (i.e. a data set). Associating these data sets with people is an extreme narrowing of the types of data that are available. Examples could range from a set of crash test data for passenger vehicles, to the performance of a group of stocks. These types of data sets make up a great proportion of the information available to be acted on by data mining methods, and rarely have ethical concerns associated with them. However, the ways in which data mining can be used can in some cases and contexts raise questions regarding privacy, legality, and ethics.[59] In particular, data mining government or commercial data sets for national security or law enforcement purposes, such as in the Total Information Awareness Program or in ADVISE, has raised privacy concerns.[60][61]

Data mining requires data preparation which can uncover information or patterns which may compromise confidentiality and privacy obligations. A common way for this to occur is through data aggregation. Data aggregation involves combining data together (possibly from various sources) in a way that facilitates analysis (but that also might make identification of private, individual-level data deducible or otherwise apparent).[62] This is not data mining per se, but a result of the preparation of data before – and for the purposes of – the analysis. The threat to an individual’s privacy comes into play when the data, once compiled, cause the data miner, or anyone who has access to the newly compiled data set, to be able to identify specific individuals, especially when the data were originally anonymous.

It is recommended that an individual is made aware of the following before data are collected:[62]

  • the purpose of the data collection and any (known) data mining projects
  • how the data will be used
  • who will be able to mine the data and use the data and their derivatives
  • the status of security surrounding access to the data
  • how collected data can be updated.

In America, privacy concerns have been addressed to some extent by the US Congress via the passage of regulatory controls such as the Health Insurance Portability and Accountability Act (HIPAA). The HIPAA requires individuals to give their “informed consent” regarding information they provide and its intended present and future uses. According to an article in Biotech Business Week’, “‘[i]n practice, HIPAA may not offer any greater protection than the longstanding regulations in the research arena,’ says the AAHC. More importantly, the rule’s goal of protection through informed consent is undermined by the complexity of consent forms that are required of patients and participants, which approach a level of incomprehensibility to average individuals.”[63] This underscores the necessity for data anonymity in data aggregation and mining practices.

Data may also be modified so as to become anonymous, so that individuals may not readily be identified.[62] However, even “de-identified”/”anonymized” data sets can potentially contain enough information to allow identification of individuals, as occurred when journalists were able to find several individuals based on a set of search histories that were inadvertently released by AOL.[64]

Software

Free open-source data mining software and applications

  • Carrot2: Text and search results clustering framework.
  • Chemicalize.org: A chemical structure miner and web search engine.
  • ELKI: A university research project with advanced cluster analysis and outlier detection methods written in the Java language.
  • GATE: a natural language processing and language engineering tool.
  • SCaViS: Java cross-platform data analysis framework developed at Argonne National Laboratory.
  • KNIME: The Konstanz Information Miner, a user friendly and comprehensive data analytics framework.
  • ML-Flex: A software package that enables users to integrate with third-party machine-learning packages written in any programming language, execute classification analyses in parallel across multiple computing nodes, and produce HTML reports of classification results.
  • NLTK (Natural Language Toolkit): A suite of libraries and programs for symbolic and statistical natural language processing (NLP) for the Python language.
  • SenticNet API: A semantic and affective resource for opinion mining and sentiment analysis.
  • Orange: A component-based data mining and machine learning software suite written in the Python language.
  • R: A programming language and software environment for statistical computing, data mining, and graphics. It is part of the GNU project.
  • RapidMiner: An environment for machine learning and data mining experiments.
  • UIMA: The UIMA (Unstructured Information Management Architecture) is a component framework for analyzing unstructured content such as text, audio and video – originally developed by IBM.
  • Weka: A suite of machine learning software applications written in the Java programming language.

Commercial data-mining software and applications

Marketplace surveys

Several researchers and organizations have conducted reviews of data mining tools and surveys of data miners. These identify some of the strengths and weaknesses of the software packages. They also provide an overview of the behaviors, preferences and views of data miners. Some of these reports include:

 

Related Posts On Pronk Palisades

Amnesty Before Enforcement — Congressional Gangsters’ Comprehensive Immigration “Reform” Bill Targets American Citizens For Unemployment — American Citizens Want All Illegal Aliens Deported Not Rewarded With Legal Status — Target The Amnesty Illegal Alien Gangsters For Defeat — Videos

NSC’s PRISM Political Payoff: 40 Million Plus Foreigners Are In USA As Illegal Aliens! — 75% Plus Lean Towards Democratic Party — Pathway To One Party Rule By 2025 If Senate Bill Becomes Law Giving Illegal Aliens Legal Status — 25 Million American Citizens Looking For Full Time Jobs! — Videos

No Such Agency — NSA — National Security Agency — Threat To The Liberty and Privacy of The American People — None Of Their Damn Business — Still Trust The Federal Government? — Videos

Big Brother Barack Targets All The American People As Enemies of The State and Democratic Party — National Security Agency’s PRISM Is The Secret Security Surveillance State (S4) Means of Invading Privacy and Limiting Liberty — Outrageous Overreach — Videos

U.S. Hacking China and Hong Kong — Videos

Sasha Issenberg — The Victory Lab: The Secret Science of Winning Campaigns — Videos

Related Posts on Pronk Pops

Pronk Pops Show 112, June 7, 2013, Segment 0: Marxist-Leninists Go To The Wall With Holder — The Man Who Knows Where The Bodies Are Buried Enjoys President Obama’s Full Confidence Says Political Fixer Valerie Jarrett — Wall Street Wants Holder To Hang On — American People Say Hit The Road Jack — Videos

Pronk Pops Show 112, June 7, 2013: Segment 1: U.S. Real Gross Domestic Product Growth Still Stagnating At 2.4% in First Quarter of 2013 As Institute for Supply Management Factory Index Sinks to 49.0 Lowest Since June 2009 — Videos

Pronk Pops Show 112, June 7, 2013, Segment 2: Federal Advisory Council (FAC) May 17, 2013 Report — No Exit To A Bridge Over Troubled Waters — Keyboarding Money — We’re screwed! — Videos

Pronk Pops Show 112, June 7, 2013, Segment 3: Official Unemployment Rate Rises To 7.6% with 11.8 Million Americans Unemployed and Only 175,000 Jobs Created in May — Videos

Pronk Pops Show 112, June 7, 2013, Segment 4: No Such Agency — NSA — National Security Agency — Threat To The Liberty and Privacy of The American People — None Of Their Damn Business — Still Trust The Federal Government? — Videos

Read Full Post | Make a Comment ( None so far )

Warning, Warning, Stock Market Crash Alert: Hindenburg Omen Signals A Stock Market Crash–Videos

Posted on August 26, 2010. Filed under: Blogroll, Communications, Investments, Language, Law, liberty, Life, Links, People, Philosophy, Politics, Raves, Video, War, Wisdom | Tags: , , , , , , |

http://www.bing.com/images/search?q=photo+hindenberg+crash&go=&form=QBIL&qs=n&sk=#focal=cb11905c50e054b953584788cad2a69d&furl=http%3A%2F%2Flc.totfarm.com%2Fpics%2Fpic_7274.jpg

Hindenburg Explodes 1937 Vintage Silent News Reel

1937 THE HINDENBURG – NEW OUTSTANDING COLOR FOOTAGE!!!

Beck: The Hindenburg Omen

Hindenburg Omen

Hindenburg

Financial Predictions Scarier than the Hindenburg Omen

Those who work in Wall Street or follow technical investment analysis are all abuzz over the Hindenburg Omen being triggered or called on Thursday, August 12, 2010 and subsequently confirmed on August 20, 2010.

The Hindenburg Omen is a technical analysis pattern that signals not just a stock market decline but a stock market crash within forty days of the appearance of the Hindenburg Omen pattern.

The Hindenburg Omen is named after the Hindenburg crash of May 6, 1937 where Germany’s  Zeppelin airship the Hindenburg exploded and was destroyed as it was attempting to land in Lakehurst, New Jersey.

The creator of the Hindenburg Omen was a blind mathematician named Jim Miekka who came up with idea in 1995 as an indicator to predict major stock market movements using the New York Stock Exchange 52-week new Highs and Lows and moving average statistics.

In order for the Hindenburg Omen to be called the following criteria or requirements must be met:

  1. The daily number of New York Stock Exchange (NYSE) new 52 Week High and the daily number of new 52 Week Lows both be greater than 2.2 percent of total NYSE issues traded that day.
  2. The New York Stock Exchange (NYSE) 10 Week moving average is rising.
  3. The new 52 Week Highs cannot be more than twice the new 52 Week Lows.
  4. The McClellan Oscillator is negative on that same day.

92 companies listed on the NYSE or 2.9% of all companies hit a new 52-week high and 81 companies or 2.6% hit a new 52-week low on August 12, 2010 and the McClellan Oscillator was indeed negative on the same day.

The McClellan Oscillator is a market breadth indicator that measures the extent to which the stock market is overbought or oversold by the amount of money entering or leaving the market.

A number above zero indicates the market is bullish or stock prices are increasing and a number below zero indicates the market is bearish or stock prices are decreasing.

Does that mean there will definitely be a stock market crash of 15% or more in September 2010.

No it does not for the Hindenburg Omen has been wrong before.

What it does mean is the probability of a stock market crash is quite high due to an extreme divergence of investor opinion with a large number of companies experiencing either high or low stock prices.

When this happens it is very unlikely that their will a movement upwards in stock market prices.

If your investment portfolio or retirement plan is in equities or stocks, you can probably expect stock market prices to decline of at least 5% or more.

If you were counting on a significant increase in your retirement plan’s investment performance so you could retire, better put your plans on hold for the time being.

In a period of investment stock market uncertainty as well as economic uncertainty, cash or better yet  gold is king.

Should the Hindenburg Omen again prove correct and the stock market significantly decline, this will definitely impact the elections like it did in 2008 when the Hindenburg Omen called the stock market crashed.

Also, keep in mind that the stock market performance is generally a  leading indicator of what in the future will happen in the economy.

Looks like the Bush Obama Depression will last at least two more years.

The the official unemployment rate will most likely exceed 9% as measured by the Bureau of Labor Statistics U-3 unemployment statistical series and the total unemployment unemployment rate exceed 14% as measured by U-6 unemployment statistical series for another two years.

This means that more than 25 million Americans will be looking for full-time employment.

If you are graduating from high school or college finding a job will take considerably longer than in the past.

http://www.sxc.hu/photo/551783/?forcedownload=1

Background Articles and Videos

Doom. Doom! DOOM!!!

“…Here are the official Hindenburg Omen criteria that were met last week:

  • That the daily number of NYSE new 52 Week Highs and the daily number of new 52 Week Lows must both be greater than 2.2 percent of total NYSE issues traded that day.
  • That the smaller of these numbers is greater than or equal to 69 (68.772 is 2.2% of 3126). This is not a rule but more like a checksum. This condition is a function of the 2.2% of the total issues.
  • That the NYSE 10 Week moving average is rising.
  • That the McClellan Oscillator is negative on that same day.
  • That new 52 Week Highs cannot be more than twice the new 52 Week Lows (however it is fine for new 52 Week Lows to be more than double new 52 Week Highs). This condition is absolutely mandatory.

http://reason.com/blog/2010/08/15/doom-doom-doom

Hindenburg Omen

“..The Hindenburg Omen is a technical analysis pattern that is said to portend a stock market crash. It is named after the Hindenburg disaster of May 6, 1937, during which the German zeppelin Hindenburg was destroyed.

History

The Omen is largely based on Norman G. Fosback’s High Low Logic Index (HLLI).[1] The value of the HLLI is the lesser of the NYSE new highs or new lows divided by the number of NYSE issues traded, smoothed by an appropriate exponential moving average. The Omen itself is said to have originated with Jim Miekka[2], and the name was suggested by the late Kennedy Gammage.

Mechanics

The Hindenburg Omen is a combination of technical factors that attempt to measure the health of the NYSE, and by extension, the stock market as a whole. The goal of the indicator is to signal increased probability of a stock market crash.

The rationale is that under “normal conditions” either a substantial number of stocks may set new annual highs or annual lows, but not both at the same time. As a healthy market possesses a degree of uniformity, whether up or down, the simultaneous presence of many new highs and lows may signal trouble.

Criteria

These criteria are calculated daily using Wall Street Journal figures for consistency. (Other exchanges may be used as well.) Some have been recalibrated by Miekka to reduce statistical noise and make the indicator a more reliable predictor of a future decline.

  1. The daily number of NYSE new 52 week highs and the daily number of new 52 week lows are both greater than or equal to 2.8 percent (typically, 84) of the sum of NYSE issues that advance or decline that day (typically, around 3000)[3]. An older version of the indicator used a threshold of 2.5 percent of total issues traded (approximately 80 of 3200 in today’s market).
  2. The NYSE index is greater in value than it was 50 trading days ago. Originally, this was expressed as a rising 10 week moving average, but the new rule is more relevant to the daily data used to look at new highs and lows.
  3. The McClellan Oscillator is negative on the same day.
  4. New 52 week highs cannot be more than twice the new 52 week lows (though new 52 week lows may be more than double new highs).

The traditional definition requires each condition to occur on the same day. Once the signal has occurred, it is valid for 30 days, and any additional signals given during the 30-day period should be ignored. During the 30 days, the signal is activated whenever the McClellan Oscillator is negative, but deactivated whenever it is positive.[4]

…”

http://en.wikipedia.org/wiki/Hindenburg_Omen

The Past Performance of the Hindenburg Omen Stock Market Crash Signals 1985 – 2005

“…How has this signal performed over the past 21 years, since 1985? The traditional definition of a Hindenburg Omen is that the daily number of NYSE New 52 Week Highs and the Daily number of New 52 Week Lows must both be so high as to have the lesser of the two be greater than 2.2 percent of total NYSE issues traded that day. However, this is just condition number one. The traditional definition had two more filters: That the NYSE 10 Week Moving Average is also Rising (condition # 2), and that the McClellan Oscillator is negative on that same day (condition # 3). These measures are calculated each evening using Wall Street Journal figures for consistency. Critics have taken this definition and pointed rightly to several failed Omens (although the correlation was still quite good).

But if we add two more filters, the correlation to subsequent severe stock market declines is remarkable. Condition # 4 requires that New 52 Week NYSE Highs cannot be more than twice New 52 Week Lows, however it is okay for New 52 Week Lows to be more than double New 52 Week Highs. Our research found that there were two incidences where the first three conditions existed, but New Highs were more than double New Lows, and no market decline resulted. There were no instances noted where if 52 Week Highs were more than double New Lows, while the first three conditions were met, that a severe decline followed. So condition # 4 becomes a critical defining component. The fifth condition we found important for high correlation is that for a confirmed Hindenburg Omen, in other words for it to be “official,” there must be more than one signal within a 36 day period, i.e., there must be a cluster of Hindenburg Omens (defined as two or more) to substantially increase the probability of a coming stock market plunge. Our research noted seven instances over the past 21 years where – using the first four conditions – there was just one isolated Hindenburg Omen signal over a thirty-six day period. In six of the seven instances, no sharp declines followed. In only one instance did a sharp subsequent sell-off occur based upon a non-cluster single Omen, but in that case it was incredibly close to having a cluster of two Omens as the previous day’s McClellan Oscillator just missed being negative. We included this instance in our data below. …”

http://www.safehaven.com/article/3880/the-past-performance-of-the-hindenburg-omen-stock-market-crash-signals-1985-2005


http://www.onlinetradingconcepts.com/TechnicalAnalysis/McClellanOscillator.html

McClellan Oscillator

“…The McClellan Oscillator is a market breadth indicator used by financial analysts of the New York Stock Exchange to evaluate the rate of money entering or leaving the market and interpretively indicate overbought or oversold conditions of the market.[1]

History

Developed by Sherman and Marian McClellan in 1969, the Oscillator is computed using the exponential moving average (EMA) of the daily ordinal difference of advancing issues (stocks which gained in value) from declining issues (stocks which fell in value) over 39 trading day and 19 trading day periods.

How it works

The simplified formula for determining the Oscillator is:

Oscillator = (19 day EMA of Advances minus Declines) − (39 day EMA of Advances minus Declines)

The McClellan Summation Index (MSI) is calculated by adding each day’s McClellan Oscillator to the previous day’s Summation Index.

By using the Summation Index of the Mcclellan Oscillator, you can judge the markets overall bullishness or bearishness.

MSI properties

  • above zero it is considered to be bullish (positive growth)
  • below zero it is considered to be bearish (negative growth)

The Summation Index is oversold at -1000 to -1250 or overbought at 1000 to 1250. [1]

The number of stocks in a stock market determine the dynamic range of the MSI. For the NZSX (one of the smallest exchanges in the English speaking world) the MSI would probably range between (-50 … +50), the 19 and 39 constants (used for the US exchanges) would have to be revised. For the NZSX a MSI moving average mechanism might be needed to smooth out the perturbations of such a small number of traded stocks. …”

http://en.wikipedia.org/wiki/McClellan_Oscillator

We Get An Official Confirmed Hindenburg Omen On August 20th, 2010

“…A reader asked: “Would you be willing to test or comment on the 8/14 Wall Street Journal article ‘Hindenberg Omen Flashes’?” The Hindenburg Omen is a complex technical signal that, including confirmation via clusters of signals, consists of simultaneous satisfaction of five rules for NYSE stocks. Different informal sources indicate some variation in the rules among practitioners. For the sake of consistency in rule application, we consider the “confirmed” Hindenburg Omens cited by Robert McHugh in his 8/21/10 article entitled “We Get An Official Confirmed Hindenburg Omen On August 20th, 2010″. This article states that, after Hindenburg Omens, “plunges can occur as soon as the next day, or as far into the future as four months.” Using the dates of the Hindenburg Omens reported in these articles and weekly closing levels of the S&P 500 Index during 1/3/86 through 8/13/10, we find that:

We make the following adjustments to the dates of confirmed Hindenburg Omens as listed in the cited article:

  • Exclude one of the duplicate listings for 6/20/02.
  • Replace the out-of-order listing of 2/22/98 with the date of 12/23/98 in the associated footnote (which would be in order).

Note that the maximum drawdown listed in the cited article after the Hindenburg Omen date of 6/6/08 falls outside the specified four-month horizon for omen effectiveness. In general, the post-omen maximum drawdowns appear to derive from intraday data for the Dow Jones Industrial Average over intervals ranging from one day to 276 days. Four of the omen dates have four-month horizons that overlap with four-month horizons of other omen dates.

 

http://www.cxoadvisory.com/technical-trading/hindenburg-omens/

Related Posts On Pronk Palisades

Read Full Post | Make a Comment ( None so far )

Liked it here?
Why not try sites on the blogroll...