The National Security Agency has implanted software in nearly 100,000 computers around the world that allows the United States to conduct surveillance on those machines and can also create a digital highway for launching cyberattacks.
While most of the software is inserted by gaining access to computer networks, the N.S.A. has increasingly made use of a secret technology that enables it to enter and alter data in computers even if they are not connected to the Internet, according to N.S.A. documents, computer experts and American officials.
The technology, which the agency has used since at least 2008, relies on a covert channel of radio waves that can be transmitted from tiny circuit boards and USB cards inserted surreptitiously into the computers. In some cases, they are sent to a briefcase-size relay station that intelligence agencies can set up miles away from the target.
The N.S.A. calls its efforts more an act of “active defense” against foreign cyberattacks than a tool to go on the offensive. But when Chinese attackers place similar software on the computer systems of American companies or government agencies, American officials have protested, often at the presidential level.
Among the most frequent targets of the N.S.A. and its Pentagon partner, United States Cyber Command, have been units of the Chinese Army, which the United States has accused of launching regular digital probes and attacks on American industrial and military targets, usually to steal secrets or intellectual property. But the program, code-named Quantum, has also been successful in inserting software into Russian military networks and systems used by the Mexican police and drug cartels, trade institutions inside the European Union, and sometime partners against terrorism like Saudi Arabia, India and Pakistan, according to officials and an N.S.A. map that indicates sites of what the agency calls “computer network exploitation.”
“What’s new here is the scale and the sophistication of the intelligence agency’s ability to get into computers and networks to which no one has ever had access before,” said James Andrew Lewis, the cybersecurity expert at the Center for Strategic and International Studies in Washington. “Some of these capabilities have been around for a while, but the combination of learning how to penetrate systems to insert software and learning how to do that using radio frequencies has given the U.S. a window it’s never had before.”
No Domestic Use Seen
There is no evidence that the N.S.A. has implanted its software or used its radio frequency technology inside the United States. While refusing to comment on the scope of the Quantum program, the N.S.A. said its actions were not comparable to China’s.
“N.S.A.’s activities are focused and specifically deployed against — and only against — valid foreign intelligence targets in response to intelligence requirements,” Vanee Vines, an agency spokeswoman, said in a statement. “We do not use foreign intelligence capabilities to steal the trade secrets of foreign companies on behalf of — or give intelligence we collect to — U.S. companies to enhance their international competitiveness or increase their bottom line.”
Over the past two months, parts of the program have been disclosed in documents from the trove leaked by Edward J. Snowden, the former N.S.A. contractor. A Dutch newspaper published the map of areas where the United States has inserted spy software, sometimes in cooperation with local authorities, often covertly. Der Spiegel, a German newsmagazine, published the N.S.A.’s catalog of hardware products that can secretly transmit and receive digital signals from computers, a program called ANT. The New York Times withheld some of those details, at the request of American intelligence officials, when it reported, in the summer of 2012, on American cyberattacks on Iran.
President Obama is scheduled to announce on Friday what recommendations he is accepting from an advisory panel on changing N.S.A. practices. The panel agreed with Silicon Valley executives that some of the techniques developed by the agency to find flaws in computer systems undermine global confidence in a range of American-made information products like laptop computers and cloud services.
Embracing Silicon Valley’s critique of the N.S.A., the panel has recommended banning, except in extreme cases, the N.S.A. practice of exploiting flaws in common software to aid in American surveillance and cyberattacks. It also called for an end to government efforts to weaken publicly available encryption systems, and said the government should never develop secret ways into computer systems to exploit them, which sometimes include software implants.
Richard A. Clarke, an official in the Clinton and Bush administrations who served as one of the five members of the advisory panel, explained the group’s reasoning in an email last week, saying that “it is more important that we defend ourselves than that we attack others.”
“Holes in encryption software would be more of a risk to us than a benefit,” he said, adding: “If we can find the vulnerability, so can others. It’s more important that we protect our power grid than that we get into China’s.”
From the earliest days of the Internet, the N.S.A. had little trouble monitoring traffic because a vast majority of messages and searches were moved through servers on American soil. As the Internet expanded, so did the N.S.A.’s efforts to understand its geography. A program named Treasure Map tried to identify nearly every node and corner of the web, so that any computer or mobile device that touched it could be located.
A 2008 map, part of the Snowden trove, notes 20 programs to gain access to big fiber-optic cables — it calls them “covert, clandestine or cooperative large accesses” — not only in the United States but also in places like Hong Kong, Indonesia and the Middle East. The same map indicates that the United States had already conducted “more than 50,000 worldwide implants,” and a more recent budget document said that by the end of last year that figure would rise to about 85,000. A senior official, who spoke on the condition of anonymity, said the actual figure was most likely closer to 100,000.
That map suggests how the United States was able to speed ahead with implanting malicious software on the computers around the world that it most wanted to monitor — or disable before they could be used to launch a cyberattack.
A Focus on Defense
In interviews, officials and experts said that a vast majority of such implants are intended only for surveillance and serve as an early warning system for cyberattacks directed at the United States.
“How do you ensure that Cyber Command people” are able to look at “those that are attacking us?” a senior official, who compared it to submarine warfare, asked in an interview several months ago.
“That is what the submarines do all the time,” said the official, speaking on the condition of anonymity to describe policy. “They track the adversary submarines.” In cyberspace, he said, the United States tries “to silently track the adversaries while they’re trying to silently track you.”
If tracking subs was a Cold War cat-and-mouse game with the Soviets, tracking malware is a pursuit played most aggressively with the Chinese.
The United States has targeted Unit 61398, the Shanghai-based Chinese Army unit believed to be responsible for many of the biggest cyberattacks on the United States, in an effort to see attacks being prepared. With Australia’s help, one N.S.A. document suggests, the United States has also focused on another specific Chinese Army unit.
Documents obtained by Mr. Snowden indicate that the United States has set up two data centers in China — perhaps through front companies — from which it can insert malware into computers. When the Chinese place surveillance software on American computer systems — and they have, on systems like those at the Pentagon and at The Times — the United States usually regards it as a potentially hostile act, a possible prelude to an attack. Mr. Obama laid out America’s complaints about those practices to President Xi Jinping of China in a long session at a summit meeting in California last June.
At that session, Mr. Obama tried to differentiate between conducting surveillance for national security — which the United States argues is legitimate — and conducting it to steal intellectual property.
TAKE IT TO THE LIMITS: Milton Friedman on Libertarianism
Giving Away Money Costs More Than You Think
Downsizing the Federal Government
Downsize the Department of Energy
Can We Eliminate the Department of Education? (Charles Murray)
$5 Billion Loan for Solar Energy — Department of Energy
Phil Kerpen on Neil Cavuto to discuss the DOE loan program
Our Ever Growing Dependence on Government
Obamanomics: A Legacy of Wasteful Spending
Why Does Big Business Love Big Government? (Domhoff, Rothbard, and Evers)
G. William Domhoff is a research professor in psychology and sociology at the University of California, Santa Cruz. He is the author of Who Rules America? (1967), Bohemian Grove and Other Retreats: A Study in Ruling-Class Cohesiveness (1974), and other books.
A prolific author and Austrian economist, Murray Rothbard promoted a form of free market anarchism he called “anarcho-capitalism.”
Bill Evers was a resident scholar at Stanford University’s Hoover Institution (and is currently a research fellow there) and also served as Assistant Secretary for the Office of Planning, Evaluation and Policy Development in the U.S. Department of Education from 2007-09.
In this lecture Domhoff, Rothbard, and Evers talk about the “interlocking overlappers” that get together to influence the government, in California and in the country generally. They each spend some time describing what it is that draws businesspeople to market-capturing and rent-seeking behaviors, and take questions from the audience.
Walter Block – Free-Market Environmentalism [Australian Mises Seminar 2012]
How Murray Rothbard Became a Libertarian
The tide is rising for America’s libertarians
By Edward Luce
The new spirit in a rising climate of anti-politics has become an attitude, rather than a movement
Robert Nozick, the late US libertarian, smoked pot while he was writing Anarchy, State and Utopia. He would applaud the growth of libertarianism among today’s young Americans. Whether it is their enthusiasm for legalised marijuana and gay marriage – both spreading across the US at remarkable speed – or their scepticism of government, US millennials no longer follow President Barack Obama’s cue. Most of America’s youth revile the Tea Party, particularly its south-dominated nativist core. But they are not big-government activists either. If there is a new spirit in America’s rising climate of anti-politics, it is libertarian.
On the face of it this ought to pose a bigger challenge to the Republican party – at least for its social conservative wing. Mr Obama may have disappointed America’s young, particularly the millions of graduates who have failed to find good jobs during his presidency. But he is no dinosaur. In contrast, Republicans such as Rick Santorum, the former presidential hopeful, who once likened gay sex to “man on dog”, elicit pure derision. Even moderate Republicans, such as Chris Christie, who until last week was the early frontrunner for the party’s 2016 nomination, are considered irrelevant. Whether Mr Christie was telling the truth last week, when he denied knowledge of his staff’s role in orchestrating a punitive local traffic jam, is beside the point. Mr Christie’s Sopranos brand of New Jersey politics is not tailored to the Apple generation.
The opposite is true of Rand Paul, the Kentucky senator, whose chances of taking the 2016 prize rose with Mr Christie’s dented fortunes last week. Unlike Ron Paul, the senator’s father, who still managed to garner a large slice of the youth vote in 2008, Rand Paul eschews the more outlandish fringes of libertarian thought. Rather than promising an isolationist US withdrawal from the world, he touts a more moderate “non-interventionism”. Instead of pledging to end fiat money, he promises to audit the US Federal Reserve – “mend the Fed”, rather than “end the Fed”. Both find echo among the Y generation. So too does his alarmism about the US national debt. Far from being big spenders, millennials are more concerned about US debt than other generations, according to polls. They are also strongly in favour of free trade. More than a third of the Republican party now identifies as libertarian, according to the Cato Institute. Just under a quarter of Americans do so too, says Gallup.
All of which looks ominous for Ted Cruz, the Texan Republican whose lengthy filibuster against Obamacare last year lit the fuse for the US government shutdown. Mr Cruz, also a 2016 aspirant, leads the pugilistic wing of the Republican party that is prepared to burn the house down in order to save the ranch. Although also a Tea Partier, Mr Paul is cultivating a sunnier Reaganesque optimism that draws on the deep roots of US libertarianism. His brand of politics also strikes a chord with those who fear the growth of the US surveillance state – the types who view Edward Snowden (another millennial) as a hero rather than a traitor. Last year the US House of Representatives came within 12 votes of passing a bill to defund the National Security Agency. Mr Paul led the bill in the Senate. Next time they could succeed.
November 2012: While Obama lost ground among white male voters, his 2012 victory was the product of perhaps the most diverse electoral coalition in American history. Voters talk about how they interpret the president’s re-election
What does it mean for the Democrats? In terms of social values, libertarians are almost identical to liberals. Smoking pot and same-sex marriage both meet with big approval. The same is not necessarily true of guns. In spite of recent school massacres, 40 US states now have “concealed weapons” laws – many passed in the past 12 months. Again, millennials are surprisingly sceptical of gun control, say the polls. But it is on economic policy where they really part company with liberals. The Great Depression helped forge a generation of solid Democrats. The same does not appear to be true of the Great Recession. Franklin Roosevelt helped dig people out of misery in the 1930s by providing direct public employment. Mr Obama, on the other hand, has stuck largely to economic orthodoxy. He may have missed a golden opportunity to mould a generation of social democrats.
He has also inadvertently fuelled scepticism about the role of government. Mr Obama came to power in 2008 on a surge of voluntarism. He did so in part by appealing to youthful idealism about public service. That now feels like a long time ago. Distrust in public institutions has continued to rise during his presidency – most strongly among the youngest generation. The share of voters who identify as independents, rather than Democrats or Republicans, recently hit an all-time high of 42 per cent, according to Gallup. This is bad news for established figures in either party – and, indeed, in any walk of life. Hillary Clinton should beware. So should Jeb Bush.
On the minus side, libertarians have no real answer to many of America’s biggest problems – not least the challenges posed to US middle-class incomes by globalisation and technology. Nor are they coherent as a force. Libertarianism is an attitude, rather than an organisation. It is also potentially fickle. Young Americans disdain foreign entanglements. That could change overnight with a big terrorist attack on the homeland. They feel let down by Democrats and hostile to mainstream Republicans. Yet they could flock to an exciting new figure in either party. Theirs is a restless generation that disdains authority. Establishment figures should take note. Tomorrow belongs to them.
DOE says its energy-scoring software — called the Home Energy Scoring Tool – is like a vehicle’s mile-per-gallon rating because it allows homeowners to compare the energy performance of their homes to other homes nationwide. It also provides homeowners with suggestions for improving their homes’ efficiency.
The software is part of the government’s effort to reduce the nation’s energy consumption; but it’s also billed as a way to keep home-retrofitting going, at a time when stimulus funds for weather-proofing have run out.
The Home Energy Scoring Tool “can be a powerful motivator in getting homeowners to make energy efficiency improvements,” DOE says. “It’s also a great way to help trained workers enter the private sector energy improvement market as funding for weatherization efforts decline.”
DOE says its Home Energy Score is useful if you are a homeowner looking to renovate or remodel your home, lower your utility bills, improve the comfort of your home, or reduce your energy usage. Moreover, “the score serves as an official way to document these improvements and thereby enhance your home’s appeal when you’re ready to sell.”
Right now, getting your home scored is voluntary.
To produce a Home Energy Score, a trained, “qualified assessor” comes to your home — for a fee — and collects approximately 40 pieces of data about the home’s “envelope” (e.g., walls, windows, heating and cooling systems) during an hour-long walk-through.
Based on the home’s characteristics, the DOE software estimates the home’s annual energy use, assuming “typical homeowner behavior.” The software then converts the estimated energy use into a score, based on a 10-point scale (10 being the most energy-efficient). The 1-10 scale accounts for differences in weather conditions by using the zip code to assign the house to one of more than 1,000 weather stations.
In addition to showing the home’s current energy efficiency — or inefficiency — the score also shows where a home would rank if all of the energy-saving improvements identified during the home walk-through were made. That may prompt some homeowners to buy new windows or doors, for example, boosting the market for home retro-fitters.
DOE recommends getting a Home Energy Score “as soon as the program becomes available in your area.” The program launched in 2012, and at this time, only single-family homes and townhouses can be scored.
The scoring is available only through DOE’s participating partners, which include state and local governments, utilities, and non-profits. DOE does not determine how much an assessor charges to score a house. “It will depend on what the local market supports.” But DOE says its partners “have indicated plans to charge between $25 and $125 for the Home Energy Score.”
And yes, the size of the home matters because larger homes use more energy.
The Home Energy Score and the associated report is generated through DOE/Lawrence Berkeley National Laboratory software. The 2014 version of DOE’s Home Energy Scoring Tool will be introduced at a webinar on Tuesday.
The Home Energy Score is similar to a vehicle’s mile-per-gallon rating, says the U.S. Energy Department. (Graphic is from DOE website)
DOE says more than 8,500 homes have been scored by the Energy Department’s growing network of more than 25 partners and 175 qualified assessors.
The business and economic reporting of CNSNews.com is funded in part with a gift made in memory of Dr. Keith C. Wold.
Cathy Zoi on the new Home Energy Score pilot program
Acting Under Secretary Cathy Zoi talks about the new Home Energy Score pilot program that was announced today by Vice President Biden and U.S. Department of Energy Secretary Steven Chu. The Home Energy Score will offer homeowners straightforward, reliable information about their homes’ energy efficiency. A report provides consumers with a home energy score between 1 and 10, and shows them how their home compares to others in their region. The report also includes customized, cost-effective recommendations that will help to reduce their energy costs and improve the comfort of their homes.
200,000 homes weatherized under the Recovery Act
Home Energy Score Pilot Program Launched By DOE
Home Energy Score Qualified Assessor module 1 intro
Glenn Greenwald: The NSA Can “Literally Watch Every Keystroke You Make”
Spiegel has revealed new details about a secretive hacking unit inside the National Security Agency called the Office of Tailored Access Operations, or TAO. The unit was created in 1997 to hack into global communications traffic. Hackers inside the TAO have developed a way to break into computers running Microsoft Windows by gaining passive access to machines when users report program crashes to Microsoft. In addition, with help from the CIA and FBI, the NSA has the ability to intercept computers and other electronic accessories purchased online in order to secretly insert spyware and components that can provide backdoor access for the intelligence agencies. American Civil Liberties Union Deputy Legal Director Jameel Jaffer and journalist Glenn Greenwald join us to discuss the latest revelations, along with the future of Edward Snowden, who has recently offered to assist U.S. targets Germany and Brazil with their respective probes into NSA spying.
TAO Revealed: The NSA’s ‘top secret weapon’
‘NSA’s goal is elimination of privacy worldwide’ – Greenwald to EU (FULL SPEECH)
Glenn Greenwald and Ruth Marcus Get in Explosive Exchange over Snowden and ‘Horrible’ D.C. Media
How The NSA Hacks Your iPhone (Presenting DROPOUT JEEP)
Following up on the latest stunning revelations released yesterday by German Spiegel which exposed the spy agency’s 50 page catalog of “backdoor penetration techniques“, today during a speech given by Jacob Applebaum (@ioerror) at the 30th Chaos Communication Congress, a new bombshell emerged: specifically the complete and detailed description of how the NSA bugs, remotely, your iPhone. The way the NSA accomplishes this is using software known as Dropout Jeep, which it describes as follows: “DROPOUT JEEP is a software implant for the Apple iPhone that utilizes modular mission applications to provide specific SIGINT functionality. This functionality includes the ability to remotely push/pull files from the device. SMS retrieval, contact list retrieval, voicemail, geolocation, hot mic, camera capture, cell tower location, etc. Command, control and data exfiltration can occur over SMS messaging or a GPRS data connection. All communications with the implant will be covert and encrypted.”
The flowchart of how the NSA makes your iPhone its iPhone is presented below:
NSA ROC operator
Load specified module
Send data request
iPhone accepts request
Retrieves required SIGINT data
Encrypt and send exfil data
What is perhaps just as disturbing is the following rhetorical sequence from Applebaum:
“Do you think Apple helped them build that? I don’t know. I hope Apple will clarify that. Here’s the problem: I don’t really believe that Apple didn’t help them, I can’t really prove it but [the NSA] literally claim that anytime they target an iOS device that it will succeed for implantation. Either they have a huge collection of exploits that work against Apple products, meaning that they are hoarding information about critical systems that American companies produce and sabotaging them, or Apple sabotaged it themselves. Not sure which one it is. I’d like to believe that since Apple didn’t join the PRISM program until after Steve Jobs died, that maybe it’s just that they write shitty software. We know that’s true.”
Or, Apple’s software is hardly “shitty” even if it seems like that to the vast majority of experts (kinda like the Fed’s various programs), and in fact it achieves precisely what it is meant to achieve.
How ironic would it be if Blackberry, left for dead by virtually everyone, began marketing its products as the only smartphone that does not allow the NSA access to one’s data (and did so accordingly). Since pretty much everything else it has tried has failed, we don’t see the downside to this hail mary attempt to strike back at Big Brother and maybe make some money, by doing the right thing for once.
We urge readers to watch the full one hour speech by Jacob Applebaum to realize just how massive Big Brother truly is, but those who want to just listen to the section on Apple can do so beginning 44 minutes 30 seconds in the presentation below.
Segment 0: God Is Behind Going Duck Crazy — Duck Dynasty Phil Robertson Suspended From Show For Expressing Views On Gays — Will Not Inherit The Kingdom of God — I’m With Phil — Photos & Videos
I am Second® – The Robertsons
Duck Dynasty : Phil’s Way of Life
Duck Dynasty: Unknown Facts About The Robertsons
The Best of Uncle Si
Duck Dynasty : Si Struck
Duck Dynasty: Si’s New Toy
Duck Dynasty: Si’s Dating Tips
Duck Dynasty : Hey
Uncle Si Robertson “ICY STARE” HILARIOUS DUCK DYNASTY ( 720P HD )
Duck Commanders Phil and Willie Robertson Interview – CONAN on TBS
The Robertson’s of Duck Dynasty Talk About How Their Faith in Jesus Turned Around Their Lives!!
Duck Commander Phil Robertson Talks About Why This Country Needs More Jesus
Duck Commander Phil Robertson from Duck Dynasty spoke to the congregation of Saddleback church in July on why people need Jesus and why the founders would agree — and I gotta say it was awesome. I watched it last night and knew I had to post it for you guys. Duck Commander’s message is really simple, that people need to love God and love each other and he delivers it beautifully. He really is a fantastic preacher.
‘Duck Dynasty’ Star Makes Shocking ‘Gay is Sin’ Comment
Duck Dynasty dared to mention Jesus
‘Duck Dynasty’ star slammed over anti-gay rant
By Andrea Morabito
Phil Robertson, patriarch of the “Duck Dynasty” clan, is being slammed for controversial comments he made about homosexuality in an interview in the January issue of GQ.
“It seems like, to me, a vagina—as a man—would be more desirable than a man’s anus. That’s just me,” Robertson told the magazine. “I’m just thinking: There’s more there! She’s got more to offer. I mean, come on, dudes! You know what I’m saying? But hey, sin: It’s not logical, my man. It’s just not logical.”
When the reporter asked Robertson what he found sinful, he said “Start with homosexual behavior and just morph out from there. Bestiality, sleeping around with this woman and that woman and that woman and those men.”
The self-proclaimed Bible-thumper then went on to paraphrase Corinthians: “Don’t be deceived. Neither the adulterers, the idolaters, the male prostitutes, the homosexual offenders, the greedy, the drunkards, the slanderers, the swindlers—they won’t inherit the kingdom of God. Don’t deceive yourself. It’s not right.”
On Wednesday, GLAAD called Robertson’s statements “vile” and “littered with outdated stereotypes.”
“Phil and his family claim to be Christian, but Phil’s lies about an entire community fly in the face of what true Christians believe,” said GLAAD spokesperson Wilson Cruz. “He clearly knows nothing about gay people or the majority of Louisianans — and Americans — who support legal recognition for loving and committed gay and lesbian couples.
“Phil’s decision to push vile and extreme stereotypes is a stain on A&E and his sponsors who now need to reexamine their ties to someone with such public disdain for LGBT people and families.”
An A&E spokesman had no comment, but Robertson released his own statement responding to the controversy.
“I myself am a product of the 60s; I centered my life around sex, drugs and rock and roll until I hit rock bottom and accepted Jesus as my Savior,” he said. “My mission today is to go forth and tell people about why I follow Christ and also what the Bible teaches, and part of that teaching is that women and men are meant to be together.
“However, I would never treat anyone with disrespect just because they are different from me. We are all created by the Almighty and like Him, I love all of humanity. We would all be better off if we loved God and loved each other.”
“Duck Dynasty” has been a ratings phenomenon for A&E, drawing 11.8 million viewers to its fourth season premiere last August, the most-watched nonfiction series telecast in cable history.
Obamacare Delay – Critics – THE BIG QUESTION: Is Obama A President Or A King? – The Kelly File
ObamaCare: Three Years of Broken Promises
Henry Chao: 30-40% of HealthCare.gov Still Needs To Be Built
DHS Cannot Provide Answers Regarding the Security of Healthcare.gov
Obamacare Website Healthcare.gov Crashes During Secretary Kathleen Sebelius’ Visit
Pelosi taken apart by David Gregory on false Obamacare promises
Dennis Miller-special Nancy Pelosi
11-13-13 “ObamaCare Implementation: The Rollout of HealthCare.gov” Pt. I
11-13-13 “ObamaCare Implementation: The Rollout of HealthCare.gov” Pt. 2
11-13-13 “ObamaCare Implementation: The Rollout of HealthCare.gov” Pt. 3
Megyn Kelly Outraged Obama Lied about Americans being able to keep coverage, shows proof
Megyn Kelly Interviews Charles Krauthammer on Obamacare Outrage – Kelly File – 10/30/13
More Than a Website
Health Site Is Improving But Likely to Miss Saturday Deadline
Louise Radnofsky and Spencer E. Ante
Despite recent progress at HealthCare.gov, a raft of problems will remain beyond the Obama administration’s Saturday deadline to make the troubled federal insurance website work.
The news isn’t all bad: Users say the site looks better, pages load faster, and more people are getting through to sign up for health plans.
But technical problems still affect HealthCare.gov’s ability to verify users’ identities and transmit accurate enrollment data to insurers, officials say. The data center that supports the site faces continuing challenges, and tools for processing payments to insurers haven’t been built.
Technical staff in Washington have been racing up to the end-of-November deadline. In their last public pronouncement on the effort, three days before the deadline, officials said they had much to do to get the site into a condition where it functions smoothly for a majority of users.
The success of the White House’s signature domestic initiative is riding on the technicians’ ability to fix the site, as well as the rest of the federal technology supporting enrollment. Across the nation, that effort is being eyed hopefully by supporters of the law, since the site is the centerpiece of the effort to overhaul American health care and extend coverage to millions of people.
Those hopes were deflated by a series of blows for the administration right up until Nov. 30, and the site continued to experience outages, both planned and unplanned, in the week leading up to the deadline.
The Wall Street Journal reported on Wednesday that the administration was planning to change its Web-hosting provider from Verizon Communications Inc. VZ -0.62%Verizon Communications Inc. U.S.: NYSE $49.62 -0.31 -0.62% Nov. 29, 2013 1:00 pm Volume (Delayed 15m) : 4.30M AFTER HOURS $49.79 +0.17 +0.34% Nov. 29, 2013 4:42 pm Volume (Delayed 15m): 611,247 P/E Ratio 65.29 Market Cap $141.91 Billion Dividend Yield 4.27% Rev. per Employee $651,745 11/27/13 H-P Will Replace Verizon for W… 11/20/13 Investors Tell AT&T, Verizon t… 11/18/13 Supreme Court Declines to Hear…More quote details and news »VZ in Your Value Your Change Short position subsidiary Terremark to Hewlett-Packard Co. in the spring, a complex transition that could introduce new challenges and take months; and the same day, the administration said it was shelving for a year any attempts to operate an online exchange for small businesses. On Wednesday, Verizon declined to comment on its clients.
Officials mixed optimism with caution. “November 30th does not represent a relaunch of HealthCare.gov,” said Julie Bataille, a spokeswoman for the government’s Centers for Medicare and Medicaid Services, which operates the site. “It is not a magical date. There will be times after November 30th when the site, like any website, does not perform optimally.”
For the fix-it drive that began in late October, the administration tapped former White House adviser Jeff Zients and QSSI, a unit of UnitedHealth Group, to act as the new lead contractor, establishing a 24-hour “war room” operations center to coordinate contractors who previously weren’t working well together. Since then, officials have focused on fixing the kinds of wrinkles that were most obvious to users.
They have reported success in speeding up the response time of the system, lowering it from an average of eight seconds at launch to less than one second for most users. They say they have eliminated a host of glitches in the software so that pages now load incorrectly less than 1% of the time. And they say they have added “visual cues” to help users navigate the system more easily.
Technicians have been racing to add new computer server, storage and database capacity to the website, hoping to get the site ready to withstand 50,000 simultaneous users by Sunday, as was originally intended, said people familiar with the work. “I think we are close,” said one.
Some people involved with enrollment say they have seen a notable uptick in recent weeks. Maine Community Health Options, a nonprofit plan based in Lewiston, Maine, now is getting “hundreds of enrollments” a day, rather than the dozens it saw trickling in earlier this month, said Chief Executive Kevin Lewis.
But problems with the performance of the site’s databases, storage and servers and their interaction with each other continue to slow the site or make it unavailable for short periods, according to government officials and contractors working on the project.
Explore how America’s health-care overhaul will affect you on this first-person adventure. CLICK THE IMAGEto start interactive experience.
Karen Egozi, CEO of the Epilepsy Foundation of Florida, which has trained nearly 50 people to help others enroll, said the performance of the website has improved in recent weeks but suffers from unpredictable glitches. On Nov. 19, Secretary of Health and Human Services Kathleen Sebelius visited a medical center in Miami and watched a member of Ms. Egozi’s staff help a couple fill out an application. The website failed, in front of a local TV camera crew.
On the weekend of Nov. 23 and 24, Ms. Egozi said her navigators were able to sign up a few people. But on Nov. 25, she said the site was down for a little while. “Sometimes, similar to when the secretary was here, the site does not let us through to the next section,” she said. “It was not working today, but yesterday it worked well.”
One source of early problems: The government had bought web-hosting services from Terremark subsidiary that initially gave it a highly virtualized system of servers shared by other groups within the Medicare center, rather than a dedicated group of computer servers for HealthCare.gov. Plans are in place to replace the Verizon unit with H-P this spring.
HHS also didn’t initially contract for a backup website or monitoring tools like those used by sophisticated consumer sites, according to people familiar with the matter.
The website still has no separate backup copy, but it did replace the virtual database with dedicated hardware, and bought and installed monitoring software.
Meanwhile, the site has a backlog of users who encountered problems in its first weeks of operation. Some appear to be locked out from the early stages unless they can get their account deleted. Others are stuck at the next big stage, persuading the federal government of their identity and their income so their application for tax credits can be processed.
Yannette Castellano waits to talk to a navigator about health-care options available under the Affordable Care Act, at the North Shore Medical Center, on Nov. 19 in Miami. AP
Guy Dicharry of Los Lunas, N.M., said he had been in limbo at the identity-verification stage since Oct. 5, despite giving the site personal information several times so it can confirm his income. He hasn’t heard back about a paper application submitted Nov. 1.
“This has been botched and is not getting fixed. If it’s not fixed, I’ll be ringing in 2014 as a newly uninsured person. I suspect that is the opposite of what the ACA was supposed to achieve,” said Mr. Dicharry, who described himself as a supporter of the Affordable Care Act. Because of their age and income, Mr. Dicharry and his wife stand to gain valuable subsidies toward the cost of coverage, but only if he buys it through the website.
Ronald Gallagher of Paradise Valley, Ariz., said he had been helping his daughter shop for coverage. After 16 hours over four days starting Oct. 1, they were told her identity was verified and she could pick a plan. But when they logged in to the website, it said her application was “In Progress.”
After failing to get help from a call center, father and daughter filled out an application over the phone in early November, but they still haven’t received a letter telling what insurance plans she qualifies for. “So far, nothing the government has done has worked,” Mr. Gallagher said.
Even when people successfully enroll, insurers say they sometimes get incorrect data. Ms. Bataille, the government spokeswoman, said officials have seen “marked improvements” in the information transmitted to insurers but “we know there are still issues that remain.” An HHS official also said that there had been improvements in identity verification, but that the agency knew it wasn’t fully fixed.
Mr. Lewis of Maine Community Health Options also worried about a larger volume of applicants, especially since insurers have now been told to find ways to process applications that come in from people as late as Dec. 23 in time for their coverage to begin Jan. 1, rather than a previous Dec. 15 deadline.
If “there’s an avalanche on that last date, I don’t know if the system will be able to support all that,” he said.
Book TV: After Words: Harry Markopolos, “No One Would Listen”
Madoff Whistleblower Speaks with ABC News Radios Aaron Kate
Ackerman Scolds SEC for Not Stopping Bernie Madoff Scheme Despite Being Told About It 10yrs Ago
Markopolos: I gift wrapped and delivered the largest Ponzi scheme in history to the SEC
Rep. Maloney on Madoff Fraud
Congressman Spencer Bachus questions at Madoff Ponzi Fraud Hearing
Congressman Sherman questions Harry Markopoulos
Ron Paul – Madoff Fraud Hearing – Congress – Big Ponzi Scheme 01-05-09
DP/30: Chasing Madoff, subject Harry Markopolos (pt 1 of 2)
DP/30: Chasing Madoff, subject Harry Markopolos (pt 2 of 2)
Background Articles and Videos
Bernie Madoff on the modern stock market
Bernie Madoff Reveals to Barbara Walters He Is ‘Happier in Prison’
Bernie Madoff’s Jail Cell and His Future Life in Prison, Levine: “He’s Worse Than a Child Molester.”
The Madoff Hustle – Part 1
The Madoff Hustle – Part 2
The Madoff Hustle – Part 3
The Madoff Hustle – Part 4
Part 1: The Hunt for Madoff’s Money
Part 2: The Hunt for Madoff?s Money
Too Good to be True- The Rise and Fall of Bernie Madoff Part 1
Erin Arvedlund first wrote about Bernie Madoff for Barrons in 2001 and published her book, Too Good to be True- The Rise and Fall of Bernie Madoff in August of 2009. In this episode of The Massachusetts School of Law’s Books of our Time, Dean Velvel, himself a Madoff victim, and Arvedlund discuss the history of the brokerage industry, the possible culpability of the entire Madoff family, the difference between Madoff’s legitimate brokerage firm and his illegitimate hedge fund and the steps that lead up to the largest Ponzi scheme in American History. Arvedlund tells the story of Madoff’s infamous Ponzi scheme with the knowledge and detail of an insider, and sheds new light on the greatest financial enigma of American History.
The Massachusetts School of Law also presents information on important current affairs to the general public in television and radio broadcasts, an intellectual journal, conferences, author appearances, blogs and books. For more information visit mslawledu.
Too Good to be True- The Rise and Fall of Bernie Madoff Part 2
The Wall Street Journal broke the news Monday that fewer than 50,000 people have enrolled in the new health care exchanges, a figure that we confirmed at The Washington Post.
That seems like a pretty small number of enrollees. Yet we haven’t seen much public panic from health law supporters. “I think everybody anticipated the early months would have relatively low enrollment,” Ron Pollack, president of the nonprofit health-care advocacy network Families USA, told me Monday night. “Obviously, with the Web site malfunctioning, that made the likely conclusion inevitable.”
Some of this apparent calm could simply be the deliberate optimism of the health care law’s advocates. But, putting aside any such bias, we can still make a case that the health law’s debut is not a complete disaster.
First, we can compare the rollout to that of the Massachusetts health care law, which had 123 enrollees sign up during the first month of coverage. That ended up accounting for 0.3 percent of first-year enrollment. If we tally up 40,000 enrollees in the federal marketplace –and another 49,000 in the state exchanges, as counted by consulting firm Avalere Health –that works out to about 1.2 percent of the 7 million people the Congressional Budget Office has projected will sign up on the exchange in 2014.
Massachusetts eventually saw a really big spike in enrollment right before the individual mandate kicked in. You can see that in this chart from the New England Journal of Medicine (which Adrianna McIntyre discusses in an aptly titled post, “This chart should be getting more attention.”).
We can also look at Medicaid enrollment, which has outpaced some observers’ expectations. There have been at least 440,000 Medicaid enrollments so far, according to Avalere. That would put Medicaid about 5 percent toward a projected enrollment of 9 million in 2014.
Is it easier to enroll people into a program such as Medicaid that does not charge premiums? Definitely. Is that program a key part of the health care law, responsible for more than half of the health law’s coverage expansion? Yes. So, these high levels of Medicaid enrollment in the first few weeks do matter for the health law’s insurance expansion.
Most health policy experts I talk to aren’t as concerned about the number of people who sign up for the health care law as they are about who actually enrolled. Was it a wave of sick people with really high health care costs, or did that group of under 50,000 people include a good chunk of younger, healthier people who don’t visit the doctor all too often?
Even that ratio will be difficult to figure out before March, when open enrollment ends. As the chart above on Massachusetts shows, a lot of healthy people might wait until right before the individual mandate kicks in to sign up for a plan. If you’d like to pencil in some time for freaking out about the health law’s failure, it’s probably best to schedule it for early April, when we’ll have more definitive data on who is actually signing up for Obamacare.
The fight over how to define the new health law’s success is coming down to one question: Who counts as an Obamacare enrollee?
Health insurance plans only count subscribers as enrolled in a health plan once they’ve submited a payment. That is when the carrier sends out a member card and begins paying doctor bills.
When the Obama administration releases health law enrollment figures later this week, though, it will use a more expansive definition. It will count people who have purchased a plan as well as those who have a plan sitting in their online shopping cart but have not yet paid.
“In the data that will be released this week, ‘enrollment’ will measure people who have filled out an application and selected a qualified health plan in the marketplace,” said an administration official, who requested anonymity to frankly describe the methodology.
The disparity in the numbers is likely to further inflame the political fight over the Affordable Care Act. Each side could choose a number to make the case that the health law is making progress or failing miserably.
On Monday, the Wall Street Journal, citing anonymous sources, said insurance companies have received about 50,000 private health plan enrollments through HealthCare.gov. Even combined with state tallies, the figure falls far short of the 500,000 sign-ups the administration initially predicted for both private sign-ups and those opting for the expansion of Medicaid.
In recent weeks, administration officials have warned that the enrollment figures for October would be low, given the tumultuous launch of the health Web site.
The administration plans to use this count of enrollees because that’s where their interaction with the healthcare.gov site ends, the administration official said. Insurance plans, rather than the federal government, are responsible for collecting the first month’s premium.
The shopping cart on healthcare.gov only contains space for one health plan, meaning the consumer must have gotten far enough to settle on a specific option.
Addressing the Wall Street Journal’s report, Health and Human Services spokeswoman Joanne Peters said: “We cannot confirm these numbers. More generally, we have always anticipated that initial enrollment numbers would be low and increase over time. . . . The problems with the Web site will cause the numbers to be lower than initially anticipated.”
States that have so far released enrollment data also tend to use this wider definition. The 14 states running their own insurance marketplaces have reported 49,000 enrollments in private health insurance plans, according to an analysis released Monday by consulting firm Avalere Health. They have also enrolled many thousands more into the Medicaid program, which the health-care law expanded.
“The idea that people are going to do layaway purchasing three months out goes against the American way,” Rhode Island exchange director Christine Ferguson said in late September, shortly before the health law’s rollout.
But the city’s exchange, DC Health Link, estimates that 321 people in the District have dropped a specific health insurance plan into their shopping cart. Of those, 164 have requested an invoice for their first month’s premium from the insurance carrier.
“We recognize that most people do not have the luxury of paying for coverage in October, months before a bill is due,” exchange spokesman Richard Sorian said Friday. “I hope that all consumers here in the District remember that they have until Dec. 15 to finalize their selection by paying their first month’s premium in order to have coverage on Jan. 1, 2014.”
The number of poor people in America is 3 million higher than the official count, encompassing 1 in 6 residents due to out-of-pocket medical costs and work-related expenses, according to a revised census measure released Wednesday.
The new measure is aimed at providing a fuller picture of poverty, but does not replace the official government numbers. Put in place two years ago by the Obama administration, it generally is considered more reliable by social scientists because it factors in living expenses as well as the effects of government aid, such as food stamps and tax credits.
Administration officials have declined to say whether the new measure eventually could replace the official poverty formula, which is used to allocate federal dollars to states and localities and to determine eligibility for safety-net programs such as Medicaid.
Congress would have to agree to adopt the new measure, which generally would result in a higher poverty rate from year to year and thus higher government payouts for aid programs.
Based on the revised formula, the number of poor people in 2012 was 49.7 million, or 16 percent. That exceeds the record 46.5 million, or 15 percent, that was officially reported in September.
The latest numbers come as more working-age adults picked up low-wage jobs in the slowly improving economy but still struggled to pay living expenses. Americans 65 and older had the largest increases in poverty under the revised formula, from 9.1 percent to 14.8 percent, because of medical expenses such as Medicare premiums, deductibles and other costs not accounted for in the official rate.
There also were increases for Hispanics and Asian-Americans, partly due to lower participation among immigrants and non-English speakers in government aid programs such as housing aid and food stamps.
African-Americans and children, helped by government benefits, had declines in poverty compared with the official rate.
“This is a real incongruity, when 1 in 6 people face economic insecurity here in the richest country in the world,” said Joseph Stiglitz, a Columbia University economist and former chairman of the White House Council of Economic Advisers who has argued for more government action to alleviate income inequality.
“When so many citizens are worse off year after year, with food insecurity and health care insecurity, there’s no way you can say that’s a successful economy.”
Last week, more than 47 million Americans who receive food stamps saw their benefits go down, while Congress began negotiations on further cuts of up to $4 billion annually to the program.
Among states, California had the highest share of poor people, hurt in part by high housing costs and large numbers of immigrants, followed by the District of Columbia, Nevada and Florida. Under the official poverty rate, more rural states were more likely to be at the top of list, led by Mississippi, Louisiana and New Mexico.
Some other findings:
-Food stamps helped lift about 5 million people above the poverty line. Without such aid, the overall poverty rate would increase from 16 percent to 17.6 percent.
-Working-age adults ages 18-64 saw an increase in poverty from 13.7 percent based on the official calculation to 15.5 percent, due mostly to commuting and child care costs.
-Child poverty declined from 22.3 percent to 18 percent under the new measure. Under both measures, children still remained the age group most likely to be living in poverty.
-By race, Hispanics and Asians saw higher rates of poverty, 27.8 percent and 16.7 percent respectively, compared with rates of 25.8 percent and 11.8 percent under the official formula. In contrast, African-Americans saw a modest decrease, from 27.3 percent to 25.8 percent based on the revised numbers. Among non-Hispanic whites, poverty rose from 9.8 percent to 10.7 percent.
“The primary reason that poverty remains so high is that the benefits of a growing economy are no longer being shared by all workers as they were in the quarter-century following the end of World War II,” said Sheldon Danziger, a University of Michigan economist.
“Given current economic conditions, poverty will not be substantially reduced unless government does more to help the working poor.”
Economists long have criticized the official poverty rate as inadequate. Based on a half-century-old government formula, the official rate continues to assume the average family spends one-third of its income on food. Those costs have declined to a much smaller share, more like one-seventh.
In reaction to some of the criticism, the Obama administration in 2010 asked the Census Bureau to develop a new poverty measure, based partly on recommendations made by the National Academy of Sciences. The goal is to help lawmakers better gauge the effectiveness of anti-poverty programs.
For instance, the new measure finds that if it weren’t for Social Security payments, the poverty rate would rise to 54.7 percent for people 65 and older and 24.5 percent for all age groups.
Refundable tax credits such as the earned income tax credit helped lift 9 million people above the poverty line. Without the credits, child poverty would rise from 18 percent to 24.7 percent.
In recent years, New York City as well California, Virginia and Wisconsin have sought to put in place a more accurate poverty measure. They were prompted in part by local officials such as New York Mayor Michael Bloomberg who have argued that the official measure does not take into account urban costs of living and that larger cities may get less federal money as a result.
Associated Press writer Mary Clare Jalonick contributed to this report.
Years after the Great Recession ended, 46.5 million Americans are still living in poverty, according to a Census Bureau report released Tuesday.
Meanwhile, median household income fell slightly to $51,017 a year in 2012, down from $51,100 in 2011 — a change the Census Bureau does not consider statistically significant.
Poverty in the United States
Poverty is a state of privation, or a lack of the usual or socially acceptable amount of money or material possessions. The most common measure of poverty in the U.S. is the “poverty threshold” set by the U.S. government. This measure recognizes poverty as a lack of those goods and services commonly taken for granted by members of mainstream society. The official threshold is adjusted for inflation using the consumer price index. The government’s definition of poverty is based on total income received. For example, the poverty level for 2012 was set at $23,050 (total yearly income) for a family of four. Most Americans (58.5%) will spend at least one year below the poverty line at some point between ages 25 and 75. Poverty rates are persistently higher in rural and inner city parts of the country as compared to suburban areas.
In November 2012 the U.S. Census Bureau said more than 16% of the population lived in poverty in the United States, including almost 20% of American children, up from 14.3% (approximately 43.6 million) in 2009 and to its highest level since 1993. In 2008, 13.2% (39.8 million) Americans lived in poverty.California has a poverty rate of 23.5%, the highest of any state in the country.
In 2011 extreme poverty in the United States, meaning households living on less than $2 per day before government benefits, was double 1996 levels at 1.5 million households, including 2.8 million children. This would be roughly 1.2% of the US population in 2011, presuming a mean household size of 2.55 people. In 2011, child poverty reached record high levels, with 16.7 million children living in food insecure households, about 35% more than 2007 levels. In 2009 the number of people who were in poverty was approaching 1960s levels that led to the national War on Poverty.
There were about 643,000 sheltered and unsheltered homeless people nationwide in January 2009. Almost two-thirds stayed in an emergency shelter or transitional housing program and the other third were living on the street, in an abandoned building, or another place not meant for human habitation. About 1.56 million people, or about 0.5% of the U.S. population, used an emergency shelter or a transitional housing program between October 1, 2008 and September 30, 2009.Around 44% of homeless people are employed.
Two official measures of poverty
Number in Poverty and Poverty Rate: 1959 to 2011. United States.
Poverty Rates by Age 1959 to 2011. United States.
There are two basic versions of the federal poverty measure: the poverty thresholds (which are the primary version) and the poverty guidelines. The Census Bureau issues the poverty thresholds, which are generally used for statistical purposes—for example, to estimate the number of people in poverty nationwide each year and classify them by type of residence, race, and other social, economic, and demographic characteristics. The Department of Health and Human Services issues the poverty guidelines for administrative purposes—for instance, to determine whether a person or family is eligible for assistance through various federal programs.
Since the 1960s, the United States government has defined poverty in absolute terms. When the Johnsonadministration declared “war on poverty” in 1964, it chose an absolute measure. The “absolute poverty line” is the threshold below which families or individuals are considered to be lacking the resources to meet the basic needs for healthy living; having insufficient income to provide the food, shelter and clothing needed to preserve health.
The “Orshansky Poverty Thresholds” form the basis for the current measure of poverty in the U.S. Mollie Orshansky was an economist working for the Social Security Administration (SSA). Her work appeared at an opportune moment. Orshansky’s article was published later in the same year that Johnson declared war on poverty. Since her measure was absolute (i.e., did not depend on other events), it made it possible to objectively answer whether the U.S. government was “winning” this war. The newly formed United States Office of Economic Opportunity adopted the lower of the Orshansky poverty thresholds for statistical, planning, and budgetary purposes in May 1965.
The Bureau of the Budget (now the Office of Management and Budget) adopted Orshansky’s definition for statistical use in all Executive departments. The measure gave a range of income cutoffs, or thresholds, adjusted for factors such as family size, sex of the family head, number of children under 18 years old, and farm or non-farm residence. The economy food plan (the least costly of four nutritionally adequate food plans designed by the Department of Agriculture) was at the core of this definition of poverty.
The Department of Agriculture found that families of three or more persons spent about one third of their after-tax income on food. For these families, poverty thresholds were set at three times the cost of the economy food plan. Different procedures were used for calculating poverty thresholds for two-person households and persons living alone. Annual updates of the SSA poverty thresholds were based on price changes in the economy food plan.
Two changes were made to the poverty definition in 1969. Thresholds for non-farm families were tied to annual changes in the Consumer Price Index rather than changes in the cost of the economy food plan. Farm thresholds were raised from 70 to 85% of the non-farm levels.
In 1981, further changes were made to the poverty definition. Separate thresholds for “farm” and “female-householder” families were eliminated. The largest family size category became “nine persons or more.”
Apart from these changes, the U.S. government’s approach to measuring poverty has remained static for the past forty years.
The poverty guideline figures are not the figures the Census Bureau uses to calculate the number of poor persons. The figures that the Census Bureau uses are the poverty thresholds. The Census Bureau provides an explanation of the difference between poverty thresholds and guidelines. The Census Bureau uses a set of money income thresholds that vary by family size and composition to determine who is in poverty. The 2010 figure for a family of 4 with no children under 18 years of age is $22,541, while the figure for a family of 4 with 2 children under 18 is $22,162. For comparison, the 2011 HHS poverty guideline for a family of 4 is $22,350.
Numbers in other countries
The official number of poor in the United States in 2008 is about 39.1 million people, greater in number but not percentage than the officially poor in Indonesia, which has a far lower Human Development Index and the next largest population after the United States. The poverty level in the United States, with 15% (46.2 million people in poverty, of a total of 308.5 million) is comparable to the one in France, where 14% of the population live with less than 880 euros per month.
Number of poor are hard to compare across countries. Absolute income may be used but does not reflect the actual number of poor, which depend on relative income and cost of living in each country. Among developed countries, each country then has its own definition and threshold of what it means to be poor, but this is not adjusted for cost of living and social benefits. For instance, despite the fact that France and US have about the same threshold in terms of dollars amount for poverty, cost of living benefits differ, with universal health care and highly subsidized post-secondary education existing in France. In general, it might be better to use the Human Poverty Index (HPI), Human Development Index (HDI) or other global measure to compare quality of living in different countries.
Relative measures of poverty
Another way of looking at poverty is in relative terms. “Relative poverty” can be defined as having significantly less access to income and wealth than other members of society.Therefore, the relative poverty rate is a measure of income inequality. When the standard of living among those in more financially advantageous positions rises while that of those considered poor stagnates, the relative poverty rate will reflect such growing income inequality and increase. Conversely, the relative poverty rate can decrease, with low income people coming to have less wealth and income if wealthier people’s wealth is reduced by a larger percentage than theirs. In 1959, a family at the poverty line had an income that was 42.64% of the median income. If the poverty line in 1999 was less than 42.64% of the median income, then relative poverty would have increased.
In the European Union and for the OECD, “relative poverty” is defined as an income below 60% of the national median equalized disposable income after social transfers for a comparable household. In Germany, for example, the official relative poverty line for a single adult person in 2003 was 938 euros per month (11,256 euros/year, $12,382 PPP. West Germany 974 euros/month, 11,688 euros/year, $12,857 PPP). For a family of four with two children below 14 years the poverty line was 1969.8 euros per month ($2,167 PPP) or 23,640 euros ($26,004 PPP) per year. According to Eurostat the percentage of people in Germany living at risk of poverty (relative poverty) in 2004 was 16% (official national rate 13.5% in 2003). Additional definitions for poverty in Germany are “poverty” (50% median) and “strict poverty” (40% median, national rate 1.9% in 2003). Generally the percentage for “relative poverty” is much higher than the quota for “strict poverty”. The U.S concept is best comparable to “strict poverty”. By European standards the official (relative) poverty rate in the United States would be significantly higher than it is by the U.S. measure. A research paper from the OECD calculates the relative poverty rate for the United States at 16% for 50% median of disposable income and nearly 24% for 60% of median disposable income (OECD average: 11% for 50% median, 16% for 60% median).
Some critics argue that relying on income disparity to determine who is impoverished can be misleading. The Bureau of Labor Statistics data suggests that consumer spending varies much less than income. In 2008, the “poorest” one fifth of Americans households spent on average $12,955 per person for goods and services (other than taxes), the second quintile spent $14,168, the third $16,255, the fourth $19,695, while the “richest” fifth spent $26,644. The disparity of expenditures is much less than the disparity of income.[neutrality is disputed]
Income distribution and relative poverty
Although the relative approach theoretically differs largely from the Orshansky definition, crucial variables of both poverty definitions are more similar than often thought. First, the so-called standardization of income in both approaches is very similar. To make incomes comparable among households of different sizes, equivalence scales are used to standardize household income to the level of a single person household. In Europe, the modified OECD equivalence scale is used, which takes the combined value of 1 for the head of household, 0.5 for each additional household member older than 14 years and 0.3 for children. When compared to the US Census poverty lines, which is based on a defined basket of goods, for the most prevalent household types both standardization methods show to be very similar.
Furthermore, the poverty threshold in Western-European countries is not always higher than the Orshansky threshold for a single person family. The actual Orchinsky poverty line for single person households in the US ($9645 in 2004) is very comparable to the relative poverty line in many Western-European countries (Belgium 2004: €9315), while price levels are also similar. The reason why relative poverty measurement causes high poverty levels in the US, as demonstrated by Förster, is caused by distributional effects rather than real differences in well-being among EU-countries and the US.
The median household income is much higher in the US than in Europe due to the wealth of the middle classes in the US, from which the poverty line is derived. Although the paradigm of relative poverty is most valuable, this comparison of poverty lines show that the higher prevalence of relative poverty levels in the US are not an indicator of a more severe poverty problem but an indicator of larger inequalities between rich middle classes and the low-income households. It is therefore not correct to state that the US income distribution is characterized by a large proportion of households in poverty; it is characterized by relatively large income inequality but also high levels of prosperity of the middle classes.[neutrality is disputed] The 2007 poverty threshold for a three member family is 17,070.
Poverty and demographics
In addition to family status, race/ethnicity and age also correlate with high poverty rates in the United States. Although data regarding race and poverty are more extensively published and cross tabulated the family status correlation is by far the strongest.
Poverty and family status
Homeless children in the United States. The number of homeless children reached record highs in 2011, 2012,and 2013 at about three times their number in 1983.
According to the US Census, in 2007 5.8% of all people in married families lived in poverty, as did 26.6% of all persons in single parent households and 19.1% of all persons living alone. More than 75% of all poor households are headed by women (2012).
By race/ethnicity and family status, based on data from 2007
Among married couple families: 5.8% lived in poverty. This number varied by race and ethnicity as follows:
5.4% of all white persons (which includes white Hispanics),
9.7% of all black persons (which includes black Hispanics), and
14.9% of all Hispanic persons (of any race) living in poverty.
Among single parent (male or female) families: 26.6% lived in poverty. This number varied by race and ethnicity as follows”
22.5% of all white persons (which includes white Hispanics),
44.0% of all black persons (which includes black Hispanics), and
33.4% of all Hispanic persons (of any race) living in poverty.
Among individuals living alone: 19.1% lived in poverty. This number varied by race and ethnicity as follows:
18% of white persons (which includes white Hispanics)
27.9% of black persons (which includes black Hispanics) and
27% of Hispanic persons (of any race) living in poverty
Poverty and race/ethnicity
The US Census declared that in 2010 15.1% of the general population lived in poverty:
9.9% of all non-Hispanic white persons
12.1% of all Asian persons
26.6% of all Hispanic persons (of any race)
27.4% of all black persons.
About half of those living in poverty are non-Hispanic white (19.6 million in 2010), but poverty rates are much higher for blacks and Hispanics. Non-Hispanic white children comprised 57% of all poor rural children.
In FY 2009, black families comprised 33.3% of TANF families, non-Hispanic white families comprised 31.2%, and 28.8% were Hispanic.
Poverty and age
The US Census declared that in 2010 15.1% of the general population lived in poverty:
22% of all people under age 18
13.7% of all people 19–64, and
9% of all people ages 65 and older
The Organization for Economic Co-operation and Development (OECD) uses a different measure for poverty and declared in 2008 that child poverty in the US is 20% and poverty among the elderly is 23%. The non-profit advocacy group Feeding America has released a study (May 2009) based on 2005–2007 data from the U.S. Census Bureau and theAgriculture Department, which claims that 3.5 million children under the age of 5 are at risk of hunger in the United States. The study claims that in 11 states, Louisiana, which has the highest rate, followed by North Carolina, Ohio, Kentucky, Texas, New Mexico, Kansas, South Carolina, Tennessee, Idaho and Arkansas, more than 20 percent of children under 5 are allegedly at risk of going hungry. (Receiving fewer than 1,800 calories per day) The study was paid by ConAgra Foods, a large food company.
In 2013, child poverty reached record high levels in the U.S., with 16.7 million children living in food insecure households. 47 million Americans depend on food banks, more than 30% above 2007 levels. Households headed by single mothers are most likely to be affected. Worst affected are the District of Columbia, Oregon, Arizona, New Mexico and Florida, while North Dakota, New Hampshire, Virginia, Minnesota and Massachusetts are the least affected.
Poverty and education
Poverty affects individual access to quality education. The U.S. education system is funded by local communities; therefore the quality of materials and teachers is reflective of the affluence of community. Low income communities are not able to afford the quality education that high income communities are. Another important aspect of education in low income communities is the apathy of both students and teachers. To some the children of the poor or ignorant are seen as mere copies of their parents fated to live out the same poor or ignorant life. The effect of such a perception can be teachers that will not put forth the effort to teach and students that are opposed to learning; in both cases the idea is that the poor student is incapable. Females in poverty are also likely to become pregnant at a young age, and with fewer resources to care for a child, young women often drop out of school. Due to these and other reasons the quality of education between the classes is not equal.
Eighty-nine percent of the American households were food secure throughout the entire year of 2002, meaning that they had access, at all times, to enough food for an active, healthy life for all of the household members. The remaining households were food insecure at least some time during that year. The prevalence of food insecurity rose from 10.7% in 2001 to 11.1% in 2002, and the prevalence of food insecurity with hunger rose from 3.3% to 3.5%.
In 2007, 88.9% of American households were food secure throughout the entire year.  The number of American households that were food secure throughout the entire year dropped to 85.4% in 2008. The prevalence of food insecurity has been essentially unchanged since 2008.
Factors in poverty
There are numerous factors related to poverty in the United States.
According to the American Enterprise Institute, research has shown that income and intelligence are related. Charles Murray compared the earnings of 733 full sibling pairs with differing intelligence quotients (IQ’s). He referred to the sample as utopian in that the sampled pairs were raised in families with virtually no illegitimacy, divorce or poverty. The average earnings of sampled individuals with an IQ of under 75 was $11,000, compared to $16,000 for those with an IQ between 75 and 90, $23,000 for those with an IQ between 90 and 110, $27,000 for those with an IQ between 110 and 125, and $38,000 for those with an IQ above 125.
Income has a high correlation with educational levels. In 2007, the median earnings of household headed by individuals with less than a 9th grade education was $20,805 while households headed by high school graduates earned $40,456, households headed holders of bachelor’s degree earned $77,605, and families headed by individuals with professional degrees earned $100,000.
In many cases poverty is caused by job loss. In 2007, the poverty rate was 21.5% for individuals who were unemployed, but only 2.5% for individuals who were employed full-time.
In 1991, 8.3% of children in two-parent families were likely to live in poverty; 19.6% of children lived with father in single parent family; and 47.1% in single parent family headed by mother.
Income levels vary with age. For example, the median 2009 income for households headed by individuals age 15–24 was only $30,750, but increased to $50,188 for household headed by individuals age 25–34 and $61,083 for household headed by individuals 35–44. Although the reasons are unclear, work experience and additional education may be factors.
Income levels vary along racial/ethnic lines: 21% of all children in the United States live in poverty, about 46% of black children and 40% of Latino children live in poverty. The poverty rate is 9.9% for black married couples and only 30% of black children are born to married couples (see Marriage below). Citing the Pew Research Center,The Economistreports that in 2007,11% of black women aged 30–44 without a high school diploma had a working spouse.[copyright violation?] The poverty rate for native born and naturalized whites is identical (9.6%). On the other hand, the poverty rate for naturalized blacks is 11.8% compared to 25.1% for native born blacks suggesting race alone does not explain income disparity. Not all minorities have low incomes. Asian families have higher incomes than all other ethnic groups. For example, the 2005 median income of Asian families was $68,957 compared to the median income of white families of $59,124. Asians, however, report discrimination occurrences more frequently than blacks. Specifically, 31% of Asians reported employment discrimination compared to 26% of blacks in 2005.
The relationship between tax rates and poverty is disputed. A study comparing high tax Scandinavian countries with the U. S. suggests high tax rates are inversely correlated with poverty rates. The poverty rate, however, is low in some low tax countries such as Switzerland. A comparison of poverty rates between states reveals that some low tax states have low poverty rates. For example, New Hampshire has the lowest poverty rate of any state in the U. S., and has very low taxes (46th among all states).It is true however that in those instances, both Switzerland and New Hampshire have a very high household income and other measures to levy or offset the lack of taxation. For example, Switzerland has Universal Healthcare and a free system of education for children as young as four years old. New Hampshire has no state income tax or sales tax, but does have the nation’s highest property taxes.
The Heritage Foundation speculates that illegal immigration increases job competition among low wage earners, both native and foreign born. Additionally many first generation immigrants, namely those without a high school diploma, are also living in poverty themselves.
Concerns regarding accuracy
In recent years, there have been a number of concerns raised about the official U.S. poverty measure. In 1995, the National Research Council‘s Committee on National Statistics convened a panel on measuring poverty. The findings of the panel were that “the official poverty measure in the United States is flawed and does not adequately inform policy-makers or the public about who is poor and who is not poor.”
The panel was chaired by Robert Michael, former Dean of the Harris School of the University of Chicago. According to Michael, the official U.S. poverty measure “has not kept pace with far-reaching changes in society and the economy.” The panel proposed a model based on disposable income:
According to the panel’s recommended measure, income would include, in addition to money received, the value of non-cash benefits such as food stamps, school lunches and public housing that can be used to satisfy basic needs. The new measure also would subtract from gross income certain expenses that cannot be used for these basic needs, such as income taxes, child-support payments, medical costs, health-insurance premiums and work-related expenses, including child care.
Many sociologists and government officials have argued that poverty in the United States is understated, meaning that there are more households living in actual poverty than there are households below the poverty threshold. A recent NPR report states that as much as 30% of Americans have trouble making ends meet and other advocates have made supporting claims that the rate of actual poverty in the US is far higher than that calculated by using the poverty threshold. A study taken in 2012 estimated that roughly 38% of Americans live “paycheck to paycheck.”
According to William H. Chafe, if one used a relative standard for measuring poverty (a standard that took into account the rising standards of living rather than an absolute dollar figure) then 18% of families was living in poverty in 1968, not 13% as officially estimated at that time.
As far back as 1969, the Bureau of Labor Statistics put forward suggested budgets for families to live adequately on. 60% of working-class Americans lived below one of these budgets, which suggested that a far higher proportion of Americans lived in poverty than the official poverty line suggested. These findings were also used by observers on the left when questioning the long-established view that most Americans had attained an affluent standard of living in the two decades following the end of the Second World War.
Using a definition of relative poverty (reflecting disposable income below half the median of adjusted national income), it was estimated that, between 1979 and 1982, 17.1% of Americans lived in poverty, compared with 12.6% of the population of Canada, 12.2% of the population of Australia, 9.7% of the population of Britain, 5.6% of the population of West Germany, 5.3% of the population of Sweden, and 5.2% of the population of Norway.
As noted above, the poverty thresholds used by the US government were originally developed during the Johnson administration’sWar on Poverty initiative in 1963–1964. Mollie Orshansky, the government economist working at the Social Security Administration who developed the thresholds, based the threshold levels on the cost of purchasing what in the mid-1950s had been determined by the US Department of Agriculture to be the minimal nutritionally-adequate amount of food necessary to feed a family. Orshansky multiplied the cost of the food basket by a factor of three, under the assumption that the average family spent one third of its income on food.
While the poverty threshold is updated for inflation every year, the basket of food used to determine what constitutes being deprived of a socially acceptable minimum standard of living has not been updated since 1955. As a result, the current poverty line only takes into account food purchases that were common more than 50 years ago, updating their cost using the Consumer Price Index. When methods similar to Orshansky’s were used to update the food basket using prices for the year 2000 instead of from nearly a half century earlier, it was found that the poverty line should actually be 200% higher than the official level being used by the government in that year.
Yet even that higher level could still be considered flawed, as it would be based almost entirely on food costs and on the assumption that families still spend a third of their income on food. In fact, Americans typically spent less than one tenth of their after-tax income on food in 2000. For many families, the costs of housing, health insurance and medical care, transportation, and access to basic telecommunications take a much larger bite out of the family’s income today than a half century ago; yet, as noted above, none of these costs are considered in determining the official poverty thresholds. According to John Schwarz, a political scientist at the University of Arizona:
The official poverty line today is essentially what it takes in today’s dollars, adjusted for inflation, to purchase the same poverty-line level of living that was appropriate to a half century ago, in 1955, for that year furnished the basic data for the formula for the very first poverty measure. Updated thereafter only for inflation, the poverty line lost all connection over time with current consumption patterns of the average family. Quite a few families then didn’t have their own private telephone, or a car, or even a mixer in their kitchen… The official poverty line has thus been allowed to fall substantially below a socially decent minimum, even though its intention was to measure such a minimum.
The issue of understating poverty is especially pressing in states with both a high cost of living and a high poverty rate such as California where the median home price in May 2006 was determined to be $564,430. With half of all homes being priced above the half million dollar mark and prices in urban areas such as San Francisco, San Jose or Los Angeles being higher than the state average, it is almost impossible for not just the poor but also lower middle class worker to afford decent housing, and no possibility of home ownership. In the Monterey area, where the low-pay industry of agriculture is the largest sector in the economy and the majority of the population lacks a college education the median home price was determined to be $723,790, requiring an upper middle class income which only roughly 20% of all households in the county boast.
Such fluctuations in local markets are, however, not considered in the Federal poverty threshold, and thus leave many who live in poverty-like conditions out of the total number of households classified as poor.
In 2011, the Census Bureau introduced a new supplementary poverty measure aimed at providing a more accurate picture of the true extent of poverty in the United States. According to this new measure, 16% of Americans lived in poverty in 2011, compared with 15.2% using the official figure. The new measure also estimated that nearly half of all Americans lived in poverty that year, defined as living within 200% of the federal poverty line.
Duke University Professor of Public Policy and Economics Sandy Darity, Jr. says, “There is no exact way of measuring poverty. The measures are contingent on how we conceive of and define poverty. Efforts to develop more refined measures have been dominated by researchers who intentionally want to provide estimates that reduce the magnitude of poverty.”
Some critics assert that the official U.S. poverty definition is inconsistent with how it is defined by its own citizens and the rest of the world, because the U.S. government considers many citizens statistically impoverished despite their ability to sufficiently meet their basic needs. According to a 2011 paper by poverty expert Robert Rector, of the 43.6 million Americans deemed to be below the poverty level by the U.S. Census Bureau in 2009, the majority had adequate shelter, food, clothing and medical care. In addition, the paper stated that those assessed to be below the poverty line in 2011 have a much higher quality of living than those who were identified by the census 40 years ago as being in poverty. Moreover, Swedish libertarianthink tankTimbro points out that lower-income households in the U.S. tend to own more appliances and larger houses than many middle-income Western Europeans.
The federal poverty line also excludes income other than cash income, especially welfare benefits. Thus, if food stamps and public housing were successfully raising the standard of living for poverty stricken individuals, then the poverty line figures would not shift since they do not consider the income equivalents of such entitlements.
A 1993 study of low income single mothers titled Making Ends Meet, by Kathryn Edin, a sociologist at the University of Pennsylvania, showed that the mothers spent more than their reported incomes because they could not “make ends meet” without such expenditures. According to Edin, they made up the difference through contributions from family members, absent boyfriends, off-the-book jobs, and church charity.
According to Edin: “No one avoided the unnecessary expenditures, such as the occasional trip to the Dairy Queen, or a pair of stylish new sneakers for the son who might otherwise sell drugs to get them, or the Cable TV subscription for the kids home alone and you are afraid they will be out on the street if they are not watching TV.” However many mothers skipped meals or did odd jobs to cover those expenses. According to Edin, for “most welfare-reliant mothers food and shelter alone cost almost as much as these mothers received from the government. For more than one-third, food and housing costs exceeded their cash benefits, leaving no extra money for uncovered medical care, clothing, and other household expenses.” 
In the age of inequality, such anti-poverty policies are more important than ever, as higher inequality creates both more poverty along with steeper barriers to getting ahead, whether through the lack of early education, nutrition, adequate housing, and a host of other poverty-related conditions that dampen ones chances in life.
There have been many governmental and nongovernmental efforts to reduce poverty and its effects. These range in scope from neighborhood efforts to campaigns with a national focus. They target specific groups affected by poverty such as children, people who are autistic, immigrants, or people who are homeless. Efforts to alleviate poverty use a disparate set of methods, such as advocacy, education, social work, legislation, direct service or charity, and community organizing.
Recent debates have centered on the need for policies that focus on both “income poverty” and “asset poverty.” Advocates for the approach argue that traditional governmental poverty policies focus solely on supplementing the income of the poor, through programs such as Aid to Families with Dependent Children (AFDC) and Food Stamps. According to the CFED2012 Assets & Opportunity Scorecard, 27 percent of households – nearly double the percentage that are income poor – are living in “asset poverty.” These families do not have the savings or other assets to cover basic expenses (equivalent to what could be purchased with a poverty level income) for three months if a layoff or other emergency leads to loss of income. Since 2009, the number of asset poor families has increased by 21 percent from about one in five families to one in four families. In order to provide assistance to such asset poor families, Congress appropriated $24 million to administer the Assets for Independence Program under the supervision of the US Department for Health and Human Services. The program enables community-based nonprofits and government agencies to implement Individual Development Account or IDA programs, which are an asset-based development initiative. Every dollar accumulated in IDA savings is matched by federal and non-federal funds to enable households to add to their assets portfolio by buying their first home, acquiring a post-secondary education and starting or expanding a small business.
Additionally, the Earned Income Tax Credit (EITC or EIC) is a credit for people who earn low-to-moderate incomes. This credit allows them to get money from the government if their total tax outlay is less than the total credit earned, meaning it is not just a reduction in total tax paid but can also bring new income to the household. The Earned Income Tax Credit is viewed as the largest poverty reduction program in the United States. There is an ongoing debate in the US about what is the most effective way to fight poverty, is it through the tax code with the EITC or through the minimum wage laws.
President Obama’s aides went to extraordinary lengths to uncover the identity of a senior official who was using Twitter to make snarky comments about White House staffers. Suspicion gradually centered on Jofi Joseph, the point man on nuclear nonproliferation at the National Security Council. So at a meeting in which everyone was in on the scam an inaccurate but innocuous news tidbit was revealed. When Joseph used his anonymous Twitter handle #natlsecwonk to broadcast the tidbit he was caught and promptly fired. He was not fired for revealing any secrets, but for making disparaging comments about thin-skinned administration players ranging from Secretary of State Hillary Clinton to Secretary of Defense Chuck Hagel.
What apparently intensified the campaign to identify the “snarker” was a comment about Valerie Jarrett, the senior Obama adviser who has her own Secret Service detail and appears to exercise an inordinate amount of power behind the scenes. Joseph tweeted “I’m a fan of Obama, but his continuing reliance and dependence upon a vacuous cipher like Valerie Jarrett concerns me.”
Jarrett, an old Chicago friend of both Barack and Michelle Obama, appears to exercise such extraordinary influence she is sometimes quietly referred to as “Rasputin” on Capitol Hill, a reference to the mystical monk who held sway over Russia’s Czar Nicholas as he increasingly lost touch with reality during World War I.
Darrell Delamaide, a columnist for Dow Jones’s MarketWatch, says that “what has baffled many observers is how Jarrett, a former cog in the Chicago political machine and a real-estate executive, can exert such influence on policy despite her lack of qualifications in national security, foreign policy, economics, legislation or any of the other myriad specialties the president needs in an adviser.”
Delamaide believes the term “vacuous cipher” that was applied to Jarrett stung so much because it could be used as a metaphor for the administration in general. He writes that what “has remained consistent about the Obama administration is that vacuity — the slow response in a crisis, the hesitant and contradictory communication, a lack of conviction and engagement amid constant political calculation.” The stunning revelation that President Obama wasn’t kept properly apprised of problems with Obamacare’s website is just the latest example of how dysfunctional Obama World can be.
Whether Jarrett’s influence is all too real or exaggerated is unknowable. What is known is the extent to which she has long been a peerless enabler of Barack Obama’s inflated opinion of himself. Consider this quote from New Yorker editor David Remnick’s interview with her for his 2010 book The Bridge.
“I think Barack knew that he had God-given talents that were extraordinary. He knows exactly how smart he is. . . . He knows how perceptive he is. He knows what a good reader of people he is. And he knows that he has the ability — the extraordinary, uncanny ability — to take a thousand different perspectives, digest them and make sense out of them, and I think that he has never really been challenged intellectually. . . . So what I sensed in him was not just a restless spirit but somebody with such extraordinary talents that had to be really taxed in order for him to be happy. . . . He’s been bored to death his whole life. He’s just too talented to do what ordinary people do.”
Up against a court flatterer of that caliber it’s no surprise that Jarrett has outlasted almost everyone who was in Obama’s original White House team — from chief of staff Rahm Emanuel to political guru David Axelrod to Press Secretary Robert Gibbs. All are known to have crossed her, and all are gone. As one former Obama aide once told me: “Valerie is ‘She Who Must Not be Challenged.’”
When the revealing histories of the Obama White House are written it will be fascinating to learn just how extensive her role in the key decisions of the Obama years was.
TRUTH about EBT SYSTEM / FOOD STAMPS SHUTDOWN – Are you PREPARED for SOCIAL UNREST & MARTIAL LAW?
Cab Driver, Woman Speak Out About Food Stamp, EBT Card Fraud
Get an EBT card
Food Stamps EBT Shut Down Declined
Link EBT) STOPPED FULL CARTS ALL OVER STORES ALL OVER AMERICA, Government SHUT DOWN NWO on the WAY,
Background Articles and Videos
Eat Your Last Meal!!! EBT, WIC,& FOOD STAMPS TO END in exchange for MICROCHIPPING TO BEGIN!!
People in Ohio, Michigan and 15 other states found themselves unable to use their food stamp debit-style cards on Saturday, after a routine test of backup systems by vendor Xerox Corp. resulted in a system failure.
At about 9 a.m. Saturday, reports from across the country began pouring in that customers’ EBT cards were not working in stores.
At 2 p.m., an EBT customer service representative told CBS Boston that the system was currently down for a computer system upgrade.
Xerox spokeswoman Jennifer Wasmer released further details later in the afternoon in an emailed statement.
“While the electronic benefits system is now up and running, beneficiaries in the 17 affected states continue to experience connectivity issues to access their benefits. Technical staff is addressing the issue and expect the system to be restored soon,” Wasmer said. “Beneficiaries requiring access to their benefits can work with their local retailers who can activate an emergency voucher system where available. We appreciate our clients’ patience while we work through this outage as quickly as possible.”
Wasmer said the affected states also included Alabama, California, Georgia, Iowa, Illinois, Louisiana, Massachusetts, Maryland, Mississippi, New Jersey, Oklahoma, Pennsylvania, Texas and Virginia.
U.S. Department of Agriculture spokeswoman Courtney Rowe said the outage is not related to the government shutdown.
Shoppers left carts of groceries behind at a packed Market Basket grocery store in Biddeford, Maine, because they couldn’t get their benefits, said fellow shopper Barbara Colman, of Saco, Maine. The manager put up a sign saying the EBT system was not in use. Colman, who receives the benefits, called an 800 telephone line for the program and it said the EBT system was down due to maintenance, she said.
“That’s a problem. There are a lot of families who are not going to be able to feed children because the system is being maintenanced,” Colman said. She planned to reach out to local officials. “You don’t want children going hungry tonight because of stupidity,” she said.
Colman said the store manager promised her that he would honor the day’s store flyer discounts next week.
Ohio’s cash and food assistance card payment systems went down at 11 a.m., said Benjamin Johnson, a spokesman for the Ohio Department of Job and Family Services. Ohio’s cash system has been fixed, but he said that its electronic benefits transfer card system is still down. Johnson said Xerox is notifying retailers to revert to the manual system, meaning customers can spend up to $50 until the system is back online. Recipients of the state’s supplemental nutrition assistance program, or SNAP, should call the 800 number on the back of their card, and Xerox will guide them through the purchase process.
Illinois residents began reporting problems with their cards — known as LINK in that state — on Saturday morning, said Januari Smith, spokeswoman for the Illinois Department of Human Services.
Smith said that typically when the cards aren’t working retailers can call a backup phone number to find out how much money customers have available in their account. But that information also was unavailable because of the outage, so customers weren’t able to use their cards.
“It really is a bad situation but they are working to get it fixed as soon as possible,” Smith said. “We hope it will be back up later today.”
In Clarksdale, Miss. — one of the poorest parts of one of the poorest states in the nation — cashier Eliza Shook said dozens of customers at Corner Grocery had to put back groceries when the cards failed Saturday because they couldn’t afford to pay for the food. After several hours, she put a sign on the front door to tell people about the problem.
“It’s been terrible,” Shook said in a phone interview. “It’s just been some angry folks. That’s what a lot of folks depend on.”
Mississippi Department of Human Services director Rickey Berry confirmed that Xerox, the state’s EBT vendor, had computer problems. He said he had been told by midafternoon that the problems were being fixed.
“I know there are a lot of mad people,” Berry said.
Sheree Powell, a spokeswoman for the Oklahoma Department of Human Services, started receiving calls around 11:30 a.m. about problems with the state’s card systems. More than 600,000 Oklahomans receive SNAP benefits, and money is dispersed to the cards on the first, fifth and 10th days of every month, so the disruption came at what is typically a high-use time for the cards.
Oklahoma also runs a separate debit card system for other state benefits like unemployment payments. Those cards can be used at ATMs to withdraw cash. Powell said Xerox administers both the EBT and debit card systems, and they both were down initially.
Like Ohio’s Johnson, Powell said that Oklahoma’s cash debit card system has since been restored, but the EBT cards for the SNAP program were still down. Powell said Oklahoma’s Xerox representative told them that the problems stemmed from a power failure at adata center, and power had been restored quickly.
“It just takes a while to reboot these systems,” she said, adding that she did not know where the data center was located.
The federal EBT website was unavailable due to the government shutdown.
BUREAU OF THE FISCAL SERVICE
STAR - TREASURY FINANCIAL DATABASE
TABLE 1. SUMMARY OF RECEIPTS, OUTLAYS AND THE DEFICIT/SURPLUS BY MONTH OF THE U.S. GOVERNMENT (IN MILLIONS)
ACCOUNTING DATE: 08/13
PERIOD RECEIPTS OUTLAYS DEFICIT/SURPLUS (-)
+ ____________________________________________________________ _____________________ _____________________ _____________________
OCTOBER 163,072 261,539 98,466
NOVEMBER 152,402 289,704 137,302
DECEMBER 239,963 325,930 85,967
JANUARY 234,319 261,726 27,407
FEBRUARY 103,413 335,090 231,677
MARCH 171,215 369,372 198,157
APRIL 318,807 259,690 -59,117
MAY 180,713 305,348 124,636
JUNE 260,177 319,919 59,741
JULY 184,585 254,190 69,604
AUGUST 178,860 369,393 190,533
SEPTEMBER 261,566 186,386 -75,180
YEAR-TO-DATE 2,449,093 3,538,286 1,089,193
OCTOBER 184,316 304,311 119,995
NOVEMBER 161,730 333,841 172,112
DECEMBER 269,508 270,699 1,191
JANUARY 272,225 269,342 -2,883
FEBRUARY 122,815 326,354 203,539
MARCH 186,018 292,548 106,530
APRIL 406,723 293,834 -112,889
MAY 197,182 335,914 138,732
JUNE 286,627 170,126 -116,501
JULY 200,030 297,627 97,597
AUGUST 185,370 333,293 147,923
YEAR-TO-DATE 2,472,542 3,227,888 755,345
No Fed Taper: What Does It Mean for Your Money? (9/18/13)
Federal Reserve: No Taper (9/18/13)
Ron Paul: Fed Decision To Not Taper Is A Really Bad Sign
Ron Paul: Taper Fakeout Means Fed Is Worried
Breaking News: Federal Reserve Will Not Taper
Rick Santelli Reacts to Federal Reserve No Taper
Why The Fed. Will INCREASE, NOT DECREASE, It’s QE/Money Printing. By Gregory Mannarino
In Business – Fed Taper Pause Fuels Commodities Rally
To Taper, or Not to Taper
FED Says No Taper — We Need A War, Gun Confiscation And Control Of Internet First — Episode 166
JIM RICKARDS: Fed Will TAPER in September or Never, and the Looming MONETARY System COLLAPSE 
James Rickards on “Why The Fed Will NOT Taper Quantitative Easing”
Peter Schiff: “The party is coming to an end”.
JIM ROGERS – When the FED stops PRINTING FIAT CURRENCY the COLLAPSE will be here. PREPARE NOW
Fed decision Just idea of tapering caused huge ruckus
Background Articles and Videos
Milton Friedman – Abolish The Fed
Milton Friedman On John Maynard Keynes
Free to Choose Part 3: Anatomy of a Crisis (Featuring Milton Friedman)
Murray Rothbard – To Expand And Inflate
The Founding of the Federal Reserve | Murray N. Rothbard
The Origin of the Fed – Murray N. Rothbard
Murray Rothbard on Hyperinflation and Ending the Fed
Murray N. Rothbard on Milton Friedman (audio – removed noise) part 1/5
Keynes the Man: Hero or Villain? | Murray N. Rothbard
WASHINGTON (AP) — The Federal Reserve has decided against reducing its stimulus for the U.S. economy, saying it will continue to buy $85 billion a month in bonds because it thinks the economy still needs the support.
The Fed said in a statement Wednesday that it held off on tapering because it wants to see more conclusive evidence that the recovery will be sustained.
Stocks spiked after the Fed released the statement at the end of its two-day policy meeting.
In the statement, the Fed says that the economy is growing moderately and that some indicators of labor market conditions have shown improvement. But it noted that rising mortgage rates and government spending cuts are restraining growth.
The bond purchases are intended to keep long-term loan rates low to spur borrowing and spending.
The Fed also repeated that it plans to keep its key short-term interest rate near zero at least until unemployment falls to 6.5 percent, down from 7.3 percent last month. In the Fed’s most recent forecast, unemployment could reach that level as soon as late 2014.
Many thought the Fed would scale back its purchases. But interest rates have jumped since May, when Fed Chairman Ben Bernanke first said the Fed might slow its bond buys later this year. But Bernanke cautioned that the reduction would hinge on the economy showing continued improvement.
In its statement, the Fed says that the rise in interest rates “could slow the pace of improvement in the economy and labor market” if they are sustained.
The Fed also lowered its economic growth forecasts for this year and next year slightly, likely reflecting its concerns about interest rates.
The statement was approved on a 9-1 vote. Esther George, president of the Federal Reserve Bank of Kansas City, dissented for the sixth time this year. She repeated her concerns that the bond purchases could fuel the risk of inflation and financial instability.
The decision to maintain its stimulus follows reports of sluggish economic growth. Employers slowed hiring this summer, and consumers spent more cautiously.
Super-low rates are credited with helping fuel a housing comeback, support economic growth, drive stocks to record highs and restore the wealth of many Americans. But the average rate on the 30-year mortgage has jumped more than a full percentage point since May and was 4.57 percent last week — just below the two-year high.
The unemployment rate is now 7.3 percent, the lowest since 2008. Yet the rate has dropped in large part because many people have stopped looking for work and are no longer counted as unemployed — not because hiring has accelerated. Inflation is running below the Fed’s 2 percent target.
The Fed meeting took place at a time of uncertainty about who will succeed Bernanke when his term ends in January. On Sunday, Lawrence Summers, who was considered the leading candidate, withdrew from consideration.
Summers’ withdrawal followed growing resistance from critics. His exit has opened the door for his chief rival, Janet Yellen, the Fed’s vice chair. If chosen by President Barack Obama and confirmed by the Senate, Yellen would become the first woman to lead the Fed.
04/12/2007 – Legendary TV presenter, interviewer, producer and author, Sir David Frost talks about his remarkable career in television.
Sir David Frost has been described as a “one man conglomerate”. He hosted and co-created That Was the Week it Was, has produced countless television programmes, has written 15 books, produced 8 films, he is a lecturer, a publisher and an impresario.
But he is perhaps best known for being one of the best television interviewers in the world. His Nixon Interviews, according to the New York Times achieved “the largest audience for a news interview in history”. Peter Morgan’s play, Frost/Nixon achieved great success in London and Broadway this year.
He is the only person to have interviewed the last seven Presidents of the United States (Richard Nixon, Gerald Ford, Jimmy Carter, Ronald Reagan, George Bush Senior, Bill Clinton, George W. Bush) and the last seven Prime Ministers of the United Kingdom (Harold Wilson, James Callaghan, Edward Heath, Margaret Thatcher, John Major, Tony Blair and Gordon Brown).
Sir David now presents Frost Over The World weekly for Al Jazeera English with a variety of newsmakers from Hamad Karzai, President Lula of Brazil, Tony Blair, Mikhail Gorbachev and Benazir Bhutto after the assassination attempt, to Gerry Adams, Madeleine Albright, Gen. Wesley Clark, Archbishop Desmond Tutu, Dame Helen Mirren and the first interview with Lewis Hamilton and continues to make Frost Tonight weekly for ITV. He is taking Through The Keyhole into its 21st year on the BBC, has recorded The Frost Years for Radio 4 and is Executive Producing a remake of the film, The Dam Busters with Universal and Peter Jackson.
50 Years of Frost – USA, February 2009
Look back at David Frost’s life
Remembering a TV Legend: Interviewer David Frost Dead at 74
China excerpt from: One on One with David Frost – George Bush: A President’s Story
Frost Over The World – George Clooney -18 Jan 08 – Hot Latest News
Frost over the World – George Clooney – 25 Jan 08 – Pt 3 – Hot Latest News
Frost Over The World – Henry Kissinger -18 Jan 08 – Hot Latest News
Frost over the World – Ron Howard – 17 Oct 08 – Hot Latest News
Frost over the World – Recep Tayyip Erdogan – 3 Apr 09 – Hot Latest News
Edward Lucas on ‘Frost over the World’ 2010
Sir David Frost Interviews Julian Assange- Wikileaks- AlJazeera Part 1of2
Sir David Frost Interviews Julian Assange- Wikileaks- AlJazeera Part 2of2
Sir David Frost Interview With Controversial Trader Alessio Rastani (Oct 2011)
The Frost Interview : Aishwarya Rai Bachchan (HD, 2012)
Paul McCartney - Entrevista a David Frost 2012 (Legendado) – Parte 1 de 3
Paul McCartney – Entrevista a David Frost 2012 (Legendado) – Parte 2 de 3
Paul McCartney – Entrevista a David Frost 2012 (Legendado) – Parte 3 de 3
Ron Paul Snr Advisor Doug Wead Interview with Frost – Mar 31 2012
David Frost – Commentator Piece from Last TW3 – ’63 – live
Frost On Satire 1-4
Frost On Satire 2-4
TW3 – That Was The Week That Was – shows up today’s UK TV dross
David Frost and Willie Rushton SHRED the then-Home Sec., on the Last TW3 – ’63 – live
Uploaded on Apr 23, 2011
That Was The Week That Was, also known as TW3, was a satirical television comedy programme that aired on BBC Television in 1962 and 1963.
Devised, produced and directed by Ned Sherrin, the programme was fronted by David Frost and cast members included improvising cartoonist Timothy Birdsall, political commentator Bernard Levin, and actors Lance Percival, who sidelined in topical calypsos, many improvised in response to suggestions from the audience, Kenneth Cope, Roy Kinnear, Willie Rushton (then known as ‘William’), Al Mancini, Robert Lang, David Kernan and Millicent Martin. The last two were also singers and the programme opened with a song – eponymously entitled That Was The Week That Was – sung by Martin to Ron Grainer’s theme tune and enumerating topics that had been in the past week’s news. Off-screen script-writers included John Albery, John Betjeman, John Bird, Graham Chapman, John Cleese, Peter Cook, Roald Dahl, Richard Ingrams, Gerald Kaufman, Frank Muir, Denis Norden, Bill Oddie, Dennis Potter, Eric Sykes, Kenneth Tynan, Keith Waterhouse and others.
The programme was groundbreaking in its lampooning of the establishment. Prime Minister Harold Macmillan was initially supportive of the programme, chastising the then Postmaster General Reginald Bevins (nominally in charge of broadcasting) for threatening to “do something about it”. During the Profumo affair, however, he became one of the programme’s chief targets for derision. After two successful seasons in 1962 and 1963, the programme did not return in 1964, as this was a General Election year and the BBC decided it would be unduly influential.
At the end of each episode, Frost would usually sign off with: “That was the week, that was.” At the end of the final programme he announced: “That was That Was The Week That Was…that was.”
David Frost interviews Frederick Forsyth on Al-Jazeera
Frost over the World – Gore Vidal – 23 May 08 – Hot Latest News
Shayan – Sir David Frost Interview
Sir David Frost in conversation with Chief Rabbi, Lord Sacks
British broadcaster David Frost dies aged 74
By Agence France-Presse Sunday, September 1, 2013 12:07 EDT
British TV giant David Frost, who interviewed the world’s great and good in a half-century broadcasting career, has died aged 74 of a heart attack on board the Queen Elizabeth cruise liner, his family said Sunday.
Frost, celebrated for his 1977 talks with Richard Nixon that extracted an unexpected apology from the disgraced US president over the Watergate scandal, died Saturday.
Operator Cunard said the ship left its British home port of Southampton on Saturday on a 10-day Mediterranean.
“Sir David Frost died of a heart attack last night aboard the Queen Elizabeth where he was giving a speech,” his family said in a statement.
“His family are devastated and ask for privacy at this difficult time,” the statement said. “A family funeral will be held in the near future and details of a memorial service will be announced in due course.”
Frost’s interviewees read like a who’s who of the rich and famous, from big names in show business to world leaders, including South African anti-apartheid icon Nelson Mandela.
Frost was the only person to have interviewed the last eight British prime ministers and the last seven US presidents before Barack Obama, and the last person to have interviewed the last shah of Iran, the Mohammed Reza Pahlavi.
Other subjects included Mikhail Gorbachev, Vladimir Putin, Yasser Arafat, F. W. de Klerk, Jacques Chirac and Benazir Bhutto.
“Hello, good evening and welcome” became his catchphrase, starting off interviews with a friendly veneer that belied a blunt determination to extract information.
“His scrupulous and disarming politeness hid a mind like a vice,” said Menzies Campbell, former leader of Britain’s Liberal Democrats. “David Frost could do you over without you realising it until it was too late.”
The lengthy interviews with Nixon were crucial for both men — Nixon was hoping to salvage his reputation for history, while Frost wanted to add another feather to his cap of famous interviews .
In the end, Frost wrung a mea culpa from Nixon over Watergate, the dirty tricks scandal which prompted his resignation in 1974 and left a lasting scar on the US political landscape.
“I let down my friends, I let down the country,” the former president said.
Frost told BBC television in 2009: “We knew what we were trying to do … and in the end his ‘mea culpa’ went further than even we had hoped.
“At the end of that I think we were aware that something sort of historic had happened and we’d gone further than expected.”
The encounter was turned into a play entitled “Frost/Nixon”, which was adapted into a 2008 film with Michael Sheen playing Frost and Frank Langella as Nixon. It was nominated for five Oscars.
Outside world affairs, Front’s roster included Orson Welles, Tennessee Williams, Noel Coward, Elton John, Woody Allen, Muhammad Ali, the Beatles, Clint Eastwood, Anthony Hopkins, John Gielgud, Norman Mailer, Warren Beatty among countless others.
British Prime Minister David Cameron hailed Frost as “an extraordinary man — with charm, wit, talent, intelligence and warmth in equal measure.
“He made a huge impact on television and politics. The Nixon interviews were among the great broadcast moments — but there were many other brilliant interviews,” Cameron said in a statement.
“He could be — and certainly was with me — both a friend and a fearsome interviewer.”
The son of a Methodist minister, David Paradine Frost was born in Kent, southeast England, on April 7, 1939.
Fresh out of Cambridge University, he presented the BBC’s groundbreaking “That Was The Week That Was”, which took an unprecedented satirical look at the week’s news between 1962 and 1963.
A globetrotter, Frost revelled in the Concorde jet-set high life, presenting five programmes a week in the United States and three in Britain.
In 1983, he married Lady Carina Fitzalan-Howard, second daughter of the Duke of Norfolk — the premier duke in the English nobility. They had three sons.
A successful businessman, Frost was knighted in 1993, becoming Sir David.
The broadcaster wrote 17 books, produced several films and started two British television networks, London Weekend Television and TV-am.
David Frost, Interviewer Who Got Nixon to Apologize for Watergate, Dies at 74
By BRIAN STELTER
David Frost, the British broadcaster whose interviews of historic figures like Henry Kissinger, John Lennon and, most famously, Richard M. Nixon often made history in their own right, died on Saturday aboard the ocean liner Queen Elizabeth, where he was scheduled to give a speech. He was 74.
The cause was a heart attack, his family said.
Mr. Frost’s highly varied television career mirrored the growth of the medium, from the black-and-white TV of the 1960s to the cable news of today.
He knew how to make his guests “make news,” as the television industry saying goes, either through a sequence of incisive questions or carefully placed silences. He showcased both techniques during his penetrating series of interviews with President Nixon, broadcast in 1977, three years after Mr. Nixon was driven from office by the Watergate scandal, resigning in the face of certain impeachment.
Mr. Frost not only persuaded Mr. Nixon to end a self-imposed silence, he also extracted an apology from the former president to the American people.
The sessions, described as the most-watched political interviews in history, were recalled 30 years later in a play and a film, both named “Frost/Nixon.” In the film, Mr. Frost was portrayed by Michael Sheen and Mr. Nixon by Frank Langella.
Since 2006, Mr. Frost’s television home had been Al Jazeera English, one of the BBC’s main competitors overseas. Mr. Frost brought prestige to the news network, while it empowered him to conduct the kind of newsmaker interviews he most enjoyed.
“No matter who he was interviewing, he was committed to getting the very best out of the discussion, but always doing so by getting to know his guest, engaging with them and entering into a proper conversation,” Al Anstey, the managing director of Al Jazeera English, said by e-mail.
He was “always a true gentleman,” Mr. Anstey added, alluding to the charm that others said made Mr. Frost so successful in securing such a wide array of guests.
Among those guests in recent years were Prime Minister Recep Tayyip Erdogan of Turkey, the actor George Clooney and the tennis star Martina Navratilova. A new season of Mr. Frost’s program, “The Frost Interview,” began in July with the astronaut Buzz Aldrin. The season was to continue through mid-September.
One of his first interviews for Al Jazeera made headlines when his guest, Tony Blair, agreed with Mr. Frost’s assessment that the war in Iraq had, up until that point in 2006, “been pretty much of a disaster.” In a statement on Sunday, Mr. Blair said, “Being interviewed by him was always a pleasure, but also you knew that there would be multiple stories the next day arising from it.”
David Paradine Frost was born April 7, 1939, in Tenterden, England, to Mona and W. J. Paradine Frost. His father was a Methodist minister.
While a student, Mr. Frost edited both a student newspaper and a literary publication at Cambridge University, where he showed a knack for satire — something on which the BBC soon capitalized. In 1962, Mr. Frost became the host of “That Was the Week That Was,” a satirical look at the news on Saturday nights. While it lasted only two seasons in Britain, “TW3,” as it was known, was reborn briefly as a program on NBC, and it is remembered as a forerunner to “The Daily Show” and the “Weekend Update” segment on NBC’s “Saturday Night Live.”
After “TW3,” Mr. Frost was the host of a succession of programs, from entertainment specials (“David Frost’s Night Out in London”) to more intellectually stimulating talk shows. While most of these were televised in Britain, Mr. Frost crossed the Atlantic constantly; he once said he had lost count of the number of times he had flown on the Concorde.
He filled in for Johnny Carson twice in 1968, and was subsequently offered a syndicated talk show, which premiered on a patchwork of stations across the United States a year later. That series came to an end in 1972.
His most memorable work happened several years later, when his interview with Mr. Nixon was broadcast around the world. At one point Mr. Frost asked about Mr. Nixon’s abuses of presidential power, prompting this answer: “Well, when the president does it, that means that it is not illegal.”
“Upon hearing that sentence, I could scarcely believe my ears,” Mr. Frost wrote in a 2007 book about the interview, published to coincide with the “Frost/Nixon” movie. Mr. Frost said his task then “was to keep him talking on this theme for as long as possible.”
By then, Mr. Frost and Mr. Nixon had already spoken on camera several times. And they continued to speak: the interviews, for which Mr. Nixon was paid $600,000 and a share of the profit for the broadcasts, were taped over four weeks for about two hours at a time and eventually totaled nearly 29 hours.
On the last day, Mr. Frost pressed Mr. Nixon to acknowledge the mistakes of the Watergate period. “Unless you say it, you’re going to be haunted for the rest of your life,” Mr. Frost said.
“That was totally ad-lib,” Mr. Frost recalled. “In fact, I threw my clipboard down just to indicate that it was not prepared in any way.” He added: “I just knew at that moment that Richard Nixon was more vulnerable than he’d ever be in his life. And I knew I had to get it right.”
Mr. Nixon apologized for putting “the American people through two years of needless agony,” adding, “I let the American people down and I have to carry that burden with me for the rest of my life.”
Mr. Frost, who was awarded a knighthood in 1993, had recently moved to a home close to Oxford, said Richard Brock, his executive producer at Al Jazeera. He also had a home in London.
Survivors include his second wife, Carina, and their three sons. His first wife, Lynne Frederick, a British actress, was the widow of Peter Sellers; they divorced in 1982. Mr. Frost was also once engaged to the American actress and singer Diahann Carroll.
In interviews, whenever Mr. Frost was asked about the highlight of his career, he cited the Nixon interview.
But Mr. Frost interviewed other presidents as well, including George H. W. Bush, whom he later praised as wise and determined.
“The Nixon interviews were among the great broadcast moments, but there were many other brilliant interviews,” Prime Minister David Cameron of Britain said in a statement on Sunday morning.
Barney Jones, a longtime colleague of Mr. Frost at the BBC, told the news organizationthat Mr. Frost had an interview with Mr. Cameron scheduled for September.
Mr. Jones marveled at Mr. Frost’s contacts, recounting a day when “he took me into my little office, scrabbled around in his contacts book, and five minutes later he was talking to George Bush. I couldn’t believe it.”
Frost died on 31 August 2013, aged 74, on board the cruise ship MS Queen Elizabeth, on which he had been engaged as a speaker.
David Paradine Frost was born in Tenterden, Kent, on 7 April 1939, the son of a Methodist minister of Huguenot descent, the Rev. Wilfred John “W. J.” Paradine Frost, and his wife, Mona (Aldrich); he had two elder sisters. While living in Gillingham, Kent, he was taught in the Bible class of the Sunday school at his father’s church (Byron Road Methodist) by David Gilmore Harvey, and subsequently started training as a Methodist local preacher, which he did not complete.
Frost studied at Gonville and Caius College, Cambridge University, from 1958, graduating from the university with a degree in English. He was editor of both the university’s student paper, Varsity, and the literary magazine Granta. He was also secretary of the Footlights Drama Society, which included actors such as Peter Cook and John Bird. During this period, Frost appeared on television for the first time in an edition of Anglia Television‘s Town And Gown, performing several comic characters. “The first time I stepped into a television studio”, he once remembered, “it felt like home. It didn’t scare me. Talking to the camera seemed the most natural thing in the world.”
According to some accounts, Frost was the victim of snobbery from the group with which he associated at Cambridge, which has been confirmed by Barry Humphries.Christopher Booker, while asserting that Frost’s one defining characteristic was ambition, commented that he was impossible to dislike. According to the satirist John Wells, the Old-Etonian actor Jonathan Cecil congratulated Frost around this time for “that wonderfully silly voice” he used while performing, but then discovered that it was Frost’s real voice.
After leaving university, Frost became a trainee at Associated-Rediffusion. Meanwhile, having already gained an agent, Frost performed in cabaret at the Blue Angel nightclub in Berkeley Square, London during the evenings.
That Was the Week That Was (TW3)
Frost was chosen by writer and producer Ned Sherrin to host the satirical programme That Was the Week That Was, alias TW3 after Frost’s flat mate John Bird suggested Sherrin should see his act at The Blue Angel. The series, which ran for less than 18 months during 1962-63, was part of the satire boom in early 1960s Britain and became a popular programme.
The involvement of Frost in TW3 led to an intensification of the rivalry with Peter Cook who accused him of stealing material and dubbed Frost “the bubonic plagiarist”. The new satirical magazine Private Eye also mocked him at this time. Frost visited the United States during the break between the two series of TW3 in the summer of 1963 and stayed with the producer of the New York production of Beyond The Fringe. Frost was unable to swim, but still jumped into the pool, and nearly drowned until he was saved by Peter Cook. At the memorial service for Cook in 1995, Alan Bennett recalled that rescuing Frost was the one regret Cook frequently expressed.
For the first three editions of the second series in 1963, the BBC attempted to limit the team by scheduling repeats of The Third Man television series after the programme, thus preventing overruns. Frost took to reading synopses of the episodes at the end of the programme as a means of sabotage. After the BBC’s Director General Hugh Greene instructed that the repeats should be abandoned, TW3 returned to being open-ended. More sombrely, on 23 November 1963, a tribute to the assassinated President John F. Kennedy, an event which had occurred the previous day, formed an entire edition of That Was the Week That Was.
An American version of TW3 ran after the original British series had ended. Following a pilot episode on 10 November 1963, the 30-minute US series, also featuring Frost, ran on NBC from 10 January 1964 to May 1965. In 1985, Frost produced and hosted a television special in the same format, That Was the Year That Was, on NBC.
Frost signed for Rediffusion, the ITV weekday contractor in London, to produce a “heavier” interview-based show called The Frost Programme. Guests included Sir Oswald Mosley and Rhodesian premier Ian Smith. His memorable dressing-down of insurance fraudster Emil Savundra, regarded as the first example of “trial by television” in the UK, led to concern from ITV executives that it might affect Savundra’s right to a fair trial. Frost’s introductory words for his television programmes during this period, “Hello, good evening and welcome”, became his catchphrase and were often mimicked.
Frost was a member of a successful consortium, including former executives from the BBC, which bid for an ITV franchise in 1967. This became London Weekend Television, which began broadcasting in July 1968. The station began with a programming policy which was considered ‘highbrow‘ and suffered launch problems with low audience ratings and financial problems. A September 1968 meeting of the Network Programme Committee, which made decisions about the channel’s scheduling, was particularly fraught, with Lew Grade expressing hatred of Frost in his presence. Frost, according to Kitty Muggeridge in 1967, had “risen without a trace.”
In 1968 he signed a contract worth £125,000 to appear on American television in his own show on three evenings each week, the largest such arrangement for a British television personality at the time. From 1969 to 1972, Frost kept his London shows and fronted The David Frost Show on the Group W (U.S. Westinghouse Corporation) television stations in the United States. His 1970 TV special, Frost on America, featured guests such as Jack Benny and Tennessee Williams.
In 1977 The Nixon Interviews, a series of five 90-minute interviews with former US President Richard Nixon, were broadcast. Nixon was paid $600,000 plus a share of the profits for the interviews, which had to be funded by Frost himself after the US television networks turned down the programme, describing it as “checkbook journalism“. Frost’s company negotiated its own deals to syndicate the interviews with local stations across the US and internationally, creating what Ron Howard described as “the first fourth network.”
Frost taped around 29 hours of interviews with Nixon over a period of four weeks. Nixon, who had previously avoided discussing his role in the Watergate scandal which had led to his resignation as President in 1974, expressed contrition saying “I let the American people down and I have to carry that burden with me for the rest of my life”.
Frost was one of the “Famous Five” who launched TV-am in February 1983 but, like LWT in the late 1960s, the station began with an unsustainable “highbrow” approach. Frost remained a presenter after restructuring. Frost on Sunday began in September 1983 and continued until the station lost its franchise at the end of 1992. Frost had been part of an unsuccessful consortium, CPV-TV, with Richard Branson and other interests, which had attempted to acquire three ITV contractor franchises prior to the changes made by the Independent Television Commission in 1991. After transferring from ITV, his Sunday morning interview programme Breakfast with Frost ran on the BBC from January 1993 until 29 May 2005. For a time it ran on BSB before its later Sunday morning rebroadcast on BBC 1.
Frost hosted Through the Keyhole, which ran on several UK channels from 1987 until 2008 and also featured Loyd Grossman. Produced by his own production company, the programme was first shown in prime time and on daytime television in its later years.
After having been in television for 40 years, Frost was estimated to be worth £200 million by the Sunday Times Rich List in 2006, a figure he considered a significant over-estimate in 2011. The valuation included the assets of his main British company and subsidiaries, plus homes in London and the country.
Frost/Nixon was originally a play written by Peter Morgan, developed from The Nixon Interviews which Frost had conducted with Richard Nixon in 1977. Frost/Nixon was presented as a stage production in London in 2006, and on Broadway in 2007. The play was adapted into a Hollywood motion picture starring Michael Sheen as Frost and Frank Langella as Nixon, both reprising their stage roles. The film was directed by Ron Howard and released in 2008. It was nominated for five Golden Globe awards: Best Motion Picture Drama, Best Director, Best Actor, Best Screenplay and Best Original Score, and for five Academy Awards: Best Picture, Best Actor, Best Director, Best Adapted Screenplay and Best Editing.
Frost was known for several relationships with high profile women. In the mid-1960s, he dated British actress Janette Scott, between her marriages to songwriter Jackie Rae and singer Mel Tormé; in the early 1970s he was engaged to American actress Diahann Carroll; between 1972 and 1977 he had a relationship with British socialite Caroline Cushing; in 1981 he married Lynne Frederick, widow of Peter Sellers, but they divorced the following year. He also had an 18-year intermittent affair with American actress Carol Lynley.
On 19 March 1983, Frost married Lady Carina Fitzalan-Howard, daughter of the 17th Duke of Norfolk. Over the next five years, they had three sons, Miles, Wilfred and George, and for many years lived in Chelsea, with their weekend home at Michelmersh Court in Hampshire.
^ Harper, Lauren (19 July 2013). “Henry Kissinger Jokes About Making a Pawn of Bobby Fischer”. National Security Archive. Retrieved 2 August 2013. “The tournament was dramatic enough thanks to Fischer’s antics, but telephone conversation on 3 July 1972, capturing British journalist David Frost asking Kissinger to persuade the grandmaster to attend the championship adds more to the story. Kissinger had an intellectual interest in chess, and the Spassky-Fischer head-to-head alone would have likely piqued his interest in the match, but Frost wanted Kissinger to get involved to ensure Fischer’s participation.”
NSA broke privacy rules thousands of times per year, audit finds
By Barton Gellman,
The National Security Agency has broken privacy rules or overstepped its legal authority thousands of times each year since Congress granted the agency broad new powers in 2008, according to an internal audit and other top-secret documents.
Most of the infractions involve unauthorized surveillance of Americans or foreign intelligence targets in the United States, both of which are restricted by statute and executive order. They range from significant violations of law to typographical errors that resulted in unintended interception of U.S. e-mails and telephone calls.
The documents, provided earlier this summer to The Washington Post by former NSA contractor Edward Snowden, include a level of detail and analysis that is not routinely shared with Congress or the special court that oversees surveillance. In one of the documents, agency personnel are instructed to remove details and substitute more generic language in reports to the Justice Department and the Office of the Director of National Intelligence.
In one instance, the NSA decided that it need not report the unintended surveillance of Americans. A notable example in 2008 was the interception of a “large number” of calls placed from Washington when a programming error confused the U.S. area code 202 for 20, the international dialing code for Egypt, according to a “quality assurance” review that was not distributed to the NSA’s oversight staff.
In another case, the Foreign Intelligence Surveillance Court, which has authority over some NSA operations, did not learn about a new collection method until it had been in operation for many months. The court ruled it unconstitutional.
The Obama administration has provided almost no public information about the NSA’s compliance record. In June, after promising to explain the NSA’s record in “as transparent a way as we possibly can,” Deputy Attorney General James Cole described extensive safeguards and oversight that keep the agency in check. “Every now and then, there may be a mistake,” Cole said in congressional testimony.
The NSA audit obtained by The Post, dated May 2012, counted 2,776 incidents in the preceding 12 months of unauthorized collection, storage, access to or distribution of legally protected communications. Most were unintended. Many involved failures of due diligence or violations of standard operating procedure. The most serious incidents included a violation of a court order and unauthorized use of data about more than 3,000 Americans and green-card holders.
In a statement in response to questions for this article, the NSA said it attempts to identify problems “at the earliest possible moment, implement mitigation measures wherever possible, and drive the numbers down.” The government was made aware of The Post’s intention to publish the documents that accompany this article online.
“We’re a human-run agency operating in a complex environment with a number of different regulatory regimes, so at times we find ourselves on the wrong side of the line,” a senior NSA official said in an interview, speaking with White House permission on the condition of anonymity.
This is the full executive summary, with names redacted by The Post, of a classified internal report on breaches of NSA privacy rules and legal restrictions.
The report covers the period from January through March 2012 and includes comparative data for the full preceding year. Its author is director of oversight and compliance for the NSA’s Signals Intelligence Directorate, but the scope of the report is narrower. Incidents are counted only if they took place within “NSA-Washington,” a term encompassing the Ft. Meade headquarters and nearby facilities. The NSA declined to provide comparable figures for its operations as a whole. A senior intelligence official said only that if all offices and directorates were included, the number of violations would “not double.”
NSA Scandal – Americans “Shut Up & Obey” – RPT NSA Broke Rules Thousands Of Times!
Ron Paul / Glenn Greenwald Interview
Snowden Leak Reveals NSA Broke Its Own Rules THOUSANDS OF TIMES
Internal NSA Audit Privacy Rules Broken Thousands of Times
NSA audit confirms abuse despite Obama’s claim
Report: NSA Spying Broke Privacy Rules Many Times – Edwards Snowden Documents Reveal
Edward Snowden NSA Reform Analysis
EVERYTHING You Do Online Is Recorded In XKeyscore The Young Turks with Cenk Uygur
XKeyscore A New Level of Invasive NSA Data Spying
‘Does the NSA collect any type of data at all on millions or hundreds of millions of Americans?’
Glenn Greenwald There Are Extremely Invasive Spying Programs the Public Still Does Not Know About
What you’re not being told about Booz Allen Hamilton and Edward Snowden
Justin Amash No Precedent In History For NSA Spying
Can American’s Change the Agenda of Extreme Spying?
FOX NEWS: NSA Tracking Of American People
“NSA Spying Now Protected From Any Challenges Under The Fourth Amendment”
Rand Paul on NSA Spying ‘An Utter, Frank Hypocrisy’ But will he do anything about it
Background Articles and Videos
NSA Whistle-blower Reveals “Stellar Wind” Spying on You – code named The Program
NSA Whistleblower: ‘Everyone In U.S. Under Virtual Surveillance’
NSA Collects ‘Word for Word’ Every Domestic Communication
NSA Whistleblowers: “All U.S.Citizens” Targeted By Surveillance Program, Not Just Verizon Customers
“Obama Is BIG BROTHER And He’s A LIAR!”
NSA Spying is Far Worse Than You Thought
UNBELIEVABE NSA, FBI Secretly Mines Data from Major Internet Companies Google, Yahoo
NSA Spying on All Americans Part 1
NSA spying on All Americans Part 2
James Bamford: Inside the NSA’s Largest Secret Domestic Spy Center
NSA Whistleblower Speaks Live: “The Government Is Lying To You” Part 2
NSA Whistleblower Speaks Live: “The Government Is Lying To You” Part 3
NSA Whistleblower Speaks Live: “The Government Is Lying To You” Part 4
NSA whistleblower William Binney Keynote at HOPE Number Nine
NSA Whistleblower Thomas Drake speaks at National Press Club – March 15, 2013
Enemies of the State [29C3]
ENEMY OF THE STATE… (1998) MUST WATCH..TAKE SERIOUSLY..
NSA and the One Trillion Dollar scam [Empire]
Nova: The Spy Factory Full Video
Inside NSA – The National Security Agency – Documentary
Inside The NSA~Americas Cyber Secrets
Why Shouldn’t I Work for the NSA? (Good Will Hunting)
The NSA Is Building the Country’s Biggest Spy Center (Watch What You Say)
The spring air in the small, sand-dusted town has a soft haze to it, and clumps of green-gray sagebrush rustle in the breeze. Bluffdale sits in a bowl-shaped valley in the shadow of Utah’s Wasatch Range to the east and the Oquirrh Mountains to the west. It’s the heart of Mormon country, where religious pioneers first arrived more than 160 years ago. They came to escape the rest of the world, to understand the mysterious words sent down from their god as revealed on buried golden plates, and to practice what has become known as “the principle,” marriage to multiple wives.
Today Bluffdale is home to one of the nation’s largest sects of polygamists, the Apostolic United Brethren, with upwards of 9,000 members. The brethren’s complex includes a chapel, a school, a sports field, and an archive. Membership has doubled since 1978—and the number of plural marriages has tripled—so the sect has recently been looking for ways to purchase more land and expand throughout the town.
But new pioneers have quietly begun moving into the area, secretive outsiders who say little and keep to themselves. Like the pious polygamists, they are focused on deciphering cryptic messages that only they have the power to understand. Just off Beef Hollow Road, less than a mile from brethren headquarters, thousands of hard-hatted construction workers in sweat-soaked T-shirts are laying the groundwork for the newcomers’ own temple and archive, a massive complex so large that it necessitated expanding the town’s boundaries. Once built, it will be more than five times the size of the US Capitol.
Rather than Bibles, prophets, and worshippers, this temple will be filled with servers, computer intelligence experts, and armed guards. And instead of listening for words flowing down from heaven, these newcomers will be secretly capturing, storing, and analyzing vast quantities of words and images hurtling through the world’s telecommunications networks. In the little town of Bluffdale, Big Love and Big Brother have become uneasy neighbors.
The NSA has become the largest, most covert, and potentially most intrusive intelligence agency ever.
Under construction by contractors with top-secret clearances, the blandly named Utah Data Center is being built for the National Security Agency. A project of immense secrecy, it is the final piece in a complex puzzle assembled over the past decade. Its purpose: to intercept, decipher, analyze, and store vast swaths of the world’s communications as they zap down from satellites and zip through the underground and undersea cables of international, foreign, and domestic networks. The heavily fortified $2 billion center should be up and running in September 2013. Flowing through its servers and routers and stored in near-bottomless databases will be all forms of communication, including the complete contents of private emails, cell phone calls, and Google searches, as well as all sorts of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital “pocket litter.” It is, in some measure, the realization of the “total information awareness” program created during the first term of the Bush administration—an effort that was killed by Congress in 2003 after it caused an outcry over its potential for invading Americans’ privacy.
But “this is more than just a data center,” says one senior intelligence official who until recently was involved with the program. The mammoth Bluffdale center will have another important and far more secret role that until now has gone unrevealed. It is also critical, he says, for breaking codes. And code-breaking is crucial, because much of the data that the center will handle—financial information, stock transactions, business deals, foreign military and diplomatic secrets, legal documents, confidential personal communications—will be heavily encrypted. According to another top official also involved with the program, the NSA made an enormous breakthrough several years ago in its ability to cryptanalyze, or break, unfathomably complex encryption systems employed by not only governments around the world but also many average computer users in the US. The upshot, according to this official: “Everybody’s a target; everybody with communication is a target.”
For the NSA, overflowing with tens of billions of dollars in post-9/11 budget awards, the cryptanalysis breakthrough came at a time of explosive growth, in size as well as in power. Established as an arm of the Department of Defense following Pearl Harbor, with the primary purpose of preventing another surprise assault, the NSA suffered a series of humiliations in the post-Cold War years. Caught offguard by an escalating series of terrorist attacks—the first World Trade Center bombing, the blowing up of US embassies in East Africa, the attack on the USS Cole in Yemen, and finally the devastation of 9/11—some began questioning the agency’s very reason for being. In response, the NSA has quietly been reborn. And while there is little indication that its actual effectiveness has improved—after all, despite numerous pieces of evidence and intelligence-gathering opportunities, it missed the near-disastrous attempted attacks by the underwear bomber on a flight to Detroit in 2009 and by the car bomber in Times Square in 2010—there is no doubt that it has transformed itself into the largest, most covert, and potentially most intrusive intelligence agency ever created.
In the process—and for the first time since Watergate and the other scandals of the Nixon administration—the NSA has turned its surveillance apparatus on the US and its citizens. It has established listening posts throughout the nation to collect and sift through billions of email messages and phone calls, whether they originate within the country or overseas. It has created a supercomputer of almost unimaginable speed to look for patterns and unscramble codes. Finally, the agency has begun building a place to store all the trillions of words and thoughts and whispers captured in its electronic net. And, of course, it’s all being done in secret. To those on the inside, the old adage that NSA stands for Never Say Anything applies more than ever.
A swath of freezing fog blanketed Salt Lake City on the morning of January 6, 2011, mixing with a weeklong coating of heavy gray smog. Red air alerts, warning people to stay indoors unless absolutely necessary, had become almost daily occurrences, and the temperature was in the bone-chilling twenties. “What I smell and taste is like coal smoke,” complained one local blogger that day. At the city’s international airport, many inbound flights were delayed or diverted while outbound regional jets were grounded. But among those making it through the icy mist was a figure whose gray suit and tie made him almost disappear into the background. He was tall and thin, with the physique of an aging basketball player and dark caterpillar eyebrows beneath a shock of matching hair. Accompanied by a retinue of bodyguards, the man was NSA deputy director Chris Inglis, the agency’s highest-ranking civilian and the person who ran its worldwide day-to-day operations.
A short time later, Inglis arrived in Bluffdale at the site of the future data center, a flat, unpaved runway on a little-used part of Camp Williams, a National Guard training site. There, in a white tent set up for the occasion, Inglis joined Harvey Davis, the agency’s associate director for installations and logistics, and Utah senator Orrin Hatch, along with a few generals and politicians in a surreal ceremony. Standing in an odd wooden sandbox and holding gold-painted shovels, they made awkward jabs at the sand and thus officially broke ground on what the local media had simply dubbed “the spy center.” Hoping for some details on what was about to be built, reporters turned to one of the invited guests, Lane Beattie of the Salt Lake Chamber of Commerce. Did he have any idea of the purpose behind the new facility in his backyard? “Absolutely not,” he said with a self-conscious half laugh. “Nor do I want them spying on me.”
For his part, Inglis simply engaged in a bit of double-talk, emphasizing the least threatening aspect of the center: “It’s a state-of-the-art facility designed to support the intelligence community in its mission to, in turn, enable and protect the nation’s cybersecurity.” While cybersecurity will certainly be among the areas focused on in Bluffdale, what is collected, how it’s collected, and what is done with the material are far more important issues. Battling hackers makes for a nice cover—it’s easy to explain, and who could be against it? Then the reporters turned to Hatch, who proudly described the center as “a great tribute to Utah,” then added, “I can’t tell you a lot about what they’re going to be doing, because it’s highly classified.”
And then there was this anomaly: Although this was supposedly the official ground-breaking for the nation’s largest and most expensive cybersecurity project, no one from the Department of Homeland Security, the agency responsible for protecting civilian networks from cyberattack, spoke from the lectern. In fact, the official who’d originally introduced the data center, at a press conference in Salt Lake City in October 2009, had nothing to do with cybersecurity. It was Glenn A. Gaffney, deputy director of national intelligence for collection, a man who had spent almost his entire career at the CIA. As head of collection for the intelligence community, he managed the country’s human and electronic spies.
Within days, the tent and sandbox and gold shovels would be gone and Inglis and the generals would be replaced by some 10,000 construction workers. “We’ve been asked not to talk about the project,” Rob Moore, president of Big-D Construction, one of the three major contractors working on the project, told a local reporter. The plans for the center show an extensive security system: an elaborate $10 million antiterrorism protection program, including a fence designed to stop a 15,000-pound vehicle traveling 50 miles per hour, closed-circuit cameras, a biometric identification system, a vehicle inspection facility, and a visitor-control center.
Inside, the facility will consist of four 25,000-square-foot halls filled with servers, complete with raised floor space for cables and storage. In addition, there will be more than 900,000 square feet for technical support and administration. The entire site will be self-sustaining, with fuel tanks large enough to power the backup generators for three days in an emergency, water storage with the capability of pumping 1.7 million gallons of liquid per day, as well as a sewage system and massive air-conditioning system to keep all those servers cool. Electricity will come from the center’s own substation built by Rocky Mountain Power to satisfy the 65-megawatt power demand. Such a mammoth amount of energy comes with a mammoth price tag—about $40 million a year, according to one estimate.
Given the facility’s scale and the fact that a terabyte of data can now be stored on a flash drive the size of a man’s pinky, the potential amount of information that could be housed in Bluffdale is truly staggering. But so is the exponential growth in the amount of intelligence data being produced every day by the eavesdropping sensors of the NSA and other intelligence agencies. As a result of this “expanding array of theater airborne and other sensor networks,” as a 2007 Department of Defense report puts it, the Pentagon is attempting to expand its worldwide communications network, known as the Global Information Grid, to handle yottabytes (1024 bytes) of data. (A yottabyte is a septillion bytes—so large that no one has yet coined a term for the next higher magnitude.)
It needs that capacity because, according to a recent report by Cisco, global Internet traffic will quadruple from 2010 to 2015, reaching 966 exabytes per year. (A million exabytes equal a yottabyte.) In terms of scale, Eric Schmidt, Google’s former CEO, once estimated that the total of all human knowledge created from the dawn of man to 2003 totaled 5 exabytes. And the data flow shows no sign of slowing. In 2011 more than 2 billion of the world’s 6.9 billion people were connected to the Internet. By 2015, market research firm IDC estimates, there will be 2.7 billion users. Thus, the NSA’s need for a 1-million-square-foot data storehouse. Should the agency ever fill the Utah center with a yottabyte of information, it would be equal to about 500 quintillion (500,000,000,000,000,000,000) pages of text.
The data stored in Bluffdale will naturally go far beyond the world’s billions of public web pages. The NSA is more interested in the so-called invisible web, also known as the deep web or deepnet—data beyond the reach of the public. This includes password-protected data, US and foreign government communications, and noncommercial file-sharing between trusted peers. “The deep web contains government reports, databases, and other sources of information of high value to DOD and the intelligence community,” according to a 2010 Defense Science Board report. “Alternative tools are needed to find and index data in the deep web … Stealing the classified secrets of a potential adversary is where the [intelligence] community is most comfortable.” With its new Utah Data Center, the NSA will at last have the technical capability to store, and rummage through, all those stolen secrets. The question, of course, is how the agency defines who is, and who is not, “a potential adversary.”
Before yottabytes of data from the deep web and elsewhere can begin piling up inside the servers of the NSA’s new center, they must be collected. To better accomplish that, the agency has undergone the largest building boom in its history, including installing secret electronic monitoring rooms in major US telecom facilities. Controlled by the NSA, these highly secured spaces are where the agency taps into the US communications networks, a practice that came to light during the Bush years but was never acknowledged by the agency. The broad outlines of the so-called warrantless-wiretapping program have long been exposed—how the NSA secretly and illegally bypassed the Foreign Intelligence Surveillance Court, which was supposed to oversee and authorize highly targeted domestic eavesdropping; how the program allowed wholesale monitoring of millions of American phone calls and email. In the wake of the program’s exposure, Congress passed the FISA Amendments Act of 2008, which largely made the practices legal. Telecoms that had agreed to participate in the illegal activity were granted immunity from prosecution and lawsuits. What wasn’t revealed until now, however, was the enormity of this ongoing domestic spying program.
For the first time, a former NSA official has gone on the record to describe the program, codenamed Stellar Wind, in detail. William Binney was a senior NSA crypto-mathematician largely responsible for automating the agency’s worldwide eavesdropping network. A tall man with strands of black hair across the front of his scalp and dark, determined eyes behind thick-rimmed glasses, the 68-year-old spent nearly four decades breaking codes and finding new ways to channel billions of private phone calls and email messages from around the world into the NSA’s bulging databases. As chief and one of the two cofounders of the agency’s Signals Intelligence Automation Research Center, Binney and his team designed much of the infrastructure that’s still likely used to intercept international and foreign communications.
He explains that the agency could have installed its tapping gear at the nation’s cable landing stations—the more than two dozen sites on the periphery of the US where fiber-optic cables come ashore. If it had taken that route, the NSA would have been able to limit its eavesdropping to just international communications, which at the time was all that was allowed under US law. Instead it chose to put the wiretapping rooms at key junction points throughout the country—large, windowless buildings known as switches—thus gaining access to not just international communications but also to most of the domestic traffic flowing through the US. The network of intercept stations goes far beyond the single room in an AT&T building in San Francisco exposed by a whistle-blower in 2006. “I think there’s 10 to 20 of them,” Binney says. “That’s not just San Francisco; they have them in the middle of the country and also on the East Coast.”
The eavesdropping on Americans doesn’t stop at the telecom switches. To capture satellite communications in and out of the US, the agency also monitors AT&T’s powerful earth stations, satellite receivers in locations that include Roaring Creek and Salt Creek. Tucked away on a back road in rural Catawissa, Pennsylvania, Roaring Creek’s three 105-foot dishes handle much of the country’s communications to and from Europe and the Middle East. And on an isolated stretch of land in remote Arbuckle, California, three similar dishes at the company’s Salt Creek station service the Pacific Rim and Asia.
The former NSA official held his thumb and forefinger close together: “We are that far from a turnkey totalitarian state.”
Binney left the NSA in late 2001, shortly after the agency launched its warrantless-wiretapping program. “They violated the Constitution setting it up,” he says bluntly. “But they didn’t care. They were going to do it anyway, and they were going to crucify anyone who stood in the way. When they started violating the Constitution, I couldn’t stay.” Binney says Stellar Wind was far larger than has been publicly disclosed and included not just eavesdropping on domestic phone calls but the inspection of domestic email. At the outset the program recorded 320 million calls a day, he says, which represented about 73 to 80 percent of the total volume of the agency’s worldwide intercepts. The haul only grew from there. According to Binney—who has maintained close contact with agency employees until a few years ago—the taps in the secret rooms dotting the country are actually powered by highly sophisticated software programs that conduct “deep packet inspection,” examining Internet traffic as it passes through the 10-gigabit-per-second cables at the speed of light.
The software, created by a company called Narus that’s now part of Boeing, is controlled remotely from NSA headquarters at Fort Meade in Maryland and searches US sources for target addresses, locations, countries, and phone numbers, as well as watch-listed names, keywords, and phrases in email. Any communication that arouses suspicion, especially those to or from the million or so people on agency watch lists, are automatically copied or recorded and then transmitted to the NSA.
The scope of surveillance expands from there, Binney says. Once a name is entered into the Narus database, all phone calls and other communications to and from that person are automatically routed to the NSA’s recorders. “Anybody you want, route to a recorder,” Binney says. “If your number’s in there? Routed and gets recorded.” He adds, “The Narus device allows you to take it all.” And when Bluffdale is completed, whatever is collected will be routed there for storage and analysis.
According to Binney, one of the deepest secrets of the Stellar Wind program—again, never confirmed until now—was that the NSA gained warrantless access to AT&T’s vast trove of domestic and international billing records, detailed information about who called whom in the US and around the world. As of 2007, AT&T had more than 2.8 trillion records housed in a database at its Florham Park, New Jersey, complex.
Verizon was also part of the program, Binney says, and that greatly expanded the volume of calls subject to the agency’s domestic eavesdropping. “That multiplies the call rate by at least a factor of five,” he says. “So you’re over a billion and a half calls a day.” (Spokespeople for Verizon and AT&T said their companies would not comment on matters of national security.)
“This amnesty will give citizenship to only 1.1 to 1.3 million illegal aliens. We will secure the borders henceforth. We will never again bring forward another amnesty bill like this.”
~Senator Edward “Ted” Kennedy, D-Mass, regarding an amnesty bill passed in 1986
Immigration by the Numbers — Off the Charts
Immigration, World Poverty and Gumballs – Updated 2010
1984 – Ronald Reagan on Amnesty
In this brief video-clip from the 1984 presidential debates Ronald Reagan discusses immigration, amnesty and the failure of the first attempt to pass the Simpson-Mazzoli Immigration Reform and Control Act. [When the act finally passed (1986) did we get reform? Did we get control?]
The Immigration Reform and Control Act of 1986
A foreigner who has either entered a country illegally (e.g. without inspection or proper documents) or who has violated the terms of legal admission to the country (e.g. by overstaying the duration of a tourist or student visa).
8 USC § 1101 – Definitions
(3) The term “alien” means any person not a citizen or national of the United States.
How Many Illegal Aliens Are in the US? – Walsh – 1
How Many Illegal Aliens Are in the United States? Presentation by James H. Walsh, Associate General Counsel of the former INS – part 1.
How Many Illegal Aliens Are in the US? – Walsh – 2
How Many Illegal Aliens Are in the United States? Presentation by James H. Walsh, Associate General Counsel of the former INS – part 2.
Census Bureau estimates of the number of illegals in the U.S. are suspect and may represent significant undercounts. The studies presented by these authors show that the numbers of illegal aliens in the U.S. could range from 20 to 38 million.
US immigration system moves towards reform
Sen. Ted Cruz Speaks on the Senate Floor in Opposition to the Gang of Eight’s Immigration Bill
Glenn Beck to Release Name of 70 House Republicans for Showdown w John Boehner on Amnesty Bill
Glenn Beck: Interview with House Republicans Planning Revolt On Immigration Bill
Glenn Beck Program Immigration and Equal Opportunity 06132013
US Senate Votes to Consider Citizenship for Illegal Immigrants
News Wrap: Senate Votes to Begin Immigration Reform Debate
Border Insecurity Citizens Track Surge Of Illegal Immigration! – Wake Up America!!
Chris Pyle, Whistleblower on CIA Domestic Spying in 70s, Says Be Wary of Attacks on NSA’s Critics
NSA Chief Grilled at Senate Hearing on Surveillance Programs
He told you so: Bill Binney talks NSA leaks
“In the wake of multiple leaks regarding the data mining programs PRISM and Boundless Informant, whistleblowers are coming out in droves to talk about the unprecedented government surveillance on the American public. RT Correspondent Meghan Lopez had a chance to sit down with NSA whistleblower William Binney to talk about the latest developments coming out of the NSA case. Binney is a 32 year veteran of the NSA, where he helped design a top secret program he says helps collect data on foreign enemies. He is regarded as one of the best mathematicians and code breakers in NSA history. He became an NSA whistleblower in 2002 when he realized the program he helped create to spy no foreign enemies was being used on Americans.”
A Massive Surveillance State Glenn Greenwald Exposes Covert NSA Program Collecting Calls, Emails
What You Should Know About The New NSA Utah Data Center
Glenn Greenwald Vs Bush Press Sec. Ari Fleischer Over NSA’s PRISM
NSA Whistleblowers: “All U.S.Citizens” Targeted By Surveillance Program, Not Just Verizon Customers
Experts Say NSA Leak Damage Could be Significant
“SPY AND DENY” IS THE NEW NORMAL IN USA!
Era of Online Sharing Offers ‘Big Data,’ Privacy Trade-Offs
Rep King Drops Bombshell; Sen Lee To Talk Claim Chief Justice Roberts Blackmailed
How PRISM Easily Gives Your Private Data Over to Big Brother
“The National Security Agency has obtained direct access to the systems of Google, Facebook, Apple and other US internet giants, according to a top secret document obtained by the Guardian.
The NSA access is part of a previously undisclosed program called Prism, which allows officials to collect material including search history, the content of emails, file transfers and live chats, the document says.”*
We’ve been assured by the president that the NSA’s PRISM program won’t affect “ordinary” U.S. citizens, but what is the criteria for deciding who gets their data mined and monitored? Cenk Uygur, Ben Mankiewicz, and John Iadarola (Host, TYT University) discuss the egregious reach of the Obama administration’s secret mass surveillance program.
NSA whistleblower Edward Snowden: ‘I don’t want to live in a society that does these sort of things’
RNC/DNC Collecting Your Info En Masse
ILLEGAL IMMIGRATION IS DESTROYING AMERICA
The Dangers of Unlimited Legal & Illegal Immigration
Immigration by the Numbers — Off the Charts
Immigration, World Poverty and Gumballs – Updated 2010
THEY COME TO AMERICA II. The Cost of Amnesty
They Come to America (Trailer 2)
2012: They Come to America. The Cost of Illegal Immigration.
Schumer Refuses To Estimate Future Immigration Flow Under Gang Of Eight Proposal
Obama To Stop Deporting Young Illegal Immigrants
“The Obama administration will stop deporting young illegal immigrants who came to the U.S. as children and who do not pose a security threat, senior administration officials said this morning, a move that could prove important in a presidential campaign that will turn in part on who wins over Latino voters.
Effective immediately, young immigrants who arrived in the U.S. illegally before they turned 16 will be allowed to apply for work permits as long as they have no criminal history and meet other criteria, officials said.
Reality Check: President Obama’s Immigration Reform Rings Hollow
(Part I) A Day in the Life of an Arizona Rancher: Fences, Illegal Aliens, and One Man’s Watchtower
(Part II) A Day in the Life of an Arizona Rancher: Fences, Illegal Aliens, and One Man’s Watchtower
Background Articles and Videos
Ap’s “Illegal Immigrant” Stand – Leno: Illegal Immigrants That is Out, Now “Undocumented Democrats”
Illegal immigration to the United States – Wiki Article
Illegal immigration to the United States is the act of foreign nationals entering the United States, without government permission and in violation of United States nationality law, or staying beyond the termination date of a visa, also in violation of the law.
The illegal immigrant population of the United States in 2008 was estimated by the Center for Immigration Studies to be about 11 million people, down from 12.5 million people in 2007. Other estimates range from 7 to 20 million. According to a Pew Hispanic Center report, in 2005, 56% of illegal immigrants were from Mexico; 22% were from other Latin American countries, primarily from Central America; 13% were from Asia; 6% were from Europe and Canada; and 3% were from Africa and the rest of the world.
Profile and demographics
Illegal immigrants continue to outpace the number of legal immigrants —a trend that’s held steady since the 1990s. While the majority of illegal immigrants continue to concentrate in places with existing large Hispanic communities, increasingly illegals are settling throughout the rest of the country.
An estimated 14 million people live in families in which the head of household or the spouse is in the United States illegaly . The number of illegal immigrants arriving in recent years tend to be better educated than those who have been in the country a decade or more. A quarter of all immigrants who have arrived in recent years have at least some college education. Nonetheless, illegal immigrants as a group tend to be less educated than other sections of the U.S. population: 49 percent haven’t completed high school, compared with 9 percent of native-born Americans and 25 percent of legal immigrants.
Illegal immigrants work in many sectors of the U.S. economy. According to National Public Radio in 2005, about 3 percent work in agriculture; 33 percent have jobs in service industries; and substantial numbers can be found in construction and related occupations (16 percent), and in production, installation, and repair (17 percent). According to USA Today in 2006, about 4 percent work in farming; 21 percent have jobs in service industries; and substantial numbers can be found in construction and related occupations (19 percent), and in production, installation, and repair (15 percent), with 12% in sales, 10% in management, and 8% in transportation. Illegal immigrants have lower incomes than both legal immigrants and native-born Americans, but earnings do increase somewhat the longer an individual is in the country.
A percentage of illegal immigrants do not remain indefinitely but do return to their country of origin; they are often referred to as “sojourners: they come to the United States for several years but eventually return to their home country.”
Breakdown by state
As of 2006, the following data table shows a spread of distribution of locations where illegal immigrants reside by state.
Number of illegal immigrants
According to the Government Accountability Office (GAO), different estimates of the total number of illegal immigrants vary depending on how the term is defined. There are also questions about data reliability.
The GAO has stated that “it seems clear that the population of undocumented foreign-born persons is large and has increased rapidly.” On April 26, 2006 the Pew Hispanic Center (PHC) estimated that in March 2005 the number of illegal immigrants in the U.S. ranged from 11.5 to 12 million individuals. This number was derived by a statistical method known as the “residual method.” According to the General Accounting office the residual estimation (1) starts with a census count or survey estimate of the number of foreign-born residents who have not become U.S. citizens and (2) subtracts out estimated numbers of legally present individuals in various categories, based on administrative data and assumptions (because censuses and surveys do not ask about legal status). The remainder, or residual, represents an indirect estimate of
Senate Dismisses Any Pretense of Enforcement in the Gang of Eight Immigration Bill
Rubio Reneges on Promise to Fix Flaws in the Bill
(Washington, D.C. June 13, 2013) In the first important vote on amendments to the Gang of Eight immigration bill, S.744, the United States Senate quickly dismissed any pretense that they intend to deliver on promises of future immigration enforcement, declared the Federation for American Immigration Reform (FAIR). By a 57-43 vote, the Senate tabled an amendment by Sen. Chuck Grassley (R-Iowa) that would have required that the Department of Homeland Security (DHS) demonstrate effective control of U.S. borders for six months before illegal aliens could gain amnesty.
“Today’s vote makes it clear that a majority of senators place a higher priority on granting amnesty to illegal aliens than they do on fulfilling their promises to the American people that our borders will be secured and that our immigration laws will be enforced,” said Dan Stein, president of FAIR. “Tellingly, Gang of Eight member Marco Rubio (R-Fla.), who has repeatedly vowed to oppose the bill if border enforcement provisions are not strengthened, was among the majority of senators who voted to kill the Grassley amendment.”
Majority Leader Harry Reid (D-Nev.) described the amendment as a “poison pill” and used a parliamentary procedure to shut off debate on it. “In the Alice in Wonderland world of the United States Senate, securing our borders and fulfilling promises to the American people, before rewarding illegal aliens, is considered a ‘poison pill,’” observed Stein.
“The vote also undermines whatever credibility Sen. Rubio had left as an honest broker on behalf of the interests of the American people. The fix is in and Rubio is off the fence. The Gang of Eight and the Senate leadership will employ any tactic to prevent amendments that might upset special interest constituencies from supporting the bill,” Stein continued.
“Under this bill there will be no border security. There will be no immigration enforcement. The Gang of Eight bill is about delivering amnesty to illegal aliens and cheap labor to business interests, and nothing else,” Stein concluded.
Sasha Issenberg interviewed at Strata Santa Clara 2013
Sasha Issenberg Discusses His New Book, ‘The Victory Lab’
Sasha Issenberg | The Victory Lab: How Innovation Happens in Electioneering | PDF13
Algorithmic Trading to Algorithmic Campaigning, Behind the Political Scene w/Sasha Issenberg
The Anatomy of an Election: Technology with Sasha Issenberg
The Victory Lab: ‘Moneyball for Politics’” Sasha Issenberg
A Conversation with Sasha Issenberg
Sasha Issenberg discusses the 2012 Obama campaign
Sasha Issenberg discusses the use of social science experiments in Rick Perry’s 2006 campaign
Sasha Issenberg speaks at NationBuilder
How They Did It: Political Tactics That Helped Obama Win
Can You Replicate the Obama Strategy? | The New School for Public Engagement
Political campaigns have revolutionized the way they target, contact and motivate supporters. Strategists are taking the insights of experimental social science and marrying them to the corporate world’s Big Data marketing tools. The Obama Campaign won in large part by using statistical modeling techniques to identify persuadable voters and to fine-tune persuasive messages. This is politics today and in the future—not only for elections but on issue campaigns for education reform, health care, the environment, labor rights and beyond. Who are the pioneers? And how might you apply their the strategies?
Maxine Waters (D) Slip of the Tongue Reveals True Intentions (Socialism for America)
Obama’s secret microtargeting operation
Campaigns admit to data mining
During campaigning, candidates are going to great lengths to find out about residents. Both presidential campaigns admit to tracking everything you do online.
Obama’s win: data mining
How We Used Data to Win the Presidential Election
Dan Siroker, of the Obama Campaign and CarrotSticks, describes how the campaign used data to win the presidential election. He shares the lessons his team learned along the way and how one can apply them to any data-driven decision one needs to make — whether it be in developing, designing, or even marketing.
Can You Replicate the Obama Strategy? | The New School for Public Engagement
Political campaigns have revolutionized the way they target, contact and motivate supporters. Strategists are taking the insights of experimental social science and marrying them to the corporate world’s Big Data marketing tools. The Obama Campaign won in large part by using statistical modeling techniques to identify persuadable voters and to fine-tune persuasive messages. This is politics today and in the future—not only for elections but on issue campaigns for education reform, health care, the environment, labor rights and beyond. Who are the pioneers? And how might you apply their the strategies?
Strata 2013: Sasha Issenberg, “The Victory Lab”
The Victory Lab: ‘Moneyball for Politics’” Sasha Issenberg
A Conversation with Sasha Issenberg
Sasha Issenberg discusses the 2012 Obama campaign
Sasha Issenberg discusses the use of social science experiments in Rick Perry’s 2006 campaign
Political Checklist: Frontline Looks at Digital Campaigns
Frontline: The Digital Factor in Election 2012
Frontline: How Much Do Digital Campaigns Know About You?
Webinar – Political Campaign Fundraising with Aristotle 360
Use Voter Data for a Smart Political Campaign
‘Big Brother’ is watching, in sophisticated digital ways
By Gitte Laasby
Town of Mukwonago voter Priscilla Trulen is used to ignoring political solicitations. For weeks, she’s been receiving three political robocalls per day related to the presidential election. On Thursday, she got seven.
But one call she got on Halloween still haunts her. It was a recorded message read by a presidential candidate trying to get her to vote.
“It was Mitt Romney saying, ‘I know you have an absentee ballot and I know you haven’t sent it in yet,’ ” Trulen said in an interview. “That just sent me over the line. Not only is it like Big Brother. It is Big Brother. It’s down to where they know I have a ballot and I haven’t sent it in! I thought when I requested the ballot that the only other entity that would know was the Mukwonago clerk.”
Trulen isn’t the only voter among Wisconsin’s much-courted electorate who is getting creeped out by the political campaigns’ unprecedented, uncanny ability to micro-target voters who are likely to vote for their candidate.
The solicitations give only a small glimpse into how much digital information the campaigns are able to access about voters.
For years, campaigns have requested the statewide voter registration list, which is subject to public information requests.
The database contains the names and addresses of active voters who are registered and able to vote, as well as inactive voters who are ineligible to vote because they have passed away, moved out of state or committed a felony, or people who need to re-register to be eligible, said Reid Magney, public information officer with the Wisconsin Government Accountability Board.
The list also contains information that the state does not release, for instance people’s birth dates, driver’s license numbers and phone numbers.
“It’s typical for both parties, or individual candidates, to be making public records requests from the clerks. And it’s perfectly legal,” Magney said. “This information is public so there’s transparency in our elections. . . . Except for how you vote, there really are no secrets.”
The state database also contains information on absentee voters. The state’s 1,851 municipalities are required to account for military and overseas absentee ballots both before and after the election, Magney said. Municipalities don’t have to report to the state whether regular absentee ballots such as Trulen’s have been returned until the election is over. However, some municipalities, including the Town of Mukwonago where Trulen lives, report to the state database as they go whether those ballots have been returned. Most likely, that’s how the Republican campaign found out Trulen received an absentee ballot.
“There’s nothing confidential as far as, ‘Did so and so vote?’ ” said Kathy Karalewitz, administrative clerk treasurer with the town. “As far as how they vote, yes.”
Requesters can also request information related to absentee ballots directly from the municipalities, although that’s more cumbersome and labor intensive.
The cost of the entire state database is $12,500. Four requesters have been willing to pay that since Sept. 1, Magney said: Catalist (a progressive voter database organization), the Democratic National Committee, and data analysis firm Aristotle – all based in Washington, D.C. The last requester was Colorado-based Magellan Strategies, a firm that specializes in “micro-targeting” for Republican parties and candidates.
Another 200 requests have been made since Sept. 1 for smaller portions of the database, Magney said.
Crunching the numbers
But what really enables the campaigns to “slice and dice” the electorate down to individual voters is that the voter list is correlated with a slew of other information designed to predict voting behavior and issues that the voter would care about.
In an interview with PBS that aired in October, Aristotle’s chief executive officer, John Phillips, said the company keeps up to 500 data points on each voter – from the type of clothes they buy, the music they listen to, magazines they read and car they own, to whether they are a NASCAR fan, a smoker or a pet owner, or have a gold credit card. Some of that information comes from commercial marketing firms, product registration cards or surveys. Other information is obtained through Facebook, door-to-door canvassing, petitions and computer cookies – small data codes that register which websites the user has visited.
Through data modeling, analyzers can categorize voters based on how they feel about specific issues, values or candidates. They then try to predict voting behavior and figure out which issue ads voters are most likely to be susceptible to – for instance ads on education, gun control or immigration.
One of the companies that requested the full Wisconsin voter database, Magellan Strategies, explains on its website that it conducts surveys on people’s opinions and merges that with their political, consumer and census demographics.
“By correlating respondents’ demographics to the demographics of the whole voting district, we can make predictions about the voting preferences of each voter in the district,” the site states.
The company also states why the strategy is so popular.
“Microtargeting enables campaigns to send targeted messages to voters who are very receptive to those messages,” the website states. “Microtargeting allows for the most cost effective voter targeting programs, for voter persuasion or get-out-the-vote.”
According to its website, Magellan has conducted microtargeting since 2008.
A little extra effort is required to determine party affiliation in Wisconsin which, contrary to other states such as California, does not register people to vote by party.
The last piece of the puzzle is the phone number, which is not available through the government, but easily found in a phone book or located in online databases, sometimes free of charge.
Nathan Conrad, a spokesman for the Republican Party of Wisconsin, did not respond to a request for comment on how the campaign obtained Trulen’s digits. Graeme Zielinski, a spokesman for the Democratic Party of Wisconsin, did not respond for requests on how his party obtains phone numbers either.
As for Trulen, she just wishes she could find a way to make the calls stop.
“It’s alarming to me,” she said. “It’s just not right. . . . It’s like you can feel the tentacles creeping into your house under your door.”
The calls to Trulen were likely part of the GOP’s effort to get out the vote in what the party considers one of its strongest counties. Waukesha County is traditionally a Republican stronghold, just as Milwaukee tends to go for Democrats.
The irony is that the robocallers apparently haven’t figured out Trulen is actually a minority in her county: She has been voting Democratic.
Political campaigns can obtain nearly unlimited information about people through commercially available databases. Here’s what information they can, and can’t, learn about you from public records related to voting:
Your name, address, gender and race
Which elections you voted in, going back to 2000
Whether you have requested an absentee ballot and whether you have sent it in.
Whom you voted for
Your date of birth
Your Social Security number, and any part of it
Your driver’s license number
Your phone number (if officials remember to redact it before they release your registration to anyone who asks.)
For more on the information that campaigns and others collect on you, watch this video from PBS.
They then use various means of communication—direct mail, phone calls, home visits, television, radio, web advertising, email, text messaging, etc.–to communicate with voters, crafting messages to build support for fundraising, campaign events, volunteering, and eventually to turn them out to the polls on election day. Microtargeting’s tactics rely on transmitting a tailored message to a subgroup of the electorate on the basis of unique information about that subgroup.
Although some of the tactics of microtargeting had been used in California since 1992, it really started to be used nationally only in 2004. In that year, Karl Rove, along with Blaise Hazelwood at the Republican National Committee, used it to reach voters in 18 states that George W. Bush’s reelection campaign was not able to reach by other means. The results were greater contacts with likely Bush voters. For example, in Iowa the campaign was able to reach 92% of eventual Bush voters (compared to 50% in 2000) and in Florida it was able to reach 84% (compared to 50% in 2000). Much of this pioneering work was done by Alex Gage and his firm, TargetPoint Consulting.
Democrats did only limited microtargeting in 2004, with some crediting microtargeting for Kerry’s win in Iowa in 2004. Some news accounts credited Republican superiority in that area for victories in that election cycle. Democrats later developed microtargeting capabilities for the 2006 election cycle. “It’s no secret that the other side [Republicans] figured this out a little sooner”, said Josh Syrjamaki, director of the Minnesota chapter of America Votes in October 2006. “They’ve had four to six years’ jump on us on this stuff…but we feel like we can start to catch up.”
Microtargeting is a modification of a practice used by commercial direct marketers. It would not be possible on a large scale without the development of large and sophisticated databases that contain data about as many voters as possible. The database essentially tracks voter habits in the same ways that companies like Visa track consumer spending habits. The Republican National Committee’s database is called Voter Vault. The Democratic National Committee effort is called VoteBuilder. A parallel Democratic effort is being developed by Catalist, a $9 million initiative headed by Harold Ickes, while the leading non-partisan database is offered by Aristotle.
The databases contain specific information about a particular voter (party affiliation, frequency of voting, contributions, volunteerism, etc.) with other activities and habits available from commercial marketing vendors such as Acxiom, Dun & Bradstreet, Experian Americas, and InfoUSA. Such personal information is a “product” sold to interested companies. These data are particularly illuminating when portrayed through a Geographic Information System (GIS), where trends based on location can be mapped alongside dozens or hundreds of other variables. This geographic depiction also makes it ideal for volunteers to visit potential voters (armed with lists in hand, laid out in the shortest route – much like how FedEx and UPS pre-determine delivery routes).
These databases are then mined to identify issues important to each voter and whether that voter is more likely to identify with one party or another. Political information is obviously important here, but consumer preferences can play a role as well. Individual voters are then put into groups on the basis of sophisticated computer modeling. Such groups have names like “Downscale Union Independents”, “Tax and Terrorism Moderates,” and “Older Suburban Newshounds.”
Once a multitude of voting groups is established according to these criteria and their minute political differences, then the tailored messages can be sent via the appropriate means. While political parties and candidates once prepared a single television advertisement for general broadcast nationwide, it is now not at all uncommon to have several dozen variations on the one message, each with a unique and tailored message for that small demographic sliver of the voting public. This is the same for radio advertisement, direct mail, email, as well as stump speeches and fundraising events.
The actual data mining task is the automatic or semi-automatic analysis of large quantities of data to extract previously unknown interesting patterns such as groups of data records (cluster analysis), unusual records (anomaly detection) and dependencies (association rule mining). This usually involves using database techniques such as spatial indices. These patterns can then be seen as a kind of summary of the input data, and may be used in further analysis or, for example, in machine learning and predictive analytics. For example, the data mining step might identify multiple groups in the data, which can then be used to obtain more accurate prediction results by a decision support system. Neither the data collection, data preparation, nor result interpretation and reporting are part of the data mining step, but do belong to the overall KDD process as additional steps.
The related terms data dredging, data fishing, and data snooping refer to the use of data mining methods to sample parts of a larger population data set that are (or may be) too small for reliable statistical inferences to be made about the validity of any patterns discovered. These methods can, however, be used in creating new hypotheses to test against the larger data populations.
Data mining uses information from past data to analyze the outcome of a particular problem or situation that may arise. Data mining works to analyze data stored in data warehouses that are used to store that data that is being analyzed. That particular data may come from all parts of business, from the production to the management. Managers also use data mining to decide upon marketing strategies for their product. They can use data to compare and contrast among competitors. Data mining interprets its data into real time analysis that can be used to increase sales, promote new product, or delete product that is not value-added to the company.
In the 1960s, statisticians used terms like “Data Fishing” or “Data Dredging” to refer to what they considered the bad practice of analyzing data without an a-priori hypothesis. The term “Data Mining” appeared around 1990 in the database community. At the beginning of the century, there was a phrase “database mining”™, trademarked by HNC, a San Diego-based company (now merged into FICO), to pitch their Data Mining Workstation; researchers consequently turned to “data mining”. Other terms used include Data Archaeology, Information Harvesting, Information Discovery, Knowledge Extraction, etc. Gregory Piatetsky-Shapiro coined the term “Knowledge Discovery in Databases” for the first workshop on the same topic (1989) and this term became more popular in AI and Machine Learning Community. However, the term data mining became more popular in the business and press communities. Currently, Data Mining and Knowledge Discovery are used interchangeably.
The manual extraction of patterns from data has occurred for centuries. Early methods of identifying patterns in data include Bayes’ theorem (1700s) and regression analysis (1800s). The proliferation, ubiquity and increasing power of computer technology has dramatically increased data collection, storage, and manipulation ability. As data sets have grown in size and complexity, direct “hands-on” data analysis has increasingly been augmented with indirect, automated data processing, aided by other discoveries in computer science, such as neural networks, cluster analysis, genetic algorithms (1950s), decision trees (1960s), and support vector machines (1990s). Data mining is the process of applying these methods with the intention of uncovering hidden patterns in large data sets. It bridges the gap from applied statistics and artificial intelligence (which usually provide the mathematical background) to database management by exploiting the way data is stored and indexed in databases to execute the actual learning and discovery algorithms more efficiently, allowing such methods to be applied to ever larger data sets.
or a simplified process such as (1) pre-processing, (2) data mining, and (3) results validation.
Polls conducted in 2002, 2004, and 2007 show that the CRISP-DM methodology is the leading methodology used by data miners. The only other data mining standard named in these polls was SEMMA. However, 3-4 times as many people reported using CRISP-DM. Several teams of researchers have published reviews of data mining process models, and Azevedo and Santos conducted a comparison of CRISP-DM and SEMMA in 2008.
Before data mining algorithms can be used, a target data set must be assembled. As data mining can only uncover patterns actually present in the data, the target data set must be large enough to contain these patterns while remaining concise enough to be mined within an acceptable time limit. A common source for data is a data mart or data warehouse. Pre-processing is essential to analyze the multivariate data sets before data mining. The target set is then cleaned. Data cleaning removes the observations containing noise and those with missing data.
Data mining involves six common classes of tasks:
Anomaly detection (Outlier/change/deviation detection) – The identification of unusual data records, that might be interesting or data errors that require further investigation.
Association rule learning (Dependency modeling) – Searches for relationships between variables. For example a supermarket might gather data on customer purchasing habits. Using association rule learning, the supermarket can determine which products are frequently bought together and use this information for marketing purposes. This is sometimes referred to as market basket analysis.
Clustering – is the task of discovering groups and structures in the data that are in some way or another “similar”, without using known structures in the data.
Classification – is the task of generalizing known structure to apply to new data. For example, an e-mail program might attempt to classify an e-mail as “legitimate” or as “spam”.
Regression – Attempts to find a function which models the data with the least error.
Summarization – providing a more compact representation of the data set, including visualization and report generation.
Sequential pattern mining – Sequential pattern mining finds sets of data items that occur together frequently in some sequences. Sequential pattern mining, which extracts frequent subsequences from a sequence database, has attracted a great deal of interest during the recent data mining research because it is the basis of many applications, such as: web user analysis, stock trend prediction, DNA sequence analysis, finding language or linguistic patterns from natural language texts, and using the history of symptoms to predict certain kind of disease.
The final step of knowledge discovery from data is to verify that the patterns produced by the data mining algorithms occur in the wider data set. Not all patterns found by the data mining algorithms are necessarily valid. It is common for the data mining algorithms to find patterns in the training set which are not present in the general data set. This is called overfitting. To overcome this, the evaluation uses a test set of data on which the data mining algorithm was not trained. The learned patterns are applied to this test set and the resulting output is compared to the desired output. For example, a data mining algorithm trying to distinguish “spam” from “legitimate” emails would be trained on a training set of sample e-mails. Once trained, the learned patterns would be applied to the test set of e-mails on which it had not been trained. The accuracy of the patterns can then be measured from how many e-mails they correctly classify. A number of statistical methods may be used to evaluate the algorithm, such as ROC curves.
If the learned patterns do not meet the desired standards, then it is necessary to re-evaluate and change the pre-processing and data mining steps. If the learned patterns do meet the desired standards, then the final step is to interpret the learned patterns and turn them into knowledge.
There have been some efforts to define standards for the data mining process, for example the 1999 European Cross Industry Standard Process for Data Mining (CRISP-DM 1.0) and the 2004 Java Data Mining standard (JDM 1.0). Development on successors to these processes (CRISP-DM 2.0 and JDM 2.0) was active in 2006, but has stalled since. JDM 2.0 was withdrawn without reaching a final draft.
For exchanging the extracted models – in particular for use in predictive analytics – the key standard is the Predictive Model Markup Language (PMML), which is an XML-based language developed by the Data Mining Group (DMG) and supported as exchange format by many data mining applications. As the name suggests, it only covers prediction models, a particular data mining task of high importance to business applications. However, extensions to cover (for example) subspace clustering have been proposed independently of the DMG.
Since the early 1960s, with the availability of oracles for certain combinatorial games, also called tablebases (e.g. for 3×3-chess) with any beginning configuration, small-board dots-and-boxes, small-board-hex, and certain endgames in chess, dots-and-boxes, and hex; a new area for data mining has been opened. This is the extraction of human-usable strategies from these oracles. Current pattern recognition approaches do not seem to fully acquire the high level of abstraction required to be applied successfully. Instead, extensive experimentation with the tablebases – combined with an intensive study of tablebase-answers to well designed problems, and with knowledge of prior art (i.e. pre-tablebase knowledge) – is used to yield insightful patterns. Berlekamp (in dots-and-boxes, etc.) and John Nunn (in chessendgames) are notable examples of researchers doing this work, though they were not – and are not – involved in tablebase generation.
Data mining is the analysis of historical business activities, stored as static data in data warehouse databases, to reveal hidden patterns and trends. Data mining software uses advanced pattern recognition algorithms to sift through large amounts of data to assist in discovering previously unknown strategic business information. Examples of what businesses use data mining for include performing market analysis to identify new product bundles, finding the root cause of manufacturing problems, to prevent customer attrition and acquire new customers, cross-sell to existing customers, and profile customers with more accuracy. In today’s world raw data is being collected by companies at an exploding rate. For example, Walmart processes over 20 million point-of-sale transactions every day. This information is stored in a centralized database, but would be useless without some type of data mining software to analyse it. If Walmart analyzed their point-of-sale data with data mining techniques they would be able to determine sales trends, develop marketing campaigns, and more accurately predict customer loyalty. Every time we use our credit card, a store loyalty card, or fill out a warranty card data is being collected about our purchasing behavior. Many people find the amount of information stored about us from companies, such as Google, Facebook, and Amazon, disturbing and are concerned about privacy. Although there is the potential for our personal data to be used in harmful, or unwanted, ways it is also being used to make our lives better. For example, Ford and Audi hope to one day collect information about customer driving patterns so they can recommend safer routes and warn drivers about dangerous road conditions.
Data mining in customer relationship management applications can contribute significantly to the bottom line. Rather than randomly contacting a prospect or customer through a call center or sending mail, a company can concentrate its efforts on prospects that are predicted to have a high likelihood of responding to an offer. More sophisticated methods may be used to optimize resources across campaigns so that one may predict to which channel and to which offer an individual is most likely to respond (across all potential offers). Additionally, sophisticated applications could be used to automate mailing. Once the results from data mining (potential prospect/customer and channel/offer) are determined, this “sophisticated application” can either automatically send an e-mail or a regular mail. Finally, in cases where many people will take an action without an offer, “uplift modeling” can be used to determine which people have the greatest increase in response if given an offer. Uplift modeling thereby enables marketers to focus mailings and offers on persuadable people, and not to send offers to people who will buy the product without an offer. Data clustering can also be used to automatically discover the segments or groups within a customer data set.
Businesses employing data mining may see a return on investment, but also they recognize that the number of predictive models can quickly become very large. Rather than using one model to predict how many customers will churn, a business could build a separate model for each region and customer type. Then, instead of sending an offer to all people that are likely to churn, it may only want to send offers to loyal customers. Finally, the business may want to determine which customers are going to be profitable over a certain window in time, and only send the offers to those that are likely to be profitable. In order to maintain this quantity of models, they need to manage model versions and move on to automated data mining.
Data mining can also be helpful to human resources (HR) departments in identifying the characteristics of their most successful employees. Information obtained – such as universities attended by highly successful employees – can help HR focus recruiting efforts accordingly. Additionally, Strategic Enterprise Management applications help a company translate corporate-level goals, such as profit and margin share targets, into operational decisions, such as production plans and workforce levels.
Another example of data mining, often called the market basket analysis, relates to its use in retail sales. If a clothing store records the purchases of customers, a data mining system could identify those customers who favor silk shirts over cotton ones. Although some explanations of relationships may be difficult, taking advantage of it is easier. The example deals with association rules within transaction-based data. Not all data are transaction based and logical, or inexact rules may also be present within a database.
Market basket analysis has also been used to identify the purchase patterns of the Alpha Consumer. Alpha Consumers are people that play a key role in connecting with the concept behind a product, then adopting that product, and finally validating it for the rest of society. Analyzing the data collected on this type of user has allowed companies to predict future buying trends and forecast supply demands.
Data mining is a highly effective tool in the catalog marketing industry. Catalogers have a rich database of history of their customer transactions for millions of customers dating back a number of years. Data mining tools can identify patterns among customers and help identify the most likely customers to respond to upcoming mailing campaigns.
Data mining for business applications is a component that needs to be integrated into a complex modeling and decision making process. Reactive business intelligence (RBI) advocates a “holistic” approach that integrates data mining, modeling, and interactive visualization into an end-to-end discovery and continuous innovation process powered by human and automated learning.
In the area of decision making, the RBI approach has been used to mine knowledge that is progressively acquired from the decision maker, and then self-tune the decision method accordingly.
An example of data mining related to an integrated-circuit (IC) production line is described in the paper “Mining IC Test Data to Optimize VLSI Testing.” In this paper, the application of data mining and decision analysis to the problem of die-level functional testing is described. Experiments mentioned demonstrate the ability to apply a system of mining historical die-test data to create a probabilistic model of patterns of die failure. These patterns are then utilized to decide, in real time, which die to test next and when to stop testing. This system has been shown, based on experiments with historical test data, to have the potential to improve profits on mature IC products.
In the study of human genetics, sequence mining helps address the important goal of understanding the mapping relationship between the inter-individual variations in human DNA sequence and the variability in disease susceptibility. In simple terms, it aims to find out how the changes in an individual’s DNA sequence affects the risks of developing common diseases such as cancer, which is of great importance to improving methods of diagnosing, preventing, and treating these diseases. The data mining method that is used to perform this task is known as multifactor dimensionality reduction.
In the area of electrical power engineering, data mining methods have been widely used for condition monitoring of high voltage electrical equipment. The purpose of condition monitoring is to obtain valuable information on, for example, the status of the insulation (or other important safety-related parameters). Data clustering techniques – such as the self-organizing map (SOM), have been applied to vibration monitoring and analysis of transformer on-load tap-changers (OLTCS). Using vibration monitoring, it can be observed that each tap change operation generates a signal that contains information about the condition of the tap changer contacts and the drive mechanisms. Obviously, different tap positions will generate different signals. However, there was considerable variability amongst normal condition signals for exactly the same tap position. SOM has been applied to detect abnormal conditions and to hypothesize about the nature of the abnormalities.
Data mining methods have also been applied to dissolved gas analysis (DGA) in power transformers. DGA, as a diagnostics for power transformers, has been available for many years. Methods such as SOM has been applied to analyze generated data and to determine trends which are not obvious to the standard DGA ratio methods (such as Duval Triangle).
Another example of data mining in science and engineering is found in educational research, where data mining has been used to study the factors leading students to choose to engage in behaviors which reduce their learning, and to understand factors influencing university student retention. A similar example of social application of data mining is its use in expertise finding systems, whereby descriptors of human expertise are extracted, normalized, and classified so as to facilitate the finding of experts, particularly in scientific and technical fields. In this way, data mining can facilitate institutional memory.
In adverse drug reaction surveillance, the Uppsala Monitoring Centre has, since 1998, used data mining methods to routinely screen for reporting patterns indicative of emerging drug safety issues in the WHO global database of 4.6 million suspected adverse drug reaction incidents. Recently, similar methodology has been developed to mine large collections of electronic health records for temporal patterns associating drug prescriptions to medical diagnoses.
Data mining of government records – particularly records of the justice system (i.e. courts, prisons) – enables the discovery of systemic human rights violations in connection to generation and publication of invalid or fraudulent legal records by various government agencies.
Spatial data mining is the application of data mining methods to spatial data. The end objective of spatial data mining is to find patterns in data with respect to geography. So far, data mining and Geographic Information Systems (GIS) have existed as two separate technologies, each with its own methods, traditions, and approaches to visualization and data analysis. Particularly, most contemporary GIS have only very basic spatial analysis functionality. The immense explosion in geographically referenced data occasioned by developments in IT, digital mapping, remote sensing, and the global diffusion of GIS emphasizes the importance of developing data-driven inductive approaches to geographical analysis and modeling.
Data mining offers great potential benefits for GIS-based applied decision-making. Recently, the task of integrating these two technologies has become of critical importance, especially as various public and private sector organizations possessing huge databases with thematic and geographically referenced data begin to realize the huge potential of the information contained therein. Among those organizations are:
offices requiring analysis or dissemination of geo-referenced statistical data
public health services searching for explanations of disease clustering
environmental agencies assessing the impact of changing land-use patterns on climate change
geo-marketing companies doing customer segmentation based on spatial location.
Challenges in Spatial mining: Geospatial data repositories tend to be very large. Moreover, existing GIS datasets are often splintered into feature and attribute components that are conventionally archived in hybrid data management systems. Algorithmic requirements differ substantially for relational (attribute) data management and for topological (feature) data management. Related to this is the range and diversity of geographic data formats, which present unique challenges. The digital geographic data revolution is creating new types of data formats beyond the traditional “vector” and “raster” formats. Geographic data repositories increasingly include ill-structured data, such as imagery and geo-referenced multi-media.
There are several critical research challenges in geographic knowledge discovery and data mining. Miller and Han offer the following list of emerging research topics in the field:
Developing and supporting geographic data warehouses (GDW’s): Spatial properties are often reduced to simple aspatial attributes in mainstream data warehouses. Creating an integrated GDW requires solving issues of spatial and temporal data interoperability – including differences in semantics, referencing systems, geometry, accuracy, and position.
Better spatio-temporal representations in geographic knowledge discovery: Current geographic knowledge discovery (GKD) methods generally use very simple representations of geographic objects and spatial relationships. Geographic data mining methods should recognize more complex geographic objects (i.e. lines and polygons) and relationships (i.e. non-Euclidean distances, direction, connectivity, and interaction through attributed geographic space such as terrain). Furthermore, the time dimension needs to be more fully integrated into these geographic representations and relationships.
Geographic knowledge discovery using diverse data types: GKD methods should be developed that can handle diverse data types beyond the traditional raster and vector models, including imagery and geo-referenced multimedia, as well as dynamic data types (video streams, animation).
Sensor data mining
Wireless sensor networks can be used for facilitating the collection of data for spatial data mining for a variety of applications such as air pollution monitoring. A characteristic of such networks is that nearby sensor nodes monitoring an environmental feature typically register similar values. This kind of data redundancy due to the spatial correlation between sensor observations inspires the techniques for in-network data aggregation and mining. By measuring the spatial correlation between data sampled by different sensors, a wide class of specialized algorithms can be developed to develop more efficient spatial data mining algorithms.
Visual data mining
In the process of turning from analogical into digital, large data sets have been generated, collected, and stored discovering statistical patterns, trends and information which is hidden in data, in order to build predictive patterns. Studies suggest visual data mining is faster and much more intuitive than is traditional data mining. See also Computer Vision.
Music data mining
Data mining techniques, and in particular co-occurrence analysis, has been used to discover relevant similarities among music corpora (radio lists, CD databases) for the purpose of classifying music into genres in a more objective manner.
Data mining has been used to stop terrorist programs under the U.S. government, including the Total Information Awareness (TIA) program, Secure Flight (formerly known as Computer-Assisted Passenger Prescreening System (CAPPS II)), Analysis, Dissemination, Visualization, Insight, Semantic Enhancement (ADVISE), and the Multi-state Anti-Terrorism Information Exchange (MATRIX). These programs have been discontinued due to controversy over whether they violate the 4th Amendment to the United States Constitution, although many programs that were formed under them continue to be funded by different organizations or under different names.
In the context of combating terrorism, two particularly plausible methods of data mining are “pattern mining” and “subject-based data mining”.
“Pattern mining” is a data mining method that involves finding existing patterns in data. In this context patterns often means association rules. The original motivation for searching association rules came from the desire to analyze supermarket transaction data, that is, to examine customer behavior in terms of the purchased products. For example, an association rule “beer ⇒ potato chips (80%)” states that four out of five customers that bought beer also bought potato chips.
In the context of pattern mining as a tool to identify terrorist activity, the National Research Council provides the following definition: “Pattern-based data mining looks for patterns (including anomalous data patterns) that might be associated with terrorist activity — these patterns might be regarded as small signals in a large ocean of noise.” Pattern Mining includes new areas such a Music Information Retrieval (MIR) where patterns seen both in the temporal and non temporal domains are imported to classical knowledge discovery search methods.
Subject-based data mining
“Subject-based data mining” is a data mining method involving the search for associations between individuals in data. In the context of combating terrorism, the National Research Council provides the following definition: “Subject-based data mining uses an initiating individual or other datum that is considered, based on other information, to be of high interest, and the goal is to determine what other persons or financial transactions or movements, etc., are related to that initiating datum.”
Knowledge discovery “On the Grid” generally refers to conducting knowledge discovery in an open environment using grid computing concepts, allowing users to integrate data from various online data sources, as well make use of remote resources, for executing their data mining tasks. The earliest example was the Discovery Net, developed at Imperial College London, which won the “Most Innovative Data-Intensive Application Award” at the ACM SC02 (Supercomputing 2002) conference and exhibition, based on a demonstration of a fully interactive distributed knowledge discovery application for a bioinformatics application. Other examples include work conducted by researchers at the University of Calabria, who developed a Knowledge Grid architecture for distributed knowledge discovery, based on grid computing.
Reliability / Validity
Data mining can be misused, and can also unintentionally produce results which appear significant but which do not actually predict future behavior and cannot be reproduced on a new sample of data. See Data snooping, Data dredging.
Privacy concerns and ethics
Some people believe that data mining itself is ethically neutral. While the term “data mining” has no ethical implications, it is often associated with the mining of information in relation to peoples’ behavior (ethical and otherwise). To be precise, data mining is a statistical method that is applied to a set of information (i.e. a data set). Associating these data sets with people is an extreme narrowing of the types of data that are available. Examples could range from a set of crash test data for passenger vehicles, to the performance of a group of stocks. These types of data sets make up a great proportion of the information available to be acted on by data mining methods, and rarely have ethical concerns associated with them. However, the ways in which data mining can be used can in some cases and contexts raise questions regarding privacy, legality, and ethics. In particular, data mining government or commercial data sets for national security or law enforcement purposes, such as in the Total Information Awareness Program or in ADVISE, has raised privacy concerns.
Data mining requires data preparation which can uncover information or patterns which may compromise confidentiality and privacy obligations. A common way for this to occur is through data aggregation. Data aggregation involves combining data together (possibly from various sources) in a way that facilitates analysis (but that also might make identification of private, individual-level data deducible or otherwise apparent). This is not data mining per se, but a result of the preparation of data before – and for the purposes of – the analysis. The threat to an individual’s privacy comes into play when the data, once compiled, cause the data miner, or anyone who has access to the newly compiled data set, to be able to identify specific individuals, especially when the data were originally anonymous.
It is recommended that an individual is made aware of the following before data are collected:
the purpose of the data collection and any (known) data mining projects
how the data will be used
who will be able to mine the data and use the data and their derivatives
the status of security surrounding access to the data
how collected data can be updated.
In America, privacy concerns have been addressed to some extent by the US Congress via the passage of regulatory controls such as the Health Insurance Portability and Accountability Act (HIPAA). The HIPAA requires individuals to give their “informed consent” regarding information they provide and its intended present and future uses. According to an article in Biotech Business Week’, “‘[i]n practice, HIPAA may not offer any greater protection than the longstanding regulations in the research arena,’ says the AAHC. More importantly, the rule’s goal of protection through informed consent is undermined by the complexity of consent forms that are required of patients and participants, which approach a level of incomprehensibility to average individuals.” This underscores the necessity for data anonymity in data aggregation and mining practices.
Data may also be modified so as to become anonymous, so that individuals may not readily be identified. However, even “de-identified”/”anonymized” data sets can potentially contain enough information to allow identification of individuals, as occurred when journalists were able to find several individuals based on a set of search histories that were inadvertently released by AOL.
KNIME: The Konstanz Information Miner, a user friendly and comprehensive data analytics framework.
ML-Flex: A software package that enables users to integrate with third-party machine-learning packages written in any programming language, execute classification analyses in parallel across multiple computing nodes, and produce HTML reports of classification results.
Holsys One: Tool for the analysis of complex systems (sensors network, industrial plant) based on a reinterpretation of the IF-THEN clause in the sense of the theory of holons.
Several researchers and organizations have conducted reviews of data mining tools and surveys of data miners. These identify some of the strengths and weaknesses of the software packages. They also provide an overview of the behaviors, preferences and views of data miners. Some of these reports include:
2011 Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
~United States Constitution, Amendment IV
“He who controls the past controls the future. He who controls the present controls the past.”
“Now I will tell you the answer to my question. It is this. The Party seeks power entirely for its own sake. We are not interested in the good of others; we are interested solely in power, pure power. What pure power means you will understand presently. We are different from the oligarchies of the past in that we know what we are doing. All the others, even those who resembled ourselves, were cowards and hypocrites. The German Nazis and the Russian Communists came very close to us in their methods, but they never had the courage to recognize their own motives. They pretended, perhaps they even believed, that they had seized power unwillingly and for a limited time, and that just around the corner there lay a paradise where human beings would be free and equal. We are not like that. We know what no one ever seizes power with the intention of relinquishing it. Power is not a means; it is an end. One does not establish a dictatorship in order to safeguard a revolution; one makes the revolution in order to establish the dictatorship. The object of persecution is persecution. The object of torture is torture. The object of power is power. Now you begin to understand me.”
“Big Brother is Watching You.”
~George Orwell’s 1984
POWER IS NOT A MEANS, IT IS AN END
Maxine Waters Confirms Big Brother Database 2013 Foretells NSA Phone & Internet Spying
Glenn Beck:Govt Storing Citizen Cellphone& Internet Activity
Digital Blackwater: How the NSA Gives Private Contractors Control of the Surveillance State
Glenn Greenwald on How NSA Leaker Edward Snowden Helped Expose a “Massive Surveillance Apparatus”
What You Should Know About The New NSA Utah Data Center
Is Edward Snowden a Hero? A Debate With Journalist Chris Hedges & Law Scholar Geoffrey Stone
Spying On Americans By NSA Prism Collection Details – Rand Paul On Hannity
NSA is Spying on EVERYTHING you do. Phone calls and internet activity is being stored and monitored.
PRISM: Why the NSA is Mining Internet Data
Total Surveillance : N.S.A. data mining all computers, phone calls, internet, emails
CNET Update – Uproar over PRISM government surveillance
NSA Caught Spying on Americans’ Internet Use
Columnist exposes Obama surveillance
Meet Edward Snowden: NSA PRISM Whistleblower
Sky News interview w/ Julian Assange and JP Barlow RE: Prism and Edward Snowden
Complete News – Snowden leaks show NSA ‘routinely lies’ to Congress
Judge Napolitano On NSA Spying: Most Extraordinarily Broad Search Warrant Ever Issued In US History
Rand Paul Discusses The NSA’s Violation Of The Bill Of Rights On Yahoo News (6-6-13)
Rand Paul On NSA Spying: ‘I’m Going To Challenge This At The Supreme Court’ -
Ron Paul: NSA Seizing Phone Records Symptom of Failure of The State
NSA Constitutional Violations? – Judge Andrew Napolitano – Geraldo
Clever Denials Surrounding the NSA PRISM Piracy Scandal
Peter Eckersley from the Electronic Frontier Foundation stopped by to explain why Silicon Vally’s top tech companies are dancing around PRISM allegations. Interview recorded Friday June 7, 2013
NSA Surveillance – Does Obama Have ANY Credibility Left?
“In his remarks today defending the NSA programs gathering telephone records and mining Internet companies, Obama sounded a familiar refrain, saying he welcomes the “debate” over the proper balance between civil liberties and national security.”*
Obama gave a speech in defense of recently uncovered secret programs to wiretap and data-mine U.S. citizens almost indiscriminately, and Congress agrees. Do you believe his remarks that we NEED these programs? Would Obama agree with himself campaigning about his stance on civil rights? Cenk Uygur, Ben Mankiewicz, and John Iadarola discuss.
How PRISM Easily Gives Your Private Data Over to Big Brother
The National Security Agency has obtained direct access to the systems of Google, Facebook, Apple and other US internet giants, according to a top secret document obtained by the Guardian.
The NSA access is part of a previously undisclosed program called Prism, which allows officials to collect material including search history, the content of emails, file transfers and live chats, the document says.”*
We’ve been assured by the president that the NSA’s PRISM program won’t affect “ordinary” U.S. citizens, but what is the criteria for deciding who gets their data mined and monitored? Cenk Uygur, Ben Mankiewicz, and John Iadarola (Host, TYT University) discuss the egregious reach of the Obama administration’s secret mass surveillance program.
The federal government is launching an expansive program dubbed “Perfect Citizen” to detect cyber assaults on private companies and government agencies running such critical infrastructure as the electricity grid and nuclear-power plants, according to people familiar with the program.The surveillance by the National Security Agency, the government’s chief eavesdropping agency, would rely on a set of sensors deployed in computer networks for critical infrastructure that would be triggered by unusual activity suggesting an impending cyber attack, though it wouldn’t persistently monitor the whole system, these people said.
Defense contractor Raytheon Corp. recently won a classified contract for the initial phase of the surveillance effort valued at up to $100 million, said a person familiar with the project.
An NSA spokeswoman said the agency had no information to provide on the program. A Raytheon spokesman declined to comment.
Some industry and government officials familiar with the program see Perfect Citizen as an intrusion by the NSA into domestic affairs, while others say it is an important program to combat an emerging security threat that only the NSA is equipped to provide.
“The overall purpose of the [program] is our Government…feel[s] that they need to insure the Public Sector is doing all they can to secure Infrastructure critical to our National Security,” said one internal Raytheon email, the text of which was seen by The Wall Street Journal. “Perfect Citizen is Big Brother.”
Glenn Becks “SURVEILLANCE STATE” (Must Viewing)
NSA spying on All Americans Part 1
NSA spying on All Americans Part 2
How to Protect Yourself from The NSA
NSA Whistleblower Seeks Asylum in Iceland
Former CIA Officer: Officials Considering NSA Whistleblower’s Case Potential Chinese Espionage
Judge Jeanine Slams IRS, NSA and Obama for Expanding Surveillance Program – Opening Statement
James Bamford: Inside the NSA’s Largest and Most Expansive Secret Domestic Spy Center 1 of 2
James Bamford: Inside the NSA’s Largest and Most Expansive Secret Domestic Spy Center 2 of 2
NSA Whistleblower: Everyone in US under virtual surveillance, all info stored, no matter the post
“The NSA Is Lying”: U.S. Government Has Copies of Most of Your Emails Says NSA Whistleblower
NSA whistleblower William Binney Keynote at HOPE Number Nine
U.S. v. Whistleblower Tom Drake
Tom Drake, a former NSA senior executive indicted last year for espionage after leaking to the media allegations that the nation’s largest intelligence organization had committed fraud, waste and abuse will appear in his first television interview. Scott Pelley reports.
Whistle Blower Threatened with 35 Years in Prison, Warns of Developing Tyranny
NSA Whistleblower Thomas Drake Prevails in Unprecedented Obama Admin Crackdown
NSA Whistleblower Thomas Drake speaks at National Press Club – March 15, 2013
Part 2: Former NSA Employee Thomas Drake and Jesselyn Radack on Whistleblower Crackdown
The Police – Every Breath You Take
“When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.
We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.–That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, –That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Governments long established should not be changed for light and transient causes; and accordingly all experience hath shewn, that mankind are more disposed to suffer, while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed. But when a