Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
1AC
Plantext
The United States federal government should substantially
curtail its domestic surveillance through backdoors that
bypass encryption technology.
Surveillance at the scale they want requires insecurity. The leaks from
Edward Snowden have revealed a variety of efforts by the NSA to weaken cybersecurity
and hack into networks. Critics say those programs, while helping NSA spying, have
made U.S. networks less secure. According to the leaked documents, the NSA
inserted a so-called back door into at least one encryption standard that
was developed by the National Institute of Standards and Technology. The
NSA could use that back door to spy on suspected terrorists, but the vulnerability
was also available to any other hacker who discovered it. NIST, a Commerce
Department agency, sets scientific and technical standards that are widely
used by both the government and the private sector. The agency has said it would
never deliberately weaken a cryptographic standard, but it remains unclear whether the agency was aware of the
back door or whether the NSA tricked NIST into adopting the compromised standard. NIST is required by law to
security. According to a 2012 New York Times report, the NSA developed a worm,
dubbed Stuxnet, to cripple Iranian nuclear centrifuges. But the worm, which
exploited four previously unknown flaws in Microsoft Windows, escaped the Iranian
nuclear plant and quickly began damaging computers around the world . The NSA and
Israeli officials have also been tied to Flame, a virus that impersonated a Microsoft update to spy on Iranian
computers. Vanee Vines, an NSA spokeswoman, said the U.S. government is as concerned as the public is with the
security of these products. The United States pursues its intelligence mission with care to ensure that innocent
systems and data, she said. The activity of NSA in setting standards has made the Internet a far safer place to
communicate and do business.
vulnerabilities while leaving the Internet vulnerable and the American people unprotected would not be in our
government weighs a variety of factors, such as the risk of leaving the vulnerability un-patched, the likelihood that
technical consultant who has worked with tech companies and helped The Washington Post with its coverage of the
hack into foreign networks to give domestic companies a competitive edge (as China is accused of doing). We do
not use foreign intelligence capabilities to steal the trade secrets of foreign companies on behalf ofor give
intelligence we collect toU.S. companies to enhance their international competitiveness or increase their bottom
line, she said. Jim Lewis, a senior fellow with the Center for Strategic and International Studies, agreed that NSA
spying to stop terrorist attacks is fundamentally different from China stealing business secrets to boost its own
economy. He also said there is widespread misunderstanding of how the NSA works, but he acknowledged that
and very difficult to detect. Particularly worrisome are attacks by tremendously skilled threat actors that
attempt to steal highly sensitiveand often very valuableintellectual property, private communications, and other
the US Director of
National Intelligence has ranked cybercrime as the top national security threat,
higher than that of terrorism, espionage, and weapons of mass destruction .1
strategic assets and information. It is a threat that is nothing short of formidable. In fact,
Underscoring the threat, the FBI last year notified 3,000 US companiesranging from small banks, major defense
hackers engineered a new round of distributed denial of service (DDoS) attacks that can generate traffic rated at a
staggering 400 gigabits per second, the most powerful DDoS assaults to date.
and the rest of the critical infrastructure goes with it ... If we are plunged into chaos and suffer more
physical destruction than 50 monster hurricanes and economic damage that dwarfs the Great Depression ... Then
we will wonder why we failed to guard against what outgoing Defense Secretary Leon
Panetta has termed a cyberPearl Harbor. An aggressor nation or extremist group
could use these kinds of cybertools to gain control of critical switches, Panetta said in a
speech in October. They could derail passenger trains or, even more dangerous, derail passenger
trains loaded with lethal chemicals. They could contaminate the water supply in
major cities or shut down the power grid across large parts of the country .
And Panetta was hardly being an alarmist. He could have added that cybersecurity experts
such as Joe Weiss of Applied Control Solutions suggest a full-on cyberattack
would seek not simply to shut down systems, but wreck them , using
software to destroy hardware. Some believe we could then be sent into chaos not
just for days of even weeks, but for months. The mother of all nightmare
scenarios would see electric, oil, gas, water, chemical , and transit, our entire
essential infrastructure, knocked out as we sought to replace equipment
that can take more than a year to manufacture and is in many cases no longer made in
the U.S. Lights would stay out. Gas stations would be unable to pump and would have nothing to pump anyway.
There would be no heat, no fuel, in many places no running water, no sewage treatment, no garbage, no traffic
lights, no air-traffic control, minimal communication, and of course, no Wi-Fi. Neighborhoods around chemical plants
could become Bhopals.
on the fact that North America could have a total of 124 Fukushima events if
the necessary conditions were present. A Festering Problem Long before Fukushima,
American regulators knew that a power failure lasting for days involving the power
grid connected to a nuclear plant , regardless of the cause, would most likely lead to a
dangerous radioactive leak in at least several nuclear power plants. A complete
loss of electrical power poses a major problem for nuclear power plants
because the reactor core must be kept cool as well as the back-up cooling systems,
all of which require massive amounts of power to work. Heretofore, all the NERC drills
which test the readiness of a nuclear power plant are predicated on the notion that
a blackout will only last 24 hours or less. Amazingly, this is the sum total of a NERC litmus test.
Although we have the technology needed to harden and protect our grid from an EMP event, whether natural or
man-made, we have failed to do so. The cost for protecting the entire grid is placed at about the cost for one B-1
Stealth Bomber. Yet, as a nation, we have done nothing. This is inexplicable and inexcusable. Our collective inaction
against protecting the grid prompted Congressman Franks to write a scathing letter to the top officials of NERC.
The problem is
entirely fixable and NERC and the US government are leaving the American people
and its infrastructure totally unprotected from a total meltdown of nuclear power
plants as a result of a prolonged power failure. Critical Analyses According to Judy Haar,
a recognized expert in nuclear plant failure analyses, when a nuclear power plant
loses access to off-grid electricity, the event is referred to as a station blackout.
However, the good Congressman failed to mention the most important aspect of this problem.
Haar states that all 104 US nuclear power plants are built to withstand electrical outages without experiencing any
core damage, through the activation of an automatic start up of emergency generators powered by diesel. Further,
when emergency power kicks in, an automatic shutdown of the nuclear power plant
commences. The dangerous control rods are dropped into the core, while water is
pumped by the diesel power generators into the reactor to reduce the heat and
thus, prevent a meltdown. Here is the catch in this process, the spent fuel rods are
encased in both a primary and secondary containment structure which is designed
to withstand a core meltdown. However, should the pumps stop because either the
generators fail or diesel fuel is not available, the fuel rods are subsequently
uncovered and a Fukushima type of core meltdown commences immediately. At this
point, I took Judy Haars comments to a source of mine at the Palo Verde Nuclear power plant. My source informed
me that as per NERC policy, nuclear power plants are required to have enough diesel fuel to run for a period of
seven days. Some plants have thirty days of diesel. This is the good news, but it is all downhill from here. The
one of my Palo Verde nuclear power plant sources informed me that there is no long term solution to a power
the
spent fuel pools carry depleted fuel for the reactor. Normally, this spent fuel has had
time to considerably decay and therefore, reducing radioactivity and heat. However,
the newer discharged fuel still produces heat and needs cooling. Housed in high
density storage racks, contained in buildings that vent directly into the atmosphere,
radiation containment is not accounted for with regard to the spent fuel racks. In
other words, there is no capture mechanism. In this scenario, accompanied by a
lengthy electrical outage, and with the emergency power waning due to either
generator failure or a lack of diesel needed to power the generators, the plant could
lose the ability to provide cooling. The water will subsequently heat up, boil away
and uncover the spent fuel rods which required being covered in at least 25 feet of
water to remain benign from any deleterious effects. Ultimately, this would lead to
fires as well and the release of radioactivity into the atmosphere. This would be the
beginning of another Fukushima event right here on American soil. Both my source
blackout and that all bets are off if the blackout is due to an EMP attack. A more detailed analysis reveals that
and Haar shared exactly the same scenario about how a meltdown would occur.
Subsequently, I spoke with Roger Landry who worked for Raytheon in various Department of Defense projects for 28
years, many of them in this arena and Roger also confirmed this information and that the above information is well
known in the industry. When I examine Congressman Franks letter to NERC and I read between the lines, it is clear
that Franks knows of this risk as well, he just stops short of specifically mentioning it in his letter. Placing Odds On a
Failure Is a Fools Errand An analysis of individual plant risks released in 2003 by the Nuclear Regulatory Commission
for 39 of the 104 nuclear reactors, the risk of core damage from a blackout
was greater than 1 in 100,000. At 45 other plants the risk is greater than 1 in 1 million, the threshold
shows that
NRC is using to determine which severe accidents should be evaluated in its latest analysis. According to the
Nuclear Regulatory Commission, the Beaver Valley Power Station, Unit 1, in Pennsylvania has the greatest risk of
event of an EMP attack, can tanker trucks with diesel fuel get to all of the nuclear power plants in the US in time to
re-fuel them before they stop running? Will tanker trucks even be running themselves in the aftermath of an EMP
attack? And in the event of an EMP attack, it is not likely that any plant which runs low on fuel, or has a generator
malfunctions, will ever get any help to mitigate the crisis prior to a plethora of meltdowns occurring. Thus, every
nuclear power plant in the country has the potential to cause a Chernobyl or Fukushima type accident if our country
been ordered by China. This makes one wonder what the Chinese are preparing for with these multiple orders for
both transformers and generators. In short, our unpreparedness is a prescription for disaster . As a
byproduct of my investigation, I have discovered that most, if not all, of the nuclear power plants are on known
earthquake fault lines. All of Californias nuclear power plants are located on an earthquake fault line. Can anyone
tell me why would anyone in their right mind build a nuclear power plant on a fault line? To see the depth of this
negligent as to not provide its nuclear plants a fool proof method to cool the secondary processes of its nuclear
materials at all of its plants? Why would ANY nuclear power plant be built on an earthquake fault line? Why are we
even using nuclear energy under these circumstances? And why are we allowing the Chinese to park right next door
to so many nuclear power plants?
In addition to
highlighting the NSAs massive institutional overreach and global privacy invasion,
Snowdens disclosures also highlight the many points at which our data is
insecure, and the vast numbers of vulnerabilities to surveillance that exist
throughout our digital world. However, while the NSA is the largest threat in the
surveillance game, it is not the only threat. Governments all around the world are
using the internet to surveil their citizens. Considering the rate of technological
change, it is not unforeseeable that the methods, tools and vulnerabilities
used by the NSA will be the tools of states, cyber criminals and low-skilled
hackers of the future. Regardless of who the perceived attacker or surveillance
operative may be, and whether it is the NSA or not, large-scale, mass
surveillance is a growing cyber security threat. It has also been disclosed that the NSA
and GCHQ have actively worked to make internet and technology users around the world
less secure. The NSA has placed backdoors in routers running vital internet
infrastructures.35 The GCHQ has impersonated social networking websites like LinkedIn in order to target
system administrators of internet service providers.36 The NSA has been working with the GCHQ
to hack into Google and Yahoo data centres .37 The NSA also works to undermine
encryption technologies, by covertly influencing the use of weak algorithms and
random number generators in encryption products and standards .38 The NSA in its
own words is working under the BULLRUN programme to "insert vulnerabilities into
commercial encryption systems, IT systems, networks, and endpoint
and Yahoo34 for information that it could mostly have gotten through the PRISM programme.
mission. On one hand, the agency is tasked with securing U.S. networks and information. On the other hand, the
agency must gather intelligence on foreign threats to national security.
means hacking encrypted communications. Thats nothing new for the NSA; the agency traces its
roots back to code-breakers deciphering Nazi messages during World War II. So in many ways, strong Internet
security actually makes the NSAs job harder. This is an administration that is a vigorous defender of surveillance,
said Christopher Soghoian, the head technologist for the American Civil Liberties Union. Surveillance at the scale
they want requires insecurity. The leaks from Edward Snowden have revealed a variety of efforts by the NSA to
weaken cybersecurity and hack into networks. Critics say those programs, while helping NSA spying, have made
U.S. networks less secure. According to the leaked documents, the NSA inserted a so-called back door into at least
NSA
could use that back door to spy on suspected terrorists, but the vulnerability was
also available to any other hacker who discovered it . NIST, a Commerce Department agency,
one encryption standard that was developed by the National Institute of Standards and Technology. The
sets scientific and technical standards that are widely used by both the government and the private sector. The
agency has said it would never deliberately weaken a cryptographic standard, but it remains unclear whether the
agency was aware of the back door or whether the NSA tricked NIST into adopting the compromised standard. NIST
revelation that
NSA somehow got NIST to build a back door into an encryption standard has
seriously damaged NISTs reputation with security experts. NIST is operating with a trust deficit right
is required by law to consult with the NSA for its technical expertise on cybersecurity. The
now, Soghoian said. Anything that NIST has touched is now tainted. Its a particularly bad time for NIST to have
phone companies. Because its an executive order instead of a law, the cybersecurity standards are entirely
Snowden leaks
werent the first to indicate that the NSA is involved in exploiting commercial
voluntary, and the U.S. government will have to convince the private sector to comply. The
security. According to a 2012 New York Times report, the NSA developed a worm, dubbed Stuxnet, to cripple
Iranian nuclear centrifuges. But the worm, which exploited four previously unknown flaws in Microsoft Windows,
public to protect its own classified networks. We do not make recommendations that we cannot stand behind for
protecting national security systems and data, she said. The activity of N SA
toward publicly disclosing flaws in security unless there is a clear national security or law enforcement need. In a
blog post Monday, Michael Daniel, the White Houses cybersecurity coordinator, said that disclosing security flaws
usually makes sense. Building up a huge stockpile of undisclosed vulnerabilities while leaving the Internet
vulnerable and the American people unprotected would not be in our national security interest, he said. But Daniel
added that, in some cases, disclosing a vulnerability means that the U.S. would forego an opportunity to collect
crucial intelligence that could thwart a terrorist attack, stop the theft of our nations intellectual property, or even
discover more dangerous vulnerabilities. He said that the government weighs a variety of factors, such as the risk
of leaving the vulnerability un-patched, the likelihood that anyone else would discover it, and how important the
potential intelligence is. But privacy advocates and many business groups are still uncomfortable with the U.S.
keeping security flaws secret. And many dont trust that the NSA will only exploit the vulnerabilities with the most
potential for intelligence and least opportunity for other hackers. The surveillance bureaucracy really doesnt have
a lot of self-imposed limits. They want to get everything, said Ed Black, the CEO of the Computer &
Communications Industry Association, which represents companies including Google, Microsoft, Yahoo, and Sprint.
Now I think people dealing with that bureaucracy have to understand they cant take anything for granted. Most
computer networks are run by private companies, and the government must work closely with the private sector to
improve cybersecurity. But companies have become reluctant to share security information with the U.S.
government, fearing the NSA could use any information to hack into their systems. When you want to go into
partnership with somebody and work on serious issuessuch as cybersecurityyou want to know youre being told
the truth, Black said. Google and one other cybersecurity firm discovered Heartbleeda critical flaw in a widely
used Internet encryption toolin March. The companies notified a few other private-sector groups about the
problem, but no one told the U.S. government until April. Information you share with the NSA might be used to hurt
you as a company, warned Ashkan Soltani, a technical consultant who has worked with tech companies and helped
The Washington Post with its coverage of the Snowden documents. He said that company officials have historically
discussed cybersecurity issues with the NSA, but that he wouldnt be surprised if those relationships are now
strained. He pointed to news that the NSA posed as Facebook to infect computers with malware. That does a lot of
from U.S. businesses. Jason Healey, the director of the Cyber Statecraft Initiative at the Atlantic Council, said the
U.S. has militarized cyber policy. The
the top challenges for Mike Rogers, who was sworn in as the new NSA director earlier this month. All the tech
companies are in varying degrees unhappy and not eager to have a close relationship with NSA, Lewis said.
Cybersecurity and Consumer Protection at Stanford University here. The summit, which focused on public-private
partnerships and consumer protection, is part of a recent White House push to focus on cybersecurity. Obama said
the prospect of cyberattacks are one of the nation's most pressing national security, economic and safety issues.
The specter of a cyberattack crippling the nation's air traffic control system or a city with a blackout is real, and
hacks such as the one on Sony Pictures last year are "hurting America's companies and costing American jobs." He
also said they are a threat to the security and well-being of children who are online. "Its one of the great paradoxes
of our time that the very technologies that empower us to do great good can also be used to undermine us and
inflict great harm," Obama said before a cheering, friendly audience here at Stanford's Memorial Auditorium.
The
about encryption and said he likely leans more toward strong data encryption than law enforcement, but is
U.S. government
surveillance activities have been seen as a potential liability for tech companies that
operate globally. Seventy to 80 percent of the user bases for a lot of these companies are the foreigners who
sympathetic to them because of the pressure they are under to keep people safe.
get very little protection under our system, explained Julian Sanchez, a senior fellow focused on technology and
civil liberties at the Cato Institute. If they dont display some push back, they know they wont do very well with
legislative efforts, said Andrew Crocker, a legal fellow at civil liberties group the Electronic Frontier Foundation.
undermined encryption technology endangering the security of financial transactions; and they've compelled
countless US based technology companies to violate the privacy of their own
customers and to build backdoors into their products to enable NSA snooping . In
stated justification of "fighting terrorism"); they've
order to compress what they've been doing into a single paragraph, I've obviously left out a lot of the nefarious
activities orchestrated by the NSA and carried out by their mercenary army of hundreds of thousands of private
sector spooks and their Five Eyes collaborators. But even so, the above paragraph is more than enough to
demonstrate that the security services in the US, and the other Five Eyes collaborator states, are running
dangerously out of control. The fact that the NSA and their Five Eyes collaborators feel entitled to trawl the Internet
for whatever they can find, which is then stored in vast data centres and subjected to algorithmic analysis without
the need for any kind of judicial warrant, demonstrates that something fundamental has changed in the relationship
between the state and the citizen. Due process has been abandoned, and as far as the security services are
concerned, we are all assumed to be guilty. They don't have to be able to show probable cause, they don't have to
apply for a warrant from a judge, they just steal our data and use it as they see fit, with no democratic oversight at
all over many of their data stealing operations. The fact that the US state employs a staggering 850,000 NSA staff
and private sector contractors to trawl this ocean of stolen data should be alarming to anyone with the brains to
think through the logical implications of such a vast mercenary army. You would have to be a hopeless idealist to
imagine that there are no "bad apples" at all amongst all these hundreds of thousands. If we assume that just 4% of
them (one in every 25) are the kind of people that would use their access to enormous surveillance powers to do
things like steal commercially confidential information to order, blackmail people, cyber stalk people, wage petty
vendettas against old adversaries ... that would mean a rogue army of some 34,000 thieves, stalkers and
blackmailers with access to the NSA's vast caches of stolen data and their extraordinary surveillance capabilities.
The fact that the NSA have been using their powers to engage in industrial espionage against various countries
such as Germany, Russia, China and Brazil illustrates that "the few bad apples" narrative, although useful from an
illustrative point of view, isn't actually the main concern. The main concern is that the NSA itself is corrupt to the
core. Instead of using their powers to maintain the rule of law and to "fight terrorism" they're actually intent on
using their unprecedented espionage capabilities in order to undermine global competition for the benefit of US
order to build themselves snooping capabilities that would have blown the minds of the East German Stasi or the
American companies to the staggering cost of running the NSA and employing an army of 850,000 spooks, the cost
structured must be absolutely aghast at the damage inflicted on the technology sector by the power crazed spooks
that considered their mission to infect everything they could with spyware as far more important than the long term
The NSA have been using their surveillance powers to engage in industrial espionage in order to benefit US
corporations. This is a clear demonstration that they see it as their mission to help US corporations by fair means or
go down as one of the most spectacular own goals in history. They built a vast data stealing operation in order to
help US corporations, but in doing so inflicted more damage on the US economy than Osama Bin Laden could ever
have dreamed of. The NSA have used their scaremongering narratives about the threat of terrorism to justify the
slaughter of their own golden goose, yet they would have us believe that they are not responsible. They would have
everyone believe that Edward Snowden is the guilty party; that he alone is responsible for the damage to the US
technology sector. But their case is a ludicrous one. There is clearly something dreadfully wrong with the way things
are set up if just one man (out of some 850,000 spooks) can single handedly wipe an estimated $180 billion off the
value of the US technology sector simply by telling the truth.
data traffic, and must be manually altered or upgraded; in SDNs; by contrast, the instructions controlling traffic can
be altered through software administered from a remote location. A parallel to the SDN is the software-defined radio
(SDR), which performs a similar function for radio transceivers; remotely delivered software instructions can set, or
determined not just during the physical construction of its infrastructure, but will be managed throughout its
gear, such as Cisco and Huawei. Many industry commentators now expect the commoditisation of network gear and
increased use of generic, standardised hardware, while value increasingly accrues to software and service
then, that Huawei showcased its own SDN technology in October 2013: Net Matrix, part of the SoftCom network
The industry
has not yet developed a full set of standardised protocols , so proprietary protocols are still in
use in these SDN trial runs. However, it is very likely that such standards will be
developed over the next few years, representing a significant shift in the
telecoms sector. The advent of software-defined networks will bring new cybersecurity challenges. There are advantages, including the possibility of responding faster and more flexibly to
architecture. Meanwhile, Cisco has announced its Open Network Environment (ONE) SDN strategy.
software-based attacks. Yet, the standardisation of network gear will also make it easier for malware, such as
worms, to navigate across multiple networks, facing fewer barriers like those currently posed by the differing
this will require co-ordination among industry and government stakeholders. In terms of hardware-based threats,
SDNs will not fundamentally change the risks: hardware could be compromised during design or at various points in
the manufacturing process, and those in charge of final testing, as well as procurement of gear, will need to
improve their ability to detect vulnerabilities and defend against attacks. Policy responses to SDNs are likely to
focus on similar issues faced in cloud computing: data and communications privacy, standards and interoperability,
and rules on cross-border data flows.
Scenario 1 Econ
Tech sector innovation is necessary for economic growth
cloud-computing affects all organizations
Coviello, Executive Vice President, EMC Corporation, 11
Art, "Can Cloud Computing Save The American Economy?", March 13 2011, Forbes,
www.forbes.com/sites/ciocentral/2011/03/13/can-cloud-computing-save-theamerican-economy/
The American dream is in peril from the confluence of sky rocketing deficits, high
unemployment, and the ticking time bomb of an aging baby boomer generation,
with its coincident increase in the burden of entitlements as a percentage of GDP . For
the first time, the next generation of Americans, our grandchildren, risk having a lower standard of living than we
again, we are well positioned to lead the world out of this one. Want proof? American businesses systemically and
culturally react fast. Two years after the economic downturn began the United States was generating 97% of its
economic output with only 90% of the labor. This sort of gain in productivity ultimately translates into increased
economic activity, the ability to pay down debt and a higher standard of living for those of us who are employed.
productivity gains
from working harder can only take us so far. Innovation and technology can and
must take us the rest of the way, creating new jobs and new industries. Our so
called information economy, for example, is ripe for innovation. Today, all
organizations are dependent on information technology. What makes me
optimistic about the future is that we have not even begun to scratch the surface of
all that can be accomplished by actually applying information technology
pervasively. We have spent trillions of dollars worldwide for the computers to create
and process information, networks to move it around and the hardware to store it.
But we are at a point where we spend 60 to 70% of IT budgets just to maintain
those systems and infrastructures. No wonder progress in applying IT is so slow. This is the
technology equivalent of every organization in the world, big or small, investing the
capital and human resources to build and operate their own electricity producing
power plants. But instead, picture a world where software platforms are available
online and easily customizable. Picture a world where compute power is generated
off site, available in quantities when and where you need it. And picture a world
where information is safely stored, efficiently managed and accessible, when and
where you need it. These are cloud infrastructures. The economies of scale,
flexibility and efficiency they offer will not only save organizations massive amounts
of capital and maintenance costs but emancipate them to apply and use
Unfortunately it does not directly address the issue of unemployment. The fact is that
the nature of our society: egalitarian, free, open and competitive that make us the most adaptive, inventive and
resilient country in the world.
finding more than a little satisfaction in Americas difficulties. Such a response should not be surprising. The US and those
representing it have been guilty of hubris (the US may often be the indispensable nation, but it would be better if others pointed this
out), and examples of inconsistency between Americas practices and its principles understandably provoke charges of hypocrisy.
When America does not adhere to the principles that it preaches to others, it breeds resentment. But, like most temptations, the
urge to gloat at Americas imperfections and struggles ought to be resisted. People around the globe should be careful what they
wish for.
Americas failure to deal with its internal challenges would come at a steep
price. Indeed, the rest of the worlds stake in American success is nearly as large as that of the US itself. Part of the reason is
economic. The US economy still accounts for about one-quarter of global output. If US growth accelerates,
Americas capacity to consume other countries goods and services will increase,
thereby boosting growth around the world . At a time when Europe is drifting and Asia is
slowing, only the US (or, more broadly, North America) has the potential to drive global economic
recovery . The US remains a unique source of innovation. Most of the worlds citizens communicate with mobile devices based
on technology developed in Silicon Valley; likewise, the Internet was made in America. More recently, new technologies developed in
the US greatly increase the ability to extract oil and natural gas from underground formations. This technology is now making its
way around the globe, allowing other societies to increase their energy production and decrease both their reliance on costly
imports and their carbon emissions. The US is also an invaluable source of ideas. Its world-class universities educate a significant
can deal effectively with the worlds problems on its own. Unilateralism rarely works. It is not just that the US lacks the means; the
very nature of contemporary global problems suggests that only collective responses stand a good chance of succeeding. But
multilateralism is much easier to advocate than to design and implement. Right now there
is only one candidate for this role: the US. No other country has the necessary
combination of capability and outlook. This brings me back to the argument that the US must put its
house in order economically, physically, socially, and politically if it is to have the resources
needed to promote order in the world. Everyone should hope that it does: The alternative to a
world led by the US is not a world led by China, Europe, Russia, Japan, India, or any other
country, but rather a world that is not led at all . Such a world would almost certainly be characterized
by chronic crisis and conflict. That would be bad not just for Americans, but for the vast majority of the
planets inhabitants.
Scenario 2 Warming
US broadband leadership is fine now, but eroding
competitiveness will stagnate innovation that collapse the
economy and the internet of things
Wilson et al 14
Phil, Director, Technology, Media and Telecommunications @ Deloitte, United
States expands global lead in mobile broadband How policy actions could enhance
or imperil Americas mobile broadband competitiveness,
http://www.deloitte.com/assets/Dcom-UnitedStates/Local
%20Assets/Documents/TMT_us_tmt/us_tmt_mobile_index%20_090214.pdf
Thus far, the United States has benefited
tremendously from its leadership position in mobile broadband. However, the future
could be different from the past for better or worse. Deloittes Mobile Communications National
Past performance is no guarantee of future success
Achievement Index can serve as a useful tool for estimating future rankings under different policy scenarios.
Although the Mobile Communications National Achievement Index demonstrates a nations past and present
positioning in mobile broadband, it can also be used on a forward-looking basis to help policymakers understand
how mobile broadband performance might evolve under different policy scenarios. As illustrated in Exhibit 4,
different policies are likely to affect U.S. performance in distinct ways, resulting in shifting key performance
indicator (KPI) values that collectively result in a different index score. For example, policies that affect broadband
supply will indirectly have an impact on pricing, affordability, and usage.
equilibrium pricing per unit of use and lower equilibrium usage. This in turn
affects the potential for industry returns, reducing investment levels and innovation. Two contrasting index
scenarios provide a useful picture of how U.S. supply policy can affect Americas ability to defend its mobile
broadband leadership position By considering two supply scenarios a favorable scenario in which U.S. policy
actions and timing are sufficient and supportive to meet Americas supply needs and an unfavorable scenario in
which policy actions are insufficient or unsupportive we can project plausible outcomes in U.S. mobile broadband
global competitiveness over the coming decade.29
A growing
index score
States over other countries would indicate the United States is achieving exceptional global
performance growth by innovating and capturing value from the growing consumer and business uses
of mobile broadband. A declining edge would indicate just the opposite, reducing the
motivation and incentive for mobile broadband investment to flow into the U nited
States and into the mobile broadband industry. Even if Americas lead is reduced but not lost,
other countries might be able to overtake the United States in certain mobile
broadband segments by capitalizing on their inherent advantages. For example, a country
with a strong public health system could use that advantage to lead innovation in the area of mHealth and capture
a disproportionate share of performance growth in that segment. These two divergent policy directions would have
substantial but opposite effects on the industrys ability to meet demand effects that can be quantified using the
mobile broadband key performance indicators. By assuming other countries continue their current mobile
broadband actions and trends, we can isolate the influence of U.S. policy actions and estimate Americas ability to
fend off global competitors in mobile broadband under both scenarios. Scenario 1: A favorable and supportive policy
available spectrum is auctioned off within the next three to five years, helping to ensure that the total amount of
spectrum is sufficient to consistently meet demand over the decade. Spectrum bands are of sufficient quality,
accounting for factors such as frequency range, block size, national coverage, and international alignment. Shared
spectrum contribution to supply is properly accounted for, factoring in impairments from sharing constraints.
Terms and conditions for spectrum access and use are market-oriented, with limited regulatory restrictions.
Outcome: Mobile broadband spectrum supply matches demand, enabling the United States to strengthen and
broadband. The additional supply allows wireless carriers to continue offering robust and compelling service to
consumers and businesses. Available performance levels (e.g., data speeds, capacity, reliability, coverage, and
latency) spur additional investment and innovation by wireless carriers and across the broader mobile ecosystem.
Many mobile ecosystem companies participate in and contribute to this growth opportunity, spanning areas such as
network infrastructure and operational support systems, devices, operating systems, and applications.
Consumer and business uses grow as service performance remains strong and
ecosystem innovation creates compelling new offerings, manifested in new devices,
applications, and services. The wireless ecosystem and industries with embedded
wireless solutions (e.g., automotive telematics, mHealth, and mCommerce) gain from
the strong home field advantage of U.S. mobile broadband leadership . The United
States enjoys export advantages from the resulting innovations and new business
models and is able to capture a growing share of global value in the respective market
segments. As illustrated in Exhibit 5, these combined performance improvements would likely position the United
States to sustain its lead in the near term and within a decade even extend its lead to nearly match the levels of the
early 2000s. Scenario 2: An unfavorable or insufficient policy approach results in the mobile broadband market
auctions are further delayed and allocated spectrum is insufficient. Spectrum bands are of lesser quality (e.g.,
higher frequencies, smaller blocks, limited national coverage, and less international alignment than in scenario 1).
restrictive and prescriptive, limiting the ability of market mechanisms to alleviate supply shortages. Outcome:
Demand exceeds supply, mobile broadband performance suffers, and the U.S. leadership
position erodes In this scenario, the policies enacted and executed in the United States are not sufficient to meet
mobile broadband supply needs, leading to a shortfall relative to rising demand. Mobile broadband becomes less
robust and reliable as localized performance issues (e.g., reduced speeds, increased latency and outages) increase
in both geography and time. Wireless carriers must focus their efforts and resources on alleviating the spectrum
and supply shortfalls, siphoning investments away from new products and services. Prices rise in order to keep
demand from exceeding supply, dampening consumer and business use of mobile broadband and limiting
purchases of the latest generation devices and applications. Ecosystem investment and innovation in the United
States are reduced as investors pursue opportunities in more attractive countries and industries. Wireless
ecosystems and mobile-enabled industries in other countries gain advantages over U.S. companies in innovation
and exports. U.S. exports suffer. As shown in Exhibit 6, the United States would likely maintain its leadership
position for the next few years thanks to momentum from current capabilities and performance levels. However,
mobile broadband supply and performance shortfalls would soon begin to take their toll, causing the U.S. lead to
shrink over the latter half of this decade. By 2020 the U.S. lead would be modest at best, with increasing challenges
from competing countries that are gaining ground with positive trends in their mobile broadband performance. The
United States would become just one of several targets for global investment in mobile broadband and would risk
established in the 2010 National Broadband Plan.32 The FCC and NTIA continue to pursue a variety of initiatives to
re-allocate commercial or federal spectrum on an exclusive or shared basis, with the most notable action being the
pending auction to re-allocate broadcast TV spectrum. However, the United States appears to be substantially
behind schedule in achieving the stated 2015 objective. As illustrated in Exhibit 7, current plans indicate that
approximately 225265 MHz of spectrum will be newly classified, auctioned, or planned for auction for mobile
broadband use through 2015, which at the lower range estimate is roughly three-fourths of the National Broadband
Plan 2015 goal. As important, 100 MHz of spectrum or approximately one-fourth of the total 395435 MHz planned
or identified to date is stipulated as shared-use, which due to the inherent nature of sharing will not be equivalent in
supply value to exclusively licensed spectrum. It is also worth emphasizing the complexity of U.S. spectrum
allocations, most of which reflect decisions made during the twentieth century, and the time delays that can result
in making spectrum available after it has been designated for commercial use. Reclaiming or sharing spectrum for
mobile use is especially difficult and time-consuming, with some federal agencies indicating that it will take up to a
decade to move their operations out of the designated bands.37 Developing economies have fewer challenges in
this regard, and thus have the potential to make progress at the expense of the United States. Whether viewed
from the U.S. or global perspective, Scenario 2 is a disconcerting yet plausible outlook. U.S. mobile broadband
leadership is by no means assured.How policy actions could enhance or imperil Americas mobile broadband
forecasts include an estimate of 26 billion installed Internet of things units by 2020, impacting the global supply
chain, and a prediction of 24 billion connected devices globally by 2016, resulting in a $1.2 trillion impact to North
American economies from revenues, cost reductions, or service improvements.39 If the United States mobile
broadband position becomes diminished and weak, it could create opportunities for other countries to gain traction
in areas of the mobile-enabled Internet of Things. Other countries, by tapping into their unique assets or
characteristics such as a higher population density or a leading public infrastructure could overtake the United
States in specific industries or applications such as telematics. mHealth, mCommerce, or mLogistics. Scale this
have an immense
impact: Energy consumption. Not only could this technology make turning the lights
on easier, but it could be the key to us effectively managing anthropogenic
carbon emissions.
So today lets focus on one place where machine to machine communication could
Regardless of your thoughts and opinions on climate change and the scope of how much carbon emissions affects
the global atmosphere, we all can agree on one thing: Emitting less carbon is a good thing, especially if it can be
done without impeding economic growth. For years, the battleground for the climate change debate has been on
the energy generation side, pitting alternative energy options like wind and solar against fossil fuels. The problem
Does that mean there's no shot at significantly reducing carbon emissions? No -- we're just
focusing on the wrong side of the energy equation, and that is where machine to machine
communications comes into play. Lets look at how the internet of things can mean for carbon emissions, and how
investors could make some hefty profits from it.
Energy consumption's overdue evolution
We humans are a fascinating study in inefficiency. We will sit in traffic on the freeway rather than take the
things less efficiently; we just don't always have the adequate information to make the most efficient decision.
When you add all of these little inefficiencies up, it amounts to massive amounts of
wasted energy and, in turn, unnecessary carbon emissions . In the U.S. alone, 1.9 billion
gallons of fuel is consumed every year from drivers sitting in traffic. That's 186 million tons of unnecessary CO2
emissions each year just in the U.S.
Now, imagine a world where every automobile was able to communicate with the
others, giving instant feedback on traffic conditions and providing alternative routes
to avoid traffic jams. This is the fundamental concept of machine-to-machine communications, and it goes
way beyond the scope of just automobiles and household conveniences.
One of the added benefits of this technology is the impact it could have on our everyday energy consumption and
the ultimate reduction in total carbon emissions.
generators, automobiles --
fully integrated heating, cooling, and lighting systems that can adjust for human occupancy.
There are lots of projections and estimates related to carbon emissions and climate change, but the one that has
emerged as the standard bearer is the amount of carbon emissions it would take to increase global temperatures by
2 degrees Centigrade. According to the UN's Environment Programme, annual anthropological greenhouse gas
emissions would need to decrease by 15% from recent levels to keep us under the carbon atmospheric levels .
Based on current emissions and the 9.1 gigaton estimate from Carbon War Room's
report, it would be enough to reduce global emissions by 18.6%, well within
the range of the UN's projections.
Warming is real, anthropogenic, and threatens extinction --prefer new evidence that represents consensus
Richard Schiffman 9/27/13, environmental writer @ The Atlantic citing the Fifth
Intergovernmental Panel on Climate Change, What Leading Scientists Want You to
Know About Today's Frightening Climate Report, The Atlantic,
http://www.theatlantic.com/technology/archive/2013/09/leading-scientists-weigh-inon-the-mother-of-all-climate-reports/280045/
The polar icecaps are melting faster than we thought they would; seas are rising
faster than we thought they would; extreme weather events are increasing. Have a nice
day! Thats a less than scientifically rigorous summary of the findings of the Fifth Intergovernmental Panel on Climate
Change (IPCC) report released this morning in Stockholm. Appearing exhausted after a nearly two sleepless days fine-tuning
the language of the report, co-chair Thomas Stocker called climate change the greatest
challenge of our time," adding that each of the last three decades has been
successively warmer than the past, and that this trend is likely to continue into
the foreseeable future. Pledging further action to cut carbon dioxide (CO2) emissions, U.S. Secretary of State John Kerry said, "This
isnt a run of the mill report to be dumped in a filing cabinet. This isnt a political document produced by
politicians... Its science." And that science needs to be communicated to the public, loudly and clearly. I canvassed leading climate
researchers for their take on the findings of the vastly influential IPCC report. What headline would they put on the news? What do they hope people hear
from 1990 and what has taken place since, climate change is proceeding faster than we expected, Mann told me by email. Mann helped develop the
famous hockey-stick graph, which Al Gore used in his film An Inconvenient Truth to dramatize the sharp rise in temperatures in recent times. Mann
continental ice sheets, which are losing ice and contributing to sea level rise at a faster rate than the [earlier IPCC] models had predicted. But there
is a lot that we still dont understand. Reuters noted in a sneak preview of IPCC draft which was leaked in August that, while the broad global trends are
clear, climate scientists were finding it harder than expected to predict the impact in specific regions in coming decades. From year to year, the worlds
hotspots are not consistent, but move erratically around the globe. The same has been true of heat waves,
mega-storms and catastrophic floods, like the recent ones that ravaged the Colorado Front Range. There is broad agreement that
climate change is increasing the severity of extreme weather events, but were not yet able to
predict where and when these will show up. It is like watching a pot boil, Danish astrophysicist and climate scientist Peter
Thejll told me. We understand why it boils but cannot predict where the next bubble will be. There is also
uncertainty about an apparent slowdown over the last decade in the rate of air temperature increase. While some critics claim that
global warming has stalled, others point out that, when rising ocean temperatures are
factored in, the Earth is actually gaining heat faster than previously
anticipated. Temperatures measured over the short term are just one parameter, said Dr Tim
Barnett of the Scripps Institute of Oceanography in an interview. There are far more critical things going on; the
acidification of the ocean is happening a lot faster than anybody thought that it
would, its sucking up more CO2, plankton, the basic food chain of the planet,
are dying, its such a hugely important signal . Why arent people using that as a measure of what is going on?
Barnett thinks that recent increases in volcanic activity , which spews smog-forming aerosols into the air that deflect solar radiation
and cool the atmosphere, might help account for the temporary slowing of global temperature
rise. But he says we shouldnt let short term fluctuations cause us to lose sight of the big picture. The dispute over temperatures underscores just
how formidable the IPCCs task of modeling the complexity of climate change is. Issued in three parts (the next two installments are due out in the spring),
the full version of the IPCC will end up several times the length of Leo Tolstoys epic War and Peace. Yet every last word of the U.N. document needs to be
I do not know of any other area of any complexity and importance at all
where there is unanimous agreement... and the statements so strong, Mike
signed off on by all of the nations on earth.
MacCracken, Chief Scientist for Climate Change Programs, Climate Institute in Washington, D.C. told me in an email. What IPCC has achieved is
remarkable (and why it merited the Nobel Peace Prize granted in 2007). Not surprisingly,
conservative by design, Ken Caldeira, an atmospheric scientist with the Carnegie Institutions Department of Global Ecology told
me: The IPCC is not supposed to represent the controversial forefront of climate
science. It is supposed to represents what nearly all scientists agree on, and it does
that quite effectively. Nevertheless, even these understated findings are inevitably controversial. Roger Pielke Jr., the Director of the
Center for Science and Technology Policy Research at the University of Colorado, Boulder suggested a headline that focuses on the cat fight that todays
report is sure to revive: "Fresh Red Meat Offered Up in the Climate Debate, Activists and Skeptics Continue Fighting Over It." Pielke should know. A critic of
Al Gore, who has called his own detractors "climate McCarthyists," Pielke has been a lightning rod for the political controversy which continues to swirl
around the question of global warming, and what, if anything, we should do about it. The publics skepticism of climate change took a dive after
Hurricane Sandy. Fifty-four percent of Americans are now saying that the effects of global warming have already begun. But 41 percent surveyed in the
same Gallup poll believe news about global warming is generally exaggerated, and there is a smaller but highly passionate minority that continues to
For most climate experts, however, the battle is long over at least
when it comes to the science. What remains in dispute is not whether climate change is happening, but how fast things are going
believe the whole thing is a hoax.
to get worse. There are some possibilities that are deliberately left out of the IPCC projections, because we simply dont have enough data yet to model
them. Jason Box, a visiting scholar at the Byrd Polar Research Center told me in an email interview that:
closet is terrestrial and oceanic methane release triggered by warming. The IPCC projections dont include the possibility
some scientists say likelihood that huge quantities of methane (a greenhouse gas thirty times as potent as CO2) will eventually be released from
thawing permafrost and undersea methane hydrate reserves. Box said that
of potential management of the problem, may be sooner than expected. Box, whose work
has been instrumental in documenting the rapid deterioration of the Greenland ice sheet, also believes that the latest IPCC
predictions (of a maximum just under three foot ocean rise by the end of the century) may turn out to be wildly
optimistic, if the Greenland ice sheet breaks up. We are heading into uncharted territory he said. We
are creating a different climate than the Earth has ever seen. The head of the IPCC,
Rajendra Pachauri, speaks for the scientific consensus when he says that time is fast running out to avoid the
catastrophic collapse of the natural systems on which human life depends. What
he recently told a group of climate scientist could be the most chilling headline of all for the U.N. report: "We have five minutes
before midnight."
aim to
Fiscal Year 2015 (H.R. 4435) by Representatives Zoe Lofgren (D-CA) and Rush Holt (D-NJ) would have prohibited
request.361 Although that measure was not adopted as part of the NDAA, a similar amendment sponsored by
Lofgren along with Representatives Jim Sensenbrenner (D-WI) and Thomas Massie (R-KY), did make it into the
House-approved version of the NDAAwith the support of Internet companies and privacy organizations362
passing on an overwhelming vote of 293 to 123.363 Like Representative Graysons amendment on NSAs
consultations with NIST around encryption, it remains to be seen whether this amendment will end up in the final