Sei sulla pagina 1di 40

1nc round 6

Deterrence
NFU destroys deterrence which encourages conventional, biological and cyber-
attacks against allies
Ankit Panda, 18, 7-17-2018, ‘No First Use’ and Nuclear Weapons, https://www.cfr.org/backgrounder/no-first-
use-and-nuclear-weapons, Ankit Panda is an award-winning writer, analyst, and researcher focusing on
international security, geopolitics, and economics, He is currently a senior editor at The Diplomat, where he writes
daily on security, geopolitics, and economics in the Asia-Pacific region and hosts a popular podcast. AVD
Pledges to only use these weapons in
Most states with nuclear weapons maintain policies that would permit their first use in a conflict.

retaliation for a nuclear attack—or a no-first-use (NFU) policy—are rare. Where these pledges have been
made by nuclear states, their adversaries generally consider them not credible. Strategic planners for nuclear weapons powers see the credible threat of the

first use of nuclear weapons as a powerful deterrent against a range of significant


nonnuclear threats, including major conventional, chemical, and biological attacks, as
well as cyberattacks. Even states with significant conventional military forces, such as
the United States, consider it necessary to retain nuclear first use as an option . The 2018 Nuclear
Posture Review, under the administration of President Donald J. Trump, retains the option of nuclear first use. Nuclear Weapons Nonproliferation, Arms Control, and Disarmament Defense
and Security Military Operations Politics and Government A so-called NFU pledge, first publicly made by China in 1964, refers to any authoritative statement by a nuclear weapon state to
never be the first to use these weapons in a conflict, reserving them strictly to retaliate in the aftermath of a nuclear attack against its territory or military personnel. These pledges are a
component of nuclear declaratory policies. As such, there can be no diplomatic arrangement to verify or enforce a declaratory NFU pledge, and such pledges alone do not affect capabilities.

States with such pledges would be technically able to still use nuclear weapons first in
a conflict, and their adversaries have generally not trusted NFU assurances. Today, China is the only
nuclear weapon state to maintain an unconditional NFU pledge. What is the U.S. declaratory nuclear use policy? During the Cold War and even today, the credible threat

of the United States using its nuclear weapons first against an adversary has been an important
component of reassuring allies. At the height of the Cold War, the threat of U.S.
tactical nuclear use was conceived of as a critical bulwark against a conventional
Soviet offensive through the Fulda Gap, a strategically significant lowland corridor in
Germany that would allow Warsaw Pact forces to enter Western Europe. A nuclear
first-use policy was thought to be a cornerstone of the defensive posture of the North
Atlantic Treaty Organization (NATO), given the large number of bases of Warsaw Pact conventional military forces. Accordingly, NATO has always
opposed a U.S. NFU declaration and has never ruled out U.S. first use under its “flexible response” posture since 1967. Today, U.S. allies in East Asia

and Europe alike rely on credible commitments from the United States to use nuclear
weapons first to deter major nonnuclear threats against them. The United States has considered but has never
declared an NFU policy and remains the only country to have ever used nuclear weapons in war—twice against Japan, in 1945. The Trump administration’s

2018 Nuclear Posture Review expands the range of significant nonnuclear strategic scenarios in which the United States may contemplate nuclear weapons
use. Notably, it does not rule out the first use of nuclear weapons in response to cyberattacks .
The 2010 Nuclear Posture Review, under the administration of President Barack Obama, reiterated an assurance in place since 1978 that the United States would not use nuclear weapons
against compliant members of the Nuclear Nonproliferation Treaty (NPT). The Obama administration still maintained the option to use nuclear weapons first while stating that the role of these

the “fundamental”
weapons to deter and respond to nonnuclear attacks had declined and that it would continue to reduce that role. It additionally emphasized that

role of U.S. nuclear weapons was to deter nuclear use against the United States and
its allies. In 2002, during the administration of President George W. Bush, the classified Nuclear Posture Review emphasized the role of U.S. nuclear weapons in deterring nonnuclear
threats, including weapons of mass destruction (WMD) and large conventional military forces, ostensibly through nuclear first use. What is the debate in the United States on NFU? Though the
2010 Nuclear Posture Review did not include an NFU pledge, the Obama administration considered the idea during its second term. It ultimately left U.S. nuclear declaratory policy unchanged
from its 2010 iteration, which stated that the United States reserved the right to use nuclear weapons to deter nonnuclear attacks while strengthening conventional capabilities to gradually
reduce the role of nuclear weapons to that of solely deterring nuclear attacks. Nevertheless, the Obama administration’s final year in office saw animated debate among proponents and
opponents of an NFU declaration. Arguments in favor of a U.S. NFU pledge. Proponents of a U.S. NFU declaration have argued that not only does the United States already maintain a de facto
NFU policy but that U.S. superiority in conventional weapons is sufficient to deter significant nuclear, biological, chemical, and conventional threats. Additionally, as Kingston Reif of the Arms
Control Association has argued, “a clear U.S. no-first-use policy would reduce the risk of Russian or Chinese nuclear miscalculation during a crisis by alleviating concerns about a devastating
U.S. nuclear first-strike.” In nuclear strategy, a first strike refers to a nuclear attack that seeks to disarm a nuclear-armed enemy before it can employ its weapons. Other proponents pointed to
an NFU policy declaration being a necessary step on the road to global nuclear disarmament, an aspirational goal of the Obama administration and a requirement for all recognized nuclear
weapon states under Article VI of the NPT. Proponents also argue that U.S. resistance to an NFU declaration has harmed U.S. nonproliferation efforts. Arguments against a U.S. NFU pledge.
U.S. allies in East Asia and Europe alike would not accept a unilateral U.S.
Critics, meanwhile, have suggested that

NFU declaration, because it could encourage adversaries to attack with conventional


weapons or to use chemical, biological, or cyber weapons. Russian conventional military advantages over U.S. allies in
Europe have amplified these concerns. Critics argue that such a declaration could undercut allied commitments and encourage U.S. allies to develop their own nuclear weapons. Within the
Obama administration in 2016, Secretary of State John Kerry, Secretary of Defense Ash Carter, and Secretary of Energy Ernest Moniz opposed an NFU declaration, primarily along these lines.
These officials shared the view of NFU skeptics that a U.S. declaration would embolden adversaries, weaken allied commitments, and invite brinkmanship. How does the president’s authority

to use nuclear weapons relate to questions of first use? Since the Trump administration’s inauguration, the issue of presidential launch
authorization has been of interest to lawmakers, precipitated by the president’s calls to expand the U.S. arsenal and threats against North Korea. In 2017, for the first time in forty-one years,
the Senate Foreign Relations Committee held a hearing on the president’s ability to use nuclear weapons. Also in 2017, Representative Ted W. Lieu of California and Senator Edward J. Markey

of Massachusetts, both Democrats, introduced bills to restrict the first use of nuclear weapons by
the president without a congressional declaration of war, but some experts say this
would not have meaningful effect on Trump’s ability to use nuclear weapons first. What are
the nuclear use policies of other nuclear weapon states? Today, eight states acknowledge that nuclear weapons play

a role in their national defense policies. Each of these states—China, France, India,
North Korea, Pakistan, Russia, the United Kingdom, and the United States—has
conveyed through official statements and documents a certain declaratory nuclear
policy, detailing the conditions under which they might use these weapons. Another
state, Israel, has not publicly acknowledged that it possesses nuclear weapons but is
widely considered a nuclear state. Russia. In 1993, Russia released a military doctrine that formally abandoned a 1982 pledge by Soviet leader Leonid
Brezhnev not to use nuclear weapons first in a conflict. This pledge was never seen as credible by NATO leaders in the final years of the Cold War. A French diplomat, writing in 1999, observed
[PDF] that even after Brezhnev’s declaration, “military records of the Warsaw Pact that fell into German hands demonstrated beyond doubt that Russian operational plans called for the use of
nuclear and chemical weapons in Germany at the onset of hostilities, even if NATO forces were using only conventional weapons.” The 1993 military doctrine said that the country’s nuclear
weapons would never be used against nonnuclear states that were members of the NPT, except those that were allied with a nuclear state. Today, Russian’s military doctrine says [PDF] the
country will use nuclear weapons against attacks by conventional forces that represent an existential threat to the country or in retaliation for a nuclear or WMD attack. China. Under stated
Chinese posture, the country would expect to first absorb a nuclear attack before using its own nuclear forces to retaliate. While this has held constant since China’s first nuclear test, there is a
debate today in the country over the continuing advisability of an NFU posture. For decades, China sought to make its NFU pledge appear credible by separating its ballistic missile and
warhead units; under these circumstances, China’s intention to use nuclear weapons before first suffering a nuclear attack would ostensibly be easily detectable. So far, there have been no
public caveats to China’s NFU policy, but some U.S. and Indian strategists doubt the credibility of China’s pledge. China has been able to maintain its NFU pledge because it has invested so
heavily in conventional military modernization, making it unlikely that it would consider nuclear escalation in a conventional war. China has publicly called on nuclear weapon states to create
and join a multilateral NFU treaty—what it has called [PDF] a Treaty on Mutual No-First-Use of Nuclear Weapons. UK. The country maintains an ambiguous nuclear posture that does “not rule
in or out the first use of nuclear weapons,” according to the UK Ministry of Defense’s 2010–2015 policy paper on the country’s nuclear deterrent. In 1978 and 1995, the UK reiterated a
commitment to not use nuclear weapons against nonnuclear states in the NPT. France. France has maintained a first-use nuclear posture since it first developed and tested nuclear weapons
during the Cold War. France’s posture emerged from its Cold War–era fears of abandonment by the United States, which led to the country’s withdrawal from NATO in 1966 to pursue an
independent nuclear capability, giving France the sovereign ability to determine how and when it would use its nuclear weapons. France pioneered the concept of a prestrategic strike for a
conventional invasion, threatening limited nuclear first use as a way to signal that it was contemplating escalation to the strategic nuclear level. France rejoined NATO in 2009 but kept its
nuclear forces outside of NATO’s defense coordination mechanisms. French forces today have inherited that legacy of independence and maintain a first-use posture to deter any type of
attack on or invasion of France. India. India maintains a declared NFU posture, with exceptions for chemical and biological weapons attacks. In its 1999 draft nuclear doctrine, India announced
that it “will not be the first to initiate a nuclear strike, but will respond with punitive retaliation should deterrence fail.” The public summary of India’s final nuclear doctrine, released in 2003,
says that “in the event of a major attack against India, or Indian forces anywhere, by biological or chemical weapons, India will retain the option of retaliating with nuclear weapons.” Indian
public statements on nuclear weapons continue to emphasize the NFU policy, without acknowledging the exceptions carved out explicitly in the official doctrine. Pakistan. Pakistan has not
ruled out nuclear first use to deter what it sees as an overwhelming Indian quantitative advantage in conventional forces. Islamabad has left the exact threshold for its nuclear use ambiguous.
Pakistani officials and strategists have been consistent in their support of a first-use posture, with the exception of former President Asif Ali Zardari, who voiced support for an NFU posture
early in his term, in 2008. Today, there is no serious push in Pakistan to reconsider the country’s first-use posture. North Korea. North Korea has not ruled out nuclear first use to deter a
preemptive strike or invasion by the United States and its allies. If the country were to detect an imminent U.S. or allied attack, it would use nuclear weapons on military installations in East
Asia and in Guam. North Korea’s intercontinental-range ballistic missiles would not be used first but would deter retaliatory nuclear use or an invasion by the United States against its territory.
The exception to this might be a scenario in which North Korea fears a first strike by the United States to eliminate the country’s leadership. Israel. Israel has neither confirmed nor denied its
possession of nuclear weapons but is thought to have developed a limited arsenal more than fifty years ago, effectively becoming the world’s sixth nuclear weapon state. In line with this policy
of nuclear opacity, Israel has made no authoritative declarations on how it would use nuclear weapons. In the late 1960s, Prime Minister Golda Meir and President Richard Nixon came to an
understanding, with Meir offering assurances that Israel would “not be the first to introduce nuclear weapons to the Middle East” but that it would also “not be the second to introduce this
weapon.”

Conventional deterrence doesn’t solve---causes an arms race and worsens


stability – Turns the Aff
Lianne de Vries 10/5/ 16
– Board Member for the Dutch Atlantic Youth Association and an Atlantic Council Future NATO Fellow, “No-First
Use: A Categorical Mistake,” http://futurenato.org/articles/no-first-use-a-categorical-mistake/
Donald Trump’s statements regarding the use of the U nited States’ nuclear capability has fired up
the debate surrounding American nuclear policy. Additionally, recent reports suggest that President Obama is keen to
leave a lasting legacy on nuclear nonproliferation by pledging a ‘no first-use’ policy. This policy is a promise that the

United States will never to be the first to launch a nuclear strike, thereby limiting the
country to nuclear retaliation alone. It is no challenge to form a critical opinion on nuclear weapons and the repercussions of their
use. They are destructive in nature and bring long-term detrimental effects to all life within an affected region. The world would be better off without the looming
nuclear weapons have contributed to an
threat of another nuclear bomb ever detonating. However, paradoxically enough,

overall reduction in the number of wars in recent decades by facilitating a balancing act between
great powers. They decrease the chance of conventional war, prevent escalation, and, in
their own way, improve international cooperation. In 2009 President Obama announced his intention to reduce America’s stockpile
of nuclear weapons and aim for a nuclear-free world countries possess a nuclear capability: the United States, Russia, China, France, the United Kingdom, Israel,
India, Pakistan, and North Korea. For some countries, particularly weaker states, nuclear capability is vital to national security. The status and insurance provided by
a nuclear weapon counterbalances deficits in conventional forces, technology and growing lists of adversaries. No states have been willing to relinquish this

Nuclear weapons offer a preventative


insurance mechanism once obtained. And the United States ought not to either.

solution to the security dilemma. A security dilemma is an escalating situation


whereby one state increases its security measures and another state responds with
similar measures, leading to increased tension and possible conflict , despite a lack of hostile intent.
Nuclear weapons quell the need to participate in a conventional arms race by effectively
deterring external attacks. Thus, they fulfill the role of force equalizer between
otherwise unequal states. This deterrent effect encourages states to approach
potential conflict with far more caution. As Bernard Brodie, an expert on nuclear strategy, suggested: “deterrence succeeds not
by fighting wars, but by preventing them from occurring”. To an extent, nuclear capability thus introduces a status quo. The

stakes for initiating conflict are much higher and so the threshold for potential war rises
also, making effective diplomacy a preferable tool of statecraft. Nuclear weapons level the
playing field between otherwise unequal states , acting as a force stabilizer, fostering stability, and
actually preserving peace. Nuclear weapons also maintain international cooperation through

the status quo effect. Were it not for the nuclear threat, the distribution of power across the

international system would be up for grabs. This would increase multipolar


competition, impede international trust and good relations and would be cause for tension. This
further serves as an argument against complete disarmament, where reliable relations and close cooperation are key, yet undermined by the elements brought in

Thirdly, without the deterrent threat of nuclear weapons the


by the process and final result of disarmament.

risk of conventional war increases. Were nukes to be abolished or the threat of use revoked, as Obama is
thought to desire, nations will rely once more on conventional forces . States will measure

readiness through the level of conventional forces, leading to a return of the


conventional security dilemma on a large scale. Elaborate conventional forces would
militarize nations, increase the possibility for escalation, and thereby the chance of conventional war.
Restraining nuclear weapons, theory shows, would not facilitate peace, but lower the threshold to war.
DA—TNW
No-first-use causes TNW withdrawal from Europe
Chamberlain 16 (Dianne Pfundstein - associate research fellow with the Arnold A. Saltzman Institute of War
and Peace Studies, Columbia University, “The Case for Retaining the First-Use Doctrine for Nuclear Weapons,”
9/28/16, https://nationalinterest.org/feature/the-case-retaining-the-first-use-doctrine-nuclear-weapons-
17865?page=0%2C1)
Adopting a no-first-use doctrine would allow the United States to remove the tactical
nuclear weapons that are currently deployed in Europe. This seems to provide an
opportunity for cost-saving, but in fact reliance on first use was motivated in part by the desire to minimize U.S. costs for
defending Western Europe. President Eisenhower did not want to station huge numbers of American troops in Europe on a permanent basis, so
relying on the U.S. nuclear arsenal seemed to be a cheaper way to defend NATO. If the United States has decided that defending the Baltics
from a Russian attack and South Korea from its northern neighbor are core strategic interests, then taking nuclear weapons off the table would
probably prompt these allies to demand a much larger commitment of U.S. conventional forces for their defense. The cost savings of
abandoning first use are not as straightforward as they might seem. Even if removing tactical weapons from Europe resulted in a net savings for
it is not clear that making U.S. commitments cheaper would produce the
the United States,
desired policy outcome, i.e. the deterrence of U.S. adversaries. To the extent that the
use of a nuclear weapon against a nuclear-armed opponent could be extremely costly
for the United States, the nuclear threat may be more effective in deterring the
adversary than a low-risk commitment of conventional forces. Removing tactical
nuclear weapons and placing the U.S. nuclear force on a lower state of alert could also
prove extremely dangerous if the United States subsequently reversed these decisions
for any reason. An opponent may interpret such reversals as signs of an imminent
attack and choose to strike the United States before it has a chance to act.

Key to deter Russian aggression


Thayer 14 (Bradley A. - associate professor of political science at the University of Minnesota-Duluth & Petr
Suchy - Head of Department of International Relations and European Studies at the Faculty of Social Studies at
Masaryk University, “Weapons as political symbolism: the role of US tactical nuclear weapons in Europe,” 7/3/14,
European Security, Volume 23, Issue 4, Taylor & Francis Online)
The value of US TNWs remains significant today, but with a different emphasis. In this section, we will
consider the contemporary roles of TNWs in Europe and argue that the political aspect
of the weapons is more important than their military role. The first role is less significant today due to
high accuracy of conventional weapons. But even the first role cannot and should not be taken off the table totally.7 The reason is that the
future is uncertain, and NATO confronts the possibility of a resurgent Russia, a threat not
equal to the Soviet Union to be sure, but Russia's actions in 2014 reveal that it poses irredentist or
revisionist political objectives in Europe. Put directly, before the seizure of Crimea, Russia under Putin generated
cause for concern (RIA Novosti, May 3, 2012). The change of tack toward greater bellicosity in Putin's
Russia underscores our point, countries can change. Accordingly, strategic thinking about
the threats NATO may face, and the role TNWs may play, should govern NATO's
decision-making with respect to TNWs. Moreover, new threats may arise on NATO's flanks,
particularly its southeastern along the Turkish border with Iran, for which TNWs might
be necessary or useful (The Baltic Times 2009, Johnson 2010, Kibaroglu 2011). In sum, deterrence is a
complicated concept and its application even more so. It is strategically imprudent to
withdraw US TNWs from Europe, given that they may have an important future role
for deterrence by denial, although we recognize that such a role might be more likely in other theaters. While superiority of US
conventional weapon systems is a major advantage for NATO and greatly strengthens its deterrence by denial capabilities, our concern is that
in the longer term, US conventional superiority is undermined through the
proliferation of capabilities on which the United States currently has a monopoly or
near monopoly. Although it may be difficult to perceive fully at present, the rapidity with which the Chinese are closing the gap in
some conventional capabilities was alarming to US Secretary of Defense Robert Gates (The Guardian 2011). In addition, cyber complicates this
issue due to the ability of the Chinese to steal military and commercial technology from the United States, Japan and NATO allies. Due to
cyberespionage, as well as old-fashioned espionage and commercial exchange, the Chinese are able to acquire the technology and use it to
improve their own military and economy and disseminate the technology to other states, allowing states to better understand US military
capabilities and reduce the gap in some categories of conventional weapons with the United States sooner than they would be able to do
With respect to the second role of TNWs, deterrence of TNWs use by the
independently.
adversary is even more relevant. Unlike the United States, the elimination of TNWs is
not entertained by other present possessors of TNWs, most importantly, Russia, in the
European context (Interfax 2012). Some attempts at gradual threat reduction, especially at the beginning of the post–cold war era –
including unilateral reductions and withdrawals of various types of TNWs, such as the 8-inch howitzer round, and the termination or
cancellation of modernization, such as the follow on to the Lance short range missile – were definitely critical decisions that helped to
strengthen trust between former adversaries. Unfortunately, uncertainty prevails about whether these initiatives, such as the Presidential
Nuclear Initiatives of early 1990s, have been really implemented by Russia (Larsen 2006, p. 34, Perry et al. 2009, Payne et al. 2013).
Nonetheless, political conditions have changed dramatically since 1989 for NATO and for Russia and given that they appear to be worsening,
unilateral measures by NATO are no longer necessary or positive for stability. Russia and the United States have reduced their strategic arsenals
significantly. The USA has done so unilaterally, as well as through treaties such as the New START Treaty.8 Both states are now at comparable
levels of launchers, missiles and warheads for strategic systems. However, the situation with tactical weapons is different. The USA and Russia
steeply reduced their TNW stockpiles, but due to the fact that their numbers heavily favored Russia, TNW arsenals still are in Moscow's favor.9
As a consequence, NATO does not have the ability to depend on equal numbers of TNWs to deter their use by Russia. While the US Air Force
deploys currently between 160 and 200 tactical nuclear bombs at European air bases, Russia has at its disposal 2000 operational TNWs
(Kristensen 2012, pp. 16, 46). We recognize, and as is discussed in detail below, that the Russians have a greater need for TNWs due to their
difficult strategic situation. Moscow faces technologically more advanced NATO forces in the West, and in the East they face Chinese numerical
superiority. We also acknowledge that some scholars doubt their deterrent role even for the Russians (Pomper et al. 2009, pp. 14–15, Yost
2011b, p. 1408). There are scholars who attribute the hawkish Russian position on TNWs to the Russian military and thus a matter of domestic
politics (Pomper et al. 2009, pp. 16–17). Other analysts, David Yost most significantly, see their role as strengthening Russian national
confidence (Yost 2012, p. 21) or as aiding the demonization of NATO (Sutyagin 2012, p. 56). Unfortunately, there is more to Russian tactical
nuclear capabilities than domestics or a dire strategic situation. A major story by preeminent security journalist Bill Sweetman revealed that the
Russian doctrine calls for TNWs use to compel deescalation by the opponent (Sweetman
2013). According to Sweetman's reporting, the use of TNWs for purposes of deescalation appeared after the 1999 war between NATO and
Russia's conception that a tactical nuclear strike against a
Yugoslavia over Kosovo (Sweetman 2013).10
high-value target would compel NATO to deescalate is worrisome. No doubt, NATO's
tactical nuclear capabilities have a direct role to play to deter such strikes.
Extinction
Fisher 15 (Max, Foreign affairs columnist at VOX, "How World War III became possible,"
6/29, http://www.vox.com/2015/6/29/8845913/russia-war)
That is why, analysts will tell you, today's tensions bear far more similarity to the period before World War
I: an unstable power balance, belligerence over peripheral conflicts, entangling
military commitments, disputes over the future of the European order, and dangerous
uncertainty about what actions will and will not force the other party into conflict. Today's Russia, once more the strongest
nation in Europe and yet weaker than its collective enemies, calls to mind the turn-of-the-century German
Empire, which Henry Kissinger described as "too big for Europe, but too small for the world." Now, as then, a rising power,
propelled by nationalism, is seeking to revise the European order. Now, as then, it believes that
through superior cunning, and perhaps even by proving its might, it can force a larger role for itself. Now, as then, the drift toward war is
gradual and easy to miss — which is exactly what makes it so dangerous. But there is one way in which today's
dangers are less like those before World War I, and more similar to those of the Cold War: the
apocalyptic logic of nuclear weapons. Mutual suspicion, fear of an existential threat,
armies parked across borders from one another, and hair-trigger nuclear weapons all
make any small skirmish a potential armageddon. In some ways, that logic has grown even more dangerous.
Russia, hoping to compensate for its conventional military forces' relative weakness, has dramatically relaxed its rules
for using nuclear weapons. Whereas Soviet leaders saw their nuclear weapons as pure deterrents, something that existed
precisely so they would never be used, Putin's view appears to be radically different. Russia's official nuclear doctrine
calls on the country to launch a battlefield nuclear strike in case of a conventional war
that could pose an existential threat. These are more than just words: Moscow has repeatedly signaled its willingness
and preparations to use nuclear weapons even in a more limited war. This is a terrifyingly low bar for nuclear
weapons use, particularly given that any war would likely occur along Russia's borders and thus not far from Moscow. And it suggests
Putin has adopted an idea that Cold War leaders considered unthinkable: that a "limited" nuclear war, of small warheads dropped on the
battlefield, could be not only survivable but winnable. "It’s not just a difference in rhetoric. It’s a whole different world," Bruce G. Blair, a
nuclear weapons scholar at Princeton, told the Wall Street Journal. He called Putin's decisions more dangerous than those of any Soviet leader
since 1962. "There’s a low nuclear threshold now that didn’t exist during the Cold War." Nuclear theory is complex and disputable; maybe Putin
a "limited" nuclear strike is in
is right. But many theorists would say he is wrong, that the logic of nuclear warfare means
fact likely to trigger a larger nuclear war — a doomsday scenario in which major
American, Russian, and European cities would be targets for attacks many times more powerful than the
bombs that leveled Hiroshima and Nagasaki. Even if a nuclear war did somehow remain limited and contained,
recent studies suggest that environmental and atmospheric damage would cause a
"decade of winter" and mass crop die-outs that could kill up to 1 billion people in a global famine.
Case

Existential threats outweigh – all life has infinite value and extinction eliminates
the possibility for future generations – err against threats, because of innate
cognitive biases
-- must preserve infinite lives and generations
-- question of intergenerational equity
-- existential threats are underestimated: global public good, intergenerational, unprecedented, scope neglect
GPP 17 (Global Priorities Project, Future of Humanity Institute at the University of Oxford, Ministry for Foreign Affairs of Finland, “Existential Risk: Diplomacy
and Governance,” Global Priorities Project, 2017, https://www.fhi.ox.ac.uk/wp-content/uploads/Existential-Risks-2017-01-23.pdf) 1.2. THE ETHICS OF EXISTENTIAL
RISK In his book Reasons and Persons, Oxford philosopher Derek Parfit advanced an influential argument about the importance of avoiding extinction: I believe that

Compare three outcomes: (1)


if we destroy mankind, as we now can, this outcome will be much worse than most people think.

Peace. (2) A nuclear war that kills 99% of the world’s existing population. (3) A nuclear
war that kills 100%. (2) would be worse than (1), and (3) would be worse than (2). Which is the greater of these two differences? Most
people believe that the greater difference is between (1) and (2). I believe that the
difference between (2) and (3) is very much greater. ... The Earth will remain habitable
for at least another billion years. Civilization began only a few thousand years ago. If
we do not destroy mankind, these few thousand years may be only a tiny fraction of
the whole of civilized human history. The difference between (2) and (3) may thus be
the difference between this tiny fraction and all of the rest of this history. If we
compare this possible history to a day, what has occurred so far is only a fraction of a
second.65 In this argument, it seems that Parfit is assuming that the survivors of a nuclear war that kills 99% of the population would eventually be able to
recover civilisation without long-term effect. As we have seen, this may not be a safe assumption – but for the purposes of this thought experiment, the point

stands. What makes existential catastrophes especially bad is that they would “destroy
the future,” as another Oxford philosopher, Nick Bostrom, puts it.66 This future could potentially be
extremely long and full of flourishing, and would therefore have extremely large
value. In standard risk analysis, when working out how to respond to risk, we work out the expected value of risk reduction, by weighing the probability that
an action will prevent an adverse event against the severity of the event. Because the value of preventing existential

catastrophe is so vast, even a tiny probability of prevention has huge expected value.67
Of course, there is persisting reasonable disagreement about ethics and there are a number of ways one might resist this conclusion.68 Therefore, it would be

In some areas, government policy does give


unjustified to be overconfident in Parfit and Bostrom’s argument.

significant weight to future generations. For example, in assessing the risks of nuclear waste storage, governments have
considered timeframes of thousands, hundreds of thousands, and even a million years.69 Justifications for this policy usually appeal to principles of
intergenerational equity according to which future generations ought to get as much protection as current generations.70 Similarly, widely accepted norms of
sustainable development require development that meets the needs of the current generation without compromising the ability of future generations to meet their

own needs.71However, when it comes to existential risk, it would seem that we fail to live
up to principles of intergenerational equity. Existential catastrophe would not only
give future generations less than the current generations; it would give them nothing.
Indeed, reducing existential risk plausibly has a quite low cost for us in comparison with

the huge expected value it has for future generations. In spite of this, relatively little is done to reduce existential
risk. Unless we give up on norms of intergenerational equity, they give us a strong case

for significantly increasing our efforts to reduce existential risks. 1.3. WHY EXISTENTIAL RISKS MAY BE
SYSTEMATICALLY UNDERINVESTED IN, AND THE ROLE OF THE INTERNATIONAL COMMUNITY In spite of the importance of
existential risk reduction, it probably receives less attention than is warranted. As a result,
concerted international cooperation is required if we are to receive adequate protection from existential risks. 1.3.1. Why existential risks are likely to be

There are several reasons why existential risk reduction is likely to be


underinvested in

underinvested in. Firstly, it is a global public good. Economic theory predicts that such
goods tend to be underprovided. The benefits of existential risk reduction are widely
and indivisibly dispersed around the globe from the countries responsible for taking
action. Consequently, a country which reduces existential risk gains only a small portion of the benefits but bears the full brunt of the costs. Countries thus
have strong incentives to free ride, receiving the benefits of risk reduction without contributing. As a result, too few do what is in the common interest.

Secondly, as already suggested above, existential risk reduction is an intergenerational public good:
most of the benefits are enjoyed by future generations who have no say in the
political process. For these goods, the problem is temporal free riding: the current
generation enjoys the benefits of inaction while future generations bear the costs.
Thirdly, many existential risks, such as machine superintelligence, engineered pandemics, and solar geoengineering, pose an
unprecedented and uncertain future threat. Consequently, it is hard to develop a satisfactory governance regime for them:
there are few existing governance instruments which can be applied to these risks, and it is unclear what shape new instruments should take. In this way, our

Cognitive biases
position with regard to these emerging risks is comparable to the one we faced when nuclear weapons first became available.

also lead people to underestimate existential risks. Since there have not been any
catastrophes of this magnitude, these risks are not salient to politicians and the
public.72 This is an example of the misapplication of the availability heuristic, a mental shortcut which assumes that something is important only if it can be
readily recalled. Another cognitive bias affecting perceptions of existential risk is scope

neglect. In a seminal 1992 study, three groups were asked how much they would be willing to pay to save 2,000, 20,000 or 200,000 birds from drowning in
uncovered oil ponds. The groups answered $80, $78, and $88, respectively.73 In this case, the size of the benefits had little effect on the scale of the preferred

People become numbed to the effect of saving lives when the numbers get too
response.

large. 74 Scope neglect is a particularly acute problem for existential risk because the
numbers at stake are so large. Due to scope neglect, decision-makers are prone to
treat existential risks in a similar way to problems which are less severe by many
orders of magnitude. A wide range of other cognitive biases are likely to affect the evaluation of existential risks.75

Indicts of our epistemology/knowledge production don’t come before or


disprove the case
Jackson 2010 (Patrick Thaddeus Jackson, Associate Professor of International Relations in the School of
International Service at the American University in Washington, DC, 2010, “The Conduct of Inquiry in International
Relations: Philosophy of Science and its Implications for the Study of World Politics,” ebook)
Faced with the impossibility of putting an end to the science question within IR by turning to the philosophy of
science, what should we do? Since we cannot resolve the question of what science is by appealing to a consensus
in philosophy, one option is to become philosophers of science ourselves, and to spend
our time and our scholarly efforts trying to resolve thorny and abstract issues about
the status of theory and evidence and the limits of epistemic certainty. But this is an
unappealing option for a scholarly field defined, if loosely, by its empirical focus (world
politics), and it would be roughly akin to advising physicists to become philosophers of
physics in order to resolve the question of what physics was and whether it was a
science. This also mis-states the relationship between philosophical debates and
scientific practice; practicing scientists have a pretty good working definition of what
it means for something to be “scientific,” but this “is less a matter of strategy than of
ongoing evaluative practice,” conducted in the course of everyday knowledge-
producing activities (Taylor 1996, 133). We do not expect physicists to give philosophical answers to
questions about the scientific status of their scholarship; we expect them to produce knowledge of the physical
world. Similarly,
we should not expect IR scholars to engage in “philosophy of IR” to the
detriment of generating knowledge about world politics; the latter, not the former, is
our main vocational task.

Trump can always circumvent using war powers -syria and war powers
resolution prove
Matthew Dickinson 4-14-2018 . ( Matthew Dickinson. Matthew Dickinson previously taught at Harvard University, where he also received his
Ph.D., working under the supervision of presidential scholar Richard Neustadt, and was a Fellow in the Governmental Studies Program at the Brookings Institution.
Matthew Dickinson is the author of Bitter Harvest: FDR, Presidential Power, and the Growth of the Presidential Branch (Cambridge University Press) and co-editor of
Guardian of the Presidency: The Legacy of Richard E. Neustadt (The Brookings Institution). He has also published numerous articles on the presidency, Congress,
presidential decision making, and presidential advisers. His current book project, titled The President and the White House Staff: People, Positions and Processes,
1945-2016, examines the growth of presidential staff in the post-World War II era."Presidential Power." Presidential Power. Web. 4-14-2018. accessed 9-10-2018.
< https://sites.middlebury.edu/presidentialpower/ >. Phops)

By now, most of you have heard that Great Britain, France and U.S military forces combined to strike three Syrian chemical
weapons facilities earlier this morning (about 4 a.m. Syrian time). The strikes were in response to the reported use of chemical weapons, most likely chlorine gas and the
sarin nerve agent, by the Assad regime against its own citizens in rebel-held areas. According to the just-concluded joint State Department and Pentagon press briefing, coalition forces,
including surface ships, submarines and aircraft, launched a total of 105 weapon strikes, largely destroying the targets. At this point it is unclear how many, if any, casualties (military or civilian)
the strikes inflicted. Pentagon officials say that prior to the attacks, “deconfliction” channels were used to warn Syria’s ally Russia that the strikes were imminent, although no specific logistical
details regarding the timing or the nature of the attacks were conveyed. Syrian air defenses were deployed but – again, according to Pentagon briefing – the Syrian efforts were ineffective,

The
with some (most?) of their anti-missile launches coming after the targets had been hit. To this point, there has been no sign of Russian involvement in the Syrian response.

decision by President Trump to launch the retaliatory strikes immediately raises two
important questions. First, under what authority did he order the strikes? Second, what was their objective, and
was it accomplished? In what can be considered either a blatant “eff-you” to his critics (and to the gods), or a demonstration of historical ignorance, Trump issued the following tweet this
morning: “Donald J. TrumpVerified account @realDonaldTrump 2h2 hours ago A perfectly executed strike last night. Thank you to France and the United Kingdom for their wisdom and the
power of their fine Military. Could not have had a better result. Mission Accomplished!” For most of us, of course, the “Mission Accomplished” tagline immediately invokes memories of the
banner placed by sailors in May, 2003, on the U.S.S. Abraham Lincoln. That banner served as the backdrop to President Bush’s announcement from the ship’s deck that we had reached the
end of “major combat operations” in Iraq. As we now know, within a couple of years a full-scale insurgency broke out in Iraq which led to the U.S. recommitting military forces. The “Mission
Accomplished” banner became a symbol of a military intervention gone wrong. While social media had a field day with Trump’s brazen proclamation, it is worth considering what mission he
believes the strikes accomplished. In the days before the Assad regime (allegedly) gassed its own people, Trump had publicly voiced his desire to get U.S. military forces out of Syria – echoing
promises he had made (to generally favorable audience responses) during the 2015-16 presidential campaign. The primary reason for military intervention in the region, he argued, was to
defeat ISIS and that goal was, essentially, achieved. However, although the talking heads on cable and social media made much of the seeming inconsistency between Trump’s professed desire
to leave Syria and the subsequent air strikes, the reality is that a suspected chemical attack took place between those two events. It’s worth recalling that Trump’s predecessor Barack Obama,
in a 2013 press conference, stated, “We have been very clear to the Assad regime, but also to other players on the ground, that a red line for us is we start seeing a whole bunch of chemical
weapons moving around or being utilized. That would change my calculus. That would change my equation.” To be sure, a full read of Obama’s remarks suggests the “red line” might have
referred to evidence that the Assad regime had lost control over its chemical weapons, instead of marking their use as unacceptable. Ultimately, when Assad did use chemical weapons, Obama
argued that the U.S. should take military action, but he also chose to let Congress decide whether to authorize a military response – a choice undoubtedly made with the realization that
Congress was unlikely to agree on how to act. That type of lawyerly reasoning did not endear Obama to everyone, but it did effectively preclude a potential debate over the extent of his war-
making powers. No matter how one interprets Obama’s words, it is apparent that for Trump, Syria’s use of chemical weapons constitutes crossing his own red line, as he made clear a year ago

In this way Trump has decided to


in ordering a limited missile strike against a Syrian airfield used by Assad to launch a previous chemical strike.

push the expanse of presidential power, whereas Obama held back . So what did Trump hope to accomplish
with these latest strikes? The joint State-Pentagon briefing this morning was instructive. Both State Department Assistant Secretary for Public Affairs Dana White and the Joint Chiefs Director
Kenneth McKenzie made clear in response to persistent questioning that the goal was to deter further use of chemical weapons by the Syrian government. They consistently refused to engage
in any discussion about the impact of the strikes on the Syrian civil war. Did Trump accomplish this mission? Will the strikes deter Assad from further use of chemical weapons? The answer to

that question will go a long way to determining how one should respond to the second issue I raised above: does Trump have the authority to
launch these strikes on his own? In the aftermath of the latest strikes, as they did after the previous strikes, journalists and pundits have been eagerly
parsing the relevant statutes and constitutional provisions to explain why Trump does, or does not, have this authority. The reality, however, as the late, great presidential scholar Richard

The probabilities of [presidential] power do not derive


Neustadt reminded us in his classic study of the Presidency: “

from a literary theory of the Constitution.” What Neustadt meant is that a close textual reading of the
Constitution, and related statute cannot – by itself – determine the answer to this
question, because these texts only provide formal vantage points from which the
relevant actors – in this case, the President, Congress, and the Courts, will do battle.
The relevant documents do not, by themselves, determine the victor. In this regard, Neustadt was a
Hamiltonian; he believed the Constitution, and the nation, was best served by a president who constantly
sought to expand the boundaries of his formal powers until he bumped up against
determined opposition. As Neustadt put it, “The more determinedly a President seeks power, the
more he will be likely to bring vigor to his clerkship. As he does so he contributes to
the energy of government.” To be sure, the Framers of the Constitution could not have anticipated the demands placed on our political institutions by modern
warfare. But the Trump administration’s evoking of its Article II “executive power” to protect

national interests as a justification for the missile strikes is a reminder that the
Constitution has proved to be a remarkably adaptable document. Presidents are
constantly pushing its boundaries, trying to expand their implied powers, until someone – Congress?
the Courts? – pushes back. Some see this as a weakness, and would prefer more clearly stated restrictions on a president’s war-making powers. However, as we can see

with the limited impact of the War Powers Resolution, which has never been effectively evoked, beyond compliance with its
reporting requirements, it is difficult to legislate limits to presidential power in this area . In Madison’s words,

“parchment barriers” (Federalist 48) haven’t proved to be a very useful limit on presidential war

making. (If press reports are accurate, Trump complied with the reporting requirements of the War Powers Resolution, at least in principle, by informing congressional leaders, of his
intentions to launch military strikes.) So what has worked? Politics. History suggests that limits on presidents’ capacity to engage in military action are a function of how well Congress is able to
push back against presidential war making, and whether the public, broadly speaking, sides with the legislative body. And, for better or for worse, that calculus will depend in part on
assessments regarding whether the missile strikes were justified, and whether they achieved their goal. In this regard, Vermont’s Democratic congressional contingent condemned Assad’s use
of chemical weapons, but also questioned Trump’s authority to launch air strikes, while not fully repudiating the strikes themselves. This is the type of careful political calculus that the nation’s
lawmakers must make in the next hours and days. Their collective decision, in turn, will provide cues to how the public ultimately responds to Trump’s actions. But the public’s response will
also be conditioned in part by what impact the strikes have on Assad’s use of chemical weapons, as reported by the media. Trump has acted, citing his Article II “executive power” to protect
the national interest. Undoubtedly, some will claim that this is an unprecedented, and thus unconstitutional, increase in the president’s war-making capacity based on an expansive and
unjustified reading of Article II. Others will argue it is entirely consistent with Hamilton’s conception of an “energetic” president leading the War against Terror, based on prerogative powers
implied by Article II. Who is right? It is important to realize that there is no “correct” answer to this as determined by a careful reading of relevant statutes. Instead, the Framers expected these

Historically, presidents have often sought to expand their


questions to be resolved through the political process.

prerogative powers to protect the nation, as implied by Article II. Congress has not always agreed with these efforts, although
they have frequently been content to let presidents assert their expanded power , and have waited to see
how that power is used, and whether the outcome it is ratified by the public. As Neustadt put it,”The need of others for a President’s initiatives creates dependence on him. Their dependence
becomes his advantage. Yet he can only capture the advantage as he meets the need.”

Rejection of securitization leads to instability and international intervention –


turns their impact
McCormack 10 – Lecturer in International Politics
Tara McCormack, is Lecturer in International Politics at the University of Leicester and has a PhD in International
Relations from the University of Westminster. 2010, Critique, Security and Power: The political limits to
emancipatory approaches, pg. 127-129
rejection of the old security framework as it has
The following section will briefly raise some questions about the
been taken up by the most powerful institutions and states. Here we can begin to see the political limits to
critical and emancipatory frameworks. In an international system which is marked by great power inequalities
between states, the rejection of the old narrow national interest-based security framework by major international
institutions, and the adoption of ostensibly emancipatory policies and policy rhetoric, has the
consequence of problematising weak or unstable states and allowing international
institutions or major states a more interventionary role, yet without establishing
mechanisms by which the citizens of states being intervened in might have any
control over the agents or agencies of their emancipation. Whatever the problems
associated with the pluralist security framework there were at least formal and
clear demarcations. This has the consequence of entrenching international power
inequalities and allowing for a shift towards a hierarchical international order in
which the citizens in weak or unstable states may arguably have even less freedom
or power than before. Radical critics of contemporary security policies, such as human security and humanitarian
intervention, argue that we see an assertion of Western power and the creation of liberal subjectivities in the developing world. For
example, see Mark Duffield’s important and insightful contribution to the ongoing debates about contemporary international security and
development. Duffield attempts to provide a coherent empirical engagement with, and theoretical explanation of, these shifts. Whilst
these shifts, away from a focus on state security, and the so-called merging of security and development are often portrayed as positive
and progressive shifts that have come about because of the end of the Cold War, Duffield argues convincingly that these shifts are highly
problematic and unprogressive. For example, the rejection of sovereignty as formal international equality and a presumption of
nonintervention has eroded the division between the international and domestic spheres and led to an international environment in which
Western NGOs and powerful states have a major role in the governance of third world states. Whilst for supporters of humanitarian
intervention this is a good development, Duffield points out the depoliticising implications, drawing on examples in Mozambique and
Afghanistan. Duffield also draws out the problems of the retreat from modernisation that is represented by sustainable development. The
Western world has moved away from the development policies of the Cold War, which aimed to develop third world states industrially.
Duffield describes this in terms of a new division of human life into uninsured and insured life. Whilst we in the West are ‘insured’ – that is
we no longer have to be entirely self-reliant, we have welfare systems, a modern division of labour and so on – sustainable development
Third world populations must be
aims to teach populations in poor states how to survive in the absence of any of this.
taught to be self-reliant, they will remain uninsured. Self-reliance of course means
the condemnation of millions to a barbarous life of inhuman bare survival . Ironically,
although sustainable development is celebrated by many on the left today, by leaving people to fend for themselves rather than
developing a society wide system which can support people, sustainable development actually leads to a less human and humane system
than that developed in modern capitalist states. Duffield also describes how many of these problematic shifts are embodied in the
contemporary concept of human security. For Duffield, we can understand these shifts in terms of Foucauldian biopolitical framework,
which can be understood as a regulatory power that seeks to support life through intervening in the biological, social and economic
processes that constitute a human population (2007: 16). Sustainable development and human security are for Duffield technologies of
security which aim to create self-managing and self-reliant subjectivities in the third world, which can then survive in a situation of serious
underdevelopment (or being uninsured as Duffield terms it) without causing security problems for the developed world. For Duffield this is
all driven by a neoliberal project which seeks to control and manage uninsured populations globally. Radical critic Costas Douzinas (2007)
also criticises new forms of cosmopolitanism such as human rights and interventions for human rights as a triumph of American hegemony.
these new security frameworks cannot
Whilst we are in agreement with critics such as Douzinas and Duffield that
be empowering, and ultimately lead to more power for powerful states, we need to
understand why these frameworks have the effect that they do. We can understand that these frameworks have political limitations
without having to look for a specific plan on the part of current powerful states. In new security frameworks such as human security we can
see the political limits of the framework proposed by critical and

Juggernaut K - Reject the priviledged framing of the the aff’s NFU policy – they
are complicit in the logic of Syrian sacrifice – the positionality of the affirmative
undermines their impact and solvency constructions – voting affirmative is
ultimately a gesture toward the position of the juggernaut.
Dominic Tierney 9-14-2016 Swarthmore political science at College associate professor,
https://www.theatlantic.com/international/archive/2016/09/nuclear-obama-north-korea-pakistan/499676/,
"Refusing to Nuke First Is for the Powerful," Atlantic, Acc:9-22-2018 (ermo/sms)
no-first-use” nuclear doctrine. This would allow the
On September 5, The New York Times reported that the Obama administration is weighing whether to adopt a so-called “

United States to launch nuclear weapons only if the enemy deployed them first. Such a change would be a dramatic policy shift: Washington has always kept the option of a
preemptive strike on the table. Under President Obama, a no-first-use doctrine has been widely regarded as an idealistic policy for the United States—a
noble, if controversial, step toward achieving his goal of “a world without nuclear weapons.” Through self-restraint, and the disavowal of a first strike, America could “escape the logic of fear,”
as Obama said at the Hiroshima Peace Memorial last May. Indeed, conservatives have condemned the no-first-use pledge as another instance of typical liberal naiveté on defense matters, or
of “ticking the boxes the far-Left long wanted ticked.” By removing the first-strike option, the argument goes, Washington will weaken America’s nuclear deterrent, embolden its enemies, and
undermine allies like Japan that rely on the U.S. nuclear umbrella. Even many of Obama’s top foreign-policy advisors are concerned by the potential security implications of this idea. Under a

But many of the arguments both for and against no-


storm of pressure, the president may very well decide that no-first-use is a bridge too far.

first-use misunderstand it: The policy reflects the power to set the rules of war, rather than
some wayward pacifist ideal to end all war. Countries that issue no-first-use pledges boast strong conventional militaries. These states want to encourage a model of war where their army
meets the enemy on a conventional battlefield with clearly defined rules—the kind of war, in other words, that they usually win. Nuclear weapons upend this model, because they help weaker
actors, the North Koreas and Pakistans of the world, produce extraordinary destruction, level the playing field, and cast victory into doubt. Therefore, a no-first-use pledge could potentially
reinforce a powerful state’s strategic advantage by discouraging other countries from developing nuclear arsenals, and by dissuading nuclear-armed countries from pushing the button. This
The same logic
would happen with the assurance that America would not fire first—thereby keeping war safely bound and safely winnable, on the powerful state’s terms.

helps explain why the United States is far more concerned if 1,000 Syrians die from
chemical weapons than if 100,000 Syrians die from guns and explosives. Normalizing the use of chemical weapons around the world is not in
the U.S. strategic interest because Washington wants to keep conflict in its comfort zone of conventional warfare. If U.S. officials concluded that chemical weapons were, in fact, of critical
strategic value, they would likely soon abandon their moral reservations over their use. Countries that contemplate or introduce a no-first-use policy are almost always strong states that enjoy
a conventional-weapons edge. Since its first nuclear test in 1964, China has repeatedly declared that it “undertakes not to be the first to use nuclear weapons at any time or under any
circumstances.” It’s no coincidence that China is the most powerful East Asian country, and would hold the advantage in any conventional war with South Korea, Vietnam, Japan, or Taiwan
(assuming, of course, that the United States stayed out). The spread of nuclear weapons in East Asia would diminish China’s strategic advantage; therefore, Beijing seeks to prevent this
outcome with a no-first-use policy. Meanwhile, India announced in 1999 that it “will not be the first to initiate a nuclear strike, but will respond with punitive retaliation should deterrence fail.”
In 2003, India qualified its no-first-use pledge by stating, “in the event of a major attack against India, or Indian forces anywhere, by biological or chemical weapons, India will retain the option
of retaliating with nuclear weapons.” Again, it’s no coincidence that India is very likely to prevail over Pakistan in a future conventional war. India has a history of winning previous contests,
and currently spends about $50 billion per year on defense compared to Pakistan’s $9.5 billion. New Delhi can safely issue a no-first-use pledge in the hope of keeping the strategic terrain
favorable. Last month, General James E. Cartwright, former head of the U.S. Strategic Command, and Bruce G. Blair, former Minuteman launch officer, co-authored an op-ed in The New York

Times in favor of a U.S. no-first-use policy. They showed, explicitly, how power undergirds the proposed doctrine. “Our
nonnuclear strength, including economic and diplomatic power, our alliances, our conventional and cyber weaponry and our technological advantages,
constitute a global military juggernaut unmatched in history. The United States simply does not need nuclear weapons
to defend its own and its allies’ vital interests, as long as our adversaries refrain from their use.” By contrast, weak states don’t even think about a

no-first-use policy. Indeed, threatening to push the button early in a conflict is the basis of their deterrent plan. During the Cold War, when the Soviet Union had
conventional superiority in Europe, the United States and its NATO allies intended to escalate to nuclear war if the Red Army launched an invasion. Similarly, today, Pakistan explicitly threatens

to retaliate with nuclear weapons if it is ever attacked—even through a conventional invasion. Viewed through a strategic—and perhaps more
cynical—lens, the no-first-use doctrine also has a huge credibility problem. For the U.S. pledge to truly
matter, a president who otherwise favors a nuclear first strike would have to decide not to press the button because of this policy. But in an extreme national

crisis—one involving, say, North Korean nuclear missiles—a president is unlikely to feel bound by America’s former
assurance. After all, if a country is willing to use nuclear weapons, it’s also willing to break a

promise. Champions and critics of no-first-use often cast it as a principled policy and a revolutionary step, for
good or for ill. But the idealistic symbolism of no-first-use betrays an underlying reality. Disavowing a first strike is a luxury afforded

to the strong, and they play this card in the hope of strategic benefit. If Obama made a dramatic announcement of no-first-use, it would
probably have less impact than people think because other countries wouldn’t follow suit, especially if they’re weak. And, in any case, the

promise may be meaningless because no one can predict a president’s calculus when staring down a nuclear
holocaust. No-first-use is the policy of Goliath, not Gandhi.

Presidential power key to credible multilat best way to solve peace studies
Stewart Patrick 12, Stewart, Research Associate @ the Center on International Cooperation, NYU,
Multilateralism and US Foreign Policy edited by Patrick and Forman, p. 18-19, AVD
deflation of presidential power can complicate U.S. commitment to credible
Still, the

multilateralism. For one thing, it increases the chance that the executive branch will agree to
assume international obligations that the legislature either opposes or has no
intention of fulfilling. This predicament is likely to be most acute when different parties control the two branches, as was the case for most of the
past decade. During the 1990s, Congress used the power of the purse to reduce foreign aid,

cut IMF and World Bank funding, withhold UN assessments, and impose budgetary
retrenchment and bureaucratic consolidation on the State Department complex. Following the 1994 elections, a
prolonged interbranch struggle produced disarray in U.S. foreign policy, including the stalling of a dozen multilateral conventions. During the Cold War, the United
States’ foreign partners could take comfort in the bipartisan U.S. consensus on international engagement and the relative orderliness and transparency of

Changing strategic and domestic circumstances, however, have cast


policymaking in Washington.

doubt on the credibility of U.S. commitments to multilateral institutions . To being with, there is no
domestic agreement today about the composition, scope, and ranking of U.S. national interests, the resources needed to pursue them, or the global commitments
they warrant. Moreover, the first post-Cold War decade saw decade saw a marked erosion of the longstanding bipartisan internationalist consensus in Congress.
Many stalwarts of constructive internationalism in both parties were replaced by colleagues preoccupied with domestic concerns or suspicious that global regimes
and organizations infringe on U.S. sovereignty, thwart U.S. interests and values, or place unacceptable checks on U.S. options. Foreign policy was increasingly the
subject of partisan squabbling, and ideological disagreement. In an inversion of the old adage, as Sarah Sewall observes, “partisanship seemed to grow stronger at
the water’s edge.” The politicization of foreign affairs has been reinforced by the growing salience of “inter-mestic” issues, like trade and immigration, which blur

the boundary between foreign and domestic policy and tend to divide rather than unite Americans. Within Congress, moreover, the
making of foreign policy has become increasingly decentralized and atomized with the
decline in party discipline and the proliferation of committees touching on foreign
policy matters. This development allows individual legislators to establish independent
foreign policy platforms. This activism is troubling, argues Lee Hamilton, former chair of the House Foreign (now International) Affairs
Committee, because members have little incentive and are poorly organized to engage in multilateral diplomacy. It has resulted in “a bias

toward unilateralism in foreign policy,” making it “harder to manage alliances,


institutions, and long-term policies across regions and topics in a highly
interconnected and complex world.”

Pres powers k2 combat nuclear terrorism


Yoo 7
John Yoo is currently the Emanuel S. Heller Professor of Law at the University of California, Berkeley., “Exercising
Wartime Powers”, Harvard International Review, Published 4-18-2007, Accessed 6-27-2018,
[http://hir.harvard.edu/article/?a=1369]/mnw
The Iraq is beginning to look like a rerun of the Vietnam War, and not just because critics are crying out that the United States has again fallen into a quagmire. War opponents argue that a wartime president has overstepped his constitutional bounds and that if Congress’ constitutional
role in deciding on war had been respected, the United States could have avoided trouble or at least have entered the war with broader popular support. According to Democratic Senator Edward Kennedy, the White House is improperly “abusing power, is excusing and authorizing

These war critics


torture, and is spying on American citizens.” The terrorists would never have been harshly interrogated and intelligence surveillance never domestically expanded if only President George W. Bush had looked to Congress.

misread the Constitution’s allocation of warmaking powers between the executive


and legislative branches. Their interpretation is weakest where their case should be its
strongest: who gets to decide whether to start a war. For much of the history of the
nation, presidents and congresses have understood that the executive branch’s
constitutional authority includes the power to begin military hostilities abroad. Energy in the Executive
During the last two centuries, neither the president nor Congress has ever acted under the belief that the Constitution requires a declaration of war before engaging in military hostilities abroad. Although the United States has used force abroad more than 100 times, it has declared war
only five times: the War of 1812, the Mexican-American and Spanish-American wars, and World War I and II. Without declarations of war or any other congressional authorization, presidents have sent troops to oppose the Russian Revolution, intervene in Mexico, fight Chinese
Communists in Korea, remove Manuel Noriega from power in Panama, and prevent human rights disasters in the Balkans. Other conflicts, such as both Persian Gulf Wars, received “authorization” from Congress but not declarations of war. Critics of these conflicts want to upend long

But the text and structure of the Constitution, as well as its


practice by appealing to an “original understanding” of the Constitution.

application over the last two centuries, confirm that the president can begin military
hostilities without the approval of Congress. The Constitution does not establish a strict warmaking process because the Framers understood that war would require the speed,

“Energy in the executive,” Alexander Hamilton argued in the


decisiveness, and secrecy that only the presidency could bring.

Federalist Papers, “...is essential to the protection of the community against foreign
attacks.” Rather than imposing a fixed, step-
He continued, “the direction of war most peculiarly demands those qualities which distinguish the exercise of power by a single hand.”

by-step method for going to war, the Constitution allows the executive and legislative
branches substantial flexibility in shaping the decision-making process for engaging in
military hostilities. Given the increasing ability of rogue states to procure weapons of
mass destruction (WMDs) and the rise of international terrorism, maintaining this
flexibility is critical to preserving US national security. The Declare War Clause Critics of the President’s war powers appeal to an understanding of declarations

The Constitution’s Declare War


probably taught in most high school civics classes. It is perhaps common sense to equate the power to “declare” war with the power to “begin” or “commence” war.

Clause, however, should not be considered in isolation. In fact, the Constitution does
not consistently use the word “declare” to mean “begin” or “initiate.” For instance, one constitutional provision withdraws
from states the power to “engage” in war. If “declare” meant “begin” or “make,” the provision should have prohibited states from “declaring” war. Similarly, another provision defines treason as “levying War” against the United States. Again, if “declare” had the clear meaning of “begin”

The evidence suggests that eighteenth-century


or “wage,” then the Constitution should have made treason the crime of “declaring war” against the United States.

English speakers used “engage” and “levy” broadly to include beginning or waging
warfare, but not “declare,” which instead carried the connotation of the recognition
of a legal status rather than of an authorization. Aside from the constitutional text itself, the structure of several constitutional provisions suggests that declaring war does not
mean the same thing as beginning, conducting, or waging war. As just mentioned, one provision generally prohibits the states from engaging in war, but it allows states to conduct hostilities if Congress approves: “No States shall, without the Consent of Congress, . . . engage in War, unless
actually invaded, or in such imminent Danger as will not admit of delay.” This provision is significant because it creates the exact process that many scholars believe should govern warmaking, namely congressional preapproval with an exception for unilateral presidential response to
actual attacks. If one believes that the Framers were consistent throughout the Constitution, the Framers should have written that “the President may not, without the Consent of Congress , engage in War, unless actually invaded, or in such imminent Danger as will not admit of delay.”
Instead, the Constitution merely gives Congress the power to declare war and the President the power to act as Commander-in-Chief, all without any description of process. The absence of a defined process for going to war is telling because the Constitution usually makes it very clear
when it requires a specific process before the government can act. This is particularly the case when the Constitution imposes shared power on the executive and legislative branches. Article I, for example, establishes a finely tuned system of bicameralism and presentment necessary to
enact federal laws. Article II, Section 2 declares that the President can make treaties subject to the advice and consent of two-thirds of the Senate while appointments can be made subject to consent of a bare majority of the Senate. Both provisions establish a process, the order in which
each institution acts, and the minimum votes required. By contrast, the Constitution does not define a process for warmaking. This suggests that the absence of a defined warmaking process is an intentional element of constitutional design. The Constitution is not merely a list of
unassociated ideas; articles, sections, and even clauses often have specific functions or themes. The Declare War Clause is housed in Article I, Section 8, Clause 11. In addition to the power to declare war, that provision also vests in Congress the now-obscure powers to grant letters of
marque and reprisal and to make rules concerning captures. Significantly, both of these powers relate to the recognition or declaration of a legal status rather than the authorization to carry out a specific activity. Rules on capture do not authorize captures in wartime but only determine
their ownership, while letters of marque and reprisal extend the benefits of combat immunity to private forces. Reading these grouped clauses to share a common nature suggests that the Declare War Clause vested Congress with a power devoted to declarations of the international legal
status of certain actions. Constitutional History and Context Historical context provides further support to this understanding of the Constitution. The Framers would have understood the distribution of war powers between the executive and legislative branches in the context of the
British Constitution, the source of many legal concepts found in the US Constitution. Under the formal British system, the King exercised all war powers, including the power to declare war. A declaration of war was not needed either to begin or to wage a war; instead, it served as a
courtesy to the enemy and as a definition of the status of their relations under international law. It notified the enemy that a state of war existed so as to formally invoke the protections of international law. It also played a domestic legal role by informing citizens of an alteration in their
legal rights and status: during periods of formal war, citizens of the contending nations could “annoy” the persons or property of the enemy and lawfully keep captured vessels. British governmental practice in the eighteenth century indicates that Parliament’s control over funding, rather
than the role of declaring war, provided a sufficient check over executive warmaking. In the 100 years before enactment of the US Constitution, Britain engaged in eight significant military conflicts but only once “declared” war at the start of a conflict. Parliament’s true check on executive
power came through control over the raising of armies and the power of the purse. This allocation of war powers under the British Constitution was not mere happenstance. Rather, the distinction between war powers and powers to fund and legislate was a core element of the
separation of powers and the rise of parliamentary democracy. Critics sometimes claim that the Constitution would not have created a powerful presidency because the Framers had rebelled against monarchy. This view misses the nuances of US political history in the period between the
Revolution and the Constitution. After independence, the revolutionaries did indeed turn against executive authority. The new state constitutions sought to weaken the executive by placing explicit restrictions on its power and by diluting its i ndependence and structural unity. In fact, in
all but one state, governors were actually elected by the legislature. But what modern-day critics of the presidency forget is that the Constitution rejected many of these innovations. Many of the drafters and ratifiers of the Constitution believed that the revolutionary state constitutions
had gone too far and had led to abuses by unchecked state legislatures. Experiments in fragmenting executive power and frustration with the limited powers of the Continental Congress led reformers to seek the restoration of authority in a unified presidency. Reading the Framers’
treatment of war powers as vesting the power over war in Congress would run directly counter to this larger historical trend. Details from the Framing debates themselves provide evidence that some of the Constitution’s supporters believed that it replicated the British system. While the
Constitutional Convention transferred the power to declare war from the British King to the Congress, an earlier draft of the Constitution had given Congress the power to “make” war, only to have it subsequently changed to the lesser power of “declare” war. When, in the all-important
state ratifying conventions, opponents of the Constitution criticized the presidency as a potential monarch, its defenders never trumpeted—although they had every incentive to do so—Congress’s power to declare war. Rather, when pressed during the Virginia ratifying convention with
the charge that the President’s powers could lead to a military dictatorship, James Madison argued that Congress’s control ov er funding would provide enough check to control the executive. National Security and Congressional Inadequacy Even if the constitutional text and history do not

Proponents of congressional war


provide a definite answer, we should then ask whether requiring congressional approval for war would provide significant functional benefits to US national security.

power often argue that the executive branch is unduly prone to war. In this view, if the president and Congress have to agree on

First, congressional
warmaking, the nation will enter fewer wars and wars that do occur will arise only after sufficient deliberation. But it is far from clear that outcomes would be better if Congress alone had the power to begin wars.

deliberation does not necessarily ensure consensus. Congressional authorization may


represent only a bare majority of Congress or an unwillingness to challenge the
President’s institutional and political strengths, regardless of the merits of the war . And even if
it does represent consensus, it is no guarantee of consensus after combat begins. The Vietnam War, which was initially approved by Congress, did not meet with a consensus over the long term but instead provoked some of the most divisive politics in US history. It is also difficult to claim
that congressional authorizations to use force in Iraq, either in 1991 or 2002, reflected a deep consensus over the merits of the wars there. The 1991 authorization barely survived the Senate, and the 2002 authorization received significant negative votes and has become a deeply divisive
issue in national politics. It is also not clear that the absence of congressional approval has led the nation into wars it should not have waged. The experience of the Cold War, which provides the best examples of military hostilities conducted without congressional support, does not
clearly come down on the side of a link between institutional deliberation and better conflict selection. Wars were fought throughout the world by the two superpowers and their proxies, such as in Korea, Vietnam, and Afghanistan, during this period. Yet the only war arguably authorized
by Congress—and this point is debatable—was the Vietnam War. Aside from bitter controversy over Vietnam, there appeared to be significant bipartisan consensus on the overall strategy of containment, as well as the overarching goal of defeating the Soviet Union. The United States did

On the other hand,


not win the four-decade Cold War by declarations of war; rather, it prevailed through the steady presidential application of the strategy of c ontainment, supported by congressional funding of the necessary military forces.

congressional action has led to undesirable outcomes. Congress led the United States
into two “bad” wars, the 1798 quasi-war with France and the War of 1812. Excessive congressional control can
also prevent the United States from entering into conflicts that are in the national interest. Most would agree now that congressional isolationism before World War II harmed US interests and that the United States and the world would have been far better off if President Franklin

Critics of presidential war


Roosevelt could have brought the United States into the conflict much earlier. Congressional participation does not automatically or even consistently produce desirable results in war decision making.

powers exaggerate the benefits of declarations or authorizations of war. What also


often goes unexamined are the potential costs of congressional participation: delay,
inflexibility, and lack of secrecy. In the post-Cold War era, the United States is
confronting the growth in proliferation of WMDs, the emergence of rogue nations,
and the rise of international terrorism. Take Each of these threats may require pre-emptive action best undertaken by the President and approved by Congress only afterward.

the threat posed by the Al Qaeda terrorist organization. Terrorist attacks are more
difficult to detect and prevent than conventional ones. Terrorists blend into civilian
populations and use the channels of open societies to transport personnel, material,
and money. Although terrorists generally have no territory or regular armed forces
from which to detect signs of an impending attack, WMDs allow them to inflict
devastation that once could have been achievable only by a nation-state. To defend
itself from this threat, the United States may have to use force earlier and more often
than when nation-states generated the primary threats to US national security. The
executive branch needs the flexibility to act quickly, possibly in situations wherein
congressional consent cannot be obtained in time to act on the intelligence. By acting earlier, the executive

Similarly, the least dangerous way to prevent rogue


branch might also be able to engage in a more limited, more precisely targeted, use of force.

nations from acquiring WMDs may depend on secret intelligence gathering and covert
action rather than open military intervention. The Delay for a congressional debate could render useless any time-critical intelligence or windows of opportunity.

Constitution creates a presidency that is uniquely structured to act forcefully and


independently to repel serious threats to the nation . Instead of specifying a legalistic process to begin war, the Framers wisely created a fluid political process in which

As the United States confronts terrorism, rogue nations, and WMD


legislators would use their appropriations power to control war.
proliferation, we should look skeptically at claims that radical changes in the way we
make war would solve our problems, even those stemming from poor judgment,
unforeseen circumstances, and bad luck.

High presidential powers k2 solve 4GW also turns case


Li 2009 [Zheyoa Li Winter, 2009 The Georgetown Journal of Law Public Policy 7 Geo. J.L. & Pub. Pol'y 373 “War
Powers for the Fourth Generation: Constitutional Interpretation in the Age of Asymmetric Warfare” lexis]
Even as the quantity of nation-states in the world has increased dramatically since the end of
World War II, the institution of the nation-state has been in decline over the past few decades.
Much of this decline is the direct result of the waning of major interstate war, which
primarily resulted from the introduction of nuclear weapons. 122 The proliferation of nuclear
weapons, and their immense capacity for absolute destruction, has ensured that conventional wars remain
limited in scope and duration. Hence, "both the size of the armed forces and the quantity of weapons at their disposal has declined
quite sharply" since 1945. 123 At the same time, concurrent with the decline of the nation-state in the
second half of the twentieth century, non-state actors have increasingly been willing
and able to use force to advance their causes. In contrast to nation-states, who adhere
to the Clausewitzian distinction between the ends of policy and the means of war to
achieve those ends, non-state actors do not necessarily fight as a mere means of advancing
any coherent policy. Rather, they see their fight as a life-and-death struggle, wherein
the ordinary terminology of war as an instrument of policy breaks down because of
this blending of means and ends. 124¶ It is the existential nature of this struggle and the
disappearance of the Clausewitzian distinction between war and policy that has given
rise to a new generation of warfare. The concept of fourth-generational warfare was first articulated in an influential
article in the Marine Corps Gazette in 1989, which has proven highly prescient. In describing what they saw as the modern trend toward a new
phase of warfighting, the authors argued that:¶ [*395] In broad terms, fourth generation warfare seems likely to
be widely dispersed and largely undefined; the distinction between war and peace will be blurred to the vanishing
point. It will be nonlinear, possibly to the point of having no definable battlefields or
fronts. The distinction between "civilian" and "military" may disappear. Actions will occur
concurrently throughout all participants' depth, including their society as a cultural, not just a physical, entity. Major military facilities, such as
airfields, fixed communications sites, and large headquarters will become rarities because of their vulnerability; the same may be true of civilian
equivalents, such as seats of government, power plants, and industrial sites (including knowledge as well as manufacturing industries). 125 It
is precisely this blurring of peace and war and the demise of traditionally definable
battlefields that provides the impetus for the formulation of a new theory of war
powers. As evidenced by Part III, supra, the constitutional allocation of war powers, and the
Framers' commitment of the war power to two co-equal branches, was not designed
to cope with the current international system, one that is characterized by the
persistent machinations of international terrorist organizations, the rise of
multilateral alliances, the emergence of rogue states, and the potentially wide
proliferation of easily deployable weapons of mass destruction, nuclear and otherwise.
The pursuit of hegemony is inevitable – Any alternative to US primacy results in
Nuclear Prolif and Global Instability
Tooley 2015– [Mark Tooley, Graduate from Georgetown University, Work at the CIA, 3-19-2015, Christianity and Nukes American nuclear
disarmament will not leave the world safer or holier, The American Spectator, http://spectator.org/articles/62090/christianity-and-nukes]
Jeong
Much of the security of the world relies on the U.S. nuclear umbrella, which continues to deter,
protect, and intimidate. Doubtless China would vastly expand its own relatively minimal nuclear arsenal and seek parity at least with Russia
absent overwhelming U.S. power. Russia’s nuclear arsenal is engorged far beyond its strategic needs, and that arsenal has in fact been blessed
by the Russian Orthodox Church, which evidently also falls outside the “ecumenical consensus.” Some religious idealists imagine that disarming
the U.S., will inspire and motivate the world to follow suit. Such expectation
the West, mainly
is based on a fundamentally and dangerously false view of global statecraft and
human nature. The power vacuum that American disarmament would create would
inexorably lead to a far more dangerous and unstable world where nuclear weapons
and other weapons of mass destruction would exponentially proliferate. American
military and nuclear hegemony for the last 70 years has sustained an historically
unprecedented approximate global peace and facilitated an even more
unprecedented global prosperity. There is indeed a moral and strategic imperative for
America today, which is to deploy its power against further nuclear proliferation and
to deter aggression by current nuclear actors, while also developing technologies and defensive weapons that
neutralize nuclear armaments. If Iran’s genocidally ambitious regime is in the end prevented from nuclearizing, it will only be thanks to
only American and Israeli nukes, perhaps joined by Saudi
American power. And if it does nuclearize,
nukes, will deter its murderous designs. Christian teaching and humanity should demand no less.

American hegemony is an irreplaceable boon to stability, peace, and


development – draw-in inevitable, but hard power mitigates impacts
Carl Gershman et al 14 , Tom Gjelten, Sarah Grebowski, Michael V. Hayden, Joshua Muravchik, David Rieff and Michael Zantovsky
“America's Purpose and Role in a Changed World: A Symposium” World Affairs, Vol. 177, No. 1 (MAY / JUNE 2014), pp 18-49 Sage Publications,
Inc. http://www.jstor.org/stable/43555062
Almost every war that America has fought since the beginning of the twentieth
century was a war America had determined to avoid. We were neutral in World War I . . . until unlimited
submarine warfare against our trans-Adantic shipping became intolerable. We resisted entering World War II until Pearl Harbor. We defined the Korean

peninsula as lying out- side our "defense perimeter," as our secretary of state declared in 1950, a few months before North Korea attacked South Korea
and we leapt into the fray. A few years later, we rebuffed French appeals for support in Viet- nam in order to avoid involving

ourselves in that distant country which was soon to become the venue of our longest war and greatest defeat. In 1990, our ambassador to Iraq

explained to Saddam Hussein that Washington had "no opinion on . . . your border disagreement with Kuwait," which he
took as encouragement to swallow his small neighbor, forcing a half mil- lion Americans to travel around the world to force him to disgorge it. A year after that, our

about the violent disintegra- tion of Yugoslavia that "we have no dog in that fight," a
secretary of state quipped

sentiment echoed by his successor, of the opposite party, who, demonstrating his virtuosity at geography, observed that that country was "a long
way from home" in a place where we lacked "vital interests" - all this not long before we sent our air force to bomb Serbia into ceasing its attacks on

Bosnia and then bombed it again a few years later until it coughed up Kosovo. Yes, there is a pattern here. When international
conflicts devolve into serious violence, they often land on America's doorstep
regardless of our wishes because ours is a big, influential, powerful country with far-
flung interests. Ours is also arguably history's most successful country, so it is natural that those who have the good fortune to be Americans would
rath- er go about their lives than entangle themselves overseas. But time and again we have seen that the choice is not up to us. Switzerland can remain neutral as
On my list of wars above, several might have been avert- ed or
wars come and go. America, not so easily.

at least fought on more favorable terms had we been more ready to fight. By the same
token, our most bril- liant foreign policy success, namely, averting a hot World War III
and bringing the Cold War to a conclusion on our own terms, was the fruit of the most
energetically inter- nationalist policy that the US had ever adopted, indeed that any
non-imperial power had ever pursued. This experience, as well as our dismal record at staying out of wars, should teach us some
strong lessons about the deepest and most important question in US foreign policy, which is not about tactics and places but about how intensely engaged we
should be. The shorthand for this debate is internationalism versus isolationism. The latter term may be unfair since no one preaches true isolation, but I use it to
characterize the impulse toward a modest or restrained approach. The case for this "isolationism" is weak, but not nonexistent. If we have sometimes erred in the
direction of being too reluctant to engage, there may also have been instances when we were too hasty. We plunged into a second Iraq War that many Americans
came to believe was too costly, too difficult to win, and of uncertain necessity. But even if this judgment is right, the Iraq War was a part of President George W.
Bush's Global War on Terror, and possibly that war might have been averted had we fought terrorism more energetically before September 11, 2001. This includes
opportunities to take Osama bin Laden out of action that we failed to seize. Had we been bolder, too, in 1991 and forced the ouster of Saddam Hussein at the end
of the first Gulf War, we might have found no reason to go back into Iraq twelve years later. My point is not that America should resort to force lightly, heedless of

rarely is it realistic to imagine that we


the risk of embarking on a war that is either unnecessary or unwinna- ble, but rather that

can "mind our own business and let other countries get along as best they can,"
however much Americans may wish to do just that. American power is the ballast that
keeps the world relatively stable. Because it is both very powerful and devoid of
imperial ambition, there is no other state, consortium, or inter- national institution
that can replace it. This is what President Bill Clinton meant when he called America the "indispensable nation." The locution sounded
embarrassingly self-glorifying but in fact merely reflected his wonderment at discovering that he could not successfully offload Ameri- ca's burdens onto the UN, as
he attempted during his first term in office, when he sought to "focus like a laser" on the domestic economy. Despite Clinton's epiphany, President Obama has
sharply downsized America's global role. It is said that the president is merely reflecting the national mood, but it is scarcely surprising that popular opinion is
trending isolationist in the absence of a presidential summons to international action. Since the end of the Cold War, isolationism has been fed by two con-
tradictory ideas: on the one hand, that America should not sully itself in a messy world; on the other, that America, itself, with its capitalism and its "hyper" power,
is a bane to others. President Obama, whose intellectual origins were on the Left and who came to office criticizing his predeces- sors for insensitivity and
arrogance, seems to embody the second of these two ideas, while his policy of disengagement draws support from a public more apt to believe the first of them. Of
course, the important question, as always, is not an absolute choice between internationalism and isolationism but whether America can be active enough to
forestall another cataclysmic event like 9/11 or Pearl Harbor. And if it is to serve our security in this respect, activism cannot be a goal in itself nor can it be random.
It must be guided by an assessment of potential threats.

Heg is key to spread democracy


Thayer 06 [Bradley A., Prof. Poli. Sci. @ Mo State U, In Defense of Primacy, The National Interest, November]
Everything we think of when we consider the current international order --free trade, a robust
monetary regime, increasing respect for human rights, growing democratization is directly linked to U.S. power.

Retrenchment proponents seem to think that the current system can be maintained
without the current amount of U.S. power behind it. In that they are dead wrong and
need to be reminded of one of history's most significant lessons: Appalling things
happen when international orders collapse. The Dark Ages followed Rome's collapse. Hitler succeeded the order established
at Versailles. Without U.S. power, the liberal order created by the United States will end just

as assuredly. As country and western great Ral Donner sang: "You don't know what you've got (until you lose it)."

Democracy prevents extinction


Diamond 95 [Larry, Senior Fellow @ the Hoover Institution, Promoting Democracy in the 1990s,
wwics.si.edu/subsites/ccpdc/pubs/di/1.htm]
This hardly exhausts the lists of threats to our security and well-being in the coming years and decades. In the former Yugoslavia nationalist
aggression tears at the stability of Europe and could easily spread. The flow of illegal drugs intensifies through increasingly powerful
international crime syndicates that have made common cause with authoritarian regimes and have utterly corrupted the institutions of
Nuclear, chemical, and biological weapons continue to proliferate.
tenuous, democratic ones,
The very source of life on Earth, the global ecosystem, appears increasingly
endangered. Most of these new and unconventional threats to security are associated
with or aggravated by the weakness or absence of democracy, with its provisions for
legality, accountability, popular sovereignty, and openness. LESSONS OF THE TWENTIETH CENTURY The experience of this
century offers important lessons. Countries that govern themselves in a truly democratic fashion do
not go to war with one another. They do not aggress against their neighbors to aggrandize themselves or glorify their
leaders. Democratic governments do not ethnically "cleanse" their own populations, and
they are much less likely to face ethnic insurgency. Democracies do not sponsor terrorism against one another. They do not build
weapons of mass destruction to use on or to threaten one another. Democratic countries form
more reliable, open, and enduring trading partnerships. In the long run they offer better and more stable climates for investment. They
are more environmentally responsible because they must answer to their own
citizens, who organize to protest the destruction of their environments. They are better bets to honor international treaties since they
value legal obligations and because their openness makes it much more difficult to breach agreements in secret. Precisely because, within their
own borders, they respect competition, civil liberties, property rights, and the rule of law, democracies are the only reliable foundation on
which a new world order of international security and prosperity can be built.

Decline undermines the alliance networks that facilitate international cooperation.


This link turns all of their impacts.
Stephen Brooks et al., G. John Ikenberry, and William C. Wohlforth 2012 “Don’t Come Home, America: The
Case against Retrenchment” http://www.carlanorrlof.com/wp-
content/uploads/2013/03/DontComeHomeAmerica.pdf, Brooks is Associate Professor of Government at
Dartmouth. He received his Ph.D. from Yale University. Ikenberry is the Albert G. Milbank Professor of Politics and
International Affairs at Princeton University in the Department of Politics and the Woodrow Wilson School of
Public and International Affairs. He is also a Global Eminence Scholar at Kyung Hee University. Wohlforth is the
Daniel Webster Professor in the Department of Government at Dartmouth College.
benefits flow to the United States from institutionalized cooperation to address a
First,

wide range of problems. There is general agreement that a stable, open, and loosely rule-based international order serves the interests of the United States.
Indeed, we are aware of no serious studies suggesting that U.S. interests would be better advanced in a world that is closed (i.e., built around blocs and spheres of influence) and devoid of
basic, agreed-upon rules and institutions. As scholars have long argued, under conditions of rising complex interdependence, states often can benefit from institutionalized cooperation.109

In the security realm, newly emerging threats arguably are producing a rapid rise in
the benefits of such cooperation for the United States. Some of these threats are
transnational and emerge from environmental, health, and resource vulnerabilities,
such as those concerning pandemics. Transnational nonstate groups with various
capacities for violence have also become salient in recent decades, including groups involved in
terrorism, piracy, and organized crime. As is widely argued, these sorts of nontraditional,
transnational threats can be realistically addressed only through various types of
collective action.111 Unless countries are prepared to radically restrict their integration into an increasingly globalized world system, the problems must be solved through
coordinated action.112 In the face of these diffuse and shifting threats, the United States is going to find itself needing to work with other states to an increasing degree, sharing information,

building capacities, and responding to crises.113 Second, U.S. leadership increases the prospects that such cooperation will emerge
in a manner relatively favorable to U.S. interests. Of course, the prospects for cooperation are partly a function of compatible
interests. Yet even when interests overlap, scholars of all theoretical stripes have established

that institutionalized cooperation does not emerge effortlessly: generating agreement


on the particular cooperative solution can often be elusive. And when interests do not
overlap, the bargaining becomes tougher yet: not just how, but whether cooperation will occur is on the table. Many factors affect the
initiation of cooperation, and under various conditions states can and have cooperated without hegemonic leadership.114 As noted above, however, scholars acknowledge that the

likelihood of cooperation drops in the absence of leadership. Finally, U.S. security


commitments are an integral component of this leadership. Historically, as Gilpin and other theorists of hegemonic
order have shown, the background security and stability that the United States provided

facilitated the creation of multilateral institutions for ongoing cooperation across


policy areas.115 As in the case of the global economy, U.S. security provision plays a role in foster ing stability within
and across regions , and this has an impact on the ability of states to engage in institutional cooperation.
Institutional cooperation is least likely in areas of the world where instability is pervasive. It is more likely to flourish in areas where states are secure and leaders can anticipate stable and

continuous relations—where the “shadow of the future” is most evident. And because of the key security role it plays in fostering
this institutional cooperation, the United States is in a stronger position to help shape
the contours of these cooperative efforts. The United States’ extended system of
security commitments creates a set of institutional relationships that foster political
communication. Alliance institutions are in the first instance about security
protection, but they are also mechanisms that provide a kind of “political
architecture” that is useful beyond narrow issues of military affairs. Alliances bind
states together and create institutional channels of communication. NATO has
facilitated ties and associated institutions—such as the Atlantic Council—that increase the ability of the
United States and Europe to talk to each other and do business.116 Likewise, the bilateral
alliances in East Asia also play a communication role beyond narrow security issues.
Consultations and exchanges spill over into other policy areas.117 For example, when U.S. officials travel to Seoul to

consult on alliance issues, they also routinely talk about other pending issues, such as ,
recently, the Korea–United States Free Trade Agreement and the Trans-Pacific Partnership.

This gives the United States the capacity to work across issue areas, using assets and bargaining chips in one area to make progress in another. It also provides more

diffuse political benefits to cooperation that flow from the “voice opportunities”
created by the security alliance architecture. 118 The alliances provide channels and
access points for wider flows of communication—and the benefits of greater political
solidarity and institutional cooperation that follow. The benefits of these communication flows
cut across all international issues , but are arguably enhanced with respect to generating
security cooperation to deal with new kinds of threats— such as terrorism and health
pandemics —that require a multitude of novel bargains and newly established
procedures of shared responsibilities among a wide range of countries. With the
existing U.S.-led security system in place, the United States is in a stronger position
than it otherwise would be to strike bargains and share burdens of security
cooperation in such areas. The challenge of rising security interdependence is greater security cooperation. That is, when countries are increasingly mutually
vulnerable to nontraditional, diffuse, transnational threats, they need to work together to eradicate the conditions that allow for these threats and limit the damage. The U.S.-

led alliance system is a platform with already existing capacities and routines for
security cooperation. These assets can be used or adapted, saving the cost of generating security cooperation from scratch. In short, having an institution in place to
facilitate cooperation on one issue makes it easier, and more likely, that the participating states will be able to achieve cooperation rapidly on a related issue.119 The usefulness of the U.S.
alliance system for generating enhanced nonsecurity cooperation is confirmed in interviews with former State Department and National Security Council officials. One former administration

official noted, using the examples of Australia and South Korea, that the security ties “create nonsecurity benefits in terms of support for
This is not
global agenda issues,” such as Afghanistan, Copenhagen, disaster relief, and the financial crisis. “

security leverage per se, but it is an indication of how the deepness of the security
relationship creates working relationships [and] interoperability that can then be
leveraged to address other regional issues.” This official notes, “We could not have organized the
Core Group (India, U.S., Australia, Japan) in response to the 2004 tsunami without the deep bilateral
military relationships that had already been in place. It was much easier for us to
organize with these countries almost immediately (within forty-eight hours) than anyone else for a
large-scale humanitarian operation because our militaries were accustomed to each
other.”120 The United States’ role as security provider also has a more direct effect of
enhancing its authority and capacity to initiate institutional cooperation in various
policy areas. The fact that the United States is a security patron of Japan, South Korea,
and other countries in East Asia, for example, gives it a weight and presence in
regional diplomacy over the shape and scope of multilateral cooperation not just
within the region but also elsewhere. This does not mean that the United States always wins these diplomatic encounters, but its leverage is
greater than it would be if the United States were purely an offshore great power without institutionalized security ties to the region. In sum, the deep engagement
strategy enables U.S. leadership , which results in more cooperation
on matters of importance than
would occur if the United States disengaged —even as it pushes cooperation toward U.S.
preferences.

Primacy is key to contain Chinese expansion


Thayer 06 [Bradley A., Prof. Poli. Sci. @ Mo State U, In Defense of Primacy, The National Interest, November]
China is clearly the most important of these states because it is a rising great power. But even Beijing is intimidated by
the United States and refrains from openly challenging U.S. power. China proclaims that
it will, if necessary, resort to other mechanisms of challenging the United States,
including asymmetric strategies such as targeting communication and intelligence satellites upon which the United States
depends. But China may not be confident those strategies would work, and so it is likely
to refrain from testing the United States directly for the foreseeable future because
China's power benefits, as we shall see, from the international order U.S. primacy
creates.

*Rejection of securitization leads to instability and international intervention –


turns their impact
McCormack 10 – Lecturer in International Politics
Tara McCormack, is Lecturer in International Politics at the University of Leicester and has a PhD in International
Relations from the University of Westminster. 2010, Critique, Security and Power: The political limits to
emancipatory approaches, pg. 127-129
The following section will briefly raise some questions about the rejection of the old security
framework as it has been taken up by the most powerful institutions and states. Here we can begin
to see the political limits to critical and emancipatory frameworks. In an international
system which is marked by great power inequalities between states, the rejection of the old narrow
national interest-based security framework by major international institutions, and the adoption
of ostensibly emancipatory policies and policy rhetoric, has the consequence of
problematising weak or unstable states and allowing international institutions or
major states a more interventionary role, yet without establishing mechanisms by
which the citizens of states being intervened in might have any control over the
agents or agencies of their emancipation. Whatever the problems associated with
the pluralist security framework there were at least formal and clear demarcations.
This has the consequence of entrenching international power inequalities and
allowing for a shift towards a hierarchical international order in which the citizens
in weak or unstable states may arguably have even less freedom or power than
before. Radical critics of contemporary security policies, such as human security and humanitarian
intervention, argue that we see an assertion of Western power and the creation of liberal subjectivities in the
developing world. For example, see Mark Duffield’s important and insightful contribution to the ongoing
debates about contemporary international security and development. Duffield attempts to provide a coherent
empirical engagement with, and theoretical explanation of, these shifts. Whilst these shifts, away from a focus
on state security, and the so-called merging of security and development are often portrayed as positive and
progressive shifts that have come about because of the end of the Cold War, Duffield argues convincingly that
these shifts are highly problematic and unprogressive. For example, the rejection of sovereignty as formal
international equality and a presumption of nonintervention has eroded the division between the international
and domestic spheres and led to an international environment in which Western NGOs and powerful states
have a major role in the governance of third world states. Whilst for supporters of humanitarian intervention
this is a good development, Duffield points out the depoliticising implications, drawing on examples in
Mozambique and Afghanistan. Duffield also draws out the problems of the retreat from modernisation that is
represented by sustainable development. The Western world has moved away from the development policies
of the Cold War, which aimed to develop third world states industrially. Duffield describes this in terms of a
new division of human life into uninsured and insured life. Whilst we in the West are ‘insured’ – that is we no
longer have to be entirely self-reliant, we have welfare systems, a modern division of labour and so on –
sustainable development aims to teach populations in poor states how to survive in the absence of any of this.
Third world populations must be taught to be self-reliant, they will remain
uninsured. Self-reliance of course means the condemnation of millions to a
barbarous life of inhuman bare survival. Ironically, although sustainable development is
celebrated by many on the left today, by leaving people to fend for themselves rather than developing a
society wide system which can support people, sustainable development actually leads to a less human and
humane system than that developed in modern capitalist states. Duffield also describes how many of these
problematic shifts are embodied in the contemporary concept of human security. For Duffield, we can
understand these shifts in terms of Foucauldian biopolitical framework, which can be understood as a
regulatory power that seeks to support life through intervening in the biological, social and economic processes
that constitute a human population (2007: 16). Sustainable development and human security are for Duffield
technologies of security which aim to create self-managing and self-reliant subjectivities in the third world,
which can then survive in a situation of serious underdevelopment (or being uninsured as Duffield terms it)
without causing security problems for the developed world. For Duffield this is all driven by a neoliberal
project which seeks to control and manage uninsured populations globally. Radical critic Costas Douzinas
(2007) also criticises new forms of cosmopolitanism such as human rights and interventions for human rights as
a triumph of American hegemony. Whilst we are in agreement with critics such as Douzinas and Duffield that
these new security frameworks cannot be empowering, and ultimately lead to
more power for powerful states, we need to understand why these frameworks have the effect that
they do. We can understand that these frameworks have political limitations without having to look for a
specific plan on the part of current powerful states. In new security frameworks such as human security we can
see the political limits of the framework proposed by critical and
And, Chinese rise causes global nuclear war.
Walton 07 [C. Dale, Lecturer in IR and Strat. Studies @ U of Reading, “Geopolitics and the Great Powers in the 21st Century,” p. 49]
Obviously, it is of vital importance to the United States that the PRC does not become the hegemon
of Eastern Eurasia. As noted above, however, regardless of what Washington does, China's success in such an endeavor is not as
easily attainable as pessimists might assume. The PRC appears to be on track to be a very great power indeed, but geopolitical conditions are
not favorable for any Chinese effort to establish sole hegemony; a robust multipolar system should suffice to keep China in check, even with
The more worrisome danger is that Beijing will
only minimal American intervention in local squabbles.
cooperate with a great power partner, establishing a very muscular axis. Such an
entity would present a critical danger to the balance of power, thus both necessitating very
active American intervention in Eastern Eurasia and creating the underlying conditions
for a massive, and probably nuclear, great power war. Absent such a "super-threat," however, the demands
on American leaders will be far more subtle: creating the conditions for Washington's gentle decline from playing the role of unipolar quasi-
hegemon to being "merely" the greatest of the world's powers, while aiding in the creation of a healthy multipolar system that is not marked by
close great power alliances.

Epistemology DA - State behavior is driven by security competition according to


the dictates of offensive realism. Critical theory does not have the ability to
unseat realism as the dominant discourse of IR your impact is inevitable
John J. Mearsheimer, realism heavyweight champion, 1995 International Security, Vol. 20, No. 1. (Summer
1995), pp. 82-93.
Realists believe thatstate behavior is largely shaped by the material structure of the
international system. The distribution of material capabilities among states is the key factor for understanding world politics. For realists, some
level of security competition among great powers is inevitable because of the material

structure of the international system. Individuals are free to adopt non-realist


discourses, but in the final analysis, the system forces states to behave according to the
dictates of realism, or risk destruction. Critical theorists, on the other hand, focus on the social structure of the
international system. They believe that "world politics is socially constructed," which is another way of saying that shared
discourse, or how communities of individuals think and talk about the world, largely shapes the world. Wendt recognizes that "material resources like gold and
tanks exist," but he argues that "such capabilities . . . only acquire meaning for human action through the structure of shared knowledge in which they are

for critical theorists, discourse can change, which means that realism is
embedded." Significantly

not forever, and that therefore it might be possible to move beyond realism to a world where
institutionalized norms cause states to behave in more communitarian and peaceful ways. The most revealing aspect of Wendt's discussion is that he did not respond

." The first problem with critical theory is that


to the two main charges leveled against critical theory in "False Promise

although the theory is deeply concerned with radically changing state behavior, it
says little about how change comes about. The theory does not tell us why particular
discourses become dominant, and others fall by the wayside. Specifically, Wendt does not
explain why realism has been the dominant discourse in world politics for well over a
thousand years, although I explicitly raised this question in "False Promise" (p. 42). Moreover, he sheds no light on why
the time is ripe for unseating realism, nor on why realism is likely to be replaced by a
more peaceful, communitarian discourse, although I explicitly raised both questions. Wendt's failure to answer these
questions has important ramifications for his own arguments. For example, he maintains that if it is possible to change international political discourse and alter
state behavior, "then it is irresponsible to pursue policies that perpetuate destructive old orders [i.e., realism], especially if we care about the well-being of future

generations." The clear implication here is that realists like me are irresponsible and do not care much about the welfare of future generations. However, even
if we change discourses and move beyond realism, a fundamental problem with Wendt's
argument remains: because his theory cannot predict the future, he cannot know whether

the discourse that ultimately replaces realism will be more benign than realism. He
has no way of knowing whether a fascistic discourse more violent than realism will
emerge as the hegemonic discourse. For example, he obviously would like another Gorbachev to come to power in Russia, but he
cannot be sure we will not get a Zhirinovsky instead. So even from a critical theory perspective, defending realism might very well be the more responsible policy

The second major problem with critical theory is that its proponents have
choice.

offered little empirical support for their theory. For example, I noted in "False Promise" that critical
theorists concede that realism has been the dominant discourse in international
politics from about 1300 to 1989, a remarkably long period of time. Wendt does not challenge this description of the historical
record by pointing to alternative discourses that influenced state behavior during this period. In fact, Wendt's discussion of history is obscure. I also noted in "False
Promise" that although critical theorists largely concede the past to realism, many believe that the end of the Cold War presents an excellent opportunity to replace
realism as the hegemonic discourse, and thus fundamentally change state behavior. I directly challenged this assertion in my article, but Wendt responds with only a
few vague words about this issue. Wendt writes in his response that "if critical theories fail, this will be because they do not explain how the world works, not

because of their values." I agree completely, but critical theorists have yet to provide evidence that their
theory can explain very much. In fact, the distinguishing feature of the critical theory literature, Wendt's work included, is its lack of
empirical content. Possibly that situation will change over time, but until it does, critical theory will not topple realism from

its commanding position in the international relations literature.


Focus on crisis-based politics is key to solving suffering – otherwise issues get
blocked form discussion, locking in status quo oppression
Robinson, Politics of crisis Escalation, Decision making under pressure 1996 p. 1, p. 27
In other cases it may he hard-ball politics that keeps future crises off the agenda. Opposing
interests may form a coalition, which effectively blocks an issue from discussion . If those
in power do not require coalitions to govern and more or less control the policy agenda, they may simply refuse to award urgency status to a
certain topic. For instance, President George Bush announced that the United States would not sign the Kyoto treaty, which effectively "killed"
the treaty. When the oil tanker Prestige sank in 2003, the Spanish government initially denied that something terrible had happened and
Some crises are not acted upon because
subsequently refused to allocate national resources to deal with the crisis .
the stakeholders involved cannot attract attention to their plight. They fail to frame
the issue of their concern in such terms that others understand and share the nature
of the threat, sense the urgency, and act accordingly. In October 2003, the UN Food and Agriculture Organization (FAO) warned that
abnormal rains had created ideal breeding grounds for locusts in the Atlas mountain ridge of Northern Africa. Nothing happened and the crisis unfolded as predicted. In the summer of 2004, the locust swarms invaded the African
north-west, destroying the livelihood of small farmers. In spite of clearly communicated and accurate warnings, a preventable crisis was allowed to wreak havoc in the region. Similarly, it wasn't until the December 2004 tsunami
had triggered the greatest natural disaster in recorded history that policy makers in the Indian Ocean region decided to set up an early-warning system similar to the one already operating in the Pacific, which presumably would
have saved many thousands of lives

positive peace come before negative peace – it’s a prerequisite / counter-


perceptions are grounded on cognitive biases, not empirical data
Pinker 2011(Steven [Professor of Psychology @ Harvard; two time Pulitzer finalist]; The Better angels of our
nature: why violence has declined; pp.xxi-xxiii; kdf)
This book is about what may be the most important thing that has ever happened in human history. Believe it or not-and I know that most
people do not-violence has declined over long stretches of time, and today we may be living in the most peaceable
era in our species' existence. The decline, to be sure, has not been smooth; it has not brought
violence down to zero; and it is not guaranteed to continue. But it is an unmistakable development, visible on scales from
millennia to years, from the waging of wars to the spanking of children. No aspect of life is untouched by the retreat from
violence. Daily existence is very different if you always have to worry about being abducted, raped, or killed, and it's hard to
develop sophisticated arts, learning, or commerce if the institutions that support them are looted and
burned as quickly as they are built. The historical trajectory of violence affects not only how life is lived but how it is
understood. What could be more fundamental to our sense of meaning and purpose than a conception of whether the strivings of the
human race over long stretches of time have left us better or worse off? How, in particular, are we to make sense of 'modernity-of the
erosion of family, tribe, tradition, and religion by the forces of individualism, cosmopolitanism, reason, and science? So much depends on
how we understand the legacy of this transition: whether we see our world as a nightmare of crime, terrorism, genocide, and war, or as a
period that, by the standards of history, is blessed by unprecedented levels of peaceful coexistence. The question of whether the arithmetic
sign of trends in violence is positive or negative also bears on our conception of human nature. Though theories of human nature rooted in
biology are often associated with fatalism about violence, and the theory that the mind is a blank slate is associated with progress, in my view it
is the other way around. How are we to understand the natural state of life when our species first emerged and the processes of history
began? The belief that violence has increased suggests that the world we made has contaminated us, perhaps irretrievably. The belief that it
has decreased suggests that we started off nasty and that the artifices of civilization have moved us in a noble direction, one in which we can
hope to continue. This is a big book, but it has to be. First I have to convince you that violence really has gone down over the course of
history, knowing that the very idea invites skepticism, incredulity, and sometimes anger. Our cognitive faculties
predispose us to believe that we live in violent times, especially when they are stoked by media that follow the watchword "If it
bleeds, it leads." The human mind tends to estimate the probability of an event from the ease with which it can recall examples,
and scenes of carnage are more likely to be beamed into our homes and burned into our memories than footage of people dying of old age.'
No matter how small the percentage of violent deaths may be, in absolute numbers there will always be enough of them to fill the evening
news, so people's impressions of violence will be disconnected from the actual proportions. Also distorting our sense of danger is our moral
psychology. No one has ever recruited activists to a cause by announcing that things are getting better, and bearers of good news are often
intellectual culture is
advised to keep their mouths shut lest they lull people into complacency. Also, a large swath of our
loath to admit that there could be anything good about civilization, modernity, and Western
society. But perhaps the main cause of the illusion of ever-present violence springs from one of the forces that drove violence down in the
first place. The decline of violent behavior has been paralleled by a decline in attitudes that tolerate or glorify violence, and often the
attitudes are in the lead. By the standards of the mass atrocities of human history, the lethal injection of a murderer in Texas, or an
occasional hate crime in which a member of an ethnic minority is intimidated by hooligans, is pretty mild stuff. But from a contemporary
vantage point, we see them as signs of how low our behavior can sink, not of how high our standards have risen. In the teeth of these
preconceptions, I will have to persuade you with numbers, which I will glean from datasets and depict in graphs. In each case I'll explain
where the numbers came from and do my best to interpret the ways they fall into place. The problem I have set out to understand is the
reduction in violence at many scales-in the family, in the neighborhood, between tribes and other armed factions, and among major nations
and states. If the history of violence at each level of granularity had an idiosyncratic trajectory, each would belong in a separate book. But to
my repeated astonishment, the global trends in almost all of them, viewed from the vantage point of the present, point downward. That calls
for documenting the various trends between a single pair of covers, and seeking commonalities in when, how, and why they have occurred.
Too many kinds of violence, I hope to convince you, have moved in the same direction for it all to be a coincidence, and that calls for an
explanation. It is natural to recount the history of violence as a moral saga-a heroic struggle of justice against evil-but that is not my starting
point. My approach is scientific in the broad sense of seeking explanations for why things happen. We may discover that a particular
advance in peacefulness was brought about by moral entrepreneurs and their movements . But we may also discover that the
explanation is more prosaic, like a change in technology, governance, commerce, or knowledge. Nor can we
understand the decline of violence as an unstoppable force for progress that is carrying us toward an omega point of perfect peace. It is a
collection of statistical trends in the behavior of groups of humans in various epochs, and as such it calls for an explanation in terms of
psychology and history: how human minds deal with changing circumstances. A large part of the book will explore the psychology of
is the synthesis of cognitive science, affective and
violence and nonviolence. The theory of mind that I will invoke
cognitive neuroscience, social and evolutionary psychology , and other sciences of human nature that I
explored in Haw the Mind Works, The Blank Slate, and The Stuff of Thought. According to this understanding, the mind is a complex system
of cognitive and emotional faculties implemented in the brain which owe their basic design to the processes of evolution. Some of these
faculties incline us toward various kinds of violence. Others-" the better angels of our nature," in Abraham Lincoln's words-incline us toward
cooperation and peace. The way to explain the decline of violence is to identify the changes in our cultural and material milieu that have
given our peaceable motives the upper hand. Finally, I need to show how our history has engaged our psychology. Everything in human
the more peaceable
affairs is connected to everything else, and that is especially true of violence. Across time and space,
societies also tend to be richer, healthier, better educated, better governed, more
respectful of their women, and more likely to engage in trade. It's not easy to tell which of these happy
traits got the virtuous circle started and which went along for the ride, and it's tempting to resign oneself to unsatisfying circularities, such as
that violence declined because the culture got less violent. Social scientists distinguish "endogenous" variables-those that are inside the
system, where they may be affected by the very phenomenon they are trying to explain-from the "exogenous" ones-those that are set in
motion by forces from the outside. Exogenous forces can originate in the practical realm, such as changes in technology, demographics, and
the mechanisms of commerce and governance. But they can also originate in the intellectual realm, as new ideas are conceived and
disseminated and take on a life of their own. The most satisfying explanation of a historical change is one that identifies an exogenous
trigger. To the best that the data allow it, I will try to identify exogenous forces that have engaged our mental faculties in different ways at
different times and that thereby can be said to have caused the declines in violence.

Raising the threat level is important as well---the aff’s political goals can’t be
accomplished absent securitization
Brian Mastroianni 17, covers science and technology for CBSNews.com, 3-16-2017, ""We are not ready": Experts
warn world is unprepared for next Ebola-size outbreak," http://www.cbsnews.com/news/study-says-world-
underprepared-ebola-level-outbreaks/
an “ideal system”
Pandemics as global security threats What happens next time a health crisis threatens to spiral out of control? Moon said

would “see all countries of the world have some basic level of preparedness” when
there seems to be a “suspicious pattern of infectious disease.” But it’s not just about
medical practices — some experts say governments need to view pandemics as
security threats. “The Neglected Dimension of Global Security,” a 2016 report from public health officials published by the National Academy of
the wave of large-scale infectious disease outbreaks over the past few
Medicine, looks at how

decades — not just Ebola, but others like HIV/AIDS and SARS — exposed how economically and politically
vulnerable nations are in the face of the ravages of future pandemics. The report finds
that a range of factors, from growing population numbers to environmental
degradation to increasing economic globalization, have shifted the dynamics of how
disease outbreaks can affect countries. “We have not done nearly enough to prevent or prepare
for such potential pandemics,” Peter Sands, the commission’s chair, wrote in the preface. “While there are certainly gaps in our scientific defenses, the bigger

leaders at all levels have not been giving these threats anything close to the
problem is that

priority they demand.” Sands called this the “neglected dimension of global security.” This report
essentially places global pandemics on the same level of seriousness as a military
assault on a country. Since pandemics are generally viewed as “health problems” rather than
“security risks,” the study argues that public health departments tend to put outbreak
preparedness on the back burner. Rather than building up defenses as one would for a
war or a terrorist attack, potential pandemics are relatively ignored. The commission issued 10
recommendations for building more effective public health resources in countries that are particularly prone to being decimated by an Ebola-level pandemic, such
as developing universal benchmarks for preparedness that nations have to meet. Economic assistance for at-risk countries is also needed —and the report argues
that money spent on preparedness would more than pay for itself. For instance, the study contends that if nations invested $4.5 billion a year to safeguard against
the next major outbreak, $60 billion a year in losses from future pandemics could be avoided.

First use capability key to deter biochemical weapon use


Steven E. Miller, 11-15-17, Director, International Security Program; Editor-in-Chief, International Security; Co-
Principal Investigator, Project on Managing the Atom. Member of the Board, Belfer Center for Science and
International Affairs "The Utility of Nuclear Weapons and the Strategy of No-First-Use," Belfer Center for Science
and International Affairs, https://www.belfercenter.org/publication/utility-nuclear-weapons-and-strategy-no-first-
use,
States that have forsaken chemical and
Deterrence of or retaliation against use of chemical or biological weapons.

biological weapons (CBW) cannot deter CBW use by threat of symmetrical reply. As
concerns about CBW proliferation have mounted, there has been growing interest in
using threats of nuclear retaliation to deter CBW use. During the Clinton Administration, the United States came
close to explicitly articulating this as its policy. The Bush Administration is similarly inclined in this direction. "Official defense policy," says one worried analysis of
this 'deterrence gap', "declares that nuclear weapons act as a deterrent against the entire spectrum of potential nuclear, biological, and chemical attacks."16
Similarly, there have been suggestions that the Israeli government might use nuclear weapons in response to an Iraqi CBW attack on its territory. Whether a state as

there is no
comprehensively powerful as the United States needs to rely on nuclear threats to deter CBW attacks is certainly questionable.17 But

question that there is wide interest in the United States in using nuclear weapons to
cope with this threat. The Bush Administration's Nuclear Posture Review, for example, says that American nuclear
weapons provide "assurance" against "known or suspected threats of nuclear,
biological, or chemical attacks," and specifically singles out the "defeat of chemical
and biological agents" as one of the missions on which the United States is working.18 Here, then, is a justification for the retention of first-use
options that has not merely persisted into the Cold War era; CBW are almost universally viewed as a growing

problem. So long as the United States (not only the Bush Administration, but also the Clinton Administration as well, albeit more
quietly) believes that nuclear weapons are necessary to deter CBW threats, there will exist

a potent reason for resisting calls for NFU. And resist the United States has done, unwaveringly, throughout the dozen years of
the post–Cold War era. It is conceivable that other nuclear-armed states will come to the same position for the same reason.

Bioweapons cause extinction


Piers Millett 17, PhD, is a Senior Research Fellow @ Oxford, “Existential Risk and Cost-Effective Biosecurity”,
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5576214/
advanced bioweapons could threaten human existence. Although the
In the decades to come,

probability of human extinction from bioweapons may be low, the expected value of
reducing the risk could still be large, since such risks jeopardize the existence of all
future generations. We provide an overview of biotechnological extinction risk, make some rough initial estimates for how severe the risks might be, and compare the
cost-effectiveness of reducing these extinction-level risks with existing biosecurity work. We find that reducing human extinction risk can be more cost-effective than reducing smaller-scale
risks, even when using conservative estimates. This suggests that the risks are not low enough to ignore and that more ought to be done to prevent the worst-case scenarios. How worthwhile
is it spending resources to study and mitigate the chance of human extinction from biological risks? The risks of such a catastrophe are presumably low, so a skeptic might argue that
addressing such risks would be a waste of scarce resources. In this article, we investigate this position using a cost-effectiveness approach and ultimately conclude that the expected value of

disease events have been


reducing these risks is large, especially since such risks jeopardize the existence of all future human lives. Historically,

responsible for the greatest death tolls on humanity. The 1918 flu was responsible for
more than 50 million deaths,1 while smallpox killed perhaps 10 times that many in the
20th century alone.2 The Black Death was responsible for killing over 25% of the
European population,3 while other pandemics, such as the plague of Justinian, are thought to have killed
25 million in the 6th century—constituting over 10% of the world's population at the time.4 It is an open question whether a future pandemic could result in
outright human extinction or the irreversible collapse of civilization. A skeptic would have many good reasons to think that

existential risk from disease is unlikely. Such a disease would need to spread
worldwide to remote populations, overcome rare genetic resistances, and evade
detection, cures, and countermeasures. Even evolution itself may work in humanity's favor: Virulence and
transmission is often a trade-off, and so evolutionary pressures could push against maximally lethal wild-type pathogens.5,6 While these
arguments point to a very small risk of human extinction, they do not rule the possibility out entirely.
Although rare, there are recorded instances of species going extinct due to disease —primarily in amphibians,

but also in 1 mammalian species of rat on Christmas Island.7,8 There are also historical examples of large human populations being almost entirely wiped out by disease, especially

when multiple diseases were simultaneously introduced into a population without


immunity. The most striking examples of total population collapse include native
American tribes exposed to European diseases, such as the Massachusett (86% loss of population), Quiripi-Unquachog (95% loss of
population), and the Western Abenaki (which suffered a staggering 98% loss of population).9 In the modern context , no single disease

currently exists that combines the worst-case levels of transmissibility, lethality,


resistance to countermeasures, and global reach. But many diseases are proof of principle that each worst-case attribute can be
realized independently. For example, some diseases exhibit nearly a 100% case fatality ratio in the absence of treatment, such as rabies or septicemic plague. Other diseases have a track
record of spreading to virtually every human community worldwide, such as the 1918 flu,10 and seroprevalence studies indicate that other pathogens, such as chickenpox and HSV-1, can
successfully reach over 95% of a population.11,12 Under optimal virulence theory, natural evolution would be an unlikely source for pathogens with the highest possible levels of

But advances in biotechnology might allow the creation of


transmissibility, virulence, and global reach.

diseases that combine such traits. Recent controversy has already emerged over a number
of scientific experiments that resulted in viruses with enhanced transmissibility,
lethality, and/or the ability to overcome therapeutics.13-17 Other experiments
demonstrated that mousepox could be modified to have a 100% case fatality rate and
render a vaccine ineffective.18 In addition to transmissibility and lethality, studies have shown that other
disease traits, such as incubation time, environmental survival, and available vectors, could be modified as well.19-21 Although these
experiments had scientific merit and were not conducted with malicious intent, their implications are still worrying. This is especially true given that there is also a long historical track record of
state-run bioweapon research applying cutting-edge science and technology to design agents not previously seen in nature. The Soviet bioweapons program developed agents with traits such
as enhanced virulence, resistance to therapies, greater environmental resilience, increased difficulty to diagnose or treat, and which caused unexpected disease presentations and

Delivery capabilities have also been subject to the cutting edge of technical
outcomes.22

development, with Canadian, US, and UK bioweapon efforts playing a critical role in
developing the discipline of aerobiology.23,24 While there is no evidence of state-run bioweapons
programs directly attempting to develop or deploy bioweapons that would pose an existential risk, the logic of
deterrence and mutually assured destruction could create such incentives in more
unstable political environments or following a breakdown of the Biological Weapons
Convention.25 The possibility of a war between great powers could also increase the
pressure to use such weapons—during the World Wars, bioweapons were used across multiple continents, with Germany targeting animals in WWI,26 and
Japan using plague to cause an epidemic in China during WWII.27
Rejecting strategic predictions of threats makes them inevitable -
decisionmakers will rely on preconceived conceptions of threat rather than the
more qualified predictions of analysts.
Michael Fitzsimmons, Winter 2007, “The Problem of Uncertainty in Strategic Planning,” Survival
But handling even this weaker form of uncertainty is still quite challeng- ing. If not sufficiently bounded, a high degree of variability in planning factors can exact a significant price on planning.
The complexity presented by great variability strains the cognitive abilities of even the most sophisticated decision- makers.15 And even a robust decision-making process sensitive to

in planning under
cognitive limitations necessarily sacrifices depth of analysis for breadth as variability and complexity grows. It should follow, then, that

conditions of risk, variability in strategic calculation should be carefully tailored to


available analytic and decision processes. Why is this important? What harm can an imbalance between complexity and cognitive or
analytic capacity in strategic planning bring? Stated simply, where analysis is silent or inadequate, the personal beliefs

of decision-makers fill the void. As political scientist Richard Betts found in a study of strategic sur- prise, in ‘an environment that lacks clarity, abounds
with conflicting data, and allows no time for rigorous assessment of sources and validity, ambiguity allows intuition or wishfulness to drive interpretation ... The greater the ambiguity, the

a
greater the impact of preconceptions.’16 The decision-making environment that Betts describes here is one of political-military crisis, not long-term strategic planning. But

strategist who sees uncertainty as the central fact of his environ- ment brings upon himself some of
the pathologies of crisis decision-making. He invites ambiguity, takes conflicting data for
granted and substitutes a priori scepticism about the validity of prediction for time pressure as
a rationale for discounting the importance of analytic rigour. It is important not to exaggerate the extent to which
data and ‘rigorous assessment’ can illuminate strategic choices. Ambiguity is a fact of life, and scepticism of analysis is necessary. Accordingly, the intuition and judgement of decision-makers
will always be vital to strategy, and attempting to subordinate those factors to some formulaic, deterministic decision-making model would be both undesirable and unrealistic. All the same,

there is danger in the opposite extreme as well. Without careful analysis of what is relatively likely and what is
relatively unlikely, what will be the possible bases for strategic choices? A decision-
maker with no faith in prediction is left with little more than a set of worst-case scenarios and his existing
beliefs about the world to confront the choices before him. Those beliefs may be more or less well founded, but if they are not made explicit and subject to
analysis and debate regarding their application to particular strategic contexts, they remain only beliefs and premises, rather than rational judgements.

Even at their best, such decisions are likely to be poorly understood by the
organisations charged with their implementation. At their worst, such decisions may be poorly understood by the decision-makers
themselves.
2nc round 6
Heg
Heg is key to solve East Asian proliferation
Lieber 05 [Robert, Prof. Gov’t and Int’l Affairs @ Georgetown U, The American Era: Power and Strategy for the
21st Century]
Taken together, these Asian involvements are not
without risk, especially vis-a-vis North
Korea, China-Taiwan, and the uncertain future of a nuclear-armed Pakistan. Nonetheless,
the American engagement provides both reassurance and deterrence and thus eases
the security dilemmas of the key states there, including countries that are America's
allies but remain suspicious of each other. Given the history of the region, an American
withdrawal would be likely to trigger arms races and the accelerated proliferation of
nuclear weapons. It is thus no exaggeration to describe the American presence as
providing the "oxygen" crucial for the region's stability and economic prosperity.37

Heg solves genocide and mass violence globally


Lieber 5 – PhD from Harvard, Professor of Government and International Affairs at Georgetown, former
consultant to the State Department and for National Intelligence Estimates (Robert, “The American Era”, pages 51-
52, WEA)
The United States possesses the military and economic means to act assertively on a
global basis, but should it do so, and if so, how? In short, if the United States conducts itself in this way,
will the world be safer and more stable, and is such a role in America’s national
interest? Here, the anarchy problem is especially pertinent. The capacity of the United Nations to act, especially
in coping with the most urgent and deadly problems, is severely limited, and in this sense, the demand for “global
governance” far exceeds the supply. Since its inception in 1945, there have only been two occasions (Korea in 1950
and Kuwait in 1991) when the U.N. Security Council authorized the use of force, and in both instances the bulk of
the forces were provided by the United States. In
the most serious cases, especially those
involving international terrorism, the proliferation of weapons of mass destruction,
ethnic cleansing, civil war, and mass murder, if America does not take the lead, no
other country or organization is willing or able to respond effectively. The deadly cases of
Bosnia (1991–95) and Rwanda (1994) make this clear. In their own way, so did the demonstrations by the people
of Liberia calling for American intervention to save them from the ravages of predatory militias in a failed state.
And the weakness of the international reaction to ethnic cleansing, rape, and
widespread killing in the Darfur region of Western Sudan provides a more recent
example.

US presence is key to India-Pakistan peace


KHALILZAD AND LESSER,1998 (Zalmay and Ian, Ambassador to Afghanistan and Sr. analyst at RAND,
Sources of Conflict in the 21st Century, page 161)
The fifth driver is Indian, Pakistani, and Chinese perceptions of the role of extraregional powers in any future
conflict. Although extraregional powers such as the United States will remain critical and
influential actors in South Asia, the nature of their presence and the way their influence
is exercised will remain important factors for stability in South Asia. The United States,
in particular, contributes to stability insofar as it can creatively use both its regional
policy and its antiproliferation strategies to influence the forms of security
competition on the subcontinent, the shape and evolution of Indian and Pakistani
nuclear programs, and the general patterns of political interaction between India and
Pakistan. The nominally extraregional power, China, also plays a critical role here both because of its presumed
competition with India and because Beijing has evolved into a vital supplier of conventional and nuclear
technologies to Pakistan.

Hegemony is sustainable and resilient – Multiple warrants


Babones 2015- [Salvatore Babones, Associate Professor at the University of Sydney, PhD in Sociology and Social
Policy, Fellow at the Institute for Policy Studies, Washington, 6-11-2015, American Hegemony Is Here to Stay: U.S.
hegemony is now as firm as or firmer than it has ever been, and will remain so for a long time to come, The
National Interest, http://nationalinterest.org/feature/american-hegemony-here-stay-13089?page=2] Jeong
IS RETREAT from global hegemony in America’s national interest? No idea has percolated more widely over the
past decade—and none is more bogus. The United States is not headed for the skids and there
is no reason it should be. The truth is that America can and should seek to remain the
world’s top dog. The idea of American hegemony is as old as Benjamin Franklin, but has its
practical roots in World War II. The United States emerged from that war as the dominant economic,
political and technological power. The only major combatant to avoid serious damage to its infrastructure, its
housing stock or its demographic profile, the United States ended
the war with the greatest
naval order of battle ever seen in the history of the world . It became the postwar home of the
United Nations, the International Monetary Fund and the World Bank. And, of course, the United States had the
bomb. America was, in every sense of the word, a hegemon. “Hegemony” is a word used by social scientists to
describe leadership within a system of competing states. The Greek historian Thucydides used the term to
characterize the position of Athens in the Greek world in the middle of the fifth century BC. Athens had the
greatest fleet in the Mediterranean; it was the home of Socrates and Plato, Sophocles and Aeschylus; it crowned
its central Acropolis with the solid-marble temple to Athena known to history as the Parthenon. Athens had a
powerful rival in Sparta, but no one doubted that Athens was the hegemon of the time until Sparta defeated it in a
bitter twenty-seven-year war. America’s only global rival in the twentieth century was the Soviet Union. The
Soviet Union never produced more than about half of America’s total national output.
Its nominal allies in Eastern Europe were in fact restive occupied countries, as were
many of its constituent republics. Its client states overseas were at best partners of convenience, and at
worst expensive drains on its limited resources. The Soviet Union had the power to resist
American hegemony, but not to displace it. It had the bomb and an impressive space program, but
little else. When the Soviet Union finally disintegrated in 1991, American hegemony was complete. The United
States sat at the top of the international system, facing no serious rivals for global leadership. This “unipolar
moment” lasted a mere decade. September 11, 2001, signaled the emergence of a new kind of threat to global
stability, and the ensuing rise of China and reemergence of Russia put paid to the era of unchallenged American
leadership. Now, America’s internal politics have deadlocked and the U.S. government shrinks from playing the
role of global policeman. In the second decade of the twenty-first century, American hegemony is widely perceived
to be in terminal decline. Or so the story goes. In fact, reports of the passing of U.S. hegemony are
greatly exaggerated. America’s costly wars in Iraq and Afghanistan were relatively
minor affairs considered in long-term perspective. The strategic challenge posed by China has also
been exaggerated. Together with its inner circle of unshakable English-speaking allies, the United States
possesses near-total control of the world’s seas, skies, airwaves and cyberspace, while
American universities, think tanks and journals dominate the world of ideas. Put aside all
the alarmist punditry. American hegemony is now as firm as or firmer than it has ever been, and will remain so for
a long time to come. THE
MASSIVE federal deficit, negative credit-agency reports, repeated
debt-ceiling crises and the 2013 government shutdown all created the impression that
the U.S. government is bankrupt, or close to it. The U.S. economy imports half a trillion dollars a
year more than it exports. Among the American population, poverty rates are high and ordinary workers’ wages
have been stagnant (in real terms) for decades. Washington seems to be paralyzed by perpetual gridlock. On top of
all this, strategic exhaustion after two costly wars in Afghanistan and Iraq has substantially degraded U.S. military
capabilities. Then, at the very moment the military needed to regroup, rebuild and rearm, its budget was hit by
sequestration. If economic power forms the long-term foundation for political and military power, it would seem
that America is in terminal decline. But policy analysts tend to have short memories. Cycles of
hegemony run in centuries, not decades (or seasons). When the United Kingdom finally defeated
Napoleon at Waterloo in 1815, its national resources were completely exhausted. Britain’s public-debt-to-GDP
ratio was over 250 percent, and early nineteenth-century governments lacked access to the full range of fiscal and
financial tools that are available today. Yet the British Century was only just beginning. The Pax Britannica and the
elevation of Queen Victoria to become empress of India were just around the corner. This is not to argue that the
U.S. government should ramp up taxes and spending, but it does illustrate the fact that it has enormous potential
fiscal resources available to it, should it choose to use them. Deficits come and go. America’s fiscal capacity
in 2015 is stupendously greater than Great Britain’s was in 1815 . Financially, there is
every reason to think that America’s century lies in the future, not in the past. The same
is true of the supposed exhaustion of the U.S. military. On the one hand, thirteen years of continuous warfare have
reduced the readiness of many U.S. combat units, particularly in the army. On the other hand, U.S. troops are
now far more experienced in actual combat than the forces of any other major
military in the world. In any future conflict, the advantage given by this experience would
likely outweigh any decline in effectiveness due to deferred maintenance and training .
Constant deployment may place an unpleasant and unfair burden on U.S. service personnel and their
families, but it does not necessarily diminish the capability of the U.S. military. On the
contrary, it may enhance it. America’s limited wars in Afghanistan and Iraq were hardly
the final throes of a passing hegemon. They are more akin to Britain’s bloody but relatively
inconsequential conflicts in Afghanistan and Crimea in the middle of the nineteenth century. Brutal wars like these
repeatedly punctured, but never burst, British hegemony. In fact, Britain engaged in costly and sometimes
disastrous conflicts throughout the century-long Pax Britannica. British hegemony did not come to an end until the
country faced Germany head-on in World War I. Even then, Britain ultimately prevailed (with American help). Its
empire reached its maximum extent not before World War I but immediately after, in 1922.
1NR
Positive and Negitive peace are distinct and your contextualization is wrong
Claske Dijkema 5-1-2007 . ( Claske Dijkema. After obtaining a Master in Sociology and Peace and Conflict Studies in Amsterdam and Berkeley, Claske
worked in South Africa for the African Center for the Constructive Resolution of Disputes (ACCORD), a training and research center on conflict resolution in Africa. Of
Dutch origin, she developed, with the Network University of Amsterdam, online courses on the transformation of conflicts for a global audience ( www.netuni.nl ).
She currently works for Modus Operandi studying the political transformation of the crisis, the dynamics of the breakdown of states, and the perspectives of the
reorganization and reform of the state in sub-Saharan Africa, in particular the Horn of Africa. "Negative versus Positive Peace." No Publication. Web. 5-1-2007.
accessed 9-30-2018. < http://www.irenees.net/bdf_fiche-notions-186_en.html >. Phops)

Galtung, the father of peace studies often refers to the distinction between
Johan

‘negative peace’ and ‘positive peace’ (e.g. Galtung 1996). Negative peace refers to the
absence of violence. When, for example, a ceasefire is enacted, a negative peace will
ensue. It is negative because something undesirable stopped happening (e.g. the
violence stopped, the oppression ended). Positive peace is filled with positive content
such as restoration of relationships, the creation of social systems that serve the
needs of the whole population and the constructive resolution of conflict. Peace does
not mean the total absence of any conflict. It means the absence of violence in all
forms and the unfolding of conflict in a constructive way. Peace therefore exists where
people are interacting non-violently and are managing their conflict positively – with
respectful attention to the legitimate needs and interest of all concerned.The authors of this
dossier consider peace as well-managed social conflict. This definition was decided on during Irenees’ Peace workshop held in South Africa in May 2007.

This Means their epistomoly first claims really are bankrupt- because they are
just straight up wrong about what they are endorsing

Vote affirmative to reduce the President’s authority to conduct first-use nuclear


strikes by restricting negative peace.

Their advocacy statement is literally in the wrong direction of the theory

This evidence means you should frame your ballot through the lens of
“restoration of relationships, the creation of social systems that serve the needs of the
whole population and the constructive resolution of conflict.” –

We believe that the global hegemonic order is better for this


Extend the Jackson evidence – the internal warrant is the K’s of epists are
unable to change the political opionions of realest actors – we are controlling
the UQ of this question

Mearchimer – There is also an epist Disad to the aff – the Dominent forms of IR
theory are so engrained that its impossible for people to divorce themselves
from those commitments
Second is the theory Disad – it fails – it means we lack knowledge of what
change is to come and whether the new form of discourse is worse
Weigh our specific evidence to get closer to the truth
Kratochwil 2008 (Friedrich Kratochwil, professor of international relations at European University Institute,
2008(Friedrich, “The Puzzles of Politics,” pg. 200-213)
In what follows, I claim that the shift in focus from “demonstration” to science as practice provides strong prima
facie reasons to choose pragmatic rather than traditional epistemological criteria in social analysis.
the epistemological project includes an argument that all warranted
Irrespective of its various forms,

knowledge has to satisfy certain field- independent criteria that are specified by
philosophy (a “theory of know- ledge”). The real issue of how our concepts and the world relate to each other, and on which non-idiosyncratic grounds we are justified to hold on
to our beliefs about the world, is “answered” by two metaphors. The first is that of an inconvertible ground, be it the nature of things, certain intuitions (Des- cartes’ “clear and distinct ideas”)
or methods and inferences; the second is that of a “mirror” that shows what is the case.¶ There is no need to rehearse the arguments demonstrating that these under- lying beliefs and
metaphors could not sustain the weight placed upon them. A “method” à la Descartes could not make good on its claims, as it depended ultimately on the guarantee of God that concepts and
things in the outer world match. On the other hand, the empiricist belief in direct observation forgot that “facts” which become “data” are – as the term suggests – “made”. They are based on

there had always


the judgements of the observer using cultural criteria, even if they appear to be based on direct perception, as is the case with colours.4¶ Besides,

been a sneaking suspicion that the epistemological ideal of certainty and rigour did
not quite fit the social world, an objection voiced first by humanists such as Vico, and later rehearsed in the continuing controversies about erklären and
verstehen (Weber 1991; for a more recent treatment see Hollis 1994). In short, both the constitutive nature of our concepts, and the value interest in which they are embedded, raise peculiar
issues of meaning and contestation that are quite different from those of description. As Vico (1947) suggested, we “understand” the social world because we have “made it”, a point raised
again by Searle concerning both the crucial role played by ascriptions of meaning (x counts for y) in the social world and the distinction between institutional “facts” from “brute” or natural
facts (Searle 1995). Similarly, since values are constitutive for our “interests”, the concepts we use always portray an action from a certain point of view; this involves appraisals and prevents
us from accepting allegedly “neutral” descriptions that would be meaningless. Thus, when we say that someone “abandoned” another person and hence communicate a (contestable)

Attempting to eliminate the value-tinge in the


appraisal, we want to call attention to certain important moral implica- tions of an act.

description and insisting that everything has to be cast in neutral, “objective”,


observational language – such as “he opened the door and went through it” – would indeed make the statement
“pointless”, even if it is (trivially) “true” (for a powerful statement of this point, see Connolly 1983).¶ The most devastating attack on the
epistemological project, however, came from the history of science itself. It not only
corrected the naive view of knowledge generation as mere accumulation of data, but
it also cast increasing doubt on the viability of various field-independent “demarcation
criteria”. This was, for the most part, derived from the old Humean argument that only sentences with empirical content were “meaningful”, while value statements had to be taken
either as statements about individual preferences or as meaningless, since de gustibus non est disputandum. As the later dis- cussion in the Vienna circle showed, this distinction was utterly
unhelpful (Popper 1965: ch. 2). It did not solve the problem of induction, and failed to acknowledge that not all meaningful theoretical sentences must correspond with natural facts.¶ Karl

Popper’s ingenious solution of making “refutability” the logical cri- terion and
interpreting empirical “tests” as a special mode of deduction (rather than as a way of increasing supporting evidence)
seemed to respond to this epistemological quandary for a while. An “historical reconstruction” of science as a progressive development thus seemed possible, as did the specification of a
pragmatic criterion for conducting research.¶ Yet again, studies in the history of science undermined both hopes. The different stages in Popper’s own intellectual development are, in fact,
rather telling. He started out with a version of conjectures and refutations that was based on the notion of a more or less self-correcting demonstration. Con- fronted with the findings that
scientists did not use the refutation criterion in their research, he emphasised then the role of the scientific community on which the task of “refutation” devolved. Since the individual scientist
might not be ready to bite the bullet and admit that she or he might have been wrong, colleagues had to keep him or her honest. Finally, towards the end of his life, Popper began to rely less
and less on the stock of knowledge or on the scientists’ shared theoretical understandings – simply devalued as the “myth of the framework” – and emphasised instead the processes of
communica- tion and of “translation” among different schools of thought within a scien- tific community (Popper 1994). He still argued that these processes follow the pattern of “conjecture
and refutation”, but the model was clearly no longer that of logic or of scientific demonstration, but one that he derived from his social theory – from his advocacy of an “open society” (Popper
1966). Thus a near total reversal of the ideal of knowledge had occurred. While formerly everything was measured in terms of the epistemological ideal derived from logic and physics,
“knowledge” was now the result of deliberation and of certain procedural notions for assessing competing knowledge claims. Politics and law, rather than physics, now provided the

template.¶ Thus the history of science has gradually moved away from the epistemo- logical ideal to
focus increasingly on the actual practices of various scientific communities engaged in
knowledge production, particularly on how they handle problems of scientific
disagreement.5 This reorientation implied a move away from field-independent criteria
and from the demonstrative ideal to one in which “arguments” and the “weight” of
evidence had to be appraised. This, in turn, not only generated a bourgeoning field of “science studies” and their “social” epistemologies (see Fuller
1991), but also suggested more generally that the traditional understandings of knowledge production

based on the model of “theory” were in need of revision.¶ If the history of science therefore provides
strong reasons for a pragmatic turn, as the discussion above illustrates, what remains to be shown is how this turn relates to the historical, linguistic
and constructivist turns that preceded it. To start with, from the above it should be clear that, in the social world, we are not

dealing with natural kinds that exist and are awaiting, so to speak, prepackaged, their
placement in the appropriate box. The objects we investi- gate are rather conceptual
creations and they are intrinsically linked to the language through which the social
world is constituted. Here “constructivists”, particularly those influenced by Wittgenstein and language philosophy, easily link up
with “pragmatists” such as Rorty, who emphasises the product- ive and pragmatic role of
“vocabularies” rather than conceiving of language as a “mirror of nature” (Rorty 1979).¶ Furthermore,
precisely because social facts are not natural, but have to be reproduced through the actions of agents, any attempt to treat them like “brute” facts becomes doubly problematic. For one,
even “natural” facts are not simply “there”; they are interpretations based on our theories. Secondly, different from the observation of natural facts, in which perceptions address a “thing”

social reality is entirely “arti- ficial” in the sense that it is dependent


through a conceptually mediated form,

on the beliefs and practices of the actors themselves. This reproductive process, directed by norms, always engenders change
either interstitially, when change is small-scale or adaptive – or more dramatically, when it becomes “transformative” – for instance when it produces a new system configuration, as after the

Consequently, any
advent of national- ism (Lapid and Kratochwil 1995) or after the demise of the Soviet Union (Koslowski and Kratochwil 1994).

examination of the social world has to become in a way “historical” even if some
“structuralist” theories attempt to minimise this dimension. [. . .]¶ Therefore a pragmatic
approach to social science and IR seems both necessary and promising. ¶ On the one
hand, it is substantiated by the failure of the epistemological project that has long
dominated the field. On the other, it offers a different positive heuristics that
challenges IR’s traditional disciplin- ary boundaries and methodological assumptions.
Interest in pragmatism therefore does not seem to be just a passing fad – even if such an interpre- tation cannot entirely be discounted, given the incentives of academia to find, just like
advertising agencies, “new and improved” versions of familiar products.

Our contingent knowledge is good enough- their ev uses the same standard
Sandberg 2005 (Jorgen Sandberg, Chair of Management and Organisation at the University of Queensland's
Business School, January 2005, “How Do We Justify Knowledge Produced Within Interpretive Approaches?,”
Organizational Research Methods, Vol. 8 No. 1, 41-68)
The dilemma interpretive researchers face can be stated in the following way: At the
same time as advocates of interpretive research deny the possibility of producing
objective knowledge, they want to claim that the knowledge they generate is true in
some way or another. But how can they justify their knowledge as true if they deny the idea of objective
truth? Does not the rejection of objective truth mean that advocates of interpretive
research approaches are forever condemned to produce arbitrary and relativist
knowledge? This is unlikely because it does not follow from the rejection of objective truth
that we cannot produce valid and reliable knowledge about reality. Despite the
rejection of objective truth, as Wachterhouser (2002) proposed, we can still “develop, apply,
and retest criteria of knowledge that give us enough reliable evidence or rational
assurance to claim in multiple cases that we in fact know something and do not just
surmise or opine that it is the case” (p. 71).
Extinction outweighs
Extinction outweighs – extend GPP 17 – they underestimate the infinite value of
potentially infinite future generations

Their dismissal is shaped by cognitive bias since it hasn’t occurred yet – the
factors that make possible for the president to have overarching and dangerous
power is a new phenomenon

xScope neglect also explains their aversion to recognizing the difference


between saving ten thousand people and 7.6 billion – it’s statistically proven
that people are numbed to the scope of potential benefits when the numbers
get too big
GPP 17 (Global Priorities Project, Future of Humanity Institute at the University of Oxford, Ministry for Foreign Affairs of Finland, “Existential Risk: Diplomacy and Governance,” Global Priorities Project, 2017, https://www.fhi.ox.ac.uk/wp-content/uploads/Existential-Risks-
2017-01-23.pdf) 1.2. THE ETHICS OF EXISTENTIAL RISK In his book Reasons and Persons, Oxford philosopher Derek Parfit advanced an influential argument about the importance of avoiding extinction: I believe that if we destroy mankind, as we now can, this outcome will be much worse
than most people think. Compare three outcomes: (1) Peace. (2) A nuclear war that kills 99% of the world’s existing population. (3) A nuclear war that kills 100%. (2) would be worse than (1), and (3) would be worse than (2). Which is the greater of these two differences? Most people
believe that the greater difference is between (1) and (2). I believe that the difference between (2) and (3) is very much greater. ... The Earth will remain habitable for at least another billion years. Civilization began only a few thousand years ago. If we do not destroy mankind, these few
thousand years may be only a tiny fraction of the whole of civilized human history. The difference between (2) and (3) may thus be the difference between this tiny fraction and all of the rest of this history. If we compare this possible history to a day, what has occurred so far is only a
fraction of a second.65 In this argument, it seems that Parfit is assuming that the survivors of a nuclear war that kills 99% of the population would eventually be able to recover civilisation without long-term effect. As we have seen, this may not be a safe assumption – but for the purposes
of this thought experiment, the point stands. What makes existential catastrophes especially bad is that they would “destroy the future,” as another Oxford philosopher, Nick Bostrom, puts it.66 This future could potentially be extremely long and full of flourishing, and would therefore
have extremely large value. In standard risk analysis, when working out how to respond to risk, we work out the expected value of risk reduction, by weighing the probability that an action will prevent an adverse event against the severity of the event. Because the value of preventing
existential catastrophe is so vast, even a tiny probability of prevention has huge expected value.67 Of course, there is persisting reasonable disagreement about ethics and there are a number of ways one might resist this conclusion.68 Therefore, it would be unjustified to be
overconfident in Parfit and Bostrom’s argument. In some areas, government policy does give significant weight to future generations. For example, in assessing the risks of nuclear waste storage, governments have considered timeframes of thousands, hundreds of thousands, and even a
million years.69 Justifications for this policy usually appeal to principles of intergenerational equity according to which future generations ought to get as much protection as current generations.70 Similarly, widely accepted norms of sustainable development require development that
meets the needs of the current generation without compromising the ability of future generations to meet their own needs.71 However, when it comes to existential risk, it would seem that we fail to live up to principles of intergenerational equity. Existential catastrophe would not only
give future generations less than the current generations; it would give them nothing. Indeed, reducing existential risk plausibly has a quite low cost for us in comparison with the huge expected value it has for future generations. In spite of this, relatively little is done to reduce existential
risk. Unless we give up on norms of intergenerational equity, they give us a strong case for significantly increasing our efforts to reduce existential risks. 1.3. WHY EXISTENTIAL RISKS MAY BE SYSTEMATICALLY UNDERINVESTED IN, AND THE ROLE OF THE INTERNATIONAL COMMUNITY In
spite of the importance of existential risk reduction, it probably receives less attention than is warranted. As a result, concerted international cooperation is required if we are to receive adequate protection from existential risks. 1.3.1. Why existential risks are likely to b e underinvested in
There are several reasons why existential risk reduction is likely to be underinvested in. Firstly, it is a global public good. Economic theory predicts that such goods tend to be underprovided. The benefits of existential risk reduction are widely and indivisibly dispersed around the globe
from the countries responsible for taking action. Consequently, a country which reduces existential risk gains only a small portion of the benefits but bears the full brunt of the costs. Countries thus have strong incentives to free ride, receiving the benefits of risk reduction without

rational public good: most of the benefits


contributing. As a result, too few do what is in the common interest. Secondly, as already suggested above, existential risk reduction is an intergene

are enjoyed by future generations who have no say in the political process. For these
goods, the problem is temporal free riding: the current generation enjoys the benefits
of inaction while future generations bear the costs. Thirdly existential risks , many , such as machine

pose an unprecedented and uncertain future threat.


superintelligence, engineered pandemics, and solar geoengineering, Consequently, it is hard to develop a
satisfactory governance regime for them: there are few existing governance instruments which can be applied to these risks, and it is unclear what shape new instruments should take. In this way, our position with regard to these emerging risks is comparable to the one we faced when

Cognitive biases also lead people to underestimate existential risks. Since


nuclear weapons first became available.

there have not been any catastrophes of this magnitude, these risks are not salient to
politicians and the public. Another 72 This is an example of the misapplication of the availability heuristic, a mental shortcut which assumes that something is important only if it can be readily recalled.

cognitive bias affecting perceptions of existential risk is scope neglect. three In a seminal 1992 study,

groups were asked how much they would be willing to pay to save 2,000, 20,000 or
200,000 birds from drowning in uncovered oil ponds. The groups answered $80, $78,
and $88, People become numbed to the effect of
respectively.73 In this case, the size of the benefits had little effect on the scale of the preferred response.

saving lives when the numbers get too large. Scope neglect is a particularly acute 74

problem for existential risk because the numbers at stake are so large. Due to scope
neglect, decision-makers are prone to treat existential risks in a similar way to
problems which are less severe by many orders of magnitude. A wide range of other cognitive biases are likely to affect the evaluation of
existential risks.75
Even if they win that some form of violence is inevitable using Util – prioritize access to
subjective pleasure

Life has intrinsic and objective value achieved through subjective pleasures---its preservation
should be an a priori goal
Amien Kacou 8 WHY EVEN MIND? On The A Priori Value Of “Life”, Cosmos and History: The Journal of Natural and Social Philosophy, Vol 4, No 1-2 (2008)
cosmosandhistory.org/index.php/journal/article/view/92/184
Furthermore, that manner of finding things good that is in pleasure can certainly not exist in any world without
consciousness (i.e., without “life,” as we now understand the word)—slight analogies put aside. In fact, we can begin to develop a more
sophisticated definition of the concept of “pleasure,” in the broadest possible sense of the word, as follows: it is the common psychological element in

all psychological experience of goodness (be it in joy, admiration, or whatever else). In this sense, pleasure can always be pictured to “mediate” all awareness
or perception or judgment of goodness: there is pleasure in all consciousness of things good; pleasure is the common element

of all conscious satisfaction. In short, it is simply the very experience of liking things, or the liking of experience, in general. In this
sense, pleasure is, not only uniquely characteristic of life but also, the core expression of goodness in life—the most general sign or

phenomenon for favorable conscious valuation, in other words. This does not mean that “good” is absolutely
synonymous with “pleasant”—what we value may well go beyond pleasure. (The fact that we value things needs not be reduced to the
experience of liking things.) However, what we value beyond pleasure remains a matter of speculation or theory. Moreover, we

note that a variety of things that may seem otherwise unrelated are correlated with pleasure—some more strongly than others. In other words, there are many things

the experience of which we like. For example: the admiration of others; sex; or rock-paper-scissors. But, again, what
they are is irrelevant in an inquiry on a priori value —what gives us pleasure is a matter for empirical investigation. Thus, we can see
now that, in general, something primitively valuable is attainable in living—that is, pleasure itself. And it seems
equally clear that we have a priori logical reason to pay attention to the world in any world where pleasure
exists. Moreover, we can now also articulate a foundation for a security interest in our life: since the good of
pleasure can be found in living (to the extent pleasure remains attainable),[17] and only in living, therefore, a priori, life ought to
be continuously (and indefinitely) pursued at least for the sake of preserving the possibility of finding that
good. However, this platitude about the value that can be found in life turns out to be, at this point, insufficient for our purposes. It seems to amount to very little more than
recognizing that our subjective desire for life in and of itself shows that life has some objective value. For what difference is there between

saying, “living is unique in benefiting something I value (namely, my pleasure); therefore, I should desire to go
on living,” and saying, “I have a unique desire to go on living; therefore I should have a desire to go on living,”
whereas the latter proposition immediately seems senseless? In other words, “life gives me pleasure,” says little more than, “I like life.” Thus, we seem to have arrived

at the conclusion that the fact that we already have some (subjective) desire for life shows life to have some
(objective) value . But, if that is the most we can say, then it seems our enterprise of justification was quite superficial, and the subjective/objective distinction was useless—
for all we have really done is highlight the correspondence between value and desire. Perhaps, our inquiry should be a bit more complex.
A2 0Hierarchy – Util for everyone – Util on balance save more lifes – no answer
to future generations means they still matter – doesn’t justify oppression – all of
our turns show that
A2 invisable – Structual violenve – Juggernaut answers
Extend Tierney 16, the aff puts itself in the position of the juggernaut. They
claim that by establishing an NFU they will be able to get rid of the credible
threat of US first use. However, that comes from a position of power. It is
exceptionally easy for the US, a country with a conventional army larger than
anything seen in history, to say they won’t use nukes because they just don’t
have to. That doesn’t make other countries less afraid nor does it make them
less likely to build their own nuclear arsenal. This obsession with things like
nuclear weapons power papers over the violence caused by US conventional
warfare—rending Syrian sacrifice and is a reason to reject their aff or at least
question the logic of their impacts.

Potrebbero piacerti anche