Está en la página 1de 23

Wind PTC Georgia

Negative Novice Packet


Index
Index............................................................................................................................................................................................................1
Econ frontline...............................................................................................................................................................................................2
Econ – frontline............................................................................................................................................................................................3
Econ – ext 3 – blackouts answers................................................................................................................................................................4
Econ – ext 3 – blackouts answers................................................................................................................................................................5
Econ – ext 4 – employment answers............................................................................................................................................................6
Economy IL answer.....................................................................................................................................................................................7
Warming frontline........................................................................................................................................................................................8
Warming Frontline.......................................................................................................................................................................................9
Warming frontline......................................................................................................................................................................................10
Warming – ext 1 – slow.............................................................................................................................................................................11
Warming – ext 3 – not human caused........................................................................................................................................................12
Warming – ext 3 – not human caused........................................................................................................................................................13
Warming – ext 4 – inevitable.....................................................................................................................................................................14
Warming – A2 hurts Oceans......................................................................................................................................................................15
Warming – A2 hurts agriculture.................................................................................................................................................................16
Warming – A2 hurts agriculture.................................................................................................................................................................17
Solvency frontline......................................................................................................................................................................................18
Solvency frontline......................................................................................................................................................................................19
Solvency – ext 1 – wind inevitable............................................................................................................................................................20
Solvency ext 2 – wind is growing..............................................................................................................................................................21
Solvency – ext 3 – PTC fails......................................................................................................................................................................22
Solvency – ext 4 – PTC not enough...........................................................................................................................................................23

1
Wind PTC Georgia
Negative Novice Packet
Econ frontline
1. PTC alone can’t solve—lack of skilled workers causes manufacturing outsourcing and loss of jobs, destroying
the renewable energy market
USA Today in 8 (“Taxes, Worker Shortage Worry Renewable Energy Firms, 02-02-08,
http://www.usatoday.com/money/industries/energy/2008-02-02-alternativeenergy_N.htm)

The federal government not only must extend the tax credits, but provide more money for training workers, said George
Sterzinger, executive director of the Washington-based Renewable Energy Policy Project.
If not, manufacturing will go overseas and the jobs will be lost, he said. It makes no sense, he added, to wean America off its
dependence on foreign oil only to become dependent on other countries for products in sustainable energy production.
"You look at a wind turbine. It's got a whole bunch of parts. Somebody makes the blades, somebody makes the tower, somebody
makes the gearboxes, the electronic controls," Sterzinger said. "Those parts can come from China, India -- or from Buffalo."
The wind-energy industry currently employs about 45,000 people in the U.S. and had $9 billion worth of investment last year, a 45
percent increase from 2006, Swisher said.
"Given that growth, we're already seeing constraints in terms of workers," he said.
Swisher estimates that by 2030, nearly a half-million jobs could be created in the wind industry, in manufacturing, construction
and operation.
The solar industry, too, is growing. Last year set a record with 314 megawatts of new solar capacity installed in the United States,
said Rhone Resch, president of the Solar Energy Industries Association. That's enough to power about 80,000 homes, he said.
The market was worth just about $200 million five years ago. Last year, it topped $2 billion, Resch said.
"These are jobs that are really the backbone of the economy, jobs like roofers, carpenters, electricians and plumbers," he said. "But
the federal government is completely asleep at the switch here."

2. Renewable energy drives up electricity prices due to new transmission lines and intermittency

Kranenburg in 7—CFA, Director, Business Development, Edison Electric Institute (Roger, Insight Magazine, “Charting a Course
for Renewables”, October 2007, http://www.platts.com/Magazines/Insight/2007/oct/200710B25150Eh3C0sQw3H_1.xml)

New, high-voltage transmission lines must also be built to accommodate the new renewable energy facilities. The siting of
renewable generation, especially wind power, is generally in remote, rural areas, far from existing transmission wires. These
transmission expansions can cost approximately $1 million to $3 million per mile to build. And these lines can also take years to
complete due to red-tape delays (especially when involving federal lands, which are particularly common in western states) and
public concerns.
The intermittent nature of wind and solar resources also can have costly impacts on the electric grid related to generation
interconnection and integration, transmission planning, and system and market operations, all of which must be taken into account
by utility planners. And because many renewable energy resources are intermittent, in that they do not operate consistently, electric
utilities will still need to build generating facilities using conventional fuels—most likely natural gas, which has experienced
volatile price swings of late—to support the grid reliability.

2
Wind PTC Georgia
Negative Novice Packet
Econ – frontline

3. Problems with the grid are being solved in the status quo
Fox News, 7
(Allison Barrie, “Project Hydra: Keeping Power Out of the Hands of Terrorists,” 6-6-2008, http://www.foxnews.com/story/0,2933,364104,00.html) // SM

The closer the grid gets to hitting capacity and buckling from consumer demand, the more and more vulnerable it becomes to natural disasters and terrorist attacks causing blackouts, rolling outages and cascading
failures.
The Department of Energy has taken the lead on countering this threat and has come up with a plan to be rolled out in 2020. But will
terrorists conveniently wait for the next 12 years to exploit this vulnerability?
ConEdison, American Superconductor and the Department of Homeland Security are determined
Fortunately, some groups are stepping in to fill the gap.
to keep the lights on in New York no matter what terrorists throw at the grid.
In less than two years, the three organizations plan to launch a program they’re calling the Resilient Electric Grid, which provides a new
superconductor cable that can link up stations and ensure the steady flow of juice to all parts of the city.
Right now, if an area like the financial district is targeted and goes down, the grid will not allow any other stations to assist by donating electricity to keep the lights on in that area.
Butwhen this superconducting cable is integrated with the existing electrical grid, it will link up substations and allow them to share
excess capacity in case of an emergency.
In the event of a deliberate attempt to cause a cascading failure similar to the blackout of 2003, it also will be able to limit the
current flow between substations during fault conditions.
The effort was dubbed "Project Hydra" after the mythical beast that grew a new head each time any was chopped off.
Once the capability for multi-path electrical resilience goes live in the New York City electric grid in 2010, the plan is to roll out
Hydra to protect other national critical infrastructure.
Project Hydra also has plans to install micro wind and water turbine generators on rooftops to ensure ongoing power generation for neighborhoods in the event of a crisis.
Guarding every power line from a terrorist attack is an impossibility for forces that already lack resources, but Project Hydra will allow our guns, guards and badges to be focused on nuclear plants and other places
where they are critical to stopping terrorist attacks.

4. Renewable energy won’t produce more jobs in the short-term


Johnson, 8 – has spent the past decade reporting from Europe, increasingly on energy issues
(Keith, “Any Given Wednesday: Green Jobs on the Hill,” 2-5-2008, http://blogs.wsj.com/environmentalcapital/2008/02/05/)

Nobody can really argue against more, and higher-paying jobs. But at a time of historically low unemployment, in a world where
fossil fuels are expected to provide nearly 80% of energy for the forseable future, many economists ask: how many net jobs can
“greening” really create?
“In the short run, there’s no way net jobs are going to be positive” from renewable energy alone, says John Whitehead, an
economics professor at Appalachian State University and half of Environmental Economics. “More brown-energy jobs will be
lost.”

3
Wind PTC Georgia
Negative Novice Packet
Econ – ext 3 – blackouts answers
The plan places more stress on the electric grid by requiring more transmission capability
Executive Intelligence Review, 6 (Marsha Freeman, “NERC Forecast: 22 Necessary Actions Required to Save U.S. Electric
Grid,” http://www.larouchepub.com/other/2006/3343power_shortages.html)

'Renewable' Resources Hoax


Another craze with the potential to destabilize the fragile electric grid is the promotion of "renewable" energy sources. Currently, a
total of 21 states and the District of Columbia have adopted requirements for the purchase of renewable energy by utilities,
sometimes for as much as 25% of their total supply. Wind generation is expected to provide the bulk of this "renewable" energy.
However, NERC points out, "wind generation is often located in remote areas, which requires new transmission construction to
deliver its energy" to where it is needed. In addition, because wind and other "renewable" resources are intermittent in nature,
generating capacity is unpredictable, requiring the installation of additional reliable generating capacity, usually fossil-fueled, to
ensure the ability to serve customers.

Our evidence assumes yours - the system is inherently stable – down areas can just by bypassed; only a large
problem will cause major issues
Kaplan, 7 – Associated Editor at the Council of Foreign Relations (Eben, “America’s Vulnerable Energy Grid,” 4-27-2007,
http://www.cfr.org/publication/13153/americas_vulnerable_energy_grid.html) // SM

With some 160,000 miles of high voltage lines and 250,000 substations, the U.S. power grid remains open to a host of threats. “It’s
extremely difficult to harden,” says Gellings.
The system was built at a time when its vulnerabilities had little impact; even today, under normal conditions a downed line or a
substation can easily be bypassed. “It’s like a web, you can go around issues,” explains Ed Legge, a spokesman for the Edison
Electric Institute, an association of publicly owned electric companies. Of course, when a grid becomes overloaded, losing a line
increases the strain and could cause failure. Circumventing one or two disruptions is one matter, but a host of simultaneous
interruptions will still cause widespread outages. This was the scenario in St. Louis in 2006, when a powerful storm knocked out
so many lines at once that the grid could no longer function.

4
Wind PTC Georgia
Negative Novice Packet
Econ – ext 3 – blackouts answers

The root cause of massive failures has been fixed – the NERC regulates transmission now

Kaplan, 7 – Associated Editor at the Council of Foreign Relations (Eben, “America’s Vulnerable Energy Grid,” 4-27-2007,
http://www.cfr.org/publication/13153/americas_vulnerable_energy_grid.html) // SM

Managing Risk
Overgrown trees alone did not precipitate the massive 2003 blackout. The greater cause was a grid so overloaded it had become
unstable. The trees merely provided the catalyst for a chain reaction that, had the system been operating stably, never would have
been possible. The North American Electrical Reliability Corporation (NERC) is responsible for ensuring such conditions are not
repeated. NERC’s president and CEO, Richard P. Sergel, explains that three of his agency’s standards were not being met when the
lights went out across the Northeast that summer: Trees went untrimmed, operators lacked the proper training, and monitoring
systems showing the grid’s condition in real-time were not in place.
Since its inception in 1968, NERC’s regulations for operating power grids have been voluntary. But in 2005, Congress asked the
Federal Energy Regulatory Commission (FERC) to designate an organization to establish and enforce rules of operation for the
nation’s electrical grid; it settled on NERC, which assumes the responsibility on June 4, 2007.
When this happens, NERC guidelines for safe operation of the electrical grid, which are currently voluntary, will become
mandatory. Under NERC’s oversight, Sergel says, consumers can rest assured the conditions that made the 2003 blackout possible
will not be replicated. “No matter how stressed the system is,” he says, “We still insist it operate stably.” At times this could mean
less reliable service; brief, managed outages could occur in order to avoid overburdening the system and risking massive failure.
Such was the case in Texas in April 2006, when hundred-degree temperatures pushed energy demands beyond the capability of the
transmission infrastructure. Though not everyone had power all the time, the relatively brief service interruptions helped allay a
massive system failure.

The 2005 Energy Bill has solves for blackouts – multiple warrents

IEEE, 7 (Institute of Electrical and Electronics Engineers, Inc, “Reliability and Blackouts,” 4-25-2007+,
http://www.electripedia.info/reliability.asp) // SM

After the major blackouts of 1965, the North American Electric Reliability Council (NERC) was created to provide guidelines to
prevent a recurrence of such a blackout [6]. The main purpose of NERC was to ensure that every region had sufficient "reserves"
(sufficient "extra generation" instantly available) to make sure the system could tolerate the loss of any single piece of equipment
without disruption at any time, and in many cases to sustain the loss of more than one piece of equipment. It also adopted rules that
diversified these reserves sufficiently throughout the system to make sure that no major transmission problems would occur [9]. In
the wake of the Blackout of 2003, NERC’s powers were significantly expanded by the Energy Policy Act of 2005.The Energy Policy Act of
2005 (EPAct) (Full Act: Pub.L. 109-058; Summary: Research Service) addressed reliability issues in a number of ways. First, EPAct mandated the creation of a self-regulatory Electric Reliability Organization (ERO)
that spans North America, with oversight by the Federal Energy Regulatory Commission (FERC). In 2006, FERC certified NERC as the ERO for the United States [12]. As the ERO, NERC is
responsible for establishing and enforcing FERC-approved electric reliability standards [13]. Furthermore, Mexican and Canadian
authorities have promised to back NERC’s regulations with the force of law [14].
Second, EPAct mandates that the Department of Energy (DOE) conduct a study of electric transmission congestion every three
years and gives the DOE authority to designate “national interest electric transmission corridors,” which will receive special
attention and funding to ensure reliability is upheld [13, 15]. The corridors are established based on the level of electric congestion
in the area, the economic vitality of development of the corridor, end markets served by the corridor, and prices of electricity
resulting from any electricity congestion. FERC then has the authority to issue permits for the construction of transmission
facilities in the corridor (if it determines that the host state does not have the authority to approve the siting). Recognizing the
importance of electric reliability, EPAct grants the ERO and FERC power to acquire right-of-way by the exercise of right of
eminent domain for siting transmission expansion [13, 15].
Furthermore, EPAct mandates the adoption of IEEE 1574 Standard for Interconnecting Distributed Resources with Electric Power
Systems. IEEE 1574 is a “technology-neutral” standard which does not specify specific types of equipment needed to meet interconnection requirements. Instead, the standard focuses on ensuring the ability for
interconnection of any on-site facility. In doing so, it addresses both operational and safety issues while focusing on the functional requirements of the interconnection and not on the specific types of equipment to meet
The standard will increase the diversity of the electricity supply by facilitating development of fuel cells,
the functional requirements.
photovoltaics and other distributed energy generation technologies, and help ensure the reliability and safety of the nation’s electric
power system for decades to come [16].

5
Wind PTC Georgia
Negative Novice Packet
Econ – ext 4 – employment answers
Their employment evidence assumes investment in inefficient renewables that takes workers away from more
productive jobs
Michaels, 8 – Professor of Economics at CSU Fullerton
(Robert J., Energy Law Journal, “National Renewable Portfolio Standard: Smart Policy or Misguided Gesture?” 29 Energy L. J. 79, Lexis-Nexis Academic)

Some advocates take job creation to such lengths that they endorse inefficient renewables over more efficient ones. One study
notes that "the [*89] renewable energy sector generates more jobs per megawatt of power installed, per unit of energy produced,
and per dollar of investment, than the fossil fuel-based energy sector." n31 It compares coal and solar units under an assumption
that four megawatts of intermittent solar capacity are equivalent to one megawatt of coal-fired capacity. Building either takes the
same labor input per megawatt, as would building a base-loadable MW of biomass capacity. The study's authors conclude in favor
of solar because it creates four times as many construction jobs per effective megawatt as coal or biomass, and also requires more
labor to operate. In effect the solar project attracts an unnecessarily large number of workers from more productive jobs and pays
them from the higher bills of captive consumers. n32

6
Wind PTC Georgia
Negative Novice Packet
Economy IL answer
The U.S. and global economy are resilient – new macroeconomic policies help the economy absorb shocks
Behravesh, 6 (Nariman, most accurate economist tracked by USA Today and chief global economist and executive vice president
for Global Insight, Newsweek, “The Great Shock Absorber; Good macroeconomic policies and improved microeconomic
flexibility have strengthened the global economy's 'immune system.'” 10-15-2006, www.newsweek.com/id/47483)

The U.S. and global economies were able to withstand three body blows in 2005--one of the worst tsunamis on record (which struck at
the very end of 2004), one of the worst hurricanes on record and the highest energy prices after Hurricane Katrina--without missing a
beat. This resilience was especially remarkable in the case of the United States, which since 2000 has been able to shrug off the
biggest stock-market drop since the 1930s, a major terrorist attack, corporate scandals and war.
Does this mean that recessions are a relic of the past? No, but recent events do suggest that the global economy's "immune system" is now strong
enough to absorb shocks that 25 years ago would probably have triggered a downturn. In fact, over the past two decades, recessions have not
disappeared, but have become considerably milder in many parts of the world. What explains this enhanced recession resistance? The answer: a
combination of good macroeconomic policies and improved microeconomic flexibility.
Since the mid-1980s, central banks worldwide have had great success in taming inflation. This has meant that long-term interest rates are at levels not seen in more than 40 years. A low-
inflation and low-interest-rate environment is especially conducive to sustained, robust growth. Moreover, central bankers have avoided some of the policy mistakes of the earlier oil
shocks (in the mid-1970s and early 1980s), during which they typically did too much too late, and exacerbated the ensuing recessions. Even more important, in
recent years the
Fed has been particularly adept at crisis management, aggressively cutting interest rates in response to stock-market crashes,
terrorist attacks and weakness in the economy.
The benign inflationary picture has also benefited from increasing competitive pressures, both worldwide (thanks to globalization and the rise of Asia as a manufacturing juggernaut) and
domestically (thanks to technology and deregulation). Since the late 1970s, the United States, the United Kingdom and a handful of other countries have been especially aggressive in
deregulating their financial and industrial sectors. This has greatly increased the flexibility of their economies and reduced their vulnerability to inflationary shocks. Looking ahead, what
all this means is that a global or U.S. recession will likely be avoided in 2006, and probably in 2007 as well. Whether the current expansion will be able to break the record set in the 1990s
for longevity will depend on the ability of central banks to keep the inflation dragon at bay and to avoid policy mistakes. The prospects look good. Inflation is likely to remain a low-level
threat for some time, and Ben Bernanke, the incoming chairman of the Federal Reserve Board, spent
much of his academic career studying the past mistakes
of the Fed and has vowed not to repeat them.
At the same time, no single shock will likely be big enough to derail the expansion. What if oil prices rise to $80 or $90 a barrel? Most estimates suggest that
growth would be cut by about 1 percent--not good, but no recession. What if U.S. house prices fall by 5 percent in 2006 (an extreme assumption, given that house prices haven't fallen
nationally in any given year during the past four decades)? Economic growth would slow by about 0.5 percent to 1 percent. What about another terrorist attack? Here the scenarios can be
pretty scary, but an
attack on the order of 9/11 or the Madrid or London bombings would probably have an even smaller impact on
overall GDP growth.
So what would it take to trigger a recession in the U.S. or world economies over the next couple of years? Two or more big shocks occurring more or less simultaneously. Global Insight
recently ran a scenario showing that a world recession could happen if the following combination of events were to take place: oil prices above $100 per barrel, inflation and interest rates
running 3 percentage points above current levels and a 10 percent drop in home prices across many industrial nations (e.g., the United States, the United Kingdom, Spain, Australia,
Sweden). The likely timing of such a recession would be 2007. However, given the extremeness of these assumptions, the probability of such a scenario is less than 20 percent.
The good news is that the chances of a recession occurring in the next couple of years are low. The not-so-good news is that assertions about recessions being relegated to history's trash
heap are still premature.

7
Wind PTC Georgia
Negative Novice Packet
Warming frontline
1. No impact to climate change – the rate of warming is slowing down now.
Science Daily, 5/5
[Science Daily, May 5 2008, “Will Global Warming Take A Short Break? Improved Climate Predictions Suggest A Reduced Warming Trend During The Next 10 Years”,
<http://www.sciencedaily.com/releases/2008/05/080502113749.htm>]

To date climate change projections, as published in the last IPCC report, only considered changes in future atmospheric
composition. This strategy is appropriate for long-term changes in climate such as predictions for the end of the century. However, in order to predict
short-term developments over the next decade, models need additional information on natural climate variations, in particular
associated with ocean currents. Lack of sufficient data has hampered such predictions in the past. Scientists at IFM-GEOMAR and from the MPI for
Meteorology have developed a method to derive ocean currents from measurements of sea surface temperature (SST). The latter are available in good quality and
global coverage at least for the past 50 years. With this additional information, natural decadal climate variations, which are superimposed
on the long-term anthropogenic warming trend, can be predicted. The improved predictions suggest that global warming will
weaken slightly during the following 10 years. “Just to make things clear: we are not stating that anthropogenic climate change won’t be as bad as
previously thought”, explains Prof. Mojib Latif from IFM-GEOMAR. “What we are saying is that on top of the warming trend there is a long-
periodic oscillation that will probably lead to a to a lower temperature increase than we would expect from the current trend
during the next years”, adds Latif. “That is like driving from the coast to a mountainous area and crossing some hills and valleys
before you reach the top”, explains Dr. Johann Jungclaus from the MPI for Meteorology. “In some years trends of both phenomena, the anthropogenic
climate change and the natural decadal variation will add leading to a much stronger temperature rise.”

2. Warming rate and impacts overstated – scientists mint money off of warming hype and media exaggeration.
Kelly, Real Clear Politics, 2008
[Jack Kelly, January 8 2008, Real Clear Politics, “Media Promotes Global Warming Alarmism”, <http://www.realclearpolitics
.com/articles/2008/01/temperatures_trending_cooler.html>]

About this time last year, Dr. Phil Jones, head of the Climatic Research Unit of East Anglia University in Britain, predicted 2007 would be the
warmest year on record.It didn't turn out that way. 2007 was only the 9th warmest year since global temperature readings were first made in
1861.2007 was also the coldest year of this century, noted Czech physicist Lubos Motl. Both global warming alarmists like Dr. Jones and
skeptics like Dr. Motl forecast that this year will be slightly cooler than last year. If so, that means it will be a decade since the high water
mark in global temperature was set in 1998. And the trend line is down. Average global temperature in 2007 was lower than for 2006, 2005, 2004,
2003, 2002 and 2001. November of last year was the coldest month since January of 2000, and December was colder still. "Global warming has stopped,"
said David Whitehouse, former science editor for the BBC. "It's not a viewpoint or a skeptic's inaccuracy. It's an observational
fact." But observational fact matters little to global warming alarmists, particularly to those in the news media. "In 2008, your
television will bring you image after image of natural havoc linked to global warming," said John Tierney, who writes a science
column for the New York Times. "You will be told that such bizarre weather must be a sign of dangerous climate change -- and
that these images are a mere preview of what's in store unless we act quickly to cool the planet.""Global warming can mean colder, it can
mean drier, it can mean wetter," said Steven Guibeault of Greenpeace.There is no dispute among scientists that the planet warmed about 0.3 degrees Celsius
between 1980 and 1998. What is in dispute is what caused the warming, and whether it will continue. The alarmists say the warming
was caused chiefly by emissions of carbon dioxide from our automobiles and factories, and that, consequently, it will continue at an ever increasing
rate unless we humans change our behavior. The skeptics say the warming trend was caused chiefly by natural cycles, and that it is at or near its end.
"The earth is at the peak of one of its passing warm spells," said Dr. Oleg Sorokhtin of the Russian Academy of Natural Sciences.
It'll start getting cold by 2012, and really, really cold around 2041, he predicts. The news media promote global warming
alarmism through selective reporting. Dr. Roger Pielke of the University of Colorado noted that a paper published in an obscure
scientific journal that argued there was a link between hurricanes and global warming generated 79 news articles, while a paper
that debunked the connection published in a far more prestigious journal generated only three. "When the Arctic sea ice last year
hit the lowest level ever recorded by satellites, it was big news and heralded as a sign the planet was warming," Mr. Tierney wrote. "When the
Antarctic sea ice last year reached the highest level ever recorded by satellites, it was pretty much ignored."Two studies published
last year which indicated the melting of Arctic sea ice was due more to cyclical changes in ocean currents and winds than to planetary warming also attracted little
attention, Mr. Tierney noted. And though the record melting of Arctic sea ice this summer was widely reported, the record growth of Arctic sea ice this fall (58,000
square miles of ice each day for 10 straight days) was not. More than 400 scientists -- many of them members of the UN's Intergovernmental Panel
on Climate Change -- challenge the claims of the leading global warming alarmist, former Vice President and now Nobel laureate Al Gore, said a
report issued by the Republicans on the U.S. Senate's Environment and Public Works committee last month. Kailee Kreider, a spokeswoman for Mr. Gore, said
there criticisms should be discounted because 25 or 30 of the scientists may have received funding from the Exxon Mobil Corp. It's Mr. Gore who is the
crook, says French physicist Claude Allegre in a new book. He's made millions in an eco-business based on phony science, Dr.
Allegre charges. Mr. Gore isn't alone, says Weather Channel founder John Coleman: "Some dastardly scientists with environmental and political
motives manipulated long term scientific data to create an illusion of rapid global warming," Mr. Coleman wrote. "Their friends in
government steered huge research grants their way to keep the movement going...In time, in a decade or two, the outrageous
scam will be obvious."

8
Wind PTC Georgia
Negative Novice Packet
Warming Frontline
3. No credible evidence proves that warming is anthropogenic [anthropogenic = human caused]
Singer 2007 distinguished research professor at George Mason and Avery, director of the Center for Global Food Issues
at the Hudson Institute
(S. Fred, Dennis T, “Unstoppable Global Warming: Every 1,500 Years” Pages 7-8)

The Earth has recently been warming. This is beyond doubt. It has warmed slowly and erratically-for a total of about 0.8 degrees Celsius-since 1850.
It had one surge of warming from 1850 to 1870 and another from 1920 to 1940. However, when we correct the thermometer records for the effects of
growing urban heat islands and widespread intensification of land use, and for the recently documented cooling of the Antarctic
continent over the past thirty years, overall world temperatures today are only modestly warmer than they were in 1940, despite a
major increase in human CO2 emissions.
The real question is not whether the Earth is warming but why and by how much. We have a large faction of intensely interested
persons who say the warming is man-made, and dangerous. They say it is driven by releases of greenhouse gases such as CO2 from
power plants and autos, and methane from rice paddies and cattle herds. The activists tell us that modern society will destroy the planet; that unless we radically
change human energy production and consumption, the globe will become too warm for farming and the survival of wild species. They warn that the polar ice caps
could melt, raising sea levels and flooding many of the world's most important cities and farming regions. However, they don't have much evidence to
support their position-only (1) the fact that the Earth is warming, (2) a theory that doesn't explain the warming of the past 150
years very well, and (3) some unverified computer models. Moreover, their credibility is seriously weakened by the fact that many
of them have long believed modern technology should be discarded whether the Earth is warming too fast or not at all.
Many scientists - though by no means all- agree that increased CO2 emissions could be dangerous. However, polls of climate-qualified scientist show that many
doubt the scary predictions of the global computer models. This book cites the work of many hundreds of researchers, authors, and coauthors whose work testifies
to the 1,500-year cycle. There is no "scientific consensus," as global warming advocates often claim. Nor is consensus important to
science. Galileo may have been the only man of his day who believed the Earth revolved around the sun, but he was right! Science is the process of developing
theories and testing them against observations until they are proven true or false.
If we can find proof, not just that the Earth is warming, but that it is warming to dangerous levels due to human-emitted
greenhouse gases, public policy will then have to evaluate such potential remedies as banning autos and air conditioners. So far,
we have no such evidence. If the warming is natural and unstoppable, then public policy must focus instead on adaptations-such as
more efficient air conditioning and building dikes around low-lying areas like Bangladesh. We have the warming. Now we must
ascertain its cause.

9
Wind PTC Georgia
Negative Novice Packet
Warming frontline
4. Best Data Proves Global Warming happens every 1500 years regardless of human activity
SINGER 07 distinguished research professor at George Mason and Avery, director of the Center for Global Food Issues
at the Hudson Institute
(S. Fred, Dennis T, “Unstoppable Global Warming: Every 1,500 Years” Pages 1-5)

The Earth is warming but physical evidence from around the world tells us that human-emitted CO2 (carbon dioxide) has played only
a minor role in it. Instead, the mild warming seems to be part of a natural I ,500-year climate cycle (plus or minus 500 years) that goes
back at least one million years. The cycle has been too long and too moderate for primitive peoples lacking thermometers to recount in
their oral histories. But written evidence of climatic change does exist. The Romans had recorded a warming from about 200 B.C.
to AD. 600, registered mainly in the northward advance of grape growing in both Italy and Britain. Histories from both Europe and Asia tell us there
was a Medieval Warming that lasted from about 900 to 1300; this period was also known as the Medieval Climate Optimum because of its mild winters,
stable seasons, and lack of severe storms. Human histories also record the Little Ice Age, which lasted from about 1300 to 1850. But people
thought each of these climatic shifts was a distinct event and not part of a continuing pattern.
This began to change in 1984 when Willi Dansgaard of Denmark and Hans Oeschger of Switzerland published their analysis of the oxygen isotopes in the first
ice cores extracted from Greenland.' These cores provided 250,000 years of the Earth's climate history in one set of "documents." The
scientists compared the ratio of "heavy" oxygen-18 isotopes to the "lighter" oxygen-16 isotopes, which indicated the temperature
at the time the snow had fallen. They expected to find evidence of the known 90,000-year ice ages and the mild interglacial periods
recorded in the ice, and they did. However, they did not expect to find anything in between. To their surprise, they found a clear
cycle—moderate, albeit abrupt—occurring about every 2,550 years running persistently through both. (This period would soon he reassessed at 1,500 years, plus
or minus 500 years).
By the mid-1980s, however, the First World had already convinced itself of the Greenhouse Theory and believed that puny human
industries had grown powerful enough to change the planet's climate. There was little media interest in the frozen findings of obscure, parka-clad
Ph.D.s in far-off Greenland. A wealth of other evidence has emerged since 1984, however, corroborating Dansgaard and Oeschger's natural I
,500-year climate cycle: An ice core from the Antarctic's Vostok Glacier-at the other end of the world from Iceland-was brought up in 1987
and showed the same 1,500year climate cycle throughout its 400,000-year length. • The ice-core findings correlate with known advances and retreats in the
glaciers of the Arctic, Europe, Asia, North America, Latin America, New Zealand, and the Antarctic. • The I ,500-year cycle has been
revealed in seabed sediment cores brought up from the floors of such far-flung waters as the North Atlantic Ocean and the Sargasso Sea, the South Atlantic
Ocean and the Arabian Sea. • Cave stalagmites from Ireland and Germany in the Northern Hemisphere to South Africa and New Zealand
in the Southern Hemisphere show evidence of the Modern Warming, the Little Ice Age, the Medieval Warming, the Dark Ages, the Roman Warming, and the
unnamed cold period before the Roman Warming. • Fossilized pollen from across North America shows nine complete reorganizations of our trees and
plants in the last 14,000 years, or one every
1,650 years. • In both Europe and South America, archaeologists have evidence that prehistoric humans moved their homes and farms
up mountainsides during the warming centuries and retreated back down during the cold ones. The Earth continually warms and
cools. The cycle is undeniable, ancient, often abrupt, and global. It is also unstoppable. Isotopes in the ice and sediment cores,
ancient tree rings, and stalagmites tell us it is linked to small changes in the irradiance of the sun. The temperature change is moderate.
Temperatures at the latitude of New York and Paris moved about 2 degrees Celsius above the long-term mean during warmings, with increases of 3 degrees or more
in the polar latitudes. During the cold phases of the cycle, temperatures dropped by similar amounts below the mean. Temperatures change little in lands al the
equator. but rainfall often does.
The cycle shifts have occurred roughly on schedule whether CO2 levels were high or low. Based on this 1,500 year-cycle, the
Earth is about 150 years into a moderate Modern Warming that will last for centuries longer. It will essentially restore the fine climate of the
Medieval Climate Optimum.
The climate has been most stable during the warming phases, The "little ice ages" have been beset by more floods, droughts,
famines, and storminess. Yet, despite all of this evidence, millions of well-educated people, many scientists, many respected
organizations-even the national governments of major First World nations-are telling us that the Earth's current warming phase is
caused by human-emitted CO2 and deadly dangerous, They ask society to renounce most of its use of fossil fuel-generated energy
and accept radical reductions in food production, health technologies, and standards of living to "save the planet."
We have missed the predictive power of the 1,500-year climate cycle. Will the fear of dangerous global warming lead society to accept draconian restrictions on the
use of fertilizers, cars, and air conditioners?

10
Wind PTC Georgia
Negative Novice Packet
Warming – ext 1 – slow
Runaway warming impossible—oceans prevent excess warming
Junk Science, 2008
(“The curious incident of the added heat at the surface.” http://junkscience.com/Greenhouse/forcing.html)

Additionally, this form introduces another layer of complexity, that of oceanic absorption. Bear in mind that every 10 meters of
water column is equivalent to one entire atmosphere (10 cubic meters of water has a mass of 10,000 Kg), meaning that the oceans
are an enormous heat sink. There is a theory that we can not find atmospheric warming because the oceans are absorbing it and
300 atmosphere's worth of oceans make the temperature change far too small to measure. Now, we have no specific problem with
the possibility that Earth's warmth is distributed through the oceans as well as the atmosphere. Our response, however, remains the
same. If additional or "excess" warmth is being spread over so many more atmospheres, at least atmosphere's worth of oceans,
then we are looking at as little as one-third of one percent of estimated warming to achieve equilibrium temperature with enhanced
greenhouse forcing. This would make the IPCC's touted 1.5-6 °C atmospheric warming an immeasurably small 0.005-0.02 °C for a
doubling of pre-Industrial atmospheric carbon dioxide -- not a particularly worrisome prospect. So, recent data acquisition fails to
show warming in the top 750 meters of the oceans (equivalent to 75 atmospheres) but there is a suggestion of warming in the deep ocean (below
1,000 meters, although historic data is sparse, to say the least -- the warming of so much of the ocean would be so small from enhanced
greenhouse that the figures are of little relevance here). We are providing a field for you to select ocean depth to disperse additional forcing so
you can see the effect ocean absorption has. As an exercise try maxing out the atmospheric carbon dioxide at 1200 ppmv (four times pre-IR
levels) and share the additional Joules through the full allowable 3,000 meters of ocean depth and see that it would take more than 100 years to
raise the temperature of the system just 1 °C. If the assertions that heat is being added to the system at the claimed rate but we can not
detect it because it is being "hidden" by dispersal in the oceans then again we are unconcerned -- distributing the additional heat
through so many more atmospheres' worth of heat sink makes mean warming trivial.

11
Wind PTC Georgia
Negative Novice Packet
Warming – ext 3 – not human caused
Mars proves solar changes are responsible for global warming.
National Post, 2007
(Lawrence Solomon, staff writer, February 7, “Look to Mars for the Truth on Globl Warming” http://www.nationalpost.com/story.html?id=edae9952-3c3e-47ba-913f-
7359a5c7f723&k=0/)

Climate change is a much, much bigger issue than the public, politicians, and even the most alarmed environmentalists realize. Global
warming extends to
Mars, where the polar ice cap is shrinking, where deep gullies in the landscape are now laid bare, and where the climate is the
warmest it has been in decades or centuries.
"One explanation could be that Mars is just coming out of an ice age," NASA scientist William Feldman speculated after the agency's Mars Odyssey completed its
first Martian year of data collection. "In some low-latitude areas, the ice has already dissipated." With each passing year more and more evidence arises of the
dramatic changes occurring on the only planet on the solar system, apart from Earth, to give up its climate secrets.
NASA's findings in space come as no surprise to Dr. Habibullo Abdussamatov at Saint Petersburg's Pulkovo Astronomical
Observatory. Pulkovo -- at the pinnacle of Russia's space-oriented scientific establishment -- is one of the world's best equipped
observatories and has been since its founding in 1839. Heading Pulkovo's space research laboratory is Dr. Abdussamatov, one of the
world's chief critics of the theory that man-made carbon dioxide emissions create a greenhouse effect, leading to global warming.
"Mars has global warming, but without a greenhouse and without the participation of Martians," he told me. "These parallel global
warmings -- observed simultaneously on Mars and on Earth -- can only be a straightline consequence of the effect of the one same
factor: a long-time change in solar irradiance."
The sun's increased irradiance over the last century, not C02 emissions, is responsible for the global warming we're seeing, says the
celebrated scientist, and this solar irradiance also explains the great volume of C02 emissions.
"It is no secret that increased solar irradiance warms Earth's oceans, which then triggers the emission of large amounts of carbon
dioxide into the atmosphere. So the common view that man's industrial activity is a deciding factor in global warming has emerged
from a misinterpretation of cause and effect relations."
Dr. Abdussamatov goes further, debunking the very notion of a greenhouse effect. "Ascribing 'greenhouse' effect properties to the
Earth's atmosphere is not scientifically substantiated," he maintains. "Heated greenhouse gases, which become lighter as a result of
expansion, ascend to the atmosphere only to give the absorbed heat away."

Climate record proves that C02 does not cause warming


Lewis, Institute of Economic Affairs, 2007
(Richard, Global Warming False Alarms, www.globalwarminghype.com/upld-book403pdf_.pdf)

The cornerstone of the global warming theory is that the CO2 content of the atmosphere in the pre-industrial period at 280 parts
per million by volume (ppmv) was over 25 per cent lower than the 370 ppmv of today. It has however been claimed by Professor
Zbignieuw Jaworowski of Warsaw University, who has been involved in glacier studies for 40 years, that the figure for the 19th
century is wrong. It is based on the analysis of greenhouse gases in ice cores from Greenland and Antarctica. The flaws in this
evidence, he says, are as follows: First there are chemical and physical processes, which have taken place within the ice cores
which decrease the concentrations of all greenhouse gases they contain. It appears that there are leaks of these gases from the ice
cores into the drilling liquid used in the boreholes and through cracks in the ice sheeting into the atmosphere. Second, there has
been manipulation of the data and biased interpretation of it. In any case meticulous analysis of the abundant 19th century
measurements of CO2 shows that its average atmospheric concentration before 1900 was 335 ppmv. Further recent work on tree
leaves, the frequency of the pores in the skin of which provide an accurate means of measuring CO2 density in the atmosphere on
a scale of centuries, show that the concentration nearly 10,000 years ago was 348 ppmv, or about the same as in 1987. A study by
Dutch scientists of Holocene era deposits in Denmark, (to which Professor Jaworowski referred in his statement to the US Senate
Committee on Commerce, Science and Transport) thus discredited the much–touted ice core estimates. The authors of it stated
bluntly “Our results contradict the concept of relatively stabilised Holocene CO2 concentrations of 270 to 280 ppmv until the
industrial revolution”. . Their tree leaf studies confirm earlier criticism of the ice core research and demolish the very basis of the
global warming case. To put the whole matter in a long-term context it is worth pointing out that fifty
million years ago the CO2 concentration of 2000 ppmv was almost six times higher than it is today but
the air temperature was only 1.5 degrees higher.

12
Wind PTC Georgia
Negative Novice Packet
Warming – ext 3 – not human caused
Cow Farts Are the True Cause of Global Warming
The Los Angeles Times 2006
(http://www.latimes.com/news/opinion/editorials/la-ed-methane15oct15,0,7911841.story)

It's a silent but deadly source of greenhouse gases that contributes more to global warming than the entire world transportation
sector, yet politicians almost never discuss it, and environmental lobbyists and other green activist groups seem unaware of its
existence.
That may be because it's tough to take cow flatulence seriously. But livestock emissions are no joke.
Most of the national debate about global warming centers on carbon dioxide, the world's most abundant greenhouse gas, and its major sources -- fossil fuels.
Seldom mentioned is that cows and other ruminants, such as sheep and goats, are walking gas factories that take in fodder and put out
methane and nitrous oxide, two greenhouse gases that are far more efficient at trapping heat than carbon dioxide. Methane, with 21
times the warming potential of CO2, comes from both ends of a cow, but mostly the front. Frat boys have nothing on bovines, as it's estimated
that a single cow can belch out anywhere from 25 to 130 gallons of methane a day.
It isn't just the gas they pass that makes livestock troublesome. A report from the United Nations Food and Agriculture Organization identified livestock as one of
the two or three top contributors to the world's most serious environmental problems, including water pollution and species loss. In terms of climate change,
livestock are a threat not only because of the gases coming from their stomachs and manure but because of deforestation, as land is
cleared to make way for pastures, and the amount of energy needed to produce the crops that feed the animals.
All told, livestock are responsible for 18% of greenhouse-gas emissions worldwide, according to the U.N. -- more than all
the planes, trains and automobiles on the planet. And it's going to get a lot worse. As living standards rise in the developing
world, so does its fondness for meat and dairy. Annual per-capita meat consumption in developing countries doubled from 31
pounds in 1980 to 62 pounds in 2002, according to the Food and Agriculture Organization, which expects global meat production to more than double by 2050.
That means the environmental damage of ranching would have to be cut in half just to keep emissions at their current,
dangerous level.

Volcanoes and El Nino are the true cause of warming


Carter 07 paleontologist, stratigrapher, marine geologist and environmental scientist, PhD from Cambridge
University
(Robert M., “The Myth of Dangerous Human-Caused Climate Change”, the AusIMM New Leaders’ Conference, May 2-3)

Both the eight and 100 year-long intervals of temperature change are too short to carry statistical significance regarding long-term climate change. However,
though the last 100 years of temperature record has only limited climatic significance (for instance, representing only three climate normal datapoints), it
is nonetheless important because it corresponds to the span of instrumental meteorological records from the earth’s surface. Accepting the 1860 - 2006
temperature record used by the IPCC (2007; Climate Research Unit, University of East Anglia) as a best measure, we find that there has been
no significant increase in surface global temperature since the peak El Nino year of 1998 (Figure 8). This result is confirmed by the
two most reliable records of average tropospheric temperature, drawn from weather balloon radiosondes (since 1958) and satellite-
mounted microwave sounding units (MSU; since 1979). Of all these datasets, the MSU record is accepted to be the most accurate
and globally representative. Once the effects of El Nino warmings and volcanic coolings are allowed for, this record shows
no significant warming since its inception in 1979 (Gray, 2006) (Figure 9). This conclusion is robust. Though several other global temperature datasets
exist, and though the MSU record has been subject to repeated corrections in interpretation, none of the available datasets document significant
recent greenhouse warming.
The global temperature stasis between 1998 and 2006 occurred despite continuing rises in atmospheric carbon dioxide over that
period. Consistent with this, Karner (2002) showed from an analysis of global temperature series that:
… antipersistence in the lower tropospheric temperature increments does not support the science of global warming developed by IPCC. Negative long-range
correlation of increments during the last 22 years means that negative feedback has been dominating in the Earth climate system during the period.
These facts, and the lack of a discernable human greenhouse effect in late 20th century temperature records, are consistent with Khilyuk and Chilingar’s (2006)
estimate that the human greenhouse forcing is four to five orders of magnitude less than the major natural forcing agents.
In summary, the slope and magnitude of temperature trends inferred from time-series data depend upon the choice of data end points. Drawing trend lines through
highly variable, cyclic temperature data or proxy data is therefore a dubious exercise. Accurate direct measurements of tropospheric global average temperature
have only been available since 1979, and they show no evidence for greenhouse warming. Surface thermometer data, though flawed, also show temperature stasis
since 1998. This pattern is not what is portrayed in the daily news media.

13
Wind PTC Georgia
Negative Novice Packet
Warming – ext 4 – inevitable
Even if all global emissions stopped today -past emissions make warming inevitable
Nicholas Stern—Head of the British Government Economic Service—2007
(Former Head Economist for the World Bank, I.G. Patel Chair at the London School of Economics and Political Science, “The Economics of Climate Change: The Stern Review”,
The report of a team commissioned by the British Government to study the economics of climate change led by Siobhan Peters, Head of G8 and International Climate Change
Policy Unit, Cambridge University Press, p. 11-13)

Additional warming is already in the pipeline due to past and present emissions. The full warming effect of past emissions is yet to be realised.
Observations show that the oceans have taken up around 84% of the total heating of the Earth’s system over the last 40 years36. If global
emissions were stopped today, some of this heat would be exchanged with the atmosphere as the system came back into
equilibrium, causing an additional warming. Climate models project that the world is committed to a further warming of 0.5° - 1 °C over
several decades due to past emissions37. This warming is smaller than the warming expected if concentrations were stabilised at 430 ppm CO2e, because
atmospheric aerosols mask a proportion of the current warming effect of greenhouse gases. Aerosols remain in the atmosphere for only a few weeks and are not
expected to be present in significant levels at stabilisation38. If annual emissions continued at today’s levels, greenhouse gas levels would be close to double pre-
industrial levels by the middle of the century. If this concentration were sustained, temperatures are projected to eventually rise by 2 – 5ºC or even
higher. Projections of future warming depend on projections of global emissions (discussed in chapter 7). If annual emissions were to remain at today’s levels,
greenhouse gas levels would reach close to 550 ppm CO2e by 2050.39 Using the lower and upper 90% confidence bounds based on the IPCC TAR range and
recent research from the Hadley Centre, this would commit the world to a warming of around 2 – 5°C (Table 1.1). As demonstrated in Box 1.2, these two climate
sensitivity distributions lie close to the centre of recent projections and are used throughout this Review to give illustrative temperature projections. Positive
feedbacks, such as methane emissions from permafrost, could drive temperatures even higher. Near the middle of this range of warming (around 2 – 3°C above
today, the Earth would reach a temperature not seen since the middle Pliocene around 3 million years ago . This level of warming on a global scale is far outside the
experience of human civilisation. However, these are conservative estimates of the expected warming, because in the absence of an effective climate policy,
changes in land use and the growth in population and energy consumption around the world will drive greenhouse gas emissions far higher than today. This would
lead greenhouse gas levels to attain higher levels than suggested above. The IPCC projects that without intervention greenhouse gas levels will rise to 550 – 700
ppm CO2e by 2050 and 650 – 1200 ppm CO2e by 210041. These projections and others are discussed in Chapter 7, which concludes that, without mitigation,
greenhouse gas levels are likely to be towards the upper end of these ranges. If greenhouse gas levels were to reach 1000 ppm, more than treble pre-industrial
levels, the Earth would be committed to around a 3 – 10°C of warming or more, even without considering the risk of positive feedbacks (Table 1.1).

14
Wind PTC Georgia
Negative Novice Packet
Warming – A2 hurts Oceans
1) Human population growth makes ocean biodiversity loss irreversible
The Advertiser, March 23, 1999 (Lexis)
By far the greatest pressure on biodiversity is the demand the growing human population places on the oceans. Marine ecosystems have
been modified and biodiversity lost through the clearing of native vegetation, the introduction of exotic species, pollution and climate change. For example, 5000
million litres of Sydney sewage which has only received primary treatment is discharged into the ocean each day. This is the
equivalent of 2000 Olympic swimming pools full of sewage being pumped into the ocean 365 days of the year.

2) Russian and Chinese pollution kills ocean species


The Times Union, March 3, 1996 (E1)
Nowhere in the world have seas been subjected to abuse as they were in the former Soviet Union. The once-bountiful Black Sea has
been badly contaminated by sewage, fertilizer and chemical wastes pouring into its waters from rivers and by oil spills and direct dumping. The sea is, in
effect, choking to death. It is, says the Russian news service Tass, on the brink of extinction. Another disaster area is the Pacific coast of China
where the rapid growth of cities and factories with little or no concern for the environment has caused alarming levels of contamination in
the Yellow, East China and South China seas.

3) Damage to forest watersheds make damage to the ocean species inevitable


Science, September 21, 2001 (No. 5538, Vol. 293, p. 2207)
The Amazon, the Congo, and rivers in Southeast Asia hold almost half the world's freshwater fish species. Their fates depend on the
surrounding forest watersheds. Elsewhere, most accessible rivers are dammed and channeled (9), causing their faunas to be more threatened than terrestrial
ones (10). Diversion of water for irrigation threatens ecosystems, such as the Mesa Central (Mexico) and the Aral Sea and its rivers (Central Asia).
Irrigation projects are often economic disasters (11, 12), as salt accumulation quickly destroys soil fertility (13).

4) Developing countries will continue to exploit ecosystems


Science, September 21, 2001 (No. 5538, Vol. 293, p. 2207)
The pressures to destroy ecosystems are often external (20). For example, the World Bank and the International Monetary Fund have indirectly
encouraged governments to deplete their natural resources to pay off debt (21). Even when available, some countries may view foreign purchase
of conservation concessions as imperialism in a 21st-century guise. Almost all the hotspots were European colonies; one is still French territory (3).
Some countries have unstable government, and others are at war.

5) Coastal eutrophication will cause ocean species loss


Bioscience, December 1, 2000 (No. 12, Vol. 50, p. 1108)
Diaz and Rosenberg (1995) list at least 23
areas documented in the scientific literature as suffering increasingly from severe oxygen stress resulting
from coastal eutrophication. Areas periodically oxygen-stressed are often quite large, covering hundreds to thousands of square kilometers (e.g.,
250 [km.sup.2] in the Gulf of Trieste, approximately 3000 [km.sup.2] in the Kattegat between Denmark and Sweden, and 8000-9500 [km.sup.2] on the Louisiana
shelf). Many of these areas appear to be near a threshold at which further oxygen depletion will yield catastrophic benthic mortality, loss of seafloor
biodiversity (Figure 1), and alteration of food webs leading from the sediments to crustacean and finfish fisheries above the SWI (de Jonge et al. 1994). It
is important to note that because Diaz and Rosenberg (1995) have focused on oxygen-stressed sites reported in scientific journals, their list clearly represents
the tip of the hypoxic/anoxic iceberg. To quote these authors: "There is no other environmental param eter of such ecological importance to coastal marine
ecosystems that has changed so drastically in such a short period of time as dissolved oxygen.... If we do not move quickly to reduce or stop the primary cause of
low oxygen, the decomposition of excess primary production associated with eutrophication, then the productivity structure of our major estuarine and coastal areas
will be permanently altered."

6) Ozone depletion will damage ocean ecosystems


Kieran Mulvaney, Editor of Ocean Update, January 11, 1998 (E, No. 1, Vol. 9, p. 28)
In addition, there is growing evidence that increased levels of UV-B radiation as a result of ozone depletion may be harming marine
species, particularly those in the upper layers of the sea. Numerous studies have shown, for example, that increased UV-B can cause death, decreased
reproductive capacity, reduced survival and impaired larval development in some of the plankton species that form the basis of the marine food chain.

15
Wind PTC Georgia
Negative Novice Packet
Warming – A2 hurts agriculture
Warming will not result famine: 5 reasons (long card)
Singer, distinguished research professor at George Mason and Avery, director of the Center for Global Food Issues
at the Hudson Institute, 2007
(S. Fred, Dennis T, “Unstoppable Global Warming: Every 1,500 Years” Pages 120-124)

FIVE REASONS NOT TO FEAR FAMINE DURING GLOBAL WARMING


First: Lessons of History
Human food production, historically, has prospered during the global warmings. We have seen in the earlier chapters the
flourishing of human society during the Roman Warming and the Medieval Warming. Food production increased during previous
historic warmings primarily because warming climates provided more of the things plants love: sunlight, rainfall, and longer
growing seasons. During warmings there are also less of the things plants hate: late spring frosts and early fall frosts that shorten
the growing season, and hailstorms that destroy fields of crops. Jorgen Olesen of the Danish Institute of Agricultural Sciences predicts that Europe's
overall food production will increase with warming, even though some southern European regions will have crops reduced by aridity."

Second: What Science Says about Food and the Modern Warming Sunshine: Richard Willson of Columbia University (and NASA) has
measured an increase in the sun's radiance of 0.05 percent per decade for the past two decades. He says the upward trend in sunlight
may well have been going on longer than that. Earlier, we didn't have the precision instruments to measure that small but vital
trend, but every bit of it encourages the growth of food crops.: The increased temperatures of the Modern Warming may have some
negative impact on crops in the southern mid-latitudes-through drier summers, for example-in places such as southern Romania, Spain, and
Texas. At the same time, however, stronger sunlight will importantly increase the productivity of farmland in the northern mid-latitudes,
such as Germany, Canada, and Russia. The increased food production in the very extensive northern plains would far outweigh the negative impact of slightly more
arid conditions in the southern mid-latitudes.
Rainfall: Increased heat means more precipitation, as more moisture evaporates from the oceans and then falls as rain or snow.
NASA says global rainfall increased 2 percent in the twentieth century compared with the tailend of the Little Ice Age in the
nineteenth century. Most of the increased moisture fell in the mid- and high-latitudes where much of the world's most productive
cropland is located. We can expect this to continue through the \Iodern Warming.
Higher CO2 Levels: Another reason food production has tended to increase during the past 150 years is that CO2 levels in the
atmosphere have increased. The oceans give up CO2 when they warm. The increased CO2 not only fertilizes the plants, but enables
them to use water more efficiently.
Researchers at the U.S. Department of Agriculture in 1997 grew wheat in a long plastic tunnel, varying the CO2 levels for the grain plants from the Ice Age CO2
level of about 200 parts per million (ppm) at one end of the tunnel to the late-1980s level of 350 ppm at the other.'
The findings? An extra 100 ppm of CO2 increased the wheat production by 72 percent under well-watered conditions, and by 48
percent under semidrought conditions. That meant an average crop yield gain of 60 percent. These results are consistent with a
wide variety of CO2 enrichment studies done in more than a dozen countries on many different crops.

Third: Farming Technology


Human food production today depends far more on farming technology than on modest climate changes. We are no more doomed
to famine by the Modern Warming than we are doomed to malaria in the era of pesticides and window screens. In fact, the food
abundance the world has increasingly enjoyed since the eighteenth century is primarily due to scientific and technological
advances.
In 1500, Britain could feed less than one million people. By 1850, thanks to knowledge of crop rotations and improved farm machines such as the seeder and
reaper, Britain fed more than 16 million people. Today, Britain has nearly 60 million people, fed mainly from its own fields.
Todau'e "Climate-Secure" Agriculture
Industrial nitrogen fertilizer is one of the biggest farming advances in human history. Before 1908, farmers could only maintain their soil nutrient levels by adding
livestock manure or by growing more green-manure crops, such as clover. Both of those strategies require lots of land. In 1908, however, the Haber-Bosch Process
began taking nitrogen from the air, which is 78 percent nitrogen. Today's farmers apply about 80 million tons of industrial nitrogen per year to maintain their soils'
fertility and it doesn't cost a single acre ofland.
To get 80 million tons per year of nitrogen from cattle manure, the world would require nearly eight billion additional cattle, plus five acres or so of forage land per
beast. We'd thus have to eliminate half the people, clear all the forests, or use some combination of those strategies.
The Green Revolution of the 1960s tripled the crop yields across Europe and much of the Third World.
• More powerful seeds, many of them with resistance to drought and pests, made better use of the complete roster of plant nutrients
(nitrogen, phosphate, and potash-plus twenty-six trace mineral elements) that soil-testing modern farmers apply to their fields.
• Irrigation assures ample moisture, often even in semiarid areas.
• Insecticides and fungicides protect the high yields of the crops both during the growing season and in storage.
In America, where high-yield farming started earlier, diaries of early settlers in Virginia's Shenandoah Valley indicate that wheat yields around 1800 were only six

IT CONTINUES

16
Wind PTC Georgia
Negative Novice Packet
Warming – A2 hurts agriculture
to seven bushels per acre. The valley's farmers today often get ten times that yield. U.S. corn yields by the 1920s had risen to about twentyfive bushels per acre. Today, the national
average is more than 140 bushels, and still rising. The same story of soaring yields and more certain harvests is playing out today over most of the world.
The African Exception
Africa is the only place in the world where per capita food production has not been increasing in recent decades. Africa's food production has been severely hampered by its ancient soils,
frequent droughts, and abundant insects and diseases. There has also been a lack of adequate research for its specific soils, microcJimates, and pests-and an equally damaging lack of stable
governance and infrastructure on that continent.
Two recent research developments are now particularly helpful for Africa .
• Quality-protein (QP) maize, bred in Mexico's International Maize and Wheat Improvement Center, not only has higher yields but also provides more lysine and tryptophan, two amino acids
that are critical for human nutrition but are lacking in most corn varieties. The QP maize is able by itself to cure many Afncan children of malnutrition .
• Rice breeders have successfully wide-crossed the African native rice species with Asian rice varieties, to create a family of more vigorous and higher-yield new rice varieties.
More such breakthroughs for Africa's farmers can be expected if more research investments are made for and in that continent. Better roads and bridges (and better national security)
would also make farm inputs less expensive and higher crop yields more marketable no matter what happens to its climate. Today s high-yield agriculture is also the most sustainable in
history, thanks t,) fertilizers, soil testing, and a twentieth-century farming system called "con-crvation tillage." Conservation tillage controls weeds with cover crops and cncmical
herbicides instead of by plowing, which invites soil erosion. The ..:,)nservation farmer just discs up the top two or three inches of topsoil along \\ ith the stalks and residue from the
previous crop. This process creates trili ions of tiny dams that prevent wind or water erosion. The little dams also encourage water to infiltrate the root zone of the field, instead of running
off mto the nearest stream.
Conservation tillage cuts soil erosion by 65 to 95 percent and often doubles the soil moisture in the field. It encourages far more soil bacteria and earthworms, both because of the constant
heavy supply of crop residues for them to eat and because they hate being plowed, as they are in conventional and organic farmers' fields.
Through the expanded use of conservation tillage across the United States, Canada, South America, Australia and, most recently, South Asia, hundreds of millions of acres are now
sustainably more productive than ever before in history.
Another fruitful use of technology and increased sustainability will be more efficient irrigation. Primitive flood irrigation systems in the Third World use water at less than 40 percent
efficiency. Center-pivot irrigation systems with trailing plastic tubes to deliver water right at the roots (minimizing evaporation) and computer-controlled to apply just the right amount of
moisture to each part of the field, can approach 90 percent water efficiency. World farmers currently use about 70 percent of the fresh water humanity "uses up." As water becomes more
valuable, the capital investments in high-efficiency irrigation systems will be justified.
Fourth: The Future and Biotechnology
Today's crop yields are the product of more than two hundred years of conventional trial-and-error science. But, by 2050 the world
will have some seven billion affluent humans demanding the high-quality diets that only about one billion people are able to afford
today. We'll also have to feed far more pets.
That means world food demand will more than double, and we're already farming half of the Earth's available land. Additional
sources of higher crop and livestock yields will be needed. The world is already using plant breeding, fertilizers, irrigation, and
pesticides. However, the world is only beginning to use biotechnology, our new-found understanding of Nature's genetic codes.
The first broad application of biotechnology in agriculture has been to make plants tolerant of synthetic herbicides, so we could
use the environmentally safest herbicides to protect our crops more effectively from weed competition. As a result we have
somewhat raised crop yields and lowered food costs in many countries.
It also happens that one of Africa's worst endemic pests is a parasitic weed called witchweed. It invades corn and sorghum plants through their roots, and the farmer
never knows it's there until his crop stalk suddenly sprouts a bright red witch weed flower instead of an ear of grain.
Genetically engineered herbicide-tolerant seeds could have solved the problem. With the seed soaked in herbicide, the witchweed would have been killed as it
invaded the plant roots, and the grain would have thrived. Unfortunately, activists and European governments threatened retaliation against any African government
that allowed the planting of biotech-modified crops.
Now, researchers have done a genetically researched end run around the biotech Luddites. Pioneer Hi-Bred identified corn seeds with a natural tolerance for the
herbicide imazopyr, and donated the germ plasm to the International Maize and Wheat Improvement Center (CIMMYT) in Mexico. CIMMYT, in turn, has bred the
herbicide tolerance into African corn varieties. Corn yields are four times as high. The technology is low cost and easy for even Africa's small farms to use.
Biotechnology (BT) has also allowed plant researchers to put an ultra-safe natural insecticide found in soils into such crop plants as corn and cotton. Because of
these pest-resistant plants, millions of pounds of pesticide no longer have to be sprayed into the environment or pose hazards to beneficial insects. BT cotton and
corn are being planted by millions of small farmers, especially in China and India.
An important second-generation benefit of biotechnology is finding wild natural genes that can improve our crop plants. We
already have one such important breakthrough. Plant explorers nearly fifty years ago found a relative of the wild potato that was
resistant to the infamous late blight virus that caused the Irish potato famine in the 1840s. Unfortunately, plant breeders were never able to
successfully cross that blight resistance gene into an edible, productive domestic potato. Now, three different universities have spliced the blight resistance gene
into new potato varieties. This will be especially important for densely populated parts of Asia and Africa (such as Rwanda and Bangladesh) that have become more
dependent on the potato's ability to produce more food per acre than any other crop.
Black Sigatoka, a new bacterial disease of bananas and plantains (important staples in much of Africa) has been spreading worldwide. Unfortunately, bananas are
especially difficult to cross-breed. Fortunately, biotechnology has now produced plants resistant to Black Sigatoka, protecting the tenuous food security of tropical
and subtropical Africa. Plant ,esea,che" alsn bel;eve that b;ntechnnlogy is the most J;kely path towaed drought-tolemnt c,ops, wh;ch would be hugely ;mponant;n
deal;ng witf any long-te'm d,ought p'oblems brought by the Modem Waeming. Egypt has al,eady ;nsened a drought-tolemnce gene from the barley plant into wheat,
produe;ng vaeieHes that need only a single ;"igat;on per crop instead of eight. The drought-tol"ant wheat wil! not only take less wal". but wil! sha'ply ;educe
sa];n;zaHnn of the ;";gated land on wh;ch it's grown. It should also be a boon on large areas of good quality land where rainfall is scarce.
Fifth: Modern Transportation
The biggest technical advantage of the modem world in dealing with weather famines is modern transportation. In the Coming
warming centuries, we will undoubtedly be able to produce enough food from the land that gets good weather in any given yea; to
supply all of the world's food needs. Equally important, We will be able to store food safely from years of plenty to ensure food
abundance in lean years, all it takes are inexpensive concrete silos and modern pesticides to keep the rats and bugs from feasting
on Our food reserves before We need to draw on them.

17
Wind PTC Georgia
Negative Novice Packet
Solvency frontline

1. Wind expansion is inevitable – prices and value of the dollar made it competitive – our ev is comparative to a world
without the PTC
Salon.com 08 (“Winds of change”, May 17, http://www.salon.com/news/feature/2008/05/17/wind_power/index.html)

Why the explosive growth? The short answer is price. New wind farms are currently offering power at 4 to 8 cents a kilowatt hour,
including the federal wind tax credit. Even without the credit, and with the recent price rise that most power sources have seen,
wind power is delivering power at 7 to 10 cents/kWh. The price of new wind farms has risen 30 percent to 40 percent in the last
few years for two reasons. First, commodity prices have soared. Second, most wind turbine manufacturing is in Europe, and the
dollar has plummeted compared to the euro. As of 2007, America had about 18 percent of total global installed capacity and about
the same fraction of the wind manufacturing business.
Ironically, the plunging dollar has done for the domestic industry what conservatives refused to do -- make this country the place to
build new wind manufacturing capacity. In the last few years, the percentage of U.S. wind equipment installed here but
manufactured abroad has dropped from 70 percent to 50 percent, and that drop is projected to continue, which should help stabilize
wind costs.
The one remaining big U.S. manufacturer of wind turbines was bought by General Electric in 2003 from the now-defunct Enron, a highly profitable move that
preserved America's role in large wind turbines. From 2004 to 2007, the company's wind turbine production has grown 500 percent, and the division brought GE
revenues exceeding $4 billion in 2007.
While the multi-decade drop in wind prices has stalled temporarily, prices for the competition have gone up the smokestack. New nuclear plants,
for instance, have tripled in price. Analysis for the California Public Utility Commission puts the cost of power from new nuclear plants at 15 cents per
kWh. It also puts the cost of coal (without carbon capture and storage) at more than 10 cents/kWh. That's a major reason why, since 2000,
Europe has added 47 GW of new wind energy, but only 9.6 GW of coal and a mere 1.2 GW of nuclear.
Yes, wind power is a variable resource, but this country has a great deal of power that runs around the clock, and many sources of flexible generation that can
complement wind's variability such as hydro power, natural gas, demand response and, soon, concentrated solar thermal power. Many regions in Europe integrate
well beyond 20 percent wind power successfully. Iowa, Minnesota, Colorado and Oregon already get 5 to 8 percent of their power from wind. Moreover, as we
electrify transportation over the next two decades with plug-in hybrids, the grid will be able to make use of far larger amounts of variable, largely nighttime zero-
carbon electricity from wind. So post 2030, wind power should be able to grow even further.

2. Wind is growing at record levels now


UPI, 8 (Megan Harris, United Press International, “Analysis: U.S. wind market's mixed signals,” 5-6-2008,
http://www.upi.com/International_Security/Energy/Analysis/2008/05/06/analysis_us_wind_markets_mixed_signals/3295/) // JMP

Annual wind energy growth in the United States topped previous records at about 45 percent in 2007 bringing total installed wind capacity to 16,818
megawatts. The U.S. regions with the biggest wind potential as measured by annual energy output are the Midwest and West -- with North Dakota and Texas on top. Texas had the largest
growth in 2007 and now leads in installed wind power capacity at 4,356 megawatts.
Worldwide, wind power grew at a record level in 2007 -- adding 20,000 megawatts of wind capacity and bringing global installed wind capacity to 94,000
megawatts.

18
Wind PTC Georgia
Negative Novice Packet
Solvency frontline
3. The PTC hurts wind market growth – empirical proof
Refocus 03 (“Boom or Bust? Which way are U.S. winds blowing?”, Renewable Energy Focus, August, ScienceDirect)

In this year's Renewable Power Outlook, which presents a 12-year forecast of renewable power generation and capacity based on
an underlying market model that assesses supply and demand on a state-level, we project U.S. wind capacity to expand by nearly
2,800 megawatts (MW) through the end of 2005 as a result of the PTC recent reinstatement. On the surface, this forecast seems to
support the conventional wisdom that the PTC is a significant driver of wind market growth. However, despite the credit's apparent
effectiveness over the past several years, both our assessment of the historical patterns of U.S wind market growth, and our
expectations regarding future growth trends, indicate that the PTC may actually inhibit U.S. wind market growth moving forward.
Wind development
As illustrated in Figure 1, in the first 5 years after it was originally made available in 1993, the PTC was ineffective in stimulating
wind market growth. During this period, low natural gas prices - as a result of the deregulation of the natural gas industry that
began with the passage of the National Gas Policy Act of 1978 and culminated in Federal Energy Regulatory Commission Order
636 - and utility reluctance to invest in generation assets - as a result of piecemeal state-level deregulatory processes that swept
through the nation in the mid-1990 - worked together to create a period of dormancy for all renewable power technologies
including wind, despite the presence of the PTC.

4. Tax credits can’t overcome market barriers without a renewable standards


UCS 7 (Union of concerned Scientists, “Renewable Electricity Standard FAQ”
http://www.ucsusa.org/clean_energy/clean_energy_policies/the-renewable-electricity-standard.html#6) AMK

Why not rely just on incentive-based approaches, such as tax credits?


Setting minimum standards has been an effective and essential policy for achieving many critical societal goals, such as increasing vehicle
and appliance efficiency, company environmental performance, and building and product safety. Renewable energy production tax credits are vital for leveling the tax
playing field with fuel-intensive technologies that pay lower property taxes and can deduct fuel expenses, but do not necessarily overcome other critical market
barriers. In order to ensure the tax credits are effective, there needs to be a policy that creates a market for the technologies. For
example, the production tax credit for wind has produced most new wind capacity in states that also have a state RES. The RES creates a
market for renewable technologies that are commercially viable or close to viable and helps reduce their costs (see below). Complementary
policies, including net metering and other financial incentives, are also needed to encourage the development of higher cost renewable emerging technologies with significant long term
The combination of setting minimum standards, while providing incentives for exceeding
potential such as customer-sited solar photovoltaics.
standards has proven to be a cost-effective approach to improving performance.
[Note – RES is a Renewable Electricity Standard]

19
Wind PTC Georgia
Negative Novice Packet
Solvency – ext 1 – wind inevitable
Wind will expand even without a tax credit – cost competitive

Tech Confidential 08 (“Wind industry celebrates growth in face of tax credit loss”, May 7,
http://www.thedeal.com/techconfidential/vc-ratings/windward-ho/wind-industry-celebrates-growt.php)

The American Wind Energy Association on Wednesday announced that 2008 was shaping up to be another strong year of growth
for the U.S. wind power industry, with 1,400 new megawatts -- enough to serve 400,000 homes --- added during the first quarter
and investments underway in 17 new power plants. The most compelling sign of how this once marginalized industry is now
booming can be seen on a graph from AWEA, showing installed wind capacity doubling between 2006 and 2007.
As Tech Confidential recently reported, the wind industry, long a source of promise, has recently begun to thrive thanks to
increased interest in cleaner fuel sources, coupled with improved wind turbine technology. This has helped not just the plants that
generate wind power, but all of the companies that manufacture turbines and the venture backed startup companies working on
new and improved turbine technology.
Despite this runaway growth that would seem to be an investor's dream, many remain cautious about the wind sector. IPOs have
been few and far between. AWEA links much of this caution to the production tax credit in the U.S., which is due to expire later
this year. As AWEA data clearly shows, the tax credit in the past has been critical to the success of the wind power industry, which
has thrived during the years when the credit was in place, and collapsed when it was taken away.
Although the industry is actively lobbying for an extension of the tax credit this year, when the continued support of wind power is
so critical to its mass adoption, they quietly concede that things are a little different now. Wind power economics have improved to
the point that, at least in many parts of the country, it is competitive with other fuel sources, even without a tax credit.

Wind power will continue to expand for several reasons – a fragmented regulatory framework is not blocking wind
development
Datamonitor, 8 (“The US Wind Market Has Outgrown the Drip Feed of Supportive Federal Legislation,” 5-27-2008,
www.redorbit.com/news/business/1403646/the_us_wind_market_has_outgrown_the_drip_feed_of/) // JMP

However, with the fate of a key Federal tax incentive in the balance, the US wind energy industry increased new installation output
at a record pace in the first quarter of 2008, adding 1,400MW of new generating capacity. The prospect of a new 'greener' US
administration, coupled with the fact that many state and city administrations have recently introduced standards and tax
incentives, is driving considerable investment and development momentum. While the US regulatory environment is perhaps less
sophisticated and more fragmented than in Europe, the country's well-established culture of innovation means that US capacity
could well catch up with, and even surpass, Europe's capacity, by leveraging a well-developed venture capital industry.

20
Wind PTC Georgia
Negative Novice Packet
Solvency ext 2 – wind is growing
Wind power is expanding now
IHT, 8 (Carey Gillam, International Herald Tribune, “Wind power gains adherents in United States,” 5-19-2008,
www.iht.com/articles/2008/05/19/business/wind.php) // JMP

While growth in ethanol use as an alternative fuel has had a big impact on rural America, wind power has also been growing steadily for the past
three years, with wind farms like this one springing up all over the windy expanse of the Great Plains and beyond.
While only 1 percent of U.S. electricity comes from wind, it is attracting so much support these days that many in the industry believe it is poised for
growth.
"These are pretty heady times," said Randall Swisher, executive director of the American Wind Energy Association, which held an investment conference in April
in Iowa that drew more than 600 attendees.
"People are finally starting to see the data about what is happening to the world's climate, and that is really having an impact," Swisher said.
Last year, a record 3,100 turbines were installed across 34 U.S. states, and another 2,000 turbines are now under construction from
California to Massachusetts.

A big expansion of wind power is planned for the U.S. – it is already cost competitive
Madrigal, 8 (Alexis, “DOE Report Says More Wind Than Coal Planned for US Grid,” 6-2-2008,
http://blog.wired.com/wiredscience/2008/06/new-doe-report.html) // JMP

A new report from the Department of Energy details that 225 gigawatts of wind power are in the planning phases, thirteen times
more than currently installed and far more than the natural gas and coal plants on the drawing board.
Ryan Wiser, a researcher at Lawrence Berkeley National Laboratory and a co-author of the report, called the increase in wind projects "extraordinary," including the 5.3 gigawatts of wind
installed during 2007, which represented 35 percent of the total new capacity added to the grid last year.
What's driving the surge? Wind
is cost-competitive with fossil fuels and comes without the risk of climate change legislation making its
fuel more expensive to use.
"Wind costs and prices are on the rise, but fossil generation costs are also increasing," Wiser emailed Wired.com. "The end result is that wind remains competitive with fossil [fuel]
generation."
Lots of new power sources will have to come online over the next few decades to replace the coal plants that were built after World
War II and that will reach the end of their lifespans over the next couple decades. Wind is looking like it will be a major part of that mix. A separate DOE report
released last month declared that wind could power 20 percent of the US grid by 2030.

Wind expansion is inevitable – oil prices


Southern Maryland News 08 (“EARTH TALK: Status of Wind Power in the U.S.”, May 9,
http://somd.com/news/headlines/2008/7645.shtml)

Clean and green wind energy is the new darling of alternative energy developers, and the U.S. industry has been surging the past
three years, especially as developers take advantage of government incentives-in the form of the so-called Production Tax Credit
(PTC)-for erecting turbines and connecting them to the grid.
The non-profit American Wind Energy Association (AWEA) reports that, in 2007 alone, total U.S. wind power capacity grew by a
new record of 45 percent, injecting some $9 billion into the economy. These new installations provide enough electricity to power
1.5 million typical American homes while strengthening the nation's energy supply with clean, homegrown electricity.
According to AWEA, utility-grade wind power installations are now in operation across 34 U.S. states, generating more than
16,000 megawatts (MW) of electricity cumulatively-enough to power upwards of 4.5 million homes and to generate 45,000 new
domestic jobs. But even with this growth, wind energy still accounts for just one percent of U.S. electricity supply. Continued
growth apace with of recent years, though, should make it a major player in the American energy scene within a decade. President
Bush himself recently suggested that wind has the potential to supply up to 20 percent of the nation's electricity.
Of course, the volatility of oil prices has helped wind energy gain its foothold. Once a wind farm is built, the fuel cost is essentially
zero (as long as the wind blows), whereas fluctuating fossil fuel prices have made traditional power sources more costly and risky.
Upping our reliance on wind power has also allowed us to lower our overall carbon footprint. If coal or natural gas were to be
substituted to generate the electricity we now get from wind, it would put 28 million additional tons of carbon dioxide into the
atmosphere every year. Wind power also saves water by not requiring the billions of gallons of water used to cool coal-fired power
plants, an increasingly contentious issue in arid areas with limited access to fresh water.

21
Wind PTC Georgia
Negative Novice Packet
Solvency – ext 3 – PTC fails
The world without a PTC would only slow down wind power growth for a few years – but not completely undermine it
Refocus 03 (“Boom or Bust? Which way are U.S. winds blowing?”, Renewable Energy Focus, August, ScienceDirect)

The bottom line for the U.S. market is that while its boom and bust market cycle will likely remain in the near term, a significant
level of market activity will exist as a result of state level RPS policies and wind power’s position as the most cost effective
renewable energy technology. Policy stability, for the PTC, may eventually emerge as a result of the increasing credibility of wind power and the entry of industry
heavyweights, such as GE, who bring with them influence amongst the Washington D.C. policy makers. However, as many industry participants insist, even without
a PTC the potential for wind energy would remain. Removing the PTC would, naturally, have significant negative effects on the industry in the short term. However,
in a world without the PTC, industry players would adapt their business plans to the new environment and after a two to three year
period of adjustment, resume a smaller, albeit stable, growth path driven by state level RPS policies.

The PTC only has a limited role in expanding wind power


Refocus 03 (“Boom or Bust? Which way are U.S. winds blowing?”, Renewable Energy Focus, August, ScienceDirect)

Given the low price of natural gas through 2000, we believe that the PTC played an important role by reducing the cost of state-level policies to an acceptable cost that allowed state
renewable power policies to be successful without consumer or regulatory push back. Essentially, the PTC provided a mechanism by which federal taxpayers could subsidise the
renewable power policies of a handful of states with progressive energy policies. Moving forward, we believe that the
role of the PTC will be diminished. As illustrated in
Figure 3, our forecast indicates that natural gas prices will average $5.75 through the end of the decade. Assuming even the most optimistic heat rate of 7,000 Btu/kWh,
the cost of gas-fired generation will be approximately 4 ¢/kWh (excluding O&M and capital costs). For comparison purposes, without the PTC we expect wind energy
to cost approximately 4-5 ¢/kWh at premium locations even in the absence of technological progress. This is why we believe that
wind will remain competitive with conventional generation options over the next decade and the PTC will therefore play a
diminished role in guaranteeing the success of state-level policies.

22
Wind PTC Georgia
Negative Novice Packet
Solvency – ext 4 – PTC not enough

PTC extension would still be ineffective to promote a strong wind market – multiple warrants
Gipe, 2003 – International Wind Energy Expert, World Renewable Energy Congress Pioneer in Renewable Energy
(Paul, “Why I Oppose the Production Tax Credit,” Wind-Works.org, February 12, http://www.wind-works.org/articles/lg_ProductionTaxCreditNo.html)

Since the first National Energy Act, tax credits have been the mechanism used in the United States to subsidize or stimulate wind energy. Through 1985 the tax credits were based on
installed capital costs. Beginning in the 1990s, the tax credits were based on the sales of wind-generated electricity. This eliminated some of the more egregious abuses of the earlier
program. Up to the present, Production Tax Credits have been the favored policy of both the U.S. wind industry and renewable energy advocates. This mechanism is only used in the
United States.
Rationale of Opposition
1. Most simply, there are better ways to promote responsible wind energy development. Electricity Feed Laws are a far superior mechanism for spurring renewable energy development.
The success of Feed Laws can be seen in Germany, Spain, and now France.
2. ProductionTax Credits can only be used by those who sell bulk electricity. They cannot be used by renewables in distributed
applications on the customer side of meter. For example, a pig farmer who wishes to off load consumption on the farm cannot use the tax credits because there are no
sales of wind-generated electricity.
3. Production Tax Credits are of benefit only to those with a tax burden sufficient to use all of the credits. Those most likely to
benefit are non-regulated utility subsidiaries.
4. Production Tax Credits lead to an unfortunate American phenomenon of a boom and bust or "gold rush" form of development. This results in part because Congress reauthorizes the tax
credits for only a few years at a time. Thus development is concentrated at the end of the last year. Boom and bust development is no way to run a business and is certainly no way to
create a dynamic and healthy renewable energy industry. Companies expand then contract, hiring and then firing in an oft-repeated cycle. When developers have been bitten by the gold
fever, impacts on the environment and on nearby communities take a back seat to getting projects "in the ground".
5. Production Tax Credits lead to concentration of the technology in the hands of a few. One-half of all wind capacity in the USA is owned or operated
by Florida Power & Light's unregulated subsidiary. The
trend in the U.S. industry is toward monopolistic power and control of the political
process. This concentration of power is not seen in markets where Electricity Feed Laws are used.
6. Production Tax Credits, by their boom and bust nature in the United States, cannot sustain a healthy manufacturing sector. There is only one U.S.
manufacturer of commercial wind turbines, GE Wind's Tehachapi plant. However, much of GE Wind's revenue comes from their German operations with sales to Germany's Feed Law
market. (GE Wind has demanded that the bankruptcy court refund much of its payment for Enron Wind's assets because GE Wind concluded they overpaid.)
7. Production Tax Credits encourage obscure and non-transparent forms of ownership structures. Part of the ongoing criminal investigation of
Enron partnerships involves wind deals designed to maximize use of the Production Tax Credits. It's nearly impossible to follow the money in these transactions. Of course, that was the
intent. For the environmental community, this complexity leads to an absence of accountability. Who is responsible to clean up environmental damage from
improperly developed wind projects? The operator? The owner? If so, who is the owner?
It's time to jump off the Production Tax Credit treadmill and work toward a more open, transparent support mechanism such as the
Electricity Feed Law.

Aff can’t solve transmission costs—this is comparatively a bigger threat than the expiration of the PTC
Shoock in 7--J.D., expected, Fordham University School of Law, 2008; B.A. summa cum laude in the History & Political
Science The State University of New York at Buffalo
(Corey Stephen, Fordham Journal of Corporate and Financial Law, “BLOWING IN THE WIND: HOW A TWO-TIERED NATIONAL RENEWABLE PORTFOLIO
STANDARD, A SYSTEM BENEFITS FUND, AND OTHER PROGRAMS WILL RESHAPE AMERICAN ENERGY INVESTMENT AND REDUCE FOSSIL FUEL
EXTERNALITIES”, Lexis Nexis)

The news for the wind industry is not all good. Despite recent [*1037] gains, long-term growth is still questionable. 207 For one,
the current production tax credit is again scheduled to expire (this time on December 31, 2008), 208 but a tougher obstacle remains
- wind power's greatest potential lies in relatively geographically remote regions. 209 Therefore, issues involving transmission
costs threaten to put the brakes on the industry's growth. 210 The national power grid, as it stands, is not conducive to carrying
massive amounts of current, for example, from the wind-rich prairies of the Dakotas to larger population centers near the Great
Lakes or the Pacific Northwest. 211 As wind energy production reaches the maximum competitive utility transmission cost,
supply-side policies will drive consumption and investment potential upward into a veritable glass ceiling. 212

23

También podría gustarte