Olduvaiblog: Musings on the coming collapse

Home » Posts tagged 'regulation'

Tag Archives: regulation

New Study Shows Total North American Methane Leaks Far Worse than EPA Estimates | DeSmogBlog

New Study Shows Total North American Methane Leaks Far Worse than EPA Estimates | DeSmogBlog.

Fri, 2014-02-14 12:40SHARON KELLY

Sharon Kelly's picture

Just how bad is natural gas for the climate?

A lot worse than previously thought, new research on methane leaks concludes.

Far more natural gas is leaking into the atmosphere nationwide than the Environmental Protection Agency currently estimates, researchers concluded after reviewing more than 200 different studies of natural gas leaks across North America.

The ground-breaking study, published today in the prestigious journal Science, reports that the Environmental Protection Agency has understated how much methane leaks into the atmosphere nationwide by between 25 and 75 percent — meaning that the fuel is far more dangerous for the climate than the Obama administration asserts.

The study, titled “Methane Leakage from North American Natural Gas Systems,” was conducted by a team of 16 researchers from institutions including Stanford University, the Massachusetts Institute of Technology and the Department of Energy’s National Renewable Energy Laboratory, and is making headlines because it finally and definitively shows that natural gas production and development can make natural gas worse than other fossil fuels for the climate.

The research, which was reported in The Washington PostBloomberg and The New York Times, was funded by a foundation created by the late George P. Mitchell, the wildcatter who first successfully drilled shale gas, so it would be hard to dismiss it as the work of environmentalists hell-bent on discrediting the oil and gas industry.

The debate over the natural gas industry’s climate change effects has raged for several years, ever since researchers from Cornell University stunned policy-makers and environmentalists by warning that if enough methane seeps out between the gas well and the burner, relying on natural gas could be even more dangerous for the climate than burning coal.

Natural gas is mostly comprised of methane, an extraordinarily powerful greenhouse gas, which traps heat 86 times more effectively than carbon dioxide during the two decades after it enters the atmosphere, according to the Intergovernmental Panel on Climate Change, so even small leaks can have major climate impacts.

The team of researchers echoed many of the findings of the Cornell researchers and described how the federal government’s official estimate proved far too low.

“Atmospheric tests covering the entire country indicate emissions around 50 percent more than EPA estimates,” said Adam Brandt, the lead author of the new report and an assistant professor of energy resources engineering at Stanford University. “And that’s a moderate estimate.”

The new paper drew some praise from Dr. Robert Howarth, one of the Cornell scientists.

“This study is one of many that confirms that EPA has been underestimating the extent of methane leakage from the natural gas industry, and substantially so,” Dr. Howarth wrote, adding that the estimates for methane leaks in his 2011 paper and the new report are “in excellent agreement.”

In November, research led by Harvard University found that the leaks from the natural gas industry have been especially under-estimated. That study, published inthe Proceedings of the National Academy of Science, reported that methane emissions from fossil fuel extraction and oil refineries in some regions are nearly five times higher than previous estimates, and was one of the 200 included in Thursday’s Science study.

EPA Estimes Far Off-Target

So how did the EPA miss the mark by such a high margin?

The EPA’s estimate depends in large part on calculations — take the amount of methane released by an average cow, and multiply it by the number of cattle nationwide. Make a similar guess for how much methane leaks from an average gas well. But this leaves out a broad variety of sources — leaking abandoned natural gas wells, broken valves and the like.

Their numbers never jibed with findings from the National Oceanic and Atmospheric Administration and the U.S. Department of Energy, which approached the problem by taking measurements of methane and other gas levels from research flights and the tops of telecommunications towers.

But while these types of measurements show how much methane is in the atmosphere, they don’t explain where that methane came from. So it was still difficult to figure out how much of that methane originated from the oil and gas industry.

At times, EPA researchers went to oil and gas drilling sites to take measurements. But they relied on driller’s voluntary participation. For instance, one EPA study requested cooperation from 30 gas companies so they could measure emissions, but only six companies allowed the EPA on site.

“It’s impossible to take direct measurements of emissions from sources without site access,” said Garvin Heath, a senior scientist with the National Renewable Energy Laboratory and a co-author of the new analysis in a press release. “Self-selection bias may be contributing to why inventories suggest emission levels that are systematically lower than what we sense in the atmosphere.” (DeSmog haspreviously reported on the problem of industry-selected well sites in similar research funded by the Environmental Defense Fund.)

Worse than Coal?

There was, however, one important point that the news coverage so far missed and that deserves attention — a crucial point that could undermine entirely the notion that natural gas can serve as a “bridge fuel” to help the nation transition away from other, dirtier fossil fuels.

In their press release, the team of researchers compared the climate effects of different fuels, like diesel and coal, against those of natural gas.

They found that powering trucks or busses with natural gas made things worse.

“Switching from diesel to natural gas, that’s not a good policy from a climate perspective” explained the study’s lead author, Adam R. Brandt, an assistant professor in the Department of Energy Resources at Stanford, calling into question a policy backed by President Obama in his recent State of the Union address.

The researchers also described the effects of switching from coal to natural gas for electricity — concluding that coal is worse for the climate in some cases. “Even though the gas system is almost certainly leakier than previously thought, generating electricity by burning gas rather than coal still reduces the total greenhouse effect over 100 years, the new analysis shows,” the team wrote in a press release.

But they failed to address the climate impacts of natural gas over a shorter period — the decades when the effects of methane are at their most potent.

“What is strange about this paper is how they interpret methane emissions:  they only look at electricity, and they only consider the global warming potential of methane at the 100-year time frame,” said Dr. Howarth. Howarth’s 2011 Cornell study reviewed all uses of gas, noting that electricity is only roughly 30% of use in the US, and describing both a 20- and a 100-year time frame.

The choice of time-frame is vital because methane does not last as long in the atmosphere as carbon dioxide, so impact shifts over time. “The new Intergovernmental Panel on Climate Change (IPCC) report from last fall — their first update on the global situation since 2007 — clearly states that looking only at the 100 year time frame is arbitrary, and one should also consider shorter time frames, including a 10-year time frame,” Dr. Howarth pointed out.

Another paper, published in Science in 2012, explains why it’s so important to look at the shorter time frames.

Unless methane is controlled, the planet will warm by 1.5 to 2 degrees Celsius over the next 17 to 35 years, and that’s even if carbon dioxide emissions are controlled. That kind of a temperature rise could potentially shift the climate of our planet into runaway feedback of further global warming.

“[B]y only looking at the 100 year time frame and only looking at electricity production, this new paper is biasing the analysis of greenhouse gas emissions between natural gas and coal in favor of natural gas being low,” said Dr. Howarth, “and by a huge amount, three to four to perhaps five fold.”

Dr. Howarth’s colleague, Prof. Anthony Ingraffea, raised a similar complaint.

“Once again, there is a stubborn use of the 100-year impact of methane on global warming, a factor about 30 times that of CO2,” Dr. Ingraffea told Climate Central, adding that there is no scientific justification to use the 100-year time window.

“That is a policy decision, perhaps based on faulty understanding of the climate change situation in which we find ourselves, perhaps based on wishful thinking,” he said.

For its part, the oil and gas industry seems very aware of the policy implications of this major new research and is already pushing back against any increased oversight of its operations.

“Given that producers are voluntarily reducing methane emissions,” Carlton Carroll, a spokesman for the American Petroleum Institute, told The New York Times in an interview about the new study, “additional regulations are not necessary.”
Photo Credit: “White Smoke from Coal-Fired Power Plant,” via Shutterstock.

New Study Shows Total North American Methane Leaks Far Worse than EPA Estimates | DeSmogBlog

New Study Shows Total North American Methane Leaks Far Worse than EPA Estimates | DeSmogBlog.

Fri, 2014-02-14 12:40SHARON KELLY

Sharon Kelly's picture

Just how bad is natural gas for the climate?

A lot worse than previously thought, new research on methane leaks concludes.

Far more natural gas is leaking into the atmosphere nationwide than the Environmental Protection Agency currently estimates, researchers concluded after reviewing more than 200 different studies of natural gas leaks across North America.

The ground-breaking study, published today in the prestigious journal Science, reports that the Environmental Protection Agency has understated how much methane leaks into the atmosphere nationwide by between 25 and 75 percent — meaning that the fuel is far more dangerous for the climate than the Obama administration asserts.

The study, titled “Methane Leakage from North American Natural Gas Systems,” was conducted by a team of 16 researchers from institutions including Stanford University, the Massachusetts Institute of Technology and the Department of Energy’s National Renewable Energy Laboratory, and is making headlines because it finally and definitively shows that natural gas production and development can make natural gas worse than other fossil fuels for the climate.

The research, which was reported in The Washington PostBloomberg and The New York Times, was funded by a foundation created by the late George P. Mitchell, the wildcatter who first successfully drilled shale gas, so it would be hard to dismiss it as the work of environmentalists hell-bent on discrediting the oil and gas industry.

The debate over the natural gas industry’s climate change effects has raged for several years, ever since researchers from Cornell University stunned policy-makers and environmentalists by warning that if enough methane seeps out between the gas well and the burner, relying on natural gas could be even more dangerous for the climate than burning coal.

Natural gas is mostly comprised of methane, an extraordinarily powerful greenhouse gas, which traps heat 86 times more effectively than carbon dioxide during the two decades after it enters the atmosphere, according to the Intergovernmental Panel on Climate Change, so even small leaks can have major climate impacts.

The team of researchers echoed many of the findings of the Cornell researchers and described how the federal government’s official estimate proved far too low.

“Atmospheric tests covering the entire country indicate emissions around 50 percent more than EPA estimates,” said Adam Brandt, the lead author of the new report and an assistant professor of energy resources engineering at Stanford University. “And that’s a moderate estimate.”

The new paper drew some praise from Dr. Robert Howarth, one of the Cornell scientists.

“This study is one of many that confirms that EPA has been underestimating the extent of methane leakage from the natural gas industry, and substantially so,” Dr. Howarth wrote, adding that the estimates for methane leaks in his 2011 paper and the new report are “in excellent agreement.”

In November, research led by Harvard University found that the leaks from the natural gas industry have been especially under-estimated. That study, published inthe Proceedings of the National Academy of Science, reported that methane emissions from fossil fuel extraction and oil refineries in some regions are nearly five times higher than previous estimates, and was one of the 200 included in Thursday’s Science study.

EPA Estimes Far Off-Target

So how did the EPA miss the mark by such a high margin?

The EPA’s estimate depends in large part on calculations — take the amount of methane released by an average cow, and multiply it by the number of cattle nationwide. Make a similar guess for how much methane leaks from an average gas well. But this leaves out a broad variety of sources — leaking abandoned natural gas wells, broken valves and the like.

Their numbers never jibed with findings from the National Oceanic and Atmospheric Administration and the U.S. Department of Energy, which approached the problem by taking measurements of methane and other gas levels from research flights and the tops of telecommunications towers.

But while these types of measurements show how much methane is in the atmosphere, they don’t explain where that methane came from. So it was still difficult to figure out how much of that methane originated from the oil and gas industry.

At times, EPA researchers went to oil and gas drilling sites to take measurements. But they relied on driller’s voluntary participation. For instance, one EPA study requested cooperation from 30 gas companies so they could measure emissions, but only six companies allowed the EPA on site.

“It’s impossible to take direct measurements of emissions from sources without site access,” said Garvin Heath, a senior scientist with the National Renewable Energy Laboratory and a co-author of the new analysis in a press release. “Self-selection bias may be contributing to why inventories suggest emission levels that are systematically lower than what we sense in the atmosphere.” (DeSmog haspreviously reported on the problem of industry-selected well sites in similar research funded by the Environmental Defense Fund.)

Worse than Coal?

There was, however, one important point that the news coverage so far missed and that deserves attention — a crucial point that could undermine entirely the notion that natural gas can serve as a “bridge fuel” to help the nation transition away from other, dirtier fossil fuels.

In their press release, the team of researchers compared the climate effects of different fuels, like diesel and coal, against those of natural gas.

They found that powering trucks or busses with natural gas made things worse.

“Switching from diesel to natural gas, that’s not a good policy from a climate perspective” explained the study’s lead author, Adam R. Brandt, an assistant professor in the Department of Energy Resources at Stanford, calling into question a policy backed by President Obama in his recent State of the Union address.

The researchers also described the effects of switching from coal to natural gas for electricity — concluding that coal is worse for the climate in some cases. “Even though the gas system is almost certainly leakier than previously thought, generating electricity by burning gas rather than coal still reduces the total greenhouse effect over 100 years, the new analysis shows,” the team wrote in a press release.

But they failed to address the climate impacts of natural gas over a shorter period — the decades when the effects of methane are at their most potent.

“What is strange about this paper is how they interpret methane emissions:  they only look at electricity, and they only consider the global warming potential of methane at the 100-year time frame,” said Dr. Howarth. Howarth’s 2011 Cornell study reviewed all uses of gas, noting that electricity is only roughly 30% of use in the US, and describing both a 20- and a 100-year time frame.

The choice of time-frame is vital because methane does not last as long in the atmosphere as carbon dioxide, so impact shifts over time. “The new Intergovernmental Panel on Climate Change (IPCC) report from last fall — their first update on the global situation since 2007 — clearly states that looking only at the 100 year time frame is arbitrary, and one should also consider shorter time frames, including a 10-year time frame,” Dr. Howarth pointed out.

Another paper, published in Science in 2012, explains why it’s so important to look at the shorter time frames.

Unless methane is controlled, the planet will warm by 1.5 to 2 degrees Celsius over the next 17 to 35 years, and that’s even if carbon dioxide emissions are controlled. That kind of a temperature rise could potentially shift the climate of our planet into runaway feedback of further global warming.

“[B]y only looking at the 100 year time frame and only looking at electricity production, this new paper is biasing the analysis of greenhouse gas emissions between natural gas and coal in favor of natural gas being low,” said Dr. Howarth, “and by a huge amount, three to four to perhaps five fold.”

Dr. Howarth’s colleague, Prof. Anthony Ingraffea, raised a similar complaint.

“Once again, there is a stubborn use of the 100-year impact of methane on global warming, a factor about 30 times that of CO2,” Dr. Ingraffea told Climate Central, adding that there is no scientific justification to use the 100-year time window.

“That is a policy decision, perhaps based on faulty understanding of the climate change situation in which we find ourselves, perhaps based on wishful thinking,” he said.

For its part, the oil and gas industry seems very aware of the policy implications of this major new research and is already pushing back against any increased oversight of its operations.

“Given that producers are voluntarily reducing methane emissions,” Carlton Carroll, a spokesman for the American Petroleum Institute, told The New York Times in an interview about the new study, “additional regulations are not necessary.”
Photo Credit: “White Smoke from Coal-Fired Power Plant,” via Shutterstock.

Sochi Olympics Going For Gold, But Not Green | A\J – Canada’s Environmental Voice

Sochi Olympics Going For Gold, But Not Green | A\J – Canada’s Environmental Voice.

Illegal logging during Sochi Olympics road construction.Activists with Environmental Watch on North Caucasus documented illegal logging in 2009 during construction of the combined (rail and motor) road from the Adler resort district (home to the Olympic stadium and athlete villages) to the mountain resort where alpine sports are taking place. Check out the EWNC website (in Russian) and Facebook page(English) for more photos.

When Russia bid to host the 2014 Winter Olympic Games, they committed to green building standards and a “zero waste” policy that promised not to add to landfills. The $51-billion Sochi Olympics – the most expensive in history – will truly have costly consequences to the environment. The area of development includes a UNESCO World Heritage site and a national park – the most biodiverse location in Russia. Eight thousand acres of preserved forests have been damaged and wetlands important for migrating birds have been buried under two metres of crushed rock. Suren Gazaryan, a zoologist with Environmental Watch on North Caucasus (EWNC) who is living in exile due to criminal charges stemming from his humans rights work, says that parts of the park have been totally destroyed. He adds that much of the government’s much-vaunted reforestation efforts have been “pointless.” The planting of 1.5 million new trees was often done by unqualified personnel who violated conventional methodology.

The Associated Press reports that Russia’s state-owned rail monopoly has been using illegal landfills to dump construction waste from an $8.2-billion, 48-kilometre highway and railroad link between the airport and alpine venues. These illegal landfills are in a water protection zone, and could potentially lead to the contamination of Sochi’s groundwater. Some IOC members have reportedly admitted to making a poor choice when they selected Sochi. Former IOC member Els van Breda Vriesman told Dutch broadcaster NOS that many members would vote differently today.

The Russian government stepped up law enforcement activity against local environmentalists during Olympic construction. Activists have been detained and criminally charged, some have lost their jobs. The government plans to illegally shut down EWNC due to the group’s insistence on legal compliance during Olympic preparations.

Before we get too up-in-arms about Russia’s environmental misconduct, let’s not forget our own here in Canada.

More reason for concern: Environmental destruction and Indigenous rights abuses often go hand-in-hand. We saw this play out at the 2010 Olympics in Vancouver, where protestors linked environmental degradation and Indigenous sovereignty, and we’re seeing it again now with the Circassian community calling Sochi “the genocide Olympics.” The Circassians are indigenous to the North Caucasus region but were driven from driven from the area in the 19th Century. Historian Walter Richmond is calling Sochi the site of Europe’s first genocidein a new book.

High Risk’ Drugs Used in Livestock » The Epoch Times

High Risk’ Drugs Used in Livestock » The Epoch Times.

Freedom of Information Act docs show FDA ignored dangers

By Epoch Times | February 4, 2014

Last Updated: February 4, 2014 6:31 am 
 
Cattle eat on a farm near Cuba, Illinois, Aug. 3, 2012 near Cuba, Illinois. (Scott Olson/Getty Images)

Cattle eat on a farm near Cuba, Illinois, Aug. 3, 2012 near Cuba, Illinois. (Scott Olson/Getty Images)

 

That’s a breach of their responsibility and the public trust.

Carmen Cordova, NRDC microbiologist and lead author of the NRDC analysis

The U.S. Food and Drug Administration (FDA) has vowed to cut antibiotics use for livestock, but a new report finds that regulators aren’t taking the problem as seriously as its own research would suggest.

“Playing Chicken with Antibiotics” is a report by the Natural Resources Defense Council (NRDC) that looks at FDA documents obtained by a Freedom of Information Act request. Documents show that between 2001 and 2010, the FDA reviewed the safety of 30 penicillin and tetracycline feed additives approved for “nontherapeutic use” in livestock and poultry. 

The documents show that the FDA rated 18 of the drugs as high risk of exposing humans to antibiotic resistant bacteria through the food supply and of adversely affecting human health. 

For the 12 remaining drugs in the FDA review, drug manufacturers did not provide proof of safety for humans.

According Carmen Cordova, NRDC microbiologist and lead author of the NRDC analysis, the FDA knowingly allowed the use of drugs in animal feed even though they failed to meet approval standards the agency set in 1973. 

“That’s a breach of their responsibility and the public trust,” Cordova said in a statement.

According to the NRDC report, the significance of these previously unreleased documents extends far beyond the 30 drugs reviewed. The tetracycline and penicillin FDA examined only make up about half of all the antibiotics used in animal agriculture. The NRDC said generics and other feed additives approved for similar uses may also contribute to antibiotic resistance risks. 

“This discovery is disturbing but not surprising given FDA’s poor track record on dealing with this issue. It’s just more overwhelming evidence that FDA–in the face of a mounting antibiotic resistance health crisis—is turning a blind eye to industry’s misuse of these miracle drugs,” Cordova added.

Resistant to Change

In the 1950s, mixing low doses of antibiotics into animal feed was hailed as a miracle of modern science. The practice promoted faster growth and prevented the infections that often plague animals in the crowded, unsanitary conditions of factory farms.

Regulators first detected problems in antibiotic resistance linked to livestock feed additives in the 1970s. Yet despite FDA’s own research and warnings from the public health community about drug resistant superbugs, efforts to curb antibiotic use on the farm have been slow.

In December 2013, the FDA gave final guidance on a policy to address the issue. However, instead of strict regulations, the agency called for the industry to voluntarily withdraw from antibiotic use, “because it is the fastest, most efficient way to make these changes.” 

“Based on our outreach, we have every reason to believe that animal pharmaceutical companies will support us in this effort,” said Michael Taylor, FDA’s deputy commissioner for foods and veterinary medicine, in a statement. 

Representatives from the drug and livestock industries assert that most antibiotics found in factory farming are used responsibly—for the treatment of diseased animals. But the claim is hard to verify because regulators insist that details of livestock antibiotics use remains a trade secret. According to NRDC, 70 percent of all medically important antibiotics sold in the United States go to livestock production.

The NRDC won a lawsuit against the FDA for failing to address the threat posed by the misuse of antibiotics in the livestock industry. However, the FDA has appealed the ruling, and a decision is now pending in the U.S. Court of Appeals for the Second Circuit, in New York.

Activist Post: Why the FCC Can’t Actually Save Net Neutrality

Activist Post: Why the FCC Can’t Actually Save Net Neutrality.

April Glaser
EFF

Network neutrality—the idea that Internet service providers (ISPs) should treat all data that travels over their networks equally—is a principle that EFF strongly supports. However, the power to enforce equal treatment on the Internet can easily become the power to control the Internet in less beneficent ways. Some people have condemned last week’s court decision to reject the bulk of the Federal Communications Commission’s (FCC) Open Internet Order as a threat to Internet innovation and openness. Others hailed it as a victory against dangerous government regulation of the Internet. Paradoxically, there is a lot of truth to both of these claims.

Violations of network neutrality are a real and serious problem: in recent years we have seen dozens of ISPs in the U.S. and around the world interfere with and discriminate against traffic on their networks in ways that threaten the innovative fabric of the Internet.

At the same time, we’ve long doubted that the FCC had the authority to issue the Open Internet rules in the first place, and we worried that the rules would lead to the FCC gaining broad control over the Internet. The FCC in particular has a poor track record of regulating our communications services. We are not confident that Internet users can trust the FCC, or any government agency, with open-ended regulatory authority of the Internet.
Look at what happened with radio and television. Though it’s charged to regulate our media landscape in the best interest of the public, the FCC opened the doors to unforeseen levels of media consolidation. That consolidation has contributed to the gutting of newsrooms and a steep decline in diversity of viewpoints and local voices on the air, as independent broadcasters across the country shut down, unable to compete with big media monopolies. One of the best protections for the open Internet is probably more competition among ISPs, but the FCC’s history doesn’t leave us hopeful that it is the right entity to help create and defend a competitive Internet marketplace.

And the FCC sometimes makes rules that narrow our freedom to communicate and innovate, like the Broadcast Flag Rule—the bit of DRM that broadcasters wanted to use to prevent the home recording of television—which EFF fought so hard to defeat in 2007.

So while we are hesitant to task any government agency with the job of regulating the Internet, we aren’t thrilled about giving that power to the FCC.

The many faces of network discrimination

However this plays out, we think it’s important that the public understands what network discrimination actually looks like. The recent debate about network neutrality has involved a lot of speculation and “what-if” hypotheticals. This is strange because we have a clear, documented history of the kinds of non-neutral, discriminatory practices that ISPs have actually deployed in recent years. Here are a few ways ISPs have throttled or blocked content in the past. We stand firm in our opposition to this kind of behavior:

  • Packet forgery: in 2007 Comcast was caught interfering with their customers’ use of BitTorrent and other peer-to-peer file sharing;
  • Discriminatory traffic shaping that prioritizes some protocols over others: a Canadian ISPslowed down all encrypted file transfers for five years;
  • Prohibitions on tethering: the FCC fined Verizon for charging consumers for using their phone as a mobile hotspot;
  • Overreaching clauses in ISP terms of service, such as prohibitions on sharing your home Wi-Fi network;
  • Hindering innovation with “fast lane” discrimination that allows wireless customers without data plans to access certain sites but not the whole Internet;
  • Hijacking and interference with DNS, search engines, HTTP transmission, and other basic Internet functionality to inject ads and raise revenue from affiliate marketing schemes, from companies like Paxfire, FairEagle, and others.

Individually and collectively, these practices pose a dire threat to the engine of innovation that has allowed hackers, startup companies, and kids in their college dorm rooms to make the Internet that we know and love today.

How can we prevent these practices and ensure neutrality over our networks? The FCC tried, and while the agency was somewhat successful in putting the brakes on the kind of network discrimination ISPs would rather see[1], the FCC used poor legal reasoning to enact weak rules.

What was wrong with the FCC’s network neutrality approach

The Open Internet rules of 2010 that were rejected by the court last week were deeply flawed and confirmed our fears about heavy-handed Internet regulation. The FCC initially claimed that it had“ancillary” authority under the 1996 Telecommunications Act to enact the Open Internet rules. That means that although the FCC did not have explicit authority from Congress to issue network neutrality rules, especially after classifying Internet service as an “information service” and not a telephone-like “common carrier” in 2002, they still professed a broad authority to regulate the Internet.

That claim of ancillary jurisdiction, if accepted, would have given the FCC pretty much boundless clearance to regulate the Internet, and to claim other ancillary powers in the future. Even if you happen to like the FCC’s current goals, who’s to say we will still like whatever goals the agency has next year and the year after that?

We had serious issues with the initial Open Internet Order, as we explained in our comments to the FCC. For one, the Order allowed ISPs free rein to discriminate as long as it was part of “reasonable efforts to… address copyright infringement.” This broad language could lead to more bogus copyright policing from the ISPs. We’ve already seen companies use inaccurate filters to blocknon-infringing fair use content online, a practice we continue to fight.

The FCC’s rules also had troubling exceptions for law enforcement, permitting ISPs to engage in voluntary, non-neutral network management practices to fulfill any law enforcement requests. We opposed this exception when the rules were being considered, but the FCC did not adopt our recommendations. And by now we all know how overbroad law enforcement exceptions to gather user data can be. If you have any doubt, pick up a newspaper and read about how the U.S. government unconstitutionally collaborates with Internet companies for law enforcement purposes.

There are no easy solutions
In light of these threats it is tempting to reach for easy solutions. But handing the problem to a government agency with strong industry ties and poor mechanisms for public accountability to fix the very real problem of network neutrality is unsatisfying. There’s a real danger that we would just be creating more problems than we’d solve.
One alternative that would go a long way would be to foster a genuinely competitive market for Internet access. If subscribers and customers had adequate information about their options and could vote with their feet, ISPs would have strong incentives to treat all netowrk traffic fairly. The court agreed with us on this point:

“a broadband provider like Comcast would be unable to threaten Netflix that it would slow Netflix traffic if all Comcast subscribers would then immediately switch to a competing broadband provider.”

Another scenario would be for Congress to step in and pass network neutrality legislation that outlines what the ISPs are not allowed to do. But fighting giant mega-corporations like AT&T and Verizon (and their army of lobbyists) in Congress promises to be a tough battle.

Yet another option: empower subscribers to not just test their ISP but challenge it in court if they detect harmful non-neutral practices. That gives all of us the chance to be watchdogs of the public interest but it, too, is likely to face powerful ISP opposition.

These are not the only options. Internet users should be wary of any suggestion that there is an easy path to network neutrality. It’s a hard problem, and building solutions to resolve it is going to remain challenging. But here is one guiding principle: any effort to defend net neutrality should use the lightest touch possible, encourage a competitive marketplace, and focus on preventing discriminatory conduct by ISPs, rather than issuing broad mandatory obligations that are vulnerable to perverse consequences and likely to be outdated as soon as they take effect.

EFF is watching this issue closely, and we’ll continue to share our thoughts on how best to defend the free and open Internet on which we all depend.

%d bloggers like this: