Golden eagle | Photo: Tambako the Jaguar/Flickr/Creative Commons License
A new study of eagle mortality at a wind facility near Palm Springs may well prove frustrating to both supporters of wind energy and those concerned about the technology’s effect on wildlife.
But if you look beneath the surface, the paper underscores a big problem with the issue of win energy and wildlife: we just don’t have the data we need to make smart decisions.
The paper by USGS research ecologist Jeffrey Lovich, to be published this month in the journal Western Birds, describes eagle mortalities at the Mesa Wind Project Site, which is part of the larger San Gorgonio Pass wind area near Palm Springs. Wind industry critics won’t find a smoking gun in the study, which documents just two eagle mortalities in the last 20 years, the most recent in 1997. And while wind partisans may try to find validation in Lovich’s study, that’s going to be difficult: Lovich carefully details a number of reasons why eagle deaths may be ongoing but undetected.
The big story here, though, is the existence of the paper itself. Why would a study that details two eagle deaths during the Clinton administration find its way into a peer-reviewed journal in 2015? The surprising answer emerges when you think about the way science is actually done.
Lovich didn’t set out to study eagles at the Mesa Wind Project. Instead, he was on site at regular intervals over the last two decades studying the effects of wind development on the Threatened desert tortoise. As Lovich writes:
I did not survey the site systematically for avian mortality; avian observations were incidental to long-term studies of the ecology and behavior of Agassiz’s desert tortoise (Gopherus agassizii), primarily 1997-2000 and 2009-2014. The research generally involved visiting the site once every 7-10 days from April to July for one or two days. Crews of 1-8 people walked long, meandering transects looking for tortoises to equip with transmitters or radio-tracking tortoises already outfitted for telemetry. Some aspects of the study required daily field work with one or two people. When dead birds were observed, field notes and photographs were taken opportunistically.
“Since research was not focused on birds,” Lovich adds, “and bird carcasses were found incidentally, it is likely that avian mortality and injury were underestimated at the site.” Lovich notes that scavenging of carcasses is prevalent in the area; a National Renewable Energy Laboratory (NREL) study of bird mortality at San Gorgonio placed dead chickens at intervals throughout the area to determine local scavengers’ efficiency, and found that only one in ten remained after ten days. With that high “scavenger bias,” as the biologists call it, the fact that Lovich’s crew was on site only every seven to ten days might well have meant missing other dead birds who were removed and eaten before the crews came back. And then there are the eight months of each typical year in which Lovich and colleagues weren’t onsite at all.
Still, two eagles doesn’t seem like much in 20 years. Their deaths were admittedly horrible: the first mortality Lovich notes, in August 1995, involved an immature golden eagle that was cut in half mid-flight by a turbine blade. The second, a juvenile female injured by a turbine blade impact in April 1997, was euthanized after a few hours of suffering. But then Lovich reports no new eagle mortalities since 1997, 18 years ago this month. He reports sightings of eagles in the area during his later field work, and observed some of those eagles flying through Mesa at altitudes that could expose them to turbine blade strikes.
Lovich also reports that those sightings have declined over the years; it’s getting less and less likely that you’ll see eagles in the San Gorgonio Pass area. But no bodies.
It won’t be surprising if some wind energy advocates point to Lovich’s paper as vindication, characterizing his work something along the lines of “peer-reviewed study finds no eagle mortalities since 1997.” Of course, that summary omits all the caveats about mortalities Lovich et al might have missed, about the Mesa Wind Project being one development in a sea of larger turbines, about absence of evidence not being evidence of absence, fewer eagles per year and so forth, so wind fans who pride themselves on their science literacy might have second thoughts.
Regardless of how one side or the other in the windmills versus eagles debate spins Lovich’s paper, though, the very fact that it was published at all indicates something disquieting about the state of our knowledge of wind power’s risks to wildlife. And the reason has to do with the way we do science.
Doing Science Wrong
I don’t mean to pick on wind partisans as solely responsible for hypothetical spin of scientific papers. Opponents of wind development are just as liable to take one preliminary study that supports their position and run with it for PR purposes. It may well be part of human nature; it’s certainly part of American culture. We Americans like our science simple and conclusive. None of this nuance stuff for us, with mixed percentages of likely outcomes and possible confounding factors and further study needed. We want science to prove or disprove things unambiguously.
That’s not easy even when the topic is something seemingly straightforward, like learning whether it’s good to eat chicken eggs or bad to drink Scotch. We’d like to have the matter settled once and for all, but science doesn’t always work the way we’d like it to. Life is complicated, and the more we learn about a topic the more complicated it can become.
Of those Americans who actually pay attention to scientific literature more than once and twice in their lives, a majority likely don’t enjoy the process in and of itself. We want a straightforward answer. In fact, if we were honest with ourselves, we’d admit that we usually want a straightforward answer that supports what we wanted to believe in the first place.
That’s why you still see people citing Andrew Wakefield’s discredited work on the alleged link between vaccines and autism. Wakefield’s paper was retracted by the Lancet and Wakefield and his colleagues were found to have acted unethically during their research. Nonetheless, people find their beliefs about vaccines bolstered by Wakefield’s work and still tout it, ignoring several papers published since that refute any link between vaccines and autism.
That’s not just a problem with improperly performed research. It’s a problem with perfectly good science as well. To take an example that’s closer to the topic of wind power and wildlife, consider a 2014 study published in the journal Scottish Marine and Freshwater Science that looked at whether gulls and other sea birds avoid offshore wind turbines off the coast of the British Isles.
That 2014 paper showed that northern gannets seem to give offshore wind installations a wide berth, while a number of gull species show no evidence of either avoiding or being attracted to those installations. The authors couldn’t determine whether birds that entered a wind turbine field actually avoided coming close to individual turbines.
It was a bit of interesting science, and potentially useful for guiding placement of offshore wind turbines to avoid interfering with northern gannets’ migration patterns. But it was a limited study: it looked only at the gannets and four gull species; it looked only at those birds in a particular part of the world, and offshore; and it didn’t draw any conclusions about any species other than northern gannets.
And yet, I’ve had more than one conversation on social media with people who’ve pointed to that Scot study to claim that “birds learn to avoid wind turbines,” regardless of whether those birds are gannets, eagles, or condors, and even if the topic isn’t offshore wind in the U.K. but onshore wind in the California desert.
It’s a disturbing habit. We look for papers that seem to be a slam dunk in favor of the viewpoint we want defended, and then we stick to them as proof of that viewpoint, even if they don’t actually say that.
And of course, that’s not how science really works.
A paper in a peer-reviewed journal can generally be taken seriously, if you read it carefully. But even the best paper describing the most conclusive, well-designed bit of research isn’t the end of the story. Science is a rock wall built by increment, each bit of research a new stone in that wall — or a well-placed tap of the mallet knocking out a stone someone else laid.
When it comes to what science knows about how many eagles are being killed by wind turbines in the United States, that wall doesn’t have a whole lot of rock in it. A few studies, like this one by Shawn Smallwood et al published in 2010, have focused on the controversial Altamont Pass Wind Area. We reported in October 2013 about a paper by Joel Pagel and several other U.S. Fish and Wildlife Service biologists that compiled third-party reports of eagle mortalities at U.S. wind facilities since 1997.
Pagel and his colleagues collected hard data on actual eagle mortalities at wind facilities. Smallwood and his colleagues surveyed the Altamont Pass area during their studies, documenting mortalities of eagles and other birds first-hand.
And aside from those two papers, and a bit of other work by Smallwood, that’s been pretty much it for actual hard data on eagle mortality at wind installations. There are plenty of other papers that have been published on the topic of eagles and wind turbines, but they mainly base their calculations of potential injury and mortality on projections of abundance of eagles and likely dispersal patterns and other such mathematical models. (The NREL study cited above that distributed dead chickens to local scavengers did do first-hand counting of bird carcasses at San Gorgonio Pass, but their surveys took place only every 90 days during their study period.)Lovich’s two eagle mortalities in 20 years of observations may not seem like a lot, but they are actual hard data in a field in which hard data is awfully thin on the ground. Lovich says it himself:
Although two eagle deaths may seem insignificant, my incidental observations represent over 2 percent of all Golden Eagle mortalities reported nationally (Altamont Pass excluded) by Pagel et al from 1997 to June 2012.
Look at it this way. Scientists can work for years to get their research published. There are only so many journals, and access to their pages is competitive. Reviewers considering papers for publication in peer-reviewed journals apply a lot of different standards to gauge whether a paper makes the cut, from the research’s methodology to general appropriateness for the journal, but one of the biggest criteria used by most reviewers is “conceptual novelty” — in other words, asking whether the work has been done before.
In 1995, when Lovich observed that first eagle mortality at Mesa Wind Project, the United States had fewer than 10,000 megawatts’ worth of operating wind turbines. As of the end of 2014, that’s up to just under 66,000 megawatts, with more new capacity built in 2012 alone than was operating nationwide in 1995. As a result of that immense buildout, and concerns for the new turbines’ effect on eagles, the U.S. Fish and Wildlife Service has been crafting rules to allow wind turbine operators to harm eagles accidentally without running afoul of the Bald and Golden Eagle Protection Act.
The issue of eagles being harmed by wind turbines in the U.S. is a huge topic, to put it mildly. And yet a paper documenting two eagle mortalities at a wind turbine facility in the last 20 years is “conceptually novel” enough to merit publication in a prestigious wildlife science journal.
Put it this way: The scientific community has more information on deaths among marine mammals, which spend much of their time in places it’s hard for us to get to, than it does about injuries and deaths to rather conspicuous birds in industrial facilities. Hell, we have better, more solid data on planets outside our solar system than we do on eagle mortalities at wind energy plants in California.
One could ask the rhetorical question “why is that the case,” but it’s almost a waste of time: it’s because wind energy companies would strongly prefer that data never gets released to the public.
And that’s what peer-reviewed journals are, for all their abstruse language and incomprehensible math and absurd paywalls: public information. Once that data gets analyzed and put in context by independent biologists, it becomes available to us all.
Lovich puts it this way:
Minimizing wildlife mortality at wind farms is a major goal of conservation, although research on how best to do that is in short supply. Compiling and publishing accurate data on mortality of Golden Eagles over time is an important first step in efforts to protect these iconic birds.
And doing so in the clear light of day is crucial if we in the public are ever to make scientifically sound decisions about our energy policy, regardless of whether we put windpower or wildlife first.