If you have ever dealt with scientific data, you’ve probably encountered one of the shadier sides of science: academic publishing. While they’ve stood, in some cases, for centuries, as the official record of scientific advancement safeguarded under the watchful eye of peers, modern journals live in a modern world. Millions of words have already been spilled on the subject, so that’s not what this article is about. Instead, I’m left asking whether academic publishing is the only means of getting the stamp of peer-review these days?
The reasons leading me to ask this question are many, but primarily through working in a management arena lately. One example, in particular, highlighted many of the disconnects between the need for verified scientific data and the incentives of journals. This moment was at a Chesapeake Bay Program Sustainable Fisheries Goal Implementation Team meeting (for those of you not in the Chesapeake region, that’s a consortium of regional fisheries managers), where a room full of decision-makers needed a verified stock assessment of blue crabs to move forward with their management planning. Peer-review is the time-tested, well-understood, and arguably easiest means of verifying data. Read More
The Ocean Cleanup is back in the news, with their first test deployment happening imminently off the coast of Japan. From reviews of their current material, it seems clear that they have not taken the critical assessment of their feasibility study, graciously provided pro-bono by Drs. Martini and Goldstein, to heart. This is unfortunate. As the project has progressed, many in the ocean science and conservation community have not only grown more skeptical of its effectiveness, but are increasingly wary of the potential this project has to cause significant environmental harm. As of yet, The Ocean Cleanup has presented no formal Environmental Impact Assessment, a critical document which would provide the data necessary to properly gauge the potential for environmental harm from large-scale engineering projects.
Image produced by The Ocean Cleanup.
Goldstein and Martini’s technical review is essential reading for anyone tracking the progress of The Ocean Cleanup, but there are many additional issues that the Ocean Cleanup has not yet addressed. Here, I present three issues related to the construction and operation of The Ocean Cleanup and the necessary information that, were I in charge of regulating the high seas, would need to know before such a project could be approved.
1. The Ocean Cleanup will be the largest offshore structure ever assembled.
When completed, The Ocean Cleanup will span 100 km of open sea with a massive array of booms and moored platforms. If successfully constructed in the proposed region, the mooring used will be the deepest ever constructed. The booms will stretch across a major oceanic current, interacting with plankton transport and pelagic migrations.
What I want to know: How will The Ocean Cleanup monitor changes in ocean-wide population structure? What community baselines have been established from which ecosystem impact can be assessed? What contingency are in place should catastrophic failure occur? Ultimately, what chronic threshold will be used to trigger a shutdown of the Ocean Cleanup, should major environmental impacts be detected as a result of standard operation, who will access to the data necessary to monitor those impacts, and who will have authority to trigger a shutdown? Read More
I first heard about the new Wyoming law #SF0012 through the Slate article summarizing it as a criminalization of citizen science. There’s a real danger that it could be interpreted and implemented that way, but let’s try and give Wyoming the benefit of the doubt for a minute. The text of the law only requires that scientists (citizen or otherwise) acquire written or verbal permission from landowners for collecting data on their land. It goes on to define what “data” means, including photographs in a fairly wide definition, and “collecting” as taking data with the intention of turning it over to a state or federal agency. It also defines trespassing and outlines the consequences for those who fail to receive permission. In short: the data collector could go to jail and their data will not be admissible in legal or policy proceedings.
At the core, the law re-hashes a fairly common definition of trespassing. The key part of the law that’s new is that the data won’t be admissible in court and the act of turning them over to federal or state agencies will make you an outlaw. Part of me thinks that data collectors, including citizen science groups, should be asking permission to go on someone’s land. This is both to keep ethics at the forefront of our scientific endeavors and for the personal safety of scientists (ranchers are known to carry shotguns, after all). Read More
Ansel Adams helped create what we now call American wilderness through his skillful photography – both his photographs and the places he used them to protect are national treasures. Recently, many of us were reminded of our country’s wilderness legacy through celebrations of the 50th anniversary of the Wilderness Act. For a quick reminder, the Act designated some of our federally-held lands as wilderness:
For this purpose there is hereby established a National Wilderness Preservation System to be composed of federally owned areas designated by Congress as “wilderness areas”, and these shall be administered for the use and enjoyment of the American people in such manner as will leave them unimpaired for future use as wilderness, and so as to provide for the protection of these areas, the preservation of their wilderness character, and for the gathering and dissemination of information regarding their use and enjoyment as wilderness.
Ansel Adams: The Tetons and the Snake River
Yet, along with this celebrated history, these recent discussions have also provoked a number of managers to utilize this strong piece of legislation to their political advantage – and dare I say, without keeping in the spirit of the law. Read More
After years of scaring pregnant women away from fishy nutrition, the FDA is finally updating its recommendations to encourage them to eat 8-12 ounces of low-mercury fish a week. That’s 2 or 3 meals per week in order to support fetal growth and development. Curious about what fish are low mercury? Stay away from tilefish from the Gulf of Mexico, swordfish, shark, and king mackerel and limit albacore tuna to 6 ounces a week. Better options include “some of the most commonly eaten fish such as shrimp, pollock, salmon, canned light tuna, tilapia, catfish, and cod”. For locally caught fish, you should check with your local authorities. The new recommendations aren’t final – read the draft and write in if you want more information that would help you make safe and healthy seafood choices. Here’s some things you should consider. Read More
They rise from the deep with gnashing teeth and hissing blowholes. They stagger through the shallows, hunting for human flesh, piercing the air with their high pitched moan. They are dead but not dead. They are Zombie Dolphins.
And you can’t fight them, because they are protected.
This past Tuesday, the draft bill to reauthorize the Magnuson-Stevens Act was released by the U.S. House. The Magnuson-Stevens Act is a big deal because this is the law that lays out how fisheries management works in the United States. This time, a number of changes have been proposed by Representative Doc Hastings, some of which could fundamentally change fisheries management and fisheries science in U.S. waters. The proposed changes immediately became controversial, garnering overwhelming support from witnesses to the House Natural Resources Committee hearing of the bill (witnesses included representatives from the recreational and commercial fishing industries as well as the Mid-Atlantic Fishery Management Council) while the Pew Charitable Trust strongly opposed the bill, calling it the “Empty Oceans Act” (translated into GIFs by Upwell for your viewing pleasure).
How might the Hastings bill affect your favorite marine species (both in the water and on your dinner plate)? Read on to see the good, bad, and ugly aspects of these proposed changes, at least according to this particular fisheries scientist.
Happy Fun Science Friday!
Though this post does not present such a happy story, given the recent discussion about dolphin photobombing, this week’s FSF is topically related. In the spring of 2010 the Deepwater Horizon oil rig experienced catastrophic failure resulting in the worst oil spill in human history. The Gulf of Mexico (GoM) was the unfortunate host of this catastrophe and the GoM community is still feeling the ecological, social, and economic consequences of this disaster.
Pod of bottlenose dolphins swimming underneath oily water of Chandeleur Sound, La., May 6, 2010.
Photo Credit: Alex Brandon/AP
One such impact that received little TV coverage during the spill was the uncharacteristic spike in dolphin deaths. A few months following the BP spill there was an unprecedented spike in dead dolphins washing ashore along the Gulf Coast; 67 dead dolphins by February of 2011, with more than half (35) of the dead dolphins being calves. This is in stark contrast to years preceding the spill when one or two dead dolphins per year were normally documented to wash ashore. Despite the spike in dolphin deaths, there was no definitive evidence linking the dead cetaceans to the oil spill as a number of other factors could have been responsible for the deaths, including infectious disease or the abnormally cold winter proceeding the spill.
A lot of debate among conservationists centers on the conflict between the desire to see a species totally protected from human exploitation and the reality that market forces will continue to exist (see the latest on shark fin bans for a very good example). Ideally, a conservation plan should strike a balance, ensuring the continued existence of the species while still allowing people to profit from it in some way. This also requires a clear idea of the limitations of conservation policies. For example, US policies (even the mighty Endangered Species Act) only directly affect populations within the territorial waters of the United States, while international agreements like CITES restrict trade of the species without telling any particular country what to do domestically. However, there are ways to track the interaction between conservation policies and the market, making it possible to make some predictions on how things like fishery management plans and CITES listings might affect trade. Then it gets interesting. Armed with this knowledge, can the market be pushed towards species conservation?