Study Finds Midwestern US Soil is Eroding 10 to 1,000 Times Faster Than It Forms
In a discovery that has repercussions for everything from domestic agricultural policy to global food security and the plans to mitigate climate change, researchers at the University of Massachusetts recently announced that the rate of soil erosion in the Midwestern US is 10 to 1,000 times greater than pre-agricultural erosion rates. These newly discovered pre-agricultural rates, which reflect the rate at which soils form, are orders of magnitude lower than the upper allowable limit of erosion set by the U.S. Department of Agriculture (USDA).
The study, which appears in the journal Geology, makes use of a rare element, beryllium-10, or 10Be, that occurs when stars in the Milky Way explode and send high-energy particles, called cosmic rays, rocketing toward Earth. When this galactic shrapnel slams into the Earth’s crust, it splits oxygen in the soil apart, leaving tiny trace amounts of 10Be, which can be used to precisely determine average erosion rates over the span of thousands to millions of years.
“We went to fourteen small patches of remnant native prairie that still exist in Iowa, Minnesota, South Dakota, Nebraska and Kansas, and used a hand auger to collect deep soil cores, in material that dates back to the last Ice Age,” says Isaac Larsen, professor of geosciences at UMass Amherst and the paper’s senior author. “We brought this soil back to our lab at UMass, sifted it to isolate individual sand grains, removed everything that wasn’t quartz, and then ran these few spoonfuls through a chemical purification process to separate out the 10Be —which was just enough to fit on the head of a pin.”
This sample was then sent to a lab which counted the individual 10Be atoms, from which Larsen and his colleagues calculated a precise rate of erosion, stretching from the present day all the way back to the last Ice Age, about 12,000 years ago.
“For the first time, we know what the natural rates of erosion are in the Midwest,” says Caroline Quarrier, the paper’s lead author and who completed this research as part of her master’s thesis at UMass Amherst. “And because we now know the rate of erosion before Euro-American settlement, we can see exactly how much modern agriculture has accelerated the process.”
The numbers are not encouraging. “Our median pre-agricultural erosion rate across all the sites we sampled is 0.04 mm per year,” says Larsen. Any modern-day erosion rate higher than that number means that soil is disappearing faster than it is accumulating.
Unfortunately, the USDA’s current limit for erosion is 1 mm per year—twenty-five times greater than the average rate Larsen’s team found. And some sites are experiencing far greater erosion, disappearing at 1,000 times the natural rate. This means that the USDA’s current guidelines will inevitably lead to rapid loss of topsoil.
Not only is the topsoil crucial for U.S. agriculture—the annual cost of diminished agricultural productivity and environmental degradation due to erosion is estimated to be tens of billion dollars per year—as well as world-wide food security, but most climate-mitigation plans rely heavily on storing carbon in the soil.
Yet, there’s no reason to despair. “There are agricultural practices, such as no-till farming, that we know how to do and we know greatly reduce erosion,” says Quarrier. “The key is to reduce our current erosion rates to natural levels,” adds Larsen.
This research was supported by the National Science Foundation.