Trends can be difficult to detect in real-world data, and the noisier the data, the tougher the task becomes. A longer time series can help limit the impact of noise, but these can be difficult to come by. Verifying the human alteration of ocean chemistry requires tackling challenges like these.
Ocean acidification entails a decrease in the pH of ocean water as the carbonate that buffers it is consumed. That carbonate does more than just maintain pH, though. Lots of marine organisms, from plankton to mollusks to coral, use it to build shells and skeletons. As the buffer is depleted, the saturation state of carbonate minerals like calcite (and its polymorph aragonite) decreases, making it more difficult for organisms to incorporate them. In most areas of the surface ocean, calcite and aragonite are supersaturated, making it easy for organisms to build shells and skeletons. In undersaturated water, the equilibrium tilts the other way, and dissolution of these structures becomes possible.
Calcite and aragonite saturation states vary regionally and seasonally, so how can we make sure the acidification trend we’re measuring is real and human-caused? One way to look into this question is to take the measurements we have and model the whole ocean to see what natural variation would have looked like before humans started emitting CO2. A recent study in Nature Climate Change does just that.