Sunday, May 17, 2015

Emissions hiatus?

John Baez has a post on the latest preliminary data from EIA for 2014. Global CO2 emissions are the same as for 2013 at 32.3 Gtons. EIA says that it is the first non-increase for 40 years that was not tied to an economic downturn.

They attribute the pause to greater use of renewables, mentioning China. Greenpeace expands on this, saying that China's use of coal dropped by 8%, with a consequent 5% drop in CO2 emission. They give the calc with sources here. This source says April coal mined in China was down 7.4% on last year, which they do partly attribute to economic slowdown there.

It's just one year, and may be influenced by China's economy. We'll see.



28 comments:

  1. Thanks for the post and links! One question. As pointed out in Mr. Baez's article, we are still measuring increases in CO2. If this slowing down or leveling off in emissions were to continue, when would we see the effects in measured CO2? Is there any idea of the time lag that exists between the two? Thanks for any info.

    ReplyDelete
    Replies
    1. It will take a while to see effects on measured CO2. The reason is that emissions correspond to the rate of change of CO2. Change in emissions corresponds to the second derivative. We're still emitting 32.3 Gtons CO2 per year, on those figures. IIRC, that would be about 4 ppm CO2/year if it all stayed in the air. In fact about half does. That means an ongoing rise of about 2 ppm/year.

      If emissions dropped 10%, that means CO2 would go up only 1.8 ppm/year. That change would take a while to notice, though it would be important long-term.

      Delete
  2. You will have population growth, so this is somewhat unimaginable.

    ReplyDelete
  3. I'd argue that if emissions dropped 10% (0.4 ppm/year worth), then the CO2 increase would be approximately 1.6 ppm/year: the "about half stays in the air" is a rule of thumb, not a result of some physical principle. The size of the sink is, as I understand it, more a function of the longer-term disequilibrium between the atmospheric concentration and the sinks than it is a function of the last year's emissions, and it is just coincidence that the emissions and sink growth have been such that the emissions to sink ratio has remained close to a factor of 2 over several decades.

    E.g., if CO2 emissions dropped to 0.2 ppm/year worth, you wouldn't expect concentrations to still rise at 0.1 ppm/year, you'd actually expect them to start dropping. (though eventually the sinks would catch up, and then concentrations would start growing again, because as long as CO2 emissions are greater than zero, then total net carbon in the atmosphere-ocean-ecosystem pool will grow. Well, up until the long-term sedimentation sink starts growing because of higher carbon concentrations, but that will take a LONG time)

    -MMM

    ps. Apologies if this posts multiple times, I've been having problems.

    ReplyDelete
    Replies
    1. "the "about half stays in the air" is a rule of thumb, not a result of some physical principle"

      I thought that too. But it's surprisingly persistent.

      ps re multiple posting - you can erase your own comments - even totally.

      Delete
    2. I know it is surprisingly persistent, but I think it is still somewhat coincidence (maybe somehow an exponential emissions growth rate turns into a constant ratio of emissions to sink in a multi-box system? One day I'll write myself a carbon cycle program to play around with these things. Also, to educate all the people who keep getting confused about the anthropogenic nature of our emissions increase.)

      -MMM

      Delete
    3. The 1/2 factor is very easy to understand if you understand the random walk behind diffusion. CO2 does not want to sequester permanently so it undergoes essentially a continuous random walk, going downward half the time and upward half the time. This generates a long fat-tail where it looks like about half remains in the atmosphere.

      Delete
    4. "The 1/2 factor is very easy to understand"

      It's not exactly 1/2. But it's a lot more complicated than that. For one thing, most of the net air-sea exchange occurs because of seasonal warm/cool. And when absorbed, it is subject to currents. And of course, CO2 engages in acid-base reactions in the sea.

      Delete
    5. Of course realistically it is more complicated, that is why I use a "dispersive" diffusion formulation. There is not one diffusion coefficient but a range of diffusion coefficients. The formulation I use assumes a MaxEntrpy spread of diffusion constants and that can be solved to show the slow sequestration and the long time it spends at around 1/2 of its initial concentration in the atmosphere.

      Complexity is tamed by using statistical physics.





      Delete
    6. WHT: I am unconvinced that a "dispersive" diffusion formulation is a good way to model the carbon cycle for the purposes we care about here (i.e., whether, over a several year period, the increase in atmospheric loading will always be about half - 44 percent, according to Nick's graph - of the emissions in that time period). I feel like dispersive diffusion might be better suited to, say, tracking labeled atoms, than to calculating atmospheric perturbations, but even there, I'd be a bit dubious. Various simple approximations of the carbon cycle have proven to be embarrassingly flawed historical (e.g., Essenhigh's well-stirred reactor, because he thought complexity was easily tamed using standard chemical engineering approaches).

      And I return to my intuition about the carbon cycle: the atmosphere and the ecosystem/ocean carbon pools are not currently in equilibrium. If we were to shut off fossil fuel emissions tomorrow, we would see CO2 concentrations drop over time, approaching, eventually, a level equal to preindustrial plus about 20-35% of the total cumulative emissions (according to https://geosci.uchicago.edu/~archer/reprints/archer.2009.ann_rev_tail.pdf ). So, therefore, as emissions decreases become large, the "about half" rule of thumb can't hold anymore, and any approach that can't capture that dynamic isn't really appropriate for this question.

      I think this might be appropriate here: https://xkcd.com/793/

      Delete
  4. anony-moose,

    MaxEntropy priors applied to these situations invariably work and they work across the board for various natural phenomenon. It works because it assumes the least. It looks as if you are assuming too much :)

    ReplyDelete
  5. WHT: Using your method, what would the prediction be for whether CO2 concentrations in 2020 would be greater than or smaller than concentrations in 2015 if we dropped from ~40 GtC/year today to 0.4 GtC/year starting on January 1st, 2016?

    If the answer is "concentrations would be higher in 2020 than in 2015" (which, based on your brief description, is the answer that I think your method would give: if I've misunderstood, then please enlighten me), then your method is not appropriate for this problem. No method can "invariably work": methods only work when they are applied to the the right kind of problems, and applied properly. And generally, it helps to actually understand the physical system you are applying the method to, so you don't make improper assumptions.

    -MMM

    ReplyDelete
    Replies
    1. I am saying these methods invariably work in much the same way that a Boltzmann distribution works. You really an't go wrong by assuming a mean value for an observation and then place an uncertainty of one standard deviation = mean about that value. That gives the MaxEntropy estimate of an observable if all you can assume is the mean. So there has to be a mean diffusion coefficient, right? Or do you think the mean diverges, such that there is no average value for diffusion?

      This gets to the heart of what people like Jaynes or recently Christian Beck and his superstatistics approach (http://en.wikipedia.org/wiki/Superstatistics) have tried to apply to traditional statistical physics.

      Delete
  6. I'm not saying that "there is no average value for diffusion", I'm saying that you aren't properly modeling the system. The carbon cycle is not currently in equilibrium: because of this, it is impossible that if X gigatons of CO2 are emitted, that the atmospheric loading will always increase by X/2 (or whatever the appropriate fraction is) regardless of the value of X. If X is small enough, because of the initial disequilibrium, atmospheric loading will actually decrease. As X gets very large, the ecosystem carbon sink will saturate. The oceans may not have the same kind of saturation limit (Henry's law, as modified by carbonate chemistry and the Revelle factor), though even for the oceans various living organisms play a role and presumably will also saturate. Which means that as X gets very large, the fraction of X which remains in the atmosphere should grow.

    If the carbon cycle problem was as easy as throwing a MaxEntropy estimate of diffusion at it, why would carbon cycle modelers bother with complex 3D ocean models, ecosystem models, and such? Heck, you have a system where concentrations increase every fall and decrease every spring - is that a simple diffusive system?

    -MMM

    ReplyDelete
    Replies
    1. It's an impulse response function. That's how diffusion problems are modelled -- as a transient response, not an equilibrium formulation. No one ever has made a requirement for a system to be in "equilibrium" before one can apply statistical physics ala a simplified differential equation.

      Actually I don't know why they bother with all those seasonal complexities. I certainly would not.

      Delete
  7. But the question isn't, "how much of this year's emissions of carbon will remain in the atmosphere next year" but rather "how much will the atmospheric concentration of CO2 change next year". Your approach would work for the former, but not the latter. Well, assuming you used the right data to estimate your diffusion coefficients. The residence time of CO2 is known to be about 5 years, but that's only the e-folding time to first absorption in the ocean/ecosystem, and if we're talking about a tagged group of CO2 molecules where if it comes back out of the ocean it still counts, then the effective residence time would presumably be somewhat longer than that. Which mean, for that calculation, less than 20% of your tagged CO2 will be gone at the end of the first year after your emissions pulse. And in the very long run, you'll asymptote to a percent remaining equal to the percent of carbon that is in the atmosphere compared to the rest of the system.

    -MMM

    ReplyDelete
    Replies
    1. I don't think you understand that diffusion is just a random walk, whereby the CO2 can enter and leave an organic carbon cycle many times. The fact that this occurs goes in to the representation of dispersed diffusion. There are so many different time scales of diffusion that MaxEntropy is the most effective way to account for them all with the minimum amount of bias.

      I recommend the work by James Sethna where he has the concept of "sloppy" models and finding the simplest representation possible. This is on my mind as we are discussing simplified modeling on the Azimuth Forum. I will use this diffusion example over there and see what they think.



      Delete
    2. I have a PhD from MIT, I think I understand what diffusion is. I also have published several peer-reviewed papers on climate change, and while I am not a carbon cycle expert myself, I have worked with carbon cycle experts and run earth system models. I recommend maybe reading some carbon cycle papers (like the Archer paper I linked to earlier), rather than continuing to https://xkcd.com/793/.

      -MMM

      Delete
    3. My earliest background is in diffusion related to semiconductor processing. Definitely a case of climate science lagging behind fundamental science and technology. The xkcd gambit is weak.

      Delete
    4. So, given the choice between, "maybe I'm wrong" and "the entire field of carbon cycle science is flawed" you go with the latter*? That confirms that the xkcd comic is pretty much the perfect encapsulation of this discussion.

      -MMM

      *There was potential for a third path, which would have been that you and climate science were consistent, and that I was the one who was confused about either what you were saying or about what the state of climate science was, but you haven't even bothered to engage with my key hypothetical corner case (e.g., carbon emissions drop to near zero) which might have served to illuminate that question.

      Delete
    5. I never said the science was wrong. I am simply pointing out that the BERN model is a heuristic representation of a dispersed diffusional process. : This is my model of diffusional sequestration laid on top of the BERN model:
      http://imagizer.imageshack.us/a/img18/8127/normalizeddecayofco2.gif


      Delete
    6. 1) The equation you are comparing to is not actually the BERN model. It is a analytic approximation to the BERN model.
      2) The statistical approximation assumes that the system is in equilibrium at 378 ppm. It also includes assumptions about the climate response to a change in forcing, because when the ocean warms CO2 becomes less soluble and there is increased stratification so it becomes a less effective sink.
      3) The purpose of creating an analytic approximation to a complex model with an assumption of an equilibrium background is because the authors are not answering the question of "how much will the atmospheric concentration of CO2 change next year", but rather, "what is the CO2 perturbation that will result from a single impulse of CO2" - very useful for calculating global warming potentials, which is what the IPCC wanted.
      4) One could apply a simplified response function like the BERN cycle aproximation to the question of "how much will the concentration change next year" by assuming a preindustrial equilibrium background in 1750 and using historical emissions estimates for each year between 1750 and the present, which would then allow the calculation of the amount by which CO2 concentrations would drop in the next year in the absence of additional emissions, and then add in one more year's worth of emissions along with its own response function. But, as my point has been from the beginning, applying even this simplified approach will make it obvious that the "about half" rule of thumb doesn't actually work when you have large changes in emissions because it is being driven in large part by the excess sink resulting from the existing disequlibrium.

      You can read about the development of the simplified response function here:
      http://www.gfdl.noaa.gov/bibliography/related_files/fj9601.pdf

      -MMM

      Delete
    7. A diffusional model will generate the same fat-tail response that the BERN model generates. Find another model that will do this with equivalent conciseness. Oh sure, you can create a multi-compartment model, but all that represents is diffusion.

      Delete
  8. WHT: How does your simple diffusion approach account for saturation of the solubility of CO2 in the ocean? This is not a small perturbation but serves as the rate-limiting step.

    Specifically a CO2 molecule does not simply randomly walk across the air-water interface. There is a chemical potential, which can act as a significant barrier if the surface waters are depleted of carbonate. The carbonate buffering reactions are responsible for roughly 90% of the solubility of CO2. And reaction with rising atmospheric concentrations is indeed reducing the carbonate concentration of surface waters (aka ocean acidification).

    Archer treats this all very simply in a zero-dimensional box model in his adaptation of Berner's GEOCARB III model at http://climatemodels.uchicago.edu/geocarb/geocarb.doc.html

    ReplyDelete
    Replies
    1. A random walk is inherent in the process. Consider the idea of vertical eddy diffusion. There is no way that the random walk is of individual water molecules across an interface in eddy diffusion. Instead the diffusion is of a general process of eddies randomly moving up and down in a vertical column.

      If people don't get out of this box of mistaking the micro for the macro, there is little hope of modeling large scale processes.

      Delete
  9. It it's all about Boltzmann statistics and diffusion, then the fraction of emissions that remain in the atmosphere should be about the same for all gases that don't decompose in atmospheric chemical reactions.

    But around 95% of SF6 emissions remain in the atmosphere, compared to around 50% of CO2 emissions (see, e.g., table 2 of I. Levin et al., Atmos. Chem. Phys. 10, 2655–2662 (2010)). How does your model explain the order-of-magnitude difference between the sinks for the two gases?

    ReplyDelete
    Replies
    1. SF6 doesn't want to enter the ocean does it? You're talking to someone that worked on dopant diffusion from the gas phase years ago. Not all dopants incorporate the same.

      I really don't get the resistance to these basic physics models.

      Delete
    2. Jonathan, why are you only making an exception for "decomposition in atmospheric chemical reactions"? The far higher solubility of CO2 compared with SF6 is the result of liquid phase interactions of these molecules with water. In particular chemical reaction of CO2 with water to form the highly soluble HCO3(-) ion.

      Delete