zlacker

[return to "Climate Change Tracker"]
1. oceanp+Hh[view] [source] 2023-09-03 17:04:15
>>Brajes+(OP)
If you pull up the last 2,000 years in the Yearly Average Observed Temperature anomaly, from 536 - 537 there should be a global average temperature anomaly of -2C to -5C from the Volcanic Winter of 536 (A period of 18 months where the sun was dimmed by volcanic ash), but the graph shows <1C. There's tree ring evidence of it from all over the world.

If they missed this, this puts into question all the rest of the data IMO.

◧◩
2. fwungy+qs[view] [source] 2023-09-03 18:07:29
>>oceanp+Hh
Large models, such as climate models, which are among the largest are highly vulnerable to high variance because of the high dimensionality of their parameter sets.

Say you have n continuous parameters to your mode. This equates to an n dimensional polygon. Unless you a high iteration Monte Carlo technique the output of your model is going to depend on where exactly your estimator point in n degree space lands, and its accuracy will depend on its distance from the actual (unknown) point in the set.

Now, many of the parameters in large models have never been measured. They are averages from the literature, or in cases where there is no literature, which is common in cutting edge science, the investigators guess.

If you look at the meta studies of climate models, which is what the IPCC uses to make projections, they come out all over the place. These models really aren't great prediction tools. They are best thought of as tools for understanding a the components of a complex system.

Covid was a perfect example: modeling was suggesting devastating impacts from covid, to which localities responded differently, some were aggressive, some were lax. It didn't seem to matter. Yes, one can find statistically significant instances where different covid responses led to higher mortality rates, but nothing substantial enough for any group to want to change what they did.

CO2 makes up 0.04% of the atmosphere and humanity is only responsible for 3% of its creation. We are making very fine grained estimated using a macro model. It's a bit like carving toothpicks with a chain saw.

[go to top]