- Jan 1, 1970
**Er, 1978 ~ 2010 is not 600,000 years. Not even close. However, this graph
may provide a little more information:
Not quite 600,000 years, but considerably more than 30. Here's some more
**Again, a 30 year trend merely backs my claim.
**What are you trying to say? That the temerature of the planet is rising?
That CO2 levels are rising? No argument from me.
**The data presented shows:
* That CO2 levels are rising.
* That average temperatures are trending upwards.
I have no issue with that data.
**I am satisfied that AGW has been shown to be the most likey explanation
for the temperature rise that has been noted, with around 95% confidence.
That is not 100% confidence and would likely not qualify for the money. It
is likely that, by the time 100% confidence has been reached, several things
will have occured:
* Milloy will be dead.
* VERY serious problems associated with global warming will be occuring and
the planet will have descended into a state of anarchy. US Dollars will
likely be virtually worthless. Food will be only currency of value.
Maybe. The problem is that none of the satellites are able to measure
planetary albedo with sufficient accuracy to make a definitive
**Which is why measuring the rate of heat retention by the oceans is so
Examine the graph on page 4. The planet's oceans store vastly more heat than
the troposphere does. The oceans are warming.
We can do almost nothing in the way of measuring
albedo from the ground. The plan is for the satellite to measure how
much energy is reflected by the planet (which includes atmospheric,
ocean, ice, land, etc) and also solar output. The energy difference
is presumed to be what the planet absorbs. Note that all the energy
is not necessarily at IR (heating). Apparently it's sufficiently
important that NASA burned $424 million on the failed Glory launch,
and other global warming related birds. The current assumption that
solar variations do not account for the alleged rise in average
temperatures is based on computer models with some rather serious
**Really? Which errors? We know that the Sun output has diminished
(slightly) over the past couple of decades and yet the temperature trend of
the Earth is still up.
There's also a rather odd problem of just what the satellites are
actually measuring. Temperature varies with altitude. Satellite IR
imagers measure through all the various layers of the atmosphere. If
there are clouds covering a land mass, the IR imager gets the
temperature of the clouds, not the ground. So, to prevent this
obvious anomaly, the computers are set to only read numbers where
there are no clouds. However, that discounts the effects of aerosols
and particulates (i.e. dust) in the upper atmosphere, which does a
marvelous job of reflecting sunlight into the IR imager. Volcanoes
make it really difficult to get accurate readings. Plenty of other
complications requiring the usual tweaks, adjustments, compensations,
normalization, and cherry picking. Oh well.
**Which is why ocean temperature measurements are so important. It is the
planet's oceans that contain the most heat. By a considerable margin.
What Malloy has done with the "global thermometer" mentioned above is
to take as much of the METAR and NOAA temperature data as possible and
average all of it. The theory is that if you're faced with a large
number of potentially erroneous data points, and don't have the means
to reduce the errors, averaging all the bad data together will somehow
result in good data. That's because the errors will tend to be in
random directions and hopefully cancel. Since the IPCC uses the same
method, one can presume it to be valid. However, I have my doubts.
Anyway, I have not attempted to debunk anything that you've offered.
What I've done it attempt to undermine your apparently unshakable
certainty in AGW and the IPCC. If I've set you on the path of
critical thinking and academic skepticism, then I haven't wasted my
**I do not have an "unshakable certainty in AGW and the IPCC". I accept that
the 95% certainty of AGW is a reasonable figure. What I find irrational is
the fact that many people seem to be clinging to the 5% uncertainty and
hoping that a very large number of very smart scientists are wrong.
Fundamentally, the way I see it is like this:
* If we spend a few Bucks today to mitigate CO2 emissions, we may be able to
avert the 95% probability of disaster.
* If we don't spend the money today, then it is highly probable (95%
certainty) that the cost will escalate with each passing year, to a point
where we will be unable to fund mitigation.
* If the scientists are wrong and we spend a few Bucks now, then it's cost
us some money.
* If the scientists are right and we don't spend the money, our civilisation
will not likely survive.
Make no mistake: I did not say that humans will be wiped out. Many will
survive. Anarchy is loking like a real probability.