Humanity has always built models to better understand complicated
observations -AND- serve as teaching tools. Building a working model proves "you
are in possession of all the facts". Deviation from observation proves that your model
needs to be improved or replaced.
Early Mechanical Models of the Solar
In ancient times people built mechanical models of
our solar system to predict the current location of the planets and stars. These models
were based upon naked-eye observations and worked well until better observations
indicated improvements were required. Model builders responded by adding
the planetary positions (perhaps this was just the easiest fix for a mechanical
In medieval times, Nicolaus Copernicus
suggested that a heliocentric system would simplify the models but this
idea seemed too revolutionary to most people. Improved (but non-magnified) observations by
Brahe coupled with mathematical analysis by
Johannes Kepler proved that
planetary orbits needed to be changed from circular to elliptical (how would the
mechanical model builders accomplish this?). Telescopic observations by
Galileo proved the heliocentric system
was correct but now the Roman Catholic Church stood in the way of scientific
progress. Many scientists avoided controversy (including burning at the stake) when they stated "we don't really believe in the heliocentric system, it just
simplifies our models".
done by people such as
Urbain Le Verrier (who used pen and paper to predict the location of an
unknown planet eventually named Uranus) is
unbelievably impressive, a computer is required to model known planetary and
stellar motion. In short, a mechanical model is no longer possible or
Furthermore, if the Earth was the only planet
orbiting the Sun, the shape of Earth's orbit would eventually degrade into a
circle. However, the gravitational tugs of other planets causes the shape of Earth's orbit
from circular to elliptical and back in a time period of
This theory was also worked out using only pen-and-paper by the Serbian
Milutin Milankovitch but remained a mathematical abstraction until computer
modeling was applied. BTW, other aspects of
combine in such a way to enable glaciations (ice ages) every 120,000 years or
so. Our current interglacial period (known as the
Holocene) started 11,700 years ago.
Computer Limitations (and Science Limitations)
Because computers can be used to
calculate equations many millions of times faster than any human, modern
life would be impossible without them. For example, who could imagine
any government manually processing our income tax claims? However, computers
do have limitations most people are not aware of.
Experiment: the next time you pour cream into your coffee, carefully watch the swirling clouds as the two fluids (liquids in this case)
mix. This turbulent
partly based upon a combination of
chaos theory and
Unfortunately no computer on the planet now (2010), or any time soon, is able to
accurately model this. Think about it: you are only mixing two liquids so why is
the resulting action so complicated? To make matters worse, every time you
perform the coffee-cream experiment you will observe a slightly different
result. So maybe we need to consider more details like: exact volumes and
temperatures of each liquid, height the cream is poured from, place where it has
been poured into, exact components of the cream, exact components of the coffee,
viscosities of both liquids, smoothness and shape of the container, swirling speed of the coffee from the initial filling event, etc.
It turns out that an accurate computer model will require us to mathematically
compute the properties and trajectory of every molecule. Since
computers won't be
doing this anytime soon, perhaps we can cut corners by only computing the
average action of each deciliter (one tenth of a liter) at ten second
intervals. As long as the simulation gives us a homogenous mixture
after 2 minutes and possibly allows the cream to settle to the bottom after a
couple of hours then our computer simulation might be good enough.
Future improvements in computer technology along with advances in computer
programming techniques might allow us to slowly reduce the average volume modeled
along with the average time-period being simulated. And yet we are still
only talking about a cup of
Modeling Earth's Climate
Simulating Earth's Weather with Pencil and Paper
The first attempt to do a model Earth's weather was done with pencil and paper using
something called the two-box model. This scheme (which is still
used today to teach science students) uses two boxes to model the whole earth.
The top box represents Earth's atmosphere while the bottom box represents
Earth's surface. It is obviously very simplistic but a little tinkering provides
a good starting point to other more complicated models.
The two-box model was replaced with two-dimensional
models, three-dimensional models, then finally
cell models. A complete description of these models is beyond the scope
of this introduction but you can Google the phrases
to investigate further.
I recently stumbled upon a first serious attempt to do a cell model which
was attempted in 1922 obviously without the aid of a computer. Excerpt from:
www.aip.org/history/climate/GCM.htm (please read this constantly updated
In 1922, the British mathematician and physicist
Lewis Fry Richardsonpublished
a more complete numerical system for weather prediction. His idea was to divide
up a territory into a grid of cells, each with its own set of numbers describing
its air pressure, temperature, and the like, as measured at a given hour. He
would then solve the equations that told how air behaved (using a method that
mathematicians called finite difference solutions of differential equations). He
could calculate wind speed and direction, for example, from the difference in
pressure between two adjacent cells. These techniques were basically what
computer modelers would eventually employ. Richardson used simplified versions
of Bjerknes's "primitive equations," reducing the necessary arithmetic
computations to a level where working out solutions by hand seemed feasible.
Even so, "the scheme is complicated," he admitted, "because the atmosphere
itself is complicated". The number of required computations was so great that
Richardson scarcely hoped his idea could lead to practical weather forecasting.
Even if someone assembled a "forecast-factory" employing tens of thousands of
clerks with mechanical calculators, he doubted they would be able to compute
weather faster than it actually happens. But if he could make a model of a
typical weather pattern, it could show meteorologists how the weather worked. So Richardson attempted to compute how the
weather over Western Europe had
developed during a single eight-hour period, starting with the data for a day
when scientists had coordinated balloon-launchings to measure the atmosphere
simultaneously at various levels. The effort cost him six weeks of pencil-work.
Perhaps never has such a large and significant set of calculations been carried
out under more arduous conditions: a convinced pacifist, Richardson had
volunteered to serve as an ambulance-driver on the Western Front. He did his
arithmetic as a relief from the surroundings of battle chaos and dreadful
wounds. The work ended in complete failure. At the center of Richardson's
simulacrum of Europe, the computed barometric pressure climbed far above
anything ever observed in the real world. "Perhaps some day in the dim future it
will be possible to advance the calculations faster than the weather advances,"
he wrote wistfully. "But that is a dream." Taking the warning to heart,
meteorologists gave up any hope of numerical modeling
Weather vs. Climate
We all know that weather reports today are not very accurate, and yet, they
have improved considerably since the 1950s. In certain instances, such as
tropical depressions which can develop into hurricanes, weather reports may be
reasonably accurate over a period of 7-10 days. But just like our cup-of-coffee
example described previously, skipping over the short-term details will allow us to predict long-term trends. This is the major difference Weather and Climate and I should point out
that "climate models" are much better than "weather models".
"Climate vs. Weather"
modeling of the environment in a period of 1 to 100 years
modeling of the environment in a period of days to weeks
Even if it was possible to accurately model climate or weather, you cannot mathematically model
all the inputs. For example, here are two (of many) events which appear to act
There are 500 active
volcanoes on Earth today with as many as 1,500 potentially active
volcanoes. However, there does
not appear to be any mathematical pattern which would describe their frequency
or intensity. To make matters worse, all active volcanoes release a variable (random)
volume of CO2 which will increase the greenhouse effect.
To make things more complicated, while some volcanoes release larger volumes of dark-colored particular matter
which absorbs sunlight, other volcanoes release light-colored particulate matter
which directly reflects sunlight back into space. Some volcanoes also
dioxide compounds which stimulate cloud formation (silvery clouds are the
perfect sunlight reflectors). Inspect the chart to the right and notice how
the black line drops in 1991 due to the effects of
Added Complication: CO2
can remain in the atmosphere for 100 years or more. The effects of white
particulate matter and/or sulphur dioxide will only last one to two years.
So what might initially appear to be a short-term cooling event eventually will be a long term warming event.
One cigarette improperly discarded in a National
Park occasionally will start a massive forest fire resulting in a massive
release of heat,
smoke, and CO2 into the atmosphere.
These seemingly random events (along with the previously mentioned turbulent
behavior of fluids) need to be manually inserted into our climate models.
Simulating Earth's Climate (a very simple starting model)
"a 16-cell model"
Imagine for a moment, a spinning Earth which is cut vertically into 4 columns
and horizontally into 4 rows which results in 16 zones. We now need to write an
a single equation for each zone which would simulate:
the quantity of solar energy entering
each zone over the course of a day
the quantity of energy temporarily absorbed by: soil, melting ice,
warming water, and evaporation
the quantity of energy being radiated back into space; especially at
the quantity of energy temporarily released by: freezing water, and
Because there is more sunlight at the equator, a greater amount of sunlight
will be absorbed in rows A + B than rows N + S. In fact, you may wish to
visualize an oval of light stretched from North to South and wide enough to
cover two columns at the equator. Because the surface of the globe is spinning
west-to-east (left to right), our view of the solar oval will be seen to move
Because the surface of the globe is spinning west-to-east while the
atmosphere wants to stay put, an apparent east-to-west wind will be blowing over
the equator so we'll need equations to describe that as well. Depending upon how
you handle parameter communication between zone boundaries, you will probably
need at least 28 (12v+16h) inter-zone
calculations. (column 4 zones are connected to column 1 zones; there are
no zones above row N or below row S because these are actually triangles)
This means that each simulated tick of the clock will require at least
44 (16 zone + 28 inter-zone) calculations. You might be able to try this with pen and paper but it
will be time consuming and error prone. Moving the simulation into a computer
will allow you to introduce larger (more accurate) equations into each location.
If you are brave then you'll need to introduce seasonal changes. This means
that the A1 would receive peak daylight in June while B3 would
receive peak daylight in December. It might be easier to visualize a single sine
wave superimposed upon our model where the phase shifts one full cycle over the course of a
Simulating Earth's Climate (more layers)
Atmospheric Layer prefix: a
Surface Layer prefix: s
Although solar energy directly heats the ground wherever it falls on land,
heated ocean water tends to redistribute energy via events as small as
evaporation and as large as hurricanes, which all occur in the atmosphere. Ocean
energy is also responsible for water currents as small as the
Gulf Stream and as large
Thermohaline Circulation which all occur at, or below, the surface. This
means we might want to introduce a second layer so atmospheric events could be
simulated in the upper layer while ocean events would be simulated in the lower
Doubling the zones from 16 to 32 means the number of zone calculations must double
from 44 to 88 but but we need to add 16 additional calculations to handle the
flow of energy between adjacent zones in each layer. We now require
104 (88 + 16 interlayer) calculations for each tick of the simulation clock. Yikes!
But do we have enough squares? More land exists in the Northern Hemisphere so
more squares would allow us to code for that. Also, since uplift formed the
Panamanian Land Bridge
3 million years ago (cutting off the Atlantic from the Pacific in Panama), ocean currents
are blocked in certain areas while making glaciations
are more common. If we want to model ocean currents then we will need a lot
JASON - Climate Models commissioned by the U.S.
It is an historical fact that the US government commissioned a climate study
in 1978 by a group of scientists known associated with
JASON. This group
created a computer model with the audacious name "The JASON Model of the
which produced a report in 1979 titled:
JASON April 1979 Technical Report JSR-78-07
preindustrial CO2 concentrations in the atmosphere expected
to double by 2035 (today's models indicate 2050-2100 depending upon human
temperatures would rise by 2-3 C by the end of the 21st century (current
temperatures at polar caps would rise much faster; perhaps by 10-12 C
(current models agree)
caveat: some temperatures in this report are given in degree
changes Celsius while others are given in degree changes
Kelvin. Multiply either of these numbers by 9/5 to get
degree changes Fahrenheit.
Simulating Earth's Climate in Large Data Centers
associated image and text in this section was borrowed from a NOAA (National
Oceanic and Atmospheric Administration) web site. It appears to be using at
least 3 layers and thousands of zones.
Climate models are systems of differential equations based on the basic laws of
physics, fluid motion, and chemistry. To "run" a model, scientists divide the
planet into a 3-dimensional grid, apply the basic equations, and evaluate the
results. Atmospheric models calculate winds, heat transfer, radiation, relative
humidity, and surface hydrology within each grid and evaluate interactions with
This is a data-derived animation of ocean surface currents from June 2005 to
December 2007 from NASA satellites. Watch how bigger currents like
the Gulf Stream in the Atlantic Ocean and the Kuroshio in the
Pacific carry warm waters across thousands of miles at speeds
greater than four miles per hour (six kilometers per hour); how
coastal currents like the Agulhas in the Southern Hemisphere move
equatorial waters toward Earth's poles; and how thousands of other
ocean currents are confined to particular regions and form
slow-moving, circular pools called eddies.
covers the period June 2005 to December 2007 and is based on a
synthesis of a numerical model with observational data, created by a
NASA project called Estimating the Circulation and Climate of the
Ocean, or ECCO for short. ECCO is a joint project between the
Massachusetts Institute of Technology and NASA's Jet Propulsion
Laboratory in Pasadena, California. ECCO uses advanced mathematical tools
to combine observations with the MIT numerical ocean model to obtain
realistic descriptions of how ocean circulation evolves over time.
Departing Thoughts on Modeling
What climate professionals say "about
Climate models are mathematical representations of the interactions between
the atmosphere, oceans, land surface, ice – and the sun. This is clearly a very
complex task, so models are built to estimate trends rather than
events. For example, a climate model can tell you it will be cold in winter, but
it can’t tell you what the temperature will be on a specific day – that’s
weather forecasting. Climate trends are weather, averaged out over time
- usually 30 years. Trends are important because they eliminate - or "smooth
out" - single events that may be extreme, but quite rare.
Climate models have to be tested to find out if they work. We can’t wait for
30 years to see if a model is any good or not; models are tested against the
past, against what we know happened. If a model can correctly predict trends
from a starting point somewhere in the past, we could expect it to predict with
reasonable certainty what might happen in the future.
So all models are first tested in a process called Hindcasting. The
models used to predict future global warming can accurately map past climate
changes. If they get the past right, there is no reason to think their
predictions would be wrong. Testing models against the existing instrumental
record suggested CO2 must cause global warming, because the models could not
simulate what had already happened unless the extra CO2 was added to
the model. All other known forcings are adequate in explaining temperature
variations prior to the rise in temperature over the last thirty years, while
none of them are capable of explaining the rise in the past thirty years.
CO2 does explain that rise, and explains it completely without any
need for additional, as yet unknown forcings.
Where models have been running for sufficient time, they have also been
proved to make accurate predictions. For example, the eruption of Mt. Pinatubo
allowed modelers to test the accuracy of models by feeding in the data about
the eruption. The models successfully predicted the climatic response after the
eruption. Models also correctly predicted other effects subsequently confirmed
by observation, including greater warming in the Arctic and over land, greater
warming at night, and stratospheric cooling.
the text above is is an excerpt from this
Facts based on modern (direct-measured) data:
All 25 accepted climate models are able to accurately model Earth's past
climate from 1860 forward (1860 provided humanity with accurate world-wide
temperature measurements due to the
manufacture and sale of inexpensive thermometers).
Scientists do not pick one model while discarding others; they publish the results from all
models then plot them all between
In the case of unexpected natural events like the volcanic explosion
of Mount Pinatubo,
climate models must produce similar results when the unexpected natural event is
inserted in the simulation.
all agree that:
Earth's climate is getting
warmer, and this is mainly due to a combination of anthropogenic (human-made) warming
COMBINED WITH natural warming (the previous glaciation
ended 11,700 years ago with no help from humans).
increases by one billion every 12 years. Anthropogenic warming will
increase faster as humanity releases evermore green-house gases.
Not one climate model shows our environment getting
The models do disagree with the point-of-no-return dates being
anywhere from 2015 to 2050.
There are more weather satellites in orbit
now than at any previous time and their measurements are being used to fine-tune
climate models. Some satellites continuously measure Earth's surface (land and
water) temperature over every 16 km (10 mile) square. Others measure glacial
Caveat: Humans can not "directly"
measure temperatures from space. Why? Temperature is inferred by:
Measuring the microwave radiation emitted by atmospheric gases
(works like a microwave oven in reverse)
radiation which is partially absorbed by ever-changing atmospheric
gases between the heat-source and the measuring instrument.
These readings are
mathematically translated (the subject of some debate) into atmospheric temperatures. It is
to use satellites to measure temperature (with certainty) from various atmospheric altitudes.
The Arctic and Greenland are melting at an unprecedented rate. The resulting
influx of fresh water could affect the ocean currents like the
Gulf Stream and the
Thermohaline Circulation which distribute equatorial heat to places like
Northern Europe. This means the current effects of global warming will cool some
locations which will convince many people that global warming is not real.
Increased heat is causing increased evaporation which causes rain to fall
sooner. Many locations are now getting too much rain while locations down-wind are getting little or none.
Therefore, Global warming brings shifting in
Facts based upon pre-modern proxy data
based upon data before 1860 can be useful but are less reliable. Why?
First off, while temperature measurements before 1860 exist,
they are usually spotty. For example, the first known temperature
measuring device is the
produced by Galileo around 1592, but their restricted temperature
range restricted them to [mostly] indoor use.
Development continued with various people creating a
sealed tube then adding an external scale, but it was
Huygens in 1665 who first proposed the idea of a standard scale
based upon the melting and freezing points of water
Daniel Fahrenheit creates a thermometer scheme based upon a
scale with freezing point of 32 F and a boiling point of 212 F
Anders Celsius creates a thermometer scheme based upon a scale
with freezing point of 0 C and a boiling point of 100 C
There are numerous accounts of people keeping local records in
England and Ireland some which are now held by Britain's
of London. One very notable record was created by Thomas Hughes for
Stroud (Gloucestershire) between 1775 and 1795.
since the majority of trees which contain grow rings are found
at mid latitudes (trees don't grow at either pole; trees at the
equator do not have annual growth rings) this proxy measurement is
only of use in the northern hemisphere
since coral only grows in warm water, this proxy measurement is
only of value in a thick band around the equator.
Pollen, Dust, and in Ice Cores
While CO2 in ice cores provide a fairly accurate
measurement of atmospheric carbon dioxide at the time the ice was
formed, trapped pollen and volcanic dust only indicate those
particles were in the vicinity of the freezing water. However, Ice
Cores are available from Antarctica which are over 800,000 years old
and contain trapped CO2 which can be mapped and proxied.
Okay so what do the climates models based upon proxies tell us?
Sunlight seems to vary by no more than one half of one percent. On
its own, changes in sunlight do not do much but can add to other anomalous
effects. Sunlight is somehow involved in small changes in the jet stream
(which can have large effects upon Northern Hemisphere weather)
Global climate (and weather) are continually affected by both
El Nino and
La Nina. This should be
of little surprise since almost 50% of the Earth (the Pacific side) is
covered by water.
Volcanoes affect global climate and weather more than anyone ever
realized. On average, they tend to cool more than heat.
Some heating effects in one area may temporarily cause cooling in other
areas. Remembering that the British Isles are warmed by the Gulf Stream
(London and Moscow are almost at the same latitude so London should be a lot
colder than it is), then anything which affects the Gulfstream could have
devastating effects on North-Western Europe
Facts and derivations:
Medieval Warm Period was 2-3 degrees warmer in the UK but only a
degree warmer in both Labrador and Northern Europe (based upon various
proxies including tree rings). Proxies indicated lower warming elsewhere.
This warming coincides with a period of very low volcanic dust (observed in
ice cores) combined with slightly higher solar output (C14
Since volcanoes stimulate cloud formation by releasing of sulphur
dioxides, a reduction in volcanic activity will result in more solar
energy reaching the Earth
More solar energy reaching the Earth drove the gulf stream a little
harder which primarily warmed: Greenland, Iceland, The British Isles,
This warming anomaly appears to have lasted at least 300 years
Three hundred years of heating resulted in an acceleration of ice
melt (as compared to the rate since the end of the ice age 11,700 years
Any fresh water, including water from ice melt, can poison the
descending (northern) end of the gulf stream
Once the gulf stream was partially poisoned with fresh water, any
decease in solar energy (increased volcanic activity producing more
clouds, deceased solar output, etc.) would flip warming in the other
direction. This happened and we now know as the
Little Ice Age
which is thought to have started between 1250 and 1275 and ended ~ 1700 Notes:
Starting and ending dates vary greatly depending upon which data
Some documents based upon human historical
records use a starting date of 1350
Data based upon ice pack and glacial ice use 1250
Although solar energy was slightly reduced during this time, the
effect was tiny compared to the loss of cloud cover
Like the medieval warm period, places affected the most by warming
were similarly affected by cooling
Energy absorbed by Earth but not transported north by the Gulf
Stream goes elsewhere and this is noticed in proxy records from other
Emission Scenarios (Using Computers to Model Alternatives
Many citizens are unaware of a related activity done by scientists who coupled
computer-based climate models to computer-based economic models to project
humanity's possible actions 100 years into the future.
This is done by
restarting each climate simulation (computer run) using one of four different emission scenarios
labeled: A1, A2, B1, B2. The outputs are based upon ~40 different outcomes
(averaged across more than 25 climate models).
The "ones" column continues our trend to
globalization while the "twos" column is a shift back to a more
The "A" row places more emphasis on a economic health
(continuing to burn fossil fuels) while the "B" row places
more emphasis on environmental health
Since globalization exports education and also promotes the education of
women, birth rates should slow and family sizes should drop. Therefore, Column 1 showed human population
hitting a 9 billion peak in 2050 then dropping off while column-2 hit a 9 billion
peak in the year 2100 (no modeling was done after 2100 so we do not know if that
population will plateau or drop).
The "A1" scenario is subdivided into 3 basic subcategories:
A1F1 (a.k.a. A1FI)
Fossil Intensive (same as "make no changes to current activity")
The first official chart published by the IPCC contained the typo "A1F1"
where they meant to publish "A1FI". All subsequent publications contain this
typo (I guess they just want to remain consistent)
The USA and Canada are firmly in A1
Despite what has been reported in the Western press, China is shifting
from A1F1 toward A1B and may end up at A1T just by inertia. Once they begin
running their factories with zero-cost energy (e.g. no fossil fuels) they
will permanently dominate the world economy.
There are too many humans on Earth to believe we would ever go extinct but I
am convinced that current human culture will not survive unless humanity switches
to scenarios A1T or B1
Humanity is expected to hit 7 billion sometime in 2012 but there many
scientists are now coming around to the idea that this number needs to drop
back to 6 billion or perhaps even 5. It is one thing to have human beings
living off the land in a pre-industrial fashion; it is quite something else
to have 7 billion people living like Americans (a.k.a. homo technologicus).
Since humanity has had no success in voluntarily limiting our population in
the past, I fear that continued climate change will lead to food shortages.
Hunger and compromised immune systems will lead to disease, war, pestilence, and death.
When I first stumbled onto IPCC Emission Scenarios, I couldn't help
Isaac Asimov short story
written in the 1940s titled
Conflict which happens to be Chapter 9 of
the 1950 book
"The Machines", powerful
positronic computers which are used to optimize the world's economy and
production, start giving instructions that appear to go against their
function. Although each glitch is minor when taken by itself, the fact that
they exist at all is alarming.
Stephen Byerley, now elected World Coordinator, consults the four other
Regional Coordinators and then asks
Susan Calvin for her opinion.
She discovers that the machines have generalized the
First Law of Robotics to now mean: No machine may harm humanity; or, through inaction,
allow humanity to come to harm (similar to the
Zeroth Law which Asimov developed in later novels). Dr. Calvin concludes
that the "glitches" are deliberate acts by the Machines, allowing a small
amount of harm to come to selected individuals in order to prevent a large
amount of harm coming to humanity as a whole.
In effect, the Machines have decided that the only way to follow the
First Law is to take control of humanity, which is one of the events that
the three Laws are supposed to prevent. Asimov returned to this theme in
The Naked Sun and
The Robots of Dawn, in which the controlling influence is not a small
conspiracy of Machines but instead the aggregate influence of many robots,
each individually tasked to prevent harm to humanity.
Kitchener - Waterloo - Cambridge, Ontario, Canada.