
Environmental Encyclopedia 3
Modeling (computer applications)
Deterministic and Stochastic Models
A deterministic model produces a single answer for
each set of variables; that is, the input variables determine
the outcome. Different results can be produced only by
entering different values for the input variables. Determinis-
tic models are useful where the values of input variables are
reliably known. For example, if an engineer knows with
reasonable certainty the strength of a concrete dam and the
force applied by a full
reservoir
, a deterministic model
should be used to calculate the water level that would cause
the dam to break. A stochastic model, in contrast, includes
an element of randomness, so that there will probably be a
different result each time the model is run. For example, a
population model might include a slightly randomized birth
rate, or a reservoir model might include a randomized vari-
able for rainfall, which in fact varies unpredictably from year
to year. An advantage of using a stochastic (random) variable
is that it helps the model approximate the fluctuations that
can occur in a real system: there is no way to predict the
actual number of pups a wolf pack will have each year, but
it is possible to predict with some precision the minimum
and maximum number that is likely. Running a stochastic
model repeatedly can produce a statistically significant sam-
ple of experimental results. Statistical analysis can be used
to assess the
probability
of various outcomes, such as a 25%
probability that a wolf population will rise over the next
100 years.
Development of Modeling
Although mathematical models and analog models
(physical structures such as clay models of landforms or
river
basins
) have been used in environmental problem solving
for a long time, the growth of modern computer modeling
began in the 1960s with the development of computers. One
of the first publicized computer models was a climate model
that calculated climate conditions based on three input pa-
rameters: temperature, air pressure, and wind speed. Since
then models have been used to predict weather, stream
dis-
charge
,
soil erosion
, population growth, pollution impacts,
and many other occurrences. The ability to use a computer
greatly increased the complexity of models that could be
developed, and the widespread availability of computers in-
creased the number of people developing models for a grow-
ing number of purposes. Models are used in
environmental
monitoring
and management, physics, engineering,
hydrol-
ogy
, economics, demographics, and many other fields. One
reason modeling developed more or less simultaneously in
many disciplines is that the techniques of programming a
computer to simulate real-world variables are highly porta-
ble: the structure of a model used to interpret water flow
might be modified and applied to magma movement in the
earth or to
nutrient
flows through a landscape. Modeling
is now a widespread technique that is used to help explain
912
how natural systems function and to inform public policy
concerning the
environment
, the economy, and many other
issues.
Sensitivity Analysis, Calibration, and Validation
Once a model is developed it is usually tested against
observed events or data to indicate how reliable it is or
where its weaknesses lie. One of the first tests usually run
is sensitivity analysis. A model’s outcome may be more sensi-
tive to changes in one variable or process than another: in
a beach erosion model, for example, the predicted erosion
rate is likely to be influenced more by wave energy than
by sand particle size, even though both factors need to be
considered. It is essential to know to which factors the model
is most sensitive, because error in estimating those factors
could introduce significant error into the results. Sensitivity
analysis involves adjusting variables or processes and then
running the model to see which factors cause the greatest
variability in outcomes.
Once the input variables and relationships in a model
are established, the model can be calibrated, or tuned, to make
it better represent reality. Calibration usually entails adjusting
different parameters in order to produce a result similar to
some observed results in the real world. Calibrating a model
for forest growth, for example, might involve running the
model repeatedly with different growth rates until the model
approximates observed historic growth rates and densities.
Validation is like calibration, in that it involves com-
paring a model’s results to observed data. The objective of
validation, though, is to demonstrate that the model is reli-
able. The assumption of model validation is that if the model
predicts the correct results in a known situation, then it
will probably predict correctly in an unknown, experimental
situation.
General Circulation Models
A type of model that has received much attention
since the mid-1980s is the general circulation model, or
GCM. This is a class of highly complex models designed
to predict with reasonable
accuracy
the behavior of the
earth’s atmosphere (climate) or oceans. GCMs have been
used to predict worldwide global warming as a result of
increased concentrations of
carbon
dioxide in the atmo-
sphere. GCMs are complex because they simulate the behav-
ior of the atmosphere and the oceans, which have complex
three-dimensional movements or flows of air masses or
water. Predicting how changes in heat input at one location
impacts temperatures and flow rates at another location re-
quires thousands of calculations performed on many inter-
related variables at thousands of points in space. Because
GCMs must keep track of so many variables and so many
points in space, they are usually run on supercomputers that
can perform (billions) of calculations per second.