top of page

Environmental Risk and Uncertainty: Eliminating Uncertainty

  • Writer: JD Solomon
    JD Solomon
  • 37 minutes ago
  • 4 min read
Leaders need to know whether uncertainty is a knowledge gap (fixable) or a natural variability (not fixable). JD Solomon Inc. provides practical solutions for addressing environmental risk and uncertainty.
Leaders need to know whether uncertainty is a knowledge gap (fixable) or a natural variability (not fixable).

In large, complex systems, uncertainty is a technical reality that must be classified, measured, and managed. The most useful distinction is between epistemic uncertainty, which reflects limits in our knowledge, and aleatoric uncertainty, which reflects inherent variability in the world itself. Serious quantitative modeling depends on knowing the difference, because what can be reduced must be pursued and what cannot be reduced must be designed for. The uncertainty that cannot be eliminated must be effectively managed and communicated.


Epistemic Uncertainty

Epistemic uncertainty is defined as those uncertainties due to simplifying model assumptions, missing physical data, or our basic lack of knowledge. Some examples include the way we express inputs or relationships to describe natural phenomena, the inputs we chose to put in (or leave out), and certain types of numeric errors (such as those related to precision or significant figures).

 

Epistemic uncertainty is limited by our understanding of what we know (knowledge) and to the choices we make in applying the knowledge (judgement).

 

Aleatoric Uncertainty

Aleatoric uncertainty is defined as those uncertainties that are inherent to a problem or to an event that cannot be reduced by additional knowledge. Additional runs (trials) of an experiment or additional observations may help to narrow the uncertainty, but there is a natural error of lack of clarity or precision that is present. Aleatoric uncertainty is also known as statistical uncertainty or irreducible uncertainty.

 

Modeling Example

Both kinds of uncertainties are present in Large Worlds. And they are usually overlapping.

 

For example, I began my career developing quantitative groundwater (hydrogeologic) models. The leading-edge quantitative models were developed with early generations of the control-volume finite-difference (CVFD) flow software MODFLOW and the solute transport and reactive solute transport software MT3D. Both software applications are still in use 20 to 30 years later, albeit with several generations of improvement. They are now officially endorsed by the United States Geological Survey (USGS).

 

Some of the issues with quantitatively assessing the uncertainty associated with some nasty chemicals and chemical compounds included: the assumed boundary conditions at the edges of the model; grid spacing (both model and field sampling points); relationships and inter-actions between known, and possibly unknown, compounds; geophysical conditions, such as aerobic or anaerobic environments, and the effect on chemical fate and transport; the type and precision of groundwater sampling that had been performed; the accuracy and reliability of analytical laboratories and field testing; and the accuracy and reliability of the models themselves.

 

All of this to say that there were many sources of uncertainty – some based on our then-current knowledge of the world and others related to the assessment approaches we had chosen, or were limited to using.

 

Uncertainty is Everywhere

Similar Large World examples can be found related to air quality assessments, atmospheric modeling, weather forecasting, climate change models, predicting wildfires, disease and epidemic modeling, biological assessments, nuclear engineering, and others.

 

The good news is that we are certainly much improved at quantitative prediction where variables behave independently, such as in many physical sciences.

 

The bad news is that we still have a long way to go when it comes to accurately predicting outcomes in which variables depend on and interact non-linearly, such as in biological processes and human behavior.


Uncertainty in Practice

The distinctions between epistemic and aleatoric uncertainty continue because they clarify what can be reduced versus what must be managed.


  • Epistemic uncertainty - reducible with more data, monitoring, research, or model refinement.

  • Aleatoric uncertainty - inherent variability that must be accommodated through design margins, resilience, or probabilistic methods.

 

Environmental agencies still rely on distinctions in epistemic and aleatoric uncertainty to justify monitoring programs, adaptive management, probabilistic risk assessments, and funding for data collection.

 

Uncertainty in Environmental Communication

Federal and state environmental agencies implicitly or explicitly use epistemic and aleatoric uncertainty logic in risk communication and hazard modeling.


Leaders need to know whether uncertainty is a knowledge gap (fixable) or a natural variability (not fixable). There are competing frameworks, but framing environmental issues using epistemic and aleatoric uncertainty remains powerful and intuitive.

 

The Limits of Eliminating Uncertainty

Reducing uncertainty and being more objective are certainly the right, noble things that should be done. However, the reality is that our knowledge of the future is not perfect and even the most quantitative models require subjectivity. Only statistical Frequentists, working in Small Worlds, believe or advocate otherwise.


Reduce uncertainty and subjectivity – yes.

Eliminate uncertainty and subjectivity – never.

Embrace uncertainty and subjectivity – always.

 

 

North Carolina State University’s Ralph Smith is an excellent source in the field of uncertainty quantification. For me, he is also an example that, although you may travel far for expert advice and guidance, sometimes you discover one of the best sources is in your own backyard. See G.L.S. Shackle for more on the nature of our knowledge. The US National Weather Service is an excellent reference for more details on quantitative modeling for weather forecasting, the USGS for quantitative hydrogeologic and geologic modeling, and the Centers for Disease Control (CDC) on quantitative and qualitative modeling related to diseases and epidemics.

 


This article was first published by JD Solomon on LinkedIn.

Solomon, J. D. (2018, October 29). Risk and uncertainty: Eliminating uncertainty. LinkedIn. https://www.linkedin.com/pulse/risk-uncertainty-eliminating-jd-solomon



JD Solomon writes and consults on decision-making, reliability, risk, and communication for leaders and technical professionals. His work connects technical disciplines with human understanding to help people make better decisions and build stronger systems. Learn more at www.jdsolomonsolutions.com and www.communicatingwithfinesse.com.

Comments


bottom of page