Skip to content

Latest commit

 

History

History
93 lines (64 loc) · 4.03 KB

File metadata and controls

93 lines (64 loc) · 4.03 KB

Extended Summary

Chapter 9. Model Fidelity Analysis for Sequential Decision-Making Systems Using Simulation Decomposition

Case Study of Critical Mineral Exploration

Authors: Robert J. Moss, Mariia Kozlova, Anthony Corso, and Jef Caers
Source: Kozlova, M., & Yeomans, J. S. (Eds.). (2024). Sensitivity Analysis for Business, Technology, and Policymaking: Made Easy with Simulation Decomposition (SimDec). Taylor & Francis. https://doi.org/10.4324/9781003453789
License: CC BY-NC-ND 4.0

📖 Read full Chapter 9: Ch9.pdf


How complex is too complex?

In this chapter, the authors explore a core question in AI and simulation:
How much model complexity do we actually need to make good decisions?

They use Simulation Decomposition (SimDec) to analyze a sequential decision-making model based on a Partially Observable Markov Decision Process (POMDP) — a framework widely used in AI, robotics, and natural resource planning.

The case focuses on critical mineral exploration, where planners must decide where to drill, when to stop, and whether to mine, all under uncertainty.


What is model fidelity and why does it matter?

Building models often involves trade-offs:

  • Higher fidelity = more detail, more realism… and more computing time
  • Lower fidelity = faster results, but possibly less accuracy

This chapter introduces a Model Fidelity Framework to test how four types of model fidelity affect planning outcomes:

  1. State fidelity – how detailed is the geological representation?
  2. Environment fidelity – how fine is the simulation grid?
  3. Planning fidelity – how much computational effort is used to plan?
  4. Inference fidelity – how the system updates beliefs (held constant here)

How SimDec was used

The team ran 13,500 decision episodes across 27 model configurations and applied SimDec to decompose outcomes like:

  • Return on investment
  • Regret
  • Runtime
  • Bias
  • Accuracy
  • Number of actions taken

They visualized how different combinations of fidelity levels shape performance, helping answer:

“Can we get good results without maxing out every setting?”


What they found

🎯 Good news: Most model outcomes (e.g. return, regret, accuracy) were not very sensitive to fidelity.
You don’t always need the highest-fidelity model to get a good answer.

💡 Where fidelity does matter:

  • Runtime depends heavily on planning iterations and grid resolution
  • Number of actions taken also increases with higher fidelity
  • SimDec exposed interactions: for example, higher environment fidelity only matters when planning fidelity is also high

🔍 Surprise insight: Even when models differ in structure, they often lead to very similar outcomes, as long as the key dynamics are captured.


Why this is useful

  • Helps developers optimize model performance without overbuilding
  • Supports efficient resource use in computationally expensive simulations
  • Useful for any application of sequential decision-making under uncertainty, including:
    • Mineral exploration
    • Autonomous vehicles
    • Emergency planning
    • Carbon storage

Rather than debating "which model is best," this approach asks:

“Which simplifications can we afford—and when do they start to hurt us?”


Want to try the framework?

🔗 Open-source tools available at:
github.com/sisl/POMDPModelFidelityFramework.jl
🧪 SimDec tools: Simulation-Decomposition GitHub


Attribution

Based on Chapter 9 of Sensitivity Analysis for Business, Technology, and Policymaking
© Robert J. Moss, Mariia Kozlova, Anthony Corso, and Jef Caers, 2024 — CC BY-NC-ND 4.0.
This summary is an independent derivative work created for educational and indexing purposes, not affiliated with the original publisher.