LLMpediaThe first transparent, open encyclopedia generated by LLMs

Agent-based modeling

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Pattie Maes Hop 4
Expansion Funnel Raw 61 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted61
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Agent-based modeling
NameAgent-based modeling
FieldComputational science
Introduced1970s
Notable examplesSugarscape, NetLogo, Repast
RelatedComplex systems, Cellular automata, Multi-agent systems

Agent-based modeling is a computational approach for simulating interactions of autonomous agents to study emergent phenomena in complex systems. It represents individual actors as discrete entities with rules for behavior and interaction, enabling exploration of macro-level patterns from micro-level processes. Practitioners draw on methods and case studies from fields such as physics, biology, and social science to test hypotheses about system dynamics.

Overview

Agent-based approaches model populations of heterogeneous agents situated in environments and interacting over time to produce system-level outcomes. Influential implementations include Sugarscape, NetLogo, and Repast, which illustrate links to research at institutions like Santa Fe Institute and Los Alamos National Laboratory. Key concepts connect to work by researchers affiliated with MIT, University of Michigan, Indiana University, Northwestern University, and projects funded by agencies such as National Science Foundation and Defense Advanced Research Projects Agency.

History and development

Origins trace to early computational experiments in the 1970s and 1980s that extended ideas from John von Neumann's automata studies and Conway's Game of Life implementations. Seminal contributions emerged from scholars at Santa Fe Institute alongside developments at RAND Corporation and Brookhaven National Laboratory. Academic diffusion accelerated through conferences hosted by International Conference on Complex Systems and publications in outlets connected to Oxford University Press and Cambridge University Press. Later institutional centers such as University of Chicago and London School of Economics fostered interdisciplinary applications.

Methodology and components

Models comprise explicit agent definitions, environment representations, interaction protocols, and scheduling mechanisms. Typical components reference theoretical foundations from work at RAND Corporation, algorithmic strategies discussed in texts from MIT Press, and empirical case designs employed at Yale University and Columbia University. Core steps include specification, parameterization, initialization, execution, and analysis; practitioners often use design patterns promoted by research groups at Los Alamos National Laboratory, Argonne National Laboratory, and Sandia National Laboratories.

Applications

Agent-based techniques have been applied to domains including urban modeling used in studies by Harvard University and University College London, epidemiology explored by teams at Centers for Disease Control and Prevention and Johns Hopkins University, market simulation researched at London Business School and Wharton School, and ecological modeling advanced by Smithsonian Institution and Max Planck Society. Other notable uses appear in transportation studies with contributions from Massachusetts Institute of Technology and Delft University of Technology, conflict and peace research at Stockholm International Peace Research Institute and United Nations, and policy analysis undertaken by Brookings Institution and OECD.

Validation and verification

Verification and validation practices draw on methodologies advocated by US Department of Defense modeling standards and academic protocols from Institute of Electrical and Electronics Engineers and Association for Computing Machinery workshops. Techniques include code verification, sensitivity analysis, calibration against data from agencies like National Institutes of Health and Eurostat, and pattern-oriented modeling popularized by groups at University of Warwick and University of Stirling.

Advantages and limitations

Advantages include capacity to represent heterogeneity and nonlinear interactions showcased in studies at Santa Fe Institute and University of California, Berkeley, and flexibility for scenario exploration used by RAND Corporation and Brookings Institution. Limitations encompass issues of parameter identifiability, computational cost emphasized in work at Argonne National Laboratory and Oak Ridge National Laboratory, and challenges in empirical validation flagged by scholars at London School of Economics and University of Oxford.

Software and implementations

Widely used platforms include NetLogo, Repast, MASON, and Gama Platform, with community contributions from developers associated with Tufts University, University of Chicago, University of Southampton, and Leipzig University. Commercial and domain-specific implementations have been produced for organizations such as Siemens and IBM. Tooling ecosystems integrate with data sources maintained by US Census Bureau, European Environment Agency, and World Bank for empirical model calibration.

Category:Computational modeling