LLMpediaThe first transparent, open encyclopedia generated by LLMs

Computing in Science & Engineering

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: Daniel V. Schroeder Hop 5
Expansion Funnel Raw 143 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted143
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Computing in Science & Engineering
Computing in Science & Engineering
N/A - It was created by IEEE (publisher) · CC BY 4.0 · source
TitleComputing in Science & Engineering
DisciplineComputer science; Engineering; Physical sciences
AbbreviationCSE
PublisherInstitute of Electrical and Electronics Engineers; American Physical Society
CountryUnited States
History1999–present

Computing in Science & Engineering describes the integration of high-performance supercomputer resources, algorithmic innovation, and domain-specific software to solve problems in physics, chemistry, biology, earth science, astronomy, engineering, and related fields. It encompasses numerical methods, parallel programming, data analysis, and visualization as practiced at institutions such as Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, CERN, NASA, and National Institute of Standards and Technology. Practitioners draw on communities around venues like the SC Conference, the ACM, and the IEEE.

Overview

Computing in Science & Engineering connects research groups at Massachusetts Institute of Technology, Stanford University, University of California, Berkeley, University of Cambridge, ETH Zurich with facilities such as Oak Ridge National Laboratory, Argonne National Laboratory, Pittsburgh Supercomputing Center, Fermilab, and CERN. It builds on contributions from individuals and awardees like John von Neumann, Alan Turing, Grace Hopper, Donald Knuth, Tim Berners-Lee, Leslie Valiant, and Shafi Goldwasser, and relies on projects including LINPACK, BLAS, MPI, OpenMP, and CUDA. Funding and policy inputs often come from agencies such as the National Science Foundation, Department of Energy, European Research Council, DARPA, and Wellcome Trust.

History and Evolution

The field evolved alongside machines such as the ENIAC, IBM 701, CDC 6600, Cray-1, and modern exascale systems from Hewlett-Packard, Intel, NVIDIA, and AMD. Milestones include algorithmic breakthroughs by John Backus and the development of languages and standards like Fortran, C, C++, Python, MATLAB, and R. Collaborative projects—Human Genome Project, Large Hadron Collider, IPCC—drove demands for scalable simulation, data assimilation, and reproducible workflows. The transition from vector to parallel to heterogeneous architectures involved researchers at Los Alamos National Laboratory, Sandia National Laboratories, National Center for Supercomputing Applications, and industry labs including Bell Labs and Microsoft Research.

Computational Methods and Algorithms

Core methods include linear algebra libraries developed under Netlib, eigensolvers by ARPACK, and optimization frameworks influenced by work at Courant Institute, INRIA, and Caltech. Numerical methods—finite element methods associated with Friedrichs and Courant, finite difference schemes used in Los Alamos National Laboratory meteorology models, spectral methods, and Monte Carlo techniques popularized in studies at Princeton University and Harvard University—are complemented by machine learning advances from Google DeepMind, Facebook AI Research, OpenAI, and academic groups at University of Toronto and Carnegie Mellon University. Parallel programming models trace to projects at Argonne National Laboratory and standards committees in IEEE, while uncertainty quantification owes lineage to work at Sandia National Laboratories and the Office of Science and Technology Policy.

Software and Tools

A rich ecosystem includes provenance and workflow systems from Apache Software Foundation projects, containerization through Docker and Kubernetes, numerical suites such as PETSc, Trilinos, SciPy, NumPy, and domain tools like GROMACS (molecular dynamics), LAMMPS, CESM (climate), WRF (weather), ENZO (astrophysics), and ANSYS (engineering). Visualization and analysis leverage packages from ParaView, VisIt, Matplotlib, and VTK, with collaborative platforms at GitHub, GitLab, Bitbucket, and archiving via Zenodo and Figshare. Reproducibility and provenance initiatives involve standards bodies such as W3C and repositories like arXiv.

Applications by Discipline

In astronomy, pipelines at European Southern Observatory and Space Telescope Science Institute handle petabyte-scale surveys from instruments like Hubble Space Telescope and Vera C. Rubin Observatory. In geoscience, centers like NOAA and USGS run coupled models for seismic and climate risk. Biology and medicine employ high-throughput sequencing analyses from Broad Institute and structural modeling at EMBL and Protein Data Bank. Chemistry leverages electronic structure codes developed at Argonne and Oak Ridge, while materials science uses high-throughput screening initiatives such as the Materials Project. Engineering disciplines integrate multiphysics simulations in aerospace work at NASA Glenn Research Center and automotive design at General Motors Research Laboratory.

Education and Workforce Development

Training pathways are offered by universities including Imperial College London, University of Oxford, Princeton University, University of Michigan, and MOOCs from edX and Coursera. Professional societies—ACM, IEEE Computer Society, SIAM—provide conferences and certification. National initiatives like the National Science Foundation's workforce programs, internships at Lawrence Livermore National Laboratory, and graduate schools at Rensselaer Polytechnic Institute cultivate computational scientists, data engineers, and research software engineers influenced by curricula from Carnegie Mellon University.

Challenges and Future Directions

Key challenges include energy efficiency in exascale systems pioneered at Oak Ridge National Laboratory and reliability concerns raised by hardware vendors Intel and NVIDIA. Ethical and governance questions surface in contexts involving World Health Organization data and international collaborations coordinated through UNESCO. Future directions point to quantum computing efforts at IBM, Google, Rigetti, and D-Wave, cryogenic and neuromorphic architectures from Intel Labs and Human Brain Project, and tighter integration of AI from DeepMind and academic labs to accelerate discovery while adhering to standards advocated by National Academies of Sciences, Engineering, and Medicine.

Category:Computational science