Generated by GPT-5-mini| Nordic Data Grid Facility | |
|---|---|
| Name | Nordic Data Grid Facility |
| Formation | 2006 |
| Dissolution | 2011 |
| Headquarters | Copenhagen |
| Region served | Denmark, Norway, Sweden, Finland, Iceland |
| Leader title | Director |
Nordic Data Grid Facility
The Nordic Data Grid Facility was a collaborative e-infrastructure initiative that provided distributed computing resources and data services across the Nordic region. It connected research communities in Denmark, Norway, Sweden, Finland, and Iceland with pan-European projects such as European Grid Infrastructure, EGEE, and PRACE. The initiative interfaced with major research centers including CERN, European Molecular Biology Laboratory, and European Space Agency.
The project aimed to offer grid middleware, identity management, storage elements, and compute resources to support scientists in fields associated with Large Hadron Collider, Human Genome Project, Square Kilometre Array, Copernicus Programme, and Nordic joint research initiatives. It collaborated with institutions like Nordic Council of Ministers, NordForsk, STFC, INFN, CNRS, Max Planck Society, Deutsches Elektronen-Synchrotron, LHCb, ATLAS, and CMS to integrate resources and standards. The NCGF provided services compatible with Globus Toolkit, gLite, ARC middleware, UNIX, and identity federations such as eduGAIN and Shibboleth.
Established through regional agreements influenced by policies from European Commission directorates and programs such as FP6, FP7, and later interactions with Horizon 2020, the facility grew from national grid projects including NorGrid, SweGrid, Danish Centre for Scientific Computing, CSC – IT Center for Science, and RHnet. Early technical work drew on experience from Nordugrid, Enabling Grids for E-sciencE, and collaborations with NERSC, PRACE Preparatory Phase, and EGI. The timeline involved pilot deployments, production rollouts, and transition activities that aligned with roadmaps from ESFRI, European Strategy Forum on Research Infrastructures, and national funding agencies across the Nordic capitals: Helsinki, Stockholm, Oslo, Copenhagen, and Reykjavik.
The architecture combined compute clusters, storage elements, workload management systems, and data management services using components like Storage Resource Manager, dCache, iRODS, Lustre, and GridFTP. Authentication and authorization were enabled by X.509, MyProxy, and federated identity systems integrating with eduGAIN and national academic identity providers such as SAML. Interoperability was achieved through standards from OGF, W3C, and IETF, and through software stacks like gLite, ARC, Globus, and HTCondor. Services offered included job submission, replica catalogs, metadata catalogs, virtual organization management, and support for science gateways linked to projects like ALICE, Nordic EMBL Partnership, NordForsk e-Infrastructure, and NeIC.
Primary participants were national research centers in Denmark, Norway, Sweden, Finland, and Iceland with institutional partners such as University of Copenhagen, University of Oslo, Uppsala University, University of Helsinki, University of Iceland, Aalto University, Chalmers University of Technology, Technical University of Denmark, Norwegian University of Science and Technology, and Karolinska Institutet. International collaborations involved CERN, EMBL, European Space Agency, European Southern Observatory, Nordic Council of Ministers, NordForsk, European Grid Infrastructure, EUDAT, PRACE, EGEE, NeIC, SURFnet, NORDUnet, EGI.eu, R\I, Computational Infrastructure for Geodynamics, and national agencies like Swedish Research Council and Research Council of Norway.
Use cases spanned high-energy physics, bioinformatics, climate modeling, and astronomy, supporting experiments and programs such as ATLAS experiment, CMS experiment, LHCb experiment, Human Genome Project, EU Copernicus Programme, IPCC, Paleoclimate Reconstructions, SKA, LOFAR, ALMA, HST, Gaia, Met Office Hadley Centre collaborations, and ECMWF data workflows. Scientific software stacks and workflows integrated tools like ROOT, GROMACS, BLAST, Cytoscape, ENVI, IDL, TensorFlow, and MPI, serving research groups at Karolinska Institutet, Umeå University, Trondheim Technical University, Aarhus University, and University of Bergen.
Governance involved consortia of national centers and ministries with oversight from bodies such as Nordic Council, NordForsk, and linkages to European Commission initiatives. Funding came from national research councils including Swedish Research Council, Research Council of Norway, Academy of Finland, and from EU programmatic instruments like FP6 and FP7. Coordination was performed with agencies and organizations such as CSC – IT Center for Science, NORDUnet, SURFnet, NeIC, EGI.eu, and various university computing centers.
The facility contributed to the evolution of pan-European infrastructures by feeding expertise, middleware integration, and operational practices into initiatives such as European Grid Infrastructure, EGI, PRACE, EUDAT, and national services like CSC. Its work influenced standards efforts with OGF and eduGAIN and supported the maturation of identity federations, data preservation approaches tied to FAIR principles, and collaborative models used by NeIC and NordForsk. Many operations, tools, and partnerships transitioned into successor organizations and infrastructures benefiting projects such as Horizon 2020 and ongoing collaborations with CERN, EMBL-EBI, ESA, and regional research networks.
Category:Nordic research infrastructure