LLMpediaThe first transparent, open encyclopedia generated by LLMs

National Grid Service

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Expansion Funnel Raw 107 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted107
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
National Grid Service
NameNational Grid Service
TypeConsortium
Founded2004
Dissolved2011
LocationUnited Kingdom
Area servedUnited Kingdom, Europe
FocusGrid computing, e-research, high-performance computing

National Grid Service

The National Grid Service was a United Kingdom distributed computing consortium that provided federated high-performance computing resources for research across multiple institutions. It connected university clusters, national laboratories, and service providers to enable computational science, digital preservation, and data-intensive projects. The initiative interfaced with international efforts in e-infrastructures, collaborating with projects from the European Commission, the Open Science Grid, and research councils.

Overview

The National Grid Service brought together partners such as University of Oxford, University of Cambridge, University of Edinburgh, University of Manchester, and Queen Mary University of London alongside facilities like STFC Rutherford Appleton Laboratory and commercial providers. It operated middleware stacks interoperable with Globus Toolkit, gLite, Unicore, Condor, and PBS Professional and aligned with standards from OGSA, OGF, IEEE, and IETF. The service supported domains represented by institutions like Imperial College London, University College London, University of Glasgow, and King's College London, while coordinating with funders such as EPSRC, Jisc, and HEFCE. International collaborations included links to European Grid Infrastructure, DEISA, PRACE, and the Open Science Grid.

History

The consortium emerged from UK e-science initiatives inspired by projects at CERN, SLAC National Accelerator Laboratory, and Lawrence Berkeley National Laboratory. Early pilots referenced architectures demonstrated by UK e-Science Core Programme, GridPP, and NERC programmes. Key milestones involved collaborations with JISC programmes, engagements with Tate Modern digital projects, and integration into European activities like EGEE and FP6 and FP7 framework projects. Leadership and advisory input came from figures associated with University of Southampton, University of Leeds, University of Sheffield, University of Bristol, and University of York.

Services and Infrastructure

Operational services included batch compute access, data staging, authentication, and virtual organization management supporting research communities at University of Birmingham, University of Nottingham, University of Liverpool, Newcastle University, and Durham University. Storage and data services interfaced with systems used by British Library, Natural History Museum, London, National Archives (UK), and domain repositories like UK Data Archive. Portals and science gateways were developed with groups at Cardiff University, University of Exeter, University of Southampton, University of Bath, and Lancaster University for fields such as climate science, bioinformatics, astrophysics, and engineering. Support and training were provided in collaboration with centres such as e-Science Institute (Edinburgh), Manchester e-Science Centre, and Oxford e-Research Centre.

Governance and Funding

Governance structures drew on university consortium models similar to boards at Research Councils UK funded projects. Funding streams included competitive awards from EPSRC, infrastructure funding from Jisc, and project grants under European Commission frameworks including FP6 and FP7. Partner institutions such as University of Aberdeen, Heriot-Watt University, University of Stirling, University of Strathclyde, and University of Ulster contributed resources under memoranda of understanding. Oversight and audits followed policies referenced in documents from National Audit Office, Cabinet Office digital strategy discussions, and guidance from Academy of Medical Sciences and Royal Society initiatives on research infrastructure.

Users and Applications

User communities spanned researchers at Medical Research Council, Wellcome Trust projects, and cultural heritage teams at British Museum and Victoria and Albert Museum. Scientific applications included simulation codes from groups at Max Planck Institute for Astrophysics collaborators, climate models interfacing with Met Office datasets, and workflows used by the Human Genome Project-linked bioinformatics teams and European Bioinformatics Institute researchers. Other domains served included digital humanities projects with partners like Oxford Internet Institute, School of Oriental and African Studies, and archaeological computing teams associated with British School at Rome.

Technical Architecture and Standards

The architecture combined compute clusters, storage systems, and identity federations using technology stacks like Globus Toolkit, LCG, gLite, Unicore, and job schedulers such as Torque, SLURM, Grid Engine, and PBS Professional. Authentication and authorization frameworks used technologies compatible with Shibboleth, SAML, X.509 certificates, and integration with EDGI and eduGAIN identity federations. Data transfer and management practices echoed protocols from GridFTP, HTTP, SRM, and metadata standards influenced by Dublin Core and domain ontologies developed in projects linked to RDF and W3C recommendations.

Impact and Legacy

The National Grid Service influenced later UK e-infrastructure projects and informed the development of successors and related programmes at Jisc, UKRI, and EPSRC funded efforts, contributing lessons to initiatives like UK Cloud Strategy pilots and the European Grid Infrastructure. Its work shaped practices adopted by institutions such as National Physical Laboratory, Royal Botanic Gardens, Kew, Met Office Hadley Centre, and influenced collaborations with CERN experiments and LHCb computing models. The legacy includes trained personnel who moved to projects at PRACE, STFC, DiRAC, and national research computing centres, and a body of technical guidance used by Digital Curation Centre, UK Data Service, and university research IT departments.

Category:Computing in the United Kingdom Category:Distributed computing