LLMpediaThe first transparent, open encyclopedia generated by LLMs

Torch (machine learning)

Generated by GPT-5-mini
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: PyTorch Hop 4
Expansion Funnel Raw 58 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted58
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()
Torch (machine learning)
NameTorch
DeveloperFacebook, Inc.; NYU contributors; Twitter, Inc.
Released2002
Latest release(historical)
Programming languageLua (programming language); C (programming language)
Operating systemLinux, macOS, Microsoft Windows
LicenseBSD license

Torch (machine learning) Torch is an open-source scientific computing framework used for machine learning and deep learning research, originally developed in the early 2000s and popularized in the 2010s. It provided a flexible tensor library and a scripting environment based on Lua (programming language), enabling rapid prototyping in academic and industrial settings such as Facebook, Inc., NYU, and Twitter, Inc.. Torch influenced subsequent frameworks and workflows adopted at organizations including Google LLC, Microsoft, and Amazon (company), and played a role in research presented at venues such as NeurIPS, ICML, and CVPR.

History

Torch emerged from research groups and collaborations involving institutions like NYU, University of California, Berkeley, and companies such as Facebook, Inc. and Twitter, Inc.. Early contributors included researchers affiliated with Fei-Fei Li's networks and labs associated with Yann LeCun and Geoffrey Hinton-adjacent communities presenting at ICCV and NeurIPS. Torch became widely used in projects led by teams at Facebook AI Research and in industry papers at CVPR and ECCV, influencing the development timelines of later frameworks such as TensorFlow and PyTorch. Over time, stewardship shifted as the deep learning ecosystem evolved, with many users migrating to successor projects supported by organizations like Meta Platforms, Inc. and academic centers including Stanford University and Massachusetts Institute of Technology.

Architecture and Design

Torch's architecture is centered on a core written in C (programming language) with bindings to Lua (programming language), enabling a lightweight interpreter for model definition used by researchers associated with NYU and practitioners at Facebook, Inc.. The design favored modularity inspired by libraries from labs at UC Berkeley and ideas discussed in workshops at NeurIPS and ICLR. Its tensor implementation and automatic differentiation concepts connected to work by groups at Carnegie Mellon University and University of Toronto and incorporated low-level optimizations influenced by projects at NVIDIA Corporation and Intel Corporation. The framework exposed an imperative programming model adopted by teams at DeepMind and contrasted with computational graph approaches presented by Google LLC.

Core Components and Features

Torch provided a tensor library, neural network modules, and utilities for optimization and data loading, with modules used in pipelines at Facebook AI Research, NYU, and Twitter, Inc.. The framework included implementations of common layers popularized by researchers such as Alex Krizhevsky and Yoshua Bengio and optimization algorithms promoted by scientists at University of Montreal and Toronto. Built-in support for random number generation and parallelism reflected practices from laboratories at MIT and UC San Diego. The API facilitated experimentation with architectures discussed at ICML and techniques presented at NeurIPS, enabling reproducible setups used in competitions like ImageNet Large Scale Visual Recognition Challenge and challenges hosted by Kaggle.

Ecosystem and Libraries

A rich ecosystem grew around Torch with libraries for vision, audio, and natural language tasks developed by contributors from Facebook AI Research, NYU, and community groups at GitHub, Inc.. Notable companion projects included vision toolkits adopted by teams at Stanford University and datasets and preprocessing utilities referenced in papers at CVPR and ACL. Integrations for hardware acceleration were influenced by partnerships and research from NVIDIA Corporation and vendor-specific libraries originating from Intel Corporation and ARM Holdings. The ecosystem included model zoos and example repositories similar to those curated by institutions like OpenAI and research labs such as DeepMind.

Applications and Use Cases

Torch was applied in computer vision, speech recognition, and natural language processing projects led by teams at Facebook AI Research, Mozilla Foundation, and academic groups at UC Berkeley. Use cases included image classification benchmarks associated with ImageNet, object detection systems influenced by work at Microsoft Research, and speech systems drawing on research from CMU and Google Research. Industrial adoption occurred at companies like Twitter, Inc. for recommendation systems and at startups incubated in ecosystems around Silicon Valley and research collaborations funded by organizations such as the National Science Foundation.

Performance and Hardware Support

Torch leveraged native code and vendor libraries to accelerate computation on CPUs and GPUs, with optimization strategies informed by engineering teams at NVIDIA Corporation, Intel Corporation, and AMD. GPU support enabled high-throughput training used in experiments reported at NeurIPS and allowed compatibility with CUDA stacks promoted by NVIDIA Corporation. Performance tuning and profiling techniques paralleled efforts at Google LLC and academic centers including Stanford University, facilitating large-scale experiments on clusters similar to those operated by Amazon Web Services and research compute at XSEDE.

Community and Development Model

Development of Torch followed an open-source model with contributions from researchers at NYU, engineers at Facebook, Inc., and volunteers coordinating on platforms such as GitHub, Inc. and mailing lists used by groups at CMU and Stanford University. The community organized workshops and tutorials at conferences like NeurIPS, ICML, and CVPR, and maintained forks and extensions analogous to community projects at Apache Software Foundation and collaborations seen in projects endorsed by OpenAI. As the landscape evolved, many community members transitioned to successor frameworks supported by organizations such as Meta Platforms, Inc. and Google LLC.

Category:Machine learning software