Generated by GPT-5-mini| IEEE Transactions on Neural Networks | |
|---|---|
| Title | IEEE Transactions on Neural Networks |
| Discipline | Neural networks, machine learning |
| Abbreviation | IEEE Trans. Neural Netw. |
| Publisher | Institute of Electrical and Electronics Engineers |
| Country | United States |
| History | 1990–present |
| Frequency | Monthly |
| Issn | 1045-9227 |
IEEE Transactions on Neural Networks is a peer-reviewed scientific journal published by the Institute of Electrical and Electronics Engineers that focuses on research in neural networks, learning algorithms, and related computational intelligence fields. The journal has been a venue for work intersecting theoretical foundations, algorithm design, and applications, attracting submissions from researchers affiliated with universities, research institutes, and industrial laboratories worldwide. Authors and readers include contributors tied to major institutions and projects across artificial intelligence, signal processing, robotics, and cognitive modeling.
The journal emerged during a period of renewed interest in connectionist models and computational learning theory associated with figures and institutions such as Geoffrey Hinton, Yann LeCun, John Hopfield, Stanford University, Massachusetts Institute of Technology, Carnegie Mellon University, Bell Labs, University of Toronto, University of California, Berkeley, University of Cambridge, University of Oxford, École Polytechnique, Tsinghua University, Peking University, Microsoft Research, IBM Research, Google Research, DeepMind, Facebook AI Research, Amazon Web Services, NVIDIA, Hitachi, Siemens, and Samsung Research. Early editorial leadership connected to professional societies such as the IEEE Computational Intelligence Society and collaborations with conferences including NeurIPS, ICML, IJCAI, AAAI Conference, CVPR, ICLR, ECCV, ICASSP, IROS, AAMAS, ACM SIGGRAPH, KDD, COLT, UAI, ISCA, and RSS helped establish its profile. Over successive decades the journal indexed advances tied to paradigms promoted by researchers at University of Toronto and experimental platforms at CIFAR, OpenAI, DeepMind, and industrial labs, reflecting shifts from early connectionism to deep learning and representational learning linked to awards such as the Turing Award and prizes given by organizations like the IEEE Medal of Honor.
The stated editorial scope covers theoretical, methodological, and applied studies relating to artificial neural networks and learning systems, encompassing topics pursued by teams at Google DeepMind, Microsoft Research, Facebook AI Research, IBM Research, Intel Labs, Apple Machine Learning Research, NVIDIA Research, Baidu Research, Tencent AI Lab, Huawei Noah's Ark Lab, Alibaba DAMO Academy, SRI International, MIT-IBM Watson AI Lab, Broad Institute, Wellcome Trust Sanger Institute, Los Alamos National Laboratory, Lawrence Berkeley National Laboratory, and university groups at Harvard University, Yale University, Princeton University, Columbia University, Cornell University, Brown University, University of Michigan, University of California, Los Angeles, Johns Hopkins University, Duke University, Imperial College London, ETH Zurich, École Normale Supérieure, Seoul National University, KAIST, Indian Institute of Technology Bombay, Australian National University, University of Sydney, University of Toronto Scarborough. The editorial board traditionally includes experts in adaptive systems, stochastic optimization, neural architectures, spiking networks, reinforcement learning, and statistical learning theory who have ties to prizes and conferences such as the NeurIPS Best Paper Award, ICML Test of Time Award, ACM Fellow recognitions, and fellowship in the IEEE. Manuscript types include original research articles, surveys, and technical correspondences reflecting methodologies used in projects at CERN, NASA Jet Propulsion Laboratory, European Space Agency, and industry demonstrations in autonomous vehicles by Waymo, Cruise, Tesla Autopilot, and robotics efforts at Boston Dynamics.
As an IEEE journal, publication processes intersect with editorial workflows used across IEEE periodicals and committees involving editors associated with institutions such as IEEE Standards Association, IEEE Signal Processing Society, and regional sections including IEEE-USA and IEEE UK and Ireland. The journal has adopted online submission and review platforms similar to those used by other major publishers and collaborates with indexing services employed by Web of Science, Scopus, and digital libraries such as the IEEE Xplore Digital Library. Access models have included subscription-based access through institutional libraries at universities like Princeton University, University of California System, University of Oxford, and national research libraries including the British Library and Library of Congress, alongside options for open access publication in alignment with funder policies from agencies such as the National Science Foundation, European Research Council, Wellcome Trust, NIH, DARPA, Japan Society for the Promotion of Science, and NSFC.
The journal is abstracted and indexed in major bibliographic databases and citation indices used by researchers at Clarivate Analytics, Elsevier, Google Scholar, and national citation services across countries including China National Knowledge Infrastructure, CiNii, Scielo, Dimensions, and institutional repositories at arXiv server mirrors hosted by institutions like Cornell University. Inclusion in indexing services has supported literature discovery alongside conference proceedings from NeurIPS, ICML, ICLR, CVPR, and ECCV cited by authors from University of Illinois Urbana–Champaign, Purdue University, Rensselaer Polytechnic Institute, Technion – Israel Institute of Technology, Weizmann Institute of Science, Tel Aviv University, University of Amsterdam, Delft University of Technology, Leiden University, and KU Leuven.
The journal's impact has been measured by citation metrics used by organizations like Clarivate Analytics and Scopus, and its articles have influenced technologies commercialized by corporations such as Google, Microsoft, Amazon, Meta Platforms, Apple, and NVIDIA. Research published has contributed to milestones recognized by awards and academies including the National Academy of Engineering, Royal Society, Royal Academy of Engineering, Academia Europaea, Chinese Academy of Sciences, and has been referenced in policy and technical roadmaps produced by entities like the European Commission and national research councils including NSF and UK Research and Innovation.
Notable contributions have included foundational work on learning algorithms, architecture design, and application case studies that intersect with research trajectories led by scholars and labs at University of Toronto, Stanford University, MIT, Carnegie Mellon University, Princeton University, Columbia University, University of California, Berkeley, Google DeepMind, OpenAI, Microsoft Research, IBM Research, Facebook AI Research, NVIDIA Research, DeepMind AlphaFold-adjacent modeling efforts, and theoretical advances related to concepts advanced by Vladimir Vapnik, Yann LeCun, Geoffrey Hinton, Yoshua Bengio, David Rumelhart, Terrence Sejnowski, Michael Jordan (computer scientist), Stephen Grossberg, Christos Papadimitriou, Judea Pearl, Leslie Valiant, Shai Shalev-Shwartz, Andrew Ng, Ian Goodfellow, Jürgen Schmidhuber, Sepp Hochreiter, Corinna Cortes, Vladimir Vapnik (duplicate names avoided in attribution lists), and groups developing applications in speech and vision used in systems by Apple Siri, Amazon Alexa, Google Assistant, and imaging tools in medical diagnostics deployed at Mayo Clinic, Cleveland Clinic, Johns Hopkins Hospital, and research collaborations with NIH and Wellcome Sanger Institute.
Category:IEEE academic journals