Generated by GPT-5-mini| coremltools | |
|---|---|
| Name | coremltools |
| Developer | Apple Inc. |
| Initial release | 2017 |
| Latest release | 6.x |
| Programming language | Python, C++ |
| Operating system | macOS, iOS, iPadOS, tvOS, watchOS |
| License | Apache License 2.0 |
coremltools
coremltools is an open-source model conversion and optimization toolkit developed by Apple Inc. designed to translate machine learning models from popular frameworks into a format consumable by Apple's on-device inference engine. It serves as a bridge between frameworks like TensorFlow, PyTorch, Keras, Scikit-learn, and deployment targets such as iPhone, iPad, Apple Watch, and Apple TV, enabling developers and researchers affiliated with institutions such as Stanford University, MIT, Carnegie Mellon University, University of California, Berkeley, and companies like Google, Facebook, Amazon (company), and Microsoft to deliver models to Apple platforms.
coremltools provides conversion utilities, model specification handling, and optimization passes to create models compatible with Apple's Core ML runtime used by Apple Inc. products including iOS, iPadOS, tvOS, and watchOS. The toolkit interacts with popular research and industry frameworks such as TensorFlow, PyTorch, ONNX, Keras, and Scikit-learn, and aids integration with developer tools like Xcode and services like Apple Developer programs. Contributors and users include engineers from Apple Inc., researchers from DeepMind, members of the OpenAI research community, and practitioners engaged with organizations like NVIDIA, Intel, and ARM Holdings.
coremltools features model graph translation, operator mapping, quantization, pruning, and model specification editing compatible with the Core ML model format used by Apple's inference stack in Core ML. The architecture separates parsing layers for frameworks such as TensorFlow, PyTorch, and ONNX from optimization and serialization layers that output a Core ML specification consumable by runtime components on devices associated with Apple Inc. like iPhone XR and MacBook Pro. It integrates with ecosystem tools such as Xcode for profiling, Instruments for performance analysis, and leverages hardware vendors including Apple M1, Apple A14, NVIDIA GPUs, and ARM CPU features when models are executed on-device.
The toolkit includes converters for major formats and community-contributed adapters: converters from TensorFlow SavedModel and TensorFlow Lite, from PyTorch via TorchScript, from ONNX intermediate representation used by projects like ONNX Runtime, and from high-level APIs such as Keras and Scikit-learn. It also accommodates models exported by research frameworks and toolchains associated with institutions like Google Brain, Facebook AI Research, Uber AI Labs, and industrial frameworks developed at Intel Labs. The project commonly interfaces with format ecosystems represented by HDF5, Protobuf, and model interchange standards advanced by Open Neural Network Exchange stakeholders including Microsoft and Amazon (company).
coremltools exposes Python APIs used in development environments such as Jupyter Notebook, PyCharm, and Visual Studio Code to convert and inspect models, modify specification fields, and apply optimizations. Typical workflows use methods analogous to APIs from TensorFlow, PyTorch, and Keras to trace graphs or export serialized models before calling conversion routines. Integration with continuous integration systems like Jenkins, GitHub Actions, and Travis CI is common for teams at organizations like Spotify, Dropbox, and Airbnb that deploy frequent model updates to Apple devices.
Optimization features include 8-bit and 16-bit quantization schemes, operator fusion, layer folding, and graph pruning to reduce model size and latency for mobile hardware families such as Apple M1, Apple A13, and Apple A12 Bionic. Techniques used by coremltools echo research from labs like Google Research, Facebook AI Research, and academic groups at University of Oxford and ETH Zurich. Profiling and benchmarking rely on tools and platforms like Xcode Instruments, Metal Performance Shaders, and vendor SDKs from NVIDIA and ARM Ltd. to measure throughput and energy efficiency on target hardware including iPhone 12 and iPad Pro.
coremltools is integrated into Apple's developer ecosystem and workflows involving Xcode for app packaging, Swift and Objective-C for runtime inference, and deployment channels such as the App Store. It works alongside services and frameworks like Vision (Apple), Create ML, ARKit, Core ML, and third-party tools from Hugging Face and TensorFlow Hub. Community contributions and collaborations often connect projects and companies such as OpenAI, DeepMind, NVIDIA, Intel, and academic consortia at University of Washington and University of Cambridge.
Introduced by Apple Inc. in the late 2010s, coremltools followed the release of Core ML to enable conversion of models developed in research institutes and companies like Google, Facebook (company), OpenAI, and university labs at Stanford University and MIT. Over successive releases the project added converters for standards championed by organizations such as Microsoft's ONNX initiative and tightened interoperability with frameworks maintained by Google and Meta (company). Development activity has involved contributors from Apple Inc. as well as community members from companies including NVIDIA, Intel Corporation, ARM Holdings, and academic researchers from Carnegie Mellon University and University of California, Berkeley.
Category:Machine learning software