The CPU (central processing unit) is the heart of any modern computer. It executed a series of instructions across a number (usually less than 10) of processors.

CPUs can also be used for machine learning inference, but their architecture is less well suited to accelerate massively parallelizable modern machine learning architectures than GPUs. CPUs are, however, significantly cheaper and easier to come by than GPUs, and can be effective for machine learning, especially at inference time, and for small batch sizes.

At TitanML, we offer enterprises the option to utilise CPUs in their AI deployments, following our success in real-time deployment of a state-of-the-art Falcon LLM on a commodity CPU [1].

Related Articles

No items found.