Artificial Intelligence is poised to affect virtually everything. It is slated to be embedded into virtually all electronics equipment.

AI processing will be done both in the Cloud and on the Intelligent Devices in order to reduce the data sent over the network to the cloud, for latency sensitive applications and also for better autonomy in case of poor network performance.

This means more and more Devices will embed AI, including battery operated devices.

NVMEngines’ Intelligent Memory products enable lower power than conventional memory and have optimization for the higher performance required by AI applications. At a fraction of the size of SRAM, it provides a path to both lower power and lower system cost.

In the future, NVMEngines’ memory products will enable the efficient implementation of Neuromorphic chips where both storage and processing are connected within these “neurons” that are communicating and learning together.