New chip releases large AI models from the cloud and runs them at low power in devices enabling full conversational dialog and complex AI inference inexpensively and at low power.
A New Spin on Embedded Memory