2021+ Alexa models can perform voice processing locally. This is configured via Device Settings in the Alexa app. Speech recognition accuracy is slightly degraded for opt-in to local processing, https://www.slashgear.com/amazon-skips-the-cloud-with-local-...
> Amazon is switching on local voice recognition processing, promising users of some of its latest Echo smart speakers and smart displays that they can have their Alexa commands avoid the cloud completely... taps into the retail giant's homegrown AZ1 Neural Edge chipset.. followed Google, Apple, and others in creating its own custom silicon. While the AZ1 may not have been able to power the whole Alexa experience, it was focused instead on specific voice recognition features.
That’s cool. Didn’t know that was an option - But then what happens after that? Once it knows what you want to do a request is usually sent somewhere to fulfil it right?
A small minority of functions can be done without internet, e.g. control of local Zigbee devices connected to Echo's Zigbee hub. Most other functions, including control of local devices connected via WiFi, go through AWS/Alexa cloud. When local voice processing is enabled, only a text transcription of the voice request is sent to the cloud, not the captured audio.
(Echo 4 is one of the few Zigbee hub options with US firmware)
Is use of Alexa+ actually mandatory for an Echo moving forward? Can users choose to stay on their current semi-local system without AI like you can now?
That's the multi-million dollar question. Can they alter functionality of purchased devices so drastically, without exposure to class-action lawsuit about fitness for purpose? If an Echo Zigbee device is disconnected from the internet and currently working, how long will that continue to work?
If Echo Zigbee devices will effectively be bricked from their current offline purpose and use cases, it could motivate attempts to re-purpose the hardware. Has nothing been learned from the recent Sonos debacle?
> Amazon is switching on local voice recognition processing, promising users of some of its latest Echo smart speakers and smart displays that they can have their Alexa commands avoid the cloud completely... taps into the retail giant's homegrown AZ1 Neural Edge chipset.. followed Google, Apple, and others in creating its own custom silicon. While the AZ1 may not have been able to power the whole Alexa experience, it was focused instead on specific voice recognition features.