AI at the edge: are you the « real time » key master?
Publié le 21 janvier 2020
To quote Daft Punk, we want AI « faster, better, stronger », but what the hell does that mean? If anyone is wondering what is AI’s next move to go faster, it is actually to go smaller, to go wiser and to reduce the weight of the algorithms, and that is what we get with edge AI; fast and immediately actionable outcomes. What does mean immediately actionable ? Guess what, it actually means immediately. If life depends on it, it’s now, not in 10 seconds nor in 5, it’s right now! Patience is not a virtue of edge AI.
In order to avoid bringing more noise to the current cacophony related to the AI topic, in this post I will only discuss what I believe is likely to bring the most value out of AI. What does mean AI at the edge ? Or more specifically, what does mean edge? Simple, it means on the spot. It means doing it on the go. Basically, we can define it by « opposition » to the remote cloud computing. However, not to undermine cloud computing, edge AI is only valid in specific situations and, if the volume of data is too large or if a tremendous processing power is required, then edge AI might not be made for you, laddie.
In a nutshell, AI-enabled IoT devices level up the stuff to drive smart real time decision. Indeed, with edge computing, collection, storage, analysis and modelling of data collected from IoT devices are not performed on the cloud. In fact, contrary to the cloud where AI is performed by one large processing unit (I concur, that’s a bit of a shortcut – sic !), AI at the edge is based on a distributed set of small but powerful computing devices working in concert for local decision making purposes.
By eliminating data transfers, latency is not an issue anymore and as such the validity of real time decisions is likely to be enhanced. Let’s face it, for applications in predictive maintenance, medical imaging, autonomous driving, fraud prevention, etc. real time decision is a must, otherwise one may wonder the added value of AI. Can you imagine the situation, a machine is over-heating, the sensors get the data, these are sent to the cloud, the cloud processes it, when the sensible decision or commanded action comes back, damn it, BANG, the machine exploded and the man standing next to the machine has been barbecued….. NOW MEANS NOW !
Also, predictive maintenance is fully integrated, and here’s an awesome news, connectivity is not a problem anymore as data are dealt with locally, making the solution much cheaper than the alternative! Let’s talk about security, as edge computing keeps sensitive data locally, security is not a problem anymore either. If a hacker is trying to access a network through IoT devices, the AI engine would detect the anomaly, trigger the appropriate action and stop the « Bad Hombre ».
Now, you guys may wonder where we stand regarding chips and hardware? Once again, I would say that the future is already past… Indeed, check Nvidia Jetson(s), Intel’s Movidius, Greenwaves GAP8 (among others). These techs combined with effcient AI programming libraries such as TensorFlow Lite, X-CUBE-AI or NNoM are opening up new horizons. To get the most out of deployed IoT devices AI-enabled solutions supported by edge computing has to be available. As mentioned previously you cannot rely on the cloud to drive real-time decision making or at least it’s not the most effcient.
To summarise, edge AI is a key driver for effectiveness, efficiency and increased productivity. Though today, the cloud is generally seen as the panacea, considering what’s available you may wonder if it’s the right solution for what you are trying to achieve, and here, the new kid on the block might just be the best way forward…