AIoT. Why it’s not just another tech buzzword.

Share on linkedin
Share on twitter
Share on facebook
Share on reddit
Share on digg
Share on email

When it comes to technology there is usually hype. How many times has ‘the next big thing’ turned out to be the ‘next big fail’, with the hype outpacing the evolution of the technology? High-definition TVs? Chatbots? And even Augmented Reality (AR)?  In today’s fast-evolving, technology-driven world it’s crucial to be able to separate between what is possible now and what will come in the not-too-distant future.

Artificial intelligence (AI) is probably one of the most overused buzzwords currently in circulation. Everyone wants to claim that their product has AI in it, even if it doesn’t. And if it does contain ‘AI’, what type are we talking here? Narrow AI, where the device performs a set of specific tasks without any human interaction, or general AI where the device is almost making decisions for itself. The reality is that it’s narrow AI but to the person on the street it’s likely they won’t know the difference – they’ll just think “AI”.

This over-liberal use of the term is not only causing confusion (and in some instances concern) for consumers, it’s also having a negative impact on devices that really are getting smarter, downplaying their advancements and benefits.


Step forward AIoT, a possible future victim of buzzword association.

In layman’s terms, the AIoT is where AI and the IoT meet, bringing intelligence to the edge. As AI moves closer to the edge and into devices, such as sensors, cameras and mobiles, in many cases it eliminates the need for racks of cloud-based computing and instead moves the analysis to the IoT device itself, removing any delay in the processing. Ultimately, it’s about transforming the IoT data into useful information for an improved decision-making process, with processing done in a location where it is most needed.

Take an autonomous vehicle as an example. It has multiple cameras for computer vision, object recognition, lane warning, driver monitoring for fatigue as well as other sensors (e.g. thermal imaging, RADAR and LiDAR) for sensor fusion. By processing at the edge, it minimises the bandwidth required to move data to and from the vehicle and avoids delays to that analysis. In connectivity black spots or when latency is critical, such as when you’re travelling at 100 mph (on the Autobahn, of course), edge processing could literally be the difference between life and death.

What does AIoT mean for Imagination?

We believe that AIoT is the natural evolution for both AI and the IoT as they are mutually beneficial. AI adds value to the IoT through machine learning capabilities, turning the data into useful information, while the IoT adds value to AI through connectivity and the exchange of data.

Our IP already goes into billions of devices, such as smartphones, tablets, set-top boxes and vehicles. Enabling these to become smarter by sharing data will help transform the world we live in.

smart city

In the smart city, the AIoT will enable ever-smarter edge devices to not only be data generators but data aggregators, data exchanges and data-driven decision-making brains. In the city, this means reducing or eliminating traffic jams by enabling cars to be constantly updated by street infrastructure and other vehicles, with the sharing of data enabling better decision-making for routing and safety as well as clearing the path for emergency vehicles to get through. Cars ‘talking’ to lampposts, traffic lights and street signs may seem a crazy notion now but, in the future, this will become a natural occurrence.

The AIoT will enable informed choices to be made based on real-time and predicted information. For instance, how frustrating is it to see motorway signs displaying out of date information because the human controller hasn’t updated it. Or to not be advised to take the next exit only to become part of a three-mile tailback? Additionally, vehicle-to-vehicle (v2v) or vehicle-to-infrastructure (v2x) data-sharing and importantly, sense-checking will ensure there are no obstacles to progress. Soon v2x will become a basic requirement – but one requiring AIoT on trillions of sensors.

In the workplace, the factory of the future will become safer because previously ‘dumb’ industrial robots and robotic vehicles will use the AIoT to become ‘aware’ of their surroundings and of the presence of a human – enhancing safety. The robot will ensure that if a human enters the ‘envelope’ of a robot’s movements it immediately understands what is happening and reverts to safe mode.

AI Factory

Likewise, we will have smart-to-go stores where you select your retail item, a drink perhaps, and when you leave the store it will be debited from your account, your loyalty points will be updated, and the shelf replenished, all from the actions of the sensors and cameras, and all without human interaction.

These scenarios will all require security and privacy to be designed in from the start. Here at Imagination, we are working with silicon and software teams to make the AIoT a reality and that it will work well for all of us.

Is AIoT the tech buzzword of 2019?

The concept of AIoT is still relatively new and while the hype is growing, the key thing is to separate between what is currently possible and what is still a way off.

Is AIoT ‘the next big thing?’

Well, we think it’s a collection of trillions of little things that together add up to a major opportunity. It will require ongoing development, new forms of connectivity, such as 5G and Wi-Fi, and major efforts in software development including advances from narrow AI to more generalised AI.

Is it all hype?

No, we don’t think so, as this is a process that has been underway for some time. In terms of smart sensors, we are only just now moving from prolific, to pervasive, to productive. At Imagination we are enabling the edge and as with many technologies, the moment they become invisible is the moment that they become their most productive. Watch this space – or should that be lamppost, traffic light, street sign…

Andrew Grant

Andrew Grant

Andrew Grant joined Imagination in 2018 as a senior director, responsible for strategic business development in AI and building the wider ecosystem of AI partnerships. He has a particular interest in autonomous vehicles and ADAS and how AI can be used to create smarter IoT devices for vision, home and robotics use cases. Previously, he was involved with start-ups from UCL and CERN, chairing Satalia, an AI company based in London and has worked with UCL School of Management, WPP and SABMiller. He has also led innovation projects on the future of automotive, aviation, retail and the IoT. At British Telecom he was a CIO and Marketing leader and he has also worked with Intel, HP and EY.

Please leave a comment below

Comment policy: We love comments and appreciate the time that readers spend to share ideas and give feedback. However, all comments are manually moderated and those deemed to be spam or solely promotional will be deleted. We respect your privacy and will not publish your personal details.

Blog Contact

If you have any enquiries regarding any of our blog posts, please contact:

United Kingdom

[email protected]
Tel: +44 (0)1923 260 511

Search by Tag

Search by Author

Related blog articles

shutterstock 453232675

Imagination’s neural network accelerator and Visidon’s denoising algorithm prove to be perfect partners

This blog post is a result of a collaboration between Visidon, headquartered in Finland and Imagination, based in the UK. Visidon is recognised as an expert in algorithms for camera image enhancement and analysis and Imagination has a series of world-beating neural network accelerators (NNA) with performance up to 100 TOPS per second per core. The problem tackled in this blog post is denoising images from conventional colour cameras. The solution is in two parts: 1. Algorithms that remove the noise without damaging image detail. 2. A high-performance convolution engine capable of running a trained neural network that takes a colour image as input and outputs a denoised colour image.

Read More »
Neural network accelerator (NNA) for the automotive industry

Imagination and Humanising Autonomy Part 2: The humans behind the autonomy

Welcome to the second in a series of articles where we explore how Imagination and Humanising Autonomy, a UK-based AI behaviour company, are teaming up to deliver practical, real-world AI-driven active safety. This time we talk to Ron Pelley, Vice President, Commercial at Humanising Autonomy about their mission, mantras, and the unlikely meeting that started it all.

Read More »


Sign up to receive the latest news and product updates from Imagination straight to your inbox.