A few years ago, no one knew if the IoT was real or a pipe dream. Now, the approach has diversified into multiple variations that target specific application niches. The concept has matured and the industry around it is growing up.
Whenever a new design approach arises, there is a period of uncertainty about its fate. It might never pan out, it might become a niche solution, or it might become a game-changer that opens a whole new design discipline. That uncertainty is captured nicely in the Gartner Hype Cycle, originally defined by technology analysts at the Gartner Group. According to that cycle, shown in Figure 1, technologies go through a period of widespread interest as the potential gets explored, followed by disillusionment as implementation challenges are discovered. If a technology survives those stages, it eventually matures into rising productivity.
Figure 1 The Gartner Hype Cycle shows how the market acceptance of new technologies typically evolves. Source: Wikipedia
I have seen this cycle first-hand with microprocessors. When I started my engineering career, microprocessors only offered 8-bit buses and clock speeds had just entered the 1 MHz range. They were being touted as the solution to all manner of control problems and set the industry’s imagination on fire.
Unlike some contemporary new technologies, like ECL, microprocessors exploded in popularity. Innumerable companies offering chips, development tools, test tools, and software libraries arose. Trade shows and publications dedicated to problem-solving with the technology appeared and grew, involving hundreds of thousands of engineers trading insights into the technology being called “embedded systems.”
More and more development teams turned to the microprocessor as a core element in their designs as applications expanded from simple control logic replacement to complex, data-driven intelligence for machines. Soon devices branched out into 16-bit, 32-bit, and wider operation, and clocks climbed toward the GHz range. A variation on the approach based on greater integration turned microprocessors into self-contained mini-systems, spawning the new category of “microcontrollers.”
Microprocessors as a design element have since reached a high level of maturity. Developers no longer need to gather in droves to learn about the latest developments. Microprocessors have become an essential element of virtually every system designer’s toolkit. They don’t generate nearly the levels of excitement seen in their early days as the focus is now on refining, not exploring, the design approach.
The IoT has been following the same kind of trajectory. When I started with the website IoT World, most designers didn’t really understand what the term meant. Initial applications involved sensors and controllers communicating their readings and receiving commands across the internet, working with data storage and control algorithms running on large computers located remotely. The potential to do even more was clear, but the challenges remained to be discovered.
Fast forward a decade and the IoT has become an important design approach. One indicator is the diversity that has arisen in approaches to the IoT. There are the historical applications of network-connected sensors and controllers, which are seeing increasing adoption. Devices like the Nest Thermostat, for instance, still rely on remote computers to deliver their full functionality. More importantly, however, the IoT has branched out into applications that were not initially envisioned.
These expanded applications range considerably in cost and complexity. Small, receive-only IoT devices are seeing use as display devices for prices on store shelves. At the other end of the spectrum are IoT devices with built-in artificial intelligence (AI) to provide complex control interactions while minimizing the need for communications bandwidth and remote computational support. This diversity has given rise to new naming terminology such as the industrial IoT (IIoT) and the artificial intelligence of things (AIoT) to describe them.
If the trajectory defined by the Gartner Hype Cycle and the experience with microcontrollers are reasonable predictors of the IoT’s future, the technology is now entering the phase of growing maturity. It appears that all the common challenges to the IoT approach have been identified: security, privacy, connectivity, bandwidth, cost, device management, and the like. Solutions to these common challenges have become less about innovation and more about refinement, another sign of a technology’s maturity.
More importantly, the IoT approach has diversified into multiple categories, each with challenges unique to their category that will see refinements unique to their needs. Basic sensors and displays will focus on low cost, low-energy, and energy-harvesting challenges. Inventory condition monitoring will concentrate on mobile connectivity issues as goods move across towns, states, and continents. The IIoT will focus on issues such as reliability, connectivity, and fail-safe operation. Consumer devices will have to wrestle with privacy issues. And complex systems in the AIoT will work on the cost-effectiveness of providing AI at the edge. And as these application spaces develop solutions to their challenges, the new capabilities that these solutions enable will open even more application spaces to the IoT design approach.
The IoT has gone from being a novel design concept to becoming a mainstream approach with an increasing number of applications. Like the microprocessor before it, the IoT approach will become an essential tool in the developer’s design kit and excitement about the approach will fade. For now, though, there is still a need to keep your Eye on IoT.
Rich Quinnell is a retired engineer and writer, and former Editor-in-Chief at EDN.
- AI and the IoT merge to form the AIoT
- Deploying IIoT sensors in the smart factory
- 8-bit isn’t dying, it’s growing
- Tale of two thermostats: Nest teardown
- Hardware and software tools for testing security in IoT designs