How infrastructure convergence drives IoT
For those of us who have spent their careers in the technology space, having a long-term perspective helps to recognize cyclical shifts in the introduction of disruptive technologies. In the early days it was Darpanet (80s), which led to the Worldwide Web (90s), then mobility (late 90s/early 2000s), then social media (2004) and now IoT, which in sheer numbers dwarfs anything that preceded it.
All of these technologies are incremental and dependent (e.g. mobility without the internet is just a phone, social media without a mobile device keeps us in our room, etc.). Every time a new capability is layered on, the potential for innovative solutions expands exponentially (look at the vast cottage industry of mobile apps that depend on mobility layered on top of internet infrastructure). As has happened several times over the last few decades, we’re experiencing another seismic shift, and this one has a fundamental difference.
IoT (as its name implies) is about things. There may be billions of people, but there are literally trillions of things. Why does this matter? Think about the socialization of things, or what is also referred to as the network effect. The more people that are on Facebook, the more useful and compelling it becomes (a bunch of kids in their dorm rooms chatting on-line vs. one billion people on the app at the same time – is a completely different technology experience).
Now instead of thinking of a billion people on an app at the same time (which is pretty cool), think of a trillion + devices communicating with each other in a common framework across the globe. What is needed to have something like this come to pass?
IoT has device dependencies. You can enable an IP address on a device, but how that device uses this capability depends on its original purpose; is it a fitness device on your wrist, or a sensor node on an oil pipeline? Both report data, but in a completely different process context. The breakdown looks like this:
Process – All business depends on process; whether you’re tracking patient data, refining oil, or building a car, information flows from one point to another, with each step dependent on the data in the preceding step. In many cases IoT is a series of connected things with a common end-point objective. By adding an accessible IP component to an already established process, the level of visibility (and hopefully insight) that can be gleaned will have a massive effect on companies ability to fine-tune their processes in real-time.
Context – The context of data defines its use. While all data is potentially useful, some data is clearly more useful than other, and that is defined by context. Who is the end user and how important is the data to helping them achieve strategic objectives? Operational data that improves production and profitability is a superset of data that e.g. tells a line worker to adjust a valve. Same basic data, different context. Tying process to the end user is what defines context.
Tracking and reporting – The good news is we now have access to vast amounts of data. That’s also the bad news, the signal to noise ratio is about to go off the charts (in favor of noise). New analytics systems that are adept at pattern recognition on a fundamentally bigger scale are going to be critical to any long-term success in harnessing what IoT can deliver. The flip side of this is, what do you do with the data once you have it? Who is the end user, and what do they need to be able to accomplish? Data visualization has always been a compelling way to make the complex understandable. While technical people may intuitively understand complex data sets, the business folks who can write big checks are the ones that ultimately need to feel comfortable with what they see, and being able to track the right data, and report on it in a way that makes sense is critical to ensuring the successful use of IoT.
Quality management – Data that does not factor in the quality associated with it is useless. IoT is not only extending the sources of data entering any system, it is also introducing ample opportunities for redundancy, overlap, errors, etc. Because of the compounding nature of the network effect, any inconsistencies that are the result of data quality issues are going to be leveraged into a much bigger mess. Check your local news feed any day for a great example of this (Equifax, Target, Yahoo, the list is sadly endless). If companies can screw up on this level with their existing data sources, imagine what they can do when their sources are exponentially bigger.
Enabling infrastructure – Any disruptive technology tends to enter the market organically, since multiple vendors with multiple standards are vying to be the one ring to rule them all. The challenge with IoT is that the coverage model is so broad, that its going to force systems that were not designed to work together from an information perspective to begin to collaborate. Want a simple example? BMC and ServiceNow (both massive IT infrastructure providers) have been at each others throats for years. Some companies are BMC shops, others like ServiceNow, and many actually use both, but for complementary functions. As the range of data sources delivered by IoT enter the IT ecosystem, its going to create a forcing function across the industry. The question is, can big companies or the channels that enable them (which are also big companies) force everyone to just get along?
I have zero doubt this is going to work. The industry has been consistently capable of adopting wildly disruptive technology and eventually rolling it into something that works for the end user. It will require companies to adapt (to paraphrase Darwin, its not the strongest or smartest that survive, it’s the most adaptable). Some companies that rode the last wave will disappear, some will get gobbled up, and some will adapt and drive the system forward for years, creating a new crop of billionaires in the process.