The history of computing can be seen as mankind’s journey toward making a machine imitate the human mind. Our brains process multiple streams and many types of data, simultaneously and in real time. We are able to focus on what’s important to us at any given time while we ignore the extraneous stuff. And now we are building machines that mimic this ability, but with far more computational power than we were born with.
The reach toward real time computational awareness by IT vendors is captured most obviously in IBM’s Cognitive Computing tag line that now serves as a moniker for everything IBM. But it can also be seen in the significant R&D efforts made by Cisco andHPE that are currently directed at what is now known as the Internet of Things (IoT)—a place where multiple sensory data streams need real time processing to extract maximum value. While machines that tell us what happened yesterday are nice to have, those that tell us what’s going on right now—or better yet—in the future are far more valuable and come the closest to the way we think.
In my view, the biggest obstacle in the path toward real time thinking machines is computational latency. In the movie The Imitation Game, Benedict Cumberbatch playing Alan Turing (see it if you haven’t already) sets a wall of electro-mechanical counters in motion. His team waits for results that can sometimes come days later. That’s computational latency with a capitol L. And while computers are now exponentially faster, latency is still holding us back.
The good news is that the journey to near zero latency has multiple pathways. The rapid rise in popularity of solid state persistent storage devices is but one. The many all-flash arrays that are now replacing hard disk systems in increasing numbers attest to the viability of this quick and easy way to make applications run faster. But there are others. Michael Stonebreaker, recent winner of the Turing Award, decided to reduce latency by giving birth to a project that developed stream processing using StreamSQL where computations are done before data ever sees a storage device. In-memory computing is another one that I’ve featured here. Of course there have traditionally been brute force approaches that attack latency by building bigger, faster boxes—think high performance computing (HPC). There is the tear-it-up-and-start-over-approach that HPE is currently pursuing with The Machine. And there is yet another approach that applies parallel processing concepts to commodity multicore processors that are common as inexpensive laptops as is now proposed by DataCore Software, also discussed here.
As each of these technological approaches advance, the resistance they are likely to encounter along the way will be a critical factor in predicting winners and losers. Resistance in this context can be expressed in multiple ways. Sheer cost is one. If an inexpensive solution gets the same ultimate result as an expensive one, the one with the lower price tag wins. Resistance can also be expressed as degree of difficulty. Assume that you have an application that you want to accelerate and you are considering any two of the technologies above, but one requires a complete rewrite of the application where the other doesn’t. All other things being equal, you would choose the simpler solution. Another is risk i.e how much risk exposure to the developer and the consumer does the solution engender. Yet a fourth resistance factor is market acceptance. In this case, assume you are an application developer writing a real time computing application. You will likely choose a solution that offers the fastest time to market, broadest appeal, the least risk and the least cost.
I believe that as we experience real time computing we will want more and we are now more likely to get that experience on smart phones as opposed to laptops. Consider Google Maps for example. It’s hard to duplicate knowing exactly where you are at any given moment on a laptop. And why would you when your smart phone is now a significant part of your life that is easier to use. There is a waiting market for real time applications and those vendors who know the path of least resistance to near zero latency will win.Back to Analyst Blogs