Cloud-based AI has enabled on-demand business intelligence for enterprises. But effective enterprise AI will have to move to the intelligent edge.
Call it what you will—artificial intelligence, machine learning, advanced analytics—it's the new indispensable thing in IT.
Artificial intelligence (AI) and machine learning are all around us now, and soon they will be everywhere. AI won’t just be in our phones and household gadgets and business software platforms, but in every car, every building, every medical device—not to mention every conference call and board meeting. We will rapidly reach a point where we cannot do business or get through the day without intelligence in everything we do.
We can already see growth patterns emerging: Smart clouds are growing in prominence; smart apps are becoming ubiquitous. Predictive application programming interfaces (APIs) have sprung up in healthcare and logistics environments—not to mention the vast personalized marketing we all experience on the Internet.
But just as we can see emerging patterns, we're beginning to see bottlenecks. The Internet wasn't built to do what we're asking it to do, and even as new infrastructure makes its way into our thinking about networking, we're already falling behind.
That’s why we need the intelligent edge—to support the devices and the analytics that are possible with artificial intelligence—without having data make onerous round trips to the cloud. Instead, with the intelligent edge, these intelligent processes can happen natively, on the device itself.
The increasing presence of automated decision making and prediction in our apps, devices and business processes have made AI an essential enterprise commodity. Organizations of all kinds need to embrace it if they want to remain competitive.
Many major vendors have also integrated AI into their applications and platforms, which got many companies on board fast with AI technologies. But plugging AI in in this manner may not always be sustainable.
AI on-demand in the cloud is just that: in the cloud. And the enterprise is drifting further and further away from the cloud, at a steady pace. There's an edge now—exemplified by edge computing, which brings computing closer to the devices that need these resources—for every enterprise, and that edge is growing every day – broader in scope and more distant from the centralized resources of the enterprise, to say nothing of any hand-off, third-party AI technology.
But the obstacle has been to enable AI-driven processes in applications and devices natively, without requiring that these processes make a round trip to the cloud.
For several years, enterprises have extended problem solving and data-processing activities into the field, away from headquarters. In this, the cloud has been a divine blessing, allowing stable, secure points of contact between those field operations (and the devices that perform them) and centralized data vaults within the enterprise.
But in the midst of this migration, entirely new kinds of devices have proliferated everywhere – Internet of Things (IoT)-enabled computing devices meant for data collection to feed those very data vaults, and thus generate increasingly refined analytics to support field work.
The problem is, there will be more than 30 billion such devices in the world, just 18 months from now – and that's orders of magnitude more than the number of smart clouds to support them.
The intelligent edge will become increasingly important to devices that operate in the field, as they become more sophisticated and autonomous. IoT-connected devices need a service layer in networking that can provide AI when it's needed – a resource external to the device that can be on-demand, dynamic, relying on agnostic interfaces to add smarts to a process only when and where they're needed, so that the apps themselves (and the devices running them) don't get bogged down. And this makes cloud-based AI only more problematic.
When an enterprise application environment lives in a cloud and needs AI from an IBM Watson or Salesforce Einstein – cloud-to-cloud – that's ideal. Clouds are increasingly well-engineered for this kind of one-on-one exchange.
But when the AI is needed in IoT-based apps and processes, that's a much less practical exchange. It puts increasing strain on the cloud resource, and clutters the network infrastructure between the edge and the supporting cloud.
The solution isn't expanded cloud resources, because clouds are too large and too costly to proliferate at the rate of IoT device proliferation.
Edge computing is already flourishing as a solution to the mismatched rates of growth between IoT and cloud platforms. So far, this new way of harnessing resources works well: More processing is happening on the edge, rather than in enterprise server rooms.
Edge computing presents itself as an obvious solution to the problem of bringing AI to IoT; it simplifies networking (and thereby response time, when apps make real-time calls), lessens burdens on cloud resources, and makes decision support analytics immediately available to the full gamut of enterprise field processes.
But even this doesn't fully solve our problem.
Even on-demand AI is based on machine learning, and for a machine to learn, it needs something to learn from: our old friend, big data. We source AI in clouds because AI comes from our vast oceans of data – and that data lives in clouds, not on the edge.
What's the last piece of the puzzle, then? The fog node – an edge computing component that does more than just streamline data networking and add computational oomph to field apps – can handle the big data problem.
When an enterprise creates a fog network to support field devices and processes, it is building a data archive in the field. This drives most security folks berserk, because those nodes don't live behind the corporate firewall and, thus, must be encrypted, top to bottom. It places these nodes and data beyond the scope of enterprise search, which in turn drives corporate data admins berserk.
But it doesn't matter if big data deployed in fog nodes to support field AI is excluded from enterprise search; it's standalone data, positioned for a single purpose, and doesn't need to be searchable by the apps and processes it's supporting – it's only there as an analytics resource. Both the edge-node-as-smart-node and big-data-on-the-edge are new ideas that ask us to think about both AI servers and big data itself in new ways.
Many industries can exploit the possibility of AI at the intelligent edge
Scott Robinson is director of business intelligence at Lucina Health in Louisville, Ky.