What Real-Time Computing Means for Local Commerce

Share this:

KONICA MINOLTA DIGITAL CAMERAWhat’s the future of big data? It might be speed, not size.

Earlier this month, SAP, the german business software giant, announced that it plans to put a version of its real-time analytics software Hana online. The computing platform is one of a number of new analytics products that use so-called in-memory processing — a reference to a computer’s temporary, or working memory — to rapidly analyze vast amounts of data in an instant, allowing businesses to draw insights and make decisions in a matter of milliseconds, not minutes or hours.

The rise of real-time computing comes as technology companies look to bring big data learnings out of the back-office and into the fold of their consumer-facing products. For years, large businesses have used tools like Hadoop, the open source software developed by Yahoo, to scan sprawling datasets for patterns in hopes of unearthing insights that could help make a company more efficient. Bill Bain, chief executive at Scaleout Software, a company providing real-time analytics software, explained to me earlier this week that big data processing has largely been used to improve the cerebral functioning of a business, providing executives and managers with a more effective tool to make decisions.

What Bain wants to build is a peripheral nervous system of sorts, a more responsive complement to today’s business intelligence-focused big data tools. In addition to the strategic insights offered by traditional big data systems, Bain says the new processing models can analyze and act on information on the fly, giving these systems the equivalent of instincts in addition to insights.

“Picture a high-end department store, in which the sales people all have tablets that are connected to centralized system,” said Bain, describing a possible scenario in which in-memory processing could help improve a business. “Once you can identify that person, the system can automatically tell the salesperson to be sure the customer’s sizes are on the rack or the right brands are front and center. It all hinges on the ability to compute in an in-memory state.”

Here’s how out it works. Similar to the way we might keep our summer clothes in a closet during the winter, computers tend to store data not being used at the moment, in drives. If a company wanted to run an analysis on last week’s sales data, for instance, software like Hadoop MapReduce could call on the needed data from the drives, and bring the information out of storage into the working memory.

That’s where the in-memory databases come in. Instead of calling the data from a separate disk, the system spread that data across the working memory of multiple computers, allowing computers to process data in an instant rather than minutes, or hours.

It’s big data, but faster. And companies like SAP, Oracle and a host of smaller vendors like ScaleOut are battling over the emerging market. When Larry Ellison, chief executive at Oracle, introduced the company’s in-memory product during an industry event last September, the billionaire told the crowd that the system provided “ungodly speeds,” saying that it was 100 times faster than its predecessor.

“Time has become an interesting vector of competition in computing.” ScaleOut Software’s COO David Brinker told me earlier this week. “There’s competitive pressure to drive companies to do something faster and faster, but there’s also a set of enabling technologies that have made it feasible to do that.”

Part of what’s driving that shift is that the systems have simply become more affordable. Over the past decade, the cost of both RAM, which provides memory, and the commodity hardware on which it runs, has plummeted, making in-memory storage a more viable option for businesses.

While decreasing costs have made the technology more accessible, it’s rising demand which has created a growing market for the capabilities. Mobility, in large part, has transformed the web from a service, which we visit to complete a given task to an extension of ourselves, a prosthetic that we expect to stay in step with our daily lives.

As we create more data about ourselves — whether it’s in what we buy, where we go in the real-world, or what we say on Facebook — there’s a need for systems to reconcile multiple streams of data in an instant. In a local marketplace which is deeply fragmented, interoperability, or the ability to work between system, is maybe the defining challenge for a given technology.

In retail, for instance, an in-memory system could help reconciling offline and online systems, making sure that purchases made online immediately reflect the system in-store. In one example, Bain says that the company used the technology to help an unnamed retailer of perishable goods — say flowers — to merge and reconcile their online and in-store orders in real-time, ensuring that there was enough left of each product to fulfill each order.

Real-time computing could also open the doors for local marketing automation. As consumer-facing services like Yelp tie into the back-end systems of local businesses, there’s a unique opportunity to programmatically connect supply and demand. These technologies, running in the cloud, could effectively replicate the role of an agency, helping small businesses, which have largely lacked the intermediary, to begin using some of the more advanced targeting capabilities that large brands have used for years.

Steven Jacobs is Street Fight’s deputy editor.

Tags: