The evolution of in-memory computing technology

Although it might appear to be an emerging technology because of all the recent hype about big data, in-memory computing has already been in use by large organizations for several years. For example, financial institutions have been using in-memory computing for credit card fraud detection and robotic trading, and Google has been using it to support searching huge quantities of data.

The need for in-memory technology is growing rapidly due to the huge explosion in the sheer quantity of data being collected, the addition of unstructured data including pictures, video and sound, and the abundance of meta-data including descriptions and keywords.  In addition, vendors are pushing predictive analytics as an important competitive advantage, for which implementing in-memory technology is a must.

The reduced cost of memory (RAM) hardware means that now smaller organizations, with annual revenues as low as one million dollars, also have access to in-memory technology, and are getting into the game.  The pace of adoption will continue to speed up as packaged software vendors incorporate in-memory computing is into industry leading solutions.

In-Memory Computing in the Enterprise Software Market

SAP took an all or nothing approach, deciding to embed in-memory computing across their entire ERP line with their SAP HANA solution.  Being first to market among their competitors with an in-memory computing product, SAP took on the role of market educator.  They aggressively marketed their HANA solution as a differentiator and also enjoy the fact that the overlapping data layer helps prevent modules of their solution from being replaced by other leading industry players such as Oracle, Salesforce and Microsoft.  SAP bet on the idea that customer upgrades to HANA would not be much more costly or complex than other major SAP upgrades.

Other database vendors – Oracle, IBM, and Microsoft – are adding in-memory features to conventional databases one module at a time.  Although this approach is less disruptive and quicker and less expensive to implement, it can create bottlenecks as high speed processing is limited to a single function, and the full benefits can’t be experienced across all parts of the application.

Enterprises still have many options when it comes to implementing in-memory technology.  In addition to the traditional database vendors providing in-memory technology, there are in-memory-computing first vendors like GigaSpaces.  GigaSpaces has already been providing in-memory functionality for several years.  An application-agnostic vendor like GigaSpaces also provides the advantage of enabling multiple vendors’ data to be incorporated into a single data grid. Still other options that enterprises can consider are integration solutions that embed in-memory computing technology. The focus here would be to support scenarios that combine data from multiple systems.

Implementation Strategies

In general, CIO’s shouldn’t limit their choice of suppliers of in-memory technology based on their incumbent solutions, but should instead pick a solution based on their organization’s specific objectives and priorities.  CIO’s should look at the scenarios that they want to enable; for example identifying potential fraud for insurance companies or predicting crimes for law enforcement, and then determine the most cost-effective in-memory technology solution that will enable them to achieve their goals.

Once they have decided on the data they want to use in-memory, they should do an ROI analysis based on the full cost of the solution including consultancy, the software cost, the amount of work required to modify the applications, and how efficiently the solution uses the hardware.

In some cases it may be wiser to use in-memory technology only for certain parts of applications.  For example, retailers might see the value in using in-memory computing to call up data about previous purchases and customer profiles to present targeted offers to customers while shopping, but decide to store employee work hours using more traditional methods since this data is less time sensitive.

Read more: The evolution of in-memory computing technology

Tags: ,

Related posts

This entry was posted on Wednesday, June 11th, 2014 at 2:10 pm and is filed under Technology. You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.