ERP Software Logo1

Microsoft Dynamics vendors provide comparisons and opinions to professionals in the ERP/Accounting software selection process

 
 

deFacto Global

Why In-Memory Computing Is a Big Deal for Business Analytics


Email | Print

In-memory computing is being hailed as a revolutionary advance, and with good reason. In-memory computing dramatically improves the performance of analytical applications and opens the doors to significant computing and business innovations.

Deloitte calls in-memory computing the “holy grail of analytics” and “the future of computing,” while a report in Forbes describes how in-memory computing “changes everything.”

Why all the hoopla? In-memory overcomes the big data crunching obstacles that have impeded the performance of business intelligence (BI) and corporate performance management (CPM) systems.  These systems have buckled under the weight of the massive data volumes that are necessary to perform accurate business forecasting and planning in today’s business world.

In-Memory Computing Changes the Game

How does in-memory computing work? Data typically has been stored on spinning disks that impose a lag-time penalty during access by computing systems. In-memory computing replaces spinning disks with a computer’s internal random access memory (RAM). In-memory also uses a row-based data storage architecture rather than the traditional column-based, which also speeds up processing.

In-memory computing handles large volumes of data more rapidly than was possible with the previous generation of BI and CPM systems, which enables more powerful and real-time analytical operations.  This means faster and more reliable planning, forecasting, and decision making.

As IBM BI expert Nancy Hensley notes, because in-memory computing speeds up the analytical process, it has become “the new go-to technology for analytics.”

Also bullish on in-memory computing is Gartner, which reports that in-memory computing “promises to open a floodgate of innovations and “enable new, previously unthinkable breakthrough applications for competitive advantage.” Examples, says Gartner, are dynamic pricing, large-scale e-commerce, real-time supply chain planning, and real-time profitability analysis.

deFacto’s Secret Sauce

In-memory computing is one of the technological advances that enable the deFacto platform to perform at levels that were previously unattainable. What also distinguishes deFacto is a unified BI, CPM, and analytics product architecture that is focused on financial analysis and planning, and the extent to which deFacto employs the Microsoft analytical stack. This allows CFOs and their organizations to benefit from all the power and capabilities traditional line-of-business analytic applications have enjoyed.

Altogether, these technical and architectural improvements enable the deFacto system to create a more comprehensive and accurate model of a business, which allows companies to perform analyses and planning at a level not previously possible.  As many deFacto customer case histories show, companies have switched to deFacto because of the performance problems they experienced with older platforms, and to gain the speed, scalability and planning advantages deFacto provides.

The benefits in-memory brings will only get better. As Don Basile notes in Forbes, the metrics around in-memory computing will continue to improve, allowing analytical systems to handle vast quantities of data and perform faster and more accurate decision making.

To learn more about the technical innovations that drive the deFacto platform, read the whitepaper “The New World of Financial Analytics.”

By Michael Neubarth, deFacto Global

Comments are closed.