by December 21, 2012 0 comments

Ever wondered why even supercomputers take a long time to process huge amounts of data? After all, if they have some of the fastest processors available, shouldn’t it be done in a few seconds? What many people don’t realize is that there is a bottleneck that is causing this delay in computing. All computing devices have a certain amount of RAM that is used to temporarily hold data that the CPU can perform computations on. When computers are demanded to do extensive computations, they have to fetch data in batches from storage to memory, given that the size of memory is very limited compared to the amount of storage available.

The delay in computing of large datasets arises because the system has to transfer data from storage to memory, perform processing on it, and then commit it back to storage. Hence, even with a 128-core processor, huge data set processing on a system with just 1GB of RAM will take forever!

However, imagine if a system had enough memory to hold all of the data needed for processing. Then, why would your system even need to touch the storage? For all the data processing needed, it can fetch it from memory, which is almost instantaneous. This is the core idea of in-memory computing. With this kind of implementation, there will be no delay in processing caused by the slow interactions between memory and storage. With the constantly decreasing prices of memory in the market, it is also becoming cost-effective to purchase solutions that rely completely on RAM only. However, in-memory computing is pointless without a useful business implementation.

Many applications in the future will be powered by IMDB (In-memory databases), which are databases that primarily rely on memory for data storage. By optimizing data access algorithms within the database software, the CPU cycles to read data is reduced dramatically. According to Gartner, an in-memory DBMS can work “10 to 20 times faster than traditional ‘on disk’ DBMS”. Here are some applications that can use in-memory technologies to great effect:

1) Rapid financial transactions
In-memory computing is a dream for high frequency trading firms, who rely on extremely quick asset trading for a living. For example, the firm may need to analyze the entire stock market with their algorithms to predict a promising long/short selling of stocks. Traditionally, this will take longer as data needs to be fetched from disk during every iteration, but with in-memory systems, the same task can be completed in a matter of seconds. This can even augment performance for e-commerce sites, such as Amazon, who rely on traditional systems to process financial transactions of customers online. With an in-memory database and algorithm in place, their website will be able to handle a far greater load than ever before with an improved level of performance.

2) Business analytics
In our previous issue, we discussed the benefits of analytics at length. In the future, in-memory systems will be the primary driver of high-performance analytics. To improve decision making, data has to be analyzed at a high frequency. With traditional analytics, terabytes of data would take days to be analyzed for useful results and trends. With in-memory analytics, the same datasets can be processed in hours. This performance leap will bring huge competitive advantage for businesses involved in industries such as e-commerce, as they will be able to take decisions relevant to “real-time” trends, rather than basing decisions on last week’s data.

3) High security systems
Fraud detections require advanced systems that are able to analyze terabytes of financial transactions and records in any given second. In such a scenario, delay could be a critical mistake, especially when large sums of money are involved. Banks can customize their own in-memory anti-fraud systems, that are able to raise a red flag in a matter of seconds rather than hours. When it comes to time-critical matters such as fraud detection, in-memory systems can be a great boon.

These are just a few of an infinite number of business scenarios that will be aided by in-memory systems. However, will smaller businesses ever have to face such scenarios, given that they don’t usually deal with huge amounts of data? This is a germane question that can only be answered from an organization's perspective. In our next section, we will speak to an industry expert to find out what considerations SMEs have to make to decide on this solution.

No Comments so far

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.