November | 2012
The high-performance computing market is expected to reach $220 billion by 2020, according to a study by Market Research Media. In-memory computing is the fastest growing segment in that market. In-memory computing involves an architecture where the data is kept inside the memory of the computer rather than on their hard disks. By keeping the detailed data in the main memory, this model speeds up data crunching and meets diverse information and analytics requirements faster. In-memory computing is not new but has evolved over the years into relatively inexpensive models that make the technology viable for mainstream adoption. With Big Data projected as the next big thing, this change couldn't have come at a better time.
Disk-based databases with its I/O (Input/Output) bottlenecks are major performance killers when solving problems involving massive amounts of data. Organizations have been able to reduce the delays by using distributed caching systems. However, it's still nowhere good enough to mine Big Data real time. This is where in-memory computing can make a difference. With in-memory databases, the I/O bottlenecks are eliminated or moved to the DRAM. In this model, an application's address space can communicate directly with the database as it is stored in the RAM. This in turn facilitates throughput reaching hundreds of thousands of transactions per second and sub-second response times.
The challenge for industries is to use in-memory computing to get the best results from Big Data. Risk and Asset Management firms are already using the technology to understand the range and probabilities of possible investment outcomes by building complex simulations. With in-memory computing, these simulations can be used to plan the best practices as the calculations run much faster. Similar methods are also used by the Oil and Gas industry to crunch their exploration data to find and manage oil fields. Insurance companies are examining the potential costs of natural calamities to make sure that the premiums are set at appropriate levels to cover the risk. What in-memory computing has done is to help these industries leverage Big Data to come up with the best possible solutions, much faster.
Going forward, it's quite clear that organizations or departments using in-memory computing to harness Big Data will have a big advantage over those who don't. With in-memory computing, there is no need for aggregates. Calculations can be made on the fly and problems for which solutions did not exist earlier can be now solved. It also means that there is less expenditure on infrastructure to run complex analytics systems.
In-memory computing aids the goal of companies to organize Big Data into a fabric that can be searched, browsed, analyzed and visualized in real time – something that had looked improbable just sometime back.
Wipro Insights set up the Council for Industry Research, comprising of domain and technology experts from the organization, to address the needs of customers. It specifically looks at innovative strategies that will help them gain competitive advantage in the market. The Council in collaboration with leading academic institutions and industry bodies studies market trends to equip organizations with insights that facilitate their IT and business strategies. http://www.wipro.com/insights/
Email us at: wipro.insights@wipro.com
© 2021 Wipro Limited |
|
© 2021 Wipro Limited |
Pharmaceutical & Life Sciences