In-Memory Technology Speeds Up Data Analytics
Another issue: The speed of in-memory technology places heavier demands on processors. As a consequence, organizations must parallelize the code that will access the data and deploy load balancing across the cluster, Lindquist says. "Load balancing becomes a critical piece of your ability to take advantage of the in-memory database."
AdJuggler has created a pull-based load balancing system, using commodity hardware and in-house developed software. Each instance of AdJuggler's transaction processing engine will pull work from the load balancing component, complete the task and then go back for more work, Lindquist says. The system brings up more instances if additional capacity is needed.
Organizations with in-memory products must also take care when it comes to database indexes. Businesses using a traditional database can afford to devote a large amount of disk space for indexes. But in-memory databases call for greater precision.
"If you're using the in-memory store like a database-with searches-you have to index for performance," Lindquist says. "You have to be more precise with it, because RAM is more expensive and limited."
The volatile nature of RAM presents another issue for in-memory adopters. Should a system fail, the data must be reloaded. This can prove time-consuming.