Rosetta Stone embraces storage virtualization in DevOps shift
As the team cloned and backed up databases for test-development environments, it made duplicate copies of the same files and database records. Generating these copies, so-called copy data, required a lot of storage capacity and CPU cycles, boosting computing costs. In addition to the financial impact, waiting for data operations such as refresh and rollback could take days.
Even so, creating backups that enable disaster recovery is a necessary evil to test new software, the price for running an agile shop. "When it gets to testing, you've got to have data on the back end," Moseley says. He began searching for a solution.
[ Related: CIOs need to avoid a mistaken path to DevOps ]
Rosetta Stone's duplicate data dilemma is not uncommon for companies trying to build, test and refine software faster. IDC estimates the average organization holds as many as 375 data copies, including backups of backups and multiple versions of files. The researcher says copy data comprises as much as 60 percent of what is stored in data centers, costing companies as much as $44 billion to manage worldwide.
To reduce duplicate application data while ensuring that it is still protected, Moseley settled on "copy data virtualization" software from Actifio. As programmers deploy a quality assurance environment, Actifio enables them to take a snapshot of one of its databases in a couple of minutes. The software enables Rosetta Stone to create a a single "golden master" of production data and maintain changes to that master copy so that programmers can rapidly recover large data sets.
Since adopting the software, Rosetta Stone has reduced the cost to create and store backups, which previously required the IT team to add more disks to storage arrays from EMC, HP and other vendors.
[ Related: Is DevOps good or bad for security ]
Moseley says that he can’t currently quantify the storage savings. However, when data copies originate from the golden image are virtual rather than physical, copy data storage can be reduced between 80 percent and 90 percent, Gartner analyst Pushan Rinnen wrote in a July 12 research note.
In Rosetta Stone’s case, the flexibility in conducting backup and recovery operations supersedes any cost savings as the company continues on its DevOps journey. "Being able to create those test-dev environments is near the top of the list of the things that we've got to be able to accomplish to keep our development teams fast and able to test what they've got going on before it hits production," Moseley tells CIO.com. "[Actifio] gets us out of the backup and recovery [cycle].”
Agility on the front-end requires agility on the back-end, which led Rosetta Stone to overhaul its data center. The company is switching from content delivery network (CDN) market leader Akamai to GlobalDots, a CDN aggregator that, Moseley says, should enable better performance for its digital downloads, which consumers worldwide cache locally on their laptops, tablets and smartphones.
[ Related: How to master disaster recovery in a DevOps world ]
Rosetta Stone has also begun running its VMware virtual machines in a data center operated by AT&T, which provides Internet, power and cooling services. And the company runs some of its VMs in Amazon Web Services (AWS). In the near future, Moseley hopes to move more compute capabilities into the cloud, which could be AWS, Microsoft, Google or some other provider.
"We're in the process of evaluating cloud vendors to find who we want to stick with long term," Moseley says.
Actifio will play a key part in Rosetta Stone’s strategic cloud strategy.
Ideally, Moseley says, Actifio's technology will enable Rosetta Stone to fail back and forth between its co-location systems and its cloud provider. Such flexibility will enable the company to solidify its disaster recovery strategy while allowing it to accommodate traffic spikes. "It gave me the ability to move functions I need out of the data center and into a public cloud where it makes sense," Moseley says.