The cloud storage sector may be in for major changes in 2014, as enterprises and vendors adjust to rising data volumes, budgetary uncertainty and new regulations governing information retention. Overall, the amount of unstructured data will grow even as budgets for storage technology remain tight at many organizations. Accordingly, vendors may shift to simpler, more interoperable solutions that are both scalable and affordable.

In 2014, businesses will need storage setups that can preserve data for the long haul while also providing IT workers with rapid access and performance. Writing for eWeek, Chris Preimesberger compiled forecasts from Active Archive Alliance and Basho, a Seagate Cloud Builder Alliance partner, and both companies predicted that enterprises would come under additional pressure to provide constant availability of Web services while also adhering to sound archiving strategies.

Basho, for example, stated that companies that manage mobile games would need to ensure that real-time multiplayer functionality would work anywhere in the world, despite server outages in specific regions. On top of that, the rise of the Internet of Things will create many new connections, services and devices for companies to manage.

Maintaining legacy systems to handle new data demands could be costly
Even as cloud infrastructure becomes more complex, however, spending more on propreitary solutions and/upgrading legacy systems may not be an option for some IT departments. Finding smart, open solutions will be key to taming the data deluge in a cost-effective way.

“[O]perations teams will need to optimize to handle exponential growth in data and traditional systems simply won’t fit into these budgets,” explained Basho. “To optimize the operations’ budget they will value operational simplicity and cost-effective scaling above all else. The same budget spent on licensing software at tens of thousands per processor core will be leveraged to achieve massive scale by adopting new technologies.”

More specifically, Basho predicted that “commodity storage and scalable software technologies” would be the key to scaling data projects while staying within budget, especially in the context of the massive amounts of information generated by IoT appliances. The stakes for coming up with an effective storage strategy are high – IDC forecasted that enterprises would need 80 exabytes of storage in 2014, and that 90 percent of that would be unstructured data.

Still, many organizations have stuck with patchwork storage solutions that worked in the past but may buckle under pressure from the scope of new data projects. Writing for The Information Daily, Basho EMEA marketing manager Jeremy Hill explained that this issue was particularly problematic in the public sector, examining the U.K. government’s struggles to modernize IT operations.

“With the rapid expansion in volumes of data being read and written, government departments require a scalable solution to process fluctuating amounts of information,” argued Hill. “The costs involved in making a proprietary system totally and continually responsive to such developments in data would be massive.”

Rather than shell out more to upgrade these old systems, agencies have been pushed to invest in cost-effective cloud solutions. Hill noted that British government investment in public cloud technologies exceeded $98 million in 2013, providing momentum to the open source projects pursued by many small and midsize businesses.

Open source solutions provide both operational and financial improvement over legacy systems. Since platforms such as OpenStack are driven by communities of developers and vendors, they evolve rapidly to meet new requirements. At the same time, the combination of open source software and commodity hardware means that businesses of all sizes can save money and simplify procurement processes.

Twitter Facebook Google Plus Linked in