Today, organizations are challenged by a massive growth in the volume of enterprise data. It’s an undisputed fact that today more data is being generated, processed, and stored than at any other time in history. And it’s only going to increase from here.
International Data Corporation (IDC) projects the “global data sphere” to grow from 33 zettabytes this year to a staggering 175ZB by 2025, representing a compound annual growth rate (CAGR) of 61%.
Looking more closely, analysts at Gartner estimate that 80% of enterprise data today is “unstructured”. This is important because much of the new growth of data volume is unstructured data which includes email, video, voice recording, media files, life science and healthcare data, social media, Internet of Things (IoT) and sensor data, among others. It’s also important because unstructured data is an ideal application for object storage. And with the rapidly-growing volumes of unstructured data volume, this is creating a corresponding increase in the need for object storage.
With this data growth putting a lot of pressure on organizations in every industry, technology leaders face a tough challenge: they must find a cost-effective way of storing and managing this growing amount of data without sacrificing performance, security, or service delivery.
As a result, many have turned to the public cloud. The flexible nature of cloud computing coupled with consumption-based pricing has made the public cloud a tempting option. But with time and experience – and enhancements to on-premises object storage solutions – many organizations are evaluating the public cloud in a new light. Some are turning to hybrid cloud approaches that employ both on-premise and cloud infrastructure, while others are repatriating applications and data to on-premise environments. According to a survey of IT managers by 451 Research, 58% of respondents said they are moving to a hybrid IT strategy.