In a blog published last year, we declared that legacy approaches to data management had become outdated, complex, inefficient and unsafe, and that a whole new approach is needed to support digital business in the multicloud era.
Nothing we have seen since has changed our minds:
That’s quite an indictment of the legacy generation of data management tools, which IT teams still grapple with to perform and manage routine functions including backup, file and object services, disaster recovery (DR), security, governance, dev/test, and analytics.
Cloud adoption has only increased the complexity: 90% of enterprises expect their cloud usage to exceed their plans prior to COVID-19, while 92% have a multicloud strategy.
More importantly, these tools are not just failing IT — they are also failing the business in terms of its online reputation, its operating efficiency, and its ability to use data as a strategic asset.
There has been almost no fundamental innovation from the legacy vendors in 25 years. From the outset their approach has been to produce specialist tools optimized for a single function, such as backup, file services or DR, that are deployed in silos that don’t interoperate.
And while a wave of newer entrants brought SaaS-based versions of these tools to market, they didn’t address the mass data fragmentation issue that added dozens more single-purpose ‘service silos’ to the mix.
In general, the current generation of data management tools can be characterized as:
This pattern looks set to continue, as legacy vendors make only incremental improvements within their own silos and don’t step back to reconsider the overall problem for global benefit. This insular and archaic thinking has serious consequences as we have seen, but it also opens up an enormous opportunity to change things for the better.
We believe a true solution can only be achieved with a fundamental redesign of the underlying architecture. Instead of an arcane collection of siloed components, next-gen data management should be a pervasive, invisible service, available wherever the data is located across the multicloud and to whoever is accessing it, helping to ensure the data is available, protected, visible, compliant, movable, and accessible to third-party apps. Next-gen data management should also play a key role in helping businesses improve their security postures — and all of this should take place with minimal management overhead for IT.
In the previous blog we outlined how our founder and CEO, Mohit Aron, adapted the architectural principles used by the largest hyperscalers to the unique demands of managing enterprise data. These breakthrough ideas, incorporated into the Cohesity Helios Multicloud Data Platform, tackle the central problem of fragmentation that has plagued the industry for so long, put IT back in control, and finally free the business to extract the value they need.
Next-gen data management directly addresses the limitations listed earlier:
Simplicity at scale
Zero Trust Security
Powered by AI
Third Party Extensibility
We believe the legacy data management industry has failed to deliver the solution needed by today’s digital businesses, and that IT is in an increasingly untenable position given relentless data growth, flat to lower budgets, and pressure to deliver more business value.
Incremental improvements won’t cut it. Instead, IT needs a modern approach: a simpler, more secure and smarter way to take back control of their data across a dynamic multicloud landscape with minimal IT effort, that can also help unlock latent value for competitive advantage.
The good news is that such a solution is widely available today, and is already being used by thousands of organizations worldwide to save money, reduce risk and be more competitive. We invite you to explore what next-gen data management can do for your business.