Oct 14, 2020|5 min|Technology

Why We Redefined Data Management

IT infrastructure has evolved through virtualization, hyperconvergence on commodity hardware, and adopted public cloud and SaaS. However, data management remains too hard. The daily task of protecting, storing, identifying and provisioning an organization’s data – arguably its most valuable business asset – remains, for many, a ridiculously complex, inefficient and expensive chore.

IT teams are still struggling to meet basic SLAs for data availability and access, let alone ensure regulatory compliance, optimize storage costs, prevent disruption from ransomware and other cyber attacks, or mine the data for useful insights. That makes it much more than an IT issue – it’s a serious inhibitor to digital business success. So what happened?

Innovation Stalled

From an industry point of view, very little. For years, data management vendors perpetuated a siloed, fragmented model that treated each function (such as backup, analytics, file sharing, disaster recovery and dev/test provisioning) as a separate discipline with its own proprietary infrastructure and operational knowledge. This highly inefficient model even spawned ‘silos within silos’: for example, separate backup solutions optimized for specific workloads such as virtual, physical, databases and containers. And none of these silos share a common view of the underlying data—including vast volumes of dark data. Managing these silos over time has become nearly impossible. 

But what about the cloud? Surely that made things easier? While it’s true that a newer generation of cloud-centric solutions offered the promise of a simpler, hardware-less alternative, the silos didn’t go away. Many of these solutions were optimized for a single function such as Backup as a Service (BaaS) rather than the full range of data management use cases – meaning IT had to once again assemble a collection of ‘service silos’, in much the same way as they had on-premises. Another consideration with ‘cloud-centricity’ is the degree of native support for traditional on-premises environments, since not all data can be moved to the cloud. 

Breaking the Stalemate

So, the state of the art after 25 years is this: IT is being forced to piece together solutions from parts that weren’t designed to work together, the business is unable to harness the value of its most important digital asset, and the risks of a damaging compliance or security breach are going undetected—all accompanied by massive inefficiency, duplication and cost. We even coined a term for the chaos: mass data fragmentation

That is why we decided that data management needed to be not just improved, but completely redefined, from a tolerated lowest common denominator to a valued strategic asset, from a loose collection of parts to a fully integrated, consolidated hybrid solution, bringing a step function improvement in simplicity, efficiency, cost and value, and adding new capabilities that were simply not possible before. That may sound like an ambitious goal, but the answer was pretty clear: take the principles of modern software engineering that had enabled the software-defined data center and hyperscalers like Amazon and Google to revolutionize the industry, and apply them to the unique challenges of enterprise data management for the first time.

iso-cohesity-legacy-products-1
Figure 1: Before Cohesity

 

helios-iso-rev
Figure 2: After Cohesity

Redefining Data Management

Our founder and CEO, Dr. Mohit Aron, had strong credentials in that regard, having been the lead developer of the Google File System, as well as the co-founder and CTO of Nutanix where he pioneered HCI. Realizing that incremental improvements weren’t going to cut it, he took a clean sheet approach to the design, embracing principles such as:

  • Fully Software-Defined: Fundamental to the redesign was a software-defined abstraction of all data management functions previously conducted in physical infrastructure, capable of running on commodity hardware as well as in multiple public clouds. This extends the platform’s reach to anywhere across a hybrid landscape with a consistent set of services and controls, while dramatically simplifying infrastructure and operations.
  • Hyperconverged: Another key to eliminating silos is the ability to run widely varying workloads and use cases (such as backup, archiving, disaster recovery, file & object storage, dev/test provisioning and analytics) anywhere on the same software platform, rather than needing dedicated infrastructure optimized for each one. This achievement transforms the traditional ‘specialist’ view of data management functions that helped propagate mass data fragmentation in the past.
  • Web Scale: The ability of hyperscalers such as Amazon and Google to operate at almost infinite scale is in part due to a sophisticated distributed file system, designed to efficiently handle data-intensive operations with dynamically allocated resources. By adapting that principle to the unique needs of data management, the Cohesity platform can handle massive amounts of distributed business data as well as multiple workloads, apps and services, simultaneously and efficiently, at web scale.
  • Global Management: The continued growth and fragmentation of enterprise data makes management a business-critical function. Cohesity Helios, modeled after hyperscaler principles, delivers global management that provides a single actionable view into all distributed data, apps, and workloads no matter where they are located, together with the ability to control them simply via UI-based policies.
  • Hybrid Cloud Ready: Most analysts agree that the de facto deployment model for most enterprises will be hybrid. The Cohesity data platform provides the exact same functionality whether running natively in public clouds, data centers or edge locations, removing the barriers to seamless data and app mobility, and enabling flexible choice of location for any workload or data as needed with no penalty. This overcomes inconsistencies between environments and facilitates end-to-end SLAs.
  • Ability to Run Apps: Based on the principle of moving compute to the data rather than the data to compute, this unique innovation allows Cohesity and 3rd party apps to run in the same environment as the managed data rather than in a separate system. This greatly improves the efficiency of extracting value from data, for example when running a trend analysis or checking compliance, and offers the application global purview across any part of the organization’s data rather than a curated subset. 
  • Developer orientation: Developers are an increasingly vital factor in business innovation, so it was important to ensure integration with agile/DevOps practices, enable secure self-service access to production dev/test data, and offer access to a fully programmable/API-first design so that powerful platform features can be leveraged into new apps and services. 

The Next Chapter is Coming — Soon!

Taken together, we believe this collection of leading-edge technologies, when applied to the age-old problems of data management, provides the much needed breakthrough the industry needed, and the step-function improvement in business value that long-suffering customers deserve.

But we haven’t finished yet! In fact we’re just getting started. Join us for an exclusive webcast and hear how we are continuing to radically simplify data management in the cloud era. We’ll also take a closer look at the future of data management in my next blog. Stay tuned!  

For more information visit: www.cohesity.com

Written by

Vineet Abraham Headshot

Vineet Abraham

Chief Product and Development Officer

X image
Icon ionic ios-globe

You are now leaving the German section of www.cohesity.com/de/ and come to an English section of the site. Please click if you want to continue.

Don't show this warning again

Icon ionic ios-globe

You are now leaving the German section of www.cohesity.com/de/ and come to an English section of the site. Please click if you want to continue.

Don't show this warning again