Enterprises today are eager to satisfy a constant requirement for secure and efficient access to information. This is because they want to remain relevant and maintain a competitive edge. The reality is that all the business-critical functions of an organization rely on the use and access of information.
However, depending on the enterprise organization and their business, it has become almost unrealistic from a cost perspective, both CapEx, and Opex, to retain all of the data generated and required by a company locally.
Demands for the use and access of data is often balanced between characteristics, such as usability, compliance, and availability against cost. As a result of this balancing act, compromising decisions are made to move and relocate valuable data assets to cost-effective locations, infrastructures, and public clouds platforms.
While the approach described above may potentially and temporarily solve some of the challenges related to cost, a different set of problems presents itself. This means data portability risks (physical or electronic), data integrity, security, loss of data visibility, and increased RPOs/RTOs, just to name a few.
Information is the core of any business prosperity, and there should be very little compromise when it comes to satisfying business requirements for data access and visibility. There are also impending data-compliance regulations, and laws such as GDPR (General Data Protection Regulation) that carry hefty financial penalties. Issues arise when a lack of visibility results in an organization’s failure to manage data according to the regulations.
Moving or transporting data to places such as distant Mountains may expose an organization’s data to the risks I mentioned above. This applies to the use of legacy gateways and software solutions for pushing data to the cloud.
It furthermore becomes challenging to use your data when you lose visibility to it. For example, how efficient will a process be when it doesn’t know the type of data, the point in time, or the different types of data sets that may be available for processing?
Expand on this concept and apply it to a BCDR (business continuity and disaster recovery) scenario where the requirements are to restore unstructured data such as files, databases, media files, and not just virtual machines. The deeper we go down the rabbit hole, the more issues keep coming up.
Enterprises should be able to take advantage of the benefits provided by public clouds and modern data management solutions without compromise. They should never have to lose visibility to their data, while maintaining ways to efficiently and effectively manage it.
What does this boil down to? Enterprises are on the lookout for modern data platforms and public and private cloud solutions.
One platform solution worth looking into is Cohesity’s DataPlatform, which is recognized as a data protection and recovery product, but has proven to be much more than just backup.
Due to its native cloud integration, Cohesity DataPlatform provides enterprises with the ability to maintain visibility of their data. It also allows organizations to catalog, inventory, and download data from any of the supported data repository targets and locations where data has been stored – whether it’s for long-term retention or business continuity and disaster recovery purposes.
With Cohesity, organizations can access, retrieve, and catalog their data from any supported internal (on-premise) and external (public cloud) repositories, regardless of the accessibility of the originating data source (cluster).
For example, an on-premise cluster that archives data for long-term retention by replicating the data to an external cloud target (S3, Blob, NFS) is illustrated below:
In combination and integration with public clouds, the Cohesity DataPlatform allows organizations to maintain visibility of their data and enables secured access and retrieval of data from any of the available Cohesity cluster deployment models (on-premise/Virtual Edition/Cloud Edition).
Accessibility also includes search, metadata/data download capabilities, and optimized data egress from public clouds for selective BCDR (business continuity and disaster recovery) scenarios.
The demonstration below showcases the scenario and capabilities discussed above.
I’ll say it once again: One of the most valuable assets of any business is information. As it has been said before, “data is the new oil” and it’s time for everyone to step up their game and treat it like it is!
For future updates about Hyperconverence, Cloud Computing, Networking, Storage, and anything in our wonderful world of technology, be sure to follow me on Twitter: @PunchingClouds.