Why Data Protection Strategy is Critical for Federal Government Cybersecurity
With a new administration in place and the SolarWinds hack still fresh in our minds, I have been thinking a lot about the issues and opportunities the federal government is facing with new and old technology, especially when it comes to cybersecurity. The government is huge, and its systems are sprawling, distributed, dispersed, and not at all homogenous. Because of its size and the nature of what all the different organizations and agencies do, there are big and unique challenges in the years ahead. As an advisor for Cohesity, I thought I could share some of my experiences and perspectives on the unique challenges of securing our government’s systems, what can help organizations and agencies going forward, and why I chose to work with Cohesity.
Government’s Big Tech Challenges
When I was at the DoD and NSA, technical debt was one of the biggest challenges we faced, and it was so big that it was difficult for the tech industry to wrap its arms around it. A 10-year-old system was not out of the norm. Adding new technologies created incredible complexity, especially since there is so much technology out there. I received an enormous amount of requests to evaluate technology, and only a fraction of that tech was useful because each tech solved a fraction of our issues. Additionally, the DoD would often replace an entire system because modernizing the existing one was too difficult. When I was in the Pentagon, I sought and received funding to replace existing cyber systems that were selected 10 years ago and for some parts of the DoD were just being implemented. By the time they were up and running, they were addressing a 10-year-old problem. Now we were asking for money to select a new system to address today’s problems. It was near impossible to implement at scale to be effective. Everyone understood it was time to replace it, but it just took so long.
Another challenge I dealt with every single day was interoperability. There are so many different types of technology products available, and each is developed and adopted to solve a specific problem. They don’t interoperate, even though if they did, they would have much more value. Instead, a huge organization is trying to manage all that disparate technology and train its tech teams on new technology. As a result, this incredible technology never gets fully implemented or those trained on one solution are not trained on how to access another. Vendors would often tell me that the organization would realize significant security improvements if their technology was fully implemented, but we only implemented a fraction of it. Instead of fully implementing, government organizations often acquire new technology to address the gap. This is a vicious cycle that is difficult to break.
The technical debt and interoperability challenges were exacerbated by the fact that we needed a significant amount of people who knew the technology to ensure we were capitalizing on its capabilities. The diverse, complex, and disparate nature, along with the excruciating lead times, required people who had mastered all different types of technology, from application development to systems integration and enterprise database administration. Even when we did find people who could span the depth of skills needed to properly implement solutions across the department, it was never enough.
Last but not least is data. In the federal government, few organizations know where all their data resides. It is definitely developed in silos for very specific mission needs. Most federal government organizations aren’t realizing the benefits of having all that data because they can’t get to it. Not only that, but data is one of the world’s most valuable resources. Sophisticated cyber attackers are now using their skills to steal data rather than attack a company’s security perimeter. Look at what happened at SolarWinds. Data really is intellectual property in almost every environment from financial information to healthcare to intelligence.
Compounding this data problem is the fact that our systems are getting more and more distributed with cloud. Yet, we still have mission requirements, where we have on-premises data centers. We have on-premises and off-premises cloud, and our data is distributed among all those environments. When I was at the Pentagon, for example, we thought we were doing some very innovative data analytics with a SaaS company. We thought the data was staying in our cloud the whole time the company was doing analytics for us. Little did we know it was going all over the place even though that was the last thing we wanted. The SaaS company was working with another company, and the other company was working with another company. We found that tracing our data was near impossible.
I realize that I have painted a gloomy picture of what the federal government faces. But, there is light at the end of the tunnel. These challenges can be overcome. It starts with well-thought-out, well-funded, and well-executed strategies. In the case of technical debt and interoperability, these strategies should address how legacy and modern technology should coexist and how to work with technical vendors to develop systems that interoperate. Some progress is happening in that area as tech shifts to an API-driven economy, and more technology libraries and frameworks are shared among communities.
Meanwhile, as technology companies and secondary and higher education increasingly cooperate to offer more opportunities for young people to acquire technical skills, some of the hiring issues will ease, especially, if more funding is imminent. The rise of innovative automation will also play a really big role in closing the gap, perhaps even more than increasing the talent pool.
Federal government organizations and agencies can also address the siloed, inaccessible, unsharable, and vulnerable data problem. The solution is having a deliberate data officer architect data strategies. For example, once the NSA pulled all their data governance under one organization, finding the data, looking at the data, protecting the data, and developing a cohesive strategy for the data became much easier. It was not instantaneously and magically revealed, of course, but by eliminating these stovepiped systems, they have a more strategic view. The NSA also had a huge research organization, and we were investigating AI, ML, Data Analytics and exposing existing data for detailed analytics.
Leveraging Innovative Cyber Technology
The government’s data problem is not necessarily one of policy. There are many policies on paper, but they are not always funded, so it is difficult to act on them. Also, our ecosystem is so interconnected, and it is impacting a huge portion of the government. The DoD, for example, is trying to address this because of how much IP they have and how they’ve lost billions of dollars every year from the fact that leaks and hacks are often in the Industry that support the DoD, the Defense Industrial Base (DIB). As a component of the DIB are small companies that cannot withstand a nation state attack simply because they do not have the resources. I often encountered small companies that developed pieces for a weapon system that would be defenseless against a nation state account because their IT department was a part-time worker or maybe had 3 people.
SolarWinds was a wakeup call, telling us that we have not worked out how to protect our data. We do not have it figured out yet. The good news is that the information sharing between organizations like Cyber Command, the NSA, the FBI, and the DHS is exceptional. I watched it mature over many years, and I think it is going to get even better. Anne Neuberger, for example, is heading up President Biden’s response to the SolarWinds hack, and she is a very, very strategic thinker. She is also a tremendous collaborator across government, and I definitely expect to see good things happen there. I expect to see other major muscle movements of this caliber under this new administration.
I believe innovative cyber technology will allow us to address this issue of data and cybersecurity. It can help us answer the kinds of questions that the SolarWinds hack raised. What did they want? What were they trying to get? After they got in, escalated privilege and manuevered through the network, did they get what they were after? Innovative cyber technology can also help us do a better job of protecting the data.
Innovative technology that delivers automation especially is the key to protecting our data. Automation is going to save the day for a lot of things such as data management and cybersecurity. Take compliance automation, for example. Compliance with the increasing data protection laws and regulations is a challenge for every organization, but there is a very good reason for it. Compliance automation enables you to keep up as compliance technology, people, and configurations change. Automation makes it much easier to stay in compliance. There was a program in the Department of Defense called comply to connect that I thought was one of the best things implemented. Every single endpoint and every single user’s laptop or workstation was checked for compliance the minute it attempted to connect to the network. If it was not in compliance, it was not allowed to connect.
Automation, especially in the cloud, is the way of the future. The possibilities are endless with what you can do with machine learning and AI in the cloud. We are also going to continue to see the hybrid approaches that include a mix of public cloud and on-premises data centers. The economies of the public cloud cannot be ignored. But due to the required constraints for certain missions, not everything can live in a public cloud. Cloud companies do a great job of protecting their infrastructure. A user needs to understand what the cloud company is protecting and what is their organization’s responsibility. Often the data of individual organizations and agencies needs additional protection. The intelligence community, for example, will probably never accept the risk of moving to a purely public cloud.
There are certain technologies that I feel are the cornerstone of great cybersecurity. I became an advisor to Cohesity because its solutions for data management and cybersecurity are the kind of innovative technology I think the federal government needs. As I’ve explained, data is a precious commodity in the federal government, as well as elsewhere, and Cohesity demonstrates that it understands that fact. Its laser focus on data as an organization’s last defense and the innovations it is developing to make sure it’s as safe as it can be is the kind of technology and foresight that can best serve the defense and intelligence communities within the federal government.
As a former NSA Executive and DOD CIO/CISO, Marianne brings over 35 years of experience across the Department of Defense (DoD), Intelligence community, and civil government sectors. She served as Deputy National Manager for National Security Systems (NSS) and Senior Cybersecurity Executive for the National Security Agency where she was directly responsible for systems across the government containing classified and/or sensitive information. She also served as both Principal Deputy for Cybersecurity and Deputy Chief Information Security Officer (DCISO), Department of Defense, CIO. She received the Distinguished Executive Presidential Rank Award, the highest government civilian recognition, for her contributions to national security. Marianne Bailey currently leads Guidehouse’s Advanced Solutions Cybersecurity practice. Ms. Bailey earned a Bachelor’s degree in Engineering from the University of Maryland, College Park, and a Master’s degree from Industrial College of the Armed Forces, National Defense University.