
Sign up to save your podcasts
Or
Cloud Service Providers have no problem sharing with you the number of data centers they own, the flexibility of options, and the ease to start in the cloud. However, what is never overtly stated is that the federal technology manager is responsible for the security of their data if it is on the server down the hall or in the cloud.
The conversational phrase is, “they are not on the hook for the security of your data.”
Today, we have several perspectives on understanding how to protect federal data in the cloud. Experts from three areas provide their views on data protection, standards, and working in a cloud environment.
When it comes to protecting data in the cloud, Skip Bailey from the U.S. Census Bureau thinks that one needs to approach it strategically first. Each of the three main Cloud Service Providers has proprietary ways of handling aspects of data control. If you think you are going into a multiple cloud environment and plan or relying on one set of rules, you are mistaken. You will need staffing to support these multiple clouds.
As in other endeavors, standards bodies can provide guidance that can assist in coming to terms with handling heterogeneous environments, in this case, varying cloud providers. Craig Hurter from the State of Colorado suggests that one should get comfortable with ISO specifications like the ISO 17789 as well as some of the general guidelines from the Cloud Security Alliance. That way, you can compare the terms of service for each Cloud Service Provider with whatever standards you choose.
It seems likely that a multi-cloud world is where federal data lives. If that is the case, then it would behoove managers to be able to evaluate each Cloud Service Provider’s capabilities. Each cloud may have options to allow control, the key is to understand how those cloud provider’s proprietary offerings compare to commercial ones.
Sterling Wilson suggests that you start with three questions. What happens if you delete data. How easy is it to deploy Multi-Factor Authentication? What about the security of data in transit?
One concept that Craig Hurter brings up is the idea of architecting data storage in depth. The idea is that the initial system is solid, but, over time, something called “drift” takes place. Updates may not all be installed promptly; other maintenance can be delayed. What may happen is you can lose security over time, while still holding to the initial design specifications. You may have “drifted” without knowing it.
5
55 ratings
Cloud Service Providers have no problem sharing with you the number of data centers they own, the flexibility of options, and the ease to start in the cloud. However, what is never overtly stated is that the federal technology manager is responsible for the security of their data if it is on the server down the hall or in the cloud.
The conversational phrase is, “they are not on the hook for the security of your data.”
Today, we have several perspectives on understanding how to protect federal data in the cloud. Experts from three areas provide their views on data protection, standards, and working in a cloud environment.
When it comes to protecting data in the cloud, Skip Bailey from the U.S. Census Bureau thinks that one needs to approach it strategically first. Each of the three main Cloud Service Providers has proprietary ways of handling aspects of data control. If you think you are going into a multiple cloud environment and plan or relying on one set of rules, you are mistaken. You will need staffing to support these multiple clouds.
As in other endeavors, standards bodies can provide guidance that can assist in coming to terms with handling heterogeneous environments, in this case, varying cloud providers. Craig Hurter from the State of Colorado suggests that one should get comfortable with ISO specifications like the ISO 17789 as well as some of the general guidelines from the Cloud Security Alliance. That way, you can compare the terms of service for each Cloud Service Provider with whatever standards you choose.
It seems likely that a multi-cloud world is where federal data lives. If that is the case, then it would behoove managers to be able to evaluate each Cloud Service Provider’s capabilities. Each cloud may have options to allow control, the key is to understand how those cloud provider’s proprietary offerings compare to commercial ones.
Sterling Wilson suggests that you start with three questions. What happens if you delete data. How easy is it to deploy Multi-Factor Authentication? What about the security of data in transit?
One concept that Craig Hurter brings up is the idea of architecting data storage in depth. The idea is that the initial system is solid, but, over time, something called “drift” takes place. Updates may not all be installed promptly; other maintenance can be delayed. What may happen is you can lose security over time, while still holding to the initial design specifications. You may have “drifted” without knowing it.