Using leased infrastructure for storing large amounts of data has been typical for several years. Leased capacities provide an advantage by allowing businesses to greatly reduce the cost of tending one’s own IT-infrastructure, as there is no need to purchase and maintain a server farm. On the other hand, some serious safety issues arise; among them is the degree of control over the data reserved by the client company of the cloud service provider.
This has been taken seriously in the past, but with the recent Edward Snowden incident, the seriousness of the issue has drastically increased and sparked much debate about the safety of cloud storage systems. As a result, the credibility of US-based cloud providers has significantly deteriorated. According to a survey by Cloud Security Alliance, 56% of respondents outside the U.S. stated their unlikelihood of using US-based cloud services.
The possibility of an unauthorized third-party gaining access to outsourced corporate data is the substantial and persistent concern for every user of a cloud service. Unfortunately, we watched malware infect the cloud despite all the assurances to adequately protect the cloud infrastructure.
Both sides – client and cloud service provider – are responsible for the safe transition of data between local and leased resources. Of course, the client has the right to demand that the cloud provider use a secure connection (SSL/TLS and VPN), but the customer has to take all possible measures to protect the information, especially when critical. The first measure in this case is data encryption.
According to B2B Interactive research, 44% of companies use encryption to secure critical data, while 36% encrypt all of their data. Given the complexity of the global data security situation, the level of adoption of encryption technologies is much lower than it should be.
Last spring, Gartner published a report stating that corporate customers will be more willing to use data encryption in the cloud (and actively seek the provision of such services), when both the customers, and the cloud service providers manage to solve a few issues of data processing and data security.
It is important to notify customers of the potential for information to leak, the physical location of where the data is stored, how the data is being managed and protected while being stored and transmitted, as well as access rights and management of encryption keys.
The Internet is a global phenomenon but laws (including data protection) differ, and one must always keep that in mind. The physical location of the servers where the data is stored is extremely important and a lot depends on that location. The local legislation defines the way the information about data leaks is disclosed in case they occur, as well as the procedures of giving out information at the request of government or law enforcement agencies.
Another serious issue is the provider’s policy in using their multi-client architecture. Clients should know how their data and traffic are isolated from other clients’ data and traffic, and what systems are used in order to avoid duplication of data on servers that are physically located in countries where information should not be for whatever reason.
As it was mentioned above, encryption is a very important element of the corporate “defense system,” but it makes the issue of retaining control over information even more serious.
Perhaps the best option is when the data transferred to the cloud has been already encrypted (especially critical data). This provides the owners of the information more confidence in its security. If the service provider enables additional encryption, it is desirable that these keys are also stored with the client’s.
If the provider manages cryptographic keys, the client has the right to insist on using hardware encryption, and in this case, the customer must have all necessary information about how the data is managed. Gartner recommends that the control of the encryption keys used to protect critical data should belong to the client. If the service provider is unable to provide that, then it is not worth dealing with.
The provider is also required to provide the same kind of protection for database snapshots, as well as for the confidential customer’s data so that potential attackers would have no way to analyze the memory contents for encryption keys.
The access to data must be reduced to a minimum. Gartner advises restriction of access to customer data within a specific range of IP-addresses. The provider is required to enable two-factor authentication as well as adequate means of control and differentiation of the client’s and administrative access to the data in the cloud. In addition to this, Gartner insists on obligatory recording of all data queries by clients and administrators in case of emergency.
All in all, the “transparent opacity” is a good recipe for preserving and increasing confidence in cloud service providers. The client needs to be aware of what is happening to its data at any time and to always have an opportunity to use tools to control access to the data. At the same time the stored data must be shielded from any prying eyes. The optimal solution to this problem is organizing the “defense-in-depth,” comprising the use of secure connection protocols, encrypting and restricting access to the data stored in the cloud.