Clearing the Cloud Part II |A Ray of Sunshine On A Cloudy Day || Cloud Computing Security

Welcome! Please comment and leave me a note telling me what you like and what you'd like to see more of. Sign up to my RSS Feed!
This entry is part of a wonderful series, [slider title="Clearing the cloud"]Entries in this series:
  1. Cloud Security Article - 1st in a Series
  2. Cloud Security: Danger (and Opportunity Ahead)
  3. Clearing the Cloud Part II |A Ray of Sunshine On A Cloudy Day || Cloud Computing Security
  4. Clearing the Cloud Part III || How Do You Solve A Problem Like “A Cloud”? || Cloud Computing Security
[/slider]
In the first in this series of “Clearing the Cloud” columns, I explored the dangers of jumping too soon into cloud computing. In this article, the second in the series, I continue my vision on how to manage and secure cloud-computing solutions.

Clearing the Cloud Part II – A Ray of Sunshine On A Cloudy Day

In the weeks that passed since the publication of my first article, it seemed that new information and new “Cloud Solutions” were popping out every day, and sometimes every hour. I am gratified to see that NIST, the National Institute of Science and Technology, has published their (15th) draft on Cloud computing, and with it, agreed with much of the definition I proposed in part I of this article – “Service based data processing and storage capability which is flexible, extensible and virtual”.
NIST suggested that cloud computing has the following salient characteristics: That it would be:
  • On-demand self-service, based upon
  • Ubiquitous network access, using
  • Location independent resource pooling, feature
  • Rapid elasticity and provide
  • A Measured Service.
It is interesting to note that NIST specifically called out the piece about the service having to be measured.   I wholeheartedly agree and take this to be a step in the maturity of Cloud Computing.

Security Models

The Jericho Forum proposed an interesting approach to cloud computing security. Starting with a description of Cloud Layers below, allows us to envision the problem:

Figure 1: Jericho Forum's Cloud Layers

Figure 1: Jericho Forum’s Cloud Layers

Here, the Forum proposed that Security (and Identity Management) are elements that cross all layers and in effect provide a design they call Collaboration Oriented Architecture (COA). Once this foundation has been laid, they defined Cloud Security as a proposed a cube-shaped model that highlights various possibilities of architecture, the one addressed here is, of course, the outsourced / external / De-Parameterized option.
Figure 2: Jericho Forum's Cloud Security Model
Figure 2: Jericho Forum’s Cloud Security Model
At about the same time, the Cloud Security Alliance, of which I am a member, designed a not-too-different view.
The CSA broke down Cloud computing into three delivery types:
  1. Infrastructure as a Service (IaaS)
  2. Platform as a Service (PaaS)
  3. Software as a Service (SaaS)
And then proceeded to define the Cloud consumption models:
  1. Private
  2. Public
  3. Managed
  4. Hybrid
If I arrange those elements in a matrix, we will get a cube similar to Jericho’s work:
Figure 3: Cloud Computing Cube
Figure 3: Cloud Computing Cube
The CSA’s model of service delivery stacks, however, is very complicated.    While I do not disagree with their reference model, I find it to be exceedingly complex.
So, allow me here to define the problem statement a bit differently than above. Because these are the early days of any cloud discussion, and that translates usually into this time being the formative years, let’s expand the basic three tenets of security, which are:
  1. Confidentiality
  2. Availability and
  3. Integrity
And add additional controls.   We will borrow from Donn Parker’s Hexad, and add:
  1. Possession (or Control)
  2. Authenticity and
  3. Utility
Clearly, in the case of Cloud computing, and especially in the Public/External case, we no longer have any control. Once the bits “leave our network,” control passes elsewhere.   And then there were five.
Losing one control typically mandates an increase in the other controls. Here, we have another set of problems.    Let us explore the remaining controls:
  • Confidentiality: Typically, we handle confidentiality through the usage of technologies such as Encryption and access Control.   We can still encrypt, but imagine what happens to a large data set.   It has to be sent, or assembled, in the Cloud, remain there in an encrypted form, and be transferred to us, for processing.Once the data is at our location, we have to decrypt it, perform the operations needed, then re-encrypt and resend to the Cloud.   Doable – yes. The performance tax here is huge. While today’s routers and servers no longer have their performance brought down to 1/6th by encryption (a loss of 84%), we still pay a heavy price.
Figure 4: Lifecycle of Encrypted Data
Figure 4: Lifecycle of Encrypted Data
Let us state once more: Having the data unencrypted at any point in the storage or transfer process exposes it to unauthorized disclosure. Unauthorized exposure, of course, is the opposite of any good security or compliance requirements, such as PCI or HIPAA. Even Amazon, with inherit interest in providing such services, announced that their Cloud is not PCI compliant nor intended for such work:
Hi, Thank you for contacting Amazon Web Services. Our payment system is PCI compliant and it is an “alternative payment processing service” meaning your users re-direct to our platform to conduct the payment event using their credit cards or bank accounts. The benefit for you is that we handle all the sensitive customer data so you don’t have to. If you haven’t looked at it, I highly suggest you check out the features and functions of our Flexible Payment Service and our Payment Widgets ( http://aws.amazon.com/fps).

As for PCI level 2 compliance, that requires external scanning via a 3rd party, PCI-approved vendor. It is possible for you to build a PCI level 2 compliant app in our AWS cloud using EC2 and S3, but you cannot achieve level 1 compliance. And you have to provide the appropriate encryption mechanisms and key management processes. If you have a data breach, you automatically need to become level 1 compliant which requires on-site auditing; that is something we cannot extend to our customers. This seems like a risk that could challenge your business; as a best practice, I recommend businesses always plan for level 1 compliance. So, from a compliance and risk management perspective, we recommend that you do not store sensitive credit card payment information in our EC2/S3 system because it is not inherently PCI level 1 compliant. It is quite feasible for you to run your entire app in our cloud but keep the credit card data stored on your own local servers which are available for auditing, scanning, and on-site review at any time.

Regards,

Cindy S.
Amazon Web Services

Figure 5: From here
Figure 6: Encryption in Cloud Case Study
Try the following as an example: Suppose you have a volume of credit-card bearing transactions that you must preserve for a period of one year.   And let’s assume that the data is in SQL form.   If so, the steps needed would include:

 

  1. Exporting the relevant tables
  2. Encrypting these files with suitable encryption
  3. Uploading the encrypted files to your cloud “bucket”
  4. Storing the data in the cloud, in an encrypted form
  5. Downloading it, while encrypted
  6. Decrypting the data
  7. Importing the data, and finally
  8. Processing it

 

 

One other element within Confidentiality is the ability to destroy data.  In a cloud, that we do not own, and on storage media that we do not control, there is high –probability that the same media be used for other purposes.   These storage buckets are dynamic and the service/platform/application provider might allocate them to other users.

This sharing, and in many cases, repeated sharing, of storage media leads to the need for assured destruction. We must follow a strict regime that states how long is data to be kept, when and by whom destroyed, and how such destruction is verified.  Since degaussing tapes and shredding CD’s is out of the question, we must employ more agile software- (or, dare we say – hardware?) based methods to assure that destruction.

 

 

Figure 5: Chercher les ...données?

Figure 7: Chercher les …données?

This question becomes infinitely more complicated when we consider that data at rest does not necessarily “rest” on a certain platen of a certain hard drive. The data can, and usually does, move between storage locations on the drives.   The onus is still on us to assure confidentiality, but… we don’t manage the drives.   The only practical solution here is to demand regular scouring of storage media from the service providers.   Do we think that such a requirement is feasible?

 

Figure 6:  Attention to Storage Media

Figure 8: Attention to Storage Media
Finally, lest someone think I am only talking about the storage aspect of Cloud Computing, the above discussion is easily applicable to processing in a Cloud as well.
  • The next control we will deal with is Availability.   When dealing with a Cloud-computing resource, we are at the mercy of the network, the remote server, and whatever controls are applicable along the way, be they host- or network-related.

    Yes, we always were at the mercy of such risks, but we owned them before.   When multiplying 99.5% by another, we quickly fall in our SLA capability to below 99%, and even further.   At what point does the enterprise take notice?    As we can see from recent, published outages at Google and elsewhere, users are very sensitive to the information they require, and rightly so.

    Figure 9:  Rapidly Decreasing SLA's
    Figure 9:  Rapidly Decreasing SLA’s
    (98.5% = 5 and a half days a year)
    Even when taking steps to “assure” access, which in reality translates into reducing exposure to this particular risk, we have typically resulted to building redundancy into the system.   Here, that would presumably add lines, servers, networking equipment and personnel.   Doable, but at what cost? What does the complexity of redundancy mean to an organization? What is the True Cost of operations?
    .
Figure 10: Availability in Cloud Case Study
Let’s look at an example: we have a volume of data which stretches at times by a factor of ten, so cloud computing seems like the perfect solution.   So here is what may happen:
  1. We ask the Cloud service provider for an availability in data storage bursting. We will estimate this payment at 10% of our regular Cloud computing cost.
  2. We ask our network services provider to create another redundant, and highly-available path to the Cloud service provider.   We will estimate that cost at 25% of our regular data communications cost.
  3. And now we must consider what we are to do if such data-burst occurs when we have no availability to send it to the cloud. Are we going to dispose of it?  Cease operations? No and no.   So here we must plan for (at least) the storage of such data regardless whether we use cloud computing services.

 

  • As for Integrity, we can detect changes after they were made. From hashing to redundancy checks, from digital signatures to trip-wiring we are able to ascertain that a change occurred.   But… we can no longer prevent changes.

    The bastion of defense in depth has crumbled when we talk about Cloud computing. We do not own the moats, the walls, or the doors (see my paper about the Evolution of Defense in Depth).   Accepting data without verification should be unthinkable, verifying all inbound data will be complex and costly. Adding yet another layer to the mix of technologies and methodologies that we must rustle.

    Indeed, the Cloud unchecked could lead to a wave of new attacks aimed directly at data whose guardians (by virtue of possession) are not incentivized to protect it from change, only mostly to be able to speed it on its way.
    Cloud computing could be a gold rush to people designing man-in-the-middle attacks, too. While most hosting companies will boast of their monitoring and security, few, if any, can assure you that they have never been compromised.   In fact, a provision of Cloud data, with its already built in doorway (or tunnel) to you, makes their life easier. They can now both alter the data AND assure that it, and associated payloads, make their way to the intended destination.

 

Figure 11: Integrity in Cloud Case Study
Here is one scenario.   We have a university system who stores grade data on the cloud.   “Pranksters” attack the datacenter and access student data, changing some grades, deleting some students.
Unless very carefully monitoring, either by using as-occurred instances or by comparing data to a “master” copy, such modified data runs a high probability of being recognized as the authoritative copy.
Worse yet, the data that was modified here has a known ownership.   These malefactors can well insert, install, and even custom-design a payload to effect the specific environment and systems found at that university – while the data itself is treated by the university as owned, and therefore “blessed” data.   This is a whole new definition of insider threat, isn’t it?

 

Know What Data You Are Getting

Figure 12: Know What Data You Are Getting

 

  • Perhaps moving away a tad from the pure-security elements of C.I. and A., to the more “business” ones, the first we will discuss in the Hexad will be Possession (or Control).As recent developments in the realm of data-breach notification laws has shown us (see my article about the “new” version of Mass’  201 CMR ), the United States, albeit one state at a time, is moving closer to the European, and indeed worldly, model of “Data Ownership”.

 

Figure 13: Definition of Data Ownership
Data ownership is a time-tested term and function that has been used typically in the military realm for over fifty years.   That term has slowly been filtering to the corporate world, and now to real-politick,   The concept that every element of data has an owner is a simple one, really.    Do allow me to explain here, for the sake of completeness, that most elements of data have at least a few owners, and perhaps many, many more.  
The data, for example, can be owned by a person designated by the Enterprise, by the system administrator, and/or by the individual it is about (in the case of PII, or personally identifiable data, for example).
Data ownership can specify who is responsible for the data, who can sell the data, whom the data is about, or what is the legal status of the data.   And frequently more than just one of these items.
Most professionals would agree that data ownership is far easier to define and maintain when the data is at a known location.    Even a relatively known event, such as the data being transmitted from a server to a data center, can affect our concept of the data ownership.   As we can see from breach-notification loopholes, selling the data, as if it was fungible item, sometimes releases a company from responsibility for the data, leading some companies to create wholly-owned subsidiaries, often with very different names, whose sole reason d’être is to shield the mother organization from disclosure rules.
Now comes the Cloud.   Even if we admit to owning the data, do we know where it is? Allow me to make an observation:
Caption 14: Ariel’s 7th Law of Cloud Computing
“Most laws are geo-political, and therefore lose their efficacy and meaning when involving trans-border clouds”.
So even if we are the best of meaning CIOs and the furthest thing from our mind is flouting the Law, we are faced with a few obstacles in our way.   Let’s state some, in no particular order:
  1. How do we comply with Breach notification laws?
  2. What happens if we have data regarding an EU national?
  3. What must we do when we disclose risk information to Auditors? To the SEC?
  4. How do we comply with rules relating to CALEA? E-Discovery? Data Forensics?
and the list goes on and on.
Lastly, we do remember that data has a lifecycle. Such DPLC mandates, ultimately, that the data be disposed off in a secure manner.    Remember those Cloud-buckets? Well, these must be certifiably-erased when we are done with their utility.   How do we do that in a Cloud?
  • If we remember the example we used above, in the University case, Authenticity of data is a problem that must be addressed.   Sometimes seen as a combination of non-repudiation, integrity and accountability, Authenticity is a super-set that defines the reliability we assign and the trust we place, in our data.

Should data in/from a Cloud seen as less-trusted data? If so, is there any worth to it?   Would Cloud end up being used only for data we could care less about? Only time would tell.   But In the following article will offer some solution ideas.

 

 

Figure 15: Authenticity in Cloud Case Study
Would you, for example, require a CRC-type check before you accept data?   Would that make your life easier or harder? And what about the following, admittedly oversimplified, scenario:
  1. A man walks into a bar (really);
  2. The man orders some drinks;
  3. One of the drinks ordered is Tsing Tao – a brand of Chinese beer;
  4. The drinks’ prices are tracked, together with the bar’s inventory in a Cloud;
  5. The Cloud’s buckets are – one in China, one in Nepal, and One in …. Oman;
  6. The bartender wants to give the man his bill…..
Just how many problems do you see here?   I see at least three:
  1. What if the data buckets arrive back at their controlling application but a confused manner, wrong order, or super-delayed timing?
  2. What if data about alcohol is being processed in Oman?
  3. And finally (and most related to our point), what if the Tsing Tao family of brewers wants to create an artificial, and large, order from this distributor, and as such bribes/hacks the Cloud provider to modify a piece of the data bucket?
  • The sixth element of our Hexad is Utility.   Utility is where Cloud Computing excels.   If we can figure out the other five elements, we can be the Bruce Willis of this story.   Cloud Computing is clearly an idea whose time is near.   We cannot argue against the flexibility, MIPS-saving, Just-in-Time, CapEx efficient model of a Cloud.   The elasticity and low(er) cost attached to such incredible advance in utility mandates that we solve the issues stated above.   Not solving them will have the effect of putting data at an amazingly higher risk.   Companies will put data in the Cloud; use the Cloud; and expand the Cloud in a tremendously accelerating rate, regardless of data security and privacy.

In the next article I will put forward some ideas on how to resolve issues defined in this and the previous articles. I will also attempt to show some of the security related benefits that we can garner from the usage of Cloud Computing, especially those that we could not, or could not easily, do before the advent of the Cloud.