Evolution of Defense in Depth

Welcome! Please comment and leave me a note telling me what you like and what you'd like to see more of. Sign up to my RSS Feed!
בע"ה

Evolution of Defense in Depth

As security professionals will tell you, one of the basic principles of a good security program is the concept of Defense in Depth.  Defense in Depth is arguably the most time-tested principle in Security, and applies to physical security, as well as information security.  Defense in Depth builds on a concept of a hardened “core”, where one places their “crown jewels”.  This core is then surrounded by castle walls and motes, with ever increasing generality of defense.

Defense in Depth is a great concept, but it comes at a price.  Just as the area covered is wider from layer to layer, so is the cost associated with protecting with against more plentiful and less and less specific threats.  A firewall, for example, that typically acts as the last line of defense on the enterprise perimeter, has to protect against a great many varieties of threats, while a server-room door has to “only” be concerned with physical access.

The Server in The Castle
The Server Room in The Center of The Castle

Another flaw in the Defense in Depth design is its inherent difficulty to implement vis-à-vis the three basic tenets of security: Confidentiality, Integrity and Availability.   Why?   Because most forms of defense create increasing Confidentiality, but make Integrity more difficult to implement and manage.  Any increase in defense, of course, makes the concept of Availability that much harder to provide to the users.

A difficulty that I myself encountered many times is the applicability of Defense in Depth to my “layer 8” problem – the users. If users are not trained properly, if they are not aware of information protection needs, methods, and the “why?” of it, they become a liability, rather than an asset, towards data security.  If you are like me, you find the need to increase our moat-to-user-ratio on an ongoing base harder to design, implement, manage, and pay for.   Many of us resign ourselves to the proverbial “this is reality” and define our demarcation line as a physical device, such as a router, an access point, a firewall or a webserver.  There are potentially two things “wrong” with doing so:

  1. We are basically saying  “we are a target just waiting to be attacked” and
  2. We allow most barbarians (in the form of rogue traffic, networks and devices) to hit our gates

If we continue to do so, we will have approached a mathematical certainty of being hacked, or at least DDoS’ed out of the Net.   I really prefer NOT to draw analogies here to the real world, and we all know which those are.

Not only is the problem above big enough to cause some to lose sleep, but imagine what happens when we move to a Cloud topology… there we have nothing but moats and walls and front doors.   These front doors can be any browser, on any device, anywhere in the world.   How do you protect yourself against that?   Speaking of losing sleep – I love coffee, but this is ridiculous.

Clouds, Doors and Windows
Clouds, Doors, and Windows. Source: desktopnexus.com (Heavily edited)

 

Like any solution that might involve our entire user set, which may include Internet users, rather than pure corporate users, any solution must be:

  1. Easy to teach (i.e. close-to-zero learning curve)
  2. Easy to implement
  3. Applicable to the widest range of platforms possible
  4. Have a small delivery and storage footprint
  5. Easy to manage and maintain

Not asking for much, am I? 

Knowing how rapidly threats involve “in the wild”, I also want a tool that does not go the normal route of “black listing”.  I am more and more convinced that we need tools, in our world of security, that no longer compare bad signatures or behavior to a database (which is how most antivirus and firewalls, for example, act) and we need to go the “white-list” route.   I will write about that in the future.   Yes, I want a tool that will be controlled by me and allow me to choose which domains can be accessed, and under what (time or other) conditions can such an access occur.  Let’s add those to my “dream list”:

  1. White list based
  2. Conditional access

To make matters even more interesting, I want control over certain user functions.  (We want, after all, to reduce the number of barbarians and the number of roads leading to our castle, don’t we?)  We want to make sure that the people that request a resource are authorized to even request it.

For example, I would like some files to be able to be read and written, but not printed.  Or that I be able to control launching certain tools, such as IM or browsers from within the session.   And finally (?) I want a bullet proof audit trail.  Why?  SoX, GLBA and HIPAA, to name but a few.

  1. Selective access to file functions
  2. Audit trail

 

What should we do?

Until now, I did not see any solution to this quandary.  Other than Awareness and Training, there was not a whole lot that could be done.   Even MSSPs would tell you – they are there for a reason, which is “people will attack you”.

Thanks to my friend Andreas Wuchner, the CISO of Novartis, I ran (head first, mind you!) into a newly launched company called Quaresso.  Launched by a group of smart people with backgrounds in networking and security, in “Protect OnQ” they created both a new product and a service.  Working together, these allow us to do a few things:

  • Firstly, they allow us, the people responsible for the data’s protection, to select who will be allowed to knock on our doors and with what.  Simply put, if you so choose, people without the proper tool will not even be allowed access to your castle.  “Not allowed on the island”, if you would.  And this permission is manageable in real time.
  • Then, you can select not only which browser is allowed to knock at your door, but also to choose what (and what NOT) that browser is allowed to contain: add-ins, plug-ins, encryption settings, printing ability (or not), security zone setting, and the list goes on.  This effectively extends the defense-in-depth to the actual browser session!!
  • If this was not enough, you are able to control THE ROUTE that your users take to reach you.  While it may seem either unimportant or even impossible, controlling a browser’s allowed connections has the ability to protect against man-in-the-middle attacks, to name just one example.  To prevent DNS hijacking and man-in-the-middle attacks, the knowledge and selection of the route is also critical.
  • Zero-day (zero minute, really) malware protection – if it is not known, it does not get transported.  Simple and neat! 
  • And the final cherry on the icing?  Remember all those virii, trojans, key loggers and co.?  Due to the implementation of the “armored” browser, data can no longer leak from it to the rest of the operating system.   All passwords and personal information typed into a protected browser session remains confidential and un-recordable.   I know I will sleep better.

I tested the tool in several scenarios.    The only drawback seems the need to install another icon on the users’ screen.  I particularly loved it when, while running the tool, I had a sniffer on and could sense no data passing from the browser unencrypted.  So much for data leakage via this route!

So…. Let’s compare this tool to my wish list.

Number

Wish List Item

Protect On-Q Delivers

1

Easy to teach (close-to-zero learning curve)

Yes, being browser based and basically requiring a ‘click’

2

Easy to implement

Yes. The user is required to download an add-on or a link to their desktop and allow it to run.  The tool does NOT require admin rights on the installing system.

For web applications, a simple check-if-present mechanism allows the application to be On-Q aware.

3

Applicable to the widest range of platforms possible

Yes.  The tool, being browser and Java/ActiveX based, allows implementation on most publicly available browser.  And since most ship with any computer nowadays, they are built in.

4

Have a small delivery and storage footprint

Yes.  The package I tested was less than 450KB in size.

5

Easy to manage and maintain

Yes.  They offer a partnering console as a tool to monitor/manage/update the remote pieces.

6

White list based

Yes.It is not only a design philosophy, but an administrator from  The Bank of Atlantis, for example, can allow a specific use-only of that tool, to only selected systems within a selected domain, if he so chooses. Nifty. Imagine allowing remote users ONLY to access a certain system, but not payroll, for example?

7

Conditional access

Almost.  The domain selectivity is in place and working. The time/location is not yet implemented and may be, depending on industry demand.  This variable is relegated to the accessed system for now.

8

Selective access to file functions

Yes and by two separate mechanisms:

Firstly, the control over which browser add-on is present, allows tools like PDF browsing and key-loggers to be excluded.Secondly, the tool can control file operations of the browser.  So, for example, you can choose to provide the ability to remotely (as in on the user’s site) print or not.

9

Audit trail

Yes.  Extensive auditing is available and, because what I saw was an early product, new reports are being developed continuously.

 

The tool does all of these, while requiring zero learning curve to the users.  By allowing the users to use the same browser they are used to and by clicking as they normally would.   No new software, no new directions, nothing.   We now protect another layer of Defense in Depth and greatly increased our control of who comes knocking at our doors.

Try it and let me know what you think.  

 

DeliciousStumbleUponDiggTwitterFacebookRedditLinkedIn

Comments on 201 CMR 17:00

 

Readers of my blog know that I was a big supporter of Massachusetts Breach Notification proposed law, 201 CMR 17:00.     You may also have known that I authored an article, together with Ken Mortensen, esquire, about the rule, at CSO Magazine.  You can see the article at "201 CMR 17: A New Tea Party!"

At the time, we thought the proposed rule was a very good development.  But… then the Massachusetts Office of Consumer Affairs and Business Relations, OCABR, changed 201 CMR 17:00 to be a lot more…. "watered down".

So, doing what any thoughtful consultant would do, I sent the following letter, showing my opinions on the latest version of 201 CMR 17:00, and adding my suggestions.

So here is the letter.  Tell me what you think:

 

Office of Consumer Affairs and Business Regulation,
10 Park Plaza, Suite 5170,
Boston, MA 02116,
(by postal mail and email)

 

Sunday, September 7, 2009

 

 

Re: 201 CMR 17.00, Standards for the Protection of Personal Information of Residents of the Commonwealth

Dear Madame Under Secretary, Mr. Egan, and Commission members,

Upon review of the latest (August 24, 2009) revision of the proposed rule, I find that quite a few elements in the regulation were substantially relaxed. Believing that the intent of the legislature was to create a more effective regulation protecting the privacy and security of the residents of the Commonwealth, I find that the current revision effectively renders such protection non-existent.

Having been in the information security and data privacy industry for over 21 years, and having had my own identity stolen and misused, I find that the realities of the business world are such that certain elements ought be considered carefully.

That said, the change from the previous version is so massive as to represent an example of how not to regulate, rather than the piece of straightforward and exemplary control that I so enthusiastically wrote about for CSO (Chief Security Officer) magazine.

In the next paragraphs, please find the changes found, my objections to them, and my suggestions on how to improve the regulation. I would be delighted to testify in front of your committee and to help update these to enhance the benefit to all residents of the Commonwealth, persons and businesses combined.  

 

Comment Number One:

Starting with section 17.01(1), the change to remove the emboldened words (… by persons who own, license, store or maintain personal information…) represents a major shift in policy. In the real world, the protection sought by this entire effort is needed at least from mid-sized and small companies as is from large companies. While smaller companies are more likely to use a third party provider for their storage and hosting needs, there is no reason why these hosting environments should not be compliant with the demands of this regulation. Conversely, these hosting companies ought to partake in the responsibility for protection of personally identifying information (PII).

Further, the current revision actually encourages data set holders to use third parties to host the data. What is to stop a company from using the system and creating a fictitious sub-company, a separate legal entity, whose entire reason d’être would be to hold the data? Doing so will clearly circumvent any protection intended by this regulation. This situation occurs again in the following sections: 17.01(2),  

 

Comment Number Two:

Section 17.03 (definitions), in its previous incarnation contained the following verbiage:

Encrypted," transformation of data through the use of a 128-bit or higher algorithmic process, or other means or process approved by the office of consumer affairs and business regulation that is at least as secure as such algorithmic process, into a form in which there is a low probability of assigning meaning without use of a confidential process or key.

In its current form, the verbiage was changed to:

Encrypted, the transformation of data into a form in which meaning cannot be assigned without the use of a confidential process or key.

My comments on these changes are several: The first is an applaud to taking OCABR out of the business of approving or disapproving encryption. You are right not be in that particular storm. However, there is a need for encryption, nearly always. There are two choices here – the first is to require encryption no matter what; the second is to define certain data elements that will require encryption. While 128-bit is no longer “good enough” and 256-bit is the current “gold standard”, there are certain standards of encryption that are accepted more-or-less universally.

 

Comment Number Three:

Please specify that encryption is required.

 

Comment Number Four:

Please specify that “a minimum of 256-bit” is needed AND select from certain industry standards. For example, AES is appropriate for today’s day-and-age, and perhaps for the next four or five years.  

 

Comment Number Five:

Please do not use the term “a low probability”. This is understood by all practitioners in the art, and here serves to confuse, rather than clarify.  

Comment Number Six:

Still within this section, the following appears:

Owns or licenses, receives, maintains, processes, or otherwise has access to personal information in connection with the provision of goods or services or in connection with employment.

This section, and especially the highlighted text, seems to conflict with the first change noted above. Let me request a clarification of this text, as it would seem to include third-parties.  

 

Comment Number Seven:

Similarly to Comment Number Six, above, the text stating:

Service provider, any person that receives, maintains, processes, or otherwise is permitted access to personal information through its provision of services directly to a person that is subject to this regulation; provided, however, that “Service provider” shall not include the U.S. Postal Service.

In addition, especially the emboldened text, seems to conflict with the changes in the first item. What is your true intent?

 

Comment Number Eight:

In addition to Comment Number Seven above, and in full understanding of the need to call-out federal agencies away from compliance requirements, I question the naming of the USPS and its subsequent removal. Let me play devil’s advocate here by giving two examples, for two different scenarios that are likely to be butting against this regulation:

  1. Since any state law attempting to regulate a Federal agency is automatically void, is there a need to call out the USPS? Imagine that the US Department of Health and Human Services discovers that it is not excluded from this regulation de jour. What would happen then?
  2. Imagine that UPS or Federal Express, both of which are common carriers de facto, chose to contest this regulation based on the concept of “like-service”. How would you defend against such action?

My suggestion: Remove the reference to any Federal Agencies. Then, decide what you want the data owners to do. It would be perfectly ok, in a vein similar to requiring encryption, to require that sensitive data be transported physically only while encrypted (as one example only). This responsibility should belong to the data owners. I expect many questions regarding this point, and I will be thrilled to assist with all I can.  

 

Comment Number Nine:

Section 17.03 (remainder) drops the requirement (in a manner similar to my Comment Number One) related to third party. It then proceeds to make a bad situation worse by removing the need to be compliant to organizations that monitor data.

The situation, however, become much worse, if the intent is to drop the requirement to monitor the performance of such compliancy program.

In the real world, there is no substitution for ongoing monitoring of performance of compliance programs. This should not present an un-due burden to small businesses, as long as pragmatism prevails. I recommend clearly demanding a written review and monitoring of such compliance programs.

 

Comment Number Ten:

The changes to section 17.03(2) are very significant and leave the regulation completely without purpose or merit. They seem to go in a contrary direction to the National policy of tightening controls, as exemplified in changing HIPAA by the addition of the HITECH amendment.

By removing the need for a formal risk assessment this regulation just became a best intent regulation. Risk assessments are the only mechanism, short of guessing to understand any data privacy, security, and compliance issues. Nowhere in the original document there is a call for a risk assessment to be done just for the sake of compliance with CMR 17:00 – so do not call for one now, but please DO demand that the very important issues brought forth within this regulation be included in such formal risk assessment.

Even small businesses can conduct (inexpensive) formal risk assessments. Size of the business does not need to affect this requirement.

 

Comment Number Eleven:

By not requiring a person to have overarching responsibility for the performance of such regulation, the OCBAR is risking the “failure has no father” syndrome. In line with most European and US laws and regulations, assigning ownership to the data and to the privacy protection efforts is of major import. Please reconsider.  

 

Comment Number Twelve:

The removal of the call to handle terminated employees’ access, while seemingly in-line with the high-level intent of this regulation is not a good choice. As can be seen from many occurrences this year alone, organizations oft “forget” to do so, resulting in major security breaches and incidents – a subject on which I would be glad to discuss, if you wish. Please call that specific requirement in the next version.  

 

Comment Number Thirteen:

The movement of the compliance date to March 2012 is not a good move. In essence, a “hunting season” was just declared to allow organizations to sign contracts moving the data (as suggested by my Comment Number One) to third parties. Doing so, will leave no one as the responsible party.

 

Comment Number Fourteen:

Further, the language today is in effect a license, in perpetuity, as far as I can tell, to never be compliant as long as the contract is signed in time. This wide loophole must be addressed.  

 

Comment Number Fifteen:

The dual dating of March 1 2010 and 2012 here is quite confusing.  I asked several legal professionals to read this paragraph and they all indicated this writing is confusing. Can this be clarified?  

 

Comment Number Sixteen:

Another change emerges from this version. Briefly, the data owner no longer has to take reasonable steps.  As long as the owner has a contract, he or she appears to be in the clear. Surely, that is not the intent here.  

 

Comment Number Seventeen:

The removal of a need for a formal comprehensive information security program (CISP) leaves the regulation to interpretation of “I had one, in my mind, we all knew of it”. CISPs are requirements of today’s reality. Please consider adding the requirement for such a written program.

Again, even small businesses can create a formal CISP. The terms “formal”, “comprehensive”, or “written” should not be seen as equivalent to “expensive”. A sample guideline can be easily developed. One that is shared with the small business owner.  

 

Comment Number Eighteen:

With regard to item (g) herein, the removal of the requirement to keep least records for least amount of time and the least access, go against the grain of any good privacy program. As can be seen in the AICPA’s Generally Accepted Privacy Principals, and other places, these requirements are essential to the program. They must be called out, explained, and demanded.  

 

Comment Number Nineteen:

In my view, the changes called out in Comment Number Eighteen, taken together with the changes no longer requiring Risk Assessment make this entire regulation pointless.

No risk assessment + no knowledge+ no need for knowledge + no rules of least access = no protection  

 

Comment Number Twenty:

The new language, still in section (g), to:

(g) Reasonable restrictions upon physical access to records containing personal information, and storage of such records and data in locked facilities, storage areas or containers.

Virtually guarantees that this Rule will be visited repeatedly by the courts. The call for “Reasonable”ness has proven unwieldy in every instance it was used. This must be clarified. Other organizations have defined and validated such access rules and I suggest we use some of the language called out by certain of them to define this paragraph more meaningfully.

 

Comment Number Twenty-One:

The removal of section (J)’s (old section (k)) words “or the potential therefore” has a chilling effect on the regulation. Near-breaches, together with known vulnerabilities, are no longer required to be investigated. This WILL result in substantially less protection to the data. I respectfully request that they be re-introduced, and perhaps expanded upon.

 

Comment Number Twenty-Two:

Section 17.04’s header has two changes in it. The first was referenced by my Comment Number One. The second change, “…and to the extent technically feasible” appears redundant and, frankly, nonsensical to me. If it is not feasible…. It cannot be done, right? What is risked here is that a business will say: “My experts believed it was not technically feasible.” This is another loophole which must be sealed.

 

Comment Number Twenty-Three:

Section 17.04(1) again re-introduces the word “reasonable”. Please see my Comment Number Twenty for a discussion why this should be changed and clarified.

 

Comment Number Twenty-Four:

In section 17.04(1) (b) I applaud the change from “seven….” to “or use of unique identifier technologies, such as biometrics or token devices;” OCABR has no need to define secure vs. non-secure.

 

Comment Number Twenty-Five:

In section 17.04(1) (c), the new language to “…and/or format that does not compromise the security of the data they protect;” has two difficulties. Firstly, the previous version was correct to demand that passwords will not be kept with the data they protect. While that might seem obvious to some, it is not to all.

 

Comment Number Twenty-Six:

Continuing from the above, in that self-same section, the term “format” seems to not include other-than-password methods, such as physical keys, or tokens. Let us revise this section to enhance its usability.  

 

Comment Number Twenty-Seven:

In section 17.04(2) (b), the new introduction of “that are reasonably designed to maintain the integrity of the security of the access controls;” suffers from the same malady as I described in Comment Number 20 and others. There are well-ratified standards for that. Let us use one of them.

 

Comment Number Twenty-Eight:

Section 17.04(4) has several changes. Again, the introduction of “reasonable” is objectionable as detailed several times above.  

 

Comments Number Twenty-Nine and Thirty:

Continuing, and adding 17.04(5), the removal of a specific word: “recording” has the value of negating anything good that this Regulation intended. From the trenches: the requirements of recording and regular review are two of the keystones of any security and privacy program. There can be no protection without an audit trail. There can be no protection without a regular audit.  

 

Comment Number Thirty-One:

I applaud the addition of the demand for encryption on mobile devices. I suggest further, in-depth review of the meaning of this change, to the usage of extremely popular devices, such as the iPhone®.

 

Comment Number Thirty-Two:

In section 17.04(6), the word “reasonably” has appeared again. Twice. This is not an actionable standard.  

 

Comment Number Thirty-Three:

Reading on, the phrase: reasonably designed to maintain the integrity of the personal information”, is non-implementable in the sense that only Secure operating systems, such as some that are employed in unique instances by the Department of Defense, can be called “designed to …”.

 

Comment Number Thirty-Three:

I suggest that the plug-and-patch cycle, which unfortunately is a part of life for system administrators, be handled via the call for inclusion in the CISP I suggest in Comment Number Seventeen of a written change control procedure.  

 

Comment Number Thirty-Four:

In section 17.04(7), the word “reasonably” has appeared again.  

 

Comment Number Thirty-Five:

The removal of all reference to physical security in section 17.04(9) is wrong. There can be no information protection without physical protection. I request that this be added back into the Regulation and enhanced. There are quite a few guidelines available from well-tested regulations.

Thank you very much for your attention and review.

Sincerely,

Ariel Silverstone, CISSP

 

Permalink

DeliciousStumbleUponDiggTwitterFacebookRedditLinkedIn