Wednesday, December 21, 2011

Leveraging Security Metrics To Protect Your Network


Maybe we should just give up trying to maintain secure enterprise networks; it’s just too hard. Fully 71% of respondents admitted that their networks are exposed to external threats due to misconfiguration issues present in their security device infrastructure. Verizon reports that 79% of organizations fail to maintain their PCI compliance from their prior year’s assessment to the next year’s Initial Report on Compliance. More than 50 percent had no idea how many of their organizations’ internal hosts were actually exposed to the Internet. 

We know that even in this era of constrained budgets, enterprises are spending more and more on network security—and yet 75% of network and security pros agree that the advantage is still on the side of the attacker. Verizon reposts that security “erosion” over the course of the year between PCI audits is the case with the vast majority of enterprises, despite the fact that we know there’s a correlation between data breaches and lack of PCI compliance.

Maybe it’s time to re-evaluate our priorities. As Dr. Mike points out, there’s a general consensus that much can be gained by focusing on the basics—the core controls. If you’re covering 90% of the core controls, security pros agree it’s better to put effort into getting to 100% rather than expanding the number of controls.

But if you’re focused on the core controls, how do you know what percentage level you’re at, and where the areas of exposure are? That’s where security metrics comes in.

In this case, we’re referring to actionable security metrics—metrics that provide proactive security intelligence. Many metrics are available to security pros: number of patches; number of vulnerabilities; and the number of firewall and router config changes are good examples of typical metrics. But most of these data points are without context, or simply serve as busyness measures. They don’t characterize risk in a meaningful way, nor do they point towards a specific area that needs attention.

Andrew Jaquith, in his book Security Metrics: Replacing Fear, Uncertainty and Doubt, describes the value of security metrics by contrasting to other business disciplines. For example, freight companies know their freight cost per mile and loading factors-as well as those of their competitors. Management can therefore set meaningful objectives and measure themselves against comparable companies. Choosing to be above, on, or below an industry average is a question of strategy as well as operational efficiency. For example, a freight company may be willing to have a lower load factor than its peers if that's the tradeoff required to offer faster delivery times (for which it presumably charges a premium).

Similarly, warehousing firms measure and compare their cost/square foot and inventory turns, and e-commerce companies measure their website conversion rates. And of course financial metrics have been standardized and reported on for years. Companies can therefore compare relevant metrics to those of their peers in order to better evaluate their internal performance.

Could such a use of metrics apply to security? Yes, but only if consistently generated within the context of a security framework.

The three pillars of security are visualize, comply and protect. If we build a framework on those pillars we’ll be able to generate meaningful security metrics.

Visualize: There is wisdom in Requirement 1 of the PCI DSS, in the section entitled “Build and Maintain a Secure Network”: the requirement is to create a network diagram, and keep it current. Why? You can’t secure what you can’t see. And yet, according to Verizon Requirement 1 has the second-highest erosion factor out of the nine requirements not specific to planning and checking. When security pros can visualize the network topology—including groups that clearly identify zones (such as DMZ) and untrusted sources—they become much more effective in creating effective segmentation strategies and policies, and maintaining their compliance. 

Comply: Compliance refers to PCI, FINRA, FFIEC, SOX and other regulatory frameworks, of course, but also internal policies, and best practices from sources such as SANS’ 20 Critical Security Controls, Version 3.0. However, complying with regulatory and internal policies in most cases is open loop; we perform security measures in an effort to comply, but other than regulatory audits we’re mostly in the dark as to how effective our security controls are. What we need to do is get from open loop security frameworks to closed loop with feedback controls that allow us to make continuous adjustments in the presence of security erosion, as shown in the diagram below:


Protect: The fundamental security question is whether the network is protected. How can we know what’s working, and where additional focus is required? By developing a security framework that provides security metrics—feedback controls, from which effective remediation strategies to security erosion can be devised. Security metrics enable enterprise to answer questions such as:
  • What is my overall level of risk, and how does it compare to yesterday, last week, last month and last year?
  • How easily can attackers get in?
  • How big is my attack surface?
  • How much of my infrastructure is undocumented?
  • Are investments and actions paying off?
  • Where do we need to improve?
  • Are we ready for our next audit?
Note that the questions above relate to actual network security, unlike, say, how many hosts were patched in the last month (busyness measure) or how many vulnerabilities are being scanned for (no context).

Are these good security metrics? Let's look at Andrew Jacquith's definition of a good metric:
  1. consistently measured, without subjective criteria;
  2. cheap to gather, preferably in an automated way;
  3. expressed as a cardinal number or percentage, not with qualitative labels such as high, medium and low;
  4. expressed using at least one unit of measure, such as "number of hosts directly exposed"; and
  5. contextually specific—relevant enough to decision-makers so that they can take action.
The security metrics provided in RedSeal 5 satisfy all of Jacquith’s criteria for good metrics, enabling RedSeal’s customers to continuously monitor their network through a closed loop process and therefore address problem areas—and in doing so protect their organization’s hosts and other sensitive assets.

Friday, February 11, 2011

"Night Dragon" Latest Reported Advanced Persistent Threat

Advanced persistent threats, when detected, are rarely publicly reported. Government agencies and enterprises that may have had sensitive data exfiltrated are reluctant to admit it, even more since they are unlikely to know precisely what assets were stolen. That's what makes McAfee's announcement of the so-called Night Dragon exploit noteworthy.

It's been a year since McAfee aired details of Operation Aurora, an advanced persistent threat (APT) that targeted at least 30 companies and organizations -- notably including Google, who publicly linked the exploit to China.

George Kurtz, CTO at McAfee, writes in his blog:
Starting in November 2009, covert cyberattacks were launched against several global oil, energy, and petrochemical companies. The attackers targeted proprietary operations and project-financing information on oil and gas field bids and operations. This information is highly sensitive and can make or break multibillion dollar deals in this extremely competitive industry.
McAfee has identified the tools, techniques, and network activities used in these attacks, which continue on to this day. These attacks have involved an elaborate mix of hacking techniques including social engineering, spear-phishing, Windows exploits, Active Directory compromises, and the use of remote administration tools (RATs).
 McAfee provided the graphic below to outline the stages of the attack:


The data accessed by the attackers included operational oil and gas field production systems, financial documents related to field exploration and bidding, and data from SCADA systems.

No one knows how many additional exploits are silently underway, exfiltrating sensitive data, intellectual property and state secrets. What's clear is that the current generation of tools to detect and defend against such attacks are inadequate for preventing such breeches.

Tuesday, February 8, 2011

Controlling Excessive Entitlements

Deloitte, in their 2010 Financial Services Global Security Study, reports that excessive entitlements, also known as excessive access rights, was the top audit finding over the past year -- for the third year in a row. It's not an isolated issue: according to Deloitte, excessive entitlements was the top audit finding  in retail and commercial banking, insurance, investment banking, and globally across all financial service segments.

Since all major regulatory frameworks, including SOX, PCI DSS, GLBA, NERC and HIPAA, require entitlement controls, many thousands of companies globally are obligated to prevent excessive entitlements and yet, according to the Deloitte survey, have failed to effectively do so.

IDC states that up to 60% of entitlements on most systems are expired and therefore dormant. It's no wonder that auditors can readily uncover excessive entitlements.

Contrast that with entitlements managed by online billing systems, where typically 0% of entitlements are dormant. What's the difference? Why are billing systems able to manage their entitlements effectively, while enterprise IT departments cannot?

The answer? Money.

Billing systems turn entitlements on or off based on payment activity. If an end user stops paying for any reason, the billing system notifies the client company and the associated product or service is no longer made available. If it were not so, the company would lose money by providing products or services for which there is no associated revenue -- in other words, operating at a loss. Because they have a financial incentive to get it right, these companies manage entitlements effectively.

Now consider financial services enterprises. When users are transfered from one department to another, or are assigned new roles in the company, they often retain their legacy entitlements through a transition period for support and training purposes. It's safer to keep these entitlements in case questions come up regarding the prior role. But no real incentive exists for end users to later relinquish their now excessive entitlements, and these entitlements often fall through the cracks of IT and compliance tracking systems. An enterprise may spend hundreds of thousands if not millions of dollars on entitlement management systems. But with up to 60% of accounts in the dormant state, the challenge is simply too great without having line-of-business managers and IT staff spend an unreasonable amount of time trying to stay on top of the issue. As a result, most enterprises have found that effectively managing entitlements and access controls is simply not possible.

Financial incentives work, as demonstrated by online billing systems. So why not try that approach in large enterprises? Considering the risk to the business from failed audits, it's time to think outside of the box. So here's an idea:

What if every user had a payroll deduction for every entitlement that is unused for a certain period, let's say 60 days. The "fine" amount goes into a reserved account, and is refunded once the entitlement is relinquished. This establishes a gentle but real incentive for end users -- not IT, not the compliance group, and certainly not HR -- to manage entitlements. By putting the issue into the hands of the only people who know whether their entitlements are required or not to perform their job functions, and underlining it with a mechanism to ensure visibility and remediation, the problem of excessive entitlements could be solved once and for all.

Monday, February 7, 2011

Advanced Persistent Threats

Security trends tend to focus on technology: terminology such as malware (on the exploit side) and data leakage protection (on the security solutions side) describe the issue in terms of their most salient technical characteristics. Botnets, drive-by downloads, and Trojan horses add further color to the technical aspects of key security threats.

The “who” behind these security threats is generally thought to be less interesting. Yes, we think we know that certain botnets are controlled by the Russian mafia, and certain exploits tend to be perpetrated by insiders. But it’s the technology behind these threats that we in the high-tech security business use to identify them and their remediation.

Advanced Persistent Threats, recently made trendy by security vendors’ marketing departments, seem fundamentally different. If you look to technical descriptions of advanced, persistent threats (APTs) you will have trouble distinguishing them from botnets. FireEye, for example, describes various command and control systems that bots and APT have in common. Shared characteristics between botnets and APT include stealth, polymorphism (continuously altering malware as it goes from host to host), and automatic updating (including new malware and even patches to protect against rival botnets).

What differentiates ATP from most botnets and other security threats is the “who”: the ATP exploit tends to be targeted, and brings to bear resources (and patience) indicative of a well-funded actor – most often a nation state. In fact, Greg Hoglund, CEO of HBGary, says ATP is a nice way to not have to say "Chinese state-sponsored threat." Attacks against Google and the U.S. DoD thought to have originated in China would seem to support this definition.

Michael K. Daly of Raytheon, speaking at LISA’09, defines APT more broadly, as increasingly sophisticated cyber attacks by hostile organizations with the goal of:
  1. Gaining access to defense, financial and other targeted information from governments, corporations and individuals.
  2. Maintaining a foothold in these environments to enable future use and control.
  3. Modifying data to disrupt performance in their targets.
But Eddie Schwartz, chief security officer at NetWitness, disagrees that modifying data to disrupt their targets is a universal ATP trait: "A real APT never really damages anything. They tweak a log file here and there ... They are stealing stuff, but you still have your copy. You never see them taint it," he says.

There is no question as to the level of sophistication involved, nor of the value of the assets under siege. Raytheon presents a hypothetical but representative scenario in the diagram below, showing multiple stages, multiple teams, extraordinary stealth and patience, and the exfiltration of well-protected and valuable information assets:


Stage 0 in the diagram above is the "Infection" that gains an initial foothold. How do these infections occur? Damballa points out that APTs can breach target organizations through a wide variety of vectors -- even in the presence of properly designed and maintained defense-in-depth strategies, as shown in the diagram below:


Well-funded APT perpetrators also have the means to compromise insider threats as well as the external threats shown above. Additional "insider threat" and "trusted connection" vectors are shown below:


Advanced persistent threats are in the news these days, and many security vendors are going to great pains to explain how their product (or more likely, the next greatest release of their product) is the ideal solution. But most experts agree that the organizations perpetrating APTs are well-funded, determined, and willing to take as long as necessary to preserve their covert activities. Is it likely that such unique security threats can be adequately addressed by the same technology that was originally developed to solve a different problem? Stay tuned for emerging start-ups such as Cyphort that bring a radically different approach to detecting and remediating advanced persistent threats.

Thursday, January 27, 2011

Access Controls, Then and Now


For the past two years I've been telling anyone who will listen that ineffective IT access controls represent an ongoing security vulnerability as well as a compliance liability for many regulated firms. The Ponemon Institute has published a survey that not only confirms what I've been saying, but shows that it's getting worse. What a surprise.

Here's how Ponemon summarizes the problem:

When employees, temporary employees, contractors and partners have inappropriate access to information resources -- that is, access that violates security policies and regulations or that is inappropriate for their current jobs -- companies are subject to serious compliance and business risks.


Fair enough. But many enterprises and security-conscious organizations have a "least privilege" policy to ensure that, as regulations and best practices require, users are provided access to ONLY those resources for which they have a legitimate business need. Doesn't that prevent the inappropriate access referred to above?

Not really. Although least privilege sounds simple enough, in practice it has proven extraordinarily difficult to achieve. This is especially true in dynamic enterprise environments, where activities related to onboarding, offboarding, outsourcing, partnering, and use of contractors threaten to overwhelm whatever business processes exist. These challenges are exacerbated by the coordination required between line-of-business managers, IT staff, HR, security, and compliance staff to manage access controls. In fact, Bruce Schneier, a prominent security guru, states unequivocally that perfect access control just isn't possible

Schneier must be on to something. The Ponemon survey, sponsored by Aveksa, found that most relevant metrics for access management are trending down. Here are the top two findings:
  • User access rights continue to be poorly managed. Eighty-seven percent of respondents believe that individuals have too much access to information resources that are not pertinent to their job description - up nine percent from the 2008 study.
  • Organizations are not able to keep pace with changes to users' job responsibilities and they face serious noncompliance and business risk as a result. Nearly three out of four organizations (72 percent) said they cannot quickly respond to changes in employee access requirements; and more than half (52 percent) reported that they are unable keep pace with the number of access change requests that come in on a regular basis.
What's at risk when access controls are ineffective? Survey respondents' concern was highest for company applications, intellectual property and general business information. Not to mention audit findings.

So what's the primary cause of poor performance in IT access management? A plurality of respondents say "We cannot keep up with our organization's information resources."  This is consistent with Schneier's observation that organizations are simply too chaotic to make it work. So what should be done?

According to the IAM experts, this is where access certification comes in. Here's what Aveksa has to say about access certification:

Good access governance requires the regular review and certification of user entitlements and roles to ensure that access rights to enterprise information assets are appropriate and meet regulatory mandates and guidelines for Sarbanes Oxley, PCI, GLBA, MAR, FERC/NERC, Basel II and HIPAA compliance.  


Many IAM solution providers have integrated modules to help you with your access certification. The problem is, this level of certification -- while important -- involves a review of the rather complicated matrix of staff and roles/entitlement assignments that have overwhelmed organizations in the first place. 

It's not as if organizations don't know they have probable vulnerabilities: the vast majority say it's "likely" that users are over-entitled.

Here's what we can conclude: Organizations suspect that their users have more access than is required, a clear violation of compliance regulations as well as a security risk. And auditors have proven their worst fears, as excessive access rights have remained the top audit finding for years. So we know that organizations are motivated to solve this problem. But despite the availability of comprehensive role-based access control IAM systems, regulated enterprises apparently still do not have the right tools to manage access controls. What they are missing is feedback that quantifies the effectiveness of their access controls.

Current approaches have obviously failed to achieve the desired and necessary level of security and compliance. That's why Cloud Compliance, my prior company, was formed -- to address this and related access audit issues through an innovative SaaS-based capability called Identity and Access Assessment (IdAA). Cloud Compliance provided visibility into not just who is accessing what, but who should access what. And when excessive access rights inevitably occur, Cloud Compliance analytics would help determine the root cause and effective remediation strategies. 

Saturday, January 1, 2011

Schubert

My favorite Schubert piano sonata is # 14 in A minor, D.784 (played by Mitsuko Uchida, a piano goddess). It starts by gently probing in the far reaches of our soul, asking ineffable questions that are of the sort one might ponder between dreams. Gradually we are drawn into the A minor universe, rising and falling on the swells of Schubert’s growing tempest. Through the first two movements the dialog progresses as a series of rising storms, sublime wind and currents that dance around themes noble and eternal—separated by interludes of sunlight, not just illumination but light that warms our hearts and enlightens our heads. Urgently and inexorably the melody pushes forward, increasing tension until it can increase no more and then, like a crossbow pulled back one more notch—is it possible?—and then another, and yet another! Finally the third and final movement (allegro vivace) resolves all the built-up tension, thunder and crossbow bolts filling the air with color, pulsing in strict accordance with the inexorable rhythm of the universe, and just as we bring ourselves into confident sync there’s the briefest pause—almost imperceptible—where the force behind the tides of the oceans and orbits of the planets gathers itself for the ecstatic finale. Somehow we’ve journeyed to the far reaches in just under 24 minutes, returning cleansed, fulfilled. I love Schubert's music.

I didn’t really know much about Schubert until a few years ago. And I wasn’t really attracted to classical piano music other than the odd concerto. Too boring compared to instruments that appeal to the ear such as a violin, which when expertly played could bring an audience to tears with a single note. The plaintive tone of an oboe, the rich warmth of the cello, the energy and passion of the brass all strike deeply within whereas the piano seemed to just offer notes. But, inspired by Thomas Mann (Doctor Faustus, chapter VIII) I decided to try again to appreciate the piano—the instrument, unlike all others, for beyond the senses, where what is heard is the noble, intellectual content of the music. Soon I had 10 hours of Beethoven and 9 hours of Schubert piano sonatas on my iPod.

How to deal with so much new music? With Beethoven, it was easy. Of his 32 piano sonatas, 8 or 9 of them became popular enough to have been named (Moonlight, Waldstein, Appassionata, etc.). So I focused on listening to and understanding the named Beethoven piano sonatas as a start.

Schubert was more difficult. I didn’t know where to start, and he didn’t have a list of named sonatas to work with. And so, one Saturday while Jo was in PA, as I was working at home all day, I listened to all 9 hours of Schubert piano sonatas When I heard a theme or phrase I particularly liked I wrote down the sonata that was playing. At the end of the day I had four Schubert piano sonatas to start with.

How do we learn to like pieces of music? For me, the only way is repetition. It takes at least 3 and sometimes 5 or more hearings before I have reached any level of familiarization with any but the simplest tunes. And while we’re at it, what it is about some music that we like and other that we’re not attracted to. In “This Is Your Brain On Music” the author (Daniel J. Levitin) makes the case that one of the attributes of music sophisticated listeners find pleasing is it’s complexity (within the constraints that make it music rather than noise, such as timbre, tempo, etc.). While it’s true that such an theory explains why repeated hearings are required to fully embrace a piece of music, on the whole I found that explanation unsatisfying. The opening bars of Beethoven’s Moonlight Sonata are anything but complex, yet we’re attracted to it nonetheless.

It seems to me there are at least two elements of satisfying music: it’s beauty; and how deeply it touches us, or moves us. And I would think that individuals with different tastes are more likely to agree as to the beauty of a piece of music based on its having a pleasing melody along with well regulated harmony, structure and tempo as per prevailing forms.

But what is it in music that moves us? Personally, for example, I find overwhelming beauty in Bach. I love the St Matthew Passion, the Mass in B minor, Goldberg Variations, Musical Offering, Cello Suites, and others—and listen to them often. But Bach rarely moves me. Same with Mozart; there's beauty, but not much in the way of passion. But Beethoven, Brahms and Schubert do indeed move me with their beautiful music. Why is that? And why is it that someone else might be moved by Back and Mozart, but not Schubert? Dr. Oliver Sacks researches this very topic from a neurological point of view, and shows various portions of the brain “lighting up” more when listening to that music which moves us (in Dr. Sacks’ case, that’s Bach). But I suspect the neurological view is more of the “what” rather than the “why”. Sacks touches on this when he suggests that music is able to reach the oldest, pre-verbal portions of our brain and thus elicit a primal response.

I started playing the four Schubert piano sonatas that somehow made an impression the first time I heard them—sonata #20 in A, sonata #7 in E-flat, and sonata #142 (which, published posthumously, is actually a collection of four impromptus) along with sonata #14 in A minor referred to above. And after listening to them a few times, I found myself drawn to them more and more strongly. I discovered that Schubert’s piano sonatas had the ability to transport me in a way that other pieces could not. I went back and selected other Schubert sonatas to listen to, and my collection of “moving” Schubert piano sonatas began to grow: I’ve now got about 7 or 8 that I listen to on a regular basis.

The following yearI got some Schubert chamber music. Now I have added to my collection of Schubert favorites his “Trout” piano quintet, several string quartets (including “Rosamunde” and “Death and the Maiden”) and the famous Cello Quintet (also published posthumously—he died young—and cited by Wikipedia as deeply sublime, with moments of unique transcendental beauty, and the “high point in the entire chamber repertoire”). In the documentary "Music From the Inside Out", Philadelphia Symphony concertmaster David Kim says the best thing about his career now that he's no longer performing by himself as a traveling violin virtuoso is that he gets to play the Schubert Cello Quintet in a chamber group, which he could never do before.

Schubert’s liturgical music is beautiful, especially his masses; my favorite mass is Schubert’s Mass in E-flat major, although Beethoven's Missa Solemnis, Kodály's Missa Brevis, and of course Bach’s Mass in B minor are favorites as well.

While Schubert in general seems to move me the most, I have found other pieces that do as well: Brahms cello sonata #1, piano quintet in F, string quintet in G, and his sacred choral music; Beethoven’s piano sonata favorites include Moonlight, Waldstein, Appassionata, Tempest and Hammerklavier; I also like his violin sonatas, especially Frühlingsonate and Kreutzer, and his string quarter in F, op. 135. And among the Russians I am especially moved by Tchaikovsky’s Rembrandt Trio and Rachmaninov’s cello sonata and piano concerto #2.

But mostly it's Schubert . He left a fairly large body of work considering the fact he died young (at age 31). He was buried next to Beethoven, whom he greatly admired and who had died the previous year. Many of his manuscripts weren’t found until after he died, and his popularity increased gradually as Robert Schumann and Franz Liszt, among others, transcribed, arranged and promoted his work. On the 100th anniversary of Schubert’s birth in 1897 Vienna celebrated with ten days of Schubert concerts. Imagine that!