The Centurylink Technology Solutions Blog - Trends in IT Infrastructure

Security: March 2012 Archives

Identity management is a hard problem. It's hard technically, it's hard to get executives and business partners to care about it, and it's hard when complexities (like cloud and mobile) are introduced to the environment. In fact, if you look at identity from an economic point of view - and you systematically analyze the costs of identity in your environment (including provisioning, access management, access review, etc.) chances are good that you're spending more on those things - organization-wide - than just about anything else IT related.

 

If that doesn't resonate with you, it could be because the costs are hidden - or at least don't show up as a line-item in the IT budget. Consider, for example, the costs of provisioning across the enterprise as a whole. The IT side of the issue is probably pretty obvious: You see the impact to staff time; you see the impact to helpdesk overhead, etc. But there are other aspects of this problem that don't hit IT and may be less visible.

 

Consider provisioning of access to specialized business apps or one-off apps that aren't maintained or supported by the IT department. For example, if the individuals who actually create users accounts are specialists on the business side - their time represents a cost (in many cases a large one) to the organization, but because IT doesn't directly feel the costs involved, they can potentially go unnoticed. If noticed, they may not be top of the priority list to address.

 

The same is true for access review. If, like many, access review consists of a periodic report to managers along with a request for them to evaluate the appropriateness of accounts and user roles, their time costs money - not your money, but someone's. The time those managers spend in that task don't hit the IT budget, but yet cost the organization significant dollars on a per-year basis.

 

Lack of ROI Means Money Out of Your Pocket

The point is that it's not cheap. Because of that - and because the economic impact may not be directly visible to management, that means it's often hard to build support for improvements (e.g., automated provisioning, access management, monitoring and other identity-related systems and tools). It's a dilemma: From a security point of view, it's in our best interests to see those tools get deployed and improvements made - but fighting the myopia is hard.

 

It's hard to have a conversation like, "We spend X on identity now, but we want to spend X-Y on operational costs by purchasing a tool with an initial capital outlay of Z. Although the tool will set us back by Z this year, we anticipate it'll pay for itself in N years and subsequent cost savings of Q over the lifetime of the product, which we estimate to be L."

 

It's hard to argue with reasoning like that (assuming your numbers aren't bogus). But you can't do it for identity when the relevant variables are spread around the organization.

 

For security pros it's frustrating because in almost every case, it's cheaper to replace legacy manual processes with an automated system, because it's more effective from a security standpoint (reducing human error and ensuring coverage), and because it helps in a cloud context. Cloud vendors can (or at least should be able to) leverage the output of these tools directly - for example through support of technologies like SAML, SPML and XACML (to mention a few). But proving it's better without numbers is like trying to catch the wind.

 

Can You Get Visibility? Maybe Now's a Good Time

So for the pragmatists out there - what can you do?

 

Say you want to be able to build support for an automated identity approach but you're finding building a business case to be challenging because of the invisibility of costs described above? Some will tell you to make stuff up - i.e., "estimate based on numbers of accounts created, average review time, etc." As a last resort, this can be a "better-than-nothing" Hail Mary - but it also has the potential to blow up in your face. Like maybe if the person you're presenting your calculations to spends five minutes reviewing his/her access list and the average you estimated is two hours; it's hard to pull back from that if they lose confidence in your numbers from the get-go.

 

One strategy you can try first before resorting to guessing involves leveraging existing IT metrics initiatives to attempt to collect data organization-wide (i.e., outside of IT). For example, looking to current hot initiatives to piggyback on: If you're in the federal government, you could include these metrics in your required continuous monitoring strategy (it's a requirement, it's top of everyone's mind, so now is a time you might be able to get farther than otherwise). ... In the private sector, you could attempt to leverage your cloud initiatives themselves to help gather data by making it part of questionnaires or data-gathering being done to support a cloud move. In fact, anything that involves data-gathering from the business (a BIA, vendor review, auditing, etc.) can be a good place to try to find data sources to help drive a realistic cost model.

 

The point is - getting good data here is a fundamental part of building a useful cost model that accounts for both visible and "hidden" costs. And if you really want to get traction with investments to your identity management program, chances are good you'll need this kind of hard evidence to make progress.

 

Ed Moyle is senior security strategist at Savvis, a CenturyLink company.

Three steps for a successful data protection strategy

Professional Services IconData creates stress on organizations like never before. From exploding data rates and multiple legacy tools to regulation, virtualization and always-present resource constraints, the practice of storing and protecting data remains a significant challenge.

 

The terms "data protection" and "backup" are sometimes used as synonyms, but there actually are distinct differences.

 

In its simplest form, "backup" refers to the practice of replicating data and making sure a copy is available for restoration should an unforeseen event occur to make the primary data set unavailable. "Data protection" stands for a far-reaching approach to ensure that data is not only available when needed but is also secured at every point along the lifecycle. Further, data protection enforces the extra requirement of proving that the data is backed up, secure and available, generally through the use of advanced portals and reporting.

 

When our customers approach Savvis about assisting with a data protection strategy, we typically spend as much time defining the goals and requirements as we do the solution. With a consultative approach we are able to offer standard solutions or design fully custom solutions to meet the business objectives.

 

Some common themes have arisen during the many conversations I've had with clients. Generally speaking, I see three steps that have led to many successful data protection strategies:

 

1. Efficient use of technology

The good news is there are a wide range of technologies and vendors that offer components of data protection. The bad news? No hardware or software vendor can offer all of the best facets under a single umbrella. For example, many vendors can offer disk-based backup with snapshots and deduplication but cannot provide portal visibility or long-term tape archiving to drive down costs.

 

If one reviews the options available - disk-based backup, virtual tape libraries, snapshots, clones, geographically dispersed replication, deduplication, compression, client based, server based, etc. - the results can be confusing. Therefore, as part of a successful strategy, it is important to select a provider with deep experience in data protection solutions as well as an understanding of how to assemble best-of-breed products to scale up functionality while scaling back cost.

 

2. Data segregation

Not all data is created equally. While that statement is slightly tongue in cheek, it's clear that some data is critically more important than other data.

 

What we find in speaking with prospective customers is that they are executing on a strategy driven by technology selections performed years ago. For example, many organizations we speak with place all of their data into a single category and store it for the amount of time regulated by a small set of the overall data because their technology of choice dictates the strategy.

 

The "one-size-fits-all" strategy is easy to manage but is inefficient and expensive. Data can be organized into a number of categories, but we find it generally falls into three:

 

  • Regulated - The name say it's all, this is data regulated by law to be protected in a certain manner. Data that is subject to regulations such as PCI, FISMA, HIPAA to name a few needs to be protected with the maximum assets available to meet the requirements. This data is nearly always encrypted at rest and in transit and is retained for long periods of time, typically in off-site facilities.
  • Sensitive - This is data that contains organizational intellectual property but isn't necessarily regulated by law to be protected in a certain manner. However, this is often an organization's "secret sauce" and should follow the same protection profile as the regulated data, but generally is not kept for the same duration. For example, while regulated data may be kept for seven years, sensitive data often is kept for less than a year.
  • Typical - This data is generated by a company during business-as-usual projects. Often classified as test-and-development data, it can encompass much more. The bottom line is that this data should be contained for the minimum amount of time to keep business continuity. Industry studies typically define this data as being kept for no more than two weeks.

 

3. Simplicity in execution

While there are many high-tech features and functions in any enterprise-level data protection strategy, the management and implementation must remain as simple as possible. For a data protection strategy to be successful it must not result in an overhaul of process just for the sake of protection.

 

Now, of course, any archaic process could use some change but most processes get the job done and do not need a new process; rather, they need a new toolkit. The most elegant solutions are flexible enough to meet business requirements while providing visibility and confirmation of execution, generally through the use of portal-based functionality.

 

Of course, there are more components to a good data protection strategy, but the aforementioned themes are present in every data protection conversation with our customers. Talking through your unique situation with experienced experts like those at Savvis would be a good start to get onto a path toward a successful data protection strategy.

 

Matt Brickey is director, storage and data protection product management, at Savvis, a CenturyLink company.

cybersecurity.jpgWhen talking to enterprise and government customers, security of their hosted applications is never far from the top of the list of priorities. It doesn't make sense to run a workload in a hosted or cloud model if you can't trust the service provider to reliably mitigate risk.

 

Many customers face regulatory requirements that tie their operational models to one or more compliance standards, such as Payment Card Industry Data Security Standard (PCI DSS) or the Federal Information Processing Standards (FIPS). However, as any chief security officer would point out - there is more to security than a standardized checklist, and there is more to risk management than industry-wide consistency. Each application possesses a different risk profile, and the manner in which risks are mitigated may depend on a wide range of variables. Of paramount importance in any information security architecture is the quality of information, the depth of control, and the ability to respond to changes in the environment.

 

It is possible to be "compliant" while not necessarily being "secure." The reason for this is that threats and risks are continually changing, and any interval-based compliance regime has an ambient level of bureaucracy that can only respond so quickly. There is an emerging debate across the ICT landscape, especially when focused on critical industries such as government, finance and energy, around whether regulatory standards are the best answer for mitigating risk.

 

On one side is the desire for clarity and "industry standards," which give all players a sense of order and consistency. Procedural discipline, documentation and knowledge of the standards all become vitally important. On the other hand, where rapidly emerging threats require supreme adaptability, the most important currency is information. Knowing where attacks are coming from, as they say, is half the battle.

 

In some respects, this is a false debate - enterprise-class organizations require equal measures of discipline and agility to successfully negotiate the risk landscape. However, when considering broad cybersecurity legislation, politicians and regulators tend to rely more on the former than the latter.

 

This debate was highlighted during the March 7 House Committee on Energy & Commerce's Communications and Technology Hearing on Capitol Hill in Washington, D.C. CenturyLink senior leaders were vital contributors to the discourse, highlighting the need for government to not simply pass down bureaucratic regulation, but also partner with key stakeholders in the industry to pass along threat intelligence in an organized and actionable format. Our chief security officer, David Mahon, testified that while the global cybersecurity threat is "real and serious," integrated communications providers like Savvis and CenturyLink play an important role in the cybersecurity ecosystem.

 

Mr. Mahon's opening statement can be viewed here:

 

 

It's interesting to think about how traditional telecommunications companies are well-positioned to address both sides of the aforementioned debate. We have the operational discipline to address compliance obligations, while we have end-to-end visibility of network traffic that allows us to act on late-breaking information before it reaches a data center or network endpoint.

 

David Shacochis is vice president, global public sector, at Savvis.