Cool Solutions

Guest Post: Does the Increasing Complexity of Identity Management Make “Intelligence” a Necessity? (part 2)


March 23, 2010 8:22 am





Guest post by Dipto Chakravarty, General Manager, Cloud Security & VP Engineering, Security Management Operating Platforms at Novell

As I indicated in my last post, operational changes, technical innovation, and the evolving regulatory environment have introduced fairly unprecedented complexity in the world of identity and access management. Identity management has grown from a set of tools, to a product to a suite to a platform over time. Convergence and consolidation of identity management tools under a common framework, resulted in the emergence of platforms to allow you to authenticate, authorize, provision and audit in an automated fashion.

What happened next was a very disruptive phenomenon. The advent of the cloud has given rise to “ubiquitous utility computing without walls,” and in this new world a lot of the old rules, a lot of the old enforcement models become insufficient; they just fall short.

In the beginning, everyone managed their own IT infrastructure on premise and the physical walls of the enterprise provided administrators with a definite perimeter. The process of “de-perimeterization” started as soon as people began moving data centers off-site into collocation facilities and the like.

What happened next was virtualization. Thanks to virtualization, the density of data centers increased because you could virtually fit more machines into a given space than you could physically fit in a room. This brought with it a “service” model in which you were charged for energy usage, for example, rather than for the physical space your dedicated machines occupied.

Of course, there were times when you needed more computing power than you were subscribed for and one way to provide that was by bursting into the cloud.

This last stage is just the natural extension of the previous stages. First you had a traditional program running inside a physical computer. That process got virtualized so that multiple programs were running within a computer within a premise. Finally, you have multiple programs running across computers across premises, and in that most abstracted way, in that last scenario, you basically don’t have an option, but rather, a mandate to run workloads intelligently by making it “identity aware.” Here is how “identity awareness” can be thought of. Just as in a server’s run-time environment, every running process has a process identifier, every thread inside a process has a thread identifier, every fiber within a thread has a fiber identifier, the “unique identifier” concept can be applied to intelligently running workloads with the notion of a unique workload identifier.

In the cloud model, competitors are collocated on the same silos, so we have to bottom-up make sure that we keep their data channelized. The buzzword is “multi-tenant,” but that essentially means partitioning the data in a way, so that we can be continuously compliant and we have highly available systems that can be audited, logged, charged back and billed correctly.

The way to ensure that is by making injecting identity into the workload itself, thus making it intelligent and aware enough to know where it’s running, where it is allowed to run, who is allowed to access it, and what to do when there is a problem.

Our present day contemporary computing environment transcends physical, virtual and cloud dimensions. Few noteworthy call outs to make here.

First, almost all the environments are a hybrid of physical, virtual and cloud. At present ~70% is physical, ~25% is virtualized, and a <5% is cloud, although these numbers will shift radically over the next couple of years.

Second, all of the three worlds co-exist in commercial non-academic environments. There is seldom an “all virtual” or “all cloud” environment. So, the trick is to make these environments seem seamless and interoperate with each other despite their heterogeneous stacks so that it reduces complexity for end-users and improves efficiency for its stakeholders.

Third, your workload on the hybrid physical, virtual and cloud have to have consistency while performing “on premise or “off premise” and its infrastructure is all connected by pipes that you no longer fully own or control.

Finally, in a world without boundaries, unless you intelligently manage your workload by making them “identity-aware”, you are out of luck.

Intelligently executing computer workloads with “identity awareness”, is no longer an option; it’s a necessity. It’s necessary to have workloads that are identity aware; it’s necessary to intelligently manage these workloads; and it’s necessary to be frugal and prudent about the way we execute and pay for these workloads in a consumption-based or utilization-based chargeback model.

What previously was a cool thing to do is now the norm. I submit to you that identity-awareness is one of the cornerstones that make it possible to intelligently build, secure, manage and measure workloads across the physical, virtual and cloud computing environments. What do you believe?

0 votes, average: 0.00 out of 50 votes, average: 0.00 out of 50 votes, average: 0.00 out of 50 votes, average: 0.00 out of 50 votes, average: 0.00 out of 5 (0 votes, average: 0.00 out of 5)
You need to be a registered member to rate this post.

Tags: , , ,
Categories: Expert Views, General, PR Blog


Disclaimer: This content is not supported by Micro Focus. It was contributed by a community member and is published "as is." It seems to have worked for at least one person, and might work for you. But please be sure to test it thoroughly before using it in a production environment.