Saturday, August 10, 2013

Nothing is scarier than a business user with a credit card - collaboration

Over the last couple of months there has been some interesting discussions about the ability of NSA to get access to all kinds of data that enterprises puts in the hands of US based IT firms. The discussion itself is not new as the same kinds of worries have been raised about data entrusted to entities in countries such as China. Most companies outside the sectors that work with technology or other information that is considered vital for national security are usually more concerned that the Chinese authorities will leak the information to Chinese competitors than about the fact that the Chinese authorities has the info.

Could the same thing happen in the US? I would say perhaps so perhaps it is time to consider alternatives to sending all your docs to a US based company if you yourself is non US based.

Back in the old days when IT actually was in control of IT systems it was possible to stop the business from utilizing useful services such as Dropbox to share documents. Today this is largely no longer true as any middle manager in the business who believes that the lack of a good document sharing solution is holding his team back can simply use his credit card to buy some space at Dropbox or Google Drive and then expense the cost just as he would expense a hotel room or a flight. This means that IT has to offer a collaboration platform that is as good, will be ready in days or worst case weeks rather than months and shouldn't cost much more than the SaaS option. If the business perceives that you can't deliver fast enough the credit card comes out and there is usually very little IT can do to stop them.

There are a number of pages such as Prism break that lists different open source alternatives but sometimes you have a specific need and not the time and resources to find, customize and implement an Open Source solution.

When it comes to collaboration the classical choices are Sharepoint or some of enterprise file share portal such as Axway with web and SFTP interfaces. Either of these options can be used to build very competent solutions but neither is really fast or cheap.There are some quite interesting new products in the market place that fits between the pure SaaS offerings and the traditional enterprise collaboration platforms. One such product is called Anchor and is marketed by Cloud Distribution in the Nordic and UK regions.

I have been playing around with an installation for a week and overall it looks like a very useful product for the SMB segement. Anchor gives you Dropbox/Google Drive like web file management capability as well as as an app that enables automated file system level synch on most platforms including smart phones (not on Chromebook, sigh). It also offers automated backups.

Anchor has a quite nice user interface that makes it easy to setup user accounts as well as guest accounts. There is also good out of the box reporting and lot of advanced features. There is even an AD synch option and remote wipe functions for the smartphones. The product has not got support for things like UMA, OAuth or OpenID Connect which may be useful for companies that likes to live close to the edge of what is technologically possible but for 98% of the SMB market that just isn't needed at this point.

If you don't have an agile collaboration solution in place and you are feeling that your business would benefit by one, or at least the business perceives that they need one, Anchor may be a good option.

Thursday, May 30, 2013

IT Service Catalog in OIM11G R2 - filtering objects

In the post "IT service catalog evolution" I discussed how the set of entitlements that IT offers to the business have been presented to the business in the various request interfaces that have been available in various provisioning products. A major and ongoing issue has been how to expose the entitlements that the business cares about. Traditionally the solution there were a couple of different ways to solve that problem if you are using OIM but if you wanted to solve the problem within the confines of the standard web interface you often ended up with a very large number of resoure objects (ROs aka application instances). Another reason for ending up with "too many ROs" could be that you have large numbers of independent target systems and each system has been modeled as an RO.

A large number of ROs comes with a number of issues but the biggest is usually that it can make it hard for the business to pick the right entitlement in the request interface.

In this post I will take a closer look on how you can resolve this problem in OIM11G R2 by utilizing the catalog concept.

The catalog offers the ability to not only present resource objects (application instances) but also use enterprise roles and entitlements. This gives you a very rich tool chest when it comes to displaying options but sometimes what you need to do is to selectively not showing certain options based on the attributes of the user that is using the request interface.

Daniel Gralewski has written an excellent introduction to the Catalog concept that is a very good starting point if you are unfamiliar with the feature. A more in depth discussion can be found in the OIM manual.

The object filtering approach requires that it is possible from a business standpoint to divide the objects in the request interface into a number of different buckets and then map these buckets to different groups of users. A typical situation may be that the following "buckets" exists:

  • Birthright objects such as a base AD account
  • Enterprise applications
  • Applications that are used by a specific department such as HR or Finance apps
  • Applications used in a specific geographic region i.e. EMEA

The discovery and categorizing exercise is very similar to role mining and if you drive it too far you will run into the same issues that plagued role mining projects. That said it is usually decently easy to perform some form of coarse grained sorting of the apps.

Once you have the apps sorted you can map the users through their cost centers or departments so that the users only see the objects that are interesting for them.

Daniel Gralewski has written a detailed howto that shows how to change the shopping cart icon based on if the user already is associated with an object or not. The same approach with some modifications to hide objects that the users really doesn't need to be able to request.

Alex Lopez has written a more advanced example that also uses multi step drop downs where the content of the first drop down is determined by the requesting user's attributes and the content of the second is determined by the pick in the first drop down. Very nice example that shows the versatility of the interface.

The catalog does offer a number of advanced capabilities and really gives the implementation team an ability to create a very customized user interface within the core product. This means that you don't have to take the very large base investment that a "ground up" user interface means and that the implementation also is decently upgrade safe.

The downside of the catalog approach is that you do need to do some business analyzes work up front to understand the who should be able to request what. The implementation team does need to have quite deep webcenter/adf skills to be able to perform the customization.

Overall the catalog is a very nice feature and clearly puts OIM clearly ahead of some of it's competition i.e. IBM SIM/TIM 6.0.

Wednesday, May 29, 2013

IT Service Catalog evolution

One of the eternal problems in Enterprise IAM is to bridge the gap between how the business looks at an entitlement and how this entitlement is actually provisioned and managed by IT into the target systems.

If look back to the 2001-2005 area when the first provisioning tools like Thor Xellerate (later OIM) and Access 360 ( later TIM and now ISIM) entered the market the tools basically offered a capability to automate the creation of core identities and the associated target system identities. The approach that these systems took closely followed how the enterprise access administrators within IT looked at the world. Most of these systems came with a request interface of variable level of usability (they were user friendly, they were just very picky about who they wanted to be friends with).

Move forward a bit to the 2005-2006 era and the request interfaces became more understandable but as they were meta data driven they were hampered by the basic data structure that basically dictated that a line item in the request interface should be a resource object which out of the box usually mapped to a user account in a target system. This is not very useful if what the business wants is to manage business entitlements which maps to one or more attributes such a group membership on the target system.

One way to solve this problem was to get yourself on of the new role management tools such as Vaau Role Manager (later Sun Role Manager and now Oracle Role Manager) or Bridgestream (later Oracle Role Manger and now dead). This approach worked if you could get the business to consolidate their access profiles into a set of distinct business roles that could then be mapped to IT roles that contained the actual entitlements.

Sometimes you could even map the users to business roles through user attributes such as physical location, cost center or reporting chain. If this was possible you had reached the nirvana of totally automated provisioning.

In practice this approach turned out to be hard to implement as it was very hard to capture the very complex nature of an enterprise entitlement management in a set of discrete rules. By the time you had finished one role mining the business had reorganized and you needed to start more or less from scratch. It was also very hard to get the business to spend time on working on defining and maintaining business roles.

If you couldn't get the money to buy a role mining and management tool what did you do then?

One option was to write your own which I did for an IBM IAM stack implementation that ran 2010-2011. This worked decently well to generate a base set of entitlements that should be given to each user upon user creation but the challenge was to keep the configuration files updated as the business evolved.

You could of course create a custom user interface that encapsulated all the complexities but that was a very expensive approach both from a time and cost perspective. I followed that approach in a Oracle eBusiness provisioning enablement project using IBM TIM in 2008 and it worked great but the project cost was substantial and the scope was limited to a single target system. Sena system as well as other system integrators have done a number of very successful implementations using this approach and if you have the time and funds and have a very complex it landscape this is clearly the approach that gives you the best result.

Another approach was to encapsulate the atomic entitlements into requestable objects and then present the business with a very long list of objects that they had to chose from. This approach was favored by the major analyst houses back in the 2010-2011 time frame and certainly works.

If you want to follow this approach using OIM I wrote some articles on how to do it back in 2010 that may be of interest:

As always there are downsides with taking this approach. One major downside is that the list of entitlements tends to get rather large in a major enterprise which makes it very hard for the business to pick the right entitlement which in turn makes the business very unhappy.

In OIM11G R2 there is a new concept called the catalog that gives you a new tool to address this particular issue. I will take a deeper look at catalog in a later post but it is a really nice addition to OIM and give you a low cost alternative to the custom interface.

Tuesday, April 16, 2013

Persona based access control

One of the design goals of an access control system be it RBAC, ABAC or raw entitlements based is that the user interface should only display options that are actually available to the user. The purpose of this is of course to create a smoother user experience especially for occasional users that don't need complex functionality to perform their tasks. In RBAC based system the classical example is to hide or grey out buttons based on what roles the user has.

This works well if there is a base UI that all users of a specific category needs and a specific user does not belong to more than one category. These assumptions are unfortunately not always correct.

If you for example look at the health payer space a single physical user can take a number of different personas. The same person could be a consumer of health services ("member"), work as a health insurance broker ("broker") or for a health service provider doing billing or similar work ("provider"). It is even common that the same person works for multiple providers but may need a completely different interface for each provider as the line of business is sufficiently different. The same person may code dental claims on Monday and GP claims for her other employer on Tuesday through Friday.

You could of course implement this use case by building a different portal for each type of user with different URLs and then federate the identities between the portals but that gets expensive and very complex both from a developer as well as a user standpoint. You could also implement the logic in custom code but that gets expensive.

Another option is to use some form of persona based access control. There are a couple of different approaches with one being using ABAC and XACML and the other of establishing the persona as a concept in the authorization model.

SecurIT is hosting a webinar on using their Trustbuilder product to implement persona based access control and if you are interested in PBAC I strongly recommend registering.

Saturday, March 2, 2013

SecurIt Trustbuilder

One of the sessions I am looking forward to at Pulse is "IAM-1390 : A Cloud Security Platform Offering Simple, Secure and Fast Digital Access to Flemish Authorities Resources" which is presented by Marc Vanmaele from SecurIT.

This implementation was done leveraging a product from SecurIT called Trustbuilder which looks like a good way to implement a couple of quite thorny but very interesting use cases.

Healthcare payers and provider 

In the US healthcare systems there is a lot of billing transactions happening between the provider of services and the payers of services as well as intermediaries in the chain. The actual entering of claims is usually done by specialized professionals called medical coders. It is very common that a specific medical coding professional works for multiple provider organizations which requires that a single physical user can take on multiple roles (i.e. on Monday GP coding for hospital A, on Tuesday to Thursday oncology coding for practitioner group B, on Fridays dental coding for doctor Z). This requirement is hard to meet in OOTB TAM but it seems like you have added an abstraction layer to better support context aware identification. 
This capability is also very useful in Medical Health Record systems such as drug manufacturers patient registries. The conventional solution would be custom code or XACML but it would be interesting to see how Trustbuilder stacks up. A little more detail on patient registries.

FDA regulated organizations and digital signatures 

The step up authentication capability could be used to meet the regulations from Food and Drug Administration on digital signatures (21 CFR part 11). Standard TAM lacks the ability to enforce an additional atomic authentication event triggered by access to a certain sets of URLs. You could of course implement this with a callback from the application code to the TAM authorization server but this requires modification of the application code which may be impractical or impossible if you’re a using a commercial of the shelf application. 

Pre pulse 2013

Pulse 2013 is coming up fast and furious and I am packing my bags.

If anyone of my readers is interested in meeting me in person and are attending Pulse I will be part of the panel for "IAM-2297: Best Practices in Adopting Identity and Access Management" which will take place in MGM-Grand  Room 122 at Tuesday 5-6 pm.

The talk will be centered around two questions:
  • How do you run a successful IAM program?
  • How do you mature the IAM program to a business service?
If this sounds interesting then please come and listen to me and hopefully I won't bore you too much.

This years program contains a long list of very interesting sessions but as I have been unable to figure out how you make the Pulse homepage when and where a specific session happens still haven't got a real plan for what sessions to attend.

One session I will make sure to attend is the "IAM-1390: A Cloud Security Platform Offering Simple, Secure and Fast Digital Access to Flemish Authorities Resources" that is presented by Marc Vanmaele from SecurIT. SecurIT won the IBM 2012 Beacon award for TrustBuilder which is an extension of IAM/TAM that adds some very interesting capabilities in some areas that are quite far away from the state och federal government sectors.

Thursday, February 21, 2013

Access recertification in the cloud with CA and Nördic Döts

Being born and raised in Sweden I keep a little eye on the Nordic IDM Market and CGI (formerly Logica, formerly WM Data) just closed a deal on building a cloud based access recertification platform for ABB.

The funniest part of reading the Swedish press release was to see what Swedish words they were using for the English concepts. I actually think that the copywriter did a really good job.

For my readers who don't speak Swedish the CA English press release about the technical platform may be more interesting. It is good to see that CA continues to invest in their IAM platform. It is also very interesting to see that the IAM as a Service delivery model now clearly has become mainstream.

Tuesday, January 15, 2013

Pulse 2013 - Best practices in IAM

Pulse 2013 is coming up soon and I am planning my yearly trip to lovely Las Vegas.

This year I will be on a panel titled "Best Practices in adopting Identity and Access Management - a panel discussion". The Pulse panels are always a lot of fun so I am really looking forward to it.

Thursday, December 6, 2012

To encrypt or not to encrypt, that is the question

In my last post I wrote about the South Carolina breach and this attack has now made it's way to even vernerable publications like the Economist.

The interesting question here is when does encryption of PII or PHI at rest go from "advanced technology that really goes above and beyond" to "industry standard"?

If you look at PII or PHI on laptops or other physical devices encryption has been standard for close to ten years now. Very few organizations allows their employees to have sensitive data in clear text today. The main reason for that are simple economics. It is much cheaper to pay for license for encryption software plus some extra juice in the laptops than having to shell out for identity theft protection. In any organization of significant size theft or misplacement of laptops is simply something that occurs on a monthly or even weekly basis. The other reasons is that it embarrassing and potentially career limiting for a CIO to have to explain why they lost the data when protection simply was easily and cheaply available.

Are we getting to the point where the corporate networks simply are considered being breachable just like laptops are "stealable"? Will encryption continue to be perceived as the silver bullet that automatically provides safe harbor for the information?

I am the first to admit that I really don't have any strong answers to these questions.

Wednesday, November 21, 2012

HIPAA and reverse incentives

Last week I visited ISA New Englands November member meeting and one of the talks was by Karen Borton about HIPAA. Really good walk through of a complex subject.

Major HIPAA breaches (500+ affected users) are published by the NHS on their website. If you look at the list you can see that most breaches still are "forgot laptop on the bus" kind. Is this because that is the most common attack vector or just because that issue is easy to discover?

Most companies that are covered entities do encrypt all laptops in order to ensure that a lost laptop does not result in a potential breach and notification but it is still quite uncommon that all PII and PHI is encrypted at rest. In contract it is quite common that the same companies encrypts credit card data as PCI more or less requires encryption.

In one recent breach the state of South Carolina PII was stolen as well as credit card data. The end result was that PII will now be encrypted within the organization.

Given the current state of regulation perhaps it takes an attack on that specific organization to raise executive awareness to actually start moving on this issue? On the other hand clearly the laptop encryption issue did somehow get noticed and actioned on throughout the industry so perhaps there is still hope.

Saturday, November 3, 2012

Compliance and security in different verticals

I attended a very interesting healthcare round table last week organized by RSA. One of the things that we discussed was how different verticals handle their respective compliance legislation.

There was a lot of consensus around the fact that finance has a very mature security stance. This is probably partly due to the fact that the compliance legislation (SOX) is rather mature and well defined. The consequences of non compliance can be very painful for very important people in the company which usually results in a lot of attention. There has also been a number of very big events in finance where insiders have gained elevated privileges in order to hide trading losses

If you move on to retail you have PCI which is very detailed and specifies exactly what you need to do and what happens if you don't. Again there has also been a number of cases of insiders stealing credit card data where improved security systems clearly could have lead to at least earlier detection and smaller losses.

In healthcare on the other hand we have HIPAA and HiTech. HIPAA is very high level and doesn't have any real teeth. There were some talk about HiTech fixing this but this far there has been very few cases where HiTech has been really brought to bear against health care or payer organizations.

Another aspect is that the information handled by healthcare organizations is harder to fence. Financial organizations tends to handle cash which certainly is very easy to fence. Credit card data isn't pure cash but thanks to organized crime there essentially are underground exchange where credit card data can be sold in wholesale quantities.

Healthcare and healthcare insurance information is harder to capitalize on as it is much easier to pose as a person with a credit care to a merchant than to pose as a healthcare provider to a healthcare payer. On the other hand the potential payouts could be very substantial so I wouldn't be surprised if we see a "TJ Max" like case very soon in the healthcare sector. The PI aspect of the PHI is another potentially interesting attack target as it often is possible to perform identity theft using the data in the insurance information.

The consensus in the meeting was that healthcare in general is relatively immature but that there are a lot of things that points to a change in culture over the next few years. One clear possibility is that the enforcement of HiTech may be stepped up and there may also be enough embarrassment generated by a couple of big data loss cases that actual change will happen in the industry.

BCBS Tennesse is one very famous example on what can be done if an organization decides to tackle a specific aspect of security (data at rest in this case). It should be noted that the cost of just providing data at rest encryption for a mid size payer organization is $6 million so lifting the overall security posture of the industry clearly won't be cheap.

Monday, October 8, 2012

IDM in the cloud

Recently there was a very interesting discussion around cloud based IDM vendors at Wisegate. The question that was posed was what are the current cloud based IDM alternatives. There is a number of vendors out there but my three favorites for a US based corporation would be:

Lighthouse security
A quite mature solution based on the IBM security stack (formerly Tivoli Identity Manager and friends) with a quite nice custom GUI.

Sena systems
My old employers at Sena offers a hosted turnkey IDM solution based on the Oracle stack. This solution is built on top of a very mature service offering that Sena started working on way back in 2006.

Very interesting solution architected by Nishant Kaushik (ex Thor, ex Oracle).

A couple of years ago cloud based offerings was a quite new thing and most vendors did not offer it. Today you can get this offering for most of the major products and most of the bigger implementation players will have a cloud offering in their portfolio.

The main draw for most customers of cloud based implementations is that you avoid having to install and maintain an often complex, temperamental and finicky IAM installation. There is a significant element of economy of scale of running a number of very similar IAM installation in a data center compared to running just one so this makes a lot of sense.

The second advantage a cloud based implementation offers is that it usually supports a standardized set of requirements instead of offering a basically unlimited configurability that most major IAM stacks will offer. This means that the implementation time is much shorter and the implementation cost as well as maintenance cost is much smaller. The downside is of course that you have to accept the use case and the requirements that already are in place. Sometimes you have the ability to do some limited configuration but you definitely aren't able to write custom code.

As a general rule it is easy to burn three times the license cost on professional services in an IAM engagement so there is a huge financial incentive to go with a standard cloud implementation rather than a custom local installation.

IAM as a service is clearly taking a bigger and bigger mind share and as corporations becomes more and more comfortable with running things in the cloud it is likely that the mindshare will result in a bigger and bigger market share as well.

Wednesday, March 7, 2012

Datapower - a blast from the past

Back in 2005 I was living in London and working for Sena Systems. Sales were slow in Europe and I was subcontracted out to a partner for a three month engagement in the US. The partner was called Datapower and was headquartered in Boston. Little did I know that this project would result in one of the big turning points in my life.

While working for Datapower I not only met my now wife but I also got some experience of the product and also got to experience what happens when IBM buys your employer. One result of that buy was that Datapower cut the relationship with non IBM partners so I had to leave the DP area and moved over to the provisioning practice.

During the next five years I didn't do any Datapower work so when I got the chance to take part in a hands on DP lab day I took the chance.

The most striking part of the experience was how little DP had changed. Most of the user interface was almost identical . DP has a wizard oriented user interface where you basically are guided through a setup process. The end product of the process is a functional unit such as a multi protocol gateway or a web proxy.

The advantage of this setup is that you can build quite complex entities without any programming or in depth training. The disadvantage is that you sometimes is a bit limited. If the option you need isn't available as an option you are usually toast. The supporting entities such as encapsulations of certificates or SSL protection of connections can also be a bit hard to figure out as they can be either pre defined or reused or be defined as part of the workflow.

One really nice function in the new OS is addition of an XACML interpreter which makes it possible to run the DP device as a PDP. You can also link the DP box to a TSPM server and use the DP box as a PEP. The PDP functionality seems a bit shallow and you have no support for policy authoring or distribution so it is really not a fully fledged XACML solution. Despite this it is good to see XACML support in yet another well established security appliance.

Thursday, February 2, 2012

Stockholm syndrome and Gartner provisioning quadrant

I will be speaking at the Identity and access management seminar in Stockholm Sweden on May 3 this spring.

My talk will primarily cover how to upgrade IAM systems including how to integrate "newer" functionality from the IAG space. I think the seminar will be very interesting so if you happen to be in the area I would recommend attending. Even if you are not in the area you know that you want to visit the land of the socialist nightmare (or statuesque blonds, pick your choice).

When you upgrade your IAM system you have to make two major choices:
  1. What software package will I use?
  2. Who will perform the upgrade?
The answer to the first question can be determined in many different ways (i.e. who plays golf with whom) but lets take a look at the new Gartner user provisioning magic quadrant and see if that provides any answers.

This years magic quadrant is rather boring which probably reflects the maturity of the market. The big three (IBM, CA and Oracle) is hanging out in the upper right corner with Oracle having a slight edge. Courion pulls off another strong showing and the cat with at least 99 lives (Novell) seems to be alive and well.

IBM, CA and Oracle are also the only players with a more or less fully featured and more or less integrated IAM stack. The down side is that the packages from the big three tend to have high license costs and also are quite complex to install and configure.

Due to the acquisition of the provisioning module from BMC Sailpoint now has a decent provisioning offer. Not as fully featured as the leaders but definitely a competitive offering.

The challengers section of the report contains a number of interesting vendors such as Forgerock and Lighthouse.

Overall there were very few surprises in the report and you can almost read between the lines that the Gartner analysts are much more excited about the new IAG quadrant (courtesy of Sailpoint).

Friday, January 27, 2012

Useful TIM tips and tricks

I ran into a page full of useful TIM tips and tricks that I thought I should share: IBM Tivoli Identity Manager How To