Wednesday, March 28, 2007

Adaper Hangs

Every TIM class seems to spend some time working with the standard Linux Adapter. It is one of the simplest to work with for doing some basic provisioning and testing. Wouldn't you know this happens to be one that gives me trouble. I'm using what I think is the latest Linux Adapter v 4.6.3 that I downloaded from the Partnerworld web site. For some reason this thing is a little flaky for me. The doc says it is supported on SLES 9 which is what I'm using. I'm using it on the server that happens to be running TIM. Initially it starts up fine. I can run the agentCfg program to configure it. Then in TIM I test connectivity to it just fine. I then tried a recon and it just hangs indefinitely. There is no log output regarding the recon in either the trace.log or the adapter log. I re-installed the adapter a couple of times with failed tests in between. The latest attempt at working with it I restarted Linux. Then before starting TIM I bumped up the logging level in

I started the agent and bumped up the logging for the adapter. Then I started up LDAP and TIM. Again I tested the TIM Linux Service to see that it connects to the Adapter just fine. I do a recon and this time the stupid thing started to work. I completed a recon successfully. But then I try to get into the agent config:

./agentCFG -agent LinuxAgent

And the tool hangs. Even if I stop the agent and start the agent I still cannot get into agent configuration unless I do something more dramatic like reboot the server. Sometimes if I stop the agent and leave it down for a while, then start it back up it seems OK. VooDoo.

Whew! It's a Brain Melt Down!

The Extending TIM 4.6 Class is definately a worth while class to take. At first I was a bit skeptical, but Day 2 and 3 got pretty involved. In fact each day the material gets more complex and the labs are pretty much all JavaScript. You should be very comfortable with TIM basics before you take the class because when things don't work in the class you will not want to fall behind because you don't know where to go to troubleshoot and debug stuff. The amount of time allocated for this class is pretty good (4 days). We could make use of a 5th day I think, but by then you will understand the concepts. No that I've been in the "weeds" a bit with TIM I can see that it must be very difficult to develop a course for TIM. So much of what you do with the product depends on so many different factors number one being the condition of your data. In the lab all the sources of identity information (CSV Files, DB2 Databases, etc...) have been pre configured with perfectly clean information (no duplicates, all attributes populated) and still things fail to work properly.

If you take the class you'll have lot's of workflows to do. I think between ACIs and Workflows, you can wrap your brain around those two things then the rest is easy. I would say definately include the Extending Tivoli Identity Manager 4.6 in your training plan.

Monday, March 26, 2007

Another thing from class today...

TIM is a little strange when it comes to feeding supervisors into the system. It all boils down to the fact that objects in TIM get assigned a random ID as part of their DN called the erglobalID. So in order to specify who a supervisor is on a persons' record, you must first create a record for that supervisor in TIM. It's sort of a chicken or egg problem. To solve this problem users are fed into TIM in two phases. The 1st round brings all users into TIM regardless of whether they are management or not. The next phase should update the supervisor attribute for all the users. To do that your second Feed into TIM must do a lookup to the TIM LDAP to resolve what the manager DN actually is. It might look something like:


So the assembly line must get an attribute about the manager that it can then do a lookup on to get this DN. This then gets populated in the manager attribute in the TIM LDAP. Keep in mind that in TIM it is referred to as ersupervisor, but it's really manager on the back end. In my sandbox system, my employee feed source contains the employee ID for the manager in user record so when I run an assembly line to update the manager attr in TIM I simply do a lookup for the DN where the manager attribute in Work matches the employeenumber attr in TIM. For that person I get the DN which is what the Feed populates into TIM. This is done using the DSMLv2 Event Handler.

In class today we did the feed a bit different. While the initial feed used the event handler the second feed for the managers was done using TDI to push people right into LDAP. There were a couple of lookups to LDAP to resolve the DNs of the user and then the manager, then one finally update connector to the LDAP top update the record.

It seems like none of these training classes really go into what solutions work best in what scenarios. So far none of the classes I've been to talk much about the best way to develop a solution given a particular case. Supposedly the Advanced ITIM 4.6 Implementation Workshop goes into that level of work. We shall see because I plan on being there.

Day one at Extending TIM 4.6

I'm on the fence so far about the Extending TIM class. The first day was pretty much all review about LDAP and basic TIM stuff. I really did not learn much yet. The latter half of the day we did labs which was OK because we worked on ITIM Data feeds using TDI. One good thing about this was that we configured a different way to feed manager attributes into TIM than the way I've seen before. I also had problems with my provisioning policy due to a typo in some JavaScript in TDI. It didn't take long to find that, but My TDI Feed still didn't work right. The feed connects to a CSV file to pull most typical user attributes like the cn, department, sn, title and hire date. Then the AL connects to a DB2 table to lookup the person's security clearance. The Solution implements a typical TIM Event Handler so that when you recon the TDI Feed Service in TIM the users get pulled into the the TIM Org Tree. There was a sample placement rule for the lab that placed users in the org tree based upon their departmentnumber attribute.

Problem: In addition to a typo in the JavaScript, I also had a typo in the Org Unit name of one of the OU's. So when I ran the assembly line most of the users were placed into TIM correctly except for the users who belonged into the OU that was spelled wrong. So I fixed the mis-spelling and re-ran the AL. No go. The users still did not get placed properly. I restarted TIM tried again and it still failed. Then I restarted TDS and TIM and tried again. Still failed. I checked the TDI Assembly Line, restarted the Event Handler and still no go. So finally I decided to delete all the users in TIM who were placed at the top level. When I ran the recon again, this time it worked.

Solution: Sometimes you just have to delete the people who are in the tree incorrectly and re-run the reconciliation. Go figure. So this is definitely something I learned today which I would never had learned without first having a typo in an Org Unit!

Sunday, March 25, 2007

Welcome to my Blogs New Home

Since I changed jobs last month I've been looking for a new home for my blog. I used to host this at my former employer on a Domino server. I used to be a Lotus Domino Administrator so running my blog on a Domino box made sense to me. And Domino is a great platform for collaboration apps like blogs so it was a "no brainer". As gracious as my former employer was in continuing to host my blog for a while after my leaving, it was still necessary to move it.

I had something all lined up to host it on another Domino server run by a different company and then there was the option of hosting it at my new employer's Domino server as well. All this indecision is why there have been so few postings to this blog lately. This content resides at my old employer, my new employer and now here not because I like to copy this content multiple times, but because I was hoping to continue running this on Domino.

I switched to blogger because Google offers some pretty cool functionality along with the new googlepages all for free. So I can create regular web pages and store my attachments like documentation I compile, cheat sheets, and code snippets on googlepages and link to this content from the blog. These tools are actually faster than the experience I was having with the Domino blog using my Notes client as the blog admin tool. Still it was nice having the blog self contained in a single NSF file, but I decided to give blogger a whirl. I'm looking into creating a twiki that I can link to the blog and my googlepages as well.

Now that the blog has finally moved I look forward to posting more again.

Tuesday, March 20, 2007

Update to SSO for WPM and WebSEAL

I re-visited some of my past posts on this topic and I suspect that I may have failed to mentions some of the steps I took to make this work. Part of the issue is that IBM has a document on how to do SSO for WPM and WebSEAL, but it involves TAMeB 5.1. I'm using the newer stuff so I needed to use some documentation on creating a junction to WAS 6 using TAI or TAI++ as well as figure out how to do SSO with WPM. When I first tried this I attempted to use TAI. It almost worked, but I kept having problems logging in as sec_master. I could login as other TAM users just fine though. I then tried using TAI++ and everything worked great. Now in my customer environments we will be using a TAM authorization server so I made sure to setup an Auth server in my sandbox (development) environment as well so I'm not sure you will be able to follow the new instructions if you don't have an authorization server. The jury is still out on that. Anyhow, I uploaded a pdf to my Googlepages here if your interested.

Wednesday, March 14, 2007

Day one and two at ITIM 4.6 Basic Implementation Workshop

In the first two days of the basic implementation workshop we have already covered a lot of ground. It's almost moving to fast so for the last two nights I would bring my book back to the hotel and go over the exercises again using my sandbox environment on my laptop. But here are some bullet points I've picked up in the first two days:

  • Do not touch the TIM LDAP or the TIM database directly for any reason. In other words these two things are off limits to any other system besides TIM. While it may be tempting to let some system connect up to the LDAP make a query or something, it is highly discouraged by the Tivoli folks.
  • The default page in TIM when any user logs in is the change password screen. This may be confusing to people and make them think that they must change their password. This screen can be changed.
  • Changing UIDs is very difficult if not nearly impossible. Avoid doing this. An ideal solution for UID is to use something that will not change like Employee Number. Do not use FI LastName.
  • Referral attributes like Supervisor or Manager will require two loads of TIM. The supervisors need to exist in TIM before you can populate the supervisor (ersupervisor) attribute.
  • The Password Policy you choose for TIM should be the same as the password policy you are using on target systems. It is possible to make it different, but be careful because when a user is required to change their password ,while it may be allowed on TIM may not be allowed on the target resource or vice versa.
  • In TIM you can configure how to handle non compliant accounts. The choice is Notify, Mark or Correct. be very careful when configuring TIM to correct non compliant accounts. By simply removing a role from someone you may cause TIM to delete many accounts from some target resource. De provisioning accounts may mean different things on different target resources. You should choose to Mark non compliant accounts instead of correct them at least until you are completely comfortable with TIM. And that may never happen. :-)
  • When you don't want a user to have an account, but you cannot change the Role and you cannot change the Provisioning Policy, simply suspend the account.
  • In TIM 4.6 some restrictions in provisioning policies have been removed from past versions so that you really no longer need to use Locations or Business Partner Locations in your organization tree. Keep this in mind when designing the tree because when you have to search for people you often have to choose the category of people you are looking for. It's sometimes easier of they are all the same.
  • Static vs. Dynamic Roles - Static roles are simple to set up, but you must maintain them manually. Dynamic roles are automatic (use an LDAP search to build), however they can cause slower performance because they are constantly be re-evaluated.
  • If you have customized any Service Profile Forms, make sure to back up the service profile before you reload the form in TIM. Not sure how often you would encounter a need to reload the Service Profile, but if you did have to for some reason you would blow away any customization you made to the form earlier.
  • For any target systems you will manage with TIM designate a Service Owner. This way when setting up workflows you can have requests routed automatically to those people who actually manage that resource. Obviously then those people would require a TIM account.
  • I still have to verify this one yet, but a Provisioning Policy has to exist in the same container as the service it pertains to.
  • Service Selection Policies get evaluated anytime anything in TIM changes. This will result in poorer performance. Avoid using these.
  • When two policies for the exact same service applies to you, the one with the higher priority wins. (The lower number means higher priority)
  • There is a recycle bin in TIM. Anytime you delete an account, it goes into the recycle bin. This is used internally by TIM and accounts are kept there for 62 days.
  • When doing a recon your goal is to have 0 orphaned accounts. To help with that TIM by default will match up the account name on the target system with what is in the alias field in TIM. This is a nice feature to help minimize orphans. When you are feeding people into TIM use the alias field and populate it with what is likely to be the account names for your target systems. TIM will try to match these up during a recon and those accounts that match will be adopted.
  • There are two different kind of workflows - provisioning people and operational. Also workflows can be global or profile specific. There is a workflow element called "work order". This is only useful if you want to send something to someone, but not receive anything back. Technically with a lot of custom coding you can get something back, but there are other ways to do this.
  • Users in a TIM environment can begin to receive a lot of email especially when there is approvals required and things to do. You can use Post Office Aggregation which will group email notifications so that users do not get bombarded with email.
  • is where you will store strings that can be used in your workflows.
  • Delegate Authority is a feature available in TIM which allows you to transfer your To Do list to someone else for some specified amount of time. The To Do list will reside in only one place so if you delegate authority to someone else say while you are on vacation you will not get copied on that To Do list. When you return items on the To Do list which the delegate was seeing will not come back to you so that person who was delegated must process those To Do's. Likewise before you delegate to someone else, you must process all To Do's on your list because those To Do's that have not been processed yet will not go to the person delegated. It only directs new requests to the person delegated.
  • Make sure to have another account besides ITIM Manager in the event someone locks the ITIM Manager account.

As I learn new things or if I find corrections to any of the above I will post again, but that's not bad for the first two days of class. I still have several hours of work on my sandbox to catch up to what we have done in class, but repetition and constant exposure to this product is how you will learn it. I venture to guess it could take at least a few years to learn TIM so I am doing everything I can to spend as many hours as possible with it so that I can learn it faster.

Tuesday, March 13, 2007

TIM 4.6 Basic Implementation Workshop a good bet

Anyone looking to take any training on Tivoli Identity Manager 4.6 would be wise to get into these workshops available out here in Costa Mesa. Before attending the Basic Implementation workshop {Link}, I thought that maybe it wouldn't be a great use of my time since I've already implemented TIM in some sandbox capacity and participated in it's implementation in at least one development environment already. I figured now that I've actually done it how much more could I learn in a basic course. Furthermore, at a customer site we were already getting into some more heavy lifting with customization of the LDAP adapter and such. My being a training would mean I'd miss a week of that really goods stuff customizing an adapter.

My first day in Costa Mesa convinced me that taking the course was a good idea. First of all I should explain something. Any of these workshop courses are not done by Tivoli Education. They are done by the Tivoli enablement team. These are the people that get called out to a customer when either the customer themselves or a business partner screwed up. They are sort of rescue and recovery. This team is also as close as you can get to the people that actually write the code. So these folks teaching and developing the course material are people actually implementing it in some pretty complex cases. Another interesting point is that if the class says it is a workshop then it is this Tivoli Enablement team that is doing the class and it is also only available in Costa Mesa, CA. That's not to say that the other courses are not any good because even the enablement team will say that courses like Extending TIM are good ones to get. It's just nice to get training from people the caliber of our friend Ram Sreerangam.

The guy teaching this class is Brad Olive. He is the Workshop Manager and has been involved with Tivoli Identity Manager since almost the very beginning. Before IBM acquired the product it was called Access360. There was even two other companies before Access360. Brad goes back that far.

This class is hands on. So Brad talks about how the product gets deployed in the real world using some slides, then you quickly get to actual exercises. The cool thing is that you don't spend any time installing the products. They are already installed for you. The classroom time is spent configuring the products to work. The first day we built the org tree, fed users into the tree, configured a simple placement rule, provisioning policy and service, etc.... What was cool about Brad teaching this class is that if features of the system do not make sense to use he will tell you straight up not to use them. Here were some of his points:

1.) Service Selection Policy - The web courses talked about how great a feature this was. Brad admitted that yes the idea was really great, but unfortunately this feature has a slight problem. These Service Selection Policies get evaluated every time anything in the system changes. This could be a performance killer. Conclusion... avoid using them.

2.) Org Tree objects like Location and Business Partner are nice if you like the cute little pictures in the org tree to differentiate what they are, but in reality these can complicate your tree design and make things a little harder to find. Conclusion... use Organizational Units or Admin Domains instead.

3.) Static vs Dynamic Roles - It's fine to use both, but if you have many Dynamic Roles you can sometimes suffer some performance since these get re-evaluated a lot.

So even if you have installed TIM and configured it to some degree this is a good class to take. Obviously if you have already taken customers from development to production then this class may be a bit simple, however you would be surprised what you can learn from a basic class. Maybe some things you have been doing all along are now considered bad practice. Brad tells me these classes are constantly being updated to reflect real world practices, so if the enablement team has learned something new about the product along the way, they incorporate this into the class.

Saturday, March 3, 2007

Handy tool for your bag of tricks

My friend Thom Anderson turned me on to this tool a long time ago. I can't believe it took me this long to actually try it out. We are building TIM and TAM on a pSeries (AIX) box so there is no GUI like you get on the Linux environment. Some of the internal people use other products to help in managing their AIX systems. I really have little experience with AIX, but since we are installing on that platform we need an x windows environment to run the installs of TIM, WAS, TDS, TAM, etc.... Cygwin {link} is a pretty easy tool to install and configure. It looked a lot more complicated when Thom showed it to me a while back, but it's not a big deal at all. There is a another good web site that describes how to install it here {link} (thanks Andy) so it's pretty straight forward.

What I like about Cygwin is that I don't have to startup KDE or Gnome on my Linux system (saves some horsepower on my VMs) and I can still run all the software that requires x windows. For SLES 9 you will need to install the openSSH packages from Cygwin because Telnet is not enabled by default. Once you have installed the packages follow these steps to connect up to your Sles 9 machine:

1.) Launch Cygwin

2.) Type startx

3.) To ssh to your linux machine, type:

ssh -Y -l

4.) From here you can launch any program that requires xwindows (I.e. TIM, TAM, TDS, WAS Graphical Installations, etc...)

Things have been crazy...

Obviously there are these large gaps in time between my postings. Well lets just say that things have been crazy my first two weeks on the new job. The customer I'm working with right now has security policies in place which prevents me from posting to my blog during the day (and rightfully so) not to mention that when we are deep into configuration of TIM and TAM the last thing I'm thinking about is this blog. So the only time I can post is obviously after hours. Up until last Friday I didn't have a machine other than my home computer and since receiving the new computer I've spent most of my nights just installing all the software needed to build out my new sandbox for TIM/TAM. Whew!

I'm still looking for a new home for my blog. I thought I had something all lined up, but that has not materialized yet. I may actually have to switch from a Domino based blog to some free service like Blogger or something. In the meantime, I finally installed VMWare on my new machine and have built a new TIM server (Only took me 2 days this time) and now I'm just getting going on preparing a TAM server.

I was thinking (should have thought about this sooner) that if I hadn't already built my TIM server maybe it would have been better to build a DB2/TDS server to hold a single Instance with all the databases for TIM and TAM. In other words the TIM LDAP DB, TIM Transaction DB, and TAM User Registry DB all on one DB2 server in the same instance with TDS installed there to run both LDAPs. Then on a separate VM I could install just WAS and TIM and the DB2 Client to connect to the necessary DB2 databases. And then for TAM I would just install the TAM code pointing to the DB2/TDS server for it's LDAP. I'm not sure if this would have been a more efficient way to run a TIM/TAM environment on my Laptop or not. I have 80GB of space and 3GB or RAM so I'm just trying to maximize the horsepower I have. Either way, my TIM server is already done at this point:

host name: tim
Disk: 10GB
TIM v4.6 FP33, IF38
TDS 6 FP3, IF2
DB2 v8.1 (Included with TIM TDS)
WAS 5.1 (with fixpacks included from TIM Suppl)

One thing I noticed when installing this was how fast my laptop performed. When I installed this on a Dell PowerEdge 2650 Server (loaded) I remember the WAS 5.1 install took soooo long (hours even) yet on ThinkPad T60p Core 2 Duo {Review} I blew through the install in under 15 minutes. Not sure how that could be, but I was pleasantly surprised. One thing I did differently in this case than the last time I built a sandbox is I used only one DB2 instance for both the TIM LDAP DB and the TIM Transaction DB. I wanted to limit the overhead as best as possible so instead of loading up two separate DB2 instances to do this I figured I would try with only one Instance containing both databases. Maybe you would or would not do this in production depending on the desires of your DB2 Admins and the hardware you are running on.

Next job to do is build out my TAM sandbox. That's what I'm working on the weekend in between other household duties. :-)