Wednesday, December 12, 2007

TAM ESSO in a Multi-Domain world

The typical enterprise organization today has multiple domains. It's not unusual then to find corporate Windows AD environments with two or ten or even 20 AD Domains in a forest especially for global companies who have been through several acquisitions. I have even seen one company with upwards of 60 AD Domains in their forest.

So, how does one best implement TAM ESSO in this type of environment?

One of my goals is to journal some of these best practices in the coming months. It's unfortunate, but there really is not white paper, red paper, or red book describing best practices in a multi-domain environment with this product. It's fairly new to the IBM fold so I believe the documentation is "on it's way". IBM has a pretty good track record for integrating its products and in my experience support from IBM is first class, but I will be the first to complain when stuff ain't right.

Some things you have to consider:

If you will use AD for your TAM ESSO Repository, where is the schema master?

Is the schema master in the same domain as the user domain?

Where in the organizational structure do you intend to store the SSOConfig container for templates? Will users have access to read this container?

If you are deploying DPRA where will you choose to store the service account for the DPRA web service? How about the Password Reset account? Keep in mind the reset acocunt must be able to see all the users. What if you have users in different domains?

Every company is set up a little different when it comes to directory design and I've seen many variations in security policies as well so there is no one size fits all approach to deploying any of the IBM security products let alone TAM ESSO, but there are some common approaches we can take.

First of all, the Windows AD design will be somewhat of a guide. Users need read access to the OU where you are storing their templates. So this pretty much guarantees that you will need to at least create an OU in each user Domain in which you will store templates.

Next, consider whether to use one solution file (the xml file which stores all your template objects in the console) or multiple solution files. It may help to use a different solution file for each domain. This may depend upon how many admins will be administering these templates and whether people in different domains will be using different applications. This can sometimes be the case in some global companies or companies that run different divisions as different AD domains.

Remember that when doing the steps to extend the AD schema, you must choose the schema master. This is often in the forest root domain. Some companies have no users in that domain except for the forest admins.

Replication schedule is a factor. If you extend the schema in the forest root domain, you may have to wait some time before the schema change reaches all the DC's in the child domains. Further more as you start building templates and synchronizing them to various AD domains, keep in mind that it may be a while before these templates replicate with all the other DCs in that domain. So users will not see those templates in the TAM agent for a while.

I'll follow this post up later as I flush out more detail. Maybe over time I will have journaled enough for somewhat of a best practices document.

Cheers!

Just a follow-up from my last post regarding Win2K schema extension

So after running through this again the process of extending the Windows 2000 AD schema for TAM ESSO is a bit more quirky than I thought. It certainly works and the TAM ESSO product functions as designed afterwards, but you have to jump through one hoop first:

From the TAM ESSO Admin Console click Repository -> Extend Schema

Next, choose Microsoft Active Directory as the Repository Type














Click OK

















You will see this error message. Notice that the error occurs after the line "Adding Attributes". If you click Close and then go and view the AD Schema with MMC you will notice that just the 5 new v-Go attributes were created. The classes were not.

Note: If you choose "Microsoft ADAM" in this step instead of "Microsoft Active Directory" the operation will fail with an entirely different error and neither attributes nor classes will be added to the AD Schema.

Now, go back again and click Repository -> Extend Schema

This time choose Microsoft ADAM as the type.














Click OK

















This time you will see that the operation was successful. Now if you view the AD Schema with MMC you will see that now not only do the 5 v-Go attributes exist, but also the 4 v-Go classes. Any further operation proceeds as normal.

Even though there are not too many organizations out there still using Windows 2000, there are still some. I'm not sure what IBM's road map is for continuing to support Windows 2000, however currently they claim to support it with SP4.

Thursday, December 6, 2007

When extending Win2K AD Schema choose ADAM

Believe it or not some companies are still using Windows 2000 out there. I had an issue the other day trying to extend the AD Schema for TAM ESSO and it failed trying to add the v-Go classes. Attributes were added just fine, but the classes would not get added to AD and all I got was a meaningless error message. Choosing ADAM as the Repository Type seems to work fine. I know that choosing Microsoft Active Directory as the Repository Type works for Windows 2003. Maybe it's just a Win2K thing. Hey whatever works.

Slight miss with Rollup E Fixpack 2

I noticed the other day when I was troubleshooting a sync problem between the TAM ESSO Agent and AD that Fixpack 2 does not update syncmgr.dll. In other words if you run the MSP to install this fix pack it will miss syncmgr.dll. I guess it's probably best to apply these fix packs manually. In case you don't already know this Rollup E is version 6.0.05.

Tuesday, November 20, 2007

Another cool Tivoli dude!

I always try to make a point to mention some people I meet every now and then because the network of Tivoli security folks does not appear to be very large. And in this case it just goes to show it really is a small world out there. I recently created a few PMR's to troubleshoot some issues I was having with TAM eSSO. By luck my level 2 support guy assigned to the call was a former Buffalonian!

Sey Gan lived right in North Amherst for about 12 years and landed a job with IBM Tivoli some time ago. And lucky for me he is a darn good TAM eSSO guy working level 2 tech support. Sey gave me some great tips and helped me work through a few problems. It's always nice to have good tech support from IBM, but it's even nicer when you discover they are from your own neck of the woods too. Of course I'm sure Sey is very much enjoying the nicer weather out in Costa Mesa, CA. But hey, we in Buffalo don't have forest fires and we certainly don't have any shortage of fresh water! Even better you can still get a pint of beer here for under $3.00.

DPRA url's

I downloaded the latest version of the Desktop Password Reset Adapter from IBM just the other day. It's version 6.00 Rollup D. First of all since when are they starting to use this "rollup" terminology? When you install this code, the web pages for the DPRA Management console say 6.0.04 or something to that effect. Even the base code which today is 6.0.05 suddenly is now referred to as Rollup E. WTF?

Well, lets get to the real point of this post... The documentation. I noticed that still the DPRA Client Install Guide has one of the URLs wrong on page 9. The first time I installed this a while back this drove me crazy. I finally noticed on the actual web service in IIS that there was no such file enrollments.aspx. The file name is enrolluser.aspx. Come on guys, can you at least fix the inaccuracies in this documentation already??? And why would they list the URLs in a different order than that in which the client software asks for them? Luckily after I had figured this out the first time I made notes for later.

Anyhow here is what is on page 9:











Here is what the URLs need to be in the order the client asks for them:

http://host/vgoselfservicereset/resetclient/checkenrollment.aspx
http://host/vgoselfservicereset/resetclient/checkforceenrollment.aspx
http://host/vgoselfservicereset/enrollmentclient/enrolluser.aspx
http://host/vgoselfservicereset/resetclient/default.aspx
http://host/vgoselfservicereset/resetclient/checkstatus.aspx

Synchronizing Global Agent Settings can break synchronization???

Just because you can do something doesn’t always mean you should. One of the Tivoli Level 2 support guys from IBM warned me that synchronizing your Global Agent Settings to the repository can lead to issues such as synchronization problems. So synchronizing can cause synchronization problems. Go figure. I guess I won’t do that then.














· Just right-click the object (vgoadminoverride in my case) and click delete


Corrupt TAM eSSO User...

"There was an error trying to load your chosen primary logon method. You may want to reinstall it in order to correct this."

I have a workstation in which 4 or 5 different users will logon to. All the users except one worked fine. The problem user would login to the Windows Domain and the above error would be displayed as soon as the TAM agent would start to load. To fix this problem I followed these steps.

1.) Logon to the Admin workstation and clean up the server side by deleting the users credential objects







· TAM eSSO Admin Console -> AD Repository -> Twist Open Users Container -> Twist Open User

· Right-click TAM eSSO object -> Delete


1.)
Next, logon to the users workstation and cleanup the agent side













· Start -> Run -> %appdata%










· Delete the Passlogix folder






















· Start -> Run -> Regedit

· Delete the Passlogix folder from HKEY_CURRENT_USER\Software\

· In this case it was also necessary for me to delete the folder in the %appdata%\Microsoft\Crypto\RSA folder. There is normally a single file in this folder, but since this problem had something to do with the logon method it was necessary to delete this folder as well.













3.) Finally, log off Windows and log back on. TAM eSSO should start correctly.

Tuesday, November 13, 2007

LDAP Adapter Gotcha...

The Problem: CTGIMD014I 1 reconciliation entries were not processed for the following accounts: eruid=users.

I had this problem on one of my TIM servers. It was relatively early on in the build process for this environment and I had created a service for an LDAP which I was to be provisioning user accounts. Really straight forward stuff here. But, I kept having these recon warnings. The Reconciliation would complete with the warning noted in the title of this post. The trace.log showed several errors like this one:

*********** ENTRY **********
$dn: DN:eruid=users
erLdapContainerName: dc=mydomain,dc=com
objectclass: OBJ:erLDAPUserAccount
cn: users
eruid: users
erAccountStatus: 1
*********** ENTRY END ******

![CDATA[Unable to create orphaned account]]

![CDATA[Thread 2 Encountered an exception processing eruid=users: CTGIMS001E At least one required attribute is missing.]]

So ITIM was trying to orphan a bunch of accounts and it came across an account called "users". Well the reason for this apparently is because of the fact that I had created a container in my target LDAP called cn=users. I had provisioned my LDAP users into this container. The LDAP Adapter apparently has a problem with the search base being cn=users,dc=mydomain,dc=com.

The Fix:

To fix this I simply re-created the users container as an OU. So I deleted the users from the cn=users container, then deleted the container and re-created it as an OU. I changed the search base on the service and now everything works fine.

I know that typically containers such as cn=something are usually seen more often in AD, but ITDS had no problem letting me create these. Technically for my solution it really makes no difference so it was easy enough to just blow away the containers and re-create them as OU's. Just a little gotcha with the LDAP Adapter I guess. BTW, IBM Tech support caught this very quickly where I just didn't see it. Thanks Melvin!

Wednesday, October 31, 2007

Tivoli Directory Integrator - One very cool software product

One of the favorite products of many Tivoli security professionals is Tivoli Directory Integrator. I have to say it's one of the coolest software products I've ever worked with. IBM currently ships it with a number of other products including one of my past passions Lotus Domino.

Anyhow if you are new to Tivoli there's a great wiki here: http://www.tdi-users.org/twiki/bin/view/Integrator/WebHome


Also finally I'm not the only one blogging about Tivoli thanks to Eddie's new ITDI blog. At least now there's someone who really knows what they're blogging about:

http://tdiingoutloud.blogspot.com/

Wednesday, October 3, 2007

Helpful Tivoli Webcasts

There are some pretty good technical webcasts available on IBM's Tivoli Support site:

http://www-306.ibm.com/software/sysmgmt/products/support/supp_tech_exch.html

If you haven't already found these they are a bit helpful for those topics you are new to in the Tivoli software. Some are very broad in scope and others are fairly granular. Check these out if you haven't already. I try to visit this site a couple times per month to register for the upcomming webcasts. Whats nice about joining a live webcast is that you can ask relevant questions at that time. All the old ones are posted.

Friday, July 20, 2007

Troubleshooting AD Agent - Unable to set some attributes

When you provision users to AD and you have chosen to set attributes from the Terminal Server tab you may encounter warnings instead of success when viewing completed requests.

The audit log may say it could not set some attributes such as Allow Logon. In my testing it would occasionally complain about not being able to set other attributes related to Terminal Services as well.

If you go to your AD Agent on the AD server and enable detailed logging you may see something like this:

errorMessage="Error setting attribute eradwtsallowlogon. Agent Terminal Server support disabled.">

This is the first clue to the problem. Simply run the agentCfg tool and choose option F. Registry Settings, then option A. Modify non-encrypted registry settings, page down using option D. then set WtsEnabled to 'TRUE'.










Try again to provision your users and this time I think you will have more success.

Monday, July 16, 2007

The InstallScript engine is missing from this machine...

If available, please run ISScript.msi, or contact your support personnel for further assistance.











If you are installing the TAM ESSO agent and encounter this message, then it is likely that you are using an older version of ISScript.msi than is required.

I ran into this problem recently and spent a few hours trying to install various versions of ISScript.msi. There is an informative tech note from InstallShield here:

http://consumer.installshield.com/kb.asp?id=Q108158

The problem is that if you try to install any or all of the versions mentioned in the technote you will still get the error message because InstallShield is already up to like version 12 or there abouts.

You can debug your msi by running this from the command line:

msiexec /i yourmsi.msi /l*v C:\Temp\MyMSILog.log

In the log you will see which version of isscript.msi it's trying to use. That would be the version you must install.

Since I'm using TAM ESSO 6.004 it is requiring version 11.5 of isscript.msi. I found the correct version of isscript.msi bundled with the TAM ESSO 6.004 product. You will find it in the Utility folder. The file name is isscript1150.msi.

Thursday, June 28, 2007

How to Change the ITIM Home Page

A few people have sent me e-mails looking to know how to change the Home Page that users see when logging into the ITIM Management screen.

This is pretty easy although not that obvious how to do it. A colleague of mine Scott Hammons and I were talking about it this morning and we decided to throw up a few screen shots with instructions on Scotts blog. If your interested, here is the link to the instructions:

http://scottswebblog.blogspot.com/2007/06/how-to-change-default-page-in-itim-46.html

Tuesday, June 26, 2007

Anybody searching for Passlogix?

OK, I have not worked with the TAM ESSO product for very long, but what seems to be quite common when deploying it is having to install, uninstall, and re-install the product a few times as you blow up your test systems trying out templates and MSI files. What is the deal with software that does not uninstall completely?

Every time you uninstall TAM ESSO, you have to go into the Registry and search all over for "passlogix" and delete all these keys manually. You would think by now that someone would get something right with this Windows OS.

Stand-Alone configuration for TAM ESSO

I recently have been doing some work with TAM ESSO. Pretty cool product from Passlogix. The idea is that if you have many applications that users have to remember names and passwords for both client/server, web apps and even terminal applications you can use TAM ESSO to manage the user credentials for each of those apps. When you login to Windows, you must also login to TAM ESSO. TAM will detect when you attempt to access an application that requires a name and password and will login for your so that you are not prompted. Administrators can build templates for all the corporate apps and deploy these templates on an Active Directory Server. AD group membership can then be used to control who has access to which templates.

One of the problems I encountered early this week was in trying to deploy the TAM ESSO client stand alone while including the templates for applications in the stand alone client. The documentation is not very clear on a lot of things including this one so I figured I would go ahead and mention what I had to do.

First I had created templates for 3 applications (Web, Client/Server, and Terminal). Next, you need to click on Global Agent Settings and if you haven't already done so, Import from Live HKLM. Next, open the End-User Experience -> Environment key and select the check box for Location of entlist.ini file. Specify the path to the file including the file name. The default is:

C:\Program Files\Passlogix\v-GO SSO\Plugin\LogonMgr\entlist.ini
















There are numerous settings that can be controlled here so make any other settings you desire to be packaged in the distributions MSI.

Next, to create the MSI click Tools -> Generate Customized MSI. Complete the fields similar to the following screen shot:



















The BASE MSI file should be stored in the location where you extracted the TAM ESSO binaries. You can store your target MSI where you prefer. Obviously if you want applications templates to be included in the MSI then you will want to add the ones necessary for your purposes. The one part that I was not aware of until working with some other TAM folks was the Global Agent Settings. This must be done in order for the application template show up in the client agent. Once you click OK, your distribution MSI is complete.

This is definitely not one of the more complex areas of working with TAM ESSO, but it was a bit annoying when the documentation did not seem to spend much time on a stand-alone implementation. For folks who just simply want to try it out to get an idea how it works, there will not necessarily be an AD server or even ADAM for that matter.h

Monday, June 11, 2007

Time Flys when you're busy

The last time I posted here was April 20. So what the heck have I been doing since then? I guess piling up a long list of things to blog about. So here's what I've been up to:

1.) Implementing the Notes Adapter for ITIM
2.) Implement new ITIM Self Care App and configure SSO with TAMeB
3.) Implementing the AD Adapter for ITIM
4.) SSO Between TAM WebSEAL, WebSphere Portal, and Domino (TAI++ vs LTPA)
5.) Attended a handful of sales engagements between NY and Boston
6.) Install and Test TAM eSSO
7.) Implement TIM in a High Availability configuration using WAS Network Deployment and HACMP clustering with AIX.
8.) Work on TAMeB WebSEAL Clustering and HACMP clustering for TAMeB
9.) Implement TIM Self Registration app from ITIM Examples.
10.) Pass the ITIM 4.6 Certification Exam

What I aim to do is post some more detail about these experiences in the coming weeks. It's tough when you spend most evenings actually working. The blog always takes a back seat. Also, now that the weather is nice here in Buffalo, I have to take advantage of that and cross off all the items on the "honey do" list. I just had a large patio poured behind my house. It's nice, but I've had to spend the last two weekends landscaping it. I figure maybe two more weekends before I'm done. I can't seem to convince my wife that I should be keeping my blog up to date instead of landscaping.

Anyhow, more on these topics later!

Friday, April 20, 2007

TIM AD Password Sync Plug-in

If your considering using the PwdSync plug-in to sync passwords from AD into TIM there are a few things you need to consider. For one thing this is an all of nothing option. All accounts that ITIM manages will be included when you use the AD Password Sync plug-in. So it is important that there are no systems being managed by ITIM which more restrictive password policies than your AD password policy. Otherwise users in AD will set their password to something that passes the AD checks and fails when ITIM tries to sync that password to other systems. The problem is that the user will not know about the failure which will cause more help desk calls.

Also, an important point is that SSL is required for this to work. You will need to export a cert from your TIM server and import it into the PwdSync plug-in. My downloads page contains a detailed document that I typed up with screen shots explaining a but more about these issues that I found not very clear in the install guide. Feel free to check it out. Hopefully it will help to save someone some time.

Downloads {Link}

Tuesday, April 17, 2007

TIM and AD Integration - Group Membership in provisioning

Just as a follow up to my earlier post about the AD Adapter, I figured out how to provision users into the proper groups in AD. What I eventually figured out is that you can't simply type the names of the groups into the advanced provisioning parameters. The adapter is expecting to pass the GUID of those groups through to AD. This GUID needs to be looked up in TIM. So let me take a step back here.

When you get the AD adapter installed and configured the first time you recon the AD all the groups from your AD will be imported into the TIM LDAP in the container defined for the AD Service you created. If you export one of these group objects from the TIM LDAP it would look something like my AD Domain Users group here:

dn: eradgroupguid=5fcbe38c66d1f343b7572848a642a8e9, erglobalid=77847836466036
35175, ou=services, erglobalid=00000000000000000000, ou=CA, dc=ca,dc=com
eradgroupguid: 5fcbe38c66d1f343b7572848a642a8e9
objectclass: erADGroup
objectclass: erManagedItem
objectclass: top
description: All domain users
eradprimarygrptkn: 513
eradgroupcn: Domain Users
eradgroupdn: CN=Domain Users,CN=Users,DC=CA,DC=local


So on the entitlement -> Advanced Provisioning Parameters you will need to add some JavaScript which will look up the GUID for the group or groups you want your user to belong to when provisioned to AD. Also, the Primary Group uses a Token represented in TIM by the eradprimarygrprkn attribute. One way to cheat here is to use the standard view of the provisioning parameters and use the search button to find the groups you want. Then switch to the advanced provisioning parameters view and you will see the GUIDs for the groups you chose:






Click on the screen shot to see a larger view.


The installation and configuration guide mentions that you can set certain Windows registry keys that will change the behavior of the adapter. One of these options is the useGroupCN setting. If you set this to true then you can reference the common name of the group in your provisioning parameters. This option may make it a bit easier for scripting.

I'm still having some issues with the Home Directory behavior, but I think the key to that is also in part how I set these registry configurations in the AD adapter. So far though, I have the AD adapter working pretty well in my sandbox system.

Monday, April 16, 2007

Anyone out there ever Integrated Novell with TAM?

Novell has for a long time been a system used for file and print operations in organizations. Many organizations still use Novell for providing users with their Home directories. Somewhere along the way Novell developed an application which serves up these Home directories and files via a web browser. Novell calls this application NetStorage. A fairly basic application like this allows the user to login with a name and password and veiw their home directory and drag and drop files to and from it. Novell also has a product called iFolder. This application allows you to sync your files from the desktop and is also apparently a web based application. I would like to attempt to protect these applications behind TAM WebSEAL. I'll start with a Windows server and I have already downloaded the eDirectory 8.7.3 code. Before you install Novell you have to acquire a license so I have completed the form for the license and am waiting for a response from Novell so that I can try this out.

If anyone out there has ever attempted this, please let me know your results.

TIM 4.6 AD Adapter

I've spent some time with the TIM AD Adapter this weekend. It's pretty easy to setup and get users provisioned with when you want some simple functionality. However, I'm having a problem in a few areas. Some of the AD attributes such as 'Group' and 'Primary Group' appear to be search types in TIM. So if you were provisioning someone manually from TIM to have an AD account, you would click on the search button on the AD account form and choose the Primary Group as well as any other groups you want the user to belong to. My problem is that when I try to set these in the Advanced Parameters section of the entitlement form, I always get warnings when the users are provisioned that these attributes cannot be set. The AD installation guide gives no clues as to how these attributes should best be set from TIM.

The other thing I haven't figured out is setting the person's Home directory. There are a couple ways to do this. On the user profile tab in the Active Directory Users and Computers UI you can choose a static Home directory. Something like c:\users\cahart. So in my TIM provisioning policy on the advanced parameter list I place a c:\users\%username% for the Home directory, this works fine. However if I want to use a UNC instead there are different attrs to set. One for the drive letter you wish to map and one for the UNC path of the share. So on my ad server (ad1) I create a share called users. In my AD user profile it might look like \\ad1\users\cahart and the drive letter mapped would be H:. When I try to set these attributes in TIM they do not get set at all when the user is created in AD. I don't get any errors, but the attributes in the user profile just end up blank.

In the TIM AD Adapter there are some registry options you can set to TRUE. I've set 3 or 4 of these and so far I haven't seen any difference. As I work with this some more I'll follow up this post.

Tuesday, April 10, 2007

Deploying an Integrated IBM Tivoli Security Solution -- Average at best

So far the I would grade the quality of the IBM Training classes for Tivoli Identity Manager on average about a C+.

This latest class Deploying an Integrated IBM Tivoli Security Solution was not the class it was advertised to be. Go ahead and click the link and read the course description yourself. I'll tell you below what's wrong with the description:

1.) The first sentence says "Presented as a case study". NOT. There is virtually no lecture whatsoever which would be a good thing if the labs better explained what the company objectives were. On several occasions I had to ask what the point was to some of the things we were doing in the labs.

2.) The second sentence says "using TIM 4.6, TAMeb 6.0 and TAM for ESSO". NOT. None of these versions of software are included in this course. In this course they use TIM 4.5, TAM 5.1 and there is no integration or mention of TAM for Enterprise Single Sign On (ESSO).

3.) In the last sentence of the initial description it says "this course shows how to integrate these products to provide services to disparate business units while maintaining security policies." NOT. There was no such discussion illustrating the providing of services to disparate business units.

In the Topics section where it lists bullet points, the first 3 bullet points are covered in this class using the old TIM and TAM software (TIM 4.5 and TAM 5.1). The 4th and 5th bullet point are not covered at all. The last bullet point maybe partially .


So if the inaccurate course description was not enough, the next problem seems to be common with the Tivoli training classes. The machines being used are simply not powerful enough to do this work. In this class each student was issued two desktop PCs. Each PC allegedly had a 2Ghz processor with 1.7GB of RAM. This should be OK for most of what we need to do, but the reality is that the machines perform poorly to the point that it disrupts the learning process. The first day we ran into enough problems that forced the instructor to re-build new images of the machines after class that day so that we could re-group and try again the second day. So the class ended up finishing early the first day because the machines were reduced to uselessness since the TAM Policy server kept hanging the machine on almost every student in the class.

Day 2 was better, but still machines were plagued with performance issues causing us to reset the VM's and in some cases left our TAM Policy server databases corrupt. We did manage to get through most of the day today completing the required labs by the end of the day.

The problem is that we just don't seem to be even close to a real world use of these products. I have no problem taking the class using old software. I may run into this older software in the field so I'm fine with working with the older product, but there simply is not enough real world training in this class and it simply is being incorrectly advertised as one thing and delivering another.

In one lab we are deploying this web application called Mantis (a help desk application) that runs on WebSphere. That's fine. We need some kind of app to put behind TAM. One of the exercises in this deployment has you doing this migrateEAR5 to externalize the roles and security info from this web application to TAM. There was no lecture explaining that. The labs did not tell you why you were doing what you were doing. Like robots we are supposed to just do what the lab says to do. Luckily the instructor was very informative on this topic when I asked him to explain why we were doing this, but stopped short of really getting into what kinds of things we need to look for in a web application that might make it compatible with TAM. Why couldn't the class demonstrate an ASP/.Net app, do a comparison to the Java app and show us what we need to look for when considering integrating any web application behind WebSEAL? Not everyone uses WebSphere applications.

I have to say if you are considering taking this course, don't bother. There has got to be a better one.

Sunday, April 1, 2007

If you're looking for a good test application for your sandbox...


IBM has a J2EE bench marking application for WebSphere that is occasionally used in some of the Tivoli training classes. I noticed this past week in the Extending TIM 4.6 class some of our labs involved provisioning users to an application called Trade 3. The goal of one lab was to provision the Trade accounts using a customized assembly line in TDI. This Trade application was used in a couple different labs in the course and I thought this would be a great application to add into my sandbox.

The Trade application is a simulated Stock Trading application which is designed to demonstrate J2EE as well as provide a benchmarking tool for your WebSphere Application Server. It requires either DB2 or Oracle for its data and user repository. Trade 3 requires WAS 5 so if your sandbox consists of a single TIM server then you could install Trade 3 on that same box using your existing DB2 instance and the WAS that is hosting your TIM application. You can find instructions on setting up Trade 3 here:

http://www-128.ibm.com/developerworks/db2/library/techarticle/0303lau/0303lau.html

To set this application up you will need to download the install files and scripts. There is a readme.html included in this download. I would recommend you instead follow the instructions in the technical article because they are more complete there. Get the trade3 install kit here:

http://www-306.ibm.com/software/webservers/appserv/benchmark3.html

Now I'll admit I spent at least 4 or 5 hours Saturday trying to install the Trade 3 application on my TIM server where I'm running WAS 5. This did not go too well probably (now that I look back) because I was following the readme file instead of the tutorial instructions. One of the steps in the process runs a JACL script that installs resources setting up connectivity to DB2 etc... I kept getting errors trying to execute some of the lines from the resource JACL. There was something about a provider1 variable being non-existent.

There is also a Trade 6 application designed to run on WAS 6. Luckily I have a VM in my sandbox running WAS 6 since this is already required for TAM WPM I decided to try and install the Trade 6 application on my WAS 6 server. This meant I was going to have to install yet another DB2 server which I hadn't planned on, but that is OK because having DB2 on my web box is not a bad idea anyhow in trying to keep my VMs capable of running stand alone. I should be able to get the same experiece from my ITIM Lab excercises this way as I would running the Trade 3 app on my TIM server although this way I will need to run both my tim VM and my web1 VM at the same time. That's OK.

So you can find Trade 6 here:

http://www-306.ibm.com/software/webservers/appserv/was/performance.html


Definitely follow this tutorial to set this up though. This is much better than the readme file included with the download. Again I had some problems with connectivity to DB2. Follow the tutorial to the point of testing the DB2 connection from WAS Admin console. If you have a problem it may be the DB2 configuration and TCP port being used. Pay close attention to what you use to name the service/port. The tutorial can be found here:

https://www6.software.ibm.com/dw/education/dm/dm0506lau/index.html

I also saved these files on my downloads page as well:

http://cahart3367.googlepages.com/downloads

So I plan to use the Trade 6 application as a place to test provisioning users as well as a good application I can protect behind TAM WebSEAL. Sounds like Fun!

My TIM & TAM Sandbox design

I consider my professional life with TIM and TAM to be a constant learning experience. With all the middleware included inside the Tivoli Security software you may never completely learn it all. Some people are good at writing code. Others are good at recognizing how to fit business processes into Tivoli Security solutions. Yet other people are good at communicating the value of Tivoli software to customers. Within the Tivoli Security suite of software there are many complex areas to develop skills. I find that in the field some people lean more towards TIM skills and others lean more towards TAM. In either case you will get exposed to TDS, TDI, WebSphere App Server, and DB2. It's quite a challenge. If you are like me you may be involved in both TIM and TAM at the same time and this is fine, but you may find your skills developing more in one over the other simply because of time and personal preference.

I've been in the process over the last several weeks of building a sandbox on my lap top. This is a place where I can prototype things I need to do in the field and serves as a place to learn. Since this software will require many hours of experience in order to learn it there is no better place to have all the components installed than on your personal machine. Obviously you will need a smoking fast machine to do this, but it can be done.

I have a Lenovo Thinkpad T60p (Core 2 Duo and 3GB RAM). So far I'm using three virtual machines in VMWare Workstation 5.5.3. The VM's:

1.) Host Name: tim
OS: SUSE Linux Enterprise 9 SP3
RAM: 1GB


Note: Obviously you would not put all this on one
server in production, but this works fine
for testing.








2.) Host Name: tam
OS: SUSE Linux Enterprise 9 SP 3
RAM: 512MB - 768MB



* I'm running TDI on each of these servers because
of some adapter testing. TDI is providing the feeds to
TIM which is why I'm running TDI there. However
I plan to test the new TAM connector for TDI which
must be installed where there is a TAM Java Runtime
so I also installed TDI on the TAM server.




3.) Host Name: web1
OS: SUSE Linux Enterprise 9 SP 3
RAM: 512MB - 1GB




* To test TAM WebSEAL I wanted to have a web server
with some applications to protect behind TAM. This
server is a good place for Web Portal Manager
(PDAdmin) and the IBM Trade 6 App as well as
possibly Portal (not sure if I really have enough RAM
for that though)



I don't always run all 3 VM's at the same time. If I'm working on Feed type stuff then I may only run the TIM server. If I'm dealing with provisioning to TAM then I would need both TIM and TAM up and running. If I'm testing security between WebSEAL and some web applications then I would need TAM and Web1 up and running. I have run all 3 VM's at the same time so as long as I don't have to be reading and PDF's or email or anything else at the same time this is no problem. This design works well for me because it's flexible and covers a lot of testing options for me. Some future VM's I will be working on include Active Directory for provisioning users to AD as well as Lotus Notes. I will probably run a Domino server on the same box as AD so that I can test both using the same VM. At 10GB of disk space per VM I think that I may soon run into disk space limitations so it may be near time to purchase a portable USB drive. We'll see.

Wednesday, March 28, 2007

Adaper Hangs

Every TIM class seems to spend some time working with the standard Linux Adapter. It is one of the simplest to work with for doing some basic provisioning and testing. Wouldn't you know this happens to be one that gives me trouble. I'm using what I think is the latest Linux Adapter v 4.6.3 that I downloaded from the Partnerworld web site. For some reason this thing is a little flaky for me. The doc says it is supported on SLES 9 which is what I'm using. I'm using it on the server that happens to be running TIM. Initially it starts up fine. I can run the agentCfg program to configure it. Then in TIM I test connectivity to it just fine. I then tried a recon and it just hangs indefinitely. There is no log output regarding the recon in either the trace.log or the adapter log. I re-installed the adapter a couple of times with failed tests in between. The latest attempt at working with it I restarted Linux. Then before starting TIM I bumped up the logging level in enRoleLogging.properties:

logger.trace.com.ibm.itim.remoteservices.level=DEBUG_MAX

I started the agent and bumped up the logging for the adapter. Then I started up LDAP and TIM. Again I tested the TIM Linux Service to see that it connects to the Adapter just fine. I do a recon and this time the stupid thing started to work. I completed a recon successfully. But then I try to get into the agent config:

./agentCFG -agent LinuxAgent

And the tool hangs. Even if I stop the agent and start the agent I still cannot get into agent configuration unless I do something more dramatic like reboot the server. Sometimes if I stop the agent and leave it down for a while, then start it back up it seems OK. VooDoo.

Whew! It's a Brain Melt Down!

The Extending TIM 4.6 Class is definately a worth while class to take. At first I was a bit skeptical, but Day 2 and 3 got pretty involved. In fact each day the material gets more complex and the labs are pretty much all JavaScript. You should be very comfortable with TIM basics before you take the class because when things don't work in the class you will not want to fall behind because you don't know where to go to troubleshoot and debug stuff. The amount of time allocated for this class is pretty good (4 days). We could make use of a 5th day I think, but by then you will understand the concepts. No that I've been in the "weeds" a bit with TIM I can see that it must be very difficult to develop a course for TIM. So much of what you do with the product depends on so many different factors number one being the condition of your data. In the lab all the sources of identity information (CSV Files, DB2 Databases, etc...) have been pre configured with perfectly clean information (no duplicates, all attributes populated) and still things fail to work properly.

If you take the class you'll have lot's of workflows to do. I think between ACIs and Workflows, you can wrap your brain around those two things then the rest is easy. I would say definately include the Extending Tivoli Identity Manager 4.6 in your training plan.

Monday, March 26, 2007

Another thing from class today...

TIM is a little strange when it comes to feeding supervisors into the system. It all boils down to the fact that objects in TIM get assigned a random ID as part of their DN called the erglobalID. So in order to specify who a supervisor is on a persons' record, you must first create a record for that supervisor in TIM. It's sort of a chicken or egg problem. To solve this problem users are fed into TIM in two phases. The 1st round brings all users into TIM regardless of whether they are management or not. The next phase should update the supervisor attribute for all the users. To do that your second Feed into TIM must do a lookup to the TIM LDAP to resolve what the manager DN actually is. It might look something like:

erglobalid=8587940495056130390,ou=0,ou=people,erglobalid=00000000000000000000,ou=CA,dc=ca,dc=com

So the assembly line must get an attribute about the manager that it can then do a lookup on to get this DN. This then gets populated in the manager attribute in the TIM LDAP. Keep in mind that in TIM it is referred to as ersupervisor, but it's really manager on the back end. In my sandbox system, my employee feed source contains the employee ID for the manager in user record so when I run an assembly line to update the manager attr in TIM I simply do a lookup for the DN where the manager attribute in Work matches the employeenumber attr in TIM. For that person I get the DN which is what the Feed populates into TIM. This is done using the DSMLv2 Event Handler.

In class today we did the feed a bit different. While the initial feed used the event handler the second feed for the managers was done using TDI to push people right into LDAP. There were a couple of lookups to LDAP to resolve the DNs of the user and then the manager, then one finally update connector to the LDAP top update the record.

It seems like none of these training classes really go into what solutions work best in what scenarios. So far none of the classes I've been to talk much about the best way to develop a solution given a particular case. Supposedly the Advanced ITIM 4.6 Implementation Workshop goes into that level of work. We shall see because I plan on being there.

Day one at Extending TIM 4.6

I'm on the fence so far about the Extending TIM class. The first day was pretty much all review about LDAP and basic TIM stuff. I really did not learn much yet. The latter half of the day we did labs which was OK because we worked on ITIM Data feeds using TDI. One good thing about this was that we configured a different way to feed manager attributes into TIM than the way I've seen before. I also had problems with my provisioning policy due to a typo in some JavaScript in TDI. It didn't take long to find that, but My TDI Feed still didn't work right. The feed connects to a CSV file to pull most typical user attributes like the cn, department, sn, title and hire date. Then the AL connects to a DB2 table to lookup the person's security clearance. The Solution implements a typical TIM Event Handler so that when you recon the TDI Feed Service in TIM the users get pulled into the the TIM Org Tree. There was a sample placement rule for the lab that placed users in the org tree based upon their departmentnumber attribute.

Problem: In addition to a typo in the JavaScript, I also had a typo in the Org Unit name of one of the OU's. So when I ran the assembly line most of the users were placed into TIM correctly except for the users who belonged into the OU that was spelled wrong. So I fixed the mis-spelling and re-ran the AL. No go. The users still did not get placed properly. I restarted TIM tried again and it still failed. Then I restarted TDS and TIM and tried again. Still failed. I checked the TDI Assembly Line, restarted the Event Handler and still no go. So finally I decided to delete all the users in TIM who were placed at the top level. When I ran the recon again, this time it worked.

Solution: Sometimes you just have to delete the people who are in the tree incorrectly and re-run the reconciliation. Go figure. So this is definitely something I learned today which I would never had learned without first having a typo in an Org Unit!

Sunday, March 25, 2007

Welcome to my Blogs New Home

Since I changed jobs last month I've been looking for a new home for my blog. I used to host this at my former employer on a Domino server. I used to be a Lotus Domino Administrator so running my blog on a Domino box made sense to me. And Domino is a great platform for collaboration apps like blogs so it was a "no brainer". As gracious as my former employer was in continuing to host my blog for a while after my leaving, it was still necessary to move it.

I had something all lined up to host it on another Domino server run by a different company and then there was the option of hosting it at my new employer's Domino server as well. All this indecision is why there have been so few postings to this blog lately. This content resides at my old employer, my new employer and now here not because I like to copy this content multiple times, but because I was hoping to continue running this on Domino.

I switched to blogger because Google offers some pretty cool functionality along with the new googlepages all for free. So I can create regular web pages and store my attachments like documentation I compile, cheat sheets, and code snippets on googlepages and link to this content from the blog. These tools are actually faster than the experience I was having with the Domino blog using my Notes client as the blog admin tool. Still it was nice having the blog self contained in a single NSF file, but I decided to give blogger a whirl. I'm looking into creating a twiki that I can link to the blog and my googlepages as well.

Now that the blog has finally moved I look forward to posting more again.

Tuesday, March 20, 2007

Update to SSO for WPM and WebSEAL

I re-visited some of my past posts on this topic and I suspect that I may have failed to mentions some of the steps I took to make this work. Part of the issue is that IBM has a document on how to do SSO for WPM and WebSEAL, but it involves TAMeB 5.1. I'm using the newer stuff so I needed to use some documentation on creating a junction to WAS 6 using TAI or TAI++ as well as figure out how to do SSO with WPM. When I first tried this I attempted to use TAI. It almost worked, but I kept having problems logging in as sec_master. I could login as other TAM users just fine though. I then tried using TAI++ and everything worked great. Now in my customer environments we will be using a TAM authorization server so I made sure to setup an Auth server in my sandbox (development) environment as well so I'm not sure you will be able to follow the new instructions if you don't have an authorization server. The jury is still out on that. Anyhow, I uploaded a pdf to my Googlepages here if your interested.

Wednesday, March 14, 2007

Day one and two at ITIM 4.6 Basic Implementation Workshop

In the first two days of the basic implementation workshop we have already covered a lot of ground. It's almost moving to fast so for the last two nights I would bring my book back to the hotel and go over the exercises again using my sandbox environment on my laptop. But here are some bullet points I've picked up in the first two days:

  • Do not touch the TIM LDAP or the TIM database directly for any reason. In other words these two things are off limits to any other system besides TIM. While it may be tempting to let some system connect up to the LDAP make a query or something, it is highly discouraged by the Tivoli folks.
  • The default page in TIM when any user logs in is the change password screen. This may be confusing to people and make them think that they must change their password. This screen can be changed.
  • Changing UIDs is very difficult if not nearly impossible. Avoid doing this. An ideal solution for UID is to use something that will not change like Employee Number. Do not use FI LastName.
  • Referral attributes like Supervisor or Manager will require two loads of TIM. The supervisors need to exist in TIM before you can populate the supervisor (ersupervisor) attribute.
  • The Password Policy you choose for TIM should be the same as the password policy you are using on target systems. It is possible to make it different, but be careful because when a user is required to change their password ,while it may be allowed on TIM may not be allowed on the target resource or vice versa.
  • In TIM you can configure how to handle non compliant accounts. The choice is Notify, Mark or Correct. be very careful when configuring TIM to correct non compliant accounts. By simply removing a role from someone you may cause TIM to delete many accounts from some target resource. De provisioning accounts may mean different things on different target resources. You should choose to Mark non compliant accounts instead of correct them at least until you are completely comfortable with TIM. And that may never happen. :-)
  • When you don't want a user to have an account, but you cannot change the Role and you cannot change the Provisioning Policy, simply suspend the account.
  • In TIM 4.6 some restrictions in provisioning policies have been removed from past versions so that you really no longer need to use Locations or Business Partner Locations in your organization tree. Keep this in mind when designing the tree because when you have to search for people you often have to choose the category of people you are looking for. It's sometimes easier of they are all the same.
  • Static vs. Dynamic Roles - Static roles are simple to set up, but you must maintain them manually. Dynamic roles are automatic (use an LDAP search to build), however they can cause slower performance because they are constantly be re-evaluated.
  • If you have customized any Service Profile Forms, make sure to back up the service profile before you reload the form in TIM. Not sure how often you would encounter a need to reload the Service Profile, but if you did have to for some reason you would blow away any customization you made to the form earlier.
  • For any target systems you will manage with TIM designate a Service Owner. This way when setting up workflows you can have requests routed automatically to those people who actually manage that resource. Obviously then those people would require a TIM account.
  • I still have to verify this one yet, but a Provisioning Policy has to exist in the same container as the service it pertains to.
  • Service Selection Policies get evaluated anytime anything in TIM changes. This will result in poorer performance. Avoid using these.
  • When two policies for the exact same service applies to you, the one with the higher priority wins. (The lower number means higher priority)
  • There is a recycle bin in TIM. Anytime you delete an account, it goes into the recycle bin. This is used internally by TIM and accounts are kept there for 62 days.
  • When doing a recon your goal is to have 0 orphaned accounts. To help with that TIM by default will match up the account name on the target system with what is in the alias field in TIM. This is a nice feature to help minimize orphans. When you are feeding people into TIM use the alias field and populate it with what is likely to be the account names for your target systems. TIM will try to match these up during a recon and those accounts that match will be adopted.
  • There are two different kind of workflows - provisioning people and operational. Also workflows can be global or profile specific. There is a workflow element called "work order". This is only useful if you want to send something to someone, but not receive anything back. Technically with a lot of custom coding you can get something back, but there are other ways to do this.
  • Users in a TIM environment can begin to receive a lot of email especially when there is approvals required and things to do. You can use Post Office Aggregation which will group email notifications so that users do not get bombarded with email.
  • CustomLables.properties is where you will store strings that can be used in your workflows.
  • Delegate Authority is a feature available in TIM which allows you to transfer your To Do list to someone else for some specified amount of time. The To Do list will reside in only one place so if you delegate authority to someone else say while you are on vacation you will not get copied on that To Do list. When you return items on the To Do list which the delegate was seeing will not come back to you so that person who was delegated must process those To Do's. Likewise before you delegate to someone else, you must process all To Do's on your list because those To Do's that have not been processed yet will not go to the person delegated. It only directs new requests to the person delegated.
  • Make sure to have another account besides ITIM Manager in the event someone locks the ITIM Manager account.

As I learn new things or if I find corrections to any of the above I will post again, but that's not bad for the first two days of class. I still have several hours of work on my sandbox to catch up to what we have done in class, but repetition and constant exposure to this product is how you will learn it. I venture to guess it could take at least a few years to learn TIM so I am doing everything I can to spend as many hours as possible with it so that I can learn it faster.

Tuesday, March 13, 2007

TIM 4.6 Basic Implementation Workshop a good bet

Anyone looking to take any training on Tivoli Identity Manager 4.6 would be wise to get into these workshops available out here in Costa Mesa. Before attending the Basic Implementation workshop {Link}, I thought that maybe it wouldn't be a great use of my time since I've already implemented TIM in some sandbox capacity and participated in it's implementation in at least one development environment already. I figured now that I've actually done it how much more could I learn in a basic course. Furthermore, at a customer site we were already getting into some more heavy lifting with customization of the LDAP adapter and such. My being a training would mean I'd miss a week of that really goods stuff customizing an adapter.

My first day in Costa Mesa convinced me that taking the course was a good idea. First of all I should explain something. Any of these workshop courses are not done by Tivoli Education. They are done by the Tivoli enablement team. These are the people that get called out to a customer when either the customer themselves or a business partner screwed up. They are sort of rescue and recovery. This team is also as close as you can get to the people that actually write the code. So these folks teaching and developing the course material are people actually implementing it in some pretty complex cases. Another interesting point is that if the class says it is a workshop then it is this Tivoli Enablement team that is doing the class and it is also only available in Costa Mesa, CA. That's not to say that the other courses are not any good because even the enablement team will say that courses like Extending TIM are good ones to get. It's just nice to get training from people the caliber of our friend Ram Sreerangam.

The guy teaching this class is Brad Olive. He is the Workshop Manager and has been involved with Tivoli Identity Manager since almost the very beginning. Before IBM acquired the product it was called Access360. There was even two other companies before Access360. Brad goes back that far.

This class is hands on. So Brad talks about how the product gets deployed in the real world using some slides, then you quickly get to actual exercises. The cool thing is that you don't spend any time installing the products. They are already installed for you. The classroom time is spent configuring the products to work. The first day we built the org tree, fed users into the tree, configured a simple placement rule, provisioning policy and service, etc.... What was cool about Brad teaching this class is that if features of the system do not make sense to use he will tell you straight up not to use them. Here were some of his points:

1.) Service Selection Policy - The web courses talked about how great a feature this was. Brad admitted that yes the idea was really great, but unfortunately this feature has a slight problem. These Service Selection Policies get evaluated every time anything in the system changes. This could be a performance killer. Conclusion... avoid using them.

2.) Org Tree objects like Location and Business Partner are nice if you like the cute little pictures in the org tree to differentiate what they are, but in reality these can complicate your tree design and make things a little harder to find. Conclusion... use Organizational Units or Admin Domains instead.

3.) Static vs Dynamic Roles - It's fine to use both, but if you have many Dynamic Roles you can sometimes suffer some performance since these get re-evaluated a lot.

So even if you have installed TIM and configured it to some degree this is a good class to take. Obviously if you have already taken customers from development to production then this class may be a bit simple, however you would be surprised what you can learn from a basic class. Maybe some things you have been doing all along are now considered bad practice. Brad tells me these classes are constantly being updated to reflect real world practices, so if the enablement team has learned something new about the product along the way, they incorporate this into the class.

Saturday, March 3, 2007

Handy tool for your bag of tricks

My friend Thom Anderson turned me on to this tool a long time ago. I can't believe it took me this long to actually try it out. We are building TIM and TAM on a pSeries (AIX) box so there is no GUI like you get on the Linux environment. Some of the internal people use other products to help in managing their AIX systems. I really have little experience with AIX, but since we are installing on that platform we need an x windows environment to run the installs of TIM, WAS, TDS, TAM, etc.... Cygwin {link} is a pretty easy tool to install and configure. It looked a lot more complicated when Thom showed it to me a while back, but it's not a big deal at all. There is a another good web site that describes how to install it here {link} (thanks Andy) so it's pretty straight forward.

What I like about Cygwin is that I don't have to startup KDE or Gnome on my Linux system (saves some horsepower on my VMs) and I can still run all the software that requires x windows. For SLES 9 you will need to install the openSSH packages from Cygwin because Telnet is not enabled by default. Once you have installed the packages follow these steps to connect up to your Sles 9 machine:

1.) Launch Cygwin


2.) Type startx

3.) To ssh to your linux machine, type:

ssh -Y -l

4.) From here you can launch any program that requires xwindows (I.e. TIM, TAM, TDS, WAS Graphical Installations, etc...)

Things have been crazy...

Obviously there are these large gaps in time between my postings. Well lets just say that things have been crazy my first two weeks on the new job. The customer I'm working with right now has security policies in place which prevents me from posting to my blog during the day (and rightfully so) not to mention that when we are deep into configuration of TIM and TAM the last thing I'm thinking about is this blog. So the only time I can post is obviously after hours. Up until last Friday I didn't have a machine other than my home computer and since receiving the new computer I've spent most of my nights just installing all the software needed to build out my new sandbox for TIM/TAM. Whew!

I'm still looking for a new home for my blog. I thought I had something all lined up, but that has not materialized yet. I may actually have to switch from a Domino based blog to some free service like Blogger or something. In the meantime, I finally installed VMWare on my new machine and have built a new TIM server (Only took me 2 days this time) and now I'm just getting going on preparing a TAM server.

I was thinking (should have thought about this sooner) that if I hadn't already built my TIM server maybe it would have been better to build a DB2/TDS server to hold a single Instance with all the databases for TIM and TAM. In other words the TIM LDAP DB, TIM Transaction DB, and TAM User Registry DB all on one DB2 server in the same instance with TDS installed there to run both LDAPs. Then on a separate VM I could install just WAS and TIM and the DB2 Client to connect to the necessary DB2 databases. And then for TAM I would just install the TAM code pointing to the DB2/TDS server for it's LDAP. I'm not sure if this would have been a more efficient way to run a TIM/TAM environment on my Laptop or not. I have 80GB of space and 3GB or RAM so I'm just trying to maximize the horsepower I have. Either way, my TIM server is already done at this point:

host name: tim
OS: SLES 9
Disk: 10GB
Components:
TIM v4.6 FP33, IF38
TDS 6 FP3, IF2
DB2 v8.1 (Included with TIM TDS)
WAS 5.1 (with fixpacks included from TIM Suppl)

One thing I noticed when installing this was how fast my laptop performed. When I installed this on a Dell PowerEdge 2650 Server (loaded) I remember the WAS 5.1 install took soooo long (hours even) yet on ThinkPad T60p Core 2 Duo {Review} I blew through the install in under 15 minutes. Not sure how that could be, but I was pleasantly surprised. One thing I did differently in this case than the last time I built a sandbox is I used only one DB2 instance for both the TIM LDAP DB and the TIM Transaction DB. I wanted to limit the overhead as best as possible so instead of loading up two separate DB2 instances to do this I figured I would try with only one Instance containing both databases. Maybe you would or would not do this in production depending on the desires of your DB2 Admins and the hardware you are running on.

Next job to do is build out my TAM sandbox. That's what I'm working on the weekend in between other household duties. :-)

Tuesday, February 20, 2007

DuPont Data Theft

I always say if you can't trust your IT staff then you've got a problem. Then again, it's not about trusting the people, it's more about trusting that the controls in place are enough to keep the honest people honest. Just like the locks on your doors. If someone really wants to break in badly enough then they will likely find a way, but if you leave the doors unlocked then maybe even an honest person will be tempted.

The DuPont thing {Link} is so sleazy a company really would have to be desperate to hire someone that stole so much data from his prior employer. I'm not saying that this guy Min's new employer new what he had done, but it's pretty strange that 4 months after signing on with the new employer he's still working for the competition. This is why Identity and Access management is so important. Only give people just enough access to applications they need to do their jobs and nothing more. But beyond Identity and Access Management, monitoring is critical for detecting such anomalies as mentioned in the DuPont story.

Crossed over to the other side...

So last week I was an IBM customer. This week I am an IBM Business Partner. Or another way of putting it I was a customer, now I'm a consultant. New role, new hat, new perspective on many things.

Day1:
My first day on the new job was your usual orientation type of thing. How to access the various HR and house keeping applications. How time is managed, who the important people are when you need help, etc.... It was a very productive day. One of the highlights was a fabulous Chinese restaurant P.F. Chang's China Bistro. Now usually I would prefer a local restaurant to a chain, but this was outstanding. Chang's Spicy Chicken was sooo good. Just melted in your mouth. They have these lettuce wraps with I think chicken or pork and you literally place the mixture in these little lettuce "cups" and wrap it up like a tortilla. Then we had the Pin Rice Noodle Soup which was awesome. It had a good kick to it, but not too much. The best part of P.F. Chang's is that you leave feeling satisfied, but not sick like I often do after eating from a typical fast food Chinese restaurant. I wish we had a P.F. Changs here in the Buffalo area. This is well worth it if you ever find one near by.

Day2:
Right into the fire, my second day was at a customer site. This is where I will likely be for quite a while aside from a bunch of scheduled training I have between now and July 1. So I hooked up with most of the project team from my company and partner companies along with some folks employed by the customer. Quite a team of people here. So far we are well into architecture with a Tivoli architect (ITIM) guy, a TAM guy (IBM), some project managers (our side, customer side) and an army of various technical and managerial folks. I'm shadowing for now to build on the experience and knowledge I already have and hopefully I'll be contributing during the implementation phase.

So off I go learning as much as I can about the needs of my new customer and how we will solve those needs with Tivoli software.

Thursday, February 15, 2007

Career move just a few days away...

Changing jobs is no small feat when you are leaving on good terms. My last day is tomorrow 2/16/07 and I've spent the last 2 weeks wrapping up the architecture phase of the Identity Management project here, trying to transfer knowledge, files, company owned gadgets, etc... to the people that need them. This is certainly a big job when you have accumulated 10 years of data and what not. Just updating everyone I know with my new contact information is a big job. I think I've worked 8:00am until Midnight for the last 2 weeks straight. I did take some time off this past Saturday to take the kids sledding though.

Some time ago I committed myself (no, not to an asylum) to Identity and Access Management. I took this plunge about 1.5 years ago when our organization wanted to build an enterprise LDAP which included the aggregation of identities from over 100 customer organizations each with slightly different technologies and standards for maintaining their user accounts. This was a big jump for me because I has spent the prior 7 or 8 years dedicated to Domino and a few before that to Novell. But this sounded like a great challenge so I started with Tivoli Directory Integrator (truly remarkable product in my opinion) to build assembly lines to connect to Novell, Active Directory, and Domino Directory servers pulling user accounts into Tivoli Directory Server. But, I said, "What about the users' passwords?". We can easily pull all these identities into an enterprise LDAP just fine, but we will ultimately be talking about over 200,000 users. How will we maintain a password policy? How will we convey to these users what their original name and password is and prompt them to change it after their first login? That's a job for Tivoli Identity Manager. Now to take a step back, why would we build this enterprise LDAP in the first place? WebSphere Portal and any portal delivered applications is the answer. So the primary driver for building the LDAP was to allow all of these users to login and access applications we intend to roll out. Early on we knew that TIM and TAM would be required so when we finally purchased that software and started to install and play around with it, a few things became clear to me:

1.) I had a lot more to learn about the Tivoli software
2.) This project was going to be a long and hard one

Both of these things spell challenge to me and I really love a challenge. So I committed to learning all I can about TIM, TAM, TDS, and the various ways to deal with Identity Management.. But to become an expert in this field will require something more than I can get by staying here in this job. As much as I loved my current position, it's going to take a different type of job for me to really learn this stuff. Alas, I am leaving my role in K-12 education and joining IBM Premium Business Partner Strategic Computer Solutions, Inc. out of Syracuse, NY. With this new position, I'll be on an accelerated pace of learning and I'll be exposed to more experts in the field than if I stayed here. I anticipate the change to be a positive career move which should enable me to make a difference in many more ways down the road. I'm psyched about the new opportunity and look forward to adding value to the Tivoli brand and of course the SCS team.

So, I'm working on moving this blog so that I can continue posting my experiences with the Tivoli Security software along the way. Hopefully my postings will somehow help someone else who has to go through the same learning curve I am going through. And since there aren't really any bloggers out there for TIM and TAM, hey it's something new to talk about. OK, not exactly as exciting as Notes and Domino, but middleware is not all that glamorous. So what.

Monday, February 5, 2007

VA déjà vu

It happens again and again. People's identity data is compromised by the loss of a computer, hard drive or some system gets hacked. In fact for the VA this is tragically the second time in only months. See Computerworld {Link}. It's just unbelievable to me that these employees need to work with thousands of records on a local drive for any reason. First of all the article isn't clear how many records were compromised. Is it 48,000 or 20,000? I wish these articles would explain why these records were on an external hard drive in the first place. What project could this person been working on where he/she could not access the records as needed from a secured database or something? Why aren't the records on a server in a locked data center? Going back to May 2006 why was an employee carrying around 26 million records on a lap top? Now that would be helpful information in the article.

When I hear about these breaches it makes me stop and think about the project I've been involved in for the last 2 years and the projects coming up where I'll be dealing with employee identity data. When developing a TDI assembly line to pull user attributes from one system to another it's very common to test it with simple CSV or flat files for the source or destination of that data. I remember developing an assembly line to read thousands of users from one system and write the records to an LDAP. To test this I would first output the data to a file. This testing might occur many times over and over. These files may end up on various directories of my computer (lap top) which undoubtedly would go home with me at night. Maybe I'll stop at the Gym on the way home or the grocery store. Next thing you know my car gets broken into and I'm the next cause of a security breach at my company.

Well luckily I don't keep these files on my lap top really. Also lucky for me I don't happen to be dealing with personal information. But these security breaches have to make you stop and think if you are like me and happen to deal identity data from time to time. I guess the simple lesson is don't keep information like this on your machine. Make up a pile of bogus users if you have to test your assembly lines. If you need to test with real users do it on a secured machine that won't be sitting on the back seat of your car after 5:00pm.