Is Your Gym Like Your Dev Team?

I recently managed to drag myself out of the office and into the gym, but unfortunately my mind was still on the office and my observations of what makes a dev team tick. In between sets I observed my fellow gym-goers and I witnessed similarities with my experiences of IT development teams. Parallels between your developer teams and the local gym can made both in terms of personas and in the approaches to gym training:  

Exercise Machines vs. Free Weights:

When the Nautilus training machines (variants of which now fill every Gym) appeared in the early image1970’s they facilitated a revolution in exercise training and professional gyms. In contrast to free-weights (barbells, dumbbells etc) exercise machines provide a convenient and safe way of training. They don’t require a watcher and force the user through the ‘correct’ range of motion to avoid injury, also enabling forced reps etc. Like software frameworks these machine are built by expert engineers using solid (but also opinionated) ideas. They both enable new starters to get started easily and safely but they also share the trade offs. Machines/frameworks can lack fluidity and shelter the user from needing to understand the underlying principles at work. If the machine is out of service the gym user may not appreciate that they can achieve the same results via other methods. Remove the abstraction that the framework provides to the software developer and it may expose their lack of underlying skills (e.g. an ASP.NET Web Forms developer not appreciating HTTP). Of course the ‘correct’ way of doing something is always debateable and may not suit your needs for every project. Interestingly some pure "Bodybuilders" refuse to use machines for snobby reasons even when it would prove useful, whilst the majority of other gym users only use machines. The same can be said for developers and frameworks. An experienced all-rounder will happily use machines/frameworks where they are useful for productivity but will also utilise free-weights/alternative methods to achieve specific requirements.  

Metrics:

Ask any professional athlete or Bodybuilder what calories they consume or the weights/reps/sets in their last gym session and they’ll tell you in detail. This is because they know the value of recording metrics and how to use them to track progress. The same principles can be applied to software development teams. What’s your current burn-down rate? What’s the average code churn figure for a nightly build? How many hours effort really went into building that MVC view compared to the estimate? A productive team that is continuously improving will be using these metrics to drive progress.

Agility:

Whilst solid athletes measure and plan they are also agile in their training – because they have to be. They have to adapt to changing training environments and to the subtle messages from their bodies to avoid injury and maintain productivity by focusing on the end goal. You wouldn’t expect them to stick rigidly to a plan defined months before despite changing circumstances (e.g: injuries, soreness), things change and so the journey towards the goal must be managed with flexibility.

Gym Buddies: image

The benefits of having a gym buddy are clearly documented in the fitness world and for obvious reasons (shared motivation towards goals etc) and these benefits are so often overlooked in the development team. Pair Programming is a step in the right direction and is one technique that springs to mind but it is also just as important to foster a shared vision within the team and promote discussions and peer learning. A performing team is usually greater than the sum of its parts because people’s performance feeds off the ideas and motivation of their peers.

The Miracle Widget:

For those who don’t want the sweat and pain there’s always the miracle widget that will yield amassing results with little effort. Whether it’s a new machine, wonder drug, electronic shock training, sofa gyms, or SOA, Cloud Computing and BPM they need to be viewed with some apprehension. That’s not to say they aren’t the next big thing, but more that they are not silver bullets and they are used best within a cohesive thought out strategy.

Doping:

Taking steroids can rapidly improve an athletes performance but that improvement comes at a cost of unwanted side effects. The end goal may be rapidly becoming achieved but at the cost of internal physical or mental damage. This is form of extreme technical debt, taking a short cut here and there may be acceptable to ship the product but reliance on that short cut can build making it harder to reverse that debt.

 

Below I’ve noted some general stereotypical personas from the Gym and how they mirror development team personas. Do you recognise these roles in your gym/dev team?  Warning: These are fun generalisations so don’t get upset or take it too seriously!

 

The Bodybuilder:

imageThis guy has one goal in mind, to get ‘big’. All his exercises are anaerobic aimed at building muscle and developing his physique. He doesn’t do aerobic training as it detracts energy from his primary goal. He shows a strong ‘engineer like’ expertise of one discipline and he probably has excellent in-depth knowledge of that area and is very focused on learning more about it. He can be slightly intimidating to approach but generally happy to share his knowledge and experience and enjoys being able to show off his skills.  This persona fits well with many traditional experienced software developers, who are experts in their chosen areas of discipline and increasingly seek to learn more about that technology area, often ignoring the benefits of others. They are dedicated and seen as experts in their field but outside their field they struggle and sometimes the imbalance with other disciplines has a negative effect.

The Endless Runner :

Similar to ‘the Bodybuilder’ above but this time in a different discipline. These guys want to run faster/longer and focus on aerobic exercises and building endurance. Again a solid, expert software engineer but this guy is not in it for the showy technology but more for building the plumbing infrastructure required to keep systems operation.

The All-Rounder :

He is not the biggest or fastest guy in the gym but he is the typical all rounder. He probably has experience of working in the various disciplines above (maybe mastering both) but prefers breath over depth. The All Rounder is able speak everyone’s language and can compete admirably with anyone else but has to submit to the overwhelming expertise of the guys listed above. Often this guy is a bridge between the different disciplines and chats in the corner with both. He gains the benefits that the variation and breadth of knowledge provides but is often at risk of not keeping up with the pace of change in either. His nearest IT persona is the architect due to his all round skills and his comfort liaising with all the required disciplines. He is happy to share his experiences when asked or when he sees someone really struggling, but is often less opinionated about one approach or another as he see’s all sides of the technology argument.

One Routine Guy:

A consistent gym attendee but does the same routine for years. We all know developers like this. They lack true ambition for the vocation and hence don’t build up a true understanding of the changing world around them. They are happy to use what they know and they feel works but the lack of willingness to learn new things puts them at risk of hitting a progress wall and finding themselves obsolete, eventually quitting.

Bored Stiff Guy :

imageThey have decided to go the gym but have no real desire to do the workout. He runs through the motions, moving from task to task with little effort or intensity. We have all no doubt worked with developers who are going through the motions without any passion for the art of software development. Similar to ‘One Routine Guy’ they know what they need to know and lack any enthusiasm to learn new skills etc. I often refer to these as ‘Part Time Programmers’ as they see the job as 9 to 5 and the thought of picking up a new skill without company sponsored training course is alien to them.  

The Biceps Only Guy:

Only focuses his energy on what can be shown off.  He is just playing with the flashy stuff but without building a strong foundation to balance it with. For this guy ‘Gloss is Boss’. Some developers are happy playing with new technologies and building hundreds of "Hello World" apps but yet actually rarely innovate for the team as they fail to see the bigger picture.

The Poor Form Guy:

Is energetic and enthusiastic about training with heavy weight but inadvertently uses dangerously bad form in his exercises. Sometimes developers/architects can become so absorbed in delivering big solutions that they fail to assess their actions. They design complicated solutions using patterns they often don’t understand regardless of the project risks and the potential long term problems around stability/maintainability etc. Like ‘Poor Form Guy’ this is often a case of poor teaching or poor controls. These guys needs a coach or community to check their form.   

The Personal Training Guru:

An expert in his field that takes in many disciples and guides people in their goals. Sort of a more senior experienced all rounder that is now dedicated to helping others. He has respect from his community and his advice is well respected. These are the guru’s in the tech world, experienced consultants/authors (e.g. Martin Fowler).  

The Impatient For Results Guy:

He wants results and fast! He’s usually the customer for the ‘Miracle Wonder Widget’ (see below). Happy to take the easy option and cut corners on quality if he can. No doubt we’ve all worked with some bad development/project managers like this.

The Newbie:

imageHe’s new to the gym and very intimidated. He’s still finding his feet with the machines and social etiquette. Just like new developers these are the live blood of the community as they bring enthusiasm and new ideas, but they need to be guided. They need assistance to get up the steep learning curve and be shown the right way to behave. If we make it hard for them to add value quickly then we risk them giving up and going elsewhere, or at least becoming a Bored Stiff Guy.

The Non-Conformist:

This guy is in the corner of the Gym doing his own thing. He’s probably using the equipment in a unique way, or using a less known training technique. He is innovative and might capable of producing amazing results using non standard approaches. He can be found in your development team too bashing out productivity tools and reviewing the latest open-source offerings. Regardless of his personal success he provides a fresh approach and generates new ways of working. He needs keeping in check though to ensure that his solutions are viable longer term.

The Non-Committed Local Gym Supervisor:

Whilst many gym supervisors are like the Personal Trainers some can be over focused on numbers (subscriptions, machine usage rules) more than real results. Once the new recruit is brought in they get given the user guide and then are left to it, with poor form being ignored as long as basic safety rules are adhered to. There can be a lack of evangelism of techniques, ideas etc., or of facilitating the creation of a real community in dev teams too which can lead them to fail. A lot of the success/failure of teams can result from the performance of the development manager / technical lead and their willingness to support the team to keep them productive.

Summary:

Of course the conclusion is that I should have been working out instead of ‘people watching’ but the fact remains that there are parallels that can be drawn between our work communities and many other walks of life. This opens up the ability for us to view situations from different perspectives which can then help us to improve our understanding.

Hello Linux – Again

I’ve been increasingly interested in the Linux OS again recently and have re-discovered it’s power and flexibility. I’m a Windows guy primarily, always have been, but I’m writing this on the new Ubuntu 11.04 (Natty Narwhal) release and I must say I am enjoying the experience.image

This is by no means my first foray into the world of Linux, in fact my first exposure was way back 1999 (wow was it that long ago?) when I heard about this magical version of Unix  that ran on PC hardware. At that point my home PC was running Windows NT on a Pentium 166 with a heady 64MB Ram. I purchased a Linux book that came with a Linux distribution on CD (downloading was not really an option for me back then with a slow dial-up connection). It was Caldera OpenLinux 2.2 and installing it was a nerve racking and sadly enjoyable journey of re-sizing partitions, configuring drivers and general frustration. Eventually I had a dual boot NT and Linux PC and a sense of self satisfaction, but Linux didn’t capture my imagination much further at that point. Much like playing with Lego as a child (or an adult Smile) the fun is in the building and configuring it how you want but not the playing with it afterwards, my fun with Linux was over. The UI felt like a poor relative to Windows and I booted into it less and less. As I moved further into Windows development there was less of a desire for me to explore Linux more. Over the last 12 years I have played with a few distributions on and off but never really for any reason other than curiosity and none of them really stuck. Lately though I’ve started to get the Linux itch again so I got downloading and I’m very pleased with what I’ve found.

Firstly it is now so incredibly quick and easy to get Linux distributions installed and setup on your machine of choice. There are so many options to get you up and running, and with most drivers automatically detected you should get few issues. Of course Virtualisation has made it easy to try out various OSs and Linux is no different. VirtualBox is a joy to use if you want to try out a Linux installation, but there is also the LiveCD option that enables you to run your Linux distribution of choice off a USB or DVD drive without touching your hard drive. Whilst this is not very practical for everyday use its good for getting a real feel for what a Linux distribution is like and what it will be like on your hardware. The best option for me with it’s ease and practicality is Wubi whereby you can install Ubuntu within your Windows OS and it creates a virtual disk on your Windows drive for installation. This means all your files are sitting isolated in a Linux virtual disk file within Windows. You boot into Linux normally (via the Windows dual boot loader screen) and it is completely transparent to your Linux OS that it’s disk is actually virtual. Of course there is a performance overhead around disk access but its not noticeable even on my netbook. This approach allows you to get all the benefits of a Linux installation without having to go full hog straight away with partitioning your disk. I’ve been running Ubuntu for a while using this approach and its working a charm. I do intend to go the full hog soon and install a fresh install of Ubuntu onto its own partition when I get around to it but as it works so well I’m in no rush. In the meantime I can try out some other Linux distro’s to find the one for me (Linux Mint and Tiny Core are on my list to checkout).

Ubuntu Unity DesktopUbuntu is certainly a relatively user friendly Linux distribution and whilst the switch to the new Unity desktop for  the latest 11.04 version is very controversial within the Linux community there is little doubting that it is a friendly experience to new Linux users. It is also a very good desktop experienced for Netbooks, which is probably why it’s working so well for me on my netbook. I’m not sure how it will scale up to my desktop yet and do understand the negative comments that have been pitched at it. Regardless it is a way of Ubuntu differentiating itself from Windows and Mac, whilst making itself as user friendly as possible. It does appear as though Unity is a first step towards Ubuntu becoming a touch friendly device like Linux distribution. I’ve commented before on this blog how i see devices replacing the PC for all but “power user” type consumers and yet I still feel that a light friendly Ubuntu (with the power still underneath for those that want it) can do well. My wife is happy using Ubuntu on my netbook because like many consumers she does almost everything in the browser and this is where Linux on a Netbook seems to shine, as its capable, light and fast. A key advantage for Ubuntu is the Ubuntu Software Centre which is ideal for new users to get up to speed quickly. Overall I have been pleased with how Ubuntu plays with my Windows network shares and was pleased that I haven’t even needed to install a Remote Desktop client to logon onto my Windows servers as there is one included (type rdesktop servername at the terminal).

It’s still early days with my latest affair with Linux but so far it’s going well and this time I might get totally hooked.

The Case Of The Missing TFS Office Add-In

This week my Team Toolbar Office add-in for Team Explorer disappeared preventing me from uploading any new Work items to Team Foundation Server. After some searching in Excel (and MS Project) I accepted it wasn’t just hiding and turned to Google for help. Luckily I found this helpful post on the Team Foundation Server Team’s MSDN blog:

http://blogs.msdn.com/b/team_foundation/archive/2010/04/24/vs-2010-and-tfs-with-office-2003.aspx

In summary it tells you where on your machine the TFS Office Add-in is located for VS 2010 and VS 2008, e.g: “Program Files\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\TFSOfficeAdd-in.dll” or “Program Files\Microsoft Visual Studio 9\Common7\IDE\PrivateAssemblies\TFSOfficeAdd-in.dll”, and reminds you how to register/unregister these COM components, (i.e. regsvr32(/u) TFSOfficeAdd-in.dll).

I unregistered the VS 2010 add-in and registered the VS 2008 one (as I needed the 2008 based add-in) and “hey presto”, the Team Toolbar was back in Excel.

Android Remote Desktop Client

2XClient_LogonI find that I am increasingly relying on the computing power of my Android smartphone (a HTC Desire) and finding novel ways of using it to make my IT life easier. Sometimes I just want to connect to my PC that is in another room, or more often for me it’s my headless Windows Home Server, and so I scouted for a Remote Desktop client that I could run on my phone. The key requirement was for it to use the Windows native Remote Desktop protocol and therefore not require any software to be installed on my PC or Server, which ruled out a lot of the VNC based Apps. Luckily 2x.com have released an excellent FREE App that ticks all the boxes.

2XClient for Android can be found here or on Android Market here. It is dead easy to set up the target machines and there are several display optimisation options. The key thing though is that it’s actually very easy to navigate the target machines desktop via a custom keyboard and a nifty mouse icon that can be dragged around with a left and right mouse button attached (left image below).  In these images I’m logging onto my Windows Home Server (a Windows 2003 based OS) but I also use it with my Windows 7 PC too. One thing to note for Windows 7 though is that I needed to set my Remote Desktop settings (via My Computer > System Properties > Remote Settings)  to “Allow connections from computers running any version of Remote Desktop” as opposed to the default setting of enforcing Network Level Authentication.

 2XClient_Mouse  2XClient_Keyboard  2XClient_StartMenu

It is surprisingly easy to do simple tasks on the target machine, especially after a bit of practice. Here I am using PowerShell and checking my Home Server Console.

2XClient_WHS12XClient_POSH

A very powerful tool to have on your phone and ideal for those quick techy tasks when you can’t be bothered to get off the sofa.

Software KVMs

Recently I have acquired an additional desktop machine on my desk and quickly saw the potential to expand the amount of screen real estate at my disposal (you can never have enough screens). So imagine I have a laptop physically connected to two screens and a desktop PC, with one screen connected, on the same desk. I want to be able to seamlessly control that desktop PC with my main keyboard and mouse that it physically connected to my laptop. That way I get to have three screens and twice the processing power. Remote Desktop tools of course are not useful here as we can already see the desktop PC’s monitor and we don’t want to control the PC through a window on the laptop. Instead we need Software KVM Applications (in fact without the V for Video as we can see the screen). These work by sending your keyboard and mouse movements over a network connection to the additional PC. They also transfer your clipboard contents so you can copy paste easily between the machines.

First I tried Synergy from http://synergy-foss.org/ which is fairly unique as it’s a cross platform offering that runs on Windows, Mac and Linux which is incredibly powerful if you have a mixed environment. Sy1I Sy2tried the latest stable build which was 1.3.6. I found it functional but basic and not that robust. The Synergy client would stop running on on several occasions (usually after locking/unlocking the PC). The UI is also very basic. That said it did the job and I have since found that the UI has been completely overhauled for the current BETA version. Whilst I don’t think it is quiet solid enough yet it looks to be a big player in this space and there is no doubting that for those with a mixed environment it is great.

In the end I decided on InputDirector, found at http://www.inputdirector.com/. This is a Windows only offering but is more mature than Synergy with a host of additional options. It is easy to configure with one PC being the master and one being the slave. You can right click the icon in the system tray and choose to enable/disable it and also to lock/shutdown the Master and Slave PC’s, which I find useful when I want to lock both PCs in one go. The best feature though is it’s stability as I have not found one issue with it yet and am surprised how effortlessly it handles the docking/undocking of my laptop which is acting as Master. Once the laptop is docked a message pops up on the system tray to notify me that master and slave are in communication again and all is well.

InputDirector screenshots below:

 ID_1ID1ID3

Windows PowerShell Console Fun

1

Whilst watching Die Hard 4 the other day I noticed the funky transparent console windows that were being used to battle out cyber warfare, and being a traditional geek I immediately liked the idea of doing the same for my PowerShell console. Sure I know these guys are using Linux and transparent consoles have been around for years but still I fancied some of that slickness on my Windows.

POSh_PssGlassIt didn’t take long to find PSGlass on CodePlex (http://powershellglass.codeplex.com/) which is a neat little exe that runs in your system tray and hunts out any PowerShell console windows, and any it finds it converts to transparent using the Windows Aero effects. The peek into the source code shows that it’s checking for a window with a process name of “powershell” or “cmd” and then uses the DwmEnableBlurBehindWindow API to make it transparent.  The effect is shown on the left. It’s simple and effective and could be extended easily to do more. Not content with this I wanted to achieve the same result from within PowerShell itself so set about using the API in a script. Luckily for me Oisin Grehan had already written a script to achieve the same result and posted it on PoshCode.org, check it out at http://poshcode.org/2052. For this script the DwmExtendFrameIntoClientArea API is used to create a sheet of glass effect with no borders and the effect is much more striking (right screen shot). The fact that POSh_Glassyou don’t have to have an application running in the background is of course much better and as its a script you can add it to your profile to always take effect on Powershell start-up. I have found it useful to create a function in my profile to toggle the glass effect on/off depending on my mood and what actions I am trying to perform.

As the console was now transparent I quickly found myself wanting it to stay on the top of other windows and so set about looking for a script for that too. Again the excellent PoshCode.org came to the rescue with this script from Shay Levy at http://poshcode.org/1837. Again I downloaded the script and added a function to my profile to be able to toggle it on/off at will from within the console itself. In order to access the Set-TopMost() function that sits within the TopMost.ps1 script I used dot sourcing (described here by Jesse Hamrick) and my functions are shown below:

function OnTop
{
	. TopMost.ps1
	Set-TopMost (get-process -id $pid).mainwindowhandle
}

function NoOnTop
{
	. TopMost.ps1
	Set-TopMost (get-process -id $pid).mainwindowhandle -disable
}

PoshMatrixFor total geek heaven why not go the extra mile and put a Matrix style screensaver within the console itself. If you haven’t seen it I would recommend checking it out at http://www.nivot.org…. The video on the site shows how it works but it essentially runs a Matrix code screensaver inside the console (not the whole desktop).  Now that’s one clever PowerShell script!

2017 UPDATE:

As PoshCode.org site has been down a while I’ve added alternative links to be able to find the above mentioned scripts from the awesome web.archive.org instead of PoshCode.org:

Live Writer Syntax Highlighting Plug-in v1.3.0 Released

I’m pleased to announce the release of version 1.3.0 of my Windows Live Writer (WLW) Source Code Highlighter plugin for WordPress.com hosted blogs. For those not aware of this plugin it enables you to quickly insert a source code snippet into your blog posts using the WordPress.com source code short codes. For more information and a run down of features checkout this page.

The key feature included in this version is support for Live Writer 2011. As I posted here the new 2011 version has implemented additional security restrictions which have the effect of preventing JavaScript to be run within the editor window. More security is always good right? Well not this time as many plugins that rely on script to function have been affected, including mine. The end result is that the ‘in editor’ preview feature has been removed when using the plugin inside WLW 2011, but don’t worry it’s still there if you’re still running an older version of Live Writer. To go someway to help with this lack of preview I have implemented a new "Preview In Template" feature for 2011 users that provides an external preview in your browser. It works by using your locally downloaded blog theme and inserts the code snippet into it (with syntax highlighting) thus providing a preview as to what the end result will look like. It is just for previewing the formatting of the selected snippet though and won’t display a full preview of the whole post – use the Preview view in WLW for that still.

WLW 2011 ‘Preview In Theme": 

WLW2011TemplatePreview

Preview in Editor in older versions of WLW, and the new Mini Preview screen:

WLWPre2010EditorPreviewCodeEntry2

Regardless of what version of Live Writer you are running there is also a new ‘Mini Preview’ feature. The code entry screen has been extended to have a preview tab where you can  instantly see the results of your code changes in full syntax highlighted glory. The code entry screen is now fully scrollable and resizable too.

I’ve also included a few small tweaks and bug fixes in this release as usual. For example you can now Alt-Tab correctly back to the Code Entry screen and all languages should now display correctly when previewing.

To download (it’s FREE) just go here: https://richhewlett.com/wlwsourcecodeplugin/ and click Download Now.

Getting A Users Username in ASP.NET

When building an ASP.net application it’s important to understand the authentication solution that you are planning to implement and then ensure that all your developers are aware of it. On a few projects I have noted that some developers lack this knowledge and it can end up causing issues later on in the project once the code is first deployed to a test environment. These problems are usually a result of the differences of running a web project locally and remotely. One problem I found on a recent project was where developers were trying to retrieve the logged on user’s Windows username (within an intranet scenario) for display on screen. Unfortunately the code to retrieve the username from client request had been duplicated and a different solution used in both places, worst still neither worked. Sure they worked on the local machine but not when deployed. It was clear immediately that the developers had not quite grasped the way ASP.net works in this regard. There are several ways of retrieving usernames and admittedly it’s not always clear which is best in each scenario, so here is a very, very, very a quick guide. This post is not a deep dive into this huge subject (I might do a follow up post on that) but merely a quick guide to indicate what user details you get for a user the below objects in the framework.

The members we’re looking at are:
name_badge
1. HTTPRequest.LogonUserIdentity
2. System.Web.HttpContext.Current.Request.IsAuthenticated
3. Security.Principal.WindowsIdentity.GetCurrent().Name
4. System.Environment.UserName
5. HttpContext.Current.User.Identity.Name  (same as just User.Identity.Name)
7. HttpContext.User Property
8. WindowsIdentity

To test we create a basic ASPX page and host it in IIS so we can see what values these properties get for a set of authentication scenarios. The page just calls the various username properties available and writing out the values in the response via Response.Write().

Scenario 1: Anonymous Authentication in IIS with impersonation off.

HttpContext.Current.Request.LogonUserIdentity.Name COMPUTER1\IUSR_COMPUTER1
HttpContext.Current.Request.IsAuthenticated False
HttpContext.Current.User.Identity.Name
System.Environment.UserName ASPNET
Security.Principal.WindowsIdentity.GetCurrent().Name COMPUTER1\ASPNET

As you can see where we’re running with Anonymous Authentication HttpContext.Current.Request.LogonUserIdentity is the anonymous guest user defined in IIS (IUSR_COMPUTER1 in this example) and as the user is not authenticated the WindowsIdentity is set to that of the running process (ASPNET), and the HttpContext.Current.User.Identity is not set.

Scenario 2: Windows Authentication in IIS, impersonation off.

HttpContext.Current.Request.LogonUserIdentity.Name MYDOMAIN\USER1
HttpContext.Current.Request.IsAuthenticated True
HttpContext.Current.User.Identity.Name MYDOMAIN\USER1
System.Environment.UserName ASPNET
Security.Principal.WindowsIdentity.GetCurrent().Name COMPUTER1\ASPNET

Using Windows Authentication however enables the remote user to be authenticated (i.e. IsAuthenticated is true) automatically via their domain account and therefore the HttpContext.Current.Request user is set to that of the remote clients user account, including the Identity object.

Scenario 3: Anonymous Authentication in IIS, impersonation on

HttpContext.Current.Request.LogonUserIdentity.Name COMPUTER1\IUSR_COMPUTER1
HttpContext.Current.Request.IsAuthenticated False
HttpContext.Current.User.Identity.Name
System.Environment.UserName IUSR_COMPUTER1
Security.Principal.WindowsIdentity.GetCurrent().Name COMPUTER1\IUSR_COMPUTER1

This time we’re using Anonymous Authentication but now with ASP.net Impersonation turned on in web.config. The only difference to the first scenario is that now the anonymous guest user IUSR_COMPUTER1 is being impersonated and therefore the System.Environment and Security.Principle are using running under that account’s privileges.

Scenario 4: Windows Authentication in IIS, impersonation on

HttpContext.Current.Request.LogonUserIdentity.Name MYDOMAIN\USER1
HttpContext.Current.Request.IsAuthenticated True
HttpContext.Current.User.Identity.Name MYDOMAIN\USER1
System.Environment.UserName USER1
Security.Principal.WindowsIdentity.GetCurrent().Name MYDOMAIN\USER1

Now with Windows Authentication and Impersonation on everything is running as our calling user’s domain account. This means that the ASP.net worker process will share the privileges of that user.

As you can see each scenario provides a slightly different spin on the results which is what you would expect. It also shows how important it is to get the configuration right in your design and implement it early on in build to avoid confusion. For more information see ASP.NET Web Application Security on MSDN.

XML Error in TFS Power Tools Backup Plan

Whilst checking the Windows Event Log on my server I found this worrying error being reported by the Team Foundation Server Power Tools Backup & Restore Tool :

Error: Tfpt Backup Restore : There is an error in XML document (499, 100).

I posted an article about using this tool to backup your TFS server in September, and it has been successfully backing up daily since that time. The above error has been reported after every daily backup for the past few weeks it seems. It immediately follows an information event from MSSQL$SQLEXPRESS stating that the “Database has been backed up”. To investigate more I opened the TFS Admin Console on the server and clicked “Take Full Backup Now” whereby I got the same error interactively – well at least it’s consistent.

TFS Backup There is an error in XML document

From here you can open the log, and in that I can see it’s reporting that the BackupSets.xml file in the root of the backup target location is invalid:

[Info   @20:30:15.977] Backup Set Name Tfs_DefaultCollection database Backup
[Info   @20:30:15.977] Backup Set Description Tfs_DefaultCollection database – Full Backup
[Info   @20:30:15.977] Adding database Tfs_DefaultCollection to the backupset
[Error  @20:30:16.352] System.InvalidOperationException: There is an error in XML document (499, 100). —> System.Xml.XmlException: There is an unclosed literal string. Line 499, position 100.

On opening the file there appears to be nothing wrong with the XML as such but in my case it didn’t seem complete, as the last successful backup only detailed one entry not the usual two (*.bak and *.trn files). This file seems to be record of the backups as opposed to being core to the actual SQL backup process so I deleted/renamed the BackupSets.xml file and ran another full backup. This time it worked fine and a new BackupSets.xml file was created. It’s now running fine again.

The Future Of The IT Department

Recently I have been witness to rapid, often painful, change within my own internal IT division over the last few years and observed the on-going developments in the industry. It is clear that IT departments changed dramatically in a short amount of time and the pace is not relenting. This has led me to try to picture what IT will look like within large institutions in the future. It is becoming more and more apparent that the structure of our internal IT organisations are very often based on the traditional legacy models that served enterprises well in the past. Big IT investments and centralised systems are best managed and maintained by an rigid organisational structure. The IT department and the business units are today usually far more disconnected than many CIOs would care to admit. IT used to be something that was done by the IT department based on fairly static business processes. However we’re now in a different world, where IT is seen increasing as just a commodity and business processes need to be able to react quickly to changing economic conditions. No longer is the IT department responsible for big monolithic systems (e.g. payroll etc.) but IT is now embedded in every business process so in some sense every department is an IT department. Surely if the IT organisation doesn’t aid the business then it will be eventually pushed aside and replaced.

The Journey From Past to Present

This excellent post by PEG covers this subject well. PEG paints the picture of the traditional IT organisation as it was in many enterprises and then slices it up to represent the current model once outsourcing/off-shoring has been considered. The left hand diagram showing the more traditional split, and the right showing the emerging norm:

Factoring in the effort required to manage out-sourced projects


Diagrams from PEG: The IT department we have today is not the IT department we’ll need tomorrow

It surprises me how many people consider their jobs as not being under threat from outsourcing as they’re role is above the bottom tier on this sort of diagram, but as you can see it is inevitable that the line between permanent staff and outsource partner staff will continue to rise to the point represented in the triangle on the right, with a good cross section of IT roles being fulfilled by partner organisations. This represents where many large enterprises are at present whereby some “doing” roles are maintained in-house but the management and planning layers are also supplemented by outsource/offshore partners. The bulge in the middle represents the extra permanent resources required to cover the additional overhead of managing partner resources.  Taking a bank to be the textbook example of a large enterprise with a significant scale IT organisation then this research into European banks activities provides some insight into the strategy driving these changes. Unsurprisingly cost reduction is key, but its not the only factor…

“Survey participants cited cost reduction as the primary reason to outsource IT functions, followed by cost variability (for example, the flexibility to respond to peak demand without ramping up internal resources) and access to know-how or skilled personnel. The main benefits of outsourcing were access to know-how or skilled personnel and a guaranteed level of service. (The cost benefits associated with outsourcing often fell short of expectations.) The biggest disadvantages of outsourcing were high switching costs and limited control over critical elements of the IT environment. On the whole, however, the survey shows that banks have embraced outsourcing. Only 3 percent of the banks surveyed were planning to decrease their outsourcing activities. The case for offshoring was slightly different. Although banks used offshoring primarily for the same reason they used outsourcing—to reduce costs—the main benefit of offshoring was less stringent foreign labour laws. The biggest disadvantages of offshoring were opposition among domestic personnel, large overhead, and loss of control.”

Both partner strategy models are therefore seen as suffering from elements of losing control of assets or deliverables and somewhat adding to management overheads, but providing some agility by providing a mechanism to ramp up or down resources as required.

PEG extends his model to show that in the future there will be an increased reliance on SaaS and automation tools and therefore a chunk of the IT organisation structure will be replaced by these as well as outsourcing/offshoring roles.

A skills/roles triangle for the new normal

Diagram from PEG: The IT department we have today is not the IT department we’ll need tomorrow

Within the current model, management layers have often become too complex and unwieldy. With the IT organisation being a business entity itself within the enterprise and with 65% of IT spend just being used to maintain current service, business functions and IT often clash over priorities and the allocation of funding. In many instances resulting in the business going outside of the IT Org to secure services or growing their own ‘black ops’ internal capability just to get things done. This again challenges the traditional IT organisational model where IT keeps a tight control.

Changing Objectives

Tighter financial conditions, increasingly competitive environments and a desire to maximise returns is leading to a model of pay per use and more utilising of partners and outsourcing models. Technology advances are making this transition possible (e.g. Cloud Computing, SaaS). Future IT departments will increasingly utilise these external services resulting in them adopting a very different structure. Whilst the traditional IT organisation has been geared to building and maintaining large complex systems and is staffed with technical people, the rapidly emerging model is one where IT skills are outsourced to numerous vendors and IT staff become the negotiators and orchestrators of these relationships and contracts. Instead of managing systems changes internally the IT organisation is increasingly just the middleman between the business and the outsource/offshore partners. The role becomes one of managing projects more than technically implementing them. Reports can be found of in-house IT departments cutting 90% of headcount with a rapid shift to offshore/outsourcing with the remaining staff focusing on the planning and relationship management tasks. This Boston Consulting Group paper suggests there is an essential move from “doer” to “orchestrator”,  with the IT Organisation “doing fewer of the traditional ‘run the business’ activities” instead leaving them to external providers and doing more coordinating of (one or many) providers activities to meet the design.  This “network of external providers and integrators” needs monitoring and tuning and the structure of the IT Organisation will need to centre around these activities.

A quote from Reinventing The IT Organisation by Antoine Gourevitch, Stuart Scantlebury & Wolfgang Thiel…

“Unless CIOs take swift action, the IT organisation will be at risk of being reduced to a thin layer between the business and the specialist outsourcing firms.”

The outcome will presumably be either a slim organisation staffed with Change Managers and Project Managers responsible for liaising with the partners to satisfy business requirements, or alternatively these changes could prove the catalyst required to move to true business driven IT, where IT skills are integrated with the business units to enable them to react rapidly to changing business needs. Larry Dignan in his post welcomes the idea of breaking up the traditional IT organisation, seeing it as an anachronism. He classes CIOs as often “out of their league”, “process jockeys” who would “rather be scouting new technologies” than innovating. I would agree that this appears to be the case in many large organisations where IT, some would argue, has frustratingly become detached from the goal of driving business value through technology, losing itself in bureaucratic processes. These organisations can seem a long way from delivering core bottom line business value. PEG discusses the detachment of Enterprise Architecture and the business, together with a description of little ‘a’ and big ‘A’ architects, here and its well worth a read. Even where IT organisations do deliver real value its often to timescales that seem painfully long to the business customer but painfully short to the IT guy wrapped up in bureaucratic red tape. Perhaps this isn’t ITs fault as such but more the  arcane structure of the IT organisation as we have come to accept.

One way suggested for IT organisations to remain relevant and address future challenges is for the business and IT to move closer together than ever. This has been talked about for many years but with the demise of the monolithic IT organisation the next few years could see this model mature. Perhaps decentralised pockets of business IT shops closely aligned to the business units will be the norm, introducing new challenges around how to control these pockets.

This shift towards IT/business integration could be very rewarding for an enterprise as in reality modern business processes are often tightly intertwined with the LOB applications in use and so anything that can be done to ensure that those LOB applications support the business processes instead of restricting the pace of business change will be welcomed. Dreischmeier & Thiel suggest new ways of working may be required as IT organisations are forced to adjust their operating model to become faster, more agile and to embrace rapid-development approaches. The business can’t afford to be held back by a slow and unwieldy IT organisation.

One concept I particularly like is the concept of  “introducing Product or Solution Managers” to address the “lack of end to end ownership within IT Orgs”. The person would “own the IT product/solution across all technical layers”. This role should improve TCO and aid business & IT priority alignment. Dreischmeier & Thiel also see the CIO as a key player in ensuring that the IT organisation is “Proactively Engaging in Business Transformation Activities” and that even the IT organisation is very well positioned to be a key player in this transformation as it is aware of the end to end business processes (in theory). They suggest:

“Creating, together with the business, a new-business-model team that seeks out and addresses the changes in economics of the relevant industry as it changes through increased competition and environmental forces”. 

The growth of agile development practices have a a part to play here too. Having innovative IT teams that ‘fail fast and often’ and use lean agile techniques to maximise business value could replace traditional models. Smaller, focused development teams under the direct control of the business units using Agile practices and being supported by a central infrastructure function (probably outsourced) could prove a very effective way of actually building what the business really need. The evolution of Cloud Computing technologies provides real opportunities to make these teams very capable. A business unit based developer could ‘mashup’ cloud services together with core on-premise web services to produce a powerful line of business application that is then deployed to PaaS cloud based infrastructure. Forester Analyst Alex Cullan sells the benefits of this model with the term “Empowered BT (Business Technology)” where IT’s role is to empower the business to utilise the technology that they need in order to remain competitive. The traditional arguments against this approach such as the expected system proliferation and business technology decisions being driven by hype, are dismissed as actually not as bad as we in IT would believe. He argues successfully that some proliferation is acceptable if it empowers the business, but there would have to be trust in business leaders to choose the right path for this to work. Is that trust there at this moment in time? Well not according to this MIT & Boston Consulting Group survey where it shows that current CIOs believe that business leaders are not positioned to lead IT enabled business transformation. Only 33% of CIOs consider their company’s senior execs effective at driving business value with IT, and 40% consider them effective at prioritizing IT investments. However perhaps this reflects the differences in the current differing priorities of the of traditional IT Organisations and the business units, with IT enforcing its traditional maintenance role (“keeping the lights on”) and role of application development/innovation more than a real distrust. The paper does however highlight the benefits that can be achieved when the IT organisation avoids the simple “middle man” role and takes the lead role of driving business change (such as lower maintenance costs, faster realisation of business benefits from new systems, and higher employee satisfaction).  Perhaps the future of the IT organisation is that of a business in its own right, an internal consulting firm offering assistance in business process design, innovation and development management.

Proctor and Gamble run their IT Organisation as a business within the enterprise running alongside other business services (e.g accounting etc.). Their services are branded and marketed to the enterprise and billed on a usage basis with business units empowered to choose to consume these services or go elsewhere. The emphasis is on running this as a viable competitive internal business that is in tune with its customers (in this case the internal business units) needs. They have Brand managers responsible for “the innovation, pricing and commercialization of the services” that ensuring that the total end to end offerings can match that of 3rd party offerings. Underpinning this though is a collection of external partner relationships that still need to be managed and so  in essence this is still heading towards becoming an integrator, orchestrating these partner services into a clear cohesive branded, and hopefully relevant, service. The key here though is the added value provided by this internal IT business service that crucially understands the business and offers competitive services that are completely relevant to the business. This is supported by the BCG research that found where IT Organisations really drove business change they often delivered their IT services as shared services and placed more emphasis on relevant prices and alternative service levels. They tended to centralise IT with lower levels of recorded “shadow” IT being instigated by the business, which could perhaps suggest that these business units felt they were getting sufficient value from their shared IT services, even though it was under central control.

Future Skills

All these changes have massive implications on the skills required within the IT organisation of the future. In the current model maintaining a relevant skilled workforce can be tricky with many key staff feeling demotivated by the outsourcing/offshoring partner model and the subsequent removal of technical roles from their organisation. The loss of junior IT roles to partner resources destroys any future progression opportunities and shows that this model is unsustainable moving forward. Engaging technical people will be increasingly difficult in the current model but perhaps a move to more business aligned IT can help skilled staff remain technical if they wish and also benefit the business through enhanced IT innovation and passion for their roles, instead of forcing good techies to oversee offshore/outsource relationships.

It seems essential now that IT staff of the near future will be expected to have an enhanced level of business acumen and market knowledge to fulfil their roles. Will this come at the expense of excellent technical skills? Maybe! Perhaps the technical skills will be embedded within the offshore/outsource partners and the relevant ‘technical’ skills required in the IT Organisation will be those around technical process design and system analysis. Knowledge of the business will perhaps be more important than any technical skill (for the majority of roles) and therefore it makes more sense to recruit IT staff from within the business units themselves. This is evident in a number of studies with CIOs, such as this BCG study

“In general, CIOs told us that Internal IT staff roles are shifting away from application development and towards process analysis and engineering, business relationship management, project management and architecture design and implementation.”

Within the previously mentioned Proctor & Gamble organisation the same theme emerges as the skills reflect the role of IT within the organisation:

“..traditional IT is just 30% of what we do. If traditional IT is all a person masters, he or she will never be a leader here. The rest is about business knowledge. Those who embrace that approach will certainly increase their value…” 

This view was supported by the previously mentioned study into European Banking, but it also went further, pointing out that technical skills were being neglected …

“…many banks appear to be underestimating the value of technical tools and skills, which are critical to developing high-impact applications, maintaining an efficient infrastructure, and managing outsourcing partners.”

So where does this leave you and I? Well, I expect the relevant number of deeply technical IT professionals will decline in Western countries but this decline will be dwarfed by the increase in semi-professional developers, working in the business but using end-user computing tools to develop systems that are meant to be rapid, easy and throw away. Where more complex solutions are sought then outsource partners will happily fill that gap. Escaping the large enterprises and fleeing to the small and medium enterprises will not be sustainable longer term either as the partner model will win there too eventually. It is entirely possible that the partner model will lose some of its lustre (it’s already happening in places) and there may be some swing back to in-house technical teams. If that happens then the IT community needs to be ready to promote a new ‘agile’ alternative that understands and drives true business benefits.

This evolution of the IT organisation is natural in such an immature industry as this but one thing is definite the future is different and we need to adapt. Whichever direction the future takes for you spend some effort in the meantime trying to understand your business customers needs better and keep innovating for them!