Installing Team Foundation Server on Windows Home Server 2011

Installing Team Foundation Server on Windows Home Server 2011

Twelve months ago I wrote a post documenting “installing Team Foundation Server 2010 on Windows Home Server” which has proved very popular. Well things move on and since then Microsoft have released a new version of Windows Home Server (WHS 2011). There are many differences between V2 of WHS compared to V1 but the main points for the purpose of this post are that WHS 2011 is build on top of Windows Server 2008 R2 (compared to Windows Server 2003 for V1) and controversially the Drive Extender technology has been removed. Whilst Drive Extender was no doubt useful for storage pooling it did make installing applications like TFS a little concerning. As described in my original article I wouldn’t install an application or a SQL Server database into the Drive Pool (it just feels wrong to me and wouldn’t trust it) and I stick by this especially as it’s been suggested that one of the reasons for Microsoft  removing DE was due to it not playing nicely with enterprise applications that would be targeted for use on the new Small Business Server Essentials product range with which WHS 2011 shares it’s code. No DE means you can install TFS now to whichever drive wherever you like in WHS 2011, and the fact that it’s built on the excellent Server 2008 base means it benefits from stability and performance improvements this brings. I’ve not found any issues with TFS on WHS 2011 and don’t expect to (although its not supported so you install it at your own risk). I think that WHS 2011 will make an even better TFS server than WHS V1.

Other than the decision of where to install TFS due to DE, the installation instructions are the same as in my original post. After installation I recommend installing the TFS Power Tools and then configuring TFS backups as described in these posts: Backing Up TFS 2010 Using PowerShell: Part 1, Backing Up TFS 2010 Using PowerShell: Part 2 and Backing up TFS 2010 with new Power Tools Backup Plan.

01 06   12

Advertisements

Android Remote Desktop Client

2XClient_LogonI find that I am increasingly relying on the computing power of my Android smartphone (a HTC Desire) and finding novel ways of using it to make my IT life easier. Sometimes I just want to connect to my PC that is in another room, or more often for me it’s my headless Windows Home Server, and so I scouted for a Remote Desktop client that I could run on my phone. The key requirement was for it to use the Windows native Remote Desktop protocol and therefore not require any software to be installed on my PC or Server, which ruled out a lot of the VNC based Apps. Luckily 2x.com have released an excellent FREE App that ticks all the boxes.

2XClient for Android can be found here or on Android Market here. It is dead easy to set up the target machines and there are several display optimisation options. The key thing though is that it’s actually very easy to navigate the target machines desktop via a custom keyboard and a nifty mouse icon that can be dragged around with a left and right mouse button attached (left image below).  In these images I’m logging onto my Windows Home Server (a Windows 2003 based OS) but I also use it with my Windows 7 PC too. One thing to note for Windows 7 though is that I needed to set my Remote Desktop settings (via My Computer > System Properties > Remote Settings)  to “Allow connections from computers running any version of Remote Desktop” as opposed to the default setting of enforcing Network Level Authentication.

 2XClient_Mouse  2XClient_Keyboard  2XClient_StartMenu

It is surprisingly easy to do simple tasks on the target machine, especially after a bit of practice. Here I am using PowerShell and checking my Home Server Console.

2XClient_WHS12XClient_POSH

A very powerful tool to have on your phone and ideal for those quick techy tasks when you can’t be bothered to get off the sofa.

Software KVMs

Recently I have acquired an additional desktop machine on my desk and quickly saw the potential to expand the amount of screen real estate at my disposal (you can never have enough screens). So imagine I have a laptop physically connected to two screens and a desktop PC, with one screen connected, on the same desk. I want to be able to seamlessly control that desktop PC with my main keyboard and mouse that it physically connected to my laptop. That way I get to have three screens and twice the processing power. Remote Desktop tools of course are not useful here as we can already see the desktop PC’s monitor and we don’t want to control the PC through a window on the laptop. Instead we need Software KVM Applications (in fact without the V for Video as we can see the screen). These work by sending your keyboard and mouse movements over a network connection to the additional PC. They also transfer your clipboard contents so you can copy paste easily between the machines.

First I tried Synergy from http://synergy-foss.org/ which is fairly unique as it’s a cross platform offering that runs on Windows, Mac and Linux which is incredibly powerful if you have a mixed environment. Sy1I Sy2tried the latest stable build which was 1.3.6. I found it functional but basic and not that robust. The Synergy client would stop running on on several occasions (usually after locking/unlocking the PC). The UI is also very basic. That said it did the job and I have since found that the UI has been completely overhauled for the current BETA version. Whilst I don’t think it is quiet solid enough yet it looks to be a big player in this space and there is no doubting that for those with a mixed environment it is great.

In the end I decided on InputDirector, found at http://www.inputdirector.com/. This is a Windows only offering but is more mature than Synergy with a host of additional options. It is easy to configure with one PC being the master and one being the slave. You can right click the icon in the system tray and choose to enable/disable it and also to lock/shutdown the Master and Slave PC’s, which I find useful when I want to lock both PCs in one go. The best feature though is it’s stability as I have not found one issue with it yet and am surprised how effortlessly it handles the docking/undocking of my laptop which is acting as Master. Once the laptop is docked a message pops up on the system tray to notify me that master and slave are in communication again and all is well.

InputDirector screenshots below:

 ID_1ID1ID3

The Future of Windows Home Server

Microsoft’s recent announcement that the key Drive Extender feature is to be removed from the new version of Windows Home Server codenamed ‘Vail’ has resulted in much dismay within the community. Many commentators, including the vocal WHS user community itself, have started to question the future of this product. In this post I give my take on where I see WHS in the medium term and consider how it can fit alongside the “new dawn” of a Cloud Computing era.

How big is the Drive Extender issue?

Firstly, what’s all this about Drive Extender (DE)? Well DE is a really neat feature of WHS that pools all the hard drives in the system into one logical data drive. This means that you can throw in a mixed selection of hard drives of any type (USB, SATA etc) or capacity and the system enables you to see them as one. It also provides fault tolerance through data duplication which protects your data from drive failure. It is one of the major features of Windows Home Server (WHS). I would argue one of three, with the others being the client backups and remote access. Sure the product does much more than just that but it’s fair to say that all of WHS’s features are available in other products in some shape or form and the combination of these three features into one customisable platform made WHS stand out for me.

Microsoft’s announcement to remove DE from the next version of WHS code named Vail immediately removes a major reason to buy into the new WHS version and this has been evidenced in the recent twitter comments on the subject where a lot of people have stated their intention to not use ‘Vail’. Of course some of this is just anger at the fact that the feature has been removed (and the way in which it was announced) but still the fact remains that the product is a weaker proposition than it was before.

Personally I see this decision in both a negative and a positive light. Firstly I see this as a major blow to the uniqueness of the product and feel that it will suffer without this USP (Unique Selling Point). Also it’s important to remember that this is positioned as a product for the average PC user and DE made extending the storage capacity easy. The user doesn’t need to buy matching disks or configure RAID, they just pop in a new disk and it gets added to the pool. Without DE adding extra storage will presumably be a more complex task. In reality though how many “average” PC users would feel happy upgrading the hard drive on their WHS anyway. Whilst enthusiasts relish the chance to pop open the case many casual users would actually see their OEM produced WHS as just an appliance, and one probably already stuffed with several 2 or 3 TB drives providing a good chunk of storage capacity right out the box. They would not consider any upgrades to it other than replacing it when it gets full. In addition whilst the shared drive pool concept makes adding storage easy the ability to add additional storage as additional drives will still be there in the product as it is in any Windows OS. I don’t see this as a huge blocker to WHS adoption.

Folder duplication utilises the DE feature to ensure that the data is duplicated onto different physical drives within the logical storage pool. This in effect is ‘RAID like’ except that the data is duplicated over time and not immediately (although there is no way of retrieving previous versions of files). This provides an easy form of fault tolerance that, whilst being fairly easy to replicate yourself using other means, will probably never be as easy as ticking a check box. This is again more of an issue for the “average” guy than the PC enthusiast who is at home configuring RAID, although a simple file copy add-in or batch job is my preferred solution. I already run daily automated RoboCopy jobs to copy "’snapshots’ of my data drives to another drive to provide both fault tolerance but also versioned snapshots that I can restore if required. I have had to dive into my snapshots on several occasions to restore a previous version of a file that has accidently been deleted/modified. I prefer this solution over RAID as disk write to a drive in RAID is duplicated immediately even if its not what you wanted.

So, what’s the positive? Well let’s consider why Microsoft are removing it. They have said that it causes conflicts with applications installed on the Small Business Server sister OS code named ‘Aurora’. These software applications don’t play nicely with having a logical drive pool. I, as have many other WHS enthusiasts, have over time installed numerous applications onto my WHS (e.g Microsoft Team Foundation Server) and I always do so with caution due to DE. I am careful to  ensure that nothing I install utilises the DATA drive and I often refrain from installing software that I think might conflict. With DE removed this worry is taken care of, which is definitely a positive for me.

Does WHS fit in the Cloud Computing Landscape?

If we look to the future and assume that the Cloud Computing paradigm is here to stay the bigger question arises of what role would WHS play. I admit to being a Cloud advocate and I do share Ray Ozzie’s view of a “New Dawn” where  devices (not PCs) connect to continuous services hosted in the internet. In this vision the majority of people only use devices to connect to the internet (smart phones, tablets, TVs etc) and they are continuously connected to the web where their data is stored, analysed, processed and shared. The concept of having a local home server is almost alien as your storage will all be in the cloud. Backups won’t be required as data will be automatically synched and devices won’t need to be imaged for restoration as they will only be simple devices with sophisticated browsers. Sure PC’s will remain for advanced users but not the user majority. This vision of the future is not that revolutionary, it’s already happening, so fast in fact that the next version of WHS after Vail will need to be positioned within this connected world. People may cry that users will always want their data close by and local but that’s not true as over time they won’t even think about it as evidenced by early cloud services like Hotmail, Exchange Online etc.

This vision of the future relies heavily on a fast internet connection and related infrastructure which is slowly being rolled out across the developed world but this weakness perhaps provides an opportunity for the WHS’s of the future. The ability to synch to your local “private cloud” and use that as the hub for your home is probably a requirement of the future and a ‘server’ device could fulfil this space. Unfortunately so could other home based devices, such as the XBoxes, Google TVs and Media Centers of the future, and the single home device is the ‘holy grail’ of consumer electronics. The battle for the position as sole ‘provider’ and gateway to the continuous services of the future will be intense and whilst the current WHS offerings (V1 and Vail") are too weak to survive the battle, maybe, just maybe, their future off-spring will fit that gap perfectly.

Summary:

WHS has, unfortunately, always been a niche product which is a real shame as it is one of the best products to ever have come out of Redmond and one that deserves more credit. Microsoft have never promoted it and seem instead to be happy to use it as a experiment for newer technologies (like DE). This is obviously a dark period for the WHS product but the communities reaction to the DE news and the growing popularity of the platform means that I believe it it will survive in the short term.

If I were Microsoft I would look to extract the key features of WHS (i.e. client backup and remote access services) and convert them into add on applications for Windows. With DE gone there is little point in having a ‘Home’ sku of Windows Server. Sell Windows 2008 Foundation to OEMs with these WHS feature applications installed for them to put on their consumer devices. This would enable these features to be supported on Windows Client OS’s in the future too when it was profitable to do so. I would be happy to run a fully fledged supported version of Windows Server that comfortably ran all server based software but to which I could also install a Client Backup and Remote Access Services if I required them.

Will I upgrade to Vail? Good question. Currently I’m undecided. I will review it against other products when the time comes (Amahi on Linux, Aurora, Win Server 2008) but one thing is for sure – the removal of DE will not affect my decision but the strength of Microsoft commitment to the product will.

BigTrak Is Back

imageThe 1980’s iconic “programmable electronic vehicle” BigTrak is back and in the shops now. I always wanted one of these when I was a boy and now I finally get to buy one, and I can also justify the purchase by buying it as a Christmas present for my son. Having boys is great!

Below is the classic 80’s advert in case you’ve forgotten how useful BigTrak is for delivering an apple to your resting dad:

Backing up TFS 2010 with new Power Tools Backup Plan

Backing up TFS 2010 with new Power Tools Backup Plan

At last – backup is built into the TFS product, well via the Power Tools at least. Backing up TFS has always been difficult and non-intuitive without a SQL DBA in your pocket, even the documentation is at best extensive but at worse confusing. But all that’s history as the September 2010 TFS Power Tools now includes a Backup plan feature.

Recently I have posted a few articles on backing up TFS 2010 using Windows PowerShell. The first involved backing up the all TFS SQL databases using PowerShell and SMO and can be found here: ‘Backing Up TFS Part-1’. the second covered an alternative strategy of backing up the latest content of your TFS repository and can found here: ‘Backing Up TFS Part-2’.

Brian Harry recently posted this article ‘Backing up and restoring your TFS server’ describing the new Backup Plan feature of the new release of the TFS Power Tools and this week the September 2010 TFS Power Tools were released. Brian’s posts provide all the details you need on installing and configuring a backup plan together with some screenshots.

I installed the updated Power Tools on my Windows Home Server (you must remove any previous versions manually first) running TFS and easily setup a backup plan:

TFSBackupInstallOption

I did hit one problem with it though. It seems that the feature is a little fussy on the target location of the backup files. The target must be a network share and the tool will attempt to apply the relevant access privileges to the folder for the scheduled job to be able to write the backups successfully. The tool verifies that the information you have supplied is accurate and it checks that a backup can run. My initial attempts to verify my backup plan failed at this point with the following error in the log:

Microsoft.SqlServer.Management.Smo.FailedOperationException: Backup failed for Server ‘servername\SqlExpress’.  —> Microsoft.SqlServer.Management.Common.ExecutionFailureException: An exception occurred while executing a Transact-SQL statement or batch. —> System.Data.SqlClient.SqlException: Cannot open backup device ‘\\servername\backup\tfsbackups\Tfs_Configuration_20100911231343.bak’. Operating system error 5(failed to retrieve text for this error. Reason: 1815).

After some investigation and a helpful post from Dave Hunter I concluded that this was due to the permissions on the backup share on my Home Server. As that share is a WHS managed share it has specific security permissions that prevented the backup tool from asserting its authority and granting the relevant privileges. To circumvent the problem I created a new standard non-WHS share on my Home Server with minimal restrictions. Once I entered the new share’s details the backup tool verified the backup plan successfully and ran a backup fine. I then knocked up a simple RoboCopy script to copy the contents of the new backup share to my original intended WHS share target location on a daily basis via a scheduled task.

In summary I believe that this is a major step forward for TFS and will benefit many of the new users picked up since the introduction of the TFS Basic Configuration in TFS 2010, proving to be another nail in the coffin for Visual SourceSafe. I’d recommend any team still using SourceSafe or any other tool to take another look at TFS as it is definitely getting easier to manage than previously.

Backing Up TFS 2010 Using PowerShell: Part 2

Backing Up TFS 2010 Using PowerShell: Part 2

In my previous post (“Backing Up TFS 2010 Using PowerShell: Part 1”) I covered how to backup up the TFS SQL Server databases using Windows PowerShell and the SQL Server Management Objects (SMO) API. In this post I’m going to take it one stage further and add another layer of backup for the paranoid amongst you like me.

Backing up the databases is all well and good but I like to be able to see what I’m backing up too and there’s nothing quite like being able to see all your data on disk in its native format.  So to give me this extra warm fuzzy feeling I have written a PowerShell script to perform a ‘Get-Latest’ of all the source code in my ‘Team Project Collection’ in TFS and then copy those latest file versions to a backup location. By scheduling this activity to occur periodically through the week I see two key benefits. Firstly I can see that I have the data in a second place and that its in its native format (i.e. just plain files of source code, documents, images etc) which I can then backup and know that should I not be able to restore the SQL databases I will at least have the latest source files. The second benefit is that I have the ability to quickly access my source code (on the local machine or remotely) from another machine that may not have Visual Studio installed. This is useful if I just want to check something out and don’t need to modify the files.

I wanted this script to run on the server (in my case a Windows Home Server) and so to be able to perform a Get-Latest on TFS I needed to install ‘Team Explorer’ and setup a ‘Workspace’. Team Explorer is the standalone client required to interact with TFS Source code without Visual Studio. The Team Explorer setup package is included on the TFS installation media in the TeamExplorer folder. Browse to this and then run Setup.exe and keep the default values. This installs the Visual Studio 2010 shell, the TFS object model, Visual Studio Team Explorer 2010 and MS Help Viewer.

Next you need to setup a Workspace for the server using Team Explorer (via Start > All Programs > Microsoft Visual Studio 2010 > Microsoft Visual Studio 2010) and access the TFS server by adding the server in the normal way and go to ‘source control’. Select the top item in source control (in most cases ‘SERVERNAME\DefaulCollection’) and right click, “Map to local Folder”. Enter path of local folder on server and then it will ask if you want to get latest, say yes. You now have a workspace set up and you will be able to perform a manual or automated ‘Get-Latest’.

To automate the interaction with TFS and perform the ‘get’ you could use the TF.exe command line tool that installs with the Team Explorer setup. Personally as a PowerShell fan I’ chose to use the PowerShell cmdlets that come with the Team Foundation 2010 Power Tools. To use these you’ll need to download the power tools and run the tfpt.msi installation package. Choose custom and just choose the server specific items, in this case just the PowerShell cmdlets and the command line interface options (see screenshot).

powertolls

The basic flow of the script is configure the file paths required, import the Microsoft.TeamFoundation.PowerShell snap-in/module, and then call the ‘Update-TFSWorkspace’ cmdlet. This is the command to perform a Get-Latest on the workspace previously configured. The workspace folder you specify must match the local folder you specified earlier in Team Explorer. The next step is to use Robocopy to copy the files to the backup location. Robocopy is a superb file copy tool as it only copies changed files which significantly improves the speed of this procedure compared to using a standard file copy cmdlet. Once complete the log is passed to RoboCopyLogger (more about this in a future blog post) that scans the log for errors, and then we add an Event Log entry for success or failure.

Finally set up a Scheduled Task to configure this script to run automatically each week. If you were to use this script as is you would need to change the file paths, and use an Event Log Source that was relevant to your system (or just remove the Write-Eventlog call”).


clear-host
write-output "------------------------------------Script Start------------------------------------"
write-output " TFS Data Get-Latest Backup "
write-output "------------------------------------------------------------------------------------" 

# set file paths
$LocalWorkspaceFolder="c:\tempcode\tfsbackups\latestdata"
$RemoteBackupFolder="\\WinHomeSvr\Backup2\TFSBackups\LatestData"
$Robocopy="c:\scripts\Robocopy.exe"
$timestamp = Get-Date -format yyyyMMddHHmmss
$Log="c:\Scripts\Logs\TFSData\TFSLatestDataBackup_$timestamp.txt"
$RobocopyLogger="c:\Scripts\RoboCopyLoggerTweet_BETA\RobocopyLoggerTweet"

# set error action preference so errors stop and the trycatch kicks in
$erroractionpreference = "Continue"

try
{
    # add the TFS snapin to the session
    add-pssnapin Microsoft.TeamFoundation.Powershell

    # get latest on the top level workspace level
    Update-TfsWorkspace $LocalWorkspaceFolder -Force -Recurse

    # now robocopy the local cache to the final backup folder
    invoke-expression "$Robocopy $LocalWorkspaceFolder
                                        $RemoteBackupFolder /MIR /LOG:$Log /NP"

    # Copy the log files too
    invoke-expression "$Robocopy $Log $RemoteBackupFolder /MIR"

    # Check the log for errors etc and log to eventlog/twitter
    invoke-expression "$RobocopyLogger $Log 1"

    # write all events to the logs
    write-output "Writting SUCCESS to EventLog"
    write-eventlog -LogName "HomeNetwork" -Source "TFS Backups"
                              -EventId 1 -Message "TFS Data BackUp Script ran"
}
catch
{
    # error occurred so lets report it
    write-output "ERROR OCCURRED" $error 

    # write an event to the event log
    write-output "Writting FAIL to EventLog"
    write-eventlog -LogName "HomeNetwork" -Source "TFS Backups"
                               -EventId 1 -Message "TFS Data BackUp Script failed" -EntryType Error
}

write-output "------------------------------------Script end------------------------------------" 

Backing Up TFS 2010 Using PowerShell: Part 1

In a previous post I covered how to install Team Foundation Server 2010 onto a Windows Home Server. The installation was a TFS Basic Configuration installation and whilst it was geared towards Windows Home Server the concepts are the same if you are installing it on other servers / workstations. This post will cover how to backup the TFS databases. For backing up just the raw source files too check out Part 2 of this post.

Now, I am a healthily paranoid kind of guy and after installing TFS the first thing I decided to get right was a method to backup my new TFS installation to protect against data loss. I’m not going to sleep easy until I know that my source code is backed up in a solid repeatable manner. The backup tool of choice is Windows PowerShell due to the sheer power that this scripting shell provides.

Firstly its important to understand that Microsoft Team Foundation Server relies completely on Microsoft SQL Server for its data persistence. Therefore backing up TFS is just a matter of backing up the SQL Server databases in the TFS Data Tier. Usually, unless you are installing an enterprise TFS solution, the database will reside on the same server as the rest of the TFS installation. The number of databases that are created by TFS will vary depending on the number of ‘Project Collections’ you create in TFS. Therefore to avoid having to update your backup scripts each time you add or remove a collection in TFS and if your SQL Server instance is only used for TFS then its safer to just backup ALL the databases.

I strongly recommend that you read the TFS documentation on how to backup TFS and only use the information in this post as supplementary information, as backing up your data is a serious business and I’d hate for something to go wrong.

I used this excellent post from Donabel Santos as the inspiration for my PowerShell script and modified it to customise for TFS and to provide additional functionality. The SQL Server interaction is through the use of the SQL Server Management Objects (SMO) API which provides a rich collection of objects through which you can interact with your databases. PowerShell makes interacting with these objects easy.

The basic flow of the script is to connect to the SQL Server instance using the SMO objects and then loop through the collection of databases within that instance. We ignore the ‘TempDB’ database as backing up this is not required nor possible. We then backup the database to file, again using the SMO. Once all databases have been backed up to files we zip them up (using the excellent 7-Zip) and copy the zip file to a backup location. You don’t need to install the full 7-Zip package on your server as you can download a command line friendly version that just needs to be copied across to your server. The end of the script then records the success or failure of the transaction to the Event-Log.

Obviously if you were to use this script you would need to change the file paths, SQL Server Instance name, and use an Event Log Source that was relevant to your system (or just remove the Write-Eventlog call”).

clear-host
write-output "-------------------------------Script Start----------------------------------"
write-output " TFS SQL Database Backup "
write-output "------------------------------------------------------------------------------------"

# load modules used in this script
import-module -name C:\scripts\support\SupportModule -verbose

# load assemblies
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null

# create a new server object, and set backup path and timestamp info (they will share same timestamp)
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "winhomesvr\SQLEXPRESS"
$timestamp = Get-Date -format yyyyMMddHHmmss
$SQLDataFolder = "C:\Program Files\Microsoft SQL Server\MSSQL10.SQLEXPRESS\MSSQL\DATA"
$backupDirectory = "c:\tempcode\tfsbackups\sql\bkupdir"
$backupZipStore = "\\winhomesvr\backup2\tfsbackups\sql\Zipped\"
$backupRawDataZipStore = "\\winhomesvr\backup2\tfsbackups\sql\ZippedRAW\"
$7ZipExePath = "c:\Scripts\Support\7-Zip\7za"
$7ZipCmdLineForBkUps = $7ZipExePath + " a " + $backupZipStore + "TFSSQLBkup_" + $timestamp
                                 + ".zip " + $backupDirectory
$7ZipCmdLineForRawDataFileBkUps = $7ZipExePath + " a " + $backupRawDataZipStore
                                 + "TFSSQLBkupRAW_" + $timestamp + ".zip " + $backupDirectory

# display settings
write-output "Backup Directory: " $backupDirectory
write-output "Backup Zip Store: " $backupZipStore
write-output "Timestamp: " $timestamp

# set error action preference so errors stop and the trycatch kicks in
$erroractionpreference = "Continue"

try
{
    write-output "Deleting old backup files"
    remove-item -Path ($backupDirectory + "\*.*") -force

    # loop all databases in server, and backup each one using SQL Backup
    foreach($db in $server.Databases)
    {
        # set database name
        $dbName = $db.Name

        # exclude the tempdb as you can't back that one up
        if ($dbName -ne "tempdb")
        {
            write-output "Processing database: " $dbName
            $smoBackup = New-Object ("Microsoft.SqlServer.Management.Smo.Backup")
            $smoBackup.Action = [Microsoft.SqlServer.Management.Smo.BackupActionType]::Database
            $smoBackup.BackupSetDescription = "Full backup of " + $dbName
            $smoBackup.BackupSetName = $dbName + " Backup"
            $smoBackup.Database = $dbName
            $smoBackup.MediaDescription = "Disk"
            $smoBackup.Devices.AddDevice(
                          $backupDirectory + "\" + $dbName + "_" + $timestamp + ".bak", "File")
            $smoBackup.Initialize = $TRUE
            $smoBackup.SqlBackup($server)
            write-output "Processed database: " $dbName
        }
    }

    write-output "Processed all databases, listing outputs..."

    #let's confirm, let's list all backup files
    $directory = Get-ChildItem $backupDirectory

    #list only files that end in .bak
    $backupFilesList = $directory | where {$_.extension -eq ".bak"}
    $backupFilesList | Format-Table Name, LastWriteTime

    write-output "Clear out old zipped file from the zip storage folder..."
    Remove-MostFiles $backupZipStore *.zip 2

    write-output "Zipping up the files..."
    invoke-expression $7ZipCmdLineForBkUps

    # write all events to the logs
    write-output "Writting SUCCESS to EventLog"
    write-eventlog -LogName "HomeNetwork" -Source "TFS Backups"
                              -EventId 1 -Message "TFS SQL BackUp Script ran"
}
catch
{
    # error occurred so lets report it
    write-output "ERROR OCCURRED" $error

    # write an event to the event log
    write-output "Writting FAIL to EventLog"
    write-eventlog -LogName "HomeNetwork" -Source "TFS Backups"
                              -EventId 1 -Message "TFS SQL BackUp Script failed" -EntryType Error
}

write-output "------------------------------------Script end------------------------------------"

Update: Since writing this post Microsoft have updated the TFS Power Tools to include a Backup Plan which enables you to schedule full backups from the TFS Admin Console. Checkout this post.

Installing Team Foundation Server 2010 on Windows Home Server

Installing Team Foundation Server 2010 on Windows Home Server

Last October I posted about the fact that Microsoft’s Team Foundation Server 2010 was going to be shipping with a “Basic” configuration that more light weight and seen as more of a Visual SourceSafe replacement. In that post I also pointed out that it seemed possible to install this version on a Windows Home Server (WHS) and that it would be something I’d try. Well eventually I have got round to doing it and of course I’ve documented my approach. The benefits of TFS immense but out of the scope of this article but instead I’m covering the installation process.

It may have been technically possible to jump through some hoops and install TFS 2008 on your WHS box but the pre-requisites were heavy (including SharePoint and full SQL Server) and it was always seen as a complicated process. With the ‘basic’ configuration of TFS 2010 you can ignore SharePoint and SQL Reporting Services and it installs onto SQL Server 2008 Express (which it also installs for you).

A point worth mentioning is that TFS 2010 will not trash you existing web sites (important for Windows Home Servers) but will instead install its own site alongside any existing sites on the server. It is possible to then connect to TFS remotely via your WHS Remote Access domain name which is a very nifty feature.

Pre-Installation Considerations:

Basic Configuration: Whilst it’s possible to install the non-basic configurations I don’t see the point unless there is something specific that you want to take advantage of that’s not included in the basic configuration. For a light, easy installation and a nice minimal processing overhead on your WHS the ‘basic’ configuration is ideal. It also contains all the core features such as build automation, work item tracking and source control.

Location of SQL Server Data Files: By default the SQL Server data files are installed to the system drive under the program files location for SQL Server Express. As you add more and more content to TFS these files will naturally grow and so you need to consider up front whether you have the disk space to support this. Windows Home Server by default installs only a 20GB system partition which should have adequate space for the data files to grow, but if you have installed a lot of applications to your system drive this could be an issue.  To store the data files at an alternative location it is easier to install SQL Express 2008 manually prior to installing TFS 2010 (and configure the data file locations via the SQL setup). The TFS installation will then just reuse the SQL Express instance already installed. Alternatively you could install the data files on the WHS’s ‘Data’ drive (D drive) but personally I prefer to leave that drive well alone and let WHS’s ‘Drive Extender’ manage all the data in it’s Storage Pool. I did consider installing a new drive to my server, and not adding that to the storage pool, which would then be used for my data files. In the end I decided I had adequate space on the system drive and I could move the data files at a future point in time if required.

Build Controller: Part of the TFS installation/configuration includes the decision to install a Build Controller on the server. If you plan on running automated TFS builds then you’ll need to install this. I would strongly recommend you use the Team Build features of TFS as they are one of its key features but if you don’t want to automate builds or want to minimise the services running on your server then just skip that part.

Support: Whist WHS is really a Windows Server 2003 under the covers and TFS 2010 clearly supports Windows Server 2003, installing it on a Windows Home Server is not supported by Microsoft (or anyone else) and you do so at your own risk.

Steps:

You’ll need to Remote Desktop onto your server to run the installation interactively. The below screenshots show the key stages of the installation process. Luckily it’s mostly a matter of clicking ‘Next’:

whs_tfs_01 whs_tfs_02

Choose to install Team Foundation Server and the Build Service (if you plan to run automated builds on this server too). Don’t install the Server Proxy:

whs_tfs_04  whs_tfs_06

After much processing and a reboot it’s installed:

whs_tfs_07 whs_tfs_08

Once installed, it then launches the Configuration Centre and it’s at this point that you can choose the ‘Basic’ configuration. It will then install SQL Server Express or point to an existing SQL instance:

whs_tfs_10 whs_tfs_11 whs_tfs_12    whs_tfs_16 whs_tfs_17

Next you’re asked to configure the Build Service details:

whs_tfs_18 whs_tfs_19 whs_tfs_20 whs_tfs_21 whs_tfs_22    whs_tfs_26 whs_tfs_27

That’s it. You can launch the ‘Team Foundation Server Administration Console’ from the Start Menu which is a neat new tool enabling you to manage TFS from the server itself:

whs_tfs_29

Next Steps:

Connect via Visual Studio: Now it’s all up and running you can fire up Visual Studio and test that you can connect to the new Team Foundation Server by adding it to the list of servers in Team Explorer. You should be able to create new Team Projects, check in code and run builds just as you would on any other TFS server.

Enable Remote Access: To enable remote access to TFS so you can access your source code from remote locations you need to copy your client certificate from the Remote Access site and then add it to the TFS site. For instructions on how to do this see this excellent post by Jason Neave.

Backups: Call me paranoid but before I add anything important to this TFS server I want an automated backup procedure in place. Backing up TFS is really ‘just’ a matter of backing up the SQL Server databases on which TFS sits as that is the only source of its data. I say ‘just’ because it’s not as easy as it should be. A key point to note is that as TFS uses numerous databases it is critical to backup all of them at the same time so you don’t end up restoring unsynchronised databases. I have created an automated backup procedure using Windows PowerShell that I will share in a future post.

Conclusion:

It’s exciting to see a combination of two excellent products working together and I think this is another great way to add value to your Windows Home Server whilst enabling you to experiment with a quality ALM tool.

UPDATE: For WHS 2011 users check out my post on Installing TFS 2010 on Windows Home Server 2011.

Installing PowerShell on Windows Home Server

Whether you manage thousands of Windows boxes in an enterprise environment or you just want a more powerful shell environment with which to manage your backup scripts on your PC, Windows PowerShell is an excellent free tool at your disposal. PowerShell V2 comes pre-installed in Windows 7 but how do you get it up and running on your Windows Home Server?

You can install V1 of PowerShell via Windows Update as it is an optional update on Windows 2003 Servers (SP2) which is what a Windows Home Server is underneath. The instructions to do this are detailed here. Basically you go to Automatic Updates on the Control Panel, view the list of ‘optional’ updates and it should be there for you to choose.

What about PowerShell v2? Well a recent post on the PowerShell Team’s blog suggests that V2 will also be available via an optional update to Windows 2003 Servers, replacing the V1 option. The same instructions as above should apply.

If the update isn’t available to you yet or you want to install V2 the manual way then you need to download the Windows Management Framework Core (KB968930) package which is detailed here. The Windows 2003 download relevant for Home Servers is here. It’s a very simple install and once complete you can access PowerShell V2 from the Start > Programs > Accessories > Windows PowerShell menu item.

POSH_WHS_01  POSH_WHS_03

For more information on Windows PowerShell check out “Getting Started with PowerShell” on MSDN or launch the PowerShell ISE environment from the PowerShell Start Menu folder (detailed above) and press F1. The bundled help files are surprisingly good and will soon get you up to speed. I intend to be posting more on PowerShell as I migrate all my server backup scripts over from standard batch file format to PowerShell.