How to Backup To USB Drive Only If It’s Connected

How to Backup To USB Drive Only If It’s Connected

A key part of most personal data backup strategies involves backing up data to an external USB drive but I don’t want to leave it constantly connected. In this post I cover how to backup to an external drive using a scheduled automated process but only if the external drive is connected at the time.

I don’t believe in leaving external backup USB drive always connected to my system (PC or Server) to avoid the data being corrupted or deleted. Also if the backup drive is for off-site storage then its not possible to be always connected anyway. Using the steps below I can perform a full data backup by simply physically connecting the drive (connecting a USB drive or docking the SATA drive into a USB dock for example), and then leaving it overnight. In the morning I can safely physically disconnect it and store it without even having to log onto the machine.

The basic flow:

1) Assuming that the machine is always on, which my server is, a scheduled task runs every night and executes a backup script (regardless of the backup drive being connected or not). If your machine is not always on then vary this by setting the scheduled task at a time when the machine is usually on.
2) The script checks for the existence of a specific folder on the connected drive (e.g: U:\backup\). If the drive isn’t connected then the folder path won’t exist and the script just exits happily. However, if the drive has been connected then the folder path will exist and the backup script will continue and copy over the data.
3) After a successful backup the script safely disconnects the USB drive. This step is technically optional as Windows supports pulling a USB drive out without doing a soft eject but its highly recommended to tell Windows first to avoid data corruption.
4) Optionally you can also output a backup log somewhere to enable you to check the logs periodically without having to reconnect the USB drive to verify the job worked.

More detailed steps:

Firstly connect the external USB drive and make a note of the drive letter it uses. Changing the drive letter to something memorable might help (B for backup, U for USB, O for Offsite etc). We’ll use U for this example. Next we need to write a DOS command script, a simple program or Powershell script to perform the backup of the data using the backup tool of your choice. I use Robocopy to copy the files via a Powershell script and below is a simplified version of my script.

# Checks for presence of offsite backup USB drive, and backs up relevant data to drive if present,
# exits gracefully if not present. Run script everynight and it only backs up data when offsite
# USB external drive is turned on.

# set file paths and log file names
$timestamp = Get-Date -format yyyyMMdd_HHmmss

# set error action preference so errors don't stop and the trycatch kicks in to handle gracefully
$erroractionpreference = "Continue"

	# Check USB drive is on by verfiying the path
	if(Test-Path $USBDriveBackupPath)
		# now copy the data folders to backup drive
		invoke-expression "Robocopy C:\Docs $USBDriveBackupPath\Docs /MIR /LOG:$LogFile /NP"
		invoke-expression "Robocopy C:\Stuff $USBDriveBackupPath\Stuff /MIR /LOG+:$LogFile /NP"

		# Copy the log file too
		invoke-expression "Robocopy $LogBasePath $USBDriveBackupPath\Logs /MIR /NP"

		# Sleep for 60 to ensure all transactions complete, then disconnect USB drive
		Start-Sleep -Seconds 60
		Invoke-Expression "c:\DevCon\USB_Disk_Eject /removeletter $USBDriveLetter"
	# Catch the error, log it somewhere, but make sure you still eject the drive using below
	Start-Sleep -Seconds 60
	Invoke-Expression "c:\DevCon\USB_Disk_Eject /removeletter $USBDriveLetter"

It is key to include in the script a check for the existence of a specific folder on the drive letter belonging to the external drive (U in our example). Only if its present do we continue with the backup.

I make sure that Robocopy logs the output to a file and that file is on the server and copied to the USB drive (as a record of the last backup date etc). I also report the running of the PowerShell script to the Eventlog for reporting purposes but this is outside the scope of this post.

It’s safer to tell Windows that you’re gonna pull the drive out and so I call  USB_Disk_Eject  from within my script, passing in the drive letter. I then wait 30 seconds to ensure the drive has had sufficient time to disconnect before I exit the script. There are a few tools available for ejecting USB drives such as Microsoft’s Device Console (DevCon.exe) but I use USB Disk Ejector (

Now set up a Scheduled Task in Windows to run the script every night at a set time.  As the script is scheduled to run every night all I have to do if I want to perform a back-up is connect my backup drive and leave it until the morning. The script will run overnight, find the drive, backup and disconnect. In the morning I can just physically disconnect the drive safely without having to log onto the machine. Periodically I’ll check the backup logs and the backup drive to make sure all is well and to check remaining drive space etc.

UPDATE: A working Powershell script can be found on my GitHub site.

Do you like this approach? Got a better idea? Let me know via the comments.

Installing Team Foundation Server on Windows Home Server 2011

Installing Team Foundation Server on Windows Home Server 2011

Twelve months ago I wrote a post documenting “installing Team Foundation Server 2010 on Windows Home Server” which has proved very popular. Well things move on and since then Microsoft have released a new version of Windows Home Server (WHS 2011). There are many differences between V2 of WHS compared to V1 but the main points for the purpose of this post are that WHS 2011 is build on top of Windows Server 2008 R2 (compared to Windows Server 2003 for V1) and controversially the Drive Extender technology has been removed. Whilst Drive Extender was no doubt useful for storage pooling it did make installing applications like TFS a little concerning. As described in my original article I wouldn’t install an application or a SQL Server database into the Drive Pool (it just feels wrong to me and wouldn’t trust it) and I stick by this especially as it’s been suggested that one of the reasons for Microsoft  removing DE was due to it not playing nicely with enterprise applications that would be targeted for use on the new Small Business Server Essentials product range with which WHS 2011 shares it’s code. No DE means you can install TFS now to whichever drive wherever you like in WHS 2011, and the fact that it’s built on the excellent Server 2008 base means it benefits from stability and performance improvements this brings. I’ve not found any issues with TFS on WHS 2011 and don’t expect to (although its not supported so you install it at your own risk). I think that WHS 2011 will make an even better TFS server than WHS V1.

Other than the decision of where to install TFS due to DE, the installation instructions are the same as in my original post. After installation I recommend installing the TFS Power Tools and then configuring TFS backups as described in these posts: Backing Up TFS 2010 Using PowerShell: Part 1, Backing Up TFS 2010 Using PowerShell: Part 2 and Backing up TFS 2010 with new Power Tools Backup Plan.

01 06   12

Android Remote Desktop Client

2XClient_LogonI find that I am increasingly relying on the computing power of my Android smartphone (a HTC Desire) and finding novel ways of using it to make my IT life easier. Sometimes I just want to connect to my PC that is in another room, or more often for me it’s my headless Windows Home Server, and so I scouted for a Remote Desktop client that I could run on my phone. The key requirement was for it to use the Windows native Remote Desktop protocol and therefore not require any software to be installed on my PC or Server, which ruled out a lot of the VNC based Apps. Luckily have released an excellent FREE App that ticks all the boxes.

2XClient for Android can be found here or on Android Market here. It is dead easy to set up the target machines and there are several display optimisation options. The key thing though is that it’s actually very easy to navigate the target machines desktop via a custom keyboard and a nifty mouse icon that can be dragged around with a left and right mouse button attached (left image below).  In these images I’m logging onto my Windows Home Server (a Windows 2003 based OS) but I also use it with my Windows 7 PC too. One thing to note for Windows 7 though is that I needed to set my Remote Desktop settings (via My Computer > System Properties > Remote Settings)  to “Allow connections from computers running any version of Remote Desktop” as opposed to the default setting of enforcing Network Level Authentication.

 2XClient_Mouse  2XClient_Keyboard  2XClient_StartMenu

It is surprisingly easy to do simple tasks on the target machine, especially after a bit of practice. Here I am using PowerShell and checking my Home Server Console.


A very powerful tool to have on your phone and ideal for those quick techy tasks when you can’t be bothered to get off the sofa.

The Future of Windows Home Server

Microsoft’s recent announcement that the key Drive Extender feature is to be removed from the new version of Windows Home Server codenamed ‘Vail’ has resulted in much dismay within the community. Many commentators, including the vocal WHS user community itself, have started to question the future of this product. In this post I give my take on where I see WHS in the medium term and consider how it can fit alongside the “new dawn” of a Cloud Computing era.

How big is the Drive Extender issue?

Firstly, what’s all this about Drive Extender (DE)? Well DE is a really neat feature of WHS that pools all the hard drives in the system into one logical data drive. This means that you can throw in a mixed selection of hard drives of any type (USB, SATA etc) or capacity and the system enables you to see them as one. It also provides fault tolerance through data duplication which protects your data from drive failure. It is one of the major features of Windows Home Server (WHS). I would argue one of three, with the others being the client backups and remote access. Sure the product does much more than just that but it’s fair to say that all of WHS’s features are available in other products in some shape or form and the combination of these three features into one customisable platform made WHS stand out for me.

Microsoft’s announcement to remove DE from the next version of WHS code named Vail immediately removes a major reason to buy into the new WHS version and this has been evidenced in the recent twitter comments on the subject where a lot of people have stated their intention to not use ‘Vail’. Of course some of this is just anger at the fact that the feature has been removed (and the way in which it was announced) but still the fact remains that the product is a weaker proposition than it was before.

Personally I see this decision in both a negative and a positive light. Firstly I see this as a major blow to the uniqueness of the product and feel that it will suffer without this USP (Unique Selling Point). Also it’s important to remember that this is positioned as a product for the average PC user and DE made extending the storage capacity easy. The user doesn’t need to buy matching disks or configure RAID, they just pop in a new disk and it gets added to the pool. Without DE adding extra storage will presumably be a more complex task. In reality though how many “average” PC users would feel happy upgrading the hard drive on their WHS anyway. Whilst enthusiasts relish the chance to pop open the case many casual users would actually see their OEM produced WHS as just an appliance, and one probably already stuffed with several 2 or 3 TB drives providing a good chunk of storage capacity right out the box. They would not consider any upgrades to it other than replacing it when it gets full. In addition whilst the shared drive pool concept makes adding storage easy the ability to add additional storage as additional drives will still be there in the product as it is in any Windows OS. I don’t see this as a huge blocker to WHS adoption.

Folder duplication utilises the DE feature to ensure that the data is duplicated onto different physical drives within the logical storage pool. This in effect is ‘RAID like’ except that the data is duplicated over time and not immediately (although there is no way of retrieving previous versions of files). This provides an easy form of fault tolerance that, whilst being fairly easy to replicate yourself using other means, will probably never be as easy as ticking a check box. This is again more of an issue for the “average” guy than the PC enthusiast who is at home configuring RAID, although a simple file copy add-in or batch job is my preferred solution. I already run daily automated RoboCopy jobs to copy "’snapshots’ of my data drives to another drive to provide both fault tolerance but also versioned snapshots that I can restore if required. I have had to dive into my snapshots on several occasions to restore a previous version of a file that has accidently been deleted/modified. I prefer this solution over RAID as disk write to a drive in RAID is duplicated immediately even if its not what you wanted.

So, what’s the positive? Well let’s consider why Microsoft are removing it. They have said that it causes conflicts with applications installed on the Small Business Server sister OS code named ‘Aurora’. These software applications don’t play nicely with having a logical drive pool. I, as have many other WHS enthusiasts, have over time installed numerous applications onto my WHS (e.g Microsoft Team Foundation Server) and I always do so with caution due to DE. I am careful to  ensure that nothing I install utilises the DATA drive and I often refrain from installing software that I think might conflict. With DE removed this worry is taken care of, which is definitely a positive for me.

Does WHS fit in the Cloud Computing Landscape?

If we look to the future and assume that the Cloud Computing paradigm is here to stay the bigger question arises of what role would WHS play. I admit to being a Cloud advocate and I do share Ray Ozzie’s view of a “New Dawn” where  devices (not PCs) connect to continuous services hosted in the internet. In this vision the majority of people only use devices to connect to the internet (smart phones, tablets, TVs etc) and they are continuously connected to the web where their data is stored, analysed, processed and shared. The concept of having a local home server is almost alien as your storage will all be in the cloud. Backups won’t be required as data will be automatically synched and devices won’t need to be imaged for restoration as they will only be simple devices with sophisticated browsers. Sure PC’s will remain for advanced users but not the user majority. This vision of the future is not that revolutionary, it’s already happening, so fast in fact that the next version of WHS after Vail will need to be positioned within this connected world. People may cry that users will always want their data close by and local but that’s not true as over time they won’t even think about it as evidenced by early cloud services like Hotmail, Exchange Online etc.

This vision of the future relies heavily on a fast internet connection and related infrastructure which is slowly being rolled out across the developed world but this weakness perhaps provides an opportunity for the WHS’s of the future. The ability to synch to your local “private cloud” and use that as the hub for your home is probably a requirement of the future and a ‘server’ device could fulfil this space. Unfortunately so could other home based devices, such as the XBoxes, Google TVs and Media Centers of the future, and the single home device is the ‘holy grail’ of consumer electronics. The battle for the position as sole ‘provider’ and gateway to the continuous services of the future will be intense and whilst the current WHS offerings (V1 and Vail") are too weak to survive the battle, maybe, just maybe, their future off-spring will fit that gap perfectly.


WHS has, unfortunately, always been a niche product which is a real shame as it is one of the best products to ever have come out of Redmond and one that deserves more credit. Microsoft have never promoted it and seem instead to be happy to use it as a experiment for newer technologies (like DE). This is obviously a dark period for the WHS product but the communities reaction to the DE news and the growing popularity of the platform means that I believe it it will survive in the short term.

If I were Microsoft I would look to extract the key features of WHS (i.e. client backup and remote access services) and convert them into add on applications for Windows. With DE gone there is little point in having a ‘Home’ sku of Windows Server. Sell Windows 2008 Foundation to OEMs with these WHS feature applications installed for them to put on their consumer devices. This would enable these features to be supported on Windows Client OS’s in the future too when it was profitable to do so. I would be happy to run a fully fledged supported version of Windows Server that comfortably ran all server based software but to which I could also install a Client Backup and Remote Access Services if I required them.

Will I upgrade to Vail? Good question. Currently I’m undecided. I will review it against other products when the time comes (Amahi on Linux, Aurora, Win Server 2008) but one thing is for sure – the removal of DE will not affect my decision but the strength of Microsoft commitment to the product will.

Backing up TFS 2010 with new Power Tools Backup Plan

Backing up TFS 2010 with new Power Tools Backup Plan

At last – backup is built into the TFS product, well via the Power Tools at least. Backing up TFS has always been difficult and non-intuitive without a SQL DBA in your pocket, even the documentation is at best extensive but at worse confusing. But all that’s history as the September 2010 TFS Power Tools now includes a Backup plan feature.

Recently I have posted a few articles on backing up TFS 2010 using Windows PowerShell. The first involved backing up the all TFS SQL databases using PowerShell and SMO and can be found here: ‘Backing Up TFS Part-1’. the second covered an alternative strategy of backing up the latest content of your TFS repository and can found here: ‘Backing Up TFS Part-2’.

Brian Harry recently posted this article ‘Backing up and restoring your TFS server’ describing the new Backup Plan feature of the new release of the TFS Power Tools and this week the September 2010 TFS Power Tools were released. Brian’s posts provide all the details you need on installing and configuring a backup plan together with some screenshots.

I installed the updated Power Tools on my Windows Home Server (you must remove any previous versions manually first) running TFS and easily setup a backup plan:


I did hit one problem with it though. It seems that the feature is a little fussy on the target location of the backup files. The target must be a network share and the tool will attempt to apply the relevant access privileges to the folder for the scheduled job to be able to write the backups successfully. The tool verifies that the information you have supplied is accurate and it checks that a backup can run. My initial attempts to verify my backup plan failed at this point with the following error in the log:

Microsoft.SqlServer.Management.Smo.FailedOperationException: Backup failed for Server ‘servername\SqlExpress’.  —> Microsoft.SqlServer.Management.Common.ExecutionFailureException: An exception occurred while executing a Transact-SQL statement or batch. —> System.Data.SqlClient.SqlException: Cannot open backup device ‘\\servername\backup\tfsbackups\Tfs_Configuration_20100911231343.bak’. Operating system error 5(failed to retrieve text for this error. Reason: 1815).

After some investigation and a helpful post from Dave Hunter I concluded that this was due to the permissions on the backup share on my Home Server. As that share is a WHS managed share it has specific security permissions that prevented the backup tool from asserting its authority and granting the relevant privileges. To circumvent the problem I created a new standard non-WHS share on my Home Server with minimal restrictions. Once I entered the new share’s details the backup tool verified the backup plan successfully and ran a backup fine. I then knocked up a simple RoboCopy script to copy the contents of the new backup share to my original intended WHS share target location on a daily basis via a scheduled task.

In summary I believe that this is a major step forward for TFS and will benefit many of the new users picked up since the introduction of the TFS Basic Configuration in TFS 2010, proving to be another nail in the coffin for Visual SourceSafe. I’d recommend any team still using SourceSafe or any other tool to take another look at TFS as it is definitely getting easier to manage than previously.

Backing Up TFS 2010 Using PowerShell: Part 2

Backing Up TFS 2010 Using PowerShell: Part 2

In my previous post (“Backing Up TFS 2010 Using PowerShell: Part 1”) I covered how to backup up the TFS SQL Server databases using Windows PowerShell and the SQL Server Management Objects (SMO) API. In this post I’m going to take it one stage further and add another layer of backup for the paranoid amongst you like me.

Backing up the databases is all well and good but I like to be able to see what I’m backing up too and there’s nothing quite like being able to see all your data on disk in its native format.  So to give me this extra warm fuzzy feeling I have written a PowerShell script to perform a ‘Get-Latest’ of all the source code in my ‘Team Project Collection’ in TFS and then copy those latest file versions to a backup location. By scheduling this activity to occur periodically through the week I see two key benefits. Firstly I can see that I have the data in a second place and that its in its native format (i.e. just plain files of source code, documents, images etc) which I can then backup and know that should I not be able to restore the SQL databases I will at least have the latest source files. The second benefit is that I have the ability to quickly access my source code (on the local machine or remotely) from another machine that may not have Visual Studio installed. This is useful if I just want to check something out and don’t need to modify the files.

I wanted this script to run on the server (in my case a Windows Home Server) and so to be able to perform a Get-Latest on TFS I needed to install ‘Team Explorer’ and setup a ‘Workspace’. Team Explorer is the standalone client required to interact with TFS Source code without Visual Studio. The Team Explorer setup package is included on the TFS installation media in the TeamExplorer folder. Browse to this and then run Setup.exe and keep the default values. This installs the Visual Studio 2010 shell, the TFS object model, Visual Studio Team Explorer 2010 and MS Help Viewer.

Next you need to setup a Workspace for the server using Team Explorer (via Start > All Programs > Microsoft Visual Studio 2010 > Microsoft Visual Studio 2010) and access the TFS server by adding the server in the normal way and go to ‘source control’. Select the top item in source control (in most cases ‘SERVERNAME\DefaulCollection’) and right click, “Map to local Folder”. Enter path of local folder on server and then it will ask if you want to get latest, say yes. You now have a workspace set up and you will be able to perform a manual or automated ‘Get-Latest’.

To automate the interaction with TFS and perform the ‘get’ you could use the TF.exe command line tool that installs with the Team Explorer setup. Personally as a PowerShell fan I’ chose to use the PowerShell cmdlets that come with the Team Foundation 2010 Power Tools. To use these you’ll need to download the power tools and run the tfpt.msi installation package. Choose custom and just choose the server specific items, in this case just the PowerShell cmdlets and the command line interface options (see screenshot).


The basic flow of the script is configure the file paths required, import the Microsoft.TeamFoundation.PowerShell snap-in/module, and then call the ‘Update-TFSWorkspace’ cmdlet. This is the command to perform a Get-Latest on the workspace previously configured. The workspace folder you specify must match the local folder you specified earlier in Team Explorer. The next step is to use Robocopy to copy the files to the backup location. Robocopy is a superb file copy tool as it only copies changed files which significantly improves the speed of this procedure compared to using a standard file copy cmdlet. Once complete the log is passed to RoboCopyLogger (more about this in a future blog post) that scans the log for errors, and then we add an Event Log entry for success or failure.

Finally set up a Scheduled Task to configure this script to run automatically each week. If you were to use this script as is you would need to change the file paths, and use an Event Log Source that was relevant to your system (or just remove the Write-Eventlog call”).

write-output "------------------------------------Script Start------------------------------------"
write-output " TFS Data Get-Latest Backup "
write-output "------------------------------------------------------------------------------------" 

# set file paths
$timestamp = Get-Date -format yyyyMMddHHmmss

# set error action preference so errors stop and the trycatch kicks in
$erroractionpreference = "Continue"

    # add the TFS snapin to the session
    add-pssnapin Microsoft.TeamFoundation.Powershell

    # get latest on the top level workspace level
    Update-TfsWorkspace $LocalWorkspaceFolder -Force -Recurse

    # now robocopy the local cache to the final backup folder
    invoke-expression "$Robocopy $LocalWorkspaceFolder
                                        $RemoteBackupFolder /MIR /LOG:$Log /NP"

    # Copy the log files too
    invoke-expression "$Robocopy $Log $RemoteBackupFolder /MIR"

    # Check the log for errors etc and log to eventlog/twitter
    invoke-expression "$RobocopyLogger $Log 1"

    # write all events to the logs
    write-output "Writting SUCCESS to EventLog"
    write-eventlog -LogName "HomeNetwork" -Source "TFS Backups"
                              -EventId 1 -Message "TFS Data BackUp Script ran"
    # error occurred so lets report it
    write-output "ERROR OCCURRED" $error 

    # write an event to the event log
    write-output "Writting FAIL to EventLog"
    write-eventlog -LogName "HomeNetwork" -Source "TFS Backups"
                               -EventId 1 -Message "TFS Data BackUp Script failed" -EntryType Error

write-output "------------------------------------Script end------------------------------------" 

Backing Up TFS 2010 Using PowerShell: Part 1

In a previous post I covered how to install Team Foundation Server 2010 onto a Windows Home Server. The installation was a TFS Basic Configuration installation and whilst it was geared towards Windows Home Server the concepts are the same if you are installing it on other servers / workstations. This post will cover how to backup the TFS databases. For backing up just the raw source files too check out Part 2 of this post.

Now, I am a healthily paranoid kind of guy and after installing TFS the first thing I decided to get right was a method to backup my new TFS installation to protect against data loss. I’m not going to sleep easy until I know that my source code is backed up in a solid repeatable manner. The backup tool of choice is Windows PowerShell due to the sheer power that this scripting shell provides.

Firstly its important to understand that Microsoft Team Foundation Server relies completely on Microsoft SQL Server for its data persistence. Therefore backing up TFS is just a matter of backing up the SQL Server databases in the TFS Data Tier. Usually, unless you are installing an enterprise TFS solution, the database will reside on the same server as the rest of the TFS installation. The number of databases that are created by TFS will vary depending on the number of ‘Project Collections’ you create in TFS. Therefore to avoid having to update your backup scripts each time you add or remove a collection in TFS and if your SQL Server instance is only used for TFS then its safer to just backup ALL the databases.

I strongly recommend that you read the TFS documentation on how to backup TFS and only use the information in this post as supplementary information, as backing up your data is a serious business and I’d hate for something to go wrong.

I used this excellent post from Donabel Santos as the inspiration for my PowerShell script and modified it to customise for TFS and to provide additional functionality. The SQL Server interaction is through the use of the SQL Server Management Objects (SMO) API which provides a rich collection of objects through which you can interact with your databases. PowerShell makes interacting with these objects easy.

The basic flow of the script is to connect to the SQL Server instance using the SMO objects and then loop through the collection of databases within that instance. We ignore the ‘TempDB’ database as backing up this is not required nor possible. We then backup the database to file, again using the SMO. Once all databases have been backed up to files we zip them up (using the excellent 7-Zip) and copy the zip file to a backup location. You don’t need to install the full 7-Zip package on your server as you can download a command line friendly version that just needs to be copied across to your server. The end of the script then records the success or failure of the transaction to the Event-Log.

Obviously if you were to use this script you would need to change the file paths, SQL Server Instance name, and use an Event Log Source that was relevant to your system (or just remove the Write-Eventlog call”).

write-output "-------------------------------Script Start----------------------------------"
write-output " TFS SQL Database Backup "
write-output "------------------------------------------------------------------------------------"

# load modules used in this script
import-module -name C:\scripts\support\SupportModule -verbose

# load assemblies
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null

# create a new server object, and set backup path and timestamp info (they will share same timestamp)
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "winhomesvr\SQLEXPRESS"
$timestamp = Get-Date -format yyyyMMddHHmmss
$SQLDataFolder = "C:\Program Files\Microsoft SQL Server\MSSQL10.SQLEXPRESS\MSSQL\DATA"
$backupDirectory = "c:\tempcode\tfsbackups\sql\bkupdir"
$backupZipStore = "\\winhomesvr\backup2\tfsbackups\sql\Zipped\"
$backupRawDataZipStore = "\\winhomesvr\backup2\tfsbackups\sql\ZippedRAW\"
$7ZipExePath = "c:\Scripts\Support\7-Zip\7za"
$7ZipCmdLineForBkUps = $7ZipExePath + " a " + $backupZipStore + "TFSSQLBkup_" + $timestamp
                                 + ".zip " + $backupDirectory
$7ZipCmdLineForRawDataFileBkUps = $7ZipExePath + " a " + $backupRawDataZipStore
                                 + "TFSSQLBkupRAW_" + $timestamp + ".zip " + $backupDirectory

# display settings
write-output "Backup Directory: " $backupDirectory
write-output "Backup Zip Store: " $backupZipStore
write-output "Timestamp: " $timestamp

# set error action preference so errors stop and the trycatch kicks in
$erroractionpreference = "Continue"

    write-output "Deleting old backup files"
    remove-item -Path ($backupDirectory + "\*.*") -force

    # loop all databases in server, and backup each one using SQL Backup
    foreach($db in $server.Databases)
        # set database name
        $dbName = $db.Name

        # exclude the tempdb as you can't back that one up
        if ($dbName -ne "tempdb")
            write-output "Processing database: " $dbName
            $smoBackup = New-Object ("Microsoft.SqlServer.Management.Smo.Backup")
            $smoBackup.Action = [Microsoft.SqlServer.Management.Smo.BackupActionType]::Database
            $smoBackup.BackupSetDescription = "Full backup of " + $dbName
            $smoBackup.BackupSetName = $dbName + " Backup"
            $smoBackup.Database = $dbName
            $smoBackup.MediaDescription = "Disk"
                          $backupDirectory + "\" + $dbName + "_" + $timestamp + ".bak", "File")
            $smoBackup.Initialize = $TRUE
            write-output "Processed database: " $dbName

    write-output "Processed all databases, listing outputs..."

    #let's confirm, let's list all backup files
    $directory = Get-ChildItem $backupDirectory

    #list only files that end in .bak
    $backupFilesList = $directory | where {$_.extension -eq ".bak"}
    $backupFilesList | Format-Table Name, LastWriteTime

    write-output "Clear out old zipped file from the zip storage folder..."
    Remove-MostFiles $backupZipStore *.zip 2

    write-output "Zipping up the files..."
    invoke-expression $7ZipCmdLineForBkUps

    # write all events to the logs
    write-output "Writting SUCCESS to EventLog"
    write-eventlog -LogName "HomeNetwork" -Source "TFS Backups"
                              -EventId 1 -Message "TFS SQL BackUp Script ran"
    # error occurred so lets report it
    write-output "ERROR OCCURRED" $error

    # write an event to the event log
    write-output "Writting FAIL to EventLog"
    write-eventlog -LogName "HomeNetwork" -Source "TFS Backups"
                              -EventId 1 -Message "TFS SQL BackUp Script failed" -EntryType Error

write-output "------------------------------------Script end------------------------------------"

Update: Since writing this post Microsoft have updated the TFS Power Tools to include a Backup Plan which enables you to schedule full backups from the TFS Admin Console. Checkout this post.

Installing Team Foundation Server 2010 on Windows Home Server

Installing Team Foundation Server 2010 on Windows Home Server

Last October I posted about the fact that Microsoft’s Team Foundation Server 2010 was going to be shipping with a “Basic” configuration that more light weight and seen as more of a Visual SourceSafe replacement. In that post I also pointed out that it seemed possible to install this version on a Windows Home Server (WHS) and that it would be something I’d try. Well eventually I have got round to doing it and of course I’ve documented my approach. The benefits of TFS immense but out of the scope of this article but instead I’m covering the installation process.

It may have been technically possible to jump through some hoops and install TFS 2008 on your WHS box but the pre-requisites were heavy (including SharePoint and full SQL Server) and it was always seen as a complicated process. With the ‘basic’ configuration of TFS 2010 you can ignore SharePoint and SQL Reporting Services and it installs onto SQL Server 2008 Express (which it also installs for you).

A point worth mentioning is that TFS 2010 will not trash you existing web sites (important for Windows Home Servers) but will instead install its own site alongside any existing sites on the server. It is possible to then connect to TFS remotely via your WHS Remote Access domain name which is a very nifty feature.

Pre-Installation Considerations:

Basic Configuration: Whilst it’s possible to install the non-basic configurations I don’t see the point unless there is something specific that you want to take advantage of that’s not included in the basic configuration. For a light, easy installation and a nice minimal processing overhead on your WHS the ‘basic’ configuration is ideal. It also contains all the core features such as build automation, work item tracking and source control.

Location of SQL Server Data Files: By default the SQL Server data files are installed to the system drive under the program files location for SQL Server Express. As you add more and more content to TFS these files will naturally grow and so you need to consider up front whether you have the disk space to support this. Windows Home Server by default installs only a 20GB system partition which should have adequate space for the data files to grow, but if you have installed a lot of applications to your system drive this could be an issue.  To store the data files at an alternative location it is easier to install SQL Express 2008 manually prior to installing TFS 2010 (and configure the data file locations via the SQL setup). The TFS installation will then just reuse the SQL Express instance already installed. Alternatively you could install the data files on the WHS’s ‘Data’ drive (D drive) but personally I prefer to leave that drive well alone and let WHS’s ‘Drive Extender’ manage all the data in it’s Storage Pool. I did consider installing a new drive to my server, and not adding that to the storage pool, which would then be used for my data files. In the end I decided I had adequate space on the system drive and I could move the data files at a future point in time if required.

Build Controller: Part of the TFS installation/configuration includes the decision to install a Build Controller on the server. If you plan on running automated TFS builds then you’ll need to install this. I would strongly recommend you use the Team Build features of TFS as they are one of its key features but if you don’t want to automate builds or want to minimise the services running on your server then just skip that part.

Support: Whist WHS is really a Windows Server 2003 under the covers and TFS 2010 clearly supports Windows Server 2003, installing it on a Windows Home Server is not supported by Microsoft (or anyone else) and you do so at your own risk.


You’ll need to Remote Desktop onto your server to run the installation interactively. The below screenshots show the key stages of the installation process. Luckily it’s mostly a matter of clicking ‘Next’:

whs_tfs_01 whs_tfs_02

Choose to install Team Foundation Server and the Build Service (if you plan to run automated builds on this server too). Don’t install the Server Proxy:

whs_tfs_04  whs_tfs_06

After much processing and a reboot it’s installed:

whs_tfs_07 whs_tfs_08

Once installed, it then launches the Configuration Centre and it’s at this point that you can choose the ‘Basic’ configuration. It will then install SQL Server Express or point to an existing SQL instance:

whs_tfs_10 whs_tfs_11 whs_tfs_12    whs_tfs_16 whs_tfs_17

Next you’re asked to configure the Build Service details:

whs_tfs_18 whs_tfs_19 whs_tfs_20 whs_tfs_21 whs_tfs_22    whs_tfs_26 whs_tfs_27

That’s it. You can launch the ‘Team Foundation Server Administration Console’ from the Start Menu which is a neat new tool enabling you to manage TFS from the server itself:


Next Steps:

Connect via Visual Studio: Now it’s all up and running you can fire up Visual Studio and test that you can connect to the new Team Foundation Server by adding it to the list of servers in Team Explorer. You should be able to create new Team Projects, check in code and run builds just as you would on any other TFS server.

Enable Remote Access: To enable remote access to TFS so you can access your source code from remote locations you need to copy your client certificate from the Remote Access site and then add it to the TFS site. For instructions on how to do this see this excellent post by Jason Neave.

Backups: Call me paranoid but before I add anything important to this TFS server I want an automated backup procedure in place. Backing up TFS is really ‘just’ a matter of backing up the SQL Server databases on which TFS sits as that is the only source of its data. I say ‘just’ because it’s not as easy as it should be. A key point to note is that as TFS uses numerous databases it is critical to backup all of them at the same time so you don’t end up restoring unsynchronised databases. I have created an automated backup procedure using Windows PowerShell that I will share in a future post.


It’s exciting to see a combination of two excellent products working together and I think this is another great way to add value to your Windows Home Server whilst enabling you to experiment with a quality ALM tool.

UPDATE: For WHS 2011 users check out my post on Installing TFS 2010 on Windows Home Server 2011.

Installing PowerShell on Windows Home Server

Whether you manage thousands of Windows boxes in an enterprise environment or you just want a more powerful shell environment with which to manage your backup scripts on your PC, Windows PowerShell is an excellent free tool at your disposal. PowerShell V2 comes pre-installed in Windows 7 but how do you get it up and running on your Windows Home Server?

You can install V1 of PowerShell via Windows Update as it is an optional update on Windows 2003 Servers (SP2) which is what a Windows Home Server is underneath. The instructions to do this are detailed here. Basically you go to Automatic Updates on the Control Panel, view the list of ‘optional’ updates and it should be there for you to choose.

What about PowerShell v2? Well a recent post on the PowerShell Team’s blog suggests that V2 will also be available via an optional update to Windows 2003 Servers, replacing the V1 option. The same instructions as above should apply.

If the update isn’t available to you yet or you want to install V2 the manual way then you need to download the Windows Management Framework Core (KB968930) package which is detailed here. The Windows 2003 download relevant for Home Servers is here. It’s a very simple install and once complete you can access PowerShell V2 from the Start > Programs > Accessories > Windows PowerShell menu item.


For more information on Windows PowerShell check out “Getting Started with PowerShell” on MSDN or launch the PowerShell ISE environment from the PowerShell Start Menu folder (detailed above) and press F1. The bundled help files are surprisingly good and will soon get you up to speed. I intend to be posting more on PowerShell as I migrate all my server backup scripts over from standard batch file format to PowerShell.

‘Windows Home Server’ Build & Setup

I recently setup a new Windows Home Server and this post covers why I chose this operating system and how I setup my server.


My requirement was for an extendable ‘always on’ network attached file storage solution that would allow me to access my files from any machine in the house (and ideally remotely via the Internet when required) whilst providing some fault tolerance data protection. Having all my data in one place makes it easier to manage (less duplication of files across machines) and easier to back-up. This centralisation of data however means being more susceptible to hardware failure (e.g. hard disk failure) and so a solution with either a RAID configuration or something similar was required which ruled out most budget NAS Storage devices. After investigating the options I decided to build a Windows Home Server (WHS). This meets all the requirements above and also adds other neat features such as the extensible Add-In model (a huge bonus for a .Net developer like me).

Buy vs Build:

Having decided on Windows Home Server as the solution the next step was to decide whether to buy or build. There are several very smart WHS devices available from manufacturers like HP and Acer. Whilst these are the easy option they are not the cheapest or the easiest to extend. Also the availability of these devices varies depending on your geographical location. Based on these factors I decided to build.

Build Option:

The fact that WHS has such modest hardware requirements means that building a server is a very economical option. As my server will be ‘always on’ I put power efficiency as a key requirement in my build. To this end I considered the Intel Atom processor found in most ‘NetBooks’. These consume little power and pack enough punch to run WHS comfortably. The Atom CPU comes pre-attached to an Intel motherboard (you can’t buy them separately yet) for under £50. However as I wanted the storage in my server to be extendable and grow over the next few  years I needed the space for at least 4 hard drives but the majority of Intel Atom boards only come with 2 SATA ports. Some boards do exist with four SATA ports but they are hard to source. Another possible Atom drawback is that it may be difficult to source Windows 2003 drivers (required for WHS) for ‘NetBook’ targeted Atom motherboards.

Buy Option:

image24Eventually after some investigations I had a list of parts to build into my shiny new server, but also a few reservations. Firstly would all these components play nicely together and would the build be solid enough to meet my ‘always on’ requirement. After discussions with a colleague he suggested I look for pre-built end of line servers, which is what I did. I quickly found the HP Proliant Ml110 G5 going for £170, bargain. With 1GB RAM, Dual Core Pentium 1.8Ghz CPU, on-board video, Gigabit NIC (Network Interface Card), 160GB HDD, DVD ROM and a multitude of SATA ports it was ideal.

Sure it lacked the power saving benefits of an Atom processor based server but it’s solid Enterprise level build quality more than makes up for it. As the server is designed to run Windows 2003 drivers would also not be a problem.



For storage I purchased two Western Digital Caviar Green 750GB SATA drives to sit alongside the HP’s 160GB disk. By buying two I can make full use of WHS’s data duplication features to protect my data. Whilst the ‘Green’ branded disks are not as fast as traditional drives they are packed with energy saving features which I value in an ‘always on’ server. 



After much deliberation on whether to use the faster 160GB drive or the larger 750GB drive for the system drive I decided to install a 750GB drive as the system drive, mainly to ensure maximum extendibility. Whichever I installed as the system drive I would be stuck with (without reinstalling the Operating System) and I didn’t want to be limited to the smaller 160GB drive. To make installation of the OS easier I only connected up the first hard drive, and then connected the other two later once the OS was up and running.


Software Installation:

Once the hardware was sorted I put in the WHS DVD and followed the instructions. The installation went quicker than expected, surprisingly not spending long on performing the ‘Microsoft Updates’. Once installed I logged on to find that WHS didn’t have the right NIC (Network Interface Card) drivers and therefore the NIC hadn’t been installed. This of course explains why I didn’t have to wait for the install to download the updates as it couldn’t get on the web to find them. I installed the NIC drivers from the HP CD and rebooted to find that I could now access the internet via Internet Explorer but neither ‘Windows Update’ nor ‘Product Activation’ would connect. After further investigation (and much head scratching) I found this error in the Windows Event Log:

Type: Error.  Source: W32Time.
Description: Time Provider NtpClient: An error occurred during DNS lookup of the manually configured peer ‘,0x1’. NtpClient will try the DNS lookup again in 15 minutes. The error was: A socket operation was attempted to an unreachable host. (0x80072751)

Checking the System Time revealed I was two years in the past (2007) for some unknown  reason. After correcting the date I could connect to Windows Update fine. After a mammoth 70 updates and a reboot I’m presented with a strict ‘Activate Now’ prompt on logon. I presume that since my WHS install believes it has been installed for two years without activation it thinks it’s time to get serious. After I activate it I ran Windows Update again and this time it installs 5 more updates. Once the OS is stable I connect up the extra hard drives and add them to the storage pool via the WHS Console Server Storage tab.


In order to connect to your Client PCs the server and Clients need to be in the same Workgroup making this alignment the next task, along with checking for useful machine names/descriptions. Once all the clients are ready I installed the Client Connector software on each client (all Windows 7 clients) and configure their backup schedules. All clients connected and performed a successful backup first time.

Before copying across all my data onto the Windows Home Server Shared Folders I made sure that ‘Folder Duplication’ was turned off. This was purely to maximise the transfer speed (as WHS didn’t have to perform any duplication during the copy process) but I made sure I turned ‘Folder Duplication’ on for all folders after the data was in place.

Setting up a Printer Server :

Next I wanted to set-up my server as a Print Server ensuring that I could print from any machine without having to turn on the Desktop hosting the printer first. The printer is a basic Lexmark Z615 but there are some unsupported Windows 2003 drivers on the Lexmark site. After trial and error with these though I abandoned them and reverted to the XP drivers which worked ok. I did have to reboot several times though to completely remove the failed printer installed with the Windows 2003 drivers.

An annoying feature of Windows is that it searches the local network for other printers and adds them to the server. I don’t want ‘Print to OneNote’ and ‘XPS Document Printer’ printers on my server but deleting them is pointless as they will reappear. To prevent Windows from performing this auto search you need to turn it off by deselecting the option in: My Computer > Tools > Folder Options > View > “Automatically Search for Network Folders & Printers’.

With my print server setup I attempted to add the printer to my Windows 7 client, but this was to prove difficult too. I couldn’t find an option to specify the correct drivers to use for the Printer and the Vista printers (needed for Windows 7) weren’t installed on my server. In the end I found this blog post  where it explains how to use the Print Manager tool (new to Vista Sp1) to add additional drivers to your print server. This worked perfectly and on the next attempt it downloaded the Vista drivers correctly and installed the printer successfully.

WHS Add Ins:

I intend to install and (time permitting) write plenty of Add Ins for use with WHS as I think that they are an excellent way to add functionality to your server. So far I have installed the Microsoft WHS Toolkit v1.1 and Andreas M’s Advanced Admin Console. I find the Advanced Admin console useful for accessing admin tools via the Console without having to Remote Desktop into the server each time.  Over the next few weeks I hope to review the Power Management Add-Ins and install one to help my server to get a few hours sleep over night when it’s not required, thus saving power and money.


So that’s my build story. My home server is up and running and I’m so far very impressed with it. I aim to post some more articles about Windows Home Server over the coming months.