Backing Up TFS 2010 Using PowerShell: Part 1

In a previous post I covered how to install Team Foundation Server 2010 onto a Windows Home Server. The installation was a TFS Basic Configuration installation and whilst it was geared towards Windows Home Server the concepts are the same if you are installing it on other servers / workstations. This post will cover how to backup the TFS databases. For backing up just the raw source files too check out Part 2 of this post.

Now, I am a healthily paranoid kind of guy and after installing TFS the first thing I decided to get right was a method to backup my new TFS installation to protect against data loss. I’m not going to sleep easy until I know that my source code is backed up in a solid repeatable manner. The backup tool of choice is Windows PowerShell due to the sheer power that this scripting shell provides.

Firstly its important to understand that Microsoft Team Foundation Server relies completely on Microsoft SQL Server for its data persistence. Therefore backing up TFS is just a matter of backing up the SQL Server databases in the TFS Data Tier. Usually, unless you are installing an enterprise TFS solution, the database will reside on the same server as the rest of the TFS installation. The number of databases that are created by TFS will vary depending on the number of ‘Project Collections’ you create in TFS. Therefore to avoid having to update your backup scripts each time you add or remove a collection in TFS and if your SQL Server instance is only used for TFS then its safer to just backup ALL the databases.

I strongly recommend that you read the TFS documentation on how to backup TFS and only use the information in this post as supplementary information, as backing up your data is a serious business and I’d hate for something to go wrong.

I used this excellent post from Donabel Santos as the inspiration for my PowerShell script and modified it to customise for TFS and to provide additional functionality. The SQL Server interaction is through the use of the SQL Server Management Objects (SMO) API which provides a rich collection of objects through which you can interact with your databases. PowerShell makes interacting with these objects easy.

The basic flow of the script is to connect to the SQL Server instance using the SMO objects and then loop through the collection of databases within that instance. We ignore the ‘TempDB’ database as backing up this is not required nor possible. We then backup the database to file, again using the SMO. Once all databases have been backed up to files we zip them up (using the excellent 7-Zip) and copy the zip file to a backup location. You don’t need to install the full 7-Zip package on your server as you can download a command line friendly version that just needs to be copied across to your server. The end of the script then records the success or failure of the transaction to the Event-Log.

Obviously if you were to use this script you would need to change the file paths, SQL Server Instance name, and use an Event Log Source that was relevant to your system (or just remove the Write-Eventlog call”).

clear-host
write-output "-------------------------------Script Start----------------------------------"
write-output " TFS SQL Database Backup "
write-output "------------------------------------------------------------------------------------"

# load modules used in this script
import-module -name C:\scripts\support\SupportModule -verbose

# load assemblies
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null

# create a new server object, and set backup path and timestamp info (they will share same timestamp)
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") "winhomesvr\SQLEXPRESS"
$timestamp = Get-Date -format yyyyMMddHHmmss
$SQLDataFolder = "C:\Program Files\Microsoft SQL Server\MSSQL10.SQLEXPRESS\MSSQL\DATA"
$backupDirectory = "c:\tempcode\tfsbackups\sql\bkupdir"
$backupZipStore = "\\winhomesvr\backup2\tfsbackups\sql\Zipped\"
$backupRawDataZipStore = "\\winhomesvr\backup2\tfsbackups\sql\ZippedRAW\"
$7ZipExePath = "c:\Scripts\Support\7-Zip\7za"
$7ZipCmdLineForBkUps = $7ZipExePath + " a " + $backupZipStore + "TFSSQLBkup_" + $timestamp
                                 + ".zip " + $backupDirectory
$7ZipCmdLineForRawDataFileBkUps = $7ZipExePath + " a " + $backupRawDataZipStore
                                 + "TFSSQLBkupRAW_" + $timestamp + ".zip " + $backupDirectory

# display settings
write-output "Backup Directory: " $backupDirectory
write-output "Backup Zip Store: " $backupZipStore
write-output "Timestamp: " $timestamp

# set error action preference so errors stop and the trycatch kicks in
$erroractionpreference = "Continue"

try
{
    write-output "Deleting old backup files"
    remove-item -Path ($backupDirectory + "\*.*") -force

    # loop all databases in server, and backup each one using SQL Backup
    foreach($db in $server.Databases)
    {
        # set database name
        $dbName = $db.Name

        # exclude the tempdb as you can't back that one up
        if ($dbName -ne "tempdb")
        {
            write-output "Processing database: " $dbName
            $smoBackup = New-Object ("Microsoft.SqlServer.Management.Smo.Backup")
            $smoBackup.Action = [Microsoft.SqlServer.Management.Smo.BackupActionType]::Database
            $smoBackup.BackupSetDescription = "Full backup of " + $dbName
            $smoBackup.BackupSetName = $dbName + " Backup"
            $smoBackup.Database = $dbName
            $smoBackup.MediaDescription = "Disk"
            $smoBackup.Devices.AddDevice(
                          $backupDirectory + "\" + $dbName + "_" + $timestamp + ".bak", "File")
            $smoBackup.Initialize = $TRUE
            $smoBackup.SqlBackup($server)
            write-output "Processed database: " $dbName
        }
    }

    write-output "Processed all databases, listing outputs..."

    #let's confirm, let's list all backup files
    $directory = Get-ChildItem $backupDirectory

    #list only files that end in .bak
    $backupFilesList = $directory | where {$_.extension -eq ".bak"}
    $backupFilesList | Format-Table Name, LastWriteTime

    write-output "Clear out old zipped file from the zip storage folder..."
    Remove-MostFiles $backupZipStore *.zip 2

    write-output "Zipping up the files..."
    invoke-expression $7ZipCmdLineForBkUps

    # write all events to the logs
    write-output "Writting SUCCESS to EventLog"
    write-eventlog -LogName "HomeNetwork" -Source "TFS Backups"
                              -EventId 1 -Message "TFS SQL BackUp Script ran"
}
catch
{
    # error occurred so lets report it
    write-output "ERROR OCCURRED" $error

    # write an event to the event log
    write-output "Writting FAIL to EventLog"
    write-eventlog -LogName "HomeNetwork" -Source "TFS Backups"
                              -EventId 1 -Message "TFS SQL BackUp Script failed" -EntryType Error
}

write-output "------------------------------------Script end------------------------------------"

Update: Since writing this post Microsoft have updated the TFS Power Tools to include a Backup Plan which enables you to schedule full backups from the TFS Admin Console. Checkout this post.

Installing Team Foundation Server 2010 on Windows Home Server

Installing Team Foundation Server 2010 on Windows Home Server

Last October I posted about the fact that Microsoft’s Team Foundation Server 2010 was going to be shipping with a “Basic” configuration that more light weight and seen as more of a Visual SourceSafe replacement. In that post I also pointed out that it seemed possible to install this version on a Windows Home Server (WHS) and that it would be something I’d try. Well eventually I have got round to doing it and of course I’ve documented my approach. The benefits of TFS immense but out of the scope of this article but instead I’m covering the installation process.

It may have been technically possible to jump through some hoops and install TFS 2008 on your WHS box but the pre-requisites were heavy (including SharePoint and full SQL Server) and it was always seen as a complicated process. With the ‘basic’ configuration of TFS 2010 you can ignore SharePoint and SQL Reporting Services and it installs onto SQL Server 2008 Express (which it also installs for you).

A point worth mentioning is that TFS 2010 will not trash you existing web sites (important for Windows Home Servers) but will instead install its own site alongside any existing sites on the server. It is possible to then connect to TFS remotely via your WHS Remote Access domain name which is a very nifty feature.

Pre-Installation Considerations:

Basic Configuration: Whilst it’s possible to install the non-basic configurations I don’t see the point unless there is something specific that you want to take advantage of that’s not included in the basic configuration. For a light, easy installation and a nice minimal processing overhead on your WHS the ‘basic’ configuration is ideal. It also contains all the core features such as build automation, work item tracking and source control.

Location of SQL Server Data Files: By default the SQL Server data files are installed to the system drive under the program files location for SQL Server Express. As you add more and more content to TFS these files will naturally grow and so you need to consider up front whether you have the disk space to support this. Windows Home Server by default installs only a 20GB system partition which should have adequate space for the data files to grow, but if you have installed a lot of applications to your system drive this could be an issue.  To store the data files at an alternative location it is easier to install SQL Express 2008 manually prior to installing TFS 2010 (and configure the data file locations via the SQL setup). The TFS installation will then just reuse the SQL Express instance already installed. Alternatively you could install the data files on the WHS’s ‘Data’ drive (D drive) but personally I prefer to leave that drive well alone and let WHS’s ‘Drive Extender’ manage all the data in it’s Storage Pool. I did consider installing a new drive to my server, and not adding that to the storage pool, which would then be used for my data files. In the end I decided I had adequate space on the system drive and I could move the data files at a future point in time if required.

Build Controller: Part of the TFS installation/configuration includes the decision to install a Build Controller on the server. If you plan on running automated TFS builds then you’ll need to install this. I would strongly recommend you use the Team Build features of TFS as they are one of its key features but if you don’t want to automate builds or want to minimise the services running on your server then just skip that part.

Support: Whist WHS is really a Windows Server 2003 under the covers and TFS 2010 clearly supports Windows Server 2003, installing it on a Windows Home Server is not supported by Microsoft (or anyone else) and you do so at your own risk.

Steps:

You’ll need to Remote Desktop onto your server to run the installation interactively. The below screenshots show the key stages of the installation process. Luckily it’s mostly a matter of clicking ‘Next’:

whs_tfs_01 whs_tfs_02

Choose to install Team Foundation Server and the Build Service (if you plan to run automated builds on this server too). Don’t install the Server Proxy:

whs_tfs_04  whs_tfs_06

After much processing and a reboot it’s installed:

whs_tfs_07 whs_tfs_08

Once installed, it then launches the Configuration Centre and it’s at this point that you can choose the ‘Basic’ configuration. It will then install SQL Server Express or point to an existing SQL instance:

whs_tfs_10 whs_tfs_11 whs_tfs_12    whs_tfs_16 whs_tfs_17

Next you’re asked to configure the Build Service details:

whs_tfs_18 whs_tfs_19 whs_tfs_20 whs_tfs_21 whs_tfs_22    whs_tfs_26 whs_tfs_27

That’s it. You can launch the ‘Team Foundation Server Administration Console’ from the Start Menu which is a neat new tool enabling you to manage TFS from the server itself:

whs_tfs_29

Next Steps:

Connect via Visual Studio: Now it’s all up and running you can fire up Visual Studio and test that you can connect to the new Team Foundation Server by adding it to the list of servers in Team Explorer. You should be able to create new Team Projects, check in code and run builds just as you would on any other TFS server.

Enable Remote Access: To enable remote access to TFS so you can access your source code from remote locations you need to copy your client certificate from the Remote Access site and then add it to the TFS site. For instructions on how to do this see this excellent post by Jason Neave.

Backups: Call me paranoid but before I add anything important to this TFS server I want an automated backup procedure in place. Backing up TFS is really ‘just’ a matter of backing up the SQL Server databases on which TFS sits as that is the only source of its data. I say ‘just’ because it’s not as easy as it should be. A key point to note is that as TFS uses numerous databases it is critical to backup all of them at the same time so you don’t end up restoring unsynchronised databases. I have created an automated backup procedure using Windows PowerShell that I will share in a future post.

Conclusion:

It’s exciting to see a combination of two excellent products working together and I think this is another great way to add value to your Windows Home Server whilst enabling you to experiment with a quality ALM tool.

UPDATE: For WHS 2011 users check out my post on Installing TFS 2010 on Windows Home Server 2011.

Installing PowerShell on Windows Home Server

Whether you manage thousands of Windows boxes in an enterprise environment or you just want a more powerful shell environment with which to manage your backup scripts on your PC, Windows PowerShell is an excellent free tool at your disposal. PowerShell V2 comes pre-installed in Windows 7 but how do you get it up and running on your Windows Home Server?

You can install V1 of PowerShell via Windows Update as it is an optional update on Windows 2003 Servers (SP2) which is what a Windows Home Server is underneath. The instructions to do this are detailed here. Basically you go to Automatic Updates on the Control Panel, view the list of ‘optional’ updates and it should be there for you to choose.

What about PowerShell v2? Well a recent post on the PowerShell Team’s blog suggests that V2 will also be available via an optional update to Windows 2003 Servers, replacing the V1 option. The same instructions as above should apply.

If the update isn’t available to you yet or you want to install V2 the manual way then you need to download the Windows Management Framework Core (KB968930) package which is detailed here. The Windows 2003 download relevant for Home Servers is here. It’s a very simple install and once complete you can access PowerShell V2 from the Start > Programs > Accessories > Windows PowerShell menu item.

POSH_WHS_01  POSH_WHS_03

For more information on Windows PowerShell check out “Getting Started with PowerShell” on MSDN or launch the PowerShell ISE environment from the PowerShell Start Menu folder (detailed above) and press F1. The bundled help files are surprisingly good and will soon get you up to speed. I intend to be posting more on PowerShell as I migrate all my server backup scripts over from standard batch file format to PowerShell.

Team Foundation Server ‘Basic’ Edition

Team Foundation Server ‘Basic’ Edition

Many development teams still regularly use Visual SourceSafe for their source control which can stimulate heated debates between those that have used it for many years without problems and those that have suffered some pain with it. Regardless of this debate there is no denying that SourceSafe is coming to the end of it’s useful life. It’s old technology and will come out of support in 2011, although a compatibility update is expected with Visual Studio 2010.

When Microsoft developed it’s replacement, Team Foundation Server (TFS), it focused on providing more than just a source control product but a whole development lifecycle management system. Regardless of the benefits of TFS (and there are many) it has been avoided by many small development teams due to its high costs and complex installation/management. Many have instead moved to alternative source control products such as the free Subversion, leading to a decline in Microsoft’s market share in this area.

So, what’s changed? Microsoft now plan to provide a ‘Basic’ version of TFS 2010  when it ships next year. I think that this is a huge step forward for TFS and it’s take up across the development community. Brian Harry details the ‘Basic’ version in this blog post. This version of TFS will have a fast and easy installation and provides many more implementation options for the product. It will install on SQL Express 2008 and can even be installed on Client Windows Operating Systems. This really is targeting the current SourceSafe users and provides a low cost (perhaps even free) entry to the benefits of TFS. You might think that this would only provide basic TFS functionality but no so. Included in the basic version is Source Control, Bug Tracking and Build Automation, which provide the bulk of the key TFS features. The screenshots also suggest that Web Access is also included. What’s not included is Report Services and SharePoint, which are arguably more geared towards the larger development teams anyway. The key benefits from TFS come from the Work Item interaction and ‘Continuous Integration’ friendly automated build features and these are included.

The move to TFS for a SourceSafe (or any other simple source control system) team will provide many benefits and this version should enable those benefits at a minimal cost. There are no details on pricing but personally I would expect it to be included in the Team Developer MSDN subscription.

SourceSafe is also used by hobbyist and professional developers to manage their own personal source code and I see this version of TFS being ideal for this. The ability to install on a client OS is a major factor to these users. There is also a comment on Brian’s blog post about running TFS basic on Windows Home Server which is something I am keen to try out.

By allowing more people to access this great product it will greatly contribute to the TFS community and it’s take up globally. If you can’t wait until TFS 2010 is released and would like to know more about TFS versus SourceSafe in terms of pricing then check out Eric Nelson’s post here.

‘Windows Home Server’ Build & Setup

I recently setup a new Windows Home Server and this post covers why I chose this operating system and how I setup my server.

Requirements:

My requirement was for an extendable ‘always on’ network attached file storage solution that would allow me to access my files from any machine in the house (and ideally remotely via the Internet when required) whilst providing some fault tolerance data protection. Having all my data in one place makes it easier to manage (less duplication of files across machines) and easier to back-up. This centralisation of data however means being more susceptible to hardware failure (e.g. hard disk failure) and so a solution with either a RAID configuration or something similar was required which ruled out most budget NAS Storage devices. After investigating the options I decided to build a Windows Home Server (WHS). This meets all the requirements above and also adds other neat features such as the extensible Add-In model (a huge bonus for a .Net developer like me).

Buy vs Build:

Having decided on Windows Home Server as the solution the next step was to decide whether to buy or build. There are several very smart WHS devices available from manufacturers like HP and Acer. Whilst these are the easy option they are not the cheapest or the easiest to extend. Also the availability of these devices varies depending on your geographical location. Based on these factors I decided to build.

Build Option:

The fact that WHS has such modest hardware requirements means that building a server is a very economical option. As my server will be ‘always on’ I put power efficiency as a key requirement in my build. To this end I considered the Intel Atom processor found in most ‘NetBooks’. These consume little power and pack enough punch to run WHS comfortably. The Atom CPU comes pre-attached to an Intel motherboard (you can’t buy them separately yet) for under £50. However as I wanted the storage in my server to be extendable and grow over the next few  years I needed the space for at least 4 hard drives but the majority of Intel Atom boards only come with 2 SATA ports. Some boards do exist with four SATA ports but they are hard to source. Another possible Atom drawback is that it may be difficult to source Windows 2003 drivers (required for WHS) for ‘NetBook’ targeted Atom motherboards.

Buy Option:

image24Eventually after some investigations I had a list of parts to build into my shiny new server, but also a few reservations. Firstly would all these components play nicely together and would the build be solid enough to meet my ‘always on’ requirement. After discussions with a colleague he suggested I look for pre-built end of line servers, which is what I did. I quickly found the HP Proliant Ml110 G5 going for £170, bargain. With 1GB RAM, Dual Core Pentium 1.8Ghz CPU, on-board video, Gigabit NIC (Network Interface Card), 160GB HDD, DVD ROM and a multitude of SATA ports it was ideal.

Sure it lacked the power saving benefits of an Atom processor based server but it’s solid Enterprise level build quality more than makes up for it. As the server is designed to run Windows 2003 drivers would also not be a problem.

  BiosInfo

 

For storage I purchased two Western Digital Caviar Green 750GB SATA drives to sit alongside the HP’s 160GB disk. By buying two I can make full use of WHS’s data duplication features to protect my data. Whilst the ‘Green’ branded disks are not as fast as traditional drives they are packed with energy saving features which I value in an ‘always on’ server. 

internals

Hardware:

After much deliberation on whether to use the faster 160GB drive or the larger 750GB drive for the system drive I decided to install a 750GB drive as the system drive, mainly to ensure maximum extendibility. Whichever I installed as the system drive I would be stuck with (without reinstalling the Operating System) and I didn’t want to be limited to the smaller 160GB drive. To make installation of the OS easier I only connected up the first hard drive, and then connected the other two later once the OS was up and running.

 drives

Software Installation:

Once the hardware was sorted I put in the WHS DVD and followed the instructions. The installation went quicker than expected, surprisingly not spending long on performing the ‘Microsoft Updates’. Once installed I logged on to find that WHS didn’t have the right NIC (Network Interface Card) drivers and therefore the NIC hadn’t been installed. This of course explains why I didn’t have to wait for the install to download the updates as it couldn’t get on the web to find them. I installed the NIC drivers from the HP CD and rebooted to find that I could now access the internet via Internet Explorer but neither ‘Windows Update’ nor ‘Product Activation’ would connect. After further investigation (and much head scratching) I found this error in the Windows Event Log:

Type: Error.  Source: W32Time.
Description: Time Provider NtpClient: An error occurred during DNS lookup of the manually configured peer ‘time.windows.com,0x1’. NtpClient will try the DNS lookup again in 15 minutes. The error was: A socket operation was attempted to an unreachable host. (0x80072751)

Checking the System Time revealed I was two years in the past (2007) for some unknown  reason. After correcting the date I could connect to Windows Update fine. After a mammoth 70 updates and a reboot I’m presented with a strict ‘Activate Now’ prompt on logon. I presume that since my WHS install believes it has been installed for two years without activation it thinks it’s time to get serious. After I activate it I ran Windows Update again and this time it installs 5 more updates. Once the OS is stable I connect up the extra hard drives and add them to the storage pool via the WHS Console Server Storage tab.

Clients:

In order to connect to your Client PCs the server and Clients need to be in the same Workgroup making this alignment the next task, along with checking for useful machine names/descriptions. Once all the clients are ready I installed the Client Connector software on each client (all Windows 7 clients) and configure their backup schedules. All clients connected and performed a successful backup first time.

Before copying across all my data onto the Windows Home Server Shared Folders I made sure that ‘Folder Duplication’ was turned off. This was purely to maximise the transfer speed (as WHS didn’t have to perform any duplication during the copy process) but I made sure I turned ‘Folder Duplication’ on for all folders after the data was in place.

Setting up a Printer Server :

Next I wanted to set-up my server as a Print Server ensuring that I could print from any machine without having to turn on the Desktop hosting the printer first. The printer is a basic Lexmark Z615 but there are some unsupported Windows 2003 drivers on the Lexmark site. After trial and error with these though I abandoned them and reverted to the XP drivers which worked ok. I did have to reboot several times though to completely remove the failed printer installed with the Windows 2003 drivers.

An annoying feature of Windows is that it searches the local network for other printers and adds them to the server. I don’t want ‘Print to OneNote’ and ‘XPS Document Printer’ printers on my server but deleting them is pointless as they will reappear. To prevent Windows from performing this auto search you need to turn it off by deselecting the option in: My Computer > Tools > Folder Options > View > “Automatically Search for Network Folders & Printers’.

With my print server setup I attempted to add the printer to my Windows 7 client, but this was to prove difficult too. I couldn’t find an option to specify the correct drivers to use for the Printer and the Vista printers (needed for Windows 7) weren’t installed on my server. In the end I found this blog post  where it explains how to use the Print Manager tool (new to Vista Sp1) to add additional drivers to your print server. This worked perfectly and on the next attempt it downloaded the Vista drivers correctly and installed the printer successfully.

WHS Add Ins:

I intend to install and (time permitting) write plenty of Add Ins for use with WHS as I think that they are an excellent way to add functionality to your server. So far I have installed the Microsoft WHS Toolkit v1.1 and Andreas M’s Advanced Admin Console. I find the Advanced Admin console useful for accessing admin tools via the Console without having to Remote Desktop into the server each time.  Over the next few weeks I hope to review the Power Management Add-Ins and install one to help my server to get a few hours sleep over night when it’s not required, thus saving power and money.

Summary:

So that’s my build story. My home server is up and running and I’m so far very impressed with it. I aim to post some more articles about Windows Home Server over the coming months.

Windows Home Server

I have recently set-up a home server using the Windows Home Server Operating System. The details of the set-up will follow in a future post but firstly I thought I would quickly introduce the Windows Home Server (WHS) product and provide some useful links.

Windows Home Server was released by Microsoft in 2007 and is built on top of Windows Server 2003. It’s role is to sit quietly in your home and automatically backup all your PCs, provide NAS (Network Attached Storage) file sharing features, media streaming and remote access. It’s protects your data from hard drive failure by duplicating your data over multiple drives where you have a multi-drive system.

WHS can be bought pre-installed on custom devices from companies like HP and Acer or you can install it yourself on your own kit. As the hardware requirements are so light its possible to get it running on an old PC you might have lying around. Alternatively build or purchase a cheap low end PC for the purpose.

Particularly of interest to developers is the WHS Add-In model. WHS can be extended through the use of ‘Add-Ins’ from various ISVs (Independent Software Vendors) and enthusiast developers. Microsoft provides a Windows Home Server Add-In SDK for .Net Developers wanting to write Add-Ins for WHS and Brendan Grant has Visual Studio project templates on his blog.

Here’s a selection of links for more information: