Linux Home Server Build

On this blog I have posted many times about my home server configuration and seeing as I’ve recently updated it I thought I’d give a quick overview of the changes made and provide some tips for setting up a Linux home server.

My home server (an HP MicroServer) is used for NAS file storage, running Plex Media server and a few other ativites including client backups. Previously my server was running Windows Home Server 2011 which was an excellent Home Server OS from Microsoft based on Windows Server 2008 R2. In additon to file sharing it also allowed for easy server administration and client PC backups. Client PCs would backup images to the server allowing for client files or whole systems to be restored. Unfortunately Microsoft discontinued Windows Home Server and it is no longer supportrd and Windows Server 2008 R2 updates will stop in July 2019. In terms of replacement options there were several, whilst all Windows offerings are too expensive and seem overkill for a home server, serious Linux and BSD options include Free NAS, Amahi, Ubuntu Server and numerous Linux desktop distros. I would also recommend looking at a Synology if you have the budget.

In the end I chose Lubuntu 18.04 (LTS) desktop distro for my needs. Why a desktop distro for a server? Well I dont need to squeeze every onunce of performance from the server and the Lubuntu desktop is so lightweight and efficient I can have a graphcial desktop enviroment as well as great server performance. It is handy to have the option to be able to RDP into the box and use the Lubuntu desktop as an alternative to SSH when required.

I installed the OS, and checked for updates:

sudo apt-get update
sudo apt-get dist-upgrade

After installing the OS I inserted the data drives and mounted them under a /mydata mount point so that I can easily access all the files on those drives. To make these mount points persistent I edited /etc/fstab to add each one using the UUID of the partition (which is found in Disks app or the GParted app)

sudo nano /etc/fstab

Then add an entry for each parition to mount …

 UUID=YOUR_OWN_PARTITION_UUID /mydata/d1/ ext4 defaults 0 0
UUID=YOUR_OWN_PARTITION_UUID /mydata/d2/ ext4 defaults 0 0

Configure The FireWall

The ufw (uncomplicated firewall) firewall is installed on Lubuntu by default but is turned off so turn it on and check its status:

sudo ufw status 

If inactive then activate it with:

sudo ufw enable

To see its status and current rules:

sudo ufw status verbose

Add new rules with:

sudo ufw allow PORTNUMBERHERE 

Install XRDP Remote Desktop Service

Next I set up XRDP for remote access to this headless server.

sudo apt-get -y install xrdp
sudo ufw allow 3389/tcp
sudo systemctl enable xrdp
sudo systemctl restart xrdp

At this point I hit may issues but this hack below seemed to be the one that led to a working XRDP session but using this command to create a .xsession in home directory of connecting user:

 echo "lxsession -s Lubuntu -e LXDE" > ~/.xsession 

For more information see this article.

Setup Cockpit Web Interface

For more remote administration and monitoring goodness I installed Cockpit which is a web based interface for servers with lots of useful features.

sudo apt-get install cockpit
sudo ufw allow 9090

Then browse to https://yourserverip:9090

For a useful Cockpit install guide check out this link.

Setup Samba for file sharing

The whole point of a file server is to share files and in order to support file sharing with Windows devices on the network you’ll need to setup Samba. A good link for setting up Samba can be found here.

sudo apt-get install samba

Set a password for your user in Samba.

sudo smbpasswd -a        

All the folder shares and their configuration are stored in the smb.conf file which can be edited by opening it up in a text editor like nano.

 sudo nano /etc/samba/smb.conf 

In the smb.conf file you may want to change the workgroup name to the same as that used by your window PCs and then add each of your folder shares.

[<folder_name>]
    path = /folder/path
    valid users = <user_name>
    read only = no

Once you have made the required changed restart the smb daemon service

sudo service smbd restart

You’ll also likely need to punch a hole in the ufw firewall for samba

sudo ufw allow Samba

Once Samba has restarted, use this command to check your smb.conf for any syntax errors with testparm

testparm

For a good guide on more complex permissions check out this guide here.

It’s worth noting that users need to have Unix perms on the underlying folders in order to be able to access them. Amend Linux file permissions as required.

Scheduled Tasks with Cron

For scheduled tasks (backup jobs etc) I have configured Cron jobs to run bash scripts (although I could have kept my existing PowerShell scripts as PowerShell now runs on Linux too) but they needed a rewrite anyway.

Open your cron job file with…

sudo crontab -e 

then add entries like this example:

# run test job at 11:15 every day 
15 11 * * * . /etc/profile; /bin/bash /home/me/testjob.sh > /tmp/cron.out

For more info on Cron check out this guide and for an awesome helper tool for building the Cron schedule times check out corntab.com.

Summary

So I’ve covered the basics of how I’ve set up my Home Server using Lubuntu which others may find useful. I’ve been running this setup for a few months and so far I am very pleased with its performance and stability.

Future steps are to install Plex Media Server and configure client PC backups. As I want to use the Snaps for Plex I am waiting for the offical Plex Snap Package to come out of BETA as I’m not in a rush. Alternatively I may use Docker to run Plex. To replace the client PC backup feature I previously had with Windows Home Server I will soon be moving to a client imaging tool such as CloneZilla, Acronis or Windows Backup and then copying the images to the server.

Advertisement

Android Remote Desktop Client

2XClient_LogonI find that I am increasingly relying on the computing power of my Android smartphone (a HTC Desire) and finding novel ways of using it to make my IT life easier. Sometimes I just want to connect to my PC that is in another room, or more often for me it’s my headless Windows Home Server, and so I scouted for a Remote Desktop client that I could run on my phone. The key requirement was for it to use the Windows native Remote Desktop protocol and therefore not require any software to be installed on my PC or Server, which ruled out a lot of the VNC based Apps. Luckily 2x.com have released an excellent FREE App that ticks all the boxes.

2XClient for Android can be found here or on Android Market here. It is dead easy to set up the target machines and there are several display optimisation options. The key thing though is that it’s actually very easy to navigate the target machines desktop via a custom keyboard and a nifty mouse icon that can be dragged around with a left and right mouse button attached (left image below).  In these images I’m logging onto my Windows Home Server (a Windows 2003 based OS) but I also use it with my Windows 7 PC too. One thing to note for Windows 7 though is that I needed to set my Remote Desktop settings (via My Computer > System Properties > Remote Settings)  to “Allow connections from computers running any version of Remote Desktop” as opposed to the default setting of enforcing Network Level Authentication.

 2XClient_Mouse  2XClient_Keyboard  2XClient_StartMenu

It is surprisingly easy to do simple tasks on the target machine, especially after a bit of practice. Here I am using PowerShell and checking my Home Server Console.

2XClient_WHS12XClient_POSH

A very powerful tool to have on your phone and ideal for those quick techy tasks when you can’t be bothered to get off the sofa.

The Future of Windows Home Server

Microsoft’s recent announcement that the key Drive Extender feature is to be removed from the new version of Windows Home Server codenamed ‘Vail’ has resulted in much dismay within the community. Many commentators, including the vocal WHS user community itself, have started to question the future of this product. In this post I give my take on where I see WHS in the medium term and consider how it can fit alongside the “new dawn” of a Cloud Computing era.

How big is the Drive Extender issue?

Firstly, what’s all this about Drive Extender (DE)? Well DE is a really neat feature of WHS that pools all the hard drives in the system into one logical data drive. This means that you can throw in a mixed selection of hard drives of any type (USB, SATA etc) or capacity and the system enables you to see them as one. It also provides fault tolerance through data duplication which protects your data from drive failure. It is one of the major features of Windows Home Server (WHS). I would argue one of three, with the others being the client backups and remote access. Sure the product does much more than just that but it’s fair to say that all of WHS’s features are available in other products in some shape or form and the combination of these three features into one customisable platform made WHS stand out for me.

Microsoft’s announcement to remove DE from the next version of WHS code named Vail immediately removes a major reason to buy into the new WHS version and this has been evidenced in the recent twitter comments on the subject where a lot of people have stated their intention to not use ‘Vail’. Of course some of this is just anger at the fact that the feature has been removed (and the way in which it was announced) but still the fact remains that the product is a weaker proposition than it was before.

Personally I see this decision in both a negative and a positive light. Firstly I see this as a major blow to the uniqueness of the product and feel that it will suffer without this USP (Unique Selling Point). Also it’s important to remember that this is positioned as a product for the average PC user and DE made extending the storage capacity easy. The user doesn’t need to buy matching disks or configure RAID, they just pop in a new disk and it gets added to the pool. Without DE adding extra storage will presumably be a more complex task. In reality though how many “average” PC users would feel happy upgrading the hard drive on their WHS anyway. Whilst enthusiasts relish the chance to pop open the case many casual users would actually see their OEM produced WHS as just an appliance, and one probably already stuffed with several 2 or 3 TB drives providing a good chunk of storage capacity right out the box. They would not consider any upgrades to it other than replacing it when it gets full. In addition whilst the shared drive pool concept makes adding storage easy the ability to add additional storage as additional drives will still be there in the product as it is in any Windows OS. I don’t see this as a huge blocker to WHS adoption.

Folder duplication utilises the DE feature to ensure that the data is duplicated onto different physical drives within the logical storage pool. This in effect is ‘RAID like’ except that the data is duplicated over time and not immediately (although there is no way of retrieving previous versions of files). This provides an easy form of fault tolerance that, whilst being fairly easy to replicate yourself using other means, will probably never be as easy as ticking a check box. This is again more of an issue for the “average” guy than the PC enthusiast who is at home configuring RAID, although a simple file copy add-in or batch job is my preferred solution. I already run daily automated RoboCopy jobs to copy "’snapshots’ of my data drives to another drive to provide both fault tolerance but also versioned snapshots that I can restore if required. I have had to dive into my snapshots on several occasions to restore a previous version of a file that has accidently been deleted/modified. I prefer this solution over RAID as disk write to a drive in RAID is duplicated immediately even if its not what you wanted.

So, what’s the positive? Well let’s consider why Microsoft are removing it. They have said that it causes conflicts with applications installed on the Small Business Server sister OS code named ‘Aurora’. These software applications don’t play nicely with having a logical drive pool. I, as have many other WHS enthusiasts, have over time installed numerous applications onto my WHS (e.g Microsoft Team Foundation Server) and I always do so with caution due to DE. I am careful to  ensure that nothing I install utilises the DATA drive and I often refrain from installing software that I think might conflict. With DE removed this worry is taken care of, which is definitely a positive for me.

Does WHS fit in the Cloud Computing Landscape?

If we look to the future and assume that the Cloud Computing paradigm is here to stay the bigger question arises of what role would WHS play. I admit to being a Cloud advocate and I do share Ray Ozzie’s view of a “New Dawn” where  devices (not PCs) connect to continuous services hosted in the internet. In this vision the majority of people only use devices to connect to the internet (smart phones, tablets, TVs etc) and they are continuously connected to the web where their data is stored, analysed, processed and shared. The concept of having a local home server is almost alien as your storage will all be in the cloud. Backups won’t be required as data will be automatically synched and devices won’t need to be imaged for restoration as they will only be simple devices with sophisticated browsers. Sure PC’s will remain for advanced users but not the user majority. This vision of the future is not that revolutionary, it’s already happening, so fast in fact that the next version of WHS after Vail will need to be positioned within this connected world. People may cry that users will always want their data close by and local but that’s not true as over time they won’t even think about it as evidenced by early cloud services like Hotmail, Exchange Online etc.

This vision of the future relies heavily on a fast internet connection and related infrastructure which is slowly being rolled out across the developed world but this weakness perhaps provides an opportunity for the WHS’s of the future. The ability to synch to your local “private cloud” and use that as the hub for your home is probably a requirement of the future and a ‘server’ device could fulfil this space. Unfortunately so could other home based devices, such as the XBoxes, Google TVs and Media Centers of the future, and the single home device is the ‘holy grail’ of consumer electronics. The battle for the position as sole ‘provider’ and gateway to the continuous services of the future will be intense and whilst the current WHS offerings (V1 and Vail") are too weak to survive the battle, maybe, just maybe, their future off-spring will fit that gap perfectly.

Summary:

WHS has, unfortunately, always been a niche product which is a real shame as it is one of the best products to ever have come out of Redmond and one that deserves more credit. Microsoft have never promoted it and seem instead to be happy to use it as a experiment for newer technologies (like DE). This is obviously a dark period for the WHS product but the communities reaction to the DE news and the growing popularity of the platform means that I believe it it will survive in the short term.

If I were Microsoft I would look to extract the key features of WHS (i.e. client backup and remote access services) and convert them into add on applications for Windows. With DE gone there is little point in having a ‘Home’ sku of Windows Server. Sell Windows 2008 Foundation to OEMs with these WHS feature applications installed for them to put on their consumer devices. This would enable these features to be supported on Windows Client OS’s in the future too when it was profitable to do so. I would be happy to run a fully fledged supported version of Windows Server that comfortably ran all server based software but to which I could also install a Client Backup and Remote Access Services if I required them.

Will I upgrade to Vail? Good question. Currently I’m undecided. I will review it against other products when the time comes (Amahi on Linux, Aurora, Win Server 2008) but one thing is for sure – the removal of DE will not affect my decision but the strength of Microsoft commitment to the product will.

Backing up TFS 2010 with new Power Tools Backup Plan

At last – backup is built into the TFS product, well via the Power Tools at least. Backing up TFS has always been difficult and non-intuitive without a SQL DBA in your pocket, even the documentation is at best extensive but at worse confusing. But all that’s history as the September 2010 TFS Power Tools now includes a Backup plan feature.

Recently I have posted a few articles on backing up TFS 2010 using Windows PowerShell. The first involved backing up the all TFS SQL databases using PowerShell and SMO and can be found here: ‘Backing Up TFS Part-1’. the second covered an alternative strategy of backing up the latest content of your TFS repository and can found here: ‘Backing Up TFS Part-2’.

Brian Harry recently posted this article ‘Backing up and restoring your TFS server’ describing the new Backup Plan feature of the new release of the TFS Power Tools and this week the September 2010 TFS Power Tools were released. Brian’s posts provide all the details you need on installing and configuring a backup plan together with some screenshots.

I installed the updated Power Tools on my Windows Home Server (you must remove any previous versions manually first) running TFS and easily setup a backup plan:

TFSBackupInstallOption

I did hit one problem with it though. It seems that the feature is a little fussy on the target location of the backup files. The target must be a network share and the tool will attempt to apply the relevant access privileges to the folder for the scheduled job to be able to write the backups successfully. The tool verifies that the information you have supplied is accurate and it checks that a backup can run. My initial attempts to verify my backup plan failed at this point with the following error in the log:

Microsoft.SqlServer.Management.Smo.FailedOperationException: Backup failed for Server ‘servername\SqlExpress’.  —> Microsoft.SqlServer.Management.Common.ExecutionFailureException: An exception occurred while executing a Transact-SQL statement or batch. —> System.Data.SqlClient.SqlException: Cannot open backup device ‘\\servername\backup\tfsbackups\Tfs_Configuration_20100911231343.bak’. Operating system error 5(failed to retrieve text for this error. Reason: 1815).

After some investigation and a helpful post from Dave Hunter I concluded that this was due to the permissions on the backup share on my Home Server. As that share is a WHS managed share it has specific security permissions that prevented the backup tool from asserting its authority and granting the relevant privileges. To circumvent the problem I created a new standard non-WHS share on my Home Server with minimal restrictions. Once I entered the new share’s details the backup tool verified the backup plan successfully and ran a backup fine. I then knocked up a simple RoboCopy script to copy the contents of the new backup share to my original intended WHS share target location on a daily basis via a scheduled task.

In summary I believe that this is a major step forward for TFS and will benefit many of the new users picked up since the introduction of the TFS Basic Configuration in TFS 2010, proving to be another nail in the coffin for Visual SourceSafe. I’d recommend any team still using SourceSafe or any other tool to take another look at TFS as it is definitely getting easier to manage than previously.

Installing Team Foundation Server 2010 on Windows Home Server

Last October I posted about the fact that Microsoft’s Team Foundation Server 2010 was going to be shipping with a “Basic” configuration that more light weight and seen as more of a Visual SourceSafe replacement. In that post I also pointed out that it seemed possible to install this version on a Windows Home Server (WHS) and that it would be something I’d try. Well eventually I have got round to doing it and of course I’ve documented my approach. The benefits of TFS immense but out of the scope of this article but instead I’m covering the installation process.

It may have been technically possible to jump through some hoops and install TFS 2008 on your WHS box but the pre-requisites were heavy (including SharePoint and full SQL Server) and it was always seen as a complicated process. With the ‘basic’ configuration of TFS 2010 you can ignore SharePoint and SQL Reporting Services and it installs onto SQL Server 2008 Express (which it also installs for you).

A point worth mentioning is that TFS 2010 will not trash you existing web sites (important for Windows Home Servers) but will instead install its own site alongside any existing sites on the server. It is possible to then connect to TFS remotely via your WHS Remote Access domain name which is a very nifty feature.

Pre-Installation Considerations:

Basic Configuration: Whilst it’s possible to install the non-basic configurations I don’t see the point unless there is something specific that you want to take advantage of that’s not included in the basic configuration. For a light, easy installation and a nice minimal processing overhead on your WHS the ‘basic’ configuration is ideal. It also contains all the core features such as build automation, work item tracking and source control.

Location of SQL Server Data Files: By default the SQL Server data files are installed to the system drive under the program files location for SQL Server Express. As you add more and more content to TFS these files will naturally grow and so you need to consider up front whether you have the disk space to support this. Windows Home Server by default installs only a 20GB system partition which should have adequate space for the data files to grow, but if you have installed a lot of applications to your system drive this could be an issue.  To store the data files at an alternative location it is easier to install SQL Express 2008 manually prior to installing TFS 2010 (and configure the data file locations via the SQL setup). The TFS installation will then just reuse the SQL Express instance already installed. Alternatively you could install the data files on the WHS’s ‘Data’ drive (D drive) but personally I prefer to leave that drive well alone and let WHS’s ‘Drive Extender’ manage all the data in it’s Storage Pool. I did consider installing a new drive to my server, and not adding that to the storage pool, which would then be used for my data files. In the end I decided I had adequate space on the system drive and I could move the data files at a future point in time if required.

Build Controller: Part of the TFS installation/configuration includes the decision to install a Build Controller on the server. If you plan on running automated TFS builds then you’ll need to install this. I would strongly recommend you use the Team Build features of TFS as they are one of its key features but if you don’t want to automate builds or want to minimise the services running on your server then just skip that part.

Support: Whist WHS is really a Windows Server 2003 under the covers and TFS 2010 clearly supports Windows Server 2003, installing it on a Windows Home Server is not supported by Microsoft (or anyone else) and you do so at your own risk.

Steps:

You’ll need to Remote Desktop onto your server to run the installation interactively. The below screenshots show the key stages of the installation process. Luckily it’s mostly a matter of clicking ‘Next’:

whs_tfs_01 whs_tfs_02

Choose to install Team Foundation Server and the Build Service (if you plan to run automated builds on this server too). Don’t install the Server Proxy:

whs_tfs_04  whs_tfs_06

After much processing and a reboot it’s installed:

whs_tfs_07 whs_tfs_08

Once installed, it then launches the Configuration Centre and it’s at this point that you can choose the ‘Basic’ configuration. It will then install SQL Server Express or point to an existing SQL instance:

whs_tfs_10 whs_tfs_11 whs_tfs_12    whs_tfs_16 whs_tfs_17

Next you’re asked to configure the Build Service details:

whs_tfs_18 whs_tfs_19 whs_tfs_20 whs_tfs_21 whs_tfs_22    whs_tfs_26 whs_tfs_27

That’s it. You can launch the ‘Team Foundation Server Administration Console’ from the Start Menu which is a neat new tool enabling you to manage TFS from the server itself:

whs_tfs_29

Next Steps:

Connect via Visual Studio: Now it’s all up and running you can fire up Visual Studio and test that you can connect to the new Team Foundation Server by adding it to the list of servers in Team Explorer. You should be able to create new Team Projects, check in code and run builds just as you would on any other TFS server.

Enable Remote Access: To enable remote access to TFS so you can access your source code from remote locations you need to copy your client certificate from the Remote Access site and then add it to the TFS site. For instructions on how to do this see this excellent post by Jason Neave.

Backups: Call me paranoid but before I add anything important to this TFS server I want an automated backup procedure in place. Backing up TFS is really ‘just’ a matter of backing up the SQL Server databases on which TFS sits as that is the only source of its data. I say ‘just’ because it’s not as easy as it should be. A key point to note is that as TFS uses numerous databases it is critical to backup all of them at the same time so you don’t end up restoring unsynchronised databases. I have created an automated backup procedure using Windows PowerShell that I will share in a future post.

Conclusion:

It’s exciting to see a combination of two excellent products working together and I think this is another great way to add value to your Windows Home Server whilst enabling you to experiment with a quality ALM tool.

UPDATE: For WHS 2011 users check out my post on Installing TFS 2010 on Windows Home Server 2011.

Installing PowerShell on Windows Home Server

Whether you manage thousands of Windows boxes in an enterprise environment or you just want a more powerful shell environment with which to manage your backup scripts on your PC, Windows PowerShell is an excellent free tool at your disposal. PowerShell V2 comes pre-installed in Windows 7 but how do you get it up and running on your Windows Home Server?

You can install V1 of PowerShell via Windows Update as it is an optional update on Windows 2003 Servers (SP2) which is what a Windows Home Server is underneath. The instructions to do this are detailed here. Basically you go to Automatic Updates on the Control Panel, view the list of ‘optional’ updates and it should be there for you to choose.

What about PowerShell v2? Well a recent post on the PowerShell Team’s blog suggests that V2 will also be available via an optional update to Windows 2003 Servers, replacing the V1 option. The same instructions as above should apply.

If the update isn’t available to you yet or you want to install V2 the manual way then you need to download the Windows Management Framework Core (KB968930) package which is detailed here. The Windows 2003 download relevant for Home Servers is here. It’s a very simple install and once complete you can access PowerShell V2 from the Start > Programs > Accessories > Windows PowerShell menu item.

POSH_WHS_01  POSH_WHS_03

For more information on Windows PowerShell check out “Getting Started with PowerShell” on MSDN or launch the PowerShell ISE environment from the PowerShell Start Menu folder (detailed above) and press F1. The bundled help files are surprisingly good and will soon get you up to speed. I intend to be posting more on PowerShell as I migrate all my server backup scripts over from standard batch file format to PowerShell.

‘Windows Home Server’ Build & Setup

I recently setup a new Windows Home Server and this post covers why I chose this operating system and how I setup my server.

Requirements:

My requirement was for an extendable ‘always on’ network attached file storage solution that would allow me to access my files from any machine in the house (and ideally remotely via the Internet when required) whilst providing some fault tolerance data protection. Having all my data in one place makes it easier to manage (less duplication of files across machines) and easier to back-up. This centralisation of data however means being more susceptible to hardware failure (e.g. hard disk failure) and so a solution with either a RAID configuration or something similar was required which ruled out most budget NAS Storage devices. After investigating the options I decided to build a Windows Home Server (WHS). This meets all the requirements above and also adds other neat features such as the extensible Add-In model (a huge bonus for a .Net developer like me).

Buy vs Build:

Having decided on Windows Home Server as the solution the next step was to decide whether to buy or build. There are several very smart WHS devices available from manufacturers like HP and Acer. Whilst these are the easy option they are not the cheapest or the easiest to extend. Also the availability of these devices varies depending on your geographical location. Based on these factors I decided to build.

Build Option:

The fact that WHS has such modest hardware requirements means that building a server is a very economical option. As my server will be ‘always on’ I put power efficiency as a key requirement in my build. To this end I considered the Intel Atom processor found in most ‘NetBooks’. These consume little power and pack enough punch to run WHS comfortably. The Atom CPU comes pre-attached to an Intel motherboard (you can’t buy them separately yet) for under £50. However as I wanted the storage in my server to be extendable and grow over the next few  years I needed the space for at least 4 hard drives but the majority of Intel Atom boards only come with 2 SATA ports. Some boards do exist with four SATA ports but they are hard to source. Another possible Atom drawback is that it may be difficult to source Windows 2003 drivers (required for WHS) for ‘NetBook’ targeted Atom motherboards.

Buy Option:

image24Eventually after some investigations I had a list of parts to build into my shiny new server, but also a few reservations. Firstly would all these components play nicely together and would the build be solid enough to meet my ‘always on’ requirement. After discussions with a colleague he suggested I look for pre-built end of line servers, which is what I did. I quickly found the HP Proliant Ml110 G5 going for £170, bargain. With 1GB RAM, Dual Core Pentium 1.8Ghz CPU, on-board video, Gigabit NIC (Network Interface Card), 160GB HDD, DVD ROM and a multitude of SATA ports it was ideal.

Sure it lacked the power saving benefits of an Atom processor based server but it’s solid Enterprise level build quality more than makes up for it. As the server is designed to run Windows 2003 drivers would also not be a problem.

  BiosInfo

 

For storage I purchased two Western Digital Caviar Green 750GB SATA drives to sit alongside the HP’s 160GB disk. By buying two I can make full use of WHS’s data duplication features to protect my data. Whilst the ‘Green’ branded disks are not as fast as traditional drives they are packed with energy saving features which I value in an ‘always on’ server. 

internals

Hardware:

After much deliberation on whether to use the faster 160GB drive or the larger 750GB drive for the system drive I decided to install a 750GB drive as the system drive, mainly to ensure maximum extendibility. Whichever I installed as the system drive I would be stuck with (without reinstalling the Operating System) and I didn’t want to be limited to the smaller 160GB drive. To make installation of the OS easier I only connected up the first hard drive, and then connected the other two later once the OS was up and running.

 drives

Software Installation:

Once the hardware was sorted I put in the WHS DVD and followed the instructions. The installation went quicker than expected, surprisingly not spending long on performing the ‘Microsoft Updates’. Once installed I logged on to find that WHS didn’t have the right NIC (Network Interface Card) drivers and therefore the NIC hadn’t been installed. This of course explains why I didn’t have to wait for the install to download the updates as it couldn’t get on the web to find them. I installed the NIC drivers from the HP CD and rebooted to find that I could now access the internet via Internet Explorer but neither ‘Windows Update’ nor ‘Product Activation’ would connect. After further investigation (and much head scratching) I found this error in the Windows Event Log:

Type: Error.  Source: W32Time.
Description: Time Provider NtpClient: An error occurred during DNS lookup of the manually configured peer ‘time.windows.com,0x1’. NtpClient will try the DNS lookup again in 15 minutes. The error was: A socket operation was attempted to an unreachable host. (0x80072751)

Checking the System Time revealed I was two years in the past (2007) for some unknown  reason. After correcting the date I could connect to Windows Update fine. After a mammoth 70 updates and a reboot I’m presented with a strict ‘Activate Now’ prompt on logon. I presume that since my WHS install believes it has been installed for two years without activation it thinks it’s time to get serious. After I activate it I ran Windows Update again and this time it installs 5 more updates. Once the OS is stable I connect up the extra hard drives and add them to the storage pool via the WHS Console Server Storage tab.

Clients:

In order to connect to your Client PCs the server and Clients need to be in the same Workgroup making this alignment the next task, along with checking for useful machine names/descriptions. Once all the clients are ready I installed the Client Connector software on each client (all Windows 7 clients) and configure their backup schedules. All clients connected and performed a successful backup first time.

Before copying across all my data onto the Windows Home Server Shared Folders I made sure that ‘Folder Duplication’ was turned off. This was purely to maximise the transfer speed (as WHS didn’t have to perform any duplication during the copy process) but I made sure I turned ‘Folder Duplication’ on for all folders after the data was in place.

Setting up a Printer Server :

Next I wanted to set-up my server as a Print Server ensuring that I could print from any machine without having to turn on the Desktop hosting the printer first. The printer is a basic Lexmark Z615 but there are some unsupported Windows 2003 drivers on the Lexmark site. After trial and error with these though I abandoned them and reverted to the XP drivers which worked ok. I did have to reboot several times though to completely remove the failed printer installed with the Windows 2003 drivers.

An annoying feature of Windows is that it searches the local network for other printers and adds them to the server. I don’t want ‘Print to OneNote’ and ‘XPS Document Printer’ printers on my server but deleting them is pointless as they will reappear. To prevent Windows from performing this auto search you need to turn it off by deselecting the option in: My Computer > Tools > Folder Options > View > “Automatically Search for Network Folders & Printers’.

With my print server setup I attempted to add the printer to my Windows 7 client, but this was to prove difficult too. I couldn’t find an option to specify the correct drivers to use for the Printer and the Vista printers (needed for Windows 7) weren’t installed on my server. In the end I found this blog post  where it explains how to use the Print Manager tool (new to Vista Sp1) to add additional drivers to your print server. This worked perfectly and on the next attempt it downloaded the Vista drivers correctly and installed the printer successfully.

WHS Add Ins:

I intend to install and (time permitting) write plenty of Add Ins for use with WHS as I think that they are an excellent way to add functionality to your server. So far I have installed the Microsoft WHS Toolkit v1.1 and Andreas M’s Advanced Admin Console. I find the Advanced Admin console useful for accessing admin tools via the Console without having to Remote Desktop into the server each time.  Over the next few weeks I hope to review the Power Management Add-Ins and install one to help my server to get a few hours sleep over night when it’s not required, thus saving power and money.

Summary:

So that’s my build story. My home server is up and running and I’m so far very impressed with it. I aim to post some more articles about Windows Home Server over the coming months.

Windows Home Server

I have recently set-up a home server using the Windows Home Server Operating System. The details of the set-up will follow in a future post but firstly I thought I would quickly introduce the Windows Home Server (WHS) product and provide some useful links.

Windows Home Server was released by Microsoft in 2007 and is built on top of Windows Server 2003. It’s role is to sit quietly in your home and automatically backup all your PCs, provide NAS (Network Attached Storage) file sharing features, media streaming and remote access. It’s protects your data from hard drive failure by duplicating your data over multiple drives where you have a multi-drive system.

WHS can be bought pre-installed on custom devices from companies like HP and Acer or you can install it yourself on your own kit. As the hardware requirements are so light its possible to get it running on an old PC you might have lying around. Alternatively build or purchase a cheap low end PC for the purpose.

Particularly of interest to developers is the WHS Add-In model. WHS can be extended through the use of ‘Add-Ins’ from various ISVs (Independent Software Vendors) and enthusiast developers. Microsoft provides a Windows Home Server Add-In SDK for .Net Developers wanting to write Add-Ins for WHS and Brendan Grant has Visual Studio project templates on his blog.

Here’s a selection of links for more information: