I’m pretty strict on making sure I have my data backed up in numerous places and my blog content is no different. I would hate to lose all these years of babbling. In this post I cover how I back up this blog, and this will apply to any blog engine or indeed any website.

This blog is hosted on WordPress.com and I trust the guys at ‘Automatic’ to keep my data safe, but accidents do happen. Ideally I want an up to date backup of this blog together with any images used. Personally I?m not too concerned about having it in a WordPress format but rather actually prefer having the raw content that I could use to recreate the blog elsewhere.

The HTTrack Tool

The tool I use is HTTrack (http://www.httrack.com) which is a web site copying tool. It essentially re-creates a working copy of the site on the local disk (which you can navigate in a browser too). The tool has many features and includes command line interface which makes it easy to run via a scheduled task. The various command line switches are documented here http://www.httrack.com/html/fcguide.html , but I use this simple command below: 

C:\HTTrack\httrack.exe http://richhewlett.com -O c:\TargetFolderPathHere

For interest the –O switch tells HTTrack to output the site to disk and hence produce a copy.

You can create a Windows Scheduled Task that periodically runs this command line and you have an automated backup. I personally go a bit further and wrap this command into a Windows PowerShell script. This script creates new folder each time with the current date and implements error handling which writes to the system eventlog.

This script is an example only and comes with no guarantees that it will work for you without modification:

#===============================================
# Backup blog to disk using HTTRACK 
# (Created by Rich Hewlett, see blog at RichHewlett.com) 
#==============================================
clear-host
write-output "---------------------Script Start--------------------"
write-output " HTTrack Site Backup Script"
write-output "-------------------------------------------------------"

# set file paths
$timestamp = Get-Date -format yyyy_MMM_dd_HHmmss
$TargetFolderPath="F:\MyBlogBackUp\$timestamp"
$HTTrackPath="C:\HTTrack\httrack.exe"

write-output "Backup target path is $TargetFolderPath"
write-output "HTTrack is at $HTTrackPath"

# set error action preference so errors don't stop and the trycatch 
# kicks in to handle gracefully
$erroractionpreference = "Continue"

try
{
    write-output "Creating output folder $TargetFolderPath ..."
    New-Item $TargetFolderPath -type directory
    
    write-output "Download data ..."
    invoke-expression "$HTTrackPath http://MyBlog.com -O $TargetFolderPath"
    write-output "Done with downloading."    
    
    write-eventlog -LogName "Network" -Source "HTTrack" -EventId 1 -Message "Downloaded blog for backup"
    
}
catch
{
    # error occurred so lets report it
    write-output "ERROR OCCURRED DURING SCRIPT " $error

    # write an event to the event log
    write-output "Writing FAIL to EventLog"
    write-eventlog -LogName "Network" -Source "HTTrack" -EventId 1 -Message "Download blog for backup FAILED during execution. $error" -EntryType Error
}

write-output "------------------------------------Script end------------------------------------"

I run this job monthly via a Windows scheduled task.

 

Using WordPress Export Feature

If you run a WordPress blog you can also do an export via the Dashboard which will export all site content (including comments) which is useful. In addition to the above raw HTML backup above I also use the export tool periodically manually (and therefore infrequently). For more information on this feature check out http://en.support.wordpress.com/export/


Here’s a quick tip that I found useful last week.

If you’re using IBM Rational Software Architect to produce a UML Sequence diagram and you add a new Synchronous Message activity the tool automatically inserts a return message for you (this can be turned off in the preferences tab). Last week I discovered that if you happen to delete that return message (or it disappears by itself somehow) it is certainly not intuitive as to how to insert it back again. After much head scratching my colleague found how to do it and it’s of course easy once you know how (thanks Si).

Here is an example sequence diagram in RSA but its missing a return message:

RSA1_

Now to add the return message, right click, select ‘Add UML’ > ‘Return Message’ (as shown below):

RSA2

A return message is inserted:

RSA3

As I said it’s easy once you know how, but I know I’ll forget :-)


A key part of most personal data backup strategies involves backing up data to an external USB drive but I don’t want to leave it constantly connected. In this post I cover how to backup to an external drive using a scheduled automated process but only if the external drive is connected at the time. 

I don’t believe in leaving external backup USB drive always connected to my system (PC or Server) to avoid the data being corrupted or deleted. Also if the backup drive is for off-site storage then its not possible to be always connected anyway. Using the steps below I can perform a full data backup by simply physically connecting the drive (connecting a USB drive or docking the SATA drive into a USB dock for example), and then leaving it overnight. In the morning I can safely physically disconnect it and store it without even having to log onto the machine.

The basic flow:

1) Assuming that the machine is always on, which my server is, a scheduled task runs every night and executes a backup script (regardless of the backup drive being connected or not). If your machine is not always on then vary this by setting the scheduled task at a time when the machine is usually on.
2) The script checks for the existence of a specific folder on the connected drive (e.g: U:\backup\). If the drive isn’t connected then the folder path won’t exist and the script just exits happily. However, if the drive has been connected then the folder path will exist and the backup script will continue and copy over the data.
3) After a successful backup the script safely disconnects the USB drive. This step is technically optional as Windows supports pulling a USB drive out without doing a soft eject but its highly recommended to tell Windows first to avoid data corruption.
4) Optionally you can also output a backup log somewhere to enable you to check the logs periodically without having to reconnect the USB drive to verify the job worked.

More detailed steps:

Firstly connect the external USB drive and make a note of the drive letter it uses. Changing the drive letter to something memorable might help (B for backup, U for USB, O for Offsite etc). We’ll use U for this example. Next we need to write a DOS command script, a simple program or Powershell script to perform the backup of the data using the backup tool of your choice. I use Robocopy to copy the files via a Powershell script and below is a simplified version of my script.

#==========================================================================================================
# Checks for presence of offsite backup USB drive, and backs up relevant data to drive if present, 
# exits gracefully if not present. Run script everynight and it only backs up data when offsite 
# USB external drive is turned on. 
#==========================================================================================================
clear-host

# set file paths and log file names 
$timestamp = Get-Date -format yyyyMMdd_HHmmss
$LogBasePath="D:\Logs\OffSiteUSBBackup"
$LogFile="$LogBasePath\USBBkUp_$timestamp.txt"
$USBDriveLetter="U"
$USBDriveBackupPath="U:\Backup"

# set error action preference so errors don't stop and the trycatch kicks in to handle gracefully
$erroractionpreference = "Continue"

try
{
	# Check USB drive is on by verfiying the path
	if(Test-Path $USBDriveBackupPath)
	{	
		# now copy the data folders to backup drive
		invoke-expression "Robocopy C:\Docs $USBDriveBackupPath\Docs /MIR /LOG:$LogFile /NP"
		invoke-expression "Robocopy C:\Stuff $USBDriveBackupPath\Stuff /MIR /LOG+:$LogFile /NP"
		
		# Copy the log file too
		invoke-expression "Robocopy $LogBasePath $USBDriveBackupPath\Logs /MIR /NP"
					
		# Sleep for 60 to ensure all transactions complete, then disconnect USB drive		
		Start-Sleep -Seconds 60
		Invoke-Expression "c:\DevCon\USB_Disk_Eject /removeletter $USBDriveLetter"        
	}
}
catch
{
	# Catch the error, log it somewhere, but make sure you still eject the drive using below
	Start-Sleep -Seconds 60
	Invoke-Expression "c:\DevCon\USB_Disk_Eject /removeletter $USBDriveLetter"
}

It is key to include in the script a check for the existence of a specific folder on the drive letter belonging to the external drive (U in our example). Only if its present do we continue with the backup.

I make sure that Robocopy logs the output to a file and that file is on the server and copied to the USB drive (as a record of the last backup date etc). I also report the running of the PowerShell script to the Eventlog for reporting purposes but this is outside the scope of this post.

It's safer to tell Windows that you're gonna pull the drive out and so I call  USB_Disk_Eject  from within my script, passing in the drive letter. I then wait 30 seconds to ensure the drive has had sufficient time to disconnect before I exit the script. There are a few tools available for ejecting USB drives such as Microsoft’s Device Console (DevCon.exe) but I use USB Disk Ejector (https://github.com/bgbennyboy/USB-Disk-Ejector). 

Now set up a Scheduled Task in Windows to run the script every night at a set time.  As the script is scheduled to run every night all I have to do if I want to perform a back-up is connect my backup drive and leave it until the morning. The script will run overnight, find the drive, backup and disconnect. In the morning I can just physically disconnect the drive safely without having to log onto the machine. Periodically I'll check the backup logs and the backup drive to make sure all is well and to check remaining drive space etc.

Do you like this approach? Got a better idea? Let me know via the comments.


WLW2011TemplatePreview

Just a quick note to announce a new version of my Windows Live Writer plug-in, for Source Code Syntax Highlighting in WordPress.com posts, has been released.

CaptainKernel posted a comment on this blog to say he was having issues with the preview feature not formatting the code correctly. After some investigation this has been caused by a change on the WordPress.com.

Anyway a new fixed version, v.1.4.2, is now available to download here which resolves this issue.


My laptop running Windows 8.1 decided not to boot this week but instead gave me a blue screen with the error "System Thread Exception Not Handled". As I’d not installed anything new recently I guessed it could be related to a Video Driver issue, so I tried to Safe Boot – but wait where is Safe Boot in Windows 8? Google and Toms Hardware site to the rescue with this excellent article for resolving the issue. Note the use of in the article of BCDEDIT from the Command Prompt to turn on the legacy Windows boot menu (accessed via pressing F8 during boot).

At the C:\ command prompt:  BCDEDIT /SET {DEFAULT} BOOTMENUPOLICY LEGACY

You can dig a bit more into this command on the ‘Windows Developer Center’ site and check out the various options you can specify, including a useful ‘onetimeadvancedoptions’ option to only turn on F8 menu for a one time use on the next boot. For more detail on the Windows 8 Start-up settings including how to restart in Safe Mode from within Windows check out this page on the windows site. Also note that you can use MSConfig (Start > Run > "msconfig.exe") to restart Windows in Safe Mode too.

To return to the standard Windows 8 boot menu (for faster boot times), once you have resolved your issue, you can run the BCDEDIT command again but this time set the BOOTMENUPOLICY to STANDARD:

At the C:\ command prompt:  BCDEDIT /SET {DEFAULT} BOOTMENUPOLICY STANDARD

If you get an ‘Access Denied’ message make sure the command prompt window is running as Administrator (right click the shortcut > Run as Administrator).

As for my laptop issue, I used Safe Mode to uninstall my video drivers, enabling me to boot normally and then successfully update the drivers.


If you need to host a static HTML page within an ASP.net MVC website or you need to mix ASP.net WebForms with an MVC website then you need to configure your routing configuration in MVC to ignore requests for those pages.

File:Belgian road sign F7.svgRecently I wanted to host a static HTML welcome page (e.g. hello.htm) on an MVC website. I added the HTML page to my MVC solution (setting it as the Visual Studio project’s start page) and configured my web site’s default page to be the HTML page (hello.htm). It tested ok at first but then I realised that it was only displaying the hello page first on debug because I’d set the page to be the Visual Studio project’s start-up page and I hadn’t actually configured the MVC routes correctly so it wouldn’t work once deployed.

For this to work you need to tell MVC to ignore the route if its for the HTML page (or ASPX page in the case of mixing WebForms and MVC). Find your routing configuration section (for MVC4 it’s in RouteConfig.cs under the App_Start folder, for MVC1,2,3 it’s in Global.asax). Once found use the IgnoreRoute() method to tell Routing to ignore the specific paths. I used this:

routes.IgnoreRoute("hello.htm"); //ignore the specific HTML start page
routes.IgnoreRoute(""); //to ignore any default root requests

Now MVC ignores a request to load the hello HTML page and leaves IIS to handle returning the resource and hence the page displays correctly. 


Sometimes if you are on a new machine or using Remote Desktop for the first time you might find that the display size is not correct when you connect to a remote machine. If the remote machine session won’t go Full Screen it can be annoying. To resolve launch Remote Desktop (tip: Start > Run > mstsc is the easiest way) or via Start Menu (Start > Programs or All Programs > Accessories > Remote Desktop Connection). Once launched click ‘Options’ or ‘Show Options’ and then on the ‘Display’ tab adjust the size of your remote desktop screen. Move the slider all the way to the right for full screen.

image

Once you connect the settings becomes the default for all Remote Desktop connections and so you’ll only need to do this once. The settings are saved in a Default.rdp file, usually stored in ‘My Documents’ or ‘Users/<UserName>/Documents’. It is possible to save multiple versions of *.rdp files and pass them to MSTSC as a command line parameter if you need to connect to different machines with different settings.


I’m keen on fostering a learning culture within teams and was drawn to this article on InfoQ Creating a Culture of Learning and Innovation by Jeff Plummer which shows what can be achieved through community learning. In the article Jeff outlines how a learning culture was developed within his organisation using simple yet effective crowd sourcing methods.

imageI have implemented a community learning approach on a smaller scale using informal Lunch & Learns where dev’s give up their lunch break routine to eat their lunch together whilst learning something new, with the presenter\teacher being one of the team who has volunteered to share their knowledge on a particular subject. Sometimes the presenter will already have the knowledge they are sharing but other times they have volunteered to go and learn a subject first and then present it back to the group. Lunch & Learns work even better if you can convince your company to buy the lunch (it’s much cheaper per head than most other training options).

It’s hard to justify expensive training courses these days but that said it’s also never been easier to find free or low cost training by looking online. As Jeff points out innovation often comes from learning subjects not directly relevant to your day job. In my approach to learning with team I have always tried to mix specific job relevant subjects with seemingly less relevant ones. For example a session on Node.js for a team of .Net developers would be hard to justify in monetary terms however I’ve no doubt the developers took away important points around new web paradigms, non-blocking threads, web server models, and much more. Developers like to learn something new and innovation often comes from taking ideas that already exist elsewhere in a different domain and applying them to the current problem.

I agree with Jeff’s point that the champions are key to the success of this initiative. It is likely that the first few subjects will be taught by the champion(s) and they will need to promote the process to others. One tip to take some of the load off the champions is to mix in video sessions as well as presenter based learning sessions. There are a lot of excellent conference session videos and these can make a good Lunch & Learn sessions. Once the momentum builds it becomes the norm for everyone to be involved and this crucially triggers a general sense of learning and of sharing that learning experience with others.


I recently decided to add a custom domain name to a free Azure website that I use for development purposes. As the FREE Azure web site model doesn’t support custom domains (a shame but hard to complain as it’s FREE) I needed to upgrade the site to the ‘Shared’ mode. This is easily done by the Scaling button in the azure portal.

Firstly however I needed to link my current azure web site to sit under a different subscription to the one I used to set it up. The problem is that cannot move sites between subscription models yet (please fix this Microsoft). To get around this I needed to create a new website under the correct subscription and then publish my web site code to it. Luckily this is easy to do as it’s just a basic website but I can imagine that this could be painful if you have a bunch of storage accounts or a database to re-create.

Using the Azure Portal, creating a new site is a simple process Click +NEW at the bottom of the portal for the menu shown below:

image

Once created all I needed to do was download a Publish profile (see this tutorial link for how to publish to Azure) for the new site for Visual Studio to use. Once downloaded I opened my VS2012 solution and brought up the Publish dialog. I pointed it to the new Publish profile file and clicked Publish. In just a few seconds I’ve got a new Azure web site up and running with my existing MVC web application. This was very smooth, with no change to config or code required. The sheer simplicity of this impressed me as I was short on time.

Next I needed to allocate my custom domain which as previously mentioned is not available for FREE websites so i needed to upgrade to SHARED mode. From the Azure portal >web site configuration > scale > click SHARED (remember this model incurs a cost).

image

Once upgraded I could then immediately select DOMAINS and set up my CNAME and A record references, for more information see this useful link (configuring a custom domain name for a Windows Azure web site). It’s worth reading the comments on the post too as it covers issues with registering the domain without the WWW subdomain.

Once the DNS entries had propagated I had my existing site up and running under a custom domain running within a shared Azure instance, all with very little effort. 


Microsoft unfortunately recently announced the demise of the TechNet Subscription. Whilst I appreciate that TechNet download abuse must contribute towards the availability of pirated products, I still think that this is a short sighted move by Microsoft. The MSDN subscription will continue  (for now) and anyone making money from privacy will be able to cover the extra cost of an MSDN subscription. Few individuals, however, are able to afford an MSDN subscription to feed their enthusiasm for Microsoft products. Nor would they want to with attractive alternatives being available from other vendors.

My concern is that the barrier to entry for being a Microsoft Technology IT Pro and Developer was just raised significantly. In my 2009 post on Microsoft making it too expensive for developers to experiment with Azure, I outlined how critical it is to make your products available to both current and future upcoming developers. Microsoft responded over the last few years by offering free Azure websites, reducing prices and offering improved MSDN offers. This has reduced the barrier to entry for Azure for developers, but Microsoft has now raised it for IT Pros and the enthusiast market. 

TechnetDownloadsAccording to Microsoft, evaluation versions of OSs will be available for download. I think that 90-180 day trials are very valuable but historically they have only been available for the latest products. Great if you want to try out Windows Server 2012 but not if you need to experiment with Windows Server 2008, which is a major flaw to this approach. Also short trial periods such as those found with client OSs are a real frustration. Virtual Labs are excellent for targeted training of specific features but are not a replacement for the real world experience of running a real instance.

But surely it’s all running in the cloud now anyway? Well perhaps in the future the idea of running servers locally will be a strange concept but we are a way yet from that being the norm. The Enterprise IT Pros and Developers of today and more importantly the near future will need to be skilled in running servers locally for some time to come. Running virtual servers in the cloud might be an option for some and may be the future but it’s expensive to do this currently and techies will not be exposed to those server maintenance activities that are abstracted away by cloud providers.

TechnetDownloads2There is a large home server enthusiast community that will rely on TechNet to evaluate and run Windows Server products. This is a vibrant, active community and one that happily shares detailed technical knowledge with the wider world and feeds the Microsoft Technology communities. With the death of Windows Home Server, and now TechNet, these enthusiasts will now start to look for alternatives. There are by comparison plenty of non-Windows choices in this space (Linux/BSD).

The cost of a TechNet subscription seems to have dropped to a bargain price over the last few years, perhaps too low, and Microsoft could have gradually increased the price over the next few years to make it less attractive to those looking to avoid buying retail versions and yet continue as a mechanism for Microsoft enthusiastic techies to access Microsoft Operating Systems. 

In summary I think that Microsoft have needlessly raised the barrier to entry for experimenting and learning Microsoft Technologies and makes alternative platforms more attractive. This move will in the long run surely push enthusiasts and young upcoming techies into the arms of Linux/BSD.




Follow

Get every new post delivered to your Inbox.

Join 37 other followers