Platform Targeting in .Net

Platform Targeting in .Net

If you see one of the errors below within your .Net application it is likely as a result of your assembly having been compiled with the wrong target architecture flag set.

1) In this instance the assembly was accidentally compiled for x86 only and run within IIS in a 64bit process:

Unhandled Exception: System.BadImageFormatException: Could not load file or assembly “AssemblyName Version=1.0.0.0, Culture=neutral” or one of its dependencies. An attempt was made to load a program with an incorrect format.

(For info IIS has an Application Pool setting under Advanced  Settings that enables you to run the App Pool as a 32 bit process instead of 64bit. This is off by default.)

2) Here a .Net application targeting a 64 bit processor was run on a 32bit system:

This version of ConsoleApplication1.exe is not compatible with the version of Windows you’re running. Check your computer’s system information to see whether you need a x86 (32-bit) or x64 (64-bit) version of the program, and then contact the software publisher.

If you are looking for a quick fix, then just recompile your assembly with the “Any CPU” option as your platform target. See project properties, build tab for this setting.  If you want more information and explanation then read on.

BlgPlatformTarget1When compiling .Net assemblies in Visual Studio (or via the command line) there are a few options regarding optimising for certain target platforms. If you know that your assembly needs a to be deployed only to a 32bit environment (or it needs to target a specific processor instruction set) then you  can optimise the output of MSIL code (what the compiler produces ready for a JIT compilation at run time) for that platform. This is not normally required but perhaps, for example, you have to reference unmanaged code that targets a specific processor architecture.

Essentially this sets a flag inside the assembly metadata that is read by the CLR. If this is set incorrectly then this can result in the above errors but also other odd situations. For example if you compile your app to target “x86”  and then reference an assembly that is targeting a “x64” platform then you will see an error at runtime due to this mismatch (BadImageFormatException). Running an “x86” application will still work on a 64 bit Windows but it will not run natively as 64bit but will instead run under the WOW64 emulation mode which enables x86 execution under 64 bit  (with a performance overhead), which may or may not be a valid scenario in your case.

If you want to reproduce the situation try creating a new console application and in the build properties tab set Platform Target to “x86”. Then create a new Class Library project, set a reference to it in the Console Application, and then in the build properties tab set it to target the “x64” platform. Build and run the application which will show the above BadImageFormatException.

The target platform for your project is set in the Project Properties tab in Visual Studio, under Build (see screenshot above). If you are compiling via the command line you use the /platform switch.

“AnyCPU” became the default value from VS2010 onwards. “AnyCPU” used up to .Net 4.0 means that if the process runs on a 32-bit system, it runs as a 32-bit process and MSIL is compiled to x86 machine code. If the process runs on a 64-bit system, it runs as a 64-bit process and MSIL is compiled to x64 machine code. Whilst this enables more compatibility with multiple target machines it can lead to confusion or unexpected results when you consider that Windows duplicates system DLLs, configuration and registry views for 32bit and 64bit processes.  So Since .Net 4.5 (VS2012)  there is now a new default subtype of “AnyCPU” called “Any CPU 32-bit preferred” which basically follows the above rules except that if a process runs on 32bit system then it will run as a 32 bit process (not 64bit as before) and its MSIL will be compiled to x86 code not x64. This change essentially now forces your process to run under 32bit on a 64bit machine unless you untick the option and turn off the default setting. This setting can be seen on Project properites Build tab as “Prefer 32-bit”.

BlgPlatformTarget2

It is worth noting that you may see a confusing “Multiple Environments” option in Visual Studio which can be automatically added  after migrating solutions from older versions of Visual Studio (I believe it has been removed as a new option in VS2015 onwards but can hang around in older solutions). Use the Configuration Manager tab to check the setting for each assembly. Most developers will want to target  “Any CPU” which supports multiple target environments. If you are getting the above errors then use the steps below to check the assembly and if incorrect then try recompiling with “Any CPU” instead.

How to confirm a target platform for a given assembly:

So how do you confirm which processor architecture an assembly was built for? Well there are a few ways:

1) Using .Net reflection via a simple Powershell command:

[reflection.assemblyname]::GetAssemblyName("${pwd}\MyClassLibrary1.dll") | format-list

Running this command pointing to the assembly you want to check will produce output like this below (results truncated):

An assembly compiled to target AnyCPU:

Name                  : ClassLibrary2
Version               : 1.0.0.0
CodeBase              : file:///C:/TempCode/crap/pttest/x86/ClassLibrary2.dll
ProcessorArchitecture : MSIL
FullName              : ClassLibrary2, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null

Same assembly but now compiled to target x86 :

Name                  : ClassLibrary2
Version               : 1.0.0.0
CodeBase              : file:///C:/TempCode/crap/pttest/x86/ClassLibrary2.dll
ProcessorArchitecture : X86
FullName              : ClassLibrary2, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null

Same assembly again but now compiled to target x64 :

Name                  : ClassLibrary2
Version               : 1.0.0.0
CodeBase              : file:///C:/TempCode/crap/pttest/x86/ClassLibrary2.dll
ProcessorArchitecture : Amd64
FullName              : ClassLibrary2, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null

 

2 )You can also see this information from a de-compiler tool such as dotPeek. Below shows a screenshot showing a x86, 64, AnyCPU and x64 targeted assemblies.
BlgPlatformTarget4_dotPeek

3) Use the CorFlags Conversion Tool

The CorFlags Conversion Tool (CorFlags.exe) is installed with Visual Studio and easily accessed by the VS Command Line. It enables reading and editing of flags for assemblies.

CorFlags  assemblyName

Assuming you have the .Net 4 version installed with .net 4 and above  you’ll see something like this. Older versions do not have the 32BITREQ/PREF flags as per the change for .Net4 discussed above:

Microsoft (R) .NET Framework CorFlags Conversion Tool.  Version  4.6.1055.0
Copyright (c) Microsoft Corporation.  All rights reserved.

Version   : v4.0.30319
CLR Header: 2.5
PE        : PE32
CorFlags  : 0x1
ILONLY    : 1
32BITREQ  : 0
32BITPREF : 0
Signed    : 0

To interpret this output see the table below and check the PE value (PE32+ only on 64bit) and the 32bit Required/Preferred flags. It is also possible to update these flags using this tool.

Summary

Below is a simple table of results to show the impact of Platform Target on the complied assembly and how it is run on a 32/64bit OS. As you can see the 32 preferred flag has resulted in an AnyCPU assembly being run as 32bit on a 64bit OS, the table also shows the values you get when you use the techniques above for determining the target platform for a given assembly.

 

Platform Target in Visual Studio PowerShell
Result
dotPeek
Result
CorFlags
32BITREQ
CorFlags
32PREF
CorFlags
PE
Runs on
32bit OS as
Runs on
64bit
OS as
AnyCPU (pre .Net 4) MSIL MSIL 0 0 PE32 32 bit process 64 bit process
AnyCPU (.Net 4 +)
32 bit NOT preferred
MSIL MSIL 0 0 PE32 32 bit process 64 bit process
AnyCPU (.Net 4 +)
32 bit Preferred (default)
MSIL x86 0 1 PE32 32 bit process 32 bit process
(under WOW64)
x86 x86 x86 1 0 PE32 32 bit process 32 bit process
(under WOW64)
x64 Amd64 x64 0 0 PE32+ ERROR 64 bit process

In summary, there are a few simple rules to be aware of with using AnyCPU and the 32bit Preferred flag but essentially AnyCPU will enable the most compatibility in most cases.

Advertisements

Renaming your Team Foundation Service Account

Renaming your Team Foundation Service Account

If you haven’t checked out TFS Service from Microsoft then I thoroughly recommend trying it out. It is basically a TFS instance in the cloud and at the moment its free for everyone to use and there’s an official a free plan for small projects up to 5 users. I run a small TFS instance on my WHS but I’m planning on switching completely to the TFS Service offering to avoid having to manage a TFS installation locally.

image10

One thing that caught me out was the issue of naming your TFS Service Account. I didn’t give much thought to it initially but then once I found how good the service was I began to wish I’d named it something more appropriate. Well, as the TFS team are releasing service updates at such a rapid cadence I didn’t have to wait long for a rename feature. The process to rename your service account is detailed in Brian Harry’s blog here.

Even if you are not enthusiastic about a cloud based centralised ALM tool I recommend checking out Brian Harry’s blog anyway for informative posts on how his team is managing three week sprints and continuous delivery within Microsoft. What this team is delivering for their customers is impressive.

Exporting an HTML Report From your VS/TFS Test Results

Exporting an HTML Report From your VS/TFS Test Results

Have you ever wanted to export your unit test results from Visual Studio or your TFS Build? Sometimes you may need to provide evidence of your unit testing position to a project stakeholder. This may be for an internal review or as part of a gateway check in a waterfall project. Alternatively you may just want to export your test results for storage later. Out the box Visual Studio provides the Test Results view which is a great tool for seeing the tests that passed/failed in a build, but it is in a custom file format (*.trx) and requires Visual Studio to view.

However I recently found this command line tool on CodePlex by ‘ridomin’ called trx2html which ‘does what it says on the tin’, it converts trx files to HTML. Using the command line, just pass the file path to your test results file for your solution or TFS Build (*.trx) to the trx2html.exe and out pops a fancy HTML report (example below).

trx2html

The tool is open source on CodePlex and so you can download the latest version from: http://trx2html.codeplex.com.

No VS with Notepad++ and Programmers Notepad

Sometimes, despite Visual Studio being an excellent IDE, you want to go the No VS route and hack your code in notepad. Perhaps you just want to make a quick change and its not worth firing up the full VS experience. Maybe you are only checking the code as a background time-filler in between other tasks you don’t want VS eating up your workstations resources all day. Perhaps most likely you’ll need to edit your code on a PC without VS (e.g. Netbook). A while ago I came across this excellent post by SecretGeek where he’s used a batch  script to add compilation to the excellent Notepad++. I recommend taking 5 minutes to follow his instructions and get this working as it makes Notepad++ even more useful. Don’t forget you will need to change the .Net framework path in the batch file if you’re coding apps older than .Net 4.0.

I’ve also copied the batch file and then modified it to work for solution files too, so I can compile the whole solution if required instead of just the project file. I’ve then added a second entry in the Notepad++ file (as per SecretGeek’s instructions) and a second shortcut key for this batch file. The only change required is to amend the “*.*proj” to “*.*sln”.  This change assumes that the project and solution files are NOT in the same folder though as this confuses the .Net compiler.

As good as Notepad++ is as a notepad tool, if you need something a little more like a mini IDE then I recommend “Programmers Notepad”. Not only is it an excellent lightweight Notepad like tool but it also supports “projects” and tools. By far the best feature though is that it parses the output from the compiler and highlights the results. Also clicking on the error will take you straight to the offending line of code (see screenshots below).

CompileNetProject_ErrorAndGoToErrLineCompileNetProject_action

It is easy to integrate SecretGeek’s above batch files into Programmers Notepad.  Go to Tools > Add Tool, and set it up like this:

CompileNetProjectCompileNetSln

Now we’re cooking! So what’s left? Well what about being able to run your Unit Tests from this mini IDE? Let’s assume Visual Studio is installed on the machine in question (which negates the point of the exercise perhaps, but not always as sometimes you have it installed but just don’t want to run the full blown VS IDE) so we can run MSTest via a batch file. For this batch file to work though we have to make a few more assumptions:

1 You will always run the tests whilst the unit test project code is the open active document in Programmers Notepad.
2 The test assembly name is the same name as the test project file.

Using these assumptions we can code the RunTests.bat batch file to find the active document’s VS project file (*.*proj) and use its name to find the output assembly name in the bin\debug folder. Once we have the test assembly we can throw it at MSTest for it to run the tests. Assigning this new batch file to a Tool menu item in Programmers Notepad (as we did with the other batch files above) we can run the tests within Programmers Notepad and the test results will be output into the output window.

RunTests

The code for the RunTests.bat batch files is:

@ECHO OFF

Set MSTestPath="C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\MStest.exe"

:findproj

Set current=%cd%

If not exist *.*proj goto tryFindProjAgain
REM get project file name
FOR %%f IN (*.*proj) DO Set dllname=%%f
REM Convert proj file name to test assembly name
set dllname=%dllname:csproj=dll%
Set fullpath=%cd%\bin\Debug\%dllname%

%MSTestPath% /testcontainer:%fullpath% /detail:errormessage

goto end

:tryFindProjAgain

ECHO No project file found in:
ECHO %cd%

cd ..

REM detect if we are at the root level
if "%current%"=="%cd%" goto end

goto findproj

:end

Installing Team Foundation Server on Windows Home Server 2011

Installing Team Foundation Server on Windows Home Server 2011

Twelve months ago I wrote a post documenting “installing Team Foundation Server 2010 on Windows Home Server” which has proved very popular. Well things move on and since then Microsoft have released a new version of Windows Home Server (WHS 2011). There are many differences between V2 of WHS compared to V1 but the main points for the purpose of this post are that WHS 2011 is build on top of Windows Server 2008 R2 (compared to Windows Server 2003 for V1) and controversially the Drive Extender technology has been removed. Whilst Drive Extender was no doubt useful for storage pooling it did make installing applications like TFS a little concerning. As described in my original article I wouldn’t install an application or a SQL Server database into the Drive Pool (it just feels wrong to me and wouldn’t trust it) and I stick by this especially as it’s been suggested that one of the reasons for Microsoft  removing DE was due to it not playing nicely with enterprise applications that would be targeted for use on the new Small Business Server Essentials product range with which WHS 2011 shares it’s code. No DE means you can install TFS now to whichever drive wherever you like in WHS 2011, and the fact that it’s built on the excellent Server 2008 base means it benefits from stability and performance improvements this brings. I’ve not found any issues with TFS on WHS 2011 and don’t expect to (although its not supported so you install it at your own risk). I think that WHS 2011 will make an even better TFS server than WHS V1.

Other than the decision of where to install TFS due to DE, the installation instructions are the same as in my original post. After installation I recommend installing the TFS Power Tools and then configuring TFS backups as described in these posts: Backing Up TFS 2010 Using PowerShell: Part 1, Backing Up TFS 2010 Using PowerShell: Part 2 and Backing up TFS 2010 with new Power Tools Backup Plan.

01 06   12

XML Error in TFS Power Tools Backup Plan

XML Error in TFS Power Tools Backup Plan

Whilst checking the Windows Event Log on my server I found this worrying error being reported by the Team Foundation Server Power Tools Backup & Restore Tool :

Error: Tfpt Backup Restore : There is an error in XML document (499, 100).

I posted an article about using this tool to backup your TFS server in September, and it has been successfully backing up daily since that time. The above error has been reported after every daily backup for the past few weeks it seems. It immediately follows an information event from MSSQL$SQLEXPRESS stating that the “Database has been backed up”. To investigate more I opened the TFS Admin Console on the server and clicked “Take Full Backup Now” whereby I got the same error interactively – well at least it’s consistent.

TFS Backup There is an error in XML document

From here you can open the log, and in that I can see it’s reporting that the BackupSets.xml file in the root of the backup target location is invalid:

[Info   @20:30:15.977] Backup Set Name Tfs_DefaultCollection database Backup
[Info   @20:30:15.977] Backup Set Description Tfs_DefaultCollection database – Full Backup
[Info   @20:30:15.977] Adding database Tfs_DefaultCollection to the backupset
[Error  @20:30:16.352] System.InvalidOperationException: There is an error in XML document (499, 100). —> System.Xml.XmlException: There is an unclosed literal string. Line 499, position 100.

On opening the file there appears to be nothing wrong with the XML as such but in my case it didn’t seem complete, as the last successful backup only detailed one entry not the usual two (*.bak and *.trn files). This file seems to be record of the backups as opposed to being core to the actual SQL backup process so I deleted/renamed the BackupSets.xml file and ran another full backup. This time it worked fine and a new BackupSets.xml file was created. It’s now running fine again.

Installing Team Foundation Server 2010 on Windows Home Server

Installing Team Foundation Server 2010 on Windows Home Server

Last October I posted about the fact that Microsoft’s Team Foundation Server 2010 was going to be shipping with a “Basic” configuration that more light weight and seen as more of a Visual SourceSafe replacement. In that post I also pointed out that it seemed possible to install this version on a Windows Home Server (WHS) and that it would be something I’d try. Well eventually I have got round to doing it and of course I’ve documented my approach. The benefits of TFS immense but out of the scope of this article but instead I’m covering the installation process.

It may have been technically possible to jump through some hoops and install TFS 2008 on your WHS box but the pre-requisites were heavy (including SharePoint and full SQL Server) and it was always seen as a complicated process. With the ‘basic’ configuration of TFS 2010 you can ignore SharePoint and SQL Reporting Services and it installs onto SQL Server 2008 Express (which it also installs for you).

A point worth mentioning is that TFS 2010 will not trash you existing web sites (important for Windows Home Servers) but will instead install its own site alongside any existing sites on the server. It is possible to then connect to TFS remotely via your WHS Remote Access domain name which is a very nifty feature.

Pre-Installation Considerations:

Basic Configuration: Whilst it’s possible to install the non-basic configurations I don’t see the point unless there is something specific that you want to take advantage of that’s not included in the basic configuration. For a light, easy installation and a nice minimal processing overhead on your WHS the ‘basic’ configuration is ideal. It also contains all the core features such as build automation, work item tracking and source control.

Location of SQL Server Data Files: By default the SQL Server data files are installed to the system drive under the program files location for SQL Server Express. As you add more and more content to TFS these files will naturally grow and so you need to consider up front whether you have the disk space to support this. Windows Home Server by default installs only a 20GB system partition which should have adequate space for the data files to grow, but if you have installed a lot of applications to your system drive this could be an issue.  To store the data files at an alternative location it is easier to install SQL Express 2008 manually prior to installing TFS 2010 (and configure the data file locations via the SQL setup). The TFS installation will then just reuse the SQL Express instance already installed. Alternatively you could install the data files on the WHS’s ‘Data’ drive (D drive) but personally I prefer to leave that drive well alone and let WHS’s ‘Drive Extender’ manage all the data in it’s Storage Pool. I did consider installing a new drive to my server, and not adding that to the storage pool, which would then be used for my data files. In the end I decided I had adequate space on the system drive and I could move the data files at a future point in time if required.

Build Controller: Part of the TFS installation/configuration includes the decision to install a Build Controller on the server. If you plan on running automated TFS builds then you’ll need to install this. I would strongly recommend you use the Team Build features of TFS as they are one of its key features but if you don’t want to automate builds or want to minimise the services running on your server then just skip that part.

Support: Whist WHS is really a Windows Server 2003 under the covers and TFS 2010 clearly supports Windows Server 2003, installing it on a Windows Home Server is not supported by Microsoft (or anyone else) and you do so at your own risk.

Steps:

You’ll need to Remote Desktop onto your server to run the installation interactively. The below screenshots show the key stages of the installation process. Luckily it’s mostly a matter of clicking ‘Next’:

whs_tfs_01 whs_tfs_02

Choose to install Team Foundation Server and the Build Service (if you plan to run automated builds on this server too). Don’t install the Server Proxy:

whs_tfs_04  whs_tfs_06

After much processing and a reboot it’s installed:

whs_tfs_07 whs_tfs_08

Once installed, it then launches the Configuration Centre and it’s at this point that you can choose the ‘Basic’ configuration. It will then install SQL Server Express or point to an existing SQL instance:

whs_tfs_10 whs_tfs_11 whs_tfs_12    whs_tfs_16 whs_tfs_17

Next you’re asked to configure the Build Service details:

whs_tfs_18 whs_tfs_19 whs_tfs_20 whs_tfs_21 whs_tfs_22    whs_tfs_26 whs_tfs_27

That’s it. You can launch the ‘Team Foundation Server Administration Console’ from the Start Menu which is a neat new tool enabling you to manage TFS from the server itself:

whs_tfs_29

Next Steps:

Connect via Visual Studio: Now it’s all up and running you can fire up Visual Studio and test that you can connect to the new Team Foundation Server by adding it to the list of servers in Team Explorer. You should be able to create new Team Projects, check in code and run builds just as you would on any other TFS server.

Enable Remote Access: To enable remote access to TFS so you can access your source code from remote locations you need to copy your client certificate from the Remote Access site and then add it to the TFS site. For instructions on how to do this see this excellent post by Jason Neave.

Backups: Call me paranoid but before I add anything important to this TFS server I want an automated backup procedure in place. Backing up TFS is really ‘just’ a matter of backing up the SQL Server databases on which TFS sits as that is the only source of its data. I say ‘just’ because it’s not as easy as it should be. A key point to note is that as TFS uses numerous databases it is critical to backup all of them at the same time so you don’t end up restoring unsynchronised databases. I have created an automated backup procedure using Windows PowerShell that I will share in a future post.

Conclusion:

It’s exciting to see a combination of two excellent products working together and I think this is another great way to add value to your Windows Home Server whilst enabling you to experiment with a quality ALM tool.

UPDATE: For WHS 2011 users check out my post on Installing TFS 2010 on Windows Home Server 2011.

Free Icons For Your Application Within Visual Studio

If like me you’re always on the lookout for neat little icons and images to add to your shiny .Net applications then you might like to know that Visual Studio includes an image library for you to use. The library is located on your hard drive within the Visual Studio installation at:

For Visual Studio 2010: 
C:\Program Files\Microsoft Visual Studio 10.0\Common7\VS2010ImageLibrary\1033\VS2010ImageLibrary.zip

For Visual Studio 2008:
C:\Program Files\Microsoft Visual Studio 9.0\Common7\VS2008ImageLibrary\1033\VS2008ImageLibrary.zip

For more information check out Weston Hutchins’ blog post on the Visual Studio Blog.

Team Foundation Server ‘Basic’ Edition

Team Foundation Server ‘Basic’ Edition

Many development teams still regularly use Visual SourceSafe for their source control which can stimulate heated debates between those that have used it for many years without problems and those that have suffered some pain with it. Regardless of this debate there is no denying that SourceSafe is coming to the end of it’s useful life. It’s old technology and will come out of support in 2011, although a compatibility update is expected with Visual Studio 2010.

When Microsoft developed it’s replacement, Team Foundation Server (TFS), it focused on providing more than just a source control product but a whole development lifecycle management system. Regardless of the benefits of TFS (and there are many) it has been avoided by many small development teams due to its high costs and complex installation/management. Many have instead moved to alternative source control products such as the free Subversion, leading to a decline in Microsoft’s market share in this area.

So, what’s changed? Microsoft now plan to provide a ‘Basic’ version of TFS 2010  when it ships next year. I think that this is a huge step forward for TFS and it’s take up across the development community. Brian Harry details the ‘Basic’ version in this blog post. This version of TFS will have a fast and easy installation and provides many more implementation options for the product. It will install on SQL Express 2008 and can even be installed on Client Windows Operating Systems. This really is targeting the current SourceSafe users and provides a low cost (perhaps even free) entry to the benefits of TFS. You might think that this would only provide basic TFS functionality but no so. Included in the basic version is Source Control, Bug Tracking and Build Automation, which provide the bulk of the key TFS features. The screenshots also suggest that Web Access is also included. What’s not included is Report Services and SharePoint, which are arguably more geared towards the larger development teams anyway. The key benefits from TFS come from the Work Item interaction and ‘Continuous Integration’ friendly automated build features and these are included.

The move to TFS for a SourceSafe (or any other simple source control system) team will provide many benefits and this version should enable those benefits at a minimal cost. There are no details on pricing but personally I would expect it to be included in the Team Developer MSDN subscription.

SourceSafe is also used by hobbyist and professional developers to manage their own personal source code and I see this version of TFS being ideal for this. The ability to install on a client OS is a major factor to these users. There is also a comment on Brian’s blog post about running TFS basic on Windows Home Server which is something I am keen to try out.

By allowing more people to access this great product it will greatly contribute to the TFS community and it’s take up globally. If you can’t wait until TFS 2010 is released and would like to know more about TFS versus SourceSafe in terms of pricing then check out Eric Nelson’s post here.

SCSF Recipes Not Working – Check Solution File

I recently opened a SCSF project that was having build problems and I found that once the build problems had been corrected the GAT recipes were not working. An error was being reported when I tried to run the Add Module recipe:

Microsoft.Practices.RecipeFramework.RecipeExecutionException: An exception occurred during the binding of reference or execution of recipe CreateBusinessModuleCS. Error was: The following arguments are required and don’t have values: CommonProject. Can’t continue execution..

Evetually I found that the required attrbutes were not set in the Solution file (.sln) so the solution was unaware of the SCSF responsiblities it had. I checked this link…

http://www.ideablade.com/forum/forum_posts.asp?TID=357

It details the solution data required (example below), which I confirmed with other SCSF solutions I had.

GlobalSection(ExtensibilityGlobals) = postSolution
RootNamespace = MySolutionName.MyProjectName
CommonProjectGuid = 7432c860-3226-49fa-a9f4-2dd27d1229b8
ShellProjectGuid = 6ee16e85-57a7-4a00-9018-43eca17194cb
EndGlobalSection

I updated the GUIDs to be the ones from my solution (ie the GUID of the Infrastructure.Interface and the Shell projects) and that sorted the problem.

This could be due to a new solution file being checked in by the developer who created the original SCSF project, and not the original Solution file (that was generated by the SCSF).