Java keystore certificate missing after Jenkins upgrade

Java keystore certificate missing after Jenkins upgrade

Following a tricky upgrade to a Jenkins server it was found that the server was no longer able to communicate with the Git server, although SSH access with private keys was stil working correctly. On invrestigating the Jenkins logs (found at Logs at: http://YOUR-JENKINS-SERVER-URL/log/all) this error was found to be the problem:

Failed to login with creds MYCREDENTIALNAMEHERE
sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.build(Unknown Source)
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(Unknown Source)
at java.security.cert.CertPathBuilder.build(Unknown Source)
Caused: sun.security.validator.ValidatorException: PKIX path building failed
at sun.security.validator.PKIXValidator.doBuild(Unknown Source)
at sun.security.validator.PKIXValidator.engineValidate(Unknown Source)
at sun.security.validator.Validator.validate(Unknown Source)
at sun.security.ssl.X509TrustManagerImpl.validate(Unknown Source)
at sun.security.ssl.X509TrustManagerImpl.checkTrusted(Unknown Source)
at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(Unknown Source)
Caused: javax.net.ssl.SSLHandshakeException

The problem here is that the Java runtime is not able to validate the connection as it doesnt have the relevant SSL certificate. To resolve I needed to download the certificate from the site and add it to the Java Keystore.

Photo by Chunlea Ju on Unsplash

On the server download the relevant web certificate for connecting to the relevant site (a GitHub server in my instance) that you are trying to access. Next find which Java JRE your Jenkins was using. This can be found in your Jenkins.xml file under the executable element. For example %BASE%\jre\bin\java means c:\Jenkins\JRE\bin\java if you have installed Jenkins to C:\Jenkins on Windows. Now find the cacerts file inside that Java runtime that it was using (for example C:\Jenkins\jre\lib\security\cacerts) where you will need to install the Certificate. To add the cert to the Java Keystore for that Java install run the below in a terminal:

keytool -import -alias INSERT_ALIAS_NAME_HERE -keystore PATH_TO_JRE_HERE\security\cacerts -file CERTIFICATE_FILE_NAME

for example…

keytool -import -alias ExampleName -keystore C:\Jenkins\jre\lib\security\cacerts -file exampleCertFile.cer

If you are asked for a password then it will be ‘changeit‘ by default.

Now the JRE Java runtime on the box that Jenkins is now using has the Certificate installed it will be able to make a SSL connection.

Build warnings & .Net Standard dependencies

Build warnings & .Net Standard dependencies

Having recently migrated a Windows Azure hosted .Net Framework ASP.net MVC 5 web application over to ASP.net Core (v2.2) I hit a few issues so I’m documenting them here for my future self and anyone else who might find it useful. The application was a multi functional ASP.net MVC site with an Azure Blob Storage Service back-end that I re-wrote in new ASP.Net Core with a mixture of both MVC and Razor Pages. The new site was .Net core 2.2 which referenced supporting projects built to .Net Standard 2.0.

The Problem

The application ran fine locally and as I was re-purposing the same Windows Azure Web Service instance I re-configured it to support .Net Core via the Azure Portal. Its not clear if this step is actually required but it makes sense to set it correctly (see screnshot below).

Next I deployed my Web App via Web Deploy, and then hit the homepage but immediately received a long list of errors such as the one below:

The type 'Attribute' is defined in an assembly that is not referenced. 
You must add a reference to assembly
'System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e'. 
[assembly: global::Microsoft.AspNetCore.Razor.Hosting.RazorCompiledItemAttribute(typeof(AspNetCore.Views_Home_Index), @"mvc.1.0.view", @"/Views/Home/Index.cshtml")] ……etc……

It looked like .Net itself was not being found on the target machine, and so after checking the server settings and server logs in the Azure Portal I double checked my project references to confirm I was targeting .Net Core and .Net Standard 2.0 and that’s when I found a build Warning that I had previously missed:

[NU1701] Package 'Microsoft.Azure.KeyVault.Core 1.0.0' was restored using '.NETFramework,Version=v4.6.1' instead of the project target framework '.NETStandard,Version=v2.0'. 
This package may not be fully compatible with your project

One of my dependent projects in the solution was referencing the Azure SDK assembly Microsoft.Azure.Storage.Blob (via NuGet package) which in turn was referencing a version of Microsoft.Azure.KeyVault.Core that was not compatible with .NetStandard 2.0. This solution ran fine locally (as the dependent framework is installed on my machine) but when the application was deployed to a server without the .Net Framework Version=v4.6.1 binaries the application failed to load. More on this problem can be found in this GitHub issue.

The Solution

The solution was to reference a later version of the Microsoft.Azure.KeyVault.Core assembly that does support .NetStandard. As my project had no direct reference to Microsoft.Azure.KeyVault.Core (as it was a dependency of Microsoft.Azure.Storage.Blob) I had no dependency reference to update and so instead added a direct dependency to Microsoft.Azure.KeyVault.Core version 3.0.3 in my project file. This didn’t directly resolve my issue though despite reportedly working for other people, but at this point I decided to update all my Azure SDK references which took Microsoft.Azure.Storage.Blob to v11.1.3 which references a .NetStandard 2.0 compliant version of the KeyVault.Core dependency:

  <ItemGroup>
    <PackageReference Include="Microsoft.Azure.Storage.Blob" Version="11.1.3" />
  </ItemGroup>

After upgrading (tweaking a few updated namespaces that has changed in the newer package versions) I deployed the project successfully without error. Issue resolved!

The lesson here I think is to firstly don’t ignore build warnings (we all sometimes – right?) and secondly be careful with dependencies and their children not being .NetStandard compliant. This may be less of a problem in the future as .Net Core and .Net Standard gets more widespread adoption but for now its one to watch for.

RunAs Issue? Check Secondary Logon Service.

RunAs Issue? Check Secondary Logon Service.

On Windows if you are having problems trying to perform an action as a different user via the RunAs command then it might be due to the ‘Secondary Logon Service’ not running. I recently had this problem on Windows Server and after some investigations found that the ‘Secondary Logon Service’ had been disabled, starting the service resolved the issue. By default it is set to ‘Manual’.

The error you get from the RunAs command may vary depending on OS version but will report a problem running a process or service. This is the error I get on Windows 10:

1058: The service cannot be started, either because it is disabled or because it has no enabled devices associated with it.

Vertical Monitors

Vertical Monitors

Whilst I’ve been using a multiple monitor setup at home and work for many years only for the past year have I been using one of the two in vertical/portrait mode, and I’m hooked.

The benefits for using multiple displays over a single display has been researched numerous times over the years. Estimates on the amount of productivity gain that can be made vary from 20% to 42% in the research I have seen, but even at 20% the second monitor soon pays for itself. Jon Peddie Research (JPR) in their 2017 research found a 42% gain in productivity based on the increased number of pixels available via multiple screens. And whilst some of this gain can be made from just larger screens as opposed to more screens there are certainly ease of use advatanges to multiple screens and several studies sponsored by Dell have found the user experience to be more pleasureable with multiple screens. I find using several screens an easier way of arranging windows than trying to arrange them on one large monitor.

“We found that users of multiple monitors have an average expected productivity increase of 42%,” said Dr. Jon Peddie, President of Jon Peddie Research.

University of Utah study showed that a second monitor can save an employee 2.5 hours a day if they use it all day for data entry.

“The more you can see, the more you can do.” Jon Peddie, 1998

But what about having monitors in portrait mode, well the Jon Peddie research also stated that additional benefit could be had by reducing scrolling for information.

“Having one of the monitors in portrait mode adds to productivity by eliminating the need to scroll.”

After taking a week or so to get used to the portrait mode I quickly found its benefits. It’s ideal for viewing anything that requires vertical scrolling, e.g. web pages, code, log files, Jira backlogs etc. In a quick test of a code file in VSCode I could see 62 lines of code on my 24inch hortizontal monitor, but 120 lines when the file was moved to the vertical monitor. Not only do you reduce scrolling but there are advantages in being able to see more of a document in one go, helping you visualise the problem.

On a side note the Xerox Alto, which was the first computer designed to support a GUI based OS, had a vertical screen to represent a document layout.

twomonitors_small

Over the past decade or more displays have moved to widescreen format and many applications have tailored their UIs to fit this trend. This means that having all portrait monitors can quickly lead to some frustrations and the need to horizontally scroll, thus reducing the additional productivity gains. This is why I have found that (for me) the ideal setup is one landscape display and one portrait display (as in the image above).

With this formation you can use the display that is best suited to the task at hand, and so you have the best of both worlds. Combine this with a few simple keyboard shortcuts to move things between monitors and you’ll find soon be loving the layout. In Windows using the Windows key and the cursor keys together can quickly move and dock windows. Linux distros vary.

Since I made the switch to one vertical at work I have had numerous colleagues see the benefits and be converted to this layout, so if you have a monitor that is capable of being turned vertically then I recommend you give it a go and see how you get on. You might get hooked!

 

Create New MSTest Projects for Pre .Net 4.5 in Visual Studio 2017

Create New MSTest Projects for Pre .Net 4.5 in Visual Studio 2017

This post outlines the steps to create a new unit test project in Visual Studio 2017 using MS Test V1 and that targets .Net Frameworks prior to .Net 4.5.

Visual Studio 2017 onwards only has new unit test projects for MS Test V2 onwards and .Net 4.5. This is fine for new applications or ones targeting a recent .Net framework version but what if you have an existing solution targeting an older .Net version. Below shows the Unit Test Project available for .Net 4.5, but as you can see in the second screenshot for .Net 3.5 its not available.

NewTestProj_45

NewTestProj_35

If you want to create a new unit test project targeting .Net 3.5/4 for example then follow the steps below:

Create a new MS Test V2 project targetting .Net Framework 4.5 as in the first screenshot above (i.e. File > New Project > Test > Unit Test Project targeting .Net 4.5).

Once its created, change the project to target your earlier .Net Framework (e.g. .Net 3.5). This is done via the Project Properties page.  Click Yes and Visual Studio will reload the project.

NewTestProj_ChgFmk

Once it reloads the project will complain about some references which is fine as we’re now going to remove the MS Test V2 assemblies.

NewTestProj_Errors

Now remove the two project references to test assemblies.

NewTestProj_RemoveRefs

Then add a reference to the MSTest v1 assembly, Microsoft.VisualStudio.QualityTools.UnitTestFramework.dll.  This should be under Extensions in the Add Reference dialog. Alternatively you can browse to them on your hard drive at the following locations:

For pre .Net 4 projects : C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\ReferenceAssemblies\v2.0

For post .Net 4 projects: C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\ReferenceAssemblies\v4.0

If you are not running Visual Studio Enterprise, then swap Enterprise in the path for Community etc.

NewTestProj_AddRefs

Now rebuild the project and you’re all done.

Find assemblies loaded during debugging in Visual Studio

Find assemblies loaded during debugging in Visual Studio

Sometimes you may get the following error when you are debugging a .Net app in Visual Studio:

“The breakpoint will not currently be hit. No symbols have been loaded for this document.”

Or you may have issues whereby the wrong code version appears to be loading at run time or perhaps when debugging you get an error saying a referenced component cannot be located.

All these issues stem from you not being able to view what components are actually being loaded during debug. If only there was a view in Visual Studio that gave you that info…well this is Visual Studio and so they’ve already thought of that, and its called the Modules view.

During debugging of your application from the menu go : Debug > Windows > Modules

Mod1

From this really useful view you can see each component that’s been loaded, the file path, the symbol file location, version information and more. This will show you if a component from the GAC has been loaded instead of your local file version, for example. It also enables you to find and load Symbol files for components where they have not been loaded automatically.

For information on the full functionality of this view check out the documentation here.

Platform Targeting in .Net

Platform Targeting in .Net

If you see one of the errors below within your .Net application it is likely as a result of your assembly having been compiled with the wrong target architecture flag set.

1) In this instance the assembly was accidentally compiled for x86 only and run within IIS in a 64bit process:

Unhandled Exception: System.BadImageFormatException: Could not load file or assembly “AssemblyName Version=1.0.0.0, Culture=neutral” or one of its dependencies. An attempt was made to load a program with an incorrect format.

(For info IIS has an Application Pool setting under Advanced  Settings that enables you to run the App Pool as a 32 bit process instead of 64bit. This is off by default.)

2) Here a .Net application targeting a 64 bit processor was run on a 32bit system:

This version of ConsoleApplication1.exe is not compatible with the version of Windows you’re running. Check your computer’s system information to see whether you need a x86 (32-bit) or x64 (64-bit) version of the program, and then contact the software publisher.

If you are looking for a quick fix, then just recompile your assembly with the “Any CPU” option as your platform target. See project properties, build tab for this setting.  If you want more information and explanation then read on.

BlgPlatformTarget1When compiling .Net assemblies in Visual Studio (or via the command line) there are a few options regarding optimising for certain target platforms. If you know that your assembly needs a to be deployed only to a 32bit environment (or it needs to target a specific processor instruction set) then you  can optimise the output of MSIL code (what the compiler produces ready for a JIT compilation at run time) for that platform. This is not normally required but perhaps, for example, you have to reference unmanaged code that targets a specific processor architecture.

Essentially this sets a flag inside the assembly metadata that is read by the CLR. If this is set incorrectly then this can result in the above errors but also other odd situations. For example if you compile your app to target “x86”  and then reference an assembly that is targeting a “x64” platform then you will see an error at runtime due to this mismatch (BadImageFormatException). Running an “x86” application will still work on a 64 bit Windows but it will not run natively as 64bit but will instead run under the WOW64 emulation mode which enables x86 execution under 64 bit  (with a performance overhead), which may or may not be a valid scenario in your case.

If you want to reproduce the situation try creating a new console application and in the build properties tab set Platform Target to “x86”. Then create a new Class Library project, set a reference to it in the Console Application, and then in the build properties tab set it to target the “x64” platform. Build and run the application which will show the above BadImageFormatException.

The target platform for your project is set in the Project Properties tab in Visual Studio, under Build (see screenshot above). If you are compiling via the command line you use the /platform switch.

“AnyCPU” became the default value from VS2010 onwards. “AnyCPU” used up to .Net 4.0 means that if the process runs on a 32-bit system, it runs as a 32-bit process and MSIL is compiled to x86 machine code. If the process runs on a 64-bit system, it runs as a 64-bit process and MSIL is compiled to x64 machine code. Whilst this enables more compatibility with multiple target machines it can lead to confusion or unexpected results when you consider that Windows duplicates system DLLs, configuration and registry views for 32bit and 64bit processes.  So Since .Net 4.5 (VS2012)  there is now a new default subtype of “AnyCPU” called “Any CPU 32-bit preferred” which basically follows the above rules except that if a process runs on 32bit system then it will run as a 32 bit process (not 64bit as before) and its MSIL will be compiled to x86 code not x64. This change essentially now forces your process to run under 32bit on a 64bit machine unless you untick the option and turn off the default setting. This setting can be seen on Project properites Build tab as “Prefer 32-bit”.

BlgPlatformTarget2

It is worth noting that you may see a confusing “Multiple Environments” option in Visual Studio which can be automatically added  after migrating solutions from older versions of Visual Studio (I believe it has been removed as a new option in VS2015 onwards but can hang around in older solutions). Use the Configuration Manager tab to check the setting for each assembly. Most developers will want to target  “Any CPU” which supports multiple target environments. If you are getting the above errors then use the steps below to check the assembly and if incorrect then try recompiling with “Any CPU” instead.

How to confirm a target platform for a given assembly:

So how do you confirm which processor architecture an assembly was built for? Well there are a few ways:

1) Using .Net reflection via a simple Powershell command:

[reflection.assemblyname]::GetAssemblyName("${pwd}\MyClassLibrary1.dll") | format-list

Running this command pointing to the assembly you want to check will produce output like this below (results truncated):

An assembly compiled to target AnyCPU:

Name                  : ClassLibrary2
Version               : 1.0.0.0
CodeBase              : file:///C:/TempCode/crap/pttest/x86/ClassLibrary2.dll
ProcessorArchitecture : MSIL
FullName              : ClassLibrary2, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null

Same assembly but now compiled to target x86 :

Name                  : ClassLibrary2
Version               : 1.0.0.0
CodeBase              : file:///C:/TempCode/crap/pttest/x86/ClassLibrary2.dll
ProcessorArchitecture : X86
FullName              : ClassLibrary2, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null

Same assembly again but now compiled to target x64 :

Name                  : ClassLibrary2
Version               : 1.0.0.0
CodeBase              : file:///C:/TempCode/crap/pttest/x86/ClassLibrary2.dll
ProcessorArchitecture : Amd64
FullName              : ClassLibrary2, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null

 

2 )You can also see this information from a de-compiler tool such as dotPeek. Below shows a screenshot showing a x86, 64, AnyCPU and x64 targeted assemblies.
BlgPlatformTarget4_dotPeek

3) Use the CorFlags Conversion Tool

The CorFlags Conversion Tool (CorFlags.exe) is installed with Visual Studio and easily accessed by the VS Command Line. It enables reading and editing of flags for assemblies.

CorFlags  assemblyName

Assuming you have the .Net 4 version installed with .net 4 and above  you’ll see something like this. Older versions do not have the 32BITREQ/PREF flags as per the change for .Net4 discussed above:

Microsoft (R) .NET Framework CorFlags Conversion Tool.  Version  4.6.1055.0
Copyright (c) Microsoft Corporation.  All rights reserved.

Version   : v4.0.30319
CLR Header: 2.5
PE        : PE32
CorFlags  : 0x1
ILONLY    : 1
32BITREQ  : 0
32BITPREF : 0
Signed    : 0

To interpret this output see the table below and check the PE value (PE32+ only on 64bit) and the 32bit Required/Preferred flags. It is also possible to update these flags using this tool.

Summary

Below is a simple table of results to show the impact of Platform Target on the complied assembly and how it is run on a 32/64bit OS. As you can see the 32 preferred flag has resulted in an AnyCPU assembly being run as 32bit on a 64bit OS, the table also shows the values you get when you use the techniques above for determining the target platform for a given assembly.

 

Platform Target in Visual Studio PowerShell
Result
dotPeek
Result
CorFlags
32BITREQ
CorFlags
32PREF
CorFlags
PE
Runs on
32bit OS as
Runs on
64bit
OS as
AnyCPU (pre .Net 4) MSIL MSIL 0 0 PE32 32 bit process 64 bit process
AnyCPU (.Net 4 +)
32 bit NOT preferred
MSIL MSIL 0 0 PE32 32 bit process 64 bit process
AnyCPU (.Net 4 +)
32 bit Preferred (default)
MSIL x86 0 1 PE32 32 bit process 32 bit process
(under WOW64)
x86 x86 x86 1 0 PE32 32 bit process 32 bit process
(under WOW64)
x64 Amd64 x64 0 0 PE32+ ERROR 64 bit process

In summary, there are a few simple rules to be aware of with using AnyCPU and the 32bit Preferred flag but essentially AnyCPU will enable the most compatibility in most cases.

Speed up a slow JSF XHTML editing experience in Eclipse or IBM RAD/RSA.

Speed up a slow JSF XHTML editing experience in Eclipse or IBM RAD/RSA.

If you find yourself doing some JSF (Java Server Faces) development within either Eclipse, IBM’s RAD (Rapid Application Developer) or IBM RSA (Rational Software Architect) IDEs you may find that the JSF editor can run slowly with some lag. This seems particularly a problem on RAM starved machines and/or older versions of the Eclipse/RAD IDEs. The problem (which can be intermittent) is very frustrating and can result in whole seconds going by after typing before your changes appear in the editor. It seems that the JSF code validator is taking too long to re-validate the edited JSF code file. At one point this got so bad for our team many would revert to making JSF changes in a text editor and then copy/paste the final code into the IDE.

java_small

Thankfully there is a workaround and in order that I don’t forget if I hit this problem again I’m posting it here. The workaround (although sadly not a fix) is to use a different “editor” within the same IDE. If you right click the JSF file you want to edit and use the pop-up menu to choose to open it with the XML Editor instead of the XHTML Editor then you will find a much faster experience. Whilst this does remove some of the JSF/XHTML specific validations it provides support for tags etc and will perform faster.

Should you wish to always use the XML Editor to edit XHTML files you can make this global change via the preferences. Go to General > Editors > File Associations > File Types list > select XHTML extension > click Add > Add XML Editor. Then in the associated editors list select the XML Editor and click the ‘Default’ button – thus making XML Editor the default for all XHTML files. Of course once this is done you can still click on individual XHTML files and right click to open in the original XHTML editor should you want to temporarily switch back for an individual file.

Hopefully this will prevent you pulling your hair out in frustration when editing XHTML files.

Using PowerShell for your VS Code Integrated Terminal

Using PowerShell for your VS Code Integrated Terminal

Microsoft’s superb Visual Studio Code editor has an integrated terminal which is accessed via the ‘View’menu or via the Ctrl+’ shortcut keys. On Windows by default the terminal used is the Windows Command Prompt (cmd.exe) terminal, however you can easily configure VS Code to use a different terminal such as Windows PowerShell.

Open the User Settings config file (the ‘settings.json’ file accessed via File > Preferences > User Settings) and modify the setting for which terminal to run on Windows:

The default setting is:

 “terminal.integrated.shell.windows”: “C:\\WINDOWS\\system32\\cmd.exe”,

To use the PowerShell terminal instead add this to your settings.json user settings file:

“terminal.integrated.shell.windows”: “C:\\Windows\\sysnative\\WindowsPowerShell\\v1.0\\Powershell.exe”,

Now PowerShell will be used instead of cmd.exe. Currently only one terminal can be configured in VS Code and so you can’t have both PowerShell and cmd.exe so you’ll have to choose your favourite for now. You can however access mutliple instances of the terminal via the drop down on the terminal window.

vscodeposhterminal2

Finally whilst on the subject of VS Code and PowerShell I recommend installing Microsoft’s PowerShell Extension which lets you code and debug PowerShell scripts directly within VS Code (and benefit from its features, e.g. git integration etc).

Interactive file reading with Powershell

Interactive file reading with Powershell

Sometimes you want to see the contents of text file whilst it is still being updated, a common example is where you are outputting to a log file and need to see the output interactively without having to keep opening the file to check for progress, or to see if a job has complete.

Luckily there is the very useful Get-Content Powershell Command.

This can take a “-wait” parameter that will reread the file every second or so and check for updates, displaying it in the console (until you end the command with Ctrl&C as usual).

So for example the command below will read the file constantly until you tell it to stop:

Get-Content C:\Logs\Log.txt -wait

In addition there is the “-Tail” parameter which is the Powershell equivalent to the Unix tail command. This will read the last few lines of the file only and not the whole file. This can be used on its own like this:

Get-Content C:\Logs\Log.txt -tail 5

…which displays the last 5 lines of the file. Or you can combine -wait and -tail together to constantly read the last (specified) number of lines:

Get-Content C:\Logs\Log.txt -wait -tail 5

For more information, see Get-Content Powershell Command on MSDN.