Enabling Hibernate on Ubuntu

Enabling Hibernate on Ubuntu

Personally I prefer to hibernate all my machines instead of shut them down as I find it more efficient being able to carry on from where I left off than to start a fresh. One annoyance with my Ubuntu installation is that Hibernate is not a first class function and needs some tweaking to get it to work. For my own record this post records what I have done to be able to Hibernate my Ubuntu 20.04 install. All credit to the original post on AskUbuntu.com.

For the actual steps checkout the AskUbuntu post but essentially you need to find the Swap partition (assuming you have one), grabs it UUID and then edit the grub file to update the GRUB_CMDLINE_LINUX_DEFAULT entry to add the resume=UUID=XXXXX-XXX-XXXX-XXXX-YYYYYYYYYY value. Finally update the grub file with “sudo update-grub”.

Once setup you can enter hibernate from the terminal with:

sudo systemctl hibernate

Now you may want to add a shortcut to your desktop or to a hot key to run this command without entering the terminal.

For a desktop shortcut check out this post, where it explains how to create a *.desktop file that will launch the script. Create a text file named e.g. HibernateNow.desktop and then add the content of the file like below, assuming that you have created a bash script with the above “sudo systemctl hibernate” command in it called “hibernatescript.sh” :

[Desktop Entry]
Type=Application
Terminal=true
Name=HibernateNow
Icon=utilities-terminal
Exec=gnome-terminal -e "bash -c './hibernatescript.sh;$SHELL'"
Categories=Application;

Then you can double click the HibernateNow.desktop file and hibernate the machine. Or setup a hotkey for this instead.

Happy hibernating.

Ubuntu Intermittent Freezing Fixed with Swappiness

Ubuntu Intermittent Freezing Fixed with Swappiness

Having run Ubuntu on my Dell XPS 14z for years I have been increasingly plagued by an intermittent freezing problem which causes the UI to freeze (mouse still movable) for anything from 5 to 30 seconds. I’m not sure when this started (maybe Ubuntu 19.04 time) but it has gotten worse with each subsequent Ubuntu release to the point that Ubuntu 20.04 was almost unusable which prompted me to search harder for a solution. The solution I found resolved the problem completely – Swappiness.

Mine is capable but aging hardware but it appears that the OS is Swapping memory to disk too agressively for my system which is what is causing the temporary freezing to occur.

Photo by Michael Dziedzic on Unsplash

The Linux Swappiness setting is an itensity setting on how aggressive the OS should swap memory to disk. Here is the offical definition:

“This control is used to define how aggressive (sic) the kernel will swap memory pages. Higher values will increase aggressiveness, lower values decrease the amount of swap. A value of 0 instructs the kernel not to initiate swap until the amount of free and file-backed pages is less than the high water mark in a zone. The default value is 60.”

Linux documentation on GitHub

For a great explanation checkout this great HowToGeek article.

Whilst the default is 60 there are differing opinions on what to set it to for certain types of hardware and OS usage but for me I went with a value of “20” and its been great. No more freezing so far.

Find your systems current setting with this command:

cat /proc/sys/vm/swappiness

Then update it with this:

sudo sysctl vm.swappiness=20

No reboot required for testing, and when you have decided on the final value to use you need to persist it via updating the sysctl.conf file, e.g:

sudo nano /etc/sysctl.conf

and set the vm.swappiness=20 setting in the file.

This solution worked for me and Ubuntu 20.04 is now running really well.

Moving Sonar Rules closer to the Developer with ESLint

Moving Sonar Rules closer to the Developer with ESLint

Shift code quality analysis to the left by moving your static analysis from the CI/CD pipeline to the developers IDE where possible. In this post I cover what we did, how to set up Sonar Lint and how we ultimately moved the Sonar rules into ESLint instead.

Problem

As a development team working on a JavaScript application we sometimes had issues where a difference between the rules enforced in our CI/CD pipeline, using SonarQube, and local rules, using ESLint, led to discrepancies in standards and ultimately, at times, broken builds. If you are enforcing rules in your project and in the CI/CD pipeline with SonarQube then indeally you need them to mostly match but this is not as easy as it sounds. Any failure in the pipeline are more time-consuming to resolve than if they happened locally before the developer pushed their code to the repository.

Photo by NESA by Makers on Unsplash

Solution

Import the Sonar rules into ESLint and force ESlint in both the IDE and the CI/CD pipeline.

In order to strike a balance of quality assurance and flexibility in the implementation of rules we introduced an approach that combines ESLint and Sonar rules with the emphasis on shift-left with rule enforcement done in the IDE as code is written and then re-enforced later in the CI/CD pipeline.

Firstly, Sonar Source (developers of SonarQube) provide a plugin for several IDEs (including VSCode) called SonarLint that helps address the issue of running Sonar rules in an IDE. Once installed it will analyse your code against default Sonar rules and report issues. What if you’re not running the default set of rules on your Sonar server, well no worries as the plugin can be set to connect to your server and download the right quality profile and rules.

To do this install the SonarLint extension into your IDE (many are supported, e.g. VS Code, Visual Studio etc) and then set the extension properties as per the instructions for your particular IDE. For VS Code it goes like this:

To link a server set the “sonarlint.connectedMode.connections.sonarqube” setting which has to be a USER setting (oddly). Then in workspace settings for the project you can configure the projectKey for your project. Workspace setting files are created in the .vscode folder in a settings.json file whcih can be added to source control so this only needs to be setup once per project. Once done, press F1 > type sonar > select “SonarLint: Update all project bindings to SonarQube” which will refresh the plugins cache and force it to download the rules from your Sonar server.

Now whilst SonarLint is a useful tool it is not as powerful as ESLint for linting in the IDE (in my opinion). For example ignoring a rule (for a genuine reason) at file or line level is not possible in a satisfactory way (you can only ignore from a line/file from ALL sonar rules). Also Eslint provides more power and flexibility especially where you have a centrally managed sonar server with shared rule profiles and quality gates that are not easy to change (which may be a good thing for your organisation).

So instead of, or even in additon to, SonarLint checking you can actually import the Sonar JavaScript scanner rules into ESLint. To do this install the npm package: eslint-plugin-sonar then configure your ESLint config to use the Sonar js/recommended) JS rules.

  1. npm install eslint-plugin-sonar
  2. Add it to your eslint config file as an extends
    extends: [
    “plugin:sonarjs/recommended”
    ]

Now ESLint will report quality errors that would previously only been highlighted in Sonar during a CI/CD pipeline build. This immediate feedback is more useful to the dev team and reduces the time wastage associated with broken builds. For us this ensured that apart from a few rules the majority are now in ESLint where developers can see them and resolve them, preventing the need for a CI/CD pipeline to highlight the problem (and broken builds).

To enforce the ESLint rules at build time we run the ESlint analysis during the CI/CD Pipeline by calling ESlint as a build step.

eslint --config eslintrc.json src 

This means no ESLint errors will be let through. We also have a Sonar Quality Gate check configured in the Pipeline but as the majority of Sonar rules are now in ESLint we should only get a failure where a rule is breached that is only in the server Sonar Profile.

Photo by Nicole Wolf on Unsplash

As an additional step we can also import all the ESlint issues found in to Sonar so that we can see them in the Sonar dashboard. You can export the ESlint rules as JSON for Sonar to import (during the build). To do this run this command in the build (ideally create a new npm script for it) assuming your src folder contains the source:

eslint --config .eslintrc.json --output-file ./eslint-report.json --format json src

Next set this Sonar property in your sonar-project.properties file or via command line argument (where eslint-report.json is the output report produced above).

sonar.eslint.reportPaths=eslint-report.json

Any issues from the ESLint report will then appear in Sonar issues marked with an EsLint badge. It appears warnings are added as majors and errors as Criticals and unfortunately I’ve not yet found a way to change this.

As a side note this command is also useful with eslint to output a HTML report of any errors which is great for reviewing or sharing:

eslint --config .eslintrc.json --output-file ./eslint-report.html --format html src

Summary

In summary, quality is being maintained as the same rules are enforced but now developers only need to ensure that ESlint is happy before committing their changes to source control to ensure a smooth server build. To make this easier you can add a new npm run script that runs all pre-commit checks that is triggered automatically (e.g. git hooks) or manually by the developer.

Auto increment build number in a JavaScript app

Auto increment build number in a JavaScript app

Whilst looking a simple way to auto increment an application build version number for a small React JavaScript application I found a neat solution so I’m sharing it here for others to use (and for me to refer back to on my next project).

Photo by Ariel on Unsplash

Problem:

For a small React application using NPM I needed to automatically increment a build number on every production build without having to remember to do it, and then I needed to refer to that version number within the application in order to display in on the site’s footer.

Solution:

After a short Google I found this really neat solution posted by Jan Lübeck on the Creact-React-App Github site here which I then very slightly tweaked for my requirements.

The solution is to essentially call a NodeJs script from npm build script definition whcih updates a metadata JSON file within the prpoject to increment the build version number by 1. This file can then be read within the React code as required to display the currnt vuild verson.

First we need to store the build version numbers in a JSON file:

{
"buildMajor":1,
"buildMinor":0,
"buildRevision":3,
"buildTag":"BETA"
}

Only the buildRevision will be updated automatically as we will want manual control over the major/minor numbers as we implement semantic versioning.

The NodeJS Script (generate-buildno.js) is below. This opens the metadata.json file, reads the buildRevision value and then updates it by 1 before saving the file:

var fs = require('fs');
console.log('Incrementing build number...');
fs.readFile('src/metadata.json',function(err,content) {
    if (err) throw err;
    var metadata = JSON.parse(content);
    metadata.buildRevision = metadata.buildRevision + 1;
    fs.writeFile('src/metadata.json',JSON.stringify(metadata),function(err){
        if (err) throw err;
        console.log(`Current build number: ${metadata.buildMajor}.${metadata.buildMinor}.${metadata.buildRevision} ${metadata.buildTag}`);
    })
});

To call this script at build time we update the package.json file and change the build command to call our script first, before the normal build command:

 "scripts": {
    "start": "react-scripts start",
    "build": "node generate-buildno.js && react-scripts build",
    "test": "react-scripts test",
  },

Now when we call npm run build the JSON file we be updated and the build revision number incremented. The React code that needs to display the app version number just has to read the metadata.json file like the example React Components below:

import React from 'react';
import metadata from './metadata.json';

function FooterComponent() {
  return (
    <div className="sf-footer">
      &copy; 2020 RichHewlett.com
      <div className="sf-footer-version">
        {`Version ${metadata.buildMajor}.${metadata.buildMinor}.${metadata.buildRevision} ${metadata.buildTag}`}
      </div>
    </div>
  );
}
export default FooterComponent;

Remember after building the application to commit your changed metadata.json file into source control!

Java keystore certificate missing after Jenkins upgrade

Java keystore certificate missing after Jenkins upgrade

Following a tricky upgrade to a Jenkins server it was found that the server was no longer able to communicate with the Git server, although SSH access with private keys was stil working correctly. On invrestigating the Jenkins logs (found at Logs at: http://YOUR-JENKINS-SERVER-URL/log/all) this error was found to be the problem:

Failed to login with creds MYCREDENTIALNAMEHERE
sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.build(Unknown Source)
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(Unknown Source)
at java.security.cert.CertPathBuilder.build(Unknown Source)
Caused: sun.security.validator.ValidatorException: PKIX path building failed
at sun.security.validator.PKIXValidator.doBuild(Unknown Source)
at sun.security.validator.PKIXValidator.engineValidate(Unknown Source)
at sun.security.validator.Validator.validate(Unknown Source)
at sun.security.ssl.X509TrustManagerImpl.validate(Unknown Source)
at sun.security.ssl.X509TrustManagerImpl.checkTrusted(Unknown Source)
at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(Unknown Source)
Caused: javax.net.ssl.SSLHandshakeException

The problem here is that the Java runtime is not able to validate the connection as it doesnt have the relevant SSL certificate. To resolve I needed to download the certificate from the site and add it to the Java Keystore.

Photo by Chunlea Ju on Unsplash

On the server download the relevant web certificate for connecting to the relevant site (a GitHub server in my instance) that you are trying to access. Next find which Java JRE your Jenkins was using. This can be found in your Jenkins.xml file under the executable element. For example %BASE%\jre\bin\java means c:\Jenkins\JRE\bin\java if you have installed Jenkins to C:\Jenkins on Windows. Now find the cacerts file inside that Java runtime that it was using (for example C:\Jenkins\jre\lib\security\cacerts) where you will need to install the Certificate. To add the cert to the Java Keystore for that Java install run the below in a terminal:

keytool -import -alias INSERT_ALIAS_NAME_HERE -keystore PATH_TO_JRE_HERE\security\cacerts -file CERTIFICATE_FILE_NAME

for example…

keytool -import -alias ExampleName -keystore C:\Jenkins\jre\lib\security\cacerts -file exampleCertFile.cer

If you are asked for a password then it will be ‘changeit‘ by default.

Now the JRE Java runtime on the box that Jenkins is now using has the Certificate installed it will be able to make a SSL connection.

Current Podcast Recommendations

Current Podcast Recommendations

 As someone with a long commute I find Podcasts a great way of keeping up to date with technology (as well as other things) and it helps me feel that my commute time is not wasted. Over the years I think I have benefited from this approach as its increased my level of general knowledge and helped me keep up to date with the latest developments in the industry.

Over the last ten years I have occasionally posted my top podcast picks and here are the links to my previous posts in 2010, 2012 and 2016.

Below is a list of my current podcast recommendations for technical and non-technical subjects.

Podcasts on Software Development:
Podcasts on Technology:
Other Recommended Podcasts :
Photo by Malte Wingen on Unsplash

Jenkins: Script not permitted to use method signature

Jenkins: Script not permitted to use method signature

I’ve recently ran into an error in a Jenkins CI/CD pipeline when using java.util.Date objects in my Groovy script pipeline and so I’d thought I’d share the problem and workaround.

Problem

In order to create a date time and build number stamp into an output build package as part of a Jenkins build I made use of the java Date object in Groovy, something like this:

def date1 = new Date(1456533546)
def today = new Date()
def diff = (today-date1)

On running the build I get this error in the console log:

org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException:
Scripts not permitted to use staticMethod 
org.codehaus.groovy.runtime.DateGroovyMethods minus java.util.Date

It turns out that as the Groovy script jenkinsfile is running in the Groovy Sandbox in Jenkins it has a white-list of approved method signatures out of the box, but java.util.Date is not one of them.

Solution

If you add the Script Security Plugin to your Jenkins install, if its not there already, then you’ll be able to whitelist new methods and apis that can be used by the Jenkins pipeline scripts.

Log onto Jenkins as an Administrator and go to Manage Jenkins > In-process Script Approval and scroll to the bottom where the Script Security Plugin will have put new entries. Here you’ll find recent script permission failures and details.

options in Script Approval UI

Choosing to “Approve” the signature will then approve scripts to use this java signature (in this example the java.util.Date object) and the error should go away.

For more information see the official Jenkins docs here.

Free SSL encryption on Azure with Cloudflare

Free SSL encryption on Azure with Cloudflare

I have a small Web Application site (running ASP.net Core) hosted on Microsoft Azure under a shared plan. Azure shared plans are budget friendly whilst providing features like custom domain names. Until recently the site was HTTP only which was initially fine for my use case but as HTTPS is becoming increasingly a minimum standard on most sites and even browsers are now warning users when navigating to unsecure sites, it was time to move to HTTPS.

Photo by James Sutton on Unsplash

The Problem

Whilst setting up HTTPs certificates has got easier and cheaper there is still usually a trade off between effort and cost. I could have added SSL to my Azure subscription but I couldn’t justify that cost for the site in question. So I needed an easy and cheap solution – and I definitely found it with Cloudflare!

The Solution

I am amazed how easy it was to set up SSL fronting of my existing site using Cloudflare’s SSL.

  1. Create an account at Cloudflare.com
  2. Enter your domain name where the site is hosted.
  3. Confirm the pricing plan (choose FREE)
  4. Cloudflare scans the domain’s DNS entries and asks you to confirm which entries you want proxying. Once done you’ll be given the new NameServers which you’ll need to update on your domain registar’s site.
  5. Once the NameServers have synced (could take minutes to 48 hours) you are done!

There are a few configuration modes for their SSL offering, e.g. full SSL (i.e. SSL from end users to Cloudflare to origin server) or Flexible (i.e. where the SSL stops at Cloudflare and traffic is not encrypted from your origin server to Cloudflare). The options you choose will depend on your setup and your requirements. As my site is hosted on Azure I got Full SSL encryption by default (using Azure’s SSL keys to encrypt traffic from my web server to Cloudflare).

You don’t need an Azure hosted site for this SSL goodness to work as they will front your service wherever its hosted.

This process was so easy I should have done it months ago. For more info checkout these resources:

Build warnings & .Net Standard dependencies

Build warnings & .Net Standard dependencies

Having recently migrated a Windows Azure hosted .Net Framework ASP.net MVC 5 web application over to ASP.net Core (v2.2) I hit a few issues so I’m documenting them here for my future self and anyone else who might find it useful. The application was a multi functional ASP.net MVC site with an Azure Blob Storage Service back-end that I re-wrote in new ASP.Net Core with a mixture of both MVC and Razor Pages. The new site was .Net core 2.2 which referenced supporting projects built to .Net Standard 2.0.

The Problem

The application ran fine locally and as I was re-purposing the same Windows Azure Web Service instance I re-configured it to support .Net Core via the Azure Portal. Its not clear if this step is actually required but it makes sense to set it correctly (see screnshot below).

Next I deployed my Web App via Web Deploy, and then hit the homepage but immediately received a long list of errors such as the one below:

The type 'Attribute' is defined in an assembly that is not referenced. 
You must add a reference to assembly
'System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e'. 
[assembly: global::Microsoft.AspNetCore.Razor.Hosting.RazorCompiledItemAttribute(typeof(AspNetCore.Views_Home_Index), @"mvc.1.0.view", @"/Views/Home/Index.cshtml")] ……etc……

It looked like .Net itself was not being found on the target machine, and so after checking the server settings and server logs in the Azure Portal I double checked my project references to confirm I was targeting .Net Core and .Net Standard 2.0 and that’s when I found a build Warning that I had previously missed:

[NU1701] Package 'Microsoft.Azure.KeyVault.Core 1.0.0' was restored using '.NETFramework,Version=v4.6.1' instead of the project target framework '.NETStandard,Version=v2.0'. 
This package may not be fully compatible with your project

One of my dependent projects in the solution was referencing the Azure SDK assembly Microsoft.Azure.Storage.Blob (via NuGet package) which in turn was referencing a version of Microsoft.Azure.KeyVault.Core that was not compatible with .NetStandard 2.0. This solution ran fine locally (as the dependent framework is installed on my machine) but when the application was deployed to a server without the .Net Framework Version=v4.6.1 binaries the application failed to load. More on this problem can be found in this GitHub issue.

The Solution

The solution was to reference a later version of the Microsoft.Azure.KeyVault.Core assembly that does support .NetStandard. As my project had no direct reference to Microsoft.Azure.KeyVault.Core (as it was a dependency of Microsoft.Azure.Storage.Blob) I had no dependency reference to update and so instead added a direct dependency to Microsoft.Azure.KeyVault.Core version 3.0.3 in my project file. This didn’t directly resolve my issue though despite reportedly working for other people, but at this point I decided to update all my Azure SDK references which took Microsoft.Azure.Storage.Blob to v11.1.3 which references a .NetStandard 2.0 compliant version of the KeyVault.Core dependency:

  <ItemGroup>
    <PackageReference Include="Microsoft.Azure.Storage.Blob" Version="11.1.3" />
  </ItemGroup>

After upgrading (tweaking a few updated namespaces that has changed in the newer package versions) I deployed the project successfully without error. Issue resolved!

The lesson here I think is to firstly don’t ignore build warnings (we all sometimes – right?) and secondly be careful with dependencies and their children not being .NetStandard compliant. This may be less of a problem in the future as .Net Core and .Net Standard gets more widespread adoption but for now its one to watch for.

IIS Express Launch Script

IIS Express Launch Script

Usually during web development you want to run your web code locally via a local development web server and there are many options for this. In fact most development workflows provide this functionality. For example Visual Studio provides a local web server to run your code, and React/Webpack toolchains usually use NodeJS based solutions. Sometimes though you want to want to fire up something simple to run your code outside your development workflow.

Photo by Aryan Dhiman on Unsplash

The Problem

When developing ReactJS projects I often prefer to run my transpiled bundled code (produced via ‘NPM run build’) in a different way to my development code on a different local web server and, as I usually have IIS Express installed already, I prefer to use that than install new global Node based web server tools. Whilst the command line parameters for IIS Express are simple I have added them to this post to prevent me having to remember them in the future 🙂

The Solution

To run IIS Express from the commandline use:

iisexpress /path:<path_to_files>

You can also specify a port number and other options if you require (see the documentation here).

I prefer to wrap this into a RunOutputBuild.cmd batch file that I can store with the code in the repo and fire up when I want to host my transpiled build output files.

REM Script to launch IIS Express and host the build folder
echo "Launching IIS Express"
cmd /K "c:\program files (x86)\IIS Express\iisexpress.exe" /path:%~dp0build

This assumes the IIS Express installation location is the default one and that the script lives in root of the repo as does the \build folder. The useful %~dp0 key resolves to the local directory where the command script was run from.

I can then just run the command file and browse to the compiled site. Its also possible to add compilation steps and one to fire up the browser automatically if required. Alternatively you could write an NPM Script or Gulp/Grunt step that replicates this functionality to launch IIS Express.