bookmark_borderRecovered

It has come to my attention that this site was exploited and had several malicious script tags inserted. This could potentially harm your system or it could be just very annoying (redirecting you to some dodgy site).

I did not really notice this, since I use several blocking extensions which I don’t pay attention to due to the sheer amount of script tags in all kinds of websites.

I spent some hours to recover from the hack, upgraded all plugins, removed all unused themes and went through the database to make sure there are no malicious scripts left.

If you find anything that could still point to some weird script or behaviour on this site, please contact me.

bookmark_borderDeserialise Json to typed Map with Jackson in Kotlin

There are multiple resources how to deserialise JSON to a typed object. This is fairly straightforward. But what if have a blob of JSON like this:

{
"FOO": 1,
"BAR": 2,
"BAZ": 3
}

And the keys of this map would be defined in an enum.

Surely you could deserialise this into a Map<String, String> first, or simply a Map. And *after* that trying to map your keys to the enum you use.

But what if you want to do something like this?

Map<MyEnum, Integer> map = deserialiseJsonAsMap(json);

Well you’d need something with generics ? Too bad: you can’t do this in Java due to type erasure.

In Kotlin however you *can* do this. Your deserialiseJsonAsMap function should use inline and reified to make it happen.

In short, your function could look like this:

    @Throws(JsonProcessingException::class, JsonMappingException::class)
    inline fun <reified K, V> deserialiseJsonAsMap(jsonData: String?): Map<K, V> {
        if (StringUtils.isEmpty(jsonData)) {
            return HashMap<K, V>()
        }
        val typeRef: TypeReference<Map<K, V>> = object : TypeReference<Map<K, V>>() {}
        return objectMapper.readValue(jsonData, typeRef)
    }

You could omit the StringUtils.isEmpty() check if you want to.

Because of inline the method body gets inlined everywhere it is called. And using reified we make sure the types are used properly. Ie, if your calling code:

var map = getJsonValuesAsMapFriendly<MyEnum, Int>(json)

It basically makes it like this:

val map = if (StringUtils.isEmpty(json)) {
      HashMap<MyEnum, Int>()
    } else {
      val typeRef: TypeReference<Map<MyEnum, Int>> = object : TypeReference<Map<MyEnum, Int>>() {}
      objectMapper.readValue(json, typeRef)
    }

This means your code gets bigger (due the inline) but you do have a single function now!

See also:

bookmark_borderSetting up Jenkins with ReverseProxy and with with github-webhook

Recently I wanted to set up Jenkins and have it work with Github webhooks. It went fairly smoothly, but I ran into some edge cases which I wanted to elaborate on:

In this case there is a dedicated Debian server, using Apache as webserver. It is using Letsencrypt and runs several apps with their own subdomains.

One of those applications is Jenkins which is available through ci.<hostname>.com.

In order to achieve I basically followed the instructions at Jenkins.

In this case, Jenkins runs (configure it at /etc/default/jenkins) at localhost:8102 without context (ie /). Then, in Apache we configure this as so:

  ProxyPass / http://localhost:8102/ nocanon
  ProxyPassReverse / http://localhost:8102/

Make sure the trialing slash is there. But, still, this is standard stuff if you read the instructions at Jenkins.

Then, I still ran into problems using the github-webhook. I tested first using Postman: I send a POST to https://ci.<domain>.com/github-webhook and to my surprise it said:

java.lang.Exception: Method POST required
	at org.kohsuke.stapler.HttpResponses.error(HttpResponses.java:83)
	at org.jenkinsci.plugins.github.webhook.RequirePostWithGHHookPayload$Processor.shouldBePostMethod(RequirePostWithGHHookPayload.java:92)

Consulting apache logs results into:

"POST /github-webhook HTTP/1.1" 302 3613 "-" "PostmanRuntime/7.26.1"
"GET /github-webhook/ HTTP/1.1" 405 7097 "https://ci.<domain>.com/github-webhook" "PostmanRuntime/7.26.1"

This is weird, because I can login at Jenkins (which is using a POST) so it shouldn’t be possible that all POST requests become GET. Something else is wrong?

One way to test it is to just use curl on the machine itself. First, lets eliminate Apache’s role and see if we can access the url directly. Ie, since on the machine jenkins runs on port 8102 we should be able to access it like so:

curl -H "Content-Type: application/json" -i -X POST http://localhost:8102/github-webhook

and the response…

$ curl -H "Content-Type: application/json" -i -X POST http://localhost:8102/github-webhook
HTTP/1.1 302 Found
Date: Thu, 30 Jul 2020 10:09:36 GMT
X-Content-Type-Options: nosniff
Location: http://localhost:8102/github-webhook/
Content-Length: 0

Ah! so it looks like Jenkins is giving us a 302.

So what gives?

After a while I figured the url needs a trailing slash (/facepalm), and sadly I figured it after searching the web for a while. I hope this post makes it easier for people to figure this out.

So, at the end the url should become /github-webhook/. Now, testing again with curl:

$ curl -H "Content-Type: application/json" -i -X POST http://localhost:8102/github-webhook/
HTTP/1.1 400 Hook should contain event type
Date: Thu, 30 Jul 2020 10:09:00 GMT
X-Content-Type-Options: nosniff
Cache-Control: must-revalidate,no-cache,no-store
Content-Type: text/html;charset=iso-8859-1
Content-Length: 372

And so we got a 400 now. This is good.

Retesting with Postman gave the same results.

Once setup in Github webhooks, it also worked fine!

So, remember, make sure a trailing slash is at that github webook url! –> /github-webhook/

bookmark_borderAutomating your SQL migrations in your Java (web) Application with ActiveJDBC’s DB Migrator

Regardless of your language or framework – if you write a (web) app and read/write data to a SQL database (you manage), you’re bound to have changes on that database for whatever reasons.

Usually this is tackled by writing SQL migration scripts. For instance SQL files you can execute in your release procedure. You might pass these SQL files along with a ‘release document’ for a separate (OPS) department.

But what if we could automate this? Wouldn’t it be much nicer your application could detect which SQL files to execute, and then execute them? And when things break, it stops.

This blog post covers one of the many ways to do this. I start from a more general process and move to a more concrete Java implementation.

An automated deploy process

In this case I assume the following:

  1. You are not allowed to run any build process on your environment you deploy on. Ie there is no maven on your prod environment (nor acceptance or test).
  2. Up until now you execute SQL migrations manually. These are SQL snippets you have and you use a SQL editor (or command line) to execute them.
  3. You want to automate these steps. (of course you do!)

So let’s make the hand by hand process a bit more formal:

  1. Write your own SQL script (You execute the bits you need, or you write it in such a way you can execute it fully all the time.)
  2. You stop your (web)app
  3. You open your SQL editor / prompt and execute your SQL script
  4. On failure, you roll back and reboot your app (and figure out what went wrong)
  5. On success you boot your (web)app

And what would an automated SQL migration process look like?

  1. It should use a SQL file that can be executed multiple times and have the same result (Ie the 2nd, or 3rd time should not result in other behaviour. Ie: Idempotent)
  2. When your (web) app boots it should check if there are any SQL files left to execute
  3. The SQL files are executed in a specific order (from oldest to newest)
  4. It remembers which migrations have been executed
  5. Upon failure, stop executing migrations and stop booting

 

Notice the similarities!

Of course, you should have created a backup before doing any migration. Though that is out of scope of this post. It usually boils down to:

  • create backup
  • execute release procedure
    • upon fail, report back. Put back backup, and boot old app.
    • upon success, good for you!

Using ActiveJDBC’s DB Migrator

I have worked on a few projects using Ruby On Rails (abbreviated as RoR), which uses rake as build tool (kinda like maven). With that you could easily create new database migrations, execute migrations, etc. I was charmed by the simplicity.

In the land of Java, where I have used mostly Spring and Hibernate – I found ActiveJDBC (which is part of Javalite) which seems to be inspired by the approach RoR has taken.

Within ActiveJDBC there is also the usage of a migrator which reminds me of RoR’s rake db:migrate approach. It is a maven plugin and you call mvn db-migrator instead. It is very nice to create new migrations or run them.

Now the running part is the problem. While on your local machine you can run mvn db-migrator, I would not suggest to do that on my production (or any other) environment. So we need a way to run the migrator without the usage of maven.

Fortunately this is easy to fix.

Let’s get into some code.

 

Setting up ActiveJDBC in your Maven project

If you follow ActiveJDBC’s documentation it is easy to set up within your project. So I take that here, but expand upon it.

In your pom.xml you add the following as a plugin within the plugins section:

<!-- for creating db migrations with maven -->
<plugin>
  <groupId>org.javalite</groupId>
  <artifactId>db-migrator-maven-plugin</artifactId>
  <version>${activejdbc.version}</version>
  <configuration>
    <migrationsPath>src/main/resources/sql/db-migrations</migrationsPath>
    <environments>MY ENVIRONMENT</environments>
  </configuration>
  <dependencies>
    <dependency>
      <groupId>${jdbc.groupId}</groupId>
      <artifactId>${jdbc.artifactId}</artifactId>
      <version>${jdbc.version}</version>
    </dependency>
  </dependencies>
</plugin>

Where there are a few details:

jdbc.groupId, jdbc.artifactId and jdbc.version is your db connector. It could be mysql for instance. You probably have this dependency already defined in your project. So take that and put it there.

For activejdbc.version I use now 2.0.

One other thing to note is that I have deviated a bit from the default migrationsPath. In my case I put the sql files into src/main/resources/sql/db-migrations.

If you set this up, you can now create a migration. Test it by executing:

mvn db-migrator:new -Dname=<your name for your migration script>

The naming of migration scripts does not really matter – although it is good to have a structure in it. Usually I tend to write them as <action>_<type>_<name of column/table>_extra_info. Which results into something like: create_table_Books.

Suppose you took the last example as input:

mvn db-migrator:new -Dname=create_table_Books

It will result into:

/src/main/resources/sql/db-migrations/20180711170817_create_table_Books.sql

As you can see, a timestamp is prepended to the name you have given.

The timestamp is used by ActiveJDBC to determine if the SQL script should be executed or not. It basically keeps a record of the last time it executed any scripts. Then it would diff against the sql scripts it found. Any newer scripts are executed in the correct order (oldest to newest). In other words, do not touch the timestamp.

I’d suggest you play a bit with the plugin first, before going to the next step.

 

Executing the SQL Migrations from your (web) application

Now the real fun begins.

Include db-migrator as dependency in your project

So not as a plugin but also as a dependencyLike so:

<!-- DB Migrations -->
<!-- yes we use it as a plugin but also as a dependency! -->
<dependency>
  <groupId>org.javalite</groupId>
  <artifactId>db-migrator-maven-plugin</artifactId>
  <version>${activejdbc.version}</version>
  <exclusions>
    <exclusion>
      <groupId>org.sonatype.sisu</groupId>
      <artifactId>sisu-guava</artifactId>
    </exclusion>
  </exclusions>
</dependency>
<!-- DB Migrations -->

**NOTE** I exclude the sisu-guava version because it is an older version than 20.0 and it will cause RuntimeExceptions when using Preconditions. Since I use version 25.x or newer in my own projects it clashes with the old version. So if you use Guava yourself and it is a newer version, make sure you exclude the old one like above.

Using it as a dependency makes it reachable for our code, while at the same time we can use the mvn db-migrator command alongside it.

 

Making sure your migrations run before your ORM is loaded

Usually web apps use an ORM (like Hibernate) and it requires the state of the DB to match the objects within the code. So your migrations (which alter state of the DB) should be executed before the ORM is booted.

So let’s start with that. You do this by adding a listener to your web.xml and place it at the top of all listeners. Making sure it is executed first.  If you do not have any listeners, in my case they are positioned right after filters but before servlet definitions.

<!-- order is important, the database migrator must run before everything else -->
<listener>
  <listener-class>com.fundynamic.webapp.listener.DatabaseMigratorListener</listener-class>
</listener>

Now we have to make sure the class exists. Create the class, and let it implement ServletContextListener like so:

package com.fundynamic.webapp.listener;

import javax.servlet.ServletContextEvent;
import javax.servlet.ServletContextListener;

public class DatabaseMigratorListener implements ServletContextListener {

  // Public constructor is required by servlet spec
  public DatabaseMigratorListener() {
  }

  // -------------------------------------------------------
  // ServletContextListener implementation
  // -------------------------------------------------------
  public void contextInitialized(ServletContextEvent sce) {
  }

  public void contextDestroyed(ServletContextEvent sce) {
      /* This method is invoked when the Servlet Context 
         (the Web application) is undeployed or 
         Application Server shuts down.
      */
  }
}

I’d suggest you put some System.out or logging there and see if it is executed in your application in the correct order before going to the next step.

Executing the migrations

Here we get into the meat of the problem.

Let’s start with the code

package com.fundynamic.webapp.listener;

import org.javalite.db_migrator.maven.MigrateMojo;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import javax.servlet.ServletContextEvent;
import javax.servlet.ServletContextListener;

public class DatabaseMigratorListener implements ServletContextListener {

  private static final Logger log = LoggerFactory.getLogger(DatabaseMigratorListener.class);

  // Public constructor is required by servlet spec
  public DatabaseMigratorListener() {
  }

  // -------------------------------------------------------
  // ServletContextListener implementation
  // -------------------------------------------------------
  public void contextInitialized(ServletContextEvent sce) {
    String configFilePath = sce.getServletContext().getRealPath("/WEB-INF/classes/dbmigrator.properties");
    String migrationsPath = sce.getServletContext().getRealPath("/WEB-INF/classes/sql/db-migrations");

    try {
      MigrateMojo mojo = new MigrateMojo();

      mojo.setConfigFile(configFilePath);
      mojo.setMigrationsPath(migrationsPath);

      // we do not use environments like this, but we process the properties file in our build process with a maven profile
      mojo.setEnvironments("myenv");

      mojo.execute();
    } catch (Exception e) {
      log.error("Exception when executing migrator: ", e);
      System.exit(-1); // stop further execution of the webapp with a non zero error code
    }
  }

  public void contextDestroyed(ServletContextEvent sce) {
      /* This method is invoked when the Servlet Context
         (the Web application) is undeployed or
         Application Server shuts down.
      */
  }
}

 

Basically we do 2 things:

  1. We point to a dbmigrator.properties which provides properties for active-jdbc’s database migration properties
  2. We set up the directory where to find the SQL migration files (which is the same folder as we used in the plugin)

You can also see I set up an environment named myenv, in this case I use the same environment because I use maven profiles which interpolate environment specific variables for me. Another way would be to provide an environment variable so you can have 1 file but different property names.

Last but not least upon any exception we do a System.exit(1) to signify we should stop whatever we are doing.

Within the dbmigrations.properties file you will need at least credentials for the user which executes the migrations (I suggest you create a separate account for each environment for that), the file might look something like this:

myenv.driver=${jdbc.driverClassName}
myenv.username=${db.migrator.username}
myenv.password=${db.migrator.password}
myenv.url=${jdbc.url}

Here I also use serialized properties by maven, do note the variables all start with myenv which is the environment configured in the listener. The ${jdbc.driverClassName} and such are defined in my pom.xml.

Making execution optional

There are cases where you do not want to execute the migrations. For example you want to execute migrations automatically only on specific environments (ie, up till acceptance, but not yet on production). In that case

Add this in your web.xml

<!-- put this here somewhere on top of your web.xml file -->
<context-param>
  <param-name>executeDatabaseMigrationsOnBoot</param-name>
  <param-value>${db.migrator.execute}</param-value>
</context-param>

Again, the param-value is a serialized property. If you want to know more about those, see maven-war-plugin filtering options.

Either way you want to have the context-param so you can read out the value from your listener and do some logic with it.

I use a BoxingUtil class which is nothing but a simple helper method to make reading string to Boolean easier. The implementation of canParseBoolean is:

public static boolean canParseAsBoolean(String value) {
  if (Boolean.TRUE.toString().equalsIgnoreCase(value) ||
    Boolean.FALSE.toString().equalsIgnoreCase(value)) return true;
  return false;
}

and parseBoolean is exactly what it does, but it make sure the value is not null. It is not of much importance here.

To read the property and do some logic it becomes like this:

String value = sce.getServletContext().getInitParameter("executeDatabaseMigrationsOnBoot");
if (!BoxingUtil.canParseAsBoolean(value)) {
  log.error("Cannot parse value given for context-param (in web.xml) 'executeDatabaseMigrationsOnBoot', value is [" + value + "], but expected either [true] or [false]");
  System.exit(-1); // stop further execution of the webapp with a non zero error code
}

boolean executeMigrations = BoxingUtil.parseBoolean(value);

now you can execute the rest of the listener code when this value is true.

 

The listener, complete code example

For completeness sake, here is the listener fully, with a bit of logging and with the flag to execute or not.

package com.fundynamic.webapp.listener;

import com.fundynamic.util.BoxingUtil;
import org.javalite.db_migrator.maven.MigrateMojo;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import javax.servlet.ServletContextEvent;
import javax.servlet.ServletContextListener;

public class DatabaseMigratorListener implements ServletContextListener {

  private static final Logger log = LoggerFactory.getLogger(DatabaseMigratorListener.class);

  // Public constructor is required by servlet spec
  public DatabaseMigratorListener() {
  }

  // -------------------------------------------------------
  // ServletContextListener implementation
  // -------------------------------------------------------
  public void contextInitialized(ServletContextEvent sce) {
    log.info("--------------------------------------------------------------");
    log.info("--------------------------------------------------------------");
    log.info("---              DATABASE MIGRATOR STARTING                 --");
    log.info("--------------------------------------------------------------");
    log.info("--------------------------------------------------------------");
    String value = sce.getServletContext().getInitParameter("executeDatabaseMigrationsOnBoot");
    if (!BoxingUtil.canParseAsBoolean(value)) {
      log.error("Cannot parse value given for context-param (in web.xml) 'executeDatabaseMigrationsOnBoot', value is [" + value + "], but expected either [true] or [false]");
      System.exit(-1); // stop further execution of the webapp with a non zero error code
    }

    boolean executeMigrations = BoxingUtil.parseBoolean(value);

    log.info("executeMigrations flag is [" + executeMigrations + "]");

    if (executeMigrations) {
      executeMigrations(sce);
    } else {
      log.info("Execute migrations was disabled, skipping executing migrations automatically!");
    }

    log.info("--------------------------------------------------------------");
    log.info("--------------------------------------------------------------");
    log.info("---              DATABASE MIGRATOR FINISHED                 --");
    log.info("--------------------------------------------------------------");
    log.info("--------------------------------------------------------------");
  }

  private void executeMigrations(ServletContextEvent sce) {
    String configFilePath = sce.getServletContext().getRealPath("/WEB-INF/classes/dbmigrator.properties");
    String migrationsPath = sce.getServletContext().getRealPath("/WEB-INF/classes/sql/db-migrations");

    try {
      MigrateMojo mojo = new MigrateMojo();

      mojo.setConfigFile(configFilePath);
      mojo.setMigrationsPath(migrationsPath);

      // we do not use environments like this, but we process the properties file in our build process with a maven profile
      mojo.setEnvironments("myenv");

      mojo.execute();
    } catch (Exception e) {
      log.error("Exception when executing migrator: ", e);
      System.exit(-1); // stop further execution of the webapp with a non zero error code
    }
  }

  public void contextDestroyed(ServletContextEvent sce) {
      /* This method is invoked when the Servlet Context
         (the Web application) is undeployed or
         Application Server shuts down.
      */
  }
}

 

What if I want to rollback?

So there are 2 scenarios for this.

  1. you’re releasing, and something went wrong.
  2. your release went fine, but you want to revert some changes in the next release

In scenario #1 roll back to the backup you created before executing the release and migrations.

In scenario #2 your release went fine and you should create a new migration rolling back the changes. There is no reverting of SQL scripts. just a continuation. (roll-forward)

Conclusion

Automating your SQL migrations is pretty straight-forward. If you write idempotent SQL scripts and configure your (web) app to execute them automatically with ActiveJDBC’s db-migrator you get convenience for free. With this you are a step closer to automating your deploy process which is a very powerful tool for both developers and non-developers.

You can either use maven profile‘s or ActiveJDBC’s environments to differentiate properties. What you use is up to you.

If you found this helpful or have any questions, let me know.

bookmark_borderCoupling: The factory method

One of the challenges we face with coding is dealing with coupling. Coupling is an important aspect of programming, it tells us how much our code is tangled. When coupling is too high, we can’t easily re-use code. When the coupling is too low it does little. You can measure coupling, there are several metrics for it even (for instance “Coupling between Objects, CBO”).

In this blog post I’d like to talk about a subtle introduction of coupling: when you introduce a factory method.

Consider you have an interesting piece of code, and this piece of code has quite a lot of properties:

class Person {
	private String firstName;
	private String lastName;
	// .. more properties here

	public void subcribeTo(Subscription subscription) {
		// do something interesting here
	}

}

The problem is, because of the amount of properties and other dependencies, we’d like to simplify its creation by introducing a Factory method. In this case we are building a web application, so we take the Request as input to read the parameters:

class Person {
	private String firstName;
	private String lastName;
	// .. more properties here

	public static Person create(HttpServletRequest request, .. more arguments here .. ) {
		this.firstName = request.getParameter(&quot;firstName&quot;);
		// .. read more properties
		// .. set up dependencies, etc.
	}

	public void subcribeTo(Subscription subscription) {
		// do something interesting here
	}

}

In the code that uses Person, it becomes easier to construct the Person and we’re happy with that. However, we have introduced coupling on several levels:

  • We construct the object with specific parameters in the create method. If we want to create from different parameters, we cannot use it. There is a coupling between the parameters and the properties.
  • The object is constructed using a Request object. We cannot now move the class to an application that does not use the web. A person has nothing to do with a request, it is just convenience that we put the factory method in the Person class. There is a coupling between the code of Person and the dependency delivering the Request object.

There are several ways to deal with this. But lets start with the last reason of coupling. It is easy to fix this coupling by creating a Factory class within your web application. From there you can generate the Person object out of a request. The Person class has no create method anymore, and thus is not tightly coupled to a Request class. The newly created Factory however is coupled to the Request, which is fine as it is meant to convert Requests into Person objects. Hence we could even name it that way:

class Person {
	private String firstName;
	private String lastName;
	// .. more properties here

	Person(String firstName, String lastName, ...) {
		this.firstName = firstName;
		this.lastName = lastName;
		// ...
	}


	public void subcribeTo(Subscription subscription) {
		// do something interesting here
	}

}

class PersonFromRequestFactory {

	// .. dependencies here 

	public Person create(HttpServletRequest request) {
		Person person = new Person(request.getParameter(&quot;firstName&quot;), )		
		// .. read more properties
		// .. set up dependencies in Person, etc.
	}

}

Once we have this Factory, you can take it a step further:
If you have different kind of request parameters to create the same object you could create different methods in the new Factory:

class PersonFromRequestFactory {

	// .. dependencies here 

	public Person createFromRegistrationForm(HttpServletRequest request) {
		Person person = new Person(request.getParameter(&quot;firstName&quot;), )		
		// .. read more properties
		// .. set up dependencies in Person, etc.
	}


	public Person createFromSubscriptionForm(HttpServletRequest request) {
		Person person = new Person(request.getParameter(&quot;givenName&quot;), )		
		// .. read more properties
		// .. set up dependencies in Person, etc.
	}


}

You could also create a Parameter object and go from there. For instance, if your web application uses Spring, you could wire your request parameters to an object (called “Form binding“) automagically and use that object as input in your Factory. This way it is type safe:

class PersonFromRequestFactory {

	// .. dependencies here 

	public Person create(RegistrationForm form) {
		Person person = new Person(form.getFirstName(), ...)		
		// .. read more properties
		// .. set up dependencies in Person, etc.
	}


	public Person createFromSubscriptionForm(SubscriptionForm form) {
		Person person = new Person(form.getGivenName(), )		
		// .. read more properties
		// .. set up dependencies in Person, etc.
	}


}

But how do you test all this?
Did you notice the Person has private fields, and no get/set methods? The only way to set the fields is using the Person constructor. How do you test the correct construction of this Person class from the request? Since we are not able to read the properties, we have to use other ways to test that code. I’ll cover that in the next blog post.