Test pollution can be a frustrating issue to look into, especially when the failures are sporadic. These may be caused by reliance on hard-coded fields (such as ids), timezone/DST issues, or previous test case leakage (for which Ted Naleid has a great debug script).
Yet sometimes fixing these issues is not enough, and the tests still fail on the CI server, but never fail when run locally. For integration tests, this may be caused by having multiple CI test executors running against the same database instance. The two (or more) test runners may be working on the same data, leading to some fun errors. You could have each job run in a separate datasource environment, but this is not maintainable, as you need to keep adding environments for each new job you add. Instead, it’s easier and faster to dynamically create the test database for each build, so that each database is a unique, one-time use datasource that is specific to that run. I’ll be detailing a quick-and-dirty way to set up this configuration in Grails 2.x, PostgreSQL, and Jenkins.
Before we start, it may be helpful to install the Jenkins PostBuild Script plugin. This will allow us to execute scripts after a job is completed (regardless of whether or not it failed), rather than (for example) having to put the logic in another job. It’s not necessary, but certainly helpful.
The first thing you will need to do is update your script that runs your Grails tests to create a new database and pass it a unique parameter. For this example I’ll be using the Jenkins BUILD_NUMBER variable, as my two jobs were not in sync/on the same build, but if that is the case for you, they may still have some clashes. You can use another Jenkins-provided variable or use one of your own (date/timestamp/random id/etc.).
I’ll be using PostgreSQL 9.3 as an example, but a similar approach should work for other database variants as well. You’ll need to point at your SQL home directory if you are not already, as well as add the `createdb` call. Then you need to pass this variable to Grails so it knows which datasource to run against.
So at this point we have created the new database, and run our tests against it. However, we also want to drop this database at the end of our execution REGARDLESS of whether or not the tests pass. We could set a variable or alter the original shell script, but as our SQL variant and version supports an `–if-exists` flag, we are safe to run the `dropdb` command regardless of whether, say, the original `createdb` call failed. This is where the PostBuild Script plugin is helpful, as it allows us to execute a set of shell scripts. Bear in mind that if you are not using the plugin and instead running another job to drop the database, your variables may not be the same (e.g. BUILD_NUMBER), so be sure to use something reproducible across jobs.
Here we can add the follow code to drop the database safely:
Finally, we need to update our Grails Datasource.groovy to accept the new parameter. Now, you may have noticed that I am only passing the suffix, and not the entire database URL. Doing so may be OK, but be sure that no job gets updated to accidentally point at “prod”, or that you have validation in place on the URL.
In the example above I use the same environment for both CI servers and developer testing, so I have the “Elvis”/ternary (“?:”) operator in case a suffix is not provided. It’s not necessary if your CI servers specify a separate environment.
Hopefully this example can help you set up a configuration that is more maintainable and robust to errors, so you can keep testing away.