Loading test BLOB data into HSQLDB with a Groovy Ant Task

At my current client we use an in-memory HSQLDB for unit testing. An in-memory DB provides us a mechanism to test DAO’s and services against a known dataset. Overall this has worked out very well for us and our unit testing.

Until recently we only used the HSQL jar file to create the schema and DBUnit Ant task to load the data from .csv files.  The Ant tasks looked something like the following.

Ant INIT target

<target name="init">
  <property name="hsql.url" value="jdbc:hsqldb:hsql://localhost:9001/testSchema" />
  <property name="hsql.username" value="testUser" />
  <property name="hsql.password" value="testPass" />
  <property name="hsql.dir" value="./hsqldb" />
  <property name="hsql.jar" value="${hsql.dir}/lib/hsqldb-1.8.0.7.jar" />
  <property name="ddl.dir" value="./ddl" />
  <property name="data.dir" value="./data" />
</target>

Ant Target to load the DDL

<target name="hsql-load-schema"
           depends="init"
      description="Loads schema into hsql.">

  <!-- load ddl -->
  <java jar="${hsql.jar}" fork="true" failonerror="true">
  <arg value="--rcFile" />
  <arg value="${hsql.dir}/sqltool.rc" />
  <arg value="testSchema" />
  <arg value="${ddl.dir}/base-schema.ddl" />
  </java>
</target>

This solution worked great for the data types we normally use. Recently however, we needed to add support for logos stored as jpg’s. In the DB the jpg files are stored as BLOB’s. From what I could find, DBUnit wasn’t an option for loading BLOB datatypes from image files in the same manner we were familiar with.

Our solution for loading images into a BLOB was to use the Ant Groovy task . This plugin allowed us to write Groovy code inside the same build.xml file we were already using to create the tables and load the data.

Updated Ant INIT task

<target name="init">
  <property name="hsql.url" value="jdbc:hsqldb:hsql://localhost:9001/testSchema" />
  <property name="hsql.username" value="testUser" />
  <property name="hsql.password" value="testPass" />

  <property name="hsql.dir" value="./hsqldb" />
  <property name="hsql.jar" value="${hsql.dir}/lib/hsqldb-1.8.0.7.jar" />
  <property name="ddl.dir" value="./ddl" />
  <property name="data.dir" value="./data" />

<!-- Setup the Groovy Path to define the tasks  -->
  <path id="groovy.path">
    <fileset dir="./lib">
      <include name="groovy-all-1.7.5.jar" />
    </fileset>
    <fileset dir="${hsql.dir}/lib">
      <include name="hsqldb-1.8.0.7.jar" />
    </fileset>
  </path>

<!-- Define the Groovy Task  -->
  <taskdef name="groovy"
             classname="org.codehaus.groovy.ant.Groovy"
          classpathref="groovy.path" />
  </target>

Groovy Ant task to load BLOB’s

<target name="hsql-load-logos"
           depends="init"
      description="Loads Logos into the DB.">
<!-- Below is an Groovy-Ant task that allows Groovy code to be executed.
The script uses Groovy to load an image into a blob DB Column.   -->

    <groovy>
      import groovy.sql.Sql
      import org.hsqldb.jdbcDriver

      def sql = Sql.newInstance("jdbc:hsqldb:hsql://localhost:9001/testSchema", "testUser", "testPass", "org.hsqldb.jdbcDriver")

// Read the directory containing the images.  Image files are prefixed with a 4 digit school identifier "

     def dir = new File('../finance-student-test/blobs/logos')
     dir.eachFile {
     if (it.isFile()) {
        println "----  processing file ---- " + it.name

        def schoolId = it.name.substring(0,4)

//  The table was not setup with a sequence to generate next value so we need to get the current max_id value for the school

        def firstRow = sql.firstRow("select LOGO_ID
                                                      from LOGO_CNTRL
                                                    where SCHOOL_ID = ${schoolId}
                                                     order by LOGO_ID desc")

        def maxId = 0
        if(firstRow != null) {
          maxId = firstRow.get("LOGO_ID")
        }

// get the file as a FileInputStream
       InputStream fis = new FileInputStream(it);

// Create a PreparedStatement to do the insert.  Some values are hardcoded in the insert statement.
       def ps = sql.connection.prepareStatement(
       "insert into LOGO_CNTRL
            (SCHOOL_ID, LOGO_ID, LOGO_TYPE_ID, LOGO_FILE_TYPE_ID, LOGO_BEGIN_DATE, LOGO_END_DATE, LOGO_FILE)
            values(?, ?, 101001, 42, ?, ?, ?)")

        ps.setObject(1, rcId)
        ps.setObject(2, maxId + 1)
        ps.setObject(3, beginDate)
        ps.setObject(4, endDate)

// Set the BLOB with the FileInputStream

        ps.setBinaryStream(5, fis, (int)it.length());

        ps.execute()
        sql.commit()
        ps.close()

      }
    }
    sql.close()

  </groovy>
</target>

If you have a better way to solve this problem I would encourage you to share it in the comments below.

About the Author

Object Partners profile.
Leave a Reply

Your email address will not be published.

Related Blog Posts
A security model for developers
Software security is more important than ever, but developing secure applications is more confusing than ever. TLS, mTLS, RBAC, SAML, OAUTH, OWASP, GDPR, SASL, RSA, JWT, cookie, attack vector, DDoS, firewall, VPN, security groups, exploit, […]
Building Better Data Visualization Experiences: Part 1 of 2
Through direct experience with data scientists, business analysts, lab technicians, as well as other UX professionals, I have found that we need a better understanding of the people who will be using our data visualization products in order to build them. Creating a product utilizing data with the goal of providing insight is fundamentally different from a typical user-centric web experience, although traditional UX process methods can help.
Kafka Schema Evolution With Java Spring Boot and Protobuf
In this blog I will be demonstrating Kafka schema evolution with Java, Spring Boot and Protobuf.  This app is for tutorial purposes, so there will be instances where a refactor could happen. I tried to […]
Redis Bitmaps: Storing state in small places
Redis is a popular open source in-memory data store that supports all kinds of abstract data structures. In this post and in an accompanying example Java project, I am going to explore two great use […]