Integration tests: connection to tcp server from two processes does not work

72 views
Skip to first unread message

Ivan Prosperi

unread,
Jun 7, 2020, 10:10:18 AM6/7/20
to H2 Database
Hi everyone.

First of all I would like to thank the developers/maintainers of H2 database. It has always served me perfectly for my needs (especially for automated testing of persistence layer).

Iam running H2 database version 1.4.200 and using Maven to manage dependencies. I am using Java 8 and running on Ubuntu 18.04.

I am currently in this situation:
  • I am developing a Java EE 8 REST application which uses a database
  • I am using Hibernate for persistence
  • I am using Wildfly 18 as application server
  • I want to test the endpoints of my application (using Arquillian), e.g. test that if some entities are in the database and i call the respective endpoint, the entities are returned.

My integration tests for endpoints are structured in the following way:
  • Start an H2 TCP server
  • Create a non-managed EntityManagerFactory with a persistence.xml referring to the H2 database, which will be used for populating the database in the tests
  • Use Arquillian to deploy a micro-deployment for the application on Wildfly
  • The deployment will internally use another persistence.xml, with a connection to the same H2 database 
  • (The Arquillian tests are executed outside the container, with @RunAsClient and @Deployment(testable = false))

To my understanding, at this point I should have two connections to a single H2 database (running in the TCP server):
  1. The first one using a non-managed EntityManagerFactory, which should be used to insert instances in the database from the tests. I will call this "external" connection
  2. The second one, using a managed Factory, handled by the Wildfly environment, which is used internally by the deployed application when I call the REST API from outside the container. I will call this "internal" connection, in that it is used inside the deployed war application
The problem is that I cannot establish the two connections to the same database: from the logs I can see that two connections are created for the same database (at least it seems so), but when I add some entities to the database from the test case using the "external" connection, no entities are retrieved from the "internal" connection.

So it seems that two separate databases are created, instead of the single one that I need.

The configurations that I tried are the following:
  1. In-Memory. Using an in-memory database, following the recommended approach:
    • Create the "external" EntityManagerFactory manually, using
      "jdbc:h2:mem:H2-db;DB_CLOSE_DELAY=-1"
      as url inside persistence.xml. This should create the database
    • Start a TCP server in the test case programmatically:
      Server.createTcpServer("-tcp", "-tcpAllowOthers", "-tcpPort", "9128", "-trace");
    • Use a tcp url for the persistence.xml url inside the micro-deployment:
      "jdbc:h2:tcp://localhost:9128/mem:H2-db;DB_CLOSE_DELAY=-1;IFEXISTS=TRUE"
    • Let Arquillian deploy the micro-deployment (I can see the logs that the persistence unit is created successfully)
  2. Persistent. Using persistent databases:
    • Manually start the server with the following command:
    • java -cp h2-1.4.200.jar org.h2.tools.Serve-tcp -tcpAllowOthers -tcpPort 9128 -trace -web -webAllowOthers
    • Use the same TCP url for both persistence.xml files:
      "jdbc:h2:tcp://localhost:9128/~/H2-db;DB_CLOSE_DELAY=-1;IFEXISTS=TRUE"
      . As far as I understand this means that both the connections will be made to the same "remote" database, as in other DMBSs normal server mode.
    • Let Arquillian deploy the micro-deployment (I can see the logs that the persistence unit is created successfully)

Unfortunately both of the approaches seem correct but do not work, in the sense that they do not throw errors and from the logs I can see that I can both the persistence units being created successfully, but if I insert entities from one persistence unit, the other does not see them, as if they were linked to different databases.

I would prefer an in-memory approach, as these are data used only for tests and can be destroyed. I only tried the persistent approach because the in memory one did not work.

I hope I made myself clear, because it is a quite complex situation for myself too.

Ivan Prosperi

unread,
Jun 10, 2020, 4:40:58 AM6/10/20
to H2 Database
Hi everyone.

I found the problem with my configuration: Wildfly only accepts database connections through datasources and I was not using a datasource for the in-deployment connection. So this first problem is solved.


However I found another problem during the setup of integration tests.

I am able to:
  1. start a Server instance in the test case, using the appropriate API Server.createTcpServer(...)
  2. obtain a connection to an in-memory database using a local url "jdbc:h2:mem:arquillian;DB_CLOSE_DELAY=-1"
  3. use a remote url to use the same in-memory database inside the arquillian-managed micro-deployment: "jdbc:h2:tcp://localhost:9128/mem:arquillian;DB_CLOSE_DELAY=-1"
But I would like to automatically start the server during the Maven build, and I found that this can done with the "java" goal of the "maven-exec-plugin" (link), as this goal uses the same jvm used in the tests.

So I configured the plugin with the following code:
<plugin>
   
<groupId>org.codehaus.mojo</groupId>
   
<artifactId>exec-maven-plugin</artifactId>
   
<version>3.0.0</version>
   
<executions>
       
<execution>
           
<id>pre-integration-test</id>
           
<phase>pre-integration-test</phase>
           
<goals>
               
<goal>java</goal>
           
</goals>
           
<configuration>
               
<mainClass>org.h2.tools.Server</mainClass>
               
<commandlineArgs>-tcp -tcpDaemon -tcpAllowOthers -tcpPort 9128 -trace</commandlineArgs>
               
<classpathScope>test</classpathScope>
               
<cleanupDaemonThreads>false</cleanupDaemonThreads>
           
</configuration>
       
</execution>
   
</executions>
</plugin>

I would have expected this to work for in-memory databases, as the java goal uses the same process of the tests to execute the server. Unfortunately this does not work.

With lsof I can see that the server is listening on 9128, but no other process uses that port, so the connections do not use that server:
COMMAND   PID     USER   FD   TYPE DEVICE SIZE/OFF NODE NAME
java    
24736 ivan94fi  218u  IPv6 446093      0t0  TCP *:9128 (LISTEN)


Instead if I run the server inside the test case with Server.startTcpServer(...), the output of lsof is the following:
COMMAND   PID     USER   FD   TYPE DEVICE SIZE/OFF NODE NAME
java    
24736 ivan94fi  218u  IPv6 446093      0t0  TCP *:9128 (LISTEN)
java    
24736 ivan94fi  220u  IPv6 444986      0t0  TCP localhost:9128->localhost:37394 (ESTABLISHED)
java    
24777 ivan94fi  589u  IPv4 448079      0t0  TCP localhost:37394->localhost:9128 (ESTABLISHED)


The only way I managed to make the exec plugin work was by using the remote TCP urls for both connections and adding the parameter "-ifNotExists" to the server creation, so that the database can be created from the remote url.

As this last approach seems not secure, and I would prefer to avoid it, is it possible to start the server for an in-memory database in some other way that does not clutter the test case code? (Mostly because I am planning to use other databases for tests, so I don't want the code for H2 server creation in tests that use another database.)

Evgenij Ryazanov

unread,
Jun 10, 2020, 4:58:34 AM6/10/20
to H2 Database
Hello.

Why you're using -tcpAllowOthers? Do you really need to connect to your testing system from other hosts? Combination of this setting with -ifNotExists effectively creates a remote security hole on your system, -ifNotExists should not be normally used even for local connections, local security hole is not as critical as remote one, but it could be exploited in some cases anyway.

Actually you should start a separate Java process with the following code:

// Create a local database
DriverManager.getConnection("jdbc:h2:mem:H2-db;DB_CLOSE_DELAY=-1", "user", "password").close();
// Create the TCP Server
org
.h2.tools.Server server = org.h2.tools.Server.createTcpServer("-tcpPort", "9128");
server
.start();

If you need to use -tcpAllowOthers, specify a strong password in this connection.

Other processes must use a remote URL instead (jdbc:h2:tcp://localhost:9128/mem:H2-db).

Another option is to start the H2 Server process and create a database in it with H2 Console or from context menu of its tray icon.

Ivan Prosperi

unread,
Jun 10, 2020, 11:33:19 AM6/10/20
to H2 Database
Hi Evgenij, thank you for your answer.

I have been working on my project since you posted your answer and now I can say I have solved the problem by following your suggestion of creating the database in another process, with a local url.

The only modifications I needed from your code where the -tcpDaemon option, otherwise the exec-maven-plugin goal will be blocked in the execution, and the -tcpPassword option in the creation and shutting down of the server.

Actually the whole workflow was successful even without a TCP password, but the shutdown method "Server.shutdownTcpServer" would fail with this error if the used password is a blank String (""):
org.h2.jdbc.JdbcSQLInvalidAuthorizationSpecException: Wrong user name or password [28000-200]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:461)
at org.h2.message.DbException.getJdbcSQLException(DbException.java:429)
at org.h2.message.DbException.get(DbException.java:205)
at org.h2.message.DbException.get(DbException.java:181)
at org.h2.message.DbException.get(DbException.java:170)
at org.h2.engine.Engine.validateUserAndPassword(Engine.java:357)
at org.h2.engine.Engine.createSessionAndValidate(Engine.java:176)
at org.h2.engine.Engine.createSession(Engine.java:166)
at org.h2.server.TcpServerThread.run(TcpServerThread.java:168)
at java.lang.Thread.run(Thread.java:748)

at org.h2.message.DbException.getJdbcSQLException(DbException.java:461)
at org.h2.engine.SessionRemote.done(SessionRemote.java:611)
at org.h2.engine.SessionRemote.initTransfer(SessionRemote.java:147)
at org.h2.engine.SessionRemote.connectServer(SessionRemote.java:435)
at org.h2.engine.SessionRemote.connectEmbeddedOrServer(SessionRemote.java:321)
at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:173)
at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:152)
at org.h2.Driver.connect(Driver.java:69)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.h2.server.TcpServer.shutdown(TcpServer.java:466)
at org.h2.tools.Server.shutdownTcpServer(Server.java:381)
at it.unifi.dinfo.stlab.viola.backend.H2Manager.stopDB(H2Manager.java:49)
at it.unifi.dinfo.stlab.viola.backend.H2Manager.main(H2Manager.java:65)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:254)
at java.lang.Thread.run(Thread.java:748)

By specifying a password both at creation and at shutdown time, I was able to get rid of this error. Maybe this could be a bug?

Furthermore, I have removed the -tcpAllowOthers flag, as you suggested, because I actually did not need it. I originally added it as I thought it was madatory when using the TCP server.

For future referece I report here the approach which works for me to use H2 as database in my integration servers using Arquillian. This assumes the need of testing a JEE REST application using Arquillian and JPA.
  • create a datasource to deploy inside Arquillian micro-deployment. This uses the tcp url to the database
  • create a utility class to manage server creation and shut down. There are some examples on StackOverflow (e.g. this answer)
  • in the utility class, use the code by Evgenij (previous post) to create the database with DriverManager.getConnection(...).close(). Here use a local url (not involving tcp)
  • in the same class and method, just after creating the database, start the server (with -tcpDaemon option)
  • configure exec-maven-plugin to run in pre-integration-test phase. It should run your utility class method that creates the database and start the server. Specify <cleanupDaemonThreads>false</cleanupDaemonThreads> in the configuration of the plugin (e.g. this SO answer)
  • configure exec-maven-plugin to run in post-integration-test phase. It should run your utility class method that shuts down the tcp server
  • In your test case, create an EntityManagerFactory that uses a persistence unit with a tcp url to the database created in the utility class

Now you should be able to use the EntityManager inside the test case to populate the database, and the same database is used by the war deployed on the Application Server by Arquillian.

Thank you again for your assistence.

Greetings,
Ivan

Evgenij Ryazanov

unread,
Jun 10, 2020, 11:58:05 AM6/10/20
to H2 Database
You must specify a TCP password to use the shutdownTcpServer() due to security reasons.

Ivan Prosperi

unread,
Jun 10, 2020, 3:16:25 PM6/10/20
to H2 Database
Sorry, my bad. I completely missed that. When I got the problem I only looked in the javadocs and I did not find that info there.
Reply all
Reply to author
Forward
0 new messages