Tag Archives: java ee

Writing integration tests with an in-memory Mongo DB

As I mentioned in my previous post, I’ve been working closely to the document oritented NoSQL database Mongo DB lately. As an advocate of sustainable systems development (test driven development that is), I took the lead in our team for designing tests for our business logic towards Mongo. For relational databases there are a lot of options, a common solution for testing against relational databases is to use an in-memory database such as H2. For NoSQL databases the options are not always as generous from an automated test perspective. Luckily, we found an in-memory version for Mongo DB from Flapdoodle which is easy to use and fits our use case perfectly. If your (Java) code base relies on Maven, just add the following dependency:

<dependency>
    <groupId>de.flapdoodle.embed</groupId>
    <artifactId>de.flapdoodle.embed.mongo</artifactId>
    <version>1.27</version>
    <scope>test</scope>
</dependency>

Then, in your test class (here based on JUnit), use the provided classes to start an in-memory Mongo DB, preferrably in a @Before method, and tear down the instance in a @After method, as in the example below:

public class TestInMemoryMongo {

    private static final String MONGO_HOST = "localhost";
    private static final int MONGO_PORT = 27777;
    private static final String IN_MEM_CONNECTION_URL = MONGO_HOST + ":" + MONGO_PORT;

    private MongodExecutable mongodExe;
    private MongodProcess mongod;
    private Mongo mongo;

    /**
     * Start in-memory Mongo DB process
     */
    @Before
    public void setup() throws Exception {
        MongodStarter runtime = MongodStarter.getDefaultInstance();
        mongodExe = runtime.prepare(new MongodConfig(Version.V2_0_5, MONGO_PORT, Network.localhostIsIPv6()));
        mongod = mongodExe.start();
        mongo = new Mongo(MONGO_HOST, MONGO_PORT);
    }

    /**
     * Shutdown in-memory Mongo DB process
     */
    @After
    public void teardown() throws Exception {
        if (mongod != null) {
            mongod.stop();
            mongodExe.stop();
        }
    }

    @Test
    public void shouldAssertSomeInteractionWithMongo() {
        // Create a connection to Mongo using the IN_MEM_CONNECTION_URL property
        // Execute some business logic towards Mongo
        // Assert the expected the behaviour
    }

    protected MongoPersistenceContext getMongoPersistenceContext() {
        // Returns an instance of the class containing business logic towards Mongo
        return new MongoPersistenceContext();
    }
}

… then you have an in-memory Mongo DB available for your test cases. The private member named “mongo” in the example above is of type com.mongodb.Mongo, which you can use for interaction with the Mongo database. As you notice, all you need is basically JUnit och Mongo, nothing more. But life is not always as easy as the simplest examples. In our case, we leverage from EJBs and other components which relies on running inside an container. As I’m contributor to the JBoss Arquillian project, a Java framework for integration tests, I was obviously curious about trying the approach of making the in-memory available when executing tests inside the container. If you use Arquillian already like we do, the transition is smooth. Just make sure to put the in-memory Mongo and Java driver on your classpath. The following Arquillian test extends the JUnit test class above, but with a bean injection for the business class under test:

@RunWith(Arquillian.class)
public class TestInMemoryMongoWithArquillian extends TestInMemoryMongo {

    @Deployment
    public static Archive getDeployment() {
        return ShrinkWrap.create(WebArchive.class, "mongo.war")
                .addClass(MongoPersistenceContext.class)
                .addAsManifestResource(EmptyAsset.INSTANCE, "beans.xml")
                .addAsWebInfResource(new StringAsset("<web-app></web-app>"), "web.xml")
                .addAsLibraries(DependencyResolvers.use(MavenDependencyResolver.class)
                        .artifact("de.flapdoodle.embed:de.flapdoodle.embed.mongo:jar:1.27")
                        .artifact("org.mongodb:mongo-java-driver:jar:2.9.1")
                        .resolveAs(JavaArchive.class));
    }

    @Inject MongoPersistenceContext mpc;

    @Override
    protected MongoPersistenceContext getMongoPersistenceContext() {
        return mpc;
    }
}

I’ve uploaded a full example project, based on Java 7, Maven, Mongo DB, Arquillian and Apache TomEE (embedded) on GitHub to demonstrate how to use an in-memory Mongo DB in a unit test as well as with Arquillian inside TomEE. This should serve as a good base when starting to write automated tests for your business logic towards Mongo.

 

Tommy Tynjä
@tommysdk

Writing integration tests in Java with Arquillian

Arquillian is a JBoss project that focuses on integration testing for Java. It’s an Open Source project I’m contributing to and using on the current project I’m working on. It let’s you write integration tests just as you would write unit tests, but it adds some very important features into the mix. Arquillian actually lets you execute your test cases inside a target runtime, such as your application server of choice! It also lets you make use of dependency injection to let you test your services directly in your integration test. Arquillian comes with a whole bunch of adapters to different runtimes, such as JBoss 7,6,5, Glassfish 3, Weld, WebSphere and even Selenium. So if you’re working with a common runtime, it is probably already supported! If your runtime is not supported, you could contribute with an adapter for it! What Arquillian basically needs to know is how to manage your runtime, how to start and stop the runtime and how to do deployments and undeployments. Just put the adapter for your runtime on your classpath and Arquillian will handle the rest.

Let’s get down to business. What does an Arquillian test look like? Image that we have a very basic EJB which we want to integration test:

@Stateless
public class WeekService {
  public String weekOfYear() {
      return Integer.toString(Calendar.getInstance().get(Calendar.WEEK_OF_YEAR));
  }
}

This EJB provides one single method which returns what week of the year it is. Sure, we could just as well unit test the functionality of this EJB, but imagine that we might want to extend this bean with more functionality later on and that we want to have a proper integration test already in place. What would an Arquillian integration test look like that asserts the behaviour of this bean? With JUnit, it would look something like this:

@RunWith(Arquillian.class)
public class WeekServiceTest {

    @Deployment
    public static Archive deployment() {
        return ShrinkWrap.create(JavaArchive.class, "week.jar")
                .addClass(WeekService.class);
    }

    @EJB WeekService service;

    @Test
    public void shouldReturnWeekOfYear() {
        Assert.assertNotNull(service.weekOfYear());
        Assert.assertEquals("" + Calendar.getInstance().get(Calendar.WEEK_OF_YEAR),
                service.weekOfYear());
    }
}

The first thing you notice is the @RunWith annotation at the top. This tells JUnit to let Arquillian manage the test. The next thing you notice is the static method that is annotated with @Deployment. This method creates a Java Archive (jar) representation containing the WeekService class which we want to test. The archive is created using ShrinkWrap, which let’s you assemble archives through a fluent Java API. Arquillian will take this archive and deploy it to the target runtime before executing the test cases, and will after the test execution undeploy the same archive from the runtime. You are not forced to have a deployment method, as you might as well want to test something that is already deployed or available in your runtime. If you have a runtime that is already running somewhere, you can setup Arquillian to run in remote mode, which means that Arquillian expects the runtime to already be running, and just do deployments and undeployments in the current environment. You could also tell Arquillian to run in managed mode, where Arquillian expects to be able to start the runtime before the test execution, and will also shutdown the runtime when the test execution completes. Some runtimes also comes with an embedded mode, which means that Arquillian will run an own isolated runtime instance during the test execution.

Probably the biggest difference to a standard unit test in the above example is the @EJB annotation on the WeekService property. This actually lets you specify that you want an EJB dependency injected into your test case when executing your tests, and Arquillian will handle that for you! If you want to make sure that is the case, just remove the @EJB annotation and you will indeed get a NullPointerException when calling service.weekOfYear() in your test method. A feature which is going to be added to a feature version of Arquillian is to fail fast if dependencies can’t be fullfilled (instead of throwing NPE’s).

The @Test annotated method handles the actual test logic, just like a unit test. The difference here is that the test is actually executed within your target runtime!

This was just a basic example of what you can do with Arquillian. The more you work with it, the more you will start to recognize different use cases where integration tests becomes a natural way of testing the intended behaviour.

The full source code of the above example is available on GitHub: https://github.com/tommysdk/showcase/tree/master/arquillian/. It also contains a more sophisticated example which uses CDI and JPA within an integration test. The examples runs on a JBoss 7.0.2 runtime, which comes bundled with the example. For more details and a deeper dive into Arquillian, visit the project website at http://arquillian.org.

 

Tommy Tynjä
@tommysdk

Configure datasources in JBoss AS 7.1

JBoss released their latest application server 7.1.0 last week which is a full fledged Java EE 6 server (full Java EE 6 profile certified). As I wanted to start working with the application server right away, I downloaded it and the first thing I needed to do was to configure my datasource. Setting up the datasources was simple and I’ll cover the steps in the process in this blog post.

I’ll use Oracle as an example in this blog post, but the same procedure worked with the other JDBC datasources I tried. First, add your driver as a module to the application server (if it does not already exist). Go to JBOSS_ROOT/modules and add a directory structure appropriate for your driver, such as: com/oracle/ojdbc14/main (note the main directory at the end) which gives you the path: JBOSS_ROOT/modules/com/oracle/ojdbc14/main.

In the main directory, add your driver jar-file and create a new file named module.xml. Add the following content to the module.xml file:

<?xml version="1.0" encoding="UTF-8"?>
<module xmlns="urn:jboss:module:1.1" name="com.oracle.ojdbc14">
    <resources>
        <resource-root path="the_name_of_your_driver_jar_file.jar"/>
    </resources>
    <dependencies>
        <module name="javax.api"/>
        <module name="javax.transaction.api"/>
    </dependencies>
</module>

In the module.xml file, the module name attribute have to match the structure of the directory structure and the resource-root path should point to the driver jar-file name.

As I wanted to run JBoss in standalone mode, I edited the JBOSS_ROOT/standalone/configuration/standalone.xml, where I added my datasource under the datasource subsystem, such as:

<subsystem xmlns="urn:jboss:domain:datasources:1.0">
  <datasources>
    <datasource jndi-name="java:jboss/datasources/OracleDS" pool-name="OracleDS" enabled="true" jta="true" use-java-context="true" use-ccm="true">
      <connection-url>jdbc:oracle:thin:@my_url:my_port</connection-url>
      <driver>oracle</driver>
      ... other settings
    </datasource>
  </datasources>
  ...
</subsystem>

Within the same subsystem, in the drivers tag, specify your driver module:

<subsystem xmlns="urn:jboss:domain:datasources:1.0">
  ...
  <drivers>
    <driver name="oracle" module="com.oracle.ojdbc14">
      <driver-class>oracle.jdbc.OracleDriver</driver-class>
    </driver>
  </drivers>
</subsystem>

Note that the driver-tag in your datasource should point to the driver name in standalone.xml and your driver module should point to the name of the module in the appropariate module.xml file.

Please also note that if your JDBC driver is not JDBC4 compatible, you need to specify your driver-class within the driver tag, otherwise it can be omitted.

Now you have your datasources setup, and when you start your JBoss AS 7.1 instance you should see it starting without any datasource related errors and see logging output along with the following lines:
org.jboss.as.connector.subsystems.datasources (ServerService Thread Pool — 27) JBAS010403: Deploying JDBC-compliant driver class oracle.jdbc.OracleDriver
org.jboss.as.connector.subsystems.datasources (MSC service thread 1-1) JBAS010400: Bound data source java language=”:jboss/datasources/OracleDS”

Tommy Tynjä
@tommysdk

An introduction to Java EE 6

Enterprise Java is really taking a giant leap forward with its latest specification, the Java EE 6. What earlier required (more or less) third party frameworks to achieve are now available straight out of the box in Java EE. EJB’s for example have gone from being cumbersome and complex to easy and lightweight, without compromises in functionality. For the last years, every single project I’ve been working on has in one way or another incorporated the Spring framework, and especially the dependency injection (IoC) framework. One of the best things with Java EE 6 in my opinion is that Java EE now provides dependency injection straight out of the box, through the CDI (Context and Dependency Injection) API. With this easy to use, standardized and lightweight framework I can now see how many projects can actually move away from being dependent on Spring just for this simple reason. CDI is not enabled by default and to enable it you need to put a beans.xml file in the META-INF/WEB-INF folder of your module (the file can be empty though). With CDI enabled you can just inject your dependencies with the javax.inject.Inject annotation:

@Stateless
public class MyArbitraryEnterpriseBean {

   @Inject
   private MyBusinessBean myBusinessBean;

}

Also note that the above POJO is actually a stateless session bean thanks to the @Stateless annotation! No mandatory interfaces or ejb-jar.xml are needed.

Working with JSF and CDI is just as simple. Imagine that you have the following bean, where the javax.inject.Named annotation marks it as a CDI bean:

@Named("controller")
public class ControllerBean {

   public void happy() { ... }

   public void sad() { ... }
}

You could then invoke the methods from a JSF page like this:

<h:form id="controllerForm">
   <fieldset>
      <h:commandButton value=":)" action="#{controller.happy}"/>
      <h:commandButton value=":(" action="#{controller.sad}"/>
   </fieldset>
</h:form>

Among other nice features of Java EE 6 is that EJB’s are now allowed to be packaged inside a war package. That alone can definitely save you from packaging headaches. Another step in making Java EE lightweight.

If you are working with servlets, there are good news for you. The notorious web.xml is now optional, and you can declare a servlet as easy as:

@WebServlet("/myContextRoot")
public class MyServlet extends HttpServlet {
   ...
}

To start playing with Java EE 6 with the use of Maven, you could just do mvn archetype:generate and select one of the jee6-x archetypes to get yourself a basic Java EE 6 project structure, e.g. jee6-basic-archetype.

Personally I believe Java EE 6 is breaking new grounds in terms of enterprise Java. Java EE 6 is what J2EE was not, e.g. easy, lightweight, flexible, straightforward and it has a promising future. Hopefully Java EE will from now on be the natural choice when building applications, over the option of depending on a wide selection of third party frameworks, which has been the case in the past.

 

Tommy Tynjä
@tommysdk