Devsoap

Devsoap - Devsoap

To subscribe to this RSS feed add https://devsoap.com/rss to your RSS Feed reader!

Blog entries:

3 things to look out for when using Spring Framework

Learn to write Spring applications that also will be a joy to evolve and improve in the future as well.

I have for some years been involved in evolving Java Web applications that were written with early versions of the Spring Framework into more modern incarnations. When doing this I have constantly stumbled upon these issues that make the application hard to maintain and develop further.

1 Database schema is exposed via API

This is the issue that by far causes issues for the clients I work with.

The major problem is that the Spring framework tutorials drive developers to this pattern which might look like a simple and easy solutions and will look good in a presentation, but in the end, once the application matures, turns out to become a real headache both from a security and maintainability aspect.

A code smell of this is when you start asking questions like this:

https://stackoverflow.com/q/26161080/4301255

There are multiple issues with exposing everything through the API layer.

From a security perspective the developer is no longer controlling what is exposed via the API from the application. All it takes for a developer to make a mistake is to add a new column to the database with some sensitive data and voilà, the sensitive information is immediately exposed via the end-point. You would need to have API schema tests that thoroughly tests that no unknown fields in the returned JSON are added to circumvent this. Most applications I've so far seen lack even the most basic tests, not to mention these kinds of corner-cases.

Another issue from a maintainability perspective is that when the database structure is directly exposed via the end-point, you are also locking the API down to whatever the database at that day happens to be.

As the application matures it is very likely you will need at some point a V2 of your API which returns a different JSON structure. This might occur when you add a new consumer of your service like another micro-service or a mobile client. The consumer of the service might have totally different needs of how your database schema looks like.

The way I've seen most developers solve this is to start adding extra transient fields to the existing entities returned by the repository. This of course affects V1 of the API that starts to get all this kind of extra information that it previously has not expected. Also the new clients will start getting all this extra information they need which was present in V1 of the API. The worst cases I've seen with this is that after you have added enough consumers and versions those entities need to be composed from almost every table in the database and the queries become slow and hard to understand.

Lets look at a better proven solution for this!

If you do not want to run into the above problems you cannot take the short path the Spring tutorials show. Forget about RestRepository, consider it an abstraction meant for demo purposes only.

You will need to use a layered approach to separate the data from the data representation returned via the API if you in the future want to have an better time maintaining and building on the API.

Instead, you could use a approach like this:

  1. Use a Repository for data layer only! Use it to fetch data only. A good indication of this is that your Entity classes does not contain any classes related to the JSON representation (like @JsonProperty) but only validation and query related annotations.
  2. Use a Service to mutate data, perform extra validations on the data. A good indication this is being done properly is that the service is the only class that accesses the repository. All data access goes through the service.
  3. Use a RestController to mutate data from entity returned by the service to the API JSON. Your RestController methods only return DTO's (Data-Transfer-Objects) and no entities. To convert between the entities and the DTO's use ModelMapper's TypeMaps! A good indication of good usage of DTO's is that the DTO does only contain JSON annotations and no database related annotations. Do not try to be smart and extend your Entity classes into DTO's ;)

Now, lets look at problems we had before and see how they can be solved with the above approach.

Now if a developer adds a new field to the database table it is exposed via the Entity class to the application but the API remains unchanged. The developer now have to manually add the field to the DTO if he wants to expose the information via the API as well. This forces the developer to think about what he is doing and weather it is a good idea or not.

Also the versioning problem goes away. Now that the API is using DTO's instead of Entity classes a new Controller or method can easily be added with a different DTO composed from the Entity. This means the application can provide different versions of the API without making changes to the data representation.

2 Data is only validated at database level or not at all

This is common scenario I also see where developers rely only on database constraints for data validation. Usually companies wake up to this only after there have been successful XSS or SQL injection attacks. At that point the data already is full of garbage and it will be really hard to get the data to become consistent and useful.

Another, leaner version of this is that validation is only done at one level, either at API or data level. The argument usually from developers is that it is wasteful to perform validation two times and validating what comes via the API should be enough.

However, I have seen it so many times that a simple coding mistake in the Controller or Service, or a security issue with the API validations has caused invalid data to end up in the database that could easily have been prevented.

If you have followed the solution in #1 to separate your API from your data layer then solving this should be as easy as applying the same validators to your DTO's as well as your Entity classes and always use @Valid for every input DTO in your Controller.

A good sign is that every field in both your DTO's and your Entity classes has some validator attached. This is especially true for String fields.

3 You don't need streams if you are not streaming data to the front-end or sending events

Most business applications today use REST as the main communication method both between services and between front-end and back-end. Unless you are working with a fully event driven architecture or you plan to move to one you should not pay much attention to the hype around streams today.

For applications related to processing numerical data (mostly IoT) they are useful, and most demos presented using streams are around this scenario where you have a numerical stream and you want to process that. By that is far from the CRUD application most businesses are using Spring for today and using the stream API to feed a CRUD with data and usually leads to these kinds of issues:

https://stackoverflow.com/a/46434000/4301255

On the surface this might seem neat. But lets dig into that a bit.

In the repository you need to now rely on query hints that all database might or might not support. In this case the developer is messaging the database driver to return everything. Depending on how good the database driver is and how modern the database is (which might be pretty old in the enterprise) that might cause performance issues.

The service method no longer provides what type of entities it handles. While with non-streaming operations the service would return a list or Stream of entities, here we are providing an output stream only without any notion of what will happen to it. This would be a nightmare to test.

One key element with data persistance is transactions. Traditionally streaming and transactions have not worked together and only recently Spring has gain some support for them for MongoDB and R2DBC neither of which are majorly used for enterprise data. You can read more about the support in the Pivotal blog https://spring.io/blog/2019/05/16/reactive-transactions-with-spring. To summerize you lose transactions if you stream.

Finally, lets look at the controller. We are returning StreamingResponseBody instead of clear list or stream of entities. Again making testing harding and opaque.

Remember KISS. If your application does not rely on data streams you don't need the Spring streaming API. But by all means do not confuse this with Java Streams which can be useful when doing data transformations.


I hope these simple three observations will help you build applications that not only works today, but will allow your application to grow and evolve without major refactoring and security issues. To achieve this the best advice I can give you is that never use the latest sexy solution provided by the vendor if it does not give significant improvements that are easily modifiable in the future.


Product documentation now available!

Product documentation now available at docs.devsoap.com!
Product documentation now available!

Using a library, extension or plugin can always be a hard without proper documentation. Especially if you are just jumping in to a new technology or language.

To better help developers get started with the DS products I am now happy to open up the Documentation site at docs.devsoap.com!

The documentation site offers full technical documentation into the details of how to use the products and in some cases how to develop them further.

The documentation site also offers the possibility to comment on specific pages to help others or to suggest or clarify unclear topics.

I hope this will help everyone in getting to know the DS products and use them in your projects. If you have improvement ideas on how to make the documentation better feel free to comment below.

Happy reading!


TodoMVC: Fullstack serverless applications (Part 2: The REST API)

Learn how to write REST API's with serverless FN Project functions as well as connecting them to AWS DynamoDB.

In this article lets explore how we can build a REST API using FN functions. This is the second part of our TodoMVC app we started building in the previous post. A demo running this project can be found here.

To build our back-end we are going to do two things; set up the REST API end-points and connect them to DynamoDB where we are going to store our todo items.

The API

The API we are going to create is a simple CRUD (Create-Read-Update-Delete) API for the Todo items.

There are multiple ways we could split this functionality up to FN functions.

  1. Handle all REST methods in one function
  2. Create one function for each REST method
  3. Create one function for read operations (GET) and one for write operations (PUT,POST,PATCH,DELETE)

Which approach you select depends on your use-case.

If we would have a lot of business logic then 2. might have been a better option as we could have split out our code based on operation. However, if we had done that then we wouldn't have been able to use pure REST as every function needs a unique path and with REST for example GET and POST might have the same path.

The third (3.) option might be interesting if we would anticipate that our application would have a lot of read requests but not that many write requests. By splitting it in this way we could load balance the read operations in a different way than write operations and maybe for read operations add more FN servers to provide a better throughput. With this approach you have the same downside as with 2. i.e. you will not be able to write a pure REST API.

We are going to select 1. as our business logic is really small and it allows us to use a single URL path for all operations we need for our TodoMVC app. We also don't anticipate a lot of requests so we don't have to care about load balancing.

Before we continue, lets recap how our project structure currently looks like after we added the UI logic in the previous post.

project-structure

So to add the API we start by creating a new submodule in the existing project for our back-end functionality.

todomvc-api-module

Next we will need to turn the module into a FN function to serve our REST API.

We start by removing any auto-generated src folder Intellij might have created for us. Then, open up the api/build.gradle file and add the following content:

/* 
 * We use ReplaceTokens to replace property file placeholders
 */ 
import org.apache.tools.ant.filters.ReplaceTokens

/* 
 * Main FN function configuration
 */
fn {
    functionClass = 'TodoAPI'
    functionMethod = 'handleRequest'
    functionPaths = ['/items']
}

/**
 * Configure FN Function timeouts
 */
fnDocker {
    idleTimeout = 30
    functionTimeout = 60
}

dependencies {
    compile 'com.amazonaws:aws-java-sdk-dynamodb:1.11.490'
    compile 'org.slf4j:slf4j-simple:1.7.25'
}

/**
 * Replaces the AWS credential placeholders with real credentials
 */
processResources {
    from(sourceSets.main.resources.srcDirs){
        filesMatching('aws-credentials.properties'){
            filter(ReplaceTokens, tokens: [
                'aws.accessKeyId' : System.getenv('AWS_ACCESS_KEY_ID') ?: project.findProperty('aws.accessKeyId') ?: '',
                'aws.secretKey' : System.getenv('AWS_SECRET_ACCESS_KEY') ?: project.findProperty('aws.secretKey') ?: '',
                'aws.region' : System.getenv('AWS_REGION') ?: project.findProperty('aws.region') ?: ''
            ])
        }
    }
}

Finally, we just invoke the :api:fnCreateProject task to create the function source stubs based on the previously created build configuration.

todomvc-create-api-function

Now our project structure looks like this
Final project structure

We are now ready to implement the TodoAPI.

Persistence with AWS DynamoDB

Now that we have our function ready, lets implement the persistence layer.

The first thing we will need is to model how the Todo items should look like in AWS DynamoDB. We can do that by creating a model class (TodoItem.java) that specifies how a single item is modeled:

@DynamoDBTable(tableName = "todomvc")
public class TodoItem implements Serializable {

    private String id = UUID.randomUUID().toString();
    private boolean active = true;
    private String description;

    @DynamoDBHashKey
    public String getId() { return id; }
    public void setId(String id) { this.id = id; }

    @DynamoDBAttribute
    public boolean isActive() { return active; }
    public void setActive(boolean active) { this.active = active; }

    @DynamoDBAttribute
    public String getDescription() { return description; }
    public void setDescription(String description) { this.description = description;}

    /**
     * Helper method to create a TodoItem from an InputStream
     */
    public static Optional<TodoItem> fromStream(InputStream stream) {
        try {
            return Optional.of(new ObjectMapper().readValue(stream, TodoItem.class));
        } catch (IOException e) {
            return Optional.empty();
        }
    }

    /**
     * Helper method to convert the items into a byte array
     */
    public Optional<byte[]> toBytes() {
        try {
            return Optional.of(new ObjectMapper().writeValueAsBytes(this));
        } catch (JsonProcessingException e) {
            return Optional.empty();
        }
    }

This is pretty much a standard POJO with some DynamoDB specific annotations to help serialize the object. Our model is pretty simple, every item will only need to have two fields to keep track of; description and active.

The id field is only there to help us uniquely identify an item so we can modify or remove it. We could just as well have used the description field as our DynamoDB key, but that would have implied that we wouldn't be able to store duplicate items in our todo list.

Now that we have our item model, let's get back to the API implementation.

For our todomvc application we will need to support the following actions:

To do that we are going to modify our function in TodoAPI.java a bit to handle all those cases with a switch-statement:

public OutputEvent handleRequest(HTTPGatewayContext context, InputEvent input) throws JsonProcessingException {
    switch (context.getMethod()) {
        case "GET": {
            return fromBytes(new ObjectMapper().writeValueAsBytes(getItems()), Success, JSON_CONTENT_TYPE);
        }
        case "POST": {
            return input.consumeBody(TodoItem::fromStream)
                    .map(this::addItem)
                    .flatMap(TodoItem::toBytes)
                    .map(bytes -> fromBytes(bytes, Success, JSON_CONTENT_TYPE))
                    .orElse(emptyResult(FunctionError));
        }
        case "PUT": {
            return input.consumeBody(TodoItem::fromStream)
                    .map(this::updateItem)
                    .flatMap(TodoItem::toBytes)
                    .map(bytes -> fromBytes(bytes, Success, JSON_CONTENT_TYPE))
                    .orElse(emptyResult(FunctionError));
        }
        case "DELETE": {
            return input.consumeBody(TodoItem::fromStream)
                    .map(this::deleteItem)
                    .flatMap(TodoItem::toBytes)
                    .map(bytes -> fromBytes(bytes, Success, JSON_CONTENT_TYPE))
                    .orElse(emptyResult(FunctionError));
        }
        default:
            return emptyResult(FunctionError);
    }
}

As you can see we start by modifying our function to inject the HTTPGatewayContext as well as the InputEvent so we can process the request. From the context we get the HTTP method used to call the function and from the input event we get the HTTP request body.

Next, depending on which HTTP method was used, we convert the HTTP body into our TodoItem model and save it to the database.

To help us understand how this gets saved to the database, lets look at the rest of TodoAPI.java:

public class TodoAPI {

    private static final String JSON_CONTENT_TYPE = "application/json";

    private final DynamoDBMapper dbMapper;

    public TodoAPI() {
        var awsProperties = getAWSProperties();
        var awsCredentials = new BasicAWSCredentials(
                awsProperties.getProperty("aws.accessKeyId"),
                awsProperties.getProperty("aws.secretKey"));
        var awsClient = AmazonDynamoDBClient.builder()
                .withRegion(awsProperties.getProperty("aws.region"))
                .withCredentials(new AWSStaticCredentialsProvider(awsCredentials))
                .build();

        dbMapper = new DynamoDBMapper(awsClient);
    }

    public OutputEvent handleRequest(HTTPGatewayContext context, InputEvent input) throws JsonProcessingException {
    // Implementation omitted
    }

    private List<TodoItem> getItems() {
        return new ArrayList<>(dbMapper.scan(TodoItem.class, new DynamoDBScanExpression()));
    }

    private TodoItem updateItem(TodoItem item) {
        dbMapper.save(item);
        return item;
    }

    private TodoItem addItem(TodoItem item) {
        dbMapper.save(item);
        return item;
    }

    private TodoItem deleteItem(TodoItem item) {
        dbMapper.delete(item);
        return item;
    }

    private static Properties getAWSProperties() {
        var awsProperties = new Properties();
        try {
           awsProperties.load(TodoAPI.class.getResourceAsStream("/aws-credentials.properties"));
        } catch (IOException e) {
            throw new RuntimeException("Failed to load AWS credentials!", e);
        }
        return awsProperties;
    }
}

As you probably noticed, we set up a DynamoDBMapper using the credentials we have stored in a file called aws-credentials.properties under our project resources.

If you check out the api/build.gradle file you will notice that we are populating the real credentials into the aws-credentials.properties file at build time.

Once we have the DynamoDBMapper it is a trivial task to query DynamoDB for items as well as add, update and remove items. The mapper will handle all communication for us.

Wrapping up

This is pretty much all there is to create a REST API using a FN Function.

We can now run the project as we did in the first part.

The difference will now be that both the UI and the API functions will be deployed to the FN server. If you want to try out the REST API it will be available under http://localhost:8080/t/todomvc/items .

The sources for the full example which you can check out and directly run is available in here. You will need valid AWS credentials to try out the example as well as create a new DynamoDB instance to host your data.


Gradle Vaadin Flow 1.0 released!

Gradle Vaadin Flow 1.0 provides you with the most performant and easy to use build integration for Vaadin Flow applications today.

Gradle Vaadin Flow 1.0 provides you with the most performant and easy to use build integration for Vaadin Flow applications today. I'm happy to announce that the plugin now has reached production ready status!

After 17 pre-releases and countless testing and bugfixes it is about time the plugin gets a stable release. I know some of you have been eagerly waiting for this :)

It has been a joy working on the plugin and a big thank you goes out to those who have tested the plugin and given excellent feedback at such an early stage of the project. I don't think it would have been possible to iron out most of the edges without your help.

A big thank you also goes out to the project sponsors who have made this project possible. By providing Open-Source sponsoring for the project they have made it possible to work on this project and provide you with a Gradle integration for your Vaadin projects. If you want to join them be sure to check out the Sponsorship page to find out how you also could help out with the project funding.

Here is s short list of features it provides:

For more information check out the product page.

But of course we are not done yet, we are only getting started!

Now it is your turn to take the project into use and give feedback of what is still missing or what does not work. If there is a feature or tweak you would like or you spot a bug that is preventing you from using the plugin be sure to submit an issue into the issue tracker over at Github.

To read more about the different releases and what they contained be sure to check out the blog articles , example projects, or the project wiki.

Happy building!


TodoMVC: Fullstack serverless applications (Part 1: The UI)

Learn how to write fullstack serverless Java web applications with Fn Project and ReactJS.

In this two part blog series we are going to look at how we can serve a full web application by only using FN Project Java serverless functions. We are going to do this by writing a classic TodoMvc application all the way from the UI with React to the persistence leveraging Amazon DynamoDB. In this first part we are going to focus on building the front-end while in the second part we finish the application by creating an API for the UI.

Why serverless?

When thinking of "serverless" or FAAS (Function-As-A-Service) you might think that the primary benefit is its simplicity, you don't have to care about running an application server and can focus on writing application code. While that is partly true, I think there are even more, more substantial benefits that can be considered.

Stateless

All serverless functions are stateless by design. Trying to save a state in a function simply will not work since after the function is executed the application is terminated and along with it all the memory it consumed. This means a lot less worries about memory leaks or data leaks and allows even junior developers to write safe applications.

Scalable

Serverless as a paradigm is similar to what micro-services provide. A way of cleanly separating functionality into smaller units or Bounded Contexts as Martin Fowler so famously put it. Serverless functions allows you to do the same as micro-services, group functions into serverless applications (like the one I will be showing) with the benefits of writing less boiler-plate code than traditional micro-service frameworks.

Cost effective

A common way to host your applications is to purchase a VPC from a vendor like Digital Ocean, or set up an Amazon EC2 instance and what you pay for is ultimately how much memory and CPU you are using. A common micro-service approach then is to deploy the application on an embedded application server like Jetty or Tomcat and then further wrap that inside a Docker container. The downside of this is that once that is deployed it will actively consume resources even while nobody is using your application and every micro-service will actually contain a fully fledged application server. In contrast, serverless functions only consume resources while they are active which means that you actually only pay for what you need. You can even further optimize on a function-basis if you've split your application wisely into functions so that the most used functionality of your application gets higher priority (and resources) while the less used gets less.

Of course, using serverless functions is not a silver bullet and comes with some considerations.

If you have a high-volume application it might be wise to split your application into a few micro-services that take the most load as they are always active and then implement serverless functions around those services for the less used functionality. It is also worth noting that serverless functions comes with a ramp-up time, i.e if the function is not hot (it hasn't been invoked in a while), it will take a few more milliseconds for it to start as the docker container wakes up from hibernation and cause a slight delay. You can affect this by tweaking the function but more about that later.

Creating our TodoMVC project

For those impatient ones who just want to browse the code, the full source code for this example can be found here https://github.com/devsoap/examples/todomvc-fn-java.

And here is the application live:

You can open the application full screen in a new tab clicking here

Getting started

To create a new serverless app create a new Gradle project in Intellij IDEA and select Java. Like so:

Next we will need to configure our Gradle build to create Serverless applications.

In the newly created project, open up the build.gradle file and replace its contents with the following:

plugins {
    // For support for Serverless FN applications
    id 'com.devsoap.fn' version '0.1.7' apply false
    // For support for fetching javascript dependencies
    id "com.moowork.node" version "1.2.0" apply false
}

group 'com.example'

subprojects {

    // Apply the plugin to all sub-projects
    apply plugin: 'com.devsoap.fn'

    // We want to develop with Java 11
    sourceCompatibility = 11
    targetCompatibility = 11

    // Add Maven Central and the FN Project repositories
    repositories {
        mavenCentral()
        fn.fnproject()
    }

    // Add the FN function API dependency
    dependencies {
        compile fn.api()
    }
}

As you probably already figured out we are going build a multi-module Gradle project where our sub-modules will be FN functions. To do that we leverage the Devsoap FN Gradle plugin as well as the Moowork Node plugin.

Also, you might want to remove any src folder that was generated for the parent project, our sources will be in the submodules.

Here is how it will look like:

Next, lets create our first function!

Right click on the project, and create a new UI module:

As we did before, remove any src folder which is automatically created.

Open up the ui/build.gradle file if it is not open yet, and replace the contents with the following:

apply plugin: 'com.moowork.node'

/**
 * Configure FN Function
 */
fn {
    // The name of the entrypoint class
    functionClass = 'TodoAppFunction'

    // To name of the entrypoint method
    functionMethod = 'handleRequest'

    // The available URL sub-paths
    functionPaths = [
            '/',
            '/favicon.ico',
            '/bundle.js',
            '/styles.css'
    ]
}

Lets take a look at what this means.

On the first line we are applying the Node Gradle plugin. We are later going to use it to compile our front-end React application.

Then we configure the Fn function.

functionClass will be th main class of our UI, this is the class which is called when somebody accesses our application.

functionMethod is the actual method that will get called. This will host our function logic.

functionPaths are all the sub-paths our function will listen to. We will have to implement some logic to handle all of these paths.

Right, now we have our function definition, but we don't yet have our function sources. Lets create them.

From the right-hand side gradle navigation menu, open up the UI Fn tasks groups and double-click on fnCreateFunction.

Lets have a look at the created function:

import static java.util.Optional.ofNullable;

public class TodoAppFunction {

    public String handleRequest(String input) {
        String name = ofNullable(input).filter(s -> !s.isEmpty()).orElse("world");
        return "Hello, " + name + "!";
    }
}    

It by default generates a basic Hello world type of function which is not very exciting. Lets now add our function logic to it so it looks like this:

/**
 * Serves our react UI via a function call
 */
public class TodoAppFunction {

    private static final String APP_NAME = "todomvc";

    /**
     * Handles the incoming function request
     *
     * @param context
     *      the request context
     *
     * @return
     *      the output event with the function output
     */
    public OutputEvent handleRequest(HTTPGatewayContext context) throws IOException {
        var url = context.getRequestURL();
        var filename = url.substring(url.lastIndexOf(APP_NAME) + APP_NAME.length());
        if("".equals(filename) || "/".equals(filename)) {
            filename = "/index.html";
        }

        var body = loadFileFromClasspath(filename);

        var contentType = Files.probeContentType(Paths.get(filename));
        if(filename.endsWith(".js")) {
            contentType = "application/javascript";
        } else if(filename.endsWith(".css")) {
            contentType = "text/css";
        }

        return OutputEvent.fromBytes(body, OutputEvent.Status.Success, contentType);
    }

    /**
     * Loads a file from inside the function jar archive
     *
     * @param filename
     *      the filename to load, must start with a /
     * @return
     *      the loaded file content
     */
    private static byte[] loadFileFromClasspath(String filename) throws IOException {
        var out = new ByteArrayOutputStream();
        try(var fileStream = TodoAppFunction.class.getResourceAsStream(filename)) {
            fileStream.transferTo(out);
        }
        return out.toByteArray();
    }
}

Lets look at the function implementation a bit:

We create a helper method loadFileFromClasspath that will load any file from the current function classpath. By using the helper method we will be able to serve any static resources via our function.

Next, to the meat of the bones, the handleRequest method. This is the entry point method where all requests will arrive that are made to the function.

If you remember from the function definition we did previously, we assigned four sub-paths to the url; '/', '/favicon.ico', '/bundle.js',
and '/styles.css'. What we simply do in handleRequest is examine the incoming URL and extract the filename from it. Then, load the file from our classpath. In essence, the function we have created is a static file loader!

What about security, will this mean that you can now load any file via this function? The answer is of course no, you will only be able to call the function with the give sub-paths in the function definition. Any other paths will just not arrive to this function.

Including the static files

We now have our function, but it will not yet return anything as we don't yet have the static files we have defined in our function definition.

Lets start with our bootstrap HTML file we want to serve.

We create a file named index.html and place it under src/main/resources. By placing the file there it will be included in our function resources and can be found from the classpath by using our function we defined above.

<!DOCTYPE html>
<html lang="en">
    <head>
        <meta charset="UTF-8">
        <title>TodoMVC - A fully serverless todo app!</title>
        <link rel="shortcut icon" href="todomvc/favicon.ico" />
        <link rel="stylesheet" type="text/css" href="todomvc/styles.css">
    </head>
    <body>
        <div id="todoapp" />
        <script src="todomvc/bundle.js"></script>
    </body>
</html>

Pretty basic stuff, we define a favicon and a css style sheet in the head section and in the body we define the root div-element and the bootstrap.js script for our React app.

Next we create a CSS file under src/main/resources and call it styles.css. In it we define some styles for the application:

body {
    background: #f5f5f5;
    font-weight: 100;
}
.container {
    background: #fff;
    width:600px;
    margin-left: auto;
    margin-right: auto;
}
h3 {
    color: rgba(175, 47, 47, 0.15);
    font-size: 100px;
    background: #f5f5f5;
    text-align: center;
    margin: 0;
}
.inner-container {
    border: 1px solid #eee;
    box-shadow: 0 0 2px 2px #eee
}
#new-todo {
    background: none;
    margin-top:10px;
    margin-left:5%;
    width:95%;
    font-size: 24px;
    height: 2em;
    border: 0;
}
.items {
    list-style: none;
    font-size: 24px;
}
.itemActive {
    width: 2em;
    height: 2em;
    background-color: white;
    border-radius: 50%;
    vertical-align: middle;
    border: 1px solid #ddd;
    -webkit-appearance: none;
    outline: none;
    cursor: pointer;
    margin-right:20px
}
.itemActive:checked {
    background-color: lightgreen;
}
.itemRemove {
    float:right;
    margin-right: 20px;
    color: lightcoral;
    text-decoration: none;
}
footer {
    width:100%;
    height:50px;
    line-height: 50px;
    color: #777
}
.itemsCompleted {
    padding-left: 20px;
}
.activeStateFilter {
    width:70%;
    float: right
}
.stateFilter {
    margin:10px;
    cursor: pointer;
     padding: 2px;
}

.stateFilter.active {
    border: 1px solid silver;
    border-radius: 4px
}

If you've done any webapps before this shouldn't be anything new.

Finally we download a nice favicon.ico file for our application and also place it under src/main/resources. You can find some nice ones from https://www.favicon.cc or design a new one yourself if you are the creative type. For our demo I chose this one.

Building the UI with React and Gradle

Now that we have our static files in place we still need to build our front-end React application.

We start by defining our front-end dependencies in a file called package.js in the root folder of the UI project. It will look like this:

{
    "name": "ui",
    "version": "1.0.0",
    "main": "index.js",
    "license": "MIT",
    "babel": {
        "presets": [
            "@babel/preset-env",
            "@babel/preset-react"
        ]
    },
    "scripts": {
        "bundle": "webpack-cli --config ./webpack.config.js --mode=production"
    },
    "devDependencies": {
        "@babel/core": "^7.2.2",
        "@babel/preset-env": "^7.3.1",
        "@babel/preset-react": "^7.0.0",
        "babel-loader": "^8.0.5",
        "css-loader": "^2.1.0",
        "html-webpack-inline-source-plugin": "^0.0.10",
        "html-webpack-plugin": "^3.2.0",
        "style-loader": "^0.23.1",
        "webpack": "^4.29.0",
        "webpack-cli": "^3.2.1"
    },
    "dependencies": {
        "babel": "^6.23.0",
        "babel-core": "^6.26.3",
        "react": "^16.7.0",
        "react-dom": "^16.7.0",
        "whatwg-fetch": "^3.0.0"
    }
}

This should be a very standard set of dependencies when building React apps.

Next we are going to use Webpack and Babel to bundle all our Javascript source files into one single bundle.js that also will get included in our static resources.

To do that we need to create another file, webpack.config.js in our UI root folder to tell the compiler how to locate and bundle our javascript files. In our case it will look like this:

var path = require('path');

module.exports = {
    entry: [
        'whatwg-fetch',
        './src/main/jsx/todo-app.js'
    ],
    output: {
        path: path.resolve(__dirname, './build/resources/main'),
        filename: 'bundle.js'
    },
    module: {
       rules: [
         {
           test: /\.(js|jsx)$/,
           exclude: /node_modules/,
           use: ['babel-loader']
         }
       ]
     },
     resolve: {
       extensions: ['*', '.js'],
     }
}

There are two noteworthy things I should mention about this.

In the entry section we are pointing to a javascript source file that will act as our main application entry point. In a moment we are going to create that file.

In output we are setting the path where we want to output the ready bundle.js file. In our case we want to output to build/resources/main as that is what Gradle will use when packaging our function.

Note: We could also have set the path to src/main/resources and it would have worked. But it is a good idea to separate generated files we don't commit to version control from static files we want to commit to version control.

Now that we have our configurations in place, we still need to instruct our Gradle build to build the front-end. We do so by adding the following task to our build.gradle file:

/**
 * Configre Node/NPM/YARN
 */
node {
    download = true
    version = '11.8.0'
}

/**
 * Bundles Javascript sources into a single JS bundle to be served by the function
 */
task bundleFrontend(type: YarnTask) {
    inputs.file project.file('package.json')
    inputs.file project.file('yarn.lock')
    inputs.files project.fileTree('src/main/html')
    inputs.files project.fileTree('src/main/jsx')
    outputs.file project.file('build/resources/main/bundle.js')
    yarnCommand = ['run', 'bundle']
}
jar.dependsOn(bundleFrontend)

What this task will do is download all necessary client dependencies using Yarn (package manager) and then it will compile our sources into the bundle.js file.

The last line indicates that whenever we are building the function we should do this to ensure the latest bundle is included in the function distribution.

Now the only thing we are missing are the actual Javascript source files. So we create a new directory src/main/jsx and in it we place two source files:

todo-app.js

import React from 'react';
import ReactDOM from 'react-dom';
import TodoList from './todo-list.js'

/**
 * Todo application main application view
 */
class TodoApp extends React.Component {

  constructor(props) {
    super(props);
    this.state = { items: [], filteredItems: [], text: '', filter: 'all' };
    this.handleChange = this.handleChange.bind(this);
    this.handleSubmit = this.handleSubmit.bind(this);
    this.handleActiveChange = this.handleActiveChange.bind(this);
    this.handleRemove = this.handleRemove.bind(this);
    this.handleFilterChange = this.handleFilterChange.bind(this);
  }

  componentDidMount() {
      fetch("todomvc/items")
        .then(result => { return result.json() })
        .then(json => { this.setState({items: json}) })
        .catch(ex => { console.log('parsing failed', ex) });
  }

  componentWillUpdate(nextProps, nextState) {
        if(nextState.filter === 'all') {
            nextState.filteredItems = nextState.items;
        } else if(nextState.filter === 'active') {
           nextState.filteredItems = nextState.items.filter(item => item.active);
        } else if(nextState.filter === 'completed') {
           nextState.filteredItems = nextState.items.filter(item => !item.active);
        }
  }

  render() {
    return (
      <div class="container">
        <h3>todos</h3>
        <div class="inner-container">
            <header class="itemInput">
                <form onSubmit={this.handleSubmit}>
                  <input
                    id="new-todo"
                    onChange={this.handleChange}
                    value={this.state.text}
                    placeholder="What needs to be done?"
                  />
                </form>
            </header>
            <section class="itemList">
                <TodoList items={this.state.filteredItems} onActiveChange={this.handleActiveChange} onRemove={this.handleRemove} />
            </section>
            <footer class="itemControls">
                <span class="itemsCompleted">{this.state.items.filter(item => item.active).length} items left</span>
                <span class="activeStateFilter">
                    <span filter="all" class={this.state.filter === 'all' ? "stateFilter active" : "stateFilter"} onClick={this.handleFilterChange}>All</span>
                    <span filter="active" class={this.state.filter === 'active' ? "stateFilter active" : "stateFilter"} onClick={this.handleFilterChange}>Active</span>
                    <span filter="completed" class={this.state.filter === 'completed' ? "stateFilter active" : "stateFilter"} onClick={this.handleFilterChange}>Completed</span>
                </span>
            </footer>
        </div>
      </div>
    );
  }

  handleChange(e) {
    this.setState({ text: e.target.value });
  }

  handleSubmit(e) {
    e.preventDefault();
    if (!this.state.text.length) {
      return;
    }

    const newItem = {
      description: this.state.text,
    };

    fetch('todomvc/items', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify(newItem)
    }).then(result => {
       return result.json();
    }).then(json => {
       this.setState( state => ({ items: state.items.concat(json), text: ''}) );
    }).catch(ex => {
      console.log('parsing failed', ex);
    });
  }

  handleActiveChange(newItem) {
    this.setState( state => ({
        text: '',
        items: state.items.map(oldItem => {
            if(oldItem.id === newItem.id) {
                fetch('todomvc/items', {
                      method: 'PUT',
                      headers: { 'Content-Type': 'application/json' },
                      body: JSON.stringify(newItem)
                }).then(result => {
                  return result.json();
               }).then(json => {
                  return json;
               })
            }
            return oldItem;
        })
    }));
  }

  handleRemove(item) {
    fetch('todomvc/items', {
          method: 'DELETE',
          headers: { 'Content-Type': 'application/json' },
          body: JSON.stringify(item)
    }).then( result => {
        this.setState(state => ({
            items: state.items.filter(oldItem => oldItem.id != item.id)
        }))
    });
  }

  handleFilterChange(e) {
    var filter = e.target.getAttribute("filter");
    this.setState( state => ({
        filter: filter
    }));
  }

}

ReactDOM.render(<TodoApp />, document.getElementById("todoapp"));

And todo-list.json:

import React from 'react';

/**
 * Todo list for managing todo list items
 */
export default class TodoList extends React.Component {

  render() {
    return (
      <ul>
        {this.props.items.map(item => (
          <li class="items" key={item.id}>
             <input class="itemActive" itemId={item.id} name="isDone" type="checkbox" checked={!item.active}
                     onChange={this.handleActiveChange.bind(this)} />
             <span class="itemDescription">{item.description}</span>
             <a class="itemRemove" itemId={item.id} href="#" onClick={this.handleRemove.bind(this)} >&#x2613;</a>
          </li>
        ))}
      </ul>
    );
  }

  handleActiveChange(e) {
    var itemId = e.target.getAttribute('itemId');
    var item = this.props.items.find ( item => { return item.id === itemId })
    item.active = !item.active;
    console.log("Changed active state of "+item.description + " to " + item.active);
    this.props.onActiveChange(item);
  }

  handleRemove(e) {
      var itemId = e.target.getAttribute('itemId');
      var item = this.props.items.find ( item => { return item.id === itemId })
      console.log("Removing item " + item.description);
      this.props.onRemove(item);
  }
}

Both of these javascript source files are pretty basic if you have done React before. if you haven't check any primer, the concepts used here are described in any resource available.

Now we have everything we need for our app. Lets have a final look how our project structure now looks like:
Final Project Structure

Running the project

Of course when we develop the project we also want to try it out on our local machine. Before you continue you will need Docker, so install that first.

To get your development FN server running, run the fnStart task from the root project:

FN Server start

Once the server is running you can deploy the function by double-clicking on the fnDeploy task.

Once the function is deployed you should be able to access it on http://localhost:8080/t/todomvc.

To be Cont’d!

We are now finished with the front-end function. But if you run the application we just made you will notice it is not working.

In the next part we will finish the application and hook our front-end function up to our back-end API and DynamoDB. Check it out!


2018 in Review

A summary of what has happened on devsoap.com in 2018.
2018 in Review

The year 2018 has soon come to an end and I think this is a good time to look at what has been accomplished this year before we move on to the next one.

We started the year in February by examining how we can improve keeping all those Maven dependencies in check and up to date by creating dependency version reports in Dependency version reports with Maven. In the article we learned how to leverage Groovy to read Maven plugin text reports and convert them to color encoded HTML reports.

In March the first version of the Gradle Vaadin Flow Plugin was released to support building Vaadin 10 projects with Gradle. The launch was described in Building Vaadin Flow apps with Gradle where we examined the basics of how the plugin worked.

In April the Gradle Vaadin Flow Plugin was improved to work with Javascript Web Components as can be read from here Using Web Components in Vaadin with Gradle.

In May the Gradle Vaadin Flow Plugin got its first support for Vaadin production mode. To read more about production mode check out this article Production ready Gradle Vaadin Flow applications.

We also examined how we can build robust, functional micro-services with Groovy and Ratpack in Building RESTful Web Services with Groovy. As a side note this has been the most read blog article the whole year so if you haven't read it yet, you have missed out!

In June the Gradle Vaadin Flow Plugin got support for Polymer custom styles as well as improvements to creating new Web components in Vaadin 10+ projects. The release notes (Gradle Vaadin Flow plugin M3 released) from that time reveals more about that.

In July we took a look at Gradle plugin development and how we can more easily define optional inputs for Gradle tasks in Defining optional input files and directories for Gradle tasks.

A new version of the Gradle Vaadin Flow Plugin was also released with new support for the Gradle Cache, HTML/Javascript bundling, Web Templates and Maven BOM support. Wow, what a release that was! The new features were described in Gradle Vaadin Flow plugin M4 released.

In September we took a look at using alternate JVM languages (Groovy and Kotlin) to build our Vaadin 10 applications with in Groovy and Kotlin with Vaadin 10.

While in October the Gradle Vaadin Flow Plugin got a new version again, this time with Spring Boot support and multi-module support.

The release also brought a controversial breaking change in requiring Gradle 5 due to the backward incompatible changes to the BOM support done in Gradle 5. However, it is starting to look like a good choice now that Gradle 5 is out and working for at least most of the community.

In late October or early November we also saw the second Devsoap product released. A new Gradle plugin Fn Project Gradle Plugin for building serverless functions on top of Oracle's FN server.

The plugin allowed to leverage Gradle to both develop and deploy the functions using all common JVM languages (Java, Groovy and Kotlin) both locally and to remotely running FN servers. The plugin is still in heavy development but already is used for projects around the world. To read more about the plugin checkout the article Groovy functions with Oracle Fn Project.

In November the Gradle Vaadin Flow Plugin went into Release Candidate state where the last bug fixes and improvements are still made to make the plugin a stable production ready release. This means it is very likely that early 2019 we will see the first stable release of the plugin so stay tuned ;)

--

Looking back that is a whole lot of new releases and articles to fit into 2018. Beyond that the year has seen a lot of more minor releases and a lot of discussions on Github and elsewhere regarding the products. It has been good to see the communities we are involved in has embraced these new ideas and I'm certainly looking forward to what 2019 will bring.

Have a good new year every one and see you in 2019!


Groovy functions with Oracle Fn Project

In this introduction I'll show you how you can easily start building your Fn Project functions with Groovy and Gradle.

The Fn Project is a new Open Source FAAS (Function-As-A-Service) framework by Oracle. In contrast to what Amazon or Google provides this framework is fully open source and can be set up on your local hardware or on any VPC provider. In this short introduction I'll show you how you can easily start building your functions with Groovy and Gradle.

The FN Project framework consists of many parts to be able to load balance and scale the framework infrastructure so it might seem daunting. But don't worry, you won't need any of that for this tutorial! We are only going to look into how we can develop a function and deploy it to a single server, the operations part can come later. There are a few things you need to install first though.

To be able to run the Fn Server you will need a Docker enabled host. So if you don't yet have Docker installed install it first.

You also will need to have Gradle installed.

You have Docker and Gradle installed now? Good!

Before we begin, the question we got to ask ourselves first is, what is a Fn function anyway?

A Fn function is in essence a small program that has some inputs (the HTTP Request) and from those inputs will produce some output (the HTTP Response). It does not really matter in which programming language the function is written as long as the inputs and outputs are defined. In fact the Oracle Fn Project is programming language agnostic and allows you to use any language you prefer if you want.

But if our functions can be written in any language, how can it all be deployed on the same server?

This is where Docker comes in. Every function is wrapped in a Docker image provided by the Fn Project that handles routing the HTTP request information from the FN Server to the function running in the Docker container and routing the response from your function back to the caller. This is what the Fn Project maintainers calls cloud native. While all this routing might sound tricky (and most likely is internally), for the function developer it is made fully transparent.

If you already took a look at the Fn Project documentation you most likely noticed that currently they offer the following language choices of writing functions:

They also offer a CLI for simplifying creating functions in those languages.

However, while all of those are good languages and the CLI is ok to use I felt that I wanted a bit more mature tool stack to work with, so I set out to write support for using Gradle to both build and deploy the function and allow developers also to leverage Groovy for writing functions.

Introducing a new Gradle plugin for building Groovy Fn functions

For a full introduction to the Gradle plugin see its Product Features page.

So lets have a look at how we can write our functions with Groovy and deploy it with Gradle!

Start by creating a new folder (in this example I used hello-fn as the folder name) somewhere on your system.

In that folder add an empty build.gradle file.

The new plugin is located the Gradle Plugin Portal so you can easily include it in your project by adding the following to your build.gradle:

plugins {
    id 'groovy'
    id 'com.devsoap.fn' version '0.1.7'
}

After you have applied the plugin the following tasks will be made available for your gradle project (you can check by running gradle tasks):

Fn tasks
--------
fnCreateFunction - Creates a Fn Function project
fnDeploy - Deploys the function to the server
fnDocker - Generates the docker file
fnInstall - Installs the FN CLI
fnInvoke - Invokes the function on the server
fnStart - Starts the local FN Server
fnStop - Stops the local FN Server

Before we run any of the tasks, we still need to add a bit of configuration to our build.gradle so the Gradle plugin knows how to create a correct function.

fnDocker {
    functionClass = 'com.example.HelloWorld'
    functionMethod = 'sayHello'
}

We will also need some more dependencies so add those as well:

repositories {
    mavenCentral()
    maven { url 'https://dl.bintray.com/fnproject/fnproject' }
}

dependencies {
    compile 'org.codehaus.groovy:groovy-all:2.5.3'
    compile "com.fnproject.fn:api:1.0.74"
}

Right, now we are ready to run the fnCreateFunction task to create our function structure.

Once you have run the task it should have created the following folder structure:

hello-fn
├── build.gradle
└── src
    └── main
        └── groovy
            └── com
                └── example
                    └── HelloWorld.groovy
                    

Not much boiler plate code there.

Lets have a look at the generated HelloWorld.groovy class:

package com.example

class HelloWorld {

    String sayHello(String input) {
        String name = input ?: 'world'
        "Hello, $name!"
    }
}

It couldn't be much simpler than this, it is a simple class with one method, sayHello, that takes a string input and assumes it is a name and returns a greeting for that name.

Of course in real-world situations you will most likely want to do a lot more like reading request headers, setting response headers and generating different content-type payloads. This is all achievable using the Java FDK the FN Project provides.

Deploying the function locally

Now that we have our function, we most likely want to test it out locally before we put it in production.

To start the development server locally the Gradle plugin provides you with an easy task to use, fnStart. When you run that task it will download the CLI and start the FN Server on your local Docker instance.

$ docker ps
CONTAINER ID        IMAGE                COMMAND             CREATED             STATUS              PORTS                              NAMES
30c96567802a        fnproject/fnserver   "./fnserver"        5 seconds ago       Up 4 seconds        2375/tcp, 0.0.0.0:8080->8080/tcp   fnserver

Once it successfully has started it should be running on port 8080. If you want to stop the server you can run fnStop which will terminate the server.

Once the server is running we can deploy our function there. This can be achieved by running fnDeploy.

It will first build a docker image and then deploy the image to the FN Server.

$ gradle fnDeploy
> Task :fnDeploy

Building image hello-fn:latest 

BUILD SUCCESSFUL in 23s
6 actionable tasks: 6 executed

If everything went fine the function should now be deployed and ready to use.

Testing our function locally

The plugin comes with a built in way of testing our running function.

By using the task fnInvoke we can issue a request to the function.

$ gradle fnInvoke

> Task :fnInvoke
Hello, world!

And if we post some input to the function:

$ gradle fnInvoke --method=POST --input=John

> Task :fnInvoke
Hello, John!

Of course the fnInvoke function is limited in what it can do and for more advanced use-cases we might want to use a separate app for testing queries (my favourite being Insomnia :) ).

To do that point your REST client to http://localhost:8080/t/<app name>/<trigger name> where in the example case the app name and trigger name is the same so the url would be http://localhost:8080/hello-fn/hello-fn. This is also what fnInvoke calls in the background.

Development tip: While developing the function locally you can make Gradle continuously monitor the source files and when you change something then it will automatically re-deploy the function. You can do that by running the fnDeploy task with the -t parameter like so gradle -t fnDeploy.

Taking the function to production

Once you have the function working locally you can deploy it to production quite easily by adding a few parameters to build.gradle.

fnDeploy {
  registry = '<docker registry url>'
  api = '<FN Server URL>'
}

Now if you run the fnDeploy task the function will be deployed to your remote Docker registry and FN Server.

And beyond...

This was just a short introduction into how you can work with Gradle and Groovy to make your functions. There are plenty of other fun things you can do with these functions, for example if you want to see a bit more advanced demo you can have a look at the CORS proxy example here https://github.com/devsoap/examples/tree/master/fn-cors-proxy.

For more information about how to use the plugin see https://github.com/devsoap/fn-gradle-plugin

If you find any issues do not hesitate to create an issue at https://github.com/devsoap/fn-gradle-plugin/issues

Thanks for reading, I hope to get your feedback on this project!


Gradle Vaadin Flow plugin M6 released

The Gradle Vaadin Flow M6 release brings Spring Boot and multi-module support for the plugin.

The Gradle Vaadin Flow M6 release brings Spring Boot and multi-module support for the plugin.

This will be the last Milestone release for the plugin, which after the plugin will start targeting the first stable 1.0 release aimed for production use!

Spring Boot support

The plugin now includes full support for building Spring Boot projects. To enable the support you will need to include the official Spring Boot plugin in your build.

plugins {
    id "org.springframework.boot" version "2.0.5.RELEASE"
    id "com.devsoap.vaadin-flow" version "1.0.0.M6"
}

Once you have the plugin applied you can creating a new Spring Boot project by just issuing the vaadinCreateProject command.

For a fully working example have a look a the project hosted at https://github.com/devsoap/examples/tree/master/flow-spring-tutorial. It is the standard flow-spring-tutorial converted to use Gradle and the plugin instead of Maven.

Multi-module support

This release also included improvements to how multi-module projects are handled.

Upcoming breaking changes

To guarantee the longest possible forward compatibility, M6 is the last release to support Gradle 4.x. For the 1.0 stable release, the plugin will start to require the use of Gradle 5. This means that the release of 1.0 will coincide with the release of Gradle 5 as close as possible.

To start preparing your project for Gradle 5 you can take Gradle 5 snapshots into use by using the Gradle wrapper with the latest snapshot versions of Gradle:

./gradlew wrapper --gradle-version=5.0-milestone-1

While M6 does not officially support Gradle 5 it should work for most use-cases already.

If you do see a warning, or something is broken then please report it to the issue tracker. This will allow us to build a stable first release on top of Gradle 5.

--

As usual the release is available to download from the Gradle Plugin Directory.

For more information about how to use the plugin see https://github.com/devsoap/gradle-vaadin-flow/wiki

If you find any issues do not hesitate to create an issue at https://github.com/devsoap/gradle-vaadin-flow/issues


Gradle Vaadin Plugin to be included in Vaadin 11 Platform

Today it became offical that the Gradle Vaadin Plugin made by Devsoap Inc. will become the foundation that the next major Vaadin Framework 11 Platform release will be built upon.

As some of you already have suspected today it became offical that the Gradle Vaadin Plugin made by Devsoap Inc. will become the foundation that the next major Vaadin Framework 11 Platform release will be built upon.

This announcement was made by CEO Joonas Lehtinen from Vaadin late last night on Twitter:

Shortly there after a teaser video was released where in the beginning you can already see the plugin in action:

This sure will be exciting times for Vaadin community as they will be able to embrace the power of the Gradle eco-system and it is a pleasure to see the hard work the Vaadin community has made over the years paying off in this kind of big way!

For a full listing of what is planned for Vaadin 11 see https://vaadin.com/blog/vaadin-11-is-now-available-with-gradle-support-and-new-components

Exciting times ahead!

--

As usual the Gradle Vaadin Plugin is available to download from the Gradle Plugin Directory.

For more information about how to use the plugin see https://github.com/devsoap/gradle-vaadin-flow/wiki

If you find any issues do not hesitate to create an issue at https://github.com/devsoap/gradle-vaadin-flow/issues


Groovy and Kotlin with Vaadin 10

Yet again a month has passed since the latest release of the Gradle Vaadin Flow plugin and now it is time for the fifth release of the plugin. This release focuses on exciting alternative JVM languages, Groovy and Kotlin!

Building Vaadin applications with Groovy

The plugin now provides templates in


Yet again a month has passed since the latest release of the Gradle Vaadin Flow plugin and now it is time for the fifth release of the plugin. This release focuses on exciting alternative JVM languages, Groovy and Kotlin!

Building Vaadin applications with Groovy

The plugin now provides templates in Groovy for all the creation tasks the plugin has. To enable the Groovy support you will need to include the official Groovy plugin in your build.

To do that add the following:

plugins {
    id 'groovy'  
    id 'com.devsoap.vaadin-flow' version '1.0.0.M5'
}

vaadin.autoconfigure()

That is it, nothing more needed.

Note: If you are not using vaadin.autoconfigure() you will need to add the Groovy dependency to your compile classpath. Like this:

dependencies {
  compile vaadin.groovy()
}  

Once you have the Groovy plugin in the project all tasks the plugin has will generate Groovy instead of Java. To try it out you can try creating a new project with gradle vaadinCreateProject you will see the following structure:

├── src
│   └── main
│       ├── groovy
│       │   └── com
│       │       └── example
│       │           └── vaadinflowplugintest
│       │               ├── VaadinFlowPluginTestServlet.groovy
│       │               ├── VaadinFlowPluginTestUI.groovy
│       │               └── VaadinFlowPluginTestView.groovy
│       └── webapp
│           └── frontend
│               └── styles
│                   └── vaadinflowplugintest-theme.css

All the other create* tasks will also now generate Groovy instead of Java, go ahead and try them out!

Building Web Templates with Groovy

One special feature the Groovy support brings is allowing you to build your Web Templates with Groovy.

If you run gradle vaadinCreateWebTemplate you will now see a new template file generated into src/main/webapp/frontend/templates ending with .tpl instead of .html. If you look into it it will look like this:

link (rel:'import', href: '../bower_components/polymer/polymer-element.html')

'dom-module' ( id:'example-web-template') {
    template {
        div (id:'caption', 'on-click' : 'sayHello') {
            yield '[[caption]]'
        }
    }

    script('''
        class ExampleWebTemplate extends Polymer.Element {
            static get is() {
                return 'example-web-template'
            }
        }
        customElements.define(ExampleWebTemplate.is, ExampleWebTemplate);
    ''')
}          

The TPL file format is a DSL provided by the Groovy Markup Template Engine to allow you to define HTML files programmatically with Groovy. At build time these templates will automatically be converted into plain HTML files.

So why use them?

The DSL allows you to use any Groovy inside the templates, compose templates into each other, inject file system properties and everything you normally can do with Groovy. No more error prone manual editing of HTML files needed, the Groovy compiler will notify you of errors at compile time if your syntax is wrong, not at runtime in production.

The TPL support will for now only be available for Groovy projects, but in the future it might also be enabled for other project types.

Kotlin support

As well as Groovy support, also Kotlin support has been added to the plugin.

Similarly to how the Groovy support works, you need to include the official Kotlin plugin to enable the Kotlin features.

plugins {
    id 'org.jetbrains.kotlin.jvm' version '1.2.61'
    id 'com.devsoap.vaadin-flow' version '1.0.0.M5'
}

Once you have that in your build.gradle you can create a Kotlin project with vaadinCreateProject and you will get the same file structure as with the Groovy project.

The other tasks will also provide Kotlin templates.

If you are building your Vaadin applications with Kotlin I can also highly recommend looking into Vaadin On Kotlin maintained by Martin Vysny (https://github.com/mvysny).

--

As usual the release is available to download from the Gradle Plugin Directory.

For more information about how to use the plugin see https://github.com/devsoap/gradle-vaadin-flow/wiki

If you find any issues do not hesitate to create an issue at https://github.com/devsoap/gradle-vaadin-flow/issues


Gradle Vaadin Flow plugin M4 released

This release focuses on improving production mode as well as client side dependency handling. It improves caching as well as adds support for Web templates.

This release focuses on improving production mode as well as client side dependency handling. It improves caching as well as adds support for Web templates.

Here is a more detailed overview of what improvements and features the new release contains:

Improved Yarn/Bower integration

The plugin client side handling has now been re-written to be fully built on top of Yarn instead of NPM.

This has allowed the plugin to take into use Yarn's caching and repository offline mirroring capabilities providing more stable, reproducable and faster builds.

Gradle Cache support

The plugin now supports the Gradle cache. The cache is leveraged when in production mode to store the component directories (bower_components and node_modules) as well as the transpiled results and other static resources. This should speed up the builds considerable and allowes pre-building production mode applications.

To enable the Gradle cache add --build-cache when running Gradle from the command line or put org.gradle.caching=true in your gradle.properties file.

Support for HTML/Javascript bundling

Previously the HTML and Javascript were not bundled when generating the transpiled production mode. That made the application make several HTTP requests for every component used. This again was causing the application startup to be slow.

Now Polymer Bundler is used, compressing all component javascript into one downloadable HTML file that can be cached by the browser.

Support for Web Templates

One new feature of Vaadin 10 is building components using pure HTML templates (Polymer Templates).

To create a Web Template the following task can be used:

gradle vaadinCreateWebTemplate --name MyWebTemplate

This will create two files; a Java class MyWebTemplate.java that contains the server side logic of the component and a MyWebTemplate.html in src/main/webapp/templates/ which contains the client side logic (HTML+JS).

The templates work as an alternative way of creating simple Vaadin components. For more advanced components separating between view and representation is highly suggested.

For more information about the templates see https://vaadin.com/docs/v10/flow/polymer-templates/tutorial-template-basic.html

Improved Maven BOM support

The plugin's BOM support is now taking advantage of the latest Gradle BOM features. That means that if you want to use the Vaadin BOM you will now need to add a settings.gradle file to your project with the following contents:

enableFeaturePreview('IMPROVED_POM_SUPPORT')

--

As usual the release is available to download from the Gradle Plugin Directory.

For more information about how to use the plugin see https://github.com/devsoap/gradle-vaadin-flow/wiki

If you find any issues do not hesitate to create an issue at https://github.com/devsoap/gradle-vaadin-flow/issues


Defining optional input files and directories for Gradle tasks

Currently Gradle does not support defining input files and directories but with a few tricks it is easy to do.

To define a input file for a task you usually use either @InputFile or @InputDirectory depending on if you require a file or directory as input. However, this only works when the directory really exists on the file system, if the file or directory does not exist then this will throw an exception.

So how do we go about defining optional input files?

Turns out there does not exist a built-in solution for this but with a little imagination we can accomplish it with the @Optional annotation and a Closure.

Here is an example:

@Optional
@InputDirectory
final Closure<File> inputDir = { new File('/path/to/dir').with { it.exists() ? it : null }

So why does this work?

The @Optional annotation defines that the property is optional in a sense that it either has a value or can be null to define the value is missing and the input should be ignored.

By using @Optional we prevent Gradle from complaining about a missing input value but it still will not solve the problem as we still can define a File which does not exist and Gradle will still complain about an @InputDirecory that does not exist.

To solve the second issue we are going to use a closure with where we mutate the File instance to either be returned if it exists or else we return null to denote it does not exist.

We use the .with{} method to mutate the instance and since we are returning null in the case the File does not exist then @Optional definition is completed and Gradle understands that this input should not be considered.

So why did we use a Closure instead of assigning the File directly to the field?

The reason is that if we would have assigned it directly the existance of the file would have been checked at object construction time. With the Closure in place we have deferred the evaluation until the value is actually needed by Gradle. By doing this we have a possibility to create the directory if it does not exist in a previous task before Gradle evaluates the @InputDirectory value.


Gradle Vaadin Flow plugin M3 released

This is a maintenance release for the Gradle Vaadin Flow plugin. This release mostly focuses on tweaking the existing functionality to be simpler and faster.

This release mostly focuses on tweaking the existing functionality to be simpler and faster.

Generated resources are no longer stored in source tree

Previously some of the resources related to production mode was put in src/main/webapp. However, since these resources should not be committed into source control you would need to ignore them somehow with your VCS. This was a bit in-convenient so these resources was moved into a new directory, build/webapp-gen in the Gradle build directory that never should be committed into VCS.

This means that if you have been using either M1 or M2 then please remove the following generated directories from source control and let the plugin re-generate them the next time it runs:

src/main/webapp/frontend/bower_components/*
src/main/webapp/frontend-es5/*
src/main/webapp/frontend-es6/*

Support for Polymer custom-styles

In previous versions of the plugin the theme CSS file was directly generated into src/main/webapp/frontend/<projectname>-styles.css and included in the View class with a @StyleSheet annotation.

This has changed so that the style css file will now be generated into src/main/webapp/frontend/styles/<projectname>-styles.css and a @HTMLImport will be generated for it in the UI class.

The reason for this change is that to be able to style Polymer components, especially the inner parts (called the shadow-dom), we will need to embed our Css into html files and into a custom-style tag. Don't ask me why though, as we know this goes against most good design practices of separating concerns.

The plugin however will now allow you to keep your CSS styles in CSS files and will autogenarate a HTML wrapper for any CSS file that you place in the ../frontend/styles/ folder.

That way you can choose to either directly use you css file via @Stylesheet(frontend:<you-css-file>.css) or with @HTMLImport(frontend:<your-css-file>.html) that will use the html file wrapper that is generated build time.

By storing your CSS in CSS files rather than HTML files in source control you will have much better editor support and separate correctly the concerns.

All new projects generated with vaadinCreateProject task will use the CSS HTML wrapper and a @HTMLImport so you can start by styling any part of your application easily.

Of course, if you specifically wish to store your styles within HTML files in the styles directory then that will work as well, but I cannot really recommend that.

New component generation task

And finally, M3 adds a new component creation task vaadinCreateComponent task that will generate a new Vaadin component in the project. This is useful if you quickly need a Vaadin component template to start making a new component from.

--

That is it for this release, summer seems to be here, so go out, enjoy the sunshine and I'll see you later!


Building RESTful Web Services with Groovy

There are many techniques and frameworks for making micro-services on the JVM today. Some of them are well known, other less so. In this article I'm going to go through how you can leverage Groovy's expressiveness combined with the robust micro-service framework Ratpack to create services that consumes less resources


Building RESTful Web Services with Groovy

There are many techniques and frameworks for making micro-services on the JVM today. Some of them are well known, other less so. In this article I'm going to go through how you can leverage Groovy's expressiveness combined with the robust micro-service framework Ratpack to create services that consumes less resources and contains less boilerplate and annotations.

Why Groovy?

It is now roughly 6 years since I started exploring what Groovy can do. Since then Groovy has taken huge steps forward along with the JVM platform. Before there was Java 8, Kotlin, Scala and numerous other languages Groovy already had invented most functional paradigms the "new kids on the block" are so touting today. While it was a bit slow still on Java 6, the introduction of invoke-dynamic in the JVM made it super-fast, almost as fast as Java itself making it a #1 language to use on the JVM.

I think Groovy's expressiveness combined with the low gradual learning curve if you are coming from Java makes it a perfect language to use both for configuration (Gradle, Jenkins) as well as application frameworks (Grails, Ratpack).

Many programmers today are also getting tired of the annotation hell Java applications are becoming (for example with Spring Framework) where functionality is hidden behind a black box annotation. Groovy's DSL support provides a nice simple alternative to annotations.

Ratpack 101 (and hopefully not 404)

Before we can begin, I need to introduce the star of the show, the Ratpack framework.

I have now been using Ratpack for some time and it has started to grow more and more on me, turning out to be a great toolbox for writing Groovy based services. Its strength is its easy learning path to both Groovy, Groovy's DSLs and building end-points (EPs). Another strength of it is that it will be very lightweight since it will be running on the lightweight Netty server which is crucial if you are spinning up lots of services.

But, instead of you believing me , let me show you how to make an app so you can judge for yourself.

Since I like to keep examples relevant I will not write a "Hello world!" type of application, you will easily find a such example with Google anyhow. Instead, I will show you how to make an EP that does currency conversions using the European Currency Exchange's (ECE) daily currency rate and caches them in a local database to reduce the hits on ECE. I will show you how you can set up a in-memory H2 database as well as initialize it using Flyway scripts. Finally I will show you how to package and deploy it with Gradle to Docker.

Sounds hard? I assure you if you can write a hello world app you can write what I just described in no time with Ratpack and Groovy.

So lets get crackin'!

We will start by creating our EP with Ratpack. So I create a new file called Ratpack.groovy and add the following logic to it:

import groovy.sql.Sql
import org.flywaydb.core.Flyway
import org.javamoney.moneta.FastMoney
import ratpack.h2.H2Module
import ratpack.service.Service
import ratpack.service.StartEvent

import javax.sql.DataSource

import static javax.money.convert.MonetaryConversions.getConversion
import static ratpack.groovy.Groovy.ratpack
import static ratpack.jackson.Jackson.json

ratpack {

    bindings {

        // Use H2 in-memory database
        module(new H2Module("sa", "", "jdbc:h2:mem:conversions;DB_CLOSE_DELAY=-1"))

        // Migrate flyway scripts on startup
        bindInstance new Service() {
            void onStart(StartEvent event) {
                new Flyway(dataSource: event.registry.get(DataSource)).migrate()
            }
        }
    }

    handlers {

        /*
            GET /convert/EUR/USD/100
         */
        get("convert/:from/:to/:amount") {

            // 1: Try to find the conversion from the db
            def amount = getAmountFromDb(get(DataSource), pathTokens.from, pathTokens.to, pathTokens.amount as BigDecimal)

            if(!amount) {
                // 2: Do the conversion via ECE
                amount = FastMoney.of(pathTokens.amount.toBigDecimal(), pathTokens.from)
                        .with(getConversion(pathTokens.to)) .number.toBigDecimal()

                // 3: Store the conversion for future reference
                setAmountToDb(get(DataSource), pathTokens.from, pathTokens.to, pathTokens.amount as BigDecimal, amount)
            }

            // 4: Render a response for a our clients
            render json([
                    "fromCurrency" : pathTokens.from,
                    "toCurrency" : pathTokens.to,
                    "fromAmount": pathTokens.amount.toBigDecimal().stripTrailingZeros(),
                    "toAmount": amount.toBigDecimal().stripTrailingZeros()
            ])
        }
    }
}

/*
 * Helper function to retrive an amount from the db
 */ 
static Number getAmountFromDb(DataSource ds, String from, String to, Number amount) {
    new Sql(ds).firstRow("""
        SELECT toAmount FROM conversions 
        WHERE fromCurrency=:from
        AND toCurrency=:to
        AND fromAmount=:amount
    """, ['from': from, 'to': to, 'amount': amount])?.values()?.first()
}

/*
 * Helper function to set an amount in the db
 */ 
static void setAmountToDb(DataSource ds, String from, String to, Number amount, Number convertedAmount) {
    new Sql(ds).executeInsert("""
        INSERT INTO conversions (fromCurrency, toCurrency, fromAmount, toAmount) 
        VALUES (:from, :to, :amount, :toAmount) 
    """, ['from': from, 'to': to, 'amount': amount, 'toAmount': convertedAmount])
}

That is the whole application, nothing more to it.

When we run the application and call /convert/EUR/USD/100 we will recieve the following json response:

{
  "fromCurrency": "EUR",
  "toCurrency": "USD",
  "fromAmount": 100,
  "toAmount": 116.99
}

But lets go through the code in more detail.

We start with the ratpack{}-closure. This is the main closure for any ratpack application. All logic to configure the ratpack app goes here.

Next, lets have a look at the bindings{}-closure.

Bindings in Ratpack are the equivalent of dependency injection in other frameworks. What you do is bind an interface (or concrete class) to an implementation of that class and then in your handlers you can get the implementation by only using the interface. In Ratpack this can be taken further to have multiple hierarchical registries with multiple implementations, but that is a more advanced topic for later.

In our simple application we register two implementations; the H2Module and a anonymous service.

A module in Ratpack is a way to bundle functionality into an easily consumable form by Ratpack applications. Most libraries you see will have a module for you to bind into the project, in our case we are using the H2 module that will allow us to use an in-memory database. What the module will do in the background is register a Datasource class with the registry, which we then can use in our handlers to do queries to the database with.

We also bind a service to allow us to integrate a non-ratpack dependency into the service life-cycle. In our case we use Flyway to migrate our database tables on application start to our in-memory database defined above. Since it does not have a Ratpack module for us, we just wrap it in a service to do our bidding.

If we were good open source developers we could write a Ratpack module for flyway with exactly that service and release it to the public and gain fame and fortune ;)

Alright, now that we have our database set up, lets look at our single EP.

We define our EPs in the handlers{}-closure. In our case we will have one EP for the currency conversion so we bind it to GET /convert/:from/:to/:amount. What this syntax means is that when our endpoint is invoked with the URI /convert/EUR/SEK/100 it will return the conversion rate for 100 EUR into SEK.

The implementation of the EP is rather trivial; we first check if we already have a conversion for that amount in the database, if it exists we just use it, otherwise we use the Monetery API (JSR354) to retrieve the conversion via ECE and then we just store it in the database for future use.

Once we have the conversation rate we just construct a JSON object to return for the client.

And that is it, application done, time to package it up and run it.

Note: The above example is written solely using the Ratpack DSL which is convenient for small EPs like this. For bigger projects you will want to split out the handlers into their own classes and only define the routes and bindings in the Ratpack.groovy file. I have made this same example using that approach at https://github.com/devsoap/examples/tree/master/currency-converter-ratpack-ext.

Packaging Ratpack applications with Gradle and Docker

Alright, time to wrap up and deploy our cool new service.

Here is the Gradle file to do it:

buildscript {
    repositories {
        jcenter()
    }
    dependencies {
        classpath "io.ratpack:ratpack-gradle:1.5.4"
        classpath 'se.transmode.gradle:gradle-docker:1.2'
    }
}

apply plugin: "io.ratpack.ratpack-groovy"
apply plugin: 'groovy'
apply plugin: 'docker'

repositories {
    jcenter()
}

dependencies {
    compile 'javax.money:money-api:1.0'
    compile 'org.javamoney:moneta:1.0'
    compile 'org.slf4j:slf4j-simple:1.7.25'
    compile 'org.flywaydb:flyway-core:4.0.3'
    compile ratpack.dependency('h2')
}

distDocker {
    exposePort(5050)
}

With this you can now try out the application by running

$> gradle -t run

And you would see something like this in your console:

[main] INFO ratpack.server.RatpackServer - Starting server...
[main] INFO ratpack.server.RatpackServer - Building registry...
[main] INFO ratpack.server.RatpackServer - Initializing 1 services...
[ratpack-compute-1-1] INFO org.flywaydb.core.internal.util.VersionPrinter - Flyway 4.0.3 by Boxfuse
[ratpack-compute-1-1] INFO org.flywaydb.core.internal.dbsupport.DbSupportFactory - Database: jdbc:h2:mem:conversions (H2 1.4)
[ratpack-compute-1-1] INFO org.flywaydb.core.internal.command.DbValidate - Successfully validated 1 migration (execution time 00:00.005s)
[ratpack-compute-1-1] INFO org.flywaydb.core.internal.metadatatable.MetaDataTableImpl - Creating Metadata table: "PUBLIC"."schema_version"
[ratpack-compute-1-1] INFO org.flywaydb.core.internal.command.DbMigrate - Current version of schema "PUBLIC": << Empty Schema >>
[ratpack-compute-1-1] INFO org.flywaydb.core.internal.command.DbMigrate - Migrating schema "PUBLIC" to version 1 - create conversion table
[ratpack-compute-1-1] INFO org.flywaydb.core.internal.command.DbMigrate - Successfully applied 1 migration to schema "PUBLIC" (execution time 00:00.038s).
[main] INFO ratpack.server.RatpackServer - Ratpack started for http://localhost:5050

You can see the application starting, Flyway doing the migrations and finally if you point your browser to for example http://localhost:5050/convert/EUR/SEK/100 you should get the conversion rate. So it works!

This is nice for development as you can use this way to try it out locally before deploying it to docker. I also used the -t-switch which will allow hot-swapping any changes immediately without re-running the task.

Next lets look into how to make a docker image out of it.

We added the docker plugin and configured it to expose the 5050 port so all that is left is to build the image. That can simply be done by running the following command:

$> gradle distDocker

Here is some of the output of that:

Task :distDocker
Sending build context to Docker daemon 21.06 MB

Step 1/4 : FROM aglover/java8-pier
 ---> 3f3822d3ece5
Step 2/4 : EXPOSE 5050
 ---> Using cache
 ---> 77ed4f7913f9
Step 3/4 : ADD example.tar /
 ---> 765aff2cd114
Removing intermediate container 4eaaf85dd4b2
Step 4/4 : ENTRYPOINT /example/bin/example
 ---> Running in d8046bea895f
 ---> a81175d6b690
Removing intermediate container d8046bea895f
Successfully built a81175d6b690

As you see the distDocker task will create the image and deploy it to the local docker image registry.

Note: You will probably be running the distDocker command using a CI and you most likely want to deploy to your internal docker registry instead of locally. To do that you can set the docker.registry parameter in your build.gradle to point it to the correct registry. More information can be found at https://github.com/Transmode/gradle-docker.

Once we have it, we can run the docker image simply by running

$> docker run -p 5050:5050 a81175d6b690

And that will run your EP via docker!

End notes

Now you know how to build an Groovy REST service without using a single annotation. How does it feel? Was it hard? Easy? Give me comments below.

All the code and examples can be found on Github, here are the links

https://github.com/devsoap/examples/tree/master/currency-converter-ratpack-dsl

https://github.com/devsoap/examples/tree/master/currency-converter-ratpack-ext

Thanks for reading, hope you found it useful!


Production ready Gradle Vaadin Flow applications

If we all would only need to support the latest version of Google Chrome then this article would most likely not need to exist. However, in real life we certainly will want to support other browsers as well so lets have a look how we can take our Vaadin Flow


If we all would only need to support the latest version of Google Chrome then this article would most likely not need to exist. However, in real life we certainly will want to support other browsers as well so lets have a look how we can take our Vaadin Flow application built with Gradle and make it production ready.

Supporting old browsers

At this stage I could go into a lengthy rant about the difference between Ecmascript 5 and Ecmascript 6, different transpilers and linters, polyfills and polygons, how the world is an unfair place and how the frontend technologies are moving way to fast for any company to ever be able to support them.

But instead I will just say that most of the Web Components today only target the most bleeding edge browsers because developers only like to tinker with new things, so we need to compile the javascript source code into different versions which different browsers can load.

The Gradle plugin added support for this via the vaadin.productionMode property in the 1.0.0.M2 release.

Production mode

When the application is set to production mode it signals the plugin that it will need to compile two versions of the frontend, one for old browsers and one for new browsers.

To do that the first thing the plugin will do is unpack any webjars you have included via Gradle. That includes the Vaadin dependencies as well as any custom Webjar you have added.

Note: Here it is important to mention that any WebJar you add need to have the Javascript sources included in the jar as well as a bower.json file. If they are missing then the WebJar will be excluded from compilation.

Once the Webjars have been unpacked the plugin performs the compilations and copies over the two new versions of the frontend into the src/main/webapp folder. These folders are frontend-es5 and frontend-es6. It will also copy anything you had in src/main/webapp/frontend (for example your theme) into those directories.

Note: You should not modify these directories as they are just a compilation result. Anything you change in them will be overwritten in the next compilation.

Note: You might want to exclude src/main/webapp/frontend-es5 and src/main/webapp/frontend-es5 from version control (add them to .gitignore) as they are compilation results.

Enabling production mode in the Vaadin application

While turning on vaadin.productionMode will make the Gradle plugin know you want to deploy to production but the Vaadin app will still be running in development mode.

The first thing you have to know is that when Vaadin is running in development mode it is expecting everything to be located under src/main/webapp/frontend. This is ok as it allows you to develop faster without compiling.

To make the Vaadin application also run in production mode we need to set the productionMode init-parameter. We can do that by setting it on the servlet like so:

@WebServlet(urlPatterns = "/*", name = "TestServlet", asyncSupported = true)
@VaadinServletConfiguration(ui = TestUI.class, productionMode = true)
public class TestServlet extends VaadinServlet { }

What this will do is tell Vaadin to instead use src/main/webapp/frontend-es5 for older browsers and src/main/webapp/frontend-es6 for newer browsers.

Another thing you also will need to do is to use the generated index.html file in the frontend directories. To do that add @HtmlImport("index.html") to your UI class like so:

import com.vaadin.flow.component.dependency.HtmlImport;
import com.vaadin.flow.component.UI;

@HtmlImport("index.html")  
public class MyUI extends UI { }

The plugin will add the import automatically to new projects you create with vaadinCreateProject but for older projects you might need to add it manually.

Development with production mode configured

Once you have turned on production mode your build will start to take longer due to the compilation.

A good tip to mitigate this is to use a Gradle parameter or system property for productionMode so you can only have it on in your CI builds and otherwise have it off while developing.

To do that we first have to configure our build.gradle to use the parameter, to do that add the following:

vaadin.productionMode = findProperty('productionMode') ?: false

Next, we will need to add a placeholder for the property in our Servlet class as well. Change the @VaadinServletConfiguration to look like this:

@VaadinServletConfiguration(ui = TestUI.class, productionMode = @PRODUCTION_MODE@ )

Finally, add the following task to your build.gradle:

// Pre-process Java sources
sourceSets.main.java.srcDirs = ['build/processedSrc']
task processSources(type: Copy) {
    from 'src/main/java'
    into 'build/processedSrc'
    filter { line -> line.replaceAll('@PRODUCTION_MODE@', "$vaadin.productionMode") }
}
compileJava.dependsOn processSources

After you have done that you can now control the prodution mode switch from the command line by appending the -PproductionMode=true parameter. For example like so:

$ gradle -PproductionMode=true jettyRun

Of course, now that you have the source processing in place you could also replace the version etc. with the same functionality ;)