Create a REST services layer with Spark

The purpose of this post is to explain how to work with Spark: a micro-framework that let us quickly create a REST services layer. Spark is the Java porting of Sinatra: famous micro-framework written in Ruby. With Spark it’s possible to start a REST web server with a few lines of code, as we can see in this very simple example.

public class App
{
   public static void main(String[] args)
   {
       Spark.get("/hello", (request, response) -> "Hello World");
   }
}

If we now run the main, Spark will start a server that listens on port 4567. If we invoke the route “hello” the server will respond with the canonical developer’s greeting.

We can notice that to create a REST route, all we need is the method get, the two mandatory parameters of this method are an URL and a lambda expression that wires up the business logic to the URL. Obviously the route will respond to HTTP method ‘GET’: Spark has a specific method for each HTTP verb.

Spark Setup

Before analyzing the next examples, let’s discuss about the setup of this library. This is the pom.xml file of a Spark project.

<dependencies>
    <dependency>
        <groupId>com.sparkjava</groupId>
        <artifactId>spark-core</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.google.code.gson</groupId>
        <artifactId>gson</artifactId>
        <version>2.2.4</version>
    </dependency>
</dependencies>

The only mandatory dependency it’s Spark itself, that just works out of the box. The web server used by Spark it’s a Jetty embedded (v. 9.0.2). The other dependency that we are going to use it’s Gson: Google’s API to create/parse JSON files. In the following example we are going to manage a REST resource: a very small group of users, defined by this Active record:

public class User implements Serializable{

	private static final long serialVersionUID = 1L;

	private static List<User> users = new ArrayList<User>();

	static{
		users.add(new User(0, "Solid Snake"));
		users.add(new User(1, "Vulcan Raven"));
		users.add(new User(2, "Meryl Silverburgh"));
		users.add(new User(3, "Hal Emmerich"));
		users.add(new User(4, "Frank Jaeger"));
	}

	private Integer id;
	private String name;

	public User(Integer id, String name) {
		super();
		this.id = id;
		this.name = name;
	}

	public User() {}

	public Integer getId() {
		return id;
	}

	public void setId(Integer id) {
		this.id = id;
	}

	public String getName() {
		return name;
	}

	public void setName(String name) {
		this.name = name;
	}

	public static List<User> getAll(){
		return users;

	}

	public static User get(final Integer id){
		return users.stream().filter((p)->p.getId().equals(id)).findFirst().get();
	}

	public static User store(User p){
		if(p.getId() == null){
			User maxIdPerson = users.stream().max((p1,p2)->Integer.compare(p1.getId(), p2.getId())).get();
			p.setId(maxIdPerson.getId()+1);
			users.add(p);
		}else{
			users.set(p.getId(), p);
		}

		return p;
	}

	public static void delete(User p){
		users.remove(p);
	}
}

To bind the resource to the routes we will use the other methods of the Spark API:

public class App
{

    private static Gson GSON = new GsonBuilder().create();

    public static void main( String[] args )
    {
    	Spark.get("/hello", (request, response) -> "Hello World");

    	Spark.get("/user/:id",  (request, response) -> {
    		Integer id = Integer.parseInt(request.params("id"));
    		return GSON.toJson(User.get(id));
    	});

    	Spark.post("/user",  (request, response) -> {
    		User toStore = null;
			try {
				toStore = GSON.fromJson(request.body(), User.class);
			} catch (JsonSyntaxException e) {
				response.status(400);
				return "INVALID JSON";
			}

			if(toStore.getId() != null){
				response.status(400);
				return "ID PROVIDED DURING CREATE";
			}else{
				User.store(toStore);
	    		return GSON.toJson(toStore);
			}
    	});

    	Spark.put("/user/:id",  (request, response) -> {
    		if(User.get(Integer.parseInt(request.params("id"))) == null){
    			response.status(404);
    			return "NOT_FOUND";
    		}else{
    			User toStore = null;
    			try {
    				toStore = GSON.fromJson(request.body(), User.class);
    			} catch (JsonSyntaxException e) {
    				response.status(400);
    				return "INVALID JSON";
    			}
        		User.store(toStore);
        		return GSON.toJson(toStore);
    		}
    	});

    	Spark.delete("/user/:id", (request, response) -> {
    		User user = User.get(Integer.parseInt(request.params("id")));
    		if(user == null){
    			response.status(404);
    			return "NOT_FOUND";
    		}else{
    			User.delete(user);
    			return "USER DELETED";
    		}
    	});
    }
}

Authentication

Spark let us inject in the HTTP chain a Filter, using a filter we can modify the behavior of the routes. A filter can be invoked before or after a route, and we need to use the methods before and after to register the filter to HTTP stack. In the next example, we add a simple password check to every route that responds to the HTTP verbs POST, PUT or DELETE.

Spark.before((request,response)->{
    	String method = request.requestMethod();
    	if(method.equals("POST") || method.equals("PUT") || method.equals("DELETE")){
                String authentication = request.headers("Authentication");
                if(!"PASSWORD".equals(authentication)){
    		    Spark.halt(401, "User Unauthorized");
                }
    	}
});

CORS

In the last example we will enable CORS in our Spark server. CORS is the acronym for “Cross-origin resource sharing”: a mechanism that allows to access REST resources outside the original domain of the request. This is the case of a mobile app that requires some data from a server, or a web app that is not hosted on the same machine of the REST services server. In order to activate this feature we have to add an OPTIONS route that responds on every registered URL, this route must return the headers and HTTP methods accepted by the server (in our example we simply return all the headers/verbs needed). At last we will register a filter to add the “Access-Control-Allow-Origin” header to indicate that the access is granted for every client.

Spark.options("/*", (request,response)->{

    String accessControlRequestHeaders = request.headers("Access-Control-Request-Headers");
    if (accessControlRequestHeaders != null) {
        response.header("Access-Control-Allow-Headers", accessControlRequestHeaders);
    }

    String accessControlRequestMethod = request.headers("Access-Control-Request-Method");
    if(accessControlRequestMethod != null){
	response.header("Access-Control-Allow-Methods", accessControlRequestMethod);
    }

    return "OK";
});

Spark.before((request,response)->{
    response.header("Access-Control-Allow-Origin", "*");
});

Conclusions

With Spark is extremely easy to create a REST web server. Its complementary application could be a HTML5 + Javascript front-end web application. Here you can read an interesting tutorial on how to deploy a Spark+AngularJS application on a OpenShift machine. Spark it’s not powerful as any JAX-RS framework like RESTEasy, but I think that this is not a API flaw: it’s just has another kind of project as a target. Al last Spark requires a JVM 8: if you’re stuck on a legacy enviroment you have choose another REST library.

Here on my personal blog you can read a very simple RESTEasy (italian) tutorial. It follows the same example of this article, so you can easily compare the two libraries. You can look at the code of this post on my GitHub account.

Author:

Francesco Strazzullo - software architect and one of the committer in the Primefaces Extensions project.

Follow us on Twitter