Java9 – What is new and worthy?

Java9 – What is new and worthy?

Reading Time: ~ 12 minutes

Finally Java9

I have finally got around to experimenting with some of the Java9 new features. Yes I know that I took my sweet time, however as usual of me: having 2000 parallel projects and always finding new and interesting topics to read upon or try, does not help to get around of checking what was in Java9.

What is new in Java9?

There are several small tweaks and cool new features that were the target for this release. The correct number was 82 JEP targeted for the end September 2016 but it was pushed back to the end of March 2017. I have selected here a couple which I liked, but in no particular order, that I would like to share with you. This is by a long shot not a complete list or anything but I hope you find something new!

  1. Java9 REPL
  2. The Jigsaw Project
  3. New HTTP API
  4. Javadoc Improvement (HTML5)
  5. Cross compilation
  6. Process Control
  7. Convenience Factory Methods for Collections
  8. StackWalker
  9. Reactive Streams
  10. Deprecate the Applet API
  11. Private methods in Interfaces

Java9 REPL

JEP 222
When I first got into contact with Python, the REPL was maybe one of the most useful tools I had as a learner. Years later and I still love it!
I am happy that Java is finally joining the REPL world and offering this feature. It is very useful to test small scripts or even look for the correct syntax that you might have forgotten without going to the docs.
Bottom line: it helps the bottom line! It reduces turnaround, i.e. developer feedback, allowing for a better usage of the time without having to do big compiles before testing a small snippet.

To get it going you only have to type:


and you can use it and type as a normal java program (semicolons are optional!) including:

Adding imports


Defining Functions

jshell> void helloWorld() { System.out.println("Hello world");}
|  created method helloWorld()

Creating Variables

jshell> String name = "Fdiez"
name ==> "Fdiez"
|  created variable name : String

Testing Expressions

jshell> String.format("%d of bottles of beer", 100)
$1 ==> "100 of bottles of beer"
|  created scratch variable $1 : String

and most cool of all it commes with some meta-commands to check out the environment you currently have. They start with a forward-slash (/) and you can see a list of them by running /help or /?:


list all the current imports that are active on the current session
list of variables defined in the current session
list of methods defined in the current session
show everything you have defined during the session (vars, methods, etc.) but it does not show commands
will show everything including the commands defined/typed in the current session
well… it exits 😉
allows you to save the current state in order to re-use or keep working on it later
allow you to re-open a previously saved state


I really like the potential for small experiments added with the REPL and I am pretty sure soon all major IDEs will have support for it making a real compannion for all developers.

The Jigsaw Project

JEP 200 (+ some others)
This was a big topic to be tackled by the Java team. The topic was also handled in innumerous posts and even books already so I won’t be going into much length here and I’ll try to do a more of a TL;DR kind of comment to it.

In a Nutshell

The Jigsaw project aimed to bring a module system into the native java ecosystem.
A module system:

  • allows loose coupling between different components
  • allows strong encapsulation of the components
  • pushes you to actively manage the dependencies
  • due to the encapsulation it also pushes you to create clear contracts between modules

This sound familiar though

And it should: The concept is not new. OSGI has been using this idea for a long time by using JARs as these modules. The inovation here though is that the JAR, due to some limitation that we are not going into in this post, is not the unit of modularity anymore, but the “Module” is.

New module, new extension

The module also has its own extension dubbed “jmod”.
This new format allows you to include native code, configuration files, and other data that do not fit into JAR files.


There is also some new tools available to try to make the implementation and management (and thus our lives) of these modules easier:

  • jdeps – helps to find dependencies in the existing code in order to make the process of modularization easier for you
  • jdeprscan – looks for deprecated APIs
  • jlink – it combines the application’s and the JDK’s modules leading into a smaller runtime
  • jmod – helper tool for jmod files.

Learn more

If you’d like to check out a more in-depth explanation, but still not a complete book, check out this source.


JEP 110
The internet evolved and so, we need it too!
The last update to the HTTP protocol was in 1992. By that time the landscape was a bit different than today and so were the requirements. In 2015 there was a new update, and thus the HTTP/2 was born.

What is it about or why do we have it?

We have it because te old protocol, the HTTP/1.1 had a lot of issues with speed, security and user-friendlyness.
The new format brings some major improvements:

  • It is binary instead of textual
  • It is fully multiplexed – which means one can send multiple requests for data in parallel over a single connection
  • it reduces overhead by sending smaller compressed headers
  • it allows push responses from the server to the client
  • reduces round trip times (a.k.a. makes website loading faster)
  • widely supported by browsers

There were some features that were not put into the final version of the protocol in order to maintain backward compativility with the old version. To be fair here are some cons too:

  • It is not THE fastest
  • Encryption is NOT required
  • Cookie security is still an issue

The support for Java

As of Java9 there is a HTTP Client which can be used to request HTTP resources over the network.
It supports both versions of the protocol, supports synch and asynch programming models and handles request and response bodies as reactive streams (check the topic 9 for reactive streams).

Here is a GET request example which prints the response as a string:

HttpClient client = HttpClient.newHttpClient();
HttpRequest request = HttpRequest.newBuilder()
client.sendAsync(request, asString())

For a gentle introduction which contain examples for each of the futures mentioned above, check the official documentation of the openjdk.

Javadoc Improvement (HTML5)

JEP 224
This was one of the most cosmetic updates from Java9 but nonetheless I really like it. It makes the standard able to generate HTML5 output of the JavaDocs.
This is great because although the output looks pretty similar, which means no need to getting used to anything, we have the benefit that HTML5 Markup is semantic, making accessibility bette.
Also, although not due to the HTML5 per se, this verison brings a nice ‘goodie’ by adding a nice feature: the ability to search. Let’s face it, this one was overdue!

You can now search for:

  • Method parameter types (Such as Int, String…)
  • Module names
  • Types and members
  • Packages

making your life easier and not having to use the browser search to find your things.

Cross compilation

JEP 247
If you are not aware, a cross compiler is a compiler capable of creating executables for a platform other than the one in which the compiler is working on, which in java it would mean, for example compiling for java version 7 while having the javac for java version 9.
Targeting the right version of a library is important because libraries are usually NOT forward compatible.
So the java compiler (javac) had two flags+parameters :

  • –source: language level accepted by the compiler
  • –target: version of the classes the compiler produces

By default however, javac compiles against the most-recent version of the platform API. This can lead to programs that cannot run on older versions of the platform since they may have used APIs only available for the newer versions.
You could though, pass another flag+parameter:

  • –bootclasspath – contained the documented-APIs from the desired version

The improvement here is that we have a new flag+parameter:

  • –release: takes care of using the correct options to compile the desired classes.

So using:

-release N

is equivalent to:

For N < 9:

-source N -target N -bootclasspath <documented-APIs-from-N>

For N >= 9:

-source N -target N --system <documented-APIs-from-N>.

How is this implemented

From the JEP itself we have:

For JDK N and –release M, M < N, signature data of the documented APIs of release M of the platform is needed. This data is stored in the $JDK_ROOT/lib/ct.sym file, which is similar, but not the same, as the file of the same name in JDK 8. The ct.sym file is a ZIP file containing stripped-down class files corresponding to class files from the target platform versions.

For JDK N and –release N,the JDK’s own image is used as the source of the class files to compile against. The list of observable modules is limited, however, to the documented modules and the jdk.unsupported module.

A Caveat

There are still problems though. Imagine the following library and client using it situation:

// Original
public class Library {
  public void foo(double d) {System.out.prinln("double foo");}
// Evolved
public class Library {
  public void foo(double d) {System.out.prinln("double foo");}
  public void foo(int i)    {System.out.prinln("int foo");}

and a program using the library:

public class Client {
  public static void main(String... args) {
     (new Library()).foo(1);

If the customer has his program based on the initial library, than the customer will make a call to a function that accepts a double, however after the overloading on the second version of the library the call will resolve to the int version since it is more specific.
This problem will appear independently of the way the library is compiled.
The idea here is that library mantainers should be careful when overloading in general.

Process Control

JEP 102
With this update you have a better control over the processes you’ve spanned out of your application as well as of processes not spawned by the API. It adds the possibility of retrieve, inspect and even kill the processes.


This was mostly motivated by the fact that the old API was limited and developers were forced to turn to native code to achieve what they intended, which of course comes with it’s own risks.
The old API basically gave you the chance to setup the environment and start a process, but that was pretty much it. Here is what you could do using the Process class:

    Runtime r = Runtime.getRuntime();
    Process p = r.exec("firefox");
    p.waitFor(10, TimeUnit.SECONDS);

You could also use the ProcessBuilder class to create the commands (lets call the calendar for the year 3001 in linux):

    try {
        ProcessBuilder pb = new ProcessBuilder("cal", "3001");
        final Process p = pb.start();
        BufferedReader br = new BufferedReader(new InputStreamReader(p.getInputStream()));
        String line;
        while((line=br.readLine()) != null){

      } catch (Exception ex) {

The scope of what you could do was limited. If you compare the API for java7 and java9 you would see that the Process went from 6 methods to 16.
I know that more methods doesn’t always means “improvement”, but in this case it does.

So what goodies do we get?

There is all kind of new things, such as:

  • children​() – Returns a snapshot of the direct children of the process.
  • destroyForcibly​() – Kills the process forcibly.
  • pid​() – Returns the native process ID of the process.
  • waitFor​(long timeout, TimeUnit unit) – Causes the current thread to wait, if necessary, until the process represented by this Process object has terminated, or the specified waiting time elapses.

The ProcessBuilder algo got some improvements to which I really like the following:

  • startPipeline​(List<ProcessBuilder> builders) – Starts a Process for each ProcessBuilder, creating a pipeline of processes linked by their standard output and standard input streams.

This is very usefull since it works like the Pipe (’|’) command in linux. With it you don’t have to acquire the output from one process and pass it to the next, but everything will be taken care for you out of the box!

Convenience Factory Methods for Collections

JEP 269
The idea here was to get rid of the over-verbosity of the way you can create a collection.
For instance to create a map and fill with values, you’d have to do something like this:

// Method 1 - the hard working method
Set<String> set = new HashSet<>();
set = Collections.unmodifiableSet(set);

// Method2 - collection from another collection
Set<String> set2 = Collections.unmodifiableSet(new HashSet<>(Arrays.asList("a", "b", "c")));

// Method3 - the double brace
Set<String> set3 = Collections.unmodifiableSet(new HashSet<String>() {{
    add("a"); add("b"); add("c");

this is far from being nice and personally I find it annoying too, but I am not one to complain 😉
Since Java8 we could at least leverage the Stream API:

Set<String> set = Collections.unmodifiableSet(Stream.of("a", "b", "c").collect(toSet()));

but it still quite verbose.

The new way

Although this is something already present on some libraries such as Guava, Java9 brought to its standard libraries a new and simplified way to do this:

// Lists now work like this
List<String> l = List.of("a", "b", "c");

// And sets like this
Set<String> s = Set.of("d", "e", "f", "g");

// Maps have two ways of being done
Map<String, String> map = Map.of("k1", "v1", "k2", "v2", "k3", "v3");

//or with entry objects
//Map.Entry<K,V> entry(K k, V v)
    entry(k1, v1),
    entry(k2, v2),
    entry(k3, v3),
    // ...
    entry(kn, vn));

This is much less verbose and can be really useful, specially for unit-tests or pre-populating a map at startup for example.


JEP 259
You know those biiiiiiig stack traces you get with loads of classes that you cannot even look into it, or even won’t because they are part of a library or the java lang itself? Wouldn’t it be nice if it just showed the classes that you actually cared about, i.e. the classes in your project?
Well this is what this improvement is doing for you! Here is an example to get the classes belloging to my project:

private static List<String> walkAndFilterStackframe() {
  return StackWalker.getInstance().walk(s -> frame-> frame.getClassName()+"/"+frame.getMethodName())
     .filter(name -> name.startsWith("org.fdiez"))
     .collect(Collectors.toList()) );

Pretty cool huh?

Did you know

But not a lot of people are aware that there was already something that tried to help you with the stack traces, previous to Java9. It was actually available since Java4, and it was called the StackTraceElement. This little guy represented a stack frame in the stack trace and allowed you to get a fully qualified name of the class where the error happened and some other information.
Together with the class there was also the addition of .getStackTrace() to the Throwable Interface which allowed you to get an array of StackTraceElement.
You could use this to:

  • Understand an application’s behavior.
  • Log stack trace element details to assist with debugging.
  • Find out who called a certain method in order to identify the source of a resource leak.

So why do we need the Stackwalker?

The way introduced by Java4 was very costly and impacted the performance since it was getting the full stack before letting you go through it. What the Java9 does with its Stack-Walking API is to make the whole process more performant.
Also StackWalker is thread-safe, i.e. multiple threads can share a single Stackwalker without any problems

Reactive Streams

JEP 266
This feature is actually part of a more wide scope which aims to constantly improve the concurrency API in the Java eco-system. It was also pushed forward by the Reactive manifesto and more precisely guided by the Reactive Streams Initiative which aimed at creating a standard for “asynch stream processing with non-blocking back pressure”. It is a mouthfull but nonetheless a good idea.

What was introduced

Interfaces that describe a Publisher and Subscriber which are part of the Reactive Stream framework. These were created inside a class called Flow.
The Publisher has one simple method that needs to be overriden:

void subscribe​(Flow.Subscriber<? super T> subscriber)

this allows Subscriber to subscribe to the topic this publisher is dealing with.

The Subscriber has 4 methods that it needs to override:

  • onComplete​() – Method invoked when it is known that no additional Subscriber method invocations will occur for a Subscription that is not already terminated by error, after which no other Subscriber methods are invoked by the Subscription.
  • onError​(Throwable throwable) – Method invoked upon an unrecoverable error encountered by a Publisher or Subscription, after which no other Subscriber methods are invoked by the Subscription.
  • onNext​(T item) – Method invoked with a Subscription’s next item.
  • onSubscribe​(Flow.Subscription subscription) – Method invoked prior to invoking any other Subscriber methods for the given Subscription.

That is basically it.

But wait there is more!

Yes, there is more.
There is the possibility to create transformation classes that will do some transformation on the data flowing from the publisher to the subscriber and you can even add control to how many messages each subscriber will get.
These improvements should facilitate the creation of Reactive Systems using Java!

Deprecate the Applet API

JEP 289
Due to the fact that most web-browser vendors are removing support for Java webbrowser plugins (it offers some security risks), there was a decision of deprecating the applet API. It will be initially deprecated and consequently removed, although it won’t happen in the next major release (Java11) and there is no fixed date yet. However in the meanwhile the developers can be guided to other technologies such as the Java Web Start or installable applications.

Private methods in Interfaces

Java8 added the functionality of having default and static methods inside interfaces. This was a big thing specially if you are an API maintainer, since it allowed you to create a backward-compatible API.
With the last release, they have “rounded-off” this feature by adding private and private static methods to interfaces too.


I didn’t find great the ideas that were added in the Java8. I understand the benefits yes, but there were ways around it before. I do prefer to have pure interfaces that just describe a contract and where someone else will do the details.
I prefer to have sleek and to-the-point interface which is just a contract among to parties and keep it well decoupled… but hey: that’s just me!

Some honorable mentions

These are also some of the interesting stuff you encounter when going thorough the list of JEPs added to the Java9. Here are a few:

  • JEP 158 – Unified JVM Logging
  • JEP 238 – Multi-Release JAR files
  • JEP 248 – Make G1 the Default Garbage Collector
  • JEP 252 – Use CLDR Locale Data by Default
  • JEP 254 – Compact Strings (more space efficient internal representation for strings)
  • JEP 253 – Prepare JavaFX UI Controls & CSS APIs for Modularization
  • JEP 283 – Enable GTK 3 on Linux
  • JEP 299 – Reorganize Documentation


How about you? Any favorites that weren’t covered? Anything you are sad that it still not around? Drop a line, I would be interested to hear about what you think.

Leave your opinion, start a discussion, share an experience or just a compliment...

%d bloggers like this: