JVM is a monster – in a good way. Its architecture and especially the super-optimized just-in-time-compiler (JIT) help Java byte-code apps to run at a performance that makes startups rewrite their entire software when they grow up. But this amazing piece of engineering doesn’t come without a price. Even small JVM applications are known to eat up a lot of memory and the start-up times of non-trivial JVM apps are – well – iconic.
GraalVM is a newish JDK distribution coming from Oracle Labs. It uses partly the same codebase as the regular OpenJDK, a variant of which you most likely use on a daily basis, but focuses on performance and polyglot support. For JVM performance, GraalVM replaces the JIT compiler in the HotSpotJava Virtual Machine with a custom version that, especially in the enterprise edition, is expected to give significantly better performance (34%, according to their own benchmarks), including some memory savings.
The more radical performance-related feature is the ability to run completely without JVM. GraalVM contains tooling (an “ahead-of-time compiler”) to compile Java applications into native executables. While the JVM improvements in GraalVM bring a notable performance gain and JVM still beats native in server processes, the native compilation is a game-changer in areas where Java apps used to suck. Startup/execution times of Java apps compiled to native code are comparable to raw C apps and the memory reserved by a trivial application is a fraction of what is needed by the JVM to boot up.
I recently spent a while setting up native compilation for a couple of Vaadin examples. It is certainly possible, but for most use cases probably not worth it. Let’s discuss what it takes to compile your Vaadin application as a native image, what it means for performance characteristics, and if/when you should consider it.
How to compile to native
When creating native images, GraalVM takes your byte code, JDK and a custom VM, and compiles all that into a native binary that your OS and processor can eat as such. Sounds easy, until you hear the limitations: the dynamic features of Java don’t work out of the box. Native compilation needs some tips, for example when dealing with reflection usage and when accessing resources from the classpath. Thus, for most Java applications, native compilation isn’t just adding a new build step.
Because so many of the Java libraries we are using rely on reflection usage and because of the overall complexity of native compilation, I suggest starting with a stack that is built for native compilation from the ground up. I chose Quarkus for this exercise, for which we recently added an official integration library. Quarkus has nice instructions to get started with native compilation and most of the “Quarkus extensions” contain hints for GraalVM about reflection usage and such.
The Vaadin extension is one of those that don’t yet come with these hints, so you can expect a bunch of exceptions on your build when you try it for the first time. To overcome the compilation issues that the Vaadin UI created, I followed these great instructions. The only sane way to make your non-trivial app compile as native is to use a special JVM agent when running the app. This way, you automatically collect the hints needed, for example for reflection usage.
With certain parts, I changed my dependencies to get tools that were compatible out of the box. I moved from an embedded H2 database to a separate database and a compatible JDBC driver. Also, my demo data generation used Node.js calls behind the scenes, so it was just much easier to get an SQL script to populate the DB.
In addition to the DB-backed example, I made a version of the Quarkus starter project that compiles to native. It is rather trivial, but it can probably help you to get your own app to compile.
Quarkus Native comes with a bunch of helpers for native image creation. In the starter example, I manually moved the hints generated by the JVM agent into a Quarkus-specific Java class that gives the same hints for the toolchain, in a bit-nicer format. In an ideal world, the Vaadin extension would provide these hints to the core library part. If you are interested in this feature, please go and give a thumbs-up to this enhancement issue I generated.
Pros and cons of native compilation
Quarkus applications are also fast to start up in normal JVM. Still, you’ll definitely see the difference once you have compiled it as native. In my example app, the start-up time is just a fraction of a second, even with Hibernate et al.
Native compilation definitely makes a difference in certain areas, but does it make sense for Vaadin applications? During my tests, I collected the following list of pros and cons:
Pros:
- Less memory reserved for the process
- Smaller artifacts
- Faster startup
- No “warm-up period” that JIT compiler needs for a new process during the performance isn’t yet optimal
Cons:
- Worse performance compared to a warmed-up JVM, which still beats native compilation easily without the “enterprise edition” of GraalVM
- Much longer build time; the good part is that the webpack step for the frontend feels like a snap :-)
- More complex and fragile build; you certainly can’t expect your app to work once your last compilation issue is cleared in your Java IDE
- Not compatible with the JVM tooling (debugging, JVM agents, configuration)
- Artifacts are no longer platform independent; the “write once, run everywhere” promise is broken to some degree.
While I’m impressed by the capabilities of the natively compiled Java apps and can see great possibilities for the Java ecosystem to expand to areas that were ruled out before, I’d still say JVM is the way to go for most Vaadin apps. The greatest benefit for Vaadin apps would probably be the memory savings. But as Vaadin deployments love memory anyways, even if using native, I’d claim that it’s much cheaper to buy more memory for your servers than to fiddle with the native compilation. The trade-off is basically the same reason you probably chose Vaadin in the first place: developer productivity over hosting expenses.
The memory savings might become relevant in certain scenarios, though – like if you have a lot (like dozens or hundreds) of different Vaadin apps running (with just a few active users in each). This could happen in, for example, a SaaS service where you chose to have a separate deployment for each and every customer.
Don't miss the possibilities of "the other GraalVM"!
While I’m currently hesitant to suggest native compilation for our customers, I want to highlight a thing about GraalVM that many Java developers are missing. GraalVM != native binary compilation. GraalVM is an umbrella project for multiple modernization efforts for JDK.
The most relevant part for most Java developers is probably the fresh just-in-time compiler (JIT), written in Java. When using that, you’ll be using almost the same JVM as with normal JDK, but you’ll get better performance on almost all fronts, especially with the enterprise edition. But for the development process and conventions, nothing really changes but the name of the JDK. The GraalVM project supports Java 17 and you can buy support for this, so it should be a safe option to consider for those hosting CPU-intensive Java projects.