~$ ./jackson.sh

Java is criminally underhyped

Published Thursday, April 15, 2021

The perspective of an ignorant computer science undergrad

It's likely that you read the title of this post and thought "what is this guy smoking? Java is everywhere!" You're correct, Java still dominates enterprise and runs some of the world's largest mission-critical applications. But Java's adoption isn't what I'm talking about, I'm talking about its hype. I spend a lot of time around inexperienced programmers. And what do inexperienced programmers love doing? Getting excited and opinionated about tools like programming languages. None of the CS undergrads I meet are hyped about Java but I think they should be.

Young/naive developers (myself included) often fall into the trap of fetishizing new languages and tools at the expense of productivity and sanity. Prior to working at Halp (now owned by $TEAM), I had a nearly romantic relationship with backend TypeScript. I thought the node.js ecosystem was the coolest thing ever: I loved the idea of transpiled code, live debugging, the massive package library, and even the weird and fragmented build systems. When I actually used it in production and spoke to more experienced engineers the magic quickly faded away.

I had a irrational affinity towards the JS ecosystem because it was the hot new thing; it had hype. Reality did not live up to my expectations. Today, the wonderful things I expected from JavaScript I am currently enjoying as I gain experience in Java. I feel betrayed that hype did not lead me to Java sooner. Java is fun to write, productive, and gets an unfair reputation among new developers as a dinosaur.

Ergonomics are what make Java great #

This cannot be understated: Java simply feels good to write. A lot of this is due to the craftsmanship JetBrains puts into IntelliJ IDEA. Everything is autocompleted, jump-to-definition is fast, find-usage works well, and refactoring is easy. However, where Java truly shines is the developer experience with third-party libraries.

My experience is limited, but I feel the winds have shifted in favor towards liberal usage of external dependencies. Not Invented Here is out, Not Invented There is in. JavaScript developers in particular are extremely likely to include third-party libraries even for trivial operations like left-padding a number. I don't think the current affinity for third-party dependencies is particularly harmful, but upstream API changes can wreak havoc on untyped JS/Python code bases.

When consuming third-party libraries in Java, you always know exactly what types need to be passed to a method. Most importantly, incorrect usage of a function will result in red squigglies in your editor. Given that heavy library usage is in, I think more people should be excited about Java.

Nominal typing saves time #

There are a number of disadvantages with dynamic/duck/weak/whatever typing. When a dependency changes an API method and your application fails at runtime rather than build time that's a problem. When a developer has to refer back to the implementation of a method to figure out which types to pass it, that's a waste of time. TypeScript and Python type hints solve this problem a bit, but they lack the ability to validate passed types at runtime without extra code.

Type guards are my least favorite TypeScript feature. They're essentially duck typing that you have to implement yourself and trust that they're implemented correctly. In my opinion, this is the worst of both worlds. Consider the following:

interface Dog {
bark: () => void;

/* The developer has to manually implement
a heuristic check for interface adherence!
When they update the interface, they have
to update the type guards too! */

function isDog(pet: object): pet is Dog {
return (pet as Dog).bark !== undefined;
const dog: any = {bark: () => console.log('woof')};

if (isDog(dog)) {
// TS now knows that objects within this if statement are always type Dog
// This is because the type guard isDog narrowed down the type to Dog

There's something about declaring a type AND having to write validation logic for said type that really rubs me the wrong way. The code above smells like someone using the wrong tool.

Unlike TypeScript definitions, Java's nominal type systems takes load off the programmer's brain by crystallizing type definitions and guaranteeing type guards by default.

Removal of responsibility for optimization #

Java developers can confidently trust the JVM to do what's best. Whether they're implementing a multithreaded application or storing a large amount of data on the heap, they can be confident they won't shoot themselves in the foot with memory management or data races. This advantage is primarily in comparison to C++, which contains a multitude of footguns.

This is part of Java's ergonomic experience. When a developer has to worry less about technicalities they can focus more on the problem at hand.

A holy grail of productivity #

How many languages can you think of that meet the following conditions?

  1. Quality package manager and build system (I 💚 Maven)
  2. Nominal typing
  3. Large community
  4. Hands-off optimization

Java is the only qualifying tool I've used, but let me know if there are others! edit: As Jwosty pointed out Microsoft's Java competitor C# has all these characteristics and more/newer language features. I have never used C# outside the Unity game engine, so I cannot fairly judge it. However, even if we include mono, C#'s portability seems to be lacking when compared to Java.

Surprising absence from university curriculum #

I currently attend the University of Colorado Boulder; it's a great school but we're certainly not known for CS. However, the majority of our upper-division computer science curriculum is shamelessly stolen from CMU or Stanford, assignments and all. During my time at CU, I've used the following programming languages:

  1. C++. This language was chosen for all the required core courses: Computer Systems, Operating Systems, Data Structures, etc. This language is a reasonable choice as it enables direct memory management, creation of kernel modules, and presents many challenges and learning opportunities.
  2. Python and Julia. As you might expect, these languages were the darlings of numerical computation and discrete math professors.
  3. Scala. This language was used in Principles of Programming Languages instruction, primarily for its functional programming and pattern matching features. While Scala uses the JVM and interops with Java, it has a very different developer experience than Java.
  4. Web languages (HTML/CSS/JS). These were only used in a single course called Software Development Methods and Tools which focused on industry trends.

I am graduating this semester and Java has not made a single appearance; I think this is a shame.

The delightful scrutability of JVM bytecode #

Debugging external libraries, particularly if they exist as a compiled binary can be exceedingly challenging. This difficulty is compounded further if the library vendor declines to include debugging symbols or a source map. The ability for developers to easily include and inspect library code is one of the reasons for JavaScript and Python's popularity.

Let's consider the worst-case scenario for working with an external library: the library is behaving in unexpected/undocumented ways and the code is obfuscated. One way this can happen is due to code minification, which is extremely common in the JS ecosystem. If you jump-to-definition and land in a minified JavaScript file, you're overwhelmed by a completely incomprehensible blob of syntax.

A similarly difficult situation comes from attempting to debug static libraries. While a few products (such as Hopper) can produce psuedo-code from binaries, it's still extremely difficulty to interpret.

JetBrain's bytecode decompiler makes this nightmare slightly less bad. While still not ideal compared to source code, it provides the easiest experience for investigating libraries which you don't have the source code to. This is possible because JVM bytecode is organized into classes rather than as a single stack of instructions (like we'd see from optimized gcc output).

Someone left an insightful comment about how Java propelled Minecraft's popularity by making extensive modding possible (even against the developer's intentions):

The amount of vitriol Java receives in the game development space is rather strong. However, whenever people dismiss Java as being a bad language, I can't help but think of Minecraft.

Yes, it's one of the few mainstream titles where Java has been successfully applied, but I believe Minecraft's success was specifically because it was written in Java.

With no JVM, there would have been no Minecraft mods at the level of flexibility that Forge provides, and the game would probably have stagnated in comparison with the sheer amount of community content from the past ten years that's available. Being able to use libraries like ASM that allow you to transform the compiled bytecode in radical ways were possible only because Minecraft targeted a virtual machine. The incredible thing was that this was in spite of the obfuscation Mojang applied to the compiled source. If it was possible to create a flexible mod system at all thanks to the JVM, people were just too motivated for any deterrence to stop them.

For the people who say that Minecraft should have been written in C++ or something from the start, because Java is a mediocre programming language, there's Bedrock Edition. Nobody I know cares about it enough to play it. For all its bloat and performance issues, the benefits of the JVM were simply too convincing.

-- nonbirithm

Out of all the machine code formats the JVM's is easily the most human-understandable. While it may not be relevant often, it's still an interesting boon of the JVM ecosystem.

Conclusion #

There is no One True Way to build applications, but I think that Java doesn't get enough attention particularly among startups and the newbie programming community. Untyped languages are useful tools, but I don't think they should be the default choice for building large applications. If you're a full-stack dev and have never extensively used Java, I think you'll be pleasantly surprised should you try it for your next project.

Java and the JVM were hyped to the moon in the 90's and early 2000's, but I don't think it should have ever died out! The developer experience I've found with IntelliJ and Java is worth getting excited about.

I am curious why Java lost its hype in the first place. Programmer culture history is poorly documented and if you have insight, please email me or leave a comment.

Comments (821)

View comments on reddit / Hacker News or follow me on Twitter