How Java and Ruby made me a better Node developer

Herein is a partial recounting of the bits and bobs that I have learned passing through different programming languages. Principles and processes that I found useful, and not so. It is by no means a complete recounting, but rather meant to help people understand why it's so important to try new things, even when you think you've 'got it down'. How each culture brings new and innovative ideas that can better your code! Onward...

Just shy of a decade with Java

There are many things to be said in favor or against Java. While I personally have no plans of returning to the language in the mid-term future, I learned many things from it and the connected community.

The Java Virtual Machine is a rock. Meaning, it is a very suitable foundation upon which to build the next generation of languages, much the same as C is. It allows highly portable code to be crafted such that it can be highly performant on just about any system that the JVM runs on, in languages (groovy, scala, clojure, jruby, et plus..) that are much more terse and flexible.

Everything in Java-land is configurable. This can be both a boon and a bane. Many Java libraries take their configurability to the point of absurdity, which in turn leads to codebase bloat. Spring is a fantastic example of this. It started off as a great DI platform, and ended up with nearly a J2EE level of complexity and in-and-of itself requires hundreds of megabytes of space on disk, not even to speak of once it gets in to memory. It is not desirable that every library should cover every possible option.

The Java community has a love for design patterns. I was first exposed to many useful patterns that I still use in other languages while I was in Java, here are a few that I still use quite regularly: MVC (or rather, variants thereof), Observer, Messaging, Facade, Factory, and Flyweight. Many of the others still apply, but aren't as useful when you have a greater syntactic flexibility. Realizing that these patterns are available to me has been a great help. When I am trying to reason through particularly difficult problems, I can fall back on some of these great patterns and find possible solutions. Unfortunately, the Java community takes this to an unhealthy level in a lot of cases. They get bogged down in talking about patterns instead of solutions. As martial arts masters say, first you have to learn how to do it, then you have to unlearn how to do it, and just do it.

Document all the things. When you work with so many highly configurable pieces of code, you have to make sure you know exactly how to make those libraries perform as you wish. The Java community is fantastic at generating vast quantities of documentation. There is a lot of white noise when you going looking for a specific problem that you have, but more likely than not, someone has your problem documented, somewhere. You just have to dig hard enough to find it. There are many communities that could learn from this approach.

Boilerplate is bad, mm'kay? Strongly typed languages require lots of it, no matter how you slice the problem. Particularly when trying to encapsulate absolutely everything. My take away from this is that boilerplate, while sometimes necessary, is an undesirable. If you find yourself constantly writing the same pieces of syntax over and over, you're probably just wasting your time and making your code more complex. Sometimes the solution is to refactor. Sometimes you need to re-evaluate the tools that you're using.

No conversation about Java would be complete without mention of their dogmatism. This was one of my greatest take aways from the Java community: Don't adhere to dogma. More than that, don't listen to appeals to authority. Just because it's writ in an RFC somewhere, doesn't make it the right way to do something. The Java culture takes their 'thou shalts' to an unhealthy level, and they end up writing massive amounts of code in order to prove their adherence.

Two years in Ruby

Ruby was built with developer happiness and ease of use in mind. It is an object oriented language that inherits a lot of its thought processes from Smalltalk. It was my first completely object oriented language, and I must admit that I found its terseness to be a breath of fresh air when I first joined the world, after so long in Java. When I was first coding for the web using Perl and PHP, and then went to Java, I thought for a very long time that it was going to be almost impossible to go back to a dynamic language, due to how poorly written "all dynamic code" was. Due to my experiences with Perl and PHP. Oh how wrong I was, but it took me a good amount of my own personal time, and jumping off the cliff in to enterprise Ruby to truly find out how "ready for prime time" that these new dynamic languages have become!

While I was not originally exposed to the Pareto principle ("80/20 rule") in Ruby, it was the first time that I saw many libraries trying to follow its advice. While controversial to many people, I saw this work out very well for a lot of cases. Take ActiveRecord for example. That library makes it very easy to do basic database queries against any number of relational databases, but if you need to do something more specific with a database, like call a stored procedure, you can just use the connection directly. Rather than them trying to encapsulate that functionality somehow.

Convention over configuration. A phrase that seems to have come directly from the Ruby community as far as I can tell. Those involved in Ruby still build some fairly large libraries (though nothing alongside the monolithic ones that are in Java), but they provide sane defaults. A good portion of the time, even if you're using a fairly impressive sized library, you can start with just bringing it in and using the default settings that it provides. This language was the first that I had seen that provided these sane defaults that you didn't have to shepherd and tweak to get just right.

Keeping it Simple and Stupid is a mantra for the Ruby community. That's where a lot of the innovation that people hear about from Ruby-land comes from. Don't overbuild. As DHH said, Rails is omakase. Don't like ActiveRecord? Don't use it. Need to kick some processing out of your web worker thread? Great, put it on a queue somewhere for a worker to pick up- but don't overbuild that queue. You should have sane defaults so that you can get a worker on that queue almost immediately. Anything else is overbuild and waste of time.

Two very important acronyms: DRY (Don't Repeat Yourself) and YAGNI (You Aint Gunna Need It). These two are mantras that I find myself repeating constantly while I'm programming. "Do I need to be able to do that with this piece of code? ..Nope!" And, when I start to copy and paste a set of code, or start retyping code that I know is elsewhere, I immediately stop myself and start refactoring so that I only have one set of code performing any particular function. I can't even begin to recount how many times this has saved my bacon when hotfixing applications.

Rails Style: "Skinny controller, fat model". This means that you should put your business logic, or the logic that actually performs changes, in the model. The controller shouldn't be overbuilt, it should only contain enough code to tell the model what to do with itself, and then pass the information back up to the client. I think that this doesn't go far enough some times. There are usually opportunities to improve even further on this during fairly complex processes. Rather than expanding a model in to quite large proportions, I occasionally take a cue from the Java land and write up a service layer instead. So that you're not bloating your controller or your model to an unmanageable level.

While the Ruby culture is not quite as dogmatic as the Java culture, they still definitely believe there is a right way and a wrong way to build your applications. Particularly if you live in Rails-land, rather than one of the subcultures (Padrino/Sinatra, for example). Still, the Ruby culture is more palatable and less stogy than the Java one in my estimation.

What's more: The Ruby culture taught me that development can be fun again. I blame that expressly on the fact that so few "paycheck developers" spend their evenings learning Ruby, so it is a group of enthusiasts instead of people that are looking for their next gig.

A year of Node

My introduction to Node came when I started using it regularly as tooling, which I do believe is how it is getting its induction in to a lot of development groups! Whether it is setting up an asset pipeline (grunt/gulp), or giving you a foundation for your next web application (yeoman), or managing your front end javascript dependencies (bower), there are a lot of uses for Node without even utilizing it as a primary language. That's a pretty big win. Once somethings starts to feel familiar, you can't help but start be curious about how more of it works. Eventually, I decided to take the plunge. I'm glad I did, despite some of the pundits that speak of Node's insufficiencies.

So far in my time spent in Node, there are two big things that I have taken away. One of which I've mildly hinted at from the Java and Ruby sections: Modularity. Ruby still has some fairly large large stacks that it requires one to use. While Rails is not one monolithic framework, but a collection of fairly large frameworks, it is still nearly 200mb in memory when it is first loaded. While memory is cheap, and disk is too, I still don't believe that relieves us developers from paying attention to those numbers.  When I first built an express application with Node and it used less than fifty megabytes from the get go, I knew I was hooked. Sure, there are some lighter weight frameworks in Ruby and Java, but that's not really how their communities think. The strongest and most active communities in Node revolve around the ExpressJS foundation. The primary application that I support today is less than 150mb with all code, in memory. The largest node app that I maintain doesn't go over 300mb in memory until it starts taking requests. I remember one java gig that I took, and they had a server with 32gb of memory for me on my first day, because the application itself took up more than 8gb in memory... I still look back and shake my head.

Asynchrony is pure win, but has its trade offs. It comes at the cost of comprehension. It can be harder for the developer to reason through, when nearly everything is an event (everyone that has spent much time in JavaScript has heard of "Callback Hell"). Despite that, there are many advantages of having your code being non-blocking. The scalability characteristics for any program that has to spend any time in wait cycles (and any program that accesses a database spends lots and lots of time waiting) is losing a lot of computer cycles just sitting and waiting for that response. Node takes full advantages of that time spent waiting to do other things, instead of making them wait in line. This is primarily achieved with the heavy use of streams. Any non-trivial application will eventually end up doing some work with them. It'll probably bend your mind a bit at first, but eventually you'll see the light At this point, I'd have a really hard time justifying going to an "IO Blocking" language again, having seen the ease with which Node can scale.

And.... That's a Wrap!

Phew! That was quite a bit! I hope you found this review interesting, or at least a bit insightful. Feel free to leave me a comment either way, I'd like to hear what you think. Either way, I would encourage you to not get too comfortable where you're at. Keep pushing the boundaries. Keep innovating. Because that is what being a coder is all about.


tamouse said...

Really great wrap up. Now you have at least a year's worth of articles to write expanding on this in detail :)

Christopher Rueber said...

Thanks! And that's exactly the goal. To try and write in this blog every week or two. :) It's been pretty dull in here for a while, trying to resurrect this thing and see how it goes!

Post a Comment