Monday, January 29, 2007

Thinking Differently with Design Patterns, Java and Accidental Complexity

Bob Lee has posted a very useful tip on performant singletons. Have a look if you are writing concurrent Java applications for the enterprise. From plain old synchronization to DCL to IODH - I guess this pattern implementation has come a full circle in Java. If you are into Java, follow this idiom .. possibly this is the best you can get for a fast, thread-safe, lazily loaded singleton with JLS guarantee.

This post is not about singletons, although we start with the code for implementing the same pattern based on post Java 5 specifications :

public class Singleton {
  static class SingletonHolder {
    static Singleton instance = new Singleton();

  public static Singleton getInstance() {
    return SingletonHolder.instance;

Everytime you need a singleton in your application, make use of the above idiom. Unless you are coding a trivial application, very soon you will feel the spiralling cost of the growing number of classes. This is a basic problem of many of the Java idioms and design patterns when we try to force functional programming paradigms through nested classes or anonymous inner classes. This is, what many refer to as accidental complexity in modeling, which, very often, tends to overshadow the domain complexity, thereby resulting in lots of glue codes.

I am not in the league to snub Java. Myself, I am a Java programmer and have been doing OO with Java and C++ for the last 10 years. The GOF design patterns book has been my bible and all my thoughts have, so far, been soaked in the practices and principles that the book professes. With Ruby and Lisp, I have started to think about programming a bit differently. And as Alan Perlis has epigrammed, "A language that doesn’t affect the way you think about programming, is not worth knowing".

Singleton Pattern Elsewhere

require 'singleton'
class Foo
  include Singleton

That's it ! Ruby's powerful mixin functionality automatically makes our class a singleton - the new method is rendered private and we get an instance method for getting the object.

Another new generation language Scala offers the object keyword for implementing singletons.

object SensorReader extends SubjectObserver {
  // ..

In this example the declaration for SensorReader creates a singleton class that can have a single instance.

Design Patterns in Java and Accidental Complexity

As a language, Java does not offer powerful functional abstractions that Ruby, Scala or Lisp provides. Hence many design patterns which look invisible in these languages stand out as elaborate design constructs in Java. These add to the *accidental complexity* in a Java application codebase and often turns out more difficult to manage than the *actual complexity*, which is the complexity of the domain that you are trying to model. Technologies like aspects and metadata based annotations are attempts to improve the abstraction level of the Java programming language. Unfortunately these can never give the programmer that seamlessness in extending the syntax of the core language. The programmer will never be able to program bottom up or carve out a DSL as elegant as Rails using Java. Norvig has an excellent presentation on how dynamic languages make many of the GOF patterns invisible within them. The presentation illustrates how macros can make the implementation of Interpreter design pattern easier, method combinations can make Observers seamless and multi-methods can ease the implementation of Builder design pattern. Mark Dominus has posted a very thought provoking essay that concludes that patterns are signs of weakness in programming languages. What he means is that, languages where we need to write repetitive code to implement solutions to recurring problems lack in the abstraction power. The very fact that we have to repeat the code for implementing the Strategy design pattern for every instance of application of the pattern in Java, implies that the language lacks the extensibility to imbibe the design construct within itself. And by doing so, the implementation inherits lots of *accidental complexity* or yellow marker as part of the codebase. OTOH, in a typical functional implementation, the strategy is a simple variable whose value is a function, and with first class functions, the pattern is invisible.

The singleton pattern implementation in Java, despite providing a performant solution, also contributes to this accidental complexity of the codebase. This is more true for many other pattern implementations in Java or C++.

Thinking Differently

I cannot imagine myself writing about lack of abstractions in OO languages had I not been exposed to Ruby, Scala or Lisp. I realize the truth in the 19th epigram of Alan Perlis - these languages have really affected my thinking on programming at large. Now I can appreciate why Steve Yeggey thinks design patterns [in Java] as mostly pounding star shaped pegs into square holes.

Thursday, January 18, 2007

Syntax Extensibility, Ruby Metaprogramming and Lisp Macros

Over the last few days I have been feeling a bit Lispy. It's not that I have been immersed in Lisp programming, I still do Java for my day job and enjoy the process of staring at reams of standard object oriented api calls and the big gigantic frameworks that provide the glue code for the enterprise software. Java is still my favorite programming language, I still enjoy writing Java and have been recently working on bigger commitments to write more Java with more Spring and more Hibernate.

The only difference is that I have started reading Paul Graham's On Lisp again !

I am convinced that I will not be programming production level business applications in Lisp in the near foreseeable future. But reading Lisp makes me think differently, the moment I start writing event listeners in Java Swing, I start missing true lexical closures, I look forward to higher level functions in the language. Boilerplates irritate me much more and make me imagine how I could have modeled it better using Scheme macros. True, I have been using the best IDE and leave it to its code generation engine to generate all boilerplates, I have also put forth a bit of an MDA within my development environment that generates much of the codes from the model. I am a big fan of AOP and have been using aspects for quite some time to modularize my designs and generate write-behind logic through the magic of weaving bytecodes.

The difference, once again, is that, I have been exposed to the best code generator of all times, the one with simple uniform syntax having access to the whole language parser, that gets the piece of source code in a single uniform data structure and knows how to munch out the desired transformation in a fail-safe manner day in and day out - the Lisp macro.

Abstractions - Object Orientation versus Syntax Construction

For someone obsessed with OO paradigm, thriving on the backbones of objects, virtual functions and polymorphism, I have learnt to model abstractions in terms of objects and classes (the kingdom of nouns). I define classes on top of the Java language infrastructure, add data members as attributes, add behavior to the abstractions through methods defined within the classes that operate on the attributes and whenever need be, I invoke the methods on an instantiated class object. This is the way I have, so far, learnt to add abstraction to an application layer. Abstraction, as they say, is an artifact of the solution domain, which should ultimately bring you closer to the problem domain. We have :

Machine Language -> High Level language -> Abstractions in the Solution Domain -> Problem Domain

In case of object oriented languages like Java, the size of the language is monstrous, add to that at least a couple of gigantic frameworks, and abstractions are clear guests on top of the language layer. Lisp, in its original incarnation, was conceived as a language with very little syntax. It was designed as a programmable programming language, and developing abstractions in Lisp, not only enriches the third block above, but a significant part of the second block as well. I now get what Paul Graham has been talking about programming-bottom-up, the extensible language, build-the-language-up-toward-your-program.

Take this example :

I want to implement dolist(), which effects an operation on each member of a list. With a Lisp implementation, we can have a natural extension of the language through a macro

dolist (x '(1 2 3)) (print x) (if (evenp x) (return)))

and the moment we define the macro, it blends into the language syntax like a charm. This is abstraction through syntax construction.

And, the Java counterpart will be something like :

// ..
Collection<..> list = ... ;
    new Predicate() {
      public boolean evaluate() {
        // ..
// ..

which provides an object oriented abstraction of the same functionality. This solution provides the necessary abstraction, but is definitely not as seamless an extension of the language as its Lisp counterpart.

Extending Extensibility with Metaprogramming

Metaprogramming is the art of writing programs which write programs. Languages which offer syntax extensibility provide the normal paths to metaprogramming. And Java is a complete zero in this regard. C offers more trouble to programmers through its whacky macros, while C++'s template metaprogramming facilities are no less hazardous than pure black magic.

Ruby offers excellent metaprogramming facilities through its eval() family of methods, the here-docs, open classes, blocks and procedures. Ruby is a language with very clean syntax, having the natural elegance of Lisp and extremely powerful metaprogramming facilities. Ruby metaprogramming capabilities have given a new dimension to the concept of api design in applications. Have a look at this example from a sample Rails application :

class Product < ActiveRecord::Base
  validates_presence_of :title, :description, :image_url
  validates_format_of :image_url,
    :with => %r{^http:.+\.(gif|jpg|png)$}i,
    :message => "must be a URL for a GIF, JPG, or PNG image"

class LineItem < ActiveRecord::Base
  belongs_to :product

It's really cool DSL made possible through syntax extension capabilities offered by Ruby. It's not much of OO that Rails exploits to offer great api s, instead it's the ability of Ruby to define new syntactic constructs through first class symbols that add to the joy of programming.

How will the above LineItem definition look in Lisp's database bindings ? Let's take this hypothetical model :

(defmodel <line_item> ()
(belongs_to <product>))

The difference with the above Rails definition is the use of macros in the Lisp version as opposed to class functions in Rails. In the Rails definition, belongs_to is a class function, which when called defines a bunch of member functions in the class LineItem. Note that this is a commonly used idiom in Ruby metaprogramming where we can define methods in the derived class right from the base class. But the main point here is that in the Lisp version, the macros are replaced in the macro expansion phase before the program runs and hence provides an obvious improvement in performance compared to its Rails counterpart.

Another great Lispy plus ..

Have a look at the following metaprogramming snippet in Ruby, incarnated using class_eval for generating the accessors in a sample bean class :

  properties.each do |prop|
    class_eval <<-EOS
      def #{prop} ()
      def #{prop}= (val)
        @#{prop} = val

Here the code which the metaprogram generates is embedded within Ruby here-docs as a string - eval ing on a string is not the recommended best practice in the Ruby world. These stringy codes are not treated as first class citizens, in the sense that IDEs do not respect them as code snippets and neither do the debuggers. This has been described in his usual style and detail by Steve Yeggey in this phenomenal blog post. Using define_method will make it IDE friendlier, but at the expense of readability and speed. The whacky class_eval runs much faster than the define_method version. A rough benchmark indicated that the class_eval version ran twice as fast on Ruby 1.8.5 than the one using define_method.

  properties.each do |prop|
    define_method(prop) {

    define_method("#{prop}=") do |value|
      instance_variable_set("@#{prop}", value)

Anyway, all these are examples of dynamic metaprogramming in Ruby since everything gets done at runtime. This is a big difference with Lisp, where the code templates are not typeless strings - they are treated as valid Lisp data structures, which the macro processor can process like normal Lisp code, since macros, in Lisp operates on the parse tree of the program. Thus code templates in Lisp are IDE friendly, debugger friendly and real first class code snippets. Many people have expressed their wish to have Lisp macros in Ruby - Ola Bini has some proposals on that as well. Whatever little I have been through Lisp, Lisp macros are really cool and a definite step forward towards providing succinct extensibility to the language through user defined syntactic control structures.

OO Abstractions or Syntax Extensions ?

Coming from an OO soaked background, I can only think in terms of OO abstractions. Ruby is, possibly the first language that has pointed me to situations when syntax extensions scale better than OO abstractions - Rails is a live killer example of this paradigm. And finally when I tried to explore the roots, the Lisp macros have really floored me with their succinctness and power. I do not have the courage to say that functional abstractions of Lisp and Ruby are more powerful than OO abstractions. Steve Yeggey has put it so subtly the natural inhibition of OO programmers towards extended syntactic constructs :

Lots of programmers, maybe even most of them, are so irrationally afraid of new syntax that they'd rather leaf through hundreds of pages of similar-looking object-oriented calls than accept one new syntactic construct.

My personal take will be to exploit all features the language has to offer. With a language like Ruby or Scala or Lisp, syntax extensibility is the natural model. While Java offers powerful OO abstractions - look at the natural difference of paradigms in modeling a Ruby on Rails application and a Spring-Hibernate application. This is one of the great eye-openers that the new dynamic languages have brought to the forefront of OO programmers - beautiful abstractions are no longer a monopoly of OO languages. Lisp tried to force this realization long back, but possibly the world was not ready for it.

Monday, January 08, 2007

Why I should learn Lisp

At the beginning of 2006, I had promised myself that I will learn Ruby and the tricks of the trade of functional programming. I do Java for a day job and get paid for consulting on enterprise Java architectures. I like Java, I love the Java community, I am a big fan of some of the great cool Java frameworks that are out there. I used to do C++ as well five years back and took great pride in designing allocators and smart pointers. All these were part of the application codebase, and despite using productive libraries like Boost, infrastructure code management (aka memory management and memory leaks) took away most of my night's sleep, at the expense of the geek feeling that I am doing C++. Java was the language that took away from me the pride of writing destructors and allocators. But in course of this sense of loss, I realized that I was now programming at a higher level of abstraction with the entire memory management left to the programming runtime. I was deep into encapsulation and better object orientation and embraced each successive release of Java with great enthusiasm.

One day, after reading a few pages of the pickaxe book and doing some hunting on the internet for Ruby evangelists, I came up with the following piece of Ruby code as the implementation of the Strategy Design Pattern :

class Filter
  def filter(values)
    new_list = []
    values.each { |v| filter_strategy(v, new_list) }

class EvenFilter < Filter
  def even?(i)
    i%2 == 0

  def filter_strategy(value, list)
    if even?(value)
      list << value

of =
array = [1,2,3,4,5]
puts of.filter(array)

On further introspection, more reading of the pickaxe book and more rummaging through the musings of Java bashers in LtU, the light of lambda dawned on me. Looked like I was going through the enlightenment of the functional programming paradigms, the enhanced expressivity and abstraction that higher order procedures add to the programs. I could appreciate the value of lexical closures, bottom-up programming and functional abstractions. The new class for Strategy implementation is adapted from Nathan Murray's excellent presentation on Higher Order Procedures in Ruby :

class FilterNew
  def filter(strategy)
    lambda do |list|
      new_list = []
      list.each do |element|
        new_list << element if

of =
filter_odds = of.filter( lambda{|i| i % 2 != 0} )
array = [1,2,3,4,5]

The Disappearing Strategy Classes

In the new implementation, where is the strategy class that is supposed to be hooked polymorphically in the context and provide the flexible OO implementation ?

It has disappeared into the powerful abstraction of the language. The method filter() in the second example does not return the newly created list, unlike the first one - it returns a procedure, which can act on other sets of data. The second example is an implementation at a much higher level of abstraction, which adds to the expressivity of the intended functionality.

In fact with functional programming paradigms, many of the design patterns which GOF have carefully listed in the celebrated book on Design Patterns, simply go away in a language that allows user to program at a higher level of abstraction. Have a look at this excellent presentation by Norvig.

As Paul rightly mentions in his post, the paradigms of functional programming hides a lot of accidental complexity mainly because of the following traits, which the language offers :

  1. Higher level of abstraction, which leads to lesser LOC, and hence reduced number of bugs

  2. Side-effect free pure functional code, which liberates the programmer from managing state and sequence of execution

  3. Improved concurrency and scalability because of the stateless and side-effect-free programming model

Ruby or Lisp ?

People look upon Lisp as the language of the Gods, someone has mentioned Ruby as an acceptable Lisp, many others consider Ruby as lying midway between Java and Lisp. Ruby is an object-oriented language with functional programming capabilities, while Lisp came into being in 1958 with the landmark 'eval' function of John McCarthy. As Paul Graham says :
With macros, closures, and run-time typing, Lisp transcends object-oriented programming.

Lisp and Smalltalk have been the main inspirations to Matz behind designing the Ruby language. May be Ruby is more pragmatic than Lisp, but the roots of Ruby are definitely engrained within the concepts of pure macros, lexical closures and extensibility mechanisms that Lisp provides. Lisp is the true embodiment of "code-as-data" paradigm. Lispers claim that Lisp (or any of its dialects) is definitely more expressive than Ruby, Lisp macros can extend the language more seamlessly than Ruby blocks. I am not qualified enough to comment on this. But my only observation is that behind the nice Lispy DSL that Rails provide, its implementation looks really clumsy and possibly would have been much more cleaner in Lisp.

Not only Ruby, functional programming constructs are beginning to make their appearence in modern day OO languages as well. C# and Visual Basic already offer lambdas and comprehensions, Java will have closures in the next release - the Lisp style is promising to come back.

Still I do not think Lisp is going to make mainstream, yet I need to learn Lisp to be a better fit in today's world of changing programming paradigms.