Node-m-r – A Simple CQRS Example using Node.js

Anyone learning about DDD, CQRS and/or event sourcing has probably read the source code of Greg Young’s simple CQRS example at some point or another. This is one of the simplest examples possible demonstrating the practical side of CQRS and event sourcing, originally developed using .NET.

I made an attempt to rewrite the source code of this example using Node.js, trying to apply as many features of the core library as possible, particularly streams and event emitters. For example, the implementation of the base aggregate root object uses an event emitter internally for applying events in derived objects. An aggregate root is also a writable stream, which is used by the repository for replaying events. The report aggregators are also writable streams, which receive events from the message bus.

I’ve put the resulting source code up on GitHub. This is still very much a work in progress though. Some improvements that I’m considering is spawning a separate child process for both the command handlers as well as the report aggregators. No data is being persisted at the moment, so everything is stored in memory. Adding physical data stores is something that I’m also looking into. There’s no UI available either. You can run the example by executing the following command:

$ node src/application.js

Have a look at the source code and let me know what you think.

Until next time.

Either.js

Some time ago, I was watching this excellent video course by Neal Ford titled “Functional Thinking – Functional Programming using Java, Clojure and Scala”. In one of the modules on functional data structures, Neal talks about this container type in Scala named Either.

The Either type in Scala represents one of two possible values, a Left value or a Right value.  The convention dictates that the Left value is used for error or exception objects, while the Right value is used for normal values. Why is this useful?

In many programming languages, like Java, C#, JavaScript, Scale, etc. …, there are generally two ways a function or method call returns something. The first one is through a regular return value that is returned by a function/method, be it some primitive type or some object. The second one is by throwing an exception, which means that the return value of the  function/method in question is not available to the calling code. In order to deal with this dual behavior, the calling code has to verify/use the value returned when invoking a function or method as well as dealing with possible exceptions  that this function/method might throw.

This is where the Either type comes in, unifying both behaviors by offering a Left value containing a possible error/exception object or a Right value containing the regular output of a function/method. In his course, Neal further elaborates on this concept by implementing a similar Either type in Java using interfaces because Java has no support for lambda’s yet (ouch).

While working on some JavaScript code a while back, I ran into this situation where the code in question could really benefit by using the concept of an Either object. So I decided to implement such an object in JavaScript and extracted the result into a Node.js module named either.js that I recently published on npm.

Take a look at the following code sample:

_this.insertReport = function(dataArea, inventoryReport, callback) {
    simulateAsynchronousIO(function() {
        var reportsCollection = _dataAreas[dataArea];
        if(!reportsCollection) {
            var error = new InvalidDataAreaError('The specified area is unknown.');
            callback(error);
            return;
        }

        reportsCollection.push(inventoryReport);
        callback();
    });
};

 

There were a couple of functions like this one where an array was being retrieved from an object named _dataAreas which contains a number of array properties. In case the specified property name could not be found on the containing object, the callback was invoked with an error object. Using an Either object, this code got revamped into something like this:

 

_this.insertReport = function(dataArea, inventoryReport, callback) {
    simulateAsynchronousIO(function() {
        getReportsCollectionFor(dataArea).fold(
            function left(error) { 
                callback(error); 
            },
            function right(reportsCollection) {
                reportsCollection.push(inventoryReport);
                callback();        
            }
        );
    });
};

function getReportsCollectionFor(dataArea) {
    reportsCollection = _dataAreas[dataArea];

    if(reportsCollection)
        return either.right(reportsCollection);
    else
        return either.left(new InvalidDataAreaError('The specified area is unknown.'));
}

 

I admit that there’s definitely some getting used to at first, but I find this approach to be very clean and useful. In fact, this kind of reminds me to the beauty of using promises for asynchronous code, but for synchronous code instead.

Until next time.

Detecting the End of a Rainbow Inside a Writable Stream

I was implementing a custom writable stream in Node.js the other day when I ran into this issue where I wanted to know whether more data was coming or that we were actually done writing stuff. When looking at the recently revised API for stream implementers, the only thing that gets mentioned in the docs is the _write method.

After some intensive DuckDuckGoing (yes, this might become an actual word in the future), I ran into this thread on the Node.js user group and this issue on GitHub. In short, it is possible and even advisable to listen to the ‘finish’ event inside a Writable stream. Apparently, this happens all the time in the Node.js library itself. This means that by catching the ‘finish’ event, I was able to implement the flush functionality that I was looking for.

So without further ado, here’s a simple example of a custom Readable and Writable stream that are piped together. The Writable stream listen to the ‘finish’ event in order to flush stuff down the drain.

var stream = require('stream'),
    util = require('util');

//
// Reading stuff
//

var ReadingStuff = function() {
    this._data = [1, 2, 3, 4, 5];

    stream.Readable.call(this);
};

util.inherits(ReadingStuff, stream.Readable);

ReadingStuff.prototype._read = function() {
    if(0 === this._data.length) {
        this.push(null);
        return;
    }

    this.push(this._data[0].toString());
    this._data.shift();
};


//
// Writing stuff
//

var WritingStuff = function() {
    stream.Writable.call(this);

    this.on('finish', function() {
        console.log('Finished writing stuff!!');
    });
};

util.inherits(WritingStuff, stream.Writable);

WritingStuff.prototype._write = function(chunk, encoding, next) {
    console.log(chunk.toString(encoding));
    next();
};

//
// Application
//

var readingStuff = new ReadingStuff();
var writingStuff = new WritingStuff();

readingStuff.pipe(writingStuff);

 

Notice that inside the _read method of our custom Readable stream, we call this.push(null) in order signal the end of the data.

Hope this helps. 

Taking Toddler Steps with Node.js – The Towering Inferno Revisited

Soon after I started using Node.js, I ran into the phenomenon of multiple nested callbacks that create some kind of horizontal tower effect. The solution I came up with in order to improve the readability of my code was using a library called step, as described in this blog post that I wrote at that time.

Over the past years I switched over to a couple of other control flow libraries that solve the same problem as step, but  eventually I settled on using the async library.

Let’s look back at the problem I used in my original blog post:

image

Here’s the slightly refactored equivalent using async:

http.createServer(function(request, response) {
    async.waterfall([
        assembleFilePath,
        readFavoritePodcastsFromFile,
        addNewFavoritePodcastToFile
        ], 
        function(error, favoritePodcasts) {
            if(error)
                return response.end(error);

            response.writeHead(200, {
                'Content-Type': 'text/html', 
                'Content-Length': favoritePodcasts.length
            });

            response.end(favoritePodcasts); 
        }
    );
})
.listen(2000);

function assembleFilePath(callback) {
    var filePath = path.join(__dirname, 'podcasts.txt');
    callback(null, filePath);
}

function readFavoritePodcastsFromFile(podcastsFilePath, callback) {
    fileSystem.readFile(podcastsFilePath, 'utf8', function(error, data) {
        if(error)
            return callback(error);

        callback(null, podcastsFilePath, data);
    });                     
}

function addNewFavoritePodcastToFile(podcastsFilePath, favoritePodcastData, callback) {
    var favoritePodcasts = favoritePodcastData;

    if(-1 == favoritePodcasts.indexOf('Astronomy Podcast')) {
        favoritePodcasts = favoritePodcasts + '\n' + 'Astronomy Podcast';       
        fileSystem.writeFile(podcastsFilePath, favoritePodcasts, function(error) {
            if(error)
                return callback(error);

            callback(null, favoritePodcasts);
        });                     
    }
    else {
        process.nextTick(function() {
            callback(null, favoritePodcasts);
        });     
    }
}

 

Here I’ve used the waterfall method of the async library in order to pass results from one function to the next. Other functions that I often use are series and parallel. Notice that in the addNewFavoritePodcastToFile function I used process.nextTick instead of just invoking the callback. This is done in order to prevent inconsistent behavior of the function. I also wrote about this in the past.

There has been a lot of buzz lately around promises, so I decided to drink some of this kool-aid. Basically, we can achieve the same kind of solution as with the async library.

http.createServer(function(request, response) {

    assembleFilePath()
    .then(readFavoritePodcastsFromFile)
    .then(addNewFavoritePodcastToFile)
    .then(function(favoritePodcasts) {
        response.writeHead(200, {
            'Content-Type': 'text/html', 
            'Content-Length': favoritePodcasts.length
        });

        response.end(favoritePodcasts); 
    })
    .done();
})
.listen(2000);

function assembleFilePath() {
    return Q.fcall(function() {
        return path.join(__dirname, 'podcasts.txt');
    });
}

function readFavoritePodcastsFromFile(podcastsFilePath) {
    var deferred = Q.defer();

    fileSystem.readFile(podcastsFilePath, 'utf8', function(error, favoritePodcasts) {
        if(error)
            return deferred.reject(new Error(error));

        deferred.resolve({
            favoritePodcasts: favoritePodcasts,
            podcastsFilePath: podcastsFilePath
        });
    });

    return deferred.promise;
}

function addNewFavoritePodcastToFile(data) {
    var deferred = Q.defer(),
    favoritePodcasts = data.favoritePodcasts;

    if(-1 == favoritePodcasts.indexOf('Astronomy Podcast')) {
        favoritePodcasts = favoritePodcasts + '\n' + 'Astronomy Podcast';       
        fileSystem.writeFile(data.podcastsFilePath, favoritePodcasts, 
        function(error) {
            if(error)
                return deferred.reject(new Error(error));

            deferred.resolve(favoritePodcasts);
        });                     
    }
    else {
        process.nextTick(function() {
            deferred.resolve(favoritePodcasts);
        });
    }

    return deferred.promise;
}

 

I’ve used the Q library for this code sample. For an excellent introduction to promises and the Q library, check out this great article on the StrongLoop blog. I think the approach using promises looks, uhm … promising as well.

Are you, dear reader, using a control flow library, which one and why?

Until next time.

Introducing node-validation

Some time ago I was looking for a validation library/module for use in a small Express application that I was writing at the time. I couldn’t find anything that suited my taste so I decided to write one myself just for kicks. The goal was learning how to publish a module to npm and making a futile attempt to contribute something back to the vibrant Node.js community. node-validation is a minimal but slightly opinionated validation library for Node.js.

Installing node-validation can be done using the canonical package manager:

$ npm install node-validation

Validation rules must be defined in a custom validator by deriving from the base Validator.

var MyObjectValidator = function() {
    Validator.call(this);

    this.ruleFor('stringProperty').isNotEmpty();
    this.ruleFor('otherStringProperty').hasMaximumLength(10);

    this.ruleFor('numericStringProperty').isNumber()
        .withMessage('Oops, something is wrong ...');
    this.ruleFor('dateStringProperty')
        .matches(/^(19|20)\d\d[-](0[1-9]|1[012])[-](0[1-9]|[12][0-9]|3[01])$/);

    this.ruleFor('numberProperty').isInteger();
    this.ruleFor('otherNumberProperty').isMaximum(5);

    this.ruleFor('exoticProperty').is(function(value) {
        return 3 === value.propertyA + value.propertyB;
    }).withMessage('Either propertyA or propertyB has an incorrect value.');
};

util.inherits(MyObjectValidator, Validator);

 

After creating a validator object, an object that needs to be validated (the subject) can be passed to the validate method. The validate method returns an array of validation errors specifying a message and the name of the violating property.

 

//
// Validation subject
//
var subject = {
    stringProperty: '',
    otherStringProperty: 'Some string value that is too long ...',

    numericStringProperty: '65.85 invalid',
    dateStringProperty: '2013-04-30 invalid',

    numberProperty: 'Some invalid number',
    otherNumberProperty: 48,

    exoticProperty: {
        propertyA: 1,
        propertyB: 1
    }
};

//
// Now it's time to validate
//
var validator = new MyObjectValidator();
var validationErrors = validator.validate(subject);

for(var i=0; i < validationErrors.length; i++) {
    console.log('Property name: ' + validationErrors[i].propertyName 
                + ', Message: ' + validationErrors[i].message);
}

 

There you go. Head over to the GitHub repository and give it a try. I’m definitely looking forward to hear your feedback.

Taking Toddler Steps with Node.js – Express Routing Revisited

Last year I wrote this blog post where I described a couple of ways on how to tackle routing with Express. In the mean while I moved on from the “Plain Old School” approach to an approach where I replaced underscore.js with node-require-directory.

Setting up node-require-directory is quite easy. In the routes folder, we just need to add an index.js module with the following two lines:

var requireDirectory = require('require-directory');
module.exports = requireDirectory(module);

Setting up the routes for Express then looks like this:

var routes = require('./../routes');

// Setting up an application ...

application.get('/', routes.root);
application.get('/home', routes.home);
application.get('/signin', routes.authentication.signin);
application.post('/signout', routes.authentication.signout);

// More route registrations

Here we simple reference the index.js module. The node-require-directory module takes care of building up a tree of functions which we can now access for our route registrations. Adding a new route is as simple as creating a new module somewhere inside the routes folder or one of its subfolders and creating a new route registration. Have a look at this example.

I found this little gem to be quite useful and it might be helpful for some of you as well.

Until next time.

Taking Toddler Steps with Node.js – Passport

Recently I added Twitter authentication to TrackMyRun using a library called Passport. I was pretty impressed how smooth this all went as I completely neglected all security concerns from the get go, which is definitely not recommended by the way. For this post I’ll walk you through the process of setting up Passport for Express using Twitter OAuth authentication.

Passport is actually the core library which provides support for OpenId and OAuth authentication. Instead of being one single monolithic library, Passport uses strategies that support authentication directly with specific OpenId/OAuth providers.

So in order to get up and running, we need to install passport as well as passport-twitter for Twitter OAuth authentication. After we install these modules using npm, we can start by configuring the Twitter strategy.

var express = require('express'),
    passport = require('passport'),
    TwitterStrategy = require('passport-twitter').Strategy;

var users = [];

passport.use(new TwitterStrategy({
        consumerKey: 'twitter-app-consumer-key',
        consumerSecret: 'twitter-app-consumer-secret',
        callbackURL: "http://test.passport-twitter.com:3000/auth/twitter/callback"
    },
    function(token, tokenSecret, profile, done) {
        var user = users[profile.id] || 
                   (users[profile.id] = { id: profile.id, name: profile.username });
        done(null, user);
    }
));

 

The strategy must be configured by providing the consumer key and consumer secret as well as the callback URL. I’m not going too much in depth on how OAuth works. Make sure to check out the Twitter for developers website on how to configure an application that uses the Twitter API.

Besides adding the strategy for Twitter, we also specified a callback function. In this callback, we’re supposed to find and verify a user that matches a specified set of credentials. Usually we have some code here that checks to see if the specified user exists in a database of some sort. In order not to clutter this example, I used a simple array here instead.

If we can find the requested user in our data store, we need to invoke done() to supply the Passport with the user.

done(null, user);

When the user cannot be found, we can simply pass false instead of a user object.

done(null, false);

In our example we always ensure that the specified credentials match a particular user object. Next we need to configure the Passport middleware for initialization and session management.

application.configure(function() {
    application.use(express.bodyParser());
    application.use(express.methodOverride());
    application.use(express.cookieParser());
    application.use(express.session( { secret: '498f99f3bbee4ae3a075eada02488464' } ));
    application.use(passport.initialize());
    application.use(passport.session());
    application.use(application.router);
    application.use(express.errorHandler({ showStack: true, dumpExceptions: true }));
    application.set('view engine', 'jade');
});

Please note that the express.session() middleware needs be called before passport.session(). Next we add the routes necessary for authenticating requests and handling the token callback.

application.get('/auth/twitter', passport.authenticate('twitter'));

application.get('/auth/twitter/callback', 
    passport.authenticate('twitter', 
        { successRedirect: '/', 
          failureRedirect: '/auth/twitter' }));

 

Last but not least we also need to declare a serializeUser/deserializeUser callback function. These are necessary for supporting login sessions.

passport.serializeUser(function(user, done) {
    done(null, user.id);
});

passport.deserializeUser(function(id, done) {
    var user = users[id];
    done(null, user);
});

Instead of reading the requested user objects from the data store, we simply use the array that we incorporated earlier.

That’s basically the thing. We can add other authentication providers by simply configuring more strategies. Have a look at the full source code of this example and try to get it up and running.

Until next time.

Taking Toddler Steps with Node.js – Express Error Handling

In the previous post I wrote about my personal routing flavor for Express. For this post, I want to briefly discuss how to set up error handling using Express.

In order to get up and going very quickly, we only need to add the errorHandler middleware provided by Connect.

application.use(express.errorHandler({ showStack: true, dumpExceptions: true }));

Here we configured the errorHandler middleware to report on exceptions and show the stack trace as well. This is quite handy during development as this setup provides us with enough detail.

image

But this is not very effective when we want to move our application to a production environment. When deployed into production, we usually want to show a user-friendly message instead of technical details, stack traces and what not. In this case we can use the application.error() method. This function receives all errors thrown by the regular route functions or errors passed to the next() function. In this catch-all-errors function we can simply render our own custom view.

application.error(function(error, request, response, next) {
    response.render('500', {
        status: 500,
        error: util.inspect(error),
        showDetails: application.settings.showErrorDetails
    });
});

This is how our custom error page looks like:

image

We can also use the application.error() function for rendering custom pages for all kinds of specific errors. Suppose we want to render a custom page for ‘404 – Page Not Found’ errors. Quite easy. We just need to register a catch-all route after all the regular route functions that simply throws a custom error.

function PageNotFoundError(message){
  this.name = 'PageNotFoundError';
  Error.call(this, message);
  Error.captureStackTrace(this, arguments.callee);
}

PageNotFoundError.prototype.__proto__ = Error.prototype;

application.use(function(request, response, next) {
    next(new PageNotFoundError())        
});

Next we need to enhance application.error() function so that it appropriately handles our PageNotFoundError.

application.error(function(error, request, response, next) {
    if (typeof error === typeof PageNotFoundError) {
      response.render('404', {
        status: 404
      });
    } 
    else {
        response.render('500', {
            status: 500,
            error: util.inspect(error),
            showDetails: application.settings.showErrorDetails
        });
    }
});

As you can see, error handling is quite easy to setup for different environments using Express.

Until next time.