Concept: High Performance web server w/ Promises


My intended purpose is to provide a higher level of throughput -- service more requests per second through an async API. It is expected that the time to service each request would increase but that is fine if we handle more requests per second. Or if nothing else, at least handle more requests before issues start to occur.

Concept Overview - How?

The high level idea is for an http server running on Express + Node. From there, any incoming request is wrapped in a promise and the promise resolves to respond to the request. Of course, if throughout processing the request, other async processing is needed, other then-ables can be used.

For example, this:

/* GET home page. */
router.get('/', function(req, res, next) {
    debug("GET /");
    res.render('site/index', { title: 'The Open Source U' });


/* GET home page. */
router.get('/', function(req, res, next) {
  new Promise(function() {
    debug("GET /");
    return { title: 'The Open Source U' }; //calculation or object
  .then(function(payload) {
    res.render('site/index', payload)

This is regardless of need. So, I intentionally have no async operation but still use a promise; this accurately represents my idea.

Result: Ineffective

I quickly did a proof of concept... which showed that this doesn't help. In fact, it made performance worse. In considering why, I suspect the cost to service the promise (memory and processing) is more costly than simply responding to the http request.

This is not to say a promise should be avoided in the http request/response cycle but it should, instead, be used at async appropriate times. Regardless, we've learned something here and I thought it was worth sharing.

Success is not final, failure is not fatal: it is the courage to continue that counts. Winston Churchill

I am disappointed this didn't have even a neutral effect on performance, let alone a negative one but it is what it is.

Note: The POC was served through Node + Express directly rather than through a reverse proxy as most production setups would be.

Update (new article coming)

Because of a comment on, I've revisited this. The commenter pointed out a mistake I made in the code I posted. That code was an example and not 1:1 from the original test code (which I ran from a Linode server).

I've posted my test code here:

I should have done this in the first place.