Luica ObscuraPosted: May 16, 2012
But, whenever I say the word “Lua” to any of my programmer friends, I get anything from quizical “tell me more” looks, to the raised eyebrows of “he’s gone off the deepend”. I think primarily this is driven by the relative obscurity of the Lua language. It’s not well known for driving web development, it’s primarily known for its usage in game development environments, particularly World of Warcraft.
But, that’s not satisfying. Why would I switch languages just to get the same performance? Well, the problem with C is the strength of C. Pointers are the biggest pain, right there with memory management. You have all the power to bend the machine to your will, but with that awesome power comes awesome responsibility, as well as buffer overflows, dangling pointers, unbalanced malloc/free, and the like. Lua kind of eliminates most of these concerns because it has a garbage collector, and natively does not support pointers. LuaJIT takes that Lua core, and adds a bit of danger for those who just MUST have their pointers. It regains the pointers, with all that implies.
How to explain this really… LuaJIT is a powerful string processor. As such, it is much more powerful than RegEx, much more compact, and much more expressive. Using LuaJIT, I can easily handle things like http parsing in a fraction of the space required to do so using any other means, including those various libraries that come with myriad other environments. Since LuaJIT is a runtime compiler, my code is as fast as if I had written it natively in C. Since I use LuaJIT as a string handler, and the Lua code itself is just a string, I can easily update my string handling capabilities without having to take down, or re-deploy the core host environment that’s running the string handler.
Well, isn’t that a funny thought? And thus, dealing with HTTP, at first I used the http_parser, and LuaJIT interop to call that library. But, after some wrangling, I have been able to replicate that library, entirely in Lua code, so there is no library to deploy. At about 40K of code, this is a pretty good thing. What does it mean? Let’s say I have a generic network node that’s just kind of sitting out there in the world. That node has the simple ability to listen on a port, and feed whatever comes in to a routine that will deal with it. By default this simple node does nothing but throw the received packets away and carry on receiving. It’s rather a black hole. But now I want to give it the power to deal with HTTP requests. I can send it a http parser, as luascript. Now, when it receives a chunk, it is sent to the http parser code, when then executes. By default, still not doing anything but parsing the http, and having knowledge of various requests. Then I send it a bit of code to deal with file retrieve requests (a simple static web page service).
Great, starting from small little bits of code, building up to more robust services.
Let’s say I want a node to take on the face of a REST service, that is serving up something. Same process, send it a bit of code to deal with http, then some to deal with the REST service interface. Oops, then I found a mistake in the REST handling. Just send a bit of code replacing the bit of code sent before. When it’s not being used, it will simply be replaced, re-compiled on the fly, and none the wiser.
Assuming I had about 100 such nodes distributed around the world, how could I do something like A/B testing, or rather “test in production”. Well, send out the little bit of new code to as many nodes as you want, see how it goes, gather data, and if all is well, tell the rest of the nodes to pick up the new bit of code. No need to take servers down, and various other bad things that occur when you have to deploy fully compiled binaries that have been tested in-house.