And now, two other things are happening. LuaJIT has this thing called dynasm. This Dynamic Assembler quickly turns what looks like embedded assembly instructions into actual machine instructions at ‘runtime’. This is kind of different than what nasm does. Nasm is an assembler proper. It takes assembly instructions, and turns that into machine specific code, as part of a typical ‘compile/link/run’ chain. Dynasm just generates a function in memory, and then you can call it directly, while your program is running.
This concept of dynamic machine code generation seems to be a spreading trend, and all JIT runtimes do it. I just came across another tool that helps you embed such a JIT thing into your C++ code. Asmjit is the tool that does a thing similar to what luajit’s dynasm does.
These of course are not unique, and I’m sure there are countless projects that can be pointed to that do something somewhat similar. And that’s kind of the point. This dynamic code generation and execution thing is rapidly leaving the p-code phase, and entering the direct machine execution phase, which is making dynamic languages all the more usable and performant.
So, what’s next?
Well, that got me to thinking. If really fast code can be delivered and executed at runtime, what kinds of problems can be solved? Remote code execution is nothing new. There are always challenges with marshaling, versioning, different architectures, security, and the like. Some of the problems that exist are due to the typically static nature of the code that is being executed on both ends. Might things change if both ends are more dynamic?
Take the case of TLS/SSL. There’s all these certificate authorities, which is inherently fragile and error prone. Then there’s the negotiation of the highest common denominator parameters for the exchange of data. Well, what if this whole mess were given over to a dynamic piece? Rather than negotiating the specifics of the encryption mechanism, the two parties could simply negotiate and possibly transfer a chunk of code to be executed.
How can that work? The client connects to the server, using some mechanism to identify itself (possibly anonymous, possibly this is handled higher up in the stack). The server then sends a bit of code that the client will then use to pass through every chunk of data that’s headed to the server. Since the client has dynasm embedded, it can compile that code, and continue operating. Whomever wrote the client doesn’t know anything about the particulars of communicating with the server. They didn’t mess up the cryptography, they didn’t have to keep up to date with the latest heart bleed. The server can change and customize the exchange however they see fit.
The worst case scenario is that the parties cannot agree on anything interesting, so they fall back to using plain old TLS. This seems useful to me. A lot of code, that has a high probability of being done wrong, is eliminated from the equation. If certificate authorities are desired, then they can be used. If something more interesting is desired, it can easily be encoded and shared. If thing need to change instantly, it’s just a change on the server side, and move along.
Of course each side needs to provide an appropriate sandbox so the code doesn’t just execute something arbitrary. Each side also needs to provide some primitives, like ability to grab certificates if needed, and access to crypto libraries if needed.
If the server wants to use a non-centralized form of identity, it can just code that up, and be on its way. The potential is high for extremely dynamic communications, as well as mischief.
And what else? Well, I guess just about anything that can benefit from being dynamic. Learning new gestures, voice recognition, image recognition, learning to walk, learning new algorithms for searching, sorting, filtering, etc. Just about anything.
Following this line of reasoning, I’d expect my various machines to start talking with each other using protocols of their own making. Changing dynamically to fit whatever situation they encounter. The communications algorithms go meta. We need algorithms to create algorithms. Threats and intrusions are perceived, and dealt with dynamically. No waiting for a rev of the OS, no centrally distributed patches, no worrying about incompatible versions of this that and the other thing. The machines, and their communications, become individual, dynamic, and non-static.
This could be interesting.