Dynamic Setting of Uniform Shader Values

I’m all about the lazy, and avoiding typing when I program. If I had my way, I would just think my program into existance. But alas, I still have to type.

But now, I can type a lot less error prone code when it comes to dealing with OpenGL, and in particular with shaders. What’s a ‘shader’? You know, those pesky little programs that you have to write to get anything done on the GPU these days.

Here’s the entirety of the code for the Mandelbrot zooming thing, at least the part that runs on the GPU. It’s a ‘fragment shader’, and pretty rudimentary at that:

local fragtext = [[
uniform sampler1D tex;
uniform vec2 center;
uniform float scale;
uniform int iter;

void main() {
    vec2 z, c;

    c.x = 1.3333 * (gl_TexCoord[0].x - 0.5) * scale - center.x;
    c.y = (gl_TexCoord[0].y - 0.5) * scale - center.y;

    int i;
    z = c;
    for(i=0; i 4.0) break;
        z.x = x;
        z.y = y;
    }

    gl_FragColor = texture1D(tex, (i == iter ? 0.0 : float(i)) / 100.0);
}
]]

Hay! Wait a minute. Isn’t that “C” code? Well, not quite. Notice a couple of things. First of all, this is the shader code encapsulated in a Lua string (the [[]]). When you look at the code itself, you see those words “uniform”. Those are variables that can be set from the outside world. In the case of the zooming mandelbrot explorer from the video, I’m changing things like the ‘iter'(ations), and the scale, and the center.

Normally, in order to change these values, you have to do things like this:

gpuprog:set_uniform1i("iter", iter);
gpuprog:set_uniform2f("center", cx, cy);
gpuprog:set_uniform1f("scale", scale);

My fingers start to cramp even from having to just copy and paste that code. Not only that, but it requires the programmer know a lot more about the intricacies of the GLSL/OpenGL API, which means they spend less time actually coding, and more time doing glue work.

What I really want to do as a programmer is this:

gpuprog.iter = iter;
gpuprog.center = float2(cx, cy)
gpuprog.scale = scale;

And while I’m at it, creating that GPU program is quite a pain in the first place. Roughly speaking, the steps are like this:

  • Create Program
  • Create Shader
  • Compile Shader
  • Attach Shader to Program
  • Link Program
  • Use Program

That’s more than a mouthful, and there’s tons of error checking in esoteric sorts of ways. But really, as a programmer, who’s not particularly focused on learning the details of OpenGL APIs, but just wants to use the GPU for some kick butt graphics, I simply want to the do the following:

gpuprog = GLSLProgram(fragtext)
gpuprog:Use();

Once again, saving my arthritic hands from a lot of typing, and sparing the few brains cells I have left from having to remember a bunch of APIs that I will use infrequently.

There’s really something to this. Because of the magic of Lua, I can implement the GLSLProgram to do lookups whenever it doesn’t recognize a field access. Basically, the lookup does the following:

function glsl_get(self, key)
    -- try the class table, as it might be a
    -- function for the class
    field = rawget(GPUProgram,key)
    if field ~= nil then
        print("returning glsl field: ", field)
        return field
    end

    -- Last, do whatever magic to return a value
    -- or nil
    local value, ncomps =  GetUniformValue(self, key)

    if ncomps == 1 then
        return value[0];
    end

    return value
end

Setting a field is even easier:

function glsl_set(self, key, value)
    -- try to set the value
    -- in the shader
    SetUniformValue(self, key, value)
end

The functions GetUniformValue, and SetUniformValue is where all the action is at. They figure out what type of field it is we’re accessing (there’s a GLSL API for that), and the construct the appropriate type of array, stuff it with the value, and call the appropriate API to get the values in and out. The, return to the user, and a nice sort of way. You can actually round trip values between get/set, so everything works out great. All the error checking can be put in one place, rather than spread all over your code.

And the way it ties together is like this:

GPUProgram = {}
GPUProgram_mt = {}

function GPUProgram.new(fragtext, vertext)
    local self = {}

    self.ID = ogm.glCreateProgram();

    if fragtext ~= nil then
        self.FragmentShader = GPUShader(GL_FRAGMENT_SHADER, fragtext);
        GPUProgram.AttachShader(self, self.FragmentShader);
    end

    if vertext ~= nil then
        self.VertexShader = GPUShader(GL_VERTEX_SHADER, vertext);
        GPUProgram.AttachShader(self, self.VertexShader);
    end

    GPUProgram.Link(self)

    setmetatable(self, GPUProgram_mt)

    return self
end

GPUProgram_mt.__index = glsl_get
GPUProgram_mt.__newindex = glsl_set

function GLSLProgram(fragtext, verttext)
    local prog = GPUProgram.new(fragtext, vertext)

    return prog
end

Basically, the GLSLProgram function will create an ‘instance’ of a GPUProgram table. That table will have a metatable, which has the __index and __newindex functions on it. These functions will be called whenever a value is not found in the instance of the GPUProgram, or when there is a desire to set a value.

It’s a good thing I was able to come up with this little string of code. Otherwise, I fear that I would never be able to make any progress using the GPU. The same little tricks will need to be done to set vertex attributes as well, which is a whole other beast to deal with.

But, there you have it. I don’t know about using other dynamic programming languages to deal with GPU shading, but I find this to be the cat’s meow as far as I’m concerned. Having this particular mechanism in place makes GPU shader programming feel as natural as my regular Lua programming. There is that small step of writing that bit of shader code in what looks like “C”, but maybe even that isn’t a big deal. I don’t see why shader code can’t be written in Lua, and just translated into the appropriate backend language. If I could do that, then I could stay safely tucked away in my “Lua Only” world. But I digress…



Leave a comment