Spelunking Linux – drawing on libdrmPosted: September 26, 2015
And what’s this then? It’s the draw_crtc_lines.lua example from the LJIT2RenderingManager repository. I am using a LuaJIT wrapped libdrm to draw graphics on the console of my Linux machine. Yah, the console, where text is usually the norm. But, really, text is just a bunch of glyphs, which is nothing but a bunch of graphics right?
That was three years ago. With the Raspberry Pi, it was actually a fairly straight forward thing to go get a handle on the screen, and thus a pointer to the frame buffer data. Just a couple of function calls…
With the image here, I am using libdrm, which is at the core of graphics on Linux systems. The way it works is, you have fairly low level access to the graphics subsystem within the Linux kernel itself. Then there’s this userspace code, as represented by libdrm, which wraps up various IOCtl() calls to make it easy to interact with those kernel functions. Then other things, from console to X11 windows, are built atop this very lowest level. Throw in some libudev, libevdev, for input device handling, and you begin to have a UI subsystem.
Using libdrm isn’t particularly hard, but it’s pretty darned hard to find any crisp clear example articles or code to show you the way. Most of the examples are associated with mode setting, or fairly low level tests, or very out of date. I’m not quire sure why that is, other than the fact that most people don’t really care about this lowest level stuff, they just use the higher level frameworks such as Qt, SDL, and what have you. I was hard pressed to find anything like “how to draw to the current screen’s frame buffer directly”. So, here’s my version:
--[[ Draw on the frame buffer of the current default crtc The way to go about drawing to the current screen, without changing modes is: Create a card From that card, get the first connection that's actually connected to something From there, get the encoder it's using From there, get the crt controller associated with the encoder From there, get the framebuffer associated with the controller From there, we have the width, height, pitch, and data ptr So, that's enough to do some drawing --]] package.path = package.path..";../?.lua" local ffi = require("ffi") local bit = require("bit") local bor, band, lshift, rshift = bit.bor, bit.band, bit.lshift, bit.rshift local libc = require("libc") local utils = require("utils") local ppm = require("ppm") local DRMCard = require("DRMCard") local function RGB(r,g,b) return band(0xFFFFFF, bor(lshift(r, 16), lshift(g, 8), b)) end local function drawLines(fb) local color = RGB(23, 250, 127) for i = 1, 400 do utils.h_line(fb, 10+i, 10+i, i, color) end end local card, err = DRMCard(); local fb = card:getDefaultFrameBuffer(); fb.DataPtr = fb:getDataPtr(); print("fb: [bpp, depth, pitch]: ", fb.BitsPerPixel, fb.Depth, fb.Pitch) local function drawRectangles(fb) utils.rect(fb, 200, 200, 320, 240, RGB(230, 34, 127)) end local function draw() drawLines(fb) drawRectangles(fb); end draw(); ppm.write_PPM_binary("draw_crtc_lines.ppm", fb.DataPtr, fb.Width, fb.Height, fb.Pitch)
There, clear as can be right? It is actually fairly easy once you have the right wrappers and an understanding of how these things are supposed to work. The comment at the top of the code block explains the steps that go into it. Basically, you get a ‘card’ which is a representation of the video card in question. In this case, we just grab the first available card. If there are two in a system, you could be more specific about it. From the card, there are connectors (like VGA, DVI, HDMI, etc). So, you choose one of those, and ultimately you find out which frame buffer is associated with it. From there, you can ask the frame buffer for it’s data pointer, and then you’re ready to go.
Asking for the data pointer from the frame buffer looks like this:
function DRMFrameBuffer.getDataPtr(self) local fd = self.CardFd; local fbcmd = ffi.new("struct drm_mode_fb_cmd"); fbcmd.fb_id = self.Id; local res = xf86drm.drmIoctl(fd, drm.DRM_IOCTL_MODE_GETFB, fbcmd) if res < 0 then return nil, "drmIoctl DRM_IOCTL_MODE_GETFB failed"; end local mreq = ffi.new("struct drm_mode_map_dumb"); mreq.handle = fbcmd.handle; if (xf86drm.drmIoctl(self.CardFd, drm.DRM_IOCTL_MODE_MAP_DUMB, mreq) ~= 0) then error("drmIoctl DRM_IOCTL_MODE_MAP_DUMB failed"); end local size = fbcmd.pitch*fbcmd.height; local dataPtr = emmap(nil, size, bor(libc.PROT_READ, libc.PROT_WRITE), libc.MAP_SHARED, fd, mreq.offset); return dataPtr end
Well, that’s certainly a mouthful, just to get a pointer to the screen’s frame buffer data! The way it breaks down actually isn’t that bad. First, you do that drmIoctl to get a handle on the framebuffer. This is just how the kernel knows which framebuffer you’re talking about, because there can be many. Then you use that handle to get some information which is useful for doing an mmap() call (the offset from the beginning of the ‘file’ which represents the frame buffer). And finally, you make the mmap call so that you can get a pointer to the actual data of interest.
At this point, you now have a pointer to the data portion of the framebuffer, and you can start drawing to your heart’s content. Yep, it’s that easy.
Well this is all fine and dandy then. If you don’t feel like creating your own graphics rendering engine from scratch, then you could enlist the power of libpixman (LJIT2pixman – Drawing on Linux), another low level portion of the graphics pipeline on Linux. With libpixman, you can take this pointer you got from the framebuffer, and give it to one of the pixman_image_create_bits() calls, to get yourself a pixman drawing canvas, then you’re all set to use all the goodness pixman has to offer.
That’s very nifty, and this is how windowing systems are born.
Using libdrm can be a daunting task to the uninitiated (such as myself). Through nice Lua wrapping, and a bit of objectification, it can be tamed, and drawing on the screen is no harder than drawing into any other bit of memory you might have. One added bonus here. Since you have a pointer to the colors on the screen, doing a screencast capture, or desktop sharing, can’t be too far away…