The Challenge of Writing Correct System Calls

If you are a veteran of using Windows APIs, you might be familiar with some patterns. One of them is the dual return value:

int somefunc();

Documentation: somefunc(), will return some value other than 0 upon success. On failure, it will return 0, and you can then call GetLastError() to find out which error actually occured…

In some cases, 0 means error. In some cases, 0 means success, and anything else is actually the error. In some cases they return BOOL, and some BOOLEAN (completely different!).

Another pattern is the “pass me a buffer, and a size…”

int somefunc(char *buff, int sizeofbuff)

Documentation: Pass in a buffer, and the size of the buffer. If the ‘sizebuff’ == 0, then the return value will indicate how many bytes need to be allocated in the buffer, so you can call the function again.

A slight variant is this one:

int somefunc(char *buff, int * sizeofbuff)

In this case, the return value of the function will indicate whether there was an error or not. If there was an error such as: ERROR_INSUFFICIENT_BUFFER, then the ‘sizeofbuff’ was stuffed with the actual size needed to fulfill the request.

This can be very confusing to say the least. What makes it more confusing, at least in the Windows world, is that there is no single way this is done. Windows APIs have existed since the beginning of time, so there is as much variety in the various APIs as there are programmers who have worked on them over the years.

How to bring sanity to this world? I’ll examine just one case where I use Lua to make a more sane picture. I want to get the current system time. In kernel32, there are a few date/time formatting functions, so I’ll use one of those. Here is the ffi to the one I want:

    LPCWSTR lpLocaleName,
    DWORD dwFlags,
    const SYSTEMTIME *lpTime,
    LPCWSTR lpFormat,
    LPWSTR lpTimeStr,
    int cchTime

That’s one hefty function to get a time printed out in a nice way. There are pointers to unicode strings, pointers to structures that contain the system time, size of buffers, buffers in unicode…

In the end, I want to be able to do this: GetTimeFormat(), and have it print: “7:54 AM”

Alrighty then. Can’t be too hard…

local GetTimeFormat = function(lpFormat, dwFlags, lpTime, lpLocaleName)
  dwFlags = dwFlags or 0;
  --lpFormat = lpFormat or "hh':'mm':'ss tt";
  if lpFormat then
    lpFormat = k32.AnsiToUnicode16(lpFormat);

  -- first call to figure out how big the string needs to be
  local buffsize = k32Lib.GetTimeFormatEx(

  -- buffsize should be the required size
  if buffsize < 1  then
    return false,  k32Lib.GetLastError();

  local lpDataStr ="WCHAR[?]", buffsize);
  local res = k32Lib.GetTimeFormatEx(

  if res == 0 then
    return false, Lib.GetLastError();

  -- We have a widechar, turn it into ASCII
  return k32.Unicode16ToAnsi(lpDataStr);

Not too bad, if a bit redundant. There are a couple of things of note, which are easy to miss if you’re not paying close attention.

First of all, I’m following the convention that any system function that succeeds should return the value, and if it fails, it should return false, and an error.

First thing to do is deal with default parameters. The dwFlags parameter is an integer, so if it has not been specified, a default value of ‘0’ will be used. If you don’t do this, then a ‘nil’ will be passed to the system function, and that will surely not work.

The time value can be passed in. If it is, it will be used. If not, then the nil in this position will result in using current system time. Same goes for localeName, and lpFormat. If they are nil, then the system default values will be used, according to the function call documentation.

The next important thing is, turning the lpFormat string into a unicode string if it was specified. Lua, by default, deals in straight ANSII 7/8 bit strings, not unicode, so by default, I assume what’s been specified is a standard Lua ansii string, so I convert it to unicode.

And finally, the first function call to the system. In this first call, I want to get the size of the buffer needed, so I pass in ‘0’ as the size of the buffer. The return value of the function will be the size of the buffer needed to fill in the string. Of course, if the return value is ‘0’, then the ‘GetLastError()’ function must be called to figure out what was wrong. In this case, it could be that one of the parameters specified was wrong, or something else. But, bail out at any rate.

Now that I know how big the buffer needs to be (in unicode characters, not in bytes?), I allocate the appropriate buffer, and make the call again, this time passing in the specified buffer.

Last step, take the unicode string that was returned, and turn it back into an ansii string so the rest of Lua can be happy with it.

There are a couple more error conditions that could possibly be handled, like checking the types of the passed in parameters, or the size of the needed buffer might change between the two system calls, but this is a ‘good enough’ approach.

It’s 38 lines of boilerplate code to ensure the simplicity and relative correctness of a single system call. With literally hundreds of very interesting system calls in the Win32 system, you can imagine how challenging it can get to do these things right.

Of course, this is why libraries exist, because someone has actually gone through and done all the challenging work to get things right. I find that doing this work in Lua is pretty easy. The biggest challenge is reading and interpreting the documentation of the API. Sometimes it’s clear, sometimes it’s not. Once conquered though, it sure does make programming in Windows a lot easier. I suspect the same it true of any language/os binding.