Here it is, the first video that I’ve done related to LEAP:
So, I received an email a few weeks back which essentially said “would you consider a role working for the CTO as a Technical Advisor”. Well, at first, I wasn’t sure what to think, but then I actually talked to who was asking me the question, and I thought, “wait a minute, this could be a really cool thing”.
It’s like this. At Microsoft, we don’t always have a person in the role of CTO. Bill Gates was “Chief Scientist” at one point, and Craig Mundie I think had the CTO role, as did Ray Ozzie. Sometimes it works, sometimes it’s a distraction.
The current CTO is Kevin Scott, and before I actually met him, the #1 comment everyone said about him was “he’s a really cool guy”. Well, after meeting him, I have the same sentiment. Kevin’s not an industry luminary from the birth days of the personal computing industry like Ray Ozzie was, he’s an engineer’s engineer with a pedigree that extends through Google, a startup adMob, and LinkedIn, where he continues to be responsible for their backend stuff.
I’ve been at Microsoft for 18 years, which means I’ve done a fair number of things, and I know a fair number of people. The first aspect of being a TA is getting around, meeting with people, and spreading the word that there’s actually a CTO.
What does the CTO do? Well, the best description I can give is the CTO acts as the dev manager/architect for the company. The scope and responsibility of the CTO can be very broad. Part of it is about efficiency of our joint engineering objectives. Part of it is making sure we’re marching to the beat of a similar drummer. Can you imagine, Microsoft has a few multi-billion dollar businesses, led by business managers who are fairly autonomous, and have quite strong independent personalities, or they would not be in the positions they are in. And along comes the CTO to help unify them.
Really, the job is being fairly impartial where necessary, and just reminding people of their shared goals and objectives, and helping them to reinforce achieving them.
Being a TA to the CTO? Mostly it’s about going deep in areas. Kevin Scott is a fast learner, fully capable of digesting tons of info, and fabricating a well informed opinion on his own. The challenge is one of time. Microsoft is vast, and if you want to go beyond the surface level in many areas, you’d spend all your time in meetings, and not actually be able to synthesize anything. So, the TA role. We have those infinite number of meetings, going deep on multiple topics, synthesizing to a certain level, and surfacing interesting bits to Kevin where decisions and direction might be required.
The the surface description of the role and responsibility. The truth is, it’s not at all a well defined role. Eric Rudder was Bill Gate’s TA, for five years, and he was quite a force, doing more than just feeding Bill Gates opinions on what he heard in the company. We’ll see what our current office of the CTO is capable of, and what kinds of value we are going to impart on the company.
I am excited for this latest opportunity. I think it’s a fitting role for where I’m at in my career, and what value I can contribute to the company. So, here we go!
Well, it’s finally done
I began this journey with creating the excuses for doing the build in the first place, and then purchasing the various parts.
Now here is the fully assembled thing. Some final thoughts. The scariest part was doing the water cooling piping. I practiced tube bending on a waste piece before embarking on the final pieces. Like a plumber, it’s helpful to plan out where the pipes are going, do some measurements, then do bending on cutting. Really I was afraid that once it got assembled, it would be springing leaks all over the place ruining the fairly expensive electronics. When I first put the tubing together, I tested by running some distilled water through the system to flush things out.
In the end, there were no leaks, and everything runs beautifully, and cool. Having done this once now, I can see redoing the tubing at some point to make it more fancy, but for now, it works just fine, and looks cool.
One thing of note, this thing is really quiet. You literally need to almost stick your ear into the various fans to hear them at all. The power supply fan is dead quiet. This is dramatically different than the power supply on my shuttle PC, which I thought was fairly quiet. Now the Shuttle PC sounds like a jet engine in comparison.
The fans on the cooling radiator are whisper quiet as well, and provide those cool lighting effects to boot. Really this thing shows off best in a fairly dark room where the various glowing light effects can be seen.
The noisiest part of the entire build is actually the disk drive. You wouldn’t normally think of that, but when things are absolutely silent, to the point where the AC fan in a room is way louder, in a quiet room, the steady rumble of the disk drive is the most notable sound.
I’m loving it so far. I feel a sense of accomplishment in putting it together. I got to use it as a visual aid for the latest cohort of the LEAP class. Having a transparent case makes it easy to point at stuff, and the liquid cooling just adds a nice wow factor.
As far as the OS is concerned, I installed Windows 10 Pro. I figure even if I want to run Linux, I can simply use Hyper-V to create Linux VMs and go that way. Given that the graphics card can run 4 monitors at a time (I think), that’s more than enough to give me the illusion of a common desktop, with two Windows screens, and a third with Linux on a VM. So, it’s a sweet combo.
As for the excuse to be able to run the Vulkan API on a modern graphics board, that’s coming along. I had to install Visual Studio, build a LuaJIT, and dust off the cobwebs of my Vulkan ffi binding. All in due time. For now, the screaming machine is being used to type this blog post, and otherwise sitting beside my desk looking cool. I’ll have to design a desk specifically for it just to add to the DIY nature of the thing.
Well, since I’m no longer interested in building the ultimate streaming PC, I’ve turned my attention to building a more traditional tower PC. What? Those are so 1980! It’s like this. What I’m really after is using the now not so new Vulkan API for graphics programming. My current challenge is, my nice tighty Shuttle PC doesn’t have the ability to run a substantial enough graphics card to try the thing out! I do in fact have a tower PC downstairs, but it’s circa 2009 or something like that, and I find this a rather convenient excuse to ‘upgrade’.
I used to build a new PC every two years, back in the day, but tech has moved along so fast, that it hardly makes sense to build so often, you’d be building every couple of months to keep pace. The machines you build today will last longer than an old hardware hacker cares to admit, but sometimes you’ve just go to bite the bullet.
Trying to figure out what components to use in a current build can be quite a challenge. It used to be that I’d just go to AnandTech and look at this years different builds, pick a mid-range system, and build something like that. Well, AnandTech is no longer what it used to be, and TomsHardware seems to be the better place for the occasional consumer such as myself.
The first thing to figure out is the excuse for building the machine, then the budget, then the aesthetics.
Excuse: I want to play with the Vulkan graphics API
Budget: Less than $2,000n (to start ;-))
Aesthetics: I want it to be interesting to look at, probably wall or furniture mounted.
Since the excuse is being able to run the Vulkan API, I started contemplating the build based on the graphics card. I’m not the type of person to go out and buy any of the most current, most expensive graphics cards, because they come out so fast that if you simply wait 9 months, that $600 card will be $300. The top choice in this category would be a NVidia GTX 1080. Although a veritable beast of a consumer graphics card, at $650+, that’s quite a budget buster. Since I’m not a big gamer, I don’t need super duper frame rates, but I do want the latest features, like support of Direct X12, Vulkan, OpenGL 4.5, etc.
A nice AMD alternative is the AMD Radeon Rx 480. That seems to be the cat’s meow at the more reasonable $250 price range. This will do the trick as far as being able to run Vulkan, but since it’s AMD and not NVidia, I would not be able to run Cuda. Why limit myself, since NVidia will also run OpenCL. So, I’ve opted for an NVidia based MSI GeForce GTX 1060.
The specialness of this particular card is the 6GB of GDDR5 RAM that comes on it. From my past history with OpenGL, I learned that the more RAM on the board the better. I also chose this particular one because it has some red plastic on it, which will be relevant when I get to the aesthetics. Comparisons of graphics cards abound. You can get stuck in a morass trying to find that “perfect” board. This board is good enough for my excuse, and at a price that won’t break the build.
Next most important after the graphics card is the motherboard you’re going to stick it in. The motherboard is important because it’s the skeleton upon which future additions will be placed, so a fairly decent board that will support your intended expansions for the next 5 years or so would be good.
I settled on the GIGABYTE G1 Gaming GA-Z170X-Gaming GT (rev. 1.0) board.
It’s relatively expensive at $199, but it’s not outrageous like the $500+ boards. This board supports up to three graphics cards of the variety I’m looking at, which gives me expansion on that front if I every choose to use it. Other than that, at least 64GB of DDR4 RAM. It has a ton of peripherals, including USB 3.1 with a type-c connector. That’s good since it’s emerging. Other than all that, it has good aesthetics with white molding and red highlights (sensing a theme).
To round out the essentials, you need a power supply. For this, I want ‘enough’, not overkill, and relatively silent.
The Seasonic Snow Silent 750 is my choice. Besides getting relatively good reviews, it’s all white on the outside, which just makes it look more interesting.
And last, but not least, the CPU to match. Since the GPU is what I’m actually interested in, the CPU doesn’t matter as much. But, since I’m not likely to build another one of these for a few years, I might as well get something reasonable.
I chose the intel i7-6700K for the CPU.
At $339, it’s not cheap, but again, it’s not $600. I chose the ‘K’ version, to support overclocking. I’ll probably never actually do that, but it’s a useful option nonetheless. I could have gone with a less expensive i5 solution, but I think you lose out on hyper-threading or something, so might as well spend a $100 more and be slightly future proof.
Now, to hold all these guts together, you need a nice case. I already have a very nice case housing the circa 2009 machine. I can’t bring myself to take it apart, and besides, I tell myself, it doesn’t have the io access on the front panels required of a modern machine. Since part of my aesthetic is to be able to show the guts of the machine (nicely themed colors), I went with something a bit more open.
The Thermaltake core P5 ATX Open Frame case is what I have chosen.
Now, I’m more of a throw it together and start using it kind of builder, but putting a little bit of flash into the build could make it a tad more interesting. Less heat dissipation problems, and if I ever do that cool liquid cooling piping stuff, I’ll be able to show it off. This case also has options to mount it against the wall/furniture, and I’ll probably take advantage of that. I can imagine having a home office desk with a couple of these mounted on the front just for kicks. Thrown in a few monitors for surround, and… Oh, but I’m not a gamer.
The rest of the kit involves various memory, storage, etc. The motherboard has M.2 as well as mSata. So, I’ll probably put an SSD on one of those interfaces as the primary OS drive. Throw in a few terabytes of spinning rust, and 64GB of RAM, and it’s all set.
The other nice thing about the motherboard is dual NICs. One is for gaming, the other (intel) is for more pedestrian networking. This can be nothing but goodness, and I’m sure I can do some nice experimenting with that.
Well, that’s what I’m after. I added it all up on newegg.com, and it came out to about $1,500, which is nicely under budget, and will give me a machine I can be happy with for a few years to come.
I’m not a twitch communicator. 140 characters, or whatever, doesn’t really do it for me, but this is the way the world works now. So, I joined twitter: @LeapToTech
I had a previous twitter account, but since I never used it, I can’t even remember what it was. But, at the behest of Laura Butler @LauraCatPJs, I put up another account so that I could retweet a tweet. It’s the future man! The internet is going to be big some day.
So, now I’m learning about the value of #HashTags, and @callsigns, and that sort of stuff. Really I only wrote this post so I could figure out how to stick a tweeter at the bottom…
Over the past few years, I figured out that in the US, if you spend on your business, you’ll recoup more money than you’ll ever make by simply saving that money in a bank account. So, here we are, it’s the end of the year, what to do?
Well, first up was a new laser cutter:
Then came another large format powder printer
A replacement for my original Up! automated glue gun
And finally, I took a flyer on the GlowForge laser cutter
GlowForge (link to save $100: http://glowforge.com/referred/?kid=Y1kbpm)
Now, what business does any sane person have putting all this stuff into their garage? Well, first of all, the Pro Jet isn’t actually in my garage, it’s actually being put into production in California. That does leave me with the Weike and Afinia (ignore the GlowForge). The Weike is an interesting tool in that it’s big enough to cut large enough pieces to incorporate into real furniture. The way I see it is, I can cut 1/8″ or 1/16″ slices and create my own “plywood”. That can result in a lot of possibilities. And of course there’s always the paper/cardboard cutting for kids crafts, ‘living hinge’ jewelry boxes, and the like. The Afinia, whenever it shows up, can serve for in house ‘print/cut’, rather than being the heavy duty garage bound tool. It will go in my office.
The Afinia is just an evolution of a fairly good ‘just print’ printer. I purchased one of the earliest Up! printers a few years back, after first spending quite a few frustrating months with the original MakerBot Cupcake. I’ve loved my Up! ever since, but it’s getting long in the tooth, so it’s time for an upgrade. The H800 has a larger print area, and probably more importantly it has an enclosed volume with a HEPA filter. This makes it a candidate for living inside the house if I make space for it. The larger print volume will make it feasible to print fun parts for construction, and not just little trinkets. I’m looking forward to having an enclosed build volume, it will likely make for better temperature control, and thus less warping of ABS.
I recently had one of my ancient designs printed on the powder printer, just to see what it would look like. I have plans to have it made using injection molding process, meanwhile, using the Afinia will help me make some small runs of the same, in true rapid prototyping fashion.
Speaking of designs, I recently installed the 2015 version of OpenScad. It renders my old designs nicely enough, and it has some new features which will make some work I had done previously even better. So, what better time to dust off some of those old designs and update them with the modern OpenScad. We’ll see how hard it is to get back into the mindset of that programming environment.
And how can the year end purchases ignore electronics!
Raspberry Pi Zero (from Adafruit)
Raspberry Pi Microsoft iOT package
At $5 (for basic board), how could any self respecting tinkerer pass this up. Yes, if you really add things up, it’s much more than this, but I already have enough power supplies, keyboards and wifi dongles in stock that these are not additional costs, and the incremental cost to me is truly $5. At the very least, I can see using these as a learning tool to teach Linux. A Raspberry Pi 0, with LuaJIT installed, and tons of “Lua Tools For Linux”, and you’re all set.
So, finishing the year with a wallet busting blowout, taking back some tax dollars in the process, and generally setting up to have some great fun in the new year.
Happy New Year!
Of course you’d expect to see a picture here, or on Facebook, or Instagram, or whatever, but we don’t roll like that.
New baby girl arrived 9/30 at 9:30am. September 28th was apparently a ‘supermoon’, and we’ve been having ‘blood moon’, and the pope was visiting and the president of China was visiting, and…
Seems like an interesting confluence, or near confluence. I especially like the 9/30, 09:30 thing. That will make it easier to remember.
Having newborns around means a couple of things. First and foremost, not a lot of sleep, although this one seems to sleep at least a couple hours here and there. Seems like we’re constantly up from 10pm to 4 am, although it’s probably not true. The second thing it means is that a lot of fairly shoddy code gets written. During those wee hours, between feedings, changes, and general soothing walks around the house, one or the other of the laptops is actually on my lap, and I’m slinging out code. It’s funny though. Something that might take 2 hours to figure out during these late night sessions can take a mere minute to realize in the bright light of regular day hours. Nonetheless, I’m heeding the call of our President and making it possible for everyone to write code!
And so it goes. A new reason to write inspired code. There’s someone new to reap the benefits of my labors.
Quite a while ago (it looks like about 3 years), I create the LAPHLibs repository. It was an outgrowth of various projects I was doing, and an experiment in open licensing. The repo is full of routines varying from hash functions to bit banging. Not a ‘library’ as such, but just a curation of things that are all pure LuaJIT code based.
Well, after I spun it out, I didn’t show it much love. I made a couple of updates here and there as I found fixes while using the routines in other projects. Recently though, I found that this is the most linked to project of all my github based projects. As such, I thought it might be useful to give it a makeover because having all that bad code out there doesn’t really speak well of the Lua language, nor my learnings of it over the past few years.
So, I recently spent a few hours cleaning things up. Most of the changes are documented in the new CHANGELOG.md file.
If you are one of the ten people who so happens to read this blog, and are a user of bits and pieces from that library, you might want to take a look.
One of the biggest sins that I fixed is the fact that in a lot of cases, I was polluting the global namespace with my functions. It was inconsistent. Some functions were global, some local, sometimes even in the same file! Now everything is local, and there are proper exports.
I also got rid of a few things, like the implementation of strtoul(). The native tonumber() function is much more correct, and deals with all cases I had implemented.
There are a few places where I was doing idiomatic classes, and I cleaned those up by adding proper looking ‘constructors’.
Overall, the set of routines stands a little taller than it did before. I can’t say when’s the next time I’ll do another overhaul. I did want to play around a bit more with the bit banging stuff, and perhaps I can add a little bit more from projects I’ve done recently, like the schedlua scheduler and the like.
Bottom line, sometimes it’s actually worth revisiting old code, if for no other reason than to recognize the sins of your past and correct them if possible.
You see, it’s like this. As it turns out, a lot of people want to run code against a Linux kernel in the cloud. Even though Windows might be a fine OS for cloud computing, the truth is, many customers are simply Linux savvy. So, if we want to make those customers happy, then Linux needs to become a first class citizen in the Azure ecosystem.
Being a person to jump on technological and business related grenades, I thought I would join the effort within Microsoft to make Linux a fun place to be on Azure. What does that mean? Well, you can already get a Linux VM on Azure pretty easily, just like with everyone else. But what added value is there coming from Microsoft so this isn’t just a simple commodity play? Microsoft does in fact have a rich set of cloud assets, and not all of them are best accessed from a Linux environment. This might mean anything from providing better access to Azure Active Directory, to creating new applications and frameworks altogether.
One thing is for sure. As the Windows OS heads for the likes of the Raspberry Pi, and Linux heads for Azure, the world of computing is continuing to be a very interesting place.