So, I’ve written quite a lot about computicles over the past few years. In most of those articles, I’m talking about the software implementation of tiny units of computation. The idea for computicles stems from a conversation I had with my daughter circa 2007 in which I was laying out a grand vision of the world where units of computation would be really small, fit in your hand sized, and be able to connect and do stuff fairly easily. That was my envisioning of ubiquitous computing. And so, yesterday, I received the latest creation from HardKernel, the Odroid HC1 (HC – Home Cloud).
Hardkernel is a funny company. I’ve been buying at least one of everything they’ve made in the past 5 years or so. They essentially make single board computers in the “credit card” form factor. What you see in the picture is the HC1, with an attached SSD of 120Gb. The SSD is 2.5″ standard, so that gives you a sense of the size of the thing. The black is aluminum, and it’s the heat sink for the unit.
The computer bit of it is essentially a reworked Odroid XU4, which all by itself is quite a strong computer in this category. Think Raspberry Pi, but 4 or 5 times stronger. The HC1 has a single Sata connector to slot in whatever hard storage you choose. No extra ribbon cables or anything like that. The XU4 itself can run variants of Linux, as well as Android. The uSD card sticking out the right side provides the OS. In this particular case I’m using OMV (Open Media Vault), because I wanted to try the unit out as a NAS in my home office.
One of the nice things about the HC1 is that it’s stackable. So, I can see piling up 3 or 4 of these to run my local server needs. Of course, when you compare to the giant beefy 64-bit super server that I’m currently typing at, these toy droids give it very little competition in the way of absolute compute power. They even did an analysis of bitcoin mining and determined a number of years it would take to get a return on their investment. But, computing, and computicles aren’t about absolute concentrated power. Rather, they are about distributed processing.
Right now I have a Synology, probably the equivalent of today’s DS1517. That thing has RAID up the wazoo, redundant power, multiple nics, and it’s just a reliable workhorse that just won’t quit, so far. The base price starts at $799, before you actually start adding storage media. The HC1 starts at $49. Of course there’s no real comparison in terms of reliability, redundancy, and the like, or is there?
Each HC1 can hold a single disk. You can throw on whatever size and variety you like. This first one has a Samsung SSD that’s a couple years old, at 120Gb. These days you can get 250Gb for $90. You can go up to 4TB with an SSD, but that’s more like a $1600 proposition. So, I’d be sticking with the low end. That makes a reasonable storage computicle roughly $150.
You could of course skip the SSD or HDD and just stick a largish USB thumb drive, or 128Gb uSD for $65, but the speed on that interface isn’t going to be nearly as fast as the Sata III interface the HC1 is sporting. So, great for a small time use, but for relatively constant streaming and download, the SSD solutions, and even HDD solutions will be more robust.
So, what’s the use case? Well, besides the pure geekery of the thing, I’m trying to imagine more appliance like usage. I’m imagining what it looks like to have several of these placed throughout the house. Maybe one is configured as a YouTube downloader, and that’s all it does all the time, shunting to the larger Synology every once in a while.
How about video streaming? Right now that’s served up by the Synology running a Plex server, which is great for most clients, but sometimes, it’s just plain slow, particularly when it comes to converting video clips from ancient cameras and cell phones. Having one HC1 dedicated to the task of converting clips to various formats that are known to be used in our house would be good. Also, maybe serving up the video itself? The OMV comes with a minidlna server, which works well with almost all the viewing kit we have. But, do we really have any hiccups when it comes to video streaming from the Synology? Not enough to worry about, but still.
Maybe it’s about multiple redundant distributed RAID. With 5 – 10 of these spread throughout the house, each one could fail in time, be replaced, and nothing would be the wiser. I could load each with a couple of terabytes, and configure some form of pleasant redudancy across them and be very happy. But, then there’s the cloud. I actually do feel somewhat reassured having the ability to backup somewhere else. As recent flooding in Texas shows, as well as wildfires, no matter how much redundancy you have locally, it’s local.
Then there’s compute. Like I said, a single beefy x64 machine with a decent GPU is going to smoke any single one of these. Likewise if you have a small cluster of these. But, that doesn’t mean it’s not useful. Odroid boards are ARM based, which makes them inherently low power consumption compared to their intel counterparts. If I’ve have some relatively light loads that are trivially parallelizable, then having a cluster of a few of these might make some sense. Again with the ubiquitous computing, if I want to have the Star Trek style “computer, where’s my son”, or “turn on the lights in the garage”, without having to send my voice to the cloud constantly, then performing operations such as speech recognition on a little cluster might be interesting.
The long and short of it is that having a compute/storage module in the $150 range makes for some interesting thinking. It’s surely not the only storage option in this range, but the combination of darned good hardware, tons of software support, low price and easy assembly, gives me pause to consider the possibilities. Perhaps the hardware has finally caught up to the software, and I can start realizing computicles in physical as well as soft form.
Last time around, I outlined what would go into my build. This time, I’ve actually placed the order for the parts. I was originally going to place with newegg, but the motherboard was out of stock. This forced me to consider amazon instead. Amazon had everything, and at fairly decent prices. That plus prime shipping, and good return policy, made it a relative no brainer (sorry newegg).
I did a hand wave on some of the parts in the last post, so I’ll round out the inventory in detail here.
this item used to require a ton of thought in the past, but today, you can spit in generally the right direction and things will likely work out. I wanted to outfit my rig with 64GB total ram. I wanted RAM that was reliable and looked good. I probably should have gone for some red colored stuff, but I went with the black G.SKILL Ripjaws V Series DDR4 PC4-25600 3200MHz parts (model F4-3200C16D-32GVK).
They come in sets of two (32GB per set), so I ordered two sets. Who knows, maybe I’ll get lucky and they’ll be red.
I know from my laptop, and my current Shuttle PC that having a SSD as your primary OS drive is an absolute must these days. Please, no 5400 RPM spinning rust! On this item, I chose the Samsung V-NAND SSD 950 Pro M.2 NVM Express 256 GB.
Well, since I’m no longer interested in building the ultimate streaming PC, I’ve turned my attention to building a more traditional tower PC. What? Those are so 1980! It’s like this. What I’m really after is using the now not so new Vulkan API for graphics programming. My current challenge is, my nice tighty Shuttle PC doesn’t have the ability to run a substantial enough graphics card to try the thing out! I do in fact have a tower PC downstairs, but it’s circa 2009 or something like that, and I find this a rather convenient excuse to ‘upgrade’.
I used to build a new PC every two years, back in the day, but tech has moved along so fast, that it hardly makes sense to build so often, you’d be building every couple of months to keep pace. The machines you build today will last longer than an old hardware hacker cares to admit, but sometimes you’ve just go to bite the bullet.
Trying to figure out what components to use in a current build can be quite a challenge. It used to be that I’d just go to AnandTech and look at this years different builds, pick a mid-range system, and build something like that. Well, AnandTech is no longer what it used to be, and TomsHardware seems to be the better place for the occasional consumer such as myself.
The first thing to figure out is the excuse for building the machine, then the budget, then the aesthetics.
Excuse: I want to play with the Vulkan graphics API
Budget: Less than $2,000n (to start ;-))
Aesthetics: I want it to be interesting to look at, probably wall or furniture mounted.
Since the excuse is being able to run the Vulkan API, I started contemplating the build based on the graphics card. I’m not the type of person to go out and buy any of the most current, most expensive graphics cards, because they come out so fast that if you simply wait 9 months, that $600 card will be $300. The top choice in this category would be a NVidia GTX 1080. Although a veritable beast of a consumer graphics card, at $650+, that’s quite a budget buster. Since I’m not a big gamer, I don’t need super duper frame rates, but I do want the latest features, like support of Direct X12, Vulkan, OpenGL 4.5, etc.
A nice AMD alternative is the AMD Radeon Rx 480. That seems to be the cat’s meow at the more reasonable $250 price range. This will do the trick as far as being able to run Vulkan, but since it’s AMD and not NVidia, I would not be able to run Cuda. Why limit myself, since NVidia will also run OpenCL. So, I’ve opted for an NVidia based MSI GeForce GTX 1060.
The specialness of this particular card is the 6GB of GDDR5 RAM that comes on it. From my past history with OpenGL, I learned that the more RAM on the board the better. I also chose this particular one because it has some red plastic on it, which will be relevant when I get to the aesthetics. Comparisons of graphics cards abound. You can get stuck in a morass trying to find that “perfect” board. This board is good enough for my excuse, and at a price that won’t break the build.
Next most important after the graphics card is the motherboard you’re going to stick it in. The motherboard is important because it’s the skeleton upon which future additions will be placed, so a fairly decent board that will support your intended expansions for the next 5 years or so would be good.
I settled on the GIGABYTE G1 Gaming GA-Z170X-Gaming GT (rev. 1.0) board.
It’s relatively expensive at $199, but it’s not outrageous like the $500+ boards. This board supports up to three graphics cards of the variety I’m looking at, which gives me expansion on that front if I every choose to use it. Other than that, at least 64GB of DDR4 RAM. It has a ton of peripherals, including USB 3.1 with a type-c connector. That’s good since it’s emerging. Other than all that, it has good aesthetics with white molding and red highlights (sensing a theme).
To round out the essentials, you need a power supply. For this, I want ‘enough’, not overkill, and relatively silent.
The Seasonic Snow Silent 750 is my choice. Besides getting relatively good reviews, it’s all white on the outside, which just makes it look more interesting.
And last, but not least, the CPU to match. Since the GPU is what I’m actually interested in, the CPU doesn’t matter as much. But, since I’m not likely to build another one of these for a few years, I might as well get something reasonable.
I chose the intel i7-6700K for the CPU.
At $339, it’s not cheap, but again, it’s not $600. I chose the ‘K’ version, to support overclocking. I’ll probably never actually do that, but it’s a useful option nonetheless. I could have gone with a less expensive i5 solution, but I think you lose out on hyper-threading or something, so might as well spend a $100 more and be slightly future proof.
Now, to hold all these guts together, you need a nice case. I already have a very nice case housing the circa 2009 machine. I can’t bring myself to take it apart, and besides, I tell myself, it doesn’t have the io access on the front panels required of a modern machine. Since part of my aesthetic is to be able to show the guts of the machine (nicely themed colors), I went with something a bit more open.
The Thermaltake core P5 ATX Open Frame case is what I have chosen.
Now, I’m more of a throw it together and start using it kind of builder, but putting a little bit of flash into the build could make it a tad more interesting. Less heat dissipation problems, and if I ever do that cool liquid cooling piping stuff, I’ll be able to show it off. This case also has options to mount it against the wall/furniture, and I’ll probably take advantage of that. I can imagine having a home office desk with a couple of these mounted on the front just for kicks. Thrown in a few monitors for surround, and… Oh, but I’m not a gamer.
The rest of the kit involves various memory, storage, etc. The motherboard has M.2 as well as mSata. So, I’ll probably put an SSD on one of those interfaces as the primary OS drive. Throw in a few terabytes of spinning rust, and 64GB of RAM, and it’s all set.
The other nice thing about the motherboard is dual NICs. One is for gaming, the other (intel) is for more pedestrian networking. This can be nothing but goodness, and I’m sure I can do some nice experimenting with that.
Well, that’s what I’m after. I added it all up on newegg.com, and it came out to about $1,500, which is nicely under budget, and will give me a machine I can be happy with for a few years to come.