If you are old enough, you may remember and even relate. Picture this: Early 2000s; DivX--and later, its rival XviD--on the software side, and Pentium 4 and Athlons on the hardware side have finally made video compression a thing; no more bulky, moldy VHS tapes; Napster in its best days, changing P2P history forever. Down here in South America, dial-up internet was finally dying and ADSL had arrived. CD-RW drives were relatively accessible as were their media, especially when compared with the emerging--and expensive--DVD. Youtube wasn’t even born. Add it together and you have the perfect scenario to end up with this in your house:
A CD tower full of physical media containing software, mp3 files, and movies. My tower was exactly like the one pictured, but in tobacco. It could hold up to two hundred CDs. When this was taken, I’d switched from standard CD cases to slim cases, doubling the capacity.
Let’s fast forward a few years to the early 2010s. Youtube existed, but wasn’t as huge as it is today. Netflix had just arrived in South America, with just a few unknown titles in its portfolio. Faster ADSL--2 to 5 mbps--was widely available in most big cities down here. Downloading not just movies, but entire series, became a big deal. Although DVDs and DVD drives had become cheaper, it still wasn’t cheaper than the price per Mb for hard disk drives (HDDs) before the floods in Indonesia (https://goo.gl/rP6kyE). Instead of managing a lot of media, some of which were starting to go bad, why not have it all on HDDs, available with just a click? What about all the CD and DVD covers? Would I trust my family photos to a mechanical HDD prone to failure?
When I started planning how to organize all the media, some research showed XBMC (which would be renamed Kodi a few years later) to be the perfect the solution. XBMC could handle the media library, download covers and lyrics, play back almost any video codec with subtitles, display my family photos, and much more. All it would take was a good amount of work on standardizing folders and file names.
What about the data itself? Disks fail and have bad blocks all the time. RAID is okay, but it had one main problem when dealing with large amount of almost static files--efficiency! RAID 1 would have doubled the cost of disks, space, noise, and energy. RAID 5 seemed okay, but what if a disk failed and I could not get another the same size right away? What if two disks failed? There’s RAID 6 but few controllers support it. What if the controller fails? Different manufacturers have different implementations, making recovery of all the data a nightmare!
Here comes Snapshot RAID. Compared to the usual RAID, it doesn’t work on the disk level. Instead, it works on the data level, over any file system. As long as you have a parity disk with an equal or greater amount of space than the data on the biggest disk on the array, you’re fine. You can even use disks of different sizes. The con is the lack of speed and data availability. Some RAID 5 controllers have live data reconstitution as long as there is just one disk fail. RAID parity is calculated live as data changes. It’s fine, but it does keep all disks spinning most of the time even if you’re just changing a small file. Snapshot RAID, as the name says, works by calculating parity in snapshots. If a disk fails, you’d be stuck with the last snapshot. The good part is that any file that is not in the failed disk is available for use, with no need of reconstruction.
Another goal was merging all data and displaying it as one big disk--almost like JBOD, but with security. Using multiple disks is fine, but makes handling the media too complex. With this in mind, and with plans to use the server as a gaming PC as well (which kind of limited the OS options to Windows) I decided to go with FlexRAID (http://www.flexraid.com/). It was still in its beta stage back then, but showed a lot of potential. To my surprise, as the Brazilian translator I was gifted a full license when it became paid.
With all the software needs addressed, it was time to get my hands dirty and build some hardware. As this was intended to be a media/gaming PC, and as one or two 1.5TB HDDs would be enough to hold all of my media at the time, I chose this case by Sentey to be my living room rack. I will explain all the dust later.
The case could fit up to three 3.5” HDDs, a DVD-RW drive, a micro ATX motherboard, a full ATX PSU and a full height GPU. It worked wonderfully until I added the third disk. No matter how many fans I added, or how fast the fans worked (producing a lot of noise), it would overheat.
It became quite obvious that I would need more fans and additional case to fit more disks as the library grew fast as a result of the convenience of running a 24-hour server and faster available internet.
It just happened that I had this old Compaq desktop case laying around:
With a “few”, not particularly pretty mods in the case and the PSU housing, it became this:
I added seven HDDs and two additional HDD controllers, as the motherboard could only handle two disks. At the time the GPU, the additional fan in the back, and the 120mm one in the top of the case were not yet added. This was the original motherboard, which I replaced a few months later with one that would work better with the processor. The four 80mm fans in the front--added to cool down the HDDs--were removed for cleaning. The power connector is hanging in the front.
- Motherboard - Abit VA-10 (changed after)
- Processor - AMD Athlon XP 2000 + (changed to a Atlhon X2 after)
- RAM - 2Gb DDR 333 (actually, DDR 400 underclocked) (two more gigs added after)
- SiI 3112 RAID Controller
- Sil 3114 RAID Controller
HDDs: (the small ones changed to bigger ones after)
- 1x40Gb IDE (System)
- 3x1.5Tb SATA (Media)
- 1x1.5Tb SATA (Personal Data)
- 1x500Gb SATA (Downloads and virtual memory)
- 1x1.5Tb SATA (Parity)
- Windows 7 32 bits
This setup worked decently for weeks until my then-fiancée (now my wife) started complaining about the noise. I must admit, it was LOUD, especially with the extra fans that came with the graphics card. With a pending marriage and no plans on leaving my fiancée, I decided to expand the bathroom renovation we were planning to the living and bedrooms too. Remember all the dust in first case pic? This is why:
Yes, we use masonry down here in Brazil, not drywall. And yes, renovating an apartment while living in it can be compared to hell with no exaggeration. You can see the CD tower can be seen in the first two pics, by the table. The computer case is on the right side of the first pic. All those yellow things are conduits to hide the cables and wires that would connect the living room and bedroom TVs to the server, now confined to a cabinet in the hallway. Of course, heat would be a problem in a confined space. Yet more fans were added to the cabinet itself, pumping the hot air to an upper partition that was vented. After a lot of dust and days of hard work, the results can be seen in Figures 9 and 10.
The server could still be heard, even with the door closed, but it was not even close to the noise we had before. As for the living room, Figure 11 shows the result.
If you’re still reading after all this, you’re probably asking, “What about ODROID in all this?” After a few years focused on expanding the family, we decided to move to a bigger apartment. Since almost every apartment where we live is about 70 years old, some renovation would again be required, and again we’d be having to live through the renovation. Since we had an extra small room in the back of the new apartment, I decided to put the old server there, passing conduits into the walls and everything. After months of renovation the time to power up the old server finally came, and after discussing the capabilities of the Raspberry Pi with some work friends, the big idea popped up. Small, incredibly powerful boards with lots of processing cores and RAM were available. Many people were using them as retro gaming consoles and even mini-PCs. The big questions were: “Is the technology there yet?” and “If so, which board to use?”
If successful, the new server would be way smaller, quieter, and more energy-efficient than the old one. At least five years had passed since my initial server build. A few weeks of research and the answer to that question was clear: the technology was there and the ODROID-XU4 was the obvious choice. Why? Its eight cores were way more than the two I had on the old server. Even with 2GB less RAM would not be a problem, as I wouldn’t be running Windows anymore, but the lightweight Linux instead. Gigabit ethernet would be perfect to feed all devices at the new apartment. The ODROID-XU4 also has a decent GPU with hardware decoding capabilities which was important, as I planned to use it not only as a server but as a video player too, just like the old PC server. Finally, but very important, was the USB 3.0 availability. As the XU4 has no SATA interfaces (and even if it had one or two, it would not be enough), USB 3.0 made the perfect alternative. Its theoretical 625MB/s speeds are way faster than any mechanical hard drive. In fact, in testing the HDDs I was using, the fastest one only delivered about 120MB/s directly attached to a PC via SATA or using a SATA-USB adaptor via USB 3.0.
One important thing was that I would no longer be able to use the old 3.5” HDDs. They are power hungry, bulky, and need 12V to run. I could have used a PC or ITX PSU to power them, but once again this solution would have been bulky and inefficient. So 2.5” HDDs came in play, as they are small, resilient, power efficient, silent and can be run with only 5V from a USB port if you have enough juice for them.
I ordered the ODROID-XU4 from Hardkernel and in a few days I had it in my hands. What a beauty! It even has an intelligent fan to reduce the noise. After a few more days, I got some HDDs and a self-powered USB 3.0 hub to start testing. Before gall the tech stuff, here’s a comparison of the box from ODROID that contained the XU4 and power adapter, and the old server. You can’t see it but inside the box, along with the XU4 and it’s power adapter, there were seven HDDs with their SATA-USB adapters, a USB hub and its power adapter, and a lot of SD cards with OS images. Having all this stuff in a box about the same size as the old PSU was not bad at all.
In the final version of the old server, the PSU would no longer fit inside the case with the DVD drive (the red cable to the left is connected to it).
During the first tests the powered USB hub created a bottleneck. An USB 3.0 port can only drop up to 900mA, which is not even close to the amount needed to run more than one or two disks, depending on the model used. Disks randomly spinning up and down during boot up and general use made that very clear. The first powered USB hub I tried did not provide enough power for just two disks. I even tried running one directly from the XU4’s USB 3.0 port and another from the USB 2.0 to divide the load, but the ones plugged to the hub kept shutting down.
The solution was to find a decent powered USB hub. After more research and reviews, I decided to go with a Xcellon 10-port USB 3.0 hub. It’s made of aluminum so it dissipates heat. The 5A from its power adapter proved to be enough to run at least the five disks I now use. I never tested it with all the seven I have.
Hardware tested, it was time to focus on the software. As running Windows was no longer an option, the question became which Linux flavor to use. As I said before, the old PC served as my home NAS, media player, gaming machine, and download center using Sonarr and Torrent. I would not accept losing any capability.
First, I focused on data availability and safety. I wouldn’t put my data on a unreliable server. The easiest and safest way I could find was OpenMedia Vault (https://www.openmediavault.org/). It’s open source and has everything I need, in an easy-to-use web interface with tons of plug-ins. As it’s based on Debian, the OS choice was already made. I downloaded a Debian image from the Hardkernel forums and started installing everything. It took me a few weeks to figure it all out. OMV’s Greyhole plug-in would now do the job of merging all the disks in to one. SnapRAID plug-in would deal with data parity. OMV would handle the rest and let me configure the two. But just then, something occurred me. After so many years, I no longer have a 56 kbps dial-up connection, but a 60 mbps cable one. Protecting anything more than my family photos and personal documents is irrelevant. I can now watch most movies and series via streaming or simply download it in minutes if it’s not available to stream. I don’t need that much disk space. Maybe not even parity. As Greyhole offers an option to replicate the data of a share in as many HDDs from the pool as I want, and as my sensitive data are not that large, simple double or triple copies would be enough.
But since the local copy isn’t backup, I’ve configured this data to be synchronized with the Cloud too (real backup), just in case. So, I’ve left SnapRAID behind after a few weeks as it would no longer be needed. It may still fit your needs, if you have a huge amount of sensitive, slow-changing data. I ran this system for about two months flawlessly, so the concept was right. It was time to bring the focus back to media-playing and gaming. Digging into how to install Kodi on Debian I stumbled on the amazing work of Meveric: ODROID GameStation Turbo (https://forum.odroid.com/viewtopic.php?f=98&t=7322).
It’s based on Debian, but has a lot of optimizations for video playback and retro gaming. A few days and I had it all merged. After two or three months of running just fine I started to experience some eventual system hangs. It turned out to be a dying SD card. As I could not recover it, the solution was to rebuild from scratch. I figured hey, if rebuilding the software, why not the hardware? For months, a bunch of wires hid behind my living room door.
With some creativity and a Dremel tool, I turned a two-drawer mini desktop organizer into the latest version of my server. Here are pictures of the process:
In this last build, I added one of the fans from the old server to help to cool down the HDDs and maybe extend their life-span. The HDD at the top holds the OS/download/temp files. I could have mounted it behind the fan to keep it cool too, but I decided to keep it more accessible in case of maintenance. It’s more reliable than the previous SD card, but not immune to failure. In addition, I can attach one additional HDD to the server via the USB hub, if needed. The whole system is so silent that it resides on the TV rack in the living room. If you don’t look at the LEDs, you can’t tell if it’s even powered on or not.
This is not to say everything was perfect. A few weeks ago, the system froze a few times. The cause was easy to figure out. Again, overheating. The server was on the top shelf of the rack, in the same place the stereo is now. It is hard to see, but there is a 400VA APC nobreak there too, in the back. The wiring goes out through a hole in the back of the lower shelf. The heat from the server and its power supplies, plus the nobreak and a lack of ventilation made the ODROID-XU4 run at around 79°C (174°F) all the time, even with no load at all. It’s likely the HDDs did as well, especially the one containing the OS. With a medium load, the CPU and/or the HDDs--I forgot to check the temperatures--passed their limit, bringing the whole system to a halt. Swapping the stereo with the server solved the issue. The heat can now flow through the wiring hole. Even after six hours of a series marathon, the server runs fine for weeks without glitching or freezing. The temperatures dropped by around 4°C (about 7°F). It could be better, but I can open the rack door whenever I need to. If the system didn’t freeze during a Brazilian summer with temperatures up to 43°C (109°F), I doubt it will freeze due to overheat ever again. In the future, I may 3D-print a case for the server. But the current one fulfills my needs.
The saga is not over, and probably never will be. I’d love to upgrade to an XU5 with 4GB RAM, H.265 and 4K support, and an even faster processor. It would be the perfect transcoding/playing platform. Running OMV combined with Emby or Plex would make the ultimate powerhouse for home transcoding and streaming. Imagine a C3 with embedded 5GHz AC WiFi: it would be THE combo, making any “Smart” TV look like a piece of trash, in terms of power and flexibility. Right now, it’s more than good enough for playing retro games, holding almost all of my media (H.265 is gaining force with 4K), serve my files all over the apartment, manage and download my movies and series while staying small, silent, and power efficient. Any media that I can’t play on the XU4 I can do on my bedroom’s ODROID-C2 (another Hardkernel’s piece of art).
So, this is it. I hope this helps others like me. Any further questions, you may find me on the ODROID forums (https://forum.odroid.com) under will_santana.