Meet An ODROIDian: Ernst Mayer, Mathematician Extraordinaire

Meet an ODROIDian Ernst Mayer

Please tell us a little about yourself. My name is Ernst Mayer. I've been living and working in Silicon Valley for roughly the last 20 years, doing algorithmic and coding work first for several tech startups and then for larger firms. Most of that work was related to EDA software for chip design. I've actually been semi-retired (in the sense that I still work, but mostly on my own research projects, not for pay) since getting laid off from my last large-company gig 6 years ago. I currently live in Cupertino, the heart of Apple country, though I never worked there. It's nice being close to the coastal hills, but the real estate prices and rents are really high. My sister and her family (husband and twin 9-year-old boys) live in the North bay, so I see them fairly often. I actually come from a science but non-computer-science background: my graduate degrees from the University of Michigan are in Aerospace Engineering and Mathematics. My PhD thesis was in theoretical fluid mechanics, specifically vortex flows. Lots of differential equations, applied mathematics and numerical analysis. My coding background coming out of college was scientific computing, that is, Fortran. I taught myself the rudiments of C and C++ after moving to Silicon Valley, and learned most of the rest of what I needed to know about those languages and CS-style algorithmics and data structures on the job.

Figure 1 - Ernst visiting the Canadian Rocky Mountains in 1986

How did you get started with computers? In the context of the kind of algorithmic and programming work, I ended up doing both by way of a career and ongoing research, it's important to note that I was, based on my early-college experiences, one of the most unlikely code geeks ever. I was a freshman at the University of Michigan in Fall 1981, and as such, was a member of one of the last few engineering-program gradating classes which was made to suffer through the freshman Fortran programming for engineers course as then constituted. The computer center was housed in a nondescript structure which was literally under a pedestrian-overpass bridge, and formally named the North University Building, but known to everyone by its acronym, NUBS. Everything was based around a then-standard mainframe-based tiered-price timeshare setup, said mainframe being an Amdahl system which I'm sure made for a suitably budget-priced alternative to the market-leading IBM 360 series. The real problem, as so often is the case, lay in the software: a non-IBM system compiler meant a non-IBM compiler, and while I don't know what alternative compiler offerings Amdahl Corp. may have had, I do know that the users of the system were stuck with a somewhat-experimental compiler from Waterloo U. in Canada. The problem with it was that the error messaging from same was so cryptic (often just obscure hexadecimal error codes, lacking even a program line number), so that for us newbies it effectively amounted to a binary syntax-error messaging system: 1 = "there are >= 1 syntax errors in your code, don't ask me where, so good luck finding them", and 0 = "congratulations! there are 0 syntax errors in your code, here are your (probably wrong) outputs." Even that would have been annoying but workable, but for the second major issue was that there were exceedingly few line-printer terminals and even fewer actual interactive terminals available, all of those were 100% occupied by computer science grad students, so we were limited to punched-card machines, for which there were also often wait lines. So once you finally got a seat at one of those and transcribed your handwritten initial attempt at a program for the current assignment on your little deck of playing-card-thick paper cards, you had to vacate your seat, take your card deck over to the nearest riffle-reader machine, maybe wait in line some more there, then head over to the paper-printout dispenser window to retrieve your program listing and, if you were lucky, get your outputs. Got a syntax-errors-remain crypto-message? Spend time poring over your program listing, identify likely error(s), then proceed back to the wait line outside the punch-card machine room. No syntax errors but errors in your outputs? More of the same. The final insult was that us bottom-rungers were allotted some ridiculously small amount of timeshare-account credits. I seem to recall $2 of said funny-money credits for the entire semester's projects, with no option to add more. Given the pricing system which was in place, the only way to turn said amount into a remotely-reasonable number of the above-described debug cycles was to use the facility in the dead of night, when prices were at their cheapest. The net result was that even the simplest 50-line programming assignment almost invariably turned into a hellish all-night work session.

Figure 2 - Ernst in his office at Case Western Reserve University in 1997, in front of a DEC Alpha workstation, the first true 64-bit RISC architecture, which was a fine system for its day, but his ODROID-C2 has an order of magnitude more computing power

As a consequence, for the rest of my undergraduate days, the only kind of coding I did was on my trusty HP-41C programmable calculator. If someone from the future had come back and told my then-self that I would end up writing (almost entirely from scratch) and maintaining a program consisting of on the order of a half-million lines of code, I would've told them they were crazy. Of course fate, as it so often does, had other plans. While in graduate school, I earned extra money by working roughly 20 hours per week doing carpentry and maintenance work for a local landlord, and spent as many of my remaining waking hours as were left available indulging my love of "bashing around in the great outdoors": rock climbing and summer mountaineering trips, cycling, martial arts.

In the summer of 1987, having completed my master's degree I was preparing to start a PhD program in experimental fluid dynamics when during a mountain-biking session I took a nasty headfirst spill and ended up with a broken neck and paralysis from the chest down. So crawling around an equipment-filled experimental lab was out; math and computer work were in. Thankfully by then the university computing labs had moved to workstations and PCs, so I was able to do most of my graduate-research coding on DEC Vax workstations, with quality software, a vastly different experience than I'd had as a freshman.

Figure 3 - Ernst with Stanford's Donald Knuth and a bunch of fellow Mersenners at the Mountain View Tied House to celebrate the discovery of the 39th Mersenne prime, M(13466917), December 2001

What attracted you to the ODROID platform? As I noted in last month's prime numbers article in ODROID Magazine, after spending much of the past five years writing assembly code for the various flavors of the x86 SIMD vector-arithmetic instruction set (SSE2, AVX, AVX2+FMA3, AVX512), last year I was also considering a first foray into adding SIMD support to a non-x86 processor family, and the ARMv8 CPU family's 128-bit vector support nicely fit the bill. After doing some homework regarding a low-cost but still reasonably high-performance development platform for that coding effort, the ODROID-C2 emerged as the top choice. I get about 50% greater throughput for my code on the C2 than on Raspberry Pi3. I also got performance benchmarks on a pre-release version of the ODROID N1 thanks to an ODROID forum user who was selected as one of the beta testers for the N1, and it looks very promising, using both of the N1's CPUs (dual-core Cortex a72 and quad-core Cortex a53), I get more or less the same throughput for the “little” a53 socket as for a C2 (which is based on the same quad-core CPU), and the 'big' dual-core a72 CPU delivers about 1.5 times that throughput. Running jobs on both sockets simultaneously cuts about 10% off each of those single-socket throughputs so we don't get quite 2.5 times the C2's total throughput, but it's still more than double that of the C2. So the C2 was definitely a good choice as my first ODROID.

How do you use your ODROIDs? Last year, the 4-5 months after buying my C2 were spent doing heavy-duty inline-assembly coding, debug and performance tuning. Since releasing the ARMv8 code I've used my C2 pretty much the same way I hope other users of my code will do, large-prime-hunting for the GIMPS distributed-computing project. (In case anyone is wondering, I neither chose the project acronym nor take any offense from it.) It's also handy to have a machine with a different version of Linux and GCC installed than either my Macbook or my big Intel quad-core system, in case I need to track down a build issue that looks like it may be compiler or OS-version related.

Which ODROID is your favorite and why? The ODROID-C2 of course, at least until the N1 goes on sale.

What innovations would you like to see in future Hardkernel products? I'm a throughput hog, so I guess my answer boils down to "more sockets!" In particular, I'm interested in how many ODROID boards it would take to compete with a high-end Intel quad system, and how the respective prices for those similar-throughput options compare. Based on the relative performance of my Intel Haswell quad and the ODROID-C2 and N1, we'd need around 20 or so C2's to match the Haswell, and around 10 N1s. Once one gets to "around 10" the notion of that kind of compact multi-board bundle becomes no longer unreasonable to contemplate. The price based on the estimated retail price of the N1 is still a little higher than one would like, but not by much. Anyway, those are the kinds of daydreams this ODROIDer has. I also think linux micro-PCs are a great way to get kids interested in computers in a way that avoids the "walled garden" effect of PCs running proprietary OSes; I think maybe some kind of educational-outreach initiative to get such systems into the hands of low-income school children would be worthwhile for the Hardkernel folks to look into. That's the kind of thing that might attract government or private-foundation grant money to sponsor it.

What hobbies and interests do you have apart from computers? I've always been a person who likes to work not only with his head but also with his hands. One thing coding and mathematics lack is the tangible satisfaction that comes with physically building something, so I always like to have some kind of handicraft project going to fulfill that need. A couple years ago I built a really sturdy workbench using salvage lumber, mostly discarded heavy-duty computer-equipment shipping pallets. This winter's project was to build a display mount for a large (45kg) iron meteorite I'd bought some years back, out of a travertine limestone base topped with a block of natural sandstone drilled to hold three lengths of steel rod to act as a tripod to cradle the meteorite. The drilling proved to be the hardest part - one expects sandstone to be fairly soft and easy to work, but this block was sand which had apparently eroded from some kind of hard mineral, it ended up taking a diamond-encrusted hollow-core drill and hours of steady heavy pressure using a drill press and water to lubricate things. I went through a lot of ibuprofen that week!

Figure 4 - Ernst is currently building a display mount for his iron meteorite

What advice do you have for someone wanting to learn more about programming and mathematics? Find a problem that really interests you which can serve as a learning vehicle for these subjects. I've had several such in my career, since my PhD research had aspects of differential equations, asymptotic analysis, perturbation theory, and linear algebra and eigensystems. My prime number work involves large-integer arithmetic and signal-processing algorithms, vector-arithmetic assembly code, plus a fascinating rich history featuring some of math's brightest luminaries. The world of science is full of such interesting problems; believe me, you'll know when one such grabs a hold of you. Making time in our distraction-filled and money-ruled modern world to pursue it, that is perhaps the trickiest issue.

Figure 5 - Ernst’s first contribution to the field of computational number theory in 2002

Be the first to comment

Leave a Reply