­

Archive | Blog

RSS feed for this section

Blog Posts.

My New Year’s resolution: 5120×2880

Current setup, 2023 edition

My current setup, 2023 edition

Alternate title: you want to experience true level? Do you?1

I have successfully converted a late-2015 27-inch Retina iMac into a standalone 5K display. After umm-ing and ahh-ing about it for a few weeks while I debated whether I wanted to potentially irreversibly disassemble a perfectly working iMac, I removed all the intervals of the iMac and replaced them with a 5K driver board that I bought from AliExpress, turning the whole thing into the cheapest standalone 5K display money can buy. Not to mention the only one that you can get that you can drive using regular old DisplayPort, no Thunderbolt required.

Why? All for an extra 55 PPI compared to readily available and much cheaper 27-inch 4K displays? Well, there are two main reasons you’d want to use a 27-inch 5K display compared to a 27-inch 4K one. But first, we need some Retina backstory…

Apple cops a lot of flack for introducing marketing terms without concrete technical specifications to back them up, and perhaps the best example of this is the term “Retina”. The Retina display article on Wikipedia actually has a pretty good explainer if you’re interested in the origin of the term, as well as what the derived definition is, based on what was said about it when it was first introduced with the iPhone 4, the first-ever Retina-class device. Remember, no concrete technical specs means we have to infer based on what we’ve been told at Apple launch events, but it seems to work.

The term Retina has been somewhat diluted now. A handful of prefixes and suffixes have been added to it to denote other variations on the theme, but whatever the marketing connotation, part of Apple’s theory behind Retina-class displays is that any display has to have a certain pixel density at a certain viewing distance, until you can no longer see individual pixels on the display. Obviously this assumes you have perfect eyesight, but putting that aside for the moment, for phones, that PPI figure is typically a lot higher than laptop or desktop displays because you’re typically holding your phone a lot closer to your face. That, in turn, means you need higher pixel density before you can’t discern individual pixels; hence higher PPI.2

You can do the maths yourself using any freely-available calculator, and Wikipedia has the actual formula. If you do, you’ll realise that theoretically, any display can be Retina if you’re sitting far enough away. For example, a 27-inch display using a very typical resolution of 2560×1440 is technically Retina from 80cm away. But I don’t know that many people who use their desktop displays from that far away, so not only do we have to start sitting closer, we have to go deeper into the Retina rabbit hole.

For simplicity’s sake, Apple also considers Retina to be a perfect multiplier of “standard” display resolutions. If we can’t change viewing distance in the Retina formula, we can simply put more pixels into the same space. By turning one pixel into four, quadrupling the total number of pixels and keeping everything else the same, that creates a sharper interface at the same physical dimensions. Earlier iPhones used a simple “2x” formula, with two times the number of pixels in both dimensions being four times the number of total pixels, but modern iPhones use a 3x scale which is nine pixels for every one on the original iPhone. For desktop displays, that means either doubling 1920×1080 referred to as 1080p or Full HD to give us 3840×2160 (4K), or doubling 2560×1440 (1440p) to 5120×2880 (5K). Apple refers to this as “HiDPI” mode.

What this means for us is that you can absolutely use a 27-inch 4K display in HiDPI mode, it will just look like a 1080p display, with four physical pixels representing every one. Which bring us to our first problem. If typical 1080p displays are usually in the 20-24 inch range, using a 27-inch display that looks like 1080p is too much physical screen for how much virtual screen real estate you get. Everything looks too big, which is where the magic of display scaling comes in.

But now we’ve introduced a second issue! Yes, you can use a scaled resolution on your 27-inch 4K display. Instead of the “native” pixel-doubling that you would get by using 3840×2160 physical pixels to represent 1920×1080, you can change the resolution of your external display up to 5120×2880 virtual pixels (which corresponds to a native pixel-doubling of 2560×1440), but then downscaling it to fit on the real 3840×2160 pixels that your display has. Doing so works, and fixes our issue of everything being too big, but this comes with its own set of issues, as per Bjango:

However, display scaling comes with some significant caveats, including a blurrier picture, shimmering when scrolling, moiré patterns, worse GPU performance, and worse battery life if you’re using a laptop. Display scaling also undoes dithering, which can mean gradients aren’t as smooth. With those issues in mind, it’s far, far better to run macOS at the pixel density it was designed for.

Check out their pictures and GIFs at the link, and you’ll be able to see the difference. Some of those issues aren’t as significant as others, but the biggest one for most people who care about this sort of thing is how using a scaled resolution makes your whole display look less crisp.

Continue Reading →

The new NAS

PC parts for a custom NAS build

It’s close enough to 2023 now, and off the back of my QNAP being on borrowed time, it’s time to think about a new NAS. I’ve had a couple of NAS iterations over the years, starting off with a $200 HP MicroServer, then the aforementioned QNAP, and now, whatever I want to go with next.

I could, of course, go with another consumer-grade NAS like a Synology. Or even a QNAP if I am feeling particularly brave. Apparently, Synology units with processors that had the LPC CLK issue weren’t affected to the same degree as QNAP units were, because they implemented their LPC interfaces at 1.8V, preventing 2V over that circuit being an issue like it was in the QNAP units. That and/or in combination with a firmware update that somehow mitigated the issue, meant that a Synology unit would have been the more reliable choice at the time, and we wouldn’t even be having this discussion if I had purchased a Synology instead of a QNAP back in 2016.

Alas, I did, and we are.

Which brings us back to the original question: what kind of a NAS do I want in 2023, that will hopefully last 5-7 years, if not more?

I’ve been thinking about this for a while now. There are definite, distinct advantages to having an all-in-one unit like a QNAP or Synology. You get the smallest possible chassis, minuscule power usage, and the entire software experience that buying a QNAP or Synology gets you, which means that even if you’re going to run your own Docker containers and just use the software to manage your storage, it’s definitely a more cohesive, user-friendly experience compared to rolling your own OS. But even after all that, I’ve been drawn to the idea of building my PC to serve NAS duties for a while now.

Why? It comes down to hardware, both in terms of choice and flexibility.

Buying a consumer-grade NAS like a QNAP or Synology means you’re buying into their ecosystem, with all of the advantages and disadvantages that entails. Yes, you can upgrade the RAM and install your own drives but that’s about it in terms of upgrades. With the exception of some of their pricier units, you can’t drop-in a PCIe card to add discrete graphics, or more M.2 drives, or even 10 GbE, if that ever becomes a thing at home. Maybe it will, maybe it won’t.

There are some Synology units that let you buy a PCIe expansion card that lets you add 10 GbE as well as more M.2 slots (in addition to the ones you already have), by the time you pony up for one of the pricer Synology units and the PCIe expansion card, you’ve basically spent as much as you would have if you picked your own parts and built your own PC from scratch, with none of the benefits of having custom hardware. It’s a trade off. I think it makes way more sense to buy a QNAP or Synology NAS, compared to building your own, than it does for you to buy a pre-built gaming PC from a major computer retailer like Dell or HP, purely because you’ll get more value out of a consumer NAS that you do out of a gaming PC that uses non-standard parts and layouts. You’re far more likely to want to upgrade your gaming PC within its expected lifetime than you are your own NAS, and you’ll appreciate standard PC components at that point, way more than you would if you were to upgrade your NAS. But I digress, and that’s a topic for another time.

When you’re building your own NAS out of commodity PC hardware, you have the complete freedom to choose which standard PC components you want, and the flexibility that affords you down the line. You might not ever need to upgrade your QNAP or Synology CPU in the lifetime of your NAS — but don’t you wish you could, when something better comes along?

But if there was a single reason I wanted to build my own NAS, it comes from being able to have access to hardware transcoding. Specifically, Intel Quick Sync Video.

While video transcoding isn’t generally a problem for me right now, that’s not to say it won’t be in the future. The Celeron J1900 in my current QNAP supports Quick Sync, and I haven’t had an issue streaming most of my content to iOS devices via Plex due to the wonders of direct play and most of my content being in a format that’s compatible with my devices. But between various CPU architectures, Quick Sync support for different codecs and formats varies. My current CPU, while it supports H.264, will only support decoding HEVC H.265, not encoding it, with zero support for newer video codecs like VP9 or AV1, or even 10 or 12-bit HEVC H.265 which is sometimes used by HDR versions of those videos. I don’t currently have Quick Sync video working on my current QNAP, but that is probably a configuration issue on my part; it’s entirely possible I haven’t set it up correctly in the Plex container.

Not supporting hardware-accelerated video encoding/decoding means we’re back to software decoding. And if YouTubers are to be believed, AV1 is going to be the next big thing, so even if we have to wait for a couple of years for it to be adopted by content farms, won’t I be glad I’ll have picked a 12th gen CPU that can handle decoding AV1 in hardware, as opposed to some Ryzen chip that would have had to rely on sheer CPU grunt to do software encoding?

While this might not be a big deal right now, it’ll matter if everyone starts using the royalty free, and even more efficient AV1 format. If that happens within the remaining lifetime of my QNAP, that’ll be an issue for me because it will mean I’m back to software decoding everything. I’m using software transcoding now, and it’s an extremely poor experience on a quad-core 2.0GHz CPU, even on my local network. The good news is, only Intel Arc has access to AV1 hardware encoders, which means everyone else has to throw CPU grunt at the problem, if they want to encode their content in AV1.

The other main advantage of rolling your own NAS hardware is that you can run whatever OS you want on it. While there are technically ways you can run other OSes on QNAPs or Synology units, it’s a hack. Building my own NAS lets me choose between straight Linux, like whatever version of Ubuntu that I ran on my HP Microserver, or the more storage-focused flavours of Linux/BSD like Unraid or TrueNAS. TrueNAS in particular is interesting because it is known for natively implementing OpenZFS, which is generally regarded as the best storage-focused filesystem. I don’t currently have a need to run any of the crazier storage configurations afforded by ZFS as I’ll be limited by the hardware and case that I’ve chosen (at least to begin with), but it’s nice to know they’re an option, if I decide to do that later on down the track.

Continue Reading →

The QNAP of Death

Alternate title: the day my NAS died

A QNAP NAS with System Booting text

Not quite the same System Booting text I was greeted with, but close enough. Excuse the dust.

System booting? Yes but the system has been booting for literally hours now. If it hasn’t booted within five minutes, there’s something wrong.

And dear reader, there was indeed something wrong. I tried all the usual stuff; turning it off and on again, leaving it off for a couple of days, pulling all the hard drives out, turning it off and on in between all of those steps, but nothing worked, nor did it give me any kind of video output to indicate what might be wrong. It turned on, but wouldn’t boot into the OS. That probably should have been my first clue that although something was wrong, maybe it wasn’t completely dead. And if it wasn’t completely dead, then maybe there was something we could do to fix it.

But after unplugging every piece of hardware I had added to the QNAP and returning it to the stock hardware configuration, the thing would still not boot up properly, giving me that same error message. System booting. Whatever was wrong with it, it wasn’t because of something I had added or done to the system, which probably meant it was hardware-related. Ugh.

With my extensive troubleshooting prowess exhausted, it was time to turn to old mate Google.

Google immediately led me to a 100-page forum thread about the issue on QNAP own forums. This was either very good, or very bad. In my case it meant it was initially very bad because it meant I had to read through most of it, but then things turned out very good because within those 100 pages, there was the trifecta: a known recurring issue, exact steps to diagnose that specific issue, and a fix that worked for enough people for it to be considered the official unofficial fix.

The problem, as it was described, is some kind of “degraded” LPC clock. As I understand it, basically there’s some kind of timing component that keeps things in your PC running on time for lower-pin (Intel’s definition of lower-pin here actually means 1170 soldered pins) processors like the Intel Celeron J1900 in the QNAP that I have. What happens is that in some systems, including in my QNAP and even some Synology units, that the circuit for this LPC clock degrades over time due to “reasons”, and eventually reaches a state where it fails to provide a stable clock to the system, meaning that the CPU doesn’t work like it should. Or something along those lines, anyway.

According to the forum post it’s remarkably similar to an issue that affected the Intel C2000 Atom processors, which Cisco and Synology both issued advisories about all the way back in 2017, although that case was slightly more serious as it caused C2000 Atom-equipped gear to fail after as short as 18 months. In the case of my QNAP, it lasted over six years. Not bad, but buyer beware, I guess, not that you’d be able to tell this kind of thing at the time of purchase.

Thankfully, diagnosing the issue is pretty easy. Use a multimeter to measure the voltage between some pins or pads on the motherboard, depending on your specific model of QNAP, and if the voltage shows over 2V, your LPC CLK is likely broken and needs to be fixed if you want to use your NAS again.

The fix is easy enough as well. Because we need to drop the voltage of the LPC CLK signal, we can drop in a resistor. Experimentation by some helpful forum members indicated that a 100 Ohm resistor, soldered between the “negative cycle transistor” and ground, will restore the voltage to a correct value to allow the LPC CLK to supply a correct clock signal to the CPU.

Simple, right?

There was just one problem. Well, besides “the problem”. I don’t own a multimeter, nor a soldering iron. Oh, and I don’t really know how to solder. I’ve soldered before, but I wouldn’t say I’m particularly good at it. But as my old swimming coach used to say, no one is born knowing how to swim or solder, so I grabbed a cheap and cheerful soldering iron, some solder, a multimeter, and prepared myself for the hackiest soldering job in the world. Yes, it was really that bad. No, I didn’t trim the ends of the resistor. Yes, I probably should have. Yes, I managed to melt a little plastic connector next to where I was soldering, but in my defence, it was impractical to pull out the entire motherboard for easier access, so I kind of had to do it in situ while it was still attached to the case, which made it all the more awkward. No, I’m not going to show you a picture. Suffice to say, I got the job done. Just.

After all was said and done, and I put my drives back into the system, it booted up just fine. Not that I didn’t expect it to, given so many other people had had success after attempting the same fix, but it was still a relief. Getting the system back up and running again meant I didn’t have to try and go to lengths to recover the data I cared about, never mind wondering what was on there that I might have forgotten about in the first place.

I wish that was the end of the story. Alas, the forum had one more golden nugget of information to dispense, and that was that the fix was only temporary. Continued degradation of the clock timer was inevitable, and the next time it failed, there was no guarantee it would be fixable with any kind of resistor. It was hard to estimate how long the fix would work for, but six months to a couple of years seemed reasonable. Reasonable, but only if you were willing to put up with the fact that your NAS might die at any moment, and maybe even be completely unrecoverable from that point on.

Which worked for me, because now I knew that it was on the way out, it was time to build a replacement.

Retina Displays, Part Three

An iMac with its screen attached with electrical tape

Yes, the screen is attached via electrical tape. Temporarily, because attaching it with adhesive is a one-time thing.

By some fortuitous mechanism that can only be described as “scoring something off the local computer forum” (close enough to be my own version of “scoring something for cheap off Craigslist”), I am now the proud owner of a late 2015 27-inch iMac. Yes, that’s right, the one with a 5K display. Now to be clear, there’s something wrong with this iMac, which is why I was able to get it for so cheap, but I figure if I can put a little work into it to get it working again, then that’s time well spent, in my books.

It’s a pretty decent-specced machine, too; an 4GHz quad-core Intel Core i7 (as it turns out, the same i7-6700K CPU that’s in my current gaming rig), maxed out with 32GB DDR3 RAM, and an AMD R9 M395 with 2GB VRAM. It’s in great condition, too. There are some very minor, superficial scratches on the display that you can only see if you’re looking for them when the screen is off, but otherwise no chips, dents, or scratches anywhere else on the screen or on the external aluminium enclosure.

The issue with this particular iMac was that its internal 3.5-inch hard drive was dead. These iMacs came standard with a Fusion Drive, which was Apple’s term for a SSD and HDD combo that was supposed to give you the best of both worlds in terms of speed and storage. This particular iMac had a 128GB SSD that still seemed to be working, but its hard drive was only good for making clicking sounds and being recognised as an unknown 4GB storage device, which I’m taking to mean it was one of those even weirder hard drives with built-in SSD caches, or something.

Installing the latest supported OS on the thing turned out to be a nightmare, even for someone who has had some experience with Macs. No matter what I did, macOS Monterey would get halfway installed before rebooting and restarting the install process. At first I thought this was some kind of reboot loop — an issue I also experienced with my own late 2013 MacBook Pro when attempting to install Catalina or Big Sur, I can’t remember which — but no matter how many times I tried, I couldn’t get Monterey to finish installing. Catalina, on the other hand, installed fine. I had issues getting Big Sur install media working, so for the time being, it was Catalina or nothing.

I suspected my Monterey install issues were related to the failed internal hard drive. That seemed to line up with the random kernel panics I was getting in Catalina, where it seemed like macOS would attempt to access the internal hard drive, realise it was dead, and then freak out and fall over when it didn’t know what to do. It seemed Catalina was as far as this thing was going. At least not without opening it up, disconnecting the hard drive, and then trying again… which is what I ended up doing about two weeks later, after some tools arrived for me to open it properly (and put it back together again).

See, my original plan for this machine was to turn it into a standalone 5K display. I had first heard of the idea via The Sizzle, a great tech newsletter by the founder of MacTalk. By pulling out all the internals of the iMac and replacing them with a relatively inexpensive — I’d hesitate to call it cheap, but it was definitely cheaper than buying a standalone 5K display — driver board you can buy from AliExpress, you can turn your 5K iMac into a standalone display that you can drive via regular old DisplayPort, which is great if you’re after none of the complexities that come with a Thunderbolt-driven 5K display1. A YouTuber probably wasn’t the first person to turn a 5K iMac into a standalone display, but they probably contributed to popularising the idea.

Like I mentioned in my other post on Retina-class displays, there are very few options if you want something other than the not-quite-Retina-class 4K at 27-inches, and none of them can be had for under $1800. With any luck, this iMac conversion will be a third of that. Not cheap — you could easily buy a great display for $600, even if it’s not quite 4K 144Hz, but 1440p 144+ Hz is easily doable — but much less expensive than what a new 5K display would set you back.

Continue Reading →

My iPhone 11 Pro Home Screen

It’s been three long years since I last had a new iPhone. While my iPhone 11 Pro remains a best-in-class smartphone that would undoubtedly serve me well for at least another couple of years, it’s time to move on.

But before we can do that, I kind of need to break down the apps that are currently on my iPhone 11 Pro home screen. For posterity’s sake, if nothing else, but mainly so I can look back on this one day and reflect fondly on the interfaces and design paradigms of the era, for a time in the not-too-distant future when we’re all using augmented reality interfaces.

I’m writing most of this while waiting for the delivery of my iPhone 14 Pro. I should have had it delivered to work and had someone else sign for it, because the worst part of having it delivered to your home address is needing to be listening intently for a knock at your door, or if you live in an apartment building like I do, the ringing of your wall phone to indicate someone has called your apartment from the outside. Now I’m stuck here at 6pm like some kind of hostage, waiting for a delivery that would ordinarily, on any other day, have been here already. But maybe there’s lots of iPhones on the StarTrack truck today, so the deliveries are taking a little longer than normal.1

In the future I think if I’m not getting my iPhone delivered to work, I’ll probably pre-order it for in-store pickup. If I take the day off work I can take it slow, pick it up from the store first thing in the morning, and then spend the rest of the day transferring all my stuff. Maybe taking the single day of annual leave is worth it.

It’s been a long time since my last iPhone home screen post. The iPhone X launched in 2017 to much fanfare because it was the biggest change to the iPhone silhouette in the entire history of the iPhone. The distinctive top and bottom bezels were gone, replaced by an almost uninterrupted edge-to-edge display that had a small notch at the top. The home button that had long contained a fingerprint Touch ID sensor was replaced with an upgraded front-facing camera system and biometric unlocking system called Face ID, which shone a pattern of infrared LEDs into your face so you could be recognised by your phone. Combined with an all-new gesture-based navigation system and an OLED display with curved corners that touched every edge on the front of the device, it was the future of iPhones for years to come.

The iPhone 11 Pro wasn’t that much different to the iPhone X in terms of the screen. Apple added an ultra-wide camera to the back, and spec-bumped just about every spec they could across the board, and that was about it.

Which might be why, although my home screen looks a little different, my apps stayed more or less the same.

Once upon a time choosing what apps to put on your home screen was a challenge because there were so many apps and only a 5×4 grid to put them in. There might be more and better quality apps now, but because we have folders and search, deliberately choosing which apps go on your Home Screen, versus those that are relegated to a folder, or worse, the App Library, remains as much of a challenge now as it was then, even if it’s for different reasons.

I’m still using a modified CGP Grey home screen organisation method, although the only tenet I choose to adhere to is “only one page of apps”. With some mental gymnastics I can claim to have a free row, although I definitely have four apps in my dock. The dock’s real estate is too important not to, but for big screen phone reasons, not because it’s the one row that stays static over multiple pages of apps, because the latter isn’t something that I have to consider when I only have one page of apps anyway.

Speaking of which, let’s get into the apps.

Continue Reading →

Stories from the road: I miss photography

An almost-deserted Bourke Street Mall in Melbourne

An almost-deserted Bourke Street Mall in Melbourne

It’s September 4th, 2022. I’m in Melbourne for the first time in a long time. It’s been nearly 8 years since PAX 2014, and while there have been a few PAX events in between, and several interstate and overseas trips since, for some reason, I haven’t been to Melbourne in all that time. Proper Melbourne, as opposed to just transiting through.

And it’s every bit as good as I remember.

I was supposed to go to Melbourne earlier this year with friends. But ol’ rona was still a thing, and I didn’t think it was the best idea. I might have been right, too, because everyone that went caught it and ended up staying an extra week before they could travel back to Queensland.

It’s September now, and ol’ rona is still a thing. It definitely seems like it will continue to be for the foreseeable future, if that wasn’t clear before. Eventually, though, people are going to have to decide for themselves what kind of risk they’re willing to accept, because the alternative seems similar to becoming something of a complete recluse.

But this isn’t about rona, or travel. It’s about photography.

A little while I ago I took out my Bessa only to find that the battery was flat after not using it for a while. I replaced the batteries, and a quick test shot resulted in some kind of stuck shutter. After panicking a bit, I did a little searching online to discover it was a common enough issue that people had come across it before. A short bit of percussive maintenance later, and the shutter was un-stuck and Bessie was working normally again.

I do feel a little guilty about putting down my camera. I’ve hardly done any photography since moving to Brisbane, so much so that any film I had brought up with me from Hobart expired a little while ago. By “a little while ago”, I mean a few years ago, so yeah, you could say it has been a while.

But it wasn’t until I went to Melbourne to see the sights and sounds that I realised how much I missed taking photos. I heard from my friends who went to Melbourne earlier this year that the city was so much different post-Covid, that it seemed less lively and a shadow of its former self, but if that was the case, I didn’t see it. Melbourne city seemed about the same as I remember from all those years ago, even if it wasn’t as busy as it was pre-Covid.

I ended up taking a few shots with my iPhone 11 Pro, and compared to the iPhone 6 that I had the last time I was in Melbourne, the versatility and quality of the 11 Pro camera system was leaps and bounds ahead. Not entirely unexpected given the multi-generational gap between the two, but phone cameras have performed wonderfully in great lighting conditions for years now. Probably since the iPhone 7 or iPhone X, now that I think about it.

But as much as I liked the photos coming out of my iPhone, it made me miss a standalone camera. Taking photos with an iPhone felt like cheating, somehow, because it was all too easy to get good photos. Point and click, right? With iPhone, anyone can be a photographer. And that’s great! But taking photos with a real camera feels nicer, somehow, like you’re a little more involved in the process rather than letting a bunch of computers and algorithms do all the photography for you.

Melbourne made me miss taking photos.

I miss taking photos with a real camera, and the only fix is to start taking photos again.