Archive | Blog

RSS feed for this section

Blog Posts.

But what about the Steam Deck?

The keen-eyed amongst you will have noticed I failed to mention the the technically-handheld, borderline-portable, Steam Deck in my last retro handheld post. And it’s a curious omission of the handheld that arguably kicked off the portable-computer-as-handheld-gaming-console era that we have today. What makes it downright weird is that I’m arguably part of the ideal target market for the Steam Deck. I have a large, expansive Steam library that’s a mix of triple-A titles (although there’s much less of that these days), some of the most popular indie games, and plenty of releases from years gone by, and yet somehow, I haven’t picked up a Steam Deck in the two and a half years that’s it’s been on sale.

And while there are a couple of reasons for that, including how the Steam Deck isn’t officially available in Australia, so I’d have to import it myself, and if I’m not importing it myself, I’m paying a markup tax to the Australian company that is importing them, which adds to the already significant cost of the Steam Deck, I’m just not sure about the Steam Deck as a product I’m really interested in. The other major reason I don’t own a Steam Deck is that it’s kind of expensive for what it is. It’s a fairly serious investment on the same level as buying a new home appliance, a new graphics card, or even an iPad, all of which you would probably get more use from. If you already have a good-to-great gaming PC, the only drawcard the Steam Deck has for you is being able to play titles portably, and even away from your home.

And the crazy thing is, the specs aren’t even that good. Phones from the same era have high refresh rates 1080p displays, but the Steam Deck is stuck with an (admittedly larger than normal phone size) 7.4-inch, 800p display at 90Hz. But if you compare the Steam Deck to what it can do, then yes, maybe the cost is justified. Being able to play triple-A titles without shelling out for a complete gaming PC is a pretty impressive drawcard indeed.

The problem with doing this, particularly when you already have a great gaming PC, is that playing games portably sounds like a good idea until you realise it’s a compromise in basically every way. Playing away from home sounds like a good idea, at least for the few hours of battery life that you get if you’re playing heavier titles. Sure, you can extend your battery life by turning down the graphics options, but then it becomes a question of how much of a trade-off between battery life and graphical fidelity you’re willing to make. And while you can get perfectly fine battery life if you’re playing lighter, easier-to-run games, you’re really telling me you forked out for an entire Steam Deck so you could play Stardew Valley outside your house? I’m not one to yuck someone else’s yum, but that seems like a pretty wild decision to me.

But none of this is new if you’re a member of the PC master race. There’s always been this dichotomy of what you can run acceptably on your current hardware, and in particular what settings you can tweak to make it run at an acceptable frame rate versus still having enough graphical bells and whistles to make you feel immersed. If there’s one thing Valve really accomplished with the Steam Deck, it was that they put this power of choice in the palm of your hand. Well, hands, given how large the Steam Deck is.

While consoles have mostly been immune from this, in the past few years we’ve definitely seen games that have started giving console gamers the choice between lower resolution, less graphical effects, and a higher frame rate, or a higher resolution, more graphical effects, and a lower frame rate. It’s becoming increasingly more common to see games offer the choice between 4K at 30 fps, or something like 1440p but at 60 fps or more.

What’s interesting about all of this is that if it’s retro emulation I’m interested in, the Steam Deck has been able to play GameCube and PS2 portably since it launched, and I hear battery life when playing retro consoles is even acceptable. Not great, mind you, but acceptable. But one of the major draw cards of the Steam Deck, and why it commands such a high price in the first place, is because it can play PC games. Again, you can do what you like, but buying a Steam Deck for retro emulation seems like a strange decision when there are other devices that can emulate retro consoles just as well as it can with better battery life. Well, one or two, and only since the last year.

But as someone from the Game Boy Pocket generation (I never owned one personally, but friends did), the Steam Deck is huge and ungainly by comparison. Yes, all that gaming goodness has to go somewhere, not to mention the battery to power it all AND get acceptable battery life for what is essentially a smaller laptop, and a lot of the size is dictated by the screen size of the device. But there’s just no getting around just how cumbersome the Steam Deck, and most other PC-based handhelds including the ASUS ROG Ally, the Lenovo Legion Go, or the MSI Claw, really are. I don’t think handheld gaming consoles have to be pocketable, necessarily, but I don’t to want to draw attention unnecessarily in public by pulling out a Steam Deck and gaming on the bus/train/plane. It’d be like taking an iMac to a Starbucks. Sure you can, but do you really want to? Do you really want to be that guy?

So when you ask me “what about the Steam Deck?” I say that while it’s may be a reasonable product at a reasonable price, it’s not for me. It’s not even that I prefer playing most games with a keyboard and mouse, or that I don’t have any games that wouldn’t be suitable for playing on it, or even the fact that it’s large, bulky, and kind of pricey. It’s really all of those things, which make it unsuitable for what I’m looking for in a portable handheld console.

Now if and when Valve decide to release something like a Steam Deck mini, I would definitely be interested depending on the compromises that they decide to make for a smaller form factor. But for the time being, the search continues for my perfect handheld console.

The retro handheld console and software emulation rabbit hole

TrimUI Smart Pro

The TrimUI Smart Pro handheld console.
Basically a perfect modern GBA/DS emulator. It can run N64 and PSP, but I wouldn’t recommend it.

Every couple of years, I’ll go on a handheld gaming bender where I eschew all responsibilities and spend as much time as possible with my head buried in a handheld console, playing a game that might have been released 20 years ago. For those couple of weeks, I’ll be a teenager again, on holidays and having nothing to do but play video games on a handheld.

By any measure, we’re long overdue for one of those times. While Covid and lockdowns might have been the ideal time to dust off one or more of my old handhelds, I think I was more concerned about surviving and avoiding Covid than I was with playing a handheld console.

One of the great things about handheld gaming consoles like the 3DS and Vita — and indeed, all consoles — is that you can expect them to work 100% reliably with every game that was released for them, because that’s just how consoles work. There’s no performance issues. No incompatibilities. If you have a copy of the game and a working console, they you can always expect to play it, whether that’s 20 years ago when the console was first released, now, or 20 years from now. I know that I’ll be able to pull out my 3DS or Vita, give it a charge, and be able to pick up right where I left off. And that’s the beauty of consoles; they just work.

But as much as I love the Game Boy Color that I grew up with, the Game Boy Advance SP I eventually received, and the Nintendo DS that ended up rounding out the handhelds of my youth, I know this isn’t sustainable indefinitely.

The main problem with the handhelds that I have is that they, like me, aren’t getting any younger. The battery it has now is likely the best battery it’s ever going to have, and while 3D scanning and printing has come a long way and you’ll probably be able to buy replacement plastic parts, that’s not necessarily guaranteed for anything else including screens or other electronics. They’re not making any new 3DSes or Vitas, so there’s no way to get a new one unless I’m willing to pay a premium for one on the second hand market. Which means it’s a one way street for these handhelds, unless I get lucky and find a good second hand model for a non-exorbitant price. So as much as I want to be able to play all my Vita games on my Vita, or play all my 3DS games on my 3DS, I know that one day, that isn’t going to be possible due to time marching ever forward. Parts will break. Batteries will wear out. And when that happens, there’s no guarantee I’ll be able to restore them to working condition. Even if I can guarantee access to games that I want to play, which in 2024 and the age of digital downloads is absolutely not a given seeing as Nintendo has already shut down the 3DS eShop and Sony was about to do the same thing with the Vita PlayStation Store until they received backlash and reneged, there’s no guarantee that the hardware is going to last. How many consoles from 20 years ago do you know of, much less working examples?

Obviously this isn’t an option for even older handhelds like the GBA; in those cases the ageing hardware is even more of a limitation, and getting worse and worse every day. So for the purposes of gaming on a retro handheld like the GBC, GBA, or even a DS, then emulation is really the best option, with all of the inherent advantages and disadvantages that brings.

The question is whether I’m willing to live with the tradeoffs of imperfect software emulation for the conveniences of modern hardware and software. Modern hardware in this case is things like hall-effect analog sticks and triggers, USB-C charging, Wi-Fi, Bluetooth, and displays with such contemporary technologies like IPS (or ideally OLED, like the Vita had all the way back in 2012) and actual pixel density far above the handful of pixels that older consoles used to have. I’ve been PC gaming at 4K since 2015 at a healthy, if not incredible, 163ppi, so going back to anything less than 720p on a 5 inch display (293ppi) seems like such a huge step backwards when you consider that even the very first Apple Watch had 326ppi in 2015. Which, I’ll remind you, was almost ten years ago.

Modern software, on the other hand, means I can use software to emulate whatever console I’m interested in, provided my device has enough power to run those games. Whether that’s an Android or Linux-based handheld, or something like the PC-based Steam Deck, mostly depends on what I’m interested in playing given the hardware is more or less the same. Android, for example, currently doesn’t have emulators for Wii U, PS3, Xbox, or Xbox 360, and while that might change in the future, that’s the way it is right now.

Conceptually, I think I’m OK with having a device that doesn’t run everything. I think it would be weird to play GBA games on 6 or 7 inch screen, for example, irrespective of how good the integer scaling is, but I think a device that runs GBC, GBA, and even DS games could work. Then if I wanted to, I’d either have 3DS games on my 3DS, Vita games on my Vita, and potentially have another device for 3DS, Vita, and every other 8th-generation console, including GameCube, PS2, and maybe even Switch. From a hardware console perspective, this sort of separation works great as well because retro handheld consoles fit into one of a handful of tiers of modern hardware, each with varying power and price to handle its own set of retro handhelds.

Continue Reading →

The 10GbE, all-fibre home network rabbit hole

My paltry AliExpress-special 8-port 2.5G switch

When I moved out of home in 2015, I needed my own home networking equipment. And unfortunately, moving back to ADSL2+ from FTTP NBN was every bit as awful as it sounds. I absolutely don’t remember why I ended up choosing the venerable Asus DSL-AC68U for my all-in-one home wireless modem and router, so I won’t pretend to, but I did, and for the past nine years, it’s done an absolutely bang-up job making sure I have the internets/pipes filled with cats/access to the information superhighway on all my devices. That means it’s time for an upgrade!

Or it was, anyway. Read part one and two of that saga.

Enter: the 10GbE home networking rabbit hole, and cue the OCAU thread with over 1500 posts discussing when 10GbE will become consumer-level technology.

Home networking gear has changed a bit in the past 10 years. 2.5Gb network interfaces are becoming more and more common; my Thunderbolt 4 dock has one, as does the B660 motherboard I built my new NAS with. And at the upper end, it’s not uncommon to find 10GbE RJ45 ports as standard, whether that’s on your top-of-the-line PC motherboard, or the iMac Pro (RIP), or today’s Mac Studio.

Which is why it’s strange that plain ol’ gigabit still seems to be the standard for home networking. Yes, home internet speeds haven’t increased anywhere nearly as much; only within the last couple of years has gigabit internet become possible in Australia, but it’s still uncommon. I think we have other countries to thank for even the adoption of 2.5G as a kind of gateway to faster wired networking speeds, given that in other countries multi-gigabit internet is not only possible, but common. But within that same time period, NVME SSDs have become near-ubiquitous, bringing speed increases of over 20 times their spinning rust predecessors, trading storage capacity for speed. So why are we just now upgrading to home network technology that’s only a paltry two and a half times faster than what we currently have?

There’s a myriad of reasons, including slow internet, but I think the main reason is that for most consumer uses, there’s just not many real reasons to have a faster network connection between your devices. Your Netflix experience isn’t noticeably improved by having a faster connection between your phone or computer, because even mediocre NBN connections can handle a 4K stream of your favourite TV show. Most people aren’t transferring huge files between their computers, so the practical applications of faster network connections are limited, despite computer-specific storage getting faster, not larger. Because if you’re not storing files in the first place, there’s also no need to transfer them between computers. I suspect it’s also why successive Wi-Fi releases have been focused on better Wi-Fi more than they have been about raw speed increases; more efficient usage of the wireless channels we have available, opening up new wireless spectrum, smarter usage of airtime, that sort of thing. If Wi-Fi is already fast enough, even faster speeds benefit few, but more efficient Wi-Fi benefits everyone, even those who aren’t hitting theoretical maxes.

But what if you’re a nerd?

A version of the trickle-down philosophy applied to technology says that as businesses and enterprises upgrade their own equipment, you can often grab upgrades to your own gear for fractions of the cost of what said business would have paid for it originally. I’m not saying that this has completely happened with 10GbE networking gear, but running a 10GbE fibre optic network at home is now within the realms of possibility, and more importantly, at something of a reasonable cost.

If you only have two devices that you want to connect up at 10GbE speeds, you can pick up two SFP+ PCIe cards for about $100 each, connect them directly with a Direct Attach Copper cable for $50-100, and still use standard Ethernet to connect to your switch and your actual internet connection. If you have more than two devices that you want to connect, that’s probably where you’ll need a SFP+ switch of some kind. But even at $130 for an 8-port 10G SFP+ switch on AliExpress, roughly $50 for each SFP+ module, then your optical cables on top of that, it’s still well within the realms of possibility to go to a mostly-fibre home network. And unlike other IT equipment, networking hardware has a lifetime measured in decades. Barring incredible breakthroughs in technology, there’s every possibility that any 10GbE equipment you buy today will be useful in 20-30 years from now, although I’d be slightly concerned about the longevity of your 10GbE SFP+ PCIe cards. For that reason, I’m on the fence about splurging on the 25Gb SFP+ versions. Even though they’re backwards compatible with 1/2.5/5/10Gb, who knows what kind of PCIe tech we’ll have in 20 years. For reference, that’s about the same period of time that it took for PCI to die out and be overtaken by PCIe. Although the PCIe train doesn’t seem to be stopping anytime soon, it’s foolish to think it’ll be around forever.

The only problem with going to an all-fibre 10GbE home network is that you will, inevitably, have devices that you can’t plug an SFP+ transceiver into. As far as these devices go, you basically have two options. Either you keep using them on standard 1/2.5G copper, or you put them on wireless. Unfortunately, there aren’t any options for an all-in one switch that has 4+ 10G SFP+ ports as well as 4+ 1/2.5G RJ45 ports, so either you’re stuck using 1/2.5G/10G RJ45 transceivers in your SFP+ ports, or you run two switches, one for your optical network, and one for your standard copper one. You can upgrade your copper transceivers to optical ones eventually, but for heat concerns you want to limit how many copper transceivers you’re using, although at 1/2.5G speeds this probably isn’t too bad, I’ve only read about heat issues with 10G copper transceivers.

As it stands, I think it would make little financial sense to upgrade some parts of my network to fibre 10G links. I could direct connect my PC and new NAS with 10G or even 25G, then do my router and switch with at least fibre, but from there it gets tricky. There are few practical ways to do 10G or fibre on laptops and even fewer economically friendly ones — despite Thunderbolt 4 being 40G — so it doesn’t seem worthwhile, and especially not when it would be a downgrade in terms of speeds over the current 2.5G connections. My end-game home networking setup would be 10G between my router and switch, then 10/25G from a switch to every computer that supports it, 2.5G to everything that doesn’t, and wireless everything else.

As fun as a theoretical mostly-fibre network is, its practical uses are limited at best. Probably why faster home networking hasn’t caught on. I’d consider running fibre between my router and switch in the future, just because fibre cabling is slightly less noticeable when I’m skirting it around the edge of my rooms.

But otherwise, 2.5G between computers is plenty fast enough.

Just like 640k of RAM out to be enough for anybody.

The new home network, part II

Previously, on Prison Break:

For the past nine years, a venerable Asus DSL-AC68U wireless modem/router has dutifully been providing access to the pipes filled with cats to all my devices. It’s done its job so well, faultlessly, that I feel like I need to put it out to pasture while it still can be repurposed as someone else’s wireless router. Besides, it’s 2024 now, and the Wi-Fi 5 that it came with is positively pedestrian compared to what we have now, putting aside the glaring limitations of Australian internet speeds or your device’s ability to utilise that kind of speed. Plus, WPA3 is also a thing now too, and any security upgrade is always worthwhile.

The MikroTik Hex has been rock solid as a router. After a solid week of Googleing and configuring, I think I have it set it up just how I want.

In many ways, RouterOS reminds me a lot of when I played with dd-wrt all those years ago. There’s just as many options to configure, and while that means there’s a bit of a learning curve, especially if you want to start from scratch, basically everything is configurable, and there’s very little hand-holding. Want to use one of the Ethernet ports as WAN? Of course, take your pick. Want to remove one of the Ethernet ports from the bridge and use it as a backup/dedicated management port? No problem! RouterOS will tell you when your config is invalid, but it won’t stop you from doing something stupid if it is technically possible. It’s absolutely possible to lock yourself out from your router if you’ve configured management interfaces to be only accessible certain interfaces/network ranges, so it’s absolutely possible to shoot yourself in the foot. If you want, you can start from literal scratch; no DHCP server, no DNS, no firewall rules. I can tell you now; you haven’t truly lived unless you have setup your own DHCP server, even if all that really means these days is ticking a box to turn it on and configuring a few options like your desired IP address range. The next best thing is customising the one that comes with the standard default config, which is what I ended up doing.

But did the Hex fix what marginal levels of bufferbloat I had? Yes, absolutely, although I don’t have SQM1 enabled all the time. For whatever reason, Opticomm FTTP connections are usually over provisioned in that I get slightly faster speeds (usually around 110 Mbps down, 45 Mbps up) than what I actually pay for (100/40), so I have SQM disabled outside of peak periods so I don’t miss out on that little bit of extra speed. It’s a small thing, but the way SQM is most noticeable is when I’m downloading something and watching a YouTube video at the same time. With SQM off, when that download is saturating my connection, my YouTube video drops quality and starts stuttering like it’s buffering over a dial-up connection. But with SQM enabled, I can download something and watch YouTube at the same time, without any loss in quality and without any buffering pauses. It’s a small thing, but SQM has made a minor but appreciable impact on my internet quality. If nothing else, now I can use my internet connection with impunity. Not like I didn’t before, but now I know it will actually work when I want it to, irrespective of whatever else I might be doing.

And yes, the Hex has limitations in terms of throughput with SQM enabled, but thanks to Australian internet speeds, I can save money by having a cheaper router. As it stands, apparently the Hex is good up to about 200-500Mbps with SQM enabled. Given that I’m not planning to upgrade my internet speeds anytime soon, that’s plenty, but if and when I do, a RB5009 (or its successor of the time) has my name on it. I’m still tossing up whether I want to “upgrade” to 250/25 for $4 more per month. While that may not be worth it, I can absolutely recommend SQM on any modern internet connection. If you have a one person household it might not be that big of a deal, but even I’ve noticed it, so I can only imagine how great it would be in a family home.

But honestly, the Hex is too fully-featured for my meagre networking requirements. I’m not running my own ISP, nor do I need any kind of failover. Fancy routing rules for specific traffic, or complicated NAT rules, are also outside of what I want out of my home network. I’m not even using VLANs or anything that would require me to know more about networking than I currently do. But it’s good to know that I can, if I want to in the future, or if my networking circumstances change, I can do all of that without having to redo my entire home network setup.

If I have hesitations about the Hex, is that it’s fairly basic in terms of features. While it does have a microSD card slot and a USB port, there are “only” gigabit Ethernet ports on the thing, no 10G SFP+, no PoE, and I can’t run containers on it like you can on some higher-end MikroTik hardware. It feels bad buying networking gear with only gigabit Ethernet in 2024, but unless I want to spend many hundreds more dollars and buy one of those little fanless mini-PCs that come with 2.5G/10G SFP+ ports and run RouterOS on that, I’m stuck with the hardware that MikroTik currently offers. I think the RB5009 would be great, but as it is, I can probably wait until the next iteration, as there’s basically nothing the Hex doesn’t do for me today. That changes if I get gigabit internet, but I can’t see that happening anytime soon, especially with the state of internet infrastructure in Australia right now. Further compounding this is that while you can get gigabit internet on NBN, the problem here is that Opticomm doesn’t seem interested in competing with the NBN2 or even offering higher speed tiers, so the fastest that I can get is 500/200 at roughly triple what I currently pay. For a one-person household, that just doesn’t seem worth it.

So for now, the Hex has this strange dichotomy between incredible software with mid-tier hardware — fine, capable hardware that’s more than enough for home network usage, but lacking a few niceties and/or esoteric features that would have been “nice to have” in 2024.

Continue Reading →

The new home network

Asus DSL-AC68U wireless modem router

Next year will be 10 years since I bought any new home networking gear. Compared to typical IT gear lifetimes, where you’re normally replacing gear every couple of years, hitting double digits on anything is an impressive feat that usually represents one of two things. Either you over-invested to begin with in the name of “future-proofing”, even if you couldn’t fully use the gear at first, or there have been so many other expenses/upgrades ahead of it that you haven’t even thought about upgrading something that works perfectly well. As the old adage goes: if it ain’t broke, don’t fix it.

But look, I’m not here to judge your personal technology choices. Merely provide some insight into some of my own, a cautionary tale or two, and some helpful anecdotes along the way. If we’re lucky, maybe we’ll get all three in a single post, but if not, two out of three ain’t bad.

For the past nine years, a venerable Asus DSL-AC68U wireless modem/router has been dutifully providing access to the interwebs to all my devices. It’s done its job so well, faultlessly, that I feel like I need to put it out to pasture before it starts getting ideas and starting the robot uprising that every sci-fi has warned us about. Besides, it’s 2024 now, and putting aside the glaring limitations of Australian internet speeds or your device’s ability to utilise that kind of speed, the Wi-Fi 5 that it came with is positively pedestrian compared to what we have now. Plus, WPA3 is also a thing now too, and any security upgrade is always worthwhile.

When I was waiting for the internet to be connected in my first apartment, I was able to plug a USB 4G modem into it and have the AC68U share it to all my devices. And when that same apartment joined the 21st century and upgraded to NBN, albeit on the slightly-inferior FTTB version, the AC68U just kept on working. And now that I’m on Opticomm (i.e. non-NBN) FTTP, it just keeps on working. I have no doubt that it would keep doing so until one of two things happened: it releases the magic smoke and spontaneously combusts into a small pile of ash, or the heat death of the universe. Whichever comes first.

That means it’s time for an upgrade! But to what?

Home networking gear is boring in the best possible way. The ideal scenario is that you set it up once, and don’t ever touch it again unless you’re changing something. But if you’re like me, you’ll spend a few weeks every ten years fiddling with it, then never touch it again. That’s basically how I’ve run my AC68U over the years, besides upgrading the firmware every now and again, or forwarding a port here and there. Like I said, if it ain’t broke, don’t fix it.

I have basically three options for upgrading my home network.

I consider myself pretty lucky (for the purposes of picking home networking gear), in that I live by myself in a small apartment. That means I don’t need a fancy mesh system, or multiple APs to cover the whole thing. Because I’m the only one that uses the network, I can wire up all my computers for the latency and consistency advantages wired connections provide, put everything else on wireless, and have a pretty simple setup overall.

It would have been easy to pick up something like the 2024 version of the AC68U, an all-in-one wireless router. This time around, I won’t even have to buy something with an ADSL modem, because I’m not planning to live in a place with ADSL ever again. But as ugly as the aesthetic of most of today’s wireless routers are, surely there are better options? Some of the Wi-Fi 7 wireless router options from TP Link don’t look too bad, although they are a little on the pricey side. But what if I wanted a slightly less consumer option? After almost a decade with the Asus and never touching all the marketing buzzwords in its web interface and going straight to the advanced settings, what if I wanted to step it up a notch?

Ubiquity seem to be the current flavour of the month for their prosumer networking gear. Their new-ish UniFi Express is a nice little all-in-one that I could probably recommend pretty comfortably to anyone who wanted something configurable, but backed with a great user interface that makes setting it all up easy enough. I can even see myself trying out a UniFi Express to see if I like the Ubiquity ecosystem as a whole, as it’s also a pretty cheap entry point into the UniFi ecosystem. It would probably also be suitable for someone to deploy at a “secondary” site like their parent’s house to replace their ageing network gear, too, and even comes with cool features like remote management.

But as nice as the UniFi Express is, it “only” comes with Wi-Fi 6, so wouldn’t be that much of an upgrade. It’s also lacking some features. There’s no USB port, for example, if you wanted to share a 4G/5G USB modem between all your devices, just like I had to do while I was waiting for the internet to be connected at my place after moving in. That isn’t a huge deal these days given the relatively fast provisioning times of NBN, but it’s a nice to have. There’s also only one LAN port, like Ubiquity expect you to have a switch if you plan to network a few computers together like it’s 1999 or something. It’s a reasonable assumption, but would it have killed Ubiquity to put a few more Ethernet ports on the thing? Fortunately, this also isn’t a big deal for me as I already have a switch connecting my computers together on a LAN like it’s 1999.

Which brings us to the third option, separate out my router, switch, and wireless access point into three separate devices. While there’s nothing particularly wrong with having an all-in-one wireless router, sometimes you just can’t find the right device at the right price. There’s also something about the modularity of having three separate devices, meaning that if you want to upgrade something in the future, you can do so without having to replace everything. But now instead of having one decision to make, I have three! Make that two, on account of the fact I already have a switch. It’s an 8-port, 2.5G RJ45 switch with one 10G SFP+ port from Keeplink, which you can also have for the low price of around $70 if you’re willing to buy it from AliExpress.

Continue Reading →

Polaroids

A set of four Polaroids taken with friends

I have a love-hate relationship with my Polaroid camera.

On paper, my Polaroid is the perfect alternative to the point-and-shoot nature of my iPhone. It’s the ideal analog equivalent to digital photos that might as well only exist on your phone, or at most in a post on social media or group chat somewhere. I love that it produces real, physical photos that people can then take home and put somewhere they’ll see it, like on their fridge or wall, to remind themselves of a nice moment in time. The photos have character that you just can’t get when you take a photo with any modern phone, even if they’re not always perfectly in focus, timestamped, geo-tagged, or include a little two-second movie.

But in practice, there are just as many negatives as there are positives to shooting Polaroids, even though the film it uses doesn’t use negatives. Sorry, little film photography pun there.

The film that it does use is expensive, expires if I don’t use it within a certain timeframe, produces sub-standard photos if I don’t store it properly, and the photos produced are so widely inconsistent as to be basically unusable half the time.

When each photo costs you at least $3, it’s not something that you can just snap away with. I’ve been limiting myself to only taking photos of people with my Polaroid for that very reason, because if I’m going to spend that much on physical photos, I want them to be of something real, and not just some nice scenery or whatever.

But because opportunities for nice photos with friends don’t come around all that often, and I’m not taking that many photos when they do, I often find myself with leftover film. Yes, even when each pack is only eight shots, which makes a 36-shot roll of film seem limitless by comparison. I then have to either force people to take more photos to finish off a pack of Polaroids, or contend with storing it and hoping that it will still be good the next time an photos with friends opportunity comes around, then hoping that the film hasn’t expired in the meantime. Improper storage or outright expiration of the film probably isn’t that big of a deal, but with photos being so wildly inconsistent and the photos themselves costing as much as they do, I want to give myself the best possible chance of getting good photos, which is ideally with film that’s within its use-by date and has been stored correctly.

Which brings us to the other part of the problem. I’ve had such varied results shooting Polaroids that there’s always a small part of me that wonders if it’s worth it. I don’t know whether it’s because I don’t have much experience with it to get a good feel for what works and what doesn’t, or because I’m too used to my iPhone camera and its ability to produce perfect photos every, single, time, and keep trying to pull off technically challenging photos with my Polaroid, but getting good photos out of my Polaroid seems like such a coin toss at times that I wonder if there’s anything I can be doing to help my chances of getting photos I would be happy to stick up on my fridge or on my wall.

What’s interesting about all of this is that I don’t have these kinds of inconsistency issues with film. Yes, I’ve shot hundreds more frames of film that I have Polaroids. But with film, I know that when a shot turns out blurry, it’s usually my fault for not nailing the focus using the manual focus lens. Or when the image turns out under or over exposed, it’s because I intentionally wanted it to be. My film rangefinder has automatic metering which prevents the possibility of too dark or too light shots when using aperture priority, but it also doesn’t have the benefit of a flash. By doing away with any kind of adjustable shutter speed or aperture and relying on fixed-focus lenses, theoretically the Polaroid should be able to produce consistent exposures due to how simplified the whole exposure triangle is. But maybe that’s one of its limitations, in that it can only produce exposures in a few limited scenarios, and it over-relies on the flash to compensate for less-than-ideal lighting. Even in the early days of shooting film, when my very first film rangefinder didn’t have (working) metering and I had to manually meter every shot using my phone before dialling my shutter speed and aperture into the camera before taking the shot, I was able to take OK photos most of the time. Yes, in the beginning I might have had a photo that turned out too dark, or too bright, of been blurry due to too slow a shutter speed. But I feel as though I was able to pretty quickly learn what worked and what didn’t and compensate accordingly. The Polaroid, by comparison, seems to have a mind of its own when it comes to exposing correctly. What I think should be exposed correctly isn’t, and what shouldn’t be exposed correctly, is! It’s madness!

Continue Reading →