A nostalgic, shelter-in-place note of praise of my iMac. Bought as a refurb in 2010, it has been running virtually nonstop ever since. The only upgrades have been to increase the RAM (12GB) and add an SSD (which was a cheap upgrade that yielded a huge performance boost). It has outlived two HDDs so far; currently rocking an 8TB HDD and 250GB SSD fusion.
Still, at 10 years old, she’s our media hub - receiving, transcoding on-the-fly and distributing up to 2 simultaneous HD video feeds - while acting as our fully-functioned DVR (record, pause, rewind etc.).
Now, though, in these times of social distancing, she is also my work computer on which I log in to my virtual Windows desktop. The 27" screen gives me plenty of real estate to replace my dual-screen set-up at the office, and this all happens without missing a beat even if someone is watching live TV. Even if I’m watching live TV in a pop-out over my workspace.
I am, of course, biased, but I can’t imagine any WinBox - let alone the weak iMac knockoffs from Dell et al - still being this functional 10 years after their release. Who knew that an upright laptop with a big screen could be such a workhorse? Steve Jobs? Oh yeah, him.
A year or so ago, Apple stopped supporting this model with OS updates, so I’m aged out at 10.13 “High Sierra”. Who at Apple would make such an arbitrary decision not to support older models? Tim Cook? Oh yeah, him.
From time-to-time I look at the current iMac refurbs with their 6-core processors and wonder what benefits a $1,600 upgrade would bring. Other than the nebulous benefits of the up-to-date OS, now that I’m not doing a whole bunch of Blu Ray transcoding, I’m not sure there are any. The old girl still looks good, runs smoothly and quietly, and powers through whatever I throw at her.
And, while I would’ve given Steve Jobs my bank passwords, I don’t like giving any money to Tim Cook’s Apple.
I think Apple still sells an iPhone 6-sized phone, but it’s got the innards of an iPhone 3.
I hung on to my 6S as long as I could, but had to upgrade once even the replacement battery couldn’t last into the afternoon. I enjoy the battery life of my 11Pro, and the camera is far better (even though still lagging behind competitors), but I hate the bigger form.
My very first iMac, a 27" Late 2009 Quad Core i7, 16GB RAM, 1TB drive system continues to run and run strong as a media core as well. Never replaced the hard drive or any other part in the unit. It’s topped out at High Sierra as well. But, I see no need whatsoever to replace it. Same thing with my daughter’s MacBook Air from 2009. Everytime I ask her if it needs anything, she just says, “nope, I’m good”. That said, the Dell Precision workstation with 2 dual-core Xeon processors has been acting as my “server” in the basement for 13 years. I did upgrade it from 8gb to 16gb like 5 years ago (for a whopping $34) and threw a $100 5TB portable drive on it to extend it’s storage. It said “thank you very much” to the free Win10 Pro upgrade last year and just keeps self-updating and running. The later part of the first decade of this century saw manufacturing quality jump by leaps and bounds where, if you spent a little $$, things would just run and run. Although, if you bought cheap crap, you got what you paid for.
Since the advent of smart phones and other internet-connected technology, computing has moved increasingly back to a pre-PC client-server model. Also, high-end gaming has slowed a lot because of the expense of developing games and moved to console architecture. So there is less and less need for processing power on the client end of things which is why computer systems are sticking around longer.
I built my home desktop in late 2015. It is hugely capable: 6-core i7 (effective 12-core w/ hyperthreading), 32GB RAM, SSD-only as of a couple years ago (after I tore it down when the 3-HDD RAID I put in it crapped out… that was frustrating). I have had the case for about 12 years so it only has USB2 ports on the front. Dammit.
I initially put Windows 8.1 on it. Then nuked it to Server 2012 R2 in 2017(?). Then last year I nuked it again to finally bring it up to Windows 10 because some apps I use (e.g. Adobe anything) stopped supporting anything else. Ever since then the computer randomly blue screens, and only when I’m not sitting at it. Sometimes it will go three weeks between BSODs; sometimes it will go two days. It’s crashed during the day when I’ve been at work, in the evening, and overnight. Usually you can just Google the bugcheck code to track down the faulty module or driver and troubleshoot from there. The exact bugcheck code I’m getting references ntoskrnl (you couldn’t possibly get less specific) and pulls up precisely one page of Google search results, none of which provide any answers. Fuck. It might be a driver but I’ve got all the most recent drivers for my motherboard, which stopped getting updates from the manufacturer in 2016. Fuuuck. My options are 1) nuke and reload Windows 10 hoping it was just a bad install (it happens), 2) nuke and reload Win 8.1/2012 and run a Win10 VM for the apps I need, 3) start methodically tearing apart the computer piece by piece to attempt to isolate the issue and wait potentially weeks at a time to see if the BSODs stop, and/or 3) depending on the results of 3, spend hundreds of dollars replacing potentially faulty parts. Fuuuuuuck.
So now I’m working from home every day and the computer has held up reasonably well despite some minor annoyances. Added a third monitor via a USB3-to-HDMI adapter (held my nose initially but it works surprisingly well for anything not gaming/video related). Yesterday I was editing audio/video all day, and despite a little bit of chugging when it got up to 29-30GB RAM in use it did pretty well. Finished rendering and uploading an hour-long HD video, closed everything out, and walked away. Came back about 20 minutes later and the damn thing had crashed. WTF?? There is nothing more demoralizing.
The 2015 version of me loved to tinker: rooted my Android phones/tablets, built my own computers, etc. The 2020 version of me is so over all of that shit, but I’m still living in the cage I built for myself five years ago (including my 2015-era laptop that I intentionally spec’ed lower because I decided at the time if I needed to do “serious” work I’d do it on my beefy desktop. Fuuuuuuuuuuuuuck).
There is zero doubt in my mind that my next computer will be a MacBook Pro. I bitched endlessly in 2017 about their switch to USB-C, but it really started to make sense when we bought them for our faculty. A USB-C monitor (like the Dell P2419HC) can now serve as a pretty damn capable docking station for a MBP: you can drive two DisplayPort monitors (one daisy-chained) and four USB-A ports AND charge your laptop all with a single USB-C cable. Add a $50 third-party USB-C adapter that includes 3-4 more USB ports, Ethernet, HDMI, and SD/microSD and I won’t need a desktop anymore. And I have gotten so fucking tired of bringing my work home by having to deal with drivers and other bullshit on my home computers. There’s no doubt I’ll have some growing pains going wholehog macOS and will probably need a Windows VM to fall back on for some things, but I honestly can’t wait for Apple to update their 13"/14" MBP with the new keyboard.
The PC market is flooded with cheap low-end bullshit. You also made a number of upgrades to your iMac over the years.
And the Dell AIOs have their uses. At my last job we needed a few touchscreen kiosk PCs without all the wire mess; Dell has them, Apple doesn’t. At my current job we have several Dell AIOs available to students in our library. Better specs than the baseline iMac for ~25% less.
The line may be drawn somewhat arbitrarily, and there is certainly some amount of planned obsolescence, but it’s also the same reason that Apple doesn’t sell $299 laptops, and the same reason you can’t put iOS 13 on an iPhone 4: it’s that baseline experience I mentioned earlier. The average Mac owner doesn’t drop an 8TB HDD and 12GB RAM into their 10-year-old iMac; they probably still have the shitty 5400rpm drive and 4GB RAM that it came with from the factory.
Catalina support goes as far back as mid-2012 on some models. Eight years is a long fucking time in computer terms. Not quite as long as Windows (10 years), but pretty long. You have to draw the line somewhere.
Somewhat interestingly, Microsoft is catching a little bit of flak for a very Apple-like decision they made with the upcoming Xbox Series X. Among other things they are going all in on the speed of the user experience (such as load times within games). It’s got a beast of a PCIe SSD instead of a hard drive or cheaper SATA SSD. If you max out the built-in storage, you have to get a proprietary external SSD for it (instead of expanding storage via USB flash drive or hard drive). While it seems anti-consumer, the reasoning is that if MS allowed any external storage to be used, the user would buy the cheapest possible storage device, it wouldn’t be fast enough to meet the console’s standards, and then they’d blame MS for the degraded experience. I get it.
Sorry your home machine isn’t behaving. My wife hated Apple, but got a MacBook Air (last model before the USB-C switch) because it checked all the boxes for functionality, size, weight and cost. She loves it.
The switch to USB-C only for Apple’s laptops has meant the elimination of the MagSafe changer, which is a ridiculous retrograde step, IMHO. OK, go with USB-Cs, but keep the damn MagSafe! Now you’re forced to have an adapter for almost everything, you can’t even plug in your iPhone without one, whereas my wife’s MB Air has MagSafe, HDMI, a couple of USB-As and memory card slots. All part of post-Jobs Apple’s effort to make sure that none of their products can connect to each other.
Agreed, I did upgrade my iMac. The memory upgrade was done right out of the box. I can go up to 16GB, but it seems a little late now. I did swap out the HDD, which isn’t improving performance - just capacity - whereas adding the SSD and creating a fusion drive was like adding NOx!
IDK, our home PC is probably 10 years old and we’re having no problems (Windows 7 upgraded to Windows 10). I am an IT professional so that may have something to do with it.
The SSD upgrade to any Mac or Wintel unit is a revelation in performance. And, they are so very cheap now. For anyone that might have a Wintel system of any type, if you have 4gb of RAM, 32-bit Win10 and a spinning drive, upgrade to a SSD drive, max out memory and switch to the 64-bit OS and, unless the PC/laptop you are using is absolute junk, it’ll run great for another 5 years for less than $100. And feel just like a new machine.
@waldo, the reason I don’t have any of those compatibility issues is because the Dell unit I bought back in 2007 was a business-grade system with solid supply chain internal components and drivers.
The SE’s front-facing camera was willfully deprecated to keep that bad-ass phone from becoming everything that it should had been. Only Apple will do shit like this. (see Battery Life deprecation)
I couldn’t care less about a front-facing camera. I’ve never used one on any phone I’ve ever owned. I do care about it being able to fit into my pocket, and not having a headphone jack is an absolute deal killer for me, which is why I haven’t upgraded to the Xperia XZ2 Compact. That was the dumbest move any phone manufacturer ever made.
I never thought I’d become a Mac guy but when I went to work for myself, I bought an iMac out of necessity as most everyone in my field uses Mac. I’ve come to like it’s simplicity and style as well as it’s compact power. I leased a iMac Pro last year and I love it. I just wish Apple and Nvidia would play nice.
Yep, if you go Windows you have to buy a business-class machine. Not only are the parts a little better and the build quality much better, but the manufacturer support - not just technical support, but also length of firmware/driver updates - is far superior.
Which leads me to my biggest pet peeve about the whole PC/Mac argument: people getting pissed off at their cheap Windows shitboxes, then switching to Mac and blowing their load about how much better/faster/???er it is. At my last job I had a lawyer client that for years had used a $249 10" netbook (remember those pieces of shit?) with 2GB non-upgradeable RAM to drive two monitors and a couple of cheap USB hubs. He finally got fed up enough to ask us for a recommendation on a new PC. We got him a quote on a reasonably appointed $1100 Dell Latitude. “I dunno, that’s pretty expensive.”
A couple days later he called in asking for help setting up his new, tricked-out $2500 MacBook Pro. It didn’t run ANY of his shit (all of his legal software was Windows only, at least at the time) and thus required setting him up with VMware Fusion in which he spent ~85% of his time and which made his life so much more difficult (but he absolutely refused to use Boot Camp - “I got a Mac for a reason”). After two weeks of some serious hand-holding he finally got the hang of his new workflow despite it taking him 25-30% longer to do his work (maybe that was billable time?), he sat back, smiled, and said, “Man, ‘they’ really were right that Macs are so much better than PCs.” No, fucker, you’re doing it wrong.
No kidding. You get what you pay for. And, what you do with it matter as well too. My shop is 96% (business class) Dell and 4% Apple. My CustSat numbers are 91%, 4.6 standard deviations ahead of Apple. The Apple “argument” is elitism of the highest order and not based in any actual elite reality. He says as he types this comment on a nice, new MacBook Pro…
Switching to Mac at home makes a lot of sense for me. But in a business environment Macs are still expensive square pegs for round holes. Windows all day, every day in the medium/large business or enterprise.
My company has gone over completely to virtual desktops so, even when sitting at my desk, I still have the “remote” in through Citrix. In that regard a Mac would work just the same, but it would be an expensive option.
The flip side is that I work off a Surface laptop in the office and my iMac at home. More power available at home, I think.