faster isn't better

at the time of writing (2021-02-16), apple's latest products have taken the 'tech community' by storm. said products are the new 'm1' macs. these devices promise a cooler, faster, more efficient computer. by all definitions, they deliver on the promise. the m1 SoC (system on chip) is considerably faster, is less hot, and is overall a better chip than its intel x86 counterparts. this article isn't here to analyse or benchmark these claims - for all intents and purposes, they are true. apple has produced a chip that is overall better than any other 'desktop class' chip on the market. i will instead focus on the consequences of producing a faster chip.


software and hardware developers are in an arms race. for each big step in hardware speed made, a big step in software speed is also taken. however, that step is in the other way. these steps are roughly the same in size, so the speed remains at a net neutral, to the nondiscerning user. the mac m1 represents this step up in speed. ask anyone who bought the m1, and they'll be happy to tell you about how fast and performant their machine is. apps once slow, such as discord, microsoft teams, xcode, slack, chromium, firefox, et al. anyone who's used these apps (or seen their code, in the cases where that's possible) can tell you that they are massive, trudging along at paces comprable to snails, often. one account i read described xcode taking up to 20 seconds to initialise on their hardware. yet, the latest hardware dispells this. unoptimised, inefficient, poorly designed programs run faster on the m1 than a large portion of optimised apps on any other CPU. so, what's wrong here? where's the issue?


at the time of writing this article, we are stuck in a global lockdown, as a result of the COVID 19 pandemic. this is the second lockdown in the uk, where i reside. as a result of the lockdown, many people have to work or learn remotely. the software used for this is commonly either microsoft teams, zoom or slack. these software are effectively essential to recieve an education, or to survive, and recieve sustenance. if you aren't deemed an essential worker and forced to work in unsafe conditions, then chances are you have to use one of these software packages. this wouldn't be an issue, if they were good software.

chances are, you've heard of the term 'planned obsolescence'. i want to contest that term. modern hardware certainly has this quality to it - designed to become redundant, so that you buy a new one. outside of mobile devices, this isn't really the case. the way devices become obsolete is the software that runs on them, and the release of newer hardware. as new hardware is released, software increases it's minimum requirements to be run. thus, the software becomes slower. as a result, hardware becomes faster, and the cycle repeats ad infinitum. hardware barely lasts a few years now. to an average person, there is only one thing that can be done with hardware that is unable to run the apps they need for work or education: to throw it away. there is nothing else the computer does for them, so they throw it away. they can't upgrade it, afterall. after throwing away their old hardware, they buy a new piece that can run their hardware, and continue on their lives.

for many people, this is not an option. frivolous spending like this is wasteful, both in terms of money and in terms of the environment. the increase in power of hardware and the increase of speed of softwa presents an increase barrier to participation in society, and thus staying alive. i do not think i am unreasonable in saying that future jobs will require more and more technology, and that contemporary technology will become no cheaper over time. while technology does get cheaper over time, 'modern' technology does not. while a laptop from 2010 may be much cheaper now than it was then, it isn't really going to be able to do anything.

within capitalism, you have to spend money to make money. what happens if you can't spend any money does not warrant pointing out.

on user freedom

the apple m1 computers present a large step in apple's history. that is, all their hardware is produced internally. previous intel macs had the ability to run any operating system, with the bootcamp program, and also due to the fact that the x86 platform has a 'lowest common denominator' that can be relied upon to exist. if you target the x86 architecture, you can make a set of assumptions that will hold true for almost all x86 chips. this is what enables windows to run with very few modifications on mac hardware. this changes with the mac m1. linux has been booted on the mac m1, but the situation is far from pretty. the m1 chips are nonstandard, even compared to other arm chips. this is to the point where versions of redhat enterprise linux cannot even be virtualised on them. their approach to booting, hardware access, core access, and more is completely different from everyone else's approach. it produces a hostility to running everything other than macos on the macs, which has it's own share of issues in regards to being locked down, such as the instance where apps took far too long to load, on every machine. why did they all go out at the same time? running an app in macos sends a request to apple's servers. those go down? can't open apps anymore.

on repairability

the apple hardware is increasingly hard to repair. if you break an apple mobile device, you have to pay upwards of 80 GBP to replace even a simple part such as a battery. the situation for macbooks is even worse - the lowest price for a battery replacement is 130 GBP. the raspberry pi 400 (with its kit) is a desktop class computer, suitable for casual browsing and games. it costs 90 GBP. many people at this point will say that unofficial repairs exist. unofficial repairs are difficult for apple devices, require expensive equipment, and thanks to apple's increasingly repair hostile design, become harder and harder with each new device that is released. this leads to wastefulness. being able to easily and inexpensively is obviously a desire quality for a consumer - who wants to pay 100 GBP to replace a battery in their laptop, when they could do it themselves, or pay someone less money to do it? the issue with this ideal is that it isn't profitable. you can't keep people purchasing your devices if they're reliable and last for a long time. with apple's place as a symbol of wealth and thus power, people are stuck in a vicious cycle of fragile apple devices being bought, breaking or becoming obsolete, and then them buying another one. this is not exclusive to apple. this happens with all major technology and software manufacturers. apple is merely the most prominent example of this, with the controversies regarding Right to Repair and their devices.


the apple m1 chip represents two trends in 'modern computing':

these two seemingly unrelated trends are woven together to create a moving target for consumers. thanks to this cycle, there will always be a slightly better machine. one that doesn't have that expensive to repair crack, or that hard to replace battery that seems to barely last any time now.

faster hardware isn't better.