That is a tremendous update. Thank you for providing a great read about the diligent efforts of an increasing number of talented contributors: Oliver and Janne and Alyssa M and Shiz and Robert and Sven and James and Neal and chaos_princess and Davide and Lina and Michael and Sasha and Alyssa R have been killing.
Great work. I believe used M1/M2 machines will be favored by young developers as their personal fun laptop in a few years, like the Thinkpad T420 used to be. For different reasons, of course.
Do the M4 and M5 GPUs also change a lot from the M3? I hope it's not too much work to get those going once M3 is usable.
> I believe used M1/M2 machines will be favored by young developers as their personal fun laptop in a few years
I doubt it. For one, the SSDs have limited lifespans, and are soldered on the mainboard. They'll be fine enough for the planned life of the laptop, but eventually secondary market laptops will start seeing waves of failures, at which point people learn that purchasing one is a gamble.
The entire Apple silicon lineup is designed for limited lifespan.
SSD can be resoldered and that service is actually becoming popular and inexpensive. It's not just MacBooks, nearly all laptops have SSD and RAM soldered. This will become a totally normal thing in a few years from now.
Soldered storage is extremely uncommon for laptops not from Apple. You pretty much only find it in very low-end Chromebook type hardware that's using eMMC for cost reasons, and a small fraction of more expensive Qualcomm-based laptops that use UFS for no good reason. All mainstream PC laptops use M.2 NVMe storage.
> > It's not just MacBooks, nearly all laptops have SSD and RAM soldered
> That's simply a lie. No other laptop have soldered SSD. An increasing number do have soldered RAM.
That's simply a lie. Pretty much all laptops using eMMC or eUFS for storage are soldered directly to the mainboard. These are often budget devices and many are things like x86-based tablets or chromebooks but there are models that are very much laptops. I do concede I am unaware of any non-apple laptops with directly soldered NVMe storage, but your claim that no other laptop have soldered SSD is patently false.
I think it's a little disingenuous to try to compare high/er end Apple laptops with soldered storage to $2-300 Chromebooks and budget devices with soldered eMMC (which is much more like a CF/SD card than anything else) in your argument in the first place.
As you acknowledge. When you look at actual competitors to Apple, you're forced to acknowledge that yup, no other manufacturer solders storage.
But yes, with due pedantry, the statement that "no other laptop has a soldered SSD" is technically wrong.
You could get into additional debates on whether eMMC and eUFS would map to most people's understanding of "SSD", but...
Neither the ram nor the ssd is on chip. The ram is on package, the ssd is on board.
On chip means literally on top of the silicon, like how AMD X3D cpus mount the SRAM chip. On modern Apple devices the ram is mounted on the organic package substrate. The difference is significant, and it's shitty that Apple outright lied about it.
I think that particular definition of "on chip" is entirely your invention. I've usually seen it broadly used for anything on-package, whether it's on-die or on a separate die within the same package.
"On chip" definitely does not have much if any history of referring specifically to stacked dies with TSVs, because that has been a very niche packaging technique until recently, and "on chip" is a much more broadly used term.
I've yet to see a desktop SSD wear out from writes. The only dead desktop SSDs I've seen have been due to buggy firmware (early drives or that recent batch of Samsungs) or well before their wear level is down (cheap noname drives off amazon).
My first SSDs were from Intel and I have completely worn them out by writing their specified maximum writable amount of data, in a couple of years or so.
After that, I have been careful to always buy only SSDs with the maximum amount of writable data that exist on the market. I have not worn out others yet, but those that have been used for many years show in their SMART counters that a large fraction of the permissible amount of written data has been reached and not much has remained until their end of life.
My point is the vast majority of desktop users do not write much. My experience is not just my own computers but the network of extended family and friends' small businesses I'm the informal tech support for. This includes video editors who edit large video files.
I do wear out SSDs but they're on servers I run with different use patterns.
It’s really a shame. May last “favorite” MacBook was from 2013 where everything was upgradable. I bought the fastest Core processor with the lowest everything else and upgraded to 16GB of RAM, SSD (granted at SATA speeds) and a second data drive in the optical drive bay. What luxury!
Hah. Yes. Couple of drops of water on an an MBA... laptop worked fine. Battery, fine. Healthy. Charging circuit would not work. Perfectly functioning laptop on AC, but unable to be on battery because 0% charge.
Me, at Genius Bar, expecting you know, maybe $300 with parts and labor?
"Here's a quote, sir, we're looking at $850+tax, perhaps we should talk about getting you into a new Mac today?"
No. The laptop was primarily connected to AC anyway and only 18 months old, if that. Sorry, Apple.
Same here. Bought an Apple Keyboard a long time ago. Spilled some juice on it. Some keys stopped working. That's when I learnt that a $200+ Apple Keyboard isn't even water resistant unlike the previous $25-50+ keyboard I had. That was the first major red-flag about Apple I had. The soldered RAM and SSDs, and locked bootloaders on the Mac were the last straw. Will never purchase an Apple device again.
S**, I haven't felt much urge to upgrade from my 16GB M1 Air and I even use it to play some Windows games under Crossover. Quite possibly the best laptop I've ever owned.
It's always come in handy for containers/VMs (and I assume compiling Rust, as it uses as much of every other resource as it can get it's hands on) but yeah, being able to run actually useful local LLMs on my now >4 year old machine has been fantastic.
Public information seems to describe the M4 GPU as mostly a performance-oriented refresh of the one from M3. M5 has brought bigger changes, not least neural/tensor accelerators on chip.
I just like the build quality and they are reaching the 200€ threshold on the used market. I bought one with 16GB RAM and a small black strip one the side of the screen (don’t bother me) for 230€ last week
>they are reaching the 200€ threshold on the used market.
Where?! I just cheeked the used market in Austria and 2020 M1s go for at least 350 for the 8GB RAM models and 450 for the 16GB model. Your 230 for the 16gb one fells more like a rare exception but not them norm everywhere.
Damn that's lucky. I checked facebook marketplace in Austria and prices are double that of what you're showing, even on Intel macbooks, there's no M1 macs for 200 Euros, only 400 Euros and up. Same on Vinted. No 200 Euro M1s, only at 2x the price.
The ones that I saw similarly low to yours are obvious scams from scam profiles all repeating the same message in the ad.
So maybe the ones you saw are scams as well. Otherwise hungary seems to be a lucky exception for some odd reason. Maybe because people have less disposable income, IDK?
Anyway, I wouldn't spend 400 Euros on a used mac with no warranty. The point of buying an old ThinkPad for cheap was that if something broke on it you could easily swap that part yourself for cheap because it was easily repairable and the used market was flooded with spare parts. But if your used macbook dies out of warranty, then you're shit out of luck, you can't fix anything, it's 400 Euros wasted.
I bought one already so I know it's not a scam. Scams usually communicate badly and they don't want to meet you in a public space (like a McDonald's with free wifi)
Obviously ymmv
>Anyway, I wouldn't spend 400 Euros on a used mac with no warranty.
This I agree with. I still prefer Thinkpads too but these M1s are also pretty good in almost every sense except for repairability
Apparently there's changes to boot that are more or less understood, but require some heavy work to handle.
Basically starting with M4 you have a choice between starting with Apple's page table monitor already running in their guarded mode extension, or all apple extensions disabled on the CPU cores.
In fact, the current state of M3 support is about where M1 support was when we released the first Arch
Linux ARM based beta; keyboard, touchpad, WiFi, NVMe and USB3 are all working, albeit with some local
patches to m1n1 and the Asahi kernel (yet to make their way into a pull request) required. So that
must mean we will have a release ready soon, right?
I often wonder whether the folks at Apple have the Asahi team on their radar. Are they in awe of the reverse engineering marvels coming out of the Asahi project, or are they indifferent to it?
I think the dev that was responsible for the bootloader or some security chip having the option to be opened posted on twitter a while ago. Pretty sure he implied or mentioned that this was what he was hoping for.
In fairness; that decision predates the existence of Asahi. Unless Apple was working with the community before Apple Silicon dropped, this isn't much acknowledgement of their existence.
They are aware. They are also aware of the designs sitting in the cabinet right next to them in Cupertino that would make all the reverse engineering instantly unnecessary.
Such a monumentally Sisyphean waste of effort in behalf of the Asahi devs in my opinion.
If you care about personal computing or Linux, don’t buy a Mac.
I'm sure Apple has data showing that their extremely lockdown strategy is good for their business but I feel like I'm one of the potential customers Apple could gain if they didn't have that.
They're a fantastic hardware company. But my admittedly very limited experience with Apple software, from iPad to their streaming service website, has been miserable. The UX doesn't work for me, the software just doesn't do what I want. Understandable, Apple very much designs their software to work for a particular workflow they come up with, if you like that workflow it's great, for someone like me it's miserable. But I would gladly buy their hardware if I could freely run an OS of my own choosing.
I doubt that any company actually cares about what any of the myriad of metrics they collect mean at the C-suite level. I mean, "maybe" I just think it is unlikely. I bet 9/10 times someone just makes a decision about how things "ought" to be and then that's the way it is going forward.
The assumption that this is a triangulated and well researched strategy doesn't match my experience in "real-job" world. I mean, maybe Apple is different because of their history, but I am not convinced anyone listens to anyone that articulates any math ideas beyond Algebra outside of some niche specialties because they don't understand it. And it's not that I'm some math god - I mean, that's what I studied, but there are people SO much more knowledgeable and capable, and they seem to get ignored too.
Like, I'm sure the guy who runs an insurance company listens to the actuaries about relative risk, but mostly, what I've just seen is someone makes a decision, and then finds post hoc ergo proctor hoc rationales for why this was a good decision down the line when they have to account for their choices.
For instance, it took my like a year at my old job, but I finally got most of the KPIs we were using to set strategy cancelled. The data we were using to generate those KPIs? Well in a few cases, after you seasonally differenced the data was no different than white noise. No autocorrelation whatsoever. In ALL the cases the autocorrelation was weak and it was all evaporated after a month or 2. You could MAYBE fit an MA model to it, but that seemed dodgy to me. And like, I'm not a major expert - I took 1 time series class in gradschool, and frankly, time series is kind of hard. But management had ZERO idea of what I was talking about when I was like, "hey, I don't think these numbers actually mean anything at all? Did anyone run an ACF?"
Then each month someone higher up the chain would say, "why is this number low?" And then they go out and search through the reams of data they had to come up with an answer that plausibly explained things. Was the number particularly "low?" No, it was within expected statistical noise thresholds, you are probably going to have at least have 1 number out of whack every 20 cycles or so... You still had to spend an hour in a meeting coming up with reasons for why it was low that went beyond "ummm, well, this is kind of random, and we'd expect to see this sort of thing ever couple years once or twice, we won't know if it's a trend for a few more months."
Anyway, this is a long anecdote to explain why I have no confidence that most companies do any sort of actual introspection. CEO creates targets and underlings build models that show how they're meeting or not meeting those targets. Now, hilariously, with Apple in particular I might be wrong, because in Tim Cook's defense, I'm pretty sure his education is in Industrial Engineering? So if any CEO is thinking about that stuff, it's him. Still, I am totally and completely unimpressed with the C-Suite sort of thinkers.
They're not dumb - like I've never really had a straight up dumbass manager outside of shitty lower jobs or small-mom-and-pop businesses? But I have seldom met any company that actually cared about the numbers - most say they do, but most just use those numbers to justify decisions they've already made.
Am I just unlucky? I'm I the witch in church here?
The environment is why I quit my job and started working for myself in January. I hated it. And not to sound like an arrogant ass because there were a LOT of way smarter people than me at $PREVIOUS_EMPLOYER, but having to have meetings to set our meetings, having to explain things that aren't statistically meaningful to people who don't understand stats anyway, and getting code reviews (when I could get them scheduled) from dudes who hadn't touched a keyboard in 5 years was... soul sucking? I'm not doing that anymore. Or ever again.
I mean, maybe it's because I had a more hands-on blue-collar adjacent job before I got into tech? Maybe it's because I'm a fool and couldn't play the game of "pretend to work and look busy. But - and I know this might be kind of messed up - I really like not having to explain things in a series of emails to people other than the customers. I really like not having to answer to anyone but my self and my customers. If I want to do something, well, I just do it now? That's a nice place to be. Riskier for sure, but I think the prior environment would have killed me, so maybe not.
Also, I have time to do shit that's interesting? Who would have guessed how much more time I'd have in the day when I didn't have 4.5 hours of meetings per day? Hell, I'm taking 2 classes at the university for fun (weird right?!) - I never could have done that before because I would have had to make a slide deck for Thursdays All-Hands or whatever and couldn't have missed the SUPER IMPORTANT MEETING that Jake has on the schedule that he'll show up for unprepared or just not show up to.
As opposed to what hardware, then? Because this is pretty much how most other drivers became a thing in the first place. Linux has come a long way and due to it "winning the cloud" many hardware vendors started properly supporting it, but this was absolutely not the case for the longest times.
Maybe I was lucky, but I never had any serious problems in Linux with any of the many Dell, HP and Lenovo laptops that I have used during the last 2 decades.
The most serious problem that I had was about 10 years ago in a Lenovo laptop with NVIDIA Optimus (i.e. where the NVIDIA GPU does not have direct video outputs, but it must pass through the Intel GPU). At that time, I spent a couple of days until succeeding to make NVIDIA Optimus work OK in Linux. With the Intel GPU, Linux worked fine since the beginning. This happened because at that time the Linux NVIDIA driver did not support Optimus, so you had to install a separate program to be able to select which GPU shall be used by an application. I do not know if any laptops with Optimus still exist today.
Except for that case, I never encountered any hardware compatibility problem that could not be solved in minutes or a few hours of most. For contrast, with Windows I have seen many problems that could not be solved in weeks, even with the assistance of IT support personnel from multiple continents, because nobody, not even the "professionals", had any idea about what Windows is really doing and what may be wrong.
It is true that some of the laptops that I have used had a few features that I have never used, so I do not know if they worked in Linux. For instance I have never used a fingerprint reader or a NFC reader.
If you want an ARM CPU, there are now a few single-board computers with a quadruple Cortex-A78 CPU in the "Qualcomm Dragonwing QCM6490" SoC (similar to a Snapdragon from the flagships of 2021), which run circles around Raspberry Pi and the like.
There are also older NVIDIA Orin SBCs with Cortex-A78, but those are severely overpriced, so they are not worthwhile, unless you really want to use them in an automotive project.
For software development, the Arm-designed cores have the advantage of excellent documentation, unlike the proprietary cores designed by Apple and Qualcomm, which are almost undocumented. Good documentation simplifies software debugging and tuning.
Unfortunately there are no cheap solutions for developing on the latest ARM ISA variants (except for a Chinese Armv9.2-A CPU, which has some quirks and is available in mini-ITX and smaller formats). For the latest ISA, you should develop software on a smartphone, e.g. on one of the Motorola smartphones that have DisplayPort for connecting an external monitor and a desktop mode for Android.
You don't need an arm processor, many modern x86 chips match or outcompete m series on power efficiency and performance. Mainly lunar lake gen 1 and the new gen 3 (arrow lake not really).
The efficiency of arm chips was never arm, really, it was the manufacturing node and SOC design. Well, Intel and AMD can make SOCs, and they do.
There are reasons beyond pure power efficiency to use ARM processors. It is a nice architecture to work with, especially if you plan to write low-level code. Also, you might want to deploy on ARM servers.
Also, there is the question who in general makes Laptops as nice as a MB Air? Who makes a fan less laptop of roughly comparable power?
If you're writing true low-level code then you're most likely doing it for performance reasons, like ffmpeg. But ARM doesn't have the instruction set to make the best use of that, x86 does with its extensions. Otherwise, the compiler handles translation, so there's just no reason for you to care about the assembly unless you're writing assembly.
As for nice laptops, I think Asus and Lenovo makes some nice ones. I don't believe any are fanless, but most are quiet - Lunar Lake gen 3 is an SOC with a base TDP of 25 watts, and it can even go down to 15 watts. These CPUs are slightly faster in multi-core performance than M4, and they use similar wattage. I believe the Asus zenbook duo gets better battery life by a wide margin because of the 99 watt-hour battery. They still fall a little short of M5 in performance, but it's very close.
As for servers, it's a good point. But I think currently most servers are still using x86 CPUs, so it might not be relevant for a while.
The Qualcomm laptops have various problems with Linux that have not been solved yet.
You have much better Linux support for an older Snapdragon from 2021 (with quadruple Cortex-A78 cores) which has been rebranded as "Dragonwing QCM6490" and which is sold by Qualcomm for use in embedded computers. Thus Qualcomm promises at least 10 years of support for it.
There are a few cheap single-board computers with it, e.g. Particle Tachyon 5G and Radxa Dragon Q6A.
Unfortunately, "cheap" means something very different today than last summer, due to the huge increase in the price of DRAM. Nevertheless, the SBCs with soldered LPDDR memory have been affected less by the price increase than the computers for which you have to buy SODIMM or DIMM memory modules, which may cost now more than a mini-PC in which you would want to install them.
There are millions of Macbooks out there that will be out of MacOS support one day. If this project diverts just a fraction of them from becoming e-waste for a little, it will be a win.
And then beyond that, there is simply no laptop manufacturer that meets the quality of Apple's hardware design. I like Macs for their hardware, the software is a compromise. A linux macbook would be my ideal laptop.
Maybe so, but 15-20 year old laptops are definitely starting to show their age.
An M2 MacBook Pro, on the other hand, is only 4 years old, has a fairly OK keyboard, and is still in striking distance of current high-end ultrabooks when it comes to performance.
The only thing my X230 struggles to do is run LLMs locally. My needs are simple, and I think normal people (i.e. probably not most people on this site) don't have needs that are any more demanding than mine.
Granted, this is running GNU/Linux rather than Windows. If you're running Windows then yeah, they show their age.
I think an X230 would be performant enough for 95% of the things I do, but a 14 year old CPU is going to have pretty terrible battery life for anything more than very light usage. And things that would be light usage on a recent PC, like watching video encoded with a modern codec, would be fairly taxing on an old CPU with no hardware decode.
True. By the time I upgraded from my X200 (fantastic machine, noticeably outdated), the lack of software support for hardware decoding H264 was noticeable. Also being stuck with OpenGL 2.1 isn't the best either.
I don't know what I'll do if and when my X230 stops being sufficient. If I could buy an Apple motherboard in an X200 chassis I'd do it in a heartbeat.
Congrats, but I think you may be in a small minority when it comes to developers shopping for laptops.
Personally, I had to upgrade from a late-model i9 MacBook Pro to this M2 MacBook Pro, because the npm + docker setup at work was taking upwards of 20 minutes for a production build...
>The only thing my X230 struggles to do is run LLMs locally. My needs are simple, and I think normal people (i.e. probably not most people on this site) don't have needs that are any more demanding than mine.
People who edit video or make music and other such tasks are totally normal too, and there are hundreds of millions of them
I think maybe you don't understand what the needs are of normal people. It's only partially about what software they run.
I recommend Mac's to the people in my life because when they have a problem they can take the machine to the Apple Store in the mall. Or if they want to understand iPhoto or Pages better, they can go to the Apple Store and take a class. They like Apple laptops because they look nice, they feel great, sound amazing (for a laptop) and have excellent battery life.
Like you, I have a ThinkPad (a P-something) and, frankly, it kind of sucks. It's all plasticy, it flexes, battery life is a joke, the trackpad is meh, and the fans are almost always running. I do like the keyboard though (I'm a fan of backspace).
The problem is that now one else is currently making hardware that is competitive with apple silicone. Apple is the only one offering both performance and battery time.
I love my Thinkpads, I really do but they are bulky, loud and the battery doesn't last very long. They are not an option for many people.
1. Asahi Linux's battery life is like 2/3 as long as on macOS
2. The Thinkpad X1 Carbon is just about as thin and nice as a Mac but it also costs just as much.
3. Apple is still leading in single core CPU speeds but x86 has caught up or surpassed M devices in both multicore and graphics. And even last gen x86 can beat the 3-generations-old M2 that is the latest one supported by Asahi Linux.
Re:3- only because they've only released the base M3 so far- once they release Pro/Max configs they'll easily regain the lead, as seen by the single core dominance.
Intel lunar lake gen 1 and 3 are competitive with the M series both in battery life and performance. Qualcomm makes arm chips that are somewhat competitive too, but you run into similar problems as M. x86 chips can most definitely reach the efficiency of M.
If I have learned one thing it's is that current corporate strategy is no guarantee for the future. If you want to purchase a laptop now and want a great linux experience, then the M2 Is a great option. But don't assume that M(n+1) will ever get support.
This reasoning is essentially just as true for any other laptop maker Dell, Lenovo, Asus, Framework, HP etc might also decide to bomb linux support at any time.
> Apple allows booting unsigned/custom kernels on Apple Silicon Macs without a jailbreak! This isn’t a hack or an omission, but an actual feature that Apple built into these devices. That means that, unlike iOS devices, Apple does not intend to lock down what OS you can use on Macs (though they probably won’t help with the development).
IIRC Marcan mentioned something he found that had been deliberately put into the Mac boot loader that made booting alternative operating systems easier and perhaps making it possible altogether.
That's apparent enough from the fact that you don't need a jailbreak exploit to boot non-Apple-signed kernels on a Mac, unlike iPads with exactly the same silicon. They are intentionally configured differently.
iPads are cheaper than MacBooks and more popular. They'd rather prefer if you bought another one instead of using it indefinitely. The same with smartphones. The answer always has been: I like money!
Yes, because if they didn't, the fact that macOS doesn't lock you down in a sandbox means that is more like the entire boot chain would've been jailbroken. The boot chain shared for the most part with iOS devices, where Apple 100% does not want "jailbreaks" because it means App Stores that they don't get to take 15-30% from.
The happy path on the Mac was provided so the talent capable of booting Linux on it could take the happy path that hides all of the stuff Apple would rather not have a bunch of reverse engineers sniffing around.
I don't own Apple hardware, but just reading through it made me appreciate how gifted these souls are, doing the god's work. I hope they succeed with upstreaming their contributions and have a first-class Linux on ARM support.
I was watching Bladerunner last night, specifically the part where Ford is zooming in on the photograph using voice commands.
Above the display is an amber horizontal bar that changes in sync with the activity on the display and my first thought was, "Finally they found a use for the Mac Touch Bar!"
The Touch Bar has so many uses in Linux I can't wait for it to work.
My favorite feature of the Touch Bar was that, if memory serves well, force push was right next to cancel in one of the IDEs, can't recall if Xcode or Intellij.
If your design language is “flat as we can make it” how can you visualise a third dimension? You have to already know which things are 3D touch ready.
I blame the software refresh of Apple after the 5-series UI language was removed. Minimal mechanical design with rich complex software is a beautiful contrast that strengthens how both feel.
Yeah it could have been useful but I feel like they nerfed it from the start. Still wasn’t a big fan.
I was hoping it was a tease for a fully software defined haptic feedback based keyboard. There’s the obvious usefulness and coolness of that, and then the fact that you could make a laptop closer to the sealed clean-ability of a phone. Probably not quite submersible/waterproof due to ports and fans but able to survive a spill and be cleaned well.
The main thing that held me back from using Asahi on my M2 MacBook air - was missing external display. If I read TFA correctly - that should now work with a custom kernel.
If that's true - I'd say MacBook air M2 is probably the new sweetspot - depending on how cheap you could get an M1.
My impression is that until now, MacBook air M1 was the sweetspot.
It would be nice if projects like this were more willing to take on paid devs, and accept regular payment. have some sort of a subscription? I don't say that because my money is burning a whole in my pocket, but because I want to see more hardware support and stability. more testing, QA, but hunting.
With the attention this project is getting, I'd be surprised if they can't get the equivalent of a small startup's seed round, just by crowdfunding. Do they have all the funding and resources they need or not? that's really my ultimate question. I know you can't just throw money at these things and make them happen faster sometimes.
For me lack of thunderbolt is a showstopper, when it’s supported a lot of needed peripherals will be supported automatically. They have apparently been working on tb support since the m1 was released 4 years ago.
To be frank, TB support on Linux in general is kinda crap. I'm not surprised this might take them a while, and I'm sure it's lower priority compared to other things on the road map
This is amazing work, and I certainly respect the talent of those involved.
That said, my question to those interested is why? I've been a daily user of both Ubuntu since 2005 and Mac since 2012. There are some edge case differences but for the most part they are so similar that I nearly always run the same code on both without modification. Clearly I'm missing something important but I'm curious what it is. Thanks in advance.
One reason is for continued software support after Apple drops macOS compatibility for your machine. Intel Macs could be patched to run newer, unsupported OS versions (essentially hackintoshing your real Mac) but from my understanding that's basically impossible for Apple Silicon Macs.
Apple makes some of (if not) the best hardware around. It makes sense that some people would want to buy a Macbook Pro for its hardware and run their favorite OS on it.
This is incredibly impressive and also quite sad. Six years later, we have a very-nearly-right kernel for the M1.
Apple is launching the M5. It seems like the future is going to be a world of closed systems and custom silicon, with any free software lagging far behind.
M1 and M2 hardware isn't going anywhere. They're still great machines. And progress will be faster once the project finishes getting their code merged into the existing Linux kernel and distros. They have a first alpha of M3 ready, they're just refraining from releasing it in that state because they're so busy with everything else they're doing - a key difference compared to when they first came out with alpha support for the M1.
I’m sad because the WiFi on my M1 MBA is dying/dead and there’s no way to replace it without swapping the whole “logic board.” Apple also doesn’t support any USB WiFi adapters in recent versions of MacOS, so it’s now tethered to a wired network connection. I’m just waiting for the M5 refresh to hit at this point. Anyway, all that’s to say that at least some M1 hardware is going to to the trash heap soon :(
To add to that, they'll continue to be great machines running Linux even after Apple has bloated MacOS to death. Tahoe has made my M1 MBP feel significantly less snappy for no good reason.
Most of my software development career was spent working at a small company that sold a product that emulated the operating system developed and sold by a much, much larger company. The work was interesting and when you had a breakthrough or a small victory, it sure felt good. The challenge of keeping up was exhilarating and kept folks motivated to keep pressing forward.
But eventually it wears you down. It's nearly impossible to keep up in the long-term. Normal product evolution, the sheer size of the behemoth and sometimes even malice on their part to thwart the little guy make it really tough to stay current.
Think of Wine vis-a-vis Windows. They will never catch up.
Except they did with Wine, in a way. They got to the point where sufficient number of third party software developers target the common base between Wine and Windows (Steam/Proton), electing to have broader compatibility rather than catching all the newest Windows-only APIs.
I wonder how much similar behavior influence other buying choices. I’ve been eyeing an upgrade from M1 for a while - so far punting on it, mostly because of Asahi.
I guess I wasn't aware that Wine pivoted from trying to be a general purpose, drop-in replacement for Windows to being a platform for games that only supports a subset of Windows functionality.
It's much more difficult to keep current and support the full functionality of a much larger competitor's offering when you have to support everything. In my experience it was an all or nothing proposition. Either you emulated it 100% or you had nothing. I think Asahi is more in this realm maybe than Wine. It really needs to support all the hardware, 100%, or it's value is greatly diminished.
> I guess I wasn't aware that Wine pivoted from trying to be a general purpose, drop-in replacement for Windows to being a platform for games that only supports a subset of Windows functionality.
Or „just enough” for the subset of users that is „enough” to ensure product viability. The absolutism of „all or nothing” is rooted in the strictly-better mentality for replacing something.
For Wine/Proton, the core demographic is essentially gamers, who tend to overlap heavily with engineering population later on, and thus core population for Microsoft to capture and retain. Once Steam removed that vendor lock-in, the corporate discussion became more flexible.
For Asahi (proud Asahi user for 4y now), the added value of „most powerful Linux/Arm64 laptop on the market” outweighs the few things that don’t work on Asahi (HDMI out is probably the only one that occasionally matters for me, but screencasting works well enough). Yes, there are gaps, but they are smaller than things from Linux that are missing on OSX or Windows for me.
There was a PR for Adobe products a few weeks ago (https://github.com/ValveSoftware/wine/pull/310), though it seems like they're redirecting it to the main Wine repo now since it makes more sense there
First you have to convince consumers to give up on systems that work with support to development kits that are build it yourself with (lots of) caveats, which is why the year of the Linux desktop still hasn’t arrived.
At least Apple usually supports their hardware for ~7 years so that's plenty of time to get Asahi working on newer Ms. I don't care too much about getting instant support but I definitely care about having the option to use my hardware more than 7 years
And that's with Apple deliberately leaving the bootloader open on Macs. If they had locked it down like they do on every other product then it would be even more of a struggle, and there's always the looming possibility that they'll just change their mind with future models.
This is my main concern. I applaud the effort that the Asahi team is doing, but there's no way that I would rely on a small team of inarguably passionate and talented hackers to maintain a system that uses reverse engineered software running on hardware manufactured by a company historically opposed to everything they're doing, even if they left this small door open for that.
It would be like going back to the days of early Linux and all the Windows-specific hardware we had to deal with, but extrapolated to the entire system. As impressive as all of their work is, it's not worth the IMO minor UX benefits of Apple's hardware.
Mainline Linux on ARM is solid these days; new x86 chips from Intel perform very well and are reasonably power efficient; and battery life of most professional laptops in Linux is quite good. For example, I get a good ~12 hours of work done on an X1 Carbon Gen 13 from a single charge. This may not be as impressive as Macbooks, and the packaging certainly isn't as sleek, but it's good enough for me. The tradeoff for a solid software experience, modulo the usual Linux shenanigans, is worth it to me.
Thankfully, hardware progress is relatively slow in a way that makes the m1 still a perfectly capable machine. Maybe we’ll have a future of “flagship community devices” where only one of every X is chosen as the supported option.
The problem is, many "nerds" have little other options in hardware. Which ARM laptop would be the alternative? Even if you allow x86, which ones are as nice as the Macbooks? Then, there is the problem, that not everyone wants to be Linux-only in their setup. I definitely prefer macOS to W11.
I've never bought any Mac, but I've been issued several at jobs. At one company, I scandalized the Apple fans by wiping OSX and installing Linux on a 10th gen Macbook Pro.
So all of the nerds that care about open platforms stop buying their systems. Everyone else is going to keep doing it and they number us massively.
And then there's the fact that it's still a dark ending if the best hardware out there — even if we all refuse to buy it because we're on a moral high ground — is a closed platform that we have to refuse to buy.
Will the gap remain just as big once earlier architectures are fully covered? I would expect some inertia bringing positive feedback in the development loop.
I wonder how this will be impacted by the Steam Frame et. al.
Apple is probably the most mainstream supplier of ARM computers at the moment, but Valve is likely to soon have the most mindshare amongst ARM-shippers who actively support Linux. I expect that will improve ARM support in the ecosystem, which should be good for Asahi also.
For me, it has been ready as a daily driver for more than a year. Battery life is shorter than macos but still long enough that I don't have to think about it (which I can't say about any x86 laptops, even when they use iGPUs).
The notable missing features are external displays (an experimental kernel branch is publicly available though) and the fingerprint sensor. That's about it, though. Given the amount of polish combined with the hardware, it's arguably the most polished Linux laptop experience you'll get.
Anyone know if the M4 GPU support has improved? I might not use Asahi "today" but I'm sure the day will come when my M4 laptop will be deemed "vintage" despite being perfectly fine hardware, and on that day I am most likely to flip over. Hopefully by then the GPU support is much richer.
Couldn't Apple make a change that renders all this work a waste of time? i.e. lock out other OSe's from booting for example. I applaud the effort but given MacOs is already a capable unix I don't see the rewards being worth it.
macOS is a capable UNIX, but it's not Linux - which has since become the standard platform for most cloud/web/ML development.
As a developer myself who uses Fedora Asahi Remix as my daily driver, I can also tell you that Linux runs 2x faster (often much more) for everything compared to macOS - on the same hardware! And that performance gain is also important for my work :-)
Totally, I have a mini forums pc that runs void linux that I ssh into from my MB air. My worry is hitching your wagon to a project that could stop working one day through no fault of the devs.
They could which is ultimately why we need regulators in the EU to premeptively ban those kinds of anti-repair anti-ownership kinds of business decisions.
Consumers should be allowed to install whatever software they want on decides they own.
Just came to praise the seven heroic souls who are working on this wonderful project that my M1 is just waiting to install once battery life meets my needs. And may I ask why, with all the crypto and tech money floating around, does Asahi not have a fully funded staff of fifty people?
Well, wish I could contribute something which could help, but I am homeless living with Schzioaffective disorder and the inflation and wealth inequity caused by the tech industry has left my life on a razors edge when it comes to money.
But I would gladly match my 1% of my monthly income to anyone here who can pledge the same 1% who makes over $500k a year. So that would be my $20/month vs their $416/month.
If “long running agents” have any use, this feels like the kind of project where they should be put to work. You have an oracle implementation (macOS) and a well defined problem with lots of grueling trial and error.
reply