Asahi is awesome!
But this is also proves that laptops outside the MacBook realm really need to improve so much. I wish there were a Linux machine with the hardware quality of a MacBook
* x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance and power efficiency
* Qualcomm kinda fumbled the Snapdragon X Elite launch with nonexistent Linux support and shoddy Windows stability, but here's to hoping that they "turn over a new leaf" with the X2.
Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
On the build quality side, basically all the PCs are still lagging behind Apple, e.g. yesterday's rant post about the Framework laptop [2] touched on a lot of important points.
Of course, there are the Thinkpads, which are still built decently but are quite expensive. Some of the Chinese laptops like the Honor MagicBooks could be attractive and some reddit threads confirm getting Linux working on them, but they are hard to get in the US. That said, at least many non-Apple laptops have decent trackpads and really nice screens nowadays.
I have no faith in Qualcomm to even make me basic gestures towards the Linux community.
All I want is an easy way to install Linux on one of the numerous Snapdragon laptops. I think the Snapdragon Thinkpad might work, but none of the other really do.
A 400$ Arm laptop with good Linux support would be great, but it's never ever going to happen.
Google has previously delivered good Linux support on Arm Chromebooks and is expected to launch unified Android+ChromeOS on Qualcomm X2 Arm devices in 2026.
My personal beef with Thinkpads is the screen. Most of the thinkpads I’ve encountered in my life (usually pretty expensive corporate ones) had shitty FHD screens. I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.
Old Thinkpads are great! I used to have a Lenovo Thinkpad X1 Carbon Gen 6 with Intel Core i7 8640U, 16 GB of RAM, and 1 TB SSD. I installed Arch Linux on it with Sway.
The keyboard and body are not bad at all - rather, they're best in class, and so is the rest of the hardware. It is a premium hardware experience, and has been since Jony Ive left, which is what makes the software so disappointing.
Excellent power efficiency in apple silicon - good battery life and good performance at the same time. The aluminum body is also very rigid and premium feeling, unlike so many creaky bendy pc laptops. Good screen, good speakers.
Aluminum and magnesium non-Apple laptops are just as stiff. There's just a wider spectrum of options, including $200 plastic ARM Chromebooks available.
> "I never understood why people claim the Macbook is so good."
Apple's good enough for the average consumer, just like a 16-bit home computer back in the day. Everyone who looks for something bespoke/specialized (e. g. certified dual- or multi-OS support, ECC-RAM, right-to-repair, top-class flicker-free displays, size, etc.) looks elsewhere, of course.
I believe there are a few all-metal laptops competing in the marketplace but was unaware they were actually better than the apple laptops ... what all aluminum laptops are better and how are they better ?
Strawman. Because Apple designed it well. Metal’s not an issue. My legacy 2013 MacBook Air still looks and feels and opens like new.
I was looking at Thinkpad Auras today. There are unaligned jutting design edges all over the thing. From a design perspective, I’ll take the smooth oblong squashed egg.
Every PC laptop I’ve touched feels terrible to hold and carry. And they run Windows, and Linux only okay. Apple
MacBooks are a long mile better than everything else and so I don’t care about upgraded memory — buy enough ram at purchase time and you don’t have to think about it again.
Memory upgrades aren’t priced super well, granted, but I could never buy HP Dell Lenovo ever again. They’re terrible. I’ve had all of them. Ironically the best device I’ve had from the other side was a Surface Laptop. But I don’t do Microsoft anymore. And I don’t want to carry squeaky squishy bendy plastic.
Most of all, I’m never getting on a customer support call with the outsourced vendors that do the support for those companies ever ever ever again. I’ll take a visit to an Apple store every day of the week.
I’ve never heard someone describe the aluminum body as bad.. what do you not like about it?
The number one benefit is the Apple Silicon processors, which are incredibly efficient.
Then it’s the trackpad, keyboard and overall build quality for me. Windows laptops often just feel cheap by comparison.
Or they’ll have perplexing design problems, like whatever is going on with Dell laptops these days with the capacitive function row and borderless trackpad.
Looking at a Thinkpad 16" P1 Gen 8 with 2X 1TB SSD, 64GB RAM, QHD+ screen, centered keyboard like MBP (i.e. no numpad), integrated Intel GPU, lightweight (4 lbs) for a little under $2.5K USD.
Closest I've found to an MBP 16" replacement.
Have been running Dell Precision laptops for many years on Linux, not sure about Lenovo build quality and battery life, but hoping it will be decent enough.
Would run Asahi if it supported M4 but looks it's a long ways away...
I am giving my MacBook Air M2 15” to my wife and bought a Lenovo E16 with 120hz screen to run Kubuntu last night. She needed a new laptop and I am had enough of macOS and just need some stuff to work that will be easier on an intel and Linux. Also I do bookwork online so bigger screen and dedicated numpad will be nice.
It reviews well and seems like good value for money with current holiday sales but I don’t expect the same hardware quality or portability just a little more freedom. I hope I’m not too disappointed.
https://www.notebookcheck.net/Lenovo-ThinkPad-E16-G3-Review-...
I outfitted our 10 person team with the E16 g2 and it’s been great.
Two minor issues- it’s HEAVY compared to T models.
Because of the weight try not to walk around with the lid up and holding it from one of front corners. I’ve noticed one of them is kind of warped from walking around the office holding it that way.
That’s great news thanks. I got the gen 3 so maybe some improvements. Weight is ok as I really just move it around the house. I buy used Panasonics for the workshop.
Been a kubuntu user since .. 2006? 2007? Don't remember when kubuntu became a thing, but as soon as I tried Ubuntu, I went kubuntu. I believe it was 5.10 or 6.04 or something. :-)
Am growing tired of Ubuntu though. Just not sure where I should turn. I want a .deb based system. Ubuntu is pushing snaps too heavily for my liking.
I liked Ubuntu and variants back when it first came out and I was newer to Linux but it didn't take long for me to realise there always seemed to be a better option for me as a daily driver. To me its like an new Linux user OS where a lot of stuff is chosen for you to use basically as is. Even the name Kubuntu where the K is for KDE but on other distros you would just choose your DE when you install.
I agree. It feels like combination of peak windows UI with the ease of Ubuntu baked in. Then the little mobile app they have that gives you shared clipboard with iOS is cool.
> I wish there were a Linux machine with the hardware quality of a MacBook
It really depends what you mean by "quality". To me first and foremost quality I look for in a laptop is for it to not break. As I'm a heavy desktop user, my laptop is typically with me on the couch or on vacation. Enter my MacBook Air M1: after 13 months, and sadly no extended warranty, the screen broke for no reason overnight. I literally closed it before going to bed and when I opened the lid the next day: screen broken. Some refer to that phenomenon as the "bendgate".
And every time I see a Mac laptop I can't help but think "slick and good looking but brittle". There's a feeling of brittleness with Mac laptops that you don't have with, say, a Thinkpad.
My absolute best laptop is a MIL-SPEC (I know, I know, there are many different types of military specs) LG Gram. Lighter than a MacBook too. And every single time I demo it to people I take the screen, I bent it left and right. This thing is rock solid.
I happen to have this laptop (not my vid) and look at 34 seconds in the vid:
The guy literally throws my laptop (well, the same) down concrete stairs and the thing still just works fine.
The friend who sold it to me (I bought it used) one day stepped on it when he woke up. No problemo.
To me that is quality: something you can buy used and that is rock solid.
Where are the vids of someone throwing a MacBook Air down the stairs and the thing keeps working?
I'm trading a retina display any day for a display that doesn't break when it accidentally falls on the ground.
Now I love the look and the incredible speed of the MacBook Air laptops (I still have my M1 but has its screen broke, I turned it into a desktop) but I really wish they were not desk queens: we've got desktops for that.
I don't want a laptop that require exceptional care and mad packaging skills when putting it inside a backpack (and which then requires the backpack to be manipulated with extreme care).
So: bring me the raw power and why not the nice look of a MacBook Air, but make it sturdy (really the most important for me) and have it support Linux. That I'd buy.
I've owned two LG gram laptops. Neither were milspec, but both were really nice. Sure, the screen quality isn't going to win any awards, nor will the speakers, but the light weight, fantastic battery life and snappy performance always get a recommendation from me.
Why? Lots of people more or less use their computer as a glorified web browser, with some zoom calls and document editing thrown in for good measure. 256gb seems overkill. My girlfriend is somehow still rocking a 2011 MacBook Air. She mostly just uses it for internet banking and managing her finances. Why would she want more than 256gb?
1Tb m.2 SSD cost 70 USD in summer 2025, and probably much less when bought in bulk as a chip. It doesn't make sense to install anything less than 1Tb in an expensive premium laptop. Or it should be upgradeable.
Apple's pricing is one of the reasons I am not going to buy their laptops. Expensive, and with no upgradeable or replaceable parts. And closed-source OS with telemetry.
> Lots of people more or less use their computer as a glorified web browser
For this purpose they can buy $350 laptop with larger screen.
Because the price tag is quite high to get as much storage as you would 15 years ago for about the same money.
I agree that many people use them as glorified internet machines but even then when they occasionally decide to back up some photos or edit a few videos the 256GB non-upgradable storage quickly becomes a limitation.
Price matters. 256GB is fine on a $500 web browsing laptop, but on a $1000+ one it's just a bad deal in 2025, even ignoring the fact that you cannot upgrade it later (it's soldered in place).
Possibly, but I don't see why those people would buy a new MacBook rather than a used 100$ laptop (which would be both better for their finances but also for the planet...)
I think they mean that is 2025, 256GB is unreasonably small. Which is true, Apple wants to up-charge hundreds of dollars just to get to the otherwise standard 1TB drive.
From a supply perspective, 256GB seems ridiculous because you can get way more capacity for not very much money, and because 256GB is now nowhere close to enough flash chips operating in parallel to reach what is now considered high performance.
But from a demand perspective, there are a lot of PC users for whom 256GB is plenty of capacity and performance. Most computers sold aren't gaming PCs or professional workstations; mainstream consumer storage requirements (aside from gaming) have been nearly stagnant for years due to the popularity of cloud computing and streaming video.
>I am very impressed with how smooth and problem-free Asahi Linux is. It is incredibly responsive and feels even smoother than my Arch Linux desktop with a 16 core AMD Ryzen 7945HX and 64GB of RAM.
Hmmm still have issue with the battery in sleep mode on the m1. It drains a lot battery when it is in sleep mode compare to mac sleep mode.
A new Wayland protocol is in the works that should support screen cutout information out of the box: https://phosh.mobi/posts/xdg-cutouts/ Hopefully this will be extended to include color information whenever applicable, so that "hiding" the screen cutout (by coloring the surrounding area deep black) can also be a standard feature and maybe even be active by default.
You can't be serious. Wayland is the opposite of modular, and the concept of an extensible protocol only creates fragmentation.
Every compositor needs to implement the giant core spec, or, realistically, rely on a shared library to implement it for them. Then every compositor can propose and implement arbitrary protocols of their own, which should also be supported by all client applications.
It's insanity. This thing is nearly two decades old, and I still have basic clipboard issues[1]. This esoteric cutouts feature has no chances of seeing stable real-world use in at least a decade from now.
Shh...you're not supposed to mention these things alas you be down voted to death.
I also have tremendous issues with Plasma. Things such as graphics glitching in the alt+tab task switcher or Firefox choking the whole system when opening a single 4k PNG image. This is pre-alpha software... So back to X11 it is. Try again in another decade or two.
YMMV and all, but my experience is that Wayland smoothness varies considerably depending on hardware. On modernish Intel and AMD iGPUs for example I’ve not had much trouble with Wayland whereas my tower with an Nvidia 3000 series card was considerably more troublesome with it.
Generally true, though this particular case is due to a single company deciding to not play ball and generally act in a manner that's hostile to the FOSS world for self-serving reasons (Nvidia).
I don't even think it's even that. These bugs seem like bog standard bugs related to correct sharing of graphics resources between processes and accessing with correct mutual exclusion.Blaming NV is likely just a convenient excuse.
> my tower with an Nvidia 3000 series card was considerably more troublesome with it.
I think you're describing a driver error from before Nvidia really supported Wayland. My 3070 exhibited similar behavior but was fixed with the 555-series drivers.
The Vulkan drivers are still so/so in terms of performance, but the smoothness is now on-par with my Macbook and Intel GNOME machine.
The thing is that I'm not experiencing this clipboard issue on Plasma, but on a fresh installation of Void Linux with niri. There are reports of this issue all over[1][2][3], so it's clearly not an isolated problem. The frustrating thing is that I wouldn't even know which project to report it to. What a clusterfuck.
I can't go back to X11 since the community is deliberately killing it. And relying on a fork maintained by a single person is insane to me.
You can see the same problem in the XMPP world, with a lot of the extensions implemented only by a few applications. But at least most XMPP extensions are designed to be backwards-compatible with clients that don't support them.
You know what OS doesn’t handle the notch? OSX. It happily throws the system tray icons right back there, with an obscure work around to bring them back. Software quality at Apple these days…
Seems like a crazy hobby to me though! Photography is inconvenient enough without having to make your own mounts and use an sdk to do it! History is filled with inconvenient hobbies though.
I would agree with the sentiment about the lack of good bright screens for lenovo's hacker laptops like the X1 carbon.
Each controller and subcomponent on the motherboard needs a driver that correctly puts it into low power and sleep states to get battery savings.
Most of those components are proprietary and don't use the standard drivers available in Linux kernel.
So someone needs to go and reverse engineer them, upstream the drivers and pray that Apple doesn't change them in next revision (which they did) or the whole process needs to start again.
In other words: get an actually Linux supported laptop for Linux.
One of my favorite machines was the MacBook Air 11 (2012). This was a pure Intel machine, except for a mediocre Broadcom wireless card. With a few udev rules, I squeezed out the same battery performance from Linux I got from OS X, down to a few minutes of advantage in favor of Linux. And all this despite Safari being a marvel of energy efficiency.
The problem with Linux performance on laptops boils down to i) no energy tweaks by default and ii) poor device drivers due to the lack of manufacturer cooperation. If you pick a machine with well supported hardware and you are diligent with some udev rules, which are quite trivial to write thanks to powertop suggestions, performance can be very good.
I am getting a bit more than 10 hours from a cheap ThinkPad E14 Gen7, with a 64 Wh battery, and light coding use. That's less than a MacBook Air, where I would be getting around 13-14 hours, but it's not bad at all. The difference comes mainly from the cheap screen that is more power consuming and ARMs superior efficiency when idling.
But I prefer not to trade the convenience and openness of x86_64 plus NixOS for a bit more battery range. IMHO, the gap is not sufficiently wide to make a big difference in most usage scenarios.
The need to tweak that deeply just to get “baseline” performance really stings, though, particularly if you’re not already accustomed to having to do that kind of thing.
It’d be a gargantuan project, but there should probably be some kind of centralized, cross-distro repository for power configuration profiles that allows users to rate them with their hardware. Once a profile has been sufficiently user-verified and is well rated, distro installers could then automatically fetch and install the profile as a post-install step, making for a much more seamless and less fiddly experience for users.
> 40% battery for 4hrs of real work is better than pretty much any linux supported laptop I've ever used
Not sure what "real work" is for you, but I regularly get more than 12 hours of battery life on an old Chromebook running a Linux and the usual IDEs/dev tooling (in a Crostini VM). All the drivers just work, sleep has no detectable battery drain. It's not a workstation by any means, but dual core Intel's are great for Python/Go/TypeScript
Out of curiosity, does Google contribute the drivers for Chromebook hardware to Linux upstream or do they keep it for themselves? Could it be that they just choose the hardware that works very well out of box with Linux?
What's the bar here? My Thinkpad X270 gets about 16 hours under Ubuntu with swaywm.
If we really want to get pedantic, its internal battery means the external pack is hot-swappable, so I can actually get several days on a "single charge." Good machine for camping trips.
Yet. Plenty of people have with Intel ones - I’m one of them. My first experience with Linux was on a 2016 MBpro. And inevitably people will do the same with the silicon Macs, likely using Asahi it seems.
It's not inevitable. That's not what that word means.
Intel Macs supported Linux because they used Intel's Linux drivers and supported bog-standard UEFI. There are no preexisting drivers or DeviceTree files published by Apple for Linux. There is no UEFI implimentation, just a proprietary bootloader that can be updated post-hoc to deny booting into third-party OSes.
> Why are some of y'all so hostile to this idea?
I would love for Linux to support as many ARM devices as possible. Unfortunately, it requires continuous effort from the OEM to be viable. I've bought Qualcomm, Rockchip and Broadcom boards before, none of them have been supported for half as long as my x86 machines are. Nevermind how fast ARM architectures become obsolete.
It feels like Apple is really the only hostile party here, and they coincidentally decide whether or not you get to use third-party OSes.
It is inevitable. I guarantee you there will be people who run Linux on their silicon Macs. I don’t know how you could possibly hold a stance that no one ever will.
Apple is very hostile to it. It won’t stop everyone though. It’ll continue to be niche but it’s happening.
It's not inevitable. It's fragile. Go boot up your old iPad; that should be well-studied, right? We ought to know how to boot into Linux on an ARM machine that old, it's only fair.
Except, you can't. The bootloader is the same iBoot process that your Apple Silicon machine uses, with mitigations to prevent unsigned OSes or persistent coldboot. All the Cydia exploits in the world won't put Linux back on the menu for iPhone or iPad users. And the same thing could happen to your Mac with an OTA update.
It is entirely possible for Apple to lock down the devices further. There's no guarantee they won't.
That's an admirable goal, but, depending on the hardware, it can run into that pesky thing called reality.
It's getting very tiresome to hear complaints about things that don't work on Linux, only to find that they're trying to run it on hardware that's poorly supported, and that's something they could have figured out by doing a little research beforehand.
Sometimes old hardware just isn't going to be well-supported by any OS. (Though, of course, with Linux, older hardware is more likely to be supported than bleeding-edge kit.)
This is very true. I've been asked by lots of people "how do I start with Linux" and, despite being 99.9% Linux user for everything everyday, my advice was always:
1. Use VirtualBox. Seriously, it won't look cool, but it will 100% work after maybe 5 mins mucking around with installing guest additions. Also snapshots. Also no messing with WiFi drivers or graphics card drivers or such.
2. Get a used beaten down old Thinkpad that people on Reddit confirm to be working with Linux without any drivers. Then play there. If it breaks, reinstall.
3. If the above didn't make you yet disinterested, THEN dual boot.
Also, if you don't care about GUI, then use the best blessing Microsoft ever created - WSL, and look no further.
I've never gotten along too well with virtualization, but would second the ThinkPad idea, or something similar. Old/cheap machine for tinkering is a good way to ease in, and I think bare metal feels more friendly.
I'd probably recommend against dual booting, but I understand it's controversial. I like to equate it to having two computers, but having to fully power one off to do anything* on the other one. Torrents stop, music collection may be inaccessible depending on how you stored it, familiar programs may not be around anymore. I dual booted for a few years in the past and I found it miserable. People who expected me to reboot to play a game with them didn't seem to understand how big of an ask that really was. Eventually things boiled over and I took the Windows HDD out of that PC entirely. Much more peaceful. (Proton solves that particular issue these days also)
That being said, I've had at least two friends who had a dual boot due to my influence (pushing GNU/Linux) who ended up with some sort of broken Windows install later on and were happy to already have Ubuntu as an emergency backup to keep the machine usable.
*Too old might be a problem these days with major distros not having 32bit ISOs anymore
I've tried this once for IntelliJ to work around slow WSL access for Git repos. Was greeted by missing fonts and broken scaling on the intro screen. Oops. But probably I was just unlucky, it might work well for most.
Apple does tons of optimizations for every component to improve battery life.
Asahi Linux, which is reverse engineered, doesn't have the resources to figure out each of those tricks, especially for undocumented proprietary hardware, so it's a "death by a thousand cuts" as each of the various components is always drawing a couple of milliwatts more than on macOS.
Eh it's pretty awful. I get 8 hours, yes, but in Linux, those 8 hours are ticking whether my laptop is sleeping in my bag or on my desk with the lid closed or I'm actively using it. 8 hours of active use is pretty good, but 8 hours in sleep is absolutely dreadful.
Exactly. This myth keeps being perpetuated, for some reason.
I'm typing this from a ThinkPad X1 Carbon Gen 13 running Void Linux, and UPower is reporting 99% battery with ~15h left. I do have TLP installed and running, which is supposed to help. Realistically, I won't get around 15h with my usage patterns, but I do get around 10-12 hours. It's a new laptop with a fresh battery, so that plays a big role as well.
This might not be as good as the battery life on a Macbook, but it's pretty acceptable to me. The upcoming Intel chips also promise to be more power efficient, which should help even more.
For optimal battery life you need to tweak the whole OS stack for the hardware. You need to make sure all the peripherals are set up right to go into the right idle states without causing user-visible latency on wake-up. (Note that often just one peripheral being out of tune here can mess up the whole system's power performance. Also the correct settings here depend on your software stack). You need to make sure that cpufreq and cpuidle governors work nicely with the particular foibles of your platform's CPUs. Ditto for the task scheduler. Then, ditto for a bunch of random userspace code (audio + rendering pipeline for example). The list goes on and on. This work gets done in Android and ChromeOS.
This doesn't match my experience. My previous three laptops (two AMD Lenovo Thinkpads, one Intel Sony VAIO) had essentially the same battery life running Linux as running Windows.
The idea that a group of people would spend so much of their time trying to get linux to work on Apple hardware through reverse engineering always seemed absolutely crazy to me. I would never consider buying Apple hardware precisely because it doesn't support linux and the work they put in achieves nothing because the risk will always remain that they will lock the hardware further. Nevermind the fact that they will likely never fully reverse engineer all the components.
It just seems like a completely pointless endeavor... perhaps some people buy into it? why would anyone buy overpriced hardware with partial support that may one day be gone? the enhanced battery life doesn't really hold much appeal to me, and the arm architecture if anything is just another signal to stay away.
The only thing that makes sense to me is that they wanted the achievement on their resume, and in that given recent developments they succeeded?
Asahi is all reverse engineering. It’s nothing short of a miracle what has already accomplished, despite, not because of, Apple.
That said some of the prominent developers have left the project. As long as Apple keeps hoarding their designs it’s going to be a struggle, even more so now.
If you care about FOSS operating systems or freedom over your own hardware there isn’t a reason to choose Apple.
To be clear, the work the asahi folks are doing is incredible. I’m ashamed to say sometimes their documentation is better than the internal stuff.
I’ve heard it’s mostly because there wasn’t an m3 Mac mini which is a much easier target for CI since it isn’t a portable. Also, there have been a ton of hardware changes internally between M2 and M3. M4 is a similar leap. More coprocessors, more security features, etc.
For example, PPL was replaced by SPTM and all the exclave magic.
This is what ruffles my jimmies about this whole thing:
> I’m ashamed to say sometimes their documentation is better than the internal stuff.
The reverse engineering is a monumental effort, this Sisyphean task of trying to keep up with never-ending changes to the hardware. Meanwhile, the documentation is just sitting there in Cupertino. An enormous waste of time and effort from some of the most skilled people in the industry. Well, maybe not so much anymore since a bunch of them left.
I really hope this ends up biting Apple in the ass instead of protecting whatever market share they are guarding here.
I strongly support a projects stance that you shouldn't ask when it will be done. But the time between the M1 launch and a good experience was less than the time since M3 I would love to know what is involved.
That's an email from James Calligeros. All this patch says is that the author is Hector Martin (and Sven Peter). The code could have been written a long time ago.
The new project leadership team has prioritized upstreaming the existing work over reverse engineering on newer systems.
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
> Last time, we announced that the core SMC driver had finally been merged upstream after three long years. Following that success, we have started the process of merging the SMC’s subdevice drivers which integrate all of the SMC’s functionality into the various kernel subsystems. The hwmon driver has already been accepted for 6.19, meaning that the myriad voltage, current, temperature and power sensors controlled by the SMC will be readable using the standard hwmon interfaces. The SMC is also responsible for reading and setting the RTC, and the driver for this function has also been merged for 6.19! The only SMC subdevices left to merge is the driver for the power button and lid switch, which is still on the mailing list, and the battery/power supply management driver, which currently needs some tweaking to deal with changes in the SMC firmware in macOS 26.
Also finally making it upstream are the changes required to support USB3 via the USB-C ports. This too has been a long process, with our approach needing to change significantly from what we had originally developed downstream
Very little progress made this year after high profile departures (Hector Martin, project lead, Asahi Lina and Alyssa Rosenzweig - GPU gurus). Alyssa's departure isn't reflected on Asahi's website yet, but it is in her blog. I believe she also left Valve, which I think was sponsoring some aspects of the Asahi project. So when people say "Asahi hasn't seen any setbacks" be sure to ask them who has stepped in to make up for these losses in both talent and sponsorship.
I have no insight into the Asahi project, but the LKML link goes to an email from James Calligeros containing code written by Hector Martin and Sven Peter. The code may have been written a long time ago.
Without official support, the Asahi team needs to RE a lot of stuffs. I’d expect it to lag behind a couple of generations at least.
I blame Apple on pushing out new models every year. I don’t get why it does that. A M1 is perfectly fine after a few years but Apple treats it like an iPhone. I think one new model every 2-3 years is good enough.
M1 is indeed quite adequate for most, but each generation has brought substantial boosts in performance in single-threaded, multi-threaded, and with the M5 generation in particular GPU-bound tasks. These advancements are required to keep pace with the industry and in a few aspects stay ahead of competitors, plus there exist high end users whose workloads greatly benefit from these performance improvements.
I agree. But Apple doesn’t sell new M1 chip laptops anymore AFAIK. There are some refurbished ones but most likely I need to go into a random store to find one. I only saw M4 and M5 laptops online.
That’s why I don’t like it as a consumer. If they keep producing M1 and M2 I’d assume we can get better prices because the total quantity would be much larger. Sure it is probably better for Apple to move forward quickly though.
In the US, Walmart is still selling the M1 MacBook Air new, for $599 (and has been discounted to $549 or better at times, such as Black Friday).
In general, I don't think it's reasonable to worry that Apple's products aren't thoroughly achieving economies of scale. The less expensive consumer-oriented products are extremely popular, various components are shared across product lines (eg. the same chip being used in Macs and iPads) and across multiple generations (except for the SoC itself, obviously), and Apple rather famously has a well-run supply chain.
From a strategic perspective, it seems likely that Apple's long history of annual iteration on their processors in the iPhone and their now well-established pattern of updating the Mac chips less often but still frequently is part of how Apple's chips have been so successful. Annual(ish) chip updates with small incremental improvements compounds over the years. Compare Apple's past decade of chip progress against Intel's troubled past decade of infrequent technology updates (when you look past the incrementing of the branding), uneven improvements and some outright regressions in important performance metrics.
> That’s why I don’t like it as a consumer. If they keep producing M1 and M2 I’d assume we can get better prices because the total quantity would be much larger.
Why would this be true? An M5 MacBook Air today costs the same as an M1 MacBook Air cost in 2020 or whenever they released it, and is substantially more performant. Your dollar per performance is already better.
If they kept selling the same old stuff, then you spread production across multiple different nodes and the pricing would be inherently worse.
If you want the latest and greatest you can get it. If an M1 is fine you can get a great deal on one and they’re still great machines and supported by Apple.
author mentions he paid $750 for a MacBook Air M2 with 16GB while on Amazon a M4 Air with 16GB is usually $750-800. I get it that M4/M3 aren't supported to boot Asahi yet, but still.
I really wanted this to work, and it WAS remarkably good, but palm rejection on the (ginormous) Apple trackpad didn't work at all, rendering the whole thing unusable if you ever typed anything.
That was a month ago, this article is a year old. I'd love to be wrong, but I don't think this problem has been solved.
Yeah what is up with that? When I've tried to look into it I've just been met with statements that palm rejection should pretty much just work, but it absolutely doesn't and accidental inputs are so bad it's unusable without a disable/enable trackpad hotkey.
All Firefox users should switch to librewolf. In the short term it’s for telling Mozilla to go f**, in the long term it’s a browser fork with with really good anti fingerprinting.
Note that librewolf rely on Mozilla tech infra for account synchronization and plugin distribution. If you are truly hostile to this organization, is there another browser you can recommend?
* x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance and power efficiency
* Qualcomm kinda fumbled the Snapdragon X Elite launch with nonexistent Linux support and shoddy Windows stability, but here's to hoping that they "turn over a new leaf" with the X2.
Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
On the build quality side, basically all the PCs are still lagging behind Apple, e.g. yesterday's rant post about the Framework laptop [2] touched on a lot of important points. Of course, there are the Thinkpads, which are still built decently but are quite expensive. Some of the Chinese laptops like the Honor MagicBooks could be attractive and some reddit threads confirm getting Linux working on them, but they are hard to get in the US. That said, at least many non-Apple laptops have decent trackpads and really nice screens nowadays.
[1] https://www.phoronix.com/review/snapdragon-x-elite-linux-eoy...
[2] https://news.ycombinator.com/item?id=46375174
All I want is an easy way to install Linux on one of the numerous Snapdragon laptops. I think the Snapdragon Thinkpad might work, but none of the other really do.
A 400$ Arm laptop with good Linux support would be great, but it's never ever going to happen.
Installed arch, setup some commands to underclock the processor on login and easily boost it when I'm compiling.
Battery life is great but I'm not running a GUI either. Good machine for when I want to avoid distractions and just code.
Bad keyboard, bad aluminium body, soldered ram...
Is it just the Apple Silicon that somehow makes it worth it? It's ARM, most software is still written and optimized for x86.
Apple's good enough for the average consumer, just like a 16-bit home computer back in the day. Everyone who looks for something bespoke/specialized (e. g. certified dual- or multi-OS support, ECC-RAM, right-to-repair, top-class flicker-free displays, size, etc.) looks elsewhere, of course.
Would you elaborate ?
I believe there are a few all-metal laptops competing in the marketplace but was unaware they were actually better than the apple laptops ... what all aluminum laptops are better and how are they better ?
it's a stylistic choice, not a logical one.
I was looking at Thinkpad Auras today. There are unaligned jutting design edges all over the thing. From a design perspective, I’ll take the smooth oblong squashed egg.
Every PC laptop I’ve touched feels terrible to hold and carry. And they run Windows, and Linux only okay. Apple MacBooks are a long mile better than everything else and so I don’t care about upgraded memory — buy enough ram at purchase time and you don’t have to think about it again.
Memory upgrades aren’t priced super well, granted, but I could never buy HP Dell Lenovo ever again. They’re terrible. I’ve had all of them. Ironically the best device I’ve had from the other side was a Surface Laptop. But I don’t do Microsoft anymore. And I don’t want to carry squeaky squishy bendy plastic.
Most of all, I’m never getting on a customer support call with the outsourced vendors that do the support for those companies ever ever ever again. I’ll take a visit to an Apple store every day of the week.
The number one benefit is the Apple Silicon processors, which are incredibly efficient.
Then it’s the trackpad, keyboard and overall build quality for me. Windows laptops often just feel cheap by comparison.
Or they’ll have perplexing design problems, like whatever is going on with Dell laptops these days with the capacitive function row and borderless trackpad.
Closest I've found to an MBP 16" replacement.
Have been running Dell Precision laptops for many years on Linux, not sure about Lenovo build quality and battery life, but hoping it will be decent enough.
Would run Asahi if it supported M4 but looks it's a long ways away...
Two minor issues- it’s HEAVY compared to T models.
Because of the weight try not to walk around with the lid up and holding it from one of front corners. I’ve noticed one of them is kind of warped from walking around the office holding it that way.
Are you running windows?
Am growing tired of Ubuntu though. Just not sure where I should turn. I want a .deb based system. Ubuntu is pushing snaps too heavily for my liking.
It really depends what you mean by "quality". To me first and foremost quality I look for in a laptop is for it to not break. As I'm a heavy desktop user, my laptop is typically with me on the couch or on vacation. Enter my MacBook Air M1: after 13 months, and sadly no extended warranty, the screen broke for no reason overnight. I literally closed it before going to bed and when I opened the lid the next day: screen broken. Some refer to that phenomenon as the "bendgate".
And every time I see a Mac laptop I can't help but think "slick and good looking but brittle". There's a feeling of brittleness with Mac laptops that you don't have with, say, a Thinkpad.
My absolute best laptop is a MIL-SPEC (I know, I know, there are many different types of military specs) LG Gram. Lighter than a MacBook too. And every single time I demo it to people I take the screen, I bent it left and right. This thing is rock solid.
I happen to have this laptop (not my vid) and look at 34 seconds in the vid:
https://youtu.be/herYV5TJ_m8
The guy literally throws my laptop (well, the same) down concrete stairs and the thing still just works fine.
The friend who sold it to me (I bought it used) one day stepped on it when he woke up. No problemo.
To me that is quality: something you can buy used and that is rock solid.
Where are the vids of someone throwing a MacBook Air down the stairs and the thing keeps working?
I'm trading a retina display any day for a display that doesn't break when it accidentally falls on the ground.
Now I love the look and the incredible speed of the MacBook Air laptops (I still have my M1 but has its screen broke, I turned it into a desktop) but I really wish they were not desk queens: we've got desktops for that.
I don't want a laptop that require exceptional care and mad packaging skills when putting it inside a backpack (and which then requires the backpack to be manipulated with extreme care).
So: bring me the raw power and why not the nice look of a MacBook Air, but make it sturdy (really the most important for me) and have it support Linux. That I'd buy.
Apple's pricing is one of the reasons I am not going to buy their laptops. Expensive, and with no upgradeable or replaceable parts. And closed-source OS with telemetry.
> Lots of people more or less use their computer as a glorified web browser
For this purpose they can buy $350 laptop with larger screen.
I agree that many people use them as glorified internet machines but even then when they occasionally decide to back up some photos or edit a few videos the 256GB non-upgradable storage quickly becomes a limitation.
Price matters. 256GB is fine on a $500 web browsing laptop, but on a $1000+ one it's just a bad deal in 2025, even ignoring the fact that you cannot upgrade it later (it's soldered in place).
Realistically, it is reasonable to expect 2TB drives, based on normal progression https://blocksandfiles.com/2024/05/13/coughlin-associates-hd...
But from a demand perspective, there are a lot of PC users for whom 256GB is plenty of capacity and performance. Most computers sold aren't gaming PCs or professional workstations; mainstream consumer storage requirements (aside from gaming) have been nearly stagnant for years due to the popularity of cloud computing and streaming video.
Hmmm still have issue with the battery in sleep mode on the m1. It drains a lot battery when it is in sleep mode compare to mac sleep mode.
Every compositor needs to implement the giant core spec, or, realistically, rely on a shared library to implement it for them. Then every compositor can propose and implement arbitrary protocols of their own, which should also be supported by all client applications.
It's insanity. This thing is nearly two decades old, and I still have basic clipboard issues[1]. This esoteric cutouts feature has no chances of seeing stable real-world use in at least a decade from now.
[1]: https://bugs.kde.org/show_bug.cgi?id=466041
I also have tremendous issues with Plasma. Things such as graphics glitching in the alt+tab task switcher or Firefox choking the whole system when opening a single 4k PNG image. This is pre-alpha software... So back to X11 it is. Try again in another decade or two.
If my Ferrari has an issue with the brakes and I go to my dealer I don't care if the brakes were by Brembo.
Blaming the vendor and their drivers is just trying to shift the blame.
I think you're describing a driver error from before Nvidia really supported Wayland. My 3070 exhibited similar behavior but was fixed with the 555-series drivers.
The Vulkan drivers are still so/so in terms of performance, but the smoothness is now on-par with my Macbook and Intel GNOME machine.
I can't go back to X11 since the community is deliberately killing it. And relying on a fork maintained by a single person is insane to me.
[1]: https://old.reddit.com/r/hyprland/comments/1d4s9bw/ctrlc_ctr...
[2]: https://old.reddit.com/r/tuxedocomputers/comments/1i9v0n7/co...
[3]: https://old.reddit.com/r/kde/comments/1jl6zv7/why_does_copyp...
Bloody Wayland.
For those curious about the Alkeria line-scan camera, he wrote a blog about 3d printing a lens mount etc. https://daniel.lawrence.lu/blog/2024-08-31-customizing-my-li...
Seems like a crazy hobby to me though! Photography is inconvenient enough without having to make your own mounts and use an sdk to do it! History is filled with inconvenient hobbies though.
I would agree with the sentiment about the lack of good bright screens for lenovo's hacker laptops like the X1 carbon.
Most of those components are proprietary and don't use the standard drivers available in Linux kernel.
So someone needs to go and reverse engineer them, upstream the drivers and pray that Apple doesn't change them in next revision (which they did) or the whole process needs to start again.
In other words: get an actually Linux supported laptop for Linux.
40% battery for 4hrs of real work is better than pretty much any linux supported laptop I've ever used
The problem with Linux performance on laptops boils down to i) no energy tweaks by default and ii) poor device drivers due to the lack of manufacturer cooperation. If you pick a machine with well supported hardware and you are diligent with some udev rules, which are quite trivial to write thanks to powertop suggestions, performance can be very good.
I am getting a bit more than 10 hours from a cheap ThinkPad E14 Gen7, with a 64 Wh battery, and light coding use. That's less than a MacBook Air, where I would be getting around 13-14 hours, but it's not bad at all. The difference comes mainly from the cheap screen that is more power consuming and ARMs superior efficiency when idling.
But I prefer not to trade the convenience and openness of x86_64 plus NixOS for a bit more battery range. IMHO, the gap is not sufficiently wide to make a big difference in most usage scenarios.
It’d be a gargantuan project, but there should probably be some kind of centralized, cross-distro repository for power configuration profiles that allows users to rate them with their hardware. Once a profile has been sufficiently user-verified and is well rated, distro installers could then automatically fetch and install the profile as a post-install step, making for a much more seamless and less fiddly experience for users.
I agree that in case of Linux, a udev rule generator would be a fantastic step ahead in terms of usability.
It's generally the most optimized system down to the fact that Apple controls everything about it's platform.
If that's considered baseline, then nothing but full vertical integration can compete
Not sure what "real work" is for you, but I regularly get more than 12 hours of battery life on an old Chromebook running a Linux and the usual IDEs/dev tooling (in a Crostini VM). All the drivers just work, sleep has no detectable battery drain. It's not a workstation by any means, but dual core Intel's are great for Python/Go/TypeScript
If we really want to get pedantic, its internal battery means the external pack is hot-swappable, so I can actually get several days on a "single charge." Good machine for camping trips.
I see how the GP comment could be provocative, but on this site we want responses that dampen provocation, not amplify it.
For a lot of people the point is to extend the life of their already-purchased hardware.
If your vendor is hostile like Apple, it will be hard to make it keep on working.
Why are some of y'all so hostile to this idea?
Intel Macs supported Linux because they used Intel's Linux drivers and supported bog-standard UEFI. There are no preexisting drivers or DeviceTree files published by Apple for Linux. There is no UEFI implimentation, just a proprietary bootloader that can be updated post-hoc to deny booting into third-party OSes.
> Why are some of y'all so hostile to this idea?
I would love for Linux to support as many ARM devices as possible. Unfortunately, it requires continuous effort from the OEM to be viable. I've bought Qualcomm, Rockchip and Broadcom boards before, none of them have been supported for half as long as my x86 machines are. Nevermind how fast ARM architectures become obsolete.
It feels like Apple is really the only hostile party here, and they coincidentally decide whether or not you get to use third-party OSes.
Apple is very hostile to it. It won’t stop everyone though. It’ll continue to be niche but it’s happening.
Except, you can't. The bootloader is the same iBoot process that your Apple Silicon machine uses, with mitigations to prevent unsigned OSes or persistent coldboot. All the Cydia exploits in the world won't put Linux back on the menu for iPhone or iPad users. And the same thing could happen to your Mac with an OTA update.
It is entirely possible for Apple to lock down the devices further. There's no guarantee they won't.
It's getting very tiresome to hear complaints about things that don't work on Linux, only to find that they're trying to run it on hardware that's poorly supported, and that's something they could have figured out by doing a little research beforehand.
Sometimes old hardware just isn't going to be well-supported by any OS. (Though, of course, with Linux, older hardware is more likely to be supported than bleeding-edge kit.)
This is very true. I've been asked by lots of people "how do I start with Linux" and, despite being 99.9% Linux user for everything everyday, my advice was always:
1. Use VirtualBox. Seriously, it won't look cool, but it will 100% work after maybe 5 mins mucking around with installing guest additions. Also snapshots. Also no messing with WiFi drivers or graphics card drivers or such.
2. Get a used beaten down old Thinkpad that people on Reddit confirm to be working with Linux without any drivers. Then play there. If it breaks, reinstall.
3. If the above didn't make you yet disinterested, THEN dual boot.
Also, if you don't care about GUI, then use the best blessing Microsoft ever created - WSL, and look no further.
I'd probably recommend against dual booting, but I understand it's controversial. I like to equate it to having two computers, but having to fully power one off to do anything* on the other one. Torrents stop, music collection may be inaccessible depending on how you stored it, familiar programs may not be around anymore. I dual booted for a few years in the past and I found it miserable. People who expected me to reboot to play a game with them didn't seem to understand how big of an ask that really was. Eventually things boiled over and I took the Windows HDD out of that PC entirely. Much more peaceful. (Proton solves that particular issue these days also)
That being said, I've had at least two friends who had a dual boot due to my influence (pushing GNU/Linux) who ended up with some sort of broken Windows install later on and were happy to already have Ubuntu as an emergency backup to keep the machine usable.
*Too old might be a problem these days with major distros not having 32bit ISOs anymore
2. If your priority is system lifespan, you are already using OEM macOS.
2. By all means start with macOS, but eventually Apple will stop supporting your machine. And y'know what will still work and get updates then? Linux.
Which old hardware? You're circling around to the grandparent's point again; Linux support is hardware dependent.
> And y'know what will still work and get updates then?
No, I don't. Depreciated iPads lay dead in piles, and they don't run Linux for shit. You want me to believe the M4 will graduate to the big leagues?
Every thread about Linux inevitably someone says “it gave new life to my [older computer model].” We’ve all seen it countless times.
I'm typing this from a ThinkPad X1 Carbon Gen 13 running Void Linux, and UPower is reporting 99% battery with ~15h left. I do have TLP installed and running, which is supposed to help. Realistically, I won't get around 15h with my usage patterns, but I do get around 10-12 hours. It's a new laptop with a fresh battery, so that plays a big role as well.
This might not be as good as the battery life on a Macbook, but it's pretty acceptable to me. The upcoming Intel chips also promise to be more power efficient, which should help even more.
[1] https://wiki.archlinux.org/title/TLP
It just seems like a completely pointless endeavor... perhaps some people buy into it? why would anyone buy overpriced hardware with partial support that may one day be gone? the enhanced battery life doesn't really hold much appeal to me, and the arm architecture if anything is just another signal to stay away.
The only thing that makes sense to me is that they wanted the achievement on their resume, and in that given recent developments they succeeded?
That said some of the prominent developers have left the project. As long as Apple keeps hoarding their designs it’s going to be a struggle, even more so now.
If you care about FOSS operating systems or freedom over your own hardware there isn’t a reason to choose Apple.
I’ve heard it’s mostly because there wasn’t an m3 Mac mini which is a much easier target for CI since it isn’t a portable. Also, there have been a ton of hardware changes internally between M2 and M3. M4 is a similar leap. More coprocessors, more security features, etc.
For example, PPL was replaced by SPTM and all the exclave magic.
https://randomaugustine.medium.com/on-apple-exclaves-d683a2c...
As always, opinions are my own
> I’m ashamed to say sometimes their documentation is better than the internal stuff.
The reverse engineering is a monumental effort, this Sisyphean task of trying to keep up with never-ending changes to the hardware. Meanwhile, the documentation is just sitting there in Cupertino. An enormous waste of time and effort from some of the most skilled people in the industry. Well, maybe not so much anymore since a bunch of them left.
I really hope this ends up biting Apple in the ass instead of protecting whatever market share they are guarding here.
https://lore.kernel.org/asahi/20251215-macsmc-subdevs-v6-4-0...
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
https://asahilinux.org/2025/02/passing-the-torch/
For instance, in this month's progress report:
> Last time, we announced that the core SMC driver had finally been merged upstream after three long years. Following that success, we have started the process of merging the SMC’s subdevice drivers which integrate all of the SMC’s functionality into the various kernel subsystems. The hwmon driver has already been accepted for 6.19, meaning that the myriad voltage, current, temperature and power sensors controlled by the SMC will be readable using the standard hwmon interfaces. The SMC is also responsible for reading and setting the RTC, and the driver for this function has also been merged for 6.19! The only SMC subdevices left to merge is the driver for the power button and lid switch, which is still on the mailing list, and the battery/power supply management driver, which currently needs some tweaking to deal with changes in the SMC firmware in macOS 26.
Also finally making it upstream are the changes required to support USB3 via the USB-C ports. This too has been a long process, with our approach needing to change significantly from what we had originally developed downstream
https://asahilinux.org/2025/12/progress-report-6-18/
Stop buying Apple laptops to run Linux.
https://rosenzweig.io/blog/asahi-gpu-part-n.html
https://lore.kernel.org/asahi/20251215-macsmc-subdevs-v6-4-0...
Asahi Lina, who also did tons of work on the Asahi Linux GPU development, also quit as she doesn't feel safe doing Linux GPU work anymore [1].
[0] https://marcan.st/2025/02/resigning-as-asahi-linux-project-l...
[1] https://asahilina.net/luna-abuse/
They are more common than you would think. There just is not many willing to work on a shoe string salary.
I blame Apple on pushing out new models every year. I don’t get why it does that. A M1 is perfectly fine after a few years but Apple treats it like an iPhone. I think one new model every 2-3 years is good enough.
That’s why I don’t like it as a consumer. If they keep producing M1 and M2 I’d assume we can get better prices because the total quantity would be much larger. Sure it is probably better for Apple to move forward quickly though.
In general, I don't think it's reasonable to worry that Apple's products aren't thoroughly achieving economies of scale. The less expensive consumer-oriented products are extremely popular, various components are shared across product lines (eg. the same chip being used in Macs and iPads) and across multiple generations (except for the SoC itself, obviously), and Apple rather famously has a well-run supply chain.
From a strategic perspective, it seems likely that Apple's long history of annual iteration on their processors in the iPhone and their now well-established pattern of updating the Mac chips less often but still frequently is part of how Apple's chips have been so successful. Annual(ish) chip updates with small incremental improvements compounds over the years. Compare Apple's past decade of chip progress against Intel's troubled past decade of infrequent technology updates (when you look past the incrementing of the branding), uneven improvements and some outright regressions in important performance metrics.
Why would this be true? An M5 MacBook Air today costs the same as an M1 MacBook Air cost in 2020 or whenever they released it, and is substantially more performant. Your dollar per performance is already better.
If they kept selling the same old stuff, then you spread production across multiple different nodes and the pricing would be inherently worse.
I've got a few ideas