As someone who came from the SGI O2/Octane era when high-end workstations were compact, distinctive, and sexy, I’ve never really understood the allure of the Mac Pro, with the exception of the 2013 Mac Pro tube, which I owned (small footprint, quiet, and powerful).
For me, aesthetics and size are important. That workstation on your desk should justify its presence, not just exist as some hulking box.
When Apple released the Mac Studio, it made perfect sense from a form-factor point-of-view. The internal expansion slots in the M2 Mac Pro didn't make any sense. It was like a bag of potato chips - mostly air. And far too big and ugly to be part of my work area! I'm surprised that Apple didn't discontinue it sooner.
As someone who worked on the M2 Mac Pro and has a real soft spot for it, I get it. It’s horrendously expensive and doesn’t offer much benefit over a Mac Studio and a thunderbolt pci chassis. My personal dream is that vms would support pci pass through and so you can just spin up a Linux vm and let it drive the gpus. But at that point, why are you buying a Mac?
G5 was the thing. And companies were buying G5 and other macs like that all the time, because you were able to actually extend it with video cards and some special equipment.
But now we have M chips. You don't need video for M chips. You kinda do, but truthfully, it's cheaper to buy a beefier Mac than to install a video card.
Pro was a great thing for designers and video editors, those freaks who need to color-calibrate monitors. And right now even mini works just fine for that.
And as for extensions - gone are the days of PCIe. Audio cards and other specialized equipment works and lives just fine on USB-C and Thunderbolt.
I remember how many months I've spent trying to make Creative Labs Sound Blaster to work on my 486 computer. At that time you had to have a card to extend your system. Right now I'm using Scarlett 2i2 from Focusrite. It works over USB-C with my iPhone, iPad and Mac. DJIs mics work just as good.
Damn, you can buy Oscilloscope that works over USB-C or network.
It's not the Mac's or Apple's fault. We are actually live in the age where systems are quite independent and do not require direct installations.
The top Mac Studio has six thunderbolt 5 ports, each of which is a PCIe 4.0 x4 link. Each is a 8GB/sec link in each direction, which is a lot. Going from x16 down to x4 has less than a 10% hit on games: https://www.reddit.com/r/buildapc/comments/sbegpb/gpu_in_pci...
Oculink is generally faster than TB5 despite them both using PCIe 4.0, because Oculink provides direct PCIe access whereas Thunderbolt has to route all PCIe traffic through its controller. The benchmarks show that the overhead introduced by the TB5 controller slows down GPU performance.
When people talk about 100gigabit networks for Macs, im really curious what kind of network you run at home and how much money you spent on it. Even at work I’m generally seeing 10gigabit network ports with 100gigabit+ only in data centers where macs don’t have a presence
Local AI is probably the most common application these days.
Apple recently added support for InfiniBand over Thunderbolt. And now almost all decent Mac Studio configurations have sold out. Those two may be connected.
I work in media production and I have the same thought constantly. Hell I curse in church as far as my industry is concerned because I find 2.5 to be fine for most of us. 10 absolutely.
I suppose the throughput is not the key, latency is. When you split ann operation that normally ran within one machine between two machines, anything that crosses the boundary becomes orders of magnitude slower. Even with careful structuring, there are limits of how little and how rarely you can send data between nodes.
I suppose that splitting an LLM workload is pretty sensitive to that.
Does M5 series have better video encoding chip/chiplet/whatever it is called than M4 series? Because while I’m happy with my M4 Pro overall, H.264 encoding performance with videotoolbox_h264 is disappointingly basically exactly the same as a previous 2018 model Intel Mac mini, and blown out of water by nvenc on any mid to high end Nvidia GPU released in the last half-decade, maybe even full decade. And video encoding is a pretty important part of video editing workflow.
If you mean editing ProRes is a better fit, if you mean final export software always beats hardware encoders in terms of quality, if you mean mass h.264 transcoding a Mac workstation is probably not the right place though.
Just about every single consumer computer shipped today uses PCIe. If you were referring to only only the physical PCIe slots, that's wrong too: the vast majority of desktop computers, servers, and workstations shipped in 2025 had physical PCIe slots (the only ones that didn't were Macs and certain mini-PCs).
The 2023 Mac Pro was dead on arrival because Apple doesn't let you use PCIe GPUs in their systems.
My post Mortem sentiments exactly. The lack of Nvidia GPU support for the M series Mac Pro models kneecapped the platform for professionals. If Apple had included that in those they’d be the defacto professional workstation for many more folks working in AI tech.
Plus modern interconnects like CXL are also layers on top of PCIe, and USB4 supports PCIe tunnelling. PCIe is a big collection of specifications, the physical/link/transaction layers can be mixed and matched and evolved separately.
I don't see it disappearing, at most we'll get PCIe 6/7/etc.
it's not just about pcie, it's socketed memory and disks. I guess disks are just pcie technically - but memory sockets are great. hell, in the pro chassis I am surprised they didn't opt for a socketed cpu that could be upgraded.
Apple really dropped the ball here. They had every ability to make something competitive with Nvidia for AI training as well as inference, by selling high end multi GPU Mac Pro workstations as well as servers, but for some reason chose not to. They had the infrastructure and custom SoCs and everything. What a waste.
It really could have been a bigger market for them than even the iPhone.
Just about everybody who isn't Nvidia dropped the ball, bigtime.
Intel should have shipped their GPUs with much more VRAM from day one. If they had done this, they'd have carved out a massive niche and much more market share, and it would have been trivially simple to do.
AMD should have improved their tools and software, etc.
Apple should have done as you say.
Google had nigh on a decade to boost TPU production, and they're still somehow behind the curve.
Such a lack of vision. And thus Nvidia is, now quite durably, the most valuable company in the world. Imagine telling that to a time traveler from 2018.
I think for AMD, they were focused on competing against Intel. Remember AMD was almost bankrupt about 15 years ago because of competing against Intel. But the very first GPU use for AI was actually with an ATI/AMD GPU, not an Nvidia one. Everyone thinks Nvidia kicked off the GPU AI craze when Ilya Sutskever cleaned up on AlexNet with an Nvidia GPU back in 2012, or when Andrew Ng and team at Stanford published their "Large Scale Deep Unsupervised Learning using Graphics Processors" in 2009, but in 2004, a couple of Korean researches were the first to implement neural networks on a GPU, using ATI Radeons: https://www.sciencedirect.com/science/article/abs/pii/S00313...
And as of now I do believe AMD is in the second strongest position in the datacenter space after Nvidia, ahead of even Google.
Don’t mistake stock market performance for revenue. NVIDIA makes ~200B annually, same as what Apple makes from iPhones. It’s a big market but GPUs aren’t just AI.
I'm purely talking in terms of revenue. There's a huge demand for AI systems from personal workstations to datacenter servers, and Apple was one of the few companies in the world in a position to build complete systems for it.
But for some reason Apple thought the sound recording engineer or the video editor market was more important... like, WTF dude? Have some vision at least!
> something competitive with Nvidia for AI training
Apple is counting on something else: model shrink. Every one is now looking at "how do we make these smaller".
At some point a beefy Mac Studio and the "right sized" model is going to be what people want. Apple dumped a 4 pack of them in the hands of a lot of tech influencers a few months back and they were fairly interesting (expensive tho).
> Apple is counting on something else: model shrink
The most powerful AI interactions I've had involved giving a model a task and then fucking off. At that point, I don't actually care if it takes 5 minutes or an hour. I've cued up a list of background tasks it can work on, and that I can circle back to when I have time. In that context, smaller isn't even the virtue at hand–user patience is. Having a machine that works on my bullshit questions and modelling projects at one tenth the speed of a datacentre could still work out to being a good deal even before considering the privacy and lock-in problems.
Give every iPhone family a in house Siri that will deal with canceling services and pursuing refunds.
Your customer screw up results in your site getting an agent drive DDOS on its CS department till you give in.
Siri: "Hey User, here's your daily update, I see you haven't been to the gym, would you like me to harass their customer service department till they let you out of their onerous contract?"
The Ultra variants of the M series chips had previously consisted of two of the Max chips bonded together.
The M5 generation Pro and Max chips have moved to a chiplet based architecture, with all the CPU cores on one chiplet, and all the GPU cores on another.
Not surprising, as the market has broadly moved on from add-in cards in favor of smaller form factors and external devices, absent some notable holdouts in specific verticals.
Gonna miss it, though. If they had reduced the add-in card slots to something more reasonable, lowered the entry price, and given us multi-socket options for the CPU (2x M# Ultras? 4x?), it could have been an interesting HPC or server box - though they’ve long since moved away from that in software land, so that was always but a fantasy.
At least the Mac Studio and Minis are cute little boxes.
Here's an interesting fact, one of the more famous and fanatical fanboy Mac Pro users was late radio host Rush Limbaugh (he owned four of them), who dedicated an entire segment to the topic on his normally all-politics show when Apple dropped the ball on Thunderbolt back in the day.
MCPRUE sells shameless ripoffs of the Mac Pro case, but with support for standard motherboard sizing, if you really want your PC to double as a cheese grater: https://www.mcprue.com/case
I own one and there's nothing shameful about it. It's basically CNCed to Apple's standards, just without the logo. The cool thing is since Studio Displays work on Windows too, with Thunderbolt motherboards you can have a setup that's visually the same as a Mac but is actually a PC.
P.S. Does anyone know how well Studio Displays now work on Linux? The best I could get it to work was on Ubuntu, where it basically worked out of the fresh install. X11 KDE on Fedora was a close second. Couldn't get it working on Wayland whatsoever.
A Ryzen 9800X3D is about 40% faster in single-core tests and the same speed to slightly faster at multi-core tasks, as compared to the M2 Ultra in the Mac Pro. In addition, the Ryzen computer would presumably be modular and allow for the user to choose their preferred configuration of memory, storage, GPU, etc, with options far exceeding those offered by Apple in its limited and non-user-upgradable machine. In addition, configuring the Ryzen machine with comparable specs to the base model Mac Pro (64GB of ram, 1TB of storage, and a low-end to midrange discrete GPU) would put you at a total system cost of something like 20-25% of the $6999 that the Mac Pro cost, even with today's inflated memory prices.
I'm not sure if this is what the parent meant by "a real modern PC," but it would certainly be 1) faster and 2) vastly cheaper than the Mac. So at minimum, your assertion that it'd be slower is wrong.
Depending on your configuration, you could likely also match the overall power consumption of the Mac as well, though yes, it is easily possible to exceed it. But the most likely way you'd exceed it is with a high-end GPU, which would vastly outperform the (fixed, non-upgradeable) GPU in the Mac.
And a 9800X3D is not even the fastest CPU out there, nor even the fastest CPU you could use with your specific motherboard. A 9950X3D is essentially two of the 9800X3Ds combined, and would be a drop-in replacement.
Still rocking a 2019 (Intel) Mac Pro here, all slots filled with various Pro Tools and UAD DSP cards, SSD, GPU, etc. I'm planning to get as much mileage out of it as I can. I'm sure a Studio would be more performant, but the Thunderbolt to PCIe chassis are not cheap.
While the trash can generation was somewhat present and around, I don't think I ever saw a cheese grater in the flesh. Did it have any users? Were there any actual useful expansion cards? Did anybody continue buying this at all, after it didn't get the M3 Ultra bump, that the Mac Studio got last year?
It had many hardware upgrades over the years - upgraded CPUs, 128GB RAM, 4TB NVME storage, a modern AMD GPU, USB3/c, thunderbolt, etc
The only reason it got replaced is because it became too much of a PITA to keep modern OSX running on it (via OCLP)
Replaced with an M4 Max Mac Studio, which is a nice and faster machine but with no ability to upgrade anything and much worse hardware resale value on M-series I'll have to replace it in 2-3 years
I'm a former 4,1 user, myself — replaced with an M2Pro mini Jan 2023 (finally retired fully 2025).
Absolutely recommend you purchase the 4-bay Terramaster external enclosure — gives you four SATA slots that are hot-swappable (unlike MacPro's). 10gbps via USB-C.
The cheese grater mac pros were very popular, in that people got them and continued to use them.
The most notable feature was that there were mac-specific graphics cards, and you could also run PC graphics cards (without a nice boot screen). They had a 1.4kw power supply I believe, and there was extra pcie power for higher-end graphics cards. You could upgrade the memory, add up to 6 or more sata hard disks (2 in dvd slot). You could run windows, dual booting if you wanted and apple supported the drivers.
The 2013 was kind of a joke. small and quiet, but expansion was minimal.
2019 looked beefy, but the expansion was more like a cash register for apple, not really democratic. There were 3rd party sata hard disk solutions,
the 2023 model was basically a joke. I think maybe the pcie slots were ok for nvme cards, not a lot else (unless apple made it).
nowadays an apple computer is more like an iphone - apple would prefer if everything was welded shut.
I have three of the trash can ones. They are absolute pieces of art, as useless as they are computationally these days (energy-to-performance wise at least). I will never sell nor give them away.
They've been trying to kill the Mac Pro for over a decade. I wonder how long before they backtrack again? It seems like they should at least have a migration path for users who needed the expansion cards the Mac Pro supported. Pushing them to the PC seems pretty bad.
Apple's new "Pro" definition seems more like "Prosumer".
The form-factor always felt like a weird fit for Apple Silicon. With the Intel boxes it was understandable; you want a few liters of free space for a couple AMD cards or some transcode hardware. The system was designed to be expandable, and the Mac Pro was the apex of Apple's commitment to that philosophy after bungling the trashcan Mac Pro.
None of the Apple Silicon hardware can seemingly justify this form factor, though. The memory isn't serviceable, PCIe devices aren't really supported, the PSU doesn't need much space, and the cooling can be handled with mobile-tier hardware. Apple's migration path is "my way or the highway" for Mac Pro owners.
Their justification for the form factor, when it was released, was that pro users need various PCI cards to interface with some of their equipment, and this would allow them to do that.
It seemed like the guts of the Mac Pro were essentially shoved inside of a box and stuck in the corner of the tower. It would seem like they could decouple it and sell a box that pro users could load cards into (like other companies do for eGPUs). It wouldn’t feel like a very Apple-like setup, but it would function and allow Apple to focus where they want to focus without simply leaving those users behind.
I suppose the other option would be to dispense with the smoke and mirrors and let people slot a Mac Studio right into the Mac Pro tower, so it could be upgraded independently of the tower.
The alternative is people leave the platform or end up with a bunch of Thunderbolt spaghetti. Neither of which seem ideal.
I suspect we'll start seeing higher-spec Mac Studio options.
One of those with an M* Ultra, and some sort of Thunderbolt storage expansion would probably cover most of the Pro's use cases. And Apple probably doesn't want to deal with anything more exotic than those.
With the popularity of mac mini (and macbooks for that matter) for doing ML/AI work, I would have thought Apple could make a Mac Pro that could make for a good workstation for doing in-house ML/AI stuff.
I bought a GPU maybe a decade ago for this, and it's not worth the hassle (for me at least), but a nice out-of-the box solution, I would pay for.
The problem is that the M1 chips foretold the doom of the Mac Pro unless they could figure out some way to do something that you couldn't do with a Mac Studio - thunderbolt is so good that it's hard to justify anything else.
If they had done more with NUMA in the M series maybe you could have a Mac Pro with M5 Ultras that can take a number of M5 "daughter cards" that do something useful.
Reading comments, I don’t think people are being completely fair here. For Intel and AMD to approach what Apple has accomplished they’re making many of the same compromises with Panther Lake and Ryzen AI Max. Apple chose to put disk controllers on their SoP rather than having them on the storage module. This shaves a tiny bit of latency. Worth it? No idea. I’m shit at hardware design.
As for not having a Pro or otherwise expandable system? It’s shit. They make several variations of their chips, and I don’t think it would hurt them to make an SoP for a socket, put a giant cooling system in it, and give it 10 or 12 PCIe slots. As for what would go in those slots? Make this beast rack mountable and people would toss better network cards, sound/video output or capture, storage controllers, and all kinds of other things in there. A key here would be to not charge so much just because they can. Make the price reasonable.
They have tried variations of this since time immemorial (we can argue about "price reasonablé") but there's just not much you can do with it that you can't do much cheaper or simpler in other ways.
The Xserve has been dead for 15 years now, and it was never tremendously amazing (though it was nice kit).
Apple apparently has some sort of "in-house" xserve-like thing they don't sell; but turning that into a product would likely be more useful than a Mac Pro, unless they add NUMA or some other way of allowing an M5 to access racks and racks of DIMMs.
Mac Studio waits for the Ultra chips to ship, which are always last in a generation. Perhaps the M5's chiplet architecture will help them move faster there.
The 2013 trash can was the end of the Mac Pro. It was never the same after that. The 2012 and earlier Mac Pros were awesome. I had a 2010 model. Here's what I loved:
• Multiple hard drive bays for easy swapping of disks, with a side panel that the user could open and close
• Expandable RAM
• Lots of ports, including audio
• The tower took up no desktop space
• It was relatively affordable, starting at $2500. Many software developers had one. (The 2019 and later Mac Pros were insanely expensive, starting at $6000.)
The Mac Studio is affordable, but it lacks those other features. It has more ports than other Macs but fewer in number and kind than the old Mac Pro, because the Mac Studio is a pointlessly small desktop instead of floor tower.
That's when they stopped designing computers for the pro market and started selling mid-century Danish furniture that can also edit videos.
I knew it was all over when third party companies had to develop the necessarily-awkward rack mount kits for those contraptions. If Apple actually cared about or understood their pro customers, they would have built a first party solution for their needs. Like sell an actual rack-mount computer again—the horror!
Instead, an editing suite got what looked like my bathroom wastebasket.
When it was introduced, Apple said the trash can was a revolution in cooling design.
Then they said they couldn't upgrade the components because of heat. Everyone knows that wasn't true.
By the time Apple said they had issues with it in 2017, AMD were offering 14nm GCN4 and 5 graphics (Polaris and Vega) compared to the 28nm GCN1 graphics in the FirePro range. Intel had moved from Ivy Bridge to Skylake for Xeons. And if they wanted to be really bold (doubtful, as the move to ARM was coming) then the 1st gen Epyc was on the market too.
Moore's Law didn't stop applying for 6 years. They had options and chose to abandon their flagship product (and most loyal customers) instead.
Aside from the GPU mess, the 2013 was a nice machine, basically a proto-Mac Studio. Aside from software, the only thing that pushed me off my D300/64GB/12-core as an everyday desktop + front-end machine is the fact that there's no economically sensible way to get 4K video at 120 Hz given that an eGPU enclosure + a decent AMD GPU would cost as much as a Mac mini, so I'm slumming it in Windows for a few months until the smoke clears from the next Mac Studio announcement.
At which point I'll decide whether to replace my Mac Pro with a Mac Studio or a Linux workstation; honestly, I'm about 60/40 leaning towards Linux at this point, in which case I'd also buy a lower-end Mac, probably a MacBook Air.
I'm in the Linux desktop / Mac laptop camp, and it works well for me. Prevents me getting too tied up in any one ecosystem so that I can jump ship if Apple start releasing duds again.
The biggest issue was actually that the Mac Pro was designed specifically for dual GPUs- in the era of SLI this made some sense, but once that technology was abandoned it was a technological dead-end.
If you take one apart you'll see why, it's not the case that you could have ever swapped around the components to make it dual-CPU instead; it really was "dual GPU or bust".
Somewhat ironically, in todays ML ecosystem, that architecture would probably do great. Though I doubt it could possibly do better than what the M-series is doing by itself using unified memory.
I'll admit that while I've used the trash can but never taken it apart myself. But I can't imagine it would have been impossible to throw 2x Polaris 10 GPUs on the daughterboards in place of the FirePros.
For what is essentially a dead-end technology, I'm somewhat doubtful people would have bought it (since the second GPU is going to be idle and add to the cost massively).
the CPU being upgraded would have been much easier though I think.
If I remember correctly, the maximum configuration was something like $35k back in the day. I wonder what those people feel like now. On the other hand, if they have $35k to burn, probably they don't even think about it.
> Serviceable, repairable, upgradable Macs are officially a thing of the past.
Well, not exactly. Apple’s desktop Macs actually all have modular SSD storage, and third parties sell upgrade kits. And it’s not like Thunderbolt is a slouch as far as expandability.
I can see why the Mac Pro is gone. Yeah, it has PCIe slots…that I don’t really think anyone is using. It’s not like you can drop an RTX 5090 in there.
The latest Mac Pro didn’t have upgradable memory so it wasn’t much different than a Mac Studio with a bunch of empty space inside.
The Mac Studio is very obviously a better buy for someone looking for a system like that. It’s just hard to imagine who the Mac Pro is for at its pricing and size.
I think what happened is that the Studio totally cannibalized Mac Pro sales.
Apparently the Neo is surprisingly repairable - in that parts can be replaced, not that you can buy stuff at Microcenter or Fry's (RIP) and shove them in.
I didn’t phrase myself very well. What I’m saying is that the loss of the Mac Pro didn’t reduce the repairability or modularity at all in the product lineup.
It was exactly as modular as the Mac mini and Mac Studio.
The only difference is that it had some PCIe slots that basically had no use since you couldn’t throw a GPU in there, and because thunderbolt 5 exists.
Yeah, sure, there were some niche PCIe things that two people probably used. Hence the discontinuation.
I am an ex-Mac user, I own a Framework. Don’t worry, you’re preaching to the choir.
Those are all for Intel Macs, and not even the recent Intel Macs. You can't use a passive adapter to put a NVMe SSD into a current Mac like you could a decade ago, because back then the only thing non-standard about the SSD was the connector. Now most of the SSD controller itself has moved to the SoC and trying to put an off the shelf SSD into the current slot makes no more sense than trying to put an SSD into a DIMM slot.
Honestly I don't care, but Apples SSDs don't have a storage controller on them, and those adapters are designed to "bypass" the controller on m.2 drives.
You can argue that it's different for the sake of being different, but
A) I personally don't always hold that monopoly is a good thing, even if we agree m.2 is fairly decent it doesn't make it universally the best.
B) I'd make the argument that Apple is competing very well with performance and reliability..
C) IIRC there are some hardware guarantees that the new filesystem needs to be aware of (for wear levelling and error-correction) and those would be obfuscated by a controller that thinks its smarter than the CPU and OS.
if we're talking about Intel era Macs then that proprietary connector predates M.2 entirely and is actually even thinner and smaller (which is pretty important when the primary use-cases is thin-and-lights); though I suppose that the adapter fits is a sign that it would have been possible to use a larger connector...
That is an absolutely awful argument against what I just said. I can tell that you don't care.
Tens of thousands of mini PC and laptop boards ship with multiple M.2 slots. Apple can use both connectors, with the exact same caveats that normal M.2 SSDs have on ordinary filesystems. Apple does not have to enable swap, zram, or other high-wear settings on macOS if they are uncomfortable with the inconsistency of M.2 drives. Now, I'd make the argument that people don't complain about APFS wear on external SSDs, but maybe I'm wrong and macOS does have some fancy bypass saving thousands of TBW/year.
Whatever the case is, "the annoying thing is competitive" was not a justification for the Lightning cable when it reached the gallows. It did not compete, it specifically protected Apple from the competitive pressure of higher-capacity connectors. The same is true of Apple's SSD racket and the decade-old meme of $400 1tb NVMe drives.
Siracusa—probably best known here for doing fabulous OS X reviews for Ars—is a co-host of ATP. He is also known is such circles for having Mac Pros, and using them for a long time (sometimes by choice, sometimes by circumstance). He thinks Apple should make a Mac Pro, not necessarily because it's a big seller, but because he thinks Apple should make a "best computer," much in the same way car companies might make a car that will never sell but pushes engineers, etc.
Ages ago, when new Mac hardware came out, I'd amuse myself by putting together an "ultimate Mac workstation" in the configurator --- once upon a time, one could hit 6 figures pretty easily --- these days, well I panic bought a duplicate computer because I was worried a chipped/cracked display was going to make it unusable (turns out a screen protector has worked thus far).
I agree with the reasoning, and would like to see Apple continue to make aspirational hardware, but maybe the mainstream stuff is good enough?
Even Siracusa admits that - he's found it hard to articulate what a true "Mac Pro" would do that you can't do with other things.
Back in the heyday of the $100k Mac Pro you could certainly imagine it doing things that wouldn't be easily done by anything under $50k, and it would look good doing it.
Apple betrayed their pro customers years ago—right around the time they went to version X of the Pro apps—it's all been a slow death by a thousand paper cuts since then.
The money's all in selling phones to teen girls now, and taking their mafia cut of app store sales.
In 2007 Steve Jobs went on stage (next to a very young-looking Tim Cook) and angrily told a reporter "we don't ship junk". Those days are over, because the flagship product is now a $600 netbook.
It's not like Apple soldered some plain old DDR5 to a PCB to be difficult:
1. It's TSMC's InFO_POP, which has significant performance benefits.
2. There weren't any modules that existed for LPDDR until very recently.
3. The power/price/performance/thermals they are able to achieve with this configuration is not possible with socketed RAM. You are asking them to make the device worse
Go pop open a Framework with an Ryzen AI Max processor -- you won't find socketed RAM. Technology has moved on. Math coprocessors and CPU cache aren't separate modules anymore either. AMD has even said they studied the possibility of LPCAMM for Strix Halo and found that even it wasn't good enough for signal integrity.
I picked up a 15" Macbook Air (M3) for $849 — clearance @costco early 2025.
This model only has 8gb of RAM — which is fine for streaming videos/typing — it absolutely could not be my daily driver, but makes for good casual usage.
Machines probably should ship with more than that (or a lighter operating system?), particularly when the RAM isn't upgradeable. I'll recon Apple supports at least two more macOS on these 8GB configurations.
My favorite machine only has 4GB of RAM (Core2Duo Max, Win7Pro) and works good, albeit nothing modern.
> simp: be excessively attentive or submissive to a person in whom one is romantically or sexually interested.
This word does not appear to be in any way relevant. You do not have to buy a MacBook Neo, but approximately everyone else in the low-end laptop market will.
If you think it is a bad product, go buy some Acer stock.
> but approximately everyone else in the low-end laptop market will
This is delusion. The retail price point right now for comparable PC laptops is $429 and they ship with DOUBLE the RAM and storage (16GB, 512GB).
For the same specs as the Neo we are talking < $350.
There is NO market for this device. Apple is catering to the welfare crowd with this one, except anyone in that situation would opt for a PC at half the price.
Like trying to sell a Cadillac with a park bench for seats to save money. It makes no sense.
Bookmark this post, the Neo will be discontinued within two years.
> but approximately everyone else in the low-end laptop market will.
No, they won't. People repeat this, but Macs constitute a minority of the low-end market and will continue to for the foreseeable future.
This has been the case when $400 Retina Intel Macbooks were flooding the used market; it was the case when Costco sold $700 M1 MBAs. If you cannot extrapolate what will happen with the $600 laptop, then I don't think you have payed attention to what the market is buying.
For me, aesthetics and size are important. That workstation on your desk should justify its presence, not just exist as some hulking box.
When Apple released the Mac Studio, it made perfect sense from a form-factor point-of-view. The internal expansion slots in the M2 Mac Pro didn't make any sense. It was like a bag of potato chips - mostly air. And far too big and ugly to be part of my work area! I'm surprised that Apple didn't discontinue it sooner.
Opinions are my own obvs.
SR-IOV is just that? and is well supported by both Windows and Linux.
Whose else would they be?
G5 was the thing. And companies were buying G5 and other macs like that all the time, because you were able to actually extend it with video cards and some special equipment.
But now we have M chips. You don't need video for M chips. You kinda do, but truthfully, it's cheaper to buy a beefier Mac than to install a video card.
Pro was a great thing for designers and video editors, those freaks who need to color-calibrate monitors. And right now even mini works just fine for that.
And as for extensions - gone are the days of PCIe. Audio cards and other specialized equipment works and lives just fine on USB-C and Thunderbolt.
I remember how many months I've spent trying to make Creative Labs Sound Blaster to work on my 486 computer. At that time you had to have a card to extend your system. Right now I'm using Scarlett 2i2 from Focusrite. It works over USB-C with my iPhone, iPad and Mac. DJIs mics work just as good.
Damn, you can buy Oscilloscope that works over USB-C or network.
It's not the Mac's or Apple's fault. We are actually live in the age where systems are quite independent and do not require direct installations.
My GPU, NVMe drives and motherboard might disagree.
…so what do you actually need PCIe for?
Thunderbolt is also too slow for higher-end networks. A single port is already insufficient for 100-gigabit speeds.
Apple recently added support for InfiniBand over Thunderbolt. And now almost all decent Mac Studio configurations have sold out. Those two may be connected.
TIL:
* https://developer.apple.com/documentation/technotes/tn3205-l...
Or maybe I forgot:
* https://news.ycombinator.com/item?id=46248644
I suppose that splitting an LLM workload is pretty sensitive to that.
Thunderbolt is external PCIe.
This is a wild and very wrong take.
Just about every single consumer computer shipped today uses PCIe. If you were referring to only only the physical PCIe slots, that's wrong too: the vast majority of desktop computers, servers, and workstations shipped in 2025 had physical PCIe slots (the only ones that didn't were Macs and certain mini-PCs).
The 2023 Mac Pro was dead on arrival because Apple doesn't let you use PCIe GPUs in their systems.
I don't understand how this is a response to anything I said.
I don't see it disappearing, at most we'll get PCIe 6/7/etc.
It really could have been a bigger market for them than even the iPhone.
Intel should have shipped their GPUs with much more VRAM from day one. If they had done this, they'd have carved out a massive niche and much more market share, and it would have been trivially simple to do.
AMD should have improved their tools and software, etc.
Apple should have done as you say.
Google had nigh on a decade to boost TPU production, and they're still somehow behind the curve.
Such a lack of vision. And thus Nvidia is, now quite durably, the most valuable company in the world. Imagine telling that to a time traveler from 2018.
And as of now I do believe AMD is in the second strongest position in the datacenter space after Nvidia, ahead of even Google.
But for some reason Apple thought the sound recording engineer or the video editor market was more important... like, WTF dude? Have some vision at least!
Apple is counting on something else: model shrink. Every one is now looking at "how do we make these smaller".
At some point a beefy Mac Studio and the "right sized" model is going to be what people want. Apple dumped a 4 pack of them in the hands of a lot of tech influencers a few months back and they were fairly interesting (expensive tho).
The most powerful AI interactions I've had involved giving a model a task and then fucking off. At that point, I don't actually care if it takes 5 minutes or an hour. I've cued up a list of background tasks it can work on, and that I can circle back to when I have time. In that context, smaller isn't even the virtue at hand–user patience is. Having a machine that works on my bullshit questions and modelling projects at one tenth the speed of a datacentre could still work out to being a good deal even before considering the privacy and lock-in problems.
Give every iPhone family a in house Siri that will deal with canceling services and pursuing refunds.
Your customer screw up results in your site getting an agent drive DDOS on its CS department till you give in.
Siri: "Hey User, here's your daily update, I see you haven't been to the gym, would you like me to harass their customer service department till they let you out of their onerous contract?"
The M5 generation Pro and Max chips have moved to a chiplet based architecture, with all the CPU cores on one chiplet, and all the GPU cores on another.
https://www.wikipedia.org/wiki/Apple_M5
So what will the M5 Ultra look like?
If you integrate two CPU chiplets and two GPU chiplets, you're looking at 36 CPU cores, 80 GPU cores, and 1228 GB/s of memory bandwidth.
Gonna miss it, though. If they had reduced the add-in card slots to something more reasonable, lowered the entry price, and given us multi-socket options for the CPU (2x M# Ultras? 4x?), it could have been an interesting HPC or server box - though they’ve long since moved away from that in software land, so that was always but a fantasy.
At least the Mac Studio and Minis are cute little boxes.
https://macdailynews.com/2012/06/12/rush-limbaugh-okay-apple...
P.S. Does anyone know how well Studio Displays now work on Linux? The best I could get it to work was on Ubuntu, where it basically worked out of the fresh install. X11 KDE on Fedora was a close second. Couldn't get it working on Wayland whatsoever.
I'm not sure if this is what the parent meant by "a real modern PC," but it would certainly be 1) faster and 2) vastly cheaper than the Mac. So at minimum, your assertion that it'd be slower is wrong.
Depending on your configuration, you could likely also match the overall power consumption of the Mac as well, though yes, it is easily possible to exceed it. But the most likely way you'd exceed it is with a high-end GPU, which would vastly outperform the (fixed, non-upgradeable) GPU in the Mac.
Not only it is screamingly fast (the fastest on earth for some workloads), but I can upgrade it easily. And is dead silent too.
The best thing is it runs native Linux and it just works.
It had many hardware upgrades over the years - upgraded CPUs, 128GB RAM, 4TB NVME storage, a modern AMD GPU, USB3/c, thunderbolt, etc
The only reason it got replaced is because it became too much of a PITA to keep modern OSX running on it (via OCLP)
Replaced with an M4 Max Mac Studio, which is a nice and faster machine but with no ability to upgrade anything and much worse hardware resale value on M-series I'll have to replace it in 2-3 years
Absolutely recommend you purchase the 4-bay Terramaster external enclosure — gives you four SATA slots that are hot-swappable (unlike MacPro's). 10gbps via USB-C.
The most notable feature was that there were mac-specific graphics cards, and you could also run PC graphics cards (without a nice boot screen). They had a 1.4kw power supply I believe, and there was extra pcie power for higher-end graphics cards. You could upgrade the memory, add up to 6 or more sata hard disks (2 in dvd slot). You could run windows, dual booting if you wanted and apple supported the drivers.
The 2013 was kind of a joke. small and quiet, but expansion was minimal.
2019 looked beefy, but the expansion was more like a cash register for apple, not really democratic. There were 3rd party sata hard disk solutions,
the 2023 model was basically a joke. I think maybe the pcie slots were ok for nvme cards, not a lot else (unless apple made it).
nowadays an apple computer is more like an iphone - apple would prefer if everything was welded shut.
Apple's new "Pro" definition seems more like "Prosumer".
That's a cute way of saying that GPUs aren't supported.
None of the Apple Silicon hardware can seemingly justify this form factor, though. The memory isn't serviceable, PCIe devices aren't really supported, the PSU doesn't need much space, and the cooling can be handled with mobile-tier hardware. Apple's migration path is "my way or the highway" for Mac Pro owners.
It seemed like the guts of the Mac Pro were essentially shoved inside of a box and stuck in the corner of the tower. It would seem like they could decouple it and sell a box that pro users could load cards into (like other companies do for eGPUs). It wouldn’t feel like a very Apple-like setup, but it would function and allow Apple to focus where they want to focus without simply leaving those users behind.
I suppose the other option would be to dispense with the smoke and mirrors and let people slot a Mac Studio right into the Mac Pro tower, so it could be upgraded independently of the tower.
The alternative is people leave the platform or end up with a bunch of Thunderbolt spaghetti. Neither of which seem ideal.
I always hoped we’d get a consumer version of what they have internally - 10 or 20 or more Apple Silicon chips for 1000 cores or so.
One of those with an M* Ultra, and some sort of Thunderbolt storage expansion would probably cover most of the Pro's use cases. And Apple probably doesn't want to deal with anything more exotic than those.
https://www.youtube.com/watch?v=x4_RsUxRjKU or something
I bought a GPU maybe a decade ago for this, and it's not worth the hassle (for me at least), but a nice out-of-the box solution, I would pay for.
If they had done more with NUMA in the M series maybe you could have a Mac Pro with M5 Ultras that can take a number of M5 "daughter cards" that do something useful.
As for not having a Pro or otherwise expandable system? It’s shit. They make several variations of their chips, and I don’t think it would hurt them to make an SoP for a socket, put a giant cooling system in it, and give it 10 or 12 PCIe slots. As for what would go in those slots? Make this beast rack mountable and people would toss better network cards, sound/video output or capture, storage controllers, and all kinds of other things in there. A key here would be to not charge so much just because they can. Make the price reasonable.
The Xserve has been dead for 15 years now, and it was never tremendously amazing (though it was nice kit).
Apple apparently has some sort of "in-house" xserve-like thing they don't sell; but turning that into a product would likely be more useful than a Mac Pro, unless they add NUMA or some other way of allowing an M5 to access racks and racks of DIMMs.
• Multiple hard drive bays for easy swapping of disks, with a side panel that the user could open and close
• Expandable RAM
• Lots of ports, including audio
• The tower took up no desktop space
• It was relatively affordable, starting at $2500. Many software developers had one. (The 2019 and later Mac Pros were insanely expensive, starting at $6000.)
The Mac Studio is affordable, but it lacks those other features. It has more ports than other Macs but fewer in number and kind than the old Mac Pro, because the Mac Studio is a pointlessly small desktop instead of floor tower.
I knew it was all over when third party companies had to develop the necessarily-awkward rack mount kits for those contraptions. If Apple actually cared about or understood their pro customers, they would have built a first party solution for their needs. Like sell an actual rack-mount computer again—the horror!
Instead, an editing suite got what looked like my bathroom wastebasket.
Then they said they couldn't upgrade the components because of heat. Everyone knows that wasn't true.
By the time Apple said they had issues with it in 2017, AMD were offering 14nm GCN4 and 5 graphics (Polaris and Vega) compared to the 28nm GCN1 graphics in the FirePro range. Intel had moved from Ivy Bridge to Skylake for Xeons. And if they wanted to be really bold (doubtful, as the move to ARM was coming) then the 1st gen Epyc was on the market too.
Moore's Law didn't stop applying for 6 years. They had options and chose to abandon their flagship product (and most loyal customers) instead.
At which point I'll decide whether to replace my Mac Pro with a Mac Studio or a Linux workstation; honestly, I'm about 60/40 leaning towards Linux at this point, in which case I'd also buy a lower-end Mac, probably a MacBook Air.
If you take one apart you'll see why, it's not the case that you could have ever swapped around the components to make it dual-CPU instead; it really was "dual GPU or bust".
Somewhat ironically, in todays ML ecosystem, that architecture would probably do great. Though I doubt it could possibly do better than what the M-series is doing by itself using unified memory.
https://i.ebayimg.com/images/g/RQIAAOSwxKFoTHe3/s-l1200.jpg
For what is essentially a dead-end technology, I'm somewhat doubtful people would have bought it (since the second GPU is going to be idle and add to the cost massively).
the CPU being upgraded would have been much easier though I think.
I was talking about the form factor of the machine.
Well, not exactly. Apple’s desktop Macs actually all have modular SSD storage, and third parties sell upgrade kits. And it’s not like Thunderbolt is a slouch as far as expandability.
I can see why the Mac Pro is gone. Yeah, it has PCIe slots…that I don’t really think anyone is using. It’s not like you can drop an RTX 5090 in there.
The latest Mac Pro didn’t have upgradable memory so it wasn’t much different than a Mac Studio with a bunch of empty space inside.
The Mac Studio is very obviously a better buy for someone looking for a system like that. It’s just hard to imagine who the Mac Pro is for at its pricing and size.
I think what happened is that the Studio totally cannibalized Mac Pro sales.
We should demand better of our computer-manufacturing overlords.
> It’s not like you can drop an RTX 5090 in there.
Why not? Oh, right, because Apple won't let you. Sad.
It was exactly as modular as the Mac mini and Mac Studio.
The only difference is that it had some PCIe slots that basically had no use since you couldn’t throw a GPU in there, and because thunderbolt 5 exists.
Yeah, sure, there were some niche PCIe things that two people probably used. Hence the discontinuation.
I am an ex-Mac user, I own a Framework. Don’t worry, you’re preaching to the choir.
"Modular" does not mean that it's serviceable, repairable or upgradable. Apple's refusal to adopt basic M.2 spec is a pretty glaring example of that.
I get the ideological angle, but in practical terms that's not a barrier: https://www.aliexpress.us/w/wholesale-apple-ssd-adapter.html...
You can argue that it's different for the sake of being different, but
A) I personally don't always hold that monopoly is a good thing, even if we agree m.2 is fairly decent it doesn't make it universally the best.
B) I'd make the argument that Apple is competing very well with performance and reliability..
C) IIRC there are some hardware guarantees that the new filesystem needs to be aware of (for wear levelling and error-correction) and those would be obfuscated by a controller that thinks its smarter than the CPU and OS.
if we're talking about Intel era Macs then that proprietary connector predates M.2 entirely and is actually even thinner and smaller (which is pretty important when the primary use-cases is thin-and-lights); though I suppose that the adapter fits is a sign that it would have been possible to use a larger connector...
Tens of thousands of mini PC and laptop boards ship with multiple M.2 slots. Apple can use both connectors, with the exact same caveats that normal M.2 SSDs have on ordinary filesystems. Apple does not have to enable swap, zram, or other high-wear settings on macOS if they are uncomfortable with the inconsistency of M.2 drives. Now, I'd make the argument that people don't complain about APFS wear on external SSDs, but maybe I'm wrong and macOS does have some fancy bypass saving thousands of TBW/year.
Whatever the case is, "the annoying thing is competitive" was not a justification for the Lightning cable when it reached the gallows. It did not compete, it specifically protected Apple from the competitive pressure of higher-capacity connectors. The same is true of Apple's SSD racket and the decade-old meme of $400 1tb NVMe drives.
Believe t-shirts at WWDC were not enough.
Thus the workstation market joins OS X Server.
They made a shirt. It was fun.
I agree with the reasoning, and would like to see Apple continue to make aspirational hardware, but maybe the mainstream stuff is good enough?
Even Siracusa admits that - he's found it hard to articulate what a true "Mac Pro" would do that you can't do with other things.
Back in the heyday of the $100k Mac Pro you could certainly imagine it doing things that wouldn't be easily done by anything under $50k, and it would look good doing it.
https://www.macrumors.com/2026/03/26/mac-pro-wheels-kit-disc...
The money's all in selling phones to teen girls now, and taking their mafia cut of app store sales.
1. It's TSMC's InFO_POP, which has significant performance benefits.
2. There weren't any modules that existed for LPDDR until very recently.
3. The power/price/performance/thermals they are able to achieve with this configuration is not possible with socketed RAM. You are asking them to make the device worse
Go pop open a Framework with an Ryzen AI Max processor -- you won't find socketed RAM. Technology has moved on. Math coprocessors and CPU cache aren't separate modules anymore either. AMD has even said they studied the possibility of LPCAMM for Strix Halo and found that even it wasn't good enough for signal integrity.
And Apple is effectively committing to supporting 8GB computers with their OS upgrades for years to come.
This model only has 8gb of RAM — which is fine for streaming videos/typing — it absolutely could not be my daily driver, but makes for good casual usage.
Machines probably should ship with more than that (or a lighter operating system?), particularly when the RAM isn't upgradeable. I'll recon Apple supports at least two more macOS on these 8GB configurations.
My favorite machine only has 4GB of RAM (Core2Duo Max, Win7Pro) and works good, albeit nothing modern.
This word does not appear to be in any way relevant. You do not have to buy a MacBook Neo, but approximately everyone else in the low-end laptop market will.
If you think it is a bad product, go buy some Acer stock.
This is delusion. The retail price point right now for comparable PC laptops is $429 and they ship with DOUBLE the RAM and storage (16GB, 512GB).
For the same specs as the Neo we are talking < $350.
There is NO market for this device. Apple is catering to the welfare crowd with this one, except anyone in that situation would opt for a PC at half the price.
Like trying to sell a Cadillac with a park bench for seats to save money. It makes no sense.
Bookmark this post, the Neo will be discontinued within two years.
No, they won't. People repeat this, but Macs constitute a minority of the low-end market and will continue to for the foreseeable future.
This has been the case when $400 Retina Intel Macbooks were flooding the used market; it was the case when Costco sold $700 M1 MBAs. If you cannot extrapolate what will happen with the $600 laptop, then I don't think you have payed attention to what the market is buying.