I always get a little bothered when I see negative reviews from a CPU update in Apple laptops. While a new CPU alone isn’t a thrilling update, it’s important that they do these regularly so consumers looking to buy aren’t forced to buy a 3 year old product with no idea when a refresh will come. I’ve been in this situation many times with Apple and it has been very frustrating. I’m glad they are back on a yearly refresh schedule.
I think the issue stems from too many people making their living off reviews that require something exciting to get views. When updates are more evolution than revolution, it makes for a more boring article/video. I always worry that these types of responses will lead Apple to do silly things, like leaving old chips out there too long, or adding pointless features just so there is something new to talk about.
> While a new CPU alone isn’t a thrilling update, it’s important that they do these regularly so consumers looking to buy aren’t forced to buy a 3 year old product with no idea when a refresh will come.
Also: incremental updates add up.
A (e.g.) 7% increase from one year to the next isn't a big deal, but +7%, +7%, +7%, …, adds up when you finally come up for a tech refresh after 3-5 years.
Its 2025, the fact that Apple is delivering CPUs with actual, noticeable annual performance improvements is pretty astounding in itself. Sure its not 1990s levels, but its still pretty great.
M silicon/SoC is the best thing to happen to computing, for me.
I have 64GBs of RAM in my Macbook Pro. I load a 48GB DuckDB to RAM and run real-time, split-second, complex, unique analysis using Polars and Superset. Nothing like this was possible before unless I had a supercomputer.
Is it really that much better than some small form AMD Ryzen with 2x32 SODIMM thrown in? I get that the M series is amazing in terms of efficiency and some people love Apple hardware but you could likely have had that performance with a $700 setup.
The only server that actually matched the performance of a Mac Studio was XEON Max series (formerly codenamed Sapphire Rapids HBM) with 64GB of integrated memory into the CPU package. the latency between the CPU and RAM is simply too big in a regular PC.
yes. we have PCs. AFAIK, the cheapest PC that compares for my workflow is an EPYC/NUMA or another very expensive CPU/latency optimized server. We have a complex stack, with clients running unique queries that we can't predict and gigabytes loaded into RAM, L3 cache doesn't always save us. I haven't found another solution, I wish we could drop the Macs cause the OS is pretty awful.
We're using Macs as servers. But it's a small operation.
Also: we shouldn't make a big deal out of every update then. Celebrating M1: alright, but then M2-M500 are boring and not even worth noting, because you know there's a new one every year.
In the vocabulary of finance you don't multiply in gains, you add them. It probably historically derives from dividends being added. At the transactional level you never actually multiply money, after all (unless you're a bank).
Agree. So many people online (not just reviewers) complaining that it's just a spec-bump, demanding a new design. I remember the time people were (rightfully) complaining that the update schedules were slow for Macs, mainly because of Intel's limitations. Now we get yearly refresh, they complain that it looks the same.
I don't think they appreciate the cost of redesigning and retooling. Echo your thoughts and hope Apple doesn't listen to this feedback. Imagine more expensive laptops because some people want more frequent design changes!
> So many people online (not just reviewers) complaining that it's just a spec-bump, demanding a new design.
If ever there was a case of "be careful what you wish for" - whether it's the Touch Bar, deleting ports or the butterfly keyboard, a redesign isn't necessarily a positive.
Apple is the company where every new iPhone release is both simultaneously boring, not worth it and a sure sign of their impending collapse and also somehow a vicious treadmill which forces people to upgrade every year, throwing out their old phone and contributing to e-waste. They could announce a cure for cancer tomorrow and within a month people would be back to asking what have they innovated recently. People just like to complain.
Perhaps it’s just a language slip, how are people forced to upgrade every year? My experience is the opposite: ios 15 is still supported[0] and my 2016 iPhone let me access the World Wide Web.
The force your talking about comes instead from developers (like me) that implements features and systems always more CPU/GPU hungry.
In certain circles it's assumed that people buy a new phone every year. Then they make 30 minute Youtube videos lambasting Apple about "incremental upgrades" and "zero innovation"
While also not getting that they're NOT the target market.
For the person whose iPhone finally (after half a decade or more) falls out of major version support a 5-6 generation jump on hardware is amazing.
I think some people see having an older generation iPhone as sending a signal of "I’m poor", a status thing. Pretty ugly thing, but the act of buying iPhones on credit happens too often.
People quickly become accustomed to common occurrences that are not threatening (like extreme weather events). Apollo 8 was the first time humans reached the moon, just orbiting it. Sixteen months later, in a time with much less media and information than we have now, US TV networks chose not to broadcast an en-route feed for Apollo 13 because this was no longer seen as interesting. We often seem spoiled, we often seem prone to complaining, and we often seem more enamored with something new. Yet there are so many remarkable things we take for granted.
That’s because it’s called “the news”, not “the olds”.
When I teach people how to talk to reporters I always emphasize this. If it’s the 10th time something happened, you need to explain it in terms of what’s -new- or your info won’t go beyond the pitch meeting.
Thats why your town’s street fair makes a big deal that it’s the 10th anniversary event. It’s “news” that you’ve hit a round number. That’s why Trump breaks the law in a little way before doing it in a big way… the second time isn’t interesting.
Add me to the group. I’m on iPhone 11 and I couldn’t be happier. I do follow their new launches and then look at what I have. It looks and works like new, have absolutely no complaints. 6+ years and going strong.
iPhone 12 mini here, I replaced the battery last year, and still going strong. Actually, I fear the day I'm forced to upgrade due not having the mini size available anymore...
People who upgrade every year don't do it for technical needs. We're long past the times when phones were inadequate and yearly improvements were big leaps that made them less unusable.
Yearly phone upgrades are just to sport the latest model, symbolizes status. Or if there's some deal where you can do it for close to no cost, better than long upgrade cycles, but I don't think "free upgrades" are common.
See!? They waste all this time and money curing cancer, which ended up being soo easy in the end, and not a single bit of effort into curing Alzheimer’s. Pricks.
Intel had multiple years of promising that their new next-gen more efficient 10nm CPUs were coming very soon, and then those kept being delayed.
The chips they did release in that time period were mostly minor revisions of the same architecture.
Apple was pretty clearly building chassis designs for the CPUs that Intel was promising to release, and those struggled with thermal management of the chips that Intel actually had on the market. And Apple got tired of waiting for Intel and having their hardware designs out of sync with the available chips.
> and those struggled with thermal management of the chips
An ironic mirror of the PowerPC era when every version of the G5 was struggling with high power consumption and heat generation when operated at any competitive frequency/performance level. The top end models like the 2.5GHz quad-G5 needed water cooling, consumed 250W when idle, and needed a 1kW PSU.
Intel's offering at the time was as revolutionary as the M-series chips.
Yep, it was a very similar situation where Apple wanted to keep their hardware cadence but were beholden to a third party on the chip roadmap.
These days they're still somewhat beholden to TSMC continuing to make progress on nodes etc, but I think they have a closer partnership and a lot more insight into that roadmap so they can keep their hardware plans in sync.
Indeed. There was a significant period where they, er, weren't really better than last year's, though. Remember Broadwell? Followed by Skylake, which took about two years to go from "theoretically available" to "actually usable".
And then Skylake's successors, which were broadly the same as Skylake for about four years.
Yeah, I literally just bought an M4 device mere weeks before the M5 came out. The performance jump is nontrivial for my use case. Am I worried about it? Nah. In another year there will be another jump, and then the year after. I’m just on a different upgrade cadence, that’s all.
Meanwhile back in the pre-M1 days I remember stalking Mac rumors for moths trying to make sure I wasn’t going to buy right before their once-in-blue-moon product refresh. You could buy a Mac and get most of its useful life before they upgrade the chip, if you timed it right, so an upgrade right after you bought was a real kick in the pants.
The review ecosystem is really toxic in that regard, as makers will court to it.
We had the silly unboxing videos fade, and it meant gorgeous packaging flying in the face of recyclability and cost reduction.
I wonder if the glass backs and utterly shiny but heavy and PITA to repair design is also part from there. A reviewer doesn't care that much if it costs half the phone to repair the back panel.
I don’t think most people are upset about the update itself. They’re just reacting to the mismatch between the scale of the improvements and the scale of the marketing.
To be fair apple didn't even market this launch, it was a very quiet release, but the reviewers needed to make content, so here we are.
That's not to say that they generally aren't guilty of making massive fanfare about ridiculously incremental phone updates and swapping the location of the camera bump just to make it look different.
> I think the issue stems from too many people making their living off reviews that require something exciting to get views.
The problem is that our hardware as we know it, has lost a lot of its stretch. Used to be that we got 100% performance gains on a generation to generation update. Then it became 50%, 30% ... Like in the GPU market, the last generation that actually got me exited was the 1000 series (1070 specific).
Now its "boring" 10 a 15% upgrades for the same generation (if we do not count naming / pricing rearrangements).
When was the last time any of use was "hey, i am exited to potentially buy this tech, really". Apple M1 comes to mind, and that is 5 years ago.
Nvidia tried to push the whole ray tracing (a bit too early), but again, its just a incremental update to graphics (as we had a lot of tricks to simulate lighting effects that had good performance). So again, kind of a boring gain if we look back.
Mobile gaming handhelds was trilling, steam deck... Then we got competitors but with high price tags = excitement became less. And now, nobody blinks with a new generation gets released because the CPU/iGPU gains are the same boring 15 a 20%... So who wants to put down 700, 900 Euro for a 15% gain.
What has really gotten you exited? Where your just willing to throw money at something? AI? And we see the same issue with LLMs ... what used to be big step/gain, in barely a years has gone from massive gains, to incremental gains. 10% better on this benchmark, 5% better there, ... So it becomes boring (GPT5 launch and reaction, Sora 2 launch and reaction).
> When updates are more evolution than revolution, it makes for a more boring article/video.
If you think about it, there is a reason why tech channels have issues and are even more clickbait then ever. Those people live on views, so when the tech they follow/review is boring to the audience, they start pushing more and more clickbait. But that eventually burning the channels.
Unfortunately, we have a entire industry that is designed around making parts smaller and smaller every generation, to make those gains. As we lost the ability to make large gains on making those smaller making parts ...
Its ironic, as we knew this was coming and yet, it seems nobody made any breakthrough at all. Quantum computing was a field that everybody knew had no road to general computing at home (materials issues).
So what is left is the same old, lets may the die a bit smaller, gain a bit, do some optimizing left and right, and call it a new product. But for customers, getting product 2.1, being named "this is our product 3.0!!!! Buy buy" ... when customers see its just 2.1, 2.2, 2.3 ...
We are in a boring time because companies sat too darn long on their behinds, milking their exiting products but never really figured how to make new products. I think the only one that took a risk was Intel years ago, and it blew up in their face.
So yes, unless some smart cookie makes a new invention, that can revolutionize how we make chips (and that can be mass produced), boring is the standard now. No matter how companies try to repackage it.
Every car company in the world realized that yearly product updates was the way to go, and no one whines that this year's model isn't good enough to justify upgrading from the previous year.
The auto market doesn't really work that way any more. Each new model is now expected to last about 8 years (plus or minus depending on the manufacturer) with only one minor mid-cycle refresh.
Nobody realy talks about the knock on effects of the attention economy. I opted out of it a long time ago, I despise YouTubers, TikTokers etc. because of the world they're shaping, and it's not just the injection of mindless rubbish into people's brains, it's real world effects like you outline in your second paragraph. It amazes me when I see seemingly smart people on HN talking about how they're addicted to YouTube "Shorts", it's like being addicted to labotomies.
Not me: I wanted Apple’s software division to innovate like its hardware division. Extra power with nothing to use it on except more and more docker containers isn’t compelling to me. I’ve not upgraded my M1 Macbook Pro and don’t plan to
I'd much prefer they focus on fixing their existing software quality problems. No innovation needed, just boring old software maintenance and design work.
They need a "Snow Leopard-style" release. This was the macOS version that came after Leopard and it was explicitly a release with no new features. It just focused on performance, efficiency, and reducing overall memory usage. Famously Apple even advertised it as having "zero new features". Recent macOS releases feel like they're going in the opposite direction. Really wish their next release would be more like Snow Leopard.
This is all I want. My Macbook Air is already stupid fast. Now I want the OS to support working faster. I would love nothing more than a ton of small quality of life updates.
Windows also need this for 10 years, since Windows 10 launch. But never happened. I guess it's just not really viable, as they need to continue selling new devices...
I still don't understand how there can be like 10 different UI styles in Windows 11 with no progress to getting that fixed. It feels more likey they are adding new ones instead...
I agree with your sentiment, but let's not rewrite history too much. Snow Leopard didn't have any new feature, but under the hood it was a massive undertaking IIRC: it introduced a 64-bit kernel and 64-bit system applications like Finder, Mail, Safari, etc. It also replaced many 32-bit system frameworks. Until Snow Leopard MacOS X was still mostly 32 bits.
When Snow Leopard came out it was very buggy, and many apps simply did not run on it. I've been a Mac user since 1993, and I think it's the only version of macOS I ever downgraded from. Don't get me wrong, it eventually became rock solid, the apps I needed were eventually upgraded, and it became a great OS.
But let's not mistake MacOS 10.6.8 for MacOS 10.6.0. And maybe let's not compare macOS 26.0 to MacOS 10.6.8 either, it's not quite fair. Ever since Snow Leopard I've been waiting at least 6 months before upgrading macOS. I don't intend to change that rule anytime soon...
They are already innovating more than they can deliver. While Apple's hardware quality is usually good, their quality standards are much lower on the software side.
This is surprising to me. I always thought Apple was ahead on the UX side of software with their attention to detail. Though it's been a while since their hardware design flaws, and their software has had new issues. Even Louis Rossmann has mostly stopped talking about Apple since their repairability changes (or due to having bigger fish to fry).
Their hardware used to be atrocious and software was the only thing they would get praised on. Now they've gotten lazy in the software department and outstanding in the hardware department.
Side note, rossmann has stopped talking about Apple because he is not longer focused on Apple repair and is turning his attention to other causes not because of apple's "repairability" changes which are still a token gesture.
Apple UX: "intuitive" features you have to discover via some random video reel. Of course you have to drag your messages sideways to see when they were sent, that's Good UX!
Sprinkle with crashes and bugs that are never fixed and charge a premium.
Apple UX: A beginner on a new Mac can’t right-click and copy, because there is a right-click but it isn’t active by default.
Go to Spotlight -> Type “Settings” -> Locate the settings -> In settings, go to Accessibility -> Wait no, it’s Mouse -> Gestures -> Activate the right-click.
^ That’s the experience for beginners. That screen should be in the installation wizard if Apple wants to make it optional. “Customize your mouse gestures”.
I don't even want more docker containers - I want to be able to run the same containers with much less overhead on the system. Hoping their new native containerisation plays that out soon.
You want Apple to invent reasons for you to need a more powerful computer? I could understand this argument for the iPad, but this is a weird complaint for Macs. Play a video game, use local LLMs, or get into video editing?
Never owned an apple device myself, but honestly I just want the existing tech to start coming down in cost. I grabbed a fordable this year and it's great. It was very much not worth $2200 though (and that was before tarriffs), so I grabbed a used 2YO model for $800 or so.
I'm already salavating at the thought of a fordable tablet in any form. But not at the thought of paying $3000 for one with current pricing.
I'm just saying I need a real reason to upgrade beyond "benchmarks make me feel good".
I can run local LLM's fine on my M1 Pro from 2021. I play games on it too. Why would I spend multiple thousands on a M(N) Macbook if there's no real reason to? It's not like when I upgraded from a 386DX to a Pentium.
I have a similar argument for phones right now. There are some AI-related reasons to upgrade, but there's not really a clear killer app yet besides frontier model chat apps. Why should I spend thousands of euros on an upgrade when I don't get anything significantly different for it?
> Why should I spend thousands of euros on an upgrade when I don't get anything significantly different for it?
You shouldn't and nobody is asking you to. Apple can sell their new computers to billions of prospective customers who wish to upgrade from x86 or buy their first computer.
At this point does the hardware division really innovate that much ?
There is significant improvement from the M4 to the M5, but how much of it is comes from TSMC and how much from Apple ? They have exclusivity on the latest processes, so it's harder to compare with what Qualcomm or AMD is doing for instance, but right now Strix Halo is basically on par with the M3~4 developped on the same node density.
On the other hardware parts, form factor has mostly stagnated, and the last big jump was the Vision Pro...
The Vision Pro was still pretty recent. They also refreshed the hardware designs of everything when moving to the M-series chips.
They also made that new wireless chip recently, the chips for the headphones, and some for the Vision Pro. The camera in the iPhone also gets a lot of attention, which takes a lot of hardware engineering. In the iPhone more generally we saw fairly big changes just a month or so ago with the new Pro phone and the Air. The Pro models on the MacBook and iPad are almost as thin, if not more thin than the Air line, which I’m sure took a considerable amount of work, to the point of making the Air branding a little silly.
The Vision Pro is a technical marvel, but as a hardware product it's uncomfortable (too heavy), bloated (useless front screen) and badly designed (straps are finally getting decent if we can trust the reviews).
These decisions IMHO fall on the hardware team, and they're not doing a good job IMHO. Meta's hardware team is arguably pulling more weight, as much as we can hate Meta for being Meta.
> headphones
Here again, the reception wasn't that great. The most recent airPod Pro was a mixed bag, the airPod max had most of the flaws of the Vision Pro and they didn't learn anything from it.
> camera
The best smartphone cameras aren't the iPhone by far now, they're losing to the Chinese makers, but don't have to compete as the market is segmented.
> MacBook and iPad are almost as thin
I wouldn't put the relentless focus on thinness as a net positive though.
All in all I'm not saying they're slacking, I'm arguing they lost the plot on many fronts and their product design is lagging behind in many ways. Apple will stay the top dog by sheer money (even just keeping TSMC in their pocket) and inertia, but I wouldn't be praising their teams as much as you do.
I still think the future for the Vision Pro is very bright. I think this version is more to get developers working on applications for it. Spatial computing is a fascinating idea.
In my own lived experience, every person I have met IRL who is dismissive of the Vision Pro has never actually used it seriously for more than a handful of minutes. People I know who have swear by spatial computing being the next UI/UX revolution.
I own an AVP, and I agree. Now I bought it secondhand for half the price, so I acknowledge that necessarily means there is at least one counterparty out there who disagrees.
Using the AVP for one work day, once I got the right fit and optical inserts, was such an eye opener. It’s like using an ultraportable laptop after living an entire life with large CRT monitors & desktop rigs tied to an actual desk. An experience, btw, which also lived through. It just radially opened my eyes to fresh new possibilities and interaction mechanisms I never before thought possible.
But at $3.5k? No sane company exec could have been serious in thinking that would take off.
My company-issued laptop isn't far off $3.5k. The tail end of AVP development and release was happening during the pandemic. I can absolutely imagine sane company execs who looked at the new remote work reality (at the time), and figured every single major enterprise would buy every one of their remote employees (all of them) an AVP.
I kinda hope someone would have the sanity to stop that.
Zoom calls with mandatory camera on were already barbaric, asking employees to strap a headset for team meeting sounds like a generally cruel idea to me.
It has a reasonable probability to get somewhere(that would require a lot of redesign, but they have th money to do so), but to be honest I wouldn't be happy with the most restrictive and closed ecosystem winning again in a new field.
If it was by design excellence and truly providing a better proposition it would sweeten the pill, but as of now it would be only because the way better products are from a company everyone hates.
In a weird way, Meta has been good at balancing hardware lockdown, and I'd see a better future with them leading the pack and allowing for better alternatives to come up along the way. Basically the same way the Quest allowed for exploration, and extended the PCVR market enough for it to survive up to this point. That wouldn't happen with Apple domining the field.
Honestly I'd be happy if they just made it stop lagging when I switch between multiple desktops in mission control. I spend most of my time in 3d party apps anyway. They recently added that lag I think with the liquid glass.
They tend to add lag in major OS releases. Gets people to consider refreshing their hardware. Just by sheer coincidence, they have a new model out this year! :-)
It’s actually very sad to see the state final cut is in. It’s a perfectly competent NLE for speed editing and has some solid features, but they had a real piece of software on their hands for years and just kind of sat around doing nothing from 2018 or so onward. I guess it just isn’t generating enough revenue to warrant the attention it deserves. It was my workhorse for a solid decade, I passionately defended FCPX because it was truly excellent after they got it to a good place 12-18mo in. Their native multicam and audio sync blew premiere/plural eyes out of the water for years. But now it’s just so…meh
I can’t imagine leaving Resolve to go back even though I still wayyyy prefer the FCPX UI.
I still have to install third party apps like Rectangle, LinearMouse and Middle to get a fraction of the ease of navigation I get on my Linux machines.
I still have to install a third party terminal like Kitty or Ghostty for basic, modern rendering.
That particular issue is just a conceptual mismatch. Exactly 0% of the time do I want to segregate my activities as "chrome" things vs "terminal" things vs whatever. When I want a feature like that, multiple desktops (mission control or whatever) is the tool of choice.
The backtick thing is just a constant annoyance. My workflow is to open windows doing the things I want some, and I want to quickly switch to the window with my next work item. Instead, I need to keep track of extra mental state and figure out if backtick is the right keystroke or if tab and then backtick is the right thing to do.
It's...fine. I'm thankful I have better options at home, but it's tolerable at work with a few third-party apps.
That was the killer feature that convinced me to switch from Windows + Linux to Mac a long time ago. I often have too many windows open, and the conceptual separation between apps and windows helps me find the right task faster. Especially because I can also switch to an app that doesn't have any windows open at the moment.
Yep, my description was mostly negative (I personally hate it because I don't think that way), but I was serious about it just being a mismatch of expectations. There's nothing written in stone about the MacOS method being wrong, and it's nice that it works better for some people. UI is partly objective and partly subjective, and this particular point definitely falls on the subjective end of the spectrum.
> How much innovation is there to do in the OS at this point?
Infinite, just like in any complex UI. All the basic interaction primitives built into the OS are somewhat broken, from app/window management and text editing to keybindings and mouse gestures
As someone who was new to Mac and eager to use AppleScript for automation, I was disappointed to find that a number of things just don't work under AppleScript in Apple Silicon. It's pretty deprecated; Shortcuts seems supported though.
While it’s certainly true that AppleScript has taken a bit of a back seat to Shortcuts, as far as I know everything AppleScript that works on Intel should work on Apple Silicon.
It's been a long while, but I remember running into issues running scripts using osascript to click through an application to work properly on a cadence. I remember my debugging ending when it seemed the consensus was that some things just wouldn't work any more on Apple Silicon.
Apologies that my memory fails me here! This was a few years ago, I only have my zsh history (and the name of a now-deleted script) to go by.
> How much innovation is there to do in the OS at this point?
1) Sign Nvidia's drivers again, at least for compute (there's no excuse)
2) Implement Vulkan 1.2 compliance (even Asahi did it, c'mon)
3) Stop using notifications to send me advertisements
3.1) Stop using native apps to display advertisement modals
4) Do not install subscription services on my machine by-default
5) Give macOS a "developer mode" that's at-least as good as WSL2 (if they won't ship GNU utils)
6) Document the APFS filesystem so the primary volume isn't inscrutable, akin to what M$ did for NTFS
If they're trying to get me to switch off Linux, those would be a nice start. I don't think any of that is too much to ask from a premium platform, but maybe my expectations are maligned.
> 5) Give macOS a "developer mode" that's at-least as good as WSL2 (if they won't ship GNU utils)
The de facto answer is Homebrew — even internally at Apple. They just can’t publicly say it without liability issues.
> If they're trying to get me to switch off Linux
It’s important to know that Apple is not trying to get you to switch from Linux. Converting “UNIX workstation” people was an effort of theirs circa 2001 but that marketing campaign is long over with.
Their targets are consumer, prosumers, and media/influencer people. They give app developers just enough attention to keep their App Store revenue healthy.
Plan your long-term computing needs accordingly. You’ll see what I mean in the next 12-24 months.
I don't think anyone should upgrade if they're happy, but I also think faster chips do have real-world benefits that tend not to be appreciated by people who aren't valuing their time enough. I replaced my M1 MBP with an M4 earlier this year, and it's had a couple real-world benefits:
- builds are noticeably faster on later chips as multicore performance has increased a lot. When I replaced my M1 MBP with an M4, builds in both Xcode, cargo and LaTeX (I'll switch to Typst one of these days, but haven't yet) took about 60% of the time they had previously. That adds up to real productivity gains
- when running e.g. qwen3 on LM Studio, I was getting 3-5 tok/s on the M1 and 10-15 on the M4, which to me at least crosses the fuzzy barrier between "interesting toy to tinker with sometimes" and "can actually use for real work"
- 5G connectivity
- WiFi 7
- Tandem OLED Screen
- Better webcam
- FaceID
- Cheaper RAM (RAM is more important to me these days than CPU speed)
- More ports
- Better/cheaper monitors
- Make a proper tablet OS
- Maybe a touchscreen but I really don't want one
As someone who hates Apple's facial recognition implementation, I'm eagerly awaiting the day they ditch TouchID for FaceID. That'll be the year for me to upgrade to a high-spec laptop on the last generation with TouchID.
Honest question, why? Sounds like you have a technical issue with it. I’m not a fan of FaceID for digital privacy reasons, but the reliability and ease of use eventually forced me to give in. I don’t think I’ve encountered anyone who hates Apple’s specific implementation of facial recognition.
Price gouging on RAM is a very intentional decision by Apple to charge 8x market rate for it. Same for storage, you can get a blazing fast 4 TB NVMe SSD for just a few hundred bucks vs $2k or whatever Apple extorts from you.
It’s just market segmentation. Other companies do this by putting nonperformant CPUs lacking sufficient bus lanes in the consumer laptop. Apple gives the entry models a real piece of hardware, just with insufficient RAM. I like this situation better than the alternative.
Yeah I get that, but it still feels really unpleasant from the side of a regular customer. Sure, Apple is targeting the software industry and media industry who'll pay $5k for a fully kitted out MBP for all of their employees. And the regular normies who don't need much RAM/storage get amazing hardware at a good price point - good for them.
But as a regular guy who just has a lot of files and tends to keep tons of browser tabs open... it really sucks that I'm in the situation of getting extorted for $3k of pure profit for Apple, or have to settle for subpar hardware from other companies (but at a reasonable price). Wasn't an issue when the RAM & SSD weren't soldered on, but now you can't upgrade them yourself.
I think the point is that every manufacturer is playing this game, and with comparable margins.
I have no idea what the hip PC laptop is these days, is it still the Lenovo Carbon X1? I went to their website and picked the pre-configured laptop with the most RAM (32GB), best CPU, and 1TB SSD. This was $3k: https://www.lenovo.com/us/en/p/laptops/thinkpad/thinkpadx1/t...
Roughly the same size and specs as the most expensive pre-configured MacBook Pro of the same screen size (the MBP has 36GB RAM, +4GB over the Lenovo, and a much better processor & GPU for $3.2k).
It's all market segmentation. Apple is just being upfront about it and giving you a clean, simple purchase page that shows the tradeoffs. Whereas Lenovo is using car salesman techniques to disorient you with a bewildering array of options and models all of which have decision paralysis-inducing tradeoffs not entirely in your favor.
Then go get another computer? Why do you rage against a product which you don't like? Forget about Apple and stick to other makers. There's plenty of products and manufacturers I don't like. I never think about them.
Not sure why you're taking it so personally and getting defensive. Was my comment not related and relevant to parent comment? I did in fact buy another computer because I don't like getting price gouged. Have a nice day.
Apple could have my money in exchange for their hardware. I won't even ask for support. They just need to provide the hardware specifications to Linux developers.
We want cheaper storage, now. 1TB costs around $400 based on speccing a 2TB M5 Macbook pro.
That's probably a 4x markup, and the $200 to go from 256 to 512 is even worse.
Every time a user considers jumping from Windows but balks at the storage costs, that's leaving many thousands of potential revenue on the table. I just can't believe it really makes economic sense for them, except in short-term cashflow terms.
RAM too. They want +$1K to go from 64GB to 128GB, with no other spec changes. It's a way of segmenting the market -- those who actually need it are willing to pay a lot more (e.g. for AI / 4K videos).
> Back in the PowerPC and Intel days, Macs would sometimes go years between meaningful spec bumps, as Apple waited on its partners to deliver appropriate hardware for various machines.
Yes and no. Sometimes Intel did not move as fast as Apple wanted, and sometimes Apple didnt feel like it.
Especially the MacPro (trash can and old cheese-grate) and the MacMini (2012-2018) were neglected.
Today, the MacPro ships with M2 Ultra, the MacStudio ships with M3 Ultra, and its not certain that the MacMini and the iMac will get the M5 or will continue shipping with the M4 for the foreseeable future.
I have a M1 Max MPB from 2022 with 32G RAM (which I'm grateful for).
More performance (especially for local AI models) is always great, but I'm trying to imagine what I'd want out of a design change!
I think slightly thinner would be nice, but not if it runs hotter or throttles.
Smaller bezels on the screen maybe?
I'm one of those who liked the touchbar (because I think that applications which labelled its shortcuts in the touchbar are awesome) so I think some innovation around things like that would be nice. But not if it compromises the perfect keyboard.
I do think MacOS would be improved with touchscreen support.
> I do think MacOS would be improved with touchscreen support.
On the contrary, I appreciate the Mac UI not being forced into touch friendliness. The whitespace increase in Big Sur is already bad enough, at least to me.
For those of us immersed in hardware fandom, the cycle is neither new or disappointing - if anything, a lot of us relish the “boring” times, because it means we can finally squeeze performance out of our investment without fretting about arbitrary replacement timelines or major improvements in technology leading to gargantuan gains. It’s nice, quiet, and let’s us enjoy the fruits of our labors and hobbies.
That being said, I do kind of head-tilt at the folks screaming that this sort of “boring” cycle of hardware isn’t sustainable, that somehow, someone must create the next major improvement to justify all new spend or otherwise this is a worthless exercise. In reality, it’s always been the opposite: Moore’s Law wasn’t infinitely scalable, and anyone who suffered through the Pentium 4 era was painfully aware of its limitations. Sure, we can find other areas to scale (like going from clock speed to core counts, and core counts to core types), but Moore’s Law is not infallible or infinite; eventually, a plateau will be reached that cannot be overcome without serious R&D or a fundamental sea-change in the marketplace (like moving from x86 to ARM), often a combination of both.
Apple, at least, has the unenviable position of being among the first in addressing this challenge: how do you sell more products when power or efficiency gains are increasingly thin, year over year? Their approach has been to leverage services for recurring revenue and gradually slowing down product refreshes over time, while tempering expectations of massive gains for those product lines seeing yearly refreshes. I suspect that will be the norm for a lot of companies going forward, hence the drive to close walled gardens everywhere and lock-in customers (see also the Android sideloading discourse).
The hardware cycle at present is fairly boring, and I quite like it. My M1 iPad Pro and M1 Pro Macbook Pro dutifully serve me well, and I have no need to replace either until they break.
I thought this was going to be about Apple's various recent catastrophic software innovations, saying "why did you have to mess with a good thing? We just wanted it to stay as-is, even if that's considered 'boring'"
I'm still typing this from an M1 MAX MBP w/ 64 gig of ram. I ended up needing more memory so, I swapped to this machine instead of my M1 air w/ 16gig. Both machines are completely capable for most tasks I deal with as a developer. Do I like my work m3? Sure. I wish I had the old m3 air I had to give back. But I'm happy with my machines.
It's funny that my ipad has a more current CPU than my two laptops.
Personally: I am extremely excited for a world where we have silicon that's capable of driving triple-A level gaming in the ~20w TDP envelope. M5 might actually be the first real glimpse we've had into this level of efficiency.
I have a 2020 intel 10nm quad core MBP and my god even the M2 is so much faster. They are doing absolutely incredible work to be getting >10% improvement every single year without fail starting from that point.
I've heard that the M-series chips with metal do great on the whole small model with low latency front; but I have no practical experience doing this yet. I'm hoping to add some local LLM/STT function to my office without heating my house.
I'm uncertain as to whether any M series mac will be performant enough and the M1/M2 mac mini's specifically, or whether there are features in the M3/M4/M5 architecture that make it worth my while to buy new.
Are these incremental updates actually massive in the model performance and latency space, or are they just as small or smaller?
As someone who purchased their first M-series Mac this year (M4 pro), I've been thrilled to discover how well it does with local genAI tasks to produce text, code and images. For example openai/gpt-oss-20b runs locally quite well with 24GB memory. If I knew beforehand how performant the Mac would be for these kinds of tasks, I probably would have purchased more RAM in order to load larger models. Performance for genAI is a function of GPU, # of GPU cores, and memory bandwidth. I think your biggest gains are going from a base chip to a pro/max/ultra version with the greater gpu cores and greater bandwidth.
The M5 is a huge upgrade over the M4 for local inference. They advertise 400% and there is reason to believe this isn’t a totally BS number. They redo the GPU cores to avoid having to emulate certain operations at the core inner loop of LLM inference.
I have an M4 and it is plenty fast enough. But honestly the local models are just not anywhere near the hosted models in quality, due to the lower parameter count, so I haven’t had much success yet.
I can relate. Most users just want stable, quiet performance improvements, not a revolution every update.
Do you care more about performance improvements or new features?
Frankly I’d be incredibly exited if the next Apple OS update was “No new major featurs. Bug fixes, perf optimization, and minor ergonomic improvements only”.
A few ergonomics improvements and a wifi stack that doesn't periodically crash could be enough to pull me to the dark side. I like my setup, but I'm lazy and don't really like the process of upgrading computers. Apple taking that load off could buy me as a customer for a decade. The amazing battery life is nothing to sneeze at either.
The important work isn’t always glamorous. This is a problem that has tainted the entire industry.
The new pretty stuff feels a lot less magical when it lags or the UI glitches out. Apple sells fluidity and a seamless user experience. They need those bug fixes and an obsessive attention to detail to deliver on what is expected of their products.
I hate that computers get faster, because it means I'll be forced to buy another laptop. It goes like this:
- Some developer buys a new laptop
- Developer writes software (a browser)
- When the software works "fast enough" on their new laptop, they ship it
- The software was designed to work on the dev's new laptop, not my old laptop
- Soon the software is too bloated to work on my old laptop
- So I have to buy a new laptop to run the software
Before I'd buy a laptop because it had cool new features. But now the only reason I buy a new one is the new software crashes from too little RAM, or runs too slowly. My old laptops work just fine. All the old apps they come with work just fine. Even new native apps work just fine. But they can't run a recent browser. And you can't do anything without a recent browser.
If our computers never got faster, we would still be able to do everything the same that we can do today. But we wouldn't have to put down a grand every couple years to replace a perfectly good machine.
I think what you want is for software developers not to write bloated code, instead of computers not getting faster. The bloated code is a result of undisciplined programming and not paying attention to users' devices.
If our computers never got faster, we would never get faster computers (obviously...) to run efficient code even faster. 3D rendering and physics simulation come to mind.
I have noticed what you mention over longer timescales (e.g. a decade). But it's mostly "flashy" software - games, trendy things... Which also includes many websites sadly - the minimum RAM usage for a mainstream website tab these days seems to be around 200MB.
Anecdata: My 12 year old desktop still runs Ubuntu+latest Firefox fine (granted, it probably wouldn't be happy with Windows, and laptops are generally weaker). Counter-anecdata: A friend's Mac Pro from many years ago can't run latest Safari and many other apps, so is quite useless.
I don't know if that's accurate to software developers, but it makes me cringe a bit as a game developer. I upgraded from a 1060 to 4060 and suddenly did waaaay less optimization; it just wasn't top of mind anymore. Of course, that bill still comes due eventually..
I agree. My M1 Air is the best laptop I’ve ever owned (and that goes back 30 years). While I’m finally getting tempted to upgrade by M5, the reality is my M1 is still quite usable. I’m thinking I might use it until it either fails or Apple finally cuts support for it.
Mac hardware has so significantly outpaced software needs I think there are diminishing returns. I'm a software developer who uses all sorts of advanced stuff and I only bought an M4 Pro, not a Max, because it wasn't worth the extra money. There are so few applications that max out a CPU for any meaningful amount of time these days like rendering videos or 3D.
My M4 iPad Pro is amazing but feels totally overpowered for what it's capable of.
I guess what I'm saying is.......I don't need faster CPUs. I want longer battery life, 5G connectivity, WiFI 7, lighter weight, a better screen, a better keyboard, etc..
I guess it's odd that Apple spends so much time making faster computers when that is practically an already solved problem.
They aren’t just making computers, though. Today’s CPU improvements go into tomorrow’s Vision Pro. Today’s improved E cores become tomorrow’s watch cores. Or something like that.
We want Apple to compete. When they stopped signing CUDA drivers, I thought it was because Apple had a competitive GPGPU solution that wasn't SPIR-V in a trenchcoat. Here we are 10 years later with SPIR-V in a trenchcoat. The lack of vision is pathetic and has undoubtedly cost Apple trillions in the past half-decade alone.
If you think this is a boring architecture, more power to you. It's not boring enough for me.
Genuine question, how does SPIR-V compare with CUDA? Why is SPIR-V in a trench coat less desirable? What is it about Metal that makes it SPIR-V in a trench coat (assuming that's what you meant)?
There might be a subset of people, such as yourself, that looks for CUDA as a hard requirements when buying a GPU. But I think it's fair to say that Vulkan/Spir-V has a _lot_ of investment and momentum currently outside of the US AI bubble.
Valve is spending a lot of resources and AFAIK so are all the AI companies in the asian market.
There are plenty of people who wants an open-source alternative that breaks the monopoly that Nvidia has over CUDA.
I think the new AMD R9700 looks pretty exciting for the price. Basically a power tweaked RX 9070 with 32gb vram and pro drivers. Wish it was an option 6-7 months ago when I put my new desktop together.
Great. I’m with you there. There is no way that’s describing Apple though.
They’re not open source, for sure. But even setting that aside, they don’t offer anything like CUDA for their system. Nobody is taking an honest stab at this.
I have an old card printer that I only use occasionally, and firing up a windows 7 virtual machine is (was?) the most convenient way to do it. I think it's not so uncommon to have old devices around that don't work with newer versions of windows.
Perhaps a Macbook is now fast enough to just run Windows 7 in full emulation? Haven't tried, though.
Edit: Checked on Youtube. Yeah, Windows 7 seems to be fast enough on an Apple silicon Macbook in full emulated mode. For example: https://www.youtube.com/watch?v=B9zqfv54CzI
So it was with PowerPC, Sparc, SGI CPUs, and a bunch of older now obsolete architectures. I don't think we should be limiting the technological potential to keep old Windows drivers afloat, and they weren't native to the platform to begin with. You can always get a PC and virtualize Windows 7 just fine.
Yes, I really want an M5, a CPU I can buy today, more than Panther Lake, which isn’t on the market yet and hasn’t been reviewed by 3rd parties.
I want a laptop that gives me amazing performance, thermals, build quality, and battery life. It’s gonna take a while to see what manufacturers will do with panther lake.
These arguments constantly devolve into "why would you want an APPLE product that's less good than $_new_shiny_PC_thing" and it's always currently available products being pitched against conceptual products in the heads of Intel's fanboys that may come to market in a year. It's a ridiculous comparison.
I got an M3 Pro Macbook Pro on clearance recently for $1,600, 16 inch screen brighter than any PC laptop's I've ever seen, that's the fastest computer I have ever used, hands down and it's 2 generations out of date already. OR I can have a PC gaming laptop where the fit and finish isn't as nice, where the screen is blurrier, the battery life maxes out at 4 hours if I do absolutely nothing with it, and any time I do anything of remote consequence the fans kick up and make it sound like it's trying to take off.
And that's without even taking into account the awful mess Windows is lately, especially around power management. It makes every laptop experience frustrating, with the same issues that were there when I was in fucking high school.
Like if you just hate Mac, fine, obviously a Mac is a bad fit for you then and I wouldn't try and tell you otherwise. But I absolutely reserve the right to giggle when those same people are turning their logical brains into pretzels to justify hating a Mac when it has utterly left the PC behind in all things apart from gaming.
I have a PC on the other side of my wall from where I'm sitting with a 7800X3D and an nvidia 4090. It's for gaming only most of the time, though I do take advantage of the 4090 for some basic LLM stuff, mostly for local audio transcription and summarizing (I take a lot of notes out loud, I speak faster than I can write). The rest of the time it's playing AAA gaming titles at full tilt at 5120x1440, full res, getting 60fps on basically anything I throw at it, all while sucking down 600W (400 for the GPU alone while playing Cyberpunk). It's a beast. I love it.
I have an M4 Mac Mini on my desk. At full tilt it pulls 30W. It scores higher in benchmarks than my gaming PC. It cost less than my 4090 did on its own, and that's including an upgraded third-party iBoff storage upgrade.
Of course, trade offs and process size differences abound; the M4 is newer, I can pack way more RAM into my PC years after I built it. I can swap cards. I can add another internal SSD. It can handle different kinds of load better, but at a cost of FAR more power draw and heat, and its in a full tower case with 4 180mm fans moving air over it (enough airflow to flap papers around on my desk). It's huge. Lumbering. A compute golem, straining under the weight of its own appetite, coils whining at the load of amps coursing through them.
Meanwhile, at idle, my Mac mini uses less power than the monitors connected to it, and eats up most of the same tasks without ruffling its suit. At full tilt, it uses less power than my air purifer. It's preposterous how good it is for what it costs to buy and run. I don't even regret not getting the M4 Pro.
When the Thunderbolt Display came out I was in a raid group and I wanted a display with great refresh rate and low delay (melee character, don’t get stuck standing in the poo). So I researched and researched and the only monitor that had equivalent response times to that dumb Thunderbolt Display was only $60 cheaper, had a plastic shell and I’d have to fight UPS over getting it.
Or I could drive across town and have a monitor today and pay $60 for the aluminum shell that hides dust better.
I think it's sometimes tempting for people to spill logical fallacies trying to argue against Mac lovers, when actually they just have different priors (they just value different aspects of the computer).
Yes, Macs have incredible compute/watt, display quality, and design. However, I like to think of myself as logical, and I would not buy a Mac.
Given the choice between a M5 Mac and a latest-gen ThinkPad, I would not take the Mac. That is fine, and so are people who would do the opposite. We are just looking for different qualities in our computer.
It's all tradeoffs after all - similar to how we value personal freedom in the West, I value freedom to do what I want with the hardware I own, and am willing to accept a performance downgrade for that. (No Windows means that the battery life hit is relatively light. FWIW, there's no chance I would buy a computer locked down to Windows either.)
I also value non-commitment to a particular ecosystem so I prefer not to buy Apple, because I think a significant amount of the device's value is in how seamlessly it integrates with other Apple devices.
However, one day in the future when many of my beliefs have become "bought out", perhaps my priorities will change and I will go all in on the ecosystem. That's OK as well.
I mean you have a much more reasonable and nuanced opinion than the GP so I wouldn't rope you in with the aforementioned mental-gymnastic-ing fanboys. However, I feel the need to take issue here:
> It's all tradeoffs after all - similar to how we value personal freedom in the West, I value freedom to do what I want with the hardware I own, and am willing to accept a performance downgrade for that.
Genuine question: what do you mean locked down? By default the Mac won't run unsigned software, but that's not even today in MacOS 26 an unsolvable issue. I run all kinds of software not signed by Apple daily. There are nuances further still there, like sometimes if you want to install kernel level stuff or tweak certain settings, you have to disable SIP which is definitely a bit of a faff, but that's a Google-able thing that any tech literate person could accomplish inside of 30 minutes.
I would bow to the technical limitations, as you're rather locked to ARM64 compiled software, but I don't recall the last time I saw a piece of software getting current updates that doesn't include a binary for that.
Yea, this is how I feel too. I've been hoping that Intel would turn itself around, but Intel has failed at its roadmap over the past few years. Intel canceled 20A and 18A is delayed. It had looked like Intel would leapfrog TSMC, but that didn't come to fruition.
I hope that Intel does well in the future. It's better for us all if more than one company can push the boundaries on fabrication.
I also remember the days when the shoe was on the other foot. Motorola or IBM was going to put out a processor that would decimate Intel - it was always a year away. Meanwhile, Intel kept pushing the P6 architecture (Pentium Pro to Pentium 3) and then NetBurst (Pentium 4) and then Core. Apple keeps improving its M-series processors and single-core speed is up 80% since the M1 and 25% faster than the fastest desktop processor from AMD and 31% faster than the fastest desktop processor from Intel.
I'd love for Panther Lake to be amazing. It will put pressure on Apple to offer better performance for my dollar. Some of performance is how much CPU a company is willing to give me at a price point and what margins they'll accept. If an amazing Panther Lake pushes Apple to offer more cores at a cheaper price, that's a win for Apple users. If an amazing Panther Lake pushes Apple to offer 2nm processors quicker (at higher cost to them), that's a win for Apple users.
But I'm also skeptical of Intel. They kept promising 10nm for years and failed. They've done a bit better lately, but they've also stumbled a lot and they're way behind their roadmap. What kind of volume will we see for Panther Lake? What prices? It's hard to compare a hopeful product to something that actually exists today. Part of it isn't just whether Intel can make 18A chips, but how fast can they produce them. If most of Intel's laptop, desktop, and server processors in 2026 aren't 18A, then it isn't the same win. And before someone says "Apple is just a niche manufacturer," they aren't anymore. Apple is making CPUs for every iPhone in addition to Macs so it has to be able to get CPUs manufactured at a very high scale - around the same scale as the Intel's CPU market.
I hope Intel can do wonderfully, but given how much Intel has overpromised and underdelivered, I'm definitely not taking their word for it.
I am excited about Panther Lake myself but where are you reading that it has higher performance/watt than M5? The chips aren't even out yet. All we have are Intel marketing materials with vague lines on charts. No one could have possibly done a performance/watt test on Panther Lake yet. I'm hoping they beat M5 but if I had to, I'd put my money on M5.
Leaving aside the availability of various Intel processors, exactly what I want is for the various manufacturers to compete as hard as they possibly can.
I want Intel to catch up this month. And then next month I want AMD to overtake them. And then ARM to make them all look slow. And then Apple to show them how it's done.
The absolute last thing I'd want is for Apple to have special magic chips that nobody else even comes close to.
From what I’ve read, single thread for panther lake is roughly the same as last gen. The gains are in efficiency, multi thread, and GPU. The most optimistic reading I’ve seen suggested 50% gains in GPU performance and in multithread. I’ll wait for independent testing before making any judgements, but Intel has a way to go to rebuild trust.
That’s partly the difference between making your own components and getting them from a vendor. Sure Intel can send select vendors prerelease prototypes but the feedback loop will never be as efficient as in house.
But it’s like a margin call. Everything is great until it completely sucks. Of course a lot of that comes down to TSMC. So if Apple falls it’s likely others will too.
I think it's the difference between having enough CPUs that you can launch a product and having enough CPUs that people start planning future products.
Volume takes time. That's why we're seeing 2026. And before someone says "that just gives Apple an advantage because they're smaller," Apple is shipping a comparable volume of CPUs - and they're doing basically all their volume on the latest fabrication tech.
> The difference is that with Apple silicon, Apple owns and controls the primary technologies behind the products it makes, as Tim Cook has always wanted.
Customers seem pretty happy with the changes and sales are up since the transition.
A Macbook with some of the best processors available in a laptop with the battery life and thermal characteristics of an iPhone or iPad is a pretty compelling product for many people.
Long time Apple customers? Almost certainly. Apple has a history of having conflict with their chip makers and other component vendors. For the longest time, Apple wasn't big enough for those vendors to bother with doing anything Apple wanted them to do. Or even when Apple was big, still wasn't big enough to make a dent in their vendor's pipelines (see also Intel and low power chips)
I always get a little bothered when I see negative reviews from a CPU update in Apple laptops. While a new CPU alone isn’t a thrilling update, it’s important that they do these regularly so consumers looking to buy aren’t forced to buy a 3 year old product with no idea when a refresh will come. I’ve been in this situation many times with Apple and it has been very frustrating. I’m glad they are back on a yearly refresh schedule.
I think the issue stems from too many people making their living off reviews that require something exciting to get views. When updates are more evolution than revolution, it makes for a more boring article/video. I always worry that these types of responses will lead Apple to do silly things, like leaving old chips out there too long, or adding pointless features just so there is something new to talk about.
> While a new CPU alone isn’t a thrilling update, it’s important that they do these regularly so consumers looking to buy aren’t forced to buy a 3 year old product with no idea when a refresh will come.
Also: incremental updates add up.
A (e.g.) 7% increase from one year to the next isn't a big deal, but +7%, +7%, +7%, …, adds up when you finally come up for a tech refresh after 3-5 years.
Its 2025, the fact that Apple is delivering CPUs with actual, noticeable annual performance improvements is pretty astounding in itself. Sure its not 1990s levels, but its still pretty great.
M silicon/SoC is the best thing to happen to computing, for me.
I have 64GBs of RAM in my Macbook Pro. I load a 48GB DuckDB to RAM and run real-time, split-second, complex, unique analysis using Polars and Superset. Nothing like this was possible before unless I had a supercomputer.
Is it really that much better than some small form AMD Ryzen with 2x32 SODIMM thrown in? I get that the M series is amazing in terms of efficiency and some people love Apple hardware but you could likely have had that performance with a $700 setup.
for DB's bandwith to RAM and Storage is just as important.
The only server that actually matched the performance of a Mac Studio was XEON Max series (formerly codenamed Sapphire Rapids HBM) with 64GB of integrated memory into the CPU package. the latency between the CPU and RAM is simply too big in a regular PC.
Have you tried other PC with 64 GB of RAM?
yes. we have PCs. AFAIK, the cheapest PC that compares for my workflow is an EPYC/NUMA or another very expensive CPU/latency optimized server. We have a complex stack, with clients running unique queries that we can't predict and gigabytes loaded into RAM, L3 cache doesn't always save us. I haven't found another solution, I wish we could drop the Macs cause the OS is pretty awful.
We're using Macs as servers. But it's a small operation.
i'm guessing you're using macs newer than M2, so they can't run linux; but i wonder if fedora server (asahi remix) would suit your operation well
Also: we shouldn't make a big deal out of every update then. Celebrating M1: alright, but then M2-M500 are boring and not even worth noting, because you know there's a new one every year.
adds up to 22.5%
> adds up to 22.5%
after 3 years
and 40% after 5 years.
yep! I was only throwing in the final number for those folks who were like me and saw math, and wanted an answer.
Sorry for the nit, but it's compounding improvement, not additive. Its 25% after 3 years and 45% after 5
That’s what he’s saying.
No. He said that the 7% 'add up', when the proper term would be 'compound'.
You are both right. What is compounding? It is when you add the gains, year by year.
The gains are multiplied though, no?
In the vocabulary of finance you don't multiply in gains, you add them. It probably historically derives from dividends being added. At the transactional level you never actually multiply money, after all (unless you're a bank).
Everyone is right! It’s a multiply-accumulate (accumulate of course a synonym for add)
What is multiplication if not adding over time?
Agree. So many people online (not just reviewers) complaining that it's just a spec-bump, demanding a new design. I remember the time people were (rightfully) complaining that the update schedules were slow for Macs, mainly because of Intel's limitations. Now we get yearly refresh, they complain that it looks the same.
I don't think they appreciate the cost of redesigning and retooling. Echo your thoughts and hope Apple doesn't listen to this feedback. Imagine more expensive laptops because some people want more frequent design changes!
> So many people online (not just reviewers) complaining that it's just a spec-bump, demanding a new design.
If ever there was a case of "be careful what you wish for" - whether it's the Touch Bar, deleting ports or the butterfly keyboard, a redesign isn't necessarily a positive.
Apple is the company where every new iPhone release is both simultaneously boring, not worth it and a sure sign of their impending collapse and also somehow a vicious treadmill which forces people to upgrade every year, throwing out their old phone and contributing to e-waste. They could announce a cure for cancer tomorrow and within a month people would be back to asking what have they innovated recently. People just like to complain.
> which forces people to upgrade every year
Perhaps it’s just a language slip, how are people forced to upgrade every year? My experience is the opposite: ios 15 is still supported[0] and my 2016 iPhone let me access the World Wide Web.
The force your talking about comes instead from developers (like me) that implements features and systems always more CPU/GPU hungry.
0 security patched last month https://news.ycombinator.com/item?id=45270108
In certain circles it's assumed that people buy a new phone every year. Then they make 30 minute Youtube videos lambasting Apple about "incremental upgrades" and "zero innovation"
While also not getting that they're NOT the target market.
For the person whose iPhone finally (after half a decade or more) falls out of major version support a 5-6 generation jump on hardware is amazing.
They are the target market.
I think some people see having an older generation iPhone as sending a signal of "I’m poor", a status thing. Pretty ugly thing, but the act of buying iPhones on credit happens too often.
People quickly become accustomed to common occurrences that are not threatening (like extreme weather events). Apollo 8 was the first time humans reached the moon, just orbiting it. Sixteen months later, in a time with much less media and information than we have now, US TV networks chose not to broadcast an en-route feed for Apollo 13 because this was no longer seen as interesting. We often seem spoiled, we often seem prone to complaining, and we often seem more enamored with something new. Yet there are so many remarkable things we take for granted.
That’s because it’s called “the news”, not “the olds”.
When I teach people how to talk to reporters I always emphasize this. If it’s the 10th time something happened, you need to explain it in terms of what’s -new- or your info won’t go beyond the pitch meeting.
Thats why your town’s street fair makes a big deal that it’s the 10th anniversary event. It’s “news” that you’ve hit a round number. That’s why Trump breaks the law in a little way before doing it in a big way… the second time isn’t interesting.
Maybe I'm a weird one, but I'm still quite happy on my iPhone 12.
I changed my iPhone 8 about 1 year ago. Now I have a 16, which I will probably be using in 2031
Add me to the group. I’m on iPhone 11 and I couldn’t be happier. I do follow their new launches and then look at what I have. It looks and works like new, have absolutely no complaints. 6+ years and going strong.
iPhone 12 mini here, I replaced the battery last year, and still going strong. Actually, I fear the day I'm forced to upgrade due not having the mini size available anymore...
> forces people to upgrade every year
People who upgrade every year don't do it for technical needs. We're long past the times when phones were inadequate and yearly improvements were big leaps that made them less unusable.
Yearly phone upgrades are just to sport the latest model, symbolizes status. Or if there's some deal where you can do it for close to no cost, better than long upgrade cycles, but I don't think "free upgrades" are common.
See!? They waste all this time and money curing cancer, which ended up being soo easy in the end, and not a single bit of effort into curing Alzheimer’s. Pricks.
The capitalist class truly are leaches.
I'd much rather Apple put their energy into performance, battery life, and long-term reliability than chasing novelty for novelty's sake
Intel released new CPUs every year; they shouldn't be blamed when Apple refused to update.
Intel had multiple years of promising that their new next-gen more efficient 10nm CPUs were coming very soon, and then those kept being delayed.
The chips they did release in that time period were mostly minor revisions of the same architecture.
Apple was pretty clearly building chassis designs for the CPUs that Intel was promising to release, and those struggled with thermal management of the chips that Intel actually had on the market. And Apple got tired of waiting for Intel and having their hardware designs out of sync with the available chips.
> and those struggled with thermal management of the chips
An ironic mirror of the PowerPC era when every version of the G5 was struggling with high power consumption and heat generation when operated at any competitive frequency/performance level. The top end models like the 2.5GHz quad-G5 needed water cooling, consumed 250W when idle, and needed a 1kW PSU.
Intel's offering at the time was as revolutionary as the M-series chips.
Yep, it was a very similar situation where Apple wanted to keep their hardware cadence but were beholden to a third party on the chip roadmap.
These days they're still somewhat beholden to TSMC continuing to make progress on nodes etc, but I think they have a closer partnership and a lot more insight into that roadmap so they can keep their hardware plans in sync.
I remember reading about an aging Mac Pro not seeing an update because the Xeon chips it used hadn’t seen an update from Intel.
I’m sure Intel had some releases each year, but did they have the right ones to make it possible for Apple to release an update?
Indeed. There was a significant period where they, er, weren't really better than last year's, though. Remember Broadwell? Followed by Skylake, which took about two years to go from "theoretically available" to "actually usable".
And then Skylake's successors, which were broadly the same as Skylake for about four years.
Yeah, I literally just bought an M4 device mere weeks before the M5 came out. The performance jump is nontrivial for my use case. Am I worried about it? Nah. In another year there will be another jump, and then the year after. I’m just on a different upgrade cadence, that’s all.
Meanwhile back in the pre-M1 days I remember stalking Mac rumors for moths trying to make sure I wasn’t going to buy right before their once-in-blue-moon product refresh. You could buy a Mac and get most of its useful life before they upgrade the chip, if you timed it right, so an upgrade right after you bought was a real kick in the pants.
Agree... regular, even if "boring," updates are a feature, not a bug
Yes.
The review ecosystem is really toxic in that regard, as makers will court to it.
We had the silly unboxing videos fade, and it meant gorgeous packaging flying in the face of recyclability and cost reduction.
I wonder if the glass backs and utterly shiny but heavy and PITA to repair design is also part from there. A reviewer doesn't care that much if it costs half the phone to repair the back panel.
Makers? Could you expand on this?
You can replace it with "manufacturer", I do think it becomes clearer.
Maker has a specific connotation, but technically still fits on the GP.
He means OEMs. Device manufacturers.
Examples include Apple, Samsung, Lenovo, etc etc.
I don’t think most people are upset about the update itself. They’re just reacting to the mismatch between the scale of the improvements and the scale of the marketing.
To be fair apple didn't even market this launch, it was a very quiet release, but the reviewers needed to make content, so here we are. That's not to say that they generally aren't guilty of making massive fanfare about ridiculously incremental phone updates and swapping the location of the camera bump just to make it look different.
> I think the issue stems from too many people making their living off reviews that require something exciting to get views.
The problem is that our hardware as we know it, has lost a lot of its stretch. Used to be that we got 100% performance gains on a generation to generation update. Then it became 50%, 30% ... Like in the GPU market, the last generation that actually got me exited was the 1000 series (1070 specific).
Now its "boring" 10 a 15% upgrades for the same generation (if we do not count naming / pricing rearrangements).
When was the last time any of use was "hey, i am exited to potentially buy this tech, really". Apple M1 comes to mind, and that is 5 years ago.
Nvidia tried to push the whole ray tracing (a bit too early), but again, its just a incremental update to graphics (as we had a lot of tricks to simulate lighting effects that had good performance). So again, kind of a boring gain if we look back.
Mobile gaming handhelds was trilling, steam deck... Then we got competitors but with high price tags = excitement became less. And now, nobody blinks with a new generation gets released because the CPU/iGPU gains are the same boring 15 a 20%... So who wants to put down 700, 900 Euro for a 15% gain.
What has really gotten you exited? Where your just willing to throw money at something? AI? And we see the same issue with LLMs ... what used to be big step/gain, in barely a years has gone from massive gains, to incremental gains. 10% better on this benchmark, 5% better there, ... So it becomes boring (GPT5 launch and reaction, Sora 2 launch and reaction).
> When updates are more evolution than revolution, it makes for a more boring article/video.
If you think about it, there is a reason why tech channels have issues and are even more clickbait then ever. Those people live on views, so when the tech they follow/review is boring to the audience, they start pushing more and more clickbait. But that eventually burning the channels.
Unfortunately, we have a entire industry that is designed around making parts smaller and smaller every generation, to make those gains. As we lost the ability to make large gains on making those smaller making parts ...
Its ironic, as we knew this was coming and yet, it seems nobody made any breakthrough at all. Quantum computing was a field that everybody knew had no road to general computing at home (materials issues).
So what is left is the same old, lets may the die a bit smaller, gain a bit, do some optimizing left and right, and call it a new product. But for customers, getting product 2.1, being named "this is our product 3.0!!!! Buy buy" ... when customers see its just 2.1, 2.2, 2.3 ...
We are in a boring time because companies sat too darn long on their behinds, milking their exiting products but never really figured how to make new products. I think the only one that took a risk was Intel years ago, and it blew up in their face.
So yes, unless some smart cookie makes a new invention, that can revolutionize how we make chips (and that can be mass produced), boring is the standard now. No matter how companies try to repackage it.
It's so tiresome.
Every car company in the world realized that yearly product updates was the way to go, and no one whines that this year's model isn't good enough to justify upgrading from the previous year.
The auto market doesn't really work that way any more. Each new model is now expected to last about 8 years (plus or minus depending on the manufacturer) with only one minor mid-cycle refresh.
My worry is that all their hardware teams are now on the same "must release yearly with insane marketing" cycle as software.
Nobody realy talks about the knock on effects of the attention economy. I opted out of it a long time ago, I despise YouTubers, TikTokers etc. because of the world they're shaping, and it's not just the injection of mindless rubbish into people's brains, it's real world effects like you outline in your second paragraph. It amazes me when I see seemingly smart people on HN talking about how they're addicted to YouTube "Shorts", it's like being addicted to labotomies.
Not me: I wanted Apple’s software division to innovate like its hardware division. Extra power with nothing to use it on except more and more docker containers isn’t compelling to me. I’ve not upgraded my M1 Macbook Pro and don’t plan to
I'd much prefer they focus on fixing their existing software quality problems. No innovation needed, just boring old software maintenance and design work.
They need a "Snow Leopard-style" release. This was the macOS version that came after Leopard and it was explicitly a release with no new features. It just focused on performance, efficiency, and reducing overall memory usage. Famously Apple even advertised it as having "zero new features". Recent macOS releases feel like they're going in the opposite direction. Really wish their next release would be more like Snow Leopard.
This is all I want. My Macbook Air is already stupid fast. Now I want the OS to support working faster. I would love nothing more than a ton of small quality of life updates.
Windows also need this for 10 years, since Windows 10 launch. But never happened. I guess it's just not really viable, as they need to continue selling new devices...
I still don't understand how there can be like 10 different UI styles in Windows 11 with no progress to getting that fixed. It feels more likey they are adding new ones instead...
WinUI looks good for me. Improvements on efficiency are needed before it can become mainstream though.
They should go back to the Windows 2000 theme, that was the best one.
AFAICT from the marketing, macOS 26 didn’t come with any new features.
[I get your point; I just refuse to consider to a ridiculous reskin no one asked for to be a “feature.”]
I agree with your sentiment, but let's not rewrite history too much. Snow Leopard didn't have any new feature, but under the hood it was a massive undertaking IIRC: it introduced a 64-bit kernel and 64-bit system applications like Finder, Mail, Safari, etc. It also replaced many 32-bit system frameworks. Until Snow Leopard MacOS X was still mostly 32 bits.
When Snow Leopard came out it was very buggy, and many apps simply did not run on it. I've been a Mac user since 1993, and I think it's the only version of macOS I ever downgraded from. Don't get me wrong, it eventually became rock solid, the apps I needed were eventually upgraded, and it became a great OS.
But let's not mistake MacOS 10.6.8 for MacOS 10.6.0. And maybe let's not compare macOS 26.0 to MacOS 10.6.8 either, it's not quite fair. Ever since Snow Leopard I've been waiting at least 6 months before upgrading macOS. I don't intend to change that rule anytime soon...
They are already innovating more than they can deliver. While Apple's hardware quality is usually good, their quality standards are much lower on the software side.
This is surprising to me. I always thought Apple was ahead on the UX side of software with their attention to detail. Though it's been a while since their hardware design flaws, and their software has had new issues. Even Louis Rossmann has mostly stopped talking about Apple since their repairability changes (or due to having bigger fish to fry).
Their hardware used to be atrocious and software was the only thing they would get praised on. Now they've gotten lazy in the software department and outstanding in the hardware department.
Side note, rossmann has stopped talking about Apple because he is not longer focused on Apple repair and is turning his attention to other causes not because of apple's "repairability" changes which are still a token gesture.
They still support x86 on Tahoe. I wonder if macOS 27 will see some changes now they can finally be rid of x86 baggage.
Apple UX: "intuitive" features you have to discover via some random video reel. Of course you have to drag your messages sideways to see when they were sent, that's Good UX!
Sprinkle with crashes and bugs that are never fixed and charge a premium.
Apple UX: A beginner on a new Mac can’t right-click and copy, because there is a right-click but it isn’t active by default.
Go to Spotlight -> Type “Settings” -> Locate the settings -> In settings, go to Accessibility -> Wait no, it’s Mouse -> Gestures -> Activate the right-click.
^ That’s the experience for beginners. That screen should be in the installation wizard if Apple wants to make it optional. “Customize your mouse gestures”.
I don't even want more docker containers - I want to be able to run the same containers with much less overhead on the system. Hoping their new native containerisation plays that out soon.
You want Apple to invent reasons for you to need a more powerful computer? I could understand this argument for the iPad, but this is a weird complaint for Macs. Play a video game, use local LLMs, or get into video editing?
Never owned an apple device myself, but honestly I just want the existing tech to start coming down in cost. I grabbed a fordable this year and it's great. It was very much not worth $2200 though (and that was before tarriffs), so I grabbed a used 2YO model for $800 or so.
I'm already salavating at the thought of a fordable tablet in any form. But not at the thought of paying $3000 for one with current pricing.
Am I right to assume those typos meant to say "foldable"?
I'm just saying I need a real reason to upgrade beyond "benchmarks make me feel good".
I can run local LLM's fine on my M1 Pro from 2021. I play games on it too. Why would I spend multiple thousands on a M(N) Macbook if there's no real reason to? It's not like when I upgraded from a 386DX to a Pentium.
I have a similar argument for phones right now. There are some AI-related reasons to upgrade, but there's not really a clear killer app yet besides frontier model chat apps. Why should I spend thousands of euros on an upgrade when I don't get anything significantly different for it?
Being able to continue running a 4 year old laptop for many more years without performance issues seems like a positive thing, not a negative one.
> Why should I spend thousands of euros on an upgrade when I don't get anything significantly different for it?
You shouldn't and nobody is asking you to. Apple can sell their new computers to billions of prospective customers who wish to upgrade from x86 or buy their first computer.
There's so much untapped potential with these chips
At this point does the hardware division really innovate that much ?
There is significant improvement from the M4 to the M5, but how much of it is comes from TSMC and how much from Apple ? They have exclusivity on the latest processes, so it's harder to compare with what Qualcomm or AMD is doing for instance, but right now Strix Halo is basically on par with the M3~4 developped on the same node density.
On the other hardware parts, form factor has mostly stagnated, and the last big jump was the Vision Pro...
The Vision Pro was still pretty recent. They also refreshed the hardware designs of everything when moving to the M-series chips.
They also made that new wireless chip recently, the chips for the headphones, and some for the Vision Pro. The camera in the iPhone also gets a lot of attention, which takes a lot of hardware engineering. In the iPhone more generally we saw fairly big changes just a month or so ago with the new Pro phone and the Air. The Pro models on the MacBook and iPad are almost as thin, if not more thin than the Air line, which I’m sure took a considerable amount of work, to the point of making the Air branding a little silly.
The Vision Pro is a technical marvel, but as a hardware product it's uncomfortable (too heavy), bloated (useless front screen) and badly designed (straps are finally getting decent if we can trust the reviews).
These decisions IMHO fall on the hardware team, and they're not doing a good job IMHO. Meta's hardware team is arguably pulling more weight, as much as we can hate Meta for being Meta.
> headphones
Here again, the reception wasn't that great. The most recent airPod Pro was a mixed bag, the airPod max had most of the flaws of the Vision Pro and they didn't learn anything from it.
> camera
The best smartphone cameras aren't the iPhone by far now, they're losing to the Chinese makers, but don't have to compete as the market is segmented.
> MacBook and iPad are almost as thin
I wouldn't put the relentless focus on thinness as a net positive though.
All in all I'm not saying they're slacking, I'm arguing they lost the plot on many fronts and their product design is lagging behind in many ways. Apple will stay the top dog by sheer money (even just keeping TSMC in their pocket) and inertia, but I wouldn't be praising their teams as much as you do.
I still think the future for the Vision Pro is very bright. I think this version is more to get developers working on applications for it. Spatial computing is a fascinating idea.
In my own lived experience, every person I have met IRL who is dismissive of the Vision Pro has never actually used it seriously for more than a handful of minutes. People I know who have swear by spatial computing being the next UI/UX revolution.
I own an AVP, and I agree. Now I bought it secondhand for half the price, so I acknowledge that necessarily means there is at least one counterparty out there who disagrees.
Using the AVP for one work day, once I got the right fit and optical inserts, was such an eye opener. It’s like using an ultraportable laptop after living an entire life with large CRT monitors & desktop rigs tied to an actual desk. An experience, btw, which also lived through. It just radially opened my eyes to fresh new possibilities and interaction mechanisms I never before thought possible.
But at $3.5k? No sane company exec could have been serious in thinking that would take off.
My company-issued laptop isn't far off $3.5k. The tail end of AVP development and release was happening during the pandemic. I can absolutely imagine sane company execs who looked at the new remote work reality (at the time), and figured every single major enterprise would buy every one of their remote employees (all of them) an AVP.
I kinda hope someone would have the sanity to stop that.
Zoom calls with mandatory camera on were already barbaric, asking employees to strap a headset for team meeting sounds like a generally cruel idea to me.
It has a reasonable probability to get somewhere(that would require a lot of redesign, but they have th money to do so), but to be honest I wouldn't be happy with the most restrictive and closed ecosystem winning again in a new field.
If it was by design excellence and truly providing a better proposition it would sweeten the pill, but as of now it would be only because the way better products are from a company everyone hates.
In a weird way, Meta has been good at balancing hardware lockdown, and I'd see a better future with them leading the pack and allowing for better alternatives to come up along the way. Basically the same way the Quest allowed for exploration, and extended the PCVR market enough for it to survive up to this point. That wouldn't happen with Apple domining the field.
Honestly I'd be happy if they just made it stop lagging when I switch between multiple desktops in mission control. I spend most of my time in 3d party apps anyway. They recently added that lag I think with the liquid glass.
> They recently added that lag
They tend to add lag in major OS releases. Gets people to consider refreshing their hardware. Just by sheer coincidence, they have a new model out this year! :-)
I rather that they stop inovating beyond the team skills budget.
I upgraded to the M4 for more ram and more GPU for local LLMs so I'm not sending all my shit to OpenAI, but it's not for everybody.
I think Final Cut and maybe Logic make good use of the new silicon features.
I’m rather happy I don’t have to upgrade from my M1. More performance is nice, but making it the baseline to run an OS would just be silly.
Yep, super optimized, super responsive, and they feel like they justify the hardware gains
It’s actually very sad to see the state final cut is in. It’s a perfectly competent NLE for speed editing and has some solid features, but they had a real piece of software on their hands for years and just kind of sat around doing nothing from 2018 or so onward. I guess it just isn’t generating enough revenue to warrant the attention it deserves. It was my workhorse for a solid decade, I passionately defended FCPX because it was truly excellent after they got it to a good place 12-18mo in. Their native multicam and audio sync blew premiere/plural eyes out of the water for years. But now it’s just so…meh
I can’t imagine leaving Resolve to go back even though I still wayyyy prefer the FCPX UI.
Now you can run LLMs beside your Docker containers!
How much innovation is there to do in the OS at this point? You can install applications and they can innovate.
Maybe you need AI, but maybe you just need some AI agent app that uses AppleScript under the hood.
I'd rather buttery smooth, secure, fast, no bugs, let me do my work.
I still have to install third party apps like Rectangle, LinearMouse and Middle to get a fraction of the ease of navigation I get on my Linux machines.
I still have to install a third party terminal like Kitty or Ghostty for basic, modern rendering.
Tons of things from the gaming side. There's a reason people laud SteamOS over a full blown windows handheld as an OS.
But what's "marketable"... well, I guess we need to drizzle whatever we come up with in AI. or douse it.
Well, it’s 2025 and I still need to install a 3rd party toolbar calendar app, so there’s that.
I also can’t snap windows, and Cmd-tab still can’t tab between different windows of the same application.
There’s lots more usability that can be improved IMO
lol! It's an Operating System! It allows you to install your own apps to do things like snap windows.
If you want the OS with all the shit you do (and don't) need, then maybe Windows is for you. ;-)
You also can snap windows in MacOS as of a few years ago.
To switch between windows of the same app, use Cmd-`
That particular issue is just a conceptual mismatch. Exactly 0% of the time do I want to segregate my activities as "chrome" things vs "terminal" things vs whatever. When I want a feature like that, multiple desktops (mission control or whatever) is the tool of choice.
The backtick thing is just a constant annoyance. My workflow is to open windows doing the things I want some, and I want to quickly switch to the window with my next work item. Instead, I need to keep track of extra mental state and figure out if backtick is the right keystroke or if tab and then backtick is the right thing to do.
It's...fine. I'm thankful I have better options at home, but it's tolerable at work with a few third-party apps.
That was the killer feature that convinced me to switch from Windows + Linux to Mac a long time ago. I often have too many windows open, and the conceptual separation between apps and windows helps me find the right task faster. Especially because I can also switch to an app that doesn't have any windows open at the moment.
Yep, my description was mostly negative (I personally hate it because I don't think that way), but I was serious about it just being a mismatch of expectations. There's nothing written in stone about the MacOS method being wrong, and it's nice that it works better for some people. UI is partly objective and partly subjective, and this particular point definitely falls on the subjective end of the spectrum.
It's also possible to do on Windows via external tools, easier to fix than changing the whole OS
> How much innovation is there to do in the OS at this point?
Infinite, just like in any complex UI. All the basic interaction primitives built into the OS are somewhat broken, from app/window management and text editing to keybindings and mouse gestures
As someone who was new to Mac and eager to use AppleScript for automation, I was disappointed to find that a number of things just don't work under AppleScript in Apple Silicon. It's pretty deprecated; Shortcuts seems supported though.
While it’s certainly true that AppleScript has taken a bit of a back seat to Shortcuts, as far as I know everything AppleScript that works on Intel should work on Apple Silicon.
What things are you finding that aren’t that way?
It's been a long while, but I remember running into issues running scripts using osascript to click through an application to work properly on a cadence. I remember my debugging ending when it seemed the consensus was that some things just wouldn't work any more on Apple Silicon.
Apologies that my memory fails me here! This was a few years ago, I only have my zsh history (and the name of a now-deleted script) to go by.
> How much innovation is there to do in the OS at this point?
1) Sign Nvidia's drivers again, at least for compute (there's no excuse)
2) Implement Vulkan 1.2 compliance (even Asahi did it, c'mon)
3) Stop using notifications to send me advertisements
3.1) Stop using native apps to display advertisement modals
4) Do not install subscription services on my machine by-default
5) Give macOS a "developer mode" that's at-least as good as WSL2 (if they won't ship GNU utils)
6) Document the APFS filesystem so the primary volume isn't inscrutable, akin to what M$ did for NTFS
If they're trying to get me to switch off Linux, those would be a nice start. I don't think any of that is too much to ask from a premium platform, but maybe my expectations are maligned.
> 5) Give macOS a "developer mode" that's at-least as good as WSL2 (if they won't ship GNU utils)
The de facto answer is Homebrew — even internally at Apple. They just can’t publicly say it without liability issues.
> If they're trying to get me to switch off Linux
It’s important to know that Apple is not trying to get you to switch from Linux. Converting “UNIX workstation” people was an effort of theirs circa 2001 but that marketing campaign is long over with.
Their targets are consumer, prosumers, and media/influencer people. They give app developers just enough attention to keep their App Store revenue healthy.
Plan your long-term computing needs accordingly. You’ll see what I mean in the next 12-24 months.
I don't know, I was using WSL2 on Windows before I switched to MacOS, WSL2 gets annoying to be honest.
You're better off using MacOS built native unix binaries and a VM or docker.
I never noticed ads in notifications, unlike with Windows which is ads infested everywhere now.
I agree that better GPU support would be nice, but also better Metal support in common open source would be nice, since I'm a laptop user.
> Give macOS a "developer mode" that's at-least as good as WSL2 (if they won't ship GNU utils)
They shipped something similar in macOS 26 - native Linux container support.
I don't think anyone should upgrade if they're happy, but I also think faster chips do have real-world benefits that tend not to be appreciated by people who aren't valuing their time enough. I replaced my M1 MBP with an M4 earlier this year, and it's had a couple real-world benefits:
- builds are noticeably faster on later chips as multicore performance has increased a lot. When I replaced my M1 MBP with an M4, builds in both Xcode, cargo and LaTeX (I'll switch to Typst one of these days, but haven't yet) took about 60% of the time they had previously. That adds up to real productivity gains
- when running e.g. qwen3 on LM Studio, I was getting 3-5 tok/s on the M1 and 10-15 on the M4, which to me at least crosses the fuzzy barrier between "interesting toy to tinker with sometimes" and "can actually use for real work"
How about:
- 5G connectivity - WiFi 7 - Tandem OLED Screen - Better webcam - FaceID - Cheaper RAM (RAM is more important to me these days than CPU speed) - More ports - Better/cheaper monitors - Make a proper tablet OS - Maybe a touchscreen but I really don't want one
just to get started
As someone who hates Apple's facial recognition implementation, I'm eagerly awaiting the day they ditch TouchID for FaceID. That'll be the year for me to upgrade to a high-spec laptop on the last generation with TouchID.
Honest question, why? Sounds like you have a technical issue with it. I’m not a fan of FaceID for digital privacy reasons, but the reliability and ease of use eventually forced me to give in. I don’t think I’ve encountered anyone who hates Apple’s specific implementation of facial recognition.
Price gouging on RAM is a very intentional decision by Apple to charge 8x market rate for it. Same for storage, you can get a blazing fast 4 TB NVMe SSD for just a few hundred bucks vs $2k or whatever Apple extorts from you.
It’s just market segmentation. Other companies do this by putting nonperformant CPUs lacking sufficient bus lanes in the consumer laptop. Apple gives the entry models a real piece of hardware, just with insufficient RAM. I like this situation better than the alternative.
Yeah I get that, but it still feels really unpleasant from the side of a regular customer. Sure, Apple is targeting the software industry and media industry who'll pay $5k for a fully kitted out MBP for all of their employees. And the regular normies who don't need much RAM/storage get amazing hardware at a good price point - good for them.
But as a regular guy who just has a lot of files and tends to keep tons of browser tabs open... it really sucks that I'm in the situation of getting extorted for $3k of pure profit for Apple, or have to settle for subpar hardware from other companies (but at a reasonable price). Wasn't an issue when the RAM & SSD weren't soldered on, but now you can't upgrade them yourself.
I think the point is that every manufacturer is playing this game, and with comparable margins.
I have no idea what the hip PC laptop is these days, is it still the Lenovo Carbon X1? I went to their website and picked the pre-configured laptop with the most RAM (32GB), best CPU, and 1TB SSD. This was $3k: https://www.lenovo.com/us/en/p/laptops/thinkpad/thinkpadx1/t...
Roughly the same size and specs as the most expensive pre-configured MacBook Pro of the same screen size (the MBP has 36GB RAM, +4GB over the Lenovo, and a much better processor & GPU for $3.2k).
It's all market segmentation. Apple is just being upfront about it and giving you a clean, simple purchase page that shows the tradeoffs. Whereas Lenovo is using car salesman techniques to disorient you with a bewildering array of options and models all of which have decision paralysis-inducing tradeoffs not entirely in your favor.
Then go get another computer? Why do you rage against a product which you don't like? Forget about Apple and stick to other makers. There's plenty of products and manufacturers I don't like. I never think about them.
Not sure why you're taking it so personally and getting defensive. Was my comment not related and relevant to parent comment? I did in fact buy another computer because I don't like getting price gouged. Have a nice day.
Apple could have my money in exchange for their hardware. I won't even ask for support. They just need to provide the hardware specifications to Linux developers.
Exactly. I might even pay them for the opportunity cost of not harvesting my data
That would immediately induce my first ever significant Apple purchase.
We want cheaper storage, now. 1TB costs around $400 based on speccing a 2TB M5 Macbook pro.
That's probably a 4x markup, and the $200 to go from 256 to 512 is even worse.
Every time a user considers jumping from Windows but balks at the storage costs, that's leaving many thousands of potential revenue on the table. I just can't believe it really makes economic sense for them, except in short-term cashflow terms.
RAM too. They want +$1K to go from 64GB to 128GB, with no other spec changes. It's a way of segmenting the market -- those who actually need it are willing to pay a lot more (e.g. for AI / 4K videos).
1TB of good NAND costs 30€ or so.
> Back in the PowerPC and Intel days, Macs would sometimes go years between meaningful spec bumps, as Apple waited on its partners to deliver appropriate hardware for various machines.
Yes and no. Sometimes Intel did not move as fast as Apple wanted, and sometimes Apple didnt feel like it. Especially the MacPro (trash can and old cheese-grate) and the MacMini (2012-2018) were neglected.
Today, the MacPro ships with M2 Ultra, the MacStudio ships with M3 Ultra, and its not certain that the MacMini and the iMac will get the M5 or will continue shipping with the M4 for the foreseeable future.
I have a M1 Max MPB from 2022 with 32G RAM (which I'm grateful for).
More performance (especially for local AI models) is always great, but I'm trying to imagine what I'd want out of a design change!
I think slightly thinner would be nice, but not if it runs hotter or throttles.
Smaller bezels on the screen maybe?
I'm one of those who liked the touchbar (because I think that applications which labelled its shortcuts in the touchbar are awesome) so I think some innovation around things like that would be nice. But not if it compromises the perfect keyboard.
I do think MacOS would be improved with touchscreen support.
> I do think MacOS would be improved with touchscreen support.
On the contrary, I appreciate the Mac UI not being forced into touch friendliness. The whitespace increase in Big Sur is already bad enough, at least to me.
For those of us immersed in hardware fandom, the cycle is neither new or disappointing - if anything, a lot of us relish the “boring” times, because it means we can finally squeeze performance out of our investment without fretting about arbitrary replacement timelines or major improvements in technology leading to gargantuan gains. It’s nice, quiet, and let’s us enjoy the fruits of our labors and hobbies.
That being said, I do kind of head-tilt at the folks screaming that this sort of “boring” cycle of hardware isn’t sustainable, that somehow, someone must create the next major improvement to justify all new spend or otherwise this is a worthless exercise. In reality, it’s always been the opposite: Moore’s Law wasn’t infinitely scalable, and anyone who suffered through the Pentium 4 era was painfully aware of its limitations. Sure, we can find other areas to scale (like going from clock speed to core counts, and core counts to core types), but Moore’s Law is not infallible or infinite; eventually, a plateau will be reached that cannot be overcome without serious R&D or a fundamental sea-change in the marketplace (like moving from x86 to ARM), often a combination of both.
Apple, at least, has the unenviable position of being among the first in addressing this challenge: how do you sell more products when power or efficiency gains are increasingly thin, year over year? Their approach has been to leverage services for recurring revenue and gradually slowing down product refreshes over time, while tempering expectations of massive gains for those product lines seeing yearly refreshes. I suspect that will be the norm for a lot of companies going forward, hence the drive to close walled gardens everywhere and lock-in customers (see also the Android sideloading discourse).
The hardware cycle at present is fairly boring, and I quite like it. My M1 iPad Pro and M1 Pro Macbook Pro dutifully serve me well, and I have no need to replace either until they break.
I thought this was going to be about Apple's various recent catastrophic software innovations, saying "why did you have to mess with a good thing? We just wanted it to stay as-is, even if that's considered 'boring'"
I wouldn't call the old software good either
General purpose computing is what we wanted.
I'm still typing this from an M1 MAX MBP w/ 64 gig of ram. I ended up needing more memory so, I swapped to this machine instead of my M1 air w/ 16gig. Both machines are completely capable for most tasks I deal with as a developer. Do I like my work m3? Sure. I wish I had the old m3 air I had to give back. But I'm happy with my machines.
It's funny that my ipad has a more current CPU than my two laptops.
Soo we can't demand both stability and constant reinvention
Personally: I am extremely excited for a world where we have silicon that's capable of driving triple-A level gaming in the ~20w TDP envelope. M5 might actually be the first real glimpse we've had into this level of efficiency.
Still no need to upgrade my M1 MBA... life is good.
I just want the old Macbook Air M1 design back :(
That’s what I am holding on to.
I have a 2020 intel 10nm quad core MBP and my god even the M2 is so much faster. They are doing absolutely incredible work to be getting >10% improvement every single year without fail starting from that point.
Yep, one of the biggest success for Apple in recent years is to hire a team of chip designers for these M chips.
I've heard that the M-series chips with metal do great on the whole small model with low latency front; but I have no practical experience doing this yet. I'm hoping to add some local LLM/STT function to my office without heating my house.
I'm uncertain as to whether any M series mac will be performant enough and the M1/M2 mac mini's specifically, or whether there are features in the M3/M4/M5 architecture that make it worth my while to buy new.
Are these incremental updates actually massive in the model performance and latency space, or are they just as small or smaller?
As someone who purchased their first M-series Mac this year (M4 pro), I've been thrilled to discover how well it does with local genAI tasks to produce text, code and images. For example openai/gpt-oss-20b runs locally quite well with 24GB memory. If I knew beforehand how performant the Mac would be for these kinds of tasks, I probably would have purchased more RAM in order to load larger models. Performance for genAI is a function of GPU, # of GPU cores, and memory bandwidth. I think your biggest gains are going from a base chip to a pro/max/ultra version with the greater gpu cores and greater bandwidth.
The M5 is a huge upgrade over the M4 for local inference. They advertise 400% and there is reason to believe this isn’t a totally BS number. They redo the GPU cores to avoid having to emulate certain operations at the core inner loop of LLM inference.
I have an M4 and it is plenty fast enough. But honestly the local models are just not anywhere near the hosted models in quality, due to the lower parameter count, so I haven’t had much success yet.
I can relate. Most users just want stable, quiet performance improvements, not a revolution every update. Do you care more about performance improvements or new features?
Frankly I’d be incredibly exited if the next Apple OS update was “No new major featurs. Bug fixes, perf optimization, and minor ergonomic improvements only”.
The Mac OS X 10.6 Snow Leopard approach. It was good.
A few ergonomics improvements and a wifi stack that doesn't periodically crash could be enough to pull me to the dark side. I like my setup, but I'm lazy and don't really like the process of upgrading computers. Apple taking that load off could buy me as a customer for a decade. The amazing battery life is nothing to sneeze at either.
Sadly this approach is less likely to get the exec a bonus
The important work isn’t always glamorous. This is a problem that has tainted the entire industry.
The new pretty stuff feels a lot less magical when it lags or the UI glitches out. Apple sells fluidity and a seamless user experience. They need those bug fixes and an obsessive attention to detail to deliver on what is expected of their products.
The exec, but also the SWEs. In my company, if all you have to show is minor improvements and bug fixes, you're at risk of being fired.
That’s sad.
Nah. I want fixes in macos, not "boring" nor "shiny updates".
I hate that computers get faster, because it means I'll be forced to buy another laptop. It goes like this:
Before I'd buy a laptop because it had cool new features. But now the only reason I buy a new one is the new software crashes from too little RAM, or runs too slowly. My old laptops work just fine. All the old apps they come with work just fine. Even new native apps work just fine. But they can't run a recent browser. And you can't do anything without a recent browser.If our computers never got faster, we would still be able to do everything the same that we can do today. But we wouldn't have to put down a grand every couple years to replace a perfectly good machine.
I think what you want is for software developers not to write bloated code, instead of computers not getting faster. The bloated code is a result of undisciplined programming and not paying attention to users' devices.
If our computers never got faster, we would never get faster computers (obviously...) to run efficient code even faster. 3D rendering and physics simulation come to mind.
I have noticed what you mention over longer timescales (e.g. a decade). But it's mostly "flashy" software - games, trendy things... Which also includes many websites sadly - the minimum RAM usage for a mainstream website tab these days seems to be around 200MB.
Anecdata: My 12 year old desktop still runs Ubuntu+latest Firefox fine (granted, it probably wouldn't be happy with Windows, and laptops are generally weaker). Counter-anecdata: A friend's Mac Pro from many years ago can't run latest Safari and many other apps, so is quite useless.
I don't know if that's accurate to software developers, but it makes me cringe a bit as a game developer. I upgraded from a 1060 to 4060 and suddenly did waaaay less optimization; it just wasn't top of mind anymore. Of course, that bill still comes due eventually..
What nonsense.
Name a software that won’t run comfortably on my M1 MacBook Air, now 5 years old.
Well if you bought the 8GB ram version there might be some apps that won't work that well ;-)
I agree. My M1 Air is the best laptop I’ve ever owned (and that goes back 30 years). While I’m finally getting tempted to upgrade by M5, the reality is my M1 is still quite usable. I’m thinking I might use it until it either fails or Apple finally cuts support for it.
Mac is not the lowest common denominator
What happened to the M3 GPU to give it a drop in score?
Same reason Asahi Linux only supports up to the M2. They completely re-did the system architecture.
When you make a major architecture change (e.g. dynamic caching) there's always one or two workloads that get slower.
Reading this on my brand new M5 Mac :)
Agree! very happy with the M4 performance.
This seems like a straw man. Are reviewers really calling the M5 boring?
For Exciting, look into RISC-V.
That's gonna be wild starting 2026, with the first implementations of RVA23, such as Tenstorrent Ascalon devboards TBA Q2.
Mac hardware has so significantly outpaced software needs I think there are diminishing returns. I'm a software developer who uses all sorts of advanced stuff and I only bought an M4 Pro, not a Max, because it wasn't worth the extra money. There are so few applications that max out a CPU for any meaningful amount of time these days like rendering videos or 3D.
My M4 iPad Pro is amazing but feels totally overpowered for what it's capable of.
I guess what I'm saying is.......I don't need faster CPUs. I want longer battery life, 5G connectivity, WiFI 7, lighter weight, a better screen, a better keyboard, etc..
I guess it's odd that Apple spends so much time making faster computers when that is practically an already solved problem.
They aren’t just making computers, though. Today’s CPU improvements go into tomorrow’s Vision Pro. Today’s improved E cores become tomorrow’s watch cores. Or something like that.
We want Apple to compete. When they stopped signing CUDA drivers, I thought it was because Apple had a competitive GPGPU solution that wasn't SPIR-V in a trenchcoat. Here we are 10 years later with SPIR-V in a trenchcoat. The lack of vision is pathetic and has undoubtedly cost Apple trillions in the past half-decade alone.
If you think this is a boring architecture, more power to you. It's not boring enough for me.
Genuine question, how does SPIR-V compare with CUDA? Why is SPIR-V in a trench coat less desirable? What is it about Metal that makes it SPIR-V in a trench coat (assuming that's what you meant)?
At this stage of the game what people want is CUDA. I just bought a new GPU and the only requirement I had was "must run reasonably modern CUDA".
There might be a subset of people, such as yourself, that looks for CUDA as a hard requirements when buying a GPU. But I think it's fair to say that Vulkan/Spir-V has a _lot_ of investment and momentum currently outside of the US AI bubble.
Valve is spending a lot of resources and AFAIK so are all the AI companies in the asian market.
There are plenty of people who wants an open-source alternative that breaks the monopoly that Nvidia has over CUDA.
I think the new AMD R9700 looks pretty exciting for the price. Basically a power tweaked RX 9070 with 32gb vram and pro drivers. Wish it was an option 6-7 months ago when I put my new desktop together.
Great. I’m with you there. There is no way that’s describing Apple though.
They’re not open source, for sure. But even setting that aside, they don’t offer anything like CUDA for their system. Nobody is taking an honest stab at this.
Triton is a very compelling alternative to CUDA for many applications. Many people know it from the outstanding Unsloth kernels.
https://triton-lang.org/main/python-api/triton.language.html
Mojo has support for Apple Silicon kernels: https://forum.modular.com/t/apple-silicon-gpu-support-in-moj...
They say no downside, but if you need to run windows 7 in virtualbox, you still need an intel mac (or other non-arm computer).
Windows 7 is sixteen years old. There are full x86 emulators available. Seems like a niche pursuit.
I have an old card printer that I only use occasionally, and firing up a windows 7 virtual machine is (was?) the most convenient way to do it. I think it's not so uncommon to have old devices around that don't work with newer versions of windows.
Perhaps a Macbook is now fast enough to just run Windows 7 in full emulation? Haven't tried, though.
Edit: Checked on Youtube. Yeah, Windows 7 seems to be fast enough on an Apple silicon Macbook in full emulated mode. For example: https://www.youtube.com/watch?v=B9zqfv54CzI
I don’t think that use case is worth designing a computer for in 2025
I have an old 2019 MBP running Windows 10 for old gaming
So it was with PowerPC, Sparc, SGI CPUs, and a bunch of older now obsolete architectures. I don't think we should be limiting the technological potential to keep old Windows drivers afloat, and they weren't native to the platform to begin with. You can always get a PC and virtualize Windows 7 just fine.
Today I was testing an x64 msi installer and app in a Windows ARM VM on UTM and it worked just fine with the Windows built-in emulation.
Windows ARM?! Make article about it
M1 had performance/watt way ahead of x86.
M5 has performance/watt below Panther Lake.
Is that really what you want?
Yes, I really want an M5, a CPU I can buy today, more than Panther Lake, which isn’t on the market yet and hasn’t been reviewed by 3rd parties.
I want a laptop that gives me amazing performance, thermals, build quality, and battery life. It’s gonna take a while to see what manufacturers will do with panther lake.
These arguments constantly devolve into "why would you want an APPLE product that's less good than $_new_shiny_PC_thing" and it's always currently available products being pitched against conceptual products in the heads of Intel's fanboys that may come to market in a year. It's a ridiculous comparison.
I got an M3 Pro Macbook Pro on clearance recently for $1,600, 16 inch screen brighter than any PC laptop's I've ever seen, that's the fastest computer I have ever used, hands down and it's 2 generations out of date already. OR I can have a PC gaming laptop where the fit and finish isn't as nice, where the screen is blurrier, the battery life maxes out at 4 hours if I do absolutely nothing with it, and any time I do anything of remote consequence the fans kick up and make it sound like it's trying to take off.
And that's without even taking into account the awful mess Windows is lately, especially around power management. It makes every laptop experience frustrating, with the same issues that were there when I was in fucking high school.
Like if you just hate Mac, fine, obviously a Mac is a bad fit for you then and I wouldn't try and tell you otherwise. But I absolutely reserve the right to giggle when those same people are turning their logical brains into pretzels to justify hating a Mac when it has utterly left the PC behind in all things apart from gaming.
I have a PC on the other side of my wall from where I'm sitting with a 7800X3D and an nvidia 4090. It's for gaming only most of the time, though I do take advantage of the 4090 for some basic LLM stuff, mostly for local audio transcription and summarizing (I take a lot of notes out loud, I speak faster than I can write). The rest of the time it's playing AAA gaming titles at full tilt at 5120x1440, full res, getting 60fps on basically anything I throw at it, all while sucking down 600W (400 for the GPU alone while playing Cyberpunk). It's a beast. I love it.
I have an M4 Mac Mini on my desk. At full tilt it pulls 30W. It scores higher in benchmarks than my gaming PC. It cost less than my 4090 did on its own, and that's including an upgraded third-party iBoff storage upgrade.
Of course, trade offs and process size differences abound; the M4 is newer, I can pack way more RAM into my PC years after I built it. I can swap cards. I can add another internal SSD. It can handle different kinds of load better, but at a cost of FAR more power draw and heat, and its in a full tower case with 4 180mm fans moving air over it (enough airflow to flap papers around on my desk). It's huge. Lumbering. A compute golem, straining under the weight of its own appetite, coils whining at the load of amps coursing through them.
Meanwhile, at idle, my Mac mini uses less power than the monitors connected to it, and eats up most of the same tasks without ruffling its suit. At full tilt, it uses less power than my air purifer. It's preposterous how good it is for what it costs to buy and run. I don't even regret not getting the M4 Pro.
When the Thunderbolt Display came out I was in a raid group and I wanted a display with great refresh rate and low delay (melee character, don’t get stuck standing in the poo). So I researched and researched and the only monitor that had equivalent response times to that dumb Thunderbolt Display was only $60 cheaper, had a plastic shell and I’d have to fight UPS over getting it.
Or I could drive across town and have a monitor today and pay $60 for the aluminum shell that hides dust better.
I think it's sometimes tempting for people to spill logical fallacies trying to argue against Mac lovers, when actually they just have different priors (they just value different aspects of the computer).
Yes, Macs have incredible compute/watt, display quality, and design. However, I like to think of myself as logical, and I would not buy a Mac.
Given the choice between a M5 Mac and a latest-gen ThinkPad, I would not take the Mac. That is fine, and so are people who would do the opposite. We are just looking for different qualities in our computer.
It's all tradeoffs after all - similar to how we value personal freedom in the West, I value freedom to do what I want with the hardware I own, and am willing to accept a performance downgrade for that. (No Windows means that the battery life hit is relatively light. FWIW, there's no chance I would buy a computer locked down to Windows either.)
I also value non-commitment to a particular ecosystem so I prefer not to buy Apple, because I think a significant amount of the device's value is in how seamlessly it integrates with other Apple devices.
However, one day in the future when many of my beliefs have become "bought out", perhaps my priorities will change and I will go all in on the ecosystem. That's OK as well.
I mean you have a much more reasonable and nuanced opinion than the GP so I wouldn't rope you in with the aforementioned mental-gymnastic-ing fanboys. However, I feel the need to take issue here:
> It's all tradeoffs after all - similar to how we value personal freedom in the West, I value freedom to do what I want with the hardware I own, and am willing to accept a performance downgrade for that.
Genuine question: what do you mean locked down? By default the Mac won't run unsigned software, but that's not even today in MacOS 26 an unsolvable issue. I run all kinds of software not signed by Apple daily. There are nuances further still there, like sometimes if you want to install kernel level stuff or tweak certain settings, you have to disable SIP which is definitely a bit of a faff, but that's a Google-able thing that any tech literate person could accomplish inside of 30 minutes.
I would bow to the technical limitations, as you're rather locked to ARM64 compiled software, but I don't recall the last time I saw a piece of software getting current updates that doesn't include a binary for that.
Nit: the M5 Pro isn’t out yet (or even announced). Your system is only one generation out of date :)
Oh I thought we were on M5. Time is a lie, lol.
And even for gaming, depending on what you play it's perfectly serviceable.
Yea, this is how I feel too. I've been hoping that Intel would turn itself around, but Intel has failed at its roadmap over the past few years. Intel canceled 20A and 18A is delayed. It had looked like Intel would leapfrog TSMC, but that didn't come to fruition.
I hope that Intel does well in the future. It's better for us all if more than one company can push the boundaries on fabrication.
I also remember the days when the shoe was on the other foot. Motorola or IBM was going to put out a processor that would decimate Intel - it was always a year away. Meanwhile, Intel kept pushing the P6 architecture (Pentium Pro to Pentium 3) and then NetBurst (Pentium 4) and then Core. Apple keeps improving its M-series processors and single-core speed is up 80% since the M1 and 25% faster than the fastest desktop processor from AMD and 31% faster than the fastest desktop processor from Intel.
I'd love for Panther Lake to be amazing. It will put pressure on Apple to offer better performance for my dollar. Some of performance is how much CPU a company is willing to give me at a price point and what margins they'll accept. If an amazing Panther Lake pushes Apple to offer more cores at a cheaper price, that's a win for Apple users. If an amazing Panther Lake pushes Apple to offer 2nm processors quicker (at higher cost to them), that's a win for Apple users.
But I'm also skeptical of Intel. They kept promising 10nm for years and failed. They've done a bit better lately, but they've also stumbled a lot and they're way behind their roadmap. What kind of volume will we see for Panther Lake? What prices? It's hard to compare a hopeful product to something that actually exists today. Part of it isn't just whether Intel can make 18A chips, but how fast can they produce them. If most of Intel's laptop, desktop, and server processors in 2026 aren't 18A, then it isn't the same win. And before someone says "Apple is just a niche manufacturer," they aren't anymore. Apple is making CPUs for every iPhone in addition to Macs so it has to be able to get CPUs manufactured at a very high scale - around the same scale as the Intel's CPU market.
I hope Intel can do wonderfully, but given how much Intel has overpromised and underdelivered, I'm definitely not taking their word for it.
I am excited about Panther Lake myself but where are you reading that it has higher performance/watt than M5? The chips aren't even out yet. All we have are Intel marketing materials with vague lines on charts. No one could have possibly done a performance/watt test on Panther Lake yet. I'm hoping they beat M5 but if I had to, I'd put my money on M5.
Leaving aside the availability of various Intel processors, exactly what I want is for the various manufacturers to compete as hard as they possibly can.
I want Intel to catch up this month. And then next month I want AMD to overtake them. And then ARM to make them all look slow. And then Apple to show them how it's done.
The absolute last thing I'd want is for Apple to have special magic chips that nobody else even comes close to.
I don't want Panther Lake, whatever that is.
https://www.youtube.com/watch?v=FL7yD-0pqZg
This is just how Intel names their CPU generations. It's far more boring than you're imagining. It's presumably named after https://snohomishcountywa.gov/5383/Panther
Comet Lake, Elkhart Lake, Cooper Lake, Rocket Lake, Adler Lake, Raptor Lake, Meteor Lake.
From what I’ve read, single thread for panther lake is roughly the same as last gen. The gains are in efficiency, multi thread, and GPU. The most optimistic reading I’ve seen suggested 50% gains in GPU performance and in multithread. I’ll wait for independent testing before making any judgements, but Intel has a way to go to rebuild trust.
so slower compared to the 14th gens.
Though it sounds like it won't be a 400W desktop part at least.
No, you're right, that's not—let me go buy a Panther Lake laptop right now. What site would you recommend?
M5 and Panther Lake are both late 2025 releases. They're fair comparisons.
One of them I can go to the store and buy right now. One I cannot. That is a very important difference.
Panther Lake isn't appearing in any products until 2026.
That’s partly the difference between making your own components and getting them from a vendor. Sure Intel can send select vendors prerelease prototypes but the feedback loop will never be as efficient as in house.
But it’s like a margin call. Everything is great until it completely sucks. Of course a lot of that comes down to TSMC. So if Apple falls it’s likely others will too.
I think it's the difference between having enough CPUs that you can launch a product and having enough CPUs that people start planning future products.
Volume takes time. That's why we're seeing 2026. And before someone says "that just gives Apple an advantage because they're smaller," Apple is shipping a comparable volume of CPUs - and they're doing basically all their volume on the latest fabrication tech.
Intel's CEO said Q1 2026 for market availability for Panther Lake.
There are no benchmarked samples yet.
I'd love Intel to do well with this, but Intel has disappointed before.
You're absolutely right. Where should we go to get a Panther Lake laptop?
That’s exactly what I’m asking.
And we’re sure that when it shows up in products it’ll be as good as Intel says it is?
... I mean, given Intel's history in this department, you'd probably want to wait until Panther Lake is available before getting too excited.
> The difference is that with Apple silicon, Apple owns and controls the primary technologies behind the products it makes, as Tim Cook has always wanted.
But did customers want it?
I'll leave it here, as the point is made.
Customers seem pretty happy with the changes and sales are up since the transition.
A Macbook with some of the best processors available in a laptop with the battery life and thermal characteristics of an iPhone or iPad is a pretty compelling product for many people.
Long time Apple customers? Almost certainly. Apple has a history of having conflict with their chip makers and other component vendors. For the longest time, Apple wasn't big enough for those vendors to bother with doing anything Apple wanted them to do. Or even when Apple was big, still wasn't big enough to make a dent in their vendor's pipelines (see also Intel and low power chips)
You didn’t make any point.