For some context about why a portable, user-friendly, hardware-level emulator for classic Mac systems is such a big deal, see this blog post from 2020: https://invisibleup.com/articles/30/
For game consoles, we've had emulators like Nestopia and bsnes and Dolphin and Duckstation for years.
For PCs, virtualisation systems like VMWare and VirtualBox have covered most people's needs, and recently there's been high-fidelity emulators like 86Box and MartyPC.
The C64 has VICE, the Amiga has WinUAE, even the Apple II has had high-quality emulators like KEGS and AppleWin, but the Mac has mostly been limited to high-level and approximate emulators like Basilisk II.
In addition to Executor/DOS, a non-released version ran on the Sun 3 workstations (they too had 680x0 processors) and Executor/NEXTSTEP ran on NeXT machines, both the 680x0 based ones and the x86 powered PCs that could run NEXTSTEP.
Executor was the least compatible because it used no intellectual property from Apple. The ROMs and system software substitutes were all written in a clean room--no disassembly of the Apple ROMs or System file.
Although Executor ostensibly has a Linux port, it's probably hard to build (I haven't tried in a couple decades) in part because to squeeze the maximum performance out of a 80386 processor, the synthetic CPU relied on gcc-specific extensions.
I know a fair amount about Executor, because I wrote the initial version of it, although all the super impressive parts (e.g., the synthetic 68k emulator and the color subsystem) were written by better programmers than I am.
Thank you so much for Executor. I used to run it on my 486 Linux box, over an X11 SSH tunnel to the Sun workstation I used in the computer labs for work on campus. I balanced my checkbook and wrote essays in emulated Excel and Word (with rough compatibility with the Windows versions). It was so cool to be able to mix and match systems that way.
I feel like that’s a bit harsh, but I’ll admit that it is needlessly inflammatory. I wasn’t in the best state mentally when I wrote that. (I do sometimes worry that I’m responsible for the disappearance of Paul C. Pratt…) At some point I need to either rewrite it to be less hostile or just yank it entirely.
For the amount of time and effort that went into that article, the author could surely have fixed at least one of the things they complain about! And they don't seem to understand the C #include mechanism at all, so should we even pay attention to their technical criticisms in the first place?!
I don't know if you read the whole article. The author did make a Mini vMac fork to clean up the build system and code, she linked it at the end. https://github.com/InvisibleUp/uvmac .
Ha. I was starting to find the article a bit tiring and the moment my eyes landed on the Conclusion heading I stopped right there. You shouldn't trust my criticisms either.
My 2c: having played around with the codebase recently (to add C++ to the codebase), I did find Cursor helpful in figuring out how to work with the bespoke build system.
It might not count as "user-friendly" but MAME does hardware-level emulation of the Macintosh and Apple II (more accurate and more peripherals but less user friendly than KEGS and AppleWin).
there's definitely room to improve user friendliness of mac emulation (minivmac's compile time config is so infuriating), but I think it's a bit unfair to compare to most of those emulators
vmware and virtualbox were backed by billion dollar corps
the 16 bit machines are much simpler than macs
game consoles had highly homogenous well documented hardware, and sold in much greater numbers (snes alone sold more than all macs from 1987 to 1995) so there's a larger community to draw devs and users from. writing a nes emulator is almost a weekend project now, it's so documented.
It should be pointed out that VMware started as a tiny, scrappy company mostly focused on selling workstation seats for you to run Windows on your Linux computer (which I did back circa 1999, so I could use Linux on my desktop), and VirtualBox started out from InnoTek, a tiny company which was essentially making software to emulate Windows on OS/2, and then later did a contract with Connectix to run OS/2 on Windows (or other hosts) using Virtual PC.
Connectix got bought by Windows, and InnoTek got bought by Sun, which is now Oracle. Connectix themselves started as a scrappy outfit making it possible to run DOS/Win95 on a Mac.
The core emulation was pretty much done and stable and optimised before the billion-dollar corps bought them out.
vmware apparently had 20 employees in year one, I don't think a single person has ever worked full time on a mac emulator (other than Apple's internal ones, of course)
even a "tiny, scrappy company" has massive manpower compared to 99.999% of open source projects
I suppose - owing to its accuracy - that this doesn't have some of BasiliskII's killer features: it patches the OS/ROMs to add support for super-high resolutions and (mostly) seamless integration with the host's file system and network.
It's a shame that Basilisk - possibly owing to its inaccurate but killer features - is as janky as it is, because it's really remarkably pleasant to use when it works.
An accurate emulator with clean codebase is a good starting point onto which to add patches/shortcuts. I've looked through the Basilisk patching code, it's not really complicated, and there are a handful of partial Toolbox reimplementations including the bits in Basilisk, Executor (author is in the comments here), MACE, etc. It would be some work to port but mostly direct translation of the code & adding test infrastructure.
One way to add devices that doesn't require ROM or software modification, but _does_ require modifying the emulator: create a virtual memory-mapped device off 68K bus, and write a driver/CDEV to drive it. The SE and Macintosh II had blessed but different expansion options, after all.
For earlier models, there are unused apertures in model memory maps that are at least 2KB large, but they do differ between models.
As an aside, I really wish MACE would open-source their work. They've made some impressive progress, but I worry that work's going to go to waste if it stays closed.
I tried loading this with the standard Mac OS 7.1 install disks readily available, with a Mac plus rom. Drive 0: disk ejected? Mini vMac seems to work. I guess it needs some work still.
Little help - how can I find ROM-s? I tried to download some using sites found in Google, but the emulator always says "Unknown or unsupported ROM file". How can I find usable roms?
Little help - how can I find ROM-s? I tried to download some using sites found in Google, but the emulator always says "Unknown or unsupported ROM file". How can I find usable roms?
I try to run some of them, e.g. Macintosh Plus. It does accept the ROM, but it just shows a flashing floppy disk icon and doesn't do anything else. How could this be fixed?
Does the Mac - like the Lisa - also require cycle accurate emulation of the hardware? I spent some time with lisaem and made experiments with Qemu, but the Lisa OS makes assumptions about hardware timing which cannot be met by the latter.
Much of my early post-college work is stored across a stack of Mac formatted Bernoulli disks. The software requires an ADB dongle to run, so physical hardware is required. I wonder if any of those ADB to USB adapters could be mapped into the emulator?
All of the ADB to USB adapters I know of only support mice and keyboards and have internal firmware that maps to USB HID. You'd have to write a custom firmware to make a raw pass through to an emulator...
It would probably be easier to crack the software!
I have a large collection of vintage Mac's and peripherals, with the largest quantity being the Apple Keyboard II [1]. Archive forums all suggest the Belkin ADB Adapter [2] but that has long since been retired. I would like to make my own, i know instructions exist for a raw passthrough.
The Griffin iMate was the most popular ADB-USB adapter from the time, and probably supports non-input devices (it would’ve been the only option at the time to make those dongles work).
Ah yeah, the ones that were sold at the time would work if you passed through USB to an emulator that supported USB hardware, or reverse-engineered their proprietary protocol. I was only thinking of the modern options when I wrote my comment.
Good advice of course. It is not valuable, and it is not my product - I merely worked on it. The real value was guiding me _away_ from a career as a programmer (and the friends we made along the way).
Funny you mention that, I'm actually friends with twvd and we share a discord server and trade UI ideas as we both use the same GUI toolkit. Snow actually uses the disk image library I built for MartyPC.
Inspired is a strong word. I didn't invent the concept of an accurate emulator, although I'm certainly a fan of his approach.
The original submission was to a post that explains why this is news, and not just a random project:
A brand new 68k Mac emulator quietly dropped last night!!
“Snow” can emulate the Mac 128k, 512k, Plus, SE, Classic, and II. It supports reading disks from bitstream and flux-floppy images, and offers full execution control and debugging features for the emulated CPU. Written using Rust, it doesn't do any ROM patching or system call interception, instead aiming for accurate hardware-level emulation.
I wish Apple would bring back the white menubar background and the coloured logo.
The white menubar makes the whole computer easier to use in a small but constant way. The coloured apple icon would suggest they no longer have their heads stuck up their assess and might bring back "fun" rather than "showing off" to their design process. And then maybe, maybe... with that "suggestion" symbolised in the UI, we can hope they might bring back the more rigorous user-centric design process they used to be famous for.
Are they really changing the UI up again? I am actually so done at this point. The endless UI churn drives me absolutely mad, but I suppose when there's nothing left to do, making it look different is easy.
I suppose a built in volume mixer is still too much to ask for though.
Sometimes I enjoy the translucent menus. They make the machine look "glossy" and expensive. But they're definitely harder to read than opaque flat ones.
With "reduce transparency" on, it's better, but the menubar still isn't white. It's a textured light grey that's closer to the look of an unfocused app window than the solid, dependable, flat thing I wish it still was.
So are we supposed to make custom backgrounds with a 30px white bar on top instead of expecting this to be an option in the settings like in every other sanely customizable OS?
seconding the overlay app, i forgot the name but there was an app that can configure the appearance of the menubar. maybe it's my menubar icon organizer? Not dozer or bartender, but can't recall right now
... and while I appreciate the rationale behind it, I'm always saddened when a carefully chosen link that suits the way I think, giving and overview and a context with links to the projects, is then over-written by the direct link to the project that doesn't give a sense of why it's interesting or relevant.
But as the Man in Black says in The Princess Bride: "Get used to disappointment".
The guidelines are clear that the original/canonical source is what we want on HN:
Please submit the original source. If a post reports on something found on another site, submit the latter.
But you're welcome to post a comment with links to other sources that give the extra information and context, and we can pin it to the top of the thread, or do what I've done here and put them in the top text.
so just to confirm, this HN submission [ 1] should have linked to this pdf of the paper [2] and put the article [3] that is the current link for the post as a comment?
The question we always ask is whether a source contains “significant new information”.
In the case you cited, the Quanta Magazine article is a report about the study’s findings that is readable and understandable to lay people, and includes backstory and quotes from interviews with the researchers and also images.
I.e., there’s plenty of information in the article that isn’t in the paper. So we’ll always go with that kind of article, over the paper itself, particularly in the case of Quanta Magazine which is a high-quality publication.
In other cases an article is “blog spam” - I.e., it just rewords a study without adding any new information, and in those cases we’ll link directly to the study, or to a better article if someone suggests it.
Anyone is always welcome to suggest a source that is the most informative about a topic and we’ll happily update the link to that.
I understand the rationale, and as someone who moderates other communities I can totally understand why this is administered as a blanket policy. Having said that, it does sometimes result in what I think of as sub-optimal situations where information is unnecessarily lost or obscured.
In particular, adding a link to the original post, as you have done here, is likely to be of minimal value. People will click on the headline link, wonder what it's about or why it's "news", and close the window. On the other hand, clicking through first to the post means people will see the context, then those who are interested will click through to the project site(s). I've done this analysis in other contexts and found that the decision tree for engagement and user-information is in favour of linking to the post, not the project.
But as I say, I understand your position, and in the end, it's not my forum, not my community, and not my choice.
I think you're implying that we're more rigid and/or self-defeating about this than we are.
We always want the source that contains the greatest amount of information about the topic. As I wrote in the other reply in this subthread, the heuristic is whether a source contains "significant new information" vs an alternative.
That means, as explained in that reply, an article about the findings of an academic study is better than the academic paper, if it contains significant new information that isn't easily found from the paper itself (particularly if the article contains quotes from interviews with the researchers). A project creator’s blog post about a new project or release is better than a link to the project's GitHub page.
We generally prefer not to link to a third-party's social media post about a project, on the basis that it's light on significant new information and takes traffic/attention away from the primary source or another in-depth article about it. (It's different if it's a 3rd-party's detailed blog post about a project, which includes their own experiences using the project and comparing it with other projects in the same category. But then it's more of a review, than a report about the project itself.) Another problem with submitting a 3rd-party post about a project is that it then becomes a topic of debate in the comments, why one source was chosen over another, which happened here.
In a case like this, the information that was in that social media post could easily have been quoted in a comment in the thread, that we could have pinned.
Given that the author of the project posted an announcement in a discussion forum, there could be a case for making that the HN source, given that it contains the other relevant links and some additional commentary, though in this case it's a bit light on detail. But it makes all the difference that the source we link to is by the author of the project.
In the case of this submission, the story has been on the front page for 12 hours already, including some time at #1, and is still going strong, so I don't think anything has been lost.
You're always welcome to make a case for why a particular source is the one that contains the most "significant new information" and is thus the one that should be the HN source.
For some context about why a portable, user-friendly, hardware-level emulator for classic Mac systems is such a big deal, see this blog post from 2020: https://invisibleup.com/articles/30/
For game consoles, we've had emulators like Nestopia and bsnes and Dolphin and Duckstation for years.
For PCs, virtualisation systems like VMWare and VirtualBox have covered most people's needs, and recently there's been high-fidelity emulators like 86Box and MartyPC.
The C64 has VICE, the Amiga has WinUAE, even the Apple II has had high-quality emulators like KEGS and AppleWin, but the Mac has mostly been limited to high-level and approximate emulators like Basilisk II.
In compatibility, it's MUCH worse than all the others, but there's also Executor: https://en.wikipedia.org/wiki/Executor_(software) which you can use to run a Macintosh version of solitaire in your browser by having the browser emulate MS-DOS which then runs Executor/DOS: https://archive.org/details/executor
In addition to Executor/DOS, a non-released version ran on the Sun 3 workstations (they too had 680x0 processors) and Executor/NEXTSTEP ran on NeXT machines, both the 680x0 based ones and the x86 powered PCs that could run NEXTSTEP.
Executor was the least compatible because it used no intellectual property from Apple. The ROMs and system software substitutes were all written in a clean room--no disassembly of the Apple ROMs or System file.
Although Executor ostensibly has a Linux port, it's probably hard to build (I haven't tried in a couple decades) in part because to squeeze the maximum performance out of a 80386 processor, the synthetic CPU relied on gcc-specific extensions.
I know a fair amount about Executor, because I wrote the initial version of it, although all the super impressive parts (e.g., the synthetic 68k emulator and the color subsystem) were written by better programmers than I am.
Thank you so much for Executor. I used to run it on my 486 Linux box, over an X11 SSH tunnel to the Sun workstation I used in the computer labs for work on campus. I balanced my checkbook and wrote essays in emulated Excel and Word (with rough compatibility with the Windows versions). It was so cool to be able to mix and match systems that way.
https://github.com/autc04/executor is a more recent fork of executor (but based on the issues, it does build on recent OS)
typo: does NOT build on recent OS
When I was starting out in the 90s, Executor was one of those very cool pieces of software I would love to play around with.
That article is objectively true but .. I've never seen such a grotesque dismissal of the hard work people have done for free.
I feel like that’s a bit harsh, but I’ll admit that it is needlessly inflammatory. I wasn’t in the best state mentally when I wrote that. (I do sometimes worry that I’m responsible for the disappearance of Paul C. Pratt…) At some point I need to either rewrite it to be less hostile or just yank it entirely.
For the amount of time and effort that went into that article, the author could surely have fixed at least one of the things they complain about! And they don't seem to understand the C #include mechanism at all, so should we even pay attention to their technical criticisms in the first place?!
I don't know if you read the whole article. The author did make a Mini vMac fork to clean up the build system and code, she linked it at the end. https://github.com/InvisibleUp/uvmac .
Ha. I was starting to find the article a bit tiring and the moment my eyes landed on the Conclusion heading I stopped right there. You shouldn't trust my criticisms either.
Wait, but that last sentence makes you a reliable narrator now. I am lost.
Oh, thank goodness. Hacking Mini vMac was a chore and I didn't have the energy to do something like this.
My 2c: having played around with the codebase recently (to add C++ to the codebase), I did find Cursor helpful in figuring out how to work with the bespoke build system.
It might not count as "user-friendly" but MAME does hardware-level emulation of the Macintosh and Apple II (more accurate and more peripherals but less user friendly than KEGS and AppleWin).
You forgot miniVMAC, 68k. Qemu does MacPPC fine.
there's definitely room to improve user friendliness of mac emulation (minivmac's compile time config is so infuriating), but I think it's a bit unfair to compare to most of those emulators
vmware and virtualbox were backed by billion dollar corps
the 16 bit machines are much simpler than macs
game consoles had highly homogenous well documented hardware, and sold in much greater numbers (snes alone sold more than all macs from 1987 to 1995) so there's a larger community to draw devs and users from. writing a nes emulator is almost a weekend project now, it's so documented.
It should be pointed out that VMware started as a tiny, scrappy company mostly focused on selling workstation seats for you to run Windows on your Linux computer (which I did back circa 1999, so I could use Linux on my desktop), and VirtualBox started out from InnoTek, a tiny company which was essentially making software to emulate Windows on OS/2, and then later did a contract with Connectix to run OS/2 on Windows (or other hosts) using Virtual PC.
Connectix got bought by Windows, and InnoTek got bought by Sun, which is now Oracle. Connectix themselves started as a scrappy outfit making it possible to run DOS/Win95 on a Mac.
The core emulation was pretty much done and stable and optimised before the billion-dollar corps bought them out.
vmware apparently had 20 employees in year one, I don't think a single person has ever worked full time on a mac emulator (other than Apple's internal ones, of course)
even a "tiny, scrappy company" has massive manpower compared to 99.999% of open source projects
I suppose - owing to its accuracy - that this doesn't have some of BasiliskII's killer features: it patches the OS/ROMs to add support for super-high resolutions and (mostly) seamless integration with the host's file system and network.
It's a shame that Basilisk - possibly owing to its inaccurate but killer features - is as janky as it is, because it's really remarkably pleasant to use when it works.
An accurate emulator with clean codebase is a good starting point onto which to add patches/shortcuts. I've looked through the Basilisk patching code, it's not really complicated, and there are a handful of partial Toolbox reimplementations including the bits in Basilisk, Executor (author is in the comments here), MACE, etc. It would be some work to port but mostly direct translation of the code & adding test infrastructure.
One way to add devices that doesn't require ROM or software modification, but _does_ require modifying the emulator: create a virtual memory-mapped device off 68K bus, and write a driver/CDEV to drive it. The SE and Macintosh II had blessed but different expansion options, after all.
For earlier models, there are unused apertures in model memory maps that are at least 2KB large, but they do differ between models.
As an aside, I really wish MACE would open-source their work. They've made some impressive progress, but I worry that work's going to go to waste if it stays closed.
Aha, I got MACE and Advanced Mac Substitute (AMS) mixed up. AMS looks less complete but has source available.
https://www.v68k.org/advanced-mac-substitute/
I tried loading this with the standard Mac OS 7.1 install disks readily available, with a Mac plus rom. Drive 0: disk ejected? Mini vMac seems to work. I guess it needs some work still.
Little help - how can I find ROM-s? I tried to download some using sites found in Google, but the emulator always says "Unknown or unsupported ROM file". How can I find usable roms?
https://macintoshgarden.org/ has always been the gold standard source for me!
I usually use https://www.macintoshrepository.org , the garden lack some proper organisation
Little help - how can I find ROM-s? I tried to download some using sites found in Google, but the emulator always says "Unknown or unsupported ROM file". How can I find usable roms?
These seem to work:
https://archive.org/details/mac_rom_archive_-_as_of_8-19-201...
Very nice, thank you.
I try to run some of them, e.g. Macintosh Plus. It does accept the ROM, but it just shows a flashing floppy disk icon and doesn't do anything else. How could this be fixed?
The icon is telling you that you need a disk to boot from.
https://www.gryphel.com/c/minivmac/start.html has some links.
The Mac Plus didn’t have an internal hard drive. So you need to start the OS from a floppy.
My father used an external hard drive with his Mac Plus, back in the day.
Does the Mac - like the Lisa - also require cycle accurate emulation of the hardware? I spent some time with lisaem and made experiments with Qemu, but the Lisa OS makes assumptions about hardware timing which cannot be met by the latter.
Much of my early post-college work is stored across a stack of Mac formatted Bernoulli disks. The software requires an ADB dongle to run, so physical hardware is required. I wonder if any of those ADB to USB adapters could be mapped into the emulator?
All of the ADB to USB adapters I know of only support mice and keyboards and have internal firmware that maps to USB HID. You'd have to write a custom firmware to make a raw pass through to an emulator...
It would probably be easier to crack the software!
I have a large collection of vintage Mac's and peripherals, with the largest quantity being the Apple Keyboard II [1]. Archive forums all suggest the Belkin ADB Adapter [2] but that has long since been retired. I would like to make my own, i know instructions exist for a raw passthrough.
[1]https://en.wikipedia.org/wiki/File:Apple_Keyboard_II.jpg
[2]https://www.cnet.com/tech/computing/hack-your-old-macs-adb-k...
The Griffin iMate was the most popular ADB-USB adapter from the time, and probably supports non-input devices (it would’ve been the only option at the time to make those dongles work).
Ah yeah, the ones that were sold at the time would work if you passed through USB to an emulator that supported USB hardware, or reverse-engineered their proprietary protocol. I was only thinking of the modern options when I wrote my comment.
If you've not backed it up already that data might be gone. If it's valuable to you then I'd recommend finding out sooner than later
Good advice of course. It is not valuable, and it is not my product - I merely worked on it. The real value was guiding me _away_ from a career as a programmer (and the friends we made along the way).
Anyone who has a working Bernoulli box probably has a matching old mac to go with it.
Several, yes ;-)
maybe this would work? https://www.bigmessowires.com/usb-wombat/
It turns out there has been some discussion on emulating or passing through ADB hardware keys but nothing conclusive seems to have come of it.
Feels so real, great work.
Any chance this could be made to emulate an Atari ST?
Was this inspired by MartyPC?
Funny you mention that, I'm actually friends with twvd and we share a discord server and trade UI ideas as we both use the same GUI toolkit. Snow actually uses the disk image library I built for MartyPC.
Inspired is a strong word. I didn't invent the concept of an accurate emulator, although I'm certainly a fan of his approach.
Any Flatpak, Snap or Scoop editions?
The original submission was to a post that explains why this is news, and not just a random project:
A brand new 68k Mac emulator quietly dropped last night!!
“Snow” can emulate the Mac 128k, 512k, Plus, SE, Classic, and II. It supports reading disks from bitstream and flux-floppy images, and offers full execution control and debugging features for the emulated CPU. Written using Rust, it doesn't do any ROM patching or system call interception, instead aiming for accurate hardware-level emulation.
* Download link (Mac, Windows, Linux): https://snowemu.com
* Documentation link: https://docs.snowemu.com
* Source link: https://github.com/twvd/snow
* Release announcement: https://www.emaculation.com/forum/viewtopic.php?t=12509
-- https://oldbytes.space/@smallsco/114747196289375530
I understand why links get re-written, but I think the context is relevant and can help the random reader who is unfamiliar with the project.
Off-topic...
I wish Apple would bring back the white menubar background and the coloured logo.
The white menubar makes the whole computer easier to use in a small but constant way. The coloured apple icon would suggest they no longer have their heads stuck up their assess and might bring back "fun" rather than "showing off" to their design process. And then maybe, maybe... with that "suggestion" symbolised in the UI, we can hope they might bring back the more rigorous user-centric design process they used to be famous for.
https://www.macrumors.com/2025/06/23/macos-tahoe-beta-2-menu...
Are they really changing the UI up again? I am actually so done at this point. The endless UI churn drives me absolutely mad, but I suppose when there's nothing left to do, making it look different is easy.
I suppose a built in volume mixer is still too much to ask for though.
Nice, thanks. I'll use that when I upgrade.
But I'm not going to upgrade whilst the back/next buttons are floating 3m above the window as suggested in that screen shot.
Turning “Reduce Transparency” on in Accessibility > Display will solidify the menubar in both light and dark modes.
I go through phases with transparency off or on.
Same.
Sometimes I enjoy the translucent menus. They make the machine look "glossy" and expensive. But they're definitely harder to read than opaque flat ones.
With "reduce transparency" on, it's better, but the menubar still isn't white. It's a textured light grey that's closer to the look of an unfocused app window than the solid, dependable, flat thing I wish it still was.
What about setting a white background, which yields a white menubar?
A color logo might be added with an overlay app – or you reminisce a black&white screen.
So are we supposed to make custom backgrounds with a 30px white bar on top instead of expecting this to be an option in the settings like in every other sanely customizable OS?
seconding the overlay app, i forgot the name but there was an app that can configure the appearance of the menubar. maybe it's my menubar icon organizer? Not dozer or bartender, but can't recall right now
Ice organizes menubar icons and can alter the bar's appearance.
I'm not sure why OP links to this site, but the actual project is here
https://snowemu.com/
https://github.com/twvd/snow
Personally I find an announcement like the one linked more helpful and useful to create a context, rather than linking directly to the project.
Links to the actual project are in the submitted post, so you can get an overview before then being directed to the project itself.
As always YMMV, indeed, YMWV, but I like seeing the announcement giving the context rather than a bare pointer to the project.
... and while I appreciate the rationale behind it, I'm always saddened when a carefully chosen link that suits the way I think, giving and overview and a context with links to the projects, is then over-written by the direct link to the project that doesn't give a sense of why it's interesting or relevant.
But as the Man in Black says in The Princess Bride: "Get used to disappointment".
We can have our cake and eat it.
The guidelines are clear that the original/canonical source is what we want on HN:
Please submit the original source. If a post reports on something found on another site, submit the latter.
But you're welcome to post a comment with links to other sources that give the extra information and context, and we can pin it to the top of the thread, or do what I've done here and put them in the top text.
so just to confirm, this HN submission [ 1] should have linked to this pdf of the paper [2] and put the article [3] that is the current link for the post as a comment?
The question we always ask is whether a source contains “significant new information”.
In the case you cited, the Quanta Magazine article is a report about the study’s findings that is readable and understandable to lay people, and includes backstory and quotes from interviews with the researchers and also images.
I.e., there’s plenty of information in the article that isn’t in the paper. So we’ll always go with that kind of article, over the paper itself, particularly in the case of Quanta Magazine which is a high-quality publication.
In other cases an article is “blog spam” - I.e., it just rewords a study without adding any new information, and in those cases we’ll link directly to the study, or to a better article if someone suggests it.
Anyone is always welcome to suggest a source that is the most informative about a topic and we’ll happily update the link to that.
We won't agree on this.
I understand the rationale, and as someone who moderates other communities I can totally understand why this is administered as a blanket policy. Having said that, it does sometimes result in what I think of as sub-optimal situations where information is unnecessarily lost or obscured.
In particular, adding a link to the original post, as you have done here, is likely to be of minimal value. People will click on the headline link, wonder what it's about or why it's "news", and close the window. On the other hand, clicking through first to the post means people will see the context, then those who are interested will click through to the project site(s). I've done this analysis in other contexts and found that the decision tree for engagement and user-information is in favour of linking to the post, not the project.
But as I say, I understand your position, and in the end, it's not my forum, not my community, and not my choice.
I think you're implying that we're more rigid and/or self-defeating about this than we are.
We always want the source that contains the greatest amount of information about the topic. As I wrote in the other reply in this subthread, the heuristic is whether a source contains "significant new information" vs an alternative.
That means, as explained in that reply, an article about the findings of an academic study is better than the academic paper, if it contains significant new information that isn't easily found from the paper itself (particularly if the article contains quotes from interviews with the researchers). A project creator’s blog post about a new project or release is better than a link to the project's GitHub page.
We generally prefer not to link to a third-party's social media post about a project, on the basis that it's light on significant new information and takes traffic/attention away from the primary source or another in-depth article about it. (It's different if it's a 3rd-party's detailed blog post about a project, which includes their own experiences using the project and comparing it with other projects in the same category. But then it's more of a review, than a report about the project itself.) Another problem with submitting a 3rd-party post about a project is that it then becomes a topic of debate in the comments, why one source was chosen over another, which happened here.
In a case like this, the information that was in that social media post could easily have been quoted in a comment in the thread, that we could have pinned.
Given that the author of the project posted an announcement in a discussion forum, there could be a case for making that the HN source, given that it contains the other relevant links and some additional commentary, though in this case it's a bit light on detail. But it makes all the difference that the source we link to is by the author of the project.
In the case of this submission, the story has been on the front page for 12 hours already, including some time at #1, and is still going strong, so I don't think anything has been lost.
You're always welcome to make a case for why a particular source is the one that contains the most "significant new information" and is thus the one that should be the HN source.