This also gives me a bit more understanding of how the Video Toaster was possible to architect in a day with such slow CPU clock speeds. It seemed like magic at the time compared to limited capabilities of IBM PC clones. I hadn't realized how much capabilities these other Amiga chips provided.
The Video Toaster was built for the Amiga mainly due to the Amiga's built-in genlock. A similar contemporaneous product for PCs existed, the Matrox Studio, but it was pricier (due to needing extra hardware) and not as cool.
The Video Toaster used the Amiga's ability to sync to a video clock supplied by a card in the special video slot. But the Amiga itself didn't include a video input for an external source, so the Toaster hardware card included that external input. However, other Amiga video add-on cards did the same. The real magic of the Toaster was how it heavily relied on the Amiga's blitter and other custom chip-based graphics abilities. So much so the Amiga's partial genlock ability wasn't even essential to the Toaster existing. In the 1980s one of the most difficult and costly parts of a real-time digital video effects system able to remap live video inputs interactively was the address generator which told the hardware where to remap the pixels 59.94 times a second. These addresses have to be generated at video speeds which wouldn't have been possible at full broadcast video resolutions by any other mainstream computer of the time. For example the Mindset computer was released the year before the Amiga and had a full genlock built-in but the Toaster could never have worked on it because the Mindset wouldn't have been able to generate the addresses fast enough.
If another non-genlockable computer could have generated the remapping addresses fast enough in those days, a Toaster-like device could have been designed to work with it. It just would have had to include a full separate genlock and the output would have been delayed a frame to allow that circuit time to work. So, it was possible, it just would have cost a bit more. How much more? Around that time, separate genlock cards capable of syncing up two disparate video sources sold for around $1,000.
On the other hand, an interactive, real-time address generator capable of keeping up with full video speeds would have cost tens of thousands of dollars. For example, the Quantel Mirage cost $700,000 in the 1980s and didn't even generate its addresses in real-time. The Mirage pre-rendered all the remapping addresses to a massive RAM disk using a Pascal program running on a workstation (which could take an hour or more) and only played them back on cue. The real-time part of the Mirage's hardware just digitized live video input and mapped it to the fixed stream of pre-generated addresses. Thus the Mirage could apply complex effects to live video but the effect geometry had to be canned, not interactive. The Ampex Digital Optics hardware did generate addresses in real-time entirely in logic-gate based hardware. Take a look at the ADO patent to gain an appreciation for just how insanely hard it was for 1980s era pre-GPU hardware to generate addresses at broadcast video rates. (https://patents.google.com/patent/US4472732/en) In fact, these esoteric real-time broadcast video effects systems were some of the first steps toward GPUs. According to Ken Kutaragi, Sony's $350,000 System-G 3D video effects hardware (which reportedly had worldwide sales of... three units) led to the Playstation GPU.
The Toaster effects could be interactive because it relied on the Amiga's graphics output as its address generator. In fact, the Amiga computer's monitor output was normally used to display the Toaster user interface but when a video effect was running the interface was entirely replaced with odd patterns that looked like bar codes. These were the addresses being generated by the Amiga custom chips and fed from the Amiga to the Toaster hardware in real-time via the monitor output (technically the fastest 'bus' in the system). This was a remarkable hack which required the Toaster to be very tightly coupled with the Amiga's hardware design and also involved some other extremely clever trade-offs. Relying on the Amiga's graphics output as the real-time address generator enabled the Toaster to exist and was also why the Toaster could never be ported to another computer or even to the PAL video standard. The Amiga's genlockability just allowed the Toaster to be even less expensive.
> In fact, the Amiga computer's monitor output was used to display the Toaster user interface but when a video effect was running the interface was entirely replaced with odd patterns that looked like bar codes. These were the addresses being fed from the Amiga to the Toaster hardware in real-time.
No way, that’s why the wipes took over the switcher screen and where that peculiar vertical line pattern came from?
I can still remember these with an odd intensity from playing with the Toaster at the studio where my dad worked 30 years ago.
Yep, it was an absolutely bonkers combined hardware / software hack conceived by Tim Jenison and Brad Carvey on the hardware side and a team of low-level assembly language Amiga hackers on the software side, for which they won a Prime Time Emmy Award for outstanding technical achievement.
I'll never forget seeing a group of Japanese engineers from Sony who designed high-end video effects hardware (which cost >20x what a Toaster system did) seeing the Toaster at the NAB trade show for the first time. There was nothing but stunned silence and shocked looks... then after a long while of watching the demo and seeing the interface turn into weird garbage during each effect, you could see the light dawn on their faces - and then the whole group erupted into excited, highly-animated chatter. I've never wished I spoke Japanese as much as that moment :-).
The feature only existed in the A2000, A3000 and A4000 because of the forethought of Jay Miner and team when building the original Amiga 1000. Many computers of the time could have been modified to use an externally sourced video clock (basically anything that used NTSC / PAL video), but Only Amiga Makes It Possible (tm) by including the signals on the DB23 video port of the A1000 and subsequent models at virtually no cost.
Even the CGA card of an IBM PC could have been modified in this way. I can imagine a hack where the flash attribute in text mode gets used to act as the overlay key, but it would have been pretty darned ugly.
Of course the very thing made genlock work so well on the Amiga (the use of bitplanes) was a factor in the ultimate demise of the Amiga in the 1990s. It's funny how the memory bandwidth tradeoff changes things.
Sadly no build-in genlock, just ability to be easily genlocked. Full build-in genlock and one input pin would allow Amiga to easily read PCM audio stored on VHS with no external hardware https://en.wikipedia.org/wiki/PCM_adaptor
Some additional context that isn't really covered in the readme - there are two components in the Amiga chipset that are especially relevant here, the copper and the blitter. The copper (named because it's a coprocessor) is a separate execution core contained within Agnus, one of the custom chips. It only supports three instructions - move, wait, and skip. Skip will skip the next instruction if a condition is met, wait will block execution until the condition is met (in both these cases, the condition can only be the location of the video beam), and move can write an arbitrary value to a custom chip register. This means that the copper can't write to other pieces of hardware (like the CIA chips that control various things like system timers and the audio filter), but also that it can't write to RAM. The copper is used fairly heavily in the demo scene, but in gaming it's probably most commonly used to reprogram the colour palette as the screen is being displayed, increasing the number of on-screen colours. The code the copper runs is in RAM so the copper can't directly modify it, but the copper can reprogram the registers telling it where its code is, so it's possible to use the skip instruction to conditionally bounce to different code depending on where you are in scanout.
The blitter is another coprocessor with a different set of limited features. Blitter as a generic term refers to a piece of hardware that can copy memory from one location to another without CPU involvement, but the Amiga blitter is more full featured than that. It also has the ability to render lines, fill areas of the screen, and apply shifts and masks to data rather than just copying it. Of course, when I say "render lines" or "fill areas of the screen", what I actually mean is that it can write specific patterns to regions of memory - there's no actual requirement that they be on screen at the time.
A really important thing here is that while the copper can't write to memory itself, it can configure the blitter, and the blitter can then modify memory. One fun thing here is that there's absolutely nothing stopping you from using the blitter to modify the code that the copper is executing.
The naive implementation of a no-cpu demo would simply be to load all the assets into RAM and then have the copper reprogram the custom chips to display them and play audio. But combining the copper and the blitter gives a turing complete execution environment that ought to be able to do almost anything you could do with the CPU (the blitter can't touch hardware registers so you're still limited to whatever registers the copper can access, and you can only access the RAM the custom chips have access to, not the larger range of fast RAM), just somewhat more slowly.
The copper was widely used in the OS not just to change the entire colour palette half-way down the screen - it could change the display resolution half-way down the screen, along with other things such as the colour depth and memory location of the bitmap data. This allowed the OS to support having multiple "screens" with different resolutions, colour depths, and palettes, and you could drag the front one down to reveal the one behind it.
It would also be used to support double-buffering for displaying smooth videos or other moving graphics - you'd have the two buffers, and the copper would change which one was being displayed while the scan was in the flyback period between frames.
Demoscene has always been about hacking machines to make them do what they're not supposed to do.
I made a "Star Wars" scrolltext once, on a so-called "bazaar" screen with other effects... but it was on ST, kind of challenging ;) I still have the floppy disk in a box, but I'm afraid it's been demagnetised like the others over time (~35 years ago).
Go for it, you might be surprised!
In the last 6 years or so, I decided to poke at my childhood Amiga 500 again when visiting family (which is where it still lives) and I wasn't sure how many of the errors I saw came from a dead disk drive or fading floppies.
Last week, enraged with disk/drive issues, I rushed a poor man's drive cleaning with 90 proof alcohol on the heads (99.9 IPA much preferred), and WD40/90 proof in the disk presence switch (again, proper contact cleaner and IPA much preferred).
Most disks behaved well after that - original software or user written, a few of my own disks had errors, and a couple sound like they need gluing back to the center spindle.
These are all disks from 90-98 and I was able to move data back and forth through the serial cables I'd finally acquired for that purpose.
This machine now has more problems with flaky IC connections and wonky caps than disks. Given the visual glitches in some of my old DeluxePaint files viewed on the actual machine, I'm surprised the circuits didn't mess up any of the transfers, because I didn't get the glitches in the emulator afterwards..
Seems we were the lucky generation. In a way we did. As they say, when you're 10, 1 year is 10% of your life and lasts forever. Now, years turned into months. Time _does_ pass slower when you turn off the intertubes though.
That gave me a nightmare vision of Doom clones where you pay money to appear as a different sprite and it's inflicted with last man standing type game modes and all kinds of other bad things modern games do.
… Ya know, I think Doom may have actually been a parallel-universe/built-by-aliens type of fluke: It seriously accelerated gaming and the social perception of gaming, and in turn pushed computer technology adoption towards 3D cards (and everything else required to support them) much faster than it may have happened without Doom.
So I think if certain "killer apps" weren't released when they did, then maybe people might have been fine with tech chugging along at a more relaxed pace..
But everything feels pretty much the same and has been "good enough" for a long while now, with little left to look forward to.. I mean just look at the Switch 1 vs Switch 2.
Back in "those days" you could literally count the extra colors you would get to see on the screen after each new generation!
Quick link to a video of one of the No-CPU demos, way better than what I was expecting...
https://youtu.be/OXT5MrDdyB8?si=cZChImbAi3JBbFFl&t=49
This also gives me a bit more understanding of how the Video Toaster was possible to architect in a day with such slow CPU clock speeds. It seemed like magic at the time compared to limited capabilities of IBM PC clones. I hadn't realized how much capabilities these other Amiga chips provided.
The Video Toaster was built for the Amiga mainly due to the Amiga's built-in genlock. A similar contemporaneous product for PCs existed, the Matrox Studio, but it was pricier (due to needing extra hardware) and not as cool.
Ah TIL https://retrocomputing.stackexchange.com/questions/22320/wha...
The Video Toaster used the Amiga's ability to sync to a video clock supplied by a card in the special video slot. But the Amiga itself didn't include a video input for an external source, so the Toaster hardware card included that external input. However, other Amiga video add-on cards did the same. The real magic of the Toaster was how it heavily relied on the Amiga's blitter and other custom chip-based graphics abilities. So much so the Amiga's partial genlock ability wasn't even essential to the Toaster existing. In the 1980s one of the most difficult and costly parts of a real-time digital video effects system able to remap live video inputs interactively was the address generator which told the hardware where to remap the pixels 59.94 times a second. These addresses have to be generated at video speeds which wouldn't have been possible at full broadcast video resolutions by any other mainstream computer of the time. For example the Mindset computer was released the year before the Amiga and had a full genlock built-in but the Toaster could never have worked on it because the Mindset wouldn't have been able to generate the addresses fast enough.
If another non-genlockable computer could have generated the remapping addresses fast enough in those days, a Toaster-like device could have been designed to work with it. It just would have had to include a full separate genlock and the output would have been delayed a frame to allow that circuit time to work. So, it was possible, it just would have cost a bit more. How much more? Around that time, separate genlock cards capable of syncing up two disparate video sources sold for around $1,000.
On the other hand, an interactive, real-time address generator capable of keeping up with full video speeds would have cost tens of thousands of dollars. For example, the Quantel Mirage cost $700,000 in the 1980s and didn't even generate its addresses in real-time. The Mirage pre-rendered all the remapping addresses to a massive RAM disk using a Pascal program running on a workstation (which could take an hour or more) and only played them back on cue. The real-time part of the Mirage's hardware just digitized live video input and mapped it to the fixed stream of pre-generated addresses. Thus the Mirage could apply complex effects to live video but the effect geometry had to be canned, not interactive. The Ampex Digital Optics hardware did generate addresses in real-time entirely in logic-gate based hardware. Take a look at the ADO patent to gain an appreciation for just how insanely hard it was for 1980s era pre-GPU hardware to generate addresses at broadcast video rates. (https://patents.google.com/patent/US4472732/en) In fact, these esoteric real-time broadcast video effects systems were some of the first steps toward GPUs. According to Ken Kutaragi, Sony's $350,000 System-G 3D video effects hardware (which reportedly had worldwide sales of... three units) led to the Playstation GPU.
The Toaster effects could be interactive because it relied on the Amiga's graphics output as its address generator. In fact, the Amiga computer's monitor output was normally used to display the Toaster user interface but when a video effect was running the interface was entirely replaced with odd patterns that looked like bar codes. These were the addresses being generated by the Amiga custom chips and fed from the Amiga to the Toaster hardware in real-time via the monitor output (technically the fastest 'bus' in the system). This was a remarkable hack which required the Toaster to be very tightly coupled with the Amiga's hardware design and also involved some other extremely clever trade-offs. Relying on the Amiga's graphics output as the real-time address generator enabled the Toaster to exist and was also why the Toaster could never be ported to another computer or even to the PAL video standard. The Amiga's genlockability just allowed the Toaster to be even less expensive.
> In fact, the Amiga computer's monitor output was used to display the Toaster user interface but when a video effect was running the interface was entirely replaced with odd patterns that looked like bar codes. These were the addresses being fed from the Amiga to the Toaster hardware in real-time.
No way, that’s why the wipes took over the switcher screen and where that peculiar vertical line pattern came from?
I can still remember these with an odd intensity from playing with the Toaster at the studio where my dad worked 30 years ago.
Yep, it was an absolutely bonkers combined hardware / software hack conceived by Tim Jenison and Brad Carvey on the hardware side and a team of low-level assembly language Amiga hackers on the software side, for which they won a Prime Time Emmy Award for outstanding technical achievement.
I'll never forget seeing a group of Japanese engineers from Sony who designed high-end video effects hardware (which cost >20x what a Toaster system did) seeing the Toaster at the NAB trade show for the first time. There was nothing but stunned silence and shocked looks... then after a long while of watching the demo and seeing the interface turn into weird garbage during each effect, you could see the light dawn on their faces - and then the whole group erupted into excited, highly-animated chatter. I've never wished I spoke Japanese as much as that moment :-).
> Brad Carvey
Trivia: He's the brother of Dana Carvey, and Garth wears a Video Toaster T-Shirt in Wayne's World 2.
The feature only existed in the A2000, A3000 and A4000 because of the forethought of Jay Miner and team when building the original Amiga 1000. Many computers of the time could have been modified to use an externally sourced video clock (basically anything that used NTSC / PAL video), but Only Amiga Makes It Possible (tm) by including the signals on the DB23 video port of the A1000 and subsequent models at virtually no cost.
Even the CGA card of an IBM PC could have been modified in this way. I can imagine a hack where the flash attribute in text mode gets used to act as the overlay key, but it would have been pretty darned ugly.
Of course the very thing made genlock work so well on the Amiga (the use of bitplanes) was a factor in the ultimate demise of the Amiga in the 1990s. It's funny how the memory bandwidth tradeoff changes things.
Sadly no build-in genlock, just ability to be easily genlocked. Full build-in genlock and one input pin would allow Amiga to easily read PCM audio stored on VHS with no external hardware https://en.wikipedia.org/wiki/PCM_adaptor
Some additional context that isn't really covered in the readme - there are two components in the Amiga chipset that are especially relevant here, the copper and the blitter. The copper (named because it's a coprocessor) is a separate execution core contained within Agnus, one of the custom chips. It only supports three instructions - move, wait, and skip. Skip will skip the next instruction if a condition is met, wait will block execution until the condition is met (in both these cases, the condition can only be the location of the video beam), and move can write an arbitrary value to a custom chip register. This means that the copper can't write to other pieces of hardware (like the CIA chips that control various things like system timers and the audio filter), but also that it can't write to RAM. The copper is used fairly heavily in the demo scene, but in gaming it's probably most commonly used to reprogram the colour palette as the screen is being displayed, increasing the number of on-screen colours. The code the copper runs is in RAM so the copper can't directly modify it, but the copper can reprogram the registers telling it where its code is, so it's possible to use the skip instruction to conditionally bounce to different code depending on where you are in scanout.
The blitter is another coprocessor with a different set of limited features. Blitter as a generic term refers to a piece of hardware that can copy memory from one location to another without CPU involvement, but the Amiga blitter is more full featured than that. It also has the ability to render lines, fill areas of the screen, and apply shifts and masks to data rather than just copying it. Of course, when I say "render lines" or "fill areas of the screen", what I actually mean is that it can write specific patterns to regions of memory - there's no actual requirement that they be on screen at the time.
A really important thing here is that while the copper can't write to memory itself, it can configure the blitter, and the blitter can then modify memory. One fun thing here is that there's absolutely nothing stopping you from using the blitter to modify the code that the copper is executing.
The naive implementation of a no-cpu demo would simply be to load all the assets into RAM and then have the copper reprogram the custom chips to display them and play audio. But combining the copper and the blitter gives a turing complete execution environment that ought to be able to do almost anything you could do with the CPU (the blitter can't touch hardware registers so you're still limited to whatever registers the copper can access, and you can only access the RAM the custom chips have access to, not the larger range of fast RAM), just somewhat more slowly.
The copper was widely used in the OS not just to change the entire colour palette half-way down the screen - it could change the display resolution half-way down the screen, along with other things such as the colour depth and memory location of the bitmap data. This allowed the OS to support having multiple "screens" with different resolutions, colour depths, and palettes, and you could drag the front one down to reveal the one behind it.
It would also be used to support double-buffering for displaying smooth videos or other moving graphics - you'd have the two buffers, and the copper would change which one was being displayed while the scan was in the flyback period between frames.
Demoscene has always been about hacking machines to make them do what they're not supposed to do.
I made a "Star Wars" scrolltext once, on a so-called "bazaar" screen with other effects... but it was on ST, kind of challenging ;) I still have the floppy disk in a box, but I'm afraid it's been demagnetised like the others over time (~35 years ago).
Go for it, you might be surprised! In the last 6 years or so, I decided to poke at my childhood Amiga 500 again when visiting family (which is where it still lives) and I wasn't sure how many of the errors I saw came from a dead disk drive or fading floppies. Last week, enraged with disk/drive issues, I rushed a poor man's drive cleaning with 90 proof alcohol on the heads (99.9 IPA much preferred), and WD40/90 proof in the disk presence switch (again, proper contact cleaner and IPA much preferred). Most disks behaved well after that - original software or user written, a few of my own disks had errors, and a couple sound like they need gluing back to the center spindle. These are all disks from 90-98 and I was able to move data back and forth through the serial cables I'd finally acquired for that purpose.
This machine now has more problems with flaky IC connections and wonky caps than disks. Given the visual glitches in some of my old DeluxePaint files viewed on the actual machine, I'm surprised the circuits didn't mess up any of the transfers, because I didn't get the glitches in the emulator afterwards..
Look into GreaseWeazle[0].
Data can probably still be read. Especially if the floppies are DD. Don't wait for it to become unreadable. Backup now rather than later.
0. https://github.com/keirf/greaseweazle/
Tremendous. Miss my Amiga and booting up the latest demo with my mates. It was a different time.
Interesting parallels with GPUs too.
I kinda wish each "era" of computing/video games lasted 3-5x longer than it did.. :')
I'd have loved to live through 10 years of the Commodore 64, 10 years of the Amiga, 10 years of the NES, 10 years of the SNES...
Seems we were the lucky generation. In a way we did. As they say, when you're 10, 1 year is 10% of your life and lasts forever. Now, years turned into months. Time _does_ pass slower when you turn off the intertubes though.
we'd still be on the pentium pro by now. but imagine all the Doom clones we could have!
That gave me a nightmare vision of Doom clones where you pay money to appear as a different sprite and it's inflicted with last man standing type game modes and all kinds of other bad things modern games do.
you pay money to appear as a different sprite with shiny particles around it. important distinction, of course.
and iddqd costs ten bucks to unlock, but it's part of a lootbox with all the other cheat codes in it
I'm fine with that
only if we got more Heretic/Hexenlikes too!
… Ya know, I think Doom may have actually been a parallel-universe/built-by-aliens type of fluke: It seriously accelerated gaming and the social perception of gaming, and in turn pushed computer technology adoption towards 3D cards (and everything else required to support them) much faster than it may have happened without Doom.
So I think if certain "killer apps" weren't released when they did, then maybe people might have been fine with tech chugging along at a more relaxed pace..
With the current silicon temperature dissipation limits, we’re in a 10 year cycle now (and growing)
But everything feels pretty much the same and has been "good enough" for a long while now, with little left to look forward to.. I mean just look at the Switch 1 vs Switch 2.
Back in "those days" you could literally count the extra colors you would get to see on the screen after each new generation!
The PS1 survived quite a while as well to the point some complained it was eating into PS2 sales numbers
I think those all had close to or more than 10 years. SNES had the least but C64 was a retail product for 12 years.
Stuff would certainly be very well optimized near the end of each era.
Just look at the demos people are still making now for the C64, ZX Spectrum, and even the OG IBM PC! Multicolor 60 FPS on CGA!
can someone link some kind of documentation?
what are these not-CPU chips even capable of?
A great description was just posted as a comment here: https://news.ycombinator.com/item?id=45068268#45070258
https://archive.org/details/amiga-hardware-reference-manual-...
Updated version from 2025: https://aminet.net/package/docs/misc/rkrm-dos
Via https://news.ycombinator.com/item?id=45074183
https://en.wikipedia.org/wiki/Amiga_Chip_RAM
Alice, Lisa, Paula were some of the chips that made the Amiga the Amiga.