I miss when gaming in general was less mainstream and more weird like this. Now the silicon manufacturers hate that they even have to sell us their scraps, let alone spend time on making unique designs for their boxes.
I bought a small press book with a collection of this art and it was a fun little trip down memory lane, as I’ve owned some of the hardware (boxes) depicted in it.
> I miss when gaming in general was less mainstream and more weird like this.
To me, this is a continuum with the box art of early games, where because the graphics themselves were pretty limited the box art had to be fabulous. Get someone like Roger Dean to paint a picture for the imagination. https://www.rogerdean.com/
The peak of this was the Maplin electronics catalogue: https://70s-sci-fi-art.ghost.io/1980s-maplin-catalogues/ ; the Radio Shack of UK electronics hobbyists, now gone entirely. Did they need to have cool art on the catalogue? No. Was it awesome? Yes.
Turns out that the Psygnosis developers in the 1980s used him as a kind of single-shot concept artist. They would commission the box art first, then use that as inspiration for designing the actual game to go inside the box.
On the plus side, PC gaming hardware seems to last ages now. I built my gaming desktop in 2020, I had a look lately at what a reasonable modern mid tier setup is and they are still recommending a lot of the parts I have. So I'll probably keep using it all for another 5 years then.
Its a double edge sword. Yes, back then 2 year old computer was old, but at the same time every 2 years a new generation of games came out that were like never seen before. Each generation was a massive step-up.
Today, a layman couldnt chronologically sort CoD games from past 10 years from looks/play/feel, new Fifa and similar is _the_ same game but with new teams added to it, and virtually every game made is a "copycat with their own twist" with almost 0 technical invention.
This is fine? Today’s games look beautiful and developers are hardly restricted by hardware. Games can innovate on content, stories, and experiences rather than on technology.
Feels similar to how painting hasn’t had any revolution in new paints available.
It's not a one-or-the-other. One wouldn't want content, stories, and experiences to stagnate just because graphics were improving, so why would the opposite be assumed?
I disagree, AAA games started nosediving with the seventh generation 20 years ago and only recently have they started to tentatively show signs of recovery.
Do you have any examples in mind from each era? I thought Fallout 3 was quite good around back then. Today we've got stuff like Borderlands 4 (or whatever the newest one is) that barely run on anyone's PC, and general game install size has also shot up drastically so it's no longer really feasible to keep most of your games installed all the time and ready to play.
I mostly play indie/retro/slightly-old games these days, so I mostly hear of the negatives for modern AAA, admittedly. I'm also tempted to complain about live service, microtransactions, gacha, season passes, and so on in recent big releases, but maybe that would be getting off-topic.
> Today we've got stuff like Borderlands 4 (or whatever the newest one is) that barely run on anyone's PC
Just like Crysis did 18 years ago?
>it's no longer really feasible to keep most of your games installed all the time and ready to play.
Crysis was around 5% of common HDD back then. Now, it'd be equivalent of around 80 GiB now. That would be just about what Elden Ring with the DLC takes.
Yes, thanks, I don't need "technical invention" in the form of more shaders for hiding the ass quality of the gameplay. Mirror's Edge Catalyst still looks great despite being almost 10 y.o. and manages to bring 2080Ti on it's knees in FullHD+.
I was planning on hanging on to my Win10 PC, which was perfectly fine except that Microsoft were both pestering me to upgrade and telling me it wasn't possible, but the death of its SSD after 7 years put paid to that.
I have a PC that is probably at least 10 years old but works perfectly well for browsing, spreadsheets and the occasional lightweight game but MS has decided, in its infinite wisdom, to say that it can't be upgraded to Windows 11.
So I will probably install Linux (probably Debian) and move on and forget about those particular games... (~30 years since I first installed Linux on a PC!).
On the other hand, you're also stuck with design mistakes for ages.
The AM5 platform is quite lacking when it comes to PCIe lanes - especially once you take USB4 into account. I'm hoping my current setup from 2019 survives until AM6 - but it seems AMD wants to keep AM5 for another generation or two...
There's minimal demand for Thunderbolt/USB4 ports on Windows desktops. It won't ever make sense to inflate CPU pin count specifically for that use case, especially when the requisite PCIe lanes don't have to come directly from the CPU.
You'd be better off complaining about how Threadripper CPUs and motherboards are priced out of the enthusiast consumer market, than asking for the mainstream CPU platform to be made more expensive with the addition of IO that the vast majority of mainstream customers don't want to pay for.
One person's "design mistake" is another person's "market segmentation".
x16 GPU + x4 NVMe + x4 USB = 24 direct CPU lanes covers 99% of the market, with add-ons behind the shared chipset bandwidth. The other 1% of the market are pushed to buy Threadripper/Epyc.
Wow, I just checked and that's really underwhelming: Tiny page size, lots of padding around the images and yet there's often 4 images per page. The layout makes it seem like the size a was late decision, it would be appropriate for a large artbook.
> Now the silicon manufacturers hate that they even have to sell us their scraps, let alone spend time on making unique designs for their boxes.
I genuinely don't believe this to be true for AMD. I bought a 6600xt on Release Day and by the time I was able to build my complete PC, it had upstream linux kernel support. You can say what you will about AMD but any company that respects my freedoms enough to build a product with great linux support and without requiring any privacy invading proprietary software to use is a-ok in my book.
I agree with the spirit of your post, and that AMD is probably the lesser evil, but it's worth noting they "just" moved the proprietary bits into a firmware blob. It's still similarly proprietary but due to the kernel being pretty open to such blobs, it's a problem invisible to most users. You'd have to use linux-libre to get a feel for how bad things really are. You can't really use any modern GPUs with it.
I understand the sentiment, but I don't see how devices with proprietary firmware stored in ROM or NVRAM are any more free or open than devices that require proprietary firmware loaded at boot.
And it looks like Linux-Libre also opposes CPU microcode updates[1], as if bugged factory microcode with known security vulnerabilities is any less non-free than fixed microcode. Recommending an alternative architecture that uses non-proprietary microcode I can understand; this I cannot.
Just the mention of pieces of hardware we don't really need anymore (sound cards, modems, etc) triggers a flood of nostalgia. I used to spend DAYs poring over PC part catalogues dreaming of my ideal rig. And brands like Hercules, Creative, Matrox all trigger the same feelings.
Crazy contrast to me having spent the past weekend wondering if cloud gaming services like Geforce Now are matured enough that I can fully move to a thin client - fat server setup for the little bit of gaming I still do.
The technology works, but the business model doesn't, so there's the eternal risk that it might get shut down at short notice with no way to export your saves.
Yeah, that's definitely a worry. Also, the fact that you're dependent on them for adding support for future games, and that (like any cloud service) it might not be available right when you want it.
Yeah that's the issue - nobody wants to just rent you a gaming PC in the cloud, they all want a cut of game sales/licensing. But if someone were to do it, the technology is absolutely there.
You don't even need to create any internal tech - Steam Remote Play already has everything you need, and I successfully used it to play Battlefield from an AWS GPU instance (was even good enough for multiplayer).
That's nice, but they were interactive - You could move around the scene or change the camera angles. The fact that you could do this and prove it was realtime and not prerendered was part of the demo and most of the charm. Lacking that, it's just... lacking.
Most of that charm is gone after 20 years, nobody needs proving the dynamic lights are really dynamic anymore.
They're still some fun to interact with anyways, or just fun as a way to review what was hot shit at the time, but I couldn't get a few of the really old ones to run on Windows 10/11 this summer. A video is a lot better than just saying "well, I'm not going to build an old PC just to play this demo" and not seeing it at all.
Besides the box art, I miss the days when 1) the graphics card didn't cost more than the rest of the components put together, 2) the graphics card got all of its damn power through the connector itself, and 3) MSRP meant something.
I just bought a RTX 5090 at MSRP. While expensive, it's also a radically more complicated product that plays a more important role in a modern computer than old GPUs did years ago.
Compared to my CPU (9950X3D), it's got a massive monolithic die measuring 750mm2 with over 4x the transistor count of the entire 9950X3d package. Beyond the graphics, it's got tensor and RT cores, dedicated engines for video decode/encode, and 32GB of DDR7 on the board.
Even basic integrated GPUs these days have far surpassed GPUs like the RTX 970, so you can get a very cheap GPU that gets power through the CPU socket, at MSRP.
Do yourself/me a favor, and give your 5090's power plug/socket a little jiggle test.
I'm a retired data center electrician, and my own GPU's has been "loose" at least more than once. Really make sure that sucker is jammed in there/latched.
I'm not in the market for a 5090 or similar, but the other day I was looking at a lower-end model, an AMD 9060 or Nvidia 5060. What shocked me was the massive variation in prices for the same model (9060 XT 16 GB or 5060 Ti 16 GB).
The AMD could be had for anywhere from 400 to 600 euros, depending on the brand. What can explain that? Are there actual performance differences? I see models pretending to be "overclocked", but in practice they barely have a few extra MHz. I'm not sure if that's going to do anything noticeable.
Since I'm considering the AMD more and it's cheaper, I didn't take that close a look at the Nvidia prices.
Looks. I'm not joking. The market is aimed at people with a fish bowl PC case that care about having a cooler with a appealing design, a interesting PCB colour and the flashiest RGB.
Some may have a bit better cooling but the price for that is also likely marked up several times considering a full dual tower CPU cooler costs $35.
The manufacturer can use better fans that move more air and stay more silent. They can design a better vapor chamber, lay out the PCB in a way that VRMs and RAM gets more cooling. But still all that stuff should not account for more than $30-50 markup.
Hey, c'mon now - some of that is flooding the market so hard that it's ~8:1 nVidia:AMD on store shelves, letting nVidia be the default that consumers will pay for. That's without touching on marketing or the stock price (as under-informed consumers conflate it with performance, thinking "If it wasn't better, the stock would be closer to AMD").
>What shocked me was the massive variation in prices for the same model [AMD v. nVidea]
I am not a tech wizard, but I think the major (and noticeable) difference would be available tensor cores — that currently nVidea's tech is faster/better in the LLM/genAI world.
Obviously AMD jumped +30% last week from OpenAI investment — so that is changing with current model GPUs.
> the graphics card didn't cost more than the rest of the components put together
In fairness, the graphics card has many times more processing power than the rest of the components. The CPU is just there to run some of the physics engine and stream textures from disk.
The existence of scalpers rather shows that the producer set the price of the product (in this case GPU) too low [!] for the number of instances of the product that are produced.
Because the price is too low, more people want to buy a graphics card than the number of graphics cards that can be produced, so even people who would love to pay more can't get one.
Scalpers solve this mismatch by balancing the market: now people who really want to get a graphics card (with a given specification) and are willing to pay more can get one.
So, if you have a hate for scalpers, complain that the graphics card producer did not increase its prices. :-)
I would guess part of the reason for this was box art used to matter because most of these cards were sold through dedicated electronics retailers like Fry's Electronics, Microcenter, and CompUSA. There was basically no such thing as online ordering for this sort of thing. People were physically browsing goods on shelves.
Just chiming in here, but at least two of the generations of cards there are from ~2005-2008 and we old farts definitely bought (or convinced our parents to buy) things from Newegg at the time!
I think it comes from a marketing exaggeration of what the card could do. None of the cards of the day could actually produce their own box art (in real time) but the art implies they could in a way they can get away with. It follows the tradition of box art on 8-bit games wildly exaggerating what the in-game graphics might look like and they'd sometimes post a tiny disclaimer in the corner.
From full cases [0] including the CPU cooler in general, to themed components[1], when it comes to gaming makers are going beyond and above to create cool visuals.
The ones in the article are boxes only, the actual cards were different from what were represented on the box. Anime-themed products are products themselves in various themes. I'd argue that these two are different phenomena.
Years ago, I picked up a low profile, single-slot GPU that worked well in Linux to throw in old machines when someone gives me one to mess with or recover. The best fit at the time was a Yeston AMD card, and in a world of cards that are all "Black with {{primary_color}}" the choice of blinding magenta made me smile.
That's fantastic. I recently bought a Lofree mechanical keyboard (they're a Chinese brand) and they definitely have the most unusual hardware designs I've ever seen.
Well there is still an NPC that proclaims Kirkbride's drug binge fueled [affectionately] lore.
Besides, in Skyrim you eat souls of hitherto immortal beings in an act of metaphysical cannibalism, and, among other things, get to witness firsthand exactly what happens to those souls you trap to fuel your fancy sword of zapping.
Meanwhile, in the background, Vivec might or might not have been kidnapped to be on the receiving end of that spear thing, and fascist elves are trying their hardest to undo the universe (it's not plot pertinent though), and also briefly did (or claimed to do) something to the moons (that are not moons, remember) that terrified an entire race into submission.
The point is, the lore is still there. You just have to pay attention, because it's not always spelled outright.
I'll never forget the big, dumb gorilla on the side of my mid-2000's 7900 GTX from a King Kong game tie-in product launch. Never played the game, but when I'm installing a GPU I sometimes think of that stupid decal and chuckle.
Crazy, outrageous graphics on a graphics accelerator box seems quite fitting. Of course these days they do far more than just render 3D graphics (and that which they do has become quite common), so perhaps that also reflects the shift away from this branding.
I loved the weird boxes back in the 90s and 2000s. I remember dad would always take us to computer trade shows and ham events, and occasionally you'd see someone from ATi or Nvidia (or one of the integrators) demoing their wares with all sorts of bizarre and funny demo software and renders. I don't know if it was just me or what, but they always sent real nice sales or marketing people and it was fun to talk to them about the GPUs as a kid. I think they were as mystified (I recall several of them laughing about it) about the box art as everyone else was.
I had a Powercolor 9000 pro "Evil Commando". My friends and I thought it looked like a terrorist out of some old action movies. It kinda looked a bit like Universal Soldier.
Later on I bought a Sapphire 9800pro "Atlantis" which had some T1000 esque figure on the box art.
After that a lot of stuff becoming more corporate and boring.
This is a blast from the past! I remember being really young and buying a GPU based solely on what art was on the box (and yes, it was a scantily clad woman) and getting really, really luckily that it actually worked with my components but it was my intro to upgrading PCs!
Can't believe this was a hobby for me and my dad during primary school and now understanding how computers work led me to my current full time job to put food on the table for my own children.
TFA calls it unhinged, I call it creative and exciting. Now all we get is rounded edges, solid colours, and "copies of reality" - boring; if I wanted reality I'd go outside and touch grass.
I think what happened is, at the time those were literally more or less examples of the best scenes the cards could render. Nowadays, putting together an example of the best scene the card could render requires a whole art department and a couple months of design. Nobody’s going to spend months on box art, so we get bland rectangles or whatever.
Or it was just a fad when the scene was novel and it ran its course as fads and design elements do. This explanation doesn't require there to be an enemy to demonize but sometimes there just isn't, as much as we might want there to be.
What the best scene you could render is a bit fuzzy. In blender you could render anything at all. But in a game, at what resolution, and what framerate, are the shadows dynamic or baked in?
> GPU makers have all abandoned this practice, which is a shame as it provided something different through box art alone. Now, we're drowning in bland boxes and similar-looking graphics cards
I feel like there could be a more positive adjective than “unhinged” if you're going to turn around and praise it. OED sez “wildly irrational and out of touch with reality”. How about “whimsical”? I love this stuff and think we need to bring this kind of whimsy back to computing.
> There's a scantily dressed lady in armor
Author neglects to mention that ATi/AMD had a named ongoing marketing character for many many years — Ruby!
As usual, when money is to be found the soulless bean counting serious mba types come along and kill all the fun. Not to mention all the pretending money-seekers who can't code their way out of a paper bag.
> As usual, when money is to be found the soulless bean counting serious mba types come along and kill all the fun.
A reminder: Even years after inventing CUDA, Nvidia, the top GPU manufacturer, was fighting for survival. I'm not sure what saved them - perhaps crypto.
If you ignore the money, they appeared quite strong. But they struggled financially. Intel famously considered buying them around 2010 because they knew they could buy them cheap - Nvidia might not survive and weren't in a position to negotiate). Thankfully, the Intel CEO killed the idea because he knew Jensen wouldn't work well with Intel.
Nvidia may not have been saved by "bean counters", but they do have a place in the world.
When you'd first get a 3d accelerator you'd enter in a completely new world, the graphics and speed you'd get were on a different planet with what your computer could do without them.
I think that the boxes initially reflected that.
My first accelerator (rather late) was that 3D Blaster Voodoo 2; the graphics of the box contributed to the emotion of holding it, they looked better than in the picture.
I was mindblown when I saw what the card could do, and I believe to have thought that the graphics did reflect well its capabilities.
I sure kept the box for many years.
I imagine that then the manufacturers felt compelled to keep making boxes which would stand out; and in part, yes, they tried to attract some purchases from people who didn't originally mean to get a new graphics card.
I miss when gaming in general was less mainstream and more weird like this. Now the silicon manufacturers hate that they even have to sell us their scraps, let alone spend time on making unique designs for their boxes.
I bought a small press book with a collection of this art and it was a fun little trip down memory lane, as I’ve owned some of the hardware (boxes) depicted in it.
For anyone else interested: https://lockbooks.net/pages/overclocked-launch
> I miss when gaming in general was less mainstream and more weird like this.
To me, this is a continuum with the box art of early games, where because the graphics themselves were pretty limited the box art had to be fabulous. Get someone like Roger Dean to paint a picture for the imagination. https://www.rogerdean.com/
The peak of this was the Maplin electronics catalogue: https://70s-sci-fi-art.ghost.io/1980s-maplin-catalogues/ ; the Radio Shack of UK electronics hobbyists, now gone entirely. Did they need to have cool art on the catalogue? No. Was it awesome? Yes.
This great interview with Roger Dean about his career in games was recently posted:
https://spillhistorie.no/2025/10/03/legends-of-the-games-ind...
Turns out that the Psygnosis developers in the 1980s used him as a kind of single-shot concept artist. They would commission the box art first, then use that as inspiration for designing the actual game to go inside the box.
Reminds me of the Anderson Bruford Wakeman & Howe 12” cover:
https://row.rarevinyl.com/products/anderson-bruford-wakeman-...
On the plus side, PC gaming hardware seems to last ages now. I built my gaming desktop in 2020, I had a look lately at what a reasonable modern mid tier setup is and they are still recommending a lot of the parts I have. So I'll probably keep using it all for another 5 years then.
Its a double edge sword. Yes, back then 2 year old computer was old, but at the same time every 2 years a new generation of games came out that were like never seen before. Each generation was a massive step-up.
Today, a layman couldnt chronologically sort CoD games from past 10 years from looks/play/feel, new Fifa and similar is _the_ same game but with new teams added to it, and virtually every game made is a "copycat with their own twist" with almost 0 technical invention.
This is fine? Today’s games look beautiful and developers are hardly restricted by hardware. Games can innovate on content, stories, and experiences rather than on technology.
Feels similar to how painting hasn’t had any revolution in new paints available.
It's not a one-or-the-other. One wouldn't want content, stories, and experiences to stagnate just because graphics were improving, so why would the opposite be assumed?
AAA games suck compared to 10 years ago though.
I disagree, AAA games started nosediving with the seventh generation 20 years ago and only recently have they started to tentatively show signs of recovery.
Do you have any examples in mind from each era? I thought Fallout 3 was quite good around back then. Today we've got stuff like Borderlands 4 (or whatever the newest one is) that barely run on anyone's PC, and general game install size has also shot up drastically so it's no longer really feasible to keep most of your games installed all the time and ready to play.
I mostly play indie/retro/slightly-old games these days, so I mostly hear of the negatives for modern AAA, admittedly. I'm also tempted to complain about live service, microtransactions, gacha, season passes, and so on in recent big releases, but maybe that would be getting off-topic.
> Today we've got stuff like Borderlands 4 (or whatever the newest one is) that barely run on anyone's PC
Just like Crysis did 18 years ago?
>it's no longer really feasible to keep most of your games installed all the time and ready to play.
Crysis was around 5% of common HDD back then. Now, it'd be equivalent of around 80 GiB now. That would be just about what Elden Ring with the DLC takes.
Yes, thanks, I don't need "technical invention" in the form of more shaders for hiding the ass quality of the gameplay. Mirror's Edge Catalyst still looks great despite being almost 10 y.o. and manages to bring 2080Ti on it's knees in FullHD+.
Just this year I finally replaced my 3rd gen I5 system. It was well over 10 years old and still just able to keep up with my workloads.
Now it's a node in my proxmox cluster running transcodes for jellyfin. The circle of life
But your stuff from 2020 probably isn't AI "enhanced"!! Throw it in the garbage!
I was planning on hanging on to my Win10 PC, which was perfectly fine except that Microsoft were both pestering me to upgrade and telling me it wasn't possible, but the death of its SSD after 7 years put paid to that.
I have a PC that is probably at least 10 years old but works perfectly well for browsing, spreadsheets and the occasional lightweight game but MS has decided, in its infinite wisdom, to say that it can't be upgraded to Windows 11.
So I will probably install Linux (probably Debian) and move on and forget about those particular games... (~30 years since I first installed Linux on a PC!).
What sort of lightweight games? Nethack? Solitaire?
World of Tanks...
Sounds like a plus to me.
On the other hand, you're also stuck with design mistakes for ages.
The AM5 platform is quite lacking when it comes to PCIe lanes - especially once you take USB4 into account. I'm hoping my current setup from 2019 survives until AM6 - but it seems AMD wants to keep AM5 for another generation or two...
There's minimal demand for Thunderbolt/USB4 ports on Windows desktops. It won't ever make sense to inflate CPU pin count specifically for that use case, especially when the requisite PCIe lanes don't have to come directly from the CPU.
You'd be better off complaining about how Threadripper CPUs and motherboards are priced out of the enthusiast consumer market, than asking for the mainstream CPU platform to be made more expensive with the addition of IO that the vast majority of mainstream customers don't want to pay for.
One person's "design mistake" is another person's "market segmentation".
x16 GPU + x4 NVMe + x4 USB = 24 direct CPU lanes covers 99% of the market, with add-ons behind the shared chipset bandwidth. The other 1% of the market are pushed to buy Threadripper/Epyc.
Aren't they waiting for DDR6?
Woah, that book is cool; and so much more from this publisher!
LGR took a look at it on his channel; a very tiny book, with very tiny art, apparently all grabbed from google images. Something of a letdown.
Wow, I just checked and that's really underwhelming: Tiny page size, lots of padding around the images and yet there's often 4 images per page. The layout makes it seem like the size a was late decision, it would be appropriate for a large artbook.
You ain't kidding! What a treasure trove of a publisher. Never heard of them before, great rec
> Now the silicon manufacturers hate that they even have to sell us their scraps, let alone spend time on making unique designs for their boxes.
I genuinely don't believe this to be true for AMD. I bought a 6600xt on Release Day and by the time I was able to build my complete PC, it had upstream linux kernel support. You can say what you will about AMD but any company that respects my freedoms enough to build a product with great linux support and without requiring any privacy invading proprietary software to use is a-ok in my book.
Fuck NVidia though.
I agree with the spirit of your post, and that AMD is probably the lesser evil, but it's worth noting they "just" moved the proprietary bits into a firmware blob. It's still similarly proprietary but due to the kernel being pretty open to such blobs, it's a problem invisible to most users. You'd have to use linux-libre to get a feel for how bad things really are. You can't really use any modern GPUs with it.
I understand the sentiment, but I don't see how devices with proprietary firmware stored in ROM or NVRAM are any more free or open than devices that require proprietary firmware loaded at boot.
And it looks like Linux-Libre also opposes CPU microcode updates[1], as if bugged factory microcode with known security vulnerabilities is any less non-free than fixed microcode. Recommending an alternative architecture that uses non-proprietary microcode I can understand; this I cannot.
[1] https://www.phoronix.com/news/GNU-Linux-Libre-4.16-Released
Sound cards too. The Hercules website still proudly shows all their boxes from back when sound cards were popular for gaming and more: https://support.hercules.com/en/cat-soundcards-en/
Several models don't even have pictures of the card, but every one of them shows the crazy box.
They also still list all their old GPUs. Compare the wild boxes at the top with the TV tuner boxes at the bottom: https://support.hercules.com/en/cat-videocards-en/
Just the mention of pieces of hardware we don't really need anymore (sound cards, modems, etc) triggers a flood of nostalgia. I used to spend DAYs poring over PC part catalogues dreaming of my ideal rig. And brands like Hercules, Creative, Matrox all trigger the same feelings.
Crazy contrast to me having spent the past weekend wondering if cloud gaming services like Geforce Now are matured enough that I can fully move to a thin client - fat server setup for the little bit of gaming I still do.
The technology works, but the business model doesn't, so there's the eternal risk that it might get shut down at short notice with no way to export your saves.
Yeah, that's definitely a worry. Also, the fact that you're dependent on them for adding support for future games, and that (like any cloud service) it might not be available right when you want it.
Yeah that's the issue - nobody wants to just rent you a gaming PC in the cloud, they all want a cut of game sales/licensing. But if someone were to do it, the technology is absolutely there.
You don't even need to create any internal tech - Steam Remote Play already has everything you need, and I successfully used it to play Battlefield from an AWS GPU instance (was even good enough for multiplayer).
Behold, the Delphi 1.0 installer screen: https://www.gladir.com/SOFTWARE/DELPHI1/delphi1-install5.png
Early to mid 90s Borland had a lot of fun making "multimedia" :-)
There was also this[0] video posted recently here of a Turbo Pascal tutorial which has some interesting intro :-)
[0] https://www.youtube.com/watch?v=UOtonwG3DXM
Nvidia also had really cool demo programs you'd run with each generation to show off what your new card could do. [0]
You'll have to use the Internet archive to see them all. [1] Several, like 'Dawn' for example, were quietly removed in 2020.
[0] https://www.nvidia.com/en-us/geforce/community/demos/
[1] http://web.archive.org/web/2019/https://www.nvidia.com/en-us...
You can watch them on youtube also: https://m.youtube.com/watch?v=DfWSJvKFMPE
That's nice, but they were interactive - You could move around the scene or change the camera angles. The fact that you could do this and prove it was realtime and not prerendered was part of the demo and most of the charm. Lacking that, it's just... lacking.
Most of that charm is gone after 20 years, nobody needs proving the dynamic lights are really dynamic anymore.
They're still some fun to interact with anyways, or just fun as a way to review what was hot shit at the time, but I couldn't get a few of the really old ones to run on Windows 10/11 this summer. A video is a lot better than just saying "well, I'm not going to build an old PC just to play this demo" and not seeing it at all.
Besides the box art, I miss the days when 1) the graphics card didn't cost more than the rest of the components put together, 2) the graphics card got all of its damn power through the connector itself, and 3) MSRP meant something.
I just bought a RTX 5090 at MSRP. While expensive, it's also a radically more complicated product that plays a more important role in a modern computer than old GPUs did years ago.
Compared to my CPU (9950X3D), it's got a massive monolithic die measuring 750mm2 with over 4x the transistor count of the entire 9950X3d package. Beyond the graphics, it's got tensor and RT cores, dedicated engines for video decode/encode, and 32GB of DDR7 on the board.
Even basic integrated GPUs these days have far surpassed GPUs like the RTX 970, so you can get a very cheap GPU that gets power through the CPU socket, at MSRP.
Do yourself/me a favor, and give your 5090's power plug/socket a little jiggle test.
I'm a retired data center electrician, and my own GPU's has been "loose" at least more than once. Really make sure that sucker is jammed in there/latched.
> 3) MSRP meant something
I'm not in the market for a 5090 or similar, but the other day I was looking at a lower-end model, an AMD 9060 or Nvidia 5060. What shocked me was the massive variation in prices for the same model (9060 XT 16 GB or 5060 Ti 16 GB).
The AMD could be had for anywhere from 400 to 600 euros, depending on the brand. What can explain that? Are there actual performance differences? I see models pretending to be "overclocked", but in practice they barely have a few extra MHz. I'm not sure if that's going to do anything noticeable.
Since I'm considering the AMD more and it's cheaper, I didn't take that close a look at the Nvidia prices.
> What can explain that?
Looks. I'm not joking. The market is aimed at people with a fish bowl PC case that care about having a cooler with a appealing design, a interesting PCB colour and the flashiest RGB. Some may have a bit better cooling but the price for that is also likely marked up several times considering a full dual tower CPU cooler costs $35.
I was thinking about cooling, but basically they all have either two or three fans, and among those they look the same to my admittedly untrained eye.
The manufacturer can use better fans that move more air and stay more silent. They can design a better vapor chamber, lay out the PCB in a way that VRMs and RAM gets more cooling. But still all that stuff should not account for more than $30-50 markup.
Hey, c'mon now - some of that is flooding the market so hard that it's ~8:1 nVidia:AMD on store shelves, letting nVidia be the default that consumers will pay for. That's without touching on marketing or the stock price (as under-informed consumers conflate it with performance, thinking "If it wasn't better, the stock would be closer to AMD").
>What shocked me was the massive variation in prices for the same model [AMD v. nVidea]
I am not a tech wizard, but I think the major (and noticeable) difference would be available tensor cores — that currently nVidea's tech is faster/better in the LLM/genAI world.
Obviously AMD jumped +30% last week from OpenAI investment — so that is changing with current model GPUs.
They were talking about within one model, not between AMD and Nvidia.
> the graphics card didn't cost more than the rest of the components put together
In fairness, the graphics card has many times more processing power than the rest of the components. The CPU is just there to run some of the physics engine and stream textures from disk.
4) games come in a retail box accompanied by detailed manuals, booklets printed with back stories, and some swags.
4) scalpers only existed for sports and music venues
The existence of scalpers rather shows that the producer set the price of the product (in this case GPU) too low [!] for the number of instances of the product that are produced.
Because the price is too low, more people want to buy a graphics card than the number of graphics cards that can be produced, so even people who would love to pay more can't get one.
Scalpers solve this mismatch by balancing the market: now people who really want to get a graphics card (with a given specification) and are willing to pay more can get one.
So, if you have a hate for scalpers, complain that the graphics card producer did not increase its prices. :-)
I would guess part of the reason for this was box art used to matter because most of these cards were sold through dedicated electronics retailers like Fry's Electronics, Microcenter, and CompUSA. There was basically no such thing as online ordering for this sort of thing. People were physically browsing goods on shelves.
Just chiming in here, but at least two of the generations of cards there are from ~2005-2008 and we old farts definitely bought (or convinced our parents to buy) things from Newegg at the time!
100%. Used Newegg and Tigerdirect a bunch during that period. Shipping took forever.
Oh wow Tigerdirect. Forgot all about them.
I think it comes from a marketing exaggeration of what the card could do. None of the cards of the day could actually produce their own box art (in real time) but the art implies they could in a way they can get away with. It follows the tradition of box art on 8-bit games wildly exaggerating what the in-game graphics might look like and they'd sometimes post a tiny disclaimer in the corner.
Unusual designs are still a thing in some markets (mainly china) - for example, a cat themed cooler: https://www.youtube.com/watch?v=fGGKaX1D9Zo and various anime themed backplates on cards are available from Yeston: https://yestonstore.com/collections/graphics-card
Thanks, Japan is in the same boat.
From full cases [0] including the CPU cooler in general, to themed components[1], when it comes to gaming makers are going beyond and above to create cool visuals.
[0] https://www.dospara.co.jp/gamepc/kuzuha.html
[1] https://www.yodobashi.com/product/100000001009108157/
The ones in the article are boxes only, the actual cards were different from what were represented on the box. Anime-themed products are products themselves in various themes. I'd argue that these two are different phenomena.
Years ago, I picked up a low profile, single-slot GPU that worked well in Linux to throw in old machines when someone gives me one to mess with or recover. The best fit at the time was a Yeston AMD card, and in a world of cards that are all "Black with {{primary_color}}" the choice of blinding magenta made me smile.
That's fantastic. I recently bought a Lofree mechanical keyboard (they're a Chinese brand) and they definitely have the most unusual hardware designs I've ever seen.
Here's one of their mice: https://www.lofree.co/products/lofree-petal-mouse
It's nice to see, but the design feels like it's meant to go into a clear case so that it can be streamed for the world to see.
On nowadays gaming related unhinged designs, I raise the CoolMaster Shark X PC case to your attention:
https://www.coolermaster.com/en-global/products/shark-x/
Please stop reminding me of how soulless and watered down everything has become :(
Games are no different, in Morrowind gods ripped each other's penises off and used them as spears; in Skyrim you fight dragons.
For sure, games have gotten bland and lame. But in an era of quirky games Morrowind was still extra quirky.
Well there is still an NPC that proclaims Kirkbride's drug binge fueled [affectionately] lore.
Besides, in Skyrim you eat souls of hitherto immortal beings in an act of metaphysical cannibalism, and, among other things, get to witness firsthand exactly what happens to those souls you trap to fuel your fancy sword of zapping.
Meanwhile, in the background, Vivec might or might not have been kidnapped to be on the receiving end of that spear thing, and fascist elves are trying their hardest to undo the universe (it's not plot pertinent though), and also briefly did (or claimed to do) something to the moons (that are not moons, remember) that terrified an entire race into submission.
The point is, the lore is still there. You just have to pay attention, because it's not always spelled outright.
I'll never forget the big, dumb gorilla on the side of my mid-2000's 7900 GTX from a King Kong game tie-in product launch. Never played the game, but when I'm installing a GPU I sometimes think of that stupid decal and chuckle.
Searched and managed to find an image:
https://www.reddit.com/r/pcmasterrace/comments/144323l/great...
Crazy, outrageous graphics on a graphics accelerator box seems quite fitting. Of course these days they do far more than just render 3D graphics (and that which they do has become quite common), so perhaps that also reflects the shift away from this branding.
GPUs back in the day were only for gamers. Now they are for increasing your ai b2b saas arr.
The Voodoo range had some cool boxes: https://www.reddit.com/r/pcmasterrace/comments/yp5qzo/3dfx_v...
It wasn't just the art, but the boxes themselves. I remember buying this GPU which came in an X-shaped box: https://www.amazon.nl/-/en/First-NVidia-GeForce-128MB-Graphi...
I loved the weird boxes back in the 90s and 2000s. I remember dad would always take us to computer trade shows and ham events, and occasionally you'd see someone from ATi or Nvidia (or one of the integrators) demoing their wares with all sorts of bizarre and funny demo software and renders. I don't know if it was just me or what, but they always sent real nice sales or marketing people and it was fun to talk to them about the GPUs as a kid. I think they were as mystified (I recall several of them laughing about it) about the box art as everyone else was.
I had a Powercolor 9000 pro "Evil Commando". My friends and I thought it looked like a terrorist out of some old action movies. It kinda looked a bit like Universal Soldier.
Later on I bought a Sapphire 9800pro "Atlantis" which had some T1000 esque figure on the box art.
After that a lot of stuff becoming more corporate and boring.
These images were originally posted on Reddit; there are more examples in the original post:
https://old.reddit.com/r/pcmasterrace/comments/y7wcd7/gpu_bo...
This is a blast from the past! I remember being really young and buying a GPU based solely on what art was on the box (and yes, it was a scantily clad woman) and getting really, really luckily that it actually worked with my components but it was my intro to upgrading PCs!
Blast from the past.
Can't believe this was a hobby for me and my dad during primary school and now understanding how computers work led me to my current full time job to put food on the table for my own children.
Ahhh reminded me of my sapphire 3870 toxic edition. Cool box art and one of the coldest running cards I’ve owned with the Vapor x chamber.
When people still bought Graphics Processing Units for processing graphics and not crypto mining or AI inferencing
Those box designers appear to have moved on to the performance whey protein and workout supplement industry.
unhinged? that's how regular ads were, it could be for anything
TFA calls it unhinged, I call it creative and exciting. Now all we get is rounded edges, solid colours, and "copies of reality" - boring; if I wanted reality I'd go outside and touch grass.
I think what happened is, at the time those were literally more or less examples of the best scenes the cards could render. Nowadays, putting together an example of the best scene the card could render requires a whole art department and a couple months of design. Nobody’s going to spend months on box art, so we get bland rectangles or whatever.
It's nothing that complicated. Nvidia started micromanaging their distributors, and removed all the fun, and AMD just copies what they do.
Or it was just a fad when the scene was novel and it ran its course as fads and design elements do. This explanation doesn't require there to be an enemy to demonize but sometimes there just isn't, as much as we might want there to be.
What the best scene you could render is a bit fuzzy. In blender you could render anything at all. But in a game, at what resolution, and what framerate, are the shadows dynamic or baked in?
look at the evolution of the DirectX branding through the years as well. OGs remember the logo themed after the radioactive hazard symbol.
Link because I had to look it up to remember: https://logos.fandom.com/wiki/DirectX
dxdiag.exe
> you could say they were unhinged
> GPU makers have all abandoned this practice, which is a shame as it provided something different through box art alone. Now, we're drowning in bland boxes and similar-looking graphics cards
I feel like there could be a more positive adjective than “unhinged” if you're going to turn around and praise it. OED sez “wildly irrational and out of touch with reality”. How about “whimsical”? I love this stuff and think we need to bring this kind of whimsy back to computing.
> There's a scantily dressed lady in armor
Author neglects to mention that ATi/AMD had a named ongoing marketing character for many many years — Ruby!
- Agent Ruby Demo Compilation https://www.youtube.com/watch?v=sUAuj0Jn8UI
- 2008 Ruby demo https://www.youtube.com/watch?v=0YjXCae4Gu0
- Ruby origin story https://web.archive.org/web/20071023192128/http://game.amd.c...
- ATI Agent Ruby™ Usage Guidelines 1.0 http://www.barbaraburch.com/portfolio/whitepaper6.pdf
- She even stuck around long enough for the ATi name to entirely disappear from AMD Radeon branding: https://i.imgur.com/uBWfzCA.jpeg https://www.youtube.com/watch?v=bwIMHX7rW8Q (2013)
- AMD-exclusive Ruby skin for Quake Champions https://www.youtube.com/watch?v=-LRSqC9n0Tc (2017)
> GeForce 6600 GT was enclosed inside a box featuring a lovely lady
nᴠɪᴅɪᴀ had several named demo characters too, but they removed all the pretty lady ones some time in 2020. Compare:
- https://web.archive.org/web/20200921115422/https://www.nvidi...
- https://www.nvidia.com/en-us/geforce/community/demos/
Adam Sessler voice I give this article a two… out of five.
As usual, when money is to be found the soulless bean counting serious mba types come along and kill all the fun. Not to mention all the pretending money-seekers who can't code their way out of a paper bag.
> As usual, when money is to be found the soulless bean counting serious mba types come along and kill all the fun.
A reminder: Even years after inventing CUDA, Nvidia, the top GPU manufacturer, was fighting for survival. I'm not sure what saved them - perhaps crypto.
If you ignore the money, they appeared quite strong. But they struggled financially. Intel famously considered buying them around 2010 because they knew they could buy them cheap - Nvidia might not survive and weren't in a position to negotiate). Thankfully, the Intel CEO killed the idea because he knew Jensen wouldn't work well with Intel.
Nvidia may not have been saved by "bean counters", but they do have a place in the world.
I remember some of those.
oh god some of these just brought back memories long repressed
When you'd first get a 3d accelerator you'd enter in a completely new world, the graphics and speed you'd get were on a different planet with what your computer could do without them.
I think that the boxes initially reflected that.
My first accelerator (rather late) was that 3D Blaster Voodoo 2; the graphics of the box contributed to the emotion of holding it, they looked better than in the picture.
I was mindblown when I saw what the card could do, and I believe to have thought that the graphics did reflect well its capabilities.
I sure kept the box for many years.
I imagine that then the manufacturers felt compelled to keep making boxes which would stand out; and in part, yes, they tried to attract some purchases from people who didn't originally mean to get a new graphics card.
soul