The early internet was like some settlers' shacks, built uncontrollably and unsystematically whereas the modern web is all skyscrapers, residential and business. Uniform apartments, uniform offices all looking very similar differeing in only subtle interior design details here and there.
Should we go back to the shack era? Of course not. But maybe we should start a new era of land exploration and start over. It shouldn't necessarily be Internet 3.0, might be something else completely. AR/VR? Possibly although that has already failed once.
The only thing missing from your analogy is the fact that the shacks were filled with personal diaries and curios, while the skyscrapers are mostly chock-full of homogenous sewage slurry.
Also the shacks weren’t really particularly shabby or anything, they were just more like well-enough-constructed single family homes.
Old websites before scripting became popular were pretty much solid in that boring-tech way. Hardware and networks were not as reliable, but the sites themselves could be fine via simplicity.
Modern overdesigned sites are sort of like modern apartment buildings: shitty build quality under fake plastic marble and wood.
Keep in mind the early websites were mostly built by an enthusiast minority, technical or not but willing to learn HTML and Netscape Composer. You can't expect the whole humanity to be as enthusiastic. The skyscraper era, no matter how much we all hate it, makes the web more democratic: it gives everyone some standardized space (Facebook, Youtube, etc) with algorithmized discovery which is parking and elevators if you want to continue the analogy.
Hard to live through what social media has done to society over the past decade without at least entertaining the idea that the higher barrier to entry of being online was maybe not a bad thing.
I don't disagree but notice how it's about the second decade of Web 2.0, not the first one. Profit-driven algorithms is a separate era in its own right. I.e. you can't blame the skyscrapers themselves for your shitty life, you just need to demand more regulation.
If we're talking about the internet before Eternal September, maybe, but putting up a site on Geocities or Tripod or using Dreamweaver certainly was not a high barrier to entry.
I wouldn't agree that the higher barrier to entry was a good thing, but I also would say that the barrier to entry was actually pretty low, with angelfire, geocities, etc. Dreamweaver + other wysiwyg, and the lack of a necessity of a giant js framework with bundling and tree-shaking.
The problem is that the barrier to entry got too low, so it was necessary for large companies to interpose themselves between producers and audiences, starting with google (becoming something other than a grep for the web, and instead becoming the editor and main income source for the web) and expanding outwards into facebook.
Remember that we started with walled gardens like AOL and Compuserve, and the web (and the end of those companies) was people desperate to break out of them. Now people have been herded in again since the indexers bought the ad companies.
yes, for sure! It was a different time. Early website authors were pioneers. They had something worth sharing and they thought it worthwhile enough to learn some coding. Nobody was trying to push ads and monetize, and there was no ubiquitous tracking or cookies
Facebook and YouTube are top-down managed systems, and I think it is a real disservice to the idea of democracy to call this sort of thing “more democratic.” They are democratic like a mall is, which is to say, not.
I like to compare today's web to radio in the late 1800s and early 1900s.
Back then, if you could piece together a transmitter and throw an antenna up, you were a broadcaster and many broadcast whatever they felt like. Just like today's internet.
Social media is the CB radio of the 1970s and 80s when anyone could buy a small rig and do all kinds of weird and wild things for cheap.
But, eventually, something had to reign in all that and the FCC along with international laws and standards came up to calm all that down. In the same way, I think the internet will eventually become licensed and regulated.
The rationale behind the FCC is that it's regulating a limited resource (spectrum space.) The web is not a limited resource (although bandwidth is, but that's a different debate.) The web is also international, and we're already seeing conflicts where one country tries to force their regulations onto another. That metaphor just doesn't work where the web is concerned.
I agree that the web in the US, and specifically large social media platforms, will probably be regulated because that seems to be one of the few things both parties agree on for their own reasons. But more so because the government wants to control information and surveil citizens. I think the balkanization of the web as a whole into smaller, closed networks is probably inevitable.
But what's most depressing of all is how many people in tech and on HN would be thrilled if one needed a license to publish on the internet just because that would implicitly push most people off of the web and leave it for a privileged elite.
As bad as social media can be (and I think its harm is often oversold for political ends) having a space where anyone can publish and communicate and create freely, where different platforms can exist and cater to different needs, where media isn't entirely controlled and gatekept by corporations, is critically important. More important than any other communications paradigm before it, including the printing press.
It's really going to be sad when we burn it all down, because it seems unlikely anyone is going to make something as free and open as the web ever again.
> But, eventually, something had to reign in all that and the FCC along with international laws and standards came up to calm all that down.
No, it actually stayed pretty lively until the 90s, when the government decided that there could be huge monopolies in media, all the stations were bought up by like 6 guys, and were automated to play Disney music 24 hours a day.
> Should we go back to the shack era? Of course not.
I am not sure. Different people want different things. I ran a Hetzner cloud instance where I toss a simple webpage with locally hosted travel photos for friends and family. And a Jupiter server (with a very weak password) on the same instance for myself and any friend when we want something more powerful than a calculator.
And this messy, improperly organized, breaking all design patterns way works just fine for me. So I'm fine with a shack for personal communication and as a personal space. My 2c.
I disagree. The primary threat model for unencrypted http connections is a MITM attack. A middle box (a proxy or router) modifies the response payload to inject malicious content or modify the content. For an ordinary blog or personal website an attacker can gain compute, violate privacy, acquire a (minor) DDOS source, on the blogs users by injecting a script.
Another type of attack would modify the content of the site to suit the attackers purpose - either to hurt the author and/or their message. Consider the damage an attacker can do if they injected CSAM onto a person's blog. The victim's life would be ruined long before the wheels of justice turned (if they turn at all). The one mitigating factor is that you'd need to have reliable control over a relatively stable middle-box to execute this attack, but that's quite feasible. Last but not least don't underestimate the way software grows. Sooner or later someone is going to implement HTTP basic authentication over plain HTTP and, needless to say, that's a bad idea.
Look, I don't like it either. I remember when you could telnet into a server and interact with it. That was good for pedagogy and building a mental model of the protocol. But we have to deal with how things are, not how we want them to be.
What’s the argument that it can be worse for the author’s privacy?
In general, I think we should encrypt everything. The more encrypted stuff floating around, the less it stands out, and the better for everybody’s privacy. Of course, nowadays encrypted content is quite common. But it didn’t become that way without effort!
Thanks to let's encrypt it's now at least possible to get a valid certificate anonymously, but it's a pain that requires renewal every 60 to 90 days and puts you at their mercy.
If they decide they don't like your brand of free speech it's lights out and they are the only game in town.
Yes, I know you can automate renewal if you have shell access, but you'll probably have to remember to do it manually if you use shared hosting that doesn't provide a cert for you.
That's a lot of work, and a lot of risk, to secure a message that's meant to be publicly broadcast in the first place.
I imagine it to be a bit like encrypting OTA television. Sure, you could stop a pirate broadcast from inpersonating your station by encrypting it, but that's not actually a threat model that applies to normal people most of the time and it makes everything far more complex.
Can your ISP MITM you? Yep, and if they do you should cancel your service then sue them into the ground.
Unfortunately this isn't 1999, and bad actors are everywhere. Even ISPs themselves (cough Comcast) have been injecting unsolicited new code into people's webpages for many years now.
It was kind of awesome that you could telnet to port 80, type "GET / HTTP/1.0", press return a couple of times, and receive a web page. Then shitty hotel wifi that injected ads happened, so we had to encrypt traffic that had absolutely no sensitive information.
Sort of. There are many confounding factors here. For one, they're harder to find because the number of personal websites doesn't scale as quickly as commercial content and SEO spam. It's also a bit of a vicious cycle, because if your website is less likely to be read by anyone, why bother writing it in the first place?
But for the most part, the very people bemoaning the current state of affairs then go back to scrolling through TikTok / Instagram / Facebook / Reddit.
Because you were 25 years younger. When you are, say, 20 years old, people >= 20 years old are likely to be interesting, which was pretty much everyone with a web page. When you are 45 years old, writings from 20 year olds are much less interesting, on average.
I think e.g. Mastodon with IndieWeb is a way to fight against the enshittification and to bring back the good from the early years. "The IndieWeb is a people-focused alternative to the “corporate web”. We are a community of independent and personal websites based on the principles of: owning your domain and using it as your primary online identity, publishing on your own site first (optionally elsewhere), and owning your content." https://indieweb.org/
I like the current state of webpages. I'm confident they will load quickly and render properly. I'm happy to read streamlined posts and find the bigger sites will organised and much easier to navigate than sites of old
I am compiling a website on neocites.org ... for about two months now, and I won't be complete for another year. It's basically a place for photos, maps and descriptions of a local community xeriscape garden with about 400 specimens.
Others will take photos and videos of the place throughout the year, and post to social media, where they instantly get a couple dozen thumbs up, and gloat about it. That is not my intention. I want a coffee table book.
I can't speak for Gemini, but when I found out about Gopher I read the specification and made a very simple server implementation for myself. If you're looking for a quick project to play with, it's not a bad one to try.
It didn't support everything, mostly just basic browsing and linking. But it was cool to build something mostly compliant to a spec that quickly.
Websites do still exist. People do know what they are. There are more of them on the web than there ever have been. Nothing is stopping anyone from creating a website if they want. Nothing is stopping you.
It was common to make tables and use them to assemble a bitmap, where each cell had zero border/margin/padding and an exact size, and contained a "slice" of the image. Web authoring tools (and Photoshop) even had explicit support for generating this sort of thing, as I recall. This was I guess simpler to automate than defining clickable regions of a single image, and it allowed for the individual pieces of the image to be requested in parallel on slow connections (adding another dimension of progressive loading).
Yeah, I remember this. Macromedia Fireworks had a slice tool that I used quite a bit. You'd basically make an image which was your website, and then do all the layout with zero border tables. But for me, this was what I was doing circa 2004 before CSS was dominant. Earlier software from the '96 era like Frontpage I think would use bitmaps whole cloth, but maybe I'm misremembering.
Ah, nothing like trying to save the logo from such a website, then discovering the image you saved is partially cut-off and includes the navbar behind it instead having a transparent background.
Looking at the code, it has definitely been modified from the original... there is now CSS as well as a google ad tracker... but visually it's probably almost exactly the same.
The early internet was like some settlers' shacks, built uncontrollably and unsystematically whereas the modern web is all skyscrapers, residential and business. Uniform apartments, uniform offices all looking very similar differeing in only subtle interior design details here and there.
Should we go back to the shack era? Of course not. But maybe we should start a new era of land exploration and start over. It shouldn't necessarily be Internet 3.0, might be something else completely. AR/VR? Possibly although that has already failed once.
The only thing missing from your analogy is the fact that the shacks were filled with personal diaries and curios, while the skyscrapers are mostly chock-full of homogenous sewage slurry.
Also the shacks weren’t really particularly shabby or anything, they were just more like well-enough-constructed single family homes.
Old websites before scripting became popular were pretty much solid in that boring-tech way. Hardware and networks were not as reliable, but the sites themselves could be fine via simplicity.
Modern overdesigned sites are sort of like modern apartment buildings: shitty build quality under fake plastic marble and wood.
If you've visited old mining operations / shacks that's pretty common! There are always some weird choices and cool things to see
> Should we go back to the shack era? Of course not.
This isn’t obvious, at least, we can’t write the idea off with an “of course not.”
Keep in mind the early websites were mostly built by an enthusiast minority, technical or not but willing to learn HTML and Netscape Composer. You can't expect the whole humanity to be as enthusiastic. The skyscraper era, no matter how much we all hate it, makes the web more democratic: it gives everyone some standardized space (Facebook, Youtube, etc) with algorithmized discovery which is parking and elevators if you want to continue the analogy.
Hard to live through what social media has done to society over the past decade without at least entertaining the idea that the higher barrier to entry of being online was maybe not a bad thing.
I don't disagree but notice how it's about the second decade of Web 2.0, not the first one. Profit-driven algorithms is a separate era in its own right. I.e. you can't blame the skyscrapers themselves for your shitty life, you just need to demand more regulation.
If the skyscraper is designed with elevators that try to keep me in and away from the first floor so I don't leave I can definitely complain.
If we're talking about the internet before Eternal September, maybe, but putting up a site on Geocities or Tripod or using Dreamweaver certainly was not a high barrier to entry.
I wouldn't agree that the higher barrier to entry was a good thing, but I also would say that the barrier to entry was actually pretty low, with angelfire, geocities, etc. Dreamweaver + other wysiwyg, and the lack of a necessity of a giant js framework with bundling and tree-shaking.
The problem is that the barrier to entry got too low, so it was necessary for large companies to interpose themselves between producers and audiences, starting with google (becoming something other than a grep for the web, and instead becoming the editor and main income source for the web) and expanding outwards into facebook.
Remember that we started with walled gardens like AOL and Compuserve, and the web (and the end of those companies) was people desperate to break out of them. Now people have been herded in again since the indexers bought the ad companies.
yes, for sure! It was a different time. Early website authors were pioneers. They had something worth sharing and they thought it worthwhile enough to learn some coding. Nobody was trying to push ads and monetize, and there was no ubiquitous tracking or cookies
Facebook and YouTube are top-down managed systems, and I think it is a real disservice to the idea of democracy to call this sort of thing “more democratic.” They are democratic like a mall is, which is to say, not.
I think a better analogy is large corp built & owned vs small artisan businesses & single family homes. Why not have both?
I like to compare today's web to radio in the late 1800s and early 1900s.
Back then, if you could piece together a transmitter and throw an antenna up, you were a broadcaster and many broadcast whatever they felt like. Just like today's internet.
Social media is the CB radio of the 1970s and 80s when anyone could buy a small rig and do all kinds of weird and wild things for cheap.
But, eventually, something had to reign in all that and the FCC along with international laws and standards came up to calm all that down. In the same way, I think the internet will eventually become licensed and regulated.
The FCC licenses radio broadcasters because the spectrum is finite. Which finite aspects of the internet do you see driving such eventual practice?
The rationale behind the FCC is that it's regulating a limited resource (spectrum space.) The web is not a limited resource (although bandwidth is, but that's a different debate.) The web is also international, and we're already seeing conflicts where one country tries to force their regulations onto another. That metaphor just doesn't work where the web is concerned.
I agree that the web in the US, and specifically large social media platforms, will probably be regulated because that seems to be one of the few things both parties agree on for their own reasons. But more so because the government wants to control information and surveil citizens. I think the balkanization of the web as a whole into smaller, closed networks is probably inevitable.
But what's most depressing of all is how many people in tech and on HN would be thrilled if one needed a license to publish on the internet just because that would implicitly push most people off of the web and leave it for a privileged elite.
As bad as social media can be (and I think its harm is often oversold for political ends) having a space where anyone can publish and communicate and create freely, where different platforms can exist and cater to different needs, where media isn't entirely controlled and gatekept by corporations, is critically important. More important than any other communications paradigm before it, including the printing press.
It's really going to be sad when we burn it all down, because it seems unlikely anyone is going to make something as free and open as the web ever again.
> But, eventually, something had to reign in all that and the FCC along with international laws and standards came up to calm all that down.
No, it actually stayed pretty lively until the 90s, when the government decided that there could be huge monopolies in media, all the stations were bought up by like 6 guys, and were automated to play Disney music 24 hours a day.
Not such a neat story, right?
> Should we go back to the shack era? Of course not.
I am not sure. Different people want different things. I ran a Hetzner cloud instance where I toss a simple webpage with locally hosted travel photos for friends and family. And a Jupiter server (with a very weak password) on the same instance for myself and any friend when we want something more powerful than a calculator.
And this messy, improperly organized, breaking all design patterns way works just fine for me. So I'm fine with a shack for personal communication and as a personal space. My 2c.
>AR/VR? Possibly although that has already failed once.
I'm pretty sure it's already failed 3 times.
I think that Homestar Runner did a great short[1] about this for their 25th anniversary.
[1] https://homestarrunner.com/toons/backtoawebsite
> You’ve enabled HTTPS-Only Mode for enhanced security, and a HTTPS version of tttthis.com is not available.
This, too, is nostalgic, in a way.
It's silly to try and encrypt everything, and arguably it can be worse for the authors privacy.
Sometimes a blog post on a plain http web site doesnt need to be encrypted.
I disagree. The primary threat model for unencrypted http connections is a MITM attack. A middle box (a proxy or router) modifies the response payload to inject malicious content or modify the content. For an ordinary blog or personal website an attacker can gain compute, violate privacy, acquire a (minor) DDOS source, on the blogs users by injecting a script.
Another type of attack would modify the content of the site to suit the attackers purpose - either to hurt the author and/or their message. Consider the damage an attacker can do if they injected CSAM onto a person's blog. The victim's life would be ruined long before the wheels of justice turned (if they turn at all). The one mitigating factor is that you'd need to have reliable control over a relatively stable middle-box to execute this attack, but that's quite feasible. Last but not least don't underestimate the way software grows. Sooner or later someone is going to implement HTTP basic authentication over plain HTTP and, needless to say, that's a bad idea.
Look, I don't like it either. I remember when you could telnet into a server and interact with it. That was good for pedagogy and building a mental model of the protocol. But we have to deal with how things are, not how we want them to be.
What’s the argument that it can be worse for the author’s privacy?
In general, I think we should encrypt everything. The more encrypted stuff floating around, the less it stands out, and the better for everybody’s privacy. Of course, nowadays encrypted content is quite common. But it didn’t become that way without effort!
Thanks to let's encrypt it's now at least possible to get a valid certificate anonymously, but it's a pain that requires renewal every 60 to 90 days and puts you at their mercy.
If they decide they don't like your brand of free speech it's lights out and they are the only game in town.
Yes, I know you can automate renewal if you have shell access, but you'll probably have to remember to do it manually if you use shared hosting that doesn't provide a cert for you.
That's a lot of work, and a lot of risk, to secure a message that's meant to be publicly broadcast in the first place.
I imagine it to be a bit like encrypting OTA television. Sure, you could stop a pirate broadcast from inpersonating your station by encrypting it, but that's not actually a threat model that applies to normal people most of the time and it makes everything far more complex.
Can your ISP MITM you? Yep, and if they do you should cancel your service then sue them into the ground.
If you don‘t encrypt everything a malicious actor like some ISPs can inject nasty things like pervasive tracking or zero day exploits.
It helps against this kind of stuff (2015) https://blog.fox-it.com/2015/04/20/deep-dive-into-quantum-in...
Unfortunately this isn't 1999, and bad actors are everywhere. Even ISPs themselves (cough Comcast) have been injecting unsolicited new code into people's webpages for many years now.
It was kind of awesome that you could telnet to port 80, type "GET / HTTP/1.0", press return a couple of times, and receive a web page. Then shitty hotel wifi that injected ads happened, so we had to encrypt traffic that had absolutely no sensitive information.
Our solution was technical, the problem was political. It should have been solved using criminal penalties for the criminals who did this.
This seems like it has at least partial overlap with the "small web": see https://kagi.com/smallweb
The last sentence can’t be true. If you go looking for them, they’re easy to find. The problem is you don’t.
Sort of. There are many confounding factors here. For one, they're harder to find because the number of personal websites doesn't scale as quickly as commercial content and SEO spam. It's also a bit of a vicious cycle, because if your website is less likely to be read by anyone, why bother writing it in the first place?
But for the most part, the very people bemoaning the current state of affairs then go back to scrolling through TikTok / Instagram / Facebook / Reddit.
Or, when you do go looking, it doesn’t feel the same. Why?
Because you were 25 years younger. When you are, say, 20 years old, people >= 20 years old are likely to be interesting, which was pretty much everyone with a web page. When you are 45 years old, writings from 20 year olds are much less interesting, on average.
https://wiby.me/
I remember. Stumbled upon
I think e.g. Mastodon with IndieWeb is a way to fight against the enshittification and to bring back the good from the early years. "The IndieWeb is a people-focused alternative to the “corporate web”. We are a community of independent and personal websites based on the principles of: owning your domain and using it as your primary online identity, publishing on your own site first (optionally elsewhere), and owning your content." https://indieweb.org/
I mean. There was a lot of absolute toss also?
I like the current state of webpages. I'm confident they will load quickly and render properly. I'm happy to read streamlined posts and find the bigger sites will organised and much easier to navigate than sites of old
I am compiling a website on neocites.org ... for about two months now, and I won't be complete for another year. It's basically a place for photos, maps and descriptions of a local community xeriscape garden with about 400 specimens.
Others will take photos and videos of the place throughout the year, and post to social media, where they instantly get a couple dozen thumbs up, and gloat about it. That is not my intention. I want a coffee table book.
Does anyone remember geocities, tripod, webrings, and Amazon affiliate links?
Pepperidge farm remembers...
I am currently building on neocities.org - that is the old geocities
My website joelx.com has been available for 18 years now. Lots of articles.
This comes up every so often, and I always post something like this :)
It still exists with Gemini protocol and gopher:
https://www.linux-magazine.com/Issues/2021/245/The-Rise-of-t...
https://en.wikipedia.org/wiki/Gemini_(protocol)
https://en.wikipedia.org/wiki/Gopher_(protocol)
https://geminiprotocol.net/
https://wiki.sdf.org/doku.php?id=gemini_site_setup_and_hosti...
https://sdf.org/?tutorials/gopher
I have moved my site to gemini (and gopher), maintaining both is far easier than what I had to go through with the WEB/htmp.
I can't speak for Gemini, but when I found out about Gopher I read the specification and made a very simple server implementation for myself. If you're looking for a quick project to play with, it's not a bad one to try.
It didn't support everything, mostly just basic browsing and linking. But it was cool to build something mostly compliant to a spec that quickly.
Websites do still exist. People do know what they are. There are more of them on the web than there ever have been. Nothing is stopping anyone from creating a website if they want. Nothing is stopping you.
They will not be indexed by search engines, though, so you had better email all your friends so you'll have a few visitors.
Yes they will?
I run/host a bunch of personal websites for friends.
I do nothing special to get them indexed and they are all on search engines.
I've created a few sites and never explicitly told search engines about them, and they got picked up just fine surprisingly quickly.
https://www.spacejam.com/1996/
If this is true to the original, I am surprised to see this was table-oriented layout and not a bitmap image with clickable x,y coordinates.
It was common to make tables and use them to assemble a bitmap, where each cell had zero border/margin/padding and an exact size, and contained a "slice" of the image. Web authoring tools (and Photoshop) even had explicit support for generating this sort of thing, as I recall. This was I guess simpler to automate than defining clickable regions of a single image, and it allowed for the individual pieces of the image to be requested in parallel on slow connections (adding another dimension of progressive loading).
Yeah, I remember this. Macromedia Fireworks had a slice tool that I used quite a bit. You'd basically make an image which was your website, and then do all the layout with zero border tables. But for me, this was what I was doing circa 2004 before CSS was dominant. Earlier software from the '96 era like Frontpage I think would use bitmaps whole cloth, but maybe I'm misremembering.
Ah, nothing like trying to save the logo from such a website, then discovering the image you saved is partially cut-off and includes the navbar behind it instead having a transparent background.
Table-oriented layouts was a thing back then too.
[dead]
Looking at the code, it has definitely been modified from the original... there is now CSS as well as a google ad tracker... but visually it's probably almost exactly the same.