Up to early 2000s, people would go to the internet to have fun, everything was new, it was the mass migration from analog to digital era.
2020s, people are going offline to have fun.
Homelab is becoming a thing even for people who never had experience with computer, people hosting their own documents, movies, music, backups in case things go bad.
Even some companies have realised the price of going cloud, some are moving back to on-prem hardware with full control.
Before we had facebook and iphones the only people that were able to run a home lab were technically adept. In 1998 I used Avantgo and Vindigo to browse the news on the train and find restaurants when nobody else could. In 2005 I remember running my netgear mp101 upnp player and everybody was impressed how I could stream music. Then we made things like iphones and facebook which got everybody on the internet, and we made all the “hard” things like music, video, news, reservations, etc.. a “service” – democratizing it (what a nice word). But for technical people it was actually shittier than just running it on your own. Not right away -- there was small overlapping period from 2005 to 2014 or so, where the pace of advancement of technology was complementary to hosting it yourself, but after the corporate monopolies got fully involved everything just went to shit. I think it has come full circle again, where the “technically illiterate” will just consume the shitty services, and will be happy or oblivious to it – they are actually serfs giving their labor/money away and they don’t care. The rest of the technical folks are just going to do their own thing again because we’re sick of the crappy services. And it will be better than the general public can ever do, just like 1998 again.
> Homelab is becoming a thing even for people who never had experience with computer, ...
Oh totally. I got my brother, who lives on the other side of the world and who's not a dev/sysadmin, just a poweruser, to install Proxmox and he's now using GPU passthrough to have VMs run different AI models (in either Linux or Windows) for image generation, experimenting, etc. He's also got a NAS with RAID etc.
To me a homelab is the 2020's version of having fun with computing: there's something incredibly refreshing in disconnecting my sub-LAN from the Internet and still have music, movies, private pastebin (yup I use this at times between computers for simple stuff I don't want to both scp'ing), private Git repositories, complete backup system (including offline HDDs/SSDs that I rotate into a safe at the bank), etc.
A movie projector, a dumb one, is another very cool thing: connected to nothing but a HDMI cable (not that HDMI is the best standard ever but it does the job).
And to be sure I can still code and work without having a nanny holding my hand as if I was a toddler, I regularly have coding sessions where I don't use Claude Code (but I also pay for a subscription: these things aren't mutually exclusive).
For anyone who wants to have a fun, a used HP Workstation with ECC memory is basically $200 and makes a perfectly fine server at home. Doesn't need to be up 24/7 either: my online service that is up 24/7 is my unbound DNS resolver and I run that one on a Raspberry Pi (for the low power consumption). The rest of my homelab (two Proxmox servers) is basically something I only need when I'm awake/at my desk. So I turn them off at night.
It's kind of funny that people are talking about "home labs" as a new thing because I've been running some form of servers on consumer PC hardware in my home since around 1998. For me this was an inseparable part of getting to know Linux and *BSD in that era.
>I've been running some form of servers on consumer PC hardware in my home since around 1998
My excuse is that I never had the financial stability that I have now in my middle 30s to get things going, also moving oversea and what not didn't help either.
But I didn't go crazy, I have 3 Proxmox servers running a few services, Pihole + Unbound as recursive DNS to avoid DNS poisoning and personal data tracking.
A DIY TrueNAS as the primary system to have a copy of my data.
I have a 4K bluray with physical media, but I do have Jellyfin also because nothing matches 80s, 90s, early 2000s movies and buying DVD in 2026 is pointless. Also, it is not easy or very, very expensive to find a bluray copy of old movies in 2026. Jellyfin solves that.
All my servers are consuming 110W 200VA tops, connected to a second hand APC UPS 1000VA.
If the whole world goes to shit right now, I can still run all my stuff without dependency to the internet.
My last goal is to have a solar/battery system so if WW3 really happens sending us to the cave age, wherever I am will still be 21st century.
1988. On a math TA salary I paid $600 for an 80MB (That's megabytes) hard drive. I had dialup. I also had Turbo Pascal and an 8087 coprocessor. I was a MS student in computational math AKA numerical analysis.
It was goddam glorious.
Took until 1995ish to have a homelab to experiment with FreeBSD and later Linux over a 10-Base-T network with gcc/g++ and dialup access to this thing called "The World Wide Web". The browser had a throbber dinosaur.
It was even more goddam glorious.
Right now I've got three main systems with decent CPUs and 128GB of memory, and several emphemeral satellite systems. With 8GB of NVIDIA VRAM I'm running gemma4:31b just fine on my media system. Which curiously enough has, ah... media on it.
I feel like I have a good idea how EV owners feel right now. (We have a Prius.)
>I feel like I have a good idea how EV owners feel right now. (We have a Prius.)
The difference is that you don't own your EV, it is a computer on wheel. Any firmware update like Tesla has done in the past and features are no longer available.
That is totally opposite of homelab, you have full control, you flash firmware that gives you full control over devices.
I am hard core Linux user, my wireless access point runs Linux, my router is a Sophos baremetal running OPNSense/FreeBSD Unix. My 3D printer is DIY running Debian Linux.
That is the best thing about homelab, nobody can take it away from you, you own everything, it is yours and yours only.
"The difference is that you don't own your EV, it is a computer on wheel. Any firmware update like Tesla has done in the past and features are no longer available."
Yeah, I think that's right.
I only thought in one dimension: reliance on corporate controlled high density existing infiltration of fossil-fuel delivery infrastructure. Which is worthless if the price is occasionally exorbitantly volatile or might even run into zero supply issues.
Another equally important dimension is: that EV car might just be a puppet, and not you running the puppet.
I'm pretty sure the Prius doesn't phone home (2015), but I admit that I've not gone deep into it.
I can't stand this thing I just did in this comment where I tried not to sound like an AI. I might have to give up short comments entirely because I can't generate enough context for authenticity credibility. <= It's a fact, and that right there sounds like AI to me now.
"Oh totally. I got my brother, who lives on the other side of the world and who's not a dev/sysadmin, just a poweruser, to install Proxmox and he's now using GPU passthrough to have VMs run different AI models (in either Linux or Windows) for image generation, experimenting, etc. He's also got a NAS with RAID etc."
dude this is way more than "power user" you're being unserious.
If you tell a genuine power user, someone comfortable with Windows registry edits, Office macros, maybe some light PowerShell scripting, that they can "totally do what my brother did," and then the actual task list is Proxmox installation, IOMMU group isolation, VFIO stub drivers, GPU passthrough debugging, RAID configuration, and multi-OS VM management, subnetting, raid and HBA configuration, you're setting them up for a brutal wall of frustration.
I'm surprised people are advocating self-hosting as a viable solution. It takes a lot of knowledge to do sync and backup yourself, most of it implicit knowledge that people here don't realize we have and so for us it seems very easy.
There was a comment in another post on the front page about how anyone "remotely technical" can set up a docker container, and I think this is a good example because the mechanics of it are simple (edit a couple text files, run a couple commands), but half the world couldn't tell you what a terminal is and they're focused on other things in life instead of learning how computers work. Cloud succeeded because cloud is easy (at least in the beginning), it's that simple.
If we are to solve this problem, we're going to have to make self-hosting easy enough for the average 7-8 year old to do it without struggling. One promising way forward is with local-first E2EE sync and backup. The only good implementation I know of personally is Obsidian Sync, which has a UX that I adore, and hope to see more of in the future. There's other good options too, but none that I'd feel comfortable trusting a seven-year-old to execute correctly first try.
> If we are to solve this problem, we're going to have to make self-hosting easy enough
It used to be easy enough in the 90s, when plenty of folks had their own custom website. You signed up with a hosting company; they provided you with a bunch of different ways to upload files; your website was hosted.
Of late, I've decided that the problem is that HTML development halted at what is still a very beta product. HTMX is a reasonable attempt to continue the spirit of HTML; where I'm going with this is that I think HTML should have continued to add enough reasonable features, without needing Javascript or massive amounts of CSS, so that most websites most people wanted to develop would still be straightforward enough to do. HTML stopped before it even had a usable <table> with sorting and filtering defined, so we've spent decades with inferior tables in every web app that all suck compared to what we got used to with even Windows 3.1 apps. HTML finally grew date and colour pickers but it should have had all kinds of rich UI controls and behaviour that would have made it totally unnecessary for people to build all the Javascript middleware that essentially treats the browser as a display canvas and otherwise totally reimplements the GUI from scratch. And we wonder why the beautiful new Macbook Neo is kneecapped by only having 8GB????
It's time for HTML6. My standard will be: everything a restaurant website needs should be basically batteries included, with the exception of an e-commerce backend. It should all be doable in static HTML files with almost no Javascript and really just enough CSS to set artistic theming elements instead of having to do arcane shit just to position things.
I echo similar sentiments. It is high time to choose self-hosting over handing over essentials to the cloud. You don't know when it could be inaccessible due to plethora of reasons. It is just that, every time I looked into setting up a home lab, it feels cost prohibitively expensive.
>It is just that, every time I looked into setting up a home lab, it feels cost prohibitively expensive.
You would say that if you look into my 12U rack right now, only 6 months ago all I had was 2x Dell SFF second hand computer from eBay that might have costed me AUD300.
Before that, I had one of those miniPC with two network ports that cost me AUD200. I installed Proxmox in it, then OPNSense (router) and pihole as virtual machine, it ran like that for years
Install Proxmox in them and you can run eveything.
This is the major misconception regarding homelab, you absolutely don't need expensive gear.
A single miniPC + Proxmox is all you need to start, try to have at least 16GB of memory, 256GB NVMe is more than enough to start.
Don't let those massive homelab setup you see on the internet tell you that is the only way :)
Well that's your problem right there. The home labber setups are for experimentation or "hot rodding" purposes and they typically way overbuild their solutions.
What most people need is an old desktop in a corner somewhere (preferably close to your router so you can get to it with an ethernet cable).
It's won't be Grandma proof, but if you're remotely technical you can write a docker compose file that glues together some useful home server utilities that sound interesting to you.
My setup is roughly speaking: Ubuntu LTS, ZFS (with 4 disks in a RAID10 style config), and a docker compose file that runs plex, transmission, syncthing, vaultwarden behind an nginx-proxy[1] container that even automagically renews my Let's Encrypt certs for me (though it's probably even easier if you use a Cloudflare tunnel).
If you're confident all your apps are available on these platforms, the storage part is easier with something like TruNAS or Unraid. If you don't need storage at all you can slim down your hardware a lot and just use a raspberry pi.
IMO, just find an old beater machine and get hacking :)
I moved my DO server to a pi that was gathering dust. I agree, folks need to get off the cloud, find an old laptop or an old $40 mac mini, they are usually low power enough.
What makes it grandma proof is software that makes it extremely simple, which is like a home appliance, which is within the realm of possibility.
The simpler way to go on most fronts is some form of Proxmox with things like the above managed, it takes care of much of the overhead and doubt on it's own or through a reasonably point and click interface, which could be pre-configured.
We're teetering on the brink of highly capable software agents that can run on a phone using a local model, that can manage things like basic digital hygiene, operating a self-hosted cloud, with tailscale and other private vpns that can leverage your own home internet service with a well maintained set of firewall rules and level of locked-down access that it's actually practical.
An inspired nerd can do it right now, but grandma will be able to do a curated, accessible set of things by the end of the year, and by the end of 2027, the internet and self hosted things are going to be incredibly different. When people can self host plex and anonymously pirate anything, and their local model can do the ethically gray area stuff like ensure everything is done so they don't get caught - cloud services can't compete with that. Cable and netflix and spotify and the rest are going to have to up their game, and not do the stupid lashing out, price gouging, hunting the pirates type of thing or they're just going to burn down faster.
We're headed for some really cool, interesting times.
People overengineer homelabs all the time for fun and practice. To selfhost (which is not, in fact, the same thing), you can get a mini PC and probably host all of the basics. A small two-bay NAS plus a mini PC and you're really cooking with gas.
Homelab = Experimenting with environments you might use at work.
Selfhost = Hosting what you need at home.
These are two very different goals with very different reasonable choices. People homelab with Kubernetes clusters, selfhosting with Kubernetes is dumb.
The idea of offshoring computing is good. However, the cloud developed as a centralized computing platform instead of a distributed one. This has created power dynamics that harm customers. The same happened with social media, and has happened to other industries. I think it would be better for customers if there were many small cloud providers and they could easily switch between them. But even migrating from one cloud provider to another is a huge endeavor these days.
If you actually mean offshore as in located in a different country or especially a different continent, then that is a terrible idea for latency for many forms of computation. There are acceptable use cases, eg when round trips are infrequent and average latency is already high like batch workloads or some forms of LLM, but even then closer compute is pretty much always a better experience.
I've taken to buying the occasional CD and DVD. While I still use spotify more than physical cds, I still have my old CD collection and the sound quality is so refreshing. And soundtracks aren't on Spotify. With movies it's hard to rewatch favourites because they don't stay on the streaming services. Again it's much more satisfying to own them yourself.
The reliability, speed and internet connectivity makes local first more appealing. Honestly - i host my own webpage, file server, and some compute locally.
That's really odd. If I remember correctly Jason Scott has talked in his podcast about how textfiles.com is a co-lo'd self hosted box. I wonder what "resource limit" got hit.
> Insult, berate and make fun of any company that offers you something like a “sharing” site that makes you push stuff in that you can’t make copies out of or which you can’t export stuff out of. They will burble about technology issues. They are fucking lying. They might go off further about business models. They are fucking stupid.
Most cloud features are open source tools with special sauce sprinkeled in. But at the same time these companies heavily fund said OS project so I suppose it's not just pure community based work.
It's an encrustified open source offering where the original vendors aren't compensated. Where there's lock in, proprietary offering creep, and highway robbery billing.
Counter-take: this was almost entirely wrong, and the author should be embarassed looking back after 17 years.
I mean, it was 2009. How much of your personal data from then is still around on non-archival media you still control? Even among the geek set here, the answer is likely to be "almost none of it". At best it's "backed up" on media you haven't validated.
Or more likely, copied somewhere else to keep it secured. Like... Dropbox or Backblaze or S3 one of those, you guessed it, CLOUD services.
Likewise, do you still have your email from 2009 online in a useful form? Gmail users, many of them in this very thread, still do.
All of mine. Music, photos, copies of important documents, archived sets of email (and gmail) across different eras. My facebook archive export, IRC & IM logs stretching back to ~2000. A lot of it even on SSD, let alone HDD's, let alone "archival media". The spinning rust is mostly used for double- and triple-redundant copies of my music and photos, as well as the usual movie collection.
I'm not sure HN is the best place for such... technological anachronistic skepticism? A lot of us ARE going to be storing all that for shits and giggles.
Yeah I have all this data backed up on a couple different drives. IRC and ICQ logs going back to when I was a teenager. Digitised photos from when I was a kid through to the present day. Source code for projects I worked on from when I was 10. Rips of all the cds I used to own. And yes, email exports dating back to about 2003.
I wish I kept more, honestly. It’s a beautiful record.
I think my most treasured possession is videos of myself and my parents from when I was young. I’m thinking of sitting my sisters kids down in front of a camera for 15 minutes and getting them to talk about their life. It’s beautiful to rewatch this stuff decades later. It’s transporting.
> How much of your personal data from then is still around on non-archival media you still control? Even among the geek set here, the answer is likely to be "almost none of it".
As other commenters have stated, maybe this isn't the best place to ask.
I'm definitely in the "almost all of it" camp. I have Diablo II game saves on my desktop that are carried forward directly from my Windows 98 SE box circa 2002-2003. As well as Linux ISO's I acquired on Kazaa while still on dial-up internet.
I have all my music from 2009, shuffled from drive to drive. It out-survived my subscriptions to on-demand music streaming services (I do Pandora for discovery but don’t like the feeling of building an Amazon streaming “library” that will actually vanish when I stop paying).
I think the drive that held my old home directory might have died, though.
Uhhh, me? My home directory has 20-30 years of documents, photos, emails, the email address itself, instant-messaging logs, etc. Even a downloaded zip of every comment I ever made on Reddit. (But not HN, I should look into that.)
The primary exception would be Google Photos pictures which were auto-uploaded from my phone that I haven't curated and downloaded yet.
I predict I will maintain my custom-domain email address much longer than if I had used Gmail, given the attrition rate of bannings without support.
> on non-archival media you still control [...] Or more likely, copied somewhere else to keep it secured.
Hold up, is this OR or XOR? It sounds like you're trying to add unreasonable (dis-)qualifiers. TFA isn't saying one must boycott "the cloud" and erase all data, it just advocates that you retain an independent copy.
> Dropbox or Backblaze or S3 one of those, you guessed it, CLOUD services.
I think that's conflating different use-cases.
* Having a regular offsite backup into S3 isn't that different from when the data was rsync'ed to a Linux machine I paid for an account on. Any cloud-ness is a remote implementation detail, not a change in the consumer relationship.
* In contrast, "all my photos are in the cloud and my friends and family can collaborate on shared albums" is different, it permanently moves the locus of control.
I still have hour long techno/house mixes that I downloaded from some dude who was trying to get into DJing in 2008/did house shows or something, because we played on the same garry's mod server. They don't exist anywhere else on the internet as far as I could possibly find. Searching his dj name doesn't bring up anything.
A UK trance artist called Deathboy left directory-traversal open on his website about 23 year ago. Since then I've had a lot of mp3's that he's never released or put on albums, which is sad because a lot of them are pretty great.
Similarly (also from ~2003), the (Australian) ABC's website held a lot of recorded breakfast radio show clips from when Adam & Wil hosted it, getting the awesome comedy band Tripod [0] to write songs in an hour. Many of these were released on their CD's, but nowhere near all of them.
Eventually that ABC server was shutdown due to lack of government funds. There's a very good chance I'm the only one on the planet with these excellent songs & interviews from those shows.
Up to early 2000s, people would go to the internet to have fun, everything was new, it was the mass migration from analog to digital era.
2020s, people are going offline to have fun.
Homelab is becoming a thing even for people who never had experience with computer, people hosting their own documents, movies, music, backups in case things go bad.
Even some companies have realised the price of going cloud, some are moving back to on-prem hardware with full control.
Before we had facebook and iphones the only people that were able to run a home lab were technically adept. In 1998 I used Avantgo and Vindigo to browse the news on the train and find restaurants when nobody else could. In 2005 I remember running my netgear mp101 upnp player and everybody was impressed how I could stream music. Then we made things like iphones and facebook which got everybody on the internet, and we made all the “hard” things like music, video, news, reservations, etc.. a “service” – democratizing it (what a nice word). But for technical people it was actually shittier than just running it on your own. Not right away -- there was small overlapping period from 2005 to 2014 or so, where the pace of advancement of technology was complementary to hosting it yourself, but after the corporate monopolies got fully involved everything just went to shit. I think it has come full circle again, where the “technically illiterate” will just consume the shitty services, and will be happy or oblivious to it – they are actually serfs giving their labor/money away and they don’t care. The rest of the technical folks are just going to do their own thing again because we’re sick of the crappy services. And it will be better than the general public can ever do, just like 1998 again.
> Homelab is becoming a thing even for people who never had experience with computer, ...
Oh totally. I got my brother, who lives on the other side of the world and who's not a dev/sysadmin, just a poweruser, to install Proxmox and he's now using GPU passthrough to have VMs run different AI models (in either Linux or Windows) for image generation, experimenting, etc. He's also got a NAS with RAID etc.
To me a homelab is the 2020's version of having fun with computing: there's something incredibly refreshing in disconnecting my sub-LAN from the Internet and still have music, movies, private pastebin (yup I use this at times between computers for simple stuff I don't want to both scp'ing), private Git repositories, complete backup system (including offline HDDs/SSDs that I rotate into a safe at the bank), etc.
A movie projector, a dumb one, is another very cool thing: connected to nothing but a HDMI cable (not that HDMI is the best standard ever but it does the job).
And to be sure I can still code and work without having a nanny holding my hand as if I was a toddler, I regularly have coding sessions where I don't use Claude Code (but I also pay for a subscription: these things aren't mutually exclusive).
For anyone who wants to have a fun, a used HP Workstation with ECC memory is basically $200 and makes a perfectly fine server at home. Doesn't need to be up 24/7 either: my online service that is up 24/7 is my unbound DNS resolver and I run that one on a Raspberry Pi (for the low power consumption). The rest of my homelab (two Proxmox servers) is basically something I only need when I'm awake/at my desk. So I turn them off at night.
You never go full cloud.
It's kind of funny that people are talking about "home labs" as a new thing because I've been running some form of servers on consumer PC hardware in my home since around 1998. For me this was an inseparable part of getting to know Linux and *BSD in that era.
I guess I'm just old though.
>I've been running some form of servers on consumer PC hardware in my home since around 1998
My excuse is that I never had the financial stability that I have now in my middle 30s to get things going, also moving oversea and what not didn't help either.
But I didn't go crazy, I have 3 Proxmox servers running a few services, Pihole + Unbound as recursive DNS to avoid DNS poisoning and personal data tracking.
A DIY TrueNAS as the primary system to have a copy of my data.
I have a 4K bluray with physical media, but I do have Jellyfin also because nothing matches 80s, 90s, early 2000s movies and buying DVD in 2026 is pointless. Also, it is not easy or very, very expensive to find a bluray copy of old movies in 2026. Jellyfin solves that.
All my servers are consuming 110W 200VA tops, connected to a second hand APC UPS 1000VA.
If the whole world goes to shit right now, I can still run all my stuff without dependency to the internet.
My last goal is to have a solar/battery system so if WW3 really happens sending us to the cave age, wherever I am will still be 21st century.
How does Jellyfin solve findings Blu-ray copies of old movies? Unless you say you just pirate them? Jellyfin isn’t just for movie pirates.
1988. On a math TA salary I paid $600 for an 80MB (That's megabytes) hard drive. I had dialup. I also had Turbo Pascal and an 8087 coprocessor. I was a MS student in computational math AKA numerical analysis.
It was goddam glorious.
Took until 1995ish to have a homelab to experiment with FreeBSD and later Linux over a 10-Base-T network with gcc/g++ and dialup access to this thing called "The World Wide Web". The browser had a throbber dinosaur.
It was even more goddam glorious.
Right now I've got three main systems with decent CPUs and 128GB of memory, and several emphemeral satellite systems. With 8GB of NVIDIA VRAM I'm running gemma4:31b just fine on my media system. Which curiously enough has, ah... media on it.
I feel like I have a good idea how EV owners feel right now. (We have a Prius.)
>I feel like I have a good idea how EV owners feel right now. (We have a Prius.)
The difference is that you don't own your EV, it is a computer on wheel. Any firmware update like Tesla has done in the past and features are no longer available.
That is totally opposite of homelab, you have full control, you flash firmware that gives you full control over devices.
I am hard core Linux user, my wireless access point runs Linux, my router is a Sophos baremetal running OPNSense/FreeBSD Unix. My 3D printer is DIY running Debian Linux.
That is the best thing about homelab, nobody can take it away from you, you own everything, it is yours and yours only.
"The difference is that you don't own your EV, it is a computer on wheel. Any firmware update like Tesla has done in the past and features are no longer available."
Yeah, I think that's right.
I only thought in one dimension: reliance on corporate controlled high density existing infiltration of fossil-fuel delivery infrastructure. Which is worthless if the price is occasionally exorbitantly volatile or might even run into zero supply issues.
Another equally important dimension is: that EV car might just be a puppet, and not you running the puppet.
I'm pretty sure the Prius doesn't phone home (2015), but I admit that I've not gone deep into it.
I can't stand this thing I just did in this comment where I tried not to sound like an AI. I might have to give up short comments entirely because I can't generate enough context for authenticity credibility. <= It's a fact, and that right there sounds like AI to me now.
"Oh totally. I got my brother, who lives on the other side of the world and who's not a dev/sysadmin, just a poweruser, to install Proxmox and he's now using GPU passthrough to have VMs run different AI models (in either Linux or Windows) for image generation, experimenting, etc. He's also got a NAS with RAID etc."
dude this is way more than "power user" you're being unserious.
If you tell a genuine power user, someone comfortable with Windows registry edits, Office macros, maybe some light PowerShell scripting, that they can "totally do what my brother did," and then the actual task list is Proxmox installation, IOMMU group isolation, VFIO stub drivers, GPU passthrough debugging, RAID configuration, and multi-OS VM management, subnetting, raid and HBA configuration, you're setting them up for a brutal wall of frustration.
I'm surprised people are advocating self-hosting as a viable solution. It takes a lot of knowledge to do sync and backup yourself, most of it implicit knowledge that people here don't realize we have and so for us it seems very easy.
There was a comment in another post on the front page about how anyone "remotely technical" can set up a docker container, and I think this is a good example because the mechanics of it are simple (edit a couple text files, run a couple commands), but half the world couldn't tell you what a terminal is and they're focused on other things in life instead of learning how computers work. Cloud succeeded because cloud is easy (at least in the beginning), it's that simple.
If we are to solve this problem, we're going to have to make self-hosting easy enough for the average 7-8 year old to do it without struggling. One promising way forward is with local-first E2EE sync and backup. The only good implementation I know of personally is Obsidian Sync, which has a UX that I adore, and hope to see more of in the future. There's other good options too, but none that I'd feel comfortable trusting a seven-year-old to execute correctly first try.
> If we are to solve this problem, we're going to have to make self-hosting easy enough
It used to be easy enough in the 90s, when plenty of folks had their own custom website. You signed up with a hosting company; they provided you with a bunch of different ways to upload files; your website was hosted.
Of late, I've decided that the problem is that HTML development halted at what is still a very beta product. HTMX is a reasonable attempt to continue the spirit of HTML; where I'm going with this is that I think HTML should have continued to add enough reasonable features, without needing Javascript or massive amounts of CSS, so that most websites most people wanted to develop would still be straightforward enough to do. HTML stopped before it even had a usable <table> with sorting and filtering defined, so we've spent decades with inferior tables in every web app that all suck compared to what we got used to with even Windows 3.1 apps. HTML finally grew date and colour pickers but it should have had all kinds of rich UI controls and behaviour that would have made it totally unnecessary for people to build all the Javascript middleware that essentially treats the browser as a display canvas and otherwise totally reimplements the GUI from scratch. And we wonder why the beautiful new Macbook Neo is kneecapped by only having 8GB????
It's time for HTML6. My standard will be: everything a restaurant website needs should be basically batteries included, with the exception of an e-commerce backend. It should all be doable in static HTML files with almost no Javascript and really just enough CSS to set artistic theming elements instead of having to do arcane shit just to position things.
The irony of this is hilarious
> Resource Limit Is Reached
> The website is temporarily unable to service your request as it exceeded resource limit. Please try again later.
A copy for people who want to read the article :
https://archive.md/Q0DYu
So what? It will be back up.
I echo similar sentiments. It is high time to choose self-hosting over handing over essentials to the cloud. You don't know when it could be inaccessible due to plethora of reasons. It is just that, every time I looked into setting up a home lab, it feels cost prohibitively expensive.
>It is just that, every time I looked into setting up a home lab, it feels cost prohibitively expensive.
You would say that if you look into my 12U rack right now, only 6 months ago all I had was 2x Dell SFF second hand computer from eBay that might have costed me AUD300.
Before that, I had one of those miniPC with two network ports that cost me AUD200. I installed Proxmox in it, then OPNSense (router) and pihole as virtual machine, it ran like that for years
Install Proxmox in them and you can run eveything.
This is the major misconception regarding homelab, you absolutely don't need expensive gear. A single miniPC + Proxmox is all you need to start, try to have at least 16GB of memory, 256GB NVMe is more than enough to start.
Don't let those massive homelab setup you see on the internet tell you that is the only way :)
> home lab
Well that's your problem right there. The home labber setups are for experimentation or "hot rodding" purposes and they typically way overbuild their solutions.
What most people need is an old desktop in a corner somewhere (preferably close to your router so you can get to it with an ethernet cable).
It's won't be Grandma proof, but if you're remotely technical you can write a docker compose file that glues together some useful home server utilities that sound interesting to you.
My setup is roughly speaking: Ubuntu LTS, ZFS (with 4 disks in a RAID10 style config), and a docker compose file that runs plex, transmission, syncthing, vaultwarden behind an nginx-proxy[1] container that even automagically renews my Let's Encrypt certs for me (though it's probably even easier if you use a Cloudflare tunnel).
If you're confident all your apps are available on these platforms, the storage part is easier with something like TruNAS or Unraid. If you don't need storage at all you can slim down your hardware a lot and just use a raspberry pi.
IMO, just find an old beater machine and get hacking :)
[1]: https://github.com/nginx-proxy/nginx-proxy
I moved my DO server to a pi that was gathering dust. I agree, folks need to get off the cloud, find an old laptop or an old $40 mac mini, they are usually low power enough.
What makes it grandma proof is software that makes it extremely simple, which is like a home appliance, which is within the realm of possibility.
The simpler way to go on most fronts is some form of Proxmox with things like the above managed, it takes care of much of the overhead and doubt on it's own or through a reasonably point and click interface, which could be pre-configured.
We're teetering on the brink of highly capable software agents that can run on a phone using a local model, that can manage things like basic digital hygiene, operating a self-hosted cloud, with tailscale and other private vpns that can leverage your own home internet service with a well maintained set of firewall rules and level of locked-down access that it's actually practical.
An inspired nerd can do it right now, but grandma will be able to do a curated, accessible set of things by the end of the year, and by the end of 2027, the internet and self hosted things are going to be incredibly different. When people can self host plex and anonymously pirate anything, and their local model can do the ethically gray area stuff like ensure everything is done so they don't get caught - cloud services can't compete with that. Cable and netflix and spotify and the rest are going to have to up their game, and not do the stupid lashing out, price gouging, hunting the pirates type of thing or they're just going to burn down faster.
We're headed for some really cool, interesting times.
It’s like drugs, the first hit is cheap or free, but you end up spending all your money and your entire life on it!
Just get an old server for free somewhere and go …
People overengineer homelabs all the time for fun and practice. To selfhost (which is not, in fact, the same thing), you can get a mini PC and probably host all of the basics. A small two-bay NAS plus a mini PC and you're really cooking with gas.
Homelab = Experimenting with environments you might use at work. Selfhost = Hosting what you need at home.
These are two very different goals with very different reasonable choices. People homelab with Kubernetes clusters, selfhosting with Kubernetes is dumb.
As it appears to be hugged to death, archive link: https://archive.ph/qsdc3
Sometimes you fuck the cloud, sometimes the cloud fucks you
Maybe the old man is on to something.
Forgot to put a cache on it probably. :)
Nope
or https://web.archive.org/web/20260414220159/https://ascii.tex...
lol, as a VPN user, I get to read nothing. No offense to archive.org, I get it.
Ironically I've been opening up Tor for archive.org lately and it seems to never be on the same blocklist the VPN IPs are on.
the irony
Flared by the cloud.. sic
The idea of offshoring computing is good. However, the cloud developed as a centralized computing platform instead of a distributed one. This has created power dynamics that harm customers. The same happened with social media, and has happened to other industries. I think it would be better for customers if there were many small cloud providers and they could easily switch between them. But even migrating from one cloud provider to another is a huge endeavor these days.
If you actually mean offshore as in located in a different country or especially a different continent, then that is a terrible idea for latency for many forms of computation. There are acceptable use cases, eg when round trips are infrequent and average latency is already high like batch workloads or some forms of LLM, but even then closer compute is pretty much always a better experience.
I've taken to buying the occasional CD and DVD. While I still use spotify more than physical cds, I still have my old CD collection and the sound quality is so refreshing. And soundtracks aren't on Spotify. With movies it's hard to rewatch favourites because they don't stay on the streaming services. Again it's much more satisfying to own them yourself.
The reliability, speed and internet connectivity makes local first more appealing. Honestly - i host my own webpage, file server, and some compute locally.
https://news.ycombinator.com/item?id=10771539
Thanks, Macroexpanded, and some others too...
Fuck the Cloud (2009) - https://news.ycombinator.com/item?id=10771539 - Dec 2015 (219 comments)
Fuck the cloud (2009) - https://news.ycombinator.com/item?id=2984083 - Sept 2011 (2 comments)
Fuck the Cloud - https://news.ycombinator.com/item?id=441885 - Jan 2009 (23 comments)
Resource Limit Is Reached - Hug of death
That's really odd. If I remember correctly Jason Scott has talked in his podcast about how textfiles.com is a co-lo'd self hosted box. I wonder what "resource limit" got hit.
It really isn’t odd at all.
The irony.
The sequel to Kiss the sky
Anyway, I love how well GDPR demonstrated this:
> Insult, berate and make fun of any company that offers you something like a “sharing” site that makes you push stuff in that you can’t make copies out of or which you can’t export stuff out of. They will burble about technology issues. They are fucking lying. They might go off further about business models. They are fucking stupid.
I think very soon we will read “Fuck AI”.
Neither would be fucked if they were open source.
Or if they had a ton of viable competition.
Is not most cloud tech based on open-source? Without Linux I feel like we would have seen cloud take off 20 years later than it did.
Most cloud features are open source tools with special sauce sprinkeled in. But at the same time these companies heavily fund said OS project so I suppose it's not just pure community based work.
Please show me the open source AWS.
It's an encrustified open source offering where the original vendors aren't compensated. Where there's lock in, proprietary offering creep, and highway robbery billing.
>Please show me the open source AWS.
I will not, because that is not what I asked.
>Is not most cloud tech *based* on open-source?
Counter-take: this was almost entirely wrong, and the author should be embarassed looking back after 17 years.
I mean, it was 2009. How much of your personal data from then is still around on non-archival media you still control? Even among the geek set here, the answer is likely to be "almost none of it". At best it's "backed up" on media you haven't validated.
Or more likely, copied somewhere else to keep it secured. Like... Dropbox or Backblaze or S3 one of those, you guessed it, CLOUD services.
Likewise, do you still have your email from 2009 online in a useful form? Gmail users, many of them in this very thread, still do.
All of mine. Music, photos, copies of important documents, archived sets of email (and gmail) across different eras. My facebook archive export, IRC & IM logs stretching back to ~2000. A lot of it even on SSD, let alone HDD's, let alone "archival media". The spinning rust is mostly used for double- and triple-redundant copies of my music and photos, as well as the usual movie collection.
I'm not sure HN is the best place for such... technological anachronistic skepticism? A lot of us ARE going to be storing all that for shits and giggles.
Yeah I have all this data backed up on a couple different drives. IRC and ICQ logs going back to when I was a teenager. Digitised photos from when I was a kid through to the present day. Source code for projects I worked on from when I was 10. Rips of all the cds I used to own. And yes, email exports dating back to about 2003.
I wish I kept more, honestly. It’s a beautiful record.
I think my most treasured possession is videos of myself and my parents from when I was young. I’m thinking of sitting my sisters kids down in front of a camera for 15 minutes and getting them to talk about their life. It’s beautiful to rewatch this stuff decades later. It’s transporting.
> How much of your personal data from then is still around on non-archival media you still control? Even among the geek set here, the answer is likely to be "almost none of it".
As other commenters have stated, maybe this isn't the best place to ask.
I'm definitely in the "almost all of it" camp. I have Diablo II game saves on my desktop that are carried forward directly from my Windows 98 SE box circa 2002-2003. As well as Linux ISO's I acquired on Kazaa while still on dial-up internet.
I have all my music from 2009, shuffled from drive to drive. It out-survived my subscriptions to on-demand music streaming services (I do Pandora for discovery but don’t like the feeling of building an Amazon streaming “library” that will actually vanish when I stop paying).
I think the drive that held my old home directory might have died, though.
Uhhh, me? My home directory has 20-30 years of documents, photos, emails, the email address itself, instant-messaging logs, etc. Even a downloaded zip of every comment I ever made on Reddit. (But not HN, I should look into that.)
The primary exception would be Google Photos pictures which were auto-uploaded from my phone that I haven't curated and downloaded yet.
I predict I will maintain my custom-domain email address much longer than if I had used Gmail, given the attrition rate of bannings without support.
> on non-archival media you still control [...] Or more likely, copied somewhere else to keep it secured.
Hold up, is this OR or XOR? It sounds like you're trying to add unreasonable (dis-)qualifiers. TFA isn't saying one must boycott "the cloud" and erase all data, it just advocates that you retain an independent copy.
> Dropbox or Backblaze or S3 one of those, you guessed it, CLOUD services.
I think that's conflating different use-cases.
* Having a regular offsite backup into S3 isn't that different from when the data was rsync'ed to a Linux machine I paid for an account on. Any cloud-ness is a remote implementation detail, not a change in the consumer relationship.
* In contrast, "all my photos are in the cloud and my friends and family can collaborate on shared albums" is different, it permanently moves the locus of control.
I still have hour long techno/house mixes that I downloaded from some dude who was trying to get into DJing in 2008/did house shows or something, because we played on the same garry's mod server. They don't exist anywhere else on the internet as far as I could possibly find. Searching his dj name doesn't bring up anything.
A UK trance artist called Deathboy left directory-traversal open on his website about 23 year ago. Since then I've had a lot of mp3's that he's never released or put on albums, which is sad because a lot of them are pretty great.
Similarly (also from ~2003), the (Australian) ABC's website held a lot of recorded breakfast radio show clips from when Adam & Wil hosted it, getting the awesome comedy band Tripod [0] to write songs in an hour. Many of these were released on their CD's, but nowhere near all of them.
Eventually that ABC server was shutdown due to lack of government funds. There's a very good chance I'm the only one on the planet with these excellent songs & interviews from those shows.
[0] https://en.wikipedia.org/wiki/Tripod_(band)