> Traditional frameworks hydrate entire pages with JavaScript. Even if you've got a simple blog post with one interactive widget, the whole page gets the JavaScript treatment. Astro flips this on its head. Your pages are static HTML by default, and only the bits that need interactivity become JavaScript "islands."
Back in my days we called this "progressive enhancements" (or even just "web pages"), and was basically the only way we built websites with a bit of dynamic behavior. Then SPAs were "invented", and "progressive enhancements" movement became something less and less people did.
Now it seems that is called JavaScript islands, but it's actually just good ol' web pages :) What is old is new again.
You're making the opposite mistake: you're seeing someone's description of a tool's feature and confusing it with the way we've already done things without even checking if the tool is transformative just because it kinda sounds similar.
Astro's main value prop is that it integrates with JS frameworks, let's them handle subtrees of the HTML, renders their initial state as a string, and then hydrates them on the client with preloaded data from the server.
TFA is trying to explain that value to someone who wants to use React/Svelte/Solid/Vue but only on a subset of their page while also preloading the data on the server.
It's not necessarily progressive enhancement because the HTML that loads before JS hydration doesn't need to work at all. It just matches the initial state of the JS once it hydrates. e.g. The <form> probably doesn't work at all without the JS that takes it over.
These are the kind of details you miss when you're chomping at the bit to be cynical instead of curious.
What is the value in first sending dysfunctional HTML and then fixing it with later executed JS? If you do that, you might as well do 100% JS. Probably would simplify things in the framework.
Sending functional HTML, and then only doing dynamic things dynamically, that's where the value is for web _apps_. So if what you point out is the value proposition for Astro, then I am not getting it, and don't see its value.
Dysfunctional HTML in the sense that interactivity is disabled, but visually it is rendered and therefore you can see the proper webpage immediately.
Just compare the two cases, assuming 100ms for the initial HTML loading and 200ms for JS loading and processing.
With full JS, you don't see anything for 300ms, the form does not exists (300ms is a noticeable delay for the human eye).
With frameworks such as Astro, after 100ms you already see the form. By the time you move the mouse and/or start interacting with it, the JS will probably be ready (because 200ms is almost instant in the context of starting an interaction).
This is not new at all, old school server side processing always did this. The advantage is writing the component only once, in one framework (React/vue/whatev). The server-client passage is transparent for the developer, and that wasn't the case at all in old school frameworks.
Note that I'm not seeing this is good! but this is the value proposition of Astro and similar frameworks: transparent server-client context switching, with better performance perceived by the user.
Usually I write my HTML and CSS also in a way that I write it once and can then reuse it elsewhere. I do that by employing traditional template engines like Jinja2 (in Python) or using SXML in Lisps.
>e.g. The <form> probably doesn't work at all without the JS that takes it over.
What a value!
I guess I may be chomping at the bit to be cynical, but I have quite a bit of experience in these fields, and I don't think Astro sounds especially transformative.
I didn't read their comment as particularly cynical, and at a high level they're still correct, but so are you.
I think your comment gets at a very specific and subtle nuance that is worth mentioning, namely that typically if you were a proghance purist, you'd have a fallback that did work; a form the submitted normally, a raw table that could be turned into an interactive graph, etc..
I don't think these details are mutually exclusive though, and that it was certainly valid in those days to add something that didn't have a non-js default rendering mode, it's just that it was discouraged from being in the critical path. Early fancy "engineered" webapps like Flipboard got roasted for poorly re-implimenting their core text content on top of canvas so they could reach 60fps scrolling, but if JS didn't work their content wasn't there, and they threw out a bunch of accessibility stuff that you'd otherwise get for free.
Now that I'm thinking back, it's hard to recall situations *at that time* where there would be both something you couldn't do without JavaScript and that couldn't also have a more raw/primitive version, but one example that comes to mind and would still be current are long-form explanations of concepts that can be visualized, such as those that make HN front page periodically. You would not tightly couple the raw text content to the visualization, you would enhance the content with an interactive visual, but not necessarily have fallback for it, and this would still be progressive enhancement.
Here's another good example from that time, which is actually only somewhat forward compatible (doesn't render on my Android), but the explanation still renders https://www.ibiblio.org/e-notes/webgl/gpu/fluid.htm
Second edit: "Streaming server rendering", "Progressive rehydration", "Partial rehydration", "Trisomorphic rendering"... Seems I woke up in a different universe today.
Its definitely older. I remember hearing about it around a job I had in 2017-2020, but it was already an established term by that time.
Quote:
2013-2015: The concept of SSR in JavaScript frameworks emerged with React (released in 2013). Initially, it was referred to as "reconciliation" or "bootstrapping" — React would match the DOM with the virtual DOM.
2015: Around the release of React 0.14 and React 15 (2016), the term "hydration" began appearing in the React ecosystem to describe the process of attaching event listeners to server-rendered markup.
The first known mention in React documentation and community discussions around 2015-2016 clarified that React would "hydrate" the HTML.
This distinguished it from full re-rendering, which would discard the server-rendered DOM.
the concept is not new in an sense (duh), I mean, send HTML first send the important CSS first so the browser can show something - and with the correct dimensions hopefully to avoid reflows (and flash of white and so on), and therefore JS interaction "always" got enabled on some kind of trigger (onload, DOMContentLoaded, or simply it the script tag was added after the closing body tag, which as far as I know led to all of the event handlers registering by the time the DOM tree was there)
then chunking was the next step, and basically then logical endpoint is this mix-and-match strategy that NextJS is "leading" (?), by allowing things to be streamed in while sending and caching as much of the static parts up front as possible.
Oh no, go read about AJAX. You may be too young, there's no revolution here. I celebrate your curiosity, it's still a wheel. Changing the name it's still reinventing it.
Agreed! Astro is fantastic, but the biggest barrier I had in learning it was getting my head around range of terms that developers who entered the workplace after 2010 have invented to describe "how the web works".
I really appreciate that innovation can sometimes reinvent things that exist out of pure ignorance and sometimes hubris. I’ve seen people turn mountains out of molehills that take on a lore of their own and then along comes someone new who doesn’t know or is a little too sure of themselves and suddenly the problem is solved with something seemingly obvious.
I’m not sure javascript islands is that but I appreciate a new approach to an old pattern.
This field (software) in general, and especially web stuff, has no memory. It’s a cascade of teens and twentysomethings rediscovering the same paradigms over and over again with different names.
I think we are overdue for a rediscovery of object oriented programming and OOP design patterns but it will be called something else. We just got through an era of rediscovering the mainframe and calling it “cloud native.”
Every now and then you do get a new thing like LLMs and diffusion models.
Wasted effort reinventing things and rediscovering their limits and failure modes is a major downside of the ageism that is rampant in the industry. There is nobody around to say “actually this has been tried three times by three previous generations of devs and this is what they learned.”
Another one: WASM is a good VM spec and the VM implementations are good, but the ecosystem with its DSLs and component model is getting an over engineered complexity smell. Remind anyone of… say… J2EE? Good VM, good foundation, massive excess complexity higher up the stack.
Even if you are 10 years younger than you and youve been doing it webdev in your late teens you’ve seen like three spins of the concepts wheel to come back again and again.
Htmx and, even moreso, Datastar have brought us back on track. Hypermedia + Ajax with whatever backend language(s)/framework(s) you want. Whereas astro is astro.
I agree with that it’s not a new concept by itself, but the way it’s being done is much more elegant in my opinion.
I originally started as a web developer during the time where PHP+jQuery was the state of the art for interactive webpages, shortly before React with SPAs became a thing.
Looking back at it now, architecturally, the original approach was nicer, however DX used to be horrible at the time. Remember trying to debug with PHP on the frontend? I wouldn’t want to go back to that. SPAs have their place, most so in customer dashboards or heavily interactive applications, but Astro find a very nice balance of having your server and client code in one codebase, being able to define which is which and not having to parse your data from whatever PHP is doing into your JavaScript code is a huge DX improvement.
> Remember trying to debug with PHP on the frontend? I wouldn’t want to go back to that.
I do remember that, all too well. Countless hours spent with templates in Symfony, or dealing with Zend Framework and all that jazz...
But as far as I remember, the issue was debuggability and testing of the templates themselves, which was easily managed by moving functionality out of the templates (lots of people put lots of logic into templates...) and then putting that behavior under unit tests. Basically being better at enforcing proper MVC split was the way to solve that.
The DX wasn't horrible outside of that, even early 2010s which was when I was dealing with that for the first time.
The main difference is as simple as modern web pages having on average far more interactivity.
More and more logic moved to the client (and JS) to handle the additional interactivity, creating new frameworks to solve the increasing problems.
At some point, the bottleneck became the context switching and data passing between the server and the client.
SPAs and tools like Astro propose themselves as a way to improve DX in this context, either by creating complete separation between the two words (SPAs) or by making it transparent (Astro)
Well, that's a way to manage server-side logic, but your progressively-enhanced client-side logic (i.e. JS) still wasn't necessarily easy to debug, let alone being able to write unit tests for them.
> but your progressively-enhanced client-side logic (i.e. JS) still wasn't necessarily easy to debug, let alone being able to write unit tests for them
True, don't remember doing much unit testing of JavaScript at that point. Even with BackboneJS and RequireJS, manual testing was pretty much the approach, trying to make it easy to repeat things with temporary dev UIs and such, that were commented out before deploying (FTPing). Probably not until AngularJS v1 came around, with Miško spreading the gospel of testing for frontend applications, together with Karma and PhantomJS, did it feel like the ecosystem started to pick up on testing.
There wasn't as much JS to test. I built a progressively-enhanced SQLite GUI not too long ago to refresh my memory on the methodology, and I wound up with 50-ish lines of JS not counting Turbo. Fifty. It was a simple app, but it had the same style of partial DOM updates and feel that you would see from a SPA when doing form submissions and navigation.
> Astro find a very nice balance of having your server and client code in one codebase, being able to define which is which and not having to parse your data from whatever PHP is doing into your JavaScript code is a huge DX improvement.
the point is pretty much that you can do more JS for rich client-side interactions in a much more elegant way without throwing away the benefits of "back in the days" where that's not needed.
Modern PHP development with Laravel is wildly effective and efficient.
Facebook brought React forth with influences from PHPs immediate context switching and Laravel’s Blade templates have brought a lot of React and Vue influences back to templating in a very useful way.
> because it implies something better would have replaced it
Hah, if only... Time and time again the ecosystem moves not to something that is better, but "same but different" or also commonly "kind of same but worse".
Back in your day, there wasn’t a developer experience by which you could build a website or web app (or both) as a single, cohesive unit, covering both front-end and back-end, while avoiding the hydration performance hit. Now we have Astro, Next.js with RSC, and probably at least a dozen more strong contenders.
That is a perfect description of it, released 1996. So far ahead of its time it’s not even funny. Still one of the best programming environments I’ve ever used almost <checks calendar> 30 years later.
The syntax is horrible, and seeing it today almost gives me the yuckies, but seems like the same idea to me, just different syntax more or less. I'm not saying it was better back then, just seems like very similar ideas (which isn't necessarily bad either)
I find that syntax more palatable than what is going on in JSX to be honest. I actually like having a separate declarative syntax for describing the tree structure of a page. Some languages manage to actually seamlessly incorporate this, for example SXML libraries in Schemes, but I have not seen it being done in a mainstream language yet.
Yeah, not a huge fan of either either to be honest, but probably JSX slightly before Twig only because it's easier for someone who written lots of HTML.
Best of them all has to be hiccup I think, smallest and most elegant way of describing HTML. Same template but in as a Clojure function returning hiccup:
And as you say, part of the language itself, which means no need to learn something different, and no need to learn a pseudo-HTML or lookalike like with JSX, which then needs to be actually parsed by the framework (or its dependencies), unlike SXML, which is already structured data, already understood perfectly in the same language and only needs to be rendered into HTML.
> there wasn’t a developer experience by which you could build a website or web app (or both) as a single, cohesive unit, covering both front-end and back-end
How much of the frontend and how much of the backend are we talking about? Contemporary JavaScript frameworks only cover a narrow band of the problem, and still require you to bootstrap the rest of the infrastructure on either side of the stack to have something substantial (e.g., more than just a personal blog with all of the content living inside of the repo as .md files).
> while avoiding the hydration performance hit
How are we solving that today with Islands or RSCs?
It can cover as much of the back-end that your front-end uses directly as you’d care to deploy as one service. Obviously if you’re going to have a microservices architecture, you’re not going to put all those services in the one Next.js app. But you can certainly build a hell of a lot more in a monolith that a personal blog serving a handful of .md files.
In terms of the front-end, there’s really no limit imposed by Next.js and it’s not limited to a narrow band of the problem (whatever that gibberish means), so I don’t know what you’re even talking about.
> How are we solving that today with Islands or RSCs?
Next.js/RSC solves it by loading JavaScript only for the parts of the page that are dynamic. The static parts of the page are never client-side rendered, whereas before RSC they were.
There are a lot of moving parts when building an application and the abstractions that Next.js provides doesn't cover the same stuff when compared to frameworks like Ruby on Rails, Django and Laravel. The narrow band of problems Next.js solves for you are things like data fetching, routing, bundling assets and rendering interactive UI. It leaves things like auth, interacting with a database, logging, mailing, crons, queues, etc up to you to build yourself or integrate with 3rd party services. When you work with one of those frameworks, they pretty much solve all of those problems for you and you don't have to leave the framework often to get things done.
This is fine generally because you have the choice to pick the right tool for the job, but in the context of "a single, cohesive unit" you can only get that with Next.js if you all that you care about are those specific abstractions and want your backend and frontend to be in the same language. Even then you run into this awkwardness where you have to really think about where your JavaScript is running because it all looks the same. This might be a personal shortcoming, but that definitely broken the illusion of cohesion for me.
> The static parts of the page are never client-side rendered, whereas before RSC they were.
Didn't the hydration performance issues start when we started doing the contemporary SSR method of isomorphic javascript? I think Islands are great and it's a huge improvement to how we started doing SSR with things like Next.js Pages Router. But that's not truly revolutionary industry wide because we've been able to do progressive enhancement long before contemporary frameworks caught up. The thing I'm clarifying here is the "before RSC" only refers to what was once possible with frameworks like Next.js and not what was possible; you could always template some HTML on the server and progressively enhance it with JavaScript.
Why do you think that? You could absolutely "build a website covering FE and BE without the hydration performance hit" using standard practices of the time.
You'd render templates in Jade/Handlebars/EJS, break them down into page components, apply progressive enhancement via JS. Eventually we got DOM diffing libraries so you could render templates on the client and move to declarative logic. DX was arguably better than today as you could easily understand and inspect your entire stack, though tools weren't as flashy.
In the 2010-2015 era it was not uncommon to build entire interactive websites from scratch in under a day, as you wasted almost no time fighting your tools.
I can only approve. To me Astro started as "it's just html and css but with includes."
I used it for my personal website, and recently used it when reimplementing the Matrix Conference website. It's really a no-fuss framework that is a joy to use.
Among the things I love about Astro:
- It's still html and css centric
- Once built, it doesn't require js by default
- You can still opt-into adding js for interactivity here and there
- Content collections are neat and tidy
- Astro massively optimizes for speed, and the maintainers know how to do it
- It had a very helpful devbar to help you visually figure out what easy fix can make your website snappier (like lazily loading images if it detects them below the fold)
For the "optimize for speed" bit, an example is that the css minifier cleverly inlines some CSS to avoid additional queries. The Image component they provide will set the width and height attribute of an image to avoid content layout shifts. It will also generate responsive images for you.
> - It's still html and css centric - Once built, it doesn't require js by default - You can still opt-into adding js for interactivity here and there
I've never used Astro so forgive my ignorance, but isn't that just creating a .html file, a .css file and then optionally provide a .js file? What does Astro give you in this case? You'd get the same experience with a directory of files + Notepad basically. It's also even more optimized for speed, since there is no overhead/bloat at all, including at dev-time, just pure files, sent over HTTP.
> an example is that the css minifier cleverly inlines some CSS to avoid additional queries
Is that a common performance issue in the web pages you've built? I think across hundreds of websites, and for 20 years, not once have "CSS queries" been a bottleneck in even highly interactive webpages with thousands of elements, it's almost always something else (usually network).
For the first one, the main benefits of Astro over static html and css (for my use cases) are the ability to include components and enforce the properties that must be passed. A typical example would be [here][0] where I define a layout for the whole website, and then [on each page that uses it](https://github.com/matrix-org/matrix-conf-website/blob/main/...) I have to pass the right properties. Doable by hand, but it's great to have tooling that can yell at me if I forgot to do it.
Content Collections also let me grab content from e.g. markdown or json and build pages automatically from it. The [Content Collections docs][1] are fairly straightforward.
As for performance issues, I've spent quite a bit of time on the countryside where connectivity was an issue and every extra request was definitely noticeable, hence the value of inlining it (you load one html file that has the css embedded, instead of loading an html file that then tells your browser to load an extra css file). The same can be true in some malls where I live.
Embedded CSS circumvents proper caching of the CSS. Also, with HTTP/2 your client can download several resources in one transaction. So, it shouldn’t make much of a difference with CSS embedded or separate. Just, that embedded CSS has to be loaded over and over again whereas a separate file can be cached and reused from the local cache.
Caching is only relevant if you think your site is going to be visited by the same person often enough that the cache is worthwhile. If the assets are small that HTTP overhead is non-negligible, and if your CSS would get few enough cache hits, then you're often better off just inlining stuff.
HTTP/2 does not change this equation much. Server Push is dead, and bypasses caching anyway. Early Hints can help if configured correctly, but still require the client to make the request roundtrip to fetch that asset.
> I've never used Astro so forgive my ignorance, but isn't that just creating a .html file, a .css file and then optionally provide a .js file? What does Astro give you in this case? You'd get the same experience with a directory of files + Notepad basically. It's also even more optimized for speed, since there is no overhead/bloat at all, including at dev-time, just pure files, sent over HTTP.
Astro is super for completely static stuff too. Sometimes static stuff can be complex and there a modern framework like Astro shines.
I will share a couple of files to explain.
The site is almost completely static. It serves minimal JS for:
(1) Prefetching (you can block that and nothing will break)
(2) Mobile menu (you cannot make an accessible mobile menu without JS)
The site is for the docs and demos of a JS library. I want many demos on it, to be able to see what patterns the lib can handle and where things break down. I want to be able to add/remove demos quickly to try ideas. Straight HTML written in index.html files would not allow me to do that (but it is fine for the site where I have my CV, so I just use that there).
This is the Astro component I made that makes it super easy for me to try whatever idea I come up with:
So Astro is basically a (what we used to call) "static site generator" in that case? Something that existed for decades, and is basically just "compiling" templates, which could be in various syntaxes and languages, but in this case it's for JavaScript specifically. I guess the "what's old is new" point continues to stand tall :)
Again I'm failing to see exactly what Astro is "innovating" (as you and others claim they're doing). It's nothing wrong with taking a workflow and making it really stable/solid, or making it really fast, or similar. But for the claim to be "innovative" to be true, they actually have to do something new, or at least put together stuff in a new way :)
I am not saying Astro is innovative and I don’t think I implied that in my reply. I don’t view Astro as innovative and I don’t understand people who view it like that because of, say, its islands. (We knew about islands before Astro.)
As you said, in the example I shared Astro is an SSG. It happens to use server-side JS but this is is irrelevant.
But it is more than that. Astro is an SSG and it is also a *very well made* SSG.
I have used all the usual suspects: Ruby ones, Go ones, Python ones, JS ones. The closest I came to having fun was 11ty, but 11ty is a bit too chaotic for me. Astro is the one that clicked. And the one that was fun to use right from day 1.
I am not a JavaScript person, mind you. JavaScript is not my strongest FE skill. The JS conventions, tricks, and syntaxes of modern FE frameworks, even less so.
So Astro did not click for me because of that. It clicked because of how well it is made and because of how fun it is to use.
Oh! It does this!
Oh! It does that!
Oh! It gives you type safety for your Markdown meta! (What?!)
Oh! It gives you out of the box this optimization I was putting together manually! You just have to say thisOptim: true in the configuration file!
Astro is a very well made tool that improves continually and that aligns with my vision of the platform and of how we should make stuff for the platform.
The best way to do "just html and css with includes" is to run any commonon webserver like nginx and turn on server side includes. It is literally just html and css with includes then. And zero javascript, anywhere, unless you want it.
SSI hasn't changed in 20+ years and it's extremely stable in all webservers. A very tiny attack surface with no maintainence problems. It just does includes of html fragments. The perfect amount of templating power to avoid redundancy but also avoid expoitable backends.
I love Astro so much. After 20 years in data and backend, I got back into frontend for a big project. After banging my head with React, I took a leap of faith and choose Astro with Svelte. That includes an initial try with SvelteKit.
It's worked out so wonderfully. By being HTML/CSS centric, it forces a certain predictable organization with your front end code. I handed the frontend to another developer, with a React background, and because it's so consistently and plainly laid out, the transition happened almost overnight.
My one criticism which is why I ditched it for now is complex routing gets confusing and abstract quickly.
I don't know that there's a serious solution to it because complexity can't come with zero friction but just my gut feeling was to back out and go with something else for now.
I am feeling old reading the phrase "traditional frameworks" as a reference to SPA/Virtual DOM frameworks all while the actual traditional frameworks like Backbone, jQuery, etc. actually worked the way described in the blogpost.
"Traditional" always been a measure that depends on when we were born. "Traditional" internet for me is 56kbit modems, vbulletin forums, GTA:VC modding and IRC, while for older people "traditional" internet is probably BBS and such, and for the younger crowd things like Discord is part of the "traditional" internet.
You see the same thing in political conservative/traditional circles, where basically things were good when they were young, and things today are bad, but it all differs on when the person was born.
>You see the same thing in political conservative/traditional circles, where basically things were good when they were young, and things today are bad, but it all differs on when the person was born
when things decline that's still an accurate represenation, not just an artifact of subjectivity
It's baffling to me why more SSR frameworks, Astro and NextJS namely, can't adopt static pages with dynamic paths like SvelteKit. So for example, if you have a page /todos/[todoId] you can't serve those in your static bundle and NextJS straight-out refuse building your app statically.
Whereas with SvelteKit, it builds happily and does this beautiful catch-all mechanism where a default response page, say 404.html in Cloudflare, fetches the correct page and from user-perspective works flawlessly. Even though behind the scenes the response was 404 (since that dynamic page was never really compiled). Really nice especially when bundling your app as a webview for mobile.
I partly agree with you, but it is a design decision that comes with a drawback. A URL /todos/123 cannot be resolved in a SPA in a hard-reload. I.e. if a user were to bookmark /todos/123 or press reload in the browser, the browser would ultimately ask the underlying HTTP server for that file. As you mentioned, you would need a 404 page configured to fetch that - but that requires a configuration in the HTTP server (nginx etc.). So you are not just a static html+js+css+images deploy, you always will need server support. Another issue is, that 4xx errors in the HTTP spec are treated differently than 2xx: most notably, browsers are NOT allowed to cache any 404 responses, no matter what response header your server sends. This will ultimately mean, those /todo/123 bookmarks/hard-reloads will always trigger a full download of the page, even though it would be in the cache. And again, you would always need support in the web server to overwrite 404 pages. While the current NextJS output can be just deployed to something like github-pages or other webspace solutions.
Now, these are just the limitations I can think of, but there are probably more. And to be fair, why "break" the web this way, if you can just use query params: /todo?id=123. This solves all the quirks of the above solution, and is exactly what any server-side app (without JS) would look like, such as PHP etc.
> if a user were to bookmark /todos/123 or press reload in the browser, the browser would ultimately ask the underlying HTTP server for that file. As you mentioned, you would need a 404 page configured to fetch that - but that requires a configuration in the HTTP server (nginx etc.). So you are not just a static html+js+css+images deploy, you always will need server support.
> use query params: /todo?id=123. This solves all the quirks of the above solution, and is exactly what any server-side app (without JS) would look like, such as PHP etc.
We had PATH_INFO in virtually every http server since CGI/1.0 and were using it for embedding parameters in urls since SEO was a thing, if not earlier. Using PATH_INFO in a PHP script to access an ID was pretty common, even if it wasn't the default.
By way of example, here's a sample url from vBulletin, a classic PHP application <https:/ /forum.vbulletin.com/forum/vbulletin-sales-and-feedback/vbulletin-pre-sales-questions/4387853-vbulletin-system-requirements>[0] where the section, subsection, topic id, and topic are embedded into the URL path, not the query string.
Interesting. You can set up the server to respond with 200.html as the catch-all so the requests would return 200. There was some issue with it—can't remember what—which is why I switched to 404.html. After the initial load though the subsequent navigations would go through pushState so I think they'd be cached.
But I don't see this is as big of a problem. With this I can switch and choose—SSR dynamic pages or use hacky catch-all mechanism. For any reasonably large site you probably would SSR for SEO and other purposes. But for completely offline apps I have to do zero extra work to render them as is.
Personally, I much prefer route paths vs query parameters not just because they look ugly but because they lose hierarchy. Also, you can't then just decide to SSR the pages individually as they're now permanently fixed to same path.
Maybe i misunderstood you, but I did dynamic routes/pages for Next and Astro static builds. Using contentful or storyblok as a CMS, where the editor defines the routes and the components/bloks per page. Basically, the projects had one slug like [...slug].
Routes and Components per Page are dynamically created while export Next or build Astro static pages. In both frameworks you create the pages / slugs via getStaticPaths. And if ISR/ISP is enabled, even new pages (that are not known during build time) are pre-rendert while running the server.
In Next it is called dynamic routes[1] and in Astro dynamic pages[2].
Catch all slugs in Next and Astro are [...slug] e.g..
I think since they used [todoId] in the example they mean a static page which does not exist at build time. Which both can do, it’s called ISG (or on-demand in the Astro docs), but it requires a server to work, or you can create a static route that accepts any parameters and pass those to JavaScript to run on the client.
It doesn't. Those are executed build-time and you can't just set a wildcard so anything outside the given set results in 404.
As background, I wanted to make a PoC with NextJS bundled into a static CapacitorJS app somewhat recently and had to give up because of this.
You can try tricking NextJS by transforming the pages into "normal" ones with eg query parameters instead of path, but then you need complicated logic changing the pages as well as rewriting links. As you of course want the normal routes in web app. Just a huge PITA.
It's clear to me that the frontend conversation space is broken. Not even just the ecosystem being a mess.
Boiling down the conversation I see in the article, it just seems to be: the browser as a HMI vs the browser as a an application runtime. Depending on what you want to do one might be a better fit than the other. But the points it puts forward are fluff arguments like "it's a breadth of fresh air" or "it loads faster".
It's difficult to articulate the source of just how broken the discussion space is; nor have I made a particularly strong argument myself. But I think it's important to keep pushing back on conversations that position framework's like they are brands winning hearts and minds. Ala the fashion industry.
The fashion industry is the best analogy I've seen so far for frontend frameworks. It's obvious that the amount of technical rigor involved with declaring something "content-driven" and "server-first" is approximately zero.
Astro is trying to position itself in opposition to things like Next.js or Nuxt wich are specifically marketed as application frameworks?
And the architecture is more suited to something like a content site, because of the content collections, built-in MDX support, SSR, image handling, and server routing?
> But the points it puts forward are fluff arguments like "it's a breadth of fresh air" or "it loads faster".
Fluff arguments do exist, but you can also measure. The site is static with minimal JS on the one page, and a bit more JS on the other page, so nothing surprising in the numbers, and nothing you can say was achieved thanks to the magic of Astro, but I wanted to shared them:
Write some straight HTML pages and serve it from bog standard Apache. Heck, get really fancy and do some server-side includes for your CSS or something.
It's really fast, you can edit it with Notepad, and you can probably saturate your bandwidth with a consumer level PC.
It's fluff because, well, our expectations are so unbelievably low. By the time you've bolted on every whizbang dingus leveraging four different languages (two of which are some flavor of Javascript), your twelve page site takes a couple of minutes to compile (what?), and it chokes your three load-balanced AWS compute nodes.
Web applications are hard. I get that. Web sites? They were, by design, incredibly simple. We make them complicated for often unclear reasons.
I appreciate what the Astro folks are trying to do, and it's very clever. But your basic Web site need not require freaking npm in order to "return to the fundamentals of the Web".
I think the main benefit is that you are not forced to use any other library or framework like React or Vue on top of it. You can simply use HTML or Web Components. However, Astro can perform similar tasks to Next or Nuxt, such as SSR, ISR, static site generation, and middleware.
Another differences and benefit of Astro is the island architecture, compared to other frameworks. This means you can implement micro frontends.
Island architecture and micro frontends are features that companies or projects may want if they have multiple teams. For example, one team could be working on the checkout process, another on the shopping basket, and another on product listings.
Now, you can use Astro to combine these components on a single route or page. And you control how this components are rendered. Astro also allows you to share global state between these islands.
This approach is beneficial because teams can develop and ship a single feature while having full responsibility for it. However, it also has downsides, and similar outcomes can be achieved with non-island architectures.
For instance, if all teams use React, it is common for each team to use a different version of React, forcing the browser to load all these versions. The same issue arises if one team uses Vue, another uses Angular, and another uses React or any other framework.
I'am not fully convinced that it will change the web. It is basically a Next or Nuxt without the library/framework login. And it overs the Island-Architecture, that is usually only beneficial for very huge projects.
But, you should try it. I work with Astro since there first release, now for several years, and I can recommend you to give it a try.
It is also a nice tool, if you want to get ride of React or Vue and move to web-components or if you want to replace Next or Nuxt. You can do this with Astro, step by step.
To me no, it's not. It works well for some of the use cases, but if all you needed was offline rendering of your js in a build step to generate static html then you really didn't need all that js to begin with. islands work until they don't, and a lot of stuff gets inlined too. I guess it's fine if you stop caring about the final build.
I feel a lot of the hype around Astro has more to do with vite than anything else. And there yes, without doubt, vite is amazing.
Not the op, but I’d guess islands become a PITA when there is user-visible state that must be synchronized and rendered coherently across multiple islands.
Setting `client:load` and `client:visible` for (svelte) islands you want to run on the client ends up inlining script type modules all around the html. It looked like a big hack to me.
On the positive side their use of web components is a nice bet.
Please stop recommending Next.js as the de facto React framework, we need some critical thinking back into front-end. Remix (React Router v7) or TanStack are much better alternatives.
Second this. Next.js had potential, but it feels like it's gone downhill majorly since Vercel got involved.
Been on the Next.js journey since v10, lived through the v13 debacle and even now on v15, I've very much cooled on it.
I find both React and Next.js move way too fast and make incredibly radical changes sub-annually. It's impossible to keep up with. Maybe it could be justified if things improved from time to time, but often it just feels like changes for changes' sake.
Remix/React Router v7 was/is on a right path. I hope whatever they are planning with Remix with preact and using web standards will bring back the robust way of building websites.
I did not like how Remix to RR7 transition was made though, my project built using Remix was not an easy upgrade and I am rewriting a lot of it on RR7 now.
Please stop recommending React as the de facto framework, we need some critical thinking back into front-end. HTML, CSS and JS are much better alternatives.
Fundamentals of the Web haven't gone away, anyone still coding across PHP, Spring, Quarkus, ASP.NET MVC hasn't noticed that much how bad things have become with JS frameworks.
Unfortunately in fashion driven industry, it isn't always easy to keep to the basics.
I spent a small amount of time looking into Astro and I didn’t get the difference with the Fresh framework created by the Deno team.. ? Fresh does this Island architecture already, and benchmarks on Astro website dont include Deno+Fresh to compare. So I’m still wondering what’s the benefit of using Deno+Astro vs. Deno+Fresh
Astro and Fresh were both inspired by the islands idea which was iirc coined by an Etsy frontend architect and further elaborated on by the creator of Preact.
My understanding is that Astro is able to more-or-less take a component from any combo of popular frameworks and render it, whereas Fresh is currently limited to just Preact via Deno. I think the limitation is to optimize for not needing a build step, and not having to tweak the frameworks themselves like Astro does (did?).
I'm not affiliated; I've just looked at both tools before.
It doesn't, which is why all these solutions breakdown long-term compared to things like WP for small biz brochure stuff. 5 to 10 years from now when you're no longer talking to your client who has absolutely no technical experience, they're not gonna know that their website code is in some random GitHub repository that needs to be compiled with vite and then you need to magically wait for Netlify/etc to pull in your changes. They'll probably be fuming they have to find a developer that knows how to edit and manage that compared to something like WordPress which is used for the majority of those websites.
For a small business that simply displays an informational site, but occasionally needs to add content, a post on a blog or a change in the text or similar, it is sufficient to build a simple infrastructure of reading some markdown files and let them SFTP some markdown files and have periodic backups. One could also go as far as automatically committing those user uploaded files to a repo. None of which is difficult to do.
It does get more complicated, if the small business wants users to be able to message them through the website, not just via e-mail. Or if the business suddenly wants to have a shop functionality on the site. Then CMSs start to slowly become an option.
Many medium businesses don't even need that btw. In many instances marketing people just want to have control over websites, that they should not be given control over, since they usually are incapable of knowing the impact of what they are doing, when they add something like Google tagmanager to their site. They also tend to often break things, when using Wordpress, because they do not understand how it works under the hood, so that side of things is also not without hassle. And then the devs are called to fix what marketing broke. Even with Wordpress. At that point it would often be easier to let the devs build a non-Wordpress site, and any ideas about things that are not just content in markdown files need to be requests for the dev team to evaluate, and possibly work on, when deemed safe and lawful.
Sadly the power dynamics in businesses are often stacked against conscientious developers.
If they need to bring in a CMS to edit content and juggle assets they could have used Wordpress in headless mode, which the customer was already used to…
Per default, Astro generates static pages. So it makes sense to compare it to an approach that doesn't.
Using a framework has upsides over writing static pages manually. Most notably, you can decompose your website into reusable components which makes your implementation more DRY. Also, you can fluently upgrade to a very interaction-heavy website without ever changing tech or architecture. But that's just what I value. I whole-heartedly recommend trying it out.
My personal website is written with Astro. I love it! Admittedly I'm still a little trigger-happy with my JS usage, but overall the ability to write templated pages and utilize islands that don't require big dependencies is the most friendly web development experience I've had.
I tired Astro but not for me. I was upgrading my personal website from Jekyll to Astro couple of years back. It was a steep learning curve with lots of folders. Things used to break all the time. Then I switched to raw html and css
The title change lead to a bit of an unexpected jolt, I assumed I'd clicked on the wrong link. I'm not sure where that falls on the guidelines though given the circumstances.
Could someone compare it like I'm 5 to static site generators like Hugo, Jekyll? Does it make it easier to throw in necessary JS (e.g. for comments)? Thanks.
Astro is, and should be treated, as a static site generator.
> Does it make it easier to throw in necessary JS (e.g. for comments)?
With astro you can combine html, css and js in a single file (.astro). You write plain JS (TypeScript) within <script> tag. There, you can, e.g. import your comment library, point to separate .js/*.ts file or write whatever logic you want for client-side JS.
See the docs for example JS usage in astro components:
Coming from Go, I don’t enjoy working with the Go or other template engine, I have comparing various 3rd party Go template libraries, and settle down with JSX-like syntax, which is just way easier.
I've become pretty convinced everyone working in software should have a something like a static site generator somewhere in their back pocket. It could be Astro, Hugo (my choice), or even pandoc running in server mode, which is unhackable because you would have to first understand the Haskell type system to hack it. But it should be there for all the little times you just want to spin up something fast, cheap, and content-driven.
Next.js hydrates only client components - so effectively it's doing island architecture. And it's react end to end. How's that different from Astro? Stating things like "Components without the complexity" doesn't really mean anything unless you do some comparisons?
I ported my personal website from Jekyll to Astro a few weeks back and I really liked it. Astro is much easier to build and extend for me (and that is a personal preference point - I (and by I mostly mean claude) - and it's cool to add react components in to create more interactive points (but I haven't deployed that element yet).
Speed is probably the same as jekyll - but relative to my react vite and nextjs apps it's about 10 times faster.
I would definitely use Astro for more complicated websites than content driven - but would probably return to nextjs or more hefty full stack solutions for complicated web apps.
Potentially the heuristics would be about the level of user state management - e.g. if you're needing to do various workflows vs just presenting content.
What I don't understand from these conversation is the main selling proposition for Astro: if you have a content-heavy website, you should ship zero Javascript. And I agree with that, most of the websites heavy in content will at most have a couple of pages consumed in a single session.
But if my "website" is an application, Javascript makes the whole user experience better, if implemented well. It doesn't matter that the user will wait for 1 more second if they will have to spend the entire day working on it.
The selling point is that you can ship JS, but only when necessary, and only scoped to the components that need it. It's not at all about removing the framework entirely.
I recently migrated my blog from WordPress to Astro (https://aligundogdu.com). The development process was genuinely enjoyable, and the performance boost especially in SEO and speed scores was a fantastic bonus.
Astro is great and I had a good experience using it. However I still don't like the complexity needed. I don't mind a build step per sé but I'd just like to know that todays version of my site can still in ~5 years.
I've only heard of Astro before, but I got interested today and it seems like an intriguing framework.
That said, Astro also seems to be developed under a venture-backed company.
Is it still less likely to end up like Next.js and React under Vercel's influence?
>> See that code fence at the top? That runs at build time, not in the browser. Your data fetching, your logic - it all happens before the user even loads the page. You get brilliant TypeScript support without any of the complexity of hooks, state management, or lifecycle methods.
This is satire, right?
If only there was any other server side language that could do the same and produce static compliant super-light HTML-first pages!
> Traditional frameworks hydrate entire pages with JavaScript. Even if you've got a simple blog post with one interactive widget, the whole page gets the JavaScript treatment. Astro flips this on its head. Your pages are static HTML by default, and only the bits that need interactivity become JavaScript "islands."
I guess I'd argue "Traditional Frameworks" were the ones that never stopped doing this. Laravel, Django, Rails etc. Then the SPA frameworks came along and broke everything.
Also - what on earth is "f*"? I originally assumed it was shorthand for "fuck" but is "fuck dream" a common expression? And wouldn't you normally write it as "f***"?
Although Astro is great and all, it collects telemetry by default and requires an opt-out. Zola and 11ty, on the other hand, don’t do such nonsense and have way smaller dependency footprint.
I misread the title and was quite confused about when the F* programming language connection will take a stage in the article. Spoiler alert: it never does, because the title doesn't make a reference to the F* programming language :D
>See that code fence at the top? That runs at build time, not in the browser. Your data fetching, your logic - it all happens before the user even loads the page.
I can't with this goddamn LLM blog posts, it just drowns everything.
> Traditional frameworks hydrate entire pages with JavaScript. Even if you've got a simple blog post with one interactive widget, the whole page gets the JavaScript treatment. Astro flips this on its head. Your pages are static HTML by default, and only the bits that need interactivity become JavaScript "islands."
Back in my days we called this "progressive enhancements" (or even just "web pages"), and was basically the only way we built websites with a bit of dynamic behavior. Then SPAs were "invented", and "progressive enhancements" movement became something less and less people did.
Now it seems that is called JavaScript islands, but it's actually just good ol' web pages :) What is old is new again.
Bit of history for the new webdevs: https://en.wikipedia.org/wiki/Progressive_enhancement
You're making the opposite mistake: you're seeing someone's description of a tool's feature and confusing it with the way we've already done things without even checking if the tool is transformative just because it kinda sounds similar.
Astro's main value prop is that it integrates with JS frameworks, let's them handle subtrees of the HTML, renders their initial state as a string, and then hydrates them on the client with preloaded data from the server.
TFA is trying to explain that value to someone who wants to use React/Svelte/Solid/Vue but only on a subset of their page while also preloading the data on the server.
It's not necessarily progressive enhancement because the HTML that loads before JS hydration doesn't need to work at all. It just matches the initial state of the JS once it hydrates. e.g. The <form> probably doesn't work at all without the JS that takes it over.
These are the kind of details you miss when you're chomping at the bit to be cynical instead of curious.
What is the value in first sending dysfunctional HTML and then fixing it with later executed JS? If you do that, you might as well do 100% JS. Probably would simplify things in the framework.
Sending functional HTML, and then only doing dynamic things dynamically, that's where the value is for web _apps_. So if what you point out is the value proposition for Astro, then I am not getting it, and don't see its value.
Dysfunctional HTML in the sense that interactivity is disabled, but visually it is rendered and therefore you can see the proper webpage immediately.
Just compare the two cases, assuming 100ms for the initial HTML loading and 200ms for JS loading and processing.
With full JS, you don't see anything for 300ms, the form does not exists (300ms is a noticeable delay for the human eye).
With frameworks such as Astro, after 100ms you already see the form. By the time you move the mouse and/or start interacting with it, the JS will probably be ready (because 200ms is almost instant in the context of starting an interaction).
This is not new at all, old school server side processing always did this. The advantage is writing the component only once, in one framework (React/vue/whatev). The server-client passage is transparent for the developer, and that wasn't the case at all in old school frameworks.
Note that I'm not seeing this is good! but this is the value proposition of Astro and similar frameworks: transparent server-client context switching, with better performance perceived by the user.
Usually I write my HTML and CSS also in a way that I write it once and can then reuse it elsewhere. I do that by employing traditional template engines like Jinja2 (in Python) or using SXML in Lisps.
You don't have any complex interactivity than, and therefore this solutions does not apply to your use cases
>e.g. The <form> probably doesn't work at all without the JS that takes it over.
What a value!
I guess I may be chomping at the bit to be cynical, but I have quite a bit of experience in these fields, and I don't think Astro sounds especially transformative.
I didn't read their comment as particularly cynical, and at a high level they're still correct, but so are you.
I think your comment gets at a very specific and subtle nuance that is worth mentioning, namely that typically if you were a proghance purist, you'd have a fallback that did work; a form the submitted normally, a raw table that could be turned into an interactive graph, etc..
I don't think these details are mutually exclusive though, and that it was certainly valid in those days to add something that didn't have a non-js default rendering mode, it's just that it was discouraged from being in the critical path. Early fancy "engineered" webapps like Flipboard got roasted for poorly re-implimenting their core text content on top of canvas so they could reach 60fps scrolling, but if JS didn't work their content wasn't there, and they threw out a bunch of accessibility stuff that you'd otherwise get for free.
Now that I'm thinking back, it's hard to recall situations *at that time* where there would be both something you couldn't do without JavaScript and that couldn't also have a more raw/primitive version, but one example that comes to mind and would still be current are long-form explanations of concepts that can be visualized, such as those that make HN front page periodically. You would not tightly couple the raw text content to the visualization, you would enhance the content with an interactive visual, but not necessarily have fallback for it, and this would still be progressive enhancement.
Here's another good example from that time, which is actually only somewhat forward compatible (doesn't render on my Android), but the explanation still renders https://www.ibiblio.org/e-notes/webgl/gpu/fluid.htm
I'm sorry to go off-topic, but since when hydrating web pages with JavaScript became a thing? First time I've heard the term.
Edit: according to WP history, around December 2020
https://en.wikipedia.org/w/index.php?title=Hydration_(web_de...
Second edit: "Streaming server rendering", "Progressive rehydration", "Partial rehydration", "Trisomorphic rendering"... Seems I woke up in a different universe today.
Its definitely older. I remember hearing about it around a job I had in 2017-2020, but it was already an established term by that time.
Quote:
2013-2015: The concept of SSR in JavaScript frameworks emerged with React (released in 2013). Initially, it was referred to as "reconciliation" or "bootstrapping" — React would match the DOM with the virtual DOM.
2015: Around the release of React 0.14 and React 15 (2016), the term "hydration" began appearing in the React ecosystem to describe the process of attaching event listeners to server-rendered markup.
the concept is not new in an sense (duh), I mean, send HTML first send the important CSS first so the browser can show something - and with the correct dimensions hopefully to avoid reflows (and flash of white and so on), and therefore JS interaction "always" got enabled on some kind of trigger (onload, DOMContentLoaded, or simply it the script tag was added after the closing body tag, which as far as I know led to all of the event handlers registering by the time the DOM tree was there)
then chunking was the next step, and basically then logical endpoint is this mix-and-match strategy that NextJS is "leading" (?), by allowing things to be streamed in while sending and caching as much of the static parts up front as possible.
Oh no, go read about AJAX. You may be too young, there's no revolution here. I celebrate your curiosity, it's still a wheel. Changing the name it's still reinventing it.
> you're chomping at the bit to be cynical instead of curious.
Nicely sums up a lot of interactions these days
Agreed! Astro is fantastic, but the biggest barrier I had in learning it was getting my head around range of terms that developers who entered the workplace after 2010 have invented to describe "how the web works".
I really appreciate that innovation can sometimes reinvent things that exist out of pure ignorance and sometimes hubris. I’ve seen people turn mountains out of molehills that take on a lore of their own and then along comes someone new who doesn’t know or is a little too sure of themselves and suddenly the problem is solved with something seemingly obvious.
I’m not sure javascript islands is that but I appreciate a new approach to an old pattern.
This field (software) in general, and especially web stuff, has no memory. It’s a cascade of teens and twentysomethings rediscovering the same paradigms over and over again with different names.
I think we are overdue for a rediscovery of object oriented programming and OOP design patterns but it will be called something else. We just got through an era of rediscovering the mainframe and calling it “cloud native.”
Every now and then you do get a new thing like LLMs and diffusion models.
This. So much this. I'm not even 42 and I feel old when I read stuff like this. Like you say, there's no memory in the web software field.
Wasted effort reinventing things and rediscovering their limits and failure modes is a major downside of the ageism that is rampant in the industry. There is nobody around to say “actually this has been tried three times by three previous generations of devs and this is what they learned.”
Another one: WASM is a good VM spec and the VM implementations are good, but the ecosystem with its DSLs and component model is getting an over engineered complexity smell. Remind anyone of… say… J2EE? Good VM, good foundation, massive excess complexity higher up the stack.
Even if you are 10 years younger than you and youve been doing it webdev in your late teens you’ve seen like three spins of the concepts wheel to come back again and again.
Software engineering/development has had a long September.
https://en.wikipedia.org/wiki/Eternal_September
I remember when it was called AJAX. We have completely lost the plot.
It's why Alan Kay calls our industry a Cargo Cult and says it's much more akin to the fashion industry than engineering.
Absolutely.
I remember it was called DHTML. ( And the code never works perfectly on IE and Netscape / Mozilla )
DynamicDrive my beloved
Htmx and, even moreso, Datastar have brought us back on track. Hypermedia + Ajax with whatever backend language(s)/framework(s) you want. Whereas astro is astro.
I agree with that it’s not a new concept by itself, but the way it’s being done is much more elegant in my opinion.
I originally started as a web developer during the time where PHP+jQuery was the state of the art for interactive webpages, shortly before React with SPAs became a thing.
Looking back at it now, architecturally, the original approach was nicer, however DX used to be horrible at the time. Remember trying to debug with PHP on the frontend? I wouldn’t want to go back to that. SPAs have their place, most so in customer dashboards or heavily interactive applications, but Astro find a very nice balance of having your server and client code in one codebase, being able to define which is which and not having to parse your data from whatever PHP is doing into your JavaScript code is a huge DX improvement.
> Remember trying to debug with PHP on the frontend? I wouldn’t want to go back to that.
I do remember that, all too well. Countless hours spent with templates in Symfony, or dealing with Zend Framework and all that jazz...
But as far as I remember, the issue was debuggability and testing of the templates themselves, which was easily managed by moving functionality out of the templates (lots of people put lots of logic into templates...) and then putting that behavior under unit tests. Basically being better at enforcing proper MVC split was the way to solve that.
The DX wasn't horrible outside of that, even early 2010s which was when I was dealing with that for the first time.
The main difference is as simple as modern web pages having on average far more interactivity.
More and more logic moved to the client (and JS) to handle the additional interactivity, creating new frameworks to solve the increasing problems.
At some point, the bottleneck became the context switching and data passing between the server and the client.
SPAs and tools like Astro propose themselves as a way to improve DX in this context, either by creating complete separation between the two words (SPAs) or by making it transparent (Astro)
Well, that's a way to manage server-side logic, but your progressively-enhanced client-side logic (i.e. JS) still wasn't necessarily easy to debug, let alone being able to write unit tests for them.
> but your progressively-enhanced client-side logic (i.e. JS) still wasn't necessarily easy to debug, let alone being able to write unit tests for them
True, don't remember doing much unit testing of JavaScript at that point. Even with BackboneJS and RequireJS, manual testing was pretty much the approach, trying to make it easy to repeat things with temporary dev UIs and such, that were commented out before deploying (FTPing). Probably not until AngularJS v1 came around, with Miško spreading the gospel of testing for frontend applications, together with Karma and PhantomJS, did it feel like the ecosystem started to pick up on testing.
There wasn't as much JS to test. I built a progressively-enhanced SQLite GUI not too long ago to refresh my memory on the methodology, and I wound up with 50-ish lines of JS not counting Turbo. Fifty. It was a simple app, but it had the same style of partial DOM updates and feel that you would see from a SPA when doing form submissions and navigation.
Not usually, but in the context of
> Astro find a very nice balance of having your server and client code in one codebase, being able to define which is which and not having to parse your data from whatever PHP is doing into your JavaScript code is a huge DX improvement.
the point is pretty much that you can do more JS for rich client-side interactions in a much more elegant way without throwing away the benefits of "back in the days" where that's not needed.
Bingo.
Modern PHP development with Laravel is wildly effective and efficient.
Facebook brought React forth with influences from PHPs immediate context switching and Laravel’s Blade templates have brought a lot of React and Vue influences back to templating in a very useful way.
The first web frameworks really got it right with stateless websites and server rendering.
>Traditional frameworks hydrate
Dear God. In 20 years people will hire HTML experts as if they are COBOL experts today.
I look forward to that day, because it implies something better would have replaced it.
> because it implies something better would have replaced it
Hah, if only... Time and time again the ecosystem moves not to something that is better, but "same but different" or also commonly "kind of same but worse".
There are so many cases where the "worse" solution "won", and there is a reason "worse is better" is such a popular mantra: https://en.wikipedia.org/wiki/Worse_is_better
Back in your day, there wasn’t a developer experience by which you could build a website or web app (or both) as a single, cohesive unit, covering both front-end and back-end, while avoiding the hydration performance hit. Now we have Astro, Next.js with RSC, and probably at least a dozen more strong contenders.
WebObjects.
That is a perfect description of it, released 1996. So far ahead of its time it’s not even funny. Still one of the best programming environments I’ve ever used almost <checks calendar> 30 years later.
[dead]
> build a website or web app (or both) as a single, cohesive unit, covering both front-end and back-end, while avoiding the hydration performance hit.
Isn't that basically just what Symfony + Twig does? Server-side rendering, and you can put JS in there if you want to. Example template:
The syntax is horrible, and seeing it today almost gives me the yuckies, but seems like the same idea to me, just different syntax more or less. I'm not saying it was better back then, just seems like very similar ideas (which isn't necessarily bad either)I find that syntax more palatable than what is going on in JSX to be honest. I actually like having a separate declarative syntax for describing the tree structure of a page. Some languages manage to actually seamlessly incorporate this, for example SXML libraries in Schemes, but I have not seen it being done in a mainstream language yet.
Yeah, not a huge fan of either either to be honest, but probably JSX slightly before Twig only because it's easier for someone who written lots of HTML.
Best of them all has to be hiccup I think, smallest and most elegant way of describing HTML. Same template but in as a Clojure function returning hiccup:
Basically, just lists/vectors with built-in data structures in them, all part of the programming language itself.What you have written, that's what SXML is, basically.
Have a look at: https://www.gnu.org/software/guile/manual/html_node/SXML.htm... or https://docs.racket-lang.org/sxml/SXML.html
And as you say, part of the language itself, which means no need to learn something different, and no need to learn a pseudo-HTML or lookalike like with JSX, which then needs to be actually parsed by the framework (or its dependencies), unlike SXML, which is already structured data, already understood perfectly in the same language and only needs to be rendered into HTML.
> there wasn’t a developer experience by which you could build a website or web app (or both) as a single, cohesive unit, covering both front-end and back-end
How much of the frontend and how much of the backend are we talking about? Contemporary JavaScript frameworks only cover a narrow band of the problem, and still require you to bootstrap the rest of the infrastructure on either side of the stack to have something substantial (e.g., more than just a personal blog with all of the content living inside of the repo as .md files).
> while avoiding the hydration performance hit
How are we solving that today with Islands or RSCs?
It can cover as much of the back-end that your front-end uses directly as you’d care to deploy as one service. Obviously if you’re going to have a microservices architecture, you’re not going to put all those services in the one Next.js app. But you can certainly build a hell of a lot more in a monolith that a personal blog serving a handful of .md files.
In terms of the front-end, there’s really no limit imposed by Next.js and it’s not limited to a narrow band of the problem (whatever that gibberish means), so I don’t know what you’re even talking about.
> How are we solving that today with Islands or RSCs?
Next.js/RSC solves it by loading JavaScript only for the parts of the page that are dynamic. The static parts of the page are never client-side rendered, whereas before RSC they were.
There are a lot of moving parts when building an application and the abstractions that Next.js provides doesn't cover the same stuff when compared to frameworks like Ruby on Rails, Django and Laravel. The narrow band of problems Next.js solves for you are things like data fetching, routing, bundling assets and rendering interactive UI. It leaves things like auth, interacting with a database, logging, mailing, crons, queues, etc up to you to build yourself or integrate with 3rd party services. When you work with one of those frameworks, they pretty much solve all of those problems for you and you don't have to leave the framework often to get things done.
This is fine generally because you have the choice to pick the right tool for the job, but in the context of "a single, cohesive unit" you can only get that with Next.js if you all that you care about are those specific abstractions and want your backend and frontend to be in the same language. Even then you run into this awkwardness where you have to really think about where your JavaScript is running because it all looks the same. This might be a personal shortcoming, but that definitely broken the illusion of cohesion for me.
> The static parts of the page are never client-side rendered, whereas before RSC they were.
Didn't the hydration performance issues start when we started doing the contemporary SSR method of isomorphic javascript? I think Islands are great and it's a huge improvement to how we started doing SSR with things like Next.js Pages Router. But that's not truly revolutionary industry wide because we've been able to do progressive enhancement long before contemporary frameworks caught up. The thing I'm clarifying here is the "before RSC" only refers to what was once possible with frameworks like Next.js and not what was possible; you could always template some HTML on the server and progressively enhance it with JavaScript.
Why do you think that? You could absolutely "build a website covering FE and BE without the hydration performance hit" using standard practices of the time.
You'd render templates in Jade/Handlebars/EJS, break them down into page components, apply progressive enhancement via JS. Eventually we got DOM diffing libraries so you could render templates on the client and move to declarative logic. DX was arguably better than today as you could easily understand and inspect your entire stack, though tools weren't as flashy.
In the 2010-2015 era it was not uncommon to build entire interactive websites from scratch in under a day, as you wasted almost no time fighting your tools.
WebObjects and ColdFusion come from the 90s
The headline is literally “returns to the fundamentals of the web” so that’s the point.
I can only approve. To me Astro started as "it's just html and css but with includes."
I used it for my personal website, and recently used it when reimplementing the Matrix Conference website. It's really a no-fuss framework that is a joy to use.
Among the things I love about Astro:
- It's still html and css centric - Once built, it doesn't require js by default - You can still opt-into adding js for interactivity here and there - Content collections are neat and tidy - Astro massively optimizes for speed, and the maintainers know how to do it - It had a very helpful devbar to help you visually figure out what easy fix can make your website snappier (like lazily loading images if it detects them below the fold)
For the "optimize for speed" bit, an example is that the css minifier cleverly inlines some CSS to avoid additional queries. The Image component they provide will set the width and height attribute of an image to avoid content layout shifts. It will also generate responsive images for you.
> - It's still html and css centric - Once built, it doesn't require js by default - You can still opt-into adding js for interactivity here and there
I've never used Astro so forgive my ignorance, but isn't that just creating a .html file, a .css file and then optionally provide a .js file? What does Astro give you in this case? You'd get the same experience with a directory of files + Notepad basically. It's also even more optimized for speed, since there is no overhead/bloat at all, including at dev-time, just pure files, sent over HTTP.
> an example is that the css minifier cleverly inlines some CSS to avoid additional queries
Is that a common performance issue in the web pages you've built? I think across hundreds of websites, and for 20 years, not once have "CSS queries" been a bottleneck in even highly interactive webpages with thousands of elements, it's almost always something else (usually network).
Fair questions.
For the first one, the main benefits of Astro over static html and css (for my use cases) are the ability to include components and enforce the properties that must be passed. A typical example would be [here][0] where I define a layout for the whole website, and then [on each page that uses it](https://github.com/matrix-org/matrix-conf-website/blob/main/...) I have to pass the right properties. Doable by hand, but it's great to have tooling that can yell at me if I forgot to do it.
Content Collections also let me grab content from e.g. markdown or json and build pages automatically from it. The [Content Collections docs][1] are fairly straightforward.
As for performance issues, I've spent quite a bit of time on the countryside where connectivity was an issue and every extra request was definitely noticeable, hence the value of inlining it (you load one html file that has the css embedded, instead of loading an html file that then tells your browser to load an extra css file). The same can be true in some malls where I live.
[0]: https://github.com/matrix-org/matrix-conf-website/blob/main/... [1]: https://docs.astro.build/en/guides/content-collections/
Embedded CSS circumvents proper caching of the CSS. Also, with HTTP/2 your client can download several resources in one transaction. So, it shouldn’t make much of a difference with CSS embedded or separate. Just, that embedded CSS has to be loaded over and over again whereas a separate file can be cached and reused from the local cache.
Caching is only relevant if you think your site is going to be visited by the same person often enough that the cache is worthwhile. If the assets are small that HTTP overhead is non-negligible, and if your CSS would get few enough cache hits, then you're often better off just inlining stuff.
HTTP/2 does not change this equation much. Server Push is dead, and bypasses caching anyway. Early Hints can help if configured correctly, but still require the client to make the request roundtrip to fetch that asset.
> I've never used Astro so forgive my ignorance, but isn't that just creating a .html file, a .css file and then optionally provide a .js file? What does Astro give you in this case? You'd get the same experience with a directory of files + Notepad basically. It's also even more optimized for speed, since there is no overhead/bloat at all, including at dev-time, just pure files, sent over HTTP.
Astro is super for completely static stuff too. Sometimes static stuff can be complex and there a modern framework like Astro shines.
I will share a couple of files to explain.
The site is almost completely static. It serves minimal JS for:
(1) Prefetching (you can block that and nothing will break)
(2) Mobile menu (you cannot make an accessible mobile menu without JS)
The site is for the docs and demos of a JS library. I want many demos on it, to be able to see what patterns the lib can handle and where things break down. I want to be able to add/remove demos quickly to try ideas. Straight HTML written in index.html files would not allow me to do that (but it is fine for the site where I have my CV, so I just use that there).
This is the Astro component I made that makes it super easy for me to try whatever idea I come up with:
https://github.com/demetris/omni-carousel/blob/main/site/com...
Here is one page with demos that use the component:
https://github.com/demetris/omni-carousel/blob/main/site/pag...
Basically, without a setup like this, I would publish the site with 3 or 4 demos, and I would maybe add 1 or 2 more after a few months.
Cheers!
So Astro is basically a (what we used to call) "static site generator" in that case? Something that existed for decades, and is basically just "compiling" templates, which could be in various syntaxes and languages, but in this case it's for JavaScript specifically. I guess the "what's old is new" point continues to stand tall :)
Again I'm failing to see exactly what Astro is "innovating" (as you and others claim they're doing). It's nothing wrong with taking a workflow and making it really stable/solid, or making it really fast, or similar. But for the claim to be "innovative" to be true, they actually have to do something new, or at least put together stuff in a new way :)
I am not saying Astro is innovative and I don’t think I implied that in my reply. I don’t view Astro as innovative and I don’t understand people who view it like that because of, say, its islands. (We knew about islands before Astro.)
As you said, in the example I shared Astro is an SSG. It happens to use server-side JS but this is is irrelevant.
But it is more than that. Astro is an SSG and it is also a *very well made* SSG.
I have used all the usual suspects: Ruby ones, Go ones, Python ones, JS ones. The closest I came to having fun was 11ty, but 11ty is a bit too chaotic for me. Astro is the one that clicked. And the one that was fun to use right from day 1.
I am not a JavaScript person, mind you. JavaScript is not my strongest FE skill. The JS conventions, tricks, and syntaxes of modern FE frameworks, even less so.
So Astro did not click for me because of that. It clicked because of how well it is made and because of how fun it is to use.
Oh! It does this!
Oh! It does that!
Oh! It gives you type safety for your Markdown meta! (What?!)
Oh! It gives you out of the box this optimization I was putting together manually! You just have to say thisOptim: true in the configuration file!
Astro is a very well made tool that improves continually and that aligns with my vision of the platform and of how we should make stuff for the platform.
The best way to do "just html and css with includes" is to run any commonon webserver like nginx and turn on server side includes. It is literally just html and css with includes then. And zero javascript, anywhere, unless you want it.
SSI hasn't changed in 20+ years and it's extremely stable in all webservers. A very tiny attack surface with no maintainence problems. It just does includes of html fragments. The perfect amount of templating power to avoid redundancy but also avoid expoitable backends.
I love Astro so much. After 20 years in data and backend, I got back into frontend for a big project. After banging my head with React, I took a leap of faith and choose Astro with Svelte. That includes an initial try with SvelteKit.
It's worked out so wonderfully. By being HTML/CSS centric, it forces a certain predictable organization with your front end code. I handed the frontend to another developer, with a React background, and because it's so consistently and plainly laid out, the transition happened almost overnight.
My one criticism which is why I ditched it for now is complex routing gets confusing and abstract quickly.
I don't know that there's a serious solution to it because complexity can't come with zero friction but just my gut feeling was to back out and go with something else for now.
I am feeling old reading the phrase "traditional frameworks" as a reference to SPA/Virtual DOM frameworks all while the actual traditional frameworks like Backbone, jQuery, etc. actually worked the way described in the blogpost.
"Traditional" always been a measure that depends on when we were born. "Traditional" internet for me is 56kbit modems, vbulletin forums, GTA:VC modding and IRC, while for older people "traditional" internet is probably BBS and such, and for the younger crowd things like Discord is part of the "traditional" internet.
You see the same thing in political conservative/traditional circles, where basically things were good when they were young, and things today are bad, but it all differs on when the person was born.
>You see the same thing in political conservative/traditional circles, where basically things were good when they were young, and things today are bad, but it all differs on when the person was born
when things decline that's still an accurate represenation, not just an artifact of subjectivity
The problem is that overall decline is generally subjective while personal decline is objective (we all grow old and sick and ultimately die).
People frequently conflate the two.
I remember Backbone being pure SPA-focused with client-side MVC.
It's baffling to me why more SSR frameworks, Astro and NextJS namely, can't adopt static pages with dynamic paths like SvelteKit. So for example, if you have a page /todos/[todoId] you can't serve those in your static bundle and NextJS straight-out refuse building your app statically.
Whereas with SvelteKit, it builds happily and does this beautiful catch-all mechanism where a default response page, say 404.html in Cloudflare, fetches the correct page and from user-perspective works flawlessly. Even though behind the scenes the response was 404 (since that dynamic page was never really compiled). Really nice especially when bundling your app as a webview for mobile.
I partly agree with you, but it is a design decision that comes with a drawback. A URL /todos/123 cannot be resolved in a SPA in a hard-reload. I.e. if a user were to bookmark /todos/123 or press reload in the browser, the browser would ultimately ask the underlying HTTP server for that file. As you mentioned, you would need a 404 page configured to fetch that - but that requires a configuration in the HTTP server (nginx etc.). So you are not just a static html+js+css+images deploy, you always will need server support. Another issue is, that 4xx errors in the HTTP spec are treated differently than 2xx: most notably, browsers are NOT allowed to cache any 404 responses, no matter what response header your server sends. This will ultimately mean, those /todo/123 bookmarks/hard-reloads will always trigger a full download of the page, even though it would be in the cache. And again, you would always need support in the web server to overwrite 404 pages. While the current NextJS output can be just deployed to something like github-pages or other webspace solutions.
Now, these are just the limitations I can think of, but there are probably more. And to be fair, why "break" the web this way, if you can just use query params: /todo?id=123. This solves all the quirks of the above solution, and is exactly what any server-side app (without JS) would look like, such as PHP etc.
> if a user were to bookmark /todos/123 or press reload in the browser, the browser would ultimately ask the underlying HTTP server for that file. As you mentioned, you would need a 404 page configured to fetch that - but that requires a configuration in the HTTP server (nginx etc.). So you are not just a static html+js+css+images deploy, you always will need server support.
> use query params: /todo?id=123. This solves all the quirks of the above solution, and is exactly what any server-side app (without JS) would look like, such as PHP etc.
We had PATH_INFO in virtually every http server since CGI/1.0 and were using it for embedding parameters in urls since SEO was a thing, if not earlier. Using PATH_INFO in a PHP script to access an ID was pretty common, even if it wasn't the default.
By way of example, here's a sample url from vBulletin, a classic PHP application <https:/ /forum.vbulletin.com/forum/vbulletin-sales-and-feedback/vbulletin-pre-sales-questions/4387853-vbulletin-system-requirements>[0] where the section, subsection, topic id, and topic are embedded into the URL path, not the query string.
[0] https://forum.vbulletin.com/forum/vbulletin-sales-and-feedba...
Interesting. You can set up the server to respond with 200.html as the catch-all so the requests would return 200. There was some issue with it—can't remember what—which is why I switched to 404.html. After the initial load though the subsequent navigations would go through pushState so I think they'd be cached.
But I don't see this is as big of a problem. With this I can switch and choose—SSR dynamic pages or use hacky catch-all mechanism. For any reasonably large site you probably would SSR for SEO and other purposes. But for completely offline apps I have to do zero extra work to render them as is.
Personally, I much prefer route paths vs query parameters not just because they look ugly but because they lose hierarchy. Also, you can't then just decide to SSR the pages individually as they're now permanently fixed to same path.
Maybe i misunderstood you, but I did dynamic routes/pages for Next and Astro static builds. Using contentful or storyblok as a CMS, where the editor defines the routes and the components/bloks per page. Basically, the projects had one slug like [...slug].
Routes and Components per Page are dynamically created while export Next or build Astro static pages. In both frameworks you create the pages / slugs via getStaticPaths. And if ISR/ISP is enabled, even new pages (that are not known during build time) are pre-rendert while running the server.
In Next it is called dynamic routes[1] and in Astro dynamic pages[2]. Catch all slugs in Next and Astro are [...slug] e.g..
[1] https://nextjs.org/docs/pages/building-your-application/rout...
[2] https://docs.astro.build/en/guides/routing/#example-dynamic-...
Maybe I am misunderstanding you, but isn't this what Astro's `getStaticPaths`[0] function is for?
[0]: `https://docs.astro.build/en/guides/routing/#static-ssg-mode
I think since they used [todoId] in the example they mean a static page which does not exist at build time. Which both can do, it’s called ISG (or on-demand in the Astro docs), but it requires a server to work, or you can create a static route that accepts any parameters and pass those to JavaScript to run on the client.
Yeah, what the other commenter said. getStaticPaths still requires you to define the rendered routes build-time
I’m also a big fan of static, and nextjs supports this: https://nextjs.org/docs/app/api-reference/functions/generate...
It doesn't. Those are executed build-time and you can't just set a wildcard so anything outside the given set results in 404.
As background, I wanted to make a PoC with NextJS bundled into a static CapacitorJS app somewhat recently and had to give up because of this.
You can try tricking NextJS by transforming the pages into "normal" ones with eg query parameters instead of path, but then you need complicated logic changing the pages as well as rewriting links. As you of course want the normal routes in web app. Just a huge PITA.
I might not be understanding you (or the various frameworks) completely, but are Astro's server Islands what you're looking for?
https://docs.astro.build/en/guides/server-islands/
Or, actually, https://docs.astro.build/en/guides/on-demand-rendering/
You can indeed do that with Next.js. In the app router, it’s called generateStaticParams. In the pages router, it’s getStaticProps.
This isn't the same as getStaticProps is evaluated at build time not runtime
[dead]
It's clear to me that the frontend conversation space is broken. Not even just the ecosystem being a mess.
Boiling down the conversation I see in the article, it just seems to be: the browser as a HMI vs the browser as a an application runtime. Depending on what you want to do one might be a better fit than the other. But the points it puts forward are fluff arguments like "it's a breadth of fresh air" or "it loads faster".
It's difficult to articulate the source of just how broken the discussion space is; nor have I made a particularly strong argument myself. But I think it's important to keep pushing back on conversations that position framework's like they are brands winning hearts and minds. Ala the fashion industry.
> Ala the fashion industry.
The fashion industry is the best analogy I've seen so far for frontend frameworks. It's obvious that the amount of technical rigor involved with declaring something "content-driven" and "server-first" is approximately zero.
Care to explain?
Astro is trying to position itself in opposition to things like Next.js or Nuxt wich are specifically marketed as application frameworks?
And the architecture is more suited to something like a content site, because of the content collections, built-in MDX support, SSR, image handling, and server routing?
> But the points it puts forward are fluff arguments like "it's a breadth of fresh air" or "it loads faster".
Fluff arguments do exist, but you can also measure. The site is static with minimal JS on the one page, and a bit more JS on the other page, so nothing surprising in the numbers, and nothing you can say was achieved thanks to the magic of Astro, but I wanted to shared them:
HOME PAGE
TTFB: .024s
SR: .200s
FCP: .231s
SI: .200s
LCP: .231s
CLS: 0
TBT: .000s
PW: 108KB
DEMOS PAGE
TTFB: .033s
SR: .300s
FCP: .281s
SI: .200s
LCP: .231s
CLS: 0
TBT: .000s
PW: 174KB
"Loads faster" is fluff? How so?
Write some straight HTML pages and serve it from bog standard Apache. Heck, get really fancy and do some server-side includes for your CSS or something.
It's really fast, you can edit it with Notepad, and you can probably saturate your bandwidth with a consumer level PC.
It's fluff because, well, our expectations are so unbelievably low. By the time you've bolted on every whizbang dingus leveraging four different languages (two of which are some flavor of Javascript), your twelve page site takes a couple of minutes to compile (what?), and it chokes your three load-balanced AWS compute nodes.
Web applications are hard. I get that. Web sites? They were, by design, incredibly simple. We make them complicated for often unclear reasons.
I appreciate what the Astro folks are trying to do, and it's very clever. But your basic Web site need not require freaking npm in order to "return to the fundamentals of the Web".
I recently build a website for a medical practice using Astro.
I was amazed by how easy it was compared to my experience with Wordpress for this several years ago.
And I can host it for free on something like Netlify and I don’t need to worry about the site being hacked, like with WP.
I even built a very simple git-based CMS so that the client can update the content themselves.
Web dev has really come a long way, despite what a lot of people say.
Were you commissioned by a friend? How does one find a gig building a medical practice website?
I’m a doctor myself and thus have contacts to people having medical practices :)
But at least in Germany there are some agencies doing nothing else.
Be careful with Netlify. Their bandwidth charges are even more egregious than Vercel’s.
> Be careful with Netlify. Their bandwidth charges are even more egregious than Vercel’s.
$550/TB for those who want to save a search.
Yes, good point. Although the traffic for these websites is so small, that I think I‘m good there for a long time.
I think the main benefit is that you are not forced to use any other library or framework like React or Vue on top of it. You can simply use HTML or Web Components. However, Astro can perform similar tasks to Next or Nuxt, such as SSR, ISR, static site generation, and middleware.
Another differences and benefit of Astro is the island architecture, compared to other frameworks. This means you can implement micro frontends. Island architecture and micro frontends are features that companies or projects may want if they have multiple teams. For example, one team could be working on the checkout process, another on the shopping basket, and another on product listings.
Now, you can use Astro to combine these components on a single route or page. And you control how this components are rendered. Astro also allows you to share global state between these islands.
This approach is beneficial because teams can develop and ship a single feature while having full responsibility for it. However, it also has downsides, and similar outcomes can be achieved with non-island architectures.
For instance, if all teams use React, it is common for each team to use a different version of React, forcing the browser to load all these versions. The same issue arises if one team uses Vue, another uses Angular, and another uses React or any other framework.
I'am not fully convinced that it will change the web. It is basically a Next or Nuxt without the library/framework login. And it overs the Island-Architecture, that is usually only beneficial for very huge projects.
But, you should try it. I work with Astro since there first release, now for several years, and I can recommend you to give it a try.
It is also a nice tool, if you want to get ride of React or Vue and move to web-components or if you want to replace Next or Nuxt. You can do this with Astro, step by step.
To me no, it's not. It works well for some of the use cases, but if all you needed was offline rendering of your js in a build step to generate static html then you really didn't need all that js to begin with. islands work until they don't, and a lot of stuff gets inlined too. I guess it's fine if you stop caring about the final build.
I feel a lot of the hype around Astro has more to do with vite than anything else. And there yes, without doubt, vite is amazing.
> islands work until they don't
Like when?
Not the op, but I’d guess islands become a PITA when there is user-visible state that must be synchronized and rendered coherently across multiple islands.
Setting `client:load` and `client:visible` for (svelte) islands you want to run on the client ends up inlining script type modules all around the html. It looked like a big hack to me.
On the positive side their use of web components is a nice bet.
Please stop recommending Next.js as the de facto React framework, we need some critical thinking back into front-end. Remix (React Router v7) or TanStack are much better alternatives.
Second this. Next.js had potential, but it feels like it's gone downhill majorly since Vercel got involved.
Been on the Next.js journey since v10, lived through the v13 debacle and even now on v15, I've very much cooled on it.
I find both React and Next.js move way too fast and make incredibly radical changes sub-annually. It's impossible to keep up with. Maybe it could be justified if things improved from time to time, but often it just feels like changes for changes' sake.
Remix/React Router v7 was/is on a right path. I hope whatever they are planning with Remix with preact and using web standards will bring back the robust way of building websites.
I did not like how Remix to RR7 transition was made though, my project built using Remix was not an easy upgrade and I am rewriting a lot of it on RR7 now.
Absolutely agree, the Remix/RR7 move was awkward and they have a thing for breaking changes, TanStack seems more reasonable on that part.
Please stop recommending React as the de facto framework, we need some critical thinking back into front-end. HTML, CSS and JS are much better alternatives.
Fundamentals of the Web haven't gone away, anyone still coding across PHP, Spring, Quarkus, ASP.NET MVC hasn't noticed that much how bad things have become with JS frameworks.
Unfortunately in fashion driven industry, it isn't always easy to keep to the basics.
I spent a small amount of time looking into Astro and I didn’t get the difference with the Fresh framework created by the Deno team.. ? Fresh does this Island architecture already, and benchmarks on Astro website dont include Deno+Fresh to compare. So I’m still wondering what’s the benefit of using Deno+Astro vs. Deno+Fresh
Astro and Fresh were both inspired by the islands idea which was iirc coined by an Etsy frontend architect and further elaborated on by the creator of Preact.
My understanding is that Astro is able to more-or-less take a component from any combo of popular frameworks and render it, whereas Fresh is currently limited to just Preact via Deno. I think the limitation is to optimize for not needing a build step, and not having to tweak the frameworks themselves like Astro does (did?).
I'm not affiliated; I've just looked at both tools before.
There are plenty of differences. Eg Fresh can only run on Deno and can only use Preact.
Sure but I’m trying to see if Astro goes further in the optimization
But it has a CMS? The author said that replaced WordPress sites with Astro, and on https://astro.build/ I see a comparison to WordPress.
Astro brings a friendly UI to maintain and update the sites? Like the WordPress panel and editor.
It doesn't, which is why all these solutions breakdown long-term compared to things like WP for small biz brochure stuff. 5 to 10 years from now when you're no longer talking to your client who has absolutely no technical experience, they're not gonna know that their website code is in some random GitHub repository that needs to be compiled with vite and then you need to magically wait for Netlify/etc to pull in your changes. They'll probably be fuming they have to find a developer that knows how to edit and manage that compared to something like WordPress which is used for the majority of those websites.
For a small business that simply displays an informational site, but occasionally needs to add content, a post on a blog or a change in the text or similar, it is sufficient to build a simple infrastructure of reading some markdown files and let them SFTP some markdown files and have periodic backups. One could also go as far as automatically committing those user uploaded files to a repo. None of which is difficult to do. It does get more complicated, if the small business wants users to be able to message them through the website, not just via e-mail. Or if the business suddenly wants to have a shop functionality on the site. Then CMSs start to slowly become an option.
Many medium businesses don't even need that btw. In many instances marketing people just want to have control over websites, that they should not be given control over, since they usually are incapable of knowing the impact of what they are doing, when they add something like Google tagmanager to their site. They also tend to often break things, when using Wordpress, because they do not understand how it works under the hood, so that side of things is also not without hassle. And then the devs are called to fix what marketing broke. Even with Wordpress. At that point it would often be easier to let the devs build a non-Wordpress site, and any ideas about things that are not just content in markdown files need to be requests for the dev team to evaluate, and possibly work on, when deemed safe and lawful.
Sadly the power dynamics in businesses are often stacked against conscientious developers.
It would be cheaper to hire a dev to update some content than to pay a wp host for 5-10 years.
Not sure if anything major has changed recently, but I think you bring your own CMS. I've used Astro successfully with Strapi, for example.
If they need to bring in a CMS to edit content and juggle assets they could have used Wordpress in headless mode, which the customer was already used to…
Eg. https://www.gatsbyjs.com/docs/glossary/headless-wordpress/
> When I say Astro sites are fast, I mean properly fast. We're talking 40% faster load times compared to traditional React frameworks.
That's a really low bar. Why not static pages? Why even use a framework at all if you're thinking of using Astro?
Per default, Astro generates static pages. So it makes sense to compare it to an approach that doesn't.
Using a framework has upsides over writing static pages manually. Most notably, you can decompose your website into reusable components which makes your implementation more DRY. Also, you can fluently upgrade to a very interaction-heavy website without ever changing tech or architecture. But that's just what I value. I whole-heartedly recommend trying it out.
What about static site generators, in that case?
How do you do server-side rendering without a framework?
If you use static pages, how do you make sure that shared UI like navbars all update if you decide to make a change?
Pretty much every static site generator has an option of splitting a page into reusable components, so something like:
Some even allow you to pass variables, so something like:Static site generators?
"The fundamentals of the web" does not include a build step or the use of a package manager.
Backend stuff is orthogonal to web fundamentals. What matters is ssr html.
Still, yes, I prefer other tooling in the backend. But astro is a good thing for JS devs
My personal website is written with Astro. I love it! Admittedly I'm still a little trigger-happy with my JS usage, but overall the ability to write templated pages and utilize islands that don't require big dependencies is the most friendly web development experience I've had.
https://evklein.com
Your website isn't loading for me.
Give it another shot. Azure static web apps are no bueno when it comes to reliable hosting...
neither for me
Try again, I had to adjust my hosting plan.
Astro is a fantastic project that I would probably use more if SvelteKit didn’t exist (but it does, so Astro is always a downgrade).
Sveltekit doesn’t have islands. That single feature puts Astro ahead in situations where you don’t want an spa.
I was all in on SvelteKit until I noticed weird CSS issues thanks to :global.
I tired Astro but not for me. I was upgrading my personal website from Jekyll to Astro couple of years back. It was a steep learning curve with lots of folders. Things used to break all the time. Then I switched to raw html and css
The title change lead to a bit of an unexpected jolt, I assumed I'd clicked on the wrong link. I'm not sure where that falls on the guidelines though given the circumstances.
Could someone compare it like I'm 5 to static site generators like Hugo, Jekyll? Does it make it easier to throw in necessary JS (e.g. for comments)? Thanks.
Astro is, and should be treated, as a static site generator.
> Does it make it easier to throw in necessary JS (e.g. for comments)?
With astro you can combine html, css and js in a single file (.astro). You write plain JS (TypeScript) within <script> tag. There, you can, e.g. import your comment library, point to separate .js/*.ts file or write whatever logic you want for client-side JS.
See the docs for example JS usage in astro components:
https://docs.astro.build/en/guides/client-side-scripts/#web-...
Coming from Go, I don’t enjoy working with the Go or other template engine, I have comparing various 3rd party Go template libraries, and settle down with JSX-like syntax, which is just way easier.
You should try it out, not comparing.
I've become pretty convinced everyone working in software should have a something like a static site generator somewhere in their back pocket. It could be Astro, Hugo (my choice), or even pandoc running in server mode, which is unhackable because you would have to first understand the Haskell type system to hack it. But it should be there for all the little times you just want to spin up something fast, cheap, and content-driven.
For me, a return to web fundamentals would include not having to use NPM.
Next.js hydrates only client components - so effectively it's doing island architecture. And it's react end to end. How's that different from Astro? Stating things like "Components without the complexity" doesn't really mean anything unless you do some comparisons?
Client vs server routing
I ported my personal website from Jekyll to Astro a few weeks back and I really liked it. Astro is much easier to build and extend for me (and that is a personal preference point - I (and by I mostly mean claude) - and it's cool to add react components in to create more interactive points (but I haven't deployed that element yet).
Speed is probably the same as jekyll - but relative to my react vite and nextjs apps it's about 10 times faster.
I would definitely use Astro for more complicated websites than content driven - but would probably return to nextjs or more hefty full stack solutions for complicated web apps.
Do you have an intuition of when something becomes hefty and complicated, and would require a full stack solution like Next.js?
I do not but I think I'm growing one!
Potentially the heuristics would be about the level of user state management - e.g. if you're needing to do various workflows vs just presenting content.
What I don't understand from these conversation is the main selling proposition for Astro: if you have a content-heavy website, you should ship zero Javascript. And I agree with that, most of the websites heavy in content will at most have a couple of pages consumed in a single session.
But if my "website" is an application, Javascript makes the whole user experience better, if implemented well. It doesn't matter that the user will wait for 1 more second if they will have to spend the entire day working on it.
The selling point is that you can ship JS, but only when necessary, and only scoped to the components that need it. It's not at all about removing the framework entirely.
I recently migrated my blog from WordPress to Astro (https://aligundogdu.com). The development process was genuinely enjoyable, and the performance boost especially in SEO and speed scores was a fantastic bonus.
I do not want to go anywhere near "fundamentals of web." E.g. html forms are such a badly designed mess, just look at checkbox form field.
But don't you need to need to understand the platform you're building on, warts and all?
How else can you fully grasp what's possible on that platform and the costs of different abstractions?
I don't think so. If your framework is high level enough, you've successfully outsourced the bad stuff to someone else.
Astro is great and I had a good experience using it. However I still don't like the complexity needed. I don't mind a build step per sé but I'd just like to know that todays version of my site can still in ~5 years.
Tanstack Start is another nice alternative, if you need a bit more, but don't want to go with Next.js.
I have returned to the fundamentals of the web and there is no sign of Astro here let alone JavaScript.
I've only heard of Astro before, but I got interested today and it seems like an intriguing framework.
That said, Astro also seems to be developed under a venture-backed company. Is it still less likely to end up like Next.js and React under Vercel's influence?
Gotta say, even as a programmer, my "f* dreams" don't involve programming.
Does this have any advantage over something like Hugo, for a static only website?
>> See that code fence at the top? That runs at build time, not in the browser. Your data fetching, your logic - it all happens before the user even loads the page. You get brilliant TypeScript support without any of the complexity of hooks, state management, or lifecycle methods.
This is satire, right? If only there was any other server side language that could do the same and produce static compliant super-light HTML-first pages!
How is this different from something like PHP? I don't get it.
Reminds me of Zola (https://www.getzola.org/), which was a step up from Hugo for me.
How does this compare to a static site generator like 11ty? What would be the benefits of Astro over that or another site generator?
Just curious if non-techy folks might ask whether Astro is reliable enough for production use, such as e-commerce, marketing sites, etc.
> Traditional frameworks hydrate entire pages with JavaScript. Even if you've got a simple blog post with one interactive widget, the whole page gets the JavaScript treatment. Astro flips this on its head. Your pages are static HTML by default, and only the bits that need interactivity become JavaScript "islands."
I guess I'd argue "Traditional Frameworks" were the ones that never stopped doing this. Laravel, Django, Rails etc. Then the SPA frameworks came along and broke everything.
Also - what on earth is "f*"? I originally assumed it was shorthand for "fuck" but is "fuck dream" a common expression? And wouldn't you normally write it as "f***"?
Why would you SSR with JS when there's Go and Rust? Why? Why waste so many resources on purpose?
Astro didn't introduce island architecture.
it says static html and css but open the site and first thing i saw is an npm command. yuck.
Although Astro is great and all, it collects telemetry by default and requires an opt-out. Zola and 11ty, on the other hand, don’t do such nonsense and have way smaller dependency footprint.
I this not similar to htmx?
Return? It never left for many of us
Seriously. This is how things are done in most nonjs frameworks
The Island concept is really exactly what php is doing since decades
I misread the title and was quite confused about when the F* programming language connection will take a stage in the article. Spoiler alert: it never does, because the title doesn't make a reference to the F* programming language :D
[flagged]
[flagged]
> I've found Astro perfect for marketing sites, blogs, e-commerce catalogues, and portfolio sites
Basically, not suitable for anything complex.
>See that code fence at the top? That runs at build time, not in the browser. Your data fetching, your logic - it all happens before the user even loads the page.
I can't with this goddamn LLM blog posts, it just drowns everything.