A lot of languages claim to be a C replacement, but Zig is the second language I've seen that seemed like it had a reasonable plan to do so at any appreciable scale. The language makes working with the C ABI pretty easy, but it also has a build system that can seamlessly integrate Zig and C together, as well as having a translate-c that actually works shockingly well in the code I've put through it.
The only thing it didn't do was be 99% compatible with existing C codebases...which was the C++ strategy, the first language I can think of with such a plan. And frankly, I think Zig keeping C's relative simplicity while avoiding some of the pitfalls of the language proper was the better play.
Do you have to bring up D in every Zig related post?
I do like D. I've written a game in it and enjoyed it a lot. I would encourage others to check it out.
But it's not a C replacement. BetterC feels like an afterthought. A nice bonus. Not a primary focus. E.g. the language is designed to use exceptions for error handling, so of course there's no feature for BetterC dedicated to error handling.
Being a better C is the one and only focus of Zig. So it has features for doing error handling without exceptions.
D is not going to replace C, perhaps for the same reasons subsets of C++ didn't.
I don't know if Zig and Rust will. But there's a better chance since they actually bring a lot of stuff to the table that arguably make them better at being a C-like language than C. I am really hyped to see how embedded development will be in Zig after the new IO interface lands.
He doesn't have to, he _gets_ to! Its knowledge exchange. Take it as a gift and not self-promotion. There's no money in this game so don't treat it like guerilla marketing. Treat it like excited people pushing the limits of technology.
I think the history of D having a garbage collector (and arguably exceptions / RTTI) from the beginning really cemented its fate. We all know that there's a "BetterC" mode that turns it off - but because the D ecosystem initially started with the GC-ed runtime, most of the D code written so far (including most of the standard library) isn't compatible with this at all.
If D really wants to compete with others for a "better C replacement", I think the language might need some kind of big overhaul (a re-launch?). It's evident that there's a smaller, more beautiful language that can potentially be born from D, but in order for this language to succeed it needs to trim down all the baggage that comes from its GC-managed past. I think the best place to start is to properly remove GC / exception handling / RTTI from the languge cleanly, rewrite the standard library to work with BetterC mode, and probably also change the name to something else (needs a re-brand...)
My post was not about betterC, it was about the super easy interoperability of C and D. This capability has been in D for several years now, and has been very popular as there's no longer a need to write an adapter to use C source code. The ability to directly compile C code is part of the D compiler, and is known as ImportC.
One interesting result of ImportC is that it is an enhanced implementation of C in that it can do forward references, Compile Time Function Execution, and even imports! (It can also translate C source code to D source code!)
I do feel like allowing for in-place source upgrading was critical to C++'s early successes. However, I feel like this ultimately worked against C++, since it also wed the language to many of C's warts and footguns.
C++ cannot seem to let go of the preprocessor, which is an anchor hurting the language at every turn.
BTW, in my C days, I did a lot of clever stuff with the preprocessor. I was very proud of it. One day I decided to replace the clever macros with core C code, and was quite pleased with the clean result.
With D modules, imports, static if, manifest constants, and templates the macro processor can be put on the ash heap of history. Why doesn't C++ deprecate cpp?
This is, like, the most ironic comment ever posted on HN. An article about cat nutrition could hit the front page and the Rust fanbois would hijack the conversation.
In this case, however, Walter was not the one that brought up D. He was replying to a comment by someone promoting Zig with the claim that only Zig and C++ have ever had a strategy to replace C. That is objectively false. There's no way to look at what D does in that area and make that sort of claim. Walter and anyone else is right to challenge false statements.
> claim that only Zig and C++ have ever had a strategy to replace C
What I actually said was that it was the second language I have seen to do so at any appreciable scale. I never claimed to know all languages. There was also an implication that I think that even if a language claims to be a C replacement, its ability to do so might exceed its ambition.
That said I also hold no ill will towards Walter Bright, and in fact was hoping that someone like him would hop into the conversation to try and sell people on why their language was also worthy of consideration. I don't even mind the response to Walter's post, because they bring real-world Dlang experience to the table as a rebuttal.
On the other hand, I find it difficult to find value in your post except as a misguided and arguably bad-faith attempt to stir the pot.
No, he never stated that "claim that only Zig and C++ have ever had a strategy to replace C", you made that up.
And "Walter was not the one that brought up D" , he actually was.
Did the text get changed? because it seems you claim exactly the opposite of what is in about ~5 sentences, so it also can't be credited to "misunderstanding".
But didn't find any "D evangelism" comments in his history (first page), but then again, he has 78801 karma points, so I am also not going to put energy in going through his online persona history.
I'm not so familiar with D, what is the state of this sort of feature? Is it a built-in tool, or are you talking about the ctod project I found?
In most languages, I've found that source translation features to be woefully lacking and almost always require human intervention. By contrast, it feels like Zig's `translate-c` goes the extra mile in trying to convert the source to something that Zig can work with as-is. It does this by making use of language features and compiler built-ins that are rather rare to see outside of `translate-c`.
Obviously the stacks of @as, @fooCast, and @truncate you are left with isn't idiomatic Zig, but I find it easier to start with working, yet non-idiomatic code than 90% working code that merely underwent a syntactic change.
It's hardwired into the D compiler binary. It will even translate C macros into D code!
Well, most macros. The macros that do metaprogramming are not translatable. I read that Zig's translator has the same issue, which is hardly surprising since it is not possible.
So, yes, the translation is not perfect. But the result works out of the box most of the time, and what doesn't translate is easily fixed by a human. Another issue is every C compiler has their own wacky extensions, so it is impractical to deal with all those variants. We try to hit the common extensions, though.
If you just want to call C code, you don't have to translate it. The D compiler recognizes C files and will run its very own internal C compiler (ImportC) to compile it. As a bonus, the C code can use data structures and call functions written in D! The compatibility goes both ways.
This affects static libc only. If you pass -dynamic -lc then the libc functions are provided by the target system. Some systems only support dynamic libc, such as macOS. I think OpenBSD actually does support static libc though.
> I think OpenBSD actually does support static libc though.
How does that work, with syscalls being unable to be called except from the system’s libc? I’d be a bit surprised if any binary’s embedded libc would support this model.
For static executables, “the system’s libc” is of course not a thing. To support those, OpenBSD requires them to include an exhaustive list of all addresses of syscall instructions in a predefined place[1].
(With that said, OpenBSD promises no stability if you choose to bypass libc. What it promises instead is that it will change things in incompatible ways that will hurt. It’s up to you whether the pain that thus results from supporting OpenBSD is worth it.)
> How does that work, with syscalls being unable to be called except from the system’s libc?
OpenBSD allows system calls being made from shared libraries whose names start with `libc.so.' and all static binaries, as long as they include an `openbsd.syscalls' section listing call sites.
Good point. C's "freestanding" mode, analogous to Rust's nostd, does not provide any functions at all, just some type definitions and constants which obviously evaporate when compiled. Rust's nostd not only can compute how long a string is, it can unstably sort a slice, do atomic operations if they exist on your hardware, lots of fancy stuff but as a consequence even nostd has an actual library of code, a similar but maybe less organized situation occurs in C++. Most of the time this is simply better, why hand write your own crap sort when your compiler vendor can just provide an optimised sort for your platform? But on very, very tiny systems this might be unaffordable.
Anyway, C doesn't have Rust's core versus std distinction and so libc is a muddle of both the "Just useful library stuff" like strlen or qsort and features like open which are bound to the operating system specifics.
That just reminds me anyone know whether rust has something similar? Not wanting to start any Rust v. Zig debate. I am just wanting to be even more independant when it comes to some of my Rust projects.
This is very exciting for zig projects linking C libraries. Though I'm curious about the following case:
Let's say I'm building a C program targeting Windows with MinGW & only using Zig as a cross compiler. Is there a way to still statically link MinGW's libc implementation or does this mean that's going away and I can only statically link ziglibc even if it looks like MinGW from the outside?
If you specify -target x86_64-windows-gnu -lc then some libc functions are provided by Zig, some are provided by vendored mingw-w64 C files, and you don't need mingw-w64 installed separately; Zig provides everything.
You can still pass --libc libc.txt to link against an externally provided libc, such as a separate mingw-w64 installation you have lying around, or even your own libc installation if you want to mess around with that.
That's cool. I imagine I could also maintain a MinGW package that can be downloaded through the Zig package manager and statically linked without involving the zig libc? (Such that the user doesn't need to install anything but zig)
That's a good way to sell moving over to the zig build system, and eventually zig the language itself in some real-world scenarios imo.
It's completely possible to implement printf. here is my impl (not 100% correct yet) of snprintf for my custom libc implemented on top of a platform I'm working on <https://zigbin.io/ab1e79>
The va_arg stuff are extern because zig's va arg stuff is pretty broken at the moment.
Here's a C++ game ported to web using said libc running on top of the custom platform and web frontend that implements the platform ABI <https://cloudef.pw/sorvi/#supertux.sorvi> (you might need javascript.options.wasm_js_promise_integration enabled if using firefox based browser)
yeah I just thought there are "compiler shenanigans" involved with printf! zig's va arg being broken is sad, I am so zig-pilled, I wish we could just call extern "C" functions with a tuple in place of va arg =D
The only thing C compilers do for printf, is static analyze the format string for API usage errors. Afaik such isn't possible in zig currently. But idk why'd you downgrade yourself to using the printf interface, when std.Io.Writer has a `print` interface where fmt is comptime and args can be reflected so it catches errors without special compiler shenigans.
Cool idea, for sure, but I can't help but wonder: for the code that's been ported, is there a concern that you'd have to perpetually watch out for CVEs in glibc/musl and determine if they also apply to the Zig implementations?
> It’s kind of like enabling LTO (Link-Time Optimization) across the libc boundary, except it’s done properly in the frontend instead of too late, in the linker
Why is the linker too late? Is Zig able to do optimizations in the frontend that, e.g., a linker working with LLVM IR is not?
Seems like it ought to be able to do inlining and dead code stripping which, I think, wouldn't be viable at link time against optimized static libraries.
Right, but I think that's what the question of "Why is the linker too late?" is getting at. With zig libc, the compiler can do it, so you don't need fat objects and all that.
---
expanding: so, this means that you can do cross-boundary optimizations without LTO and with pre-built artifacts. I think.
I will say first that C libc does this - the functions are inline defined in header files, but this is mainly a pre-LTO artifact.
Otherwise it has no particular advantage other than disk space, it's the equivalent of just catting all your source files together and compiling that.
If you thikn it's better to do in the frontend, cool, you could make it so all the code gets seen by the frontend by fake compiling all the stuff, writing the original source to an object file special section, and then make the linker really call the frontend with all those special sections.
You can even do it without the linker if you want.
Now you have all the code in the frontend if that's what you want (I have no idea why you'd want this).
It has the disadvantage that it's the equivalent of this, without choice.
If you look far enough back, lots of C/C++ projects used to do this kind of thing when they needed performance in the days before LTO, or they just shoved the function definitions in header files, but stopped because it has a huge forced memory and compilation speed footprint.
Then we moved to precompiled headers to fix the latter, then LTO to fix the former and the latter.
Everything old is new again.
In the end, you are also much better off improving the ability to take lots of random object files with IR and make it optimize well than trying to ensure that all possible source code will be present to the frontend for a single compile. Lots of languages and compilers went down this path and it just doesn't work in practice for real users.
So doing stuff in the linker (and it's not really the linker, the linker is just calling the compiler with the code, whether that compiler is a library or a separate executable) is not a hack, it's the best compilation strategy you can realistically use, because the latter is essentially a dream land where nobody has third party libraries they link or subprojects that are libraries or multiple compilation processes and ....
Zig always seems to do this thing in blog posts and elsewhere where they add these remarks that often imply there is only one true way of doing it right and they are doing it.
It often comes off as immature and honestly a turnoff from wanting to use it for real.
As I understand it, compiling each source file separately and linking together the result was historically kind of a hack too, or at least a compromise, because early unix machines didn't have enough memory to compile the whole program at once (or even just hold multiple source files in memory at a time). Although later on, doing it this way did allow for faster recompilation because you didn't need to re-ingest source files that hadn't been changed (although this stopped being true for template-heavy C++ code).
"Furthermore, when this work is combined with the recent std.Io changes, there is potential for users to seamlessly control how libc performs I/O - for example forcing all calls to read and write to participate in an io_uring event loop"
This is exciting! I particularly care more about kqueue but I guess the quote applies to it too.
Does anyone know if there is a timeline on when Zig might achieve 1.0? I've been interested in the language for a while, but I'm a bit concerned about writing anything important in it when it seems to be evolving so much at the moment
Nobody knows, but for what it's worth, existing large projects that are used in production environments have been fairly good at keeping up with Zig releases. See: Bun, Ghostty, and Tigerbeetle for good examples of this. Because the semantics of Zig are relatively simple, porting to the latest version is usually as simple as bumping your compiler version, trying to build, making a fairly mindless, mechanical change, and repeating until it builds.
The biggest thing holding me back from using Zig for important projects is the willingness of my peers to adopt it, but I'm just building projects that I can build myself until they are convinced :)
I expect a lot of C code may be quite mechanically translated to Zig (by help of LLMs). Unlike C->Rust or C->C++, where there's more of a paradigm shift.
There's solid reason for the translation here; the Zig core team is aiming to eliminate duplicated code and C functions, and avoid the need to track libc from multiple sources. In the future, LLMs could serve as part of this, but they are currently quite terrible at Zig (as far as I understand it, it's not a lack of Zig code samples, it's an imbalance of OLD Zig to NEW Zig, as Zig changes quite frequently).
You would need to consider if it is even worth it translating your C code. If the paradigm is identical and the entire purpose would be "haha it is now one language," surely you could just compile and link the C code with libzigc... In my opinion, it's not worth translating code if the benefit of "hey look one language" requires the cost of "let's pray the LLM didn't hallucinate or make a mistake while translating the code."
The very same day I sat at home writing this devlog like a coward, less than five miles away, armed forces who are in my city against the will of our elected officials shot tear gas, unprovoked, at peaceful protestors, including my wife.
This isn't some hypothetical political agenda I'm using my platform to push. There's a nonzero chance I go out there next weekend to peacefully protest, and get shot like Alex Pretti.
Needless to say, if I get shot by ICE, it's not good for the Zig project. And they've brought the battle to my doorstep, almost literally.
Andy stay safe. We gotta all come to realization that none of this is possible if we let our democracy slip away. Millions before us died to preserve it. We owe it to them to put up a good fight.
My 85 year old mom lives in Portland and she attends rallies frequently. If you know of any way to support you or other local people doing this work, I'm very interested. My email is on my profile page.
I have a friend who is in Minneapolis. He's involved in caravans which are tracking ICE. He wasn't the driver in the last one. But, the ICE vehicle they tailed suddenly started going in a very direct path, instead of randomly driving. The driver figured it out first. They drove to the driver's house and then stood outside of their car for ten minutes staring at his house. Cars in Minnesota have their license plates on both the front and the back.
Is there any justification for that kind of intimidation? Did any of the Trump supporters vote for that? I hear about paid agitators on the left but not that kind of compensated actors. Is his name in a database now once they did the lookup?
In the Federal model of US government, state authority overrides centralized government except in the explicit cases enumerated by the Constitution.
So yes, of course they mean their local officials, because in this case there isn’t an explicit line in the Constitution explaining why the feds are allowed to invade Minnesota.
The Supreme Court has disagreed with you on the matter of federal immigration constitutional authority for more than a century. There isn’t any “invasion”; that’s a propaganda device.
I like how you call peaceful protests when people throw huge rocks, break city infrastructure and damage property and take 0 accountability for it. And most likely don't pay taxes to fix it up later.
How convenient it must be to blame officers instead of bad actors just because you agree with their side.
This is purely pushing political agenda, you just covering it up.
Since you're so eager to construe his support for peaceful protest as support for civil unrest, I therefore think it's fair if I construe your defence of ICE to mean support for their extrajudicial executions and the people who dress up as ICE (ie: masked men dragging people at gunpoint into unmarked vans) to kidnap and rape people.
I can't hold it so had to create an account to share, I'm sorry. I'm one of the minor zig contributors, and I'm reading ziglang blog for the purpose of engagement in software engineering craft.
I don't want to see these ICE stuff or whatever else political opinion you or somebody else have. I'm not from US and I barely know what ICE is but you're hating on people (I'm sure you think it's deserved, as with any hate) and I assume you may hate me at some point because I do something you don't share or like (like this comment for example).
Thinking that creator of Zig may hate me, takes a lot of fun from using the language let alone contributing to it or areas surrounding it. What if tomorrow people with tattoos at particular spot will be hated in media and you'll be posting "Abolish people with tattoo". Not the best comparison, but I hope you got why I feel scared of engaging with community now.
I think you have big responsibility for maintaining community of people with different political opinions and you are definitely free to share it on your personal blog. But you chose to do it in the community driven project as a lead of that project. And it's not first time. It's a bit different. For me at least.
Also the fear is what made me create this new account, I'm not a bot or something like that. I'm just afraid due to many (political) reasons and I want to find peace in playing with computers and one of these safe places was just taken from me, which you probably have the right to do but you could've avoided it. You're not the only one. There are many projects like this who mention Gaza, Ukraine, Russia, Israel, all these stuff. It's getting less and less projects to engage with (again, for me, I think it works well for those projects as they attract people they like).
I'm sorry you have to suffer and see people deaths. Me too. I understand it's difficult to hold these stuff inside. As you can see I couldn't ether. But I hoped you're stronger than me.
The world demonstrates in many instances, that you do not have to have empathy with people suffering from oppression, rape, murder, etc in order to "succeed" in terms of wealth and power.
Meaning: if you can't accept that someone publishing words/code/etc on the web at the same time also offers their own strong opinions (that you directly claim to be hate) about their own such issues, there's plenty of "communities" in which this kind of unempathetic approach to other people and their lives is celebrated and normalized.
If you barely know what ICE is, how can you claim his opinions to be "hate"? How can you claim that Andrew may hate you without thinking you identify with what you understand about ICE?
What ICE does is unmistakenly fascistic and authoritarian, far beyond the powers they have been granted by law and democratic processes. It's utterly disgusting to try and compare protesting and fighting against that with "abolish people with tattoos".
ICE is an institution, a government agency among a dozen+ law enforcement agencies in the US. You compare advocating for abolishing it through democratic process (what Andrew expressed) with calling for the murder of many millions of people with a private hobby.
And while Andrew may have some responsibility towards the community he founded; if he has the responsibility to include different political opinions, he most certainly has the responsibility to exclude fascism. Fascism is the destruction of different opinions, it is not a political opinion that can stand among others and be compared on the same basis: that of human rights at the minimum.
Ask yourself and reflect: why does this very simple and inoffensive call by Andrew make you scared, especially if you don't know what ICE is and does? Could you have been influenced into this feeling? It is certainly not a rational reaction to a few characters of text viewed on a screen.
This strikes me as a very agent-friendly problem. Given a harness that enforces sufficiently-rigorous tests, I'm sure you could spin up an agent loop that methodically churns through these functions one by one, finishing in a few days.
Have you ever used an LLM with Zig? It will generate syntactically invalid code. Zig breaks so often and LLMs have such an eternally old knowledge cutoff that they only know old ass broken versions.
The same goes for TLA+ and all the other obscure things people think would be great to use with LLMs, and they would, if there was as much training data as there was for JavaScript and Python.
i find claude does quite well with zig. this project is like > 95% claude, and it's an incredibly complicated codebase [0] (which is why i am not doing it by hand):
[0] generates a dynamically loaded library which does sketchy shit to access the binary representation of datastructures in the zig compiler, and then transpiles the IR to zig code which has to be rerun to do the analysis.
To be fair, this was true of early public LLMs with rust code too. As more public zig repositories (and blogs / docs / videos) come online, they will improve. I agree it's a mess currently.
A little bit! I wrote a long blog post about how I made it, I think the strategy of having an LLM look at individual std modules one by one make it actually pretty accurate. Not perfect, but better than I expected
Try it again. This time do something different with CLAUDE.md. By the way it's happy to edit its own CLAUDE.md files (don't have an agent edit another agent's CLAUDE.md files though [0])
250 C files were deleted. 2032 to go. Watching Zig slowly eat libc from the inside is one of the more satisfying long term projects to follow
That's something I've always admired about Zig.
A lot of languages claim to be a C replacement, but Zig is the second language I've seen that seemed like it had a reasonable plan to do so at any appreciable scale. The language makes working with the C ABI pretty easy, but it also has a build system that can seamlessly integrate Zig and C together, as well as having a translate-c that actually works shockingly well in the code I've put through it.
The only thing it didn't do was be 99% compatible with existing C codebases...which was the C++ strategy, the first language I can think of with such a plan. And frankly, I think Zig keeping C's relative simplicity while avoiding some of the pitfalls of the language proper was the better play.
D can import C files directly, and can do C-source to D-source translation.
D can compile a project with a C and a D source file with:
Do you have to bring up D in every Zig related post?
I do like D. I've written a game in it and enjoyed it a lot. I would encourage others to check it out.
But it's not a C replacement. BetterC feels like an afterthought. A nice bonus. Not a primary focus. E.g. the language is designed to use exceptions for error handling, so of course there's no feature for BetterC dedicated to error handling.
Being a better C is the one and only focus of Zig. So it has features for doing error handling without exceptions.
D is not going to replace C, perhaps for the same reasons subsets of C++ didn't.
I don't know if Zig and Rust will. But there's a better chance since they actually bring a lot of stuff to the table that arguably make them better at being a C-like language than C. I am really hyped to see how embedded development will be in Zig after the new IO interface lands.
He doesn't have to, he _gets_ to! Its knowledge exchange. Take it as a gift and not self-promotion. There's no money in this game so don't treat it like guerilla marketing. Treat it like excited people pushing the limits of technology.
I think the history of D having a garbage collector (and arguably exceptions / RTTI) from the beginning really cemented its fate. We all know that there's a "BetterC" mode that turns it off - but because the D ecosystem initially started with the GC-ed runtime, most of the D code written so far (including most of the standard library) isn't compatible with this at all.
If D really wants to compete with others for a "better C replacement", I think the language might need some kind of big overhaul (a re-launch?). It's evident that there's a smaller, more beautiful language that can potentially be born from D, but in order for this language to succeed it needs to trim down all the baggage that comes from its GC-managed past. I think the best place to start is to properly remove GC / exception handling / RTTI from the languge cleanly, rewrite the standard library to work with BetterC mode, and probably also change the name to something else (needs a re-brand...)
My post was not about betterC, it was about the super easy interoperability of C and D. This capability has been in D for several years now, and has been very popular as there's no longer a need to write an adapter to use C source code. The ability to directly compile C code is part of the D compiler, and is known as ImportC.
One interesting result of ImportC is that it is an enhanced implementation of C in that it can do forward references, Compile Time Function Execution, and even imports! (It can also translate C source code to D source code!)
C++ is more C-like than Zig and Rust, so it's more likely to become a C replacement.
I do feel like allowing for in-place source upgrading was critical to C++'s early successes. However, I feel like this ultimately worked against C++, since it also wed the language to many of C's warts and footguns.
C++ cannot seem to let go of the preprocessor, which is an anchor hurting the language at every turn.
BTW, in my C days, I did a lot of clever stuff with the preprocessor. I was very proud of it. One day I decided to replace the clever macros with core C code, and was quite pleased with the clean result.
With D modules, imports, static if, manifest constants, and templates the macro processor can be put on the ash heap of history. Why doesn't C++ deprecate cpp?
This is, like, the most ironic comment ever posted on HN. An article about cat nutrition could hit the front page and the Rust fanbois would hijack the conversation.
In this case, however, Walter was not the one that brought up D. He was replying to a comment by someone promoting Zig with the claim that only Zig and C++ have ever had a strategy to replace C. That is objectively false. There's no way to look at what D does in that area and make that sort of claim. Walter and anyone else is right to challenge false statements.
> claim that only Zig and C++ have ever had a strategy to replace C
What I actually said was that it was the second language I have seen to do so at any appreciable scale. I never claimed to know all languages. There was also an implication that I think that even if a language claims to be a C replacement, its ability to do so might exceed its ambition.
That said I also hold no ill will towards Walter Bright, and in fact was hoping that someone like him would hop into the conversation to try and sell people on why their language was also worthy of consideration. I don't even mind the response to Walter's post, because they bring real-world Dlang experience to the table as a rebuttal.
On the other hand, I find it difficult to find value in your post except as a misguided and arguably bad-faith attempt to stir the pot.
No, he never stated that "claim that only Zig and C++ have ever had a strategy to replace C", you made that up. And "Walter was not the one that brought up D" , he actually was.
Did the text get changed? because it seems you claim exactly the opposite of what is in about ~5 sentences, so it also can't be credited to "misunderstanding".
But didn't find any "D evangelism" comments in his history (first page), but then again, he has 78801 karma points, so I am also not going to put energy in going through his online persona history.
This is a bad comment in so many ways.
Walter's short limited comment was quite relevant.
> C-source to D-source translation.
I'm not so familiar with D, what is the state of this sort of feature? Is it a built-in tool, or are you talking about the ctod project I found?
In most languages, I've found that source translation features to be woefully lacking and almost always require human intervention. By contrast, it feels like Zig's `translate-c` goes the extra mile in trying to convert the source to something that Zig can work with as-is. It does this by making use of language features and compiler built-ins that are rather rare to see outside of `translate-c`.
Obviously the stacks of @as, @fooCast, and @truncate you are left with isn't idiomatic Zig, but I find it easier to start with working, yet non-idiomatic code than 90% working code that merely underwent a syntactic change.
It's hardwired into the D compiler binary. It will even translate C macros into D code!
Well, most macros. The macros that do metaprogramming are not translatable. I read that Zig's translator has the same issue, which is hardly surprising since it is not possible.
So, yes, the translation is not perfect. But the result works out of the box most of the time, and what doesn't translate is easily fixed by a human. Another issue is every C compiler has their own wacky extensions, so it is impractical to deal with all those variants. We try to hit the common extensions, though.
If you just want to call C code, you don't have to translate it. The D compiler recognizes C files and will run its very own internal C compiler (ImportC) to compile it. As a bonus, the C code can use data structures and call functions written in D! The compatibility goes both ways.
I've always admired your work, Walter. Keep it up!
Thank you for the kind words, Andy!
This is the smart choice
You keep compatibility with C, can tap into its ecosystem, but you are no longer stuck with outdated tooling
D gives you faster iteration, clearer diagnostics, and a generally smoother experience, even if it doesn't go as far as Rust in terms of safety
I wish more languages would follow this strategy, ImportC is great, let's you port things one step at a time, if required/needed
Let's be honest: who wants to write or generate C bindings? And who wants to risk porting robust/tested/maintained C code incorrectly?
> who wants to write or generate C bindings?
Not me, and not anyone else. Many D users have commented on how ImportC eliminates the tedium of interfacing to me.
And with D, you don't have to write .h interface files, either (although you can, but it turns out pretty much nobody bothers to).
Does this mean long term Zig won’t run on OpenBSD?
Because doesn’t OpenBSD block direct syscalls & force everything to go through libc.
https://news.ycombinator.com/item?id=38039689
This affects static libc only. If you pass -dynamic -lc then the libc functions are provided by the target system. Some systems only support dynamic libc, such as macOS. I think OpenBSD actually does support static libc though.
> I think OpenBSD actually does support static libc though.
How does that work, with syscalls being unable to be called except from the system’s libc? I’d be a bit surprised if any binary’s embedded libc would support this model.
For static executables, “the system’s libc” is of course not a thing. To support those, OpenBSD requires them to include an exhaustive list of all addresses of syscall instructions in a predefined place[1].
(With that said, OpenBSD promises no stability if you choose to bypass libc. What it promises instead is that it will change things in incompatible ways that will hurt. It’s up to you whether the pain that thus results from supporting OpenBSD is worth it.)
[1] https://nullprogram.com/blog/2025/03/06/
> How does that work, with syscalls being unable to be called except from the system’s libc?
OpenBSD allows system calls being made from shared libraries whose names start with `libc.so.' and all static binaries, as long as they include an `openbsd.syscalls' section listing call sites.
Can't you just have one syscall(2) to rule them all? https://man7.org/linux/man-pages/man2/syscall.2.html
Sorry I got mixed up with FreeBSD: https://codeberg.org/ziglang/zig/issues/30981 (original github link has more information)
Not all of libc is syscalls. E.g. strlen() is zib libc but open() goes to system libc.
Good point. C's "freestanding" mode, analogous to Rust's nostd, does not provide any functions at all, just some type definitions and constants which obviously evaporate when compiled. Rust's nostd not only can compute how long a string is, it can unstably sort a slice, do atomic operations if they exist on your hardware, lots of fancy stuff but as a consequence even nostd has an actual library of code, a similar but maybe less organized situation occurs in C++. Most of the time this is simply better, why hand write your own crap sort when your compiler vendor can just provide an optimised sort for your platform? But on very, very tiny systems this might be unaffordable.
Anyway, C doesn't have Rust's core versus std distinction and so libc is a muddle of both the "Just useful library stuff" like strlen or qsort and features like open which are bound to the operating system specifics.
https://news.ycombinator.com/reply?id=46864849&goto=item%3Fi...
That just reminds me anyone know whether rust has something similar? Not wanting to start any Rust v. Zig debate. I am just wanting to be even more independant when it comes to some of my Rust projects.
There is a couple libc implementations:
- c-ward [0] a libc implementation in Rust
- relibc [1] a libc implementation in Rust mainly for use in the Redox os (but works with linux as well)
- rustix [2] safe bindings to posix apis without using C
[0]: https://github.com/sunfishcode/c-ward
[1]: https://gitlab.redox-os.org/redox-os/relibc/
[2]: https://github.com/bytecodealliance/rustix
This is very exciting for zig projects linking C libraries. Though I'm curious about the following case:
Let's say I'm building a C program targeting Windows with MinGW & only using Zig as a cross compiler. Is there a way to still statically link MinGW's libc implementation or does this mean that's going away and I can only statically link ziglibc even if it looks like MinGW from the outside?
This use case is unchanged.
If you specify -target x86_64-windows-gnu -lc then some libc functions are provided by Zig, some are provided by vendored mingw-w64 C files, and you don't need mingw-w64 installed separately; Zig provides everything.
You can still pass --libc libc.txt to link against an externally provided libc, such as a separate mingw-w64 installation you have lying around, or even your own libc installation if you want to mess around with that.
Both situations unchanged.
That's cool. I imagine I could also maintain a MinGW package that can be downloaded through the Zig package manager and statically linked without involving the zig libc? (Such that the user doesn't need to install anything but zig)
That's a good way to sell moving over to the zig build system, and eventually zig the language itself in some real-world scenarios imo.
do you suspect it will be possible to implement printf??
while we're talking about printf, can i incept in you the idea of making an io.printf function that does print-then-flush?
It's completely possible to implement printf. here is my impl (not 100% correct yet) of snprintf for my custom libc implemented on top of a platform I'm working on <https://zigbin.io/ab1e79> The va_arg stuff are extern because zig's va arg stuff is pretty broken at the moment. Here's a C++ game ported to web using said libc running on top of the custom platform and web frontend that implements the platform ABI <https://cloudef.pw/sorvi/#supertux.sorvi> (you might need javascript.options.wasm_js_promise_integration enabled if using firefox based browser)
yeah I just thought there are "compiler shenanigans" involved with printf! zig's va arg being broken is sad, I am so zig-pilled, I wish we could just call extern "C" functions with a tuple in place of va arg =D
The only thing C compilers do for printf, is static analyze the format string for API usage errors. Afaik such isn't possible in zig currently. But idk why'd you downgrade yourself to using the printf interface, when std.Io.Writer has a `print` interface where fmt is comptime and args can be reflected so it catches errors without special compiler shenigans.
I'm thinking: do a translate-c and then statically catch errors using my zig-clr tool.
Cool idea, for sure, but I can't help but wonder: for the code that's been ported, is there a concern that you'd have to perpetually watch out for CVEs in glibc/musl and determine if they also apply to the Zig implementations?
Yes but we already have to do that for our own standard library. For shared codepaths (e.g. math) it's strictly fewer potential bugs.
> It’s kind of like enabling LTO (Link-Time Optimization) across the libc boundary, except it’s done properly in the frontend instead of too late, in the linker
Why is the linker too late? Is Zig able to do optimizations in the frontend that, e.g., a linker working with LLVM IR is not?
Seems like it ought to be able to do inlining and dead code stripping which, I think, wouldn't be viable at link time against optimized static libraries.
It is viable against the IR that static libraries contain when LTO is enabled.
LTO essentially means “load the entire compiler backend into the linker and do half of the compilation work at link time”.
It’s a great big hack, but it does work.
Right, but I think that's what the question of "Why is the linker too late?" is getting at. With zig libc, the compiler can do it, so you don't need fat objects and all that.
---
expanding: so, this means that you can do cross-boundary optimizations without LTO and with pre-built artifacts. I think.
Calling this "properly" is a stretch at best.
I will say first that C libc does this - the functions are inline defined in header files, but this is mainly a pre-LTO artifact.
Otherwise it has no particular advantage other than disk space, it's the equivalent of just catting all your source files together and compiling that. If you thikn it's better to do in the frontend, cool, you could make it so all the code gets seen by the frontend by fake compiling all the stuff, writing the original source to an object file special section, and then make the linker really call the frontend with all those special sections.
You can even do it without the linker if you want.
Now you have all the code in the frontend if that's what you want (I have no idea why you'd want this).
It has the disadvantage that it's the equivalent of this, without choice.
If you look far enough back, lots of C/C++ projects used to do this kind of thing when they needed performance in the days before LTO, or they just shoved the function definitions in header files, but stopped because it has a huge forced memory and compilation speed footprint.
Then we moved to precompiled headers to fix the latter, then LTO to fix the former and the latter.
Everything old is new again.
In the end, you are also much better off improving the ability to take lots of random object files with IR and make it optimize well than trying to ensure that all possible source code will be present to the frontend for a single compile. Lots of languages and compilers went down this path and it just doesn't work in practice for real users.
So doing stuff in the linker (and it's not really the linker, the linker is just calling the compiler with the code, whether that compiler is a library or a separate executable) is not a hack, it's the best compilation strategy you can realistically use, because the latter is essentially a dream land where nobody has third party libraries they link or subprojects that are libraries or multiple compilation processes and ....
Zig always seems to do this thing in blog posts and elsewhere where they add these remarks that often imply there is only one true way of doing it right and they are doing it. It often comes off as immature and honestly a turnoff from wanting to use it for real.
Yeah, like their solutions to detecting use after free are hardly any different from using something like PurifyPlus.
As I understand it, compiling each source file separately and linking together the result was historically kind of a hack too, or at least a compromise, because early unix machines didn't have enough memory to compile the whole program at once (or even just hold multiple source files in memory at a time). Although later on, doing it this way did allow for faster recompilation because you didn't need to re-ingest source files that hadn't been changed (although this stopped being true for template-heavy C++ code).
It's hardly a hack when its how most languages work in the first place.
"Furthermore, when this work is combined with the recent std.Io changes, there is potential for users to seamlessly control how libc performs I/O - for example forcing all calls to read and write to participate in an io_uring event loop"
This is exciting! I particularly care more about kqueue but I guess the quote applies to it too.
There are so many scary parts of libc, this is a really exciting project
There are many useful functions too. Like "memfrob" and "strfry". I hope the Zig libc makes those available too
Just joking of course. Those are sadly only in glibc.. :)
Does anyone know if there is a timeline on when Zig might achieve 1.0? I've been interested in the language for a while, but I'm a bit concerned about writing anything important in it when it seems to be evolving so much at the moment
Nobody knows, but for what it's worth, existing large projects that are used in production environments have been fairly good at keeping up with Zig releases. See: Bun, Ghostty, and Tigerbeetle for good examples of this. Because the semantics of Zig are relatively simple, porting to the latest version is usually as simple as bumping your compiler version, trying to build, making a fairly mindless, mechanical change, and repeating until it builds.
The biggest thing holding me back from using Zig for important projects is the willingness of my peers to adopt it, but I'm just building projects that I can build myself until they are convinced :)
There's no timeline for 1.0
You might find this interesting: https://www.youtube.com/watch?v=x3hOiOcbgeA
I’m sure this has crossed someone’s mind but why isn’t this called zlibc? :-)
Perhaps to avoid confusion with zlib?
https://www.zlib.net/
Well we also have glib and glibc.
I rather like "libz".
Does Zig does have the man power to keep these up to date?
I think we either need to make operating systems not in C, or just accept that at some level we rely on C.
Not C. PDP-11
Super cool project.
I expect a lot of C code may be quite mechanically translated to Zig (by help of LLMs). Unlike C->Rust or C->C++, where there's more of a paradigm shift.
There's solid reason for the translation here; the Zig core team is aiming to eliminate duplicated code and C functions, and avoid the need to track libc from multiple sources. In the future, LLMs could serve as part of this, but they are currently quite terrible at Zig (as far as I understand it, it's not a lack of Zig code samples, it's an imbalance of OLD Zig to NEW Zig, as Zig changes quite frequently).
You would need to consider if it is even worth it translating your C code. If the paradigm is identical and the entire purpose would be "haha it is now one language," surely you could just compile and link the C code with libzigc... In my opinion, it's not worth translating code if the benefit of "hey look one language" requires the cost of "let's pray the LLM didn't hallucinate or make a mistake while translating the code."
> they are currently quite terrible at Zig
hard disagree (example elsewhere)
"Abolish ICE" at the bottom. Obviously a Bad Bunny fan, as I am.
The very same day I sat at home writing this devlog like a coward, less than five miles away, armed forces who are in my city against the will of our elected officials shot tear gas, unprovoked, at peaceful protestors, including my wife.
https://www.kptv.com/2026/01/31/live-labor-unions-rally-marc...
This isn't some hypothetical political agenda I'm using my platform to push. There's a nonzero chance I go out there next weekend to peacefully protest, and get shot like Alex Pretti.
Needless to say, if I get shot by ICE, it's not good for the Zig project. And they've brought the battle to my doorstep, almost literally.
Abolish ICE.
Andy stay safe. We gotta all come to realization that none of this is possible if we let our democracy slip away. Millions before us died to preserve it. We owe it to them to put up a good fight.
My 85 year old mom lives in Portland and she attends rallies frequently. If you know of any way to support you or other local people doing this work, I'm very interested. My email is on my profile page.
I have a friend who is in Minneapolis. He's involved in caravans which are tracking ICE. He wasn't the driver in the last one. But, the ICE vehicle they tailed suddenly started going in a very direct path, instead of randomly driving. The driver figured it out first. They drove to the driver's house and then stood outside of their car for ten minutes staring at his house. Cars in Minnesota have their license plates on both the front and the back.
Is there any justification for that kind of intimidation? Did any of the Trump supporters vote for that? I hear about paid agitators on the left but not that kind of compensated actors. Is his name in a database now once they did the lookup?
> against the will of our elected officials
Did you mean your local officials?
In the Federal model of US government, state authority overrides centralized government except in the explicit cases enumerated by the Constitution.
So yes, of course they mean their local officials, because in this case there isn’t an explicit line in the Constitution explaining why the feds are allowed to invade Minnesota.
yep. It's not in the constitution, but in all practicality mcculloch vs maryland (which i would love to see repealed) disagrees
The Supreme Court has disagreed with you on the matter of federal immigration constitutional authority for more than a century. There isn’t any “invasion”; that’s a propaganda device.
And yet they didn’t brag about invading other states bordering, let’s see, Canada, just the blue one they had a political spat with.
That's clever. Just slap the "immigration sticker" on ICE and do whatever you want.
Just go home. Work on Zig. Don't do anything stupid.
I like how you call peaceful protests when people throw huge rocks, break city infrastructure and damage property and take 0 accountability for it. And most likely don't pay taxes to fix it up later.
How convenient it must be to blame officers instead of bad actors just because you agree with their side.
This is purely pushing political agenda, you just covering it up.
Since you're so eager to construe his support for peaceful protest as support for civil unrest, I therefore think it's fair if I construe your defence of ICE to mean support for their extrajudicial executions and the people who dress up as ICE (ie: masked men dragging people at gunpoint into unmarked vans) to kidnap and rape people.
Please stay safe.
I can't hold it so had to create an account to share, I'm sorry. I'm one of the minor zig contributors, and I'm reading ziglang blog for the purpose of engagement in software engineering craft. I don't want to see these ICE stuff or whatever else political opinion you or somebody else have. I'm not from US and I barely know what ICE is but you're hating on people (I'm sure you think it's deserved, as with any hate) and I assume you may hate me at some point because I do something you don't share or like (like this comment for example). Thinking that creator of Zig may hate me, takes a lot of fun from using the language let alone contributing to it or areas surrounding it. What if tomorrow people with tattoos at particular spot will be hated in media and you'll be posting "Abolish people with tattoo". Not the best comparison, but I hope you got why I feel scared of engaging with community now.
I think you have big responsibility for maintaining community of people with different political opinions and you are definitely free to share it on your personal blog. But you chose to do it in the community driven project as a lead of that project. And it's not first time. It's a bit different. For me at least.
Also the fear is what made me create this new account, I'm not a bot or something like that. I'm just afraid due to many (political) reasons and I want to find peace in playing with computers and one of these safe places was just taken from me, which you probably have the right to do but you could've avoided it. You're not the only one. There are many projects like this who mention Gaza, Ukraine, Russia, Israel, all these stuff. It's getting less and less projects to engage with (again, for me, I think it works well for those projects as they attract people they like).
I'm sorry you have to suffer and see people deaths. Me too. I understand it's difficult to hold these stuff inside. As you can see I couldn't ether. But I hoped you're stronger than me.
Get some empathy and awareness. I’m not from the US either but I am against fascist thugs occupying cities. It’s not difficult.
The world demonstrates in many instances, that you do not have to have empathy with people suffering from oppression, rape, murder, etc in order to "succeed" in terms of wealth and power.
Meaning: if you can't accept that someone publishing words/code/etc on the web at the same time also offers their own strong opinions (that you directly claim to be hate) about their own such issues, there's plenty of "communities" in which this kind of unempathetic approach to other people and their lives is celebrated and normalized.
If you barely know what ICE is, how can you claim his opinions to be "hate"? How can you claim that Andrew may hate you without thinking you identify with what you understand about ICE?
What ICE does is unmistakenly fascistic and authoritarian, far beyond the powers they have been granted by law and democratic processes. It's utterly disgusting to try and compare protesting and fighting against that with "abolish people with tattoos". ICE is an institution, a government agency among a dozen+ law enforcement agencies in the US. You compare advocating for abolishing it through democratic process (what Andrew expressed) with calling for the murder of many millions of people with a private hobby.
And while Andrew may have some responsibility towards the community he founded; if he has the responsibility to include different political opinions, he most certainly has the responsibility to exclude fascism. Fascism is the destruction of different opinions, it is not a political opinion that can stand among others and be compared on the same basis: that of human rights at the minimum.
Ask yourself and reflect: why does this very simple and inoffensive call by Andrew make you scared, especially if you don't know what ICE is and does? Could you have been influenced into this feeling? It is certainly not a rational reaction to a few characters of text viewed on a screen.
I too am sick of internal compiler errors
I too am sick of internal combustion engines, a product of the last century.
I too am sick of intrusion countermeasures electronics. Think of all the poor netrunners out there.
Well, I guess the Zig project is now writing in NTSC, causing compatibility issues for the PAL folks out there.
/s
This strikes me as a very agent-friendly problem. Given a harness that enforces sufficiently-rigorous tests, I'm sure you could spin up an agent loop that methodically churns through these functions one by one, finishing in a few days.
hallucinations in a libc implementation would be especially bad
Have you ever used an LLM with Zig? It will generate syntactically invalid code. Zig breaks so often and LLMs have such an eternally old knowledge cutoff that they only know old ass broken versions.
The same goes for TLA+ and all the other obscure things people think would be great to use with LLMs, and they would, if there was as much training data as there was for JavaScript and Python.
i find claude does quite well with zig. this project is like > 95% claude, and it's an incredibly complicated codebase [0] (which is why i am not doing it by hand):
https://github.com/ityonemo/clr
[0] generates a dynamically loaded library which does sketchy shit to access the binary representation of datastructures in the zig compiler, and then transpiles the IR to zig code which has to be rerun to do the analysis.
To be fair, this was true of early public LLMs with rust code too. As more public zig repositories (and blogs / docs / videos) come online, they will improve. I agree it's a mess currently.
You must have not tried this with an LLM agent in the past few months.
i tested sonnet 4.5 just last week on a zig codebase and it has to be instructed the std.ArrayList syntax every time.
I made a Zig agent skill yesterday if interested: https://github.com/rudedogg/zig-skills/
Claude getting the ArrayList API wrong every time was a major reason why
It’s AI generated but should help. I need to test and review it more (noticed it mentions async which isn’t in 0.15.x :| )
The linked blog post about making this is an excellent read.
Fighting fire with fire
A little bit! I wrote a long blog post about how I made it, I think the strategy of having an LLM look at individual std modules one by one make it actually pretty accurate. Not perfect, but better than I expected
Are you using an agent? It can quickly notice the issue and fix it. Obviously if it's trained on an older version it won't know the new APIs.
Try it again. This time do something different with CLAUDE.md. By the way it's happy to edit its own CLAUDE.md files (don't have an agent edit another agent's CLAUDE.md files though [0])
0: https://news.ycombinator.com/item?id=46723384