Slightly odd suggestion: package it up as both a Python and an NPM module - both just thin wrappers around the combined binary - and then people within those ecosystems will be able to run:
uvx ut md5 ...
Or:
npx ut md5 ...
To execute it without having to figure out cargo or how to add a Rust binary to their path.
I've seen a few tools do things like this recently, it's a pretty interesting pattern. I believe there's tooling in the Python/Rust world that makes compiling the different binary wheels relatively easy using GitHub Actions.
If you know you will use it often, uv has `uv tool install ...`. So, after the first `uv tool install ut` you can just run `ut md5 ...` or whatever. Don't need to keep using uvx.
uv also has a couple commands I throw in a systemd unit[1] to keep these tools updated:
Interesting perspective. I mean the issue exists with any ecosystem. See nix who has to wrap down to the lib level everything under the sun to make their package system usable for all kinds of use cases. But they need to do this because of the deterministic nature of the system. Brew on the other hand discouraged packaging tools that are available from other package managers. Don’t know if this is still the case. I feel a bit uncertain about this. It would mean that a tool should not only strive to be included in all major repositories of Linux distributions along with winget, brew and Mac ports etc (which is a tough ask). Now they should also publish to npm, pypy, ruby gems etc as well? I feel something is taking a wrong turn here.
cargo-dist will get you the npm one for free. They've got pypi support planned as well but don't have it yet, though they can also generate standard curl | sh installers and all too.
I will personally never want to install uvx or npx.
Another suggestion is to enable cargo-binstall, it allows just to install cargo binaries conveniently, and cargo-binstall is just a single-binary to install itself.
I can understand why people would find `ut` convenient. That said, I would caution against trying to include too much functionality. What is too much? I don't have a clear idea on this yet.
But I would probably argue that including HTTP functionality is going too far. Why? Because there are already amazing tools dedicated to this already. On the client side, see `xh` [1]. On the server side, see `miniserve` [2]. Both have approximately 7K stars on GitHub.
It seems wiser to let specialized projects focus on a particular functional area; this is better for users and less work for maintainers.
I agree. this tool feels like an all-in-one swiss knife. granted, it is useful. But it goes directly against one of the UNIX core values which is "do one thing and do it well". as well, including too much functionality in one single package will eventually bloat it, and there are many examples of this happening (windows, systemd, etc.).
I feel that the applicability of the "do one thing and do it well" philosophy scales with the frequency with which you expect to do that thing and the complexity of the task. In the case of a text editor or a VCS it obviously makes sense. In the case of tools that I use a couple times a year like these, having them all grouped together under a 'misc' application is exactly what I need in order to waste time fumbling through my installed apps trying to remember where I left the QR code generator.
I maintain a Rust-based CLI HTTP server that embeds Nushell. It’s a handy little Swiss-army knife that’s replaced Nginx and Caddy for my personal projects.
You can serve a folder of static assets like this:
I was looking to try something like this (without the nushell...), but I find that I always have caddy installed anyway, and for the really local cases I use php -S.
In this title you tell exactly zero information about what your tools actually do, but somehow find it important to mention the language they're written in.
Although philosophically I prefer the unix approach of "do one thing and do it well", I really admire this tool. I think it might be the fact that the one thing this does well is curating a set of functions for a particular profile of developer. My story is someone doing web focused full stack development?
It might be worth doing a survey of your users to see what they use ut for and what areas you should focus on next.
I think for packaging it's okay to "have" lots of things that individually "do" one thing.
The important part is that the user controls the entry points. It's more Unixy to allow someone to decode audio from one pipe to another than to only allow them to play a file to a speaker.
Consider that Debian "does" lots of things because it has a kernel, hardware abstractions, a userland, a package manager, and often a GUI and web browser. But it also "does" none of those because it's just a convenient and useful wrapper to publish all the other tools, which you can still call upon individually
When importing a library, it becomes part of your project, therefore it becomes your responsibility to ensure that the imported code is safe and sound.
I am seeing the list of dependencies, and even without looking at the transitive ones, I am sure you didn't review any of those, nor will properly maintain that huge list.
That's a supply chain ticking bomb in my book.
I like Rust, but most projects look like kindergarten collage with no regards to security.
Thanks for disclosing your use of GenAI, that kind of transparency is nice to see up front. On that note though, I took a random example (`calc.rs`) and noticed there was no unit or integration testing validating the parsers or etc. did what was expected of them, or that the end results were correct. Are tests for these tools planned? I get sketched out a bit when I see GenAI code with no automated validations, but I really want to like this tool because it fills such an interesting niche :)
This project cleverly combines formal verification with AI ethics! Using mathematical certainty to constrain AI's uncertainty is like adding a transparent audit window to a "black box." Wadler would surely smile knowingly upon seeing this semantic ledger specification—theory has finally been transformed into verifiable practice
From philosophical aphorism to verifiable fact, this is a brilliant path
I don't have a strict idea of "done" for ut. But, I am not keen on adding increasing complex things to it either. It's purpose is convenience not exhaustiveness.
The point of ut is not to replace or invent new tooling. It is meant to be a set of tools that are simple, self exploratory and work out of the box with sane defaults. So, essentially, something that you don't have to remember syntax for or go through help/man pages everytime you want to use it.
If you can remember all of that off the top of your head and find it ergonomic to type out, then sure. But much like how I prefer someone else to do my content syncing as an ergonomic appliance rather than using FTP + curlftpfs + a VCS [1], I quite like this idea of a focused toolbox (written in a language that compiles to native code) and welcome it rather than having to store these massive snippets in my head (or write shell wrappers for them).
Why is everything in the same binary? Why not multiple binaries, one for each function? That way people can install only the ones they need, a-la Unix tools: do only one thing and do it well.
I also have the exact same tools but written in Go. Rust would be a nice upgrade (lower footprint) but to keep them all in the same binary is a bit silly.
Those tools either don't ship with, or exist in wildly different forms on Windows. It's particularly bad for curl, which might be the real curl.se curl or Microsoft's confusingly-named Powershell alias.
I could definitely see using this in a cross-platform build or installation environment.
Windows and *nix systems are often used for very different things so I don't understand why there would be need for some kind of universal superbinary. And thanks to WSL you can already get GNU coretools running in Windows anyways.
I lead a large-ish open source software project. We have developers that need to build on Linux, macOS, and Windows. It's useful to be able to get everyone bootstrapped with as few steps as possible and with as few dependencies as possible. For our uses CMake works well as a universal superbinary, but I'm always on the lookout for tools that can reduce developer friction.
To me the biggest upside would be terminal completion and discovery via help text. Sure you can always bounce to a search engine and bounce back, but I can imagine cases where you want a toolkit in front of you that you know how to use when your focus is not on memorizing commands.
This could be great for students without sysadmins needing to lodge complaints.
I don't think it's that silly. BusyBox packages a bunch of utilities in a single binary. It amortizes fixed costs: a single binary takes less space than 30 binaries that each do one tiny thing.
These are small bits of code, and the functionality is interrelated. The entire thing feels like a calculator, or awk, and seems reasonable to put in one binary.
The Unix philosophy doesn't actually say anything about whether different tools should live in the same binary. It's about having orthogonal bits of functionality and using text as the universal interface. BusyBox is just as Unix as having all the tools be different binaries.
Amusingly, if you look historically, it's also a traditional approach to reduce total binary size - a bunch of small utilities were all Sim links to a single binary, which conditioned on argv[0] to figure out what to do.
Slightly odd suggestion: package it up as both a Python and an NPM module - both just thin wrappers around the combined binary - and then people within those ecosystems will be able to run:
Or: To execute it without having to figure out cargo or how to add a Rust binary to their path.I've seen a few tools do things like this recently, it's a pretty interesting pattern. I believe there's tooling in the Python/Rust world that makes compiling the different binary wheels relatively easy using GitHub Actions.
If you know you will use it often, uv has `uv tool install ...`. So, after the first `uv tool install ut` you can just run `ut md5 ...` or whatever. Don't need to keep using uvx.
uv also has a couple commands I throw in a systemd unit[1] to keep these tools updated:
1: https://github.com/level12/coppy/blob/main/systemd/mise-uv-u...Interesting perspective. I mean the issue exists with any ecosystem. See nix who has to wrap down to the lib level everything under the sun to make their package system usable for all kinds of use cases. But they need to do this because of the deterministic nature of the system. Brew on the other hand discouraged packaging tools that are available from other package managers. Don’t know if this is still the case. I feel a bit uncertain about this. It would mean that a tool should not only strive to be included in all major repositories of Linux distributions along with winget, brew and Mac ports etc (which is a tough ask). Now they should also publish to npm, pypy, ruby gems etc as well? I feel something is taking a wrong turn here.
cargo-dist will get you the npm one for free. They've got pypi support planned as well but don't have it yet, though they can also generate standard curl | sh installers and all too.
This looks cool, thanks!
I will personally never want to install uvx or npx.
Another suggestion is to enable cargo-binstall, it allows just to install cargo binaries conveniently, and cargo-binstall is just a single-binary to install itself.
+1 for this great suggestion
I can understand why people would find `ut` convenient. That said, I would caution against trying to include too much functionality. What is too much? I don't have a clear idea on this yet.
But I would probably argue that including HTTP functionality is going too far. Why? Because there are already amazing tools dedicated to this already. On the client side, see `xh` [1]. On the server side, see `miniserve` [2]. Both have approximately 7K stars on GitHub.
It seems wiser to let specialized projects focus on a particular functional area; this is better for users and less work for maintainers.
[1]: https://github.com/ducaale/xh
[2]: https://github.com/svenstaro/miniserve
I agree. this tool feels like an all-in-one swiss knife. granted, it is useful. But it goes directly against one of the UNIX core values which is "do one thing and do it well". as well, including too much functionality in one single package will eventually bloat it, and there are many examples of this happening (windows, systemd, etc.).
I feel that the applicability of the "do one thing and do it well" philosophy scales with the frequency with which you expect to do that thing and the complexity of the task. In the case of a text editor or a VCS it obviously makes sense. In the case of tools that I use a couple times a year like these, having them all grouped together under a 'misc' application is exactly what I need in order to waste time fumbling through my installed apps trying to remember where I left the QR code generator.
I see this more like `busybox`, where being a single binary is an implementation detail. The commands are still orthogonal and composable.
I maintain a Rust-based CLI HTTP server that embeds Nushell. It’s a handy little Swiss-army knife that’s replaced Nginx and Caddy for my personal projects.
You can serve a folder of static assets like this:
http-nu :3021 '{|req| .static "www" $req.path}'
https://github.com/cablehead/http-nu
I was looking to try something like this (without the nushell...), but I find that I always have caddy installed anyway, and for the really local cases I use php -S.
I'm curious, what do you think is missing from Caddy that has you looking for something new?
> ut – Rust based CLI utilities for devs and IT
In this title you tell exactly zero information about what your tools actually do, but somehow find it important to mention the language they're written in.
To capture attention of Rust purists
Very neat.
Although philosophically I prefer the unix approach of "do one thing and do it well", I really admire this tool. I think it might be the fact that the one thing this does well is curating a set of functions for a particular profile of developer. My story is someone doing web focused full stack development?
It might be worth doing a survey of your users to see what they use ut for and what areas you should focus on next.
I think for packaging it's okay to "have" lots of things that individually "do" one thing.
The important part is that the user controls the entry points. It's more Unixy to allow someone to decode audio from one pipe to another than to only allow them to play a file to a speaker.
Consider that Debian "does" lots of things because it has a kernel, hardware abstractions, a userland, a package manager, and often a GUI and web browser. But it also "does" none of those because it's just a convenient and useful wrapper to publish all the other tools, which you can still call upon individually
I agree. This feels like a level two busybox/toybox. Nice job.
When importing a library, it becomes part of your project, therefore it becomes your responsibility to ensure that the imported code is safe and sound.
I am seeing the list of dependencies, and even without looking at the transitive ones, I am sure you didn't review any of those, nor will properly maintain that huge list.
That's a supply chain ticking bomb in my book.
I like Rust, but most projects look like kindergarten collage with no regards to security.
This is literally anti-unix.
Thanks for disclosing your use of GenAI, that kind of transparency is nice to see up front. On that note though, I took a random example (`calc.rs`) and noticed there was no unit or integration testing validating the parsers or etc. did what was expected of them, or that the end results were correct. Are tests for these tools planned? I get sketched out a bit when I see GenAI code with no automated validations, but I really want to like this tool because it fills such an interesting niche :)
Hey, yeah, fair concern. Some tools already have tests, but, I do plan on adding it to all of them.
Why would a user of these tools care about what language they are written in?
As it is a utility toolkit, I'd like to have:
- the Web & Network section expanded: the copyparty features (github.com/9001/copyparty) and curlie (github.com/rs/curlie).
- compress/decompress features with password: it doesn't need to use the best compress algorithm, gzip is good enough.
The first thing I checked was how hashing is implemented and two things stand out to me:
1. Input must be valid UTF-8. 2. stdin is read to EOF instead of being read incrementally
Neither are ideal and can make ut unfit for a fair few use cases.
Would be cool if this also had a `retry` sub-command, for running any commands with an exponential backoff retry logic. Similar to these Rust tools:
https://github.com/demoray/retry-cli
https://github.com/rye/eb
This project cleverly combines formal verification with AI ethics! Using mathematical certainty to constrain AI's uncertainty is like adding a transparent audit window to a "black box." Wadler would surely smile knowingly upon seeing this semantic ledger specification—theory has finally been transformed into verifiable practice From philosophical aphorism to verifiable fact, this is a brilliant path
> Data Generation: uuid (v1, v3, v4, v5)
The new versions 7 and 8 are really a must these days, especially v7.
DevToys (https://github.com/DevToys-app/DevToys) has a CLI version like this.
On what basis are new features included?
Has the creator thought about the definition of "done"? Will it grow indefinitely like a katamari ball?
[1]: https://en.wikipedia.org/wiki/Katamari
I don't have a strict idea of "done" for ut. But, I am not keen on adding increasing complex things to it either. It's purpose is convenience not exhaustiveness.
is this stuff not pretty easy to do with python?
``` python -c "import base64; print(base64.b64encode('$INPUT_STRING'.encode('utf-8')).decode('utf-8'))" ```
I love typing python commands that are 10x longer than a shorthand.
Yes it's easy to set up an alias or shell command or whatever, but that's besides the point :p
You don't even have to go that far, `base64` is a coreutil (https://github.com/coreutils/coreutils/blob/ebfd80083b4fe4ae...).
The point of ut is not to replace or invent new tooling. It is meant to be a set of tools that are simple, self exploratory and work out of the box with sane defaults. So, essentially, something that you don't have to remember syntax for or go through help/man pages everytime you want to use it.
uutils/coreutils has a `base64` in Rust which just gained better performance due to the base64-simd crate for SIMD: https://github.com/uutils/coreutils/pull/8578
Note that uutils does not work if the file does not fit into memory.
With GNU coreutils:
With uutils doing the same would exhaust your systems memory until either it freezes or oomd kills the process.For now. There's no reason this won't/can't be worked on in the future.
If you can remember all of that off the top of your head and find it ergonomic to type out, then sure. But much like how I prefer someone else to do my content syncing as an ergonomic appliance rather than using FTP + curlftpfs + a VCS [1], I quite like this idea of a focused toolbox (written in a language that compiles to native code) and welcome it rather than having to store these massive snippets in my head (or write shell wrappers for them).
[1] https://news.ycombinator.com/item?id=9224
You can send a heredoc into as in a single line of shell, too.
How about uuidv7? That’s catching on, as it’s got improved sortability / clustering behavior in DBs
Why is everything in the same binary? Why not multiple binaries, one for each function? That way people can install only the ones they need, a-la Unix tools: do only one thing and do it well.
I also have the exact same tools but written in Go. Rust would be a nice upgrade (lower footprint) but to keep them all in the same binary is a bit silly.
The single binary thing reminds me of BusyBox - https://en.wikipedia.org/wiki/BusyBox
Sometimes it can be quite convenient to grab a single binary that does a whole bunch of useful stuff in one package.
Also, lots of the ut functions already exist as decades old unix tools.
Those tools either don't ship with, or exist in wildly different forms on Windows. It's particularly bad for curl, which might be the real curl.se curl or Microsoft's confusingly-named Powershell alias.
I could definitely see using this in a cross-platform build or installation environment.
Windows and *nix systems are often used for very different things so I don't understand why there would be need for some kind of universal superbinary. And thanks to WSL you can already get GNU coretools running in Windows anyways.
I lead a large-ish open source software project. We have developers that need to build on Linux, macOS, and Windows. It's useful to be able to get everyone bootstrapped with as few steps as possible and with as few dependencies as possible. For our uses CMake works well as a universal superbinary, but I'm always on the lookout for tools that can reduce developer friction.
Probably for the same reasons that uutils/coreutils uses a single binary. Specifically:
- it reduces the total install size, since common code, including the standard library, is only included once, rather than copied for each executable
- it makes installation easier on some platforms, since you just have to install a single executable, instead of a bunch of executables
To me the biggest upside would be terminal completion and discovery via help text. Sure you can always bounce to a search engine and bounce back, but I can imagine cases where you want a toolkit in front of you that you know how to use when your focus is not on memorizing commands.
This could be great for students without sysadmins needing to lodge complaints.
I don't think it's that silly. BusyBox packages a bunch of utilities in a single binary. It amortizes fixed costs: a single binary takes less space than 30 binaries that each do one tiny thing.
These are small bits of code, and the functionality is interrelated. The entire thing feels like a calculator, or awk, and seems reasonable to put in one binary.
The Unix philosophy doesn't actually say anything about whether different tools should live in the same binary. It's about having orthogonal bits of functionality and using text as the universal interface. BusyBox is just as Unix as having all the tools be different binaries.
Amusingly, if you look historically, it's also a traditional approach to reduce total binary size - a bunch of small utilities were all Sim links to a single binary, which conditioned on argv[0] to figure out what to do.
In The Old Days, it was hard links, no symlinks.
Eek, you're right. Also ugh, I should stop talking to my phone, just noticed "Sim links". sigh.