> trash a.txt b.png moves `a.txt` and `b.png` to the trash. Supports macOS and Linux.
The way you’re doing it trashes files sequentially, meaning you hear the trashing sound once per file and ⌘Z in the Finder will only restore the last one. You can improve that (I did it for years) but consider just using the `trash` commands which ships with macOS. Doesn’t use the Finder, so no sound and no ⌘Z, but it’s fast, official, and still allows “Put Back”.
> jsonformat takes JSON at stdin and pretty-prints it to stdout.
Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.
> uuid prints a v4 UUID. I use this about once a month.
Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?
It's weird how the circle of life progresses for a developer or whatever.
- When I was a fresh engineer I used a pretty vanilla shell environment
- When I got a year or two of experience, I wrote tons of scripts and bash aliases and had a 1k+ line .bashrc the same as OP
- Now, as a more tenured engineer (15 years of experience), I basically just want a vanilla shell with zero distractions, aliases or scripts and use native UNIX implementations. If it's more complicated than that, I'll code it in Python or Go.
I think it's more likely to say that this comes from a place of laziness than some enlightened peak. (I say this as someone who does the same, and is lazy).
When I watch the work of coworkers or friends who have gone these rabbit holes of customization I always learn some interesting new tools to use - lately I've added atuin, fzf, and a few others to my linux install
I went through a similar cycle. Going back to simplicity wasn't about laziness for me, it was because i started working across a bunch more systems and didn't want to do my whole custom setup on all of them, especially ephemeral stuff like containers allocated on a cluster for a single job. So rather than using my fancy setup sometimes and fumbling through the defaults at other times, i just got used to operating more efficiently with the defaults.
You can apply your dotfiles to servers you SSH into rather easily. I'm not sure what your workflow is like but frameworks like zsh4humans have this built in, and there are tools like sshrc that handle it as well. Just automate the sync on SSH connection. This also applies to containers if you ssh into them.
For anyone else reading this comment who was confused because this seems like the opposite of what you'd expect about Nix: Hacker News ate the asterisks and turned them into italics.
Besides many nix computers I also have wife, dog, children, chores, shopping to be done. Unlike when I was young engineer I could stay all night fiddling with bash scripts and environments.
Yeah - been there, done that, too. I feel like the time I gain from having a shortcut is often less that what I wound need to maintain it or to remember the real syntax when I'm on a machine where it's not available (which happens quite often in my case). I try to go with system defaults as much as possible nowadays.
I am going through a phase of working with younger engineers who have many dotfiles, and I just think "Oh, yeh, I remember having lots of dotfiles. What a hassle that was."
Nowadays I just try to be quite selective with my tooling and learn to change with it - "like water", so to speak.
(I say this with no shade to those who like maintaining their dotfiles - it takes all sorts :))
Prepare to swing back again. With nearly 30 years experience I find the shell to be the best integration point for so many things due to its ability to adapt to whatever is needed and its universal availability. My use of a vanilla shell has been reduced to scripting cases only.
On the other hand, the author seems to have a lot of experience as well.
Personally I tend to agree... there is a very small subset of things I find worth aliasing. I have a very small amount and probably only use half of them regularly. Frankly I wonder how my use case is so different.
In my case i'd start typing it in my browser then just click something i've visited 100 times before. There is something to be said about reducing that redundant network call but I dont think it makes much practical difference and the mental mapping/discoverability of aliases isnt nothing.
as a person who loves their computer, my ~/bin is full. i definitely (not that you said this) do not think "everything i do has to be possible on every computer i am ever shelled into"
being a person on a computer for decades, i have tuned how i want to do things that are incredibly common for me
though perhaps you're referring to work and not hobby/life
I prefer using kubectl than any other method so i have plenty of functions to help with that. I'd never consider using python or go for this although I do have plenty of python and go "scripts" on my path too.
If you come through the other side, you set up LocalCommand in your .ssh/config which copies your config to every server you ssh to, and get your setup everywhere.
It is often useful to chain multiple sed commands and sometimes shuffle them around. In those cases I would need to keep changing the fist sed. Sometimes I need to grep before I sed. Using cat, tail and head makes things more modular in the long run I feel. It’s the ethos of each command doing one small thing
My fav script to unpack anything, found a few years ago somewhere
# ex - archive extractor
# usage: ex <file>
function ex() {
if [ -f $1 ] ; then
case $1 in
*.tar.bz2) tar xjf $1 ;;
*.tar.gz) tar xzf $1 ;;
*.tar.xz) tar xf $1 ;;
*.bz2) bunzip2 $1 ;;
*.rar) unrar x $1 ;;
*.gz) gunzip $1 ;;
*.tar) tar xf $1 ;;
*.tbz2) tar xjf $1 ;;
*.tgz) tar xzf $1 ;;
*.zip) unzip $1 ;;
*.Z) uncompress $1;;
*.7z) 7z x $1 ;;
*) echo "'$1' cannot be extracted via ex()" ;;
esac
else
echo "'$1' is not a valid file"
fi
}
This is exactly the kind of stuff I'm most interested in finding on HN. How do other developers work, and how can I get better at my work from it?
What's always interesting to me is how many of these I'll see and initially think, "I don't really need that." Because I'm well aware of the effect (which I'm sure has a name - I suppose it's similar to induced demand) of "make $uncommon_task much cheaper" -> "$uncommon_task becomes the basis of an entirely new workflow/skill". So I'm going to try out most of them and see what sticks!
Also: really love the style of the post. It's very clear but also includes super valuable information about how often the author actually uses each script, to get a sense ahead of time for which ones are more likely to trigger the effect described above.
A final aside about my own workflows which betrays my origins... for some of these operations and for others i occasionally need, I'll just open a browser dev tools window and use JS to do it, for example lowercasing a string :)
I'd love to see a cost benefit analysis of the author's approach vs yours, which includes the time it took the author to create the scripts, remember/learn to use them/reference them when forgetting syntax, plus time spent migrating whenever changing systems.
why is this interesting to you? the whole point of doing all of this is to be more efficient in the long run. of course there is an initial setup cost and learning curve after which you will hopefully feel quite efficient with your development environment. you are making it sound like it is not worth the effort because you have to potentially spend time learning "it"? i do not believe that it takes long to "learning" it, but of course it can differ a lot from person to person. your remarks seem like non-issues to me.
It's interesting because there's a significant chance one wastes more time tinkering around with custom scripts than saving in the long run. See https://xkcd.com/1205/
For example. The "saves 5 seconds task that I do once a month" from the blog post. Hopefully the author did not spend more than 5 minutes writing said script and maintaining it, or they're losing time in the long run.
I find that now with AI, you can make scripts very quickly, reducing the time to write them by a lot. There is still some time needed for prompting and testing but still.
I keep meaning to generalize this (directory target, multiple sources, flags), but I get quite a bit of mileage out of this `unmv` script even as it is:
#!/bin/sh
if test "$#" != 2
then
echo 'Error: unmv must have exactly 2 arguments'
exit 1
fi
exec mv "$2" "$1"
I have mkcd exactly ( I wonder how many of us do, it's so obvious)
I have almost the same, but differently named with scratch(day), copy(xc), markdown quote(blockquote), murder, waitfor, tryna, etc.
I used to use telegram-send with a custom notification sounnd a lot for notifications from long-running scripts if I walked away from the laptop.
I used to have one called timespeak that would speak the time to me every hour or half hour.
I have go_clone that clones a repo into GOPATH which I use for organising even non-go projects long after putting go projects in GOPATH stopped being needed.
I liked writing one-offs, and I don't think it's premature optimization because I kept getting faster at it.
Some cool things here but in general I like to learn and use the standard utilities for most of this. Main reason is I hop in and out of a lot of different systems and my personal aliases and scripts are not on most of them.
sed, awk, grep, and xargs along with standard utilities get you a long long way.
I totally agree with this, I end up working on many systems, and very few of them have all my creature comforts. At the same time, really good tools can stick around and become impactful enough to ship by default, or to be easily apt-get-able. I don't think a personal collection of scripts is the way, but maybe a well maintained package.
Nice! Tangentially related: I built a (MacOS only) tool called clippy to be a much better pbcopy. It was just added to homebrew core. Among other things, it auto-detects when you want files as references so they paste into GUI apps as uploads, not bytes.
clippy image.png # paste into Slack, etc. as upload
clippy -r # copy most recent download
pasty # copy file in Finder, paste actual file here
> cpwd copies the current directory to the clipboard. Basically pwd | copy. I often use this when I’m in a directory and I want use that directory in another terminal tab; I copy it in one tab and cd to it in another. I use this once a day or so.
You can configure your shell to notify the terminal of directory changes, and then use your terminal’s “open new window” function (eg: ctrl+shift+n) to open a new window retaining the current directory.
Broadly, I very much love this approach to things and wish it was more "acceptable?" It reminds me of the opposite of things like "the useless use of cat" which to me is one of the WORST meme-type-things in this space.
Like, it's okay -- even good -- for the tools to bend to the user and not the other way around.
Does zsh support this out-of-the-box? Because I definitely never had to setup any of these kinds of aliases but have been using this shorthand dot notation for years.
This is one area that I've found success in vibe coding with. Making scripts for repetitive tasks that are just above the complexity threshold where the math between automating and doing manually is not so clear. I have copilot generate the code for me and honestly I don't care too much of its quality, extensibility, and are easy enough to read through where I don't feel like my job is AI pr reviewer.
Historical note: getting hold of these scripts by chatting to various developers was the motivation for the original 2004 "lifehacks" talk[1][2]. If you ever get into an online argument over what is a "life hack" and what isn't, feel free to use short scripts like these as the canonical example.
Otherwise, I am happy to be pulled into your discussion, Marshall McLuhan style[3] to adjudicate, for a very reasonable fee.
An important advantage of aliases was not mentioned: I see everything in one place and can easily build aliases on top of other aliases without much thinking.
Anyways, my favourite alias that I use all the time is this:
alias a='nvim ~/.zshrc && . ~/.zshrc'
It solves the ,,not loaded automatically'' part at least for the current terminal
As a bonus, I prepend my custom aliases or scripts with my user name and hyphen (i.e helicaltwine-). It helps me recall rarely used scripts when I need them and forget the names.
I follow a similar but more terse pattern. I prepend them all with a comma, and I have yet to find any collisions. If you're using bash (and I assume posix sh as well), the comma character has no special meaning, so this is quite a nice use for it. I agree that it's nice to type ",<tab>" and see all my custom scripts appear.
Nice. I have a bash script similar to the one listed "removeexif" called prep_for_web which takes any image file (PNG, BMP, JPG, WebP), scrubs EXIF data, checks for transparency and then compresses it to either JPG using MozJPEG or to PNG using PNGQuant.
The nato phonetic alphabet one cracked me up. My dude you don't need that, call center employees don't know it, just say S as in Sugar like ur grandma used to.
The NATO alphabet is made of commonly known words that are hard to misspell and hard to mishear (at least the initial phonemes). The person on the other end doesn't need to be able to recite the words, they just need to be able to hear "november" and recognize that it starts with N.
The nato phonetic alphabet is still useful even if the other party doesn't know it, I've used it a bunch of times on the phone to spell out my 10- letter last name. Saves quite a lot of time and energy for me vs saying "letter as in word" for each letter.
Exactly. The listening party doesn't need to have knowledge of the NATO alphabet to still benefit from it since they are just regular English words.
I once had someone sound out a serial number over a spotty phone connection years ago and they said "N as in NAIL". You know what sounds a lot like NAIL? MAIL.
And that is why we don't just arbitrarily make up phonetic alphabets.
I dunno, there's a pretty good chance that the one that people spent time and effort designing to replace earlier efforts with the goal of reducing potential ambiguity and for use over noisy connections with expectation that mistakes could cost lives is probably better than what you improvise on the spot
When I worked in customer service, I asked a teammate what I could do to spell back something the customer said, and she taught me that system, it helped me a lot.
if you use x0vnc (useful if you use a linux machine both from the attached screen and from vnc, and in a bunch of other scenarios), copy and paste to and fro the vnc client is not implemented, quite frustrating. here's 2 scripts that does that for you, I now use this all day. https://github.com/francoisp/clipshare
no offense but a lot of those script are pretty hacky they may work for the user but i would not use them without making sure to review them and adapt them to my workflow
> trash a.txt b.png moves `a.txt` and `b.png` to the trash. Supports macOS and Linux.
The way you’re doing it trashes files sequentially, meaning you hear the trashing sound once per file and ⌘Z in the Finder will only restore the last one. You can improve that (I did it for years) but consider just using the `trash` commands which ships with macOS. Doesn’t use the Finder, so no sound and no ⌘Z, but it’s fast, official, and still allows “Put Back”.
> jsonformat takes JSON at stdin and pretty-prints it to stdout.
Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.
> uuid prints a v4 UUID. I use this about once a month.
Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?
https://www.man7.org/linux/man-pages/man1/uuidgen.1.html
I am not the author, but my bet is that he didn't know of its existence.
The best part about sharing your config or knowledge is that someone will always light up your blind spots.
Python also pretty-prints out of the box:
> Why prioritise node instead of jq?
In powershell I just do
But as a functionand it's `New-Guid` in PowerShell.
It's weird how the circle of life progresses for a developer or whatever.
- When I was a fresh engineer I used a pretty vanilla shell environment
- When I got a year or two of experience, I wrote tons of scripts and bash aliases and had a 1k+ line .bashrc the same as OP
- Now, as a more tenured engineer (15 years of experience), I basically just want a vanilla shell with zero distractions, aliases or scripts and use native UNIX implementations. If it's more complicated than that, I'll code it in Python or Go.
I think it's more likely to say that this comes from a place of laziness than some enlightened peak. (I say this as someone who does the same, and is lazy).
When I watch the work of coworkers or friends who have gone these rabbit holes of customization I always learn some interesting new tools to use - lately I've added atuin, fzf, and a few others to my linux install
I went through a similar cycle. Going back to simplicity wasn't about laziness for me, it was because i started working across a bunch more systems and didn't want to do my whole custom setup on all of them, especially ephemeral stuff like containers allocated on a cluster for a single job. So rather than using my fancy setup sometimes and fumbling through the defaults at other times, i just got used to operating more efficiently with the defaults.
You can apply your dotfiles to servers you SSH into rather easily. I'm not sure what your workflow is like but frameworks like zsh4humans have this built in, and there are tools like sshrc that handle it as well. Just automate the sync on SSH connection. This also applies to containers if you ssh into them.
I'm guessing you haven't worked in Someone Else's environment?
The amount of shit you'll get for "applying your dotfiles" on a client machine or a production server is going to be legendary.
Same with containers, please don't install random dotfiles inside them. The whole point of a container is to be predictable.
When I had one nix computer, I wanted to customize it heavily.
Now I have many nix computers and I want them consistent and with only the most necessary packages installed.
For anyone else reading this comment who was confused because this seems like the opposite of what you'd expect about Nix: Hacker News ate the asterisks and turned them into italics.
Besides many nix computers I also have wife, dog, children, chores, shopping to be done. Unlike when I was young engineer I could stay all night fiddling with bash scripts and environments.
Yeah - been there, done that, too. I feel like the time I gain from having a shortcut is often less that what I wound need to maintain it or to remember the real syntax when I'm on a machine where it's not available (which happens quite often in my case). I try to go with system defaults as much as possible nowadays.
I would still call my Python scripts “scripts.” I don’t think the term “scripts” is limited to shell scripts.
I am going through a phase of working with younger engineers who have many dotfiles, and I just think "Oh, yeh, I remember having lots of dotfiles. What a hassle that was."
Nowadays I just try to be quite selective with my tooling and learn to change with it - "like water", so to speak.
(I say this with no shade to those who like maintaining their dotfiles - it takes all sorts :))
Prepare to swing back again. With nearly 30 years experience I find the shell to be the best integration point for so many things due to its ability to adapt to whatever is needed and its universal availability. My use of a vanilla shell has been reduced to scripting cases only.
On the other hand, the author seems to have a lot of experience as well.
Personally I tend to agree... there is a very small subset of things I find worth aliasing. I have a very small amount and probably only use half of them regularly. Frankly I wonder how my use case is so different.
edit: In the case of the author I guess he's probably wants to live in the terminal full time. And perhaps offline. there is a lot of static data he's stored like http status codes: https://codeberg.org/EvanHahn/dotfiles/src/commit/843b9ee13d...
In my case i'd start typing it in my browser then just click something i've visited 100 times before. There is something to be said about reducing that redundant network call but I dont think it makes much practical difference and the mental mapping/discoverability of aliases isnt nothing.
this is how it works for you
as a person who loves their computer, my ~/bin is full. i definitely (not that you said this) do not think "everything i do has to be possible on every computer i am ever shelled into"
being a person on a computer for decades, i have tuned how i want to do things that are incredibly common for me
though perhaps you're referring to work and not hobby/life
It's the bell curve meme all along.
I prefer using kubectl than any other method so i have plenty of functions to help with that. I'd never consider using python or go for this although I do have plenty of python and go "scripts" on my path too.
Different strokes for different folks - tenured engineers just settle into whatever works best for them.
If you come through the other side, you set up LocalCommand in your .ssh/config which copies your config to every server you ssh to, and get your setup everywhere.
I like this one.
Regarding the `line` script, just a note that sed can print an arbitrary line from a file, no need to invoke a pipeline of cat, head, and tail:
prints the second line of file. The advantage sed has over this line script is it can also print more than one line, should you need to: prints lines 2 through 4, inclusive.It is often useful to chain multiple sed commands and sometimes shuffle them around. In those cases I would need to keep changing the fist sed. Sometimes I need to grep before I sed. Using cat, tail and head makes things more modular in the long run I feel. It’s the ethos of each command doing one small thing
True, everything depends on what one is trying to do at the time.
yeah I almost always start with `cat` but I still pipe it into `sed -n 1,4p`
My fav script to unpack anything, found a few years ago somewhere
`aunpack` does the trick for me.
This is exactly the kind of stuff I'm most interested in finding on HN. How do other developers work, and how can I get better at my work from it?
What's always interesting to me is how many of these I'll see and initially think, "I don't really need that." Because I'm well aware of the effect (which I'm sure has a name - I suppose it's similar to induced demand) of "make $uncommon_task much cheaper" -> "$uncommon_task becomes the basis of an entirely new workflow/skill". So I'm going to try out most of them and see what sticks!
Also: really love the style of the post. It's very clear but also includes super valuable information about how often the author actually uses each script, to get a sense ahead of time for which ones are more likely to trigger the effect described above.
A final aside about my own workflows which betrays my origins... for some of these operations and for others i occasionally need, I'll just open a browser dev tools window and use JS to do it, for example lowercasing a string :)
I'd love to see a cost benefit analysis of the author's approach vs yours, which includes the time it took the author to create the scripts, remember/learn to use them/reference them when forgetting syntax, plus time spent migrating whenever changing systems.
why is this interesting to you? the whole point of doing all of this is to be more efficient in the long run. of course there is an initial setup cost and learning curve after which you will hopefully feel quite efficient with your development environment. you are making it sound like it is not worth the effort because you have to potentially spend time learning "it"? i do not believe that it takes long to "learning" it, but of course it can differ a lot from person to person. your remarks seem like non-issues to me.
It's interesting because there's a significant chance one wastes more time tinkering around with custom scripts than saving in the long run. See https://xkcd.com/1205/
For example. The "saves 5 seconds task that I do once a month" from the blog post. Hopefully the author did not spend more than 5 minutes writing said script and maintaining it, or they're losing time in the long run.
I find that now with AI, you can make scripts very quickly, reducing the time to write them by a lot. There is still some time needed for prompting and testing but still.
I keep meaning to generalize this (directory target, multiple sources, flags), but I get quite a bit of mileage out of this `unmv` script even as it is:
I have mkcd exactly ( I wonder how many of us do, it's so obvious)
I have almost the same, but differently named with scratch(day), copy(xc), markdown quote(blockquote), murder, waitfor, tryna, etc.
I used to use telegram-send with a custom notification sounnd a lot for notifications from long-running scripts if I walked away from the laptop.
I used to have one called timespeak that would speak the time to me every hour or half hour.
I have go_clone that clones a repo into GOPATH which I use for organising even non-go projects long after putting go projects in GOPATH stopped being needed.
I liked writing one-offs, and I don't think it's premature optimization because I kept getting faster at it.
Obviously that script is more convenient, but if you’re on a system where you don’t have it, you can do the following instead:
Some cool things here but in general I like to learn and use the standard utilities for most of this. Main reason is I hop in and out of a lot of different systems and my personal aliases and scripts are not on most of them.
sed, awk, grep, and xargs along with standard utilities get you a long long way.
I totally agree with this, I end up working on many systems, and very few of them have all my creature comforts. At the same time, really good tools can stick around and become impactful enough to ship by default, or to be easily apt-get-able. I don't think a personal collection of scripts is the way, but maybe a well maintained package.
Nice! Tangentially related: I built a (MacOS only) tool called clippy to be a much better pbcopy. It was just added to homebrew core. Among other things, it auto-detects when you want files as references so they paste into GUI apps as uploads, not bytes.
https://github.com/neilberkman/clippy / brew install clippy[delayed]
> cpwd copies the current directory to the clipboard. Basically pwd | copy. I often use this when I’m in a directory and I want use that directory in another terminal tab; I copy it in one tab and cd to it in another. I use this once a day or so.
You can configure your shell to notify the terminal of directory changes, and then use your terminal’s “open new window” function (eg: ctrl+shift+n) to open a new window retaining the current directory.
Broadly, I very much love this approach to things and wish it was more "acceptable?" It reminds me of the opposite of things like "the useless use of cat" which to me is one of the WORST meme-type-things in this space.
Like, it's okay -- even good -- for the tools to bend to the user and not the other way around.
> alphabet just prints the English alphabet in upper and lowercase. I use this surprisingly often (probably about once a month)
I genuinely wonder, why would anyone want to use this, often?
These are great, and I have a few matching myself.
Here are some super simple ones I didn't see that I use almost every day:
cl="clear"
g="git"
h="history"
ll="ls -al"
path='echo -e ${PATH//:/\\n}'
lv="live-server"
And for common navigation:
dl="cd ~/Downloads"
dt="cd ~/Desktop"
I have a bunch of little scripts and aliases I've written over the years, but none are used more than these...
alias ..='cd ..'
alias ...='cd ../..'
alias ....='cd ../../..'
alias .....='cd ../../../..'
alias ......='cd ../../../../..'
alias .......='cd ../../../../../..'
Does zsh support this out-of-the-box? Because I definitely never had to setup any of these kinds of aliases but have been using this shorthand dot notation for years.
Yes it does.
up() { local d="" for ((i=1; i<=$1; i++)); do d="../$d" done cd "$d" }
up 2, up 3 etc.
I have setup a shortcut: alt+. to run cd.., it's pretty cool.
I also aliased - to run cd -
This is one area that I've found success in vibe coding with. Making scripts for repetitive tasks that are just above the complexity threshold where the math between automating and doing manually is not so clear. I have copilot generate the code for me and honestly I don't care too much of its quality, extensibility, and are easy enough to read through where I don't feel like my job is AI pr reviewer.
Share yours!
I use this as a bookmarklet to grab the front page of the new york times (print edition). (You can also go back to any date up to like 2011)
I think they go out at like 4 am. So, day-of, note that it will fail if you're in that window before publishing.
Historical note: getting hold of these scripts by chatting to various developers was the motivation for the original 2004 "lifehacks" talk[1][2]. If you ever get into an online argument over what is a "life hack" and what isn't, feel free to use short scripts like these as the canonical example.
Otherwise, I am happy to be pulled into your discussion, Marshall McLuhan style[3] to adjudicate, for a very reasonable fee.
[1] https://craphound.com/lifehacksetcon04.txt
[2] https://archive.org/details/Notcon2004DannyOBrienLifehacks
[3] https://www.openculture.com/2017/05/woody-allen-gets-marshal...
A lot of these scripts could be just shell aliases.
OP links to a long blog post with reasons for using scripts
https://evanhahn.com/why-alias-is-my-last-resort-for-aliases...
An important advantage of aliases was not mentioned: I see everything in one place and can easily build aliases on top of other aliases without much thinking.
Anyways, my favourite alias that I use all the time is this:
It solves the ,,not loaded automatically'' part at least for the current terminal30% of the productivity hacks can be archived in vanilla Nushell.
As a bonus, I prepend my custom aliases or scripts with my user name and hyphen (i.e helicaltwine-). It helps me recall rarely used scripts when I need them and forget the names.
I follow a similar but more terse pattern. I prepend them all with a comma, and I have yet to find any collisions. If you're using bash (and I assume posix sh as well), the comma character has no special meaning, so this is quite a nice use for it. I agree that it's nice to type ",<tab>" and see all my custom scripts appear.
> `rn` prints the current time and date using date and cal.
And you can type `rn -rf *` to see all timezones recursively. :)
Nice. I have a bash script similar to the one listed "removeexif" called prep_for_web which takes any image file (PNG, BMP, JPG, WebP), scrubs EXIF data, checks for transparency and then compresses it to either JPG using MozJPEG or to PNG using PNGQuant.
[1] https://github.com/mozilla/mozjpeg
[2] https://pngquant.org
I also have a radio script to play internet streams with mpv (?). Other random stuff
A password or token generator, simple or complicated random text.
Scripts to list, view and delete mail messages inside POP3 servers
n, to start Nautilus from terminal in the current directory.
lastpdf, to open the last file I printed as PDF.
lastdownload, to view the names of the n most recent files in the Downloads directory.
And many more but those are the ones that I use often and I remember without looking at ~/bin
It's been a while since I haven't read something as useful!
There also some very niche stuff that I won't use but found funny
The nato phonetic alphabet one cracked me up. My dude you don't need that, call center employees don't know it, just say S as in Sugar like ur grandma used to.
The NATO alphabet is made of commonly known words that are hard to misspell and hard to mishear (at least the initial phonemes). The person on the other end doesn't need to be able to recite the words, they just need to be able to hear "november" and recognize that it starts with N.
The nato phonetic alphabet is still useful even if the other party doesn't know it, I've used it a bunch of times on the phone to spell out my 10- letter last name. Saves quite a lot of time and energy for me vs saying "letter as in word" for each letter.
Exactly. The listening party doesn't need to have knowledge of the NATO alphabet to still benefit from it since they are just regular English words.
I once had someone sound out a serial number over a spotty phone connection years ago and they said "N as in NAIL". You know what sounds a lot like NAIL? MAIL.
And that is why we don't just arbitrarily make up phonetic alphabets.
> saying "letter as in word" for each letter
Which often just confuses things further.
Me: My name is "Farb" F-A-R-B. B as in Baker.
Them: Farb-Baker, got it.
"M as in Mancy."
Right but it's not much more useful than any other phonetic alphabet the other party doesn't know, including the one you make up on the spot.
If you're me, it's still useful because the ones I make up on the spot aren't great.
"S-T-E-V-E @ gmail.com, S as in sun, T as in taste, ..." "Got it, fpeve."
I dunno, there's a pretty good chance that the one that people spent time and effort designing to replace earlier efforts with the goal of reducing potential ambiguity and for use over noisy connections with expectation that mistakes could cost lives is probably better than what you improvise on the spot
When I worked in customer service, I asked a teammate what I could do to spell back something the customer said, and she taught me that system, it helped me a lot.
I once had the customer service agent for Iberia (the Spanish airline) confirm my confirmation number with me using it.
It worked with me and I guess it must have usually worked for him in most of his customer interactions.
if you use x0vnc (useful if you use a linux machine both from the attached screen and from vnc, and in a bunch of other scenarios), copy and paste to and fro the vnc client is not implemented, quite frustrating. here's 2 scripts that does that for you, I now use this all day. https://github.com/francoisp/clipshare
Where are the one letter aliases? My life got better after I alias k=kubectl
The markdownquote can be replaced by (at least in vim):
^ (jump to the beginning)
ctrl+v (block selection)
j (move cursor down)
shift+i (bulk insert?)
type ><space>
ESC
:'<,'>s/.*$/> \0/ also
no offense but a lot of those script are pretty hacky they may work for the user but i would not use them without making sure to review them and adapt them to my workflow
mksh is already the MirBSD Korn SHell
Looks very useful!
cool collection