So I'm sure we've all spent time writing scripts or figuring out CLIs for that one project we're working on, and then kind of go on to forget what we did. Then, when another project comes along later, you wish you had that script again so you could see how you did that thing you did.
Personally, I used to just check random scripts into a repo as a kind of "archive" of all my scripts. But I wanted a better way to organize and use these things.
For years I've been building and collecting these scripts into a CLI that I call Devtools to make it so that each script is a subcommand.
I've had a lot of my friends and coworkers ask me to open-source it so they could use it and see how some things are done in Bash, what tools I use, etc. So...here's that CLI!
But what I'd honestly like is more...
So what are your useful scripts or CLIs you've built? Or what's that script you wrote years ago that you now swear by? Or what's that one application you use daily that just makes your life infinitely easier! I want to grow this collection and feed the addiction!
Thanks so much for the other stuff you use! I've been using
bmfor years, but I haven't usedmkcd, so I'm definitely going to add that.I'm going to give your library, config, and AoC a good look because that's exactly what I was hoping for in this conversation! :)
In general: The only time I add it to my
~/.bashrcis when it's either an alias for something simple, or a very simple function. Otherwise, anything that requires more legwork or is bigger than a few lines, I put indtools. I used to put it all in my~/.bashrcbut that honestly became kind of cumbersome when I have different configs on different servers, or machines for work vs personal, etc. And sometimes the exports would differ making functions work differently and I didn't want to just have to copy that section of my~/.bashrcas well every time something updated, hence why I created thedtoolsrepo!To respond to your other comments, I'm going to do my best to respond in the order they show up:
printfvsechothroughbold()vs$'\e[1m'The
dtoolsscript is actually compiled, not written by me. So in the vast majority of the project, my code is all in thesrcdirectory, not in thedtoolsscript. In fact, my repo without compilation consists of only 3k lines. The compiled code in the script then makes up the completions, coloring, some error handling, validations, filters, help messages, constraints (like conflicting flags), etc.So many of the
echosyou see are from thebashlyframework, not my code. I often use heredocs for longer or multiline strings (being SUPER careful when using<<-EOFto make sure my damn editor is using TABs...that's such a nightmare otherwise 😂 ).If you look through my code in particular, you'll see I use many of these bash-isms you've mentioned!
So the One vs Many comment is exactly how the repo works! Each subcommand is its own directory and file. So, for example: All
dtools awscommands are in thesrc/commands/awsdirectory. And any further subcommands likedtools aws secretsmanagerare in another subdirectory where each command has its own individual script!Bashisms
I'm familiar with many of the bashisms you mentioned except the
var="$(< file)"one, that's awesome! I've been trying to migrate away from usingcatto just output file contents and use more direct, purpose methods that are often built into tools (likejq '.[]' file.jsoninstead ofcat file.json | jq '.[]'). However, I'll say that when I'm trying to read each line into an iterable array, I often usereadarraytoo.grep | awk | sedI've been trying for years to get more people to look into
awkbecause it's amazing! It's so undervalued!sedtakes some getting used to with the pattern and hold space but it's worth the initial suffering 😛Shellcheck
I've got my Helix editor set up with Shellcheck! It's awesome! You'll notice if you look at my code directly that there's a number of places I have to do
# shellcheck disable=SC2154(a variable is referencing an undefined value). This is because the framework creates and passes those variables to my scripts for me.A Loose Offer
You seem a lot like me in that you do a LOT of bash scripting! So I'll admit to the fact that I've looked at the compiled code and noted that the most important code is mine, and while there's a lot of things going on in the compiled script, I agree with most of it. But I've also been a bit concerned about how often it's spawning subshells when it doesn't have to.
I think I can fix some of them with associative arrays if I add a minimum bash version requirement in my config, but I've honestly never tried. I'll check that out now!
Since you make a solid point about a lot of this that should maybe be updated in the Bashly framework, maybe we should work together to update the framework to have better conventions like you've mentioned?
If you mean from my dotfiles, that's wild. A friend of mine wrote his own implementation in rust, but I've not really used their version, though I'm not sure its on github.
While I'm not currently using it, its on my todo list to take a real look at chezmoi for these per-machine differences; especially as I'm always between Linux, Windows & WSL. While chezmoi is outside the scope of this topic, it seems like a pretty solid configuration management option...and probably safer than what I'm doing (
ln -s).My "solution" is a collection of templates I'll load in to my editor (
nvim, with my ~~lackluster~~ plugin), which contains the basics for most scripts of a certain type. The only time that I'll write something and rely on something that isn't builtin, e.g. a customization, is if:require()the item & add testing for it[[ -z "${var}" ]], orcommand -vusuallyFor my work, every script is usually as "batteries included" as reasonable, in whatever language I'm required to work with (
bash,sh,pwshorgroovy). That said, the only items that appear in nearly every script at work are:main(),show_help(), etc.log()curlwrapper for Mailgun)Transpiling
bashintobashis probably the weirdest workflow I've ever heard of. While I can see some benefit of a "framework" mentality, if the 'compiled' result is a 30K line script, I'm not sure how useful it is IMO.For me at least, I view most shell scripts as being simple automation tools, and an exercise in limitation.
I did see some of that, even in the transpiled
dtoolsmonolithJust be aware that this reads the full contents into a variable, not an array. I would generally use
mapfile/readarrayfor multiline files. As for thejqexample, you should be able to get away withjq '.[]' < file.json, which is also POSIX when that's a concern.I don't think I'm the right person for that job -- I'm both unfamiliar with Ruby and have no desire to interract with it. I'm also pretty opinionated about shell generally, and likely not the right person to come up with a general spec for most people.
Additionally, my initial reaction that
bashlyseems like a solution in search of a problem, probably isn't healthy for the project.