So I'm sure we've all spent time writing scripts or figuring out CLIs for that one project we're working on, and then kind of go on to forget what we did. Then, when another project comes along later, you wish you had that script again so you could see how you did that thing you did.
Personally, I used to just check random scripts into a repo as a kind of "archive" of all my scripts. But I wanted a better way to organize and use these things.
For years I've been building and collecting these scripts into a CLI that I call Devtools to make it so that each script is a subcommand.
I've had a lot of my friends and coworkers ask me to open-source it so they could use it and see how some things are done in Bash, what tools I use, etc. So...here's that CLI!
But what I'd honestly like is more...
So what are your useful scripts or CLIs you've built? Or what's that script you wrote years ago that you now swear by? Or what's that one application you use daily that just makes your life infinitely easier! I want to grow this collection and feed the addiction!
I've gotten to the point that, anything "useful" enough goes in a repo -- unless its for work, since I'd otherwise be polluting our "great" subversion server...
Functions
I've stopped using as many functions, though are just too handy:
bm(): super basic bookmark manager,cdorpushdto some static pathzoxidehas to offermkcd(): essentiallymkdir -p && cd, but I use it enough that I forgot it isn't standardI'm also primarily a WSL user these days (one day, I'll move permanently) -- to deal with
ssh-agentshenanigans there, I also rely onssh.shin my config. I should at some point removekc(), as I don't think I'lll ever go back.Scripts
Despite having a big collection of scripts, I don't use these too often; but still wanted to mention:
md2d.sh: pandoc wrapper, mostly using it to convert markdown into docxgitclone.sh:git clonewrapper, but I use it asgcl -gquite oftenA lot of my more useful scripts are, unfortunately, work related -- and probably pretty niche.
"Library"
I also keep a library of sorts for reusable snippets, which I'll
sourceas needed. Themath&arraylibs in particular are very rarely used -- AoC, for the most part.Config
Otherwise, my
bashconfig is my lifeblood -- without it, I'm pretty unproductive.dtoolscommentsHad a look through your repo, and have some thoughts if you don't mind. You may already know about several of these items, but I'm not going to be able to sift through 30K lines to see what is/isn't known.
printfvsechoThere's a great writeup on why
echoshould be used with caution. Its probably fine, but wanted to mention it -- personally, I'll useechowhen I need static text andprintfdoesn't make sense to use otherwise.Multiline-
printfvsHEREDOCIn the script, you've got like 6K lines of
printfstatements to show various usage text. Instead, I'd recommend usingHEREDOCs (<<).As an example:
HEREDOCs can also be used for basically anystdinstream; for example:bold()vs$'\e[1m'On a related note, rather than using functions and by extension subshells (
$(...)) to color text; you could do something like:This sets each of these options in an associative array (or hash table, sort of); callable with
${ANSI_FMT["key"]}; which expands like any other variable. As such, the text will be inserted directly without needing to spawn a subshell.Additionally, the
$'...'or$"..."syntax is a bashism that expands escape sequences directly; so$'\t'expands to a literal tab character. The only difference betweeen the two forms is whether$expressions will also be expanded, e.g.$"\e[31m$HOME\e[0mvs$'\e[31mHOME\e[0m.Do also note that
$'\e[0m(or equiv) is required with this method, as you're no longer performing the formatting in a subshell environment. I personally find this tradeoff worthwhile, though. But, I also don't use it very often.The heredoc example before would then look like:
As a real-world example from a recent work project:
Here, I'm using
printf's%bto expand the color code, then later using$'...'withsedto strip those out for writing to a logfile. While I'm not using an associative array in this case, I do something similar in mylog.shlibrary.One vs Many
Seeing that there's nearly 30K lines in this script, I would argue it should be split up. You can easily split scripts up to keep everything organized, or to make reusable code, by
sourceing the script. For example, to use thelog.shlibrary, I would do something like:Given the insane length of this monolith, splitting it up is probably worth it. The
run()and related functions could stay withindtools, but each part could be split out to another file, which does its own subcommand argparse?Bashisms
The Wooledge on Bashisms is a great writeup explaining the quirks between
POSIXandbash-- more specificaly, what kind of tools available out of the box when writing forbashspecifically.Some that I use on a regular basis:
&>or&>>: redirect bothstdout&stdinto some file/descriptor|&: shorthand for2>&1 |var="$(< file)": read file contents into a variablemapfileorreadarrayfor most of these casesbash)(( ... )): arithemtic expressions, including C-Style for-loops(( myvar >= 1 ))or(( count++ ))grep | awk | sedJust wanted to note that
awkcan do basically everything. These days I tend to avoid it, but it can do it all. Using ln. 6361 as an example:The downside of
awkis that it can be a little slow compared togrep,sedorcut. More power in a single tool, but maybe not as performant.Shellcheck
I'm almost certain I'm preaching to the choir, but will add the recommendation for shellcheck or
bash-language-serverbroadly.While there's not much it spit out for
dtools, there are some items of concern, notably unclosed strings.A Loose Offer
If interested, I could like at rewriting
dtooltaking into account the items I've listed above, amongst others. Given the scope of the project, that's quite the undertaking from a new set of eyes, but figured I'd throw it out there. Gives me something to do over the upcoming long weekend.Thanks so much for the other stuff you use! I've been using
bmfor years, but I haven't usedmkcd, so I'm definitely going to add that.I'm going to give your library, config, and AoC a good look because that's exactly what I was hoping for in this conversation! :)
In general: The only time I add it to my
~/.bashrcis when it's either an alias for something simple, or a very simple function. Otherwise, anything that requires more legwork or is bigger than a few lines, I put indtools. I used to put it all in my~/.bashrcbut that honestly became kind of cumbersome when I have different configs on different servers, or machines for work vs personal, etc. And sometimes the exports would differ making functions work differently and I didn't want to just have to copy that section of my~/.bashrcas well every time something updated, hence why I created thedtoolsrepo!To respond to your other comments, I'm going to do my best to respond in the order they show up:
printfvsechothroughbold()vs$'\e[1m'The
dtoolsscript is actually compiled, not written by me. So in the vast majority of the project, my code is all in thesrcdirectory, not in thedtoolsscript. In fact, my repo without compilation consists of only 3k lines. The compiled code in the script then makes up the completions, coloring, some error handling, validations, filters, help messages, constraints (like conflicting flags), etc.So many of the
echosyou see are from thebashlyframework, not my code. I often use heredocs for longer or multiline strings (being SUPER careful when using<<-EOFto make sure my damn editor is using TABs...that's such a nightmare otherwise 😂 ).If you look through my code in particular, you'll see I use many of these bash-isms you've mentioned!
So the One vs Many comment is exactly how the repo works! Each subcommand is its own directory and file. So, for example: All
dtools awscommands are in thesrc/commands/awsdirectory. And any further subcommands likedtools aws secretsmanagerare in another subdirectory where each command has its own individual script!Bashisms
I'm familiar with many of the bashisms you mentioned except the
var="$(< file)"one, that's awesome! I've been trying to migrate away from usingcatto just output file contents and use more direct, purpose methods that are often built into tools (likejq '.[]' file.jsoninstead ofcat file.json | jq '.[]'). However, I'll say that when I'm trying to read each line into an iterable array, I often usereadarraytoo.grep | awk | sedI've been trying for years to get more people to look into
awkbecause it's amazing! It's so undervalued!sedtakes some getting used to with the pattern and hold space but it's worth the initial suffering 😛Shellcheck
I've got my Helix editor set up with Shellcheck! It's awesome! You'll notice if you look at my code directly that there's a number of places I have to do
# shellcheck disable=SC2154(a variable is referencing an undefined value). This is because the framework creates and passes those variables to my scripts for me.A Loose Offer
You seem a lot like me in that you do a LOT of bash scripting! So I'll admit to the fact that I've looked at the compiled code and noted that the most important code is mine, and while there's a lot of things going on in the compiled script, I agree with most of it. But I've also been a bit concerned about how often it's spawning subshells when it doesn't have to.
I think I can fix some of them with associative arrays if I add a minimum bash version requirement in my config, but I've honestly never tried. I'll check that out now!
Since you make a solid point about a lot of this that should maybe be updated in the Bashly framework, maybe we should work together to update the framework to have better conventions like you've mentioned?
If you mean from my dotfiles, that's wild. A friend of mine wrote his own implementation in rust, but I've not really used their version, though I'm not sure its on github.
While I'm not currently using it, its on my todo list to take a real look at chezmoi for these per-machine differences; especially as I'm always between Linux, Windows & WSL. While chezmoi is outside the scope of this topic, it seems like a pretty solid configuration management option...and probably safer than what I'm doing (
ln -s).My "solution" is a collection of templates I'll load in to my editor (
nvim, with my ~~lackluster~~ plugin), which contains the basics for most scripts of a certain type. The only time that I'll write something and rely on something that isn't builtin, e.g. a customization, is if:require()the item & add testing for it[[ -z "${var}" ]], orcommand -vusuallyFor my work, every script is usually as "batteries included" as reasonable, in whatever language I'm required to work with (
bash,sh,pwshorgroovy). That said, the only items that appear in nearly every script at work are:main(),show_help(), etc.log()curlwrapper for Mailgun)Transpiling
bashintobashis probably the weirdest workflow I've ever heard of. While I can see some benefit of a "framework" mentality, if the 'compiled' result is a 30K line script, I'm not sure how useful it is IMO.For me at least, I view most shell scripts as being simple automation tools, and an exercise in limitation.
I did see some of that, even in the transpiled
dtoolsmonolithJust be aware that this reads the full contents into a variable, not an array. I would generally use
mapfile/readarrayfor multiline files. As for thejqexample, you should be able to get away withjq '.[]' < file.json, which is also POSIX when that's a concern.I don't think I'm the right person for that job -- I'm both unfamiliar with Ruby and have no desire to interract with it. I'm also pretty opinionated about shell generally, and likely not the right person to come up with a general spec for most people.
Additionally, my initial reaction that
bashlyseems like a solution in search of a problem, probably isn't healthy for the project.