I’ve been working on a web project for a few months now and thought I’d take stock of the tools I’ve been using. So I went through my zsh_history (fun fact, since March I’ve run 3,950 commands for this project). These are only the “generic” tools, I’ve excluded any stack-specific tools, like ghci, pgcli, docker, spago, etc. I’ve also excluded obvious tools and built-ins, like ssh, scp, and git.

{html,js,css}-beautify

When I’m on the frontend, it’s not uncommon to deal with minified code. I’ll often pipe curl’s output into one of these, before redirecting it into a file, ex. curl www.google.com | html-beautify > google.html.

Note that js-beautify will also beautify json, but I find the next tool to be better for that.

jq

jq is the one and only tool I’ve needed to mess with json on the command line. The easiest use of it is to provide no arguments, which will format and color the output:

curl jsonplaceholder.typicode.com/posts | jq

jq has its own language that can do pretty much any json manipulation you’d want, for example something like this:

curl jsonplaceholder.typicode.com/posts | jq ".[0:8]|map(.title)"

Will give you just the titles of the first 8 results. There’s a lot to learn, and the manual does a pretty good job. However, like with most things, googling for stack overflow answers is often easier than reviewing the manual.

scc

This is more of a vanity thing than a really useful tool, but I like to check in ocassionally with my projects and see how much code I’ve written. Just running scc in the root directory will give a breakdown of how many lines of code there are in each language, in a nice table. It also has a fun section with estimates of how much it would cost to develop a similar codebase, how many people it would take, and how long it would take (not to be taken seriously, of course).

ncdu

ncdu gives a nice interactive view of disk usage. I encountered a couple issues with disk usage since I was doing a lot of scraping for this project, and ncdu was very useful to find out where my disk space was going.

htop

htop is just a better top. If you don’t know top, what this program does is show an overview of CPU and memory usage, and a list of all processes. You can sort the processes by % CPU and % memory. This makes it easy to find runaway processes and kill them, or find out why your computer is slow (in my experience it’s almost always the 120 browser tabs I’ve opened trying to debug an issue).

tmux

tmux is a program to manage multiple terminals (or a terminal multiplexer, in nerd). I usually have a setup like this running:

  • Frontend window

  • Editor pane

  • Dev server pane

  • Misc commands pane

  • Backend window

  • Editor pane

  • Server pane

  • REPL

  • Deployment/architecture/misc window

  • Editor pane

  • Docker compose logs

  • ssh session

It takes some getting used to the key commands, but once you get comfortable, it becomes a very powerful tool, especially if you’re a heavy terminal user. I never use “vanilla” terminal sessions anymore (unless it’s to ssh into a server to start a tmux session there!).

tmux is also great because of its client-server model. You can quit your terminal session and your tmux session will still be alive. I use this as a hacky way to start long-running processes on remote servers, since the tmux session will stay alive after I’m disconnected.

I’ve also heard good things about byobu, but haven’t seen anything that justifies relearning key commands and setting it up.

trash-cli

This is a rm replacement of sorts. It will use the system’s trash instead of just erasing your files from existence. Now I delete stuff without the terror that comes with writing a rm command. I wouldn’t recommend aliasing rm to trash-put, but try to get in the habit of using trash-put when appropriate (everything except scripts, imo).

live-server

This tool just takes a directory and will serve files from it. Can be useful for prototyping html/js/css stuff. Also provides live reloading, which is nice.