logo by user raptor

basement
community

search

wall of shame

software shell scripts

joined jan 12, 2024

avatar

joined jan 12, 2024

this thread will be dedicated to the discussion of shell scripts - share ones you've made or found in the wild; share tips, tricks and hacks you've learned, or if you're new to shell scripting, ask about it!

i frequently write little shell scripts to help improve my computing experience. for example, just yesterday i wrote this, called mutemic:

#!/usr/bin/env bash
 checkmute(){ 
if 
pactl get-source-mute 0 | grep yes; then 
notify-send "Microphone [MUTED]" "Your mic is now muted."; else 
notify-send "Microphone [UNMUTED]" "Your mic is unmuted." 
fi 
} 
pactl set-source-mute 0 toggle && checkmute

i have this script set to alt+m on my keyboard. it toggles mute for my microphone, then notifies me if it is muted or not by using my desktop's native notification system. neat, right?

it's a fun way of learning to interact with your system and if you're creative enough you can make some pretty cool things.

posted 2/4/2024, 5:26 pm

joined jan 1, 2024

avatar

joined jan 1, 2024

notify-send --icon="x-office-calendar" "Discordian Date" "$(ddate)"

My first shell script which makes a little pop-up alert that tells you the date on the Discordian Calendar. It's not much but it was a nifty little introduction to shell scripts as a concept. Haven't made one since because I did not need it yet, but I can see the power behind them now.

edited 2/5/2024, 1:03 pm

joined aug 16, 2023

avatar

non serviam

joined aug 16, 2023

I didn't mean to, but I ended up building a static site generator using a makefile and shell scripts.

hxtoc and hxincl are part of the HTML-XML-utils package.

#!/usr/bin/env bash INPUT=$1 OUTPUT=$2 # shellcheck source=common.sh . "./common.sh" . "${INPUT/.htm/.sh}" PAGE_URL=$(echo "${INPUT}" | sed -e "s:site/::" -e "s:htm:html:") SOURCE_TEXT=$(echo "${PAGE_URL}" | sed -e "s:html:txt:") SUBJECT=$(echo "${PAGE_TITLE}" | sed -e "s/ /%20/g") # Use tools from HTML-XML-utils implement server-side includes on localhost. # Use HTML Tidy to pretty-print markup and make it readable for newbies using "view source". "./templates/${PAGE_TEMPLATE}.sh" "${INPUT}" \ | hxincl -f -b ./partials/ -x \ | hxtoc -x -t -l 2 \ | tidy -config ./tidy-html.conf \ | sed -e "s|SED_SITE|${SITE}|g" \ -e "s|SED_TITLE|${PAGE_TITLE}|g" \ -e "s|SED_SUBJECT|${SUBJECT}|g" \ -e "s|SED_DESCRIPTION|${PAGE_DESCRIPTION}|g" \ -e "s|SED_URL|${URL}|g" \ -e "s|SED_LANG|${SITE_LANG}|g" \ -e "s|SED_ICBM_LAT|${ICBM_LAT}|g" \ -e "s|SED_ICBM_LON|${ICBM_LON}|g" \ -e "s|SED_PAGE_URL|${PAGE_URL}|g" \ -e "s|SED_SOURCE_TEXT|${SOURCE_TEXT}|g" \ -e "s|SED_YEAR|${YEAR}|g" \ -e "s|SED_AUTHOR_EMAIL|${AUTHOR_EMAIL}|g" \ -e "s|SED_AUTHOR_URL|${AUTHOR_URL}|g" \ -e "s|SED_AUTHOR|${AUTHOR}|g" \ -e "s|SED_LICENSE_DESC|${LICENSE_DESC}|g" \ -e "s|SED_LICENSE|${LICENSE}|g" \ -e "s|SED_CREATED_ON|${PAGE_CREATED}|g" \ -e "s|SED_UPDATED_ON|${UPDATED_DATE}|g" \ -e "s|SED_UPDATED_DISPLAY|${UPDATED_DISPLAY}|g" \ -e "s|SED_UPDATED_TIMESTAMP|${UPDATED_TIMESTAMP}|g" > "./${OUTPUT}"

That's just one script ("to-html.sh"). I've got more in a git repository.

edited 2/6/2024, 3:57 pm

joined feb 22, 2023

avatar

Do not induce vomiting if swallowed.

joined feb 22, 2023

Wrote yet another little bookmark manager for dmenu.

The most important functions are

bmks_dadd() { url="$(dmenu -p "URL" <&- | sed -E 's/^https:\/\///g')" [ -z "$url" ] && exit description="$(dmenu -p "Description" <&-)" [ -z "$description" ] && echo "$url" >> $URL_FILE_PATH/$URL_FILE_NAME [ -n "$description" ] && echo "$description - $url" >> $URL_FILE_PATH/$URL_FILE_NAME }

and

bmks_display() { sort $URL_FILE_PATH/$URL_FILE_NAME | dmenu | awk '{print $(NF)}' | xargs -Iα firefox α }

posted 2/5/2024, 5:51 pm

deleted_user

joined some time ago

avatar

deleted_user

joined some time ago

content has been deleted

posted 2/7/2024, 5:35 am

joined aug 16, 2023

avatar

non serviam

joined aug 16, 2023

quoting Melvian:

This has become my new personal favorite way to write build scripts for my C-based software projects. This assumes zsh though. I'm not sure how well it works in bash (looks like it might?).

Basically, sources, include directories, library directories, etc. are stored as array variables. Then special variable accessing syntax converts these to the necessary parameters (like prepending -I to each include directory) and concatenated into an ARGS variable. Finally it's passed to clang in such a way that they're all expanded correctly.

For a moment I thought this was a makefile, not a shell script. Also, I'm surprised the script doesn't start with #!/usr/bin/env zsh so that it can execute itself. Or did you leave that out for your post?

posted 2/7/2024, 4:07 pm

deleted_user

joined some time ago

avatar

deleted_user

joined some time ago

content has been deleted

edited 2/8/2024, 5:16 am

joined jan 12, 2024

avatar

joined jan 12, 2024

quoting starbreaker:

I didn't mean to, but I ended up building a static site generator using a makefile and shell scripts.

dude, that's incredible! truly utilizing these tools to their fullest extent.

posted 3/15/2024, 4:23 pm

joined jan 12, 2024

avatar

joined jan 12, 2024

quoting c1:

Wrote yet another little bookmark manager for dmenu.

sweet! i might see if this works with rofi

posted 3/15/2024, 4:24 pm

joined aug 16, 2023

avatar

non serviam

joined aug 16, 2023

quoting whiteVHS:

dude, that's incredible! truly utilizing these tools to their fullest extent.

Thanks. Yesterday I put together a script to create an XML sitemap for non-AI/non-commercial search engine crawlers.

#!/usr/bin/env bash # shellcheck source=common.sh . "./common.sh" LASTMOD_DATE=$("${DATE}" -u -Iseconds) ENTRIES=$((find site -name '*.html' && find site -name '*.xml') \ | "${SORT}" -u \ | awk -F '\t' '{printf "\t<url>\n\t\t<loc>%s</loc>\n\t\t<lastmod>__DATE__</lastmod>\n\t\t<changefreq>weekly</changefreq>\n\t</url>\n", $1}' \ | "${SED}" -e "s|site/|${URL}/|g" \ -e "s|__DATE__|${LASTMOD_DATE}|g") cat <<EOF <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> ${ENTRIES} </urlset> EOF

And this is my makefile. I use recursive make for selective parallelization to take advantage of near-ubiquitous multi-core CPUs.

.SUFFIXES: .png .jpg .avif .htm .html .rss .headline .sh .tsv .jpg.avif: avifenc --ignore-exif --speed 4 --min 20 --max 22 --qcolor 50 --qalpha 80 "$<" "$@" .png.avif: avifenc --ignore-exif --speed 4 --min 20 --max 22 --qcolor 50 --qalpha 80 "$<" "$@" .htm.html: ./to-html.sh "$<" "$@" .htm.rss: ./to-rss-entry.sh "$<" "$@" .htm.headline: ./to-rss-headline.sh "$<" "$@" .sh.tsv: ./to-tsv.sh "$<" "$@" .tsv.rss: cat `cut -f4 "$<"` > "$@" .tsv.html: ./to-index-html.sh "$<" "$@" JPGS!=find site/assets/ -name '*.jpg' && find site/media/ -name '*.jpg' JPG_AVIF=${JPGS:.jpg=.avif} PNGS!=find site/assets/ -name '*.png' && find site/media/ -name '*.png' PNG_AVIF=${PNGS:.png=.avif} HTM!=find site -name '*.htm' RSSHTML!=find site/blog -name '*.htm' HTM_HTML=${HTM:.htm=.html} RSSHTML_RSS=${RSSHTML:.htm=.rss} RSSHTML_HEADLINE=${RSSHTML:.htm=.headline} SH!=find site -name '*.sh' SH_TSV=${SH:.sh=.tsv} TSV!=find data -name '*.tsv' TSV_RSS=${TSV:.tsv=.rss} TSV_HEADLINES=${TSV:.tsv=.headline} TSV_HTML=${TSV:.tsv=.html} include sshvars default: build help: @echo "targets:" @grep -E '^[a-zA-Z0-9_-]+:.*?## .*$$' $(MAKEFILE_LIST) \ | sed -n 's/^\(.*\): \(.*\)##\(.*\)/ \1|\3/p' \ | sed 's/makefile://g' \ | column -t -s '|' exif: ## strip data from source images. exiftool -overwrite_original -all:all= site/ data: $(SH_TSV) ## process shell variables with shell scripts mkdir -p data find site -name "*.tsv" -exec cat {} + | sort -ru > data/everything.tsv head -n 10 data/everything.tsv > data/recent.tsv head -n 1 data/everything.tsv > data/latest.tsv grep RECOMMENDED data/everything.tsv > data/recommended.tsv grep blog/entertainment/ data/everything.tsv > data/entertainment.tsv grep blog/lewd/ data/everything.tsv > data/lewd.tsv grep blog/misc/ data/everything.tsv > data/misc.tsv grep blog/personal/ data/everything.tsv > data/personal.tsv grep blog/rants/ data/everything.tsv > data/rants.tsv grep blog/tech/ data/everything.tsv > data/tech.tsv grep blog/writing/ data/everything.tsv > data/writing.tsv grep blog/links/ data/everything.tsv > data/links.tsv includes: $(TSV_HTML) ## process TSV files with shell scripts indexes: ## update post indexes cp data/*.html partials/indexes/ ./latest-post.sh ./data/latest.tsv ./partials/latest-post.html ./bookmarks.sh ./bookmarks.tsv ./partials/bookmarks.html touch -am ./site/index.htm site/blog/*.htm ./site/blog/*/index.htm ./site/bookmarks/index.htm html: $(HTM_HTML) rss: $(RSSHTML_RSS) $(RSSHTML_HEADLINE) content: html rss ## process HTML with shell scripts images: $(JPG_AVIF) $(PNG_AVIF) ## process image files with ImageMagick feeds: $(TSV_RSS) ## generate the RSS feed cp data/*.rss partials/feeds/ ./to-rss-feed.sh feeds/rss.xml feeds/everything.sh > site/feeds/everything.xml ./to-rss-feed.sh feeds/rss.xml feeds/headlines.sh > site/feeds/headlines.xml ./to-rss-feed.sh feeds/rss.xml feeds/recent.sh > site/feeds/recent.xml ./to-rss-feed.sh feeds/rss.xml feeds/recommended.sh > site/feeds/recommended.xml ./to-rss-feed.sh feeds/rss.xml feeds/entertainment.sh > site/feeds/entertainment.xml ./to-rss-feed.sh feeds/rss.xml feeds/lewd.sh > site/feeds/lewd.xml ./to-rss-feed.sh feeds/rss.xml feeds/misc.sh > site/feeds/misc.xml ./to-rss-feed.sh feeds/rss.xml feeds/personal.sh > site/feeds/personal.xml ./to-rss-feed.sh feeds/rss.xml feeds/rants.sh > site/feeds/rants.xml ./to-rss-feed.sh feeds/rss.xml feeds/tech.sh > site/feeds/tech.xml ./to-rss-feed.sh feeds/rss.xml feeds/writing.sh > site/feeds/writing.xml ./to-rss-feed.sh feeds/rss.xml feeds/links.sh > site/feeds/links.xml headlines: ## create RSS headlines feed sed 's/\.rss/\.headline/g' data/everything.tsv > data/headlines.tsv cat `cut -f4 data/headlines.tsv` > data/headlines.rss sitemap: ## generate XML sitemap for legitimate search engines. ./sitemap.sh > site/sitemap.xml archives: ## generate downloadable archives ./archive.sh text: ## build everything but images gmake -j8 data gmake -j8 includes gmake exif indexes gmake -j8 content gmake headlines feeds sitemap build: ## build everything with recursive make calls gmake -j8 images gmake text gmake archives serve: ## build and run the site on [http://localhost:8000](http://localhost:8000) for testing python3 -m http.server -d ./site 8000 install: ## build and deploy the site to my hosting provider rsync --rsh="ssh ${SSH_OPTS}" \ --delete-delay \ --exclude-from='./rsync-exclude.txt' \ -acvz site/ ${SSH_USER}@${SSH_HOST}:${SSH_PATH} clean: ## remove generated files from working directory find . -name ".DS_Store" -exec rm -f {} + find site/ -name "*.html" -exec rm -f {} + find site/ -name "*.avif" -exec rm -f {} + find site/ -name "*.rss" -exec rm -f {} + find site/ -name "*.headline" -exec rm -f {} + find site/ -name "*.tsv" -exec rm -f {} + rm -f site/feeds/*.xml data/*.{tsv,rss,html} partials/feeds/*.rss partials/includes/*.html .PHONY: help clean dep feeds archive metadata indexes

posted 3/15/2024, 6:23 pm

joined mar 1, 2024

avatar

joined mar 1, 2024

I have a few scripts to make my life easier :P

  • i use this when i want to quickly search about something. (wikipedia for terminal)

#!/bin/sh query="$@" title=${query^} query=$(echo $query | tr ' ' '_') content=$(curl -sL "https://en.wikipedia.org/w/index.php?search=$query" | grep "<p>" | sed -e 's/<[^>]*>//g' -e 's/\&\#[0-9][0-9]//g' -e 's/\;[0-9][0-9]\;//g' -e 's/\;//g' | head -1) echo -e "\033[1m$title\033[0m\\n$content"
  • aaaannd this one liner for word definitions

curl -s "https://api.dictionaryapi.dev/api/v2/entries/en/$1" | jq . | grep "\"definition\"" | cut -d '"' -f4 | sed 's/^/- /g' | less

edited 3/16/2024, 12:22 pm

joined aug 16, 2023

avatar

non serviam

joined aug 16, 2023

quoting hasp:

I have a few scripts to make my life easier :P

  • i use this when i want to quickly search about something. (wikipedia for terminal)

#!/bin/sh query="$@" title=${query^} query=$(echo $query | tr ' ' '_') content=$(curl -sL "https://en.wikipedia.org/w/index.php?search=$query" | grep "<p>" | sed -e 's/<[^>]*>//g' -e 's/\&\#[0-9][0-9]//g' -e 's/\;[0-9][0-9]\;//g' -e 's/\;//g' | head -1) echo -e "\033[1m$title\033[0m\\n$content"

Nice.

quoting hasp:

  • aaaannd this one liner for word definitions

curl -s "https://api.dictionaryapi.dev/api/v2/entries/en/$1" | jq . | grep "\"definition\"" | cut -d '"' -f4 | sed 's/^/- /g' | less

This makes me sad that DICT isn't better maintained, better known, and more popular. You can run a dict server on localhost and do lookups offline.

posted 3/16/2024, 7:00 pm

joined dec 4, 2022

avatar

joined dec 4, 2022

I decided this weekend that I wanted to take backups of the basement community database in a better way, so I found a tutorial online and tweaked it a bit to take a SQL dump of the database and upload it to another server via sftp

works really well and is running this script twice per day. The only issue is that both servers on running are on linode, so god forbid something happens to the company or the datacenter burns down, we'd be fucked. Should possibly have the FTP server running with a different hosting provider just to be safe. maybe digitalocean or something

https://www.cyberciti.biz/faq/ubuntu-linux-mysql-nas-ftp-backup-script/

#!/bin/bash ### MySQL Server Login Info ### MUSER="user" MPASS="password" MHOST="ip_address_here" DB="database_name" ## Automatically detect path to mysql and mysqldump/lftp/gzip ## MYSQL="$(command -v mysql)" MYSQLDUMP="$(command -v mysqldump)" GZIP="$(command -v gzip)" LFTP="$(command -v lftp)" ## Tmp backup directory ## BAK="/temp-backups" ### FTP SERVER Login info ### FTP_USER="ftpuser" FTP_PASSWORD="ftppass" FTP_HOST="ftp_server_ip_here" FTP_USER_HOME="/home/directory/here" NOW=$(date +"%d-%m-%Y") # Create the temp folder where we store the database dump [ ! -d "$BAK" ] && mkdir -p "$BAK" # dump the database into a timestamp-ed file FILE=$BAK/$DB.$NOW-$(date +"%T").gz $MYSQLDUMP --no-tablespaces -u "$MUSER" -h "$MHOST" -p"$MPASS" "$DB" | $GZIP -9 > "$FILE" # SFTP into the FTP server and copy the dump into a directory $LFTP "sftp://$FTP_USER:$FTP_PASSWORD@$FTP_HOST" -e "mkdir server_backups/$NOW;cd server_backups/$NOW; mput $BAK/*; quit" # finally, delete the local database dump rm -rf $BAK

edited 3/21/2024, 4:45 am

joined sep 22, 2023

avatar

lives in a pineapple under the sea

joined sep 22, 2023

This imageboard is completely made in shell script and its open source!

edited 3/21/2024, 9:18 am

joined mar 29, 2024

avatar

joined mar 29, 2024

quoting dean:

This imageboard is completely made in shell script and its open source!

that's pretty crazy. I would have picked python to do that any day.

posted 3/29/2024, 12:47 am

joined sep 22, 2023

avatar

lives in a pineapple under the sea

joined sep 22, 2023

quoting divsel:

that's pretty crazy. I would have picked python to do that any day.

Yeah, but its quite unique, and that makes it quite cool.

I doesn't work for some reason now tho, so I never really got the chance to try it out sad smiley

posted 4/5/2024, 11:48 am

joined mar 29, 2024

avatar

joined mar 29, 2024

quoting dean:

Yeah, but its quite unique, and that makes it quite cool.

I doesn't work for some reason now tho, so I never really got the chance to try it out sad smiley

You could always try out my imageboard :P

https://board.divsel.com

It doesn't have the cool backstory of being a shell script but it serves a similar function.

posted 4/6/2024, 3:24 am

joined sep 22, 2023

avatar

lives in a pineapple under the sea

joined sep 22, 2023

quoting divsel:

You could always try out my imageboard :P

https://board.divsel.com

It doesn't have the cool backstory of being a shell script but it serves a similar function.

I already tried it out and used it salute smiley

posted 4/7/2024, 7:32 am

software shell scripts