this post was submitted on 16 Dec 2025
39 points (93.3% liked)

Linux

10668 readers
387 users here now

A community for everything relating to the GNU/Linux operating system (except the memes!)

Also, check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS
top 45 comments
sorted by: hot top controversial new old
[–] Object@sh.itjust.works 22 points 3 days ago* (last edited 3 days ago) (3 children)

I have a script named d in my PATH and it contains this:

("$@" > /dev/null 2>&1 &)

It allows me to run any program in a fully detached state in a way that works even if the terminal that started the program closes, and it's as simple as d <command>.

[–] stewie410@programming.dev 4 points 2 days ago (1 children)

Shouldn't you end with & disown to fully detach?

[–] Object@sh.itjust.works 2 points 2 days ago* (last edited 2 days ago) (1 children)

IIRC disown is a shell built-in command, so its use is a bit limited. Not sure if & is also a built-in, but I found disown to not work in some situations. Besides, it's shorter.

[–] stewie410@programming.dev 1 points 2 days ago

shell built-in command

After looking into it a bit more, its at least a builtin for bash but is otherwise not POSIX. I guess nohup ... & would be the POSIX compliant equivalent, though still not a builtin.

Its my understanding that & backgrounds, not necessarily detaches, a process -- if the parent process closes, I think the background tasks would still be wait()ed on, if only using &.

[–] Redjard@lemmy.dbzer0.com 7 points 3 days ago

good idea, I've been manually typing out variations of this as needed for years.

[–] Gobbel2000@programming.dev 2 points 2 days ago (2 children)

How does this even work? I get the redirection part, but how is the command executed in a detached state?

[–] Object@sh.itjust.works 3 points 2 days ago

() creates a subshell, and & runs the command in background. The $@ means everything after the first argument, so the <command> is executed like a normal command. I am not sure why this works, but it has worked more consistently than nohup, disown, and it's a lot shorter than most other solutions.

[–] MadhuGururajan@programming.dev 1 points 2 days ago* (last edited 2 days ago)

the last & is like doing "command &". d is a function that takes argument and $@ is usually the first argument

[–] MoonRaven@feddit.nl 3 points 2 days ago
[–] vk6flab@lemmy.radio 10 points 3 days ago (1 children)
[–] victorz@lemmy.world 4 points 3 days ago* (last edited 3 days ago) (4 children)

What do y'all use awk for really? 20 of using Linux, I've never had to use awk. And I've done a looot of scripting in my days. Anything from building my own clone of polybar using eww (with loads of scripts underneath), to automated systems for bulk handling of student assignments back at uni when I used to help out with grading and such.

What's awk good for that other standard utilities can't do?

[–] stewie410@programming.dev 3 points 2 days ago (1 children)

I both love and hate awk -- on the one hand, it provides the same/similar functionality of similar tools (sed, grep, cut, etc); but it is a bit of a bear and can be pretty slow.

If you need more "complex" tasks done what would be cumbersome with the rest of the standard tooling, and performance is a non-issue, awk/gawk can probably get it done.

Though, I too am trying to use it as little as possible in scripts. I think multiple subshells/pipes is still better than awk in some cases. Syntax also leaves a lot to be desired...

[–] victorz@lemmy.world 2 points 2 days ago (1 children)

My experience exactly. I'd rather use a specific tool designed for the task than invoke a whole new language. It just feels... icky for some reason.

[–] stewie410@programming.dev 2 points 2 days ago

There are times when dealing with annoying field separators that awk is a more convenient tool -- though, I'm also now at the stage that I want to do as much with bash-builtins as I possibly can.

[–] vk6flab@lemmy.radio 6 points 3 days ago (2 children)

I've been using Linux for 25 years, awk is a more recent addition to my arsenal, but rapidly becoming more and more useful.

For example, awk is extremely helpful if you want to rearrange columns, do math on columns, essentially do things that would take multiple lines of bash with cut and read.

[–] nik9000@programming.dev 3 points 3 days ago

I used to switch to perl or python if I needed awk. These days I don't tend to run into it as much. Not sure if that was a good choice. But it's how I spent the past 25 years.

[–] victorz@lemmy.world 3 points 3 days ago

That makes sense! I think I'd be running Nushell for this if the scripts didn't need to be very portable, sounds like a good use case for that.

[–] TrumpetX@programming.dev 2 points 2 days ago (2 children)

I use awk instead of cut cause I can remember the syntax better.

ps aux | grep zoom | grep -v grep | awk '{print $2}' | xargs kill -9

[–] stewie410@programming.dev 2 points 2 days ago (2 children)

You can get rid of those greps, btw:

ps aux | awk '/zoom/ && $0 !~ /awk/ {print $2}'

Or just use pgrep.

[–] TrumpetX@programming.dev 1 points 1 day ago* (last edited 1 day ago)

grep is everywhere, pgrep is not

[–] victorz@lemmy.world 2 points 2 days ago* (last edited 2 days ago)

The greps were much more legible that this concoction 😅

But I second the use of pgrep. 👌

Better yet, just killall zoom.

[–] victorz@lemmy.world 1 points 2 days ago (1 children)

May I interest you in the command killall? 😁

[–] TrumpetX@programming.dev 1 points 1 day ago (1 children)

The thing with killall is I can't see what it's going to do first. With the greps, I can look first, kill second.

[–] victorz@lemmy.world 1 points 1 day ago

True, then I would personally run pgrep first, then killall. 😊

[–] dontsayaword@piefed.social 4 points 3 days ago* (last edited 3 days ago) (1 children)

I usually use something like awk '{print $2}' to get a bit of some output

Ex: list processes, grep for the line I want, then awk out the chunk I want from that line (the pid):

ps aux | grep myprogram | awk '{print $2}'

[–] hallettj@leminal.space 4 points 3 days ago (1 children)

That's the only think I know how to do with awk, and I reach for it a lot! cut is purpose-built for that function, and is supposedly easier to understand; but it doesn't seem to just work like awk does.

[–] victorz@lemmy.world 2 points 3 days ago (1 children)

I think cut is a little bit finicky because two consecutive occurrences of the cell delimiter counts, and gives an empty cell when selecting the index between them.

choose is a bit better at this from what I remember, which is like the modern cut, I believe, of course written in Rust.

Otherwise Nushell excels at this sort of thing, although I don't really use it.

[–] hallettj@leminal.space 2 points 3 days ago

Oh, I hadn't heard about choose!

I have been using Nushell, and you're right, it is great at parsing input. Commands like detect columns and parse are very nice, and have been supplanting awk for me.

[–] stewie410@programming.dev 2 points 2 days ago

If we're including shell scripts/functions as "terminal commands":

  • bm()
    • Zoxide wasn't quite what I wanted
    • Quick jump to a static list of directories with some alias
  • mkcd()
    • Basic mkdir && cd
    • Some extra stuff for my specific workflow
  • bashlib
    • My small, but helpful, "stdlib"
  • gitclone.sh
    • git clone wrapper without needing to give a full URI
  • md2d.sh
    • pandoc wrapper
    • I'm required to provide documentation at work in docx format, so I use this to avoid Office as much as possible

If we just mean terminal applications:

A couple of bash specific items I'm using quite often these days:

[–] justdaveisfine@piefed.social 10 points 3 days ago
[–] hallettj@leminal.space 7 points 3 days ago (1 children)

When I'm in some subdirectory of a git repository, I use this command to jump to the repo root:

alias gtop="cd \$(git rev-parse --show-toplevel)"
[–] victorz@lemmy.world 5 points 3 days ago (1 children)

What I do with all git related aliases is I alias git to just g in the shell. Then for any alias I want that uses git I just put that alias in the global git config under the alias section.

This avoids polluting the shell with a bunch of git-specific aliases. Just the one, g.

[–] hallettj@leminal.space 3 points 3 days ago (1 children)

I certainly see the value in this strategy! But I'm not going to give up my top-level aliases. I enjoy saving two keystrokes too much!

Here are my most used aliases (these ones use Nushell syntax):

alias st = git status
alias sw = git switch
alias ci = git commit
alias lg = git log --color --graph '--pretty=format:%Cred%h%Creset -%C(yellow)%d%Creset %s %Cgreen(%cr) %C(bold blue)<%an>%Creset' --abbrev-commit
alias push = git push

I was also delighted to learn that I could get the same short aliases for corresponding fugitive commands in vim/neovim using the vim-alias plugin:

-- This is a lazy.nvim plugin module
return {
  'Konfekt/vim-alias',
  config = function()
    -- Shortcuts for git operations to match some of the shell aliases I have.
    -- For example, `:sw ` expands to `:Git switch `
    vim.cmd [[Alias sw Git\ switch]]
    vim.cmd [[Alias ci Git\ commit]]
    vim.cmd [[Alias pull Git\ pull]]
    vim.cmd [[Alias push Git\ push]]
    vim.cmd [[Alias show Git\ show]]
    vim.cmd [[Alias re Git\ restore]]
    vim.cmd [[Alias lg GV]]
  end,
}

Fugitive is very nice for integrating git workflows in the editor, and its commands have very nice tab completion for branches and such.

[–] victorz@lemmy.world 3 points 3 days ago* (last edited 3 days ago) (1 children)

two keystrokes

For me I'd be saving one keystroke. Status for me would be g s, g c for commit, and so on. Single letter aliases for the most common commands, two letters for less common in a conflict. 😁

But these days since a few years back I just use lazygit (aliased to lg btw, lol).

Everything in lazygit is basically just single keystrokes also. c for commit, etc. Very handy.

Fugitive

Cool beans, sounds like a good tool! I'm on team Helix since a few years, after being a vim/nvim user for about a decade, and emacs a couple years before that. Helix's paradigm just makes so much sense. 🎯👌 Jumping around symbols intra-file and inter-file, and LSP support built-in, no fussing. Worth a try for a few weeks if you ask me.

[–] hallettj@leminal.space 2 points 3 days ago (1 children)

Oh yeah, I do find Helix interesting! I sometimes recommend it to people who don't have a background with modal editing as a batteries-included option for getting started. I have tried it a little bit myself. It's hard for me to give up leap.nvim and fugitive, which is holding me back.

I've been meaning to try out dedicated git programs to see how comfortable I can be without fugitive. Tig is one that caught my eye. Or sometimes I even think about using Gitbutler because its virtual branch feature seems very useful, and I haven't seen any other tool that does that.

[–] victorz@lemmy.world 2 points 3 days ago* (last edited 3 days ago)

I think the best way is just take the leap, and try it out for real. 😉

I used Tig before lazygit actually. It's great for getting an overview of history. But lazygit I think is more focused on the current state, and workflow-oriented. It is very easy to drop commits, rebase, edit commits, etc.

I'm not sure what virtual branches are or why I would need them but sounds interesting. 😅

[–] deifyed@lemmy.wtf 6 points 3 days ago (2 children)

Maybe "alias u=cd .." or "alias mkcd=mkdir -p $1 && cd $1"

[–] hallettj@leminal.space 2 points 3 days ago

I like mkcd! I have the same thing. Although what I use more is a function I pair with it that specifically creates a temporary directory, and cds to it. It's useful for temporary work, like extracting from a zip file.

These are my Nushell implementations:

# Create a directory, and immediately cd into it.
# The --env flag propagates the PWD environment variable to the caller, which is
# necessary to make the directory change stick.
def --env dir [dirname: string] {
  mkdir $dirname
  cd $dirname
}

# Create a temporary directory, and cd into it.
def --env tmp [
  dirname?: string # the name of the directory - if omitted the directory is named randomly
] {
  if ($dirname != null) {
    dir $"/tmp/($dirname)"
  } else {
    cd (mktemp -d)
  }
}
[–] deifyed@lemmy.wtf 1 points 3 days ago

Totally forgot my favorites. C=clear, vi=nvim, l=eza -lh

pacman -Syu
chown
chmod
[–] petsoi@discuss.tchncs.de 1 points 2 days ago

I'd say

cd
rm 
mkdir
su

and many more 😂

[–] data1701d@startrek.website 2 points 3 days ago

for file in *.WAV; do ffmpeg -i “$file” -i cover.png -disposition:v attached_art “$(basename “$file” .wav).flac”

(I’m doing this from memory, so I may have messed something up, but that’s the gist of it for taking a bunch of WAV files and turning them into FLACs with cover art. I also do a similar setup for combining the metadata of an MP3 and audio data of a WAV, since They Might Be Giants seems to have forgotten FLAC was invented.)

[–] hallettj@leminal.space 3 points 3 days ago* (last edited 3 days ago)

I have to dual boot for work, so every day I have to reboot into a different OS install. It's on its own drive with its own bootloader, so I can't use systemctl reboot --boot-loader-entry. But I was able get a smooth process using efibootmgr.

This is my Nushell implementation:

def boot-to [
  device: string@boot-devices # Identifier of device to boot to (e.g. 0003)
] {
  sudo efibootmgr --bootnext $device
  systemctl reboot
}

# This function exists to provide tab completion for boot-to
def boot-devices [] {
  efibootmgr | parse --regex 'Boot(?<value>\S+)\* (?<description>(?:\w+ )*\w+)'
}

namei -om to check permission issues

[–] U@piefed.social 2 points 3 days ago

top, obviously.