fool

joined 2 years ago
[–] fool@programming.dev 24 points 7 months ago (1 children)

Make your data useless or wrong.

More passively, there's probably an oddly large amount of John Does born on January 1, 2000 ;)

More offensively, anti-image-gen data poisoning such as Nightshade exists. It's well-defended against IIRC so hopefully someone can Cunningham's Law correct me. And this is also more solo of a movement (as opposed to gaining mass support for something)

[–] fool@programming.dev 39 points 7 months ago* (last edited 7 months ago) (4 children)

https://www.explainxkcd.com/wiki/index.php/327:_Exploits_of_a_Mom

SQL sanitization joke. Won't affect most databases today.

[–] fool@programming.dev 21 points 7 months ago* (last edited 7 months ago)

Disclosure: I've done very little UI/UX.

Google's Material Design (wikipedia) is much more widely-adopted across OSes/Flutter/the web (see how many websites have that dropshadow topbar and ≡?); Microsoft's Fluent (wikipedia) is Windows-first, but is usable anywhere.

Both are based on responding to user actions. Fluent uses lightup acrylic (translucent) canvases (e.g. hover? border glowy.)

while Google's Material uses paper-esque whitespace, navbars, dropshadows, and round corners. (e.g. scrolling? dropshadow appears on nav)

Think Microsoft Teams vs. Google Drive.

They're both full-fledged but Material You is way more common judging by places such as the F-Droid ecosystem on Android. As for which is "better", Material You supposedly has better colorscheme flexibility since it 'wants' to adapt to e.g. user wallpapers. But other than that it's really just preference (or whether relevant tooling exists :P). I know some devs use Material You for a predictable, unified look across Android apps, while others bend them to their will to reduce animations or whatnot.

If you're designing something, make sure you keep your own self in the mix too. Breezy Weather uses Material Design, but it's more customized to have a unique feel than, say, TrackerControl (which also uses Material).

[–] fool@programming.dev 7 points 7 months ago (1 children)

Solitaire 2: Collectors Edition. 175k hours

[–] fool@programming.dev 12 points 7 months ago* (last edited 7 months ago) (1 children)

NOOOOOO HOW DID I GET SHITTYMORPHED ON LEMMY

[–] fool@programming.dev 6 points 7 months ago* (last edited 7 months ago) (1 children)

CRYPT-- oh, you mean how the nice tutorial peeps have affected us.

Vimjoyer has increased the adoption rate for flakes on NixOS. And also NixOS use in general.

Mental Outlaw has probably contributed to new Gentoo users, quoth the meme, but Gentoo is still a dying breed compared to its heyday in the early naughts.

Fireship has made people -- particularly CS students I believe -- more comfortable with trying out new programming languages. (The "I'll check out the Fireship video first" approach. But then again, ChatGPT has arguably had the same effect across undergraduates... that's a digression)

Asahi Lina's longform Rust dev work, while less of a network effect, has had its own substantial effects within the Asahi Linux "Linux on the M-series" sphere. I believe she also helped port a kind of anime mocap engine onto Linux, which could over the longterm boost the anime-nerd Linux-nerd center Venn diagram. But that's speculation.

edit:

In a broader perspective, with the combination of SteamOS and large YouTubers trying out Linux, Linux desktop adoption will probably increase more than it has now. I doubt it will pass 10% though with Linux's reputation (tech nerds, compile all day, games don't run, command line -- even though these are improving, it's hard to kick)

[–] fool@programming.dev 3 points 7 months ago* (last edited 7 months ago)

Thank you! (˶ˆᗜˆ˵) ✧

[–] fool@programming.dev 20 points 7 months ago (4 children)

A lot of comments here are suspicious of you, so I'm going to try my hand at guessing whether this was AI.

Since GPTs are hilariously bad at detecting themselves, I'll venture on the human spirit!

First, we establish truth 1: this is copy-pasted.

Although Moissanite isn't mentioned twice, everything after "Synthetic Alexandrite" inclusively is mentioned twice. That means this was procedurally copy-pasted. Someone writing on their own would either CTRL+A then CTRL+C and make no mistakes, or not repeat themself at all.

Of course, we can also look at the half-formalized format that indicates something was copied from raw text and pasted into markdown, rather than formatted with markdown first.

Colon:
words words Colon:
words words Colon:

copy-paster spotted

Second, we cast doubt that a human wrote the source.

  • AI-isms vs. non AI-isms
    • Non-reused acronym definitions.

      Garnets like... yttrium iron garnet (YIG)

      This is probably taken straight from the Wikipedia's site description for YIG. Usually humans don't define an acronym only to never use it, unless they're making a mistake, especially not for just making repeated structure. So either Wikipedia was in the training corpus or this was Googled.

    • 5/23 sentences start with "While" (weak ai indicator)

    • no three-em dashes or obvious tricolons are overused (non ai-indicator)

    • no filler bullshit introduction or conclusion (non ai-indicator)

    • obvious repeated structure that you can feel (strong ai indicator)

    • Suspiciously uncreative descriptions (ai indicator)

      "These stones are not just rare but impossible to find naturally, offering a unique and unconventional aesthetic perfect for someone looking to stand out." (emphasis added)

    • Repetition of "unusual" and "rare" rather than more flavorful or useful adjectives (AI indicator)

      • We're talking synthetic stuff. Would a human write about rarity?
    • Superficial, neutral-positive voice despite length and possible source. If this was pasted from a technical blog, I'd expect it to have more "I" and personal experiences, or more deep anecdotal flavor (AI indicator)

      • e.g. use of "fascinating" but doesn't go deeper into any positivities

Third... let's take a guess

So it was copy-pasted from somewhere, but I can't imagine it being from a blog or website, and it isn't directly from Wikipedia. It has some nonhuman mistakes, but is otherwise grammatical, neutral-positive, and repetitively structured. And it lacks that deeper flavor. So.... it was an AI, but likely not openAI.

At least there aren't any very "committal" facts, so the length but lack of depth suggests that everything's maaaaaaybe true...

I wasted my time typing this

[–] fool@programming.dev 6 points 7 months ago* (last edited 7 months ago) (1 children)

Cool-ass economics fun fact, hell yeah

~or~ ~not~ ~so~ ~fun~

[–] fool@programming.dev 1 points 7 months ago

What's the most recent thing that made you laugh? Why your username? Also, do you think wisdom can be taught (vs. making the mistakes ourselves)?

[–] fool@programming.dev 2 points 7 months ago (1 children)

Not sure about the latter, but NZT was the focal "smart" drug in Limitless (a show on the premise "this drug makes u smart but if u stop taking it bad stuff happens")

[–] fool@programming.dev 3 points 7 months ago (3 children)
 

It can be a small skill.

The last thing I learned to do was whistle. Never could whistle my whole life, and tutorials and friends never could help me.

So, for the last month or two, I just sort of made the blow shape then spam-tried different "tongue configurations" so to speak -- whenever I had free time. Monkey-at-a-typewriter type shit. It was more an absentminded thing than a practice investment.

Probably looked dumb as hell making blow noises. Felt dumb too ("what? you can't whistle? just watch"), but I kept at it like a really really low-investment... dare I attract self-help gurus... habit.

Eventually I made a pitch, then I could shift the pitch up a little, then five pitches, then Liebestraum, then the range of a tenth or so. Skadoosh. Still doing it now lol.

(Make of this what you will: If I went the musician route my brain told me to, then I would've gotten bored after 1 minute of major scales. When I was stuck at only having five pitches, I had way more longevity whistle-blowing cartoonish Tom-and-Jerry-running-around chromaticisms than failing the "fa" in "do re mi fa".)

So, Lemmings: What was the last skill you learned? And further, what was the context/way in which you learned it?

44
submitted 9 months ago* (last edited 9 months ago) by fool@programming.dev to c/linux@lemmy.ml
 

I had a teeny pet project using GNU assembly that was going to target two platforms.

Instead of keeping my handwritten worst-practices Makefile I decided to try GNU Autotools for the educated reasons of:

  • Text scrolling by looks pretty
  • Vague memories of ./configure make make install tarballs

I got hit with mysterious macro errors, recompile with -fPIE errors (didn't need this before?), autotools trying to run gcc on a .o file w/ the same options as an .s file, "no rule for all:", and other things a noob would run into. (I don't need a bugfix, since my handspun Makefile is "working on my machine" with uname -m.) So there's a bit of a learning curve here, inhibited by old documentation ~and~ ~more~ ~quietly,~ ~genAI~ ~being~ ~shittier~ ~than~ ~normal~ ~in~ ~this~ ~department~

With this I ask:

Do people still use Autotools for non-legacy stuff? If not, what do people choose for a new project's build system and why?

edit: trimmed an aside

 

A bookmarklet is a bookmark whose URL is JavaScript code instead of a site. It might be, for example,

javascript:document.querySelector('video').playbackRate = Number(prompt("speed")) || 1; void(0)

// formatted version:
javascript:
document
  .querySelector('video')
  .playbackRate = 
  Number(prompt("speed")) || 1; 
void(0)

so that if you click the bookmark, it sets the speed of the video to whatever you want (e.g. 3.7).

You could also run this directly in the URL bar (in some cases -- I think desktop Chrome does that), or you can simply type alert() into the dev console (desktop Firefox prefers this for security reasons).

Is running my own arbitrary JS like this a thing on mobile? I'm on Android but I'm not sure if Brave disabled it -- I vaguely remember it working once, but it doesn't anymore. No luck on Firefox either. Maybe there's a workaround?

 

edit: solved lol

When I run Cheese, the inbuilt webcam light flashes for an instant then stops. Assuming Cheese opens correctly, it never successfully shows the webcam feed. Cheese worked prior to update. ( Zoom webcam fails too D: )

setup

Arch Linux, kernel 6.11.5-arch-11, Hyprland v0.44.1 (pipewire 1.2.6) on a hybrid Nvidia+Intel Lenovo laptop.

$ lsusb
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 002: ID 25a7:fa23 Areson Technology Corp 2.4G Receiver
Bus 001 Device 004: ID 8087:0a2b Intel Corp. Bluetooth wireless interface
Bus 001 Device 005: ID 138a:0094 Validity Sensors, Inc. 
Bus 001 Device 033: ID 13d3:5673 IMC Networks EasyCamera
Bus 002 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub

$ v4l2-ctl --list-devices                                         
EasyCamera: EasyCamera (usb-0000:00:14.0-5):
	/dev/video0
	/dev/video1
	/dev/media0

packages

I'm up-to-date.

$ yay -Q |egrep -i 'gstreamer|video|cam|media|mtp'
fswebcam 20200725-1
gnome-video-effects 1:0.6.0-2
gst-plugin-libcamera 0.3.2-1
gstreamer 1.24.8-1
gstreamer-vaapi 1.24.8-1
guvcview 2.1.0-4
guvcview-common 2.1.0-4
gvfs-mtp 1.56.1-1
haskell-http-media 0.8.1.1-18
intel-media-driver 24.3.3-1
libcamera 0.3.2-1
libcamera-ipa 0.3.2-1
libmtp 1.1.21-2
media-player-info 26-1
perl-lwp-mediatypes 6.04-6
pipewire-libcamera 1:1.2.6-1
# qt omitted
xf86-video-intel 1:2.99.917+923+gb74b67f0-2

tried apps

Tried cheese and fswebcam, among a few others (logs are too long to fit)

$ cheese
cheese

(cheese:53011): Gdk-WARNING **: 11:41:45.977: Native Windows taller than 65535 pixels are not supported
[0:56:26.030577084] [53011] ERROR IPAModule ipa_module.cpp:171 Symbol ipaModuleInfo not found
[0:56:26.030591748] [53011] ERROR IPAModule ipa_module.cpp:291 v4l2-compat.so: IPA module has no valid info
[0:56:26.030607425] [53011]  INFO Camera camera_manager.cpp:325 libcamera v0.3.2

(cheese:53011): GStreamer-CRITICAL **: 11:41:46.096: gst_structure_get_value: assertion 'structure != NULL' failed
[0:57:26.270746293] [53011] ERROR IPAModule ipa_module.cpp:171 Symbol ipaModuleInfo not found
[0:57:26.270790988] [53011] ERROR IPAModule ipa_module.cpp:291 v4l2-compat.so: IPA module has no valid info
[0:57:26.270850259] [53011]  INFO Camera camera_manager.cpp:325 libcamera v0.3.2
[0:57:26.432615229] [53061]  INFO Camera camera.cpp:1197 configuring streams: (0) 640x480-MJPEG
[0:57:26.449271361] [53574] ERROR V4L2 v4l2_videodevice.cpp:1931 /dev/video0[450:cap]: Failed to start streaming: Protocol error
[0:57:26.449379043] [53574] ERROR V4L2 v4l2_videodevice.cpp:1266 /dev/video0[450:cap]: Unable to request 0 buffers: No such device

(cheese:53011): cheese-WARNING **: 11:42:46.537: Failed to start the camera: Protocol error: ../libcamera/src/gstreamer/gstlibcamerasrc.cpp(680): gst_libcamera_src_task_enter (): /GstCameraBin:camerabin/GstWrapperCameraBinSrc:camera_source/GstBin:bin37/GstLibcameraSrc:libcamerasrc0:
Camera.start() failed with error code -71

(cheese:53011): Clutter-CRITICAL **: 11:42:48.119: Unable to create dummy onscreen: No foreign surface, and wl_shell unsupported by the compositor
$ fswebcam -v
***
Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
src_v4l2_get_capability,91: /dev/video0 information:
src_v4l2_get_capability,92: cap.driver: "uvcvideo"
src_v4l2_get_capability,93: cap.card: "EasyCamera: EasyCamera"
src_v4l2_get_capability,94: cap.bus_info: "usb-0000:00:14.0-5"
src_v4l2_get_capability,95: cap.capabilities=0x84A00001
src_v4l2_get_capability,96: - VIDEO_CAPTURE
src_v4l2_get_capability,107: - STREAMING
No input was specified, using the first.
src_v4l2_set_input,185: /dev/video0: Input 0 information:
src_v4l2_set_input,186: name = "Camera 1"
src_v4l2_set_input,187: type = 00000002
src_v4l2_set_input,189: - CAMERA
src_v4l2_set_input,190: audioset = 00000000
src_v4l2_set_input,191: tuner = 00000000
src_v4l2_set_input,192: status = 00000000
src_v4l2_set_pix_format,523: Device offers the following V4L2 pixel formats:
src_v4l2_set_pix_format,532: 0: [0x47504A4D] 'MJPG' (Motion-JPEG)
src_v4l2_set_pix_format,532: 1: [0x56595559] 'YUYV' (YUYV 4:2:2)
Using palette MJPEG
Adjusting resolution from 384x288 to 424x240.
src_v4l2_set_mmap,675: mmap information:
src_v4l2_set_mmap,676: frames=4
src_v4l2_set_mmap,725: 0 length=203520
src_v4l2_set_mmap,725: 1 length=203520
src_v4l2_set_mmap,725: 2 length=203520
src_v4l2_set_mmap,725: 3 length=203520
Error starting stream.
VIDIOC_STREAMON: Protocol error
Unable to use mmap. Using read instead.
Unable to use read.

logs

$ journalctl -b-0
# some stuff removed for post character limit
pipewire[1028]: spa.v4l2: '/dev/video0' VIDIOC_STREAMON: Protocol error
pipewire[1028]: pw.node: (v4l2_input.pci-0000_00_14.0-usb-0_5_1.0-78) suspended -> error (Start error: Protocol error)
kernel: usb 1-5: USB disconnect, device number 50
pipewire[1028]: spa.v4l2: VIDIOC_REQBUFS: No such device
kernel: usb 1-5: new high-speed USB device number 51 using xhci_hcd
kernel: usb 1-5: New USB device found, idVendor=13d3, idProduct=5673, bcdDevice=16.04
kernel: usb 1-5: New USB device strings: Mfr=3, Product=1, SerialNumber=2
kernel: usb 1-5: Product: EasyCamera
kernel: usb 1-5: Manufacturer: AzureWave
kernel: usb 1-5: SerialNumber: 0001
kernel: usb 1-5: Found UVC 1.00 device EasyCamera (13d3:5673)
mtp-probe[66565]: checking bus 1, device 51: "/sys/devices/pci0000:00/0000:00:14.0/usb1/1-5"
mtp-probe[66565]: bus: 1, device: 51 was not an MTP device
mtp-probe[66599]: checking bus 1, device 51: "/sys/devices/pci0000:00/0000:00:14.0/usb1/1-5"
mtp-probe[66599]: bus: 1, device: 51 was not an MTP device
kernel: usb 1-5: USB disconnect, device number 51
kernel: usb 1-5: new high-speed USB device number 52 using xhci_hcd
kernel: usb 1-5: New USB device found, idVendor=13d3, idProduct=5673, bcdDevice=16.04
kernel: usb 1-5: New USB device strings: Mfr=3, Product=1, SerialNumber=2
kernel: usb 1-5: Product: EasyCamera
kernel: usb 1-5: Manufacturer: AzureWave
kernel: usb 1-5: SerialNumber: 0001
kernel: usb 1-5: Found UVC 1.00 device EasyCamera (13d3:5673)
$ sudo dmesg
[ 5248.449913] usb 1-5: USB disconnect, device number 50
[ 5248.842621] usb 1-5: new high-speed USB device number 51 using xhci_hcd
[ 5249.025592] usb 1-5: New USB device found, idVendor=13d3, idProduct=5673, bcdDevice=16.04
[ 5249.025612] usb 1-5: New USB device strings: Mfr=3, Product=1, SerialNumber=2
[ 5249.025620] usb 1-5: Product: EasyCamera
[ 5249.025626] usb 1-5: Manufacturer: AzureWave
[ 5249.025632] usb 1-5: SerialNumber: 0001
[ 5249.030816] usb 1-5: Found UVC 1.00 device EasyCamera (13d3:5673)
[ 5259.873533] usb 1-5: USB disconnect, device number 51
[ 5260.268988] usb 1-5: new high-speed USB device number 52 using xhci_hcd
[ 5260.454354] usb 1-5: New USB device found, idVendor=13d3, idProduct=5673, bcdDevice=16.04
[ 5260.454371] usb 1-5: New USB device strings: Mfr=3, Product=1, SerialNumber=2
[ 5260.454378] usb 1-5: Product: EasyCamera
[ 5260.454384] usb 1-5: Manufacturer: AzureWave
[ 5260.454389] usb 1-5: SerialNumber: 0001
[ 5260.460370] usb 1-5: Found UVC 1.00 device EasyCamera (13d3:5673)

Any help appreciated! ^_^

Solution: It fixed itself after like 15 power cycles. Easy peasy

121
kdesu (programming.dev)
submitted 10 months ago* (last edited 10 months ago) by fool@programming.dev to c/linuxmemes@lemmy.world
 
45
submitted 10 months ago* (last edited 10 months ago) by fool@programming.dev to c/nostupidquestions@lemmy.world
 

This site is so cool!

             />  フ
            |  _ _| 
          /` ミ_xノ 
         /     |
        /  ヽ   ノ
        │  | | |
   / ̄|   | | |
   ( ̄  ヽ__ヽ_)__)
    \二)

But how do people make these? I searched online and the best I could find were small Japanese communities still using MS Gothic (which is metrically incompatible with Arial/more-used fonts) and halfhearted JPG-to-ASCII-bitmap converters.

Further, how do people manage these? I'd imagine an emoji search, but these millionfold emoticons don't have names; and the other alternatives are "I've got a meme for that scrolls down infinite camera roll" or searching them up every time.

⠀/\_/\
(˶ᵔ ᵕ ᵔ˶) thanks lol
/ >🌷<~⁠♡

 
 

I saw a post recently about someone setting up parental controls -- screentime, blocked sites, etc. -- and it made me wonder.

In my childhood, my free time was very flexible. Within this low-pressure flexibility I was naturally curious, in all directions -- that meant both watching brainteaser videos, and watching Gmod brainrot. I had little exposure to video games other than Minecraft which ran poorly on my machine, so I tended to surf Flash games and YouTube.

Strikingly, while watching a brainteaser video, tiny me had a thought:

I'm glad my dad doesn't make me watch educational videos like the other kids in school have to.

For some reason, I wanted to remember that to "remember what my thought process was as a child" so that memory has stuck with me.

Onto the meat: if I had had a capped screentime, like a timer I could see, and knew that I was being watched in some way, I'd feel pressure. For example,

10 minutes left. Oh no. I didn't have fun yet. I didn't have fun yet!!

Oh no, I'm gonna get in so much trouble for watching another YTP...

and maybe that pressure wouldn't have made me into an independent, curious kid, to the person I am now. Maybe it would've made me fearful or suspicious instead. I was suspicious once, when one of my parents said "I can see what you browse from the other room" -- so I ran the scientific method to verify if they were. (I wrote "HI MOM" on Paint, and tested if her expression changed.)

So what about now? Were we too free, and now it's our job to tighten the next generation? I said "butthead" often. I loved asdfmovie, but my parents probably wouldn't have. I watched SpingeBill YTPs (at least it's not corporatized YouTube Kids).

Or differently: do we watch our kids without them knowing? Write a keylogger? Or just take router logs? Do we prosecute them like some sort of panopticon, for their own good?

Or do we completely forgo this? Take an Adventure Playground approach?

Of course, I don't expect a one-size-fits-all answer. Where do you stand, and why?

 

Git cheat sheets are a dime-a-dozen but I think this one is awfully concise for its scope.

  • Visually covers branching (WITH the commands -- rebasing the current branch can be confusing for the unfamiliar)
  • Covers reflog
  • Literally almost identical to how I use git (most sheets are either Too Much or Too Little)
73
submitted 10 months ago* (last edited 10 months ago) by fool@programming.dev to c/linux@lemmy.ml
 

What was your last RTFM adventure? Tinker this, read that, make something smoother! Or explodier.

As for me, I wanted to see how many videos I could run at once. (Answer: 60 frames per second or 60 frames per second?)

With my sights on GPUizing some ethically sourced motion pictures, I RTFW, graphed, and slapped on environment variables and flags like Lego bricks. I got the Intel VAAPI thingamabob to jaunt by (and found that it butterized my mpv videos)

$ pacman -S blahblahblahblahblahtfm
$ mpv --show-profile=fast
Profile fast: 
 scale=bilinear
 dscale=bilinear
 dither=no
 correct-downscaling=no
 linear-downscaling=no
 sigmoid-upscaling=no
 hdr-compute-peak=no
 allow-delayed-peak-detect=yes
$ mpv --hwdec=auto --profile=fast graphwar-god-4KEDIT.mp4
# fucking silk

But there was no pleasure without pain: Mr. Maxwell F. N. 940MX (the N stands for Nvidia) played hooky. So I employed the longest envvars ever

$ NVD_LOG=1 VDPAU_TRACE=2 VDPAU_NVIDIA_DEBUG=3 NVD_BACKEND=direct NVD_GPU=nvidia LIBVA_DRIVER_NAME=nvidia VDPAU_DRIVER=nvidia prime-run vdpauinfo
GPU at BusId 0x1 doesn't have a supported video decoder
Error creating VDPAU device: 1
# stfu

to try translating Nvidia VDPAU to VAAPI -- of course, here I realized I rtfmed backwards and should've tried to use just VDPAU instead. So I did.

Juice was still not acquired.

Finally, after a voracious DuckDuckGoing (quacking?), I was then blessed with the freeing knowledge that even though post-Kepler is supposed to support H264, Nvidia is full of lies...

 ______
< fudj >
 ------
          \   ‘^----^‘
           \ (◕(‘人‘)◕)
              (  8    )        ô
              (    8  )_______( )
              ( 8      8        )
              (_________________)
                ||          ||
               (||         (||

and then right before posting this, gut feeling: I can't read.

$ lspci | grep -i nvidia
... NVIDIA Corporation GM108M [GeForce 940MX] (rev a2)
# ArchWiki says that GM108 isn't supported.
# Facepalm

SO. What was your last RTFM adventure?

 

I have a little helper command in ~/.zshrc called stfu.

stfu() {
    if [ -z "$1" ]; then
        echo "Usage: stfu <program> [arguments...]"
        return 1
    fi

    nohup "$@" &>/dev/null &
    disown
}
complete -W "$(ls /usr/bin)" stfu

stfu will run some other command but also detach it from the terminal and make any output shut up. I use it for things such as starting a browser from the terminal without worrying about CTRL+Z, bg, and disown.

$ stfu firefox -safe-mode
# Will not output stuff to the terminal, and
# I can close the terminal too.

Here’s my issue:

On the second argument and above, when I hit tab, how do I let autocomplete suggest me the arguments and command line switches for the command I’m passing in?

e.g. stfu ls -<tab> should show me whatever ls’s completion function is, rather than listing every /usr/bin command again.

# Intended completion
$ stfu cat -<TAB>
-e                      -- equivalent to -vE                                                                                                                                                     
--help                  -- display help and exit                                                                                                                                                 
--number            -n  -- number all output lines                                                                                                                                               
--number-nonblank   -b  -- number nonempty output lines, overrides -n                                                                                                                            
--show-all          -A  -- equivalent to -vET                                                                                                                                                    
--show-ends         -E  -- display $ at end of each line                                                                                                                                         
--show-nonprinting  -v  -- use ^ and M- notation, except for LFD and TAB                                                                                                                         
--show-tabs         -T  -- display TAB characters as ^I                                                                                                                                          
--squeeze-blank     -s  -- suppress repeated empty output lines                                                                                                                                  
-t                      -- equivalent to -vT                                                                                                                                                     
-u                      -- ignored  

# Actual completion
$ stfu cat <tab>
...a list of all /usr/bin commands
$ stfu cat -<tab>
...nothing, since no /usr/bin commands start with -

(repost, prev was removed)

EDIT: Solved.

I needed to set the curcontext to the second word. Below is my (iffily annotated) zsh implementation, enjoy >:)

stfu() {
  if [ -z "$1" ]; then
    echo "Usage: stfu <program> [arguments...]"
    return 1
  fi

  nohup "$@" &>/dev/null &
  disown
}
#complete -W "$(ls /usr/bin)" stfu
_stfu() {
  # Curcontext looks like this:
  #   $ stfu <tab>
  #   :complete:stfu:
  local curcontext="$curcontext" 
  #typeset -A opt_args # idk what this does, i removed it

  _arguments \
    '1: :_command_names -e' \
    '*::args:->args'

  case $state in
    args)
      # idk where CURRENT came from
      if (( CURRENT > 1 )); then
        # $words is magic that splits up the "words" in a shell command.
        #   1. stfu
        #   2. yourSubCommand
        #   3. argument 1 to that subcommand
        local cmd=${words[2]}
        # We update the autocompletion curcontext to
        # pay attention to your subcommand instead
        curcontext="$cmd"

        # Call completion function
        _normal
      fi
      ;;
  esac
}
compdef _stfu stfu

Deduced via docs (look for The Dispatcher), this dude's docs, stackoverflow and overreliance on ChatGPT.

EDIT: Best solution (Andy)

stfu() {
  if [ -z "$1" ]; then
    echo "Usage: stfu <program> [arguments...]"
    return 1
  fi

  nohup "$@" &>/dev/null &
  disown
}
_stfu () {
  # shift autocomplete to right
  shift words
  (( CURRENT-=1 ))
  _normal
}
compdef _stfu stfu
view more: ‹ prev next ›