starshipwinepineapple

joined 1 year ago
[–] starshipwinepineapple@programming.dev 10 points 1 week ago* (last edited 1 week ago)

There are cuda packages for arch. I can confirm they work.

https://wiki.archlinux.org/title/GPGPU

Edit- to add i haven't found arch's limit, it feels like i am bound only by my own limit in my time, ability, and willingness to tinker with my setup. arch itself is widely supported, it has many official packages through pacman, and then additional through the "arch user repository" (AUR) so chances are most of the things you want or need have a package that can be installed with an AUR helper (like yay/paru which install from both pacman and AUR). In other distros you get more of a one-size fits all and you lose some of that ability to change things whereas arch you're giving a minimal setup and left to build the system to your liking. It does take more time and expertise than other distros but it does give you more control. For me the trade off was an easy decision, but it's not something i blanket recommend to everyone.

[–] starshipwinepineapple@programming.dev 4 points 2 weeks ago (2 children)

Codeberg has been stable enough for my small usage. It does have a CI, woodpecker, that requires manual approval. I haven't used their CI yet

Tflops is a generic measurement, not actual utilization, and not specific to a given type of workload. Not all workloads saturate gpu utilization equally and ai models will depend on cuda/tensor. the gen/count of your cores will be better optimized for AI workloads and better able to utilize those tflops for your task. and yes, amd uses rocm which i didn't feel i needed to specify since its a given (and years behind cuda capabilities). The point is that these things are not equal and there are major differences here alone.

I mentioned memory type since the cards you listed use different versions ( hbm vs gddr) so you can't just compare the capacity alone and expect equal performance.

And again for your specific use case of this large MoE model you'd need to solve the gpu-to-gpu communication issue (ensuring both connections + sufficient speed without getting bottlenecked)

I think you're going to need to do actual analysis of the specific set up youre proposing. Good luck

[–] starshipwinepineapple@programming.dev 2 points 2 weeks ago (2 children)

The table you're referencing leaves out CUDA/ tensor cores (count+gen) which is a big part of the gpus, and also not factoring in type of memory. From the comments it looks like you want to use a large MoE model. You aren't going to be able to just stack raw power and expect to be able to run this without major deterioration of performance if it runs at all.

Don't forget your MoE model needs all-to-all communication for expert routing

[–] starshipwinepineapple@programming.dev 4 points 2 weeks ago (1 children)
  • Custom DNS servers specified on the device to circumvent the pihole
  • dns over https or tls
  • hotspot from approved device
  • alternative YouTube front ends

These are just off the top of my head. Best case scenario the blocking does work and the teen never tries to bypass it. They'll still just move onto "wasting" time on something else. This is treating the symptom and not the root cause.

[–] starshipwinepineapple@programming.dev 5 points 2 weeks ago (3 children)

Pihole can set up "groups" for different blocklists. You specify client by IP or MAC address so it doesnt matter what the dhcp server is, so long as there's a static IP or static MAC address. My pihole server doesn't have dhcp set up and I'm able to do this fine

Though from personal experience this just becomes a game of cat and mouse, and if you have a motivated teenager then they will find a way to circumvent this. For example android can rotate MAC addresses, and IP addresses are trivial to spoof as well.

[–] starshipwinepineapple@programming.dev 7 points 2 weeks ago (1 children)

Haven't used all of those but my recommendation would be to just start trying them. Start small, get a feel for it and expand usage or try a different backup solution. You should be able to do automatic backups for any of them either directly or setting up your own timer/cron jobs (which is how i do it with rsync).

[–] starshipwinepineapple@programming.dev 94 points 2 months ago* (last edited 2 months ago) (8 children)

I submitted a response but if i may give some feedback, the second portion brings up:

I am willing to pay a substantial amount for hardware required for self-hosting.

This seemed out of place because there were no other value related questions (iirc). Such as:

  • I believe self hosting saves me money in the short term
  • i believe self hosting saves me money in the long run

I'm sure you could also think of more. But i think it's pretty important because between cloud service providers and any non-free apps you want to use, it can be quite costly compared to the cost of some hardware and time it takes to set things up.

The rest of my responses don't change but if you're wanting to understand the impact of money in all of this, i think some more questions are needed

Best of luck!

[–] starshipwinepineapple@programming.dev 4 points 2 months ago (1 children)

I use vscodium and it is available on AUR (vscodium / vscodium-bin). Supposedly there are some plugins not available for it, but i don't use a ton of plugins and the ones I used in vscode were available in vscodium when i switched.

[–] starshipwinepineapple@programming.dev 26 points 2 months ago* (last edited 2 months ago)

For some background, it turns out organic maps had a for profit llc registered and long poised itself as free and open source. When the llc was discovered the community volunteers wrote an open letter

When their concerns were not answered they forked the project and created CoMaps which in theory is supposed to be everything organic maps ever portrayed itself as.

[–] starshipwinepineapple@programming.dev 8 points 2 months ago* (last edited 2 months ago)

Why Not Use…?

I am aware that there are many other git “forge” platforms available. Gitea, Codeberg, and Forgejo all come to mind. Those platforms are great as well. If you prefer those options instead of SourceHut that’s fine! Switching to any of those would still be a massive improvement over GitHub.

Unfortunately, I find the need to have an account in order to contribute to projects a deal breaker. It causes too much friction for no real gain. Email based workflows will always reign supreme. It’s the OG of code contributions.

Ive been using codeberg(a public forgejo) and it felt more familiar coming from github/gitlab. Sourcehut wasn't bad, but it did feel quite a bit different and i admittedly didn't get too far past that. I do like the idea of contributing without an account though. i know that it's a git feature to create a patch file but having a forge support it is neat.

Semi related, I do look forward to federation of forgejo which i think helps the "needing an account" somewhat. I think it's less unreasonable to expect someone to have an account on -any federated forge- than to have an account at the specific forge my project is on.

Good article though. It did help make sourcehut make more sense than the first time i looked at it

 

Hi all, I'm relatively new to this instance but reading through the instance docs I found:

Donations are currently made using snowe’s github sponsors page. If you get another place to donate that is not this it is fake and should be reported to us.

Going to the sponsor page we see the following goal:

@snowe2010's goal is to earn $200 per month

pay for our 📫 SendGrid Account: $20 a month 💻 Vultr VPS for prod and beta sites: Prod is $115-130 a month, beta is $6-10 a month 👩🏼 Paying our admins and devops any amount ◀️ Upgrade tailscale membership: $6-? dollars a month (depends on number of users) Add in better server infrastructure including paid account for Pulsetic and Graphana. Add in better server backups, and be able to expand the team so that it's not so small.

Currently only 30% of the goal to break-even is being met. Please consider setting up a sponsorship, even if it just $1. Decentralized platforms are great but they still have real costs behind the scenes.

Note: I'm not affiliated with the admin team, just sharing something I noticed.

view more: next ›