brickfrog

joined 2 years ago
[–] brickfrog@lemmy.dbzer0.com 5 points 1 month ago

You see proponents of both views engaging in egregious argumentative practices at times and it is clear that this situation is continually degrading and needs something to be done about it.

Does it? I'm kind of thinking if people insist on browsing Lemmy in All mode, and forcing themselves to view everything they they don't want to view, then it's on them to learn how to block communities in their own profile settings. Or if you want to help them somehow, maybe some way to display a quick how-to to show people how to block communities and/or browse in Subscribed mode could be useful. Just not sure how feasible that would be overall if people are browsing All and just reacting to things they don't want to see.

For me browsing Lemmy in Subscribed mode and purposely subscribing to communities I'm interested in works well enough, no need to wade into the for/against AI drama or any other topics I'm not interested in.

[–] brickfrog@lemmy.dbzer0.com 3 points 1 month ago (1 children)

Eh, sure OP could do that. Does seem a bit over the top for OP to pursue the most complicated backup solution possible :D Maybe as a strange experiment to see how it goes, not as a trusted backup solution. (like you said not for critical data)

IPFS would also require more bandwidth vs just about any other solution since it has to constantly talk to other IPFS nodes. And more finicky, last I used IPFS the client would run into memory leaks and other weirdness requiring restarts every now and then (hopefully it's more stable for long-term runs nowadays).

[–] brickfrog@lemmy.dbzer0.com 7 points 1 month ago

Similar but no, Syncthing does not use bittorrent or the bittorrent protocol.

Though if you're curious Resilo Sync (formerly Bittorrent Sync) is similar to Syncthing and does use bittorrent.

[–] brickfrog@lemmy.dbzer0.com 12 points 1 month ago (3 children)

Wouldn't be a good solution, you're hoping that other users are going to volunteer to pin (aka store and seed) your personal backup data for you.

Using IPFS for personal backups is exactly the same as creating a torrent with your backup data - With both it would be unlikely that your personal backup data will actually exist anywhere beyond your own data storage, no one's going to freely volunteer to store your backups for you.

[–] brickfrog@lemmy.dbzer0.com 15 points 1 month ago* (last edited 1 month ago) (10 children)

Not overly active but there are a few communities you could join if you like

!OpenSignups@lemmy.dbzer0.com

!Opensignups@lemmy.ml

!Opensignups@noworriesto.day

https://opentrackers.org/ is also a good site to keep an eye on (though it seems to be less active at the moment).

[–] brickfrog@lemmy.dbzer0.com 2 points 1 month ago

Hmm I can see all current 13 comments here (via web ui), granted I'm looking at it one hour after you posted. If you can see all the comments now then maybe federation with lemmy.world was slow for a bit?

Or could just be new account though you could probably rule that out after like a day if the issue still persists.

[–] brickfrog@lemmy.dbzer0.com 1 points 2 months ago (1 children)

Prime95 and Memtest86+ both run under Linux so no issues there.

You could also run those sort of apps off a Linux boot USB, or one of the Linux based diagnostic boot USB projects. I like SystemRescue (https://www.system-rescue.org/System-tools) but there are plenty of others you can check out - SystemRescue includes memtest86+ / stress / stress-ng / stressapptest for system stress testing so that could be something to try.

[–] brickfrog@lemmy.dbzer0.com 1 points 2 months ago (3 children)

Yup like others said the lack of a CPU cooler is definitely the problem here. CPUs heat up quickly and once they hit their thermal limit the system will shut down to try to avoid hardware failure. Hopefully the CPU wasn't damaged from repeatedly overheating while you were testing without cooling.. it might be okay, only way to know for sure is to properly install a cooler and then test.

Once you've got it going I'd suggest doing a burn-in test just to be sure the CPU will last. Been a bit since I've done a build but usually I'd run something like Prime95 to be sure the CPU and cooling is stable.

[–] brickfrog@lemmy.dbzer0.com 1 points 2 months ago

Ah yeah I saw that one but I don't think it does quite what OP wants. Seems more like it is designed to monitor a running qBittorrent client and then copy the .torrent file(s) to Transmission, with all torrent data in the same data folder. Might not help much for OP with all the different data folders they have in their current setup.

My concept is as such: have a shared folder where everything is moved after download. I call this /mnt/torrents.

The script provided that makes all of this happen is a python script. It queries the qBittorrent client for uploading or completed downloads, checks to see if they are private or public torrents, then copies the .torrent files to the respective "watched" directory of the public or private (transmission) client. It just copies the .torrent files to directories, so it should be usable with other torrent clients that have "watched" directories.

But either way nice effort! I'm kind of surprised at the lack of scripts to import torrents into Transmission. The only related script I could find is to do Transmission --> qBittorrent but it doesn't seem to do the reverse https://github.com/Mythologyli/transmission-to-qbittorrent

[–] brickfrog@lemmy.dbzer0.com 2 points 2 months ago* (last edited 2 months ago) (2 children)

and even then, I tried one and for some reason it wouldn’t verify my downloaded files and insisted on redownloading the torrent from scratch. Even though I had made sure I was pointing to the correct directory. This may be because I’ve renamed files in the past

That should work fine.. I suspect that failed maybe because you renamed like you said. Make sure Transmission is adding torrents in paused mode, then do another test with a torrent you definitely didn't rename. Maybe just do a test download in qBittorrent and then attempt to add it into Transmission e.g. a Linux Mint torrent or similar is usually a safe test https://www.linuxmint.com/edition.php?id=319

Because of how you have your torrents organized it does sound like you'll need to tough it out and add each torrent and configure it manually.

It would be easier if you had all the torrent data saved in the same folder(s), in which case just configure Transmission to add torrents in pause mode, configure a watch folder, copy your qBittorrent's .torrent files into that watch folder, and finally do a re-check in Transmission and start all the torrents. Then just hardlink the torrent data out into your own nested folders how you want them set up, that way the same data exists and is linked in two places (torrent data folder and your own folders). Maybe it's something to consider for your future configuration but it's not going to help you much right now.

For now yeah, the best you could do is set Transmission to add torrents in paused mode, configure a watch folder, copy paste your current qBittorrent .torrent files, then afterwards in Transmission change each torrent's data location and re-check one-by-one. Not sure if it's any faster than just adding the torrents manually one-by-one :/

You should be able to find the current .torrent files wherever MacOS saves your qBittorrent files, look for a folder that looks like qBittorrent / BT_backup, all the .torrent files in BT_backup are your loaded torrents inside qBittorrent.

With some luck maybe you can find a tool that does qBittorrent --> Transmission migrations? I wasn't sure if any exist, all I can find are tools to do Transmission --> qBittorrent e.g. https://github.com/undertheironbridge/transmission2qbt

(note I'm not on MacOS so maybe someone else has more direct advice to offer)

[–] brickfrog@lemmy.dbzer0.com 3 points 2 months ago* (last edited 2 months ago)

Core 2 Duos are slow, yeah. I've got an Asus F8SP-X1 laptop from ~ 2008 with a Core 2 Duo T9500, 4 GB RAM, and a SSD SATA drive in it. It was originally a mid-range Windows Vista system. Over its years I managed to upgrade it as far as it could go. It does run standard Ubuntu and Windows 10 - Certainly not fast but it does run. Performance would lean towards unbearable without the SSD. I suspect Gnome isn't doing it any favors and switching to a lighter DE or distro would help (or maybe just ditching the DE altogether) but since it's just a spare laptop it's no big deal.

One of the takeaways from your experiment is if it the system was already crap at running Windows 10 it's not necessarily going to fare better with Linux, at least if you're expecting a nice desktop environment. I don't know if in 2025 we need to equate the "will this run Linux?" challenge on old Windows XP/7 hardware aside from the geek/techie users that want to do something with that old hardware. Anyone else non-technical stuck with that type of hardware isn't thinking about Windows 10 being retired.

[–] brickfrog@lemmy.dbzer0.com 14 points 2 months ago* (last edited 2 months ago)

You may as well call them and ask. Main things you want to find out are what plans/prices they offer and if they have any data caps. And/or if it's still under construction definitely ask to be put on their list of interested customers.

Honestly just about anything fiber is going to be an improvement over Comcast cable internet... if I were you I'd at least inquire if they have a 1 gig download/upload plan and work from there. Good luck!

view more: ‹ prev next ›