All the answers are so technical and I am completely computer illiterate. I want to download various wikis/fandom sites for offline reading but it's ridiculously hard. HTTrack downloads everything including google data and outside links and I have no idea how to stop it, and I have no experience with python so trying to use Wikiteam and MediaWiki is nearly impossible for me (it seems to not work on Windows and I'm trying on Linux but Linux is extremely frustrating). The instructions are difficult to follow. Currently I'm using an extension that lets me save each page individually which takes me hours and I don't even get everything.
Can anyone ELI5 for how to use python/Wikiteam? I read all the FAQs and readmes but they expect way more familiarity than I have. I don't want edit history or talk pages, I SOLELY want the current wiki pages at time of download and images. Thanks!!!