Original Title: Trying to download a niche wiki site for offline use, tried zimit but it takes far too long for simple sites, tried httrack but it struggles with modern sites, thinking of using CURL, how is everyone creating web archives of modern wiki sites?
What I'm trying to do is extract the content of a web site that has a wiki style format/layout. I dove into the source code and there is a lot of pointless code that I don't need. The content itself rests inside a frame/table with the necessary formatting information in the CSS file. Just wondering if there's a smarter way to create an offline archive thats browsable offline on my phone or the desktop?
Ultimatley I think I'll transpose everything into Obsidian MD (the note taking app that feels like it has wiki style features but with offline usage and uses the markup language to format everything).