Want to backup your favorite NSFW Creative Writing stories? Here's the complete workflow:
Step 1: Gather Links
- Start with the comprehensive link collection from Cyb3rNexus's GitHub Gist – it already contains hundreds of pre-filtered thread links!
- For more recent stories, navigate to NSFW Creative Writing
- Use Link Gopher browser extension to extract all links from paginated results
- Combine both sources and save all links into
qq_links.txt
Step 2: Filter Thread Links
Create filter_qq_links.py with the Python script above, then run:
python filter_qq_links.py
This extracts only valid thread links (matching /threads/ pattern) into filtered_links.txt
Step 3: Download with Fichub
# Install fichub-cli
pip install -U fichub-cli
# Download all fics
fichub_cli -i filtered_links.txt -o "/home/user/downloads/QQ"
That's it! All stories will be downloaded as EPUB files to your specified folder. Happy reading!
#!/usr/bin/env python3
"""
Simple version - filter QuestionableQuesting thread links
"""
import re
def filter_qq_links(input_file, output_file):
"""Filter QQ thread links from input file and save to output file."""
pattern = r"^https://forum//.questionablequesting/.com/threads/[^/]+/d+/?$"
with open(input_file, "r") as f:
links = [line.strip() for line in f if line.strip()]
valid_links = [link for link in links if re.match(pattern, link)]
with open(output_file, "w") as f:
f.write("\n".join(valid_links))
print(f"Found {len(valid_links)} valid links out of {len(links)} total")
print(f"Results saved to {output_file}")
# Usage
if __name__ == "__main__":
filter_qq_links("qq_links.txt", "filtered_links.txt")
I'm trying to download all fics from that specific forum. Sorry I wasn't clear.