page_grabber
Download managers can be pesky creatures with hard to use batch-downloading features. This is why Python is useful. This program has been used an modified so many times I’ve forgotten just when I programmed it. I’m going to settle with fall 2010-ish.
import urllib.request,os
url_part = "http://examplewebsite.com/"
url_end = "page.php?number=%i"
save_dir = "page/"
for i in range(1,40):
addr = url_part+(url_end%i)
try:
response = urllib.request.urlopen(addr)
f = open(os.path.join(save_dir,addr.split("/")[-1]),"wb")
f.write(response.read())
f.close()
print(addr, "OK")
except:
print(addr, "FAIL")