Download all photos for a given keyword as a zip file?
darryl
Registered Users Posts: 997 Major grins
To make it easier for parents in my son's preschool to download all of the images tagged with their kid's name, I've written a Perl CGI script that:
- Accepts a SmugMug keyword as input
- Downloads the RSS feed for that keyword
- Downloads all of the images for that keyword to a temp directory
- Renames all of the images to "somestring_YYYYMMDD_HHMMSS_UNIQ.jpg"
- Zips all the images up
- Returns a link to download the zip file.
(Yeah, yuck, Perl. But I needed to use the EXIF and Zip modules, as well as the nifty Proc::Background module to manage multiple calls to curl to download the images in parallel.)
If you're interested, let me know and I'll clean up the code and post links to the script source.
--Darryl
- Accepts a SmugMug keyword as input
- Downloads the RSS feed for that keyword
- Downloads all of the images for that keyword to a temp directory
- Renames all of the images to "somestring_YYYYMMDD_HHMMSS_UNIQ.jpg"
- Zips all the images up
- Returns a link to download the zip file.
(Yeah, yuck, Perl. But I needed to use the EXIF and Zip modules, as well as the nifty Proc::Background module to manage multiple calls to curl to download the images in parallel.)
If you're interested, let me know and I'll clean up the code and post links to the script source.
--Darryl
0
Comments
http://www.dgrin.com/showthread.php?t=74036
Both scripts require some kind of Unix server to run from on a regular basis. For the zip script, I also have a clean-up script to delete the zip files (they can take up a lot of space) three hours after they're generated.
Just from a usability perspective I really dig this idea, would love to see the API augmented in some way post-1.2.2 to allow for "soft requests" like this.
Nice job!
kallasoft | The "Break It Down" Blog
The code's a tad messy, but it gets the job done, albeit slowly.
http://www.darryl.com/smugkeysrc
keyword.php -- generates links for specified keywords to pass to the CGI script I do not automatically generate this list. For my needs a static list worked fine.
getfiles.cgi -- logs into site-wide password-protected site, downloads RSS feed for given keyword, downloads all images listed in RSS feed, zips images, provides download link for zip file.
cleancache.pl -- cleans up zip files older than 3 hours. I run it hourly from crontab.
Additional Notes:
The site I wrote this for has a site-wide password. I need to login to get a cookie so I can get to the RSS feeds. If yours does not, you could comment out that bit of getfiles.cgi
I have to run getfiles.cgi with cgiwrap so that the zip files get written by my uid instead of nobody or apache. This is because the cleancache script has to delete them later and it runs as my uid.