Download all photos for a given keyword as a zip file?

darryldarryl Registered Users Posts: 997 Major grins
To make it easier for parents in my son's preschool to download all of the images tagged with their kid's name, I've written a Perl CGI script that:

- Accepts a SmugMug keyword as input
- Downloads the RSS feed for that keyword
- Downloads all of the images for that keyword to a temp directory
- Renames all of the images to "somestring_YYYYMMDD_HHMMSS_UNIQ.jpg"
- Zips all the images up
- Returns a link to download the zip file.

(Yeah, yuck, Perl. But I needed to use the EXIF and Zip modules, as well as the nifty Proc::Background module to manage multiple calls to curl to download the images in parallel.)

If you're interested, let me know and I'll clean up the code and post links to the script source.

--Darryl

Comments

  • darryldarryl Registered Users Posts: 997 Major grins
    edited May 15, 2008
    By the way, this is a companion hack to my earlier hack to allow browsers to tag photos by adding comments: smcomments2keywords.pl

    http://www.dgrin.com/showthread.php?t=74036

    Both scripts require some kind of Unix server to run from on a regular basis. For the zip script, I also have a clean-up script to delete the zip files (they can take up a lot of space) three hours after they're generated.
  • rkallarkalla Registered Users Posts: 108 Major grins
    edited May 16, 2008
    Darryl,
    Just from a usability perspective I really dig this idea, would love to see the API augmented in some way post-1.2.2 to allow for "soft requests" like this.

    Nice job!
  • darryldarryl Registered Users Posts: 997 Major grins
    edited May 16, 2008
    What's a "soft request"? If you mean something that needs to be hosted on a server, then heck yeah it'd be great if SmugMug could host these hacks instead of me. But heck, if they're going to go to that much trouble, they should just *fix* the problems: 1) Provide an option for some kind of distributed keywording w/o forcing people to be paid users of SmugMug, 2) Provide an option to allow for bulk dowloads from the web interface.
  • darryldarryl Registered Users Posts: 997 Major grins
    edited June 5, 2008
    Somebody else asked, so:

    The code's a tad messy, but it gets the job done, albeit slowly.

    http://www.darryl.com/smugkeysrc

    keyword.php -- generates links for specified keywords to pass to the CGI script I do not automatically generate this list. For my needs a static list worked fine.

    getfiles.cgi -- logs into site-wide password-protected site, downloads RSS feed for given keyword, downloads all images listed in RSS feed, zips images, provides download link for zip file.

    cleancache.pl -- cleans up zip files older than 3 hours. I run it hourly from crontab.

    Additional Notes:
    The site I wrote this for has a site-wide password. I need to login to get a cookie so I can get to the RSS feeds. If yours does not, you could comment out that bit of getfiles.cgi

    I have to run getfiles.cgi with cgiwrap so that the zip files get written by my uid instead of nobody or apache. This is because the cleancache script has to delete them later and it runs as my uid.
Sign In or Register to comment.