How can I update files remotely from a website/FTP site?

Hi, I don’t know if you guys get tired of people asking you to write scripts for them, but you seem like a helpful bunch so I thought I’d ask! :wink:

My company has web kiosks in our stores (eMacs running Panther or Jaguar) and would like to use the Mac screensaver to show ads.

We need the ability to update the screensaver image files (jpegs) remotely. We thought about using Remote Desktop, but that seems like overkill when we’re only updating 5 or 6 files a week.

I’m not sure if it’s even possible, but here’s what I’d like the script to do…

  1. Run in the background and kick off automatically once a day (say midnight).

  2. Download a zipped folder from our website or FTP site containing images (I don’t know the site address or path yet)

  3. If necessary, wake the computer from the screensaver (I assume replacing these files while the screensaver is running is a bad idea)

  4. Unzip the file and REPLACE the images in our existing screensaver folder.

(I believe the path is /Users/kiosk/Pictures/screensaver)

  1. Clean up after itself (turn screensaver back on, delete zipped files and any temp folders created)


Well no one responded, so I reckon I’ll answer my own question for all the lurkers out there! :smiley:

The following shell command seems to work:

curl http://mysite/ad1.jpg > /Users/Shared/Pictures/screensaver/ad1.jpg

Of course, this doesn’t unzip files or check to see if the screensaver is already running, but I figured these steps are not neccessary. I can use Cronnix to make it a cron job and run it several times to grab all the files.

I can’t figure out how to make it grab a folder, and when I use wild cards the images become currupt.


Or I could just create an ApplesScript that runs the shell command for each image:

do shell script "curl > /Users/Shared/Pictures/screensaver/ad1.jpg"

do shell script "curl > /Users/Shared/Pictures/screensaver/ad2.jpg"

do shell script "curl > /Users/Shared/Pictures/screensaver/ad3.jpg"

Curl doesn’t what files are in a folder. However, if you know which files exist, you can get curl to download them.

One example:

cd /Users/Shared/Pictures/screensaver; curl -O http://mysite/ad[1-10].jpg

This will get the files ad1.jpg, ad2.jpg, etc., up to ad10.jpg.

Another example:

cd /Users/Shared/Pictures/screensaver; curl -O http://mysite/{apple, orange, lemon}.jpg

This would download the files apple.jpg, orange.jpg, and lemon.jpg.

In both of those options, “-O” means save the local file using the name of the remote file.

Thanks Bruce, your example did the trick!

When I originally used -O and [] the images became corrupt… my syntax must have been wrong.

One other question… to make this a cron job that runs at midnight without resorting to Cronnix, do I simply add 0 0 * * * /usr/bin/ to the beginning of the command?

Thanks again,