sucking images from Safari

Does anybody know of any scripts/widgets/kludges on how to extract off-site images from a local Safari web page, and save the images to a selectable folder?? I could do a “Download File to Desktop”, but frankly, it would take forever to do anything more than 30 images.

Any ideas? none of the website duplicators that I know of works with local files.

signed,

Desperately Seeking Scriptings

Model: G4 400Mhz single-processor
AppleScript: 1.9.3
Browser: Safari 312.6
Operating System: Mac OS X (10.3.6)

Hi mensa.guy,

I don’t know what you mean by offsite images, but you can try this:


set f to choose folder with prompt "Choose a folder for download:"
set u_path to quoted form of POSIX path of f
tell application "Safari"
	set num_images to ¬
		(do JavaScript "document.images.length" in document 1) as integer
end tell
if num_images is 0 then return
set dup_count to 0
repeat with i from 1 to num_images
	tell application "Safari"
		set image_url to ¬
			(do JavaScript "document.images[" & (i - 1) & "].src" in document 1)
	end tell
	set user_tid to AppleScript's text item delimiters
	set AppleScript's text item delimiters to {"/"}
	set file_name to last text item of image_url
	set AppleScript's text item delimiters to user_tid
	tell application "Finder" to set _exists to exists file file_name of f
	if _exists then
		set dup_count to dup_count + 1
		beep 1
	else
		do shell script ¬
			"/usr/bin/curl " & image_url & " -o " & u_path & file_name
	end if
end repeat
display dialog "Duplicates: " & dup_count

It doesn’t do anything with dulicate names in the selected folder. It’s your preference what to do with them.

Here are places where I got info:

Safari javascript examples:
http://www.apple.com/applescript/safari/jscript.01.html

UNIX curl info:
curl man pages,
Developer:Examples:AppleScript Studio:Daily Dilbert

AppleScript:
AppleScriptLanguageGuide.pdf

gl,

Oops, made an mistake with quoted form.


set f to choose folder with prompt "Choose a folder for download:"
tell application "Safari"
	set num_images to ¬
		(do JavaScript "document.images.length" in document 1) as integer
end tell
if num_images is 0 then return
set dup_count to 0
repeat with i from 1 to num_images
	tell application "Safari"
		set image_url to ¬
			(do JavaScript "document.images[" & (i - 1) & "].src" in document 1)
	end tell
	set user_tid to AppleScript's text item delimiters
	set AppleScript's text item delimiters to {"/"}
	set file_name to last text item of image_url
	set AppleScript's text item delimiters to user_tid
	tell application "Finder" to set _exists to exists file file_name of f
	if _exists then
		set dup_count to dup_count + 1
		beep 1
	else
		set u_path to quoted form of POSIX path of ((f as string) & file_name)
		do shell script ¬
			"/usr/bin/curl " & image_url & " -o " & u_path
	end if
end repeat
display dialog "Duplicates: " & dup_count

BTW, you need to add error handling for other things like if there is no page, page is not loaded, etc.

gl,

For local webpages you could also use Automator.
It does choke sometimes on external (the rest of the world :D)
webpages, but is fine for local stuff.

Safari Actions

1, Get Current Webpage from Safari
2, Get Image URLs from Webpage
3, Download URLs ( with show action when run)

there happens to be an episode of the MacBreak video podcast where Sal himself creates an automator action to scrape images from web pages using Safari. Might be worth a look:

http://www.twit.tv/mb3

Don