ScreenShots from a list of links

I have a text file that contains about 100 urls.
I was wondering if it would be possible to automate the process of opening each link and taking a screen shot of that webpage.

I know that in the past screenshots were out of reach but perhaps that has changed.

Any thoughts or comments are appreciated.
Thanks,
jcole

If your text file has the URLs each on a separate line, this should work. For more information about the screencapture CLI tool, open the Terminal and type “screencapture --help”.

Jon


[This script was automatically tagged for color coded syntax by Convert Script to Markup Code]

Thanks Jon, I’ll take a look at the script you posted. No doubt it’ll be a bit more refined than my own frankenScript. I bound a few scripts together and it seems to work:

-- select and read contents of text file
set theFile to (choose file with prompt "Select a file to read:" of type {"TEXT"})
open for access theFile
set fileContents to (read theFile)
set target_URL to fileContents
close access theFile

repeat with each_URL in (paragraphs of target_URL)
	set target_folder_name to "screenshots"
	set target_folder to (((path to desktop) as string) & target_folder_name & ":")
	set file_prefix to "sc_"
	set sc_num to 0
	tell application "Finder"
		try
			get target_folder as alias
		on error
			make new folder at desktop with properties {name:target_folder_name}
		end try
		repeat
			set sc_num to sc_num + 1
			set file_name to file_prefix & sc_num & ".pdf"
			if (exists of file (target_folder & file_name)) = false then exit repeat
		end repeat
	end tell
	
	tell application "Safari"
		open location each_URL
		delay 10 -- wait 10 seconds
		do shell script "/usr/sbin/screencapture -x " & ((quoted form of POSIX path of target_folder & file_name) as string)
		close window 1
	end tell
end repeat

Thanks again.
jcole

Thanks Jon, This is perfect.

jcole