I am using SystemEvents to script a non-AppleScript App. One thing the App in question is not providing properly is the “selected rows” attribute of an “outline view”.
If I select the rows the row’s background color changes to blue. Via ScriptDebugger system events dictionary viewer, I do not see a change in the attributes for the selected rows.
I can get the row’s position, size and frame.
Wondering if via AppleScript I can get a color or list of colors from a point or frame. I have an extension on NSImage that can give me the “dominant color” from within the image.
Looking for something a little more simple than using CGContext, etc, to sample a small frame.
Even if I can get a NSImage from a frame
Via AppleScript would help.
I’ll be trying to figure this one out, but my mind immediately went to the scenario where the list with selected items may extend beyond the visible frame (requiring scrolling to see all the rows). If it is not the case where all the rows will always be visible, color scraping will be a fail unless such state can be detected and scrolling can be managed to count and expose… if you can’t be certain of the number of rows, there is no workable solution I can ponder.
Then you can take a screenshot and get the RGB values of some pixel:
use AppleScript version "2.5"
use framework "Foundation"
use scripting additions
set imagePath to ((current application's NSTemporaryDirectory())'s stringByAppendingString:"TemporaryItems/screenshot.png") as string
do shell script "screencapture -xtpng " & quoted form of imagePath
set imageRep to current application's NSBitmapImageRep's imageRepWithContentsOfFile:imagePath
if imageRep = missing value then return "this file does not contain a usable image"
set theColor to imageRep's colorAtX:20 y:20 -- modify according to your needs
set theRed to theColor's redComponent()
set theGreen to theColor's greenComponent()
set theBlue to theColor's blueComponent()
set theAlpha to theColor's alphaComponent()
set theSpace to theColor's colorSpace()
{theRed, theGreen, theBlue, theAlpha, theSpace}
I ended up finding this which uses a a filter to average the input
do shell script "screencapture -R " & theX & "," & theY & "," & theWidth & "," & theHeight & " -c"
set theData to current application's NSPasteboard's generalPasteboard()'s dataForType:(current application's NSPasteboardTypeTIFF)
set theCIImage to current application's CIImage's imageWithData:theData
set theCIFilter to current application's CIFilter's filterWithName:"CIAreaAverage"
theCIFilter's setValue:theCIImage forKey:(current application's kCIInputImageKey)
set theResult to theCIFilter's valueForKey:(current application's kCIOutputImageKey)
set thisRep to current application's NSBitmapImageRep's alloc()'s initWithCIImage:theResult
set theColor to (thisRep's colorAtX:0 y:0)
I created an Extension on NSImage which uses for the screen capture
Then comparing the returned color to my testColor.
Works great except for the SystemEvents get “AXFrame” doesn’t return a true NSRect but rather all the points of the frame so I had to calculate the width and height for the rect from
Those values (x1,y1,x2,y2)
I have filtering happening in the viewed list so scrolling is not needed. The app outline view also does not support “scrollToRow:” or “scrollToVisible:”