QTKit and ASOC

After reading Shane’s book I was working to try and convert the ObjC code from http://iloveco.de/adding-isight/ into ASOC. The first issue that came up is that the defaultInputDeviceWithMediaType class method only ever returns null. But I got around it by hard-coding my iSight device using inputDevices.

However, my bigger problem is assigning the session output to my QT Capture View using setCatptureSession. Anyone thoughts or suggestions that might help? Please let me know. Thanks.

script VideoPreviewAppDelegate
	property parent : class "NSObject"
    
    property outputView : missing value
	
	on applicationWillFinishLaunching_(aNotification)
        set session to current application's QTCaptureSession's alloc()'s init()
        
        --set theVideo to current application's QTCaptureDevice's defaultInputDeviceWithMediaType_("QTMediaTypeVideo")
        
        set isight to item 2 of current application's QTCaptureDevice's inputDevices()
        log current application's QTCaptureDevice's inputDevices()

        tell isight to open_(reference)

        set input to current application's QTCaptureDeviceInput's deviceInputWithDevice_(isight)
        
        tell session to addInput_error_(input, reference)
        
        set outputView to current application's QTCaptureSession's setCaptureSession_(session)
              
        tell session to startRunning()
    end applicationWillFinishLaunching_
	
	on applicationShouldTerminate_(sender)
		-- Insert code here to do any housekeeping before your application quits 
		return current application's NSTerminateNow
	end applicationShouldTerminate_
	
end script

In the objective C method there is this line:

 [outputView setCaptureSession:session];

Your line, “set outputView to current application’s QTCaptureSession’s setCaptureSession_(session)” is not equivalent.

outputView is an IBOutlet for a QTSessionView, which you drag into your interface in IB. The line should just be:

"outputView’s setCaptureSession_(session).

Ric

After edit: I don’t know if “reference” will work in the 2 places you have it – normally you use “missing value” in the places where objective C has a (NSError **) parameter. This may be different on your computer, but my iSight camera is item 3 of the devices list not 2. A safer way to get a reference to your iSight camera would be to use:
“set isight to current application’s QTCaptureDevice’s defaultInputDeviceWithMediaType_(current application’s QTMediaTypeVideo)”. This should give you the iSight device regardless of where in the devices list it resides.

Thanks Ric, that all worked perfectly. I decided to now expand my program to take a picture and save it as a jpeg. To do this I attempted to translate the following Objective-C code into ASOC.

- (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputVideoFrame:(CVImageBufferRef)videoFrame withSampleBuffer:(QTSampleBuffer *)sampleBuffer fromConnection:(QTCaptureConnection *)connection
{
    if ( currentImage ) return;
    
    CVBufferRetain(videoFrame);
    
    @synchronized (self) {
        currentImage = videoFrame;
    }
    
    [self performSelectorOnMainThread:@selector(saveImage) withObject:nil waitUntilDone:NO];
}

The best that I could come up with is as follows and this delegate fires as expected but it doesn’t assign the data to variables like it does for the Objective-C code so I added the set statement hoping it would suffice.

    on captureOutput_didOutputVideoFrame_withSampleBuffer_fromConnection_(QTCaptureOutput,       CVImageBufferRef, QTSampleBuffer, QTCaptureConnection)

        set theVideoFrame to CVImageBufferRef
        
    end captureOutput_didOutputVideoFrame_withSampleBuffer_fromConnection_

Thinking theVideoFrame might actually contain the data I want, my goal is to covert it to a JPEG format and save it but I can’t seem to find any examples of how to do that. Any help would be appreciated. Thanks.

I don’t know if this is possible in ASOC. For one, I don’t think you can deal with the @synchronized thing, having to do with the method being called on another thread. However, when I tried implementing the delegate method, it does get called, so i’m not sure what is going on there. But, the "(CVImageBufferRef)videoFrame " parameter, which is a CVImageBufferRef when I do it in Objective C, shows up as BAIntermediateData in ASOC. That doesn’t work as the parameter for the addFrame method which you need to turn the video frame into an image:

on addFrame_(sender)
		log videoFrame
		if videoFrame is not missing value then
			set imageRep to current application's NSCIImageRep's imageRepWithCIImage_(current application's CIImage's imageWithCVImageBuffer_(videoFrame))
			set theImage to current application's NSImage's alloc()'s initWithSize_(imageRep's |size|())
			theImage's addRepresentation_(imageRep)
		end if
		log theImage
	end addFrame_

That code is an ASOC translation of the code in the “Creating a QTKit Stop Motion Application” example in the docs.

Ric

Thanks Ric, I appreciate all your help and effort. I think you are right in that it is not going to work properly with the multi-threading issue. Wanted to post the full script just in case someone else finds it useful, but I received the following error when attempting the conversion. It was still a useful experiment and I learned a lot with your help. Thanks so much. --Andrew

+[CIImage imageWithCVImageBuffer:]: value passed to argument of type ^{__CVBuffer=} must be either missing value' or reference’; assuming `missing value’.
-[NSImage addRepresentation]: unrecognized selector sent to instance 0x400f14880
-[VideoPreviewAppDelegate saveImage:]: -[NSImage addRepresentation]: unrecognized selector sent to instance 0x400f14880 (error -10000)


        script VideoPreviewAppDelegate
	property parent : class "NSObject"
    
    property theOutputView : missing value
    property theSession : missing value
    property theVideoFrame : missing value
	
	on applicationWillFinishLaunching_(aNotification)
        
        --  initialize capture session
        set theSession to current application's QTCaptureSession's alloc()'s init()
        
        --  get default video device
        set isight to current application's QTCaptureDevice's defaultInputDeviceWithMediaType_(current application's QTMediaTypeVideo)

        --  open default video device
        tell isight to open_(missing value )
        
        --  create input object
        set input to current application's QTCaptureDeviceInput's deviceInputWithDevice_(isight)
        
        --  add input to input object
        tell theSession to addInput_error_(input, missing value)
        
        --  initialize video output
        set output to current application's QTCaptureDecompressedVideoOutput's alloc()'s init()
       
        --  set the delegate that will capture the image
        output's setDelegate_(me)
        
        -- add output to session
        tell theSession to addOutput_error_(output, missing value)
        
        --  output the capture session to the capture view window
        theOutputView's setCaptureSession_(theSession)
              
        --  start the video stream
        tell theSession to startRunning()

    end applicationWillFinishLaunching_
    
    
    on captureOutput_didOutputVideoFrame_withSampleBuffer_fromConnection_(QTCaptureOutput, CVImageBufferRef, QTSampleBuffer, QTCaptureConnection)
        
        --  set video frame object to each streaming video frame 
        set theVideoFrame to CVImageBufferRef
        
    end captureOutput_didOutputVideoFrame_withSampleBuffer_fromConnection_
    
    
    on saveImage_(sender)
        
        -- NSCIImageRep *imageRep = [NSCIImageRep imageRepWithCIImage:[CIImage imageWithCVImageBuffer:currentImage]]
        if my theVideoFrame is not missing value then
            set theImageRep to current application's NSCIImageRep's imageRepWithCIImage_(current application's CIImage's imageWithCVImageBuffer_(my theVideoFrame))
            
            set theImage to current application's NSImage's alloc()'s initWithSize_(theImageRep's |size|())
            theImage's addRepresentation(theImageRep)
        end if
        
    end saveImage_
    
    
    on applicationShouldTerminateAfterLastWindowClosed_()
        --  terminate application if main window is closed
        return(yes)
    end applicationShouldTerminateAfterLastWindowClosed_
    
	
	on applicationShouldTerminate_(sender)
		-- Insert code here to do any housekeeping before your application quits 
		return current application's NSTerminateNow
	end applicationShouldTerminate_
	
end script