cocoa - Extracting an image from H.264 sample data (Objective-C / Mac OS X) -
given sample buffer of h.264, there way extract frame represents image?
i'm using qtkit capture video camera , using qtcapturemoviefileoutput
output object.
i want similar cvimagebufferref
passed parameter qtcapturevideopreviewoutput
delegate method. reason, file output doesn't contain cvimagebufferref
.
what qtsamplebuffer which, since i've set in compression options, contains h.264 sample.
i have seen on iphone, coremedia , avfoundation can used create cvimagebufferref
given cmsamplebufferref
(which, imagine close qtsamplebuffer
i'll able get) - mac, not iphone.
neither coremedia or avfoundation on mac, , can't see way accomplish same task.
what need image (whether cvimagebufferref, ciimage or nsimage doesn't matter) current frame of h.264 sample given me output object's call back.
extended info (from comments below)
i have posted related question focusses on original issue - attempting play stream of video samples using qtkit: playing stream of video data using qtkit on mac os x
it appears not possible why i've moved onto trying obtain frames images , creating appearance of video, scaling, compressing , converting image data cvimagebufferref
nsimage
, sending peer on network.
i can use qtcapturepreviewvideooutput
(or decompressed) uncompressed frame images in form of cvimagebufferref
.
however, these images references need compressing, scaling , converting nsimage
s before they're use me, hence attempt scaled , compressed frame framework using qtcapturemoviefileoutput
(which allows compression , image size set before starting capture), saving me having expensive compression, scale , conversion operations, kill cpu.
does creating single-frame grabbing application section of qtkit application programming guide not work in instance?
Comments
Post a Comment