objective c - Get pixel colour from a Webcam -
i trying pixel colour image displayed webcam. want see how pixel colour changing time.
my current solution sucks lot of cpu, works , gives me correct answer, not 100% sure if doing correctly or cut steps out.
- (ibaction)addframe:(id)sender { // recent frame // must done in @synchronized block because delegate method sets recent frame not called on main thread cvimagebufferref imagebuffer; @synchronized (self) { imagebuffer = cvbufferretain(mcurrentimagebuffer); } if (imagebuffer) { // create nsimage , add movie // think can remove steps here, not sure where. nsciimagerep *imagerep = [nsciimagerep imagerepwithciimage:[ciimage imagewithcvimagebuffer:imagebuffer]]; nssize n = {320,160 }; //nsimage *image = [[[nsimage alloc] initwithsize:[imagerep size]] autorelease]; nsimage *image = [[[nsimage alloc] initwithsize:n] autorelease]; [image addrepresentation:imagerep]; cvbufferrelease(imagebuffer); nsbitmapimagerep* raw_img = [nsbitmapimagerep imagerepwithdata:[image tiffrepresentation]]; nslog(@"image width %f", [image size].width); nscolor* color = [raw_img coloratx:1279 y:120]; float colourvalue = [color greencomponent]+ [color redcomponent]+ [color bluecomponent]; [graphview setxy:10 andy:200*colourvalue/3]; nslog(@"%0.3f", colourvalue);
any appreciated , happy try other ideas. guys.
there couple of ways made more efficient. take @ imagefromsamplebuffer:
method in this tech q&a, presents cleaner way of getting cvimagebufferref
image (the sample uses uiimage
, it's practically identical nsimage
).
you can pull pixel values straight out of cvimagebufferref
without conversion. once have base address of buffer, calculate offset of pixel , read values there.
Comments
Post a Comment