iphone - Is it possible for an iOS app to take an image and then analyze the colors present in said image? -
for example after taking image, app tell relative amount of red, blue, green, , yellow present in picture , how intense each color is.
that's super specific know, know if it's possible , if has idea how go that.
thanks!
sure it's possible. you've have load image uiimage, underlying cgimage, , pointer pixel data. if average rgb values of pixels you're pretty muddy result, though, unless you're sampling image large areas of strong primary colors.
erica sadun's excellent ios developer cookbook series has section on sampling pixel image data shows how it's done. in recent versions there "core" , "extended" volume. think it's in core ios volume. copy of mac ibooks crashing repeatedly right now, can't find you. sorry that.
edit:
i got open on ipad finally. in core volume, in recipe 1-6, "testing touches against bitmap alpha levels." title implies, recipe looks @ image's alpha levels figure out if you've tapped on opaque image pixel or missed image tapping on transparent pixel. you'll need adapt code come average color image, erica's code shows hard part - getting , interpreting bytes of image data. book in objective-c. post comment if have trouble figuring out.
Comments
Post a Comment