Some time ago I posted a tutorial on the basics of using Core Image face detection on the iOS platform.
Something that was asked in the comments thread for that post was how to use Core Image face detection on a live video feed – and whether it was fast enough.
Recently I received a submission from Jeroen Trappers showing how to use Core Image face detection on a live video feed, and place a moustache on the faces found within the video stream.
Here’s an image of the example in action from Jeroen’s site:
You can find the tutorial here.
The speed is respectable on newer iOS devices with a 640×480 capture resolution.
Added to the Core Image page.
Submit A Resource
Have you created a useful tutorial, library or tool for iOS development that you would like to get in front of our 300,000+ monthly page views from iOS developers?
You can submit the url here.
The resources we feel will appeal to our readers the most will be posted on the front page.