I found a couple of really cool examples from an upcoming book called iOS 4 sensor programming. I was checking out the rough cut of the book, and this book really covers a lot of cool topics.
Two of the topics covered in the book are face detection and augmented reality both of which have examples available.
The first example is of face detection with OpenCV, and the code here is a bit more understandable than most of the OpenCV samples I’ve seen around. Installing OpenCV to work on the iPhone is somewhat of a pain, but following this tutorial alleviates a lot of the pain:
Using OpenCV On The iPhone
The second example currently available is on augmented reality, and while it’s pretty simple, it shows you the potential of what you can do by combining the camera and gps features available on the iPhone. What the example does is show you the direction of a couple of specific cities with the directions updated in real time./
You can find the current book examples on the books homepage at:
Programming iPhone Sensors
The rough cut is available on Safari here for those looking to read the book. Looks like it wasn’t updated for a while as the author had to wait for the last iPad update, but he’s back to working on it full time. Looking forward to it.
[thanks to Alex Curylo for pointing out the book]
Submit A Resource
Have you created a useful tutorial, library or tool for iOS development that you would like to get in front of our 300,000+ monthly page views from iOS developers?
You can submit the url here.
The resources we feel will appeal to our readers the most will be posted on the front page.