😎 Fun Feature Friday: Scanning Text from Photos on iOS
Or the "I just want to share the text from this photo" feature also known as OCR
Welcome to 😎 Fun Feature Friday 😎 where I take a look at a feature I love and break it down from a product perspective. I plan on doing these once a month or so. - AQ
The “I just want to share the text from this photo" feature also known as OCR
My wife and I have a constant iMessage dialogue going as we manage the endless tasks of kids, home, family and life. Often I see something and I just need to get her the most pertinent information - so I use this handy feature on iOS. I call it the “I just want to share the text from this photo” feature.
Yesterday is a perfect example. The mail arrived with a thick AmEx brochure about our credit card benefits. I scanned it to see if there was anything we didn’t already know and there was - apparently our card allows us to pay with points on Alaska Airlines - cool! I knew snapping a full (or zoomed in) photo of the brochure would lose my wife (WTF did he send me this?) so I snapped the photo, scanned the text and sent her just the sentences she needed to see (demo below).
TL;DR what does this feature do?
It allows you to highlight text in a photo and copy / paste it anywhere.
Who did Apple build this for?
While my use case (busy parents) is interesting, I don’t think I’m in the majority user group Apple was thinking about when they built this feature. My hypothesis is that the end users they had in mind are students and researchers who need the ability to quickly take notes on key points they need to remember.
However, I think that Apple actually built this for app developers who might want to create fantastic new iOS app experiences for students and researchers to use.
What problem were they trying to solve?
Removing the greatest technical challenge from building a note-taking app: the ability to recognize text within an image (also known as OCR or optical character recognition). By doing this and enabling OCR as a platform capability Apple levels the playing field and allows developers to focus their efforts on creating fantastic (and creative) user experiences that truly make note-taking and organization easier.
How does Apple measure success?
Apple should be looking at the success of note-taking apps on the iOS platform. Since OCR was introduced as a platform capability, have there been more note-taking apps? Has the quality of the apps improved? Overall is note-taking on an iPhone easier?
Ultimately, if Apple can say that note-taking on an iPhone is demonstrably easier than it was before they’ve achieved success. This claim will yield a healthier iOS app ecosystem and will contribute to iPhone sales.
What might Apple do next? How might they improve this experience?
To improve this for both end users and developers Apple might add app suggestion. Today when you highlight text the iOS native “Copy | Select All…” dialogue displays. Imagine if that bar proactively suggested apps (either based on a user’s past usage or App store ranking) for the user to copy the text into. Apple could create a quality bar so that only apps with an X app store ranking would be considered for inclusion. This would incentivize developers even further to create high quality experiences, thereby powering Apple’s iOS ecosystem growth flywheel.


