By: Nu Yang
CrowdOptic connects users by what they’re looking at with their phone Checking in to a location on Facebook and live tweeting from an event is common practice these days, but the latest mobile technology now allows consumers to engage and interact just by pointing their smartphone camera at an object.
Based in San Francisco and launched last summer, CrowdOptic uses focus-based services to supply data on engagement as well as time and location. What that means, according to the company, is that venues, advertisers, and sponsors can respond in real-time to shifts in crowd momentum, enhance in-venue security, and deliver hyper-targeted mobile media.
CrowdOptic works with event organizers and advertisers to create specific event data for a phone app (available in both iPhone and Android devices). When the app is downloaded, users can point their mobile camera at people or things to learn more about them. Fans at a sporting event can focus their phone on a player and receive statistics, or concert-goers can find out what band is playing by pointing their camera toward the stage.
With focus-based technology, users can also connect with other people who are zooming in on the same thing with their camera. These “clusters” can share their comments, photos, and videos with each other and with people outside the event.
CrowdOptic recently conducted beta tests at this year’s Academy Awards in Hollywood, Calif., and Super Bowl XLIV in Indianapolis. When users pointed their phone camera to take photos, they were able to join in live online discussions with others who were pointing their phone at the same thing.
Chief executive officer and co-founder Jon Fisher said for the newspaper industry, a product such as CrowdOptic would make news events more “tangible.”
“It’s a real-time tool,” Fisher said. “It could encourage users to bring in the news and photos, and cut through a lot of noise.”
With more reporters using mobile devices, CrowdOptic could help with breaking news stories. Journalists are also able to verify photos and tweets sent in by eyewitnesses based on the phone’s location and line of sight. Fisher said editors are able to sift through the thousand of photos submitted to a newspaper to find specific photos based on what the user is looking at, rather than just a location.
“It narrows down the search and makes it go by faster,” he said. “CrowdOptic creates a template for who’s witnessing (the event). There’s more focus with the data.”
Fisher said there are currently two major trends: putting more enhanced technology into people’s hands and connecting people interactively. Though he said QR codes and augmented reality each have their place, he sees them as static.
“They know where the object is, but they don’t move,” Fisher said. “With clusters, the static goes away, and the object is able to move. The smartphone becomes the naked eye.”
For more information, visit crowdoptic.com.