A new ‘human recognition system’ app can identify people in a crowd from their clothing, and overlay their name on the display of a Google Glass headset.
The system, called InSight, is part-funded by Google and was unveiled at the HotMobile technology conference in Jekyll Island, Georgia, last week. It aims to help users find their friends and be spotted themselves in busy places like shopping centres, sports stadia and airports. Face recognition systems cannot be used for this, says InSight developer Srihari Nelakuditi at the University of South Carolina in Columbia, because it is unlikely someone in a crowd will be looking straight at a headset’s camera.
In early tests, the system showed 93 percent accuracy in identifying volunteers, even with their backs turned.
In the auction, the seller claimed (how perfect, as Google doesn’t have its early adopters sorted out yet?!) he’s going to get the Google Glass through Google’s #ifihadglass competition.
I’ve been selected as an early adapter for Google’s upcoming release. you are buying a brand new unopened pair of Google’s Project Glass glasses. i will be personally attending and picking up my pair in either Los Angeles, or New York at Google’s Project Glass launch event, which will take place some time after Feburary 27th.
Unfortunately for the seller, the claim was completely false, as Google will not be letting winners know until mid-March. In fact, the competition only closed today. Winners are also barred from reselling the devices, as per Google’s rules.
Because one may ask: what does the Google Glass provide that the world hasn’t already seen? The Google Glass sports a camera with which one may capture live photos, it has Google Maps preinstalled, and it can translate one language to another. But then, a smartphone can do all of these and more. Isn’t it highly improbable and impractical that one would buy a pair of glasses just to look like a dork?
“If you look at other wearable pieces of functional technology, there’s a reason they’re not ubiquitous. There’s a reason we all make fun of someone wearing a Bluetooth or a BlackBerry holster,” said Daniella Yacobovsky, co-founder of BaubleBar, an online jewellery retailer. “Is it useful? Of course it is. Do I look like a tool? Yeah. I’m not going to wear it.”
I consider Google Glass in the same league of innovations as it brings regular activities closer to our senses and makes those activities much easier and faster to execute. What would you prefer? Saying “OK Glass. Take a photo” or taking out your phone, enabling the camera, pointing and shooting? Would you like running into the room to fetch your video camera to record the first steps of your daughter or would you prefer saying “OK Glass. Record this”?
As Google and other companies begin to build wearable technology like glasses and watches, an industry not known for its fashion sense is facing a new challenge — how to be stylish. […] In a sign of how acute the challenge is for Google, the company is negotiating with Warby Parker, an e-commerce start-up that sells trendy eyeglasses, to help it design more fashionable frames…
Want to see how Glass actually feels? It’s surprisingly simple. Say “take a picture” to take a picture. Record what you see, hands free. Even share what you see, live.
We as human beings are very visual, and for that reason, Google has unveiled the first demo video of its much-anticipated Google Glass a.k.a. Google Project Glass, dubbing it the future of “wearable technology.” So, could Glass change the course of technology?
The unveiling video showed how these “glasses” can be used to take pictures, record video, share content directly via email or social networks and get information quickly. Other features include weather reports, map directions, quickest routes and most obviously spell-checker.
Yes. It’s my expectation that in three to five years it will actually look unusual and awkward when we view someone holding an object in their hand and looking down at it. Wearable computing will become the norm.
Google’s Project Glass product lead Steve Lee walks us through his experience with the development of the company’s sci-fi-inspired eyewear—from his team’s “hundreds of variations and dozens of early prototypes” to his vision of the future.
At the start of this month, Google secretly began testing its augmented reality goggles with selected employees. The prototype eyewear allows users to view messages, videos, maps, and images in real-time. Now, Oakley has confirmed that it’s testing similar technology that will rival Google’s ‘Project Glass.’
Colin Baden, CEO of Oakley, stated in an interview that:
As an organization, we’ve been chasing this beast since 1997. Ultimately, everything happens through your eyes, and the closer we can bring it to your eyes, the quicker the consumer is going to adopt the platform.
Oakley has been working on technology to produce head-mounted displays for nearly 15 years, and has 600 patents mainly relating to optical specifications. Baden refused to confirm if Oakley would produce its own pair of smart glasses, or if the company would only license its technology.
Essentially, the fifth screen represents a paradigm shift in advertising as much as it represents a change in the way we use computers on the move. It means ads everywhere, and different kinds of ads too—something TV advertisers are beginning to wake up to, now viewers use their iPads and phones while watching TV… “Every single piece of advertising now has as its goal behavior change. Understanding how to change behavior, and what people will choose to interact with is key for advertisers. Expect to see more science, more psychologists, and more behavioral data informing advertising in the future.”
When I saw Google had somehow forgotten to include any ads in their Project Glass promotional video I just couldn’t resist fixing that oversight for them. […] So here is my slightly more realistic version of Google’s augmented reality glasses - now featuring contextual Google Ads!
Bryce is referring to the profound changes in the technological infrastructure our culture sits on, as we start to create nearly unimaginable amounts of data — both personalized and anonymized — through the streams of our existence.
The proximate cause to this insight is the new demo video of Google glasses (or goggles), and Bryce’s realization that Google might be in a position to track literally everything we see (at least with the glasses on). Every product you reject and put back on the shelf, every plate of food, every friend (and stranger) you pass in the street, every store you enter, every breath you take, every move you make, they’ll be watching you.