Is Google Glass the Segway of this era?

Not so long ago Mike Butcher (from Techcrunch) tried on a pair of Google Glass lent to him by a Glass Explorer at a conference and even though his experience with Glass was rather short he made a conclusion that nailed it for me:

“So Google Glass for me will be this era’s Segway: hyped as a game changer but ultimately used by warehouse workers and mall cops.”

I had a chance to buy a Google Glass 5-6 months ago when the original Google Explorers were allowed to invite 3 friends to join the program. As managing partner for digital at Duval Guillaume, I need and want to be on top of major tech innovations and Glass is definitely one of those things. So I volunteered immediately and made some arrangements to be able to buy it whilst not completely following the standard procedure. You have to be a US resident for instance which I’m not. But that was pretty easy to overcome. So I got my Google Glass pretty soon after and once it was all set up (which was pretty easy) I started playing around with it.

At first I was in awe. The little projection screen of Google Glass is crisp, the voice & touch controls are very intuitive and simple to get familiar with and it’s pretty impressive what it can do. It’s the same when you share that experience with others, every time one of my friends or colleagues put on Google Glass and performed some of the main key tasks they were amazed with the result. That and the jealousy of some to get hold of their own.

I took some really great pictures of the rising sun while driving my car, got directions pointed out to a unknown shop while walking in the city, watched Youtube video’s after searching them via voice commands, shared Facebook updates also via voice, … A lot of nice things actually. But then there’s also a problem. There are a few, like battery time for instance (which is worse than on a smartphone). But that’s not the real problem.

The real problem is that actually wearing it makes you look weird – or at least different enough for people to notice. It doesn’t look natural and so people will make a comment about it. They either know what it is and want to try it, or worse, want you to take it off, like if you’re constantly filming people. Or people don’t know what it is and think you look ridiculous. And you can’t blame them because you know you look ridiculous with the glasses on.

And if people ask what the benefits are and you tell them, they will tell you all of that’s also possible with your smartphone. With the difference you don’t have the take it out of your pocket, but then again you don’t have to wear those strange glasses all the time. And indeed, there’s not much you can bring into that. Because there are very few moments that you can say that you couldn’t possible reach for your phone, in which case Google Glass really was beneficial to you.

And for that Mike’s comment makes a lot of sense. When it makes no real difference to use Google Glass or your smartphone for the same tasks, the smartphone is still a winner. But when you’re a policeman, or a flight attendant, a medic, … and you need your hands for other things then the Glass makes total sense. Therefore it cannot come as a surprise that NYPD is testing Google Glass or Virgin Atlantic.

I’m not sure how the final Google Glass will go to market nor when that will happen. But it still needs massive change before people will adopt it because it don’t think it is appealing enough to the masses how it is right now. Let alone the price tag of course, you can buy yourself some pretty sweet smartphones for USD 1.500.

Don’t get me wrong by the way, I’m still pretty happy to have one and I will keep testing the device for quite some more time. It does help to get insights on where wearables might go to and it still is pretty amazing if you’re willing to unthink the fact that you are wearing empty glasses with a battery pack on the side. Let’s see what comes next.

Note – I wrote this on the plane about a week ago, since then Google announced Android Wear which subsequently makes a lot more sense to me than the Glass does for the moment. Or maybe I should just wait until we see what RayBan is going to make of it.

The crying invoice

Did you know that 1 in 3 invoices in Belgium are paid late?  That brought us to the idea for this campaign we created for ikki, a new service of USG People developed to support freelancers. From now on invoices will never go unnoticed again: the crying invoice.

Hats off to my colleagues at Duval Guillaume who developed the idea.

When augmentation is about reducing (Pt. 2)

In February of this year I wrote a post about Kevin Slavin’s talk on Augmented Reality at PICNIC NY Salon. In that video he talked about something that made total sense to me… which to be honest is true for most of what Kevin says anyway :)

“His thoughts around augmented cities and why maybe ‘augmented’ should be about taking things away instead of just adding them to the world as we are already drowning in data as it is.”

So when I got this video today from a colleague about a research project on ‘Dimished Reality’ by Jan Herling and Wolfgang Broll of the  Ilmenau University of Technology, it was like a proof of the concept Kevin talked about a year ago now. I don’t like the name ‘Dimished Reality’ because it still is doing more on top of what is really there. But in this case less really is more, check it out:

Glass. How to make sharing more contextual.

Well at least that’s what I think it does. When I first read about this Firefox/Chrome plugin I thought it would be something similar to Weblin, a service I blogged about in 2007. Luckily it’s not the same. For one Weblin wasn’t as cool and interesting as I first thought and is in de deadpool by now and Glass has a different offering so let’s give that a try.

“Glass is a browser add-on that lets you share experiences and not just content. We’ve created a virtual sheet of Glass that lies over the entire internet that’s yours to affect. You can share your thoughts about anything on the web, right in the moment, by literally placing notes, (highlighting text, and even placing pictures and videos – to come soon) on top of any website and share those thoughts with only those you choose. We let you share the moment and thought together as an experience.”

I’m not sure actually if it has a benefit to share websites/comments the way Glass wants you to, but I got to have some friends on Glass first to figure that one out :) See it in action:

Glass is still in beta (invitation only) but I could still use the invitation code offered by The Next Web in their post about Glass so can you (code = thenextweb). If that doesn’t work anymore, I have 5 invites left so give me a shout if you need one of those.

That way we can both find out if this is a keeper or not.

Mindgoggling

This is how the Urban Dictionary defines this: “(adj) something that is so baffling only goggles could understand”. I suppose that is how you got to think of Google Goggles, a mobile tool that allows you to take a picture of something to get instant search results based on the content of the picture. Sounds cool, check this out.

It did remind me of a Microsoft project I read & blogged about 3 years ago, a side project of Photosynth at that time. They talked about a very similar tool but don’t remember hearing from this after that.

photosynthmobile

Question to ask the Photosynth guys maybe? Or Steve, maybe you know (can find out)?

Sketch2Photo: do want.

This is a pretty awesome piece of software I must admit, too bad it’s not publicly available yet so you’ll have to do it with the photos and videos instead. The software composes a realistic picture based on simple freehand sketch annotated with text labels. Basically the sketch below is supposed to result in the photo next to it (and a few alternatives) based on photos found on the internet. If that really works… you’ll have to agree that that is pretty cool.

teaser

Here’s the full process:

overview

You can see more examples as well as a video showing the process on the Sketch2Photo website.

[Via The Web Life]

Because clicking is so 90’s

This trailer shows a the website project created by Andreas Lutz as part of a study project and is rather inspirational. The website uses video and sound input to control the navigation, so basically you navigate using gestures and voice.

audiovisualnav

Make sure you try it out for yourself on Andreas’ website as well. I found it still pretty hard to use the navigation and have to admit I still prefer clicking for now :) but you can see where this is going and that’s an opportunity that I do want to think about. Very cool!

[Via fubiz]