The world’s most tagged photograph?

I missed this one earlier on. Orange has tried to create a world record for the getting the most tagged people in one photo, using a view from the Pyramid stage at the Glastonbury festival.

glastonbury

“The pic itself is a 1.3 gigapixel, 75,000 pixel-wide image compiled from 36 photos that took one minute to capture. They used two Hasselblad H4D-50 cameras with 50 megapixel digital backs and, camera geeks, a 150mm lens on top and 100mm lens tilt shift adapter. Both cameras were mounted vertically on a tripod and rotated at 10 degree increments to take the pictures.”

8.195 people are tagged as we speak, has it been confirmed yet that’s a record?

Mindgoggling

This is how the Urban Dictionary defines this: “(adj) something that is so baffling only goggles could understand”. I suppose that is how you got to think of Google Goggles, a mobile tool that allows you to take a picture of something to get instant search results based on the content of the picture. Sounds cool, check this out.

It did remind me of a Microsoft project I read & blogged about 3 years ago, a side project of Photosynth at that time. They talked about a very similar tool but don’t remember hearing from this after that.

photosynthmobile

Question to ask the Photosynth guys maybe? Or Steve, maybe you know (can find out)?

Sketch2Photo: do want.

This is a pretty awesome piece of software I must admit, too bad it’s not publicly available yet so you’ll have to do it with the photos and videos instead. The software composes a realistic picture based on simple freehand sketch annotated with text labels. Basically the sketch below is supposed to result in the photo next to it (and a few alternatives) based on photos found on the internet. If that really works… you’ll have to agree that that is pretty cool.

teaser

Here’s the full process:

overview

You can see more examples as well as a video showing the process on the Sketch2Photo website.

[Via The Web Life]

Photo tourism: stunning!

Remember Photosynth? About 2 years ago Microsoft Live Labs release the tech preview of Photosynth and I hasn’t lost any bit of coolness ever since. It was presented at TED Talks by Blaise Aguera y Arcas (together with SeaDragon) and is one of the most watched videos of the complete TED video collection (which on itself is pretty awesome already). Watch that first if you haven’t done so already.

Long Zheng at istartedsomething.com reports on a new 3D photoviewer that was created by Microsoft Research and the University of Washington:

“The collaborative research team from the University of Washington and Microsoft Research who only two years ago in 2006 published their paper “Photo Tourism” and their technology demonstration “Photosynth” have again pushed the boundaries of what can be achieved by intuitively processing the abundance of digital images shared on the web. This week at SIGGRAPH 2008 they’re sharing with the world some even better technology they’ve been working on which they call “Finding Paths through the World’s Photos“. Don’t let the name fool you, it’s damn cool. If you’re not much of a reading person like me, take a look at this video demonstration.”

Here’s the video, pretty stunning indeed: