Friday, February 21, 2014

Project Tango

In case you guys have been wondering what I've been working on.

It's nice to finally be able to share.

The future is awesome. g.co/ProjectTango

9 comments:

  1. Very nice! I've submitted a proposal!

    ReplyDelete
  2. Hi Johnny,

    i'm seriously impressed by your work and enthusiasm regarding Human-Robot interaction.

    Regarding Project Tango:

    What do you think about the phone in combination with a FPV RC plane/car/truck, Oculus Rift(or any other 3d glasses capable of high resolutions) and a Long Range Transmitter?

    Would the phone be capable of "altering" the perceived reality in real time and what can we expect regarding Resolution?

    I think that the phones main task will be the indexing of indoor space to finalize a virtual model of earth but im really excited about the virtual reality opportunities.
    Like remote exploring and simultaneously changing it into a video game environment.
    It probably wont be as effective in metropolitian Areas since isolating your plane's frequency is a pain in the ass but i could see myself driving into the green outskirts, setting up my gear and just go mountain/cave exploring while sitting in my chair with the controller and 3d glasses.

    Is that too far fetched or something that could become real pretty soon?

    Kind regards

    ReplyDelete
  3. Hello Johnny,

    GTango !.

    Regarding the video of this project I remembered three similar human-machine interactions that from my point of view are full compatible with this project:

    1) Google Street View with life broadcast.

    2:) Zebedee lidar : https://www.youtube.com/watch?v=DUEAz_naHHg

    3:) 6th sense with @pranavmistry https://www.youtube.com/watch?v=rWf4xS-28nU

    And I could imagine :

    A Jean-Michel Jarre concert with thousands of phones transmitting data with a Tango app to a cloud server and a powerful data processor could use this 3D video mapping projection to create the digital information for artistic lighting over the people + building (all in real time).

    Adding GTango to all city cabs, processing the resulting "Big Data" and generate maps for analysis of social dynamics and social predictive behavior.

    Using multiple synchro photo-lidar-tango cameras could create a event registration, and save the map to navigate in 3D over our recent history.

    I really see a pre and post GTango era.

    Congratulations.

    ReplyDelete
  4. Dear JL,

    I've submitted a proposal right now. I'm a former HCII alum as well and worked closely with Scott. I'm currently working on developing tech enabled therapy tools for children with Autism. A lot of my work has been focused on making interactions natural and increasing the link between the virtual and real world (that is a big deficiency for kids with ASD).

    This device could be the missing piece and I hope that you consider my application. Love the idea of putting a depth camera inside a phone. It will be the next big shift in mobile and ubiquitous computing.

    Again, you are awesome. You're the reason I went to HCII and your work is a really big part of what keeps me motivated every day. Kudos to you and your team.

    ReplyDelete
  5. I haven't heard your post in so long. I love the new Tango Project. Too bad its only for the company due to FCC restrictions :)

    Johnny, I have been following your projects since you post about grid with wii project which it was several years ago. I am actually just have applied to CMU for HCI program just like you :) I hope I can get a spot this year. I would really appreciate if you can give me insights and contact.

    Great work.

    ReplyDelete
  6. Hello Mr. Lee!

    My name is Maitreyee Joshi; I am currently a senior at Lynbrook High School, San Jose. I recently read an article about Project Tango that you are leading and I was hoping to incorporate its indoor mapping functionality into an app for the blind I am working on.

    I am working on an indoor navigation system app for the blind. The app works by using sound cues to allow the blind to virtually navigate through maps of indoor buildings.

    I'm currently in the initial development stages right now and have tested out a basic prototype of my app a couple of months ago at the Silicon Valley Blind Center. The results were encouraging and the blind person I tested out on was able to tell me the entire layout of a building after navigating through the building twice on the app.

    I am now working on the automatic indoor mapping functionality of the app by using wifi strength fingerprinting and depth sensors. The Project Tango devices do exactly the type of indoor mapping that is necessary in my app for the blind. Unfortunately, I can't apply to get a device since I'm not a professional developer in a company but it would extremely helpful to me if you were available to discuss this project with me sometime.

    Thank you for all your help and I look forward to hearing back from you!

    ReplyDelete
  7. I love it. I make sand sculptures for a living and am slowly building a computer game involving robots that will be controlled through the web to pick up beach litter. I cant wait to get my hands on this. Thanks for building it:)
    www.dirtybeach.tv

    ReplyDelete
  8. This comment has been removed by the author.

    ReplyDelete