While certainly not the technology focused topic I usually post, I definitely wasted a few minutes trying to sing the following chorus in the true spirit of procrastination. Try to sing along (if you can). If you are linguistically incapable, just reading along is amusing enough.
"he was like
she was all
he was all
they were like
we were all,
like oh my god
like totally
we were like
that was all
they were all
he was like
she was like
all totally
like oh my god"
If this was not educational enough for you, the following "anthropological introduction to YouTube" has a boring title, but is an incredibly fascinating and entertaining discussion of the cultural and social phenomena within the depths of YouTube... and relevant to the video above. Like totally. (warning: 1 hour talk, but definitely one of the better uses of 1 hour in my life).
Wednesday, December 17, 2008
"He was like..."
Posted by Johnny Chung Lee at 10:35 AM 23 comments
Monday, December 15, 2008
Nice Pen-based Input Research
One of the things I enjoy to using this blog for is to share cool projects from Human-Computer Interaction (HCI) research. This post highlights projects by Gonzalo Ramos (or "Gonzo" for short) and his co-authors. He has worked on several projects demonstrating how much better pen input software could be. These are just a few I like.
1. The Zlider - A pressure sensitive slider widget that adds additional navigation and control capability to standard slider interactions. Academic research video below. Quick demo montage at beginning, but the demo meat is at 3:07
2. Using a Pen to Effortlessly Bridge Displays. Using a stylus, you can simply drag documents between computer screens or mobile devices. The pen motion also implicity defines the orientation of the displays relative to one another. Academic video below. Demos at the beginning and more mobile screen scenarios at around 2:43
3. Rolling the Pen as Input Using an external tracker and a Wacom tablet, rotating the pen in your fingers can be used to control another parameter without moving the stylus. Academic video below, demo meat at 2:19
You can check more of his projects on his website.
Posted by Johnny Chung Lee at 5:06 PM 40 comments
Thursday, November 20, 2008
Some great Wiimote IR tracking projects
I've decided to collect some of my favorite projects I've seen people do with my Wiimote projects, derivatives of them, or distantly inspired (through the creator's own admissions). It's a surprise, and flattering to see how many people seem happy to credit me. Thanks all! The list gets more "unusual" the further you go down.
Two Wiimote Whiteboards to make a competative relay race:
Great IR wands for the Wiimote whiteboard. I've been meaning to make these, but I haven't gotten to it yet.
Some nice two handed, two finger pinching systems:
Wiimote Wheelchair art. Unfortunately, no video but more information at this link.
Head tracking prototypes with Anime assets. The effect of the girl coming out of the screen (about half way through the video) is very nicely done with the "haze" layer. His other videos are also worth checking out. I don't know what he does for a living, but he's good at it.
Wii Theremin gallantly created/performed by Ken Moore:
Finally, a video on "chicken head tracking". It doesn't use the Wii remote, but was posted as a response to my video and I love it!
Posted by Johnny Chung Lee at 12:28 PM 72 comments
Thursday, November 13, 2008
Scratch Input and Low-Cost Multi-spectral material sensor
Chris Harrison, a PhD Student at my old program at CMU, presented a couple projects of his at UIST 2008 that I really really like. The first is his "Scratch Input" device. The basic idea is that if you place a senstive microphone on the bottom of a mobile device. Any large, hard surface you put it down on can now be used as an input gesture surface. A variety of gestures can be distinctly and reliably detected with some simple machine learning. Video (academic) below include a nice demo where he turns his entire wall into an MP3 player controller:
The other project he presented was a simple, cheap multi-spectral sensor for recognizing various materials. It includes an IR LED, UV LED, RGB LED, a photoresistor, and a TSL230 TOAS optical sensor. With these, he read the reflectively under different illuminations to recognize 27 different materials with 86.9% accuracy, be this your jeans, your backpack, your desk at home, your desk at work. This means coarse location awareness of mobile devices for cheap, some opportunities for more intelligent power management, and implicit security behaviors when placed on familiar or unfamiliar surfaces. Very nice work.
Posted by Johnny Chung Lee at 7:28 PM 53 comments
Friday, November 7, 2008
SurfaceWare - sensing glasses for Surface
My colleague, Paul Dietz, in the Applied Sciences group released a video of one of his first projects he did when he joined Microsoft. These glasses use the transparent material of the glass as prisms that sense the amount of liquid in them by watching the amount of internally reflected IR light. Check out the video:
If you aren't familiar with how Surface works, it is a rear projected table that also has a bright IR emitter inside that illuminates objects placed on the surface which are then visible to an IR camera. The video does a good job explaining how the glasses work.
This is actually a revisit of an older project of Paul's called iGlassware. That one used passively powered RFID sensor tags in the base of the glass to capacitively measure the liquid level. The table had a big RFID antenna in it. Paul was also a key developer of Mitsubishi Electric Research Lab's Diamond Touch table being skillfully demonstrated by Ed Tse below.
Ed is currently at Smart Technologies, who helped push out their new touch table:
Posted by Johnny Chung Lee at 3:21 AM 34 comments
Thursday, October 9, 2008
Andy Wilson
I was re-watching some videos of work done by one my colleagues Andy Wilson, and I don't think his work gets as much attention as it deserves given how amazing it is. If you think my stuff is cool, you should bow down to his greatness... or at least watch these videos.
Posted by Johnny Chung Lee at 5:17 PM 80 comments
Thursday, September 4, 2008
Working with the PixArt camera directly
This has been a pretty whirlwind past few months. Lots of things have happened, almost none of which procrastineering related which is why I haven't posted anything here. But, one of the things that I have poked at in the past few weeks was creating a PixArt to USB-HID device which allows the camera from the Wiimote to appear as a relatively easy to access USB device. This addresses several problems with using the Wiimote such as running off batteries for extended periods and flakey platform specific Bluetooth drivers. It's also possible to read from the Pixart cam at over 100Hz if you read directly via I2C as well as track visible dots once you remove the IR filter. Of course, none of this was discovered by me. All credit belongs to the numerous individuals who have contributed thier knowledge to the various Wiimote hacking websites. Normally, this project wouldn't be worth a post, but all the information on how to do this is pretty scattered and difficult to follow. So, I figured I would contribute by trying to making this all a bit clearer.
Here's the pinout thanks to kako and a PCB picture. The Reset pin is active low, so use a pullup resistor to Vcc. The Wiimote runs the camera with a 25Mhz clock, but it also works with a 20Mhz clock so you might get away with fudging this a bit. The I2C communication is fast 400Khz and the slave device address is 0xB0. Most microcontroller development platforms should include I2C communication capabilities. If yours doesn't, get a better dev kit =o). Desoldering the camera can be hard with so many pins. But, careful use of a hot air gun will do the trick. The first part is to initialize the camera over I2C. Here's the pseudo code for initializing to maximum sensitivity (actual CCS C code in comments):
- write(hex): B0 30 01
- wait 100ms
- write(hex): B0 00 00 00 00 00 00 00 90 //sensitivity part 1
- wait 100ms
- write (hex): B0 07 00 41 //sensitivity part 2
- wait 100ms
- write(hex): B0 1A 40 00 //sensitivity part 3
- wait 100ms
- write(hex): B0 33 03 //sets the mode
- wait 100ms
- write(hex): B0 30 08
- wait 100ms
It's still somewhat mysterious to me what all these mean, but in this mess is the sensitivity and mode settings described at Wiibrew. The above code uses the sensitivity setting suggested by inio "00 00 00 00 00 00 90 00 41, 40 00" experssed in the 2nd, 3rd, and 4th message. The wait times are conservatively long. After you initialize, you can now read samples from it:
- write(hex): B0 37 //prepare for reading
- wait 25us
- write(hex): B1 //read request
- read 8 bytes
- wait 380us
- write(hex): B1 //read request
- read 4 bytes
This yeilds one sample from the camera containing 12 bytes, 3 for each of the 4 potential points. The format of the data will be the Extended Mode (X,Y, Y 2-msb, X 2-msb, Size 4-bits). The wait timings approximate what the Wiimote does. I've called this routine 1000 times per second without ill effect. Though, I doubt this is actually scanning the sensor and instead is just reporting the contents of an interal buffer. But, people claim 200Hz updates are possible. So, you can use that as a suggestion.
Hooking this up to your microcontroller is pretty straight forward. Give the camera 3.3v power using a voltage regulator, ground, a 20-25Mhz clock, and connect the SDA and SCL lines (don't forget your pull up resistors), and pull up the reset pin.
The CCS C Compiler for the PIC18F4550 includes USB-HID sample code. It's simply a matter of stuffing the data you got from the PixArt camera into the input report buffers for the USB. With this, you could actually create a USB mouse profile and make it control the cursor without any software or drivers at all. If set it up as a full speed device, it's possible to get 1ms reports providing extremely low latency updates. CCS provides relatively affordable PIC programmers as well. Explaining how to set all this up is not within the scope of this post, but it should be plenty to get you started. If you want to make a PCB, you can try ExpressPCB which can get you boards in-hand for as low as $60.
Update 9/6/08: Just a note about the clock. Since my PIC was using a 20Mhz resonator, I just piggy backed the Pixart clock pin off the OSC2/CLKO pin of the PIC which seemed to work fine. Also, Kako has more details (in Japanese) on doing this with an Arduino
Posted by Johnny Chung Lee at 5:51 PM 50 comments
Monday, June 23, 2008
More Wiimote Projects - A Brain Dump
It’s been a while since I’ve posted anything. That’s largely because I’ve been traveling a lot, giving talks, and most recently relocating to a new city. It became clear to me a while ago that I wasn’t going to get around to making more videos anytime soon. So, I figured I would make a post about the projects that I would probably make videos of if I had more free time. The content of this post has been in the talks that I’ve been giving, but I’m just sitting down to write it out now for my trusty blog readers.
1. Throwable Displays using the Wii remote
This I actually built and demoed in my lab at CMU. But, it only existed for about two days before I had to break it down to move and I didn’t get a chance to document it. Several months ago, a patent filed by Philips made some of the tech new sites about throwable displays in games. But it was a concept patent pretty far from a working demo. However, it turns out it’s pretty easy to implement using a projector, a wiimote, an IR emitter, and some of our trusty retro-reflective tape. It essentially combines the techniques from the finger tracking and the wiimote whiteboard projects. You put a little bit of reflective tape on each corner of a square piece of foam core, turn on the IR emitter so the Wiimote can see the four corners, align the camera tracking data with a projector using the 4-point calibration, and then the projector can display images perfectly aligned to the edges of a moving piece of foam core. The process of using a projector to augment the appearance of objects is called “Spatially Augmented Reality”.
Research colleagues of mine made a really fun demo where they tracked an air hockey puck from above and projected down on the air hockey table to display all sorts of visual effects that responded to the location/motion of the puck. They were demonstrating a fancy new type of high-speed tracking system. But, the Wiimote works quite well at 100Hz. I wish I had documented the throwable display on video, because it worked quite well. You really could pick it up and throw it around and the video image stays fairly locked onto the surface. There's a small latency primarily due to the 60Hz refresh of the projector. I even made a rough demo of the air hockey table, but it was VERY rough - just drew a line tail behind the puck. Again, a little patch of reflective tape on the puck and IR ring illuminated Wiimote above. However, the throwable display concept is actually a simpler implementation of a project I did earlier on “Foldable Displays” (tracked using a Wii remote) which I did make a video of, but not in tutorial format like my other Wii videos:
2. 3D tracking using two (or more) Wii remotes
Since the tracking in the Wiimote is done with a camera, if you have two cameras you can do a simple stereo vision triangulation to do full 3D motion capture for about $100. This was actually already done by some people at the University of Cambridge:
This is text book computer vision algorithm, but I haven’t gotten around to making a C# implementation. Obviously, you can use more than 2 wii remotes to increase tracking stability as well as increase occlusion tolerance. This would be a VERY useful and popular utility if anyone out there wants to make a nice software tool to transform multiple wiimotes into a cheap mocap system.
3. Universal Pointer using the Wii remote
The nice thing about the camera is that it can detect multiple points in different configurations. The four dots could be used to create a set of barcode-like or glyph-like identifiers above each screen in a multi-display environment. This would not only provide pointing functionality on each screen, but also provide screen ID which means you could interact with any cooperating computer simply by pointing at its screen. No fumbling for the mouse and keyboard, just walk around the room, or office building, or campus, and point at a screen. If all the computers were networked, you could carry files with your Wiimote virtually (using the controller ID) letting you copy/paste or otherwise manipulate documents across arbitrary screens regardless of what computer is driving the display or what input device is attached to the computer. You just carry your universal pointer that works on any screen, anywhere automatically. This makes a big infrastructure assumption, but it really alters the way one could interact with computational environments. The computers disappear and it becomes just a bunch of screens and your universal pointer.
Similarly, arbitrary objects could have unique IR identifiers. For example, if each lamp in your house had a uniquely shaped Wii sensor bar on it (and they were computer controlled lamps, of course), you could turn on a specific lamp simply by pointing at it and pressing a button or dim it by rotating the wiimote. If was an RGB led lamp, you could specify brightness, hue, and saturation with a quick gesture..
4. Laser Tag using Wii remotes
If you put IR leds on each of the Wii remotes, they can see each other. So, you can have a laser-tag like interaction just using Wii remotes – no display, except perhaps if you wanted a big score board. You’d have to validate which Wii remote you were shooting at, which you would do using some kind of IR LED blink sequence for confirmation. Just wire up the IR leds to the LEDs built into the Wii remote, so you can computer control their illumination.
5. IR tracking with ID using the Wii remote
This is more technical (and related to the above idea), but it addresses an important issue that I have yet to see done in either commercial or research systems. The problem with IR blob tracking using cameras is that you can’t which blob is which. You could blink the LEDs to broadcast their ID. But, this 1) would be slow because the ID data rate is limited by the frame rate of the camera 2) really hurts your tracking rate/reliability because you don’t know where the dot is when the LED is off. Now, the Wii remote’s camera chip gives 100Hz update, which might be tolerable for a small number of IDs. But, this approach doesn’t really work well when you want fast tracking with lots of unique IDs. One solution is to attach a high speed IR receiver to the side of the Wii remote for data transmission and simply use the camera for location tracking. IR receivers used in your TV probably support data rates of around 4000 bps - much higher than the 50 bps sampling limit you could squeeze out of the Wii remote. So, as the LEDs furiously blink their IDs at 4Kbps, they look like they are constantly on to the camera. This yields good tracking as well as many IDs. Now when you have multiple LEDs transmitting simultaneously, you’ll get packet collisions. So, some type of collision avoidance scheme would be needed of which there are many to choose from. It will also be necessary to re-associate the data packet with a visible dot. So, not all the LEDs can be visible all the time. But, you only have to sacrifice a small number of camera frames to support a large number of IDs. You can also probably boost performance if you are willing to accept short term probabilistic ID association.
Posted by Johnny Chung Lee at 12:01 AM 90 comments
Thursday, March 27, 2008
WiimoteWhiteboard v0.2 - slightely updated/fixed
Hi, all. I got a few moments yesterday to make a couple of small improvements to the Wiimote Whiteboard software. Most notably, I improved the mouse emulation code. There were problems where it wouldn't work with some programs like PowerPoint, Alias Sketchbook, etc. So, those work fine now.
I also added a "Tracking Utilization" feedback to the GUI which tells you how much of the camera you are utilizing for tracking. This gets updated after calibration. This provides a way to evaluate how good your Wiimote placement is which directly impacts tracking quality. Getting this number to 100% is virtually impossible in any usable configuration, but dropping below 50% is a sign of not-so-great Wiimote placement.
I've also updated the code to use Brian Peeks's WiimoteLib v1.2. I think this may help a little with greater Bluetooth/Vista compatibility. On a technical note, the WiimoteLib in this download is slightly modified to increase the Wiimote camera sensitivity.
You can use this link to download the new version: WiimoteWhiteboard v0.2
Posted by Johnny Chung Lee at 3:11 PM 91 comments
Saturday, March 22, 2008
Inspiring Students
One of the great, unexpected, and perhaps most influential aspects of creating these videos has been how many people they have inspired and sparked an innovative spirit in. I've gotten hundreds of emails from young students that express this enthusiasm. But, perhaps one of the best testimonials is this news article about kids in the Clara Byrd Baker Elementary School's Lego Club in Williamsburg, VA. The students there, lead by Kofi Merritt, are getting excited about innovating in technology by creating their own electronic white boards.
Posted by Johnny Chung Lee at 10:02 AM 10 comments
Thursday, March 13, 2008
Tracking multiple laser pointers @ 200Hz using the Wii remote
A couple of videos (vid1 and vid2) from sha433 demonstrate that if you are willing to crack open your Wii remote (you'll need a tri-wing screw driver) and take out the IR filter, the camera will track visible points of light, including laser pointers. That's pretty neat. Additionally, it has also been recently discovered by a few people that if you read the data directly off the camera chip using I2C, you can get 200 Hz tracking (vs 100Hz that you get over Bluetooth). It looks like there might be a latency hit, though, depending on how you get the data into your computer.
So, if you are willing to get a little down and dirty with the hardware, you can pull out quite a bit more capability! Good job Sha!
Posted by Johnny Chung Lee at 8:08 PM 29 comments
YouTube Awards: Nominated for Best Instructional Video of 2007
Hey all, (shameless plug) it looks like my video on head-tracking was nominated for the Youtube Award - Best Instructional Video of 2007! Very neat. =o) Please, go to the site and cast your vote! Thanks!
Posted by Johnny Chung Lee at 3:55 PM 19 comments
Friday, February 22, 2008
EA's Boom Blox to include Wiimote Headtracking
Louis Castle announced yesterday at GDC2008 that EA's upcoming title "Boom Blox" will ship with an easter egg that allows head-tracking using the Wii remote! Very cool. Here's the Digg article someone created for it - which you should all digg to help drum up some attention and demand for this feature. Reward the developers who decided this was worth including and send a signal to EA and the greater game development community that this is a desired step forward in the evolution of game play technology. The expected release will be in May.
I'm proud. If this pans out, it'll be only 5 months between the initial research prototype to integration into a major product release. Sweet! Happy to see my stuff being used. Humorously, there were 3 other demos on the GDC expo floor showing variations of my head tracking demo. =o)
Just in case you are wondering: No, I don't get any royalties or benefits for the use of this technique in games. Personally, I'm much happier impacting the state of technology on such a large scale in such a short period of time rather than struggling to transform it into personal financial gain. In terms of my original intent behind creating the head-tracking demo, it has already been a wild success beyond my highest expectations.
Joystiq article
Digg article
Posted by Johnny Chung Lee at 1:44 AM 92 comments
Wednesday, January 9, 2008
Official Wiimote Project Forum!
The number of comments on my blog posts have gotten rather unweildy making them difficult read and thus, not extremely useful. It has become clear that a full up discussion forum is needed to manage the threads of conversation. I know there is already a lot of chatter scattered all around the web about these Wiimote projects, so hopefully this isn't too late to centralize some of that discussion.
JD has honorably stepped up the challenge. So,... announcing the official discussion forum for wii remote projects!
http://www.wiimoteproject.com/
Everyone give JD a pat on the back, and try to use the forum for discussing wii remote projects. I'll try to add links to this forum on the main project page.
Link to the WiimoteProject Forum
Posted by Johnny Chung Lee at 1:53 PM 45 comments