Wednesday, December 17, 2008
"He was like..."
"he was like
she was all
he was all
they were like
we were all,
like oh my god
like totally
we were like
that was all
they were all
he was like
she was like
all totally
like oh my god"
If this was not educational enough for you, the following "anthropological introduction to YouTube" has a boring title, but is an incredibly fascinating and entertaining discussion of the cultural and social phenomena within the depths of YouTube... and relevant to the video above. Like totally. (warning: 1 hour talk, but definitely one of the better uses of 1 hour in my life).
Monday, December 15, 2008
Nice Pen-based Input Research
1. The Zlider - A pressure sensitive slider widget that adds additional navigation and control capability to standard slider interactions. Academic research video below. Quick demo montage at beginning, but the demo meat is at 3:07
2. Using a Pen to Effortlessly Bridge Displays. Using a stylus, you can simply drag documents between computer screens or mobile devices. The pen motion also implicity defines the orientation of the displays relative to one another. Academic video below. Demos at the beginning and more mobile screen scenarios at around 2:43
3. Rolling the Pen as Input Using an external tracker and a Wacom tablet, rotating the pen in your fingers can be used to control another parameter without moving the stylus. Academic video below, demo meat at 2:19
You can check more of his projects on his website.
Thursday, November 20, 2008
Some great Wiimote IR tracking projects
Two Wiimote Whiteboards to make a competative relay race:
Great IR wands for the Wiimote whiteboard. I've been meaning to make these, but I haven't gotten to it yet.
Some nice two handed, two finger pinching systems:
Wiimote Wheelchair art. Unfortunately, no video but more information at this link.
Head tracking prototypes with Anime assets. The effect of the girl coming out of the screen (about half way through the video) is very nicely done with the "haze" layer. His other videos are also worth checking out. I don't know what he does for a living, but he's good at it.
Wii Theremin gallantly created/performed by Ken Moore:
Finally, a video on "chicken head tracking". It doesn't use the Wii remote, but was posted as a response to my video and I love it!
Thursday, November 13, 2008
Scratch Input and Low-Cost Multi-spectral material sensor
The other project he presented was a simple, cheap multi-spectral sensor for recognizing various materials. It includes an IR LED, UV LED, RGB LED, a photoresistor, and a TSL230 TOAS optical sensor. With these, he read the reflectively under different illuminations to recognize 27 different materials with 86.9% accuracy, be this your jeans, your backpack, your desk at home, your desk at work. This means coarse location awareness of mobile devices for cheap, some opportunities for more intelligent power management, and implicit security behaviors when placed on familiar or unfamiliar surfaces. Very nice work.
Friday, November 7, 2008
SurfaceWare - sensing glasses for Surface
If you aren't familiar with how Surface works, it is a rear projected table that also has a bright IR emitter inside that illuminates objects placed on the surface which are then visible to an IR camera. The video does a good job explaining how the glasses work.
This is actually a revisit of an older project of Paul's called iGlassware. That one used passively powered RFID sensor tags in the base of the glass to capacitively measure the liquid level. The table had a big RFID antenna in it. Paul was also a key developer of Mitsubishi Electric Research Lab's Diamond Touch table being skillfully demonstrated by Ed Tse below.
Ed is currently at Smart Technologies, who helped push out their new touch table:
Thursday, October 9, 2008
Andy Wilson
Thursday, September 4, 2008
Working with the PixArt camera directly
Here's the pinout thanks to kako and a PCB picture. The Reset pin is active low, so use a pullup resistor to Vcc. The Wiimote runs the camera with a 25Mhz clock, but it also works with a 20Mhz clock so you might get away with fudging this a bit. The I2C communication is fast 400Khz and the slave device address is 0xB0. Most microcontroller development platforms should include I2C communication capabilities. If yours doesn't, get a better dev kit =o). Desoldering the camera can be hard with so many pins. But, careful use of a hot air gun will do the trick. The first part is to initialize the camera over I2C. Here's the pseudo code for initializing to maximum sensitivity (actual CCS C code in comments):
- write(hex): B0 30 01
- wait 100ms
- write(hex): B0 00 00 00 00 00 00 00 90 //sensitivity part 1
- wait 100ms
- write (hex): B0 07 00 41 //sensitivity part 2
- wait 100ms
- write(hex): B0 1A 40 00 //sensitivity part 3
- wait 100ms
- write(hex): B0 33 03 //sets the mode
- wait 100ms
- write(hex): B0 30 08
- wait 100ms
It's still somewhat mysterious to me what all these mean, but in this mess is the sensitivity and mode settings described at Wiibrew. The above code uses the sensitivity setting suggested by inio "00 00 00 00 00 00 90 00 41, 40 00" experssed in the 2nd, 3rd, and 4th message. The wait times are conservatively long. After you initialize, you can now read samples from it:
- write(hex): B0 37 //prepare for reading
- wait 25us
- write(hex): B1 //read request
- read 8 bytes
- wait 380us
- write(hex): B1 //read request
- read 4 bytes
This yeilds one sample from the camera containing 12 bytes, 3 for each of the 4 potential points. The format of the data will be the Extended Mode (X,Y, Y 2-msb, X 2-msb, Size 4-bits). The wait timings approximate what the Wiimote does. I've called this routine 1000 times per second without ill effect. Though, I doubt this is actually scanning the sensor and instead is just reporting the contents of an interal buffer. But, people claim 200Hz updates are possible. So, you can use that as a suggestion.
Hooking this up to your microcontroller is pretty straight forward. Give the camera 3.3v power using a voltage regulator, ground, a 20-25Mhz clock, and connect the SDA and SCL lines (don't forget your pull up resistors), and pull up the reset pin.
The CCS C Compiler for the PIC18F4550 includes USB-HID sample code. It's simply a matter of stuffing the data you got from the PixArt camera into the input report buffers for the USB. With this, you could actually create a USB mouse profile and make it control the cursor without any software or drivers at all. If set it up as a full speed device, it's possible to get 1ms reports providing extremely low latency updates. CCS provides relatively affordable PIC programmers as well. Explaining how to set all this up is not within the scope of this post, but it should be plenty to get you started. If you want to make a PCB, you can try ExpressPCB which can get you boards in-hand for as low as $60.
Update 9/6/08: Just a note about the clock. Since my PIC was using a 20Mhz resonator, I just piggy backed the Pixart clock pin off the OSC2/CLKO pin of the PIC which seemed to work fine. Also, Kako has more details (in Japanese) on doing this with an Arduino
Monday, June 23, 2008
More Wiimote Projects - A Brain Dump
1. Throwable Displays using the Wii remote
This I actually built and demoed in my lab at CMU. But, it only existed for about two days before I had to break it down to move and I didn’t get a chance to document it. Several months ago, a patent filed by Philips made some of the tech new sites about throwable displays in games. But it was a concept patent pretty far from a working demo. However, it turns out it’s pretty easy to implement using a projector, a wiimote, an IR emitter, and some of our trusty retro-reflective tape. It essentially combines the techniques from the finger tracking and the wiimote whiteboard projects. You put a little bit of reflective tape on each corner of a square piece of foam core, turn on the IR emitter so the Wiimote can see the four corners, align the camera tracking data with a projector using the 4-point calibration, and then the projector can display images perfectly aligned to the edges of a moving piece of foam core. The process of using a projector to augment the appearance of objects is called “Spatially Augmented Reality”.
Research colleagues of mine made a really fun demo where they tracked an air hockey puck from above and projected down on the air hockey table to display all sorts of visual effects that responded to the location/motion of the puck. They were demonstrating a fancy new type of high-speed tracking system. But, the Wiimote works quite well at 100Hz. I wish I had documented the throwable display on video, because it worked quite well. You really could pick it up and throw it around and the video image stays fairly locked onto the surface. There's a small latency primarily due to the 60Hz refresh of the projector. I even made a rough demo of the air hockey table, but it was VERY rough - just drew a line tail behind the puck. Again, a little patch of reflective tape on the puck and IR ring illuminated Wiimote above. However, the throwable display concept is actually a simpler implementation of a project I did earlier on “Foldable Displays” (tracked using a Wii remote) which I did make a video of, but not in tutorial format like my other Wii videos:
2. 3D tracking using two (or more) Wii remotes
Since the tracking in the Wiimote is done with a camera, if you have two cameras you can do a simple stereo vision triangulation to do full 3D motion capture for about $100. This was actually already done by some people at the University of Cambridge:
This is text book computer vision algorithm, but I haven’t gotten around to making a C# implementation. Obviously, you can use more than 2 wii remotes to increase tracking stability as well as increase occlusion tolerance. This would be a VERY useful and popular utility if anyone out there wants to make a nice software tool to transform multiple wiimotes into a cheap mocap system.
3. Universal Pointer using the Wii remote
The nice thing about the camera is that it can detect multiple points in different configurations. The four dots could be used to create a set of barcode-like or glyph-like identifiers above each screen in a multi-display environment. This would not only provide pointing functionality on each screen, but also provide screen ID which means you could interact with any cooperating computer simply by pointing at its screen. No fumbling for the mouse and keyboard, just walk around the room, or office building, or campus, and point at a screen. If all the computers were networked, you could carry files with your Wiimote virtually (using the controller ID) letting you copy/paste or otherwise manipulate documents across arbitrary screens regardless of what computer is driving the display or what input device is attached to the computer. You just carry your universal pointer that works on any screen, anywhere automatically. This makes a big infrastructure assumption, but it really alters the way one could interact with computational environments. The computers disappear and it becomes just a bunch of screens and your universal pointer.
Similarly, arbitrary objects could have unique IR identifiers. For example, if each lamp in your house had a uniquely shaped Wii sensor bar on it (and they were computer controlled lamps, of course), you could turn on a specific lamp simply by pointing at it and pressing a button or dim it by rotating the wiimote. If was an RGB led lamp, you could specify brightness, hue, and saturation with a quick gesture..
4. Laser Tag using Wii remotes
If you put IR leds on each of the Wii remotes, they can see each other. So, you can have a laser-tag like interaction just using Wii remotes – no display, except perhaps if you wanted a big score board. You’d have to validate which Wii remote you were shooting at, which you would do using some kind of IR LED blink sequence for confirmation. Just wire up the IR leds to the LEDs built into the Wii remote, so you can computer control their illumination.
5. IR tracking with ID using the Wii remote
This is more technical (and related to the above idea), but it addresses an important issue that I have yet to see done in either commercial or research systems. The problem with IR blob tracking using cameras is that you can’t which blob is which. You could blink the LEDs to broadcast their ID. But, this 1) would be slow because the ID data rate is limited by the frame rate of the camera 2) really hurts your tracking rate/reliability because you don’t know where the dot is when the LED is off. Now, the Wii remote’s camera chip gives 100Hz update, which might be tolerable for a small number of IDs. But, this approach doesn’t really work well when you want fast tracking with lots of unique IDs. One solution is to attach a high speed IR receiver to the side of the Wii remote for data transmission and simply use the camera for location tracking. IR receivers used in your TV probably support data rates of around 4000 bps - much higher than the 50 bps sampling limit you could squeeze out of the Wii remote. So, as the LEDs furiously blink their IDs at 4Kbps, they look like they are constantly on to the camera. This yields good tracking as well as many IDs. Now when you have multiple LEDs transmitting simultaneously, you’ll get packet collisions. So, some type of collision avoidance scheme would be needed of which there are many to choose from. It will also be necessary to re-associate the data packet with a visible dot. So, not all the LEDs can be visible all the time. But, you only have to sacrifice a small number of camera frames to support a large number of IDs. You can also probably boost performance if you are willing to accept short term probabilistic ID association.
Thursday, March 27, 2008
WiimoteWhiteboard v0.2 - slightely updated/fixed
I also added a "Tracking Utilization" feedback to the GUI which tells you how much of the camera you are utilizing for tracking. This gets updated after calibration. This provides a way to evaluate how good your Wiimote placement is which directly impacts tracking quality. Getting this number to 100% is virtually impossible in any usable configuration, but dropping below 50% is a sign of not-so-great Wiimote placement.
I've also updated the code to use Brian Peeks's WiimoteLib v1.2. I think this may help a little with greater Bluetooth/Vista compatibility. On a technical note, the WiimoteLib in this download is slightly modified to increase the Wiimote camera sensitivity.
You can use this link to download the new version: WiimoteWhiteboard v0.2
Saturday, March 22, 2008
Inspiring Students
Thursday, March 13, 2008
Tracking multiple laser pointers @ 200Hz using the Wii remote
So, if you are willing to get a little down and dirty with the hardware, you can pull out quite a bit more capability! Good job Sha!
YouTube Awards: Nominated for Best Instructional Video of 2007
Friday, February 22, 2008
EA's Boom Blox to include Wiimote Headtracking
I'm proud. If this pans out, it'll be only 5 months between the initial research prototype to integration into a major product release. Sweet! Happy to see my stuff being used. Humorously, there were 3 other demos on the GDC expo floor showing variations of my head tracking demo. =o)
Just in case you are wondering: No, I don't get any royalties or benefits for the use of this technique in games. Personally, I'm much happier impacting the state of technology on such a large scale in such a short period of time rather than struggling to transform it into personal financial gain. In terms of my original intent behind creating the head-tracking demo, it has already been a wild success beyond my highest expectations.
Joystiq article
Digg article
Wednesday, January 9, 2008
Official Wiimote Project Forum!
JD has honorably stepped up the challenge. So,... announcing the official discussion forum for wii remote projects!
http://www.wiimoteproject.com/
Everyone give JD a pat on the back, and try to use the forum for discussing wii remote projects. I'll try to add links to this forum on the main project page.
Link to the WiimoteProject Forum