…or any characters that you really want to!
In Keio University, a group that is lead by Professor Yasue Mitsukura has recently developed a way for USB web-cameras on an ordinary PC to detect the orientation and expression of a person’s face in real-time and update the model on screen accordingly.
The group is hoping to bring this technology into the market for CG hobbyist looking to animate their characters or others to translate the technology into a sort of avatar system in the near future.
According to early showcases, the system is entirely capable of reproducing expressions such as anger, happiness, sadness, laughter, surprise and others all the via the usage of a webcam. The system is capable of tracking the eyelids, eyebrows, cheeks and even the mouth movements of the user.
The low-cost (only requiring a standard webcam) and simplicity of the method would allow casual users or even companies to get their hands on the tech and expand onto other uses.
Imagine, a MMORPG in the near future that allows you to use your own expressions for your avatar in the game! Isn’t that one step closer to the virtual reality world that everyone’s been thinking of? Combine this with Kinect’s motion sensing, I think we pretty much have a primitive version of what people would call virtual reality machine!