Blog Post

Animating Textures & Avatars, Exploring Augmented Reality

by Alex Davis (SL/OS: Alex09 Danitz)

Recently, I have been working on a cardiac arrest simulation for the Point-of-Care Center for Emerging Neurotechnologies.  My focus has been on creating and animating the textures for the patient monitor in the simulation.  The model for the patient monitor was fairly easy to create using examples of monitors from Google, however animating the textures to synch up with the simulation programming has been a little trickier.  We’re currently using this script to rotate the textures in a way that simulates the kind of movement you would see on a real patient monitor.  I also found videos from Google to be helpful when learning how to do this.

In addition to general graphic design, I also use Poser to create custom animations in Second Life and OpenSim that are needed on a variety of projects, such as walking and running cycles, sitting, talking, flying, etc. I developed some instructions and a tutorial for training co-workers that might be useful for others who want to use Poser for creating animations in virtual worlds too.

And since Second Life only lets you truly animate avatars, getting movement with prims is rather tricky. I have just begun to explore Puppeteer which will allow me to move objects in order to get some simple animations going.

Finally, in my spare time I’ve also been looking into Augmented Reality. I haven’t gotten into any coding or anything yet since I’m still looking into how to go about creating it. Recently I found a program called SSTT (Simplified Spatial Target Tracking) which works in conjunction with Unity3D. There is also a way to do it using Flash and Papervision. Here’s an example I found using a demo from Craig Kapp’s blog.

Creating OpenSim Avatars

by Ayan Daniels (SL/OS:  Ayan Deluxe)

At UCSIM we are made up of people with great talents. From programming to three-dimensional modeling, so many aspects are captured here. Personally, I am a graphic design student worker and a Graphic Communication major. So on any given day I could be on Photoshop creating faces for an avatar or sketching out ideas for branding. The work here is fast paced and fun, which makes for an awesome environment.

Something that I am working on right now is the creation of the default characters in a new platform we are offering called OpenSim. The process began like almost every project I have over at DAAP; sketching. I dedicated one page for each character idea I had. In total this came to be about 15, give or take a few mistakes! Some of the ones I’m focusing on now, are in the picture below.

After plenty of sketching, a review with the team, and refinement, it is time to put them into OpenSim. Through making flat textures in Photoshop (using the sketches as inspiration) and uploading them into OpenSim the desired look is achieved. Some of these textures include denim to create jeans, flat bodies to create skin, and hair strands for a realistic head of tresses. After all of this is done, plenty of tweaking still needs to happen before it is in working mode.

The work I’m doing with avatar creation will help new users on the University of Cincinnati OpenSim Campus. When students, professors, and interested parties log into the campus they will have a variety of options for their avatars. These avatars range in ethnicity, age, gender, weight, and style. Therefore, after completion, anyone will be able to find an avatar that looks like them, makes them laugh, or opens their imagination.

Kinect Research

by Brad Cruse (SL/OS: Bartleby Zeritonga)

Finding new ways to create simulations is always fun. Lately I’ve been working with the Microsoft Kinect and Unity 3D. The Kinect is an IR camera that can pick up on joint placement depth and rotation. Unity 3D is a game engine that has a broad community of developers. Anyone developing simulations or making animations should look into this technology as a means of inexpensive motion capture.

The Kinect can be used in a number of ways, but for right now I have been concentrating on two. The first area of concentration is using the Kinect to develop prefabricated animation files for simulations. One of the best software products for this out there is Brekel. You can download a free version of this at www.Brekel.com. Brekel can export .bvh files with joints that match the default Second Life skeleton and 3DS Max skeleton, so implementation is a breeze. It also comes with a MotionBuilder plug-in and can be used to capture and edit straight in MotionBuilder.

The second aspect of Kinect research I have been getting into is using the Kinect in real time, mapping the joints to an avatar in Unity. Right now there are a few different wrappers being developed to use the Kinect with Unity, each has good and bad features. The three wrappers are OpenNI’s Kinect wrapper, Microsoft’s Kinect SDK Unity plug-in, and a wrapper developed by Amir at Tinker Heavy Studios.

As of now, deciding on which wrapper to use is a wash, each has their burdens. Amir’s wrapper does not work 100% of the time, but it can do joint rotations and depth at 30 fps. OpenNI’s wrapper only works at a rate of 17 fps, but it may be able to be hacked to only use what is needed rather than stream all information the Kinect gathers. The Microsoft plug-in would be the best option, however it is more complicated to install everything needed than it is to jump into the programming.

FAAST is another keybinding method developed for the Kinect. FAAST accepts gestures as an interfacing technique and we made a brief demo recently using it with Second Life – check it out!