Elvis Dean
Hi, I’m Elvis Deane, a seasoned digital event and video producer. I take pride in my ability to communicate clearly with clients and my proficiency with a wide range of production tools. My educational background in visual effects coordination, virtual production, and creative arts has provided me with a strong foundation in both traditional and cutting-edge industry techniques. After graduating in 1999 from one of Toronto’s first 3D college programs, I chose not to pursue a full-time career in animation. Instead, it became a hobby of mine alongside my work in video production, making me a long-time 3D enthusiast.
The character I selected for the 2024 Reallusion 3D Character Contest is one I’ve been developing since 2001. Drawing inspiration from my love for Samurai Jack and Star Wars, I created Griff as the protagonist of a series of silent short animated action films. His backstory revolves around a pilot on the run from a powerful, evil military. However, I soon realized that completing the entire project alone was too ambitious. Instead, I pivoted and created a comic that tells part of his story.
Character Design in Character Creator
Every few years, I revisit the character and model new versions based on the software I’m using at the time. In 2020, I tried a demo of Character Creator and created a version of Griff to test the program. Since then, the character has undergone three revisions, and I’ve finally settled on a more cartoony, stylized version for the 2024 Reallusion 3D Character Contest.
I didn’t use any reference material when stylizing Griff; instead, I drew from intuition. Given that he’s a tall and thin character, I wanted to exaggerate those traits in his face. I opted for realistic textures while keeping his features cartoonish to create a strong contrast, allowing me to apply realistic props and set pieces.
Versatility of AI-based mocap
As part of my setup for mocap, I make sure to have a few hours of uninterrupted time when nobody will disturb me. I also mark boundaries at the edges of the camera’s frame to prevent moving off-screen and losing tracking data. This is vital because I use AI-based mocap with Move.ai, and processing happens only after recording. If there’s an issue, I won’t know until about half an hour later, when the data is finally processed and I can preview the motion.
By using an AI-based mocap solution, I’m not required to wear a marker suit. Instead, I typically wear something comfortable and flexible, while clipping on a lavalier microphone to capture high-quality audio. I prefer recording facial motion capture simultaneously with the body performance, as it enhances the realism of the final result.
In the past, I recorded body and facial capture separately, but this caused a loss of subtle details, particularly in eye and head movements. This project marked the first time I used the Rokoko Headrig, designed to hold a phone. Previously, I built my own facial capture rig using an airsoft helmet and GoPro mounts. While it worked, the helmet was heavy and shook during fast movements. In contrast, the Headrig is lightweight and pairs well with an iPhone for Live Face. I also like to mount a light on the helmet when possible, ensuring my face stays lit even if I turn away from the light source.
Facial Animation in iClone
Live Face is really a dream app. It gets a very accurate version of my performance, and the more recent addition of denoising is amazing for taking out any small jitters. Once I have the data inside iClone, I also like to use AccuLips with the audio recording of my performance to get a more accurate lipsync. One of my favorite parts of AccLips is the Talking Style Editor, which lets you break down an audio clip into different sections and apply different styles of talking. For this animation, Griff is pretty loud, so I used the Bellowing setting for most of the monologue.
For the body mocap, I used MoveOne, which is an iPhone app that records a video and then sends it to the cloud to process the movement into an FBX file. Even though it only sees the movement from one angle, it does a very good job at capturing my overall movement.
Having the mocap as a basis to start my animation is very helpful. I like to take that performance and exaggerate certain things as necessary. Sometimes the eyebrows and eyes need bigger movements at times to emphasize a point in the dialogue.
Character Animation in iClone
The thing I love about iClone is how many tools there are to adjust and tweak a motion capture performance. I did two full takes of the monologue but preferred the start of the second take and the end of the first take. It was very easy to line them up in iClone and blend them so that it became one seamless performance. I also really appreciate the curve editor. As good as MoveOne is, it often has a lot of small jitters in the motion capture data which can make the character look shaky. I also like to use different motion layers when I’m working so that I can add new animation to the character without affecting what I tried before.
Final Render and Composition
For the final renders, I chose to use Unreal Engine, starting with LiveLink to transfer my characters from iClone. Sticking to my preferred method, I built a rough set using boxes and cylinders to get a general sense of camera movement. Once I was satisfied with the layout, I recorded a take of the LiveLink from iClone and began refining the set before focusing on lighting.
It usually takes me a day or two to finalize the look and composition, but for my contest entry, I managed it in about a day since I started the animation late and was running out of time. I’ve learned to delete all my lighting and start fresh if something doesn’t feel right—sometimes a blank slate is more effective than trying to fix what’s not working.
Benefits of iClone
Though I almost talked myself out of participating, I’m incredibly thankful I entered the Reallusion contest. It provided a challenge and a deadline that pushed me to complete the animation, forcing me to think deeply about new directions for the character that I hadn’t explored before. Griff had always worn a jacket, but for fun, I decided to dress him in a hockey jersey instead. His character had always been a bit serious, and this contest gave me the opportunity to experiment with a different look. I’m hoping this fresh energy will translate into the film I’m currently working on.
Art contests are a great way to push yourself out of your comfort zone and spark fresh ideas. For years, I avoided creating 3D work because the technical demands of building and rigging multiple characters were so time-consuming, sapping my energy for animation. Since I started using iClone, I’ve taken on many more creative projects. I can use motion capture for a base performance, but iClone also offers excellent tools if I prefer to animate by hand.
iClone is especially beneficial for indie or solo creators. It simplifies the technical aspects of 3D that I’ve always struggled with, allowing me to focus entirely on animation. I no longer have to fight with rigging or clothing, and there are plenty of tools for scaling up a project, adding crowds, or creating complex interactions in scenes. Another advantage is its integration with other software. The new CC Control Rig in Unreal Engine lets you make adjustments after exporting, adding elements like squash and stretch to enhance your animation.
Closing thoughts
Participating in this contest not only reignited my passion for my character, Griff, but also opened my eyes to the possibilities of blending creativity with cutting-edge tools. It’s been an invaluable experience, pushing me to step outside my usual workflow and explore new ways of bringing my characters to life. The process of combining motion capture, hand animation, and powerful software like iClone has made a significant difference in how I approach animation. For anyone working solo or on smaller teams, iClone’s user-friendly features are a game-changer, enabling more focus on the creative aspects without getting bogged down by technical challenges. The journey continues, and I’m excited to see where this hybrid approach takes me in future projects.
Related Posts