Interview: FX Artist Jon Finger On Using Xsens Motion Capture Technology
Why don't we start with how you got involved with using Xsens and why you chose their motion-capture technology over others.
I've actually been checking in on Xsens since 2006. I've always had this idea that big action blockbusters could be achievable on a much more modest budget and production scale due to advances in technology. Xsens seems like an obvious step in that direction. No studio, extra cameras, infrared, or line of sight is required so pretty much anyone anywhere can mo-cap. I recorded a space short as two astronaut by myself just because I had a suit laying around from another shoot.
The other piece to this is I love experimenting and testing limits. With the setup speed and recording mobility I can quickly try out experiments and iterate on tests. It's a huge opportunity remove barriers. You can put it on a parkour artist in a public place, record cg actors on location, or act out a VR scene in your living room.
I know what you mean about experimenting. This gives the user time to try things out and make mistakes which I think is essential to the creative process Tell me more about the space short you created.
Sure I post some of my personal experiments to my YouTube channel. (see video at bottom of page) The project was really on a whim, I had thought about doing something in space but there was no script. It was pretty much a free flow story and then I filled in the rest when I acted out the second character. I actually didn't think I was going to show anyone it. I was just curious how my faking a "magnetic boot walk" would look and wanted to try acting as two interacting characters. The bonus was I did test renders in 3d and Youtube 360.
Can you give a brief backgrounder on yourself and your history as a digital artist? Also, do you have a current picture?
I've been a video geek since the moment I made my sister disappear by stopping my dad's camcorder and starting it with her gone. When I left for college I decided to be a computer science major because I saw similar creative processes in game development and believed a stable job in that area was more likely. After an elective film class I realized I could use some of the 3d modeling & animation I had been experimenting with for games in videos. I started winning awards at CMF and decided to move to a Communications Major and focus on film production.
Once I left Sonoma State I found that it was much easier to get editing or directing work if I offered visual effects as well. I started doing trailers for games and some tech companies which slowly became almost exclusively motion graphics. I used those jobs and some personal experiments to get a job as R-west's in-house post production guy. It was a great opportunity to edit and do visual effects on spots for big companies like intel, DeMarini, and Deep silver.
After feeling a bit too disconnected from the creative process I took a job at Viacom in hopes that entertainment would have more creative mobility. When the GameTrailers department was sold in 2014 I started freelancing and focusing mostly on live action/vfx projects that I can grow and be creative on.
Last question: can you briefly describe your workflow for using Xsens to capture motion, how you use the data and incorporate it into your project?
I started out sending fbx files to motion builder from mvn studio but I quickly switched to sending straight to Cinema 4d (My package of choice). I tend to use Retarget or Constraint tags so my character's bones reference the mocap bones. Retarget is faster but I have a bit more flexibility with the Constraint tags if I want to alter how or how much the mocap data is referenced.
Thanks a lot for your time, Jon.
You are welcome.
Redshift (Xsens Mocap short film)
For more info on Xsens motion capture technology visit their main website. You can follow Jon's experiments on his YouTube channel. Be sure to watch his Redshift: Xsens Mocap Short Film below.