SIGGRAPH 2020: Talks and presentations - part 2

Sep 09, 2020 at 10:00 am by nemirc


Welcome to the last part of my SIGGRAPH 2020 coverage. There are still a few interesting things I wanted to talk to you about.

Reallusion

Reallusion was presenting their Character Creator and SkinGen. In that presentation, they showed how the Character Creator works, and the wide variety of customizations for your characters. Things like body shape, musculature, size, height, age, and more, are completely customizable, and the models have a very good deformation-friendly topology.

There was also a section of the presentation devoted to SkinGen, and also how Character Creator can work with Unreal Engine. I will definitely take a look at Character Creator in the near future and see how it works in conjunction with Unreal Engine 4, as it looks like a very good alternative for high quality human characters.

Studio Baobab

The studio Baobab was presenting their VR short “Baba Yaga.” This short is inspired by the European tale of Baba Yaga, but they said they took some liberties with the story and also adapted it for VR, making you, the user, part of the story. This is one of the instances where I think this event format falls short, since it would be a lot better to experience the story live, using a VR set. However, I can say the idea of presenting a short film using VR is interesting, and, if you consider the mentions of using VR in museums or other mass-consumption places, I think VR could be a great use for presenting stories and books.

Felix & Paul Studios

Speaking of alternate ways of presenting books, Felix & Paul Studios were presenting their AR book developed using Unreal Engine, and I have to say this looks amazing. The concept goes like this: you have a set of AR glasses (like the Holo Lens) and physical book, but rather than having written words, the book has AR markers.

As you pass the pages, the scenes are rendered by the Holo Lens on top of the book, similar to “Pop up books,” but rendered and animated using AR. During the presentation they explained the process of selecting AR marker types, how info on those markers are used to call data, how to make the scenes react to external stimulus like tilting the book, etc.

If I had a criticism, it would be how this type of book will promote pushing actual books away even more. We now have audiobooks, so people have less incentive to actually read a book. When you add AR books to the mix, younger generations might have even less desire to pick up a book and read. I see this as a big problem, because I've meet a lot of people that can't read quickly, have problem understanding, and even worse, can't write correctly (even in the age of auto-correctors). 

Unity

Lastly, I saw a presentation from Unity, also related to AR, and how you can use Unity’s renderers and ARKit to create stunning visuals. AR is one of those things that was at first very rudimentary, but has gradually improved in render quality (and tracking quality, of course). Some things you can do now with Unity and AR are adding atmospheric fog to your AR output, detecting light conditions to create an accurate representation, and track complex scene data for a more accurate placement of your AR objects. On top of that, you can now use Unity’s Universal Render Pipeline (the renderer that aims to replace the standard renderer) to create your AR applications, meaning you can have a very high level of quality.

And this is all for this time. Next time I will focus on sharing my last final impressions on SIGGRAPH 2020.






Sign up for our newsletter

This website uses cookies to ensure you get the best experience possible More Info
Got it!