Advertisement

Read about the latest Gaming news and announcements. The official blog of Activision, publishers of Call of Duty, Sekiro, Crash Bandicoot, Skylanders, and more.

Epic’s new motion-capture animation tech has to be seen to be believed

Would you believe that creating this performance took only minutes of video processing and no human tweaking?

Enlarge / Would you believe that creating this performance took only minutes of video processing and no human tweaking? (credit: Ninja Theory / Epic)

SAN FRANCISCO—Every year at the Game Developers Conference, a handful of competing companies show off their latest motion-capture technology, which transforms human performances into 3D animations that can be used on in-game models. Usually, these technical demonstrations involve a lot of specialized hardware for the performance capture and a good deal of computer processing and manual artist tweaking to get the resulting data into a game-ready state.

Epic's upcoming MetaHuman facial animation tool looks set to revolutionize that kind of labor- and time-intensive workflow. In an impressive demonstration at Wednesday's State of Unreal stage presentation, Epic showed off the new machine-learning-powered system, which needed just a few minutes to generate impressively real, uncanny-valley-leaping facial animation from a simple head-on video taken on an iPhone.

The potential to get quick, high-end results from that kind of basic input "has literally changed how [testers] work or the kind of work they can take on," Epic VP of Digital Humans Technology Vladimir Mastilovic said in a panel discussion Wednesday afternoon.

Read 13 remaining paragraphs | Comments



from Gaming & Culture – Ars Technica https://ift.tt/JqbhMdg

Related Posts:

Recent Posts

Unordered List

Text Widget

Blog Archive

Like US On Facebook

Email Subscriptions

Enter your email address:

Delivered by FeedBurner

Like US On Facebook

Contact Form

Name

Email *

Message *