At the Game Developers Conference (GDC) 2023, Epic Games introduced a new tool called MetaHuman Creator in its State of Unreal keynote. With this tool, developers should be able to create their own MetaHumans and even animate them with an iPhone. MetaHuman Creator Tool is a software that allows developers to create and animate realistic looking and highly detailed characters. The characters can then be used in various games, movies or other media.
Until now, the creation of such characters was a laborious and time-consuming process, usually carried out by specialized teams. The new tool from Epic Games should now remove these hurdles and also enable individual developers to create and animate their own characters. At the same time, the user interface should be easy and intuitive to use in order to provide access to as many developers as possible. What’s particularly interesting is that you can even use the tool with an iPhone to animate the characters. Overall, Epic Games’ new MetaHuman Creator tool could help revolutionize character creation in the gaming and film industries and give more people access to the field.
In an announcement, a video was shown demonstrating the ease for developers to use video footage captured with the iPhone to animate characters. The video shows the impressive results of the animation tool, which displays realistic movements with subtle facial expressions, eye tracking and graphics quality. It’s amazing how easy it is for developers to import the video footage into the tool and then use it for animation.
The realistic movements of the characters suggest that the tool has advanced technology to reproduce movements and expressions in a realistic way. The subtle facial expressions and eye tracking make the characters more vivid and add another element of authenticity. Overall, the video shows how powerful and simple the animation tool is, and how it allows developers to achieve impressive results without having to invest a lot of time and effort.
In an impressive technological breakthrough, Epic Games recently showed what’s possible with the latest technology by releasing a clip of Ninja Theory’s upcoming game Senua’s Saga: Hellblade II. In this clip, performance capture artist Melina Juergens, who portrays the main character of the game, acted out a scene, which was then turned into a great animation. This technology demonstration was shot with expensive motion capture cameras in a special studio, resulting in an impressive final result.
The use of motion capture technology makes it possible to capture human movements and expressions in a very accurate and realistic way. This is an important step in the evolution of video games and the film industry, as it allows seamless integration of real actors into animated characters. Using expensive cameras and specialized studios, motion capture artists can achieve even more precise results, bringing to life characters that look much more realistic because of their human-like movements and expressions.
The footage was then transformed into stunning animation that brings the characters to life, giving players an incredibly realistic and immersive gaming experience. Overall, this technology demonstration by Epic Games shows that motion capture technology is an important step in the development of video games and the film industry. It allows seamless integration of real actors with animated characters, creating a much more realistic and immersive experience for viewers and players. Although the use of expensive cameras and specialized studios is necessary to achieve the best results, this technology will undoubtedly evolve in the future to provide even more impressive results.
The technology behind MetaHuman Animator allows low-budget indie developers to easily animate their game characters by creating animations from footage from almost any camera. This technology is powered by a type of artificial intelligence or machine learning algorithm, as explained by Vladimir Mastilovic, VP of Digital Humans Technology at Epic, which fed MetaHuman Animator with a “large, diverse, highly curated database” of facial images that allows it to generate realistic animations.
The way MetaHuman Animator works is based on machine learning, a process in which the algorithm continuously learns from data to make decisions or perform tasks. In this case, the algorithm has access to a large number of facial images carefully selected to cover a wide range of facial shapes, structures and movements. By analyzing this data, MetaHuman Animator can track facial movements in real time and transfer these movements to the game character.
This allows developers to create realistic animations without hiring an animator or using expensive motion capture technologies. The present dataset was most likely collected through Epic Games’ acquisition of several motion capture companies such as Cubic Motion, 3Lateral, and Hyprsense. Epic Games stated on its MetaHuman website that all acquired companies have evolved their technology to now work in real time.
This means that motion capture technology is now capable of capturing motion in real time and instantly translating it into digital characters without the need for time-consuming post-processing. This improved technology promises tremendous potential for creating realistic and detailed digital characters for the film, gaming and virtual reality industries. If you are a developer and interested in exploring MetaHuman Animator’s capabilities, there is good news. It was announced that the program will be launched in the summer. This means that those interested will soon have the opportunity to bring their creative visions and ideas to life using MetaHuman Animator’s advanced technology.
14 Antworten
Kommentar
Lade neue Kommentare
Veteran
Urgestein
Veteran
Urgestein
Veteran
Urgestein
Veteran
Urgestein
Veteran
Urgestein
Veteran
Veteran
Veteran
Urgestein
Alle Kommentare lesen unter igor´sLAB Community →