Sonic’s very fast redesign

When the redesign [of the Sonic the Hedgehog movie] was commissioned, Artist Tyson Hesse, who had worked on previous Sonic the Hedgehog media, was brought on to lead the redesign. “But a lot of the work that the team had already done as far as the look of Sonic’s fur, should he have eyelashes and what they look like…how we should handle the eyes…all that kind of stuff was actually directly translatable over to the new design,” comments Wright. “I think the redesign took only seven or eight weeks, which is a record at MPC for the design of a 3D character.”

The process was helped by Hesse having a strong relationship with Sega previously. The team flew to London, “where we all sat with the guy that was concept sculpting and literally just did it in real-time.” Wright felt that because there was a clear remit for Hesse to take the redesign lead, it allowed everything to fall into place very simply. “Actually, funnily enough, the redesign was pretty painless. In some respects, a more exaggerated character is a simpler thing for the team to execute,” he comments.

https://www.fxguide.com/fxfeatured/sonics-very-fast-redesign/

Real-time, in-camera background compositing in The Mandalorian

2020-02-06 Jay Holben

For decades, green- and bluescreen compositing was the go-to solution for bringing fantastic environments and actors together on the screen. . . . However, when characters are wearing highly reflective costumes, as is the case with Mando (Pedro Pascal), the title character of The Mandalorian, the reflection of green- and bluescreen in the wardrobe causes costly problems in post-production. . . .

The solution was what might be described as the heir to rear projection — a dynamic, real-time, photo-real background played back on a massive LED video wall and ceiling, which not only provided the pixel-accurate representation of exotic background content, but was also rendered with correct camera positional data.

The Mandalorian: This Is the Way in American Cinematographer

Also see The Virtual Production of The Mandalorian, Season One at YouTube.

Discovered from: https://news.ycombinator.com/item?id=22378679

How AlphaStar Became a StarCraft Grandmaster

2020-02-13 Tommy Thompson

One of the biggest headlines in AI research for 2019 was the unveiling of AlphaStar – Google DeepMind’s project to create the worlds best player of Blizzard’s real-time strategy game StarCraft II.  After shocking the world in January as the system defeated two high ranking players in closed competition, an updated version was revealed in November that had achieved grandmaster status: ranking among the top 0.15% in Europe’s 90,000 active players.  So let’s look at how AlphaStar works, the underpinning technology and theory that drives it, the truth behind the media sensationalism and how it achieved grandmaster rank in online multiplayer.

https://www.gamasutra.com/blogs/TommyThompson/20200213/358051/How_AlphaStar_Became_a_StarCraft_Grandmaster.php

Google Colaboratory Notebook and Repository Gallery

2019-11-25 firmai / Derek Snow

“A curated list of repositories with fully functional click-and-run colab notebooks with data, code and description. The code in these repositories are in Python unless otherwise stated.”

Some of the Google Colaboratory Notebooks listed use artificial neural networks. Google Colaboratory Notebooks are Jupyter notebooks that run Python or other code on Google’s cloud for free.