Although many games themselves allow players to freely roam and explore fully realized 3D worlds and environments, fans who watch their favorite player’s exploits on streaming services like Twitch are limited to that player’s perspective only: but this may not be the case for long.
Being able to explore a game’s 3D environment while monitoring a single player’s progress is far from a new idea. Many multiplayer first-person shooters allow players who die to follow another player before they reappear in combat. Many games also feature spectator modes, where viewers can float freely on a map that other players are playing on. This experience, however, requires each player to have a copy of that particular game installed and running.
Streaming services like Twitch instead take a single player’s rendered view of a game and broadcast it to thousands of viewers as 2D video streams. This approach helps to further popularize video games as a spectator sport, but the ability for viewers to change the perspective of the live stream they are watching would make the experience even more compelling, similarly to angles of Alternative cameras can enhance the experience of watching a sports game. But it would also require each viewer to have a full copy of the game they’re watching someone else play installed on the device they’re using, which isn’t always possible. For example, you can watch a live stream of a game that requires a powerful high-end gaming PC on a low-end tablet that could never run it.
Researchers from the University of Waterloo Cheriton School of Computing in Ontario, Canada have found a way to give viewers of a video game stream the ability to actually look around the featured 3D world, without needing to own and install a copy of the actual game. Even low-end mobile devices available today, such as tablets and smartphones, have enough processing power to render simple 3D environments, and here that is leveraged to create a live-view tool with extended capabilities. . Their research is available here in a recently published article.
Instead of seeding everyone in a live stream with a copy of the game (many A-list games with gigabytes of graphics assets can take hours to download, even on a fast connection), the researchers are simply improving the 2D video data sent to viewers with additional information. extracted from the game’s real-time renderer, including “depth buffer, camera pose, and projection matrix”.
On the viewer side, this additional data allows the 2D video information to be used to recreate a rendered 3D environment that matches in-game geometries. And much like in-game spectator modes, the viewer can manipulate the environment to change where they is looking at or even where the camera is positioned. Want to see where the missile that killed the player you’re looking at came from? You could potentially look around the same 3D environment they’re in and see for yourself.
The only downside is that part of the recreated 3D environment lacks texture and graphical information, making it look mostly monochromatic and bland. For now, that may make that remote experience less compelling. But as the research progresses, we may eventually see ways to fix this problem, especially since interactive game streaming seems to be the way the industry is moving in general.