google’s Chrome Beta 94 Announcement mentions that Google is implementing some new web standards that can make browser-based gaming experiences even better. The soon to be released Web codecs could help make cloud gaming easier and faster, while the experimental WebGPU can make it easier for developers of games that run in the browser to leverage the power of your computer.
WebCodecs is an API designed to give developers better access to the video encoding/decoding codecs already bundled with your browser, finding out what to do with video streams. While there are already methods for playing video in Chrome, they aren’t necessarily designed for things like cloud gaming, which is best if the latency is as low as possible. WebCodecs is built to avoid overhead, making it easier to get the incoming video stream to your screen as quickly as possible, possibly using hardware decoding. This will theoretically also make it perform better than it currently does on slower machines (these are the types of computers where cloud gaming is most desirable anyway).
the newer ones, more experimental WebGPU gives web developers better access to your computer’s graphics power by connecting them to your computer’s proprietary graphics API (similar to Apple’s Metal, Microsoft’s DirectX 12, or Vulkan). In simpler terms, it makes it easier for web developers to talk to your graphics card in a language it understands, without going through other layers that could slow things down. To be intended as a next generation version of WebGL, which allows developers to leverage the (now fairly outdated) OpenGL framework. In the future, the technology should make it easier for developers to create graphically intense games that run in the browser, leveraging the full power of the current generation of GPUs.
Both technologies also have their place outside of gaming. In a lecture from July 2020Google mentioned that Zoom was interested in using WebCodecs for video conferencing, and WebGPU can be used to display 3D models in the browser or to accelerate machine learning models. It makes sense that they would appear in Chrome, as these are all areas that Google is playing in, from cloud gaming with Google Stadia to its own video conferencing apps. However, both pieces of technology are open standards, developed by the W3C, and other browser makers have started testing them too.
Of course, we probably won’t see experiences powered by WebCodecs or WebGPU for a while. While WebCodecs is nearing release (it is expected to be enabled by default in the upcoming Chrome 94), developers will still need to get their apps working with it. As for WebGPU, it is currently in its experimental pilot phase, which Google expects to end in early 2022. Whether it will end up as a feature at that point will depend on how the trial goes, whether the spec is complete, and whether enough people are interested in using it.
While these technologies may not make things possible that were impossible, they are exciting nonetheless. When things are simpler or more flexible, it lowers the barrier to entry for developers. For gamers looking to play on the web, either through streaming or native games, the time developers save trying to figure out how to get frames on your screen is time they can spend improving other parts of the experience.