Translation, edit | Alex
Technology review | Liu Lianxiang
This article from Streaming Media by Adrian Pennington
▲ Scan the TWO-DIMENSIONAL code in the figure for more information about audio and video technology conference ▲
Video exploration #007# – the Metasemes
Companies like Magic LEAP, Nvidia, and Huawei were trying to build metasurses long before mainstream media became interested in them, though they were called Magicverse, Omniverse, or Cyberverse. Other companies conceptualize the metasexes as planet-scale AR, AR clouds, and mirror worlds. The person most closely associated with the meta-universe today, Mark Zuckerberg, once described it as an “Embodied Internet”.
These ideas dovetailed with spatial computing (also known as the 3D Web), or computer-generated 3D VR and AR as envisioned by the likes of film special effects pioneer John Gaeta. Cultural historians and science fiction aficionados will tell you that the term metaverse was coined by Neal Stephenson in his 1992 novel Avalanche, long before ready Player One and its book took the world by storm.
All of these concepts focus on the same evolution of the Internet: from flat text and photo web pages to a virtual space that digitally enhances the physical world and enables (at least) 3D experiences.
Ori Inbar, founder of AWE (Augmented World Expo), explained AR cloud in 2017 as “the continuous digital replication of the real World, enabling shared AR experiences across different users and devices”. The latest version of the metaverse hasn’t changed much from his.
In an article for NAB Amplify, Infinite Reality president Rodric David described the metadverse this way: “The metadverse is… The full convergence of streaming, interactive experiences and social media. Content, communication and interaction are presented as deep, intensely realistic experiences that drive user behavior and ultimately brand value.”
Craig Donato, CBO of online game creation site Roblox, tells _Protocol_ :
“People use the Internet for more information, but enter the metasverse for more social opportunities. I would no longer be constrained by physical distance or other constraints, such as who I interact with and how I present myself. It’s absolutely disruptive.”
No one knows when the metasomes will be fully formed, but proponents are optimistic that the next generation of the Internet will have a profound impact on everything. It will become the dominant platform for producing and viewing live content globally, David said: “It will have features and features such as interaction, real-time transactions, brand promotion integration, gaming, integrated social, blockchain and NFT, and gaming tools.”
This article won’t dwell too much on the current activities surrounding the metasverse, which allow tech giants to claim more of our data, money, and spirituality. But let’s also admit it: the battle for the next Internet framework has already begun.
01 / Open or closed?
Tim Sweeney, CEO of Epic Games, summed up the battle when he pointed out that the “Walled gardens” erected by Facebook and Google had to be torn down in order to achieve value in the metasverse, both financially and creatively and socially. “Right now we’re in a closed platform, and Apple and Google are taking advantage of that closure; But when we get out of it, everyone will realize: OK, we spent the last 10 years doing it for other people.”
A “Walled Garden” is an environment that controls user access to apps, web pages and services. The walled garden limits users to a specific range, allowing users to access or enjoy specified content, applications or services, and forbidding or restricting users to access or enjoy other not allowed. — From Baidu Encyclopedia
Leaving aside the fact that Epic Games’ own apps, such as Fortnite, are largely walled gardens, it is widely acknowledged that “if the value of the metasexes is to be realized, there needs to be a cross-industry alliance based on a set of standards, guidelines and best practices, In order to support the continuous production and distribution of large-scale cross-platform 3D and XR content. “Neil Trevett, VP of Developer Ecosystem at Nvidia, RealTime Conference 2021.
02 / Working Group on Standardization
Just as the IETF was set up in 1986 to develop and promote Internet standards, Sweeny believes such collaboration is needed to build the metasexes. You need a whole set of standards, and the Web is based on a few of them (like HTML). The meta-universe requires a lot of standards, such as file formats to describe 3D scenes and network protocols to describe how players interact in real time. Every multiplayer game uses some kind of network protocol, not all of which can communicate, but eventually they should all be able to communicate with each other.
Interaction standards and tools such as protocols, formats, and services that support persistent, ubiquitous virtual emulation technologies will probably be the most important part of the entire meta-universe framework. “Without them, the meta-universe would not exist, but would be at best a more virtual and immersive version of today’s mobile Internet and app stores.” Matthew Ball, managing partner of EpyllionCo, and Jacob Navok, CEO of Genvid Technologies, wrote on Ball’s website. “More importantly, this mediocre imitation will not be highly profitable and far from healthy and vibrant…”
While most articles on the subject assume that there is only one metasverse, it is more accurate to say that it will be a multiverse. Much like today’s Internet, which uses hundreds of millions of personal homepages or apps as access points, the portal to the metasverse’s home page will be implemented through a browser-based URL and a personalized Avatar. “People will be able to navigate the entire virtual metasurverse using mobile devices that incorporate game engine mechanics,” David said. “Infinitely customizable Avatars will carry keys, wallets and identities, becoming our virtual version.”
03/Billions of meta-universes
There could well be billions of such metasurses in which everyone has their own digital ID, but being able to synchronize and interact is their purpose. Our avatars should be free to come and go in and out of every metasemes and hardware devices (VR headsets, AR mobile devices), and our actions, creation, data, and blockchain wallets should be free from the limitations and obstacles of the “walled garden”.
“Blockchain is an undisputed expression of individual ownership in a neutral, shared way, and it is the most practical way to implement a long-term open framework in which people can control their own existence without being looked after,” Sweeney wrote in the Business of Business_.
This will require individual funding and individual effort to create unique creations known as closed metauniverses, as well as more open standards and massively scalable approaches in open metauniverses. On the moral level, there has always been a debate between the opening and closing of the metasomes, which mainly tends to be between capitalist monopoly and socialist (democratic) utopia. In reality the differences are more subtle.
“The closed metaverse can only be accessed by downloading the proprietary Source IP after the user signs the end-user license agreement,” said David. “Fortnite, Roblox, Call of Duty, Minecraft and League of Legends are all hermetic metaverse. Anyone who creates an Avatar can access the metasomes through a browser and URL on a PC or mobile device. Over the next few years, brands need to invest in the meta-universe as their number one strategy.”
If the goal is to try to integrate the world into the metasverse, that means that many of the devices and platforms around us will need to be interconnected — cars, security cameras, VR and AR headsets, projection cameras and screens, wearables, and more. “Developing these will require proprietary standards, or at least be able to benefit from the use of proprietary standards,” Ball and Navok said. This definitely puts a lot of pressure on developers, and it can become a vicious circle — no platform has enough users to develop, no platform has enough content to attract users… “You can’t import an experience from Roblox into Minecraft or Fortnite any more than you can easily import photos and likes from Instagram into TikTok or Snapchat.”
04 / Composition of the Open Meta-universe
How to describe shared virtual worlds in an open, flexible and efficient way is the key to constructing open metasexes. “Certainly not an extension to HTML or JavaScript rendering libraries,” Michael Kass, a senior distinguished engineer at Nvidia, wrote in a linkedin post. “It will not be created by a standards committee. It will be an open source 3D Scene Description, developed over years of training under challenging conditions.”
Nvidia supports Pixar’s open source USD (Universal Scene Description). “Pixar’S USD was created to facilitate teamwork and the interchange of tools between artists, resulting in cinema-quality avatars, Settings, and animations,” Kass said. The very characteristics that are conducive to teamwork are what are needed in the cooperative and social parts of the meta-universe, and the standardization of interchange tools is what binds the meta-universe together.”
Nvidia uses USD as the core technology for its Omniverse, a B2B platform for building metasverse applications for a variety of companies, and Kass said Nvidia’s enhancements to USD enable it to render applications “wherever the relevant rendering process is located,” enabling metasverse scalability.
“The Web already has multiple replication mechanisms, including distributed databases, CDN, and various caches,” Kass explains. But replicating a very complex 3D virtual world presents unique challenges. If the HTML of a Web page changes, you can resend the entire changed HTML. This is impractical for virtual worlds built with hundreds of megabytes. Any practical open meta-universe must be able to replicate by sending incremental updates that specify (only) what has changed.”
By building an efficient replication system on top of USD, Nvidia believes it can synchronize the virtual experience of multiple participants. In another post, Ball wrote, “The key thing for Omniverse is that it can achieve this synchronization regardless of the file format and engine or simulation technology used. That said, none of this has to go through Unity, Unreal or AutoCAD. “While Omniverse is primarily used for design and testing right now, it’s conceivable that Nvidia will use this technology, along with its own industrial computing power, to implement much of the metasverse experience in the future.”
Nvidia describes Omniverse as an end-to-end platform for building and simulating virtual worlds
(Image from Nvidia)
05 / Create large-scale 3D assets
Nvidia is also a supporter of Khronos Group. Khronos Group is an industry consortium that includes Huawei, Google, Epic Games, and Valve, and focuses on developing open source apis and creating royalty free open standards for graphics, computing, and rendering acceleration. Khronos Group manages standards such as Vulkan, OpenXR, OpenGL ES, WebGL and glTF.
According to The Khronos Group, the Use of the WebGL API has become widespread, allowing users to observe, manipulate, and modify 3D models without installing any browser plug-ins. VR and AR are now also supported in browsers via WebXR, and the glTF 3D file format, designed for efficient downloading and rendering, enables the creation and transformation of 3D models, it said.
Companies like Epic Games, Nvidia, and Google are already building digital worlds in the metasemes. Companies that can replicate the real world to the digital world have a lot of business ahead of them. In 2019, for example, Epic Games acquired Quixel, a library of 2D and 3D photogrammetric assets founded in 2011 by artists Teddy Bergsman and Waqar Azim. In 2016, Quixel launched Megascans, an online library of footage based on real-world materials and 3D object scanning.
In an article for NAB Amplify, Ball said the ability to map the real world has become an important source of IP. “This shift explains why companies like Epic and Unity are buying companies that have the ability to scan the real world, rather than building it from scratch. We are likely to see some very competitive racing on this circuit over the next few years. Companies with similar businesses, including Nvidia, Autodesk, Facebook, Snap and Niantic, will all choose to create their own databases.
However, the creation of 3D assets requires highly skilled technicians and artists, and the lack of such talent will be a major hindrance to the growth of the meta-universe. Developers at Khronos Group think mass-market LiDAR (Light Detection and Ranging) technology may be the answer. The technology is built into some new phones, such as the iPhone 12, and is available to the average user.
Rumor has it that the iPhone 13 Pro may have a second-generation LiDAR scanner built into it that, in conjunction with machine learning algorithms, can instantly transform our everyday lives into 3D. “Many experts think 3D capture will be as common as digital photography was in 2000.” TechRadar reports.
LiDAR doesn’t just deal with still images; it’s also important for user-generated Volumetric videos. Apple Insider_ points out that Apple’s patent, issued in 2021, compresses LiDAR spatial information in video using an encoder. “It allows the A15 chip to simulate Video Bokeh based on LiDAR depth information while still shooting high-quality Video.”
3D media management platforms based on interactive standards such as glTF, such as Sketchfab and Poly, can already view and interactively control 3D models through Web browsers. “LiDAR technology… It is now possible for anyone with the latest iPhone to massively render the real world, turn it into machine-readable 3D models, turn it into tradable NFT, and then quickly upload it to the open virtual world. These are avatars, wearables, furniture, even whole buildings or whole streets.” “Jamie Burke, CEO and founder of Venture capital firm Outlier Ventures, wrote in a website tweet.
Burke is also leading the effort to lay the groundwork for the open meta-universe. Outlier Ventures invests in cryptocurrencies, blockchains, and startups in the emerging Web 3 space. “The convergence of decentralized technologies will radically reshape the Internet and form a new data economy,” the company said. The past 20 years have been dominated by global digitization, the growing number of anti-social platforms, and the “cloud.” The next 20 years will be defined by the redistribution of network value and the unbunging of platform monopolies. The power of the platform will shift to the individual user.”
Outlier Ventures hopes to accelerate The realization of The Metaverse by promoting The Open Metaverse OS, a shared, Open operating system that builds on The success of decentralized protocols such as NFT. It explicitly links digital currencies and “on-chain NFT trading assets” to the heart of the emerging meta-cosmic economy.
Ball and Navok agree with this approach. They believe that blockchain will become an important exchange technology that can “retain maximum value and benefit from open standards” and is very likely to thrive in the meta-universe.
Composite camera image and corresponding real data (image from Nvidia)
06 / The open operating system of the Metasverse
Burke notes that these framework technologies, such as LiDAR, Pixar’s USD, and Nvidia’s Omniverse, have more to gain in a global open market than any closed platform, and that Web 3 and encryption are increasingly integrating with new environments such as gaming and VR. These new environments are migrating from the Web 2 platform to the next generation. The best explanation for Open Metaverse OS is a continuous set of highly composable technologies that will increasingly, and selectively, be used to develop more Open meta-universes.
Other platforms hope to achieve similar results with social. OWAKE is a “real-time ‘moment-to-moment’ sharing system” that enables communication between people, people and machines, and between machines. Kronosa developed the system, whose mission is to “use the next generation of the Internet to build sustainable human societies where people can live and work in both virtual and real worlds”. In addition, the Open Metaverse Interoperability Group is “dedicated to connecting the virtual world through the design and development of agreements for identity, social graph, inventory, and more.”
Nvidia Drive Mapping offers zoomable, high-definition maps and localization features for autonomous vehicles
(Image from Nvidia)
07 / Hardware in the metaverse
Metasexes or not, software can no longer cope with the vast amount of data that travels across networks. Optimizing video transmission bandwidth, latency, and reliability is critical. “If we want to interact in a vast, real-time shared and persistent virtual environment, we will need to receive massive streams of cloud data,” Ball and Navok wrote in a separate article. “Cloud data flows are going to be important if we want to seamlessly jump into different virtual worlds.”
Delay is a big bug in live sports right now. Of course, in general, by optimizing technologies such as LL-HLS and ABR, the latency from video capture to screen playback can be reduced to 5 seconds. This level of latency is fine for live NFL or Premier League games, but not for competitive video games, online gambling and multiplayer matchups, not to mention real-time social interactions in the future metasemes. “Slight facial expression changes are important for human communication — we’re hyper-sensitive to small errors or synchronization issues (hence the CGI” uncanny valley “effect),” Ball and Navok said. It doesn’t matter how powerful your device is, they insist, and if it doesn’t receive all the information it needs in a timely manner, then the availability and development of computing power will limit and define the meta-universe.
08 / For computing, 5G and edge of the metasverse
5G, along with infrastructure built on the edge (in data centres or mobile phones), is thought to be the key to the metasexes. In addition to the usual developments in telecommunications, there are many innovations in the metasexes. LionShare Media, a Los Angeles-based startup, has launched THIN/AIR, a platform for Premium entertainment and immersive Media experiences. According to its website, the cloud-native 5G decentralized media distribution platform, which is direct-to-users, provides creators with their own media channels called Projects. LionShare Media’s interpretation of Projects is that they are spatial 3D Web apps with hyper-Cube UI/UX designs that go beyond OVER-THE-top video, social Media and live streaming experiences.
But even if we improve the computing power of user devices, bring more enterprise computing power close to user computing power, and build more de-centric infrastructure, it will still fall short.
Ball and Navok’s idea: a P2P network will emerge in which the available computing power of each local PC and device will be used to meet demand. Device owners will be paid for their CPU and GPU computing power being used. They believe this approach is possible if future transactions are handled via blockchain. “Every computer, no matter its size, will be designed to auction off any idle computing time. Billions of dynamically aligned processors will be able to power deep computing for the largest industry customers, eventually connecting all computers and providing unlimited computing power to enable the metasomes.”
References:
Amplify.nabshow.com/articles/me…
www.awexr.com/
www.bbc.com/news/techno…
www.protocol.com/roblox-meta…
www.businessofbusiness.com/articles/wh…
www.matthewball.vc/all/themeta…
www.khronos.org/webgl/
www.techradar.com/news/why-th…
Thank you:
This article has been translated and published with permission from Adrian Pennington.
Original link:
www.streamingmedia.com/Articles/Re…