To this end, part of the reason that the battle royale genre is only recently popular in video games now is because it’s only recently possible to play live with so many other users. Although some games with highest concurrencies have existed for more than twenty years, such as Second Life or Warcraft, they essentially spoofed the experience by “sharding” and splitting users into different “worlds” and servers. Eve Online, for example, can technically have more than 100,000 players “in the same game”, but they are split across different galaxies (i.e. server nodes). As a result, a player only really sees or interacts with a small handful of other players at any one time. In addition, traveling to another galaxy means disconnecting from one server and loading another (which the game is able to narratively “hide” by forcing players to jump to light speed in order to cross the vastness of space). And this still worked because the gameplay dynamic was based on predominantly large-scale, pre-planned ship-based combat. If it was a “fast-twitch” game such as Rocket League or Call of Duty, these slowdowns would have been unplayable.
A number of companies are working hard to solve this problem, such as the aptly named Improbable. But this is an enormous computational challenge and one that fights against the underlying design/intent of the Internet.
The Internet as we experience it today works because of standards and protocols for visual presentation, file loading, communications, graphics, data, and so forth. These include everything from consumer-recognizable .GIFs filetypes to the websocket protocol that underlies almost every form of real-time communication between a browser and other servers on the internet.
The Metaverse will require an even broader, more complex, and resilient set of S&Ps. What’s more, the importance of interoperability and live synchronous experiences means we’ll need to prune some existing standards and “standardize” around a smaller set per function. GIF, .JPEG, .PNG, .BMP, .TIFF, .WEBP, etc. And while the web today is built on open standards, much of it is closed and proprietary. Amazon and Facebook and Google use similar technologies, but they aren’t designed to transition into one another – just as Ford’s wheels aren’t designed to fit a GM chassis. In addition, these companies are incredibly resistant https://hookupdate.net/escort-index/fresno/ to cross-integrating their systems or sharing their data. Such moves might raise the overall value of the “digital economy”, but also weakens their hyper-valuable network effects and makes it easier for a user to move their digital lives elsewhere.
This will be enormously difficult and take decades. And the more valuable and interoperable the Metaverse is, the harder it will be to establish industry-wide consensus around topics such as data security, data persistence, forward compatible code evolution, and transactions.
While the establishments of standards usually involve actual meetings, negotiations, and debates, the standards for the Metaverse won’t be established upfront. The standard process is much messier and organic, with meetings and opinions changing on an ad hoc basis.
To use a meta analogy for the Metaverse, consider SimCity. In ideal circumstances, the “Mayor” (i.e. player) would first design their mega-metropolis, then build from day one to this final vision. But in the game, as with real life, you can’t just “build” a 10MM person city. You start with a small town and optimize for it first (e.g. where the roads are, schools are, utility capacity, etc.). As it grows, you build around this town, occasionally but judiciously tearing down and replacing “old” sections, sometimes only if/when a problem (insufficient supply of power) or disaster hits (a fire). But unlike SimCity, there will be many mayors, not one – and their desires and incentives will often conflict.