<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>SynWorlds</title>
	<atom:link href="https://www.synworlds.com/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.synworlds.com</link>
	<description>Virtual Worlds.  Real Play.</description>
	<lastBuildDate>Mon, 22 Jul 2024 23:29:28 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.8.38</generator>
	<item>
		<title>How AI Brought 11,000 College Football Players to Digital Life in Three Months</title>
		<link>https://www.synworlds.com/2024/07/22/how-ai-brought-11000-college-football-players-to-digital-life-in-three-months/</link>
		<comments>https://www.synworlds.com/2024/07/22/how-ai-brought-11000-college-football-players-to-digital-life-in-three-months/#comments</comments>
		<pubDate>Mon, 22 Jul 2024 23:29:28 +0000</pubDate>
		<dc:creator><![CDATA[Monty Simus]]></dc:creator>
				<category><![CDATA[Blog]]></category>

		<guid isPermaLink="false">http://www.synworlds.com/?p=1062</guid>
		<description><![CDATA[Via the Wall Street Journal, a report on Electronic Arts&#8217; use of new tech to scan photos for its videogame after securing players’ likeness rights for the first time A character artist works on the new ‘EA Sports College Football 25.’ PHOTO: EA SPORTS It has been over a decade since Electronic Arts released a college football videogame. To [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Via the Wall Street Journal, a <a href="https://www.wsj.com/tech/ai/how-ai-brought-11-000-college-football-players-to-digital-life-in-three-months-84d7c00f?" target="_blank">report</a> on Electronic Arts&#8217; use of new tech to scan photos for its videogame after securing players’ likeness rights for the first time</p>
<article>
<section>
<div style="text-align: center;" data-type="image" data-inset_type="" data-sub_type="" data-layout="inline">
<blockquote><figure><em><img class="aligncenter" alt="" src="https://images.wsj.net/im-982589?width=700&amp;height=525" srcset="https://images.wsj.net/im-982589?width=540&amp;size=1.3333333333333333 540w, https://images.wsj.net/im-982589?width=620&amp;size=1.3333333333333333 620w, https://images.wsj.net/im-982589?width=639&amp;size=1.3333333333333333 639w, https://images.wsj.net/im-982589?width=700&amp;size=1.3333333333333333 700w, https://images.wsj.net/im-982589?width=700&amp;size=1.3333333333333333&amp;pixel_ratio=1.5 1050w, https://images.wsj.net/im-982589?width=700&amp;size=1.3333333333333333&amp;pixel_ratio=2 1400w, https://images.wsj.net/im-982589?width=700&amp;size=1.3333333333333333&amp;pixel_ratio=3 2100w" width="700" height="525" /></em></figure>
<p><em>A character artist works on the new ‘EA Sports College Football 25.’ PHOTO: EA SPORTS</em></p></blockquote>
</div>
<blockquote>
<p data-type="paragraph"><em>It has been over a decade since Electronic Arts released a college football videogame. To get the likenesses of some 11,000 players into the new version that launched Friday, it had three months.</em></p>
<p data-type="paragraph"><em>For its long-running Madden NFL series, EA developers travel the country to make three-dimensional scans of professional players. But that wasn’t financially or logistically feasible for “EA Sports College Football 25.” </em></p>
</blockquote>
<div>
<blockquote>
<p data-type="paragraph"><em>There are about six times as many players in the National Collegiate Athletic Association’s top-tier Football Bowl Subdivision as there are in the National Football League. In addition, colleges don’t set their rosters until the late spring and they often change significantly, as some players go pro and others matriculate.</em></p>
<div data-type="inset" data-inset_type="newsletterinset" data-sub_type="" data-layout="wrap">
<hr data-testid="divider" />
<div></div>
</div>
<p style="text-align: center;" data-type="paragraph"><em>To release <a href="https://www.wsj.com/sports/football/electronic-arts-college-football-videogame-7867f3d1?mod=article_inline" rel="" data-type="link">the college-football game</a> this month, EA relied on artificial-intelligence technology it began developing about four years ago, before the NCAA set a <a href="https://www.wsj.com/articles/ncaa-proposes-interim-policy-on-name-image-likeness-11624915224?mod=article_inline" rel="" data-type="link">new policy</a> allowing players to sell their likeness rights in 2021 amid high-profile litigation and state legislation.</em></p>
<p data-type="paragraph"><em>“It was a leap of faith” that the NCAA would change its rules, said Cam Weber, president of EA Sports. </em></p>
<p data-type="paragraph"><em>EA collected photos of the athletes’ heads from their schools and then used its AI to create their videogame doppelgängers in seconds. The technology isn’t generative AI that creates new images such as OpenAI’s Dall-E, but rather a kind that takes data from photos and creates full 3-D avatars.</em></p>
<p data-type="paragraph"><em>If the results weren’t up to snuff, artists were brought in to make enhancements. Their changes were then fed back into the AI program so it could learn from its mistakes.</em></p>
<p data-type="paragraph"><em>The digital versions aren’t as detailed as in Madden, but they mark the first time EA has been able to put replicas of real players in its college football game.</em></p>
<p data-type="paragraph"><em>“You couldn’t do this with the regular workflows we’ve done in the past,” Weber said. </em></p>
<div style="text-align: center;" data-type="image" data-inset_type="" data-sub_type="" data-layout="inline">
<figure><em><img alt="" src="https://images.wsj.net/im-982591?width=700&amp;height=394" srcset="https://images.wsj.net/im-982591?width=540&amp;size=1.7777777777777777 540w, https://images.wsj.net/im-982591?width=620&amp;size=1.7777777777777777 620w, https://images.wsj.net/im-982591?width=639&amp;size=1.7777777777777777 639w, https://images.wsj.net/im-982591?width=700&amp;size=1.7777777777777777 700w, https://images.wsj.net/im-982591?width=700&amp;size=1.7777777777777777&amp;pixel_ratio=1.5 1050w, https://images.wsj.net/im-982591?width=700&amp;size=1.7777777777777777&amp;pixel_ratio=2 1400w, https://images.wsj.net/im-982591?width=700&amp;size=1.7777777777777777&amp;pixel_ratio=3 2100w" width="700" height="394" /></em></figure>
<p><em>Texas quarterback Quinn Ewers, one of the players on the cover of ‘EA Sports College Football 25,’ as he appears in the game. PHOTO: EA SPORTS</em></div>
<p data-type="paragraph"><em>For the 134 college football stadiums in the game, EA used other technology it developed in-house to incorporate details such as Notre Dame’s “Touchdown Jesus” mural and the waterfalls at Arkansas State. </em></p>
<p data-type="paragraph"><em>The project was so large that the team reached the maximum number of cells possible on a Google Sheets spreadsheet—10 million—while trying to keep track of all the data it collected.</em></p>
<p data-type="paragraph"><em>“All these things are meaningful” to fans, said Robert Jones, a senior production director at EA. </em></p>
<p data-type="paragraph"><em>Weber said EA sees long-term value in its AI technology because the company plans to release new installments of the college game annually. In addition, some of the athletes are likely to become part of its Madden NFL series in the future, which could potentially make their 3-D avatars continually useful. </em></p>
<p data-type="paragraph"><em>The company also hopes to use the technology for its other sports titles, which span soccer, auto-racing, hockey and mixed-martial arts.</em></p>
<h3 data-type="hed"><em>Securing likeness rights</em></h3>
<p data-type="paragraph"><em>In 2013, EA <a href="https://www.wsj.com/articles/SB10001424052702304526204579099672450514140?mod=article_inline" rel="" data-type="link">canceled</a> its old college football game in the aftermath of a <a href="https://www.wsj.com/articles/ncaa-unveils-20-million-settlement-with-ex-players-over-videogames-1402330931?mod=article_inline&amp;mod=article_inline&amp;mod=article_inline" rel="" data-type="link">class-action lawsuit</a> involving its use of a college basketball player’s likeness in another one of its sports titles.</em></p>
<p data-type="paragraph"><em>In early 2021, anticipating the NCAA would set its new likeness rules later that year, EA <a href="https://www.wsj.com/articles/electronic-arts-plans-return-to-college-football-videogames-11612288511?mod=article_inline" rel="" data-type="link">announced plans</a> to reboot its college-football franchise—and with a major upgrade.</em></p>
<p data-type="paragraph"><em>Previously, the games featured generic college players, because EA didn’t have a way to secure the rights to real ones. Now the company wanted to include real players for the first time. </em></p>
<p data-type="paragraph"><em>Questions arose in the collegiate football community about how EA would compensate players and whether stars would get paid more. </em></p>
<p data-type="paragraph"><em>The company ended up offering players $600 each, plus a deluxe copy of the game that retails for $99.99 and comes with extra content. An EA spokesman said more players opted in than the company was able to fit into the game.</em></p>
<div style="text-align: center;" data-type="image" data-inset_type="" data-sub_type="" data-layout="inline">
<figure><em><img alt="" src="https://images.wsj.net/im-982590?width=700&amp;height=394" srcset="https://images.wsj.net/im-982590?width=540&amp;size=1.7777777777777777 540w, https://images.wsj.net/im-982590?width=620&amp;size=1.7777777777777777 620w, https://images.wsj.net/im-982590?width=639&amp;size=1.7777777777777777 639w, https://images.wsj.net/im-982590?width=700&amp;size=1.7777777777777777 700w, https://images.wsj.net/im-982590?width=700&amp;size=1.7777777777777777&amp;pixel_ratio=1.5 1050w, https://images.wsj.net/im-982590?width=700&amp;size=1.7777777777777777&amp;pixel_ratio=2 1400w, https://images.wsj.net/im-982590?width=700&amp;size=1.7777777777777777&amp;pixel_ratio=3 2100w" width="700" height="394" /></em></figure>
<p><em>An image from EA Sports’ new college-football videogame, which has gotten mostly positive reviews from gamers who paid for early access. PHOTO: EA SPORTS</em></div>
<p data-type="paragraph"><em>Some stars are getting more money, however, in exchange for extra work. They include Michigan Wolverines running back Donovan Edwards, Texas Longhorns quarterback Quinn Ewers and Colorado Buffaloes cornerback and wide receiver Travis Hunter, who are on the cover of the game, and more than 100 players who agreed to promote “EA Sports College Football 25” on social media. </em></p>
<p data-type="paragraph"><em>That cost, which adds up to more than $7.7 million, is notable but not huge for a game that will likely cost at least $40 million to make and tens of millions of dollars more to market, according to analysts.</em></p>
<h3 data-type="hed"><em>Competition with Madden</em></h3>
<p data-type="paragraph"><em>“EA Sports College Football 25” is coming out at a time when the videogame industry is in a funk. It has been beset by layoffs this year, and publishers have been releasing fewer big-budget, visually complex titles—so-called triple-A games—which typically sell the most copies. U.S. videogame software sales fell 3% in May, according to the most recent monthly data available from market-research firm Circana.</em></p>
<p data-type="paragraph"><em>The lack of competition and the long delay since EA last put out a college-football game should help fuel demand for the new one, said Oppenheimer analyst Martin Yang. He estimates it will sell at least four million units, translating to more than $240 million in sales. </em></p>
<p data-type="paragraph"><em>The version of the game that came out in 2013, called “NCAA Football,” had a devoted following but was never as popular as EA’s Madden NFL series. </em></p>
<p data-type="paragraph"><em>This year, college football returning to videogames with real players for the first time could dent demand for August’s “Madden NFL 25,” Yang said. The annual Madden sequels are typically among EA’s bestselling games. </em></p>
<p data-type="paragraph"><em>To address that potential problem, EA is selling the deluxe editions of both games in a bundle for $149.99, or $50 less than buying them separately. The standard format costs $69.99.</em></p>
<p data-type="paragraph"><em>So far, social-media reactions to the game from people who paid for early access appear to be mostly positive. However, many complained about problems trying to play the game online.</em></p>
<p data-type="paragraph"><em>“Man I sure wish I could play #NCAA25 without EA’s fragile servers dying,” an X user wrote Monday. </em></p>
<p data-type="paragraph"><em>EA said it is increasing server capacity to try to address the problem.</em></p>
<p data-type="paragraph"><em>The company is also hoping some small details created without the help of AI will help engage fans. To make an in-game version of the University of Texas mascot, a live bull named Bevo, developers used motion-capture technology on a colleague who ran on all fours.</em></p>
</blockquote>
</div>
</section>
</article>
]]></content:encoded>
			<wfw:commentRss>https://www.synworlds.com/2024/07/22/how-ai-brought-11000-college-football-players-to-digital-life-in-three-months/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>How Generative AI Could Reinvent What It Means To Play</title>
		<link>https://www.synworlds.com/2024/06/21/how-generative-ai-could-reinvent-what-it-means-to-play/</link>
		<comments>https://www.synworlds.com/2024/06/21/how-generative-ai-could-reinvent-what-it-means-to-play/#comments</comments>
		<pubDate>Fri, 21 Jun 2024 07:34:47 +0000</pubDate>
		<dc:creator><![CDATA[Monty Simus]]></dc:creator>
				<category><![CDATA[Blog]]></category>

		<guid isPermaLink="false">http://www.synworlds.com/?p=1056</guid>
		<description><![CDATA[Via MIT&#8217;s Technology Review, a report on how how AI-powered NPCs that don’t need a script could make games—and other worlds—deeply immersiv First, a confession. I only got into playing video games a little over a year ago (I know, I know). A Christmas gift of an Xbox Series S “for the kids” dragged me—pretty easily, [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Via MIT&#8217;s Technology Review, a <a href="https://www.technologyreview.com/2024/06/20/1093428/generative-ai-reinventing-video-games-immersive-npcs/?" target="_blank">report</a> on how how AI-powered NPCs that don’t need a script could make games—and other worlds—deeply immersiv</p>
<div>
<header id="how-generative-ai-could-reinvent-what-it-means-to-play">
<div>
<div>
<div>
<blockquote>
<div><em>First, a confession. I only got into playing video games a little over a year ago (I know, I know). A Christmas gift of an Xbox Series S “for the kids” dragged me—pretty easily, it turns out—into the world of late-night gaming sessions. I was immediately attracted to open-world games, in which you’re free to explore a vast simulated world and choose what challenges to accept. Red Dead Redemption 2 (RDR2), an open-world game set in the Wild West, blew my mind. I rode my horse through sleepy towns, drank in the saloon, visited a vaudeville theater, and fought off bounty hunters. One day I simply set up camp on a remote hilltop to make coffee and gaze down at the misty valley below me.</em></div>
</blockquote>
</div>
</div>
</div>
</header>
</div>
<div>
<div id="content--body">
<blockquote>
<div>
<p><em>To make them feel alive, open-world games are inhabited by vast crowds of computer-controlled characters. These animated people—called NPCs, for “nonplayer characters”—populate the bars, city streets, or space ports of games. They make these virtual worlds feel lived in and full. Often—but not always—you can talk to them.</em></p>
</div>
<div>
<div>
<p><em>In open-world games like Red Dead Redemption 2, players can choose diverse interactions within the same simulated experience.</em></p>
</div>
</div>
<div>
<p><em>After a while, however, the repetitive chitchat (or threats) of a passing stranger forces you to bump up against the truth: This is just a game. It’s still fun—I had a whale of a time, honestly, looting stagecoaches, fighting in bar brawls, and stalking deer through rainy woods—but the illusion starts to weaken when you poke at it. It’s only natural. Video games are carefully crafted objects, part of a multibillion-dollar industry, that are designed to be consumed. You play them, you loot a few stagecoaches, you finish, you move on. </em></p>
<p><em>It may not always be like that. Just as it is upending other industries, generative AI is opening the door to entirely new kinds of in-game interactions that are open-ended, creative, and unexpected. The game may not always have to end.</em></p>
<p><em>Startups employing generative-AI models, like ChatGPT, are using them to create characters that don’t rely on scripts but, instead, converse with you freely. Others are experimenting with NPCs who appear to have entire interior worlds, and who can continue to play even when you, the player, are not around to watch. Eventually, generative AI could create game experiences that are infinitely detailed, twisting and changing every time you experience them. </em></p>
<p><em>The field is still very new, but it’s extremely hot. In 2022 the venture firm Andreessen Horowitz launched Games Fund, a $600 million fund dedicated to gaming startups. A huge number of these are planning to use AI in gaming. And the firm, also known as A16Z, has now invested in two studios that are aiming to create their own versions of AI NPCs. A second $600 million round was announced in April 2024.</em></p>
</div>
<div>
<div>
<p><em>Early experimental demos of these experiences are already popping up, and it may not be long before they appear in full games like RDR2. But some in the industry believe this development will not just make future open-world games incredibly immersive; it could change what kinds of game worlds or experiences are even possible. Ultimately, it could change what it means to play.</em></p>
<p><em>“What comes after the video game? You know what I mean?” says Frank Lantz, a game designer and director of the NYU Game Center. “Maybe we’re on the threshold of a new kind of game.”</em></p>
<h3><em>These guys just won’t shut up</em></h3>
<p><em>The way video games are made hasn’t changed much over the years. Graphics are incredibly realistic. Games are bigger. But the way in which you interact with characters, and the game world around you, uses many of the same decades-old conventions.</em></p>
<p><em>“In mainstream games, we’re still looking at variations of the formula we’ve had since the 1980s,” says Julian Togelius, a computer science professor at New York University who has a startup called Modl.ai that does in-game testing. Part of that tried-and-tested formula is a technique called a dialogue tree, in which all of an NPC’s possible responses are mapped out. Which one you get depends on which branch of the dialogue tree you have chosen. For example, say something rude about a passing NPC in RDR2 and the character will probably lash out—you have to quickly apologize to avoid a shootout (unless that’s what you want).</em></p>
</div>
</div>
<div>
<div>
<div>
<p><em>In the most expensive, high-­profile games, the so-called AAA games like Elden Ring or Starfield, a deeper sense of immersion is created by using brute force to build out deep and vast dialogue trees. The biggest studios employ teams of hundreds of game developers who work for many years on a single game in which every line of dialogue is plotted and planned, and software is written so the in-game engine knows when to deploy that particular line. RDR2 reportedly <a href="https://www.vulture.com/2018/10/the-making-of-rockstar-games-red-dead-redemption-2.html">contains an estimated 500,000 lines of dialogue</a>, voiced by around 700 actors. </em></p>
<p><em>“You get around the fact that you can [only] do so much in the world by, like, insane amounts of writing, an insane amount of designing,” says Togelius. </em></p>
</div>
</div>
<p><em>Generative AI is already helping take some of that drudgery out of making new games. Jonathan Lai, a general partner at A16Z and one of Games Fund’s managers, says that most studios are using image-­generating tools like Midjourney to enhance or streamline their work. And in a 2023 survey by A16Z, 87% of game studios said they were already using AI in their workflow in some way—and 99% planned to do so in the future. Many use AI agents to replace the human testers who look for bugs, such as places where a game might crash. In recent months, the CEO of the gaming giant EA said generative AI could be used in more than 50% of its game development processes.</em></p>
<p><em>Ubisoft, one of the biggest game developers, famous for AAA open-world games such as Assassin’s Creed, has been using a large-­language-model-based AI tool called Ghostwriter to do some of the grunt work for its developers in writing basic dialogue for its NPCs. Ghostwriter generates loads of options for background crowd chatter, which the human writer can pick from or tweak. The idea is to free the humans up so they can spend that time on more plot-focused writing.</em></p>
<p><em>Ultimately, though, everything is scripted. Once you spend a certain number of hours on a game, you will have seen everything there is to see, and completed every interaction. Time to buy a new one.</em></p>
<p><em>But for startups like Inworld AI, this situation is an opportunity. Inworld, based in California, is building tools to make in-game NPCs that respond to a player with dynamic, unscripted dialogue and actions—so they never repeat themselves. The company, now valued at $500 million, is the best-funded AI gaming startup around thanks to backing from former Google CEO Eric Schmidt and other high-profile investors. </em></p>
<p><em>Role-playing games give us a unique way to experience different realities, explains Kylan Gibbs, Inworld’s CEO and founder. But something has always been missing. “Basically, the characters within there are dead,” he says. </em></p>
<p><em>“When you think about media at large, be it movies or TV or books, characters are really what drive our ability to empathize with the world,” Gibbs says. “So the fact that games, which are arguably the most advanced version of storytelling that we have, are lacking these live characters—it felt to us like a pretty major issue.”</em></p>
<p><em>Gamers themselves were pretty quick to realize that LLMs could help fill this gap. Last year, some came up with ChatGPT mods (a way to alter an existing game) for the popular role-playing game Skyrim. The mods let players interact with the game’s vast cast of characters using LLM-powered free chat. One mod even included OpenAI’s speech recognition software Whisper AI so that players could speak to the players with their voices, saying whatever they wanted, and have full conversations that were no longer restricted by dialogue trees. </em></p>
<p><em>The results gave gamers a glimpse of what might be possible but were ultimately a little disappointing. Though the conversations were open-ended, the character interactions were stilted, with delays while ChatGPT processed each request. </em></p>
</div>
<div>
<p><em>Inworld wants to make this type of interaction more polished. It’s offering a product for AAA game studios in which developers can create the brains of an AI NPC that can be then imported into their game. Developers use the company’s “Inworld Studio” to generate their NPC. For example, they can fill out a core description that sketches the character’s personality, including likes and dislikes, motivations, or useful backstory. Sliders let you set levels of traits such as introversion or extroversion, insecurity or confidence. And you can also use free text to make the character drunk, aggressive, prone to exaggeration—pretty much anything.</em></p>
<p><em>Developers can also add descriptions of how their character speaks, including examples of commonly used phrases that Inworld’s various AI models, including LLMs, then spin into dialogue in keeping with the character. </em></p>
<blockquote><p><em>“Because there’s such reliance on a lot of labor-intensive scripting, it’s hard to get characters to handle a wide variety of ways a scenario might play out, especially as games become more and more open-ended.”</em></p>
<p><em><cite>Jeff Orkin, founder, Bitpart</cite></em></p></blockquote>
<p><em>Game designers can also plug other information into the system: what the character knows and doesn’t know about the world (no Taylor Swift references in a medieval battle game, ideally) and any relevant safety guardrails (does your character curse or not?). Narrative controls will let the developers make sure the NPC is sticking to the story and isn’t wandering wildly off-base in its conversation. The idea is that the characters can then be imported into video-game graphics engines like Unity or Unreal Engine to add a body and features. Inworld is collaborating with the text-to-voice startup ElevenLabs to add natural-sounding voices.</em></p>
<p><em>Inworld’s tech hasn’t appeared in any AAA games yet, but at the Game Developers Conference (GDC) in San Francisco in March 2024, the firm unveiled an early demo with Nvidia that showcased some of what will be possible. In Covert Protocol, each player operates as a private detective who must solve a case using input from the various in-game NPCs. Also at the GDC, Inworld <a href="https://www.youtube.com/watch?v=1od2pIs9220">unveiled a demo called NEO NPC</a> that it had worked on with Ubisoft. In NEO NPC, a player could freely interact with NPCs using voice-to-text software and use conversation to develop a deeper relationship with them.</em></p>
<p><em>LLMs give us the chance to make games more dynamic, says Jeff Orkin, founder of Bitpart, a new startup that also aims to create entire casts of LLM-powered NPCs that can be imported into games. “Because there’s such reliance on a lot of labor-intensive scripting, it’s hard to get characters to handle a wide variety of ways a scenario might play out, especially as games become more and more open-ended,” he says.</em></p>
<p><em>Bitpart’s approach is in part inspired by Orkin’s PhD research at MIT’s Media Lab. There, he <a href="https://web.archive.org/web/20221006212223/http:/alumni.media.mit.edu/~jorkin/papers/orkin_phd_thesis_2013.pdf">trained AIs to role-play social situations</a> using game-play logs of humans doing the same things with each other in multiplayer games.</em></p>
<p><em>Bitpart’s casts of characters are trained using a large language model and then fine-tuned in a way that means the in-game interactions are not entirely open-ended and infinite. Instead, the company uses an LLM and other tools to generate a script covering a range of possible interactions, and then a human game designer will select some. Orkin describes the process as authoring the Lego bricks of the interaction. An in-game algorithm searches out specific bricks to string them together at the appropriate time.</em></p>
<p><em>Bitpart’s approach could create some delightful in-game moments. In a restaurant, for example, you might ask a waiter for something, but the bartender might overhear and join in. Bitpart’s AI currently works with Roblox. Orkin says the company is now running trials with AAA game studios, although he won’t yet say which ones.</em></p>
</div>
<div>
<p><em>But generative AI might do more than just enhance the immersiveness of existing kinds of games. It could give rise to completely new ways to play.</em></p>
<h3><em>Making the impossible possible</em></h3>
<p><em>When I asked Frank Lantz about how AI could change gaming, he talked for 26 minutes straight. His initial reaction to generative AI had been visceral: “I was like, oh my God, this is my destiny and is what I was put on the planet for.” </em></p>
<p><em>Lantz has been in and around the cutting edge of the game industry and AI for decades but received a cult level of acclaim a few years ago when he created the Universal Paperclips game. The simple in-browser game gives the player the job of producing as many paper clips as possible. It’s a riff on the famous thought experiment by the philosopher Nick Bostrom, which imagines an AI that is given the same task and optimizes against humanity’s interest by turning all the matter in the known universe into paper clips.</em></p>
<p><em>Lantz is bursting with ideas for ways to use generative AI. One is to experience a new work of art as it is being created, with the player participating in its creation. “You’re inside of something like Lord of the Rings as it’s being written. You’re inside a piece of literature that is unfolding around you in real time,” he says. He also imagines strategy games where the players and the AI work together to reinvent what kind of game it is and what the rules are, so it is never the same twice.</em></p>
<p><em>For Orkin, LLM-powered NPCs can make games unpredictable—and that’s exciting. “It introduces a lot of open questions, like what you do when a character answers you but that sends a story in a direction that nobody planned for,” he says. </em></p>
</div>
</blockquote>
<blockquote>
<div>
<p><em>It might mean games that are unlike anything we’ve seen thus far. Gaming experiences that unspool as the characters’ relationships shift and change, as friendships start and end, could unlock entirely new narrative experiences that are less about action and more about conversation and personalities. </em></p>
<p><em>Togelius imagines new worlds built to react to the player’s own wants and needs, populated with NPCs that the player must teach or influence as the game progresses. Imagine interacting with characters whose opinions can change, whom you could persuade or motivate to act in a certain way—say, to go to battle with you. “A thoroughly generative game could be really, really good,” he says. “But you really have to change your whole expectation of what a game is.”</em></p>
<p><em>Lantz is currently working on a prototype of a game in which the premise is that you—the player—wake up dead, and the afterlife you are in is a low-rent, cheap version of a synthetic world. The game plays out like a noir in which you must explore a city full of thousands of NPCs powered by a version of ChatGPT, whom you must interact with to work out how you ended up there. </em></p>
</div>
<div>
<div>
<p><em>His early experiments gave him some eerie moments when he felt that the characters seemed to know more than they should, a sensation recognizable to people who have played with LLMs before. Even though you know they’re not alive, they can still freak you out a bit.</em></p>
<p><em>“If you run electricity through a frog’s corpse, the frog will move,” he says. “And if you run $10 million worth of computation through the internet … it moves like a frog, you know.” </em></p>
<p><em>But these early forays into generative-­­AI gaming have given him a real sense of excitement for what’s next: “I felt like, okay, this is a thread. There really is a new kind of artwork here.”</em></p>
<h3><em>If an AI NPC talks and no one is around to listen, is there a sound?</em></h3>
<p><em>AI NPCs won’t just enhance player interactions—they might interact with one another in weird ways. Red Dead Redemption 2’s NPCs each have long, detailed scripts that spell out exactly where they should go, what work they must complete, and how they’d react if anything unexpected occurred. If you want, you can follow an NPC and watch it go about its day. It’s fun, but ultimately it’s hard-coded.</em></p>
<p><em>NPCs built with generative AI could have a lot more leeway—even interacting with one another when the player isn’t there to watch. Just as people have been fooled into thinking LLMs are sentient, watching a city of generated NPCs might feel like peering over the top of a toy box that has somehow magically come alive.</em></p>
<p><em>We’re already getting a sense of what this might look like. At Stanford University, Joon Sung Park has been experimenting with AI-generated characters and watching to see how their behavior changes and gains complexity as they encounter one another. </em></p>
<p><em>Because large language models have sucked up the internet and social media, they actually contain a lot of detail about how we behave and interact, he says.</em></p>
</div>
</div>
<div>
<div>
<p><em>In Park’s recent research, he and colleagues set up a Sims-like game, called Smallville, with 25 simulated characters that had been trained using generative AI. Each was given a name and a simple biography before being set in motion. When left to interact with each other for two days, they began to exhibit humanlike conversations and behavior, including remembering each other and being able to talk about their past interactions. </em></p>
</div>
</div>
<div>
<div>
<p><em>For example, the researchers prompted one character to organize a Valentine’s Day party—and then let the simulation run. That character sent invitations around town, while other members of the community asked each other on dates to go to the party, and all turned up at the venue at the correct time. All of this was carried out through conversations, and past interactions between characters were stored in their “memories” as natural language.</em></p>
<p><em>For Park, the implications for gaming are huge. “This is exactly the sort of tech that the gaming community for their NPCs have been waiting for,” he says. </em></p>
<p><em>His research has inspired games like AI Town, an open-source interactive experience on GitHub that lets human players interact with AI NPCs in a simple top-down game. You can leave the NPCs to get along for a few days and check in on them, reading the transcripts of the interactions they had while you were away. Anyone is free to take AI Town’s code to build new NPC experiences through AI. </em></p>
</div>
</div>
<div>
<div>
<p><em>For Daniel De Freitas, cofounder of the startup Character AI, which lets users generate and interact with their own LLM-powered characters, the generative-AI revolution will allow new types of games to emerge—ones in which the NPCs don’t even need human players. </em></p>
<p><em>The player is “joining an adventure that is always happening, that the AIs are playing,” he imagines. “It’s the equivalent of joining a theme park full of actors, but unlike the actors, they truly ‘believe’ that they are in those roles.”</em></p>
<p><em>If you’re getting Westworld vibes right about now, you’re not alone. There are plenty of stories about people torturing or killing their simple Sims characters in the game for fun. Would mistreating NPCs that pass for real humans cross some sort of new ethical boundary? What if, Lantz asks, an AI NPC that appeared conscious begged for its life when you simulated torturing it?</em></p>
<p><em>It raises complex questions he adds. “One is: What are the ethical dimensions of pretend violence? And the other is: At what point do AIs become moral agents to which harm can be done?”</em></p>
<p><em>There are other potential issues too. An immersive world that feels real, and never ends, could be dangerously addictive. Some users of AI chatbots have already reported losing hours and even days in conversation with their creations. Are there dangers that the same parasocial relationships could emerge with AI NPCs? </em></p>
</div>
</div>
<div>
<p><em>“We may need to worry about people forming unhealthy relationships with game characters at some point,” says Togelius. Until now, players have been able to differentiate pretty easily between game play and real life. But AI NPCs might change that, he says: “If at some point what we now call ‘video games’ morph into some all-encompassing virtual reality, we will probably need to worry about the effect of NPCs being too good, in some sense.”</em></p>
<h3><em>A portrait of the artist as a young bot</em></h3>
<p><em>Not everyone is convinced that never-ending open-ended conversations between the player and NPCs are what we really want for the future of games. </em></p>
<p><em>“I think we have to be cautious about connecting our imaginations with reality,” says Mike Cook, an AI researcher and game designer. “The idea of a game where you can go anywhere, talk to anyone, and do anything has always been a dream of a certain kind of player. But in practice, this freedom is often at odds with what we want from a story.”</em></p>
<p><em>In other words, having to generate a lot of the dialogue yourself might actually get kind of … well, boring. “If you can’t think of interesting or dramatic things to say, or are simply too tired or bored to do it, then you’re going to basically be reading your own very bad creative fiction,” says Cook. </em></p>
<p><em>Orkin likewise doesn’t think conversations that could go anywhere are actually what most gamers want. “I want to play a game that a bunch of very talented, creative people have really thought through and created an engaging story and world,” he says.</em></p>
<p><em>This idea of authorship is an important part of game play, agrees Togelius. “You can generate as much as you want,” he says. “But that doesn’t guarantee that anything is interesting and worth keeping. In fact, the more content you generate, the more boring it might be.”</em></p>
<p><em>Sometimes, the possibility of everything is too much to cope with. No Man’s Sky, a hugely hyped space game launched in 2016 that used algorithms to generate endless planets to explore, was seen by many players as a bit of a letdown when it finally arrived. Players quickly discovered that being able to explore a universe that never ended, with worlds that were endlessly different, actually fell a little flat. (A series of updates over subsequent years has made No Man’s Sky a little more structured, and it’s now generally well thought of.)</em></p>
<p><em>One approach might be to keep AI gaming experiences tight and focused.</em></p>
</div>
<div>
<div>
<p><em>Hilary Mason, CEO at the gaming startup Hidden Door, likes to joke that her work is “artisanal AI.” She is from Brooklyn, after all, says her colleague Chris Foster, the firm’s game director, laughing.</em></p>
<p><em>Hidden Door, which has not yet released any products, is making role-playing text adventures based on classic stories that the user can steer. It’s like Dungeons &amp; Dragons for the generative AI era. It stitches together classic tropes for certain adventure worlds, and an annotated database of thousands of words and phrases, and then uses a variety of machine-learning tools, including LLMs, to make each story unique. Players walk through a semi-­unstructured storytelling experience, free-typing into text boxes to control their character. </em></p>
<p><em>The result feels a bit like hand-annotating an AI-generated novel with Post-it notes.</em></p>
<p><em>In a demo with Mason, I got to watch as her character infiltrated a hospital and attempted to hack into the server. Each suggestion prompted the system to spin up the next part of the story, with the large language model creating new descriptions and in-game objects on the fly.</em></p>
<p><em>Each experience lasts between 20 and 40 minutes, and for Foster, it creates an “expressive canvas” that people can play with. The fixed length and the added human touch—Mason’s artisanal approach—give players “something really new and magical,” he says.</em></p>
<h3><em>There’s more to life than games</em></h3>
<p><em>Park thinks generative AI that makes NPCs feel alive in games will have other, more fundamental implications further down the line.</em></p>
<p><em>“This can, I think, also change the meaning of what games are,” he says. </em></p>
<p><em>For example, he’s excited about using generative-AI agents to simulate how real people act. He thinks AI agents could one day be used as proxies for real people to, for example, test out the likely reaction to a new economic policy. Counterfactual scenarios could be plugged in that would let policymakers run time backwards to try to see what would have happened if a different path had been taken. </em></p>
</div>
</div>
</blockquote>
<div>
<div>
<blockquote><p><em>“You want to learn that if you implement this social policy or economic policy, what is going to be the impact that it’s going to have on the target population?” he suggests. “Will there be unexpected side effects that we’re not going to be able to foresee on day one?”</em></p>
<p><em>And while Inworld is focused on adding immersion to video games, it has also worked with LG in South Korea to make characters that kids can chat with to improve their English language skills. Others are using Inworld’s tech to create interactive experiences. One of these, called Moment in Manzanar, was created to help players empathize with the Japanese-Americans the US government detained in internment camps during World War II. It allows the user to speak to a fictional character called Ichiro who talks about what it was like to be held in the Manzanar camp in California. </em></p>
<p><em>Inworld’s NPC ambitions might be exciting for gamers (my future excursions as a cowboy could be even more immersive!), but there are some who believe using AI to enhance existing games is thinking too small. Instead, we should be leaning into the weirdness of LLMs to create entirely new kinds of experiences that were never possible before, says Togelius. The shortcomings of LLMs “are not bugs—they’re features,” he says. </em></p>
<p><em>Lantz agrees. “You have to start with the reality of what these things are and what they do—this kind of latent space of possibilities that you’re surfing and exploring,” he says. “These engines already have that kind of a psychedelic quality to them. There’s something trippy about them. Unlocking that is the thing that I’m interested in.”</em></p>
<p><em>Whatever is next, we probably haven’t even imagined it yet, Lantz thinks. </em></p>
<p><em>“And maybe it’s not about a simulated world with pretend characters in it at all,” he says. “Maybe it’s something totally different. I don’t know. But I’m excited to find out.”</em></p></blockquote>
</div>
</div>
</div>
</div>
]]></content:encoded>
			<wfw:commentRss>https://www.synworlds.com/2024/06/21/how-generative-ai-could-reinvent-what-it-means-to-play/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Gym Class VR Builds On Popularity With NBA Logos and Venues</title>
		<link>https://www.synworlds.com/2023/07/29/gym-class-vr-builds-on-popularity-with-nba-logos-and-venues/</link>
		<comments>https://www.synworlds.com/2023/07/29/gym-class-vr-builds-on-popularity-with-nba-logos-and-venues/#comments</comments>
		<pubDate>Sat, 29 Jul 2023 10:46:54 +0000</pubDate>
		<dc:creator><![CDATA[Monty Simus]]></dc:creator>
				<category><![CDATA[Blog]]></category>

		<guid isPermaLink="false">http://www.synworlds.com/?p=1053</guid>
		<description><![CDATA[Via Sports Business Journal, a look at how Gym Class VR is building on its popularity with the addition of NBA logos and venues: The mid-range jumper may be a lost art in the NBA game, but I sank one from just inside the free-throw line at the TD Garden, as the PA announcer called out the basket. [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Via Sports Business Journal, a <a href="https://www.sportsbusinessjournal.com/Journal/Issues/2023/07/24/Technology/nba-gym-class-vr.aspx?" target="_blank">look</a> at how Gym Class VR is building on its popularity with the addition of NBA logos and venues:</p>
<div id="content-wrap">
<blockquote>
<div><em><img alt="" src="https://www.sportsbusinessjournal.com/-/media/Images/Daily/2023/07/27/SBJ-TECH/NBA-Bundle---Cover-Landscape.ashx?mw=768" /></em></div>
<p><em>The mid-range jumper may be a lost art in the <a>NBA</a> game, but I sank one from just inside the free-throw line at the <a>TD Garden</a>, as the PA announcer called out the basket. When my opponent lost control of the ball, I went back on offense as the crowd started chanting, “De-fense! De-fense!”</em></p>
<p><em>After firing an air ball, I collected my own rebound and swished an uncontested putback. I then forgot to play defense, distracted by conversation with Gym Class co-founder Paul Katsen in what was my first interview conducted in the metaverse.</em></p>
<p><em>I wasn’t actually in Boston, though the parquet court and <a>Celtics</a> logo looked like the genuine artifact. Instead, I was in a PR firm’s Manhattan office wearing a Meta Quest 2 and trying not to break the overhead light fixture with a wayward shooting motion.</em></p>
<p><em><a href="https://www.sportsbusinessjournal.com/Daily/Issues/2022/10/13/Technology/gym-class-vr-meta-quest-store-virtual-reality-gaming-basketball.aspx" target="_blank" rel="noopener noreferrer">Gym Class</a> has been attracting tens of thousands of daily users — and more than 100,000 on peak traffic days — making it the most popular VR sports game, with Meta reporting that more than 40% of users are between the ages of 13 and 24.</em></p>
<p><em>A recent driver has been a <a href="https://www.sportsbusinessjournal.com/Daily/Issues/2023/02/09/Technology/nba-gym-class-vr.aspx" target="_blank" rel="noopener noreferrer">partnership agreement with the NBA</a>, permitting Gym Class to license the league’s logos and venues. The NBA also took an equity stake in the game, which has also received backing from <a>Kevin Durant</a>’s 35 Ventures, <a>Andre Iguodala</a>, <a>Lonzo Ball</a>, <a>Danny Green</a> and the <a>Golden State Warriors</a> through GSW Sports Ventures. All 29 arenas are represented with attention paid to small details, such as the <a>Toronto Raptors</a>’ home court featuring the word “North” written in 25 different languages.</em></p>
<div>
<blockquote>
<p dir="ltr" lang="en"><em> What NBA team will you unlock?</em></p>
<p><em>Step onto your favorite NBA team’s court, wear NBA apparel and accessories, and hoop with friends in Gym Class VR.<a href="https://t.co/QX87fdEDrV">https://t.co/QX87fdEDrV</a> <a href="https://t.co/sWIbJq2VhB">pic.twitter.com/sWIbJq2VhB</a></em></p>
<p><em>— Gym Class &#8211; Basketball VR (@Gymclassvr) <a href="https://twitter.com/Gymclassvr/status/1659203612315377665?ref_src=twsrc%5Etfw">May 18, 2023</a></em></p></blockquote>
</div>
<p><em>“You have this ability for the first time to step onto a court, see what that&#8217;s like, and that thing hopefully is the magnet that brings you in,” Katsen said, noting that several dormant accounts were recently resurrected in order to subscribe to the NBA bundle.</em></p>
<p><em>The premise of Gym Class is as much social as it is basketball. Most users spend time on public courts, places for pickup games and hangouts. The game developers created their own avatar system for better individualized expression than the halfatars endemic to Meta games.</em></p>
<p><em>“If you grew up playing basketball, often as kids and adults, that’s where you hang out with your friends,” Katsen said. “Your parents aren&#8217;t there, it’s a very social atmosphere. And so once you have friends in these spaces, and you meet people from around the world, you end up wanting to differentiate how you look, you want to customize your space.”</em></p>
<p><em>“It’s way better than texting someone or playing Fortnite where you can&#8217;t see their body movements,” he added.</em></p>
<p><em>The NBA organically learned about Gym Class and saw an opportunity to help basketball-minded fans a chance to rep their favorite team and hoop in venues that are reasonable facsimiles of where the stars play.</em></p>
<p><em>“Our fan base is young and tech-savvy, multicultural, diverse geographically, and we&#8217;re always looking for different ways to meet them where they are,” said Adrienne O’Keefe, Head of Gaming &amp; Digital Assets at the NBA.</em></p>
<p><em>Wearing virtual NBA team apparel is a glimpse of possible future partnerships. It’s easy to envision future brand opportunities with sneaker companies and more.</em></p>
<div><em><img alt="" src="https://www.sportsbusinessjournal.com/-/media/Images/Daily/2023/07/27/SBJ-TECH/gym-class-nba-courts-in-story.ashx?mw=408" /></em></p>
<div><em>GYM CLASS</em></div>
</div>
<p><em>“I didn&#8217;t come from gaming, I came from consumer social,” said Katsen, a former product manager at Twitter and Coinbase prior to Gym Class. “And if you look at an interest graph of things that people talk about, or communities that pop up on Twitter, for instance, basketball is one of those central nodes that&#8217;s connected to fashion, fitness, media, celebrities.”</em></p>
<p><em>Other avenues for enhancing Gym Class include deeper basketball experiences — permitting full-court or five-on-five games — as well as expanding to other activities, some of which are already happening in a makeshift way by creative users. Katsen also acknowledged that the game has lacked a proper onboarding experience, which they’ll soon support.</em></p>
<p><em>Katsen said they’ve seen users post videos to TikTok that replicate other sports, like football and dodgeball, as well as self-expression through haircuts and tattoos, meaning the creation of barbershops and tattoo parlors could be on the product roadmap.</em></p>
<p><em>“The unexpected ways that we see people use these courts are how we get our next set of product insights,” he said. “It&#8217;s super interesting because it&#8217;s an open space.”</em></p></blockquote>
</div>
<div>
<p>&nbsp;</p>
</div>
]]></content:encoded>
			<wfw:commentRss>https://www.synworlds.com/2023/07/29/gym-class-vr-builds-on-popularity-with-nba-logos-and-venues/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>K-pop: The Rise Of Virtual Girl Bands</title>
		<link>https://www.synworlds.com/2022/12/12/k-pop-the-rise-of-virtual-girl-bands/</link>
		<comments>https://www.synworlds.com/2022/12/12/k-pop-the-rise-of-virtual-girl-bands/#comments</comments>
		<pubDate>Tue, 13 Dec 2022 01:27:30 +0000</pubDate>
		<dc:creator><![CDATA[Monty Simus]]></dc:creator>
				<category><![CDATA[Blog]]></category>

		<guid isPermaLink="false">http://www.synworlds.com/?p=1051</guid>
		<description><![CDATA[Via BBC, a look at the rise of virtual K-pop girl bands: Since releasing their debut single I&#8217;m Real in 2021, K-pop girl group Eternity have racked up millions of views online. They sing, dance and interact with their fans just like any other band. In fact, there&#8217;s mainly one big difference between them and [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Via BBC, a <a title="Virtual Kpop bands" href="https://www.bbc.com/news/world-asia-63827838?at_medium=RSS&amp;at_campaign=KARANGA" target="_blank">look</a> at the rise of virtual K-pop girl bands:</p>
<div data-component="text-block">
<div>
<blockquote><p><em><b>Since releasing their debut single I&#8217;m Real in 2021, K-pop girl group Eternity have racked up millions of views online.</b></em></p></blockquote>
</div>
</div>
<blockquote>
<div data-component="text-block">
<div>
<p><em>They sing, dance and interact with their fans just like any other band.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>In fact, there&#8217;s mainly one big difference between them and any other pop group you might know &#8211; all 11 members are virtual characters.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Non-humans, hyper-real avatars made with artificial intelligence.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;The business we are making with Eternity is a new business. I think it&#8217;s a new genre,&#8221; says Park Jieun, the woman behind Eternity.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;The advantage of having virtual artists is that, while K-pop stars often struggle with physical limitations, or even mental distress because they are human beings, virtual artists can be free from these.&#8221;</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>The cultural tidal wave of Korean pop has become a multibillion-dollar force over the last decade. With its catchy tunes, high-tech production and slinky dance routines, K-pop has smashed into the global mainstream, becoming one of South Korea&#8217;s most lucrative and influential exports.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>But the top K-pop stars, their legions of loyal fans, and the business-owners looking to capitalise on their success are all looking to the future.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>With the explosion of artificial intelligence (AI), deepfake and avatar technologies, these pop idols are taking their fame into a whole new dimension.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>The virtual faces of Eternity&#8217;s members were created by deep learning tech company Pulse9. Park Jieun is the organisation&#8217;s CEO.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Initially the company generated 101 fantasy faces, dividing them into four categories according to their charms: cute, sexy, innocent and intelligent.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Fans were asked to vote on their favourites. In-house designers then set to work animating the winning characters according to the preferences of the fans.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>For live chats, videos and online fan meets, the avatar faces can be projected onto anonymous singers, actors and dancers, contracted in by Pulse9.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>The technology acts like a deepfake filter, bringing the characters to life.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;Virtual characters can be perfect, but they can also be more human than humans,&#8221; Park Jieun tells BBC 100 Women.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>As deepfake technology moves into the mainstream, there have been concerns that it could be used to manipulate people&#8217;s images without permission or generate dangerous misinformation.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Women have reported having their faces put into pornographic films, while deepfakes of Russian President Vladimir Putin and President Volodymr Zelensky of Ukraine have been shared on social media sites.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;I&#8217;m always trying to make it clear that these are fictional characters,&#8221; says the CEO.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>She says Pulse9 uses the European Union&#8217;s draft ethical AI guidelines when making their avatars.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>And Park Jieun sees other advantages in virtual bands where each avatar can be controlled by their creators.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;The scandal created by real human K-pop stars can be entertaining, but it&#8217;s also a risk to the business,&#8221; says the CEO.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>She believes she can put these new technologies to good use and minimise risks for overstressed and pressurised K-pop artists trying to keep up with the demands of the industry.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Over the past years, K-pop made headlines for various social issues &#8211; from dating gossip to online trolling, fat-shaming and extreme dieting of band members.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>The genre has also spurred a conversation about mental health and cyberbullying in South Korea, after the tragic death of young K-pop stars, which many believe had a significant impact on their following.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>In 2019, singer and actress Sulli was <a href="https://www.bbc.co.uk/news/world-asia-50051575">found dead</a> in her apartment, aged 25. She had taken a break from the entertainment industry, after reportedly &#8220;suffering physically and mentally from malicious and untrue rumours spreading about her&#8221;.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Her close friend Goo Hara, another bright K-pop artist, was <a href="https://www.bbc.co.uk/news/world-asia-50535937">also found dead</a> at her home in Seoul soon after. Before taking her own life, Goo was fighting for justice after secretly being filmed by a boyfriend, and was being viciously abused online for that.</em></p>
</div>
</div>
<div data-component="subheadline-block">
<h2 id="Threat-or-aid" tabindex="-1"><em>Threat or aid?</em></h2>
</div>
<div data-component="text-block">
<div>
<p><em>For the human stars working around the clock to train, perform and interact with their fans, having some avatar assistance in the virtual world could provide some relief.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Han Yewon, 19, is the lead vocalist of newly launched girl group mimiirose, managed by YES IM Entertainment in South Korea.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>She spent almost four years as a trainee, waiting for her opportunity to be thrust into the limelight &#8211; and one of many candidates who had to undertake monthly evaluations. Those who didn&#8217;t show sufficient progress were let go.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;I worried a lot about not being able to debut,&#8221; says Yewon.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Becoming a K-pop star doesn&#8217;t happen overnight. And with new groups making their debut every year, it can be hard to stand out.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;I went to work around ten in the morning and did my vocal warm-ups for an hour. After that, I sang for two or three hours, I danced for three to four hours and worked out for another two hours&#8221;, says the vocalist.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;We practised for more than 12 hours in total. But if you aren&#8217;t good enough, you end up staying longer.&#8221;</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Yet the prospect of virtual avatars flooding the industry worries Yewon, who says that fans appreciate her authenticity.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;Because technology has improved so much lately, I&#8217;m afraid that virtual characters will take the place of human idols,&#8221; she says.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Other K-pop groups, however, have been quick to adopt new avatar technologies &#8211; and the business is forecasted to grow steadily.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>The digital human and avatar market size is estimated to reach $527.58bn (£429bn) globally by 2030, according to projections by market consulting company Emergen Research.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>At least four of K-pop&#8217;s biggest entertainment companies are investing heavily in virtual elements for their stars, and five of the top-earning K-pop groups of 2022 are getting in on the trend.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Using virtual copies of themselves allow them to reach fans across time zones and language barriers &#8211; in ways that flesh-and-blood artists would never be able to do.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Girl band aespa, for instance, consists of four human singers and dancers (Karina, Winter, Giselle and Ningning) and their four virtual counterparts &#8211; known as ae-Karina, ae-Winter, ae-Giselle and ae-Ningning. The avatars can explore virtual worlds with the fans and be used across multiple platforms.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>While chart-topping girl band, Blackpink made history with the help of their virtual twins, winning the first-ever <a href="https://www.mtvema.com/video-clips/ud0ulm/2022-mtv-emas-blackpink-wins-first-ever-best-metaverse-performance-award">MTV award for Best Metaverse Performance</a> in 2022.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>More than 15 million people from around the world tuned in to popular online gaming platform PUBGM to watch the group&#8217;s avatars perform in real time.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>During the Covid-19 pandemic Moon Sua and her K-pop group Billlie had to cancel their live performances and fan meets. Instead, the band&#8217;s management company created virtual copies of band members, to throw a party for fans in the virtual world.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;Since it was our first time doing it, we were a bit clumsy,&#8221; says Sua.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;But as time went by, we got used to it, talking with the fans while adapting to the virtual world. We had such a good time.&#8221;</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Moon Sua was impressed by how real the group&#8217;s avatars looked, but says she still prefers to meet with their followers in person.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;I don&#8217;t think it&#8217;s something threatening. Maybe we can learn skills from watching them? I don&#8217;t think they are a threat that can replace us,&#8221; says the band&#8217;s main rapper.</em></p>
<p><em></em><em>But there are also some concerns in the wider industry about ethical and copyright issues that avatar technologies can present.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;There&#8217;s a lot of unknowns when it comes to artists in the metaverse, virtual versions, icons of themselves, whatever it might be,&#8221; Jeff Benjamin, Billboard&#8217;s K-pop columnist, tells BBC 100 Women.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;It might be the fact that the artists themselves might not be in control of their image and that can create an exploitative situation.&#8221;</em></p>
</div>
</div>
<div data-component="subheadline-block">
<h2 id="Too-soon-to-know" tabindex="-1"><em>&#8216;Too soon to know&#8217;</em></h2>
</div>
<div data-component="text-block">
<div>
<p><em>For fans like Lee Jisoo, 19, who studies at engineering college, K-pop has been a welcome distraction during times of stress. She has been a dedicated Billlie fan since the group launched in 2019.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;Their love for their fans is amazing. You cannot help but love them more,&#8221; says Jisoo.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Jisoo collects fan albums and merchandise, while also interacting with the band online and in the virtual world.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;I feel emotions through Billlie that I wouldn&#8217;t have felt if I didn&#8217;t like them,&#8221; she says.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;And I&#8217;m fangirling even more because I want to give back those feelings to Billlie. I think this is a positive thing for me.&#8221;</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>But the virtual world can also be an unwelcoming space for K-pop stars and fans alike, with regulations to prevent cyberbullying or abuse lacking or rarely being enforced. The industry has been rocked by online bullying and smear campaigns waged against successful stars.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;I get more stressed out when I see mean comments on Billlie online. Because it&#8217;s also an insult to the things I like, so I get stressed out and heartbroken,&#8221; says Jisoo.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>Child and adolescent psychiatrist Jeong Yu Kim, who works in Seoul, says it&#8217;s too soon to know how virtual technology and the rise of AI characters will affect young people.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;I see the real problem is that we&#8217;re not seeing each other in an authentic way,&#8221; Jeong Yu says.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;In virtual worlds, we could be more free and do things that you can&#8217;t do outside, you can be someone else,&#8221; explained Jeong Yu. &#8220;This K-pop industry is really responsive to what the public wants, and they would want their artists to fulfil that.&#8221;</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;Just like any entertainment industry, there are so many pressures,&#8221; says Jeff Benjamin.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;The artists are really expected to always show a good image, they&#8217;re supposed to be that shining example for their fans.&#8221;</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>But this is changing, he says, and there has been an industry-wide shift taking place in order to better serve the mental health needs of the stars and reduce the intensive workload.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;The artists themselves are also opening up about what&#8217;s going on with their mental health, and that&#8217;s actually forging a deeper connection with those fans.&#8221;</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>In the fast-changing K-pop industry, it might be too soon to say whether virtual idols are a short-term fad or the future of the music industry.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>But for now, for fans like Jisoo the choice of who to follow is an easy one.</em></p>
</div>
</div>
<div data-component="text-block">
<div>
<p><em>&#8220;Honestly, if someone asks me, &#8216;Do you want to watch Billlie on the metaverse for 100 minutes or in real life for ten minutes?&#8217;, I&#8217;ll choose to see Billlie for ten minutes in real life.&#8221;</em></p>
</div>
</div>
</blockquote>
<div data-component="text-block">
<div>
<blockquote><p><em>She believes &#8220;people who like real idols and people who like virtual idols are completely different&#8221; &#8211; and for many like her, it would be &#8220;hard&#8221; to fall for the avatars at the expense of human K-pop stars.</em></p></blockquote>
</div>
</div>
]]></content:encoded>
			<wfw:commentRss>https://www.synworlds.com/2022/12/12/k-pop-the-rise-of-virtual-girl-bands/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>The Metaverse: And How It Will Revolutionize Everything</title>
		<link>https://www.synworlds.com/2022/07/23/the-metaverse-and-how-it-will-revolutionize-everything/</link>
		<comments>https://www.synworlds.com/2022/07/23/the-metaverse-and-how-it-will-revolutionize-everything/#comments</comments>
		<pubDate>Sat, 23 Jul 2022 12:54:24 +0000</pubDate>
		<dc:creator><![CDATA[Monty Simus]]></dc:creator>
				<category><![CDATA[Blog]]></category>

		<guid isPermaLink="false">http://www.synworlds.com/?p=1046</guid>
		<description><![CDATA[Via Time, a look at the metaverse and how it will revolutionize many aspect of our lives: The U.S. Securities and Exchange Commission reports that in the first six months of 2022, the word metaverse appeared in regulatory filings more than 1,100 times. The previous year saw 260 mentions. The preceding two decades? Fewer than a dozen [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Via Time, a <a title="Metaverse" href="https://time.com/6197849/metaverse-future-matthew-ball/" target="_blank">look</a> at the metaverse and how it will revolutionize many aspect of our lives:<em><a href="https://wwnorton.com/books/9781324092032" target="_blank"><br />
</a></em></p>
<blockquote><p><em>The U.S. Securities and Exchange Commission reports that in the first six months of 2022, the word metaverse appeared in regulatory filings more than 1,100 times. The previous year saw 260 mentions. The preceding two decades? Fewer than a dozen in total. It increasingly feels as though every corporate executive feels the need to mention the <a href="https://time.com/6116826/what-is-the-metaverse/">metaverse</a>—and of course, how it naturally fits the capabilities of their company better than those of their competitors. Few seem to explain what it is or exactly what they’ll build. The executive class also appears to disagree over fundamental aspects of this new platform, including the criticality of virtual reality headsets, blockchains and crypto, as well as whether it’s here now, might be soon, or is decades in the future.</em></p>
<p><em>None of which has constrained investment. Much has been written of Facebook’s <a href="https://www.cnbc.com/2021/10/28/facebook-changes-company-name-to-meta.html" target="_blank">name change</a> to “Meta” and the more than $10 billion it now loses each year on its metaverse initiatives. But another six of the largest public companies in the world—Amazon, Apple, Google, Microsoft, Nvidia, Tencent—have also been busy preparing for the metaverse. They are reorganizing internally, rewriting their job descriptions, reconstructing their product offerings, and prepping multi-billion-dollar product launches. In January, Microsoft announced the largest acquisition in Big Tech history, <a href="https://www.nytimes.com/2022/01/18/business/microsoft-activision-blizzard.html" target="_blank">paying $75 billion</a> for gaming giant Activision Blizzard, which would “provide building blocks for the metaverse.” In total, <a href="https://www.mckinsey.com/~/media/mckinsey/business%20functions/marketing%20and%20sales/our%20insights/value%20creation%20in%20the%20metaverse/Value-creation-in-the-metaverse.pdf" target="_blank">McKinsey &amp; Company estimates</a> that corporations, private equity companies, and venture capitalists made $120 billion in metaverse-related investments during the first five months of this year.</em></p>
<p><em>Nearly all of the aforementioned work has, thus far, remained invisible to the average person. Rather like the metaverse itself. There isn’t really a metaverse product we can go buy, nor “metaverse revenue” be found on an income statement. In fact, it might seem as though the metaverse, to the extent it ever existed, has already come and gone. Crypto has <a href="https://time.com/6188396/crypto-crash-future/">crashed</a>. So too has Facebook’s market capitalization, which topped $900 billion when the company changed its name to Meta, but now sits around $445 billion. This year, the video gaming sales have <a href="https://twitter.com/MatPiscatella/status/1547927601901027336" target="_blank">fallen</a> by nearly 10%, due in part to the end of the pandemic that forced many people inside.</em></p>
<div>
<div>
<div>
<figure><em><img class="aligncenter" alt="" src="https://api.time.com/wp-content/uploads/2022/07/TIM220808_Metaverse-Cover_final.jpg?quality=85&amp;w=2363" width="477" height="635" /></em></figure>
<div>
<div><em>Illustration by Micah Johnson for TIME</em></div>
</div>
</div>
</div>
</div>
<div id="ntv1124334-506170-61194"></div>
<p><em>To many, it’s a good thing that the metaverse seems to be sputtering. The largest tech platforms have already established enormous influence over our lives, as well as the technologies and business models of the modern economy. It’s also clear that there are many problems with today’s internet; why not solve them before moving onto what Mark Zuckerberg calls “the successor” to it?</em></p>
<p><em>The answer is embedded in that very question. The metaverse, a 30-year-old term but nearly century-old idea, is forming around us. Every few decades, a platform shift occurs—such as that from mainframes to PCs and the internet, or the subsequent evolution to mobile and cloud computing. Once a new era has taken shape, it’s incredibly difficult to alter who leads it and how. But between eras, those very things usually do change. If we hope to build a better future, then we must be as aggressive about shaping it as are those who are investing to build it.</em></p>
<p><em>***</em></p>
<p><em>So what is this future? Think of the metaverse as a parallel virtual plane of existence that spans all digital technologies and will even come to control much of the physical world. This construct helps explain another common description of the metaverse as a 3D internet—and why establishing it is so hard, but also likely to be worthwhile.</em></p>
<p><em>The internet as we know it today spans nearly every country, 40,000 networks, millions of applications, over a hundred million servers, almost 2 billion websites, and tens of billions of devices. Each of these technologies can coherently, consistently exchange information, find one another “on the net,” share online account systems and files (a JPEG, an MP4, a paragraph of text), and even interconnect (think of how a news publisher links to another outlet’s report). Nearly 20% of the world economy is considered “digital,” with much of the remaining 80% running on it.</em></p>
<p><em>Though the Internet is resilient, wide-ranging, and powerful, it wasn’t built for live and interactive experiences involving a large number of participants—especially when it comes to 3-D imaging. Rather, the internet was designed primarily so that one static file (such as an email or spreadsheet) could be copied and sent from one device to another, such that it might be independently and asynchronously reviewed or modified. This is partly why, even in the age of the “Streaming Wars” and multi-trillion dollar big tech companies, simple two-person video calls can be so unreliable. (It’s a marvel that online multiplayer games work at all.) Furthermore, there’s no consensus on file formats or conventions for 3D information, no standard systems to exchange data in virtual worlds. We also lack the computing power to pull off the metaverse as we imagine it. And we will want many new devices to realize it—not just VR goggles, but things like holographic displays, ultra-sonic force-field generators, and, spooky as it sounds, devices to capture electrical signals sent across muscles.</em></p>
<p><em>We cannot know in advance exactly how important a 3D internet might be to our global economy, just as we didn’t know the value of the internet. But we do have some view to the answer. As internet connectivity and computer processors have improved, we’ve shifted from colorless text to primitive webpages and web blogs, then online profiles (like a Facebook page) and video-based social networks, emojis, and filters. The volume of content we produce online has grown from a few message board posts, emails, or blog updates a week to a constant stream of multimedia content encapsulating our lives. The next evolution to this trend seems likely to be a persistent and “living” virtual world that is not a window into our life (such as Instagram) nor a place where we communicate it (such as Gmail) but one in which we also exist—and in 3D (hence the focus on immersive VR headsets and avatars).</em></p>
<p><em>Already, nearly a hundred million people a day log onto Roblox, Minecraft, and Fortnite Creative, platforms that operate tens of millions of interconnected worlds, which support a consistent virtual identity, virtual goods, communications suites, and can be accessed from most devices. Most time in these platforms is spent on leisure—playing games, attending concerts—but we are starting to see people go further.</em></p>
<p><em>Education is a category we have long expected to be transformed by the digital era, but has thus far resisted it. Since 1983, the cost of higher education has grown over 1,200%; medical care and services, which ranks second for cost increases in the U.S. over that period, is up half as much. The challenge is the real thing requires no fewer resources than it did decades ago, and what is lost when shifting to a remote computer screen. Eye contact. Peers. Hands-on experimentation. Equipment. Zoomschool, YouTube videos, and digital multiple choice are no substitute for the real thing.</em></p>
<p><em>In the metaverse, the Magic School Bus becomes possible. For decades, students learned about gravity by watching their teacher drop a feather and a hammer, and then seeing a tape of Apollo 15 commander David Scott doing the same on the moon. (Spoiler: They fall at the same speed.) Such demonstrations need not go away, but they can be supplemented by the creation of elaborate virtual Rube Goldberg machines, which students can then test under Earth-like gravity, on Mars, and even under sulfuric rainfalls of the Venusian upper atmospheres. Instead of dissecting a frog, we can travel its circulatory systems not unlike the way we drive the Mushroom Kingdom in Mario Kart. And all of this is available irrespective of geographic location or resources of the local school board.</em></p>
<p><em>In 2021, neurosurgeons at <a href="https://www.hopkinsmedicine.org/news/articles/johns-hopkins-performs-its-first-augmented-reality-surgeries-in-patients#:~:text=The%20device%20used%20in%20the,the%20spine%20of%20a%20cadaver." target="_blank">Johns Hopkins performed</a> the hospital’s first-ever live patient surgery using an augmented-reality headset, thereby providing the surgeon with an interactive display of the patient’s internal anatomy. Dr. Timothy Witham, who performed the surgery and also directs the hospital’s Spinal Fusion Laboratory, likened it to having GPS. This frame of reference is important. We often think of the metaverse replacing something we do today—such as wearing a VR headset instead of using a smartphone or watching TV—but we don’t drive GPS instead of a car; we drive a car with GPS.</em></p>
<p><em>Earlier in 2021, Google unveiled its <a href="https://blog.google/technology/research/how-were-testing-project-starline-google/" target="_blank">Project Starline</a> device, which uses machine learning, computer vision, a dozen depth sensors and cameras and fabric-based multi-layered light field displays to create 3D “holographic video” without requiring the use of mixed reality goggles. In comparison to traditional “2D” video calling, Google says its Starline technology leads to 15% increases in eye-contact, 25-50% increases in non-verbal forms of communication (hand gestures, head nods, eyebrow movements) and 30% better memory recall of the conversation. Few of us enjoy Zoom; perhaps some of our displeasure can be alleviated by adding another dimension.</em></p>
<p><em>Infrastructure is another good example. The Hong Kong International Airport now operates a live “digital twin” of the facility, allowing airport operators to use a live 3D simulation to determine where passengers and planes should be directed. Multi-billion-dollar, multi-decade city projects are using these technologies to determine how a given building might affect traffic flows and emergency response times, or how its design will affect the temperature and sunlight of a local park on a specific day. These are mostly disconnected simulations. The next step is to bring them online—like shifting from offline Microsoft Word documents to cloud-based, collaborative ones—and turning the world into a digital development platform.</em></p>
<p><em>***</em></p>
<p><em>For society, however, exactly what the metaverse means is unclear. This gives understandable pause to some, who see billions invested in what feels like a game. But think of the metaverse as a fourth era of computing and networking—succeeding mainframes, which ran from the 1950s to 1970s; personal computers and the Internet of the 1980s to mid-2000s; and the mobile and cloud era we experience today. Each era changed who accessed computing and networking resources, when, where, why, and how. The results of these changes were profound. But they were also hard to specifically predict.</em></p>
<p><em>Even the biggest believers in the mobile internet once struggled to predict more than “more people, online more often, for more reasons.” Having a detailed technical understanding of digital networking didn’t illuminate the future, nor did deploying billions in R&amp;D. Services such as Facebook, Netflix, or Amazon’s AWS cloud computing platform are obvious in hindsight, but nothing about them—their business models, technology, design principles—was at the time. In this regard, we should recognize that confusion, conflation, and uncertainty are prerequisites for disruption.</em></p>
<p><em>Still, there are specific issues that can be cleared up. The metaverse is often misdescribed as immersive virtual reality headsets, such as the Meta Quest (née Oculus VR), or augmented reality glasses, the most famous example of which to date is Google’s infamous Glass. VR and AR devices may become a preferred way to access the metaverse, but they are not it. Consider that smartphones are not the same thing as the mobile internet.The metaverse is also not Roblox, Minecraft, Fortnite, or any other game; these are virtual worlds or platforms that are likely to be part of the metaverse, just as Facebook and Google are part of the internet. For similar reasons, think of the metaverse as singular, just as we say “the internet” not “an internet.” (To the extent we identify different internets today, this largely reflects regional regulatory differences.) Another frequent conflation is that between the metaverse and Web3, crypto, and blockchains. This trio may become an important part of realizing the metaverse’s potential, but they are merely principles and technologies. In fact, many metaverse leaders doubt there is any future for crypto.</em></p>
<p><em>The metaverse should not be thought of as an overhaul to the Internet, nor something that will replace all mobile models, devices, or software. It will produce new technologies, and behaviors. But that doesn’t mean we leave what we prefer behind. I still write on a PC, and that’s likely to remain the best way to write long-form text. The majority of internet traffic today both originates and terminates on a mobile device, yet nearly all of it is transmitted on fixed-line cables and using the Internet Protocol Suite as it was designed in the 1980s.</em></p>
<p><em>The metaverse is not yet here (even if some executives will claim it is, or at least imminent). At the same time, transformations don’t experience “switch flips.” We are in the mobile era today, but the first cellular network call was in 1973, the first wireless data network was in 1991, smartphone in 1992, and so on until the iPhone in 2007. While it’s impossible to say when the development of the metaverse began, it’s clearly underway. In mid 2021, only weeks before Facebook unveiled its metaverse intentions, Tim Sweeney, CEO and founder of Fortnite maker Epic Games, tweeted prerelease code from the company’s 1998 game Unreal, adding that players “could go into portals and travel among [different worlds]…with no combat and [would stand] in a circle chatting.” These experiences didn’t take off at the time for a number of reasons—there were too few people online, tools for world-creation were too difficult to use, the devices that could support them were too costly and heavy, etc. “We’ve had metaverse aspirations for a very, very long time…” he added a few minutes later, “but only in recent years have a critical mass of working pieces started coming together rapidly.”</em></p>
<p><em>The metaverse is also not inherently dystopic. This a common misconception as the word “Metaverse” comes from a dystopic novel, Neal Stephenson’s Snow Crash. Snow Crash’s forebears, such as William Gibson’s Neuromancer (1984) and Philip K. Dick’s The Trouble With Bubbles (1953), similarly leave readers with the sense that the metaverse worsened the real world. Drama is at the root of most fiction; utopias are rarely the setting for popular stories. But since the 1970s, numerous “proto-metaverses” have emerged that have not been centered on subjugation or profiteering, but on collaboration and creativity. Each decade, the realism of these worlds improves, as does their functionality, value, and cultural impact.</em></p>
<p><em>***</em></p>
<p><em>The foundation of today’s internet was built over several decades through the work of government research labs, universities, and independent technologists and institutions. These mostly not-for-profit collectives typically focused on establishing open standards that would help them share information from one server to another, and in doing so make it easier to collaborate on future technologies, projects, and ideas. The benefits of this approach were far-ranging. Anyone could access or build on the internet, from any device, on any network, at low to no cost.</em></p>
<p><em>None of this prevented businesses from making a profit on the internet or building closed experiences through paywalls or proprietary tech. Rather, the “openness” of the internet enabled more companies to be built, reaching more users, and achieving greater profits, while also preventing pre-internet giants (and, crucially, telecom companies) from controlling it. Openness is also why the internet is considered to have democratized information, and why the majority of the most valuable public companies in the world today were founded (or were reborn) in the internet era.</em></p>
<p><em>It’s not difficult to imagine how different the internet would be if it had been created by multinational media conglomerates in order to sell widgets, serve ads or harvest user data for profits.</em></p>
<p><em>However, a “corporate internet” is the current expectation for the metaverse. When the internet was born, government labs and universities were effectively the only institutions with the computational talent, resources, and ambitions to build a “network of networks,” and few in the for-profit sector imagined its commercial potential. None of this is true when it comes to the metaverse. Instead, it is being pioneered and built by private businesses.</em></p>
<p><em>In 2016, long before the metaverse was seriously contemplated by the corporate executives worldwide, Epic Games’ Sweeney told <a href="https://venturebeat.com/2016/12/16/epics-tim-sweeney-be-patient-the-metaverse-will-come-and-it-will-be-open/" target="_blank">VentureBeat</a> that “If one central company gains control of [the metaverse], they will become more powerful than any government and be a god on Earth.” It’s easy to find such a claim hyperbolic. But according to <a href="https://ir.citi.com/gps/x5%2BFQJT3BoHXVu9MsqVRoMdiws3RhL4yhF6Fr8us8oHaOe1W9smOy1%2B8aaAgT3SPuQVtwC5B2%2Fc%3D" target="_blank">Citi</a> and <a href="https://www.yahoo.com/video/consulting-giant-kpmg-makes-first-110000768.html?guccounter=1" target="_blank">KPMG</a>, the metaverse could generate as much as $13 trillion in revenue per year by 2030. Morgan Stanley has <a href="https://www.morganstanley.com/ideas/metaverse-investing" target="_blank">estimated</a> $8 trillion in both the U.S. and China, similar to <a href="https://www.goldmansachs.com/insights/pages/gs-research/framing-the-future-of-web-3.0-metaverse-edition/report.pdf" target="_blank">Goldman Sachs global projection</a> of between $2.5 and $12.5 trillion; <a href="https://www.mckinsey.com/~/media/mckinsey/business%20functions/marketing%20and%20sales/our%20insights/value%20creation%20in%20the%20metaverse/Value-creation-in-the-metaverse.pdf" target="_blank">McKinsey forecasts</a> $5 trillion worldwide. Jensen Huang, the founder and CEO of Nvidia, which ranked as one of the ten largest public companies in the world for most of the year, believes the GDP of the metaverse will eventually exceed that of “the physical world.”</em></p>
<p><em>It is here that fears of a dystopia seem fair, rather than alarmist. The idea of the metaverse means an ever-growing share of our lives, labor, leisure, time, wealth, happiness, and relationships will be spent inside virtual worlds, rather than just aided through digital devices. It will be a parallel plane of existence that sits atop our digital and physical economies, and unites both. As a result, the companies that control these virtual worlds and their virtual atoms will be more dominant than those who lead in today’s digital economy.</em></p>
<p><em>The metaverse will thus render more acute many of the hard problems of digital existence today, such as data rights, data security, misinformation and radicalization, platform power, and user happiness. The philosophies, culture, and priorities of the companies that lead in the metaverse era, therefore, will help determine whether the future is better or worse than our current moment, rather than just more virtual or remunerative.</em></p>
<p><em>As the world’s largest corporations and most ambitious start-ups pursue the metaverse, it’s essential that we—users, developers, consumers, and voters—understand we still have agency over our future and the ability to reset the status quo, but only if we act now. Yes, the metaverse can seem daunting, if not outright scary, but this moment of change is our chance to bring people together, to transform industries that have resisted disruption, and to build a more equal global economy.</em></p>
<p><em>Much about the future is uncertain, just as the internet was in the 1990s and 2000s. But we can understand how the metaverse is likely to work and why; which experiences might be available when, why, and to whom; what might go wrong and what must go right. And we can use this information to shape the future, just as Big Tech is and will. There are trillions of dollars at stake, as executives are wont to remind us—and, more importantly, our lives.</em></p></blockquote>
]]></content:encoded>
			<wfw:commentRss>https://www.synworlds.com/2022/07/23/the-metaverse-and-how-it-will-revolutionize-everything/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Esports and National Security</title>
		<link>https://www.synworlds.com/2022/03/22/esports-and-national-security/</link>
		<comments>https://www.synworlds.com/2022/03/22/esports-and-national-security/#comments</comments>
		<pubDate>Tue, 22 Mar 2022 14:10:42 +0000</pubDate>
		<dc:creator><![CDATA[Monty Simus]]></dc:creator>
				<category><![CDATA[Blog]]></category>

		<guid isPermaLink="false">http://www.synworlds.com/?p=1043</guid>
		<description><![CDATA[Via Breaking Defense, commentary on the potential for militaries to&#8221;learn from world-class players how they develop, train and practice the quick-twitch skills and reaction times needed for competitive gaming: Sport and national security often cross paths. The Duke of Wellington famously observed (albeit probably not in these exact words), that “The battle of Waterloo was [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Via Breaking Defense, <a title="esports and defense" href="https://breakingdefense.com/2022/03/esports-and-national-security-dod-should-invest-effort-now-to-reap-benefits-in-the-future/" target="_blank">commentary</a> on the potential for militaries to&#8221;learn from world-class players how they develop, train and practice the quick-twitch skills and reaction times needed for competitive gaming:</p>
<blockquote><p><em>Sport and national security often cross paths. The Duke of Wellington famously observed (albeit probably not in these exact words), that “The battle of Waterloo was won on the playing fields of Eton.” Gen. Douglas MacArthur absolutely did say, “Upon the fields of friendly strife are sown the seeds that on other days, on other fields, will bear the fruits of victory.”</em></p>
<p><em>War and sports are about to intersect again, but these sports won’t be grounded on grassy fields. The new teams and technology of 21st century warfare will draw from the players of electronic sports, aka esports.</em></p>
<p><em>Most people engaged in esports are under 30. Most people who plan for war are over 30. The great militaries that work to bridge this generational divide could gain a major competitive advantage in future national security conflict.</em></p>
<p><em>Esports is the field of virtual, organized competitive gaming in which individuals and teams compete against each other, often for cash prizes. There are rankings, sponsorships, leagues, and a lot of advertising. High schools, colleges and universities have teams. They offer scholarships, academic courses, and research and development opportunities. NBC, ESPN, TBS, and Telemundo broadcast esports competitions.</em></p>
<p><em>In short, this is a big deal. The global industry is already <a href="https://www.prnewswire.com/news-releases/global-esports--games-streaming-estimated-to-reach-us3-5-billion-by-2025--up-from-us2-1-billion-in-2021--301410630.html">worth multi-billions of dollars</a>. The audience live-streaming esports events is closing in on a <a href="https://cyberathletiks.com/how-many-people-watch-esports/">billion people</a> worldwide.</em></p>
<p><em>Any sport that attracts that many people and that much money will also continue to attract a lot of technology. For that reason alone, the esport world deserves our attention.</em></p>
<p><em>Esport technology involves collecting, storing, processing, and moving big amounts of data fast and displaying that information in an ever more accessible and realistic virtual environment. All the key areas of technology are dual-use, applicable to civil and military use. They could all be impacted by what are expected to be the most impactful emergent technologies, including <a href="https://breakingdefense.com/tag/5g-networks/">5G</a> and 6G data networks, artificial intelligence, quantum computing and virtual reality systems.</em></p>
<div id="div-id-for-TL-inline-1"></div>
<p><em>A lot of the technology that appears in war games could show up in real war. From training, planning, testing, exercising, and preparing for fighting, to managing armed conflict, the technology driven by esports could wind up on battlefields the way technologies developed through air and car racing influenced the engineering, design, and manufacture of cutting-edge weapons in World War II.</em></p>
<p><em>The key to leaping ahead in any sort of competition is to experiment, innovate and integrate emerging technologies faster and better than the other guy. Esports is already a testbed for what Gen. George Patton called “the musicians of Mars,” orchestrating all the capabilities a commander can bring to bear on the enemy on a battlefield. Only a competitor who wants to lose the next war would refuse to listen.</em></p>
<p><em>Esports is also a fertile recruiting ground for high tech warriors with high tech knowledge and skills. The U.S. Air Force has already figured this out. DRL, for example, runs a global, professional drone racing league where pilots control custom-built drones equipped with cameras that zip through a course at up to 90 miles an hour. This is like a training league for drone top guns. The Air Force is now a prominent sponsor of DRL, running recruitment ads on their social media platforms. As the Center for a New American Security <a href="https://www.cnas.org/publications/reports/esports-and-the-military">has observed</a>: “developing familiarity with [esports] platform norms and rules will assist the services in using such platforms effectively for recruitment.”</em></p>
<p><em>Militaries can also learn from world-class players how they develop, train and practice the quick-twitch skills and reaction times needed for competitive gaming. In addition to learning constructive skills, knowledge, and attributes of gaming and networking warfare, militaries will learn about the challenges. One example is <a href="https://www.espn.com/esports/story/_/id/24427802/mental-health-issues-esports-remain-silent-very-real-threat-players">mental health</a>. Some world-class gamers play 10 or more hours a day while constantly texting and communicating with fans and other gamers. Some have demonstrated pathologies akin to PTSD, a response to the mental strain of intense gaming. On the other hand, some have also argued esports can provide <a href="https://blog.coastline.edu/how-esports-benefit-mental-health#:~:text=Another%20study%20by%20a%20group,used%20as%20a%20medical%20remedy.">mental health benefits</a>.</em></p>
<div id="div-id-for-recommended-postcard"></div>
<p><em>None of this is to say the gaming world is prime to become part of the US military-industrial complex. Currently, the most dominant games are Dota 2, Counter-Strike: Global Offensive, Fortnite, League of Legends, and StarCraft II. Many who play these games are Chinese; no advantage to the West there. (Notably, the Chinese government in 2019 <a href="https://archive.esportsobserver.com/china-recap-feb6-2019/">recognized esports as a profession</a>; you can be sure Beijing views the benefits the same way the West should.) And many Western players are interested in, well, just playing games. They don’t particularly like distractions and disruptions.</em></p>
<p><em>For a while, intelligence <a href="https://www.bbc.com/news/technology-58600181">agencies were worried</a> about terrorists coordinating and recruiting on video-game chats. It appears, however, that the gaming world is not particularly fruitful space for extremist recruiting. That’s because the gaming environment views outside distractions as an annoyance. This suggests that successful intervention in the space — and successful exploitation of the esport experience — will require militaries to understand gaming culture as well as gaming technologies.</em></p>
<p><em>The emerging technologies of the 21st century will likely be as transformative as the digital revolution of the 1990s. The world of esports sits at the crossroads of many these capabilities, with the potential to remake virtually every aspect of human life, including conflict. Connecting with the cutting-edge of esports could well be crucial to winning the next fight.</em></p></blockquote>
]]></content:encoded>
			<wfw:commentRss>https://www.synworlds.com/2022/03/22/esports-and-national-security/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Virtual Workouts, Real Sweat</title>
		<link>https://www.synworlds.com/2022/02/01/virtual-workouts-real-sweat/</link>
		<comments>https://www.synworlds.com/2022/02/01/virtual-workouts-real-sweat/#comments</comments>
		<pubDate>Tue, 01 Feb 2022 18:34:08 +0000</pubDate>
		<dc:creator><![CDATA[Monty Simus]]></dc:creator>
				<category><![CDATA[Blog]]></category>

		<guid isPermaLink="false">http://www.synworlds.com/?p=1040</guid>
		<description><![CDATA[Via The Wall Street Journal, an interesting article on what it is like to exercise in the Metaverse With the Meta Quest 2: One thing they don’t tell you about working out in the metaverse: The exercise might be in virtual reality, but the sweat is real. My first VR exercise was a boxing class that’s [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Via The Wall Street Journal, an interesting <a title="Virtual Workout" href="https://www.wsj.com/articles/virtual-workouts-real-sweat-exercisingin-the-metaverse-with-the-meta-quest-2-11643501247?utm_source=SportTechie_Newsletter&amp;utm_medium=Email&amp;utm_campaign=SPRTCD220126002" target="_blank">article</a> on what it is like to exercise in the Metaverse With the Meta Quest 2:</p>
<blockquote><p><em>One thing they don’t tell you about working out in the metaverse: The exercise might be in virtual reality, but the sweat is real.</em></p>
<p><em>My first VR exercise was a boxing class that’s like “Dance Dance Revolution” for your hands. I punched color-coded targets zooming at my face and tilted to avoid swirling bars. After the session, I peeled the damp Meta Quest 2 (formerly known as Oculus) off my face—which is just as disgusting as it sounds. But there’s an upside: I couldn’t believe how hard my heart was pumping and how fast those 20 minutes flew by.</em></p></blockquote>
<div>
<blockquote><p><em>VR fitness isn’t just a gimmick. The headset turns a dreaded aspect of exercise (actually doing it) into an interactive game that makes you forget it’s a workout. That is, when it doesn’t give you motion sickness.</em></p>
<p><em>Fitness experts say virtual workouts, which largely involve squats and air punches, can be an effective way to burn calories. Most can be done in your living room, as long as there’s a clear area for hopping and arm flinging. Some require cardio equipment, such as an elliptical machine. However, you won’t find virtual workouts with much yoga, weight training or other dynamic movement that could pose a safety hazard. Wearing a headset is like having an electronic blindfold strapped to your face.</em></p>
<h6><em>VR punch-dancing = great cardio</em></h6>
<p><em><a href="https://www.getsupernatural.com/" target="_blank" rel="nofollow">Supernatural</a> was the most impressive of the VR workout apps I tried. It offers four types of classes: boxing, meditation, stretching and a high-intensity, rhythmic workout called Flow. The sensors in the Quest’s handheld controllers tell the app how fast and accurately you’re punching the flying orbs. The visuals are stunning. Workouts are set in expansive real-life landscapes captured by 360-degree cameras. I never thought I’d be working out in the Sahara. (The app’s creator, Within, recently entered into <a href="https://www.oculus.com/blog/within-to-join-meta/" target="_blank" rel="nofollow">an agreement to be acquired</a> by <a href="https://www.wsj.com/market-data/quotes/FB">Facebook</a> <a href="https://www.wsj.com/market-data/quotes/FB?mod=chiclets" data-channel="/zigman2/quotes/205064656/composite" data-symbol="FB" data-changepercent="0.41">FB +0.94% </a>parent <a href="https://www.wsj.com/market-data/quotes/FB">Meta Platforms</a> Inc. )</em></p>
<p><em></em><em>Like <a href="https://www.wsj.com/articles/the-best-fitness-apps-for-working-out-at-home-11641740402?mod=article_inline" target="_blank" rel="nofollow">digital fitness platforms</a> that run on phones and tablets, both VR apps require a subscription; the <a href="https://www.oculus.com/quest-2/" target="_blank" rel="nofollow">Meta Quest 2 headset</a> starts at $299. The apps are pricey: a Supernatural membership for up to four people (sharing one device) costs $19 a month, or $180 a year. FitXR costs $10 a month for five profiles. Unlike on smartphone apps, workouts on the Quest 2 can’t be downloaded for offline use. Naturally, the headset requires a Facebook account.</em></p>
<p><em>The apps’ up-and-down and side-to-side movements and arm-raises are legitimate exercises, according to Jimmy Bagley, associate professor of exercise and muscle physiology at San Francisco State University. “You use a lot of muscle mass lunging and squatting,” he said.</em></p>
<p><em>“If you are a non-exerciser or beginning exerciser and want to lose some fat or gain some muscle, then you could do most of your workouts in VR now,” Dr. Bagley said. More serious athletes should stick with a real trainer in a real gym, he added.</em></p>
<h6><em>Virtual cycling: Dangerous curves ahead</em></h6>
<p><em>A <a href="https://pubmed.ncbi.nlm.nih.gov/30325233/" target="_blank" rel="nofollow">2018 study</a> by Dr. Bagley and other researchers found that VR exercise can be just as effective at increasing your calorie burn rate as treadmill running or cycling, depending on the intensity of the game. Subjects in <a href="https://www.liebertpub.com/doi/abs/10.1089/g4h.2019.0196" target="_blank" rel="nofollow">a 2020 study</a> by researchers at the University of Minnesota Duluth said they felt less tired or exhausted biking in VR, compared with traditional bike exercise.</em></p>
<p><em>I’m already a big fan of Zwift, an <a href="https://www.wsj.com/articles/peloton-schmelotonheres-how-to-turn-your-bike-into-a-connected-workout-station-11595163600?mod=article_inline" target="_blank" rel="nofollow">indoor biking app</a> that runs on my iPad and controls my smart bike trainer’s resistance based on the virtual route’s elevation. Could a VR app be even better? </em></p>
<p><em>I tried out <a href="https://www.holodia.com/" target="_blank" rel="nofollow">Holofit</a> by Holodia ($10.75 a month), which works with Bluetooth-enabled ellipticals, rowers and stationary bikes. You can also use a traditional machine with a cadence sensor attached to your shoe. Holofit’s virtual worlds are programmed with absurd set pieces to keep you engaged. I passed an overturned bus on the Golden Gate Bridge, rode through a rock concert in the middle of Paris and flew out of a rocket ship onto Saturn’s rings. </em></p>
<p><em>This kind of cardio was a different beast in VR. The headset got steamy fast, and while biking on a straight road didn’t bother me, sharp turns and going uphill or downhill made my stomach flip. After 30 minutes, I was too nauseated to continue. This might be a unique issue: I have amblyopia, which means one eye doesn’t work as well as the other. Wearing glasses inside of the headset helps, but I still get headaches after long stints. (My colleague Joanna, who spent <a href="https://www.wsj.com/articles/metaverse-experience-facebook-microsoft-11636671113?mod=article_inline" target="_blank" rel="nofollow">24 hours in the metaverse</a>, experienced similar symptoms after prolonged headset use.) </em></p>
<p><em>A Holofit spokesman said this feeling is common for new users and eventually dissipates. The company is continuing to refine the avatar’s in-game movements, he added. Holofit was fine otherwise, but it doesn’t have Zwift’s automatic resistance-adjustment features or detailed graphics.</em></p>
<p><em>I asked Zwift Chief Executive Officer Eric Min why the app isn’t in the metaverse yet. “Zwift does indeed have a build for VR, but it’s not something that’s publicly available,” he said. For now, a VR headset is too bulky for bikers, he said. “It’s weighty, hot, fogs up and the sweat consideration is significant.” He said the company also is concerned about user safety.</em></p>
<p><em>He’s right about the sweat. The Quest 2 engulfs half your face like a snorkel mask, trapping heat inside, and needs to be snug to stay put while you move around. After each workout, the face pad on my headset was drenched.</em></p>
<p><em>Meta acknowledges that the headset needs improvement for fitness: The company plans to sell new accessories, including an exercise-optimized facial interface that can be wiped down and grips that make the controllers easier to hold while you sweat.</em></p>
<p><em>To become mainstream fitness devices, headsets will need to be lighter, with even faster processors and more precise head-tracking to reduce motion sickness. They should also be able to track leg movements, which they don’t do currently.</em></p>
<p><em>New devices are imminent. Meta says it plans to <a href="https://www.youtube.com/watch?v=Uvufun6xer8&amp;t=3335s" target="_blank" rel="nofollow">expand its Quest offerings later this year</a> with more advanced hardware. And <a href="https://www.wsj.com/market-data/quotes/AAPL">Apple</a>’s <a href="https://www.wsj.com/articles/apples-iphone-successor-comes-into-focus-11638594004?mod=article_inline" target="_blank" rel="nofollow">anticipated smart glasses</a> are expected to feature augmented-reality technology. Perhaps a virtual fitness instructor could appear overlaid in the real world.</em></p>
<p><em>For now, the experience is great for early adopters who are bored with their exercise routine—and willing to navigate a novel, complex interface. Before beginning a VR fitness journey, however, there are two things you will definitely need: a <a href="https://www.oculus.com/silicone-cover/" target="_blank" rel="nofollow">silicone face-cushion cover</a> for your Quest 2 headset, plus a <a href="https://www.insidehook.com/daily_brief/style/jeff-horwitz-headband-meet-press-msnbc" target="_blank" rel="nofollow">good old-fashioned sweatband</a>. Trust me.</em></p></blockquote>
</div>
]]></content:encoded>
			<wfw:commentRss>https://www.synworlds.com/2022/02/01/virtual-workouts-real-sweat/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Why Investors Are Paying Real Money For Virtual Land</title>
		<link>https://www.synworlds.com/2022/01/21/why-investors-are-paying-real-money-for-virtual-land/</link>
		<comments>https://www.synworlds.com/2022/01/21/why-investors-are-paying-real-money-for-virtual-land/#comments</comments>
		<pubDate>Fri, 21 Jan 2022 16:03:21 +0000</pubDate>
		<dc:creator><![CDATA[Monty Simus]]></dc:creator>
				<category><![CDATA[Blog]]></category>

		<guid isPermaLink="false">http://www.synworlds.com/?p=1037</guid>
		<description><![CDATA[Via Time, a report on why investors are paying real money for virtual land: Chris Adamo considers himself late to the game when it comes to investing in NFTs, or non-fungible tokens. He collected his first one in summer 2021. But when it comes to buying up property in the metaverse, Adamo is early. Eight [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Via Time, a <a title="Virtual Land" href="https://time.com/6140467/metaverse-real-estate/?utm_source=SportTechie_Newsletter&amp;utm_medium=Email&amp;utm_campaign=SPRTCD220115002" target="_blank">report</a> on why investors are paying real money for virtual land:</p>
<blockquote><p><em>Chris Adamo considers himself late to the game when it comes to investing in NFTs, or non-fungible tokens. He collected his first one in summer 2021. But when it comes to buying up property in the metaverse, Adamo is early. Eight months ago, the Miami-based venture capitalist and a group of associates calling themselves the MetaCollective DAO used a virtual real estate broker to buy 23 parcels in The Sandbox, a user-generated, blockchain-based virtual world, for prices starting at 1ETH (about $3,000). A nearby property <a href="https://www.benzinga.com/markets/cryptocurrency/22/01/24960365/virtual-land-just-sold-for-42-eth-in-the-sandbox" target="_blank" rel="noopener noreferrer">sold</a> for about 42ETH, or $130,000.</em></p>
<p><em>The land—pixels, really—<a href="https://twitter.com/ryanfreedman_/status/1481067990435401728?s=20" target="_blank">borders</a> the compound of the Bored Ape Yacht Club, a buzzy NFT community, and a plot owned by Adidas. They’re calling it Sandbox Hill Road, as a nod to Silicon Valley’s famous Sand Hill Road and The Sandbox, the platform where this “land” exists. Already, the parcels’ value has gone up about ten times in price, making their holdings potentially worth many millions of dollars.</em></p>
<p><em>“It’s like the New York City of The Sandbox,” Adamo says. “Like the Lower East Side or Soho right now.” Translation: it’s hip—or at least, they are invested in believing it can be.</em></p>
<div data-player-id="1BWlFwPe" data-media-id="FUucllsh" data-playlist-id="" data-title="Steve Aoki | The Future of Innovation" data-description="Steve Aoki on the potential of cryptocurrency. Interviewed by Andrew Chow, TIME Staff Writer." data-autostart="viewable">
<div>
<div id="jw-1BWlFwPe-e9116118-d1fe-44c2-8ea9-7357f838a55d" tabindex="0" role="application">
<div><em>If the metaverse is meant to encompass everything that exists virtually, from digital art to virtual worlds, then the real estate parcels that are being snapped up can be seen as just one type of metaversal investment, often listed as NFTs. These virtual worlds—The Sandbox, Decentraland, Cryptovoxels, Earth2, Nifty Island, Superworld, Wilder World—each offer different things to users: hyper-realistic graphics, gaming options, communities of specific types of early adopters. (<a href="https://decrypt.co/87524/someone-paid-450k-snoop-dogg-metaverse-neighbor" target="_blank">Snoop Dogg</a>, for instance, staked out a home for himself in The Sandbox; Paris Hilton has an island in Roblox.)</em></div>
</div>
</div>
</div>
<p><em>Right now, if you open The Sandbox on a web browser, all you’ll see is a flat map of brand logos scattered throughout land-shaped masses made up of colorful pixels. (Each of those pixels, or plots, is a property worth real money; in general, the concept of scarcity is a farce online, but in these worlds—as in our physical one—it is often real.) Meanwhile over on Cryptovoxels, things feel more like an early-stage video game populated by blank walking mannequins. (Sometimes, they fly.) Click on a billboard, and you’ll see details of the NFT work and artist you’re viewing, with a link to OpenSea, the NFT marketplace.</em></p>
<p><em>MetaCollective has big plans for their blank squares. For Drew Austin, managing partner at venture capital syndicate RedBeard Ventures and leader for MetaCollective, it’s all about developing this corner of the future internet into a learning center or “university” for self-education on all things web3. He envisions virtual classes, dormitory rooms that users can rent, and a full social experience. “We can recreate what an educational digital experience is, in this new digital world,” he says. None of this has been built or designed yet. But the money is real.</em></p>
<p><em>One way to think about it is like purchasing a domain name, or snagging a good social media handle. If email was our home in Web 1, and social profiles—like a Facebook or Instagram page—were the Web 2 home bases for each of us, then personal property in the form of virtual real estate may be the Web 3 version. The difference is that instead of being beholden to providers or platforms to design, regulate and control the experience, Web 3 property is intended to be something you, the end user, can build yourself. For brands, it could mean something much more interactive and active than their current digital presences. For individuals, it could mean earning income by playing games or selling products.</em></p>
<p><em>Andrew Steinwold, managing partner at metaverse-native fund Sfermion, calls it “unlimited optionality,” breaking free of the bounds of our profiles and pages. An entire industry of virtual world developers has already popped up. “One of the things that’s so exciting and fascinating about the metaverse is it’s all about cocreation, right?” says Jessica Peltz Zatulove, another MetaCollective member. “So we’re also just seeing this blending between creators and celebrities and communities.” Then again, right now this is all speculation.</em></p>
<p><em>The big winners—at the moment, at least—are the platforms and developers, who are raking in investment dollars from early buyers. Animoca Brands, the company behind The Sandbox, recently reported it is now <a href="https://www.businessinsider.in/stock-market/news/metaverse-company-animoca-brands-more-than-doubles-in-value-to-5-billion-after-soros-and-the-winklevoss-twins-back-the-sandbox-owners-latest-funding-round/articleshow/88981948.cms#:~:text=In%20a%20Tuesday%20press%20release,about%20%242.2%20billion%2C%20Bloomberg%20reported." target="_blank">worth $5 billion</a>, up from a valuation just over $2 billion in 2021. Roblox, a more established gaming universe, listed on the New York Stock Exchange in March 2021 at a valuation of <a href="https://venturebeat.com/2021/03/10/roblox-goes-public-at-42-2-billion-valuation-in-direct-listing/" target="_blank">$42 billion</a>. One research <a href="https://coinmarketcap.com/alexandria/article/revealed-how-much-metaverse-industry-could-be-worth" target="_blank">report</a> predicts virtual gaming worlds alone could be worth $400 billion by 2025, with the broader metaverse industry worth over $1 trillion.</em></p>
<p><em>Many of the early buyers of virtual real estate are doubly invested—in the platforms themselves and through personal plays like DAOs buying and developing new land—so their bullishness is ultimately self-serving. (Steinwold’s fund, for instance, has its hand in both platform investments and individual properties; Austin runs a fund that invests in five different worlds.) The technology, too, is early—Adamo is the first to admit we’re about a decade out from easy mass adoption, and Austin notes plenty of “room for improvement,” from the interface to the technically complicated process of buying property.</em></p>
<p><em>But the hunger is there for web3 investors. Virtual property prices have gone up as much <a href="https://www.cnbc.com/2022/01/12/investors-are-paying-millions-for-virtual-land-in-the-metaverse.html" target="_blank">as 500%</a> since Facebook’s much-hyped transition to Meta, according to CNBC. Already, plots in some virtual worlds are just as expensive as a real-world house.</em></p>
<div>
<div>
<div data-src="https://api.time.com/wp-content/uploads/2022/01/Virtual-Real-Estate-05.jpeg" data-crop="" data-alt="" data-title="" data-shop-image="false" data-width="800" data-height="288">
<div>
<div>
<div><em>Decentraland</em></div>
</div>
</div>
</div>
</div>
</div>
<p><em>Even if the casual user experience leaves much to be desired, however, ways to claim land and plans to develop property are expanding daily. ONE Sotheby’s just <a href="https://futureparty.com/sothebys-real-home-metaverse-nft/?utm_medium=email&amp;utm_campaign=Crypto%20cowboys%20%20Building%20East%20Hollywood&amp;utm_content=Crypto%20cowboys%20%20Building%20East%20Hollywood+Version+B+CID_7206f9ccc9ccd66db49a3693e63e226a&amp;utm_source=CampaignMonitor&amp;utm_term=real%20estate%20project" target="_blank">announced</a> they will build a virtual replica of a real-world property in The Sandbox, with ownership crossing over. Meanwhile, an anonymous buyer snapped up the neighboring property to Snoop Dogg for a <a href="https://decrypt.co/87524/someone-paid-450k-snoop-dogg-metaverse-neighbor" target="_blank">reported $450,000</a>, betting on proximity to a famous neighbor as a value-add, just as MetaCollective is betting on Bored Ape Yacht Club. Over at Cryptovoxels, one developer is planning to build a New York Stock Exchange-style trading center and home for crypto-native companies like defi protocols in their centrally-located Frankfurt property, a spot they purchased because it allows for larger virtual buildings. The dream is for it to become a central hub in this universe, and one with real utility as we migrate into virtual realms.</em></p>
<p><em>If this all sounds quixotic, that cynicism is warranted. Even investors are maintaining healthy skepticism about the current iterations of virtual worlds. Steinwold has raised over $100 million from investors for his funds, but he sees much of the virtual world speculation as being overvalued so far. In fact, he says, overvaluation in web3 is “true broadly,” from NFT art to crypto tokens. But that still hasn’t stopped him from investing “at the company-building level.” And it hasn’t stopped him from backing the Frankfurt NYSE plan in Cryptovoxels. “We’re kind of in the pre-Napster era. We don’t have Napster yet. We don’t have iTunes, and we don’t have Spotify,” he says, comparing today’s virtual worlds to the early-2000s music-sharing platform and its successors. “That’ll come, but it’s gonna take a pretty long time.”</em></p>
<p><em>For Zatulove, another MetaCollective investor, the draw is in the business potential. As a founding partner of Hannah Grey, an early stage venture firm that specializes in emerging platform potential for brands, Zatulove is focused on finding ways to build commerce into this new landscape. “It’s about having an office space in a prime location, but it’s really about: Can you rent this land?” she says, “Can you have a store? Can you host events? We’re in a gold rush moment with virtual real estate where people don’t know what they’re gonna build or how they’re going to build it, but they’re acquiring land in the best possible locations to create an interesting financial future.” She imagines setting up office space on the MetaCollective campus.</em></p>
<p><em>“Maybe we have a coffee shop, maybe we have a cool hangout. Maybe we have town hall meetings, maybe we host office hours for founders, maybe we just have a museum that inspires creativity, in collaboration across different builders in this space,” she says, brainstorming. Plus, the market is untapped; Zatulove cites the three billion people worldwide who are gamers, and who are used to spending time in virtual environments. Even if Sandbox hasn’t captured their attention yet, the potential is there. “The delight right now of virtual real estate is that it’s recognizing that there’s opportunity ahead that you’re setting up for yourself,” she says.</em></p>
<p><em>Adamo has kids and, like any dad, he’s thinking about their future—and about what he can pass down to them. This real estate might not be a brick-and-mortar property, but it’s still something bought with their best interests in mind. “With the rates of this year’s growth, this looks like a really multi-generational plan purchase,” he says. Maybe Sandbox Hill Road will disappear into the ether of the internet in a few years, like Limewire and Kazaa. Maybe he’s bought into a future Spotify. In the meantime, the bubble just gets bigger.</em></p></blockquote>
]]></content:encoded>
			<wfw:commentRss>https://www.synworlds.com/2022/01/21/why-investors-are-paying-real-money-for-virtual-land/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Nike Buys Virtual Sneaker Maker To Sell Digital Shoes In The Metaverse</title>
		<link>https://www.synworlds.com/2021/12/15/nike-buys-virtual-sneaker-maker-to-sell-digital-shoes-in-the-metaverse/</link>
		<comments>https://www.synworlds.com/2021/12/15/nike-buys-virtual-sneaker-maker-to-sell-digital-shoes-in-the-metaverse/#comments</comments>
		<pubDate>Wed, 15 Dec 2021 22:53:53 +0000</pubDate>
		<dc:creator><![CDATA[Monty Simus]]></dc:creator>
				<category><![CDATA[Blog]]></category>

		<guid isPermaLink="false">http://www.synworlds.com/?p=1033</guid>
		<description><![CDATA[Via CNN Business, an article on Nike&#8217;s acquisition of a virtual sneaker maker to sell digital shoes in the metaverse: Nike said on Monday it had bought virtual sneaker company RTFKT for an undisclosed sum, as the sportswear giant looks to quickly expand its footprint in the fast-growing &#8220;metaverse.&#8221; Last month, Nike (NKE) became one of the first big [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Via CNN Business, an <a title="Metaverse" href="https://www.cnn.com/2021/12/14/tech/nike-rtfkt/index.html" target="_blank">article</a> on Nike&#8217;s acquisition of a virtual sneaker maker to sell digital shoes in the metaverse:</p>
<div>
<blockquote>
<p data-paragraph-id="paragraph_488631CE-A956-95AB-62CE-B8C95D8ED01A" data-act-id="paragraph_0"><em>Nike said on Monday it had bought virtual sneaker company RTFKT for an undisclosed sum, as the sportswear giant looks to quickly expand its footprint in the fast-growing &#8220;metaverse.&#8221;</em></p>
</blockquote>
</div>
<blockquote>
<div data-paragraph-id="paragraph_AF8DEB32-5E89-B7C5-AEB7-B8CB64ED951C" data-act-id="paragraph_1"><em>Last month, Nike (<a href="https://money.cnn.com/quote/quote.html?symb=NKE&amp;source=story_quote_link">NKE</a>) became one of the first big brands to enter the shared virtual world that gained prominence after Facebook recently rebranded itself to Meta Platforms (<a href="https://money.cnn.com/quote/quote.html?symb=FB&amp;source=story_quote_link">FB</a>).</em></div>
<div data-paragraph-id="paragraph_AF8DEB32-5E89-B7C5-AEB7-B8CB64ED951C" data-act-id="paragraph_1"></div>
<div data-paragraph-id="paragraph_AF8DEB32-5E89-B7C5-AEB7-B8CB64ED951C" data-act-id="paragraph_1"><em></em><em>In such blockchain-based environments, users can buy virtual land and other digital assets such as clothing for avatars in the form of a crypto asset called a non-fungible token (NFT).</em></div>
<div data-paragraph-id="paragraph_46F6D0E8-B49D-BDAB-029D-B8CB64F520F0" data-act-id="paragraph_3"></div>
<div data-paragraph-id="paragraph_46F6D0E8-B49D-BDAB-029D-B8CB64F520F0" data-act-id="paragraph_3"><em>Formed in 2020 by Benoit Pagotto, Chris Le and Steven Vasilev, RTFKT also makes NFT collectibles and memes, according to its website.</em></div>
<div data-ad-text="show"><em> </em></div>
<div data-paragraph-id="paragraph_4CF03447-D6F5-C6D5-A7BB-B8CB64F75080" data-act-id="paragraph_4"><em>&#8220;This acquisition is another step that accelerates Nike&#8217;s digital transformation and allows us to serve athletes and creators at the intersection of sport, creativity, gaming and culture,&#8221; Nike Chief Executive Officer John Donahoe said in a statement.</em></div>
</blockquote>
]]></content:encoded>
			<wfw:commentRss>https://www.synworlds.com/2021/12/15/nike-buys-virtual-sneaker-maker-to-sell-digital-shoes-in-the-metaverse/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>What Will Art Look Like in the Metaverse?</title>
		<link>https://www.synworlds.com/2021/12/06/what-will-art-look-like-in-the-metaverse/</link>
		<comments>https://www.synworlds.com/2021/12/06/what-will-art-look-like-in-the-metaverse/#comments</comments>
		<pubDate>Mon, 06 Dec 2021 14:23:33 +0000</pubDate>
		<dc:creator><![CDATA[Monty Simus]]></dc:creator>
				<category><![CDATA[Blog]]></category>

		<guid isPermaLink="false">http://www.synworlds.com/?p=1029</guid>
		<description><![CDATA[Via the New York Times, an interesting look at what art will look like in the future: In the opening pages of Ben Lerner’s debut novel, “Leaving the Atocha Station,” his narrator goes to Madrid’s Prado museum and observes a stranger breaking into sobs in front of Rogier van der Weyden’s “Descent From the Cross,” a votive [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>Via the New York Times, an interesting <a title="Art in Metaverse" href="https://www.nytimes.com/2021/12/01/magazine/mark-zuckerberg-meta-art.html" target="_blank">look</a> at what art will look like in the future:</p>
<div>
<div>
<blockquote><p><em>In the opening pages of Ben Lerner’s debut novel, <a title="" href="https://www.nytimes.com/2012/03/11/books/review/what-leaving-the-atocha-station-says-about-america.html">“Leaving the Atocha Station,”</a> his narrator goes to Madrid’s Prado museum and observes a stranger breaking into sobs in front of Rogier van der Weyden’s “Descent From the Cross,” a votive portrait attributed to Paolo da San Leocadio, and <a title="" href="https://www.museodelprado.es/en/the-collection/art-work/the-garden-of-earthly-delights-triptych/02388242-6d6a-4e9e-a992-e1311eab3609" target="_blank" rel="noopener noreferrer">Hieronymus Bosch’s “Garden of Earthly Delights.”</a> He watches the man until he leaves and follows him out into the sunshine. The narrator has worried for a long time that he is incapable of having such a profound experience of art. Many of us, I imagine, have experienced the failure to be moved by a painting as we’d hoped to be. I thought of this passage as I watched the first major ad for Meta, Facebook’s rebranding as a metaverse company, which also takes place in a museum. But here the art is moving — quite literally.</em></p>
<p><em>The video begins with four young people looking at Henri Rousseau’s “Fight Between a Tiger and a Buffalo,” which hangs in the Cleveland Museum of Art. As they peer into the frame, the tiger’s eyes flicker and the whole painting comes to life and opens up into a three-dimensional animated jungle. The tiger and the buffalo, the toucans and monkeys and the mandrills in the trees, all start dancing to an old rave tune; the kids bop along, too. Fruit trees grow around them in the gallery. Above the rainforest canopy, in the distance, stands a mysterious hexagonal portal, and beyond that, in the misty red hills, the towering skyline of a great tropical city. It’s a scene that suggests Facebook may be returning to Silicon Valley’s countercultural origins: a psychedelic dream of a global community sharing in collective hallucinations.</em></p></blockquote>
</div>
</div>
<div></div>
<div>
<div>
<blockquote><p><em><a title="" href="https://www.youtube.com/watch?v=XOn2CZWnxxY&amp;ab_channel=Reuters" target="_blank" rel="noopener noreferrer">The video keynote that Meta released</a> to explain itself to investors also features art prominently, opening with a demo in which a couple of Mark Zuckerberg’s co-workers find a piece of augmented-reality street art hidden on a wall in SoHo. It’s brought to life by 3-D animation and ported over from Lower Manhattan to virtual reality, growing into a nightmarish Cthulhu-like blob that surrounds their avatars. (Zuckerberg: “That’s awesome!”) For some reason, the company wants us to think about art when we think about its new product. Perhaps it’s because they want us to see it as a platform for creative self-expression — or perhaps just because fine art provides a more edifying context than video games or working from home.</em></p></blockquote>
</div>
</div>
<blockquote><p><em>This apparent stance toward art is at once moronic and apt; moronic because it reduces art to a mere gewgaw, apt because other entrepreneurs have already embraced this view. The animated Rousseau assumes the popular logic of the <a title="" href="https://www.nytimes.com/2021/03/07/arts/design/van-gogh-immersive-experiences.html">“Van Gogh immersive experience,”</a> in which the dour old Dutchman’s paintings of starry nights and ominous wheat fields are projected onto the walls and floors to create an enveloping spectacle, attraction and selfie backdrop. Both presume that audiences can enjoy artworks only when they’re in the process of being ruined. And in the case of the Van Gogh experience, the market has proved them right: There are currently at least five different competing Van Gogh experiences touring the country. The copy has surpassed the original. This has remained a consistent theme throughout the history of Facebook, which offers a pale simulation of friendship and community in place of the real thing. Meta promises to lead us farther into the forest of illusions.</em></p>
<p><em>And yet the return to an art of dreaming and escapism is an enticing proposition. Rousseau, painting jungles in his studio in Paris beginning in his middle age, was escaping his own humdrum life as a retired municipal toll-service employee. He is said to have often told stories of his youthful adventures and how his tour of duty in Napoleon III’s intervention in Mexico had inspired his jungle pictures; but these were all lies. In reality he played in an infantry band and never once left France.</em></p>
<h2><em>An important thing to remember about the metaverse is that none of this has been made, neither the jungle nor the technology to display it.</em></h2>
<h2><em style="font-size: 14px;">Rousseau found his actual inspiration in travel books and regular visits to the Jardin des Plantes, of which he once told an art critic, “When I go into the glass houses and I see the strange plants of exotic lands, it seems to me that I enter into a dream.” It was this uncanny dream space, where fierce animals have the quality of children’s book illustrations and bananas grow upside-down on the trees, that he conjured in his paintings; and it was the childlike originality and naïve purity of these depictions that his fellow artists would come to admire.</em></h2>
<p><em>In late-19th and early-20th century Paris, Rousseau and his contemporaries (Paul Gauguin, Georges Seurat, Pablo Picasso, etc.) were busy inventing bohemian modernity, creating new ways of living and of seeing the world. In our century, that visionary role appears to have passed from the artists to the engineers, to Zuckerberg and his ilk. Who else tries to invent new universes? Who dares spin grand utopian fantasies? Artists don’t anymore. It’s Silicon Valley’s Promethean founders who try — and routinely fall short.<br />
</em></p></blockquote>
<div>
<div>
<blockquote><p><em>Meta’s offering is not an appealing one: It’s somehow both childish and cynical. But a vision of the future ideated by a creative agency for a megacorporation was always going to be dreadful. The problem is not that kids today cannot appreciate a Rousseau masterpiece, but that their elders, my generation, are unsure of how to come up with anything that might compare to it — we have forgotten how to imagine a different world entirely.</em></p>
<p><em>An important thing to remember about the metaverse is that none of this has been made, neither the jungle nor the technology to display it. You can’t really go to a museum and do this. It’s just an idea, a whisper on the wind. An ad about nothing. It’s Meta. The more times I watch the ad, and the keynote where Zuckerberg explains his vision in detail, the more it seems that he has no idea what he’s making or selling. That’s bad for a company but not for artists, who flourish with an open brief. Indeed, much of the keynote is a call for thousands of “creators” to help build a functioning metaverse and a promise that they’ll be paid to do so.</em></p>
<p><em>Contemporary art is currently dominated by painting and sculpture, by traditional materials and old ways of making. Companies outside of the art world, meanwhile, are using digital technology to remake timeless masterpieces as evanescent gimcracks, as projected tourist attractions and animations. But few artists are doing what Rousseau and his peers did: accepting the realities imposed by new technologies — in their case, photography — and breaking the old ways apart to create something new. An artist with the spirit of Rousseau might appreciate the potential of this new medium and want to make art for the metaverse and the wider public. Now, as in his day, he wouldn’t be remaking old works from the past but coming up with fantastical scenes from his dreams: sights he’d never witnessed in his own life, rendered in a style that nobody had ever seen. Today it feels possible, perhaps for the first time this century, to invent completely new aesthetics — so long as someone takes the reins from the technologists.</em></p></blockquote>
</div>
</div>
]]></content:encoded>
			<wfw:commentRss>https://www.synworlds.com/2021/12/06/what-will-art-look-like-in-the-metaverse/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
