[ad_1]
The Simulation by Fable said it is open sourcing its new tool for creating Westworld-like simulations that feature AI characters.
San Francisco-based The Simulation by Fable (previously known as Fable) has a team with two-time Emmy winners, and it has been working on the intersection of games and AI for years with veterans from both Pixar and Oculus.
Today, it revealed its AI framework, SAGA (Skill to Action Generation for Agents), now available as an open source project. This AI-based technology aims to empower developers in crafting immersive Westworld-like simulations of the future where AI characters are the main actors within games, rather than player characters. A demo of the tech is set in a 1800s Wild West town called Thistle Gulch.
“We’re releasing SAGA this week, which is a tool to help people bring agents to life within simulations,” said Edward Saatchi, CEO of The Simulation by Fable, in an interview with GamesBeat. “And we’re going to be open sourcing it and allowing a community to start to grow up around simulations. We want to enable them to bring their simulations to life with intelligent agents that can plan and take actions.”
VB Event
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
Learn More
The launch of SAGA allows developers to create highly detailed worlds, contextualized within historical events or fictitious narratives, reflecting real-world occurrences or speculative scenarios, said Saatchi.
Frank Carey, the company’s CTO, expressed excitement about shifting from rudimentary AI applications to more narrative-driven embodied agents living within creators’ simulated environments.
“The promise of Westworld was narrative agents living in a simulation until humans entered. I think the moral is that humans treated it like a theme park for their own personal ambitions and ruined what could have been – a powerful simulation of agents,” Carey said.
Pete Billington, cofounder of The Simulation by Fable, said in a statement, “The technology we’re releasing today, SAGA will allow people to build highly detailed worlds that exist in the context of historical events and well-defined fictional worlds – Like the Cuban missile crisis, the end of the Roman Empire, the first-time humans inhabit mars, or discover that dragons really did exist.”
Although currently used in smaller villages and towns with around tens of agents, Saatchi aspires to evolve SAGA to power simulations housing a million agents. Their objective is to enable these agents to make complex, unpredictable decisions, rating their choices through Reinforcement Learning through Human Feedback (RLHF).
Demo
In a demo, Saatchi showed how SAGA works within the story of Thistle Gulch, a 3D simulation portraying a Wild West town.
“You have genuinely intelligent agents who are not reacting based on some backstory, but they’re actually making intelligent choices in in the moment,” Saatchi said.
In Inworld AI’s recent demo, you as a human play a detective interrogating AI characters. In this case, the AI characters is the investigator as well.
Two AI agents, Sheriff Cooper and Blackjack Kane, face off against each other. They have conflicting goals after a murder occurred in the town. A local native is dead. The local tribe’s chief has threatened to take things into his own hands if Sheriff Cooper doesn’t uncover the murderer quickly.
Blackjack Kane, the saloon owner and local gang leader, is also unsure who the murderer is, but he doesn’t want any investigation to blow back on him and to ruin his plans to rob a stagecoach in a few weeks. Cooper wants to catch the killer. Their interests conflict, and they make their plans to achieve their goals in real time. As a player, I did have a little control in deciding which tasks the AI should do first.
The Sheriff will move the native’s body into the jail to preserve the evidence or Blackjack and his goons will find something incriminating and plant that evidence in the scapegoat’s room to frame them for the murder.
In most simulations of the murder, Dan Deadshot is chosen as the killer by the AI. Dan’s ally, Blackjack Kane, decides to help him cover up the crime, and stop Sheriff Cooper from finding the murderer. All agents are pursuing their own ends and are also cooperating with other agents. The Sheriff talks to the townsfolk looking for leads and cooperation, while Blackjack conspires with his gang to plant evidence and spread rumors to throw him off.
“The sheriff has to try to figure out what’s happened by interrogating people and finding the people responsible, but the killer and his ally in town have to try to figure out how to cover it up and put the sheriff on the wrong track,” Saatchi said. “We’ve internally done it many times. You get different outcomes. Sometimes the sheriff arrests the right person, sometimes there’s a shootout, sometimes the sheriff is drawn to arrest the wrong person because fake evidence has been planted by the other side.”
The AI characters have memories, and they can recall conversations so the sheriff can figure out if someone has lied to him, Carey said.
In my demo, the town had 17 characters in it. On the first day, characters go about their business. Some have jobs to do, and they have things like needs, much the same way that characters in The Sims do. They also have to sleep, or they get tired. The Simulation by Fable gives the characters their own backstories, but it doesn’t control them directly once the game starts, Carey said. When you insert a plot point, like finding a murder weapon, that’s when the developer has control over the story — but not how the characters react to it.
“The purpose of the demo is that we want it to wrap up by the end of the first day and so we’ll need to settle on someone to arrest,” Carey said. “Sometimes you get the right person or it’s a character who has been framed.”
The Thistle Gulch demo video below shows Blackjack cooperating with his gang to draw the sheriff’s attention away from his criminal schemes. While the dialogue is generated by the simulation, who to talk to and what the topic and goal of the conversations are all generated via SAGA.
“Using our simulation tool, which is kind of a creator tool, we allow people to go in and create a world, like a Minecraft or a builder type tool, set out the world, set out the characters, and then let it play out as a simulation,” Carey said. “Playing it out with SAGA, you get this really emergent behavior with all the characters. But it’s really up to you decide how much control you want. You set up the dominoes.”
The framework facilitates agent decision-making, generating varied actions and allowing human ratings, or even intervention if a different simulation path is desired. These embodied agents have the capability to manipulate their environment, creating dynamically evolving narratives.
Moreover, SAGA’s capacity to simulate scenarios yielded unexpected outcomes, showcasing agent loyalty shifts and intricate inter-agent dynamics. This ability to score and fine-tune outcomes ensures an evolving and nuanced simulation experience.
Backed by research
SAGA’s development was influenced by recent academic work on embodied agents, acknowledging the pioneering research of individuals like Joon Park of Stanford University and Jim Fan of Nvidia and Stanford.
Academic interest in embodied agents in simulations peaked this year when Park of Stanford created a simple simulation in the browser using a 2D-grid game engine backed by a Python webserver. His paper drove millions of views and has won several awards.
Shortly after Park’s paper, Voyager by Fan of Nvidia was released. It’s a research project focused on “life-long learning” and creating new skills via code generation and refinement while leveling an agent up in Minecraft and learning to craft new things along the way.
SAGA is inspired by the work of Park and Fan but also takes a different approach.
With SAGA, agents first tell SAGA contextual metadata about themselves and their world via a companion simulation: Who they are; What they know; What “Skills” they have; And what their goals are. Then, when an agent is deciding what to do next, SAGA generates a set of “Actions” that best serve the Agent’s goals in that moment.
These action options are then scored and returned to the simulation in order to direct the agent. This process repeats each time the agent is deciding its next action and can be scaled to multiple agents running simultaneously.
The Simulation by Fable aims to stimulate a community centered around simulations, envisioning applications in economic simulations, historical reenactments, creation games, and even AI-powered episodic TV shows, challenging the boundary between human and AI-driven creativity.
The open-sourced SAGA aims to spur innovation and experimentation within the developer community, pushing the boundaries of AI-driven simulations. The company plans to gather feedback from developers before Christmas, welcoming exploration and input from diverse perspectives.
As open source, developers can use it with their own simulations.
“We’re trying to kind of kickstart a community around not what we think is a bit boring, which is chatbots just being dumped into NPCs, but actually intelligent worlds filled with intelligent agents and no humans allowed,” Saatchi said.
Saatchi outlined their long-term goal: training embodied AIs within simulations to foster an intelligent community capable of venturing beyond simulated realms and into the internet as peers. The goal is to create a living simulation with a million agents. SAGA will grow in complexity over time.
The Simulation by Fable is releasing SAGA as an open-source tool for developers because they believe that simulations are just in their infancy, and they want to help to kickstart a community around simulations, as they have done around Virtual Beings. The next Virtual Beings Summit of 2024 will have a theme of simulations and they hope to show off new work created with SAGA at the Summit.
Saatchi said “Vast virtual simulations with embodied agents are the future for AI research, as evidenced by the brilliant work of Dr. Jim Fan and Joon Park of Stanford – 3D simulations are clearly in the realm of gaming, and we believe a company with the right mix of gaming and AI talent could be the next OpenAI. Game developers should not see themselves as mere clients and grateful recipients of the work of OpenAI and Anthropic but as the drivers of the next stage of AI – realistic, embodied agents in complex, emergent 24/7 simulations.”
Simulations built around SAGA could power varied applications – Saatchi lays out some of the ideas below, including one that The Simulation by Fable themselves are pursuing: having simulations power weekly episodic stories using the Showrunner AI that they debuted earlier this year.
Saatchi believes the simulation could test economic policies or do a historical reenactment with a twist, like the 13 days of the Cuban Missile Crisis. Developers can also use the tech to build a game where players could devote thousands of hours to build up simulations with the goal of reaching god-like intelligence equal to human intelligence.
The company expects to link the SAGA tool to Showrunner, another Fable tool which creates TV shows based on prompts, so that it can create shows based on the gameplay and make a series of episodes about what is going on in the game.
Big ambitions
Saatchi said he doesn’t want AI experts to just drop a large language model into a tavern keeper so you can talk to the tavern keeper for 30 hours. That’s not going to make RPGs any better, and it doesn’t push us closer to AGI, or artificial general intelligence, where AI is as smart as humans are.
“We’re really excited to see what the community does because, just like a lot of these very large quests toward AGI, you essentially are signing up to create a template for the entire world forever. And so we need help from the wider community to push forward because we ultimately want every skill that a human has to be a skill that an agent has in here. Every thought that a human has to be a thought that an agent can have here. Every action that a human can take to be an action that an agent can take here because we are ultimately trying to get to AGI.”
That is, Saatchi wants to get truly intelligent emergent life coming from the simulation. It’s not unlike the Ryan Reynolds movie Free Guy.
“I’m impressed by the 17 characters that are in this Westworld-like or Deadwood world,” he said. “They’re behaving in a complex way. But this is just one day. A small number of characters, a small number of skills. But to get up to hundreds of skills, thousands of actions, hundreds of thousands of characters, all with their own agendas and motivations. That gets out of our control very, very quickly. And means that they start to exhibit behavior that we certainly didn’t program but even couldn’t have predicted that they would do such a thing.”
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.
[ad_2]
Source link