Hasbro CEO Chris Cocks Is Talking About AI in D&D Again

DND LOGO.jpg


Chris Cocks, the CEO of Hasbro, is talking about the usage of AI in Dungeons & Dragons again. In a recent interview with Semafor, Cocks once again brought up potential usage of AI in D&D and other Hasbro brands. Cocks described himself as an "AI bull" and offered up a potential subscription service that uses AI to enrich D&D campaigns as a way to integrate AI. The full section of Semafor's interview is below:

Smartphone screens are not the toy industry’s only technology challenge. Cocks uses artificial intelligence tools to generate storylines, art, and voices for his D&D characters and hails AI as “a great leveler for user-generated content.”

Current AI platforms are failing to reward creators for their work, “but I think that’s solvable,” he says, describing himself as “an AI bull” who believes the technology will extend the reach of Hasbro’s brands. That could include subscription services letting other Dungeon Masters enrich their D&D campaigns, or offerings to let parents customize Peppa Pig animations. “It’s supercharging fandom,” he says, “and I think that’s just net good for the brand.”


The D&D design team and others involved with D&D at Wizards of the Coast have repeatedly stood by a statement posted back in 2023 that said that D&D was made by humans for humans. The full, official stance on AI in D&D by the D&D team can be found below.

For 50 years, D&D has been built on the innovation, ingenuity, and hard work of talented people who sculpt a beautiful, creative game. That isn't changing. Our internal guidelines remain the same with regards to artificial intelligence tools: We require artists, writers, and creatives contributing to the D&D TTRPG to refrain from using AI generative tools to create final D&D products. We work with some of the most talented artists and creatives in the world, and we believe those people are what makes D&D great.
 

log in or register to remove this ad

Christian Hoffer

Christian Hoffer

LLMs aren't so much snake oil as they are real but limited (and unethically sourced and horribly wastedul) products with legitimate applications being sold by snake oil salespeople and deluded technocultists who see more humanity in Siri than their neighbors.
Okay, but the thing about snake oil is 1 particular snake oil and 1 particular set of useful properties. Why it became to iconic "scam" product is because other types of it were sold as panacea.

LLMs, like snake oil, are a thing that exists. They do not do what the people pitching them to the public, and the people advocating for them in this thread, claim they do. They literally just pattern match and attempt to replicate a pattern in response to a prompt. In order to do this, they use unfathomable amounts of stolen data, electricity and water (for cooling).

They use so much electricity that companies were looking into building nuclear power plants just to supply their server farms, they use so much data that OpenAI has essentially said if they don't get to just steal anything they like "it's over", and they use so much water that Forbes (not exactly a bastion of anti-capitalist sentiment) is concerned.

Open AI is marketing it primarily as a research tool and a substitute for knowledge workers:

1741938421470.png

Because if they marketed it as what it actually does, it'd be blatantly obvious it is not worth any of this, and it would join Tay in the Museum of Failure. As I've pointed out constantly, people who defend generative AI invariably do so by attributing it with properties of other advanced computing and Machine Learning.

Famously Elon Musk claimed his copy-paste of ChatGPT as being able to diagnose medical conditions - it couldn't, of course, but like remember Theranos.

If people don't want to accept the reality when it comes to LLMs in their D&D games, they're not going to want to accept it once there's real stakes on the line.
 

log in or register to remove this ad

Related Articles

Remove ads

Remove ads

Top