September 20, 2024

The Terrifying Potential of OpenAI’s New Text-to-Video Tool, Sora

3 min read

OpenAI, the company behind the popular text-based AI model, ChatGPT, has recently unveiled another groundbreaking generative artificial intelligence (AI) tool, Sora. This new application takes written prompts and turns them into original videos, a capability that has one AI expert “terrified.”

The rapid evolution of generative AI tools, such as Sora, has significant implications for various industries, particularly in the context of the upcoming 2024 presidential election. Oren Etzioni, founder of TrueMedia.org, an organization dedicated to fighting AI-based disinformation in political campaigns, expressed his concerns to CBS MoneyWatch.

“Generative AI tools are evolving so rapidly, and we have an election coming up that could have serious consequences,” Etzioni stated. “As we’re trying to sort this out, we’re coming up against one of the most consequential elections in history.”

OpenAI shared a teaser of its text-to-video model, Sora, explaining that it can create sophisticated, 60-second-long videos featuring highly detailed scenes, complex camera motion, and multiple characters with vibrant emotions. The tool is not yet publicly available, and OpenAI has restricted its use to “red teamers” and some visual artists, designers, and filmmakers to test the product and deliver feedback to the company before its wider release.

Safety experts will evaluate the tool to understand how it could potentially create misinformation and hateful content, OpenAI stated. However, Etzioni believes that the tool’s capabilities could be used to generate deepfake videos, which could have serious implications for democracy and the upcoming election.

“We’re trying to build this airplane as we’re flying it, and it’s going to land in November if not before—and we don’t have the Federal Aviation Administration, we don’t have the history, and we don’t have the tools in place to do this,” Etzioni said.

The potential for deepfake videos to be created using Sora or similar technology from OpenAI competitors is a significant concern. Dr. Andrew Newell, chief scientific officer for identity verification firm iProov, told CBS MoneyWatch that this puts the onus on organizations, such as banks, to develop their own AI-based tools to protect consumers against potential threats.

Banks that rely on video authentication security measures are most exposed, according to Newell. “Voice actors or people who make short videos for video games, education purposes, or ads will be the most immediately affected,” he said. “For professions like marketing or creative, multimodal models could be a game changer and could create significant cost savings for film and television makers, and may contribute to the proliferation of AI-generated content rather than using actors.”

Given that it makes it easier for anyone—even those without artistic ability—to create visual content, Sora could let users develop choose-your-own-adventure-style media. Even a major player like Netflix could enable end-users to develop their own content based on prompts.

Megan Cerullo, a New York-based reporter for CBS MoneyWatch, covered the story, and her article was first published on February 16, 2024. The text is copyrighted by CBS Interactive Inc. and all rights are reserved.

In conclusion, OpenAI’s new text-to-video tool, Sora, has the potential to revolutionize content creation and have significant implications for various industries. However, its capabilities could also be used to generate deepfake videos, which could have serious consequences for democracy and the upcoming election. Organizations must take steps to protect against these potential threats, and safety experts must evaluate the tool to understand its limitations and potential misuses.

Copyright © All rights reserved. | Newsphere by AF themes.