Scarlett Johansson isn’t just the powerhouse actress you know from the silver screen; she’s turning into a fierce defender of personal identity in the age of AI. Recently, Johansson found herself at the crossroads of technology and privacy, as she voiced strong concerns about OpenAI using a voice strikingly similar to hers without her green light. This isn’t your run-of-the-mill celebrity gossip—it’s a pivotal moment that signals deeper implications for consumers and big brands alike.
First, let’s take a step back and look at what transpired. OpenAI, the tech behemoth behind ChatGPT, decided to create an AI voice for their latest model, GPT-4o. They showcased this voice, named Sky, which bore an uncanny resemblance to Johansson’s signature sultry tone. Scarlett was not amused. After hearing the demo, she felt a mix of shock and outrage, stating that she had explicitly declined to license her voice for AI use.
OpenAI quickly responded by pausing the use of Sky and reiterated it was never their intention to mimic Johansson. But here’s the kicker: Scarlett believes the resemblance is no accident. She argues that OpenAI made a deliberate attempt to replicate her voice after her refusal to collaborate—an accusation that stirs up a hornet






