Formula E team fires its AI-generated female motorsports reporter, after backlash: “What a slap in the face for human women that you’d rather make one up than work with us.”::px-captcha
Formula E team fires its AI-generated female motorsports reporter, after backlash: “What a slap in the face for human women that you’d rather make one up than work with us.”::px-captcha
Yes? It’s nothing personal, human women, but once “having a pleasant feminine voice” is something that machines can do more efficiently than humans, why shouldn’t those machines be given the job?
You’ve got bigger problems than labour relations when “having a pleasant feminine voice” is the success criteria you use to measure the performance of a reporter.
I dunno, this logic sounds exactly like the fucked up logic that went on in the conference room that dreamed up this shitty idea only to have it face reality and be pulled on day one.
We’re not talking about replacing Bernstein and Woodward here…
What else does a racecar reporter have to do? There’s only so many ways you can say the cars are going round in circles.
Never really watched racing, have you…
deleted by creator
But any insight it might have about racing is not its own, and so it may not feel genuine to someone who knows it’s AI. It’s nice to have former racers as commentators because they give you information that few other people have, like Martin Brundle for example.
Some AI could say the same stuff, but you’d know it was coming from a computer, not experience. Maybe that would change over time, but I’m not convinced.
Such a Brave Little Toaster.
So should be pretty easy for a human to do then right? A lot easier than training an AI model to be able to spontaneously describe what’s happening on the race track at any given moment.
Cheaper too I bet.
Sure, you just have to hire a team of AI engineers who’s job it is to train the AI on thousands of races and test it and test it and test it. Definitely cheaper than just hiring one human to be an announcer.
Not really. The real power of these LLMs is their ability to understand the written word, context and emotion then generate text based on it.
Bing AI uses search to get its sources and its training to summarise them. It doesn’t need to be trained on the specific things it’s generating off. It just needs to understand them.
Anyone who used ChatGPT to get information and not generate text was using it wrong. This is a very common misconception.
As a fan of F1, and I casually watch FE, I’d much rather have human commentators, thanks.
Brutal take.
What about all the dudes that don’t get a shot either way because they’re not an attractive woman? Is it a slap in the face to them?
I feel like AI haters really struggle to grasp the concept of an actually competent AI that can do something better than a human would. The counter-arguments always seem to come from the assumption that this will never be the case but that’s changing the subject.
If there is an AI doctor that has a proven track record of being better at diagnosing illesses than any human doctor then I’ll rather consult the AI. I’m fully aware how “unfair” it is for the human doctor but I don’t want to have to deal with misdiagnosis just because I wanted to show my support for human doctors and knowingly going for the inferior option.
Because we haven’t seen an AI yet that can do what a human can do but better.
The flip side is that the company that owns the doctor AI doesn’t want you to use it because their 95% successful diagnosis means every 1 in 20 cases they have the opportunity to get sued.
Well presumably they would be using it to replace a doctor with even worse success rate so I’m not sure why wouldn’t they want me to use that instead.
Why do you think you know what’s happening in a hypothetical doctor’s mind?
Legislation is always 2+ decades behind technology. Legal protections are in place for doctors making wrong decisions with the information they have on hand as long as it’s to the best of their ability. The same protection doesn’t extend to someone’s brand new AI doctor.