Not just that, remember it’s just an LLM. It quantifies which tokens (or words and letters, if you will) come up next. It doesn’t matter if it’s factual or fictional - if it’s good enough, it does that.
LLMs are very confident in lying. I once asked it if there is a magic method to catch magic method calls in PHP - it told me its __magic. Lo and behold, there is and never was such a method. That’s my first and last time I tried it, and there ain’t gonna be a second time in the near future.
State media would report that a man using his computer suddenly fell to his death from the roof of his appartment. Neighbors say the man didn’t live in an apartment. 🤷♂️
Just wait until AI starts rewriting history, changing historical facts, and purposely misinforms people.
It’s only a matter if time before AI will deny the Holocaust, Black slaves in the US, and the numerous African genocides.
The Google chatbot is already doing just that.
Not just that, remember it’s just an LLM. It quantifies which tokens (or words and letters, if you will) come up next. It doesn’t matter if it’s factual or fictional - if it’s good enough, it does that.
LLMs are very confident in lying. I once asked it if there is a magic method to catch magic method calls in PHP - it told me its
__magic
. Lo and behold, there is and never was such a method. That’s my first and last time I tried it, and there ain’t gonna be a second time in the near future.Haha, just what happens when you create AI that exists in a fantasy world. You get fantastic (as in fantasy) results.
Cough 1989 Tiananmen Square protests and massacre cough
Chinese AI bot says “what”?
Your computer just explodes killing you instantly.
What a tragic accident…
That’s the Russian chatbot.
State media would report that a man using his computer suddenly fell to his death from the roof of his appartment. Neighbors say the man didn’t live in an apartment. 🤷♂️
In Russia, Apartments fall on you
In Russia, windows fall out of you