Chatgpt often gets things wrong.
It has been likened to a mansplainer: supremely confident in its answers, regardless of their accuracy.
Unlike search engines, which mostly direct people to other pages and make no claims for their veracity, chatbots present their answers as gospel truth.
Chatbots must also grapple with bias, prejudice and misinformation as they scan the internet.
There are sure to be controversies as they produce incorrect or offensive replies. (Google is thought to have held back the release of its chatbot over such concerns, but Microsoft has now forced its hand.)
Chatgpt already gives answers that Ron DeSantis, Florida’s governor, would consider unacceptably woke.
这是《经济学人》2023年2月刊的一篇文章“The battle for internet search”第6段。
1
Chatgpt经常将事情弄错。
2
mansplainer:直男癌(似的说教)
supremely:极其
accuracy:准确性
它有点像直男癌:对自己回答极其自信,不管其准确性。
3
veracity:真诚
gospel:绝对真理
不像搜索引擎,搜索引擎大部分直接将用户引导到其他页面,并不对其真实性做出声明,而聊天机器人则视其答案为真理。
4
聊天机器人在其扫描网络的时候需要与偏见,损害和错误信息作斗争。
5
controversies:辩论
这的确会有争议,因为他们给出了不正确或冒犯性的答复。(谷歌被认为是出于这样的担忧而推迟了其聊天机器人的发布,但微软现在已经强行出手了。)
6
Chatgpt已经给出了Florida州长Ron DeSantis认为无法接受的答案。
词汇
mansplainer:直男癌(似的说教)
supremely:极其
accuracy:准确性
veracity:真诚
gospel:绝对真理
controversies:辩论
领取专属 10元无门槛券
私享最新 技术干货