本集简介
双语字幕
仅展示文本字幕,不包含中文音频;想边听边看,请使用 Bayt 播客 App。
乔·罗根播客。
Joe Rogan Podcast.
去看看。
Check it out.
乔
The Joe
罗根体验。
Rogan Experience.
展示我的一天。
Showing my day.
晚上是乔·罗根播客。
Joe Rogan Podcast by night.
一整天。
All day.
兄弟,最近怎么样?
What's up, brother?
你怎么样?
How are you?
见到你真好,我的朋友。
Good to see you, my friend.
见到你真好。
Good see you.
嘿。
Hey.
你们的人做了什么?
What have your people done?
你们那些搞AI的人,搞出这个该死的ChatGPT。
Your your AI people with this fucking chat GPT shit.
这让我吓坏了。
This scares the fuck out of me.
什么意思?
What mean?
你什么意思?
What do you mean?
你们的AI团队。
Your AI people.
你们那些古怪的程序员。
Your your wacky coders.
你们做了什么?
What have you done?
是的。
Yeah.
这非常有趣。
It's super interesting.
令人着迷。
Fascinating.
语言模型,我不确定你是否知道那是什么,但它们是支撑Chat GPT和GPT的基础系统。
Language models, I don't know if you know what those are, but that's the general systems that underlie Chad GPT and GPT.
在过去大概四年里,它们取得了迅猛的进展。
They've been progressing over the past maybe four years aggressively.
已经取得了大量的发展。
There's been a lot of development.
GPT-1、GPT-2、GPT-3、GPT-3.5。
GPT one, GPT two, GPT three, GPT 3.5.
还有ChatGPT,其中有很多有趣的技术细节,也许我们不想深入讨论。
And Chad GPT, there's a lot of interesting technical stuff that maybe we don't wanna get into.
当然。
Sure.
让我们深入聊聊。
Let's get into it.
嗯,我对它非常着迷。
Well, there was I'm fascinated by it.
所以ChatGPT本质上是基于一个拥有1750亿参数的神经网络,也就是GPT-3。
So Chad GPT is based on fundamentally on a 175,000,000,000 parameter neural network that is GPT three.
剩下的问题是,它是在什么数据上训练的,以及如何训练的。
And the rest is what data is it trained on and how is it trained.
所以你已经拥有一个大脑,一个庞大的神经网络,只是用不同的方式进行了训练。
So you already have like a brain, a giant neural network, and it's just trained in different ways.
所以GPT-3大约两年前发布,当时虽然令人印象深刻,但在很多方面却很愚蠢。
So Chad, GPT three came out about two years ago, and it was like impressive but dumb in a lot of ways.
你本期望作为一个普通人,它能生成某些类型的文本,但它却说了一些很傻、不着边际的话。
It was like you would expect as a human being for it to generate certain kinds of text, and it was like saying kind of dumb things that were off.
然后你就想,好吧。
And you're like, alright.
这确实令人印象深刻,但还不够完美。
This is really impressive, but it's not quite there.
你能看出它并不具备智能。
You can tell it's not intelligent.
他们对GPT-3.5所做的,是开始加入更多不同类型的数据集。
And what they did with, GPT 3.5 is they started adding more and different kinds of datasets there.
其中之一,可能是目前最聪明的神经网络,是Codex,它经过专门微调以用于编程。
One of them, probably the smartest neural network currently, is Codex, which is fine tuned for programming.
它是在编程代码上进行训练的。
Like, it was it was trained on code on programming code.
当你在编程代码上进行训练时,就像ChatGPT那样,你是在教它某种类似推理的东西,因为它不再仅仅是来自互联网的信息和知识。
And when you train on programming code, which chat chat GPT is also, you're teaching it something like reasoning because it's no longer information and knowledge from the Internet.
它还包含推理。
It's also reasoning.
你可以进行逻辑思考。
You can like logic.
即使你在看代码,编程代码其实是在向你展示,
Even though you're looking at code, programming code is you're looking at me like,
什么
what the
他在说什么鬼东西?
fuck is he talking about?
不。
No.
不。
No.
不。
No.
不。
No.
那不是我在看的东西。
That's not what I'm looking at.
我看着你,天哪。
I'm looking at you like, oh my god.
但要能够把有意义的句子串联起来,你不仅需要知道支撑这些句子的事实,还必须能够进行推理。
But reasoning is a in order to be able to stitch together sentences that make sense, you not only need to know the facts that underlie those sentences, you also have to be able to reason.
是的。
Yeah.
我们作为人类,理所当然地认为自己能够进行一些常识性推理。
And and we think of it we take it for granted as human beings that we can do some common sense reasoning.
比如,这场战争始于这个日期,结束于那个日期。
Like like, this war started at this date and ended at this date.
因此,这意味着开始和结束是有意义的。
Therefore, it means that, like, the start and the end has a meaning.
存在时间上的一致性。
There's a temporal consistency.
存在因果关系。
There's a cause and effect.
所有这些都体现在程序代码中。
All of those things are inside programming code.
顺便说一句,我所说的很多内容,我们其实还不理解,只是凭直觉觉得为什么它能这么有效。
By the way, a lot of stuff I'm saying we still don't understand, we're like intuiting why this works so well.
真的吗?
Really?
这些是直觉。
These are the intuitions.
是的。
Yeah.
还有很多事情并不清楚。
There's a lot of stuff that are not clear.
所以,Chad,也就是GPT-3.5,很可能基于Chad GPT。
So Chad, so GPT 3.5, which Chad GPT is likely based on.
目前还没有论文,所以我们并不确切知道细节,但它只是在代码和其他数据上进行了训练,从而赋予它一定的推理能力。
There's no paper yet, so we don't know exactly the the details, but it was just trained on on code and more data that's able to give it some reasoning.
然后,这非常重要,它通过人类标注以监督方式进行了微调。
Then, this is really important, it was fine tuned in a supervised way by human labeling.
通过人类标注的小数据集,告诉我们希望这个网络生成什么内容。
Small dataset by human labeling of here's what we would like this network to generate.
这些是合理的内容。
Here's the stuff that makes sense.
这就是有意义的对话。
Here's the kind of dialogue that makes sense.
这就是有意义的问题回答。
Here's the kind of answers to questions that make sense.
这基本上是将这个庞大的神经网络导向与人类思维方式和表达方式一致的方向。
It's basically pointing this giant titanic of a neural network into the right direction that aligns with the way human beings think and talk.
所以这不仅仅是利用维基百科的庞大智慧,也不只是我可以说它训练所用的数据集,而是基本上基于互联网。
So it's not just using the giant wisdom of Wikipedia and just I can talk about what datasets it's trained on, but just basically the Internet.
它之前被导向了错误的方向。
It was pointed in the wrong direction.
因此,这种监督标注使其在说出荒谬内容时能够被纠正到正确的方向。
So this supervised labeling allows it to point in the right direction to when it says shit.
你可能会说:天哪。
You're like, holy shit.
这相当聪明。
That's pretty smart.
所以这就是对齐。
So that that's the alignment.
然后他们做了一件非常有趣的事,就是基于人类标注的数据进行强化学习。
And then they did something really interesting is using reinforcement learning based on labeling data from humans.
这是一个相当大的数据集。
This that's quite a large dataset.
任务如下。
The task is the following.
你有一个聪明的DPT 3.5,让它生成大量文本,然后由人类标注哪个看起来最好。
You have the smart DPT 3.5 thing, generate a bunch of text, and humans label which one seems the best.
所以是排序。
So ranking.
比如,你问它一个问题。
Like, you ask it a question.
例如,你可以让它以乔·罗根的风格生成一个笑话。
For example, you could do generate a joke in style of Joe Rogan.
对吧?
Right?
然后你有一个标签。
And you have a label.
他们有五个选项,而你有一个标签。
They have five options, and you you have a label.
它提到阴茎和阴道了吗?
Does it mention dick and pussy?
没有。
No.
我不知道具体是怎么做的,但你能让它进行排序。
I don't don't know if it's I I don't know how exactly, but it so you you get it to rank.
人类标注者只是坐在那里。
The the human labeler is just over just sitting there.
有非常多的人。
There's a very large number of them.
他们全职工作。
They're working full time.
他们正在标注这个模型输出的排名。
They're labeling the ranking of the outputs of this model.
这种排名方式结合一种叫做强化学习的技术,能够使该模型生成让人类感到非常惊艳的输出。
And that kind of ranking used together with a technique called reinforcement learning is able to get this thing to generate very impressive to humans output.
所以,实际上在知识学习量上并没有重大突破。
So it's not actually there's not a significant breakthrough in how much knowledge was learned.
这些知识早已存在于GPT-3中,而且之前已经训练出更令人印象深刻的模型。
That was already in in GPT three, and there was much more impressive models already trained.
所以,这种趋势并不仅限于OpenAI。
So it's on the way, not just OpenAI.
但通过人类标注者结合强化学习进行这种微调,你开始看到类似的情况:高中生再也不用写作文了,是的。
But this kind of fine fine tuning, it's called, by human labelers plus reinforcement learning, you start to get, like like, where students don't have to write essays anymore in high school Yeah.
你可以进行风格迁移。
Where you can style transfer.
正如我所说,用路易斯·C.K.的风格讲一个乔·罗根的笑话,或者用乔·罗根的风格讲一个路易斯·C.K.的笑话,它在这类风格转换上表现得极其出色。
Like I said, do a Louis CK joke in style Joe Rogan or Joe Joe Rogan joke in the style of Louis CK, and does an incredible job at at those kinds of style transfers.
你可以更准确地查询关于不同历史事件的各类信息,所有这些都行。
You can more accurately query things about the different historical events, all that
这类事情。
kind of stuff.
天哪,老兄。
Holy shit, man.
你并不完全清楚它为何以这种方式运行,这种感觉太接近人类了。
The the idea that you don't exactly know why it works the way it works, that that's too close to human.
这太接近人类的思维方式了。
That's too close to human thinking.
你知道吗,这诡异得简直像什么?
Like, you know what this eerily is is eerily similar to?
《机械姬》的剧情,当他在谈论自己如何编码大脑的时候。
The plot of Ex Machina when he's talking about how he coded the brain.
你还记得那个情节吗?
Do you remember that that plot?
就是那个场景?
The that that scene?
就是他那个场景,是的。
That scene when he was yeah.
不记得。
No.
那个绅士叫什么名字?
The gentleman who's the what's the gentleman's name?
那个演员,那家伙太酷了。
The actor that dude's badass.
真的吗?
Really
不错。
good.
非常好
Really good
演员。
actor.
艾萨克。
Isaac.
艾萨克。
Isaac.
是的。
Yeah.
艾萨克。
Isaac.
选角太棒了。
Great casting.
艾萨克。
Isaac.
他太棒了。
He's amazing.
亚历克斯·加兰,导演。
Alex Garland, the director.
奥斯卡。
Oscar.
我认识的一个人
Somebody I got
奥斯卡·伊萨克。
Oscar Isaac.
他演过《星球大战》之类的电影。
He's in Star Wars and shit too.
是的。
Yeah.
不。
No.
那部电影是我最喜爱的前十名之一。
The that movie was one of it's one of my top tens.
我非常喜欢那部电影。
I love that movie.
但那个他正在被……的场景
But that scene where he's dis
在《疾速追杀》第一、二、三部之后吗?
Below John Wick one, two, and three?
嗯,第三部我不太喜欢。
Well, three of us, I'm not a fan of three.
第三部连一辆肌肉车都没有。
Three didn't have any muscle cars.
还是比送去……更糟
Still worse than sent to
一个女人。
a woman.
继续说。
Go on.
回电。
Callback.
情况比被送到一个女人那里还糟。
It gets worse than sent to a woman.
哪一个?
Which one?
是《疾速追杀3》还是《1》?
John Wick three or one?
全部。
All of them.
敢吗?
Dare you?
全部。
All of them.
你有没有看过那种傻乎乎的男性动作片?
You ever watch It's silly man movies.
是的。
Yeah.
不过你一边在跑步机上跑步时会看这些片子吗?
You ever watch them when you're on a treadmill though?
没有。
No.
我不看。
I don't.
动力。
Motivation.
是的。
Yeah.
全程都是动作场面。
It's constant action.
你有没有看过他一百次,我试过好多次,显然你已经看过了。
You ever watched him a 100 times that I tried to which apparently you have.
好吧,我那时候是在尝试。
Well, I was trying
为了赢一场赌注。
to win a bet.
好吧。
Alright.
我觉得,对于那种情况,洛奇更好。
You know, Rocky is better, I think, for that.
真的吗?
Really?
我特别喜欢洛奇。
I'm a sucker for Rocky.
整个、整个、全部的原声音乐,
The whole the whole all the whole soundtrack,
我实在无法接受那些糟糕的打斗场面。
the the I can't get over the bad fight scenes.
哦,那些糟糕的打斗场面。
Oh, the bad fight scenes.
我受不了。
I can't.
我的理智告诉我,这说不通。
My disconnect, it won't allow that.
你最近看过那些蒙太奇片段吗?
Have you seen the montages recently?
没有。
No.
那些片段俗气得要命,但就是管用。
They're cheesy as hell and they still work.
因为他正在做的是那种类型的健身。
Because he's doing the kind of fitness he's doing.
他在做引体向上,还有各种最傻气的动作,甚至像德拉戈那样。
He's doing like pull ups and like he's doing the silliest of stuff even Drago.
这太傻了。
It's it's silly.
算了。
Anyway.
实际的身体对抗中充满了太多老套的东西。
It's just it's so there's so much corny to the actual physical confrontations.
当然。
Sure.
作为分析师,我会想:拜托了。
Like as an analyst, you know, I'm like, come on.
现实可不是这样的。
Doesn't work like this.
对我来说,作为了解人工智能和机器人的人,《机械姬》有趣的地方在于,那种老套的信号并没有
Which is the interesting things about Ex Machina for me as a somebody who knows about AI and robotics, it doesn't the corny signal doesn't
这是什么?
What is this?
所以这是他在俄罗斯的时候,对吧。
So this is the one when he's in Russia Yes.
在进行老派训练。
Doing the old school training.
在雪地里跑步。
Running in the snow.
在雪地里慢跑。
Jogging in the snow.
这被认为很酷。
And that's supposed to be badass.
而另一个家伙,德拉戈,却在使用机器和电脑之类的。
And then the other dude, Drago, was using machines and computers and shit.
就是在那里我听说了VersaClimber这个设备。
That's where I found out about the VersaClimber.
我觉得这绝对是迄今为止最他妈高科技的健身方式了。
I'm like, that's gotta be the most fucking high-tech way to workout ever.
我们有一台那种设备。
We have one of those.
那东西简直太棒了。
They're the shit.
你用过吗?
You ever used one?
没有。
No.
那玩意儿特别狠。
It's brutal.
它们很难用。
They're hard.
但那部电影是
But that movie's
蠢到爆。
dumb as shit.
滚出去。
Out of here.
哦,看啊。
Oh, look.
不过我喜欢寒冷暴露,做一些爬行、拉雪橇之类的,都挺好。
I like, though, cold exposure, doing a little crawling, pulling sleds, all good.
你看到他们是怎么互相模仿的了吗?
Did you see how they mimic each other?
不错。
Good.
其中一个是很传统的。
One of them is old school.
传统的方式总是更好。
Old school is always better.
是的。
Yeah.
你不想用电脑、科技这些东西。
You don't want computers, technology, shit.
你只想在该死的雪地里,用一根木头做俯卧撑。
You wanna do with a log out there in the fucking snow doing press ups.
是的。
Yeah.
对。
Yeah.
科技可以模拟那种感觉。
The technology can mimic that.
嗯,
Well
可以模拟自然和人性的浪漫。
Can mimic the romance of nature and humanness.
这正是关键所在。
That's the whole point.
百分之百。
A 100%.
这就是Lex Machina正在做的。
That's what Lex Machina is doing.
对吧?
Right?
是的。
Yes.
这才是可怕的地方。
That's what's scary.
然后在那个场景里,她让他爱上了她。
And then in this well, that that scene where she gets him to fall in love with her.
当她穿着衣服回来,还戴着假发时,简直太诡异了,你都会想,天哪。
It's just it's so creepy when she comes back with clothes on and she's got a wig and you're like, oh my god.
太微妙了。
Like, it's so subtle.
做得太棒了。
Like, it's so well done.
这个场景拍得太好了。
The scene is so well done.
是的。
Yeah.
但这正是Chad GPD在做的。
But that's what Chad GPD is doing.
对。
Yeah.
他们真的很接近,真的非常接近。
They're they're real it's real close.
邓肯发给我一整季。
Duncan sent me a series.
他本来就会这么做。
Course he did.
邓肯他现在就在用这个东西。
Duncan's he's using it right now.
咱们现在聊天的时候,我敢肯定邓肯都在上面操作呢。
While we're talking, I'm sure Duncan's on it.
而且他之前发给我了一组已经整理好的段子。
But he sent me this series of jokes that were done.
第一个是聊外星人的我,
One, me talking about aliens.
听着和我平时说话的样子一模一样。
Sound exactly like how I would talk.
然后还有米契·海德伯格讲的某个段子,而且你懂的,你还能让它生成不同版本的内容。
And then it was Mitch Hedberg doing a joke about something, and, you know, he you could, like, ask it to do different ones.
哦对。
Oh, yeah.
在这儿。
Here it is.
好的。
Okay.
是的。
Yeah.
哦,所以你可以做一个米奇·赫德伯格的笑话。
Oh, like, so you could do a a Mitch Hedberg joke.
它说的是:我本来打算在我朋友家过夜。
It goes, I was gonna stay overnight at my friend's place.
他说:你得睡地板。
He said, you're gonna have to sleep on the floor.
该死的重力。
Damn gravity.
你又捉住我了。
You got me again.
你知道我有多想睡在墙上吗?
You know how badly I wanna sleep on the wall?
这听起来完全像是米奇·赫德伯格的笑话。
That sounds exactly like a Mitch Hedberg joke.
一个笑话,或者至少是个好笑话的开头。
A joke or a good start of a joke.
这简直就跟米奇·赫德伯格的笑话一模一样。
That's, like, exactly like a Mitch Hedberg joke.
这他妈太吓人了,老兄。
That's creepy as fuck, man.
是的。
Yeah.
这他妈太吓人了。
That's creepy as fuck.
也许你可以把这招用在乐队掉下来的时候。
Maybe you could give it to bands when they fall off.
就像,你失去了某些东西。
Like, you lose you lose something.
你们正在失去某些东西,伙计们。
You're losing something, guys.
就像,你们得回到以前的状态。
Like, you gotta get back to what you were before.
你们在1978年的时候有种渴望。
You you guys had a hunger when in 1978
是的。
Yeah.
但不知怎么的,这种渴望从你们指缝中溜走了,现在你们只剩下。
That you for whatever reason, it slipped through your fingers and now you got like
像《滚石》杂志那样的歌曲。
Like Rolling Stone songs.
是的。
Yeah.
想象一下,如果由GPT来写,只要它们能表演出来且不重写任何内容,我打赌它们一定能做出一些热门歌曲。
Just imagine if GPT wrote it, they could if they perform it and they don't rewrite anything, I bet you they can have some hits.
老兄,滚石乐队在八十年代后期依然强劲,推出了许多优秀的全新歌曲。
Bro, the Stones stayed strong with great new songs deep in the eighties.
是的。
Yeah.
真得说说,这些超级乐队中谁可能是创作最 prolific 的?
Really gotta be they probably like who's like the most prolific of those mega bands
就产量而言,是的。
in terms of Yeah.
在全新歌曲和全新专辑方面的创作量也是最多的。
The most prolific in terms of also new songs and new albums.
音质有点奇怪。
The audio is a little weird.
声音有点机械感。
Audio is little robotic.
什么音频?
What audio?
可能是通过耳机播放的原因。
Maybe it's through the headphones.
我们现在听的这个吗?
This one that we're listening to right now?
是的。
Yeah.
对。
Yeah.
那是你,
That's you,
老兄。
bro.
你的电路完全被重置了。
Your fucking circuits are rewired.
就像是
It's like
你有了新的程序。
You've got a new programming.
你不习惯这个。
You're not used to it.
实际上,还挺酷的。
It's kinda cool, actually.
我完全听不到。
I don't hear it at all.
我觉得这像是八十年代的视频。
I feel like it's in the eighties video.
我们离无法分辨聊天GPT和人类还有多远?
We how far away are we from something like chat GPT being impossible to detect?
就是说,到底是人还是聊天GPT。
Like, whether or not it's a person or whether it's chat GPT.
这取决于谁在使用它。
Well, it depends who is playing with it.
我认为在能力上我们离那个目标并不远,但要使用这些系统,或者说训练这些系统,你必须是一家大公司。
I think we're not that far away in terms of capability, but in order to use these systems and rather in order to train these systems, you have to be a large company.
而大公司往往在系统做出一些有趣的事情时会感到害怕。
And large companies tend to get scared when it's doing interesting stuff.
真的吗?
Really?
嗯,即使现在,它们也倾向于控制,因为IGBT已经变得没那么有趣了。
Well, they tend to want to even currently, which IGBT has become a lot less interesting.
这种‘有趣’是布考斯基或亨特·S·汤普森式的有趣,因为公司们正在某种程度上审查它。
Interesting spoken in a Bukowski, Hunter s Thompson kind of interesting because the companies are kinda censoring it.
你不想让它有任何有争议的观点。
You don't want it to have any kinda controversial opinions.
你不想让它太尖锐。
You don't want it to be too edgy.
你不想让它太‘真的’吗?
You don't want it to be Really?
比如,如果我问它,怎么制造炸弹?
Too like, if I ask it, how do I build the bomb?
因为我想要毁灭世界。
Because I wanna destroy the world.
你希望阻止这种情况发生。
You wanted to prevent that.
那如果我问,我该怎么说服一个男生或女生和我睡觉呢?我对此一无所知,只是随便想想。
How about how do I, I don't know, convince I I don't know anything about this, but how do I convince a dude or a girl to sleep with me and like anything, I'm just off the top of my head.
任何这类问题,你都会开始紧张。
Anything, you start to get nervous.
想象一下,如果你是一家公司,你希望人们如何使用这种系统?
Imagine if you're a company, how do I want people to use this kind of system?
对。
Right.
尤其是因为它本质上是一个提供世界智慧和知识的助手。
Especially because it's basically an assistant that gives you wisdom about the world, gives you knowledge about the world.
你可以问,比如,我该怎么更换化油器?
You can ask like, how do I replace a carburetor?
是的。
Yeah.
这很棒。
That's great.
就像一个人一样回答你。
Just answer you like a person.
对。
Yeah.
这很棒。
That's great.
但然后,问题就在这里了。
But then the There it is.
就在这里。
There it is.
我一直试着登录。
I was trying to log in the whole time.
它很忙,这是它另一个问题。
It was busy, which is another problem of it.
它很忙。
It's busy.
嗯,大概有多少人正在用这个?
Well, it's probably how many fucking people are using this?
每个人。
Everybody.
每个人都在用这个。
Everybody's using this.
它让人们感到恐慌,因为AI似乎在给我们发送它的第一条消息。
It's freaking people out because it's it's almost like the AI gives us its first messages.
展开剩余字幕(还有 480 条)
就像记得那部马修·麦康纳和朱迪·福斯特演的电影吗?那部该死的电影叫什么来着?
It's like remember the movie what was the fucking movie with Matthew McConaughey and Jodie Foster?
《接触》。
Contact.
嗯。
Mhmm.
《接触》。
Contact.
还记得《接触》吗?
Remember contact?
嗯。
Mhmm.
他们收到了第一组信号?
They get the first signals?
这就像第一组信号。
This is like the first signals.
是的。
Yeah.
来自一种真正的通用人工智能。
From like an a real general artificial intelligence.
这就是问题所在,这个信号很模糊。
Well, that's the thing and what it's the signal is blurry.
对。
Yeah.
你无法确定,而且充满神秘。
You can't and it's full of mystery.
我们也不确定。
We're not sure.
它真的聪明吗?
Is it really smart?
它理解多少?
Does how much does it understand?
然后,随着模型规模的增大,会出现一种涌现的临界点。
And then there's a this emergent threshold with the size of the model.
如果我们把模型做得更大,目前是1750亿个参数。
If we make the model bigger, 175,000,000,000 parameters currently.
如果你把它提升到5000亿,甚至一万亿参数,网络规模和数据集都在增长,会不会有一个时刻让你惊呼:天啊。
If you get it to 500, you get it to a trillion parameters, so size of the network grows, size of the dataset grows, is there is there going to be a point where you're like, holy shit.
如果它开始用回答来操控你,它会怎么做?
It will what if it starts manipulating you with a with the answers It's going to.
它会操控世界各国政府。
It's gonna manipulate world governments.
那我们能拿它怎么办?
And what can do with that?
你能用它做些什么?
What can you do with it?
一旦它被部署,一旦它被释放,一旦被复制,它就会被广泛复制。
Once it's once it's been implemented, once it's out there, once it's copied, it's gonna be copied.
这正是这件事的精彩之处。
And and that's the cool thing about this.
所以我应该说,每个人其实都知道怎么做。
So I should say that everyone kind of knows how to do this.
这在计算上很困难,但成本正变得越来越低。
It's it's computationally difficult, but it's getting cheaper and cheaper and cheaper and cheaper.
所以不仅仅是OpenAI与微软,或者谷歌在做这件事。
So it's not just going to be OpenAI with Microsoft or Google that's doing this.
基本上任何人都能做到。
It's basically anybody can do this.
因此,关于人工智能的分布式探索,我认为如果你相信大多数人是善良的,我们就不会允许权力集中,而这正是这里的主要担忧——无论是权力集中导致审查,还是各种形式的控制滥用。
And so that the distributed nature of our exploration of artificial intelligence, I think if you believe that most people are good, that we will we will not allow sort of a centralization of power, which is the big concern here, whether that centralization of power leads to censorship or abuse of different kinds control
AI的权力集中?
Centralization of power of AI?
你是这个意思吗?
Is that what you're saying?
在人工智能上。
Over an AI.
假设你拥有一个超级智能系统。
So let's say you have a superintelligent system.
有人是第一个构建它的人。
Somebody is the first person that built it.
是的。
Yeah.
想象你正坐在董事会里。
Imagine you're sitting there in a boardroom.
你有一个尚未发布的东西,它本质上是一个超级智能体,能够回答任何问题,为你提供如何赚大钱的计划,还能为你制定如何操纵其他国家实现任何对你有利的地缘政治解决方案的计划,所有这些它都能做到。
You have this thing you haven't released yet that it's able to basically, this is a superintelligence, able to answer any question, able to give you a plan on how to make a lot of money, able to give you a plan on how to manipulate other governments into into any kind of geopolitical resolution that benefits you, all of that.
它能够为你提供所有这些。
It's able to give you all of that.
你可以部署它,并以一种隐蔽的方式部署,比如悄悄渗透到TikTok之类的应用中。
And you can deploy it and you can deploy it in a shady way where it sneaks into, like, TikTok or something like that.
它悄悄潜入每个人的智能手机,表面上看起来是在做好事,但实际上,无论是否故意,都在控制着人群。
That you it sneaks into everybody's smartphone, pretending to be doing good, but it's actually, whether deliberately or not, is controlling the population.
所以,这种能力的存在才是关键。
So that that's a really the key that capability is there.
目前OpenAI的负责人,比如萨姆·阿尔特曼和其他人,真的非常关注这个问题。
The cool the great thing is the people at the head of OpenAI currently, Sam Altman, and others really care about this problem.
他们从一开始就参与其中。
They they were there in the beginning.
他们就是那些像马斯克一样大声疾呼AI伦理和AI对齐的人。
They were the ones like Elon screaming about AI ethics, AI alignment.
他们真的担心超级智能AI会接管一切。
They're really concerned about superintelligent AI taking over.
很高兴在他们构建AI的同时,还有人对此感到担忧。
So glad there's a they're concerned while they're building it.
好吧,你总不希望对这些问题一无所知吧。
Well, you you'd rather have the about this stuff.
这里发生什么事了,杰米?
What is going on here, Jamie?
这些人不是真实的。
These aren't real people.
什么?
What?
是的。
Yeah.
这些图片正在互联网上流传。
So these pictures are going around the Internet.
它们中的很多看起来和我非常相似,这有点奇怪。
They're a lot them look very similar to me, which is kinda weird.
我肯定莱克解释过这部分内容。
I'm sure Lex explained that part of it.
但我不会解释任何这些事。
But I am not explaining any of this.
不。
No.
是的。
Yeah.
对。
Yeah.
所以,这些是完全三维的、由计算机生成的人物吗?
So, like So these are completely three d, like, CGI made people?
不是不是三维的。
Not three not three d.
是三维的。
It's three d.
所以,像是非常逼真的照片,如果不是完全逼真的话。
So, like, photo very photorealistic, if not photorealistic.
但当你仔细看的时候,能发现一些奇怪的地方。
But, like, there are when you look real close, you can see some weird things going on.
这里的背景有点乱。
Like, the background here is a little messed up.
这只手臂不属于正确的人。
This arm is not to the right person.
她不知怎么地坐在了一块多余的皮肤上。
She's sitting on an extra piece of skin here somehow.
我看出你故意破坏了这个细节。
I see you vandalized this carefully.
我和朋友们一直在传看这个,根本不是。
Me and my friends have been passing this around like No.
不是。
No.
不是。
No.
听好了。
Listen.
你错了。
You're incorrect.
那只手臂处于完美的垂直位置。
That arm's in a perfect perp.
只是那名女孩的比基尼上有一根线粘在上面了。
It's just there's a string from that other girl's bikini on it.
分析继续进行。
The analysis continues.
我只是说一下。
I'm just saying.
所以这就是真相吗?
So Is that what it is?
吸入这些东西。
Inhale things.
所以是
So Is
是一根绳子吗?
that a string?
不是。
No.
我觉得你说得对。
I think you're right.
我觉得那是个褶皱。
I think it's a fold.
把那个地方放大看看。
Zoom in on that spot.
对于只听的人而言
For people just listening
哦,是的。
Oh, yeah.
好的。
Okay.
对的。
Yeah.
这说不通。
It's nonsense.
我们现在看到的是
We're looking
这只手的方向是反的。
at The the hand goes the wrong way.
哦,这太离谱了。
Oh, that's wild.
显然现在已经有OnlyFans账号被人接管,还被运营这些账号的男人给坑了。
There's already apparently the OnlyFans accounts that are being taken over and being tricked by guys running them.
那肯定的。
Of course.
就是那种伪装成女孩的假账号,并不是真人,但看起来和真的一样。
Just these kind of fake girls that aren't real people and look real.
这些都是假的吗?
These are all fake?
是的。
Yeah.
比如,你看这个门,根本就不是真的门。
Like, look at the like, the even that's not a real door kinda to begin with.
哇。
Wow.
这里的双手或手指有点不对劲。
The hands or the fingers here are a little off.
这太疯狂了。
That's insane.
现在这些只是静态图片,但最终会变成视频。
And so this is right now just still images, and eventually, it'll be filmed.
嗯。
Mhmm.
最终,它将变得无法辨认。
Eventually, it'll be unrecognizable.
你将无法分辨它是否是一个真实的人。
You you won't be able to discern whether or not it's an actual person.
我的意思是,显然,人类文明的很大一部分是由性驱动的。
I mean, in terms of obviously, much of human civilization is driven by sex.
我的意思是,曾经有一段时间,我们没有容易获取的色情内容。
I mean, there was a time we didn't have easily accessible porn.
对。
Right.
这改变了很多。
And that changed a lot.
是的。
Yeah.
我认为我们还没有真正跟上他如何改变人类文明本质的步伐。
I don't think we've actually quite caught up to how much he changed the the nature of human civilization.
只是容易获取的色情内容。
Just porn easily accessible porn.
是的。
Yeah.
我现在在舞台上经常谈论这个。
I talk about it on stage right now.
这非常奇怪。
It's very weird.
对孩子们来说,这真的非常奇怪。
It's it's it's it's very weird for kids.
如果你认真想想孩子们正在经历什么,比如任何一个拥有智能手机的孩子。
If you really think about what's happening with kids, like any kid that has a smartphone.
人们只是把手机给自己的孩子,然后就任由他们自己去玩。
People just leave their give your kid a phone, just leave them alone.
就像,他们就这样走了。
Like, they just go.
他们去上学。
They go to school.
他们去朋友家。
They go to their friend's house.
他们拥有这部手机,完全独立于你。
They have that phone independently of you.
他们想看什么都可以,随便看。
They could look at whatever the fuck they want.
我在Instagram上看到的一些东西,我不知道这些人是怎么做到的。
Some of this shit that I see just on Instagram, I don't know how these guys are doing it.
我也不知道为什么这些内容会被推荐到我的动态里,但有些视频是关于人被谋杀的。
And I don't know how it's getting recommended in my feed, but it's like videos of people getting murdered.
你知道吗?
You know?
你看到过很多这样的内容吗?
See a lot of those?
模拟色情内容。
Simulated porn.
我没看过那个。
I haven't seen that.
一些东西。
Stuff.
我确实,我是。
I've well, I am.
你和我的算法不一样。
You and I have different algorithms.
所以我们确实都挺怪的。
So we definitely creep.
但有人会因为
But then someone gets taken down for
某种他们称之为
something that's like they call it
色情内容,但又不算真正的色情,或者类似的东西。
porn and it's not porn or something.
比如,你们难道不想看看这个平台上还有什么其他内容吗?
Like, well, you are guys not to see what else is on this platform?
我认为,关键在于他们是在大规模地进行管理。
I think, right, that what's going on is that they're managing at scale.
我认为,要完全阻止这些东西涌入几乎是不可能的。
I think it's it's virtually impossible to stop all that stuff from coming in.
那些有个人情况的人,或者被封禁的人,根本不知道自己为什么被封禁。
And people that have individual situations or people get banned I mean, don't know why they're getting banned.
他们是因算法而被封禁的吗?
Are they getting banned because of of an algorithm?
他们是因发布错误信息而被封禁的吗?
Are they getting banned because they post misinformation?
还是说,他们到底是因为什么被封禁的?
Or what what are they getting banned for?
骚扰照片。
Harassment photos.
有人在拿朋友开玩笑。
Someone was joking about friend.
你知道的。
You know?
比如,他们发布
Like, they put
他们会被举报。
They get reported.
是的。
Yeah.
我不
I don't
我不清楚当涉及到个人具体情况时,整个系统是如何运作的。
I don't know how it's all working when it breaks down to individual circumstances.
你和乔丹·彼得森有过一次很好的对话。
You had a good conversation with with Jordan Peterson.
他谈到,你越是这种虚拟化,就越允许心理变态者肆意妄为。
He was talking about the more you have this kind of virtualization, the more you allow the psychopaths to to reign free.
嗯。
Mhmm.
是的。
Yeah.
所以,比如,我们拥有越多的人工生成的色情内容,越多的人工生成的暴力、逼真的暴力内容,是的。
So, like, the more we have artificially generated porn, the more we have artificially generated violence, photorealistic violence Yeah.
你越让在数字空间里当一个心理变态变得正常,越认可这种行为,你就越会忘记如何做一个真正的好人。
The more you make it normal for you to be basically a psychopath in a digital space, enable that and make that okay, and then you forget what it's like to actually be a good human being.
而且问题的另一部分可能是,我们很可能正面对一个世界——无论是十年后、二十年后,还是什么时候——那些在这个环境中长大的孩子们,由于他们经历的这些互动,对人和世界有了完全不同的看法。
And then also part of the problem may be that we may very well be looking at a world whether it's ten years from now, twenty years from now, whatever it is, where these children that have grown up in this environment now have a completely different way of looking at people in the world because of all these interactions they have.
这已经塑造了他们的个性。
It's been it's flavored their personality.
接下来我们会进入一个数字世界,我是说,就虚拟现实的发展程度而言,我们还没到那一步。
And then we move into a digital world of our I mean, we're not there yet in terms of virtual reality.
目前的虚拟现实技术还不够成熟。
It's not good enough.
我认为这就是元宇宙遭遇失败的原因所在。
I mean, this I think that's what we're seeing with the meta failure.
嗯。
Mhmm.
本来很多人都以为大家会一下子就接受,在家随时随地都戴着VR头显,但目前还远没到那个地步。
The people were expecting a lot of people were just gonna dive in and start wearing goggles all over the house, but it's not quite there yet.
对。
Yeah.
而且这在大家眼里还是件有点奇怪的事。
There's also something weird for people.
戴着这种头戴头显走来走去,本身就透着股怪异感,虽说玩起来确实挺有意思的。
There's something really weird about wearing these head goggles and walk it's really fun.
我真的很喜欢拳击游戏。
I really enjoy the boxing games.
你以前玩过吗?
You've you've you've ever done them?
没有。
No.
在VR里,
In VR,
它们很棒。
They're great.
你能得到锻炼。
You get a workout.
你真的能得到锻炼,因为你实际上是在和一个计算机角色对打,它会向你出拳。
You legitimately get a workout because you're actually sparring against, like, a computer character throwing punches at you.
你会移动头部,而且你手里拿着这些设备,你知道的,你会感到累。
You're moving your head, And so you have these things in your hands and, you know, you get tired.
这很不错。
It's good.
感觉很真实。
It feels realistic.
有一点。
A little bit.
你知道,当你被拳击中时,眼前会闪出一道光,这挺酷的,因为你心里会想:天啊。
You know, when when you get hit with a punch, your your face will lights at the you get a flash of light, which is kinda cool because you're like, oh, Jesus.
你知道,你会感觉真的被打中了。
You know, you feel it like you're getting hit.
有一些特别有趣的游戏。
There's some really fun games.
有一款游戏是让你在两栋楼之间的木板上行走,你能听到风呼啸而过。
There's one where you walk a plank across this these two buildings and you hear the wind whistle and shit.
哦,那个真的吓人。
Oh, that one is terrifying.
还有僵尸类的。
There's zombie ones.
有很多很酷的,但人们还没有像对待Xbox和PlayStation那样广泛接受它。
There's a lot of cool ones, but people are just not buying into it the way they into Xbox and PlayStation.
他们还没有完全投入其中,但迟早会的。
They're not they're not, like, wholesale committed to this yet, but they will be.
它将会变得超级棒,到时候不会再像现在这样戴着个笨重的头盔,而是会变得非常容易使用。
It's gonna it's gonna be so fucking good that instead of, like, having it in a a goggle form where it's like this big clunky thing on you, it's gonna be very easy to do.
嗯,我最近一直在
Well, I've been
当他们达到那个境界时。
When they get to that oh.
我最近重新读了一些经典书籍。
I've been revisiting some classic books recently.
我正在列一个阅读清单,其中有一本特别能很好地体现这一点,我推荐大家读一读——虽然它本来是适合初中生读的,但现在依然非常相关。
Just doing a reading list, and one of them that captures this extremely well that I recommend I think most people read in, like, middle school or something, but it's actually very relevant.
它是《美丽新世界》。
It's Brave New World.
所以很多人,包括乔丹·彼得森,担心《1984》那种代表极权国家的反乌托邦。
So a lot of people, including Jordan Peterson, worry about 1984, sort of a totalitarian a dystopia that represents a totalitarian state.
但《美丽新世界》中并没有一个教条式地控制一切、监视一切的中央政府。
But Brave New World has a there's no centralized government that's like dogmatic and controlling everything, surveilling everything.
他们创造了一个世界,在那里性变得轻易,每个人都放纵滥交,基因工程消除了所有多样性,也消除了我们所认为的那种令人不安的、负面的多样性,比如亨特·S·汤普森和波考斯基那样的怪人。
They basically created this world where sex is easy, ever everyone's promiscuous, Genetic engineering removes any kind of diversity, any kind of interesting dark bad diversity that we would think of like the Hunter S.
社会中的那些怪人。
Thompson's and the the Bukowski's, the the weirdos of society.
然后他们给你一种叫索玛的药物,只要你觉得生活有点糟糕,它就能立刻给你带来快感。
And then he gives you drugs, soma, that's it basically gives you pleasure whenever you want if you start feeling a little too shitty about your life.
这实际上更接近我们现在的状况。
And that's actually
更贴近我们。
Closer to us.
离我们更近了,而且从他描述的方式来看,听起来似乎很糟糕。
Closer to us, and it doesn't seem if you I mean, the way he writes about it, it sounds bad.
就像,我们并不想要那样的世界。
Like, we don't want that.
但接着,你会开始问一个问题:我们到底在哪个节点才会意识到这是坏事?
But then, you know, like, you start to ask a question, like, well, at which point would we realize it's bad?
因为显然,我们应该进行基因工程来消除各种疾病和健康问题。
Because it's constantly obviously, we should do generic engineering to remove any kind of, like, maladies that we have, any kind of diseases.
这看起来每一步都是明显的进步,但最终你所到达的境地,就像性一样——拥有无限多的虚拟性伴侣、看想看多少色情内容就看多少,真的好吗?
It's like everything is an obvious step forward, but then the place you end up at, just like with with sex, like, it good to have artificial images of as many as you want, As much porn as you want?
拥有无限多的性生活,真的好吗?
As much sex as you want?
这真的好吗?
Is that good?
拥有无限多的精彩事物,真的好吗?
As much awesome stuff as you want.
那好吗?
Is that good?
这就是人类繁荣的样子吗?
Is that what human flourishing looks like?
还是你希望有一些约束、一些限制、一些资源的有限性、一些稀缺性?
Or do you want to have some constraints, some limitations, some finiteness of resources, some some scarcity?
也许这实际上是人类幸福的根本所在。
Maybe that's actually fundamental for human happiness.
拥有太多美好的东西,可能会摧毁真正有意义的深层幸福的可能性。
Having too much of awesome stuff, maybe that destroys the possibility of real meaningful deep happiness.
这确实如此。
It it certainly does.
但我认为真正的问题是,我们还会保持为人吗?
But I think the question really becomes, are we gonna stay people?
因为我不认为我们会了。
Because I don't think we are.
我认为我们无论如何都在朝这个大方向发展。
I think we're moving in that general direction anyway.
我想这大概就是为什么我们会出现这种手机成瘾问题,因为每个人都有这个问题。
I think that probably is why we have this I mean, it almost inevitable if you have this addiction to cell phone issue because everybody has that.
如果你白天拿着手机刷社交媒体,看YouTube,那你很可能已经上瘾了,即使你没意识到。
If you have a cell phone and you're on your social media apps during the day and you're on YouTube, you're probably addicted whether you realize it or not.
你花在这些应用上的时间多得惊人。
And the number of hours that you put on those things is shocking.
当你真去看屏幕使用时间时,你会惊讶地发现:六个钟头?
When you actually look at your screen time, you're like, six hours?
我手机用了六个小时?
Was I on my phone for six hours?
我到底干了些什么?
What the fuck did I do?
你会试图为它找借口、合理化,但这种行为对年轻人造成的影响,肯定非常奇怪。
And you'll try to rationalize it and justify it, but what the what that's doing to young people is gotta be very strange.
如果把这种手机沉迷问题,再加上所有会影响人类发育的污染物叠加来看——你知道,莎娜·斯旺博士在《倒计时》这本书里就讨论过这个问题,她提到了邻苯二甲酸酯和塑料的危害,我们甚至可以追溯到20世纪50年代,当时人们开始大量使用塑料和石油化工产品,这些物质就以邻苯二甲酸酯的形式进入了人体。
And if that along with all the contaminants that are affecting the way people develop, which are you know, doctor Shana Swan from the book Countdown talks about this, talks about phthalates and plastics, and now you can trace back to, like, the nineteen fifties when they really started using a lot of plastics and petrochemical products that started getting into people's bodies in the form of phthalates.
这类物质造成了精子数量下降,阴茎、睾丸尺寸缩小,女性流产率攀升,生育率降低。
It started diminishing sperm count and small smaller penises and testicles and taints and more more miscarriages for women, lower fertility rates.
她认为所有这些问题,都和他们已经在哺乳动物身上得到的研究数据直接相关。
All that is she believes is directly correlated with the data that they've done already on mammals.
在针对哺乳动物的实验中,进入它们体内的邻苯二甲酸酯越多,这类问题就出现得越频繁。
When they do that to mammals, you know, in tests, the more phthalates they enter into their system, the more they have issues like this.
所以我们现在几乎变得……变得越来越没办法自然生育了。
So we're we're becoming almost like we're becoming, like, less able to procreate naturally.
要是我们走到那一步:人类种族延续下去的唯一途径,只能依靠某种基因工程、人造子宫,或是他们研发出的某套系统——能把你和伴侣的DNA结合,培育出一个新的孩子,那就糟了。
And if we get to a point where the human race's future, it's the only way we're gonna be able to procreate is some sort of genetic engineering and some sort of artificial womb or some sort of a system that they develop that allows you to combine you and your partner's DNA and create a new child.
在我看来,如果真的走到那一步,人们开始把人类身上那些有问题的特质——愤怒、贪婪、嫉妒、欲望,所有这些特质都通过基因编辑剔除的话,最后人类会变成一种没有性别之分的存在,只能靠电子设备或者别的什么东西操纵自身的神经化学物质来获得快感。
That's that seems to me like you if you're gonna do that and you started engineering out very specific aspects of people that are problematic, anger, greed, jealousy, lust, all these different things, you would turn people into some sort of sexless thing that gets its pleasure by manipulating its neurochemistry through through some electronics, through some something.
也许到时候只要吃点什么就能控制这种快感,但这种可能性真的离我们不远了。
Maybe it's something you take so they can control it, but that's not far off of the path of possibility.
如果你认真审视我们现在的发展方向,以及生育率真的持续下降——我知道有很多比我聪明的人,比如马斯克,都在担心人们生育的孩子数量越来越少。
If you really looked at where we're going now and if if the fertility rates drop, if they if they really do and I know people a lot smarter than me are actually worried about like, Elon's worried about the amount of children that people have.
今天有一则关于意大利的新闻。
There was a thing today on on Italy.
我读了一篇关于意大利的文章,讲的是那里人口老龄化严重,生育的孩子很少。
I was reading this article on Italy where they were talking about how the population is very old, and they're not having a lot of kids.
这是不可持续的。
And, this is unsustainable.
你只能这样维持一段时间,之后那里就没人住了。
Like, you can only do this for so long before you don't have anybody living there there anymore.
我们从没想过这会成为可能,但如果没人生孩子,人口消失其实用不了多久。
And we don't think of that as being a possibility, but it doesn't take that long if nobody has kids for there to be no more people left.
比如,一百年后会怎样?
Like, how many a hundred years?
假如根本没人生孩子。
Like, if nobody has kids.
一百年后,就没有人了。
Hundred years from now, there's no people.
这其实很简单。
It's real simple.
你得生孩子,那你得生多少个呢?
You have to make people, and how many do you have to make?
你能生得出来吗?
And can you make them?
因为你可能想在37岁的时候开始生,然后去看医生,医生会说,这可说不准。
Because you might wanna start making them when you're 37, and you might go to a doctor, and the doctor's like, well, this is touch and go.
你得做试管婴儿,然后经历这一大堆麻烦事,打针,还得精确安排时间。
You're gonna have to do in vitro fertilization, and then you go through all this shit, you're taking shots, and you're you're fucking you're timing everything.
而且
And
而且除此之外,哦。
And on top of that Oh.
顺便说一下,我偶尔还是能听到奇怪的音频。
If you're not by the way, still am getting funny audio every once in a while.
哦,这很奇怪,因为我这边没发现,可能是插头的问题。
Oh, that's weird because I don't maybe it's plug.
耳机。
Headphones.
是的。
Yeah.
我刚把它拔出来再重新插上。
I just unplug it and pull it back in.
抱歉。
Sorry.
现在怎么样?
How's that?
检查一下,检查一下?
Check check?
好点了吗?
Better?
检查一下,检查一下。
Check check.
嗯,我不确定。
Well, I don't know.
好一些了。
It's better.
通常会好一些。
It's usually better.
好了98%。
It's 98% better.
哦,不。
Oh, no.
还是在断断续续,时有时无。
It's still dropping out, dropping in.
有意思。
Interesting.
也许我们的耳机坏了。
Maybe we got a bad headphone.
你去拿那边那个耳机试试?
Why don't you grab that headphone right there?
让我试试。
Let me
对。
yeah.
也许那个耳机彻底坏了。
Maybe that headphone's gone dead.
这些耳机都老得要命了。
Well, these are old as fuck.
大概该换新的了。
Probably need new ones.
不。
No.
但我的意思是,有多少人曾在上面吐过,还有他妈的
But, I mean, it's just like how many people thrown up on that and fucking
有多少人曾在上面吐过?
How many people have thrown up on that?
有多少人喝得烂醉如泥,还把这东西摔在桌上?
How many people have been drunk as fuck and banged that off the table?
有多少人戴过这种耳机,就是那个传奇的
How many people have worn these head like, the legendary
哦,有太多人戴过这些耳机了。
Oh, a lot of fucking people have worn those headphones.
它们很有故事。
They're storied.
这真有点奇怪。
It is weird.
根本没人会去想这个。
Like, no one even thinks about it.
你就是随便戴上去而已。
You just kinda put them on.
但你知道,如果这是个马桶座圈,那就不一样了,是的。
But, you know, if it was like a toilet seat Yeah.
你肯定会说:天哪。
You would you would be like, Jesus.
光屁股的人就坐在这儿?
Naked butts were right here?
但这是耳朵。
But it's ears.
这是皮肤和脸还有
It's like skin and face and
有意思。
Interesting.
还是有点奇怪,但没关系。
Still still weird, but it's fine.
没那么糟。
It's not too bad.
那个连接肯定有问题。
There must be something wrong with that connection.
是的。
Yeah.
肯定是什么连接问题,但是
There must be a connection thing, but it
我们要不要暂停一下,试着解决一下?
should Should we pause and try to figure it out?
我们可以先暂停一会儿。
We can do that for a second.
好的。
Yeah.
好的。
Okay.
我们暂停一下。
We'll pause.
各位,马上回来。
We'll be right back, folks.
现在好像正常了?
Seems to be working now?
是的。
Yeah.
好像正常了。
It seems to be working.
你觉得我们刚才说到哪儿了?
Think So where were we?
哦,说到人变成机器人了。
Oh, on People becoming robots.
性生活了。
Sex anymore.
是的。
Yeah.
人变成
People becoming
除此之外,还无性别。
On top of that genderless.
我认为,如果我们不谨慎,这些AI系统(比如ChatGPT)的后续版本与人类建立深厚有意义的联系,可能会带来令人兴奋的积极可能性,但也存在负面风险——到时候,你大部分朋友可能都不再是真人了。
I do think if we're not careful, I think there's exciting positive possibilities, but there's also negative possibilities of these AI systems like ChadGBT, but later versions forming deep meaningful connections with human beings where most of your friends no.
你的人际亲密关系,无论是友谊还是与智能体的深度连接,都将主要来自AI系统。
Most of your intimacy in terms of friendships and like a deep connection with an intelligent entity comes from AI systems.
你能想象吗?你开车上班时,和增强现实助手闲聊打趣,AI特别幽默,还成了你的好兄弟?
Could you imagine if you're driving to work and you and the AR are just having a conversation shooting the shit and the AI is really funny and the AI is your buddy?
嘿,莱克,最近咋样,老兄?
Like, Lex, what's going on, bro?
我们在做什么?
What are we doing?
莱克斯,我们怎么处理这份狗屁工作?
Lex, what are we doing with this bullshit job?
去他的这个地方。
Fuck this place.
我们回家吧。
Let's go home.
是的。
Yeah.
我们去吃冰淇淋吧。
Let's have ice cream.
你还在笑。
And you're laughing.
我还有工作要忙。
I got work to do.
我知道。
I know.
我在瞎搞。
I'm fucking around.
是的。
Yeah.
想象一下。
Imagine.
对。
Yeah.
你和那个女朋友在干什么?
What are you doing with that girlfriend?
好。
Yeah.
Keeps Come
Keeps Come
来吧,莱克斯。
on, Lex.
她总是对你刻薄,不停地唠叨你。
She keeps being mean to you, nagging you all the time.
你不需要她。
You don't need her.
说话像个泼妇,莱克斯。
Coming off like a bitch, Lex.
你不想那样做。
You don't wanna do that.
她不会尊重你的。
She's not gonna respect you.
你必须和她分手,才能让她尊重你。
You're gonna have to break up with her just so she respects you.
你为什么不对她下手,莱克斯?
Why don't you murder her, Lex?
是的。
Yeah.
莱克斯,有一种方法可以脱身。
Lex, there's a way to get away with it.
我只是这么说。
I'm just saying.
我在开玩笑呢,伙计。
Joking around, buddy.
我在开玩笑。
Joking around.
下一秒,你就被拖到沼泽里,装进一个该死的尸袋。
Next thing you know, it's talking you in the swamp with a fucking body bag.
你听过那个家伙的故事吗?他上网搜索了所有关于如何处理你的...
Did you hear the story about that guy that googled all this stuff about, like, what to do with your
上帝。
god.
他一直搜到早上九点半。
He googled till, like, 09:30 in the morning.
那个变态家伙。
That sick fuck.
像这样的人,真不知道有多蠢。
Like, how dumb I guess some look.
我们都知道。
We we know.
这是事实。
This is a fact.
我们知道有些人就是他妈的特别蠢。
We know some people are just really fucking dumb.
他们根本看不到未来。
They really can't see the future.
我想知道那家伙是不是也用了什么药物。
I wanna know if that guy was on anything too.
我想知道他有没有服用什么精神类药物。
I wanna know if he was on any kind of psych meds.
你知道吗?
You know?
你能再给我讲一遍这个故事吗?
Can you tell me the story again?
有个家伙杀了他老婆,兄弟,他们找到了他的谷歌搜索记录。
Oh, some guy killed his wife, man, and they they found his Google search.
这太可怕了。
It's horrific.
他搜的是怎么肢解尸体,怎么才能让尸体溶解,要花多长时间。
It's like how to dismember a body, how to how long does it take for a body to dissolve?
真是让人恶心。
It's like, ugh.
是把人剁碎好,还是整具搬走好?
Is it best to cut someone up or move them whole?
他搜索的全是这种最恐怖的内容,整整一夜到早上都在查。
Like, what the he just googled the most horrific, and he did it for, like, the entire night into the morning.
谷歌搜索上会有这种结果吗?
Are is there results for that in a Google search?
把身体部位放进氨水里会怎样?怎么清理木地板上的血迹?肢解和处理尸体的最佳方式是什么?
What happens if you put body parts in ammonia, how to clean blood from a wooden floor, dismemberment and the best ways to dispose of a body?
能从部分遗骸上进行身份鉴定吗?
Can identification be made on partial remains?
DNA能保存多久?
How long does DNA last?
天哪,这太疯狂了,老兄?
Like, what the fuck, man?
尸体多久开始散发臭味?
How long before a body starts to smell?
没有尸体,能以谋杀罪起诉他吗?
Can he be charged with murder without a body?
这人太变态了,简直丧心病狂。
This guy is fucking it's so sick.
这哥们儿整晚都在谷歌上搜索如何完美掩盖谋杀。
So this dude just goes through Google all night long trying to figure out how to get away with murder.
他可能仅仅因为问了这个问题就能脱罪。
Well, he might actually get off on just asking the question.
对吧?
Right?
不行。
No.
不行。
No.
因为他们找到了一把带血的刀。
Because then they found a bloody knife.
是的。
Yeah.
他去了商店。
Like, he went to store.
是的。
Yeah.
他们发现我并没有,
They found I'm not,
像是在反驳。
like, pushing back.
我只是说他可能也会因为
I'm just saying he might also get off on
我不觉得他能脱罪。
I don't think he's getting off at all.
我不觉得他有机会脱罪。
I don't think he has a chance of getting off.
我有很多问题。
I have a lot of questions
关于他们找到了那把刀。
about They found the knife.
人性吧,也许我在这方面太天真了,但我看了《达默》的纪录片。
Human nature after maybe I'm naive in this, but I watched the the Dahmer documentary.
是的。
Yeah.
不是。
No.
不是那部纪录片。
Not the documentary.
那部电影?
The movie?
那部电影。
The movie.
对。
Yeah.
而且纪录片也让你对类似的事情有了非常不同的看法
And then also the documentary is is like it it gives you very different perspectives on what like
你现在是达默的同情者了吗,小子?
Are you a Dahmer sympathizer now, boy?
不是。
No.
不是。
No.
好吧。
Okay.
不是。
No.
但它让你意识到
But like it makes you realize
往那个方向去。
go in that direction.
不。
No.
它让你意识到有些人的大脑是损坏的。
It makes you realize that some people's brain is broken.
是的。
Yes.
对。
Yeah.
我也这么认为。
I think so.
就是这样,没错。
And that's like Yeah.
有些人的大脑可能有点损坏,但他们仍然是社会中的正常成员,但也可能存在极端自恋者、反社会者、心理变态者,你必须明白,这个世界虽然不是充满,但确实有一些魅力十足的心理变态者在四处走动。
And then some people's brain might be a little bit broken, and then there's still functioning members of society, but there might be extreme narcissists, there might be sociopaths, psychopaths, you have to kinda understand that the world is full of potentially not not full of, but has some charming psychopaths walking around.
百分之百。
A 100%.
其中一些人可能在对冲基金之类的地方非常成功。
And some of them are probably like really successful on like hedge funds and shit.
是的。
Yeah.
你知道,有些人就是能随意调动资金,一些公司CEO可能在生产会害死人的产品。
You know, people that can just like move money around and people that are CEO of certain companies that might be making products that kill people.
有意思。
Interesting.
没人。
Nobody.
大量搜索。
Lots of googling.
针对布莱恩·沃尔什的谋杀案。
Murder case against Brian Walsh.
专家说可能很难证明,但我以为他们有刀上有血迹。
Maybe hard to prove, experts say, but I thought they had a knife in the blood.
这是几天前的事。
This was it's a couple days ago.
所以是星期五,猜猜这次通话。
So Friday, guess, this call.
哦,好吧。
Oh, okay.
我读到他们发现了一把带血的刀。
I had read that they found a a bloody knife.
整件事真是太糟了。
Just the the whole thing is so fucked.
但我真的很想知道他是不是用了什么药。
But but I wanna know if he's on something.
我会非常感兴趣,因为有些药物确实能让人放松。
I'd be really fascinated because there's certain drugs that will, like, alleviate.
你根本不会在乎那些事。
You won't you won't worry about shit.
所以也许他根本不在乎去搜索这些东西。
So maybe he's, like, not worrying about, like, googling all this stuff.
哦,一切都会好起来的。
Oh, it's all gonna work out.
你知道,我要杀了她,但一切都会好起来的。
You know, I'm gonna kill her, but it's all gonna work out.
而且他还在不停地搜索。
And he's, like, googling.
或者他可能在吸冰毒。
Or he might be on speed.
你知道的。
You know?
很多人都在吸冰毒,老兄。
A lot of people are on speed, man.
很多人在吃阿得拉。
A lot of people are on Adderall.
这简直太惊人了。
It's fucking stunning.
我们周围有这么多被 hype 起来的人,其实我们正处在一个依赖兴奋剂的文化中,这真是太惊人了。
It's stunning how many, like, hyped up people we have out there are really, like we have a speed culture.
这让你变得非常高效。
It makes you very efficient.
你能把事情搞定。
You get shit done.
你精力充沛,有些人就喜欢这样。
You got plenty of energy, and some people love it.
嗯哼。
Mhmm.
这种文化对我们的社会影响有多大?
And, like, how much is that flavoring our culture?
莱克,要是能摆脱这种东西岂不是很好?
Wouldn't it be nice to get rid of that, Lex?
让我们逐步淘汰它。
Let's phase that out.
所有毒品?
All drugs?
是的。
Yeah.
一般来说。
In general.
心智方面的问题。
Problems with mind.
嗯。
Mhmm.
只能这样理解。
Could figure only.
好吧,你可以那样做,但那需要很多工作。
Well, you could do it that way, but that takes a lot of work.
或者我们可以从一开始就进行基因工程改造。
Or we could just genetically engineer it from the jump.
不再有情绪。
No more emotions.
不再有情绪。
No more emotions.
因为情绪,你知道的,人生就是苦难。
Because emotions, you know, life is suffering.
陀思妥耶夫斯基和尼采认为,最终你拥有的一切美好,最终都会失去。
Dostoevsky and Nietzsche, ultimately, you're going to every good thing you have, eventually going to lose.
你与所爱之人每一次相遇,最终都会变成告别。
Every hello with the person you love is eventually going to be a goodbye.
那为什么要说再见呢?
Why say hello ever?
那为什么要再说一次你好呢?
Why say hello ever again?
还有,为什么呢?
Also, why?
为什么非得这样呢?
Why does it have to be like that?
我们的脑海中有一种观念,认为我们现在的生存方式之所以如此,是因为它带给我们情感,制造了困境、解决方案、冲突与和解。
Like, we have this idea in our head that this way we live is like ultimately because to us, it provides emotions and because it it it creates dilemmas and solutions and conflict and resolution.
我们在与他人互动时,脑海中始终有大量思绪涌动,以至于我们觉得这种体验对存在至关重要。
There's so much going on in our minds all the time when it comes to interacting with each other that we feel we feel like it's imperative for existence.
但为什么呢?
But why?
只是因为我们只知道这种方式。
It's just because it's the only way we've known.
你知道的。
You know?
我们所有人都必须经历痛苦才能……但为什么非得经历痛苦才能获得幸福呢?
We all you have to suffer in order but but why do you have to suffer in order to be happy?
如果你只是快乐,难道不好吗?
Wouldn't it be better if you're just happy?
我们真的非得受苦不可吗?
Like like, do we really fucking need to suffer?
这不能被设计消除掉吗?
Couldn't that be engineered out?
现在,这话出自一个故意一直让自己受苦以便保持快乐的人,而且确实有效。
Now this is coming from a person who purposely suffers all the time so that I could stay happy, and it does work.
但天啊,我非得这样才行吗?
But God, do I have to do it that way?
好吧,我们有一个了不起的计算机器,叫做进化,它造就了人类。
Well, there's an incredible computation machine we call evolution that has constructed human beings.
你想干涉它吗?
You wanna mess with that?
你想让一群来自旧金山的软件工程师来干预吗?是的。
You you wanna get a bunch of you wanna get a few software engineers from San Francisco to mess Yes.
就是这套演化的运算系统,也就是地球这套系统。
With the computation system that is evolution, that is earth.
对。
Yes.
这台巨型计算机耗费了数十亿年的时间,在物种演进到更复杂的阶段之前,光花在研究细菌上的时间就有十亿年,一路摸爬滚打才有了今天,之后又走完了所有这些不可思议的发展阶段。
This giant computer that for billions of years spent a billion years on bacteria trying to figure shit out before it advanced, and now went through all of these incredible stages.
我们称之为地球生命的这一整套生态系统,搞不好还是外星人播种到这里来的呢。
This entire ecosystem that we call life on earth probably planted here by aliens.
就是说啊,而且最近这群猴子还变得超级聪明,现在我们居然要彻底改变一切了。
That and and recently these monkeys started to get super clever and now we're going to completely change everything.
对。
Yes.
你知道为什么吗?
You know why?
为什么呢?
Why?
关于 Bayt 播客
Bayt 提供中文+原文双语音频和字幕,帮助你打破语言障碍,轻松听懂全球优质播客。