TED Radio Hour - 创作者经济如何让你说话像互联网一样 封面

创作者经济如何让你说话像互联网一样

How the creator economy is making you talk like the internet

本集简介

网络语言学家亚当·亚历克西奇揭秘自己如何通过解读网络俚语走红。此外,在我们的创作者经济系列第二集中,探讨为何以线上谋生意味着为"算法"工作。 TED Radio Hour+ 订阅用户现可解锁独家番外篇,获取TED演讲者的更多观点,并跟随制作团队一探幕后花絮。Plus订阅还能让您无广告收听常规节目(比如本期内容!)。立即注册 plus.npr.org/ted。 了解更多赞助商信息选择:podcastchoices.com/adchoices NPR隐私政策

双语字幕

仅展示文本字幕,不包含中文音频;想边听边看,请使用 Bayt 播客 App。

Speaker 0

这里是TED广播时间。

This is the TED Radio Hour.

Speaker 0

每周,带来突破性的TED演讲。

Each week, groundbreaking TED Talks.

Speaker 1

我们现在的任务是大胆梦想。

Our job now is to dream big.

Speaker 2

在TED大会上呈现。

Delivered at TED conferences.

Speaker 2

为了实现我们期待的未来。

To bring about the future we want to see.

Speaker 2

遍布全球。

Around the world.

Speaker 2

去理解我们是谁。

To understand who we are.

Speaker 2

从这些演讲中,我们为您带来会令您惊讶的演讲者和观点。

From those talks, we bring you speakers and ideas that will surprise you.

Speaker 1

你永远不知道会发现什么。

You just don't know what you're gonna find.

Speaker 2

挑战你。

Challenge you.

Speaker 2

我们真的需要自问:为什么这值得关注?

We truly have to ask ourselves, like, why is it noteworthy?

Speaker 2

甚至改变你。

And even change you.

Speaker 2

我真的感觉自己像是变了一个人。

I literally feel like I'm a different person.

Speaker 2

是的。

Yes.

Speaker 2

你有这种感觉吗?

Do you feel that way?

Speaker 2

来自TED和NPR的值得传播的思想。

Ideas worth spreading from TED and NPR.

Speaker 2

我是玛努什·扎莫罗迪。

I'm Manoush Zamorodi.

Speaker 2

今天的节目是我们探索创作者经济系列的第二部分,讨论人们如何通过创意赚钱,以及在AI时代创意的真正含义。

Today on the show, part two of our exploration of the creator economy, how people are making money off of creativity, and what creativity even means in the age of AI.

Speaker 1

这是成为全职旅行博主的第一天。

This is day one of becoming a full time travel influencer.

Speaker 2

我经常外出就餐,过去一周我吃过的每家餐厅都能听到我们的声音。

I eat out a lot to hear us everywhere I ate this past week.

Speaker 2

如果有机会,美国超过一半的Z世代表示他们想成为社交媒体网红。

If given the opportunity, more than half of Gen Z in The US say they'd like to be a social media influencer.

Speaker 2

我听说过的产品,通过创作内容谋生,通常以网络短视频的形式呈现。

Product that I heard about from Making a living by creating content often in the form of short videos online.

Speaker 2

比如亚当·亚历克西克。

Like Adam Alexic.

Speaker 1

我用一种自称为'教育类网红口音'的方式说话。

I speak in a what I call an educational influencer accent.

Speaker 1

我会说得非常快。

I'll talk very quickly.

Speaker 1

我会强调更多词汇来吸引你观看我的视频。

I'll stress more words to keep you watching my video.

Speaker 1

而这一切都是通过算法进行交流的预期方式的一部分。

And all of that is part of this expected way to talk through the algorithm.

Speaker 2

亚当在社交媒体上自称词源学爱好者,在TikTok和Instagram等平台拥有超过300万粉丝。

Adam calls himself etymology nerd on social media, where he has over 3,000,000 followers across platforms like TikTok and Instagram.

Speaker 1

我在大学学习语言学,当你即将获得语言学学位时,经典的自问是:接下来我该做什么?

I studied linguistics in college, and the classic question you ask yourself when you're graduating with a linguistics degree is, well, what do I do next?

Speaker 2

是啊。

Yeah.

Speaker 1

我想,不妨试试社交媒体。

I thought, you know, I might as well give social media a try.

Speaker 1

创造了一种鸟语。

Made a bird language.

Speaker 1

它被称为。

It's called.

Speaker 2

亚当三年前开始发布关于语言的内容。

Adam started posting about language three years ago.

Speaker 1

八度音程表示情绪,半音,以及

The octave indicates mood, the semitone, and

Speaker 2

包括一个关于他自己交流习惯的早期系列

Including an early series about his own habit of communicating

Speaker 1

我开始了。

Here I go.

Speaker 2

像鸟儿一样。

Like a bird.

Speaker 2

其他视频讲解语音学。

Other videos explain phonetics.

Speaker 1

我很确定第一个音节是

I'm pretty sure first syllable is

Speaker 2

一个模仿作品。

a parody.

Speaker 2

歌手Cardi b创作的。

Singer Cardi b makes.

Speaker 1

显然,这个r音比普通颤音持续得更长。

Obviously, the r is held longer than a regular trill.

Speaker 2

在此期间,他目睹观众逐渐增多。

All the while, he watched as the audience grew.

Speaker 2

得益于他可爱的书呆子气质,也归功于算法的响应方式。

Thanks to, yes, his endearing nerdiness, but also how the algorithm responded.

Speaker 1

使用两个茶词中的一个,具体用哪个取决于

Uses one of two words for tea and which one they use depends on how the

Speaker 2

以他最受欢迎的一个视频为例,该视频解释了为何我们有些人用chai这个词,而其他人称之为

Take one of his most popular videos, which explains why some of us use the word chai while others call it

Speaker 1

Chao De Chinos。

Chao De Chinos.

Speaker 1

现在,茶沿着平淡的贸易路线传播。

Now chao spread along bland trade routes.

Speaker 1

给我

Give me

Speaker 2

这个视频已被观看超过1600万次。

This video has been watched over 16,000,000 times.

Speaker 1

我想我从高中就知道这件事了,但很长一段时间我都没考虑过制作相关视频,因为我觉得,哇,大家都知道这个。

I think I've known about this thing since high school, and I didn't even consider making a video about it for the longest time because I thought, wow, everybody knows this.

Speaker 1

但我想人们其实不知道。

But I guess people don't.

Speaker 2

视频中,亚当站在地图前,指着几个世纪前的贸易路线。

In the video, Adam stands in front of a map pointing out centuries old trade routes.

Speaker 1

基本上每种语言都有这两个词中的一个。

Every language basically has one of these two words.

Speaker 1

‘茶’这个词通过陆路沿着丝绸之路传播。

The word chai spread by land along Silk Road landed trade routes.

Speaker 1

像印地语、俄语、波斯语中的‘chai’这个词是通过海洋贸易路线传播的。

The word chai in languages like Hindi, Russian, Persian, the word tea spread along ocean trade routes.

Speaker 1

现在欧洲每个国家都想得到一些茶叶,于是他们开始通过海洋与荷兰人进行贸易

Now every country in Europe wants to get their hands in some tea, so they start trading with the Dutch through ocean

Speaker 2

贸易路线。

trade routes.

Speaker 2

整件事有种紧张、略带狂躁的感觉,就像他在匆忙上完这堂历史课,尽管他说的每个词都经过精心选择。

Intense, slightly manic feel to the whole thing, like he's rushing through this history lesson even though every word he says has been chosen with care.

Speaker 1

是的。

Yeah.

Speaker 1

我觉得我用了一个钩子开头。

I think I opened with a hook.

Speaker 2

首先,要有开场白。

First, there's the intro.

Speaker 1

哦,你不知道每个国家都用两个词中的一个来指代茶。

Oh, you didn't know about how every country uses one of two words for tea.

Speaker 1

所以一开始我就用了'你'这个词,这是第二人称代词,总是更容易传播,因为人们喜欢关于自己的视频。

So right off the bat, I use the word you, which is a second person pronoun, which always goes more viral because people like videos about them.

Speaker 1

所以如果我换一种开场白说,这是一个关于茶的视频,那就不会火,即使内容完全相同,理念也完全一样。

So if I just said as an intro instead, this is a video about Chai and that's not gonna go viral, even if it's the exact same content, the exact same, like, idea.

Speaker 2

然后是故事情节。

Then there's the storyline.

Speaker 1

这可能有点简化,比如波兰语里有这个词,但我把故事简化了,因为简单的叙事在社交媒体上传播非常重要。

It's it's maybe slightly reductive, like Polish has the word, but I'm sort of simplifying the story because simple storytelling is so important to go viral on social media.

Speaker 1

你不能有多个叙事线。

You cannot have multiple narratives.

Speaker 1

在社交媒体上,你一次只能讲一个故事。

You have to tell one narrative at a time on social media.

Speaker 2

还有他的表达方式。

And his delivery.

Speaker 1

我语速很快,用的是教育类网红的腔调。

I am talking quickly using my educational influencer accent.

Speaker 1

我在地图上四处指点,来回移动,制造视觉干扰。

I am pointing around in the map and kinda moving around, like, creating visual disruptions.

Speaker 1

所有这些某种程度上都是你走红的原因。

And all of that kind of is part of how you go viral.

Speaker 2

但除了历史课程,亚当还将焦点放在粉丝们当下使用的词汇上。

But also, in addition to history lessons, Adam turns the spotlight on words his followers are using right now.

Speaker 1

你知道'他下厨了'是好事,但'他完蛋了'这句话

You know how the phrase he cooked is a good thing, but the phrase he's cooked

Speaker 2

似乎每天都在变化。

Which seemed to change by the day.

Speaker 1

关于'skippity'这个词最有趣的是,关于'w riz'的特点是'w'并不代表

The most interesting thing about the word skippity is that The thing about w riz is that w doesn't mean

Speaker 2

他的独门秘诀就是向用户解释他们为何这样说话。

His special sauce is explaining to his users why they talk the way they do.

Speaker 2

算法也喜欢这样。

The algorithm likes it too.

Speaker 2

那么是他在操控算法,还是算法在操控他?

So is he working the algorithm or is the algorithm working him?

Speaker 2

操控着所有制作和消费内容的人。

Working everyone who makes and consumes content.

Speaker 2

今天的节目中,我们将继续探讨创作者经济系列。

Today on the show, we're continuing our series exploring the creator economy.

Speaker 2

我们如何买卖创意,以及它如何改变我们的价值观。

How we're buying and selling creativity and how it's changing what we value.

Speaker 2

上次,我们与企业家兼艺术家杨西·斯特里克勒探讨了商业层面。

Last time, we looked at the business side with entrepreneur and artist, Yancey Strickler.

Speaker 2

这次,我们来看看创作者经济如何影响我们的文化和语言。

This time, how the creator economy is influencing our culture and language.

Speaker 2

过去几年里,亚当将他的视频内容拓展成了多重职业身份。

So in the last couple years, Adam has parlayed his videos into a multi hyphenate career.

Speaker 1

我是一名语言学家、网红,也是《算法语言》的作者。

I'm a linguist, influencer, and author of AlgoSpeak.

Speaker 2

24岁的他,自称收入对毕业于相对冷门专业的大学生来说还算不错。

At 24, he makes what he calls a decent salary for a college grad who majored in a relatively obscure field.

Speaker 1

大学时我的研究更偏向政府与语言学方向,本科论文全是关于塞尔维亚和克罗地亚的语言民族主义。

In college, my research was more focused on government and linguistics, and my undergrad stuff was all about language nationalism in Serbia and Croatia.

Speaker 1

哦。

Oh.

Speaker 1

所以我就这样进入了这个行业。

So that's how I ended up in that line of work.

Speaker 2

我这么问绝非有意冒犯。

I really don't mean to be impertinent by asking.

Speaker 2

我知道有些人会说,这家伙才二十出头,

I know some people will say like, how can this guy he's like in his early twenties.

Speaker 2

怎么能自称语言学家?

How can he call himself a linguist?

Speaker 1

是啊。

Yeah.

Speaker 1

当然。

Of course.

Speaker 2

有些人可能会遇到你,然后想,天哪,任何人都可以成为任何领域的专家。

Some people might come across you and think, oh my god, anybody can be an expert about anything.

Speaker 1

确实,我只有语言学本科学位,没有研究生学位。

It is true that I have a undergrad degree in linguistics, not a graduate degree.

Speaker 1

我认为我所做的事其实特别适合研究俚语词源学,因为要真正理解某些内容如何在网上传播,你自己必须身处网络环境。

I think what I'm doing is actually uniquely suited to studying slang etymology because to really understand what it means for something to spread online, you have to be online yourself.

Speaker 1

我绝对没想到自己最终会研究社交媒体语言学,但事情就这样发生了——我开始在这些平台上制作关于语言、词源学的视频,然后我注意到自己的表达开始受到一些限制。

And I definitely never knew I'd end up studying social media linguistics, but it just happened that I started making videos about language, about etymology on these platforms, and I started noticing my own speech being restricted a little bit.

Speaker 1

我开始注意到自己会如何绕过平台限制来调整语言表达。

I started noticing how I would reroute my language around platform constraints.

Speaker 2

能举个例子吗?

Can you give me an example of that?

Speaker 2

比如说什么?

Like, what?

Speaker 1

是的。

Yeah.

Speaker 1

在TikTok上你不能说'杀'这个词。

Well, you can't say the word kill on TikTok.

Speaker 1

我的意思是,你可以说,但会被限流。

I mean, I guess you can, but it's suppressed.

Speaker 1

你的视频会被推荐给更少的人。

Your video is gonna be shown to fewer people.

Speaker 1

于是人们转向替代方案,比如直播。

So people turn to alternatives like on a live.

Speaker 1

当你‘非活’后会发生什么?

What happens after you unalive?

Speaker 1

I

Speaker 2

实际上,在读到你的书之前,我从未听说过‘非活’这个词。

had not heard of unalive until I I read your book, actually.

Speaker 2

显然,我接触的年轻人还不够多。

Clearly, I don't hang out with enough young young people.

Speaker 2

但人们开始用‘非活’代替‘被杀’或‘死亡’,是为了规避审查或平台的语言限制。

But, like, people started saying unalive instead of someone was killed or dead to get around censorship or get around restrictions on language on these platforms.

Speaker 1

没错。

Exactly.

Speaker 1

这是个更安全的平台替代词。

It's a more platform safe alternative.

Speaker 1

据我们了解,‘直播’一词始于2013年的《终极蜘蛛侠》梗,后来在2010年代末演变成Roblox梗,随后被TikTok上的心理健康社区使用。

As far as we can tell, the word on a live started in 2013 with an ultimate Spider Man meme that then turned into a Roblox meme in the late twenty tens that then started being used by the mental health community on TikTok.

Speaker 1

但我也发现人们在线下使用‘非活’这个词。

But then I also found that people were using the word unalive offline.

Speaker 1

有些初中生在课堂作文里会讨论哈姆雷特考虑‘非活’自己。

There are kids in middle schools who talk about Hamlet contemplating unaliving himself in their classroom essays.

Speaker 1

或是课堂上讨论《化身博士》中发生的‘非活’行为。

Or a classroom discussion on the unaliving that happens in doctor Jekyll and Mr.

Speaker 1

海德。

Hyde.

Speaker 2

现在有请亚当·阿莱丘克登上TED讲台。

Here's Adam Aleczuk on the TED stage.

Speaker 1

这些并非假设情境。

And these aren't hypothetical situations.

Speaker 1

这些都是我从上千名中学教师关于这个词的调查中提取的真实案例。

These are actual examples drawn from the thousand plus middle school teachers I've surveyed about this word.

Speaker 1

显然,作为一个新近出现的词汇,'非活体'在各种场景中的使用频率令人印象深刻。

Clearly, for such a recent word, unalive shows up in an impressive range of scenarios.

Speaker 1

但其主要功能似乎是委婉表达。

But the main function appears to be euphemistic.

Speaker 1

许多孩子在谈论死亡等令人不适的话题时会使用这个词,因为'非活体'听起来不那么可怕。

Many kids use the word when they're uncomfortable talking about topics like death since unalive sounds like a less scary word.

Speaker 1

从很多方面来说,这并不新鲜。

And in many ways, this is nothing new.

Speaker 1

自语言诞生以来,我们就在用委婉语表达死亡。

We've been euphemizing death as long as we've had language.

Speaker 1

例如单词'deceased'(已故)源自拉丁语'de casus',它本身就是对拉丁语死亡词汇'moris'的委婉替代。

The word deceased, for example, comes from Latin de casus, which was a euphemism for the previous Latin word for death, moris.

Speaker 1

显然,就连坚忍的罗马人也和当今的中学生一样对死亡感到不安。

Apparently, even the stoic Romans were as queasy about death as today's middle schoolers.

Speaker 1

但'非活体'与'已故'之间存在一个关键区别。

But there is a crucial difference between unalive and deceased.

Speaker 1

这就是为什么我们只能用'非活'这个词,因为在TikTok上不能说'杀死'。

And that's that we only got the word unalive because you can't say kill on TikTok.

Speaker 1

他们有个神秘的算法,会删除或压制任何可能违反社区准则的帖子。

They have a mysterious algorithm that removes or suppresses any post that might violate their community guidelines.

Speaker 1

所以人们就用'非活'这个词来绕过限制。

So people got around with that with the word unalive.

Speaker 1

初中生们不知道这一点。

The middle schoolers don't know this.

Speaker 1

他们在网上看到这个词或从朋友那里听说,就以为它和其他词没什么两样。

They see the word online or hear from friends and assume it's a word like any other.

Speaker 1

公平地说,你可能也不知道'已故'这个词的来历,除非你是个词源学爱好者。

And fair enough, you probably didn't know where the word deceased came from, unless you're some kind of etymology nerd.

Speaker 1

但'去世'这个词的出现并不是因为无法在古代罗马石碑上刻下'莫尔斯'这个词。

But decease didn't happen because it was impossible to carve the word morse into an ancient Roman tablet.

Speaker 1

我们正进入一个由社交媒体算法驱动的全新语言变革时代。

We are entering an entirely new era of language change driven by social media algorithms.

Speaker 1

所谓的'算法话术'——传统上对这类平台规避行为的称呼——只是这些平台如何影响我们语言的冰山一角。

The algo speak, that's what it's traditionally called this sort of platform circumvention, is only the tip of the iceberg of how these platforms are affecting our language.

Speaker 1

我也注意到自己会迎合病毒式传播的趋势。

I also noticed, you know, myself playing into viral trends.

Speaker 1

我看到其他网红也在这么做。

I look at other influencers doing this too.

Speaker 1

我关注那些抓人眼球的表达方式,因为这些平台的设计就是为了把你的注意力变现,整个激励结构都在促使网红们效仿。

I look at attention grabbing language because these platforms are designed to monetize your attention, and all the incentive structure is there for influencers to replicate.

Speaker 1

我们该如何吸引人们的注意力?

How do we grab people's attention?

Speaker 1

因此,语言也更多地围绕着转瞬即逝的潮流和吸引眼球的内容演变。

So language is also kind of revolving more around ephemeral trends and what grabs attention.

Speaker 1

而算法通过用户分类形成了小圈子和回音室。

And we have algorithms creating in groups and echo chambers through how they categorize users.

Speaker 1

这些小圈子实际上充当了语言形成的孵化器。

And these in groups actually serve as incubators for language formation.

Speaker 1

所有这些就是我所说的'算法语言'——广义上指我们的语言正被社交媒体算法广泛塑造。

And all of this is sort of what I call algo speak, the expanded definition that our language is broadly being shaped by social media algorithms right now.

Speaker 2

我们能先聊聊这个词的前半部分吗?'算法语言'。

Can we talk about the first part of that word, algo speak?

Speaker 2

'算法'指的是算法。

Algo meaning algorithms.

Speaker 2

对吧?

Right?

Speaker 2

比如,我们来谈谈这个算法。

Like, let's talk about the algorithm.

Speaker 2

作为一名前科技记者,我想说,其实并没有一个统一的算法。

As a former tech reporter, I'm like, well, it's not the algorithm.

Speaker 2

是的。

Yeah.

Speaker 2

算法都有各自的名称。

There's like names of algorithms.

Speaker 2

对吧?

Right?

Speaker 2

就像,但当人们使用它时,他们具体指的是什么?

Like but but when people use it, what are they referring to?

Speaker 2

你能详细说明你观察到的现象,或者你是如何了解它在某些地方的具体运作方式吗?

And can you walk us through what exactly you have noticed or how you've learned about how it does work in certain places?

Speaker 1

是的。

Yeah.

Speaker 1

我很高兴你提到这不仅仅是算法问题,因为我们在讨论算法时最大的倾向之一就是将其简化、拟人化。

And I'm so glad you brought up that it's not the algorithm because one of the biggest tenancies we have when talking about the algorithm is to simplify it, to personify it.

Speaker 1

我想这让正在发生的事情感觉更正常些。

And that makes it feel more normal what's happening, I guess.

Speaker 1

嗯。

Mhmm.

Speaker 1

其实涉及许多不同的算法。

It is a lot of different algorithms.

Speaker 1

用户如何被分类是一个算法。

How users are classified is one algorithm.

Speaker 1

内容如何分类是另一个算法。

How content is classified is another algorithm.

Speaker 1

实际上,即使在这些算法内部,还有一系列子算法各自负责内容分类的不同环节。

And actually, even within those algorithms, there's a bunch of sub algorithms that are all doing their own thing with content classification.

Speaker 1

因此有许多不同的流程同时运行,所有这些都在视频从上传到最终出现在'为你推荐'页面的宏大机制中发挥作用。

So there's a lot of different processes running at the same time, and all of these are playing a part in the greater machination of how a video ends up from someone uploading it onto your for you feed.

Speaker 2

我是说,我在考虑自己。

I I mean, I'm thinking of myself.

Speaker 2

我很喜欢自己成功迫使Instagram的算法只给我推送自然和宠物内容。

I love how I have bullied the algorithm on Instagram into only giving me nature and pets.

Speaker 2

我保护了自己在那个平台的使用体验。

I've protected my experience on that platform.

Speaker 2

但它显然知道我是个喜欢长途散步的中年女性,不想在Instagram上看到新闻。

But it obviously knows I'm a middle aged woman who likes to go on long walks and doesn't wanna look at the news when she's on Instagram.

Speaker 1

是的。

Yep.

Speaker 2

没错。

Yeah.

Speaker 2

告诉我,这基本上就是我们讨论的内容吗?

Tell me about like is is that what basically what we're talking about here?

Speaker 1

很多人都觉得你可以训练算法,可以个性化定制它。

There's a lot of sense that you can train your algorithm, that you can personalize it.

Speaker 1

毕竟它叫'为你推荐'页面。

It is after all called the for you page.

Speaker 1

人们常说类似'我自己打造了推荐页'或'算法真的很懂我'这样的话。

And people often say things like, oh, you know, I built my own for you page or the algorithm really knows me.

Speaker 1

我认为这也符合训练算法的理念。

And I think this is also playing into that idea of training your algorithm.

Speaker 1

这是个简化版的说法。

It is a simplified story.

Speaker 1

是的,你在某种程度上训练了系统推荐给你的内容类型。

Yes, you partially have trained some aspect of what content is being recommended to you.

Speaker 1

但即便在你的Instagram推送中,主要是户外内容,每三个视频里可能就有一个是广告。

But also on your Instagram feed that is mostly, you know, outdoors content, every third video is probably an advertisement.

Speaker 2

嗯。

Mhmm.

Speaker 1

而你其实并不想看那些广告。

And you don't actually want to see the advertisement.

Speaker 1

所以这个算法真的是为你服务的吗?

So is the algorithm really for you?

Speaker 1

不是。

No.

Speaker 1

某种程度上它既为你服务,也为平台利益服务。

It's it's kind of both for you and for the platform's benefit.

Speaker 1

每条推送给你的视频,每个出现在'为你推荐'里的内容,本质上都是对平台有利的——因为这会获得你的参与度和注意力。

And every video that's shown to you, every video that shows up on the for you feed is actually something that benefits the platform, definitionally, because it gets your engagement, it gets your attention.

Speaker 1

所以传播的内容更多是为平台而非你服务,但它们都被包装成'为你推荐'的样子。

So the things that spread are more for the platform than for you, but it's all kind of kind of packaged under this guise of for you.

Speaker 1

我认为个性化推荐是最危险的事情之一,因为它让我们完全忽视了更宏观层面发生的事。

And I think the personalization is one of the most dangerous things because we really forget what's going on on a broader level.

Speaker 2

稍后继续连线网络语言学家亚当·亚历克斯,探讨社交媒体算法如何影响网络红人。

In a minute, more with online linguist Adam Alexic on how social media algorithms are influencing influencers online.

Speaker 1

它不断改变着奖励机制的优先级,网红们只能靠直觉去摸索。

It's constantly changing what priorities, incentives it's rewarding, and influencers have to just feel it out.

Speaker 1

他们必须通过日常互动来了解算法。

They have to be aware of the algorithm by interacting with it on a daily basis.

Speaker 2

今天的节目内容是创作者经济第二部分。

On the show today, the creator economy part two.

Speaker 2

我是马努莎·扎莫罗迪,您正在收听的是NPR的TED广播时间。

I'm Manousha Zamorodi, and you're listening to NPR's TED Radio Hour.

Speaker 2

我们马上回来。

We'll be right back.

Speaker 2

本消息由Wise提供,这是一款全球资金管理应用。

This message comes from Wise, the app for using money around the globe.

Speaker 2

使用Wise管理资金时,您总能获得中间市场汇率,且无隐藏费用。

When you manage your money with Wise, you'll always get the mid market exchange rate with no hidden fees.

Speaker 2

加入数百万用户行列,访问wise.com。

Join millions of customers and visit wise.com.

Speaker 2

条款与条件适用。

Ts and cs apply.

Speaker 2

本消息由NPR赞助商TED Tech提供,这是TED旗下的播客节目。

This message comes from NPR sponsor, TED Tech, a podcast from TED.

Speaker 2

每周五收听关于技术如何影响人们对社会、科学、设计、商业等领域思考的对话、演讲与见解。

Every Friday, hear conversations, talks, and insights on the way technology shapes how people think about society, science, design, business, and more.

Speaker 2

在您获取播客的任何平台收听TED Tech。

Listen to TED Tech wherever you get your podcasts.

Speaker 2

这里是NPR的TED广播时间。

It's the TED Radio Hour from NPR.

Speaker 2

我是马努什·扎莫罗迪。

I'm Manoush Zamorodi.

Speaker 2

本期节目是我们探讨创作者经济的第二部分。

On the show, part two of our look at the creator economy.

Speaker 2

今天的向导是创作者亚当·阿莱茨克。

Our guide today is creator Adam Aleczik.

Speaker 2

亚当被他的300万粉丝称为词源学极客,他破解了如何在网上走红的密码。

Adam is known as etymology nerd to his 3,000,000 followers online, where he's cracked the code of how to go viral.

Speaker 2

他的视频剖析了不同人群如何使用不同词汇,比如'demure'。

His videos dissect how different people use different words, like demure.

Speaker 2

端庄。

Demure.

Speaker 2

'端庄'这个词的含义是否永远改变了?

Is demure changed forever?

Speaker 2

你能向大家解释一下'demure'发生了什么变化吗?

Can you explain to people what's happened to demure?

Speaker 1

是的。

Yeah.

Speaker 1

之前有个关于在职场中表现得端庄、谨慎和可爱的'demure'风潮。

There was a demure trend about being demure, mindful, and cutesy in the workplace.

Speaker 1

上班化妆?

Make up for work?

Speaker 1

非常端庄。

Very demure.

Speaker 1

非常符合文化中关于含蓄内敛的理念。

Very Playing into cultural ideas of reservedness.

Speaker 1

绿色折痕,而非端庄。

Green crease, not demure.

Speaker 2

Demure过去指谦逊含蓄。

Demure used to mean modest and reserved.

Speaker 1

在我吃沙拉的时候?

While I'm eating a salad?

Speaker 1

非常端庄。

Very demure.

Speaker 2

紧张。

Nervous.

Speaker 2

它在TikTok上走红,在那里低调反而成了表演。

It trended on TikTok where understatement became performative.

Speaker 2

多么讽刺。

How ironic.

Speaker 2

孩子们不知道如何保持端庄。

Kids don't know how to keep a demure.

Speaker 2

布拉德利的五号产品无处不在,从青少年到卡戴珊、詹妮弗·洛佩兹等名人都在用。

Bradley's five It went everywhere from teens to celebrities like the Kardashians and Jennifer Lopez.

Speaker 2

这个,对着瓶子喝。

This, drink from the bottle.

Speaker 2

非常端庄。

Very demure.

Speaker 2

是的。

Yeah.

Speaker 1

Z世代有一种巨大的淡漠美学。

There is this huge Gen Z aesthetic of nonchalance.

Speaker 1

而‘端庄’这个词某种程度上是在调侃这种风格,但与此同时,我认为也有人真心实意地使用它。

And demure was sort of making fun of that, but at the same time, I think some people used it genuinely.

Speaker 2

这种对‘端庄’的新用法使其成为2024年dictionary.com的年度词汇。

This new use of demure made it dictionary.com's word of the year in 2024.

Speaker 1

而且这不仅仅是为了规避算法审查而创造的新词。

And it's not just new words to avoid algorithmic censorship.

Speaker 1

社交媒体的结构本身正在改变词汇的来源、流行方式以及传播速度。

The very structure of social media is changing where words come from, how words get popular, and how quickly those words spread.

Speaker 0

伸出你的

Sticking out your

Speaker 1

我想在座有些人可能对这首歌很熟悉。

I believe some of you might be familiar with this song.

Speaker 0

你真是skibbity。

You're so skibbity.

Speaker 0

你真是phantom tax。

You're so phantom tax.

Speaker 0

我只想成为你的

I just wanna be your

Speaker 1

为Rizzler伸出你的gat。

Sticking out your gat for the Rizzler.

Speaker 1

你真是个小机灵鬼。

You're so skibbidi.

Speaker 1

你真是个大忽悠。

You're so phantom tax.

Speaker 1

我只想成为你的大佬。

I just wanna be your sigma.

Speaker 1

快给我过来。

Freaking come here.

Speaker 1

把你的俄亥俄给我。

Give me your Ohio.

Speaker 1

给不了解情况的人解释一下,这是去年爆红的Rizzler神曲歌词,一个病毒式传播的梗。

For those of you out of the loop, these are the lyrics to the Rizzler song, a meme that went massively viral last year.

Speaker 1

里面全是当下初中生流行的黑话,比如撩技、尬住、骚操作,这些词能出圈这首歌功不可没。

It's full of current middle school slang words like riz, ghat, and skivody, and was instrumental in popularizing those words to a broader audience.

Speaker 1

社交媒体算法奖励重复内容。

Social media algorithms reward repetition.

Speaker 1

如果一首歌既搞笑又洗脑,用户互动率高,算法就会把它推给更多人,因为已证明能提升应用活跃度。

If a song is funny or catchy and people interact with it, the algorithm will then push that song to more people since it's proven to drive engagement on the app.

Speaker 1

梗或流行语也是同理,因为像话题标签这类趋势数据也会推送给曾对类似内容感兴趣的用户。

The same is true of memes or words in general since trending metadata like hashtags will also be pushed to people who previously shown interest in similar content.

Speaker 1

创作者们对此心知肚明。

Creators are very aware of this.

Speaker 1

我们会主动蹭热门音频或标签来提升视频流量。

And we actively use trending audios or hashtags to make our videos perform better.

Speaker 1

以《Rizzler》这首歌为例,我们看到大量视频创作者开始使用Riz、Gyot和Skibbidi等词汇,因为他们知道这类视频会火。

In the wake of the Rizzler song, for example, we saw an explosion of people making videos with the words Riz, Gyot, and Skibbidi because they knew those videos would do well.

Speaker 1

结果就是这些词汇迅速传播开来。

And as a result, the word spread.

Speaker 1

语言向来就有点像病毒。

Language has always been a little bit like a virus.

Speaker 1

词汇通过社交网络在不同宿主间传播,在感染不同人群的过程中不断复制变异。

Words are transmitted from one host to another, reproducing and changing as they infect different people along social networks.

Speaker 1

但如今社交媒体真正的病毒式传播特性,让这个过程从开始到结束都在加速。

But now the literally viral nature of social media is accelerating this process from start to finish.

Speaker 1

短短一年内,像Riz这样的词汇就能从完全默默无闻变成牛津词典年度词汇。

In the span of just a year, a word like Riz can go from complete obscurity to becoming the Oxford English Dictionary word of the year.

Speaker 1

算法是罪魁祸首,而网红们则是帮凶。

And the algorithm is the culprit, but influencers are the accomplices.

Speaker 1

我们使出浑身解数来娱乐观众,因为这样能提升视频数据,让我们得以谋生。

We use whatever tricks we can to keep you entertained because that makes our videos do better, which helps us earn a living.

Speaker 1

这意味着我们最终往往创造并传播着对系统有利的词汇。

This means that we often end up creating and spreading words that help the system.

Speaker 2

显然你很懂算法机制,才能用这种方式让更多人关注语言起源并为他们讲解。

Clearly, you understand how the algorithm works in that you have been able to leverage it in a way that gets more people interested in the roots of language and explaining it to them.

Speaker 2

你是怎么做到的?

How did you do that?

Speaker 1

确实如此。

Absolutely.

Speaker 1

某种程度上必须与魔鬼做笔交易。

There's sort of a deal with the devil that has to happen.

Speaker 1

对吧?

Right?

Speaker 1

作为创作者你必须接受,是的,我正在迎合算法所奖励的内容,而算法会奖励趋势性的东西。

You have to accept as a creator, yes, I'm playing into what the algorithm is rewarding, and they're going to reward, for example, trends.

Speaker 1

所以我开始讨论流行语,并分析这些流行语的来源。

So I started talking about trending language, also analyzing the trending language where it came from.

Speaker 1

你知道,'skibbidi'这个词爆红对我的生意有好处,因为现在我可以讨论'skibbity'的词源,由于人们对这一现象的社会性着迷,我的视频会比分析'椅子'或'桌子'这类词的视频传播得更广。

It, you know, it's good for my business that the word skibbidi goes viral because now I get to talk about the etymology of skibbity, and because people are socially fascinated in this phenomenon, my video is gonna go more viral than an analysis of the word chair or table or whatever.

Speaker 1

那些内容对人们来说不太有趣。

That's less interesting to people.

Speaker 2

我们暂停一下。

Let's pause.

Speaker 2

脚注:skibbity。

Footnote, skibbity.

Speaker 2

它是什么意思?

What's it mean?

Speaker 2

它从哪来的?

Where'd it come from?

Speaker 2

为那些像我一样不了解的人解释下。

For those who don't know, like me.

Speaker 1

我觉得至少现在大多数人都曾在新闻里听说过'skibbity'这个词。

I feel like at this point, at least most people have heard of skibbity somewhere in the news.

Speaker 1

它刚被收录进剑桥词典。

It was just added to the Cambridge dictionary.

Speaker 1

但'skibbidi'是个无意义的词。

But skibbidi is a nonsense word.

Speaker 1

它可以完全随机地用作不表示任何意义的感叹词,但也可能是个强调词。

It can be used completely randomly as an interjection that doesn't mean anything, but it could be an intensifier.

Speaker 1

所以你可以说'skibbidi riz'就像非常强烈的'riz',可能是好也可能是坏的方向。

So you could say skibbidi riz is like very intense riz in either a good or a bad direction.

Speaker 2

意思是魅力。

Meaning charisma.

Speaker 2

就是,对。

Just Yeah.

Speaker 2

你懂的。

You know.

Speaker 2

没错。

Right.

Speaker 2

我们是NPR。

We it's NPR.

Speaker 2

我们的听众年龄层偏大。

We have an older audience.

Speaker 1

不。

No.

Speaker 1

当然。

Course.

Speaker 1

嗯。

Yeah.

Speaker 1

澄清这一点非常重要。

Very important to clarify.

Speaker 1

嗯哼。

Uh-huh.

Speaker 1

对。

Yeah.

Speaker 1

这些都被称为'脑残词汇',是一种网络迷因美学,据说这些词会对大脑产生不良影响。

So these are all kind of called brain rot words, and it's this meme aesthetic of supposedly words that are bad for your brain.

Speaker 1

作为语言学家,我必须指出这些词汇实际上对心智完全无害。

I as a linguist, I have to say there is nothing actually about these words that is mentally deleterious at all.

Speaker 1

但这确实形成了一种无意义重复算法俚语的迷因美学。

But it is sort of the catch all meme aesthetic of nonsensical repetition of algorithmic slang.

Speaker 1

而'Skibbidi'源自YouTube短视频系列。

And Skibbidi comes out of that YouTube short series.

Speaker 1

'Riz'来自Twitch主播Kai Sinat。

Riz comes from this Twitch streamer Kai Sinat.

Speaker 1

但两者都在算法环境中扮演着荒诞的角色。

But both are kind of taking on this role of absurdity in the algorithmic kind of milieu.

Speaker 1

更准确地说,它之所以荒诞,是因为算法推荐给我们的内容本身就荒诞。

It's more of it's absurd because it's absurd because it's being recommended to us by the algorithm.

Speaker 1

嗯。

Mhmm.

Speaker 1

然后初中生们开始说这个词,因为它听起来很有趣。

And then middle schoolers start saying it because it sounds funny.

Speaker 2

然后当年长的人开始说这个词时,他们就不说了。

And then when older people start saying it, they stop saying it.

Speaker 1

没错。

Exactly.

Speaker 1

我觉得我们已经完全过了‘skibbidi’这个阶段了。

I think we're well past skibbidi.

Speaker 1

我认为‘skibbidi’现在已经是强弩之末了。

I think skibbidi is on its last throws right now.

Speaker 1

它可能被收录在词典里,但在初中生中越来越不流行了。

It may be in the dictionary, but it's getting less and less popular in the middle schools.

Speaker 2

在初中生中。

In the middle schools.

Speaker 2

是啊。

Yeah.

Speaker 2

我感觉这些新词冒出来时,我就...我我实在搞不懂。

I feel like these new words come up, and I'm like, I I can't I don't know.

Speaker 2

等我终于弄明白时,他们早就换新词了。

By the time I understand it, they'll have moved on.

Speaker 2

那还有什么意义呢?

So what's the point?

Speaker 2

不过这样可能也好,反正没人想听我说‘skibbidi’——虽然他们刚刚就听到了。

But that's probably smart because nobody wants to hear me saying skibbidi anyway, except they just did.

Speaker 1

嗯,你知道,那些真正突出的潮流,我认为它们的生命周期更短,因为你提到了那个原因。

Well, you know, some of the trends that really stick out, I think they have shorter lifespans because it is that thing that you mentioned.

Speaker 1

一旦你奶奶开始说'skibbidi',你自己就不想再说这个词了。

Once your grandmother starts saying skibbidi, you'd no longer wanna say skibbidi yourself.

Speaker 2

嘿。

Hey.

Speaker 2

我还没当奶奶呢。

I'm not a grandma yet.

Speaker 1

我没说你是奶奶。

I wasn't calling you a grandmother.

Speaker 1

好吧。

Okay.

Speaker 1

但潮流消退确实有这个特点——当它开始被圈外人使用时,它最初的价值在于它是圈内人的专属。

But there is that aspect to how trends die out that once it starts being used by an out group, it initially had value because it was in the in group.

Speaker 1

现在它不再具有那种价值,所以作为笑话也不再有任何吸引力。

And now it no longer has that value, so it no longer has any cache as a joke.

展开剩余字幕(还有 444 条)
Speaker 1

嗯哼。

Mhmm.

Speaker 1

然后我们就继续前进。

And then we move on.

Speaker 1

而且总有新的笑话在不断涌现。

And there's constantly new jokes percolating.

Speaker 1

我还认为关注那些不太明显或一开始甚至不像笑话的笑话很重要。

I also think it's important to pay attention to the jokes that don't seem as obvious or don't even appear as jokes in the first place.

Speaker 1

我认为这些词汇将更持久地改变我们的语言。

I think these are the words that are actually going to change our language more permanently.

Speaker 1

那些通常来自少数群体或边缘现象的词汇,最终反而会对我们的语言产生影响。

Words that are percolating from usually minority groups or from more fringe phenomena, those are the ones that actually end up influencing our language.

Speaker 1

所以我要谈谈'skibbity'和'low key'之间的区别。

So I I talk about the distinction between skibbity and low key.

Speaker 1

'low key'是个副词。

So low key is this adverb.

Speaker 1

这个词自19世纪以来就存在,意思是'低调的'之类的。

It's been around since the eighteen hundreds as a word meaning like muted or something.

Speaker 1

嗯。

Mhmm.

Speaker 1

但最近它作为副词用法流行起来,这源自非裔美国人英语,人们会说'那个派对低调地好'。

But it was recently popularized in an adverbial sense, and this is from African American English, where people would say, oh, that party was low key good.

Speaker 2

意思是像'放松'那样吗?

Meaning like chill?

Speaker 1

对。

Yeah.

Speaker 1

就是'low key'。

Well, low key.

Speaker 1

没错。

Yeah.

Speaker 1

它有点像是'漫不经心'的意思。

It kinda means like, you know, nonchalant.

Speaker 1

我不想过多提及,想保持低调。

I don't wanna say it too much being more low key about it.

Speaker 1

对吧?

Right?

Speaker 1

它是被消音的。

It's it's muted.

Speaker 1

是一种消音状态。

It's mutedness.

Speaker 1

是有所保留的。

It's reservedness.

Speaker 2

还有,你那种漫不经心的用法,跟我理解的漫不经心不一样。

Also, the way you just use nonchalant, not how I use nonchalant.

Speaker 1

哦,确实。

Oh, yeah.

Speaker 1

‘漫不经心’这个词的语义也在演变。

The word nonchalant is evolving as well.

Speaker 1

自2023年左右就有网络梗在讨论人们怎么使用这个词。

There's been an Internet meme since like 2023 about how people are using this.

Speaker 1

所以我觉得当译者自己也深陷算法推送时,工作会变得很困难。

So I guess it's tough being a translator when you're also deep in the kind of algorithm yourself.

Speaker 1

不过确实。

But yeah.

Speaker 1

所以‘低调’这个词,以及我们看到许多其他词汇的定义变化都在发生,只是没有‘skibbidi’那么引人注目。

So low key, I think, and these changing definitions that we see with a lot of other words are also happening, but don't stick out as much as skibbidi.

Speaker 1

我认为正是那些不显眼的词汇,将在数百年后对我们的语言产生更深远的影响。

And I think it's those words that don't stick out that are going to have a greater impact on our language hundreds of years from now.

Speaker 2

所以你刚才说的这些,其实解释了我15岁女儿的很多行为。

So what you're just saying actually is explaining a lot of my 15 year old daughter's behavior.

Speaker 2

我是说,青少年嘛,从来都是这样。

I mean, teenagers, whatever, it was ever this.

Speaker 2

每一代人都有自己独特的叛逆方式。

Every generation has its own way of rebelling.

Speaker 2

但冷漠态度绝对是个典型特征。

But nonchalance is definitely a thing.

Speaker 1

哦,完全同意。

Oh, absolutely.

Speaker 1

关于15岁青少年我还想强调一个重要观点。

I also an important point about the 15 year olds.

Speaker 1

我认为这些孩子是语言发展走向的风向标。

I think these are these kids are the bellwethers of where language is gonna go.

Speaker 1

他们才是语言行为最具可塑性的人群。

They are the ones that are actually most flexible with their linguistic behaviors.

Speaker 1

小学生还不太能把握社会潮流,也不那么急于塑造自我身份。

The elementary schoolers aren't quite as tapped into social trends yet or as searching to create identities.

Speaker 1

而年长者更固守语言应该怎样或词汇该如何发音的观念。

And older people are more entrenched in our ideas of how language should be or what words are supposed to sound like.

Speaker 1

初中生则正处于身份认同的形成阶段。

And middle schoolers are forming their identity.

Speaker 1

他们试图将自己与成年人和前几代人区分开来。

They're trying to differentiate themselves from adults and from previous generations.

Speaker 1

他们正试图为自己构建一种集体认同。

They're trying to build a collective kind of identity for themselves.

Speaker 1

因此他们在语言接纳上更为灵活。

And so they are more flexible with adopting language.

Speaker 1

所以如果你想真正了解英语的现状,就该多关注15岁的孩子们。

So you should be paying attention to your 15 year old if you wanna really understand what the state of the English language is right now.

Speaker 2

我猜有人会说,是啊。

So I guess someone would say like, yeah.

Speaker 2

但你知道,在我的学校,我记得有些特定的词。

But in my, you know, at my school, I remember there was like certain words.

Speaker 2

这有点恶心——四年级时有个小孩会用舌下腺体朝人吐口水,他管这叫‘舌射’。

This is kinda gross, but in fourth grade, there was a kid who would spit at people but only using his glands under his tongue, and he called it a gleek.

Speaker 2

超级怪。

Super weird.

Speaker 2

但这个行为在四年级突然流行起来。

But then it just took off in the fourth grade.

Speaker 2

男生们会朝喜欢的女生‘舌射’

It was like boys would gleek the girl

Speaker 1

那个

that

Speaker 2

他们喜欢的。

they liked.

Speaker 2

恶心。

Gross.

Speaker 2

但显然,这也是个好词,因为它多年来一直萦绕在我心头。

But also, clearly, good word because it stuck with me for all these years.

Speaker 2

但但情况一直如此。

But but that's always been the case.

Speaker 2

比如,它与社交群体创造自己的语言,或者印刷机、广播这些媒介有什么不同?

Like, how is it different than, you know, social groups coming up with their own language or for that matter, like, the printing press, the the radio?

Speaker 1

很高兴我们提到了印刷机和广播,因为媒介即信息。

I'm so glad we're bringing up the printing press and the radio because the medium is the message.

Speaker 1

我们的交流方式将塑造趋势传播的路径。

The way we're talking is going to shape the way that trends diffuse.

Speaker 1

我并不是说人类用语言交流或创造社会趋势是新鲜事。

And I'm not going to say, oh, it's new that humans are using language to communicate or come up with social trends.

Speaker 1

不是的。

No.

Speaker 1

这本质上就是人类的活动。

That is a fundamentally human activity.

Speaker 1

我们会一直这样做,并且总会找到新的方式。

And we will always do it, and we will always find new ways to do it.

Speaker 1

但现在算法正在调节这种互动,就像教室对'gleek'这个词的影响一样。

But the algorithm is now mediating that interaction in the same way that the classroom did with the word gleek.

Speaker 1

语言的整个故事是由我们使用的交流媒介所讲述的。

The story of language as a whole is told by what mediums we're using to communicate.

Speaker 1

在我们拥有羊皮纸、莎草纸等一切之前,我们更多依赖口头交流。

Before we had parchment and papyrus and everything, we relied a lot more on oral communication.

Speaker 1

故事曾以押韵和格律传颂,后来我们转向羊皮纸和纸张,当时人们对此深感忧虑。

Stories would be told through rhyme and meter, and then we moved to parchment and paper, and people were really concerned about that at the time.

Speaker 1

我认为柏拉图担心的是人们不再大量记忆会导致脑力衰退。

I think Plato was concerned about brain rot from people not memorizing things as much.

Speaker 1

随后我们开始用纸张以不同方式分割故事。

Then we start segmenting our stories differently with paper.

Speaker 1

对吧?

Right?

Speaker 1

我们可以划分章节。

We can we can have chapters.

Speaker 1

我们能在词与词之间留出空格,而不像石碑上所有文字都挤在一起。

We can actually put spaces between words as opposed to stone tablets where all the words were smooshed together.

Speaker 1

接着我们开始用印刷机大规模生产。

And then we we start mass producing things with the printing press.

Speaker 1

印刷机使得方言得以复制传播,同时也巩固了传统话语权掌控者的地位。

And the printing press now allows for vernacular language to reproduce and kind of solidify also at the same time the traditional gatekeepers of who gets to control how we talk.

Speaker 1

因此有段时间,印刷机是主导的传播方式。

And so for a while, the printing press is the dominant mode of communication.

Speaker 1

大量标准语言在各种语言中被复制,而不仅限于教会拉丁语。

You have a lot of standard language being replicated in all these different languages rather than just church Latin.

Speaker 1

最终互联网为非正式口语的书面复制创造了机会。

And the Internet finally creates this opportunity for the written replication of informal speech.

Speaker 1

人们开始使用俚语。

You have people using slang words.

Speaker 1

人们开始用小写字母书写并使用缩写。

You have people writing in lower case and using abbreviations.

Speaker 1

这某种程度上是一场关于谁有权发言、谁有权决定语言形态的革命。

And this is sort of this revolution in who can talk and who can vote on what language is supposed to be like.

Speaker 1

这为我认为的新转折点奠定了基础。

And that sort of lays the foundation for what I think is the new inflection point.

Speaker 0

M to the b, some to the b, some

And m to the b, some to the b, some

Speaker 1

算法掌控着内容的分发。

Algorithms govern the distribution of content.

Speaker 1

我还不知道要在这里发布什么类型的视频。

I don't know what type of videos I'm gonna post on here yet.

Speaker 1

好的。

Okay.

Speaker 2

但如果你做的一切都对,为什么你的内容没有爆火?

But why is your content not blowing up if you're doing everything right?

Speaker 2

但让我来指点你

But let me put you on

Speaker 1

除非符合算法机制,否则内容不会病毒式传播。

Things don't go viral unless they work for the algorithm.

Speaker 1

这意味着我们使用的语言某种程度上也始终在为算法服务。

And what that means is that the language we're using is always sort of performing for the algorithm as well.

Speaker 1

而这种媒介始终存在,不断塑造着思想在网络上的传播方式。

And this is this medium that is always there, that is always shaping how ideas diffuse online.

Speaker 1

这意味着任何传播开来的思想要么对算法有利,要么就是为算法而生的。

And that means any idea that diffuses is either good for the algorithm or it's created for the algorithm.

Speaker 1

这就导致了像'非活体'这类词汇的使用。

That means using words like unalive.

Speaker 1

这就催生了诸如'Skibbidi'这类趋势的跟风。

That means playing into trends like skibbidi.

Speaker 1

这意味着许多创作者被激励去创造新词或使用流行语,因为算法将这些语言视为元数据。

That means a lot of creators are just incentivized to come up with new words or use trendy language because the algorithm sees that language as metadata.

Speaker 1

过去,元数据更像是标签这类能提供内容信息的东西。

In the past, metadata was something like a hashtag, something that gives you information about the content.

Speaker 1

现在算法能洞察一切。

Now the algorithm sees everything.

Speaker 1

自然语言处理、计算机视觉组件等所有子算法都在分析这些数据,从而生成视频内容的数字化表征。

The natural language processing, computer vision components, all sub algorithms analyze that, and they create this numerical representation of what a video is.

Speaker 1

嗯。

Mhmm.

Speaker 1

它们早已洞悉视频内容。

And they already know what the video is about.

Speaker 1

而词汇正是其中的组成部分。

And but the word is part of that.

Speaker 1

因此我们被激励使用那些有助于视频传播的词汇。

So we're incentivized to use words that aid in the distribution of our own video.

Speaker 1

只是

Just

Speaker 2

跟进一下,你知道的,就是马歇尔·麦克卢汉那个经典理论——媒介即讯息。

to follow-up on, you know, the classic Marshall McEwen idea of the medium is the message.

Speaker 2

算法才是真正塑造语言的力量。

The algorithm is the one that's shaping language.

Speaker 2

但需要明确的是,算法是由人编写的,这些人为想要变现的科技巨头工作。

But to be clear, the algorithm is programmed by people, people who work for very large tech companies who want to monetize.

Speaker 1

完全正确。

Absolutely.

Speaker 1

他们建立的这些机制会被后来的网红们效仿,比如强调需要互动、需要留存率(即观众观看视频的时长)、需要点赞。

And they create those structures that are then replicated by other influencers down the line that, oh, you need engagement, you need retention, which is how long people watch a video, you need likes.

Speaker 1

所以现在作为网红,我制作视频时就要想方设法让观众看完,还要获得点赞。

And so now, as a influencer, I'm trying to make all my videos to get people to watch the whole time, and I wanna get likes.

Speaker 1

否则视频就不会被推荐,我只能顺应这套规则。

Otherwise, the video is not gonna be distributed, so I have to play into that.

Speaker 1

我还想指出,有些社会现象并非完全由工程师编程决定。

I also want to note that there's emergent social phenomena that aren't programmed just by these engineers.

Speaker 1

并不是嗯...

It's not Mhmm.

Speaker 1

纯粹自上而下的过程。

Purely like this top down process.

Speaker 1

每种媒介都伴随着相应的社会期待。

There is social expectations that come with each medium.

Speaker 1

这里是NPR新闻华盛顿分台现场报道。

Live from NPR News in Washington.

Speaker 1

我是戴夫·马丁利。

I'm Dave Mattingly.

Speaker 1

在NPR,你会听到一种经过专业训练的播音腔调。

On NPR, you have, like, an NPR voice that's been studied.

Speaker 2

特朗普总统正展开他上任后的首次正式出访

President Trump is making his first official trip of

Speaker 1

显然,我并不擅长适应这种风格。

Clearly, I am not good at adapting into this.

Speaker 2

不过我觉得他们多年来一直想摆脱这种腔调。

Oh, I think they've been trying to get rid of that for years, though.

Speaker 1

但与此同时

But at the

Speaker 2

我就在这里。

same time Here I am.

Speaker 1

嗯哼。

Uh-huh.

Speaker 1

你现在用的可能是更符合广播要求的发声方式。

You're speaking in a maybe a more radio friendly voice.

Speaker 1

我正用着算法更易识别的发音方式,因为这是我们进行这类交流时学会的说话方式。

I'm speaking in a more algorithm friendly voice because that's what we've we've learned to talk when we're kind of communicating like this.

Speaker 2

等等。

Wait.

Speaker 2

有人可能会问,你说的算法友好型声音是什么意思?

Somebody might be like, what do you mean an algorithm friendly voice?

Speaker 2

解释一下。

Define that.

Speaker 1

实际上网红口音有很多种类型,但你可能最熟悉那种著名的生活方式类网红口音。

There's actually a bunch of different types of influencer accents, but you're probably familiar with the most famous sort of lifestyle influencer accent.

Speaker 1

就是那种'嗨,大家好'的调调。

The hey, guys.

Speaker 1

欢迎收听NPR。

Welcome to NPR.

Speaker 1

今天我们要讨论算法。

Today, we're gonna be talking about algorithms.

Speaker 1

带着那种上扬的尾音。

It has that uptalk.

Speaker 1

语调总是往上扬。

It has the rising tone.

Speaker 1

呃,你可能没

Well, could you're not

Speaker 2

收到任何人的回应。

getting any response from anyone.

Speaker 2

所以你就会

So you're like

Speaker 1

呃,它

Well, it

Speaker 2

试图与自己对话。

trying to have a conversation with yourself.

Speaker 1

确实如此。

Exactly.

Speaker 1

这是一种单向交流,虽然对塑造它很重要,但远不止于此。

This is a one-sided communication that is an important part of shaping it, but that's not all.

Speaker 1

要知道,如果我只是一直面无表情地对着镜头说话,这种视频是不会爆红的,因为人们对说话方式是有期待的。

You know, I could just continuously monotone deadpan into the camera and that wouldn't go viral because there's an expectation of how people are supposed to talk.

Speaker 1

这类生活方式博主的腔调源自早年的YouTube口音,而YouTube口音又源自'山谷女孩'口音。

And this sort of lifestyle influencer accent came out of the previous YouTube accent, which came out of the Valley Girl accent.

Speaker 1

所以我们说话方式里存在社会声望的层级结构,人们对生活方式博主应该怎么说话是有既定期待的。

So there's layers of social prestige in how we talk, and there's an expectation of what the lifestyle influencer is supposed to sound like.

Speaker 2

等等。

Now, wait.

Speaker 2

等一下。

Wait.

Speaker 2

稍等。

Wait.

Speaker 2

你是说人类大脑会被这种声音吸引,从而获得更多收听量,所以算法就持续推送它?

Are you saying that the human brain is attracted to those and therefore it gets more listens and therefore the algorithm continues to push it?

Speaker 2

还是说算法能识别这些语调特征,然后更大力地向人们推送?

Or are you saying that the algorithm is hearing those intonations and pushing it harder on people?

Speaker 1

不是。

No.

Speaker 1

我认为算法没有捕捉到口音语调。

I don't think the algorithm is picking up on accent intonations.

Speaker 1

我认为人类确实存在对社会声望的偏见。

I think there is actual human bias toward social prestige.

Speaker 1

我们通过自己的视角评估各种社交情境,比如判断某件事是否值得倾听。

We evaluate any kind of social situation through our lens of like, is this something worth listening to?

Speaker 1

然后我们的轻微滑动习惯会通过算法推荐被复制传播。

And then, like, our slight scroll patterns get replicated through what the algorithm recommends.

Speaker 1

算法甚至不需要分析这个人是否在用生活方式影响者口音说话。

The algorithm doesn't even need to analyze whether or not this person is speaking in a lifestyle influence or accent.

Speaker 1

它只是表明:'哦,这个观众快速滑走了,因为他们认为说话者没有使用社会认可的口音,这意味着我要减少推荐这个视频的用户量'。

It's merely telling that, oh, this this viewer scrolled away quickly because this person wasn't speaking in a socially approved accent in their minds, and that means I'm gonna recommend this video to fewer users.

Speaker 1

因此某种程度上形成了对社会认可口音的筛选复制机制。

And so there's sort of a selection that happens for socially desirable accents to replicate.

Speaker 1

我还想澄清这仅是其中一种最刻板印象化的口音,因为女性在语言表达上总是受到更严苛的审视。

And I also wanna clarify that's only one type of accent that's the most stereotyped because, you know, women are scrutinized with language.

Speaker 1

我用的是自称为'教育类影响者口音'的说话方式。

I speak in a what I call an educational influencer accent.

Speaker 1

我会说得非常快。

I'll talk very quickly.

Speaker 1

我会加重更多词语的发音来吸引你继续观看我的视频。

I'll stress more words to keep you watching my video.

Speaker 2

不过我感觉你一直都是个语速很快的人。

I get the sense though that you were a fast talker always.

Speaker 2

不是?

No?

Speaker 1

我确实语速很快,而且在这里我确实暴露了自己。

I definitely am a fast talker, and I'm definitely giving away myself here.

Speaker 1

但更重要的是,我在视频中追求完美,如果语调不对或措辞不当,或者语速不够快,我会重拍某些片段。

But also, there's a perfection that I strive for in the videos where I'll retake certain clips if it seems like I don't have the right intonation on the right word, or if I stumble on my words, if I don't talk fast enough.

Speaker 1

所以现在对我来说更像是一种更自然的说话方式,如果在更大的群体对话中,我可能会说得不一样,那种场合不需要这样,或者更随意,更含蓄。

And so what now is still like a kind of more natural way of speaking for me, and I, again, would probably speak differently if I was in a larger group conversation where you don't need to or it's like more laid back, you know, it's more demure.

Speaker 2

你真是太淡定了,亚当。

You're so nonchalant, Adam.

Speaker 2

天啊。

My god.

Speaker 1

我不想总是显得漫不经心。

I don't wanna be shalant all the time.

Speaker 1

有时需要漫不经心,有时则需要淡定。

There's a time and a place to be shalant, and there's a time to be nonchalant.

Speaker 2

等等。

Wait.

Speaker 2

稍等一下。

There's a hold on.

Speaker 2

漫不经心和淡定有区别吗?

There's a difference between shalant and nonchalant?

Speaker 1

嗯,淡定是指保持克制。

Well, nonchalant is being reserved.

Speaker 1

然后

And then

Speaker 2

但'shalant'这个词本身并不存在

But shalant is not its own word.

Speaker 2

什么?

What?

Speaker 1

但它确实存在

But it is.

Speaker 1

现在它存在了

It is now.

Speaker 1

这是一个通过社交媒体流行起来的俚语

It's a slang word that was popularized through, like, social media.

Speaker 1

意思是?

Meaning?

Speaker 1

人们用它表示与非冷淡相反的意思

And people talk meaning the opposite of nonchalant.

Speaker 1

意思是,你对某事表现得很明显

Meaning, you are overt about something.

Speaker 2

就像'兴奋'那样?

Like jazzed?

Speaker 1

Yeah.

Speaker 1

你对某事感到兴奋

You're excited about something.

Speaker 1

我对语言很随意,你懂的。

I'm chalant about language, you know.

Speaker 2

当我说兴奋时,你是那种'哇,87岁吗'之类的反应吗?

When I say jazzed, are you like, woah, 87 or what?

Speaker 1

我是说,我可能直接用'随意'这个词。

I mean, I probably would just use the word shalant.

Speaker 1

没错。

Exactly.

Speaker 2

稍后回来,我们将讨论亚当对社交媒体用户(尤其是年轻用户)的关切与乐观看法。

When we come back, Adam's concern and optimism for social media users, younger ones in particular.

Speaker 2

另外,亚当和我要火了。

Plus, Adam and I go viral.

Speaker 2

也许吧。

Maybe.

Speaker 1

哦,你居然不知道

Oh, you didn't know about the

Speaker 2

我们该怎么开始?

How do we begin?

Speaker 2

哦,抱歉。

Oh, sorry.

Speaker 2

糟透了。

Terrible.

Speaker 2

好吧。

Okay.

Speaker 2

本期节目将探讨创作者经济及其对语言的影响。

Today on the show, the creator economy and its effects on language.

Speaker 2

我是马努莎·扎莫罗迪,您正在收听的是NPR出品的TED广播时间。

I'm Manousha Zamorodi, and you're listening to the TED Radio Hour from NPR.

Speaker 2

请继续关注我们。

Stay with us.

Speaker 2

这里是NPR出品的TED广播时间。

It's the TED Radio Hour from NPR.

Speaker 2

我是马努莎·扎莫罗迪。

I'm Manouche Zamorodi.

Speaker 2

我们一直在与亚当·阿莱克西克讨论创作者经济。

We've been talking to Adam Alexic about the creator economy.

Speaker 2

亚当自称是社交媒体语言学家。

Adam calls himself a social media linguist.

Speaker 2

他还是《算法语言:社交媒体如何重塑语言未来》一书的作者。

He's also the author of Algo Speak, how social media is transforming the future of language.

Speaker 2

他的职业生涯致力于阐释词汇使用中体现的文化与代际鸿沟。

And he has built his career explaining the cultural and generational divide on how we use words.

Speaker 1

在19世纪时我们还没有'社交世代'这个概念。

We didn't have the idea of a social generation in the eighteen hundreds.

Speaker 1

这算是新鲜事物。

This is like a new thing.

Speaker 1

当时人们正从第一次世界大战归来。

People are coming back from World War one.

Speaker 1

我们称他们为迷惘的一代。

We called them the lost generation.

Speaker 1

哦,那些从二战归来的人们。

Oh, the people coming back from World War two.

Speaker 1

那是大兵一代。

That's the GI generation.

Speaker 1

接着我们有了婴儿潮一代。

And we have baby boomers.

Speaker 1

然后我们其实不知道该怎么称呼你们,就是之后的那一代人。

And then we actually don't know what to call you, the the next generation after that.

Speaker 1

我们就想,好吧,也许他们不想被定义,所以就叫他们X世代吧。

We're like, well, I guess they don't have a desire to be defined, so we'll call them Gen X.

Speaker 1

然后我们就继续沿用世代存在的这个概念。

And then we just keep running with the idea that generations exist.

Speaker 1

再后来就有了千禧一代,对,因为是2000年嘛。

And then we have like, millennial, yeah, because it's 2,000.

Speaker 1

然后呢,我们也不知道该怎么称呼Z世代。

And then, well, we don't know what to call Gen Z either.

Speaker 1

所以我们就随便用X世代之后的第二个字母来命名。

So we'll just do whatever was two after Gen X.

Speaker 1

现在字母都用完了,所以就叫阿尔法世代吧。

And now, we just ran out of the alphabet, so we'll call it Gen Alpha.

Speaker 1

但我认为网红们助长了这些标签,因为人类社交中存在着对被贴标签的迷恋。

But I think influencers play into these labels because there is a human social fascination with being labeled.

Speaker 1

我们喜欢MBTI、星座这类事物,也喜欢归属于某个群体和世代的概念。

We love things like MBTIs and zodiacs and the idea of belonging to a bucket and with a generation.

Speaker 2

还有简写。

And shorthand.

Speaker 2

对吧?

Right?

Speaker 1

确实如此。

It is.

Speaker 1

这就是简写。

It is shorthand.

Speaker 1

显然这过于简化了,我想人们也意识到了这点。

Obviously, it's reductive, and I think people realize that.

Speaker 1

但我们比以往更频繁地使用世代标签,因为它确实有效。

But we're using more generational language than we were before because it works.

Speaker 1

比如用'Z世代手指心形'和'千禧一代心形'来区分就很有效。

It works to talk about the Gen Z finger heart versus the millennial heart.

Speaker 1

或者在X世代之前的人拍短视频时,他们会先做个小小的吸气动作。

Or before Gen X people will start a short form video, they'll they'll do, like, a little breath in.

Speaker 1

嗯哼。

Uh-huh.

Speaker 1

我们这样把自己归类实在太奇怪了,而这些标签实际上成了算法趋势的诱饵,用来宣称'哦这是Z世代的特征'。

It's it's so strange to me that we put ourselves in buckets like that, and they actually serve as algorithmic trend bait to say that you are, oh, this is a Gen z thing.

Speaker 1

这是...嗯哼。

This is a Uh-huh.

Speaker 1

Z世代的凝视表情最近在网上疯传。

The Gen z stare went viral recently.

Speaker 1

我做过一个关于婴儿潮一代用省略号的视频。

I did a video on the boomer ellipses.

Speaker 1

他们的消息中间总会出现随意的点点点。

There's always a random dot dot dot in the middle of their messages.

Speaker 2

哦,我就这样。

Oh, I do that.

Speaker 1

我故意那么说是因为知道会火。

I phrased it that way because I knew it'd go viral.

Speaker 1

比如,为什么婴儿潮一代要在文字里加句点?年轻人可不会这么做。

Like, why do boomers put dots in their text, which is not a normal thing that young people do?

Speaker 1

所以分开表达想法其实更高效

So it's simply more efficient to separate ideas

Speaker 2

但这就像你说话时声音渐弱。

But it's like trailing off in your language.

Speaker 2

是啊。

Yeah.

Speaker 1

对啊。

Yeah.

Speaker 1

只是不同世代有不同的短信礼仪。

It's just it there's different texting etiquette among different generations.

Speaker 1

再说用'世代'这个词本身,其实所有语言都在把事物分类。

And again, to use that word generation already feels like I mean, all of language is putting things into categories.

Speaker 1

但我认为某些分类方式比其他方式更具简化性。

But I think some categories are more reductive than others.

Speaker 1

例如,后缀'core'最近在Z世代俚语中非常流行,用来描述特定的美学风格,比如'cottage core'、'goblin core'或'angel core'。

For example, the suffix core has recently gotten very popular in Gen z slang to describe specific aesthetics, like cottage core or goblin core or angel core.

Speaker 1

从表面上看,这些都很可爱。

And on the surface level, these are cute.

Speaker 1

你看了个田园风视频,觉得喜欢。

You watch a cottage core video, you like it.

Speaker 1

之后你会收到更多田园风内容。

Later on, you get more cottage core content.

Speaker 1

你甚至可能开始认同田园风美学。

You might even start to identify with the cottage core aesthetic.

Speaker 1

但问题是。

But here's the thing.

Speaker 1

这一切都是假的。

It's all fake.

Speaker 1

这些美学风格存在的全部原因,是因为TikTok算法认定'cottagecore'这类词汇属于热门元数据。

The entire reason these aesthetics exist is because TikTok algorithm has decided that words like cottagecore qualify as trending metadata.

Speaker 1

于是创作者们就制作更多田园风内容来传播这个词,更多人互动后又让这个词更热门。

So creators respond by making more cottagecore content that propagates the word, and then more people interact with it, which makes the word trendier.

Speaker 1

这种现象源于社交媒体算法希望你认同高度细分化的标签,这样它们就能为你提供针对该身份的极度精准的商业化内容。

And this happens because social media algorithms wants to make you identify with hypercompartmentalized labels since they can then give you extremely specific commercialized content catering to that identity.

Speaker 1

现在你成了田园风爱好者,每次收到田园风视频都会觉得自己很特别。

Now that you're a cottagecore person, you feel special every time you get a cottagecore video.

Speaker 1

你是那种田园风的人吗?

You're like, cottagecore?

Speaker 1

算法真的很了解我。

Well, the algorithm really knows me.

Speaker 1

算法给了你这个身份。

The algorithm gave you that identity.

Speaker 1

你甚至可能开始购买田园风服装或装饰来适应你作为田园风人士的新生活方式,而这正是他们想要的。

You might even start buying cottagecore clothing or cottagecore decorations to fit your new lifestyle as a cottagecore person, and that's exactly what they want.

Speaker 1

最疯狂的是他们甚至没打算隐藏这一点。

The craziest part is they're not even trying to hide this.

Speaker 2

这正是资本主义创客经济的核心。

This is this is at the heart of the capitalist maker economy.

Speaker 2

不是吗?

No?

Speaker 1

这是理解算法运作原理的关键点。

This is such an important point to understand about how algorithms work.

Speaker 1

他们想要更多分类。

They want more categories.

Speaker 1

如果我是一个半性恋地精风Z世代霉粉,对他们来说就再好不过了。

It's really good for them if I'm a demisexual goblin core Gen Z Swiftie.

Speaker 2

我完全听不懂你在说什么,不过你继续。

I don't even know what you just said, but yes, go on.

Speaker 1

重点是,这些都是小标识符,是关于我作为用户的一小段元数据。

The point is, those are all little identifiers, little pieces of metadata about myself as a user.

Speaker 1

算法实际上并不需要这些文字,但人类感觉到自己在为算法表演,于是创造出这些词汇,循环往复中自发形成了一个认同该标签的用户群体,这反过来又为算法提供了更多可奖励的内容。

The algorithm actually doesn't need the words, but humans sense that they are trying to perform for the algorithm, and they create these words, and they actually circularly, emergently make more of a cluster of users who identify with this label, and then it makes more for the algorithm to reward.

Speaker 1

这一切都是监控资本主义体系的一部分,现在他们有更多关于我们的标签,能以更具体的方式锁定我们。

And it's all part of this surveillance capitalism system where now they have more things to label about us, more ways to target us more specifically.

Speaker 1

如果你查看2021年TikTok的商业页面,上面说亚文化已成为新的人口统计特征,并为企业提供了如何从田园风、坏女孩美学等标签中获利的思路。

If you look at TikTok's business page in 2021, it says that subcultures are the new demographics, and it gives businesses ideas for how to profit off things like cottagecore and hashtag baddie aesthetic.

Speaker 1

最让我震惊的是,他们居然在页面上使用了'人口统计'这个词。

And what's crazy to me about that page, they are using the word demographic.

Speaker 1

嗯。

Mhmm.

Speaker 1

因为过去的人口统计特征指的是种族、年龄、性别。

Because in the past, a demographic was like race, age, gender.

Speaker 1

而现在还包括你是否属于田园风。

And now it's also whether you're cottagecore or not.

Speaker 2

作为一个X世代的人,我内心有一部分在想:老兄,你这完全是在出卖自己啊。

So so as a Gen Xer, part of me is like, dude, you're totally selling out.

Speaker 2

但我猜那是一种极具时效性的测试方式,用来检验你为何选择现在的职业。

But I'm guessing that that was an extremely time specific way of testing you on why you are in the career you're in.

Speaker 1

我想我非常清楚这种矛盾——作为一个创作者谈论这些词汇并关心算法如何影响我们的文化时必须要面对的割裂感。

I think I'm extremely aware of the dissonance, I guess, that is required to be a creator talking about these words and caring about how algorithms are affecting our culture.

Speaker 1

我认为颠覆性地使用这些算法是可能的,否则我根本不会这么做。

I I think it is possible to use these algorithms subversively, otherwise, I I would not.

Speaker 1

另外,如果我们回归麦克卢汉的媒介理论,我认为混合多种媒介形式也非常重要。

It's also, I think, very important to mix forms of media if we're going back to McLuhan and media theory.

Speaker 1

有位学者叫哈罗德·英尼斯,嗯。

There's a scholar called Harold Innis Mhmm.

Speaker 1

他在加拿大的麦克卢汉之前。

Who came before McLuhan in Canada.

Speaker 1

我认为他关于空间偏向与时间偏向传播的观点非常精辟。

I I think he has a fascinating point about space biased versus time biased communication.

Speaker 1

时间偏向媒介与文化记录共存,比如书籍和口头传统。

And time biased sticks with the cultural record, like books and oral traditions.

Speaker 1

虽然要触达同等规模的受众可能需要更长时间,但空间偏向媒介能迅速占据大量空间,却难以持久。

And so while it might take way longer to reach an audience that's the same size, space biased media fills up a lot of space really quickly, but doesn't stick around.

Speaker 1

像报纸、大众媒体、电视,或者现在的算法,都具有极强的空间偏向性。

Something like a newspaper, or mass media, or TV, or right now algorithms are incredibly space biased.

Speaker 1

它们充斥我们的意识,然后转瞬即逝。

They they fill our consciousness, and then they go away.

Speaker 1

看来两种媒介类型各有利弊。

So it seems that both types of media have their pros and cons.

Speaker 1

对吧?

Right?

Speaker 1

空间偏向媒介擅长覆盖广泛受众,时间偏向媒介则利于保持文化延续性。

Space biased media is very good for communicating to a big audience, and time biased media is very good for maintaining some kind of cultural constancy or longevity.

Speaker 1

因此我认为应该尽可能混合使用两种媒介,通过接触多元媒介来构建更全面的社会认知。

And so I think we should be mixing as much time biased and space biased media as possible, where we build a more holistic picture of society and reality by consuming and engaging with all these different types of media.

Speaker 1

这就是我写书的原因。

So that's why I wrote a book.

Speaker 2

是啊。

Yeah.

Speaker 2

我确实想问你这个问题。

I did wanna ask you that.

Speaker 2

写书在某种程度上似乎有些复古,但根据你刚才说的,我是否可以认为你很高兴创作了它?

Writing a book seems kind of retro in a way, but based on what you've just said, am I right to assume that you were happy to create it?

Speaker 2

嗯,一是因为你拿到了报酬,二是能让更多人了解你的想法。

Well, a, because you got paid, and b, gets your ideas to more people.

Speaker 2

但第三点更重要,它更有可能经受住时间的考验,不像你的一些TikTok视频?

But c, it's more tangible that it'll stand the test of time, maybe unlike some of your TikTok videos?

Speaker 1

没错。

Yeah.

Speaker 1

我觉得没人会看我2023年的TikTok视频了。

I don't think anybody's looking at my TikTok videos from 2023.

Speaker 1

但两年后可能还会有人看这本书。

Somebody might look at this book two years from now though.

Speaker 1

所以思想传播的方式确实存在差异,不同媒介会带来不同的传播效果。

So there is a difference to how ideas can diffuse, and ideas will diffuse differently depending on the medium.

Speaker 1

这其中很多都是人类社会构建的层层规则。

So a lot of this is just built up layers of human social constructs.

Speaker 1

比如NPR规定的谈话方式,和书籍要求的写作规范——顺便说句,这本书用的是非正式英语。

The NPR social construct of how we're supposed to talk versus the book social construct of how I'm supposed to write, which is informal English, mind you.

Speaker 1

此外还有TikTok平台要求的表达方式。

And then there's the TikTok construct of how I'm supposed to talk.

Speaker 1

每一种情境对我应如何沟通都有不同层次的尊重或理解。

And each of these has a different layer of respect or understanding of how I should be communicating.

Speaker 2

那么亚当,你究竟是谁?

So who are you really, Adam?

Speaker 1

我认为优秀的沟通者应该懂得适应不同媒介、不同类型人群和不同社交场合,我们时刻都在进行语境转换。

I think a good communicator should know to adapt to different media and to different types of people and to different social settings, and we are constantly code switching all around us.

Speaker 1

你和祖母交谈时的状态与和挚友聊天时完全不同,这再正常不过。

You you are a different person talking to your grandmother than talking to your best friend, and that is super normal.

Speaker 1

祖母家的氛围与你和朋友聚会的酒吧本就是不同的沟通媒介。

The your grandmother's house is a different medium than the bar that you're hanging out with your friend at.

Speaker 1

所以根据媒介切换状态是人类天性的一部分,但某种程度上仍要保持概念上的一致性。

So this is a aspect of human nature to switch with the medium, but kind of maintain conceptual constancy, I guess.

Speaker 1

始终坚守自己的道德体系和行动方向。

Still have your own moral system and your own direction with what you're trying to do.

Speaker 2

我邀请亚当和我一起拍个视频。

I asked Adam to make a video with me.

Speaker 1

对你并不了解。

Didn't know about you.

Speaker 2

我们怎么开始?

How do we begin?

Speaker 1

哦,抱歉。

Oh, sorry.

Speaker 2

用他的爆款公式试试能否走红。

And use his formula to see if we could go viral.

Speaker 1

灯光有点差。

The lighting's kinda bad.

Speaker 2

迎合算法。

By playing to the algorithm.

Speaker 2

是啊。

Yeah.

Speaker 1

好吧。

Okay.

Speaker 1

对啊。

Yeah.

Speaker 2

我们的首要需求,一个时髦的新词。

Our first requirement, a trendy new word.

Speaker 1

哦,你不知道'clanker'这个词吗?

Oh, you didn't know about the word clanker?

Speaker 2

而且你总需要一个好噱头。

And you always need a good hook.

Speaker 2

X世代对阵Z世代。

Gen X versus Gen Z.

Speaker 2

不。

No.

Speaker 2

我完全不知道。

I had no idea.

Speaker 2

我是X世代。

I'm Gen X.

Speaker 1

这算是一种对机器人的臆想性蔑称。

It's kind of a speculative slur for robots.

Speaker 1

这个想法是它们未来会变得更有感知力,而现在就可以被当作人类少数群体对待。

So the idea that they would become more sentient in the future and they could be treated as a human minority today.

Speaker 2

这个视频的重点是让人们思考:为什么我们要为人类创造的事物发明蔑称。

The point of this video, to get people thinking about why we would even invent a slur for something that humans created.

Speaker 2

在这个案例中,我们正用'铁皮佬'这个蔑称来指代机器人。

In this case, robots, which we are now calling clankers as a slur.

Speaker 1

没错。

Exactly.

Speaker 1

所以这个想法是对机器人'哐当声'的贬义化演绎,同时也影射了N开头的那个词。

So the idea is a derogatory play on the idea of clanking as a robot, but also the n word itself.

Speaker 1

这个概念源自《星球大战》,在那里机器人被视为次等种族,有很多关于'铁皮佬'作为底层阶级的设定,但某种程度上也是未来拟人化的公民。

It comes from Star Wars where robots are seen as sort of a secondary race, and there's a lot of ideas about clankers being a lower class, but sort of anthropomorphized citizen in the future.

Speaker 2

从某些角度看确实合理。

In some ways, makes sense.

Speaker 2

比如它们不是人类,难道不该被当作铁皮佬对待吗?

Like, they're not humans, so shouldn't they be treated as clankers?

Speaker 1

你知道,这个想法是未来它们会产生意识,而我们却拒绝承认这点。

You know, it's just the idea that in the future they would become conscious, and we aren't recognizing that.

Speaker 1

这有点像网络迷因,设想2060年你女儿带回家一个铁皮佬男友。

It's sort of an online meme that's speculating about your daughter bringing home a clanker in the year 2060.

Speaker 2

所以就像,2060年我女儿会带回家一个对她特别特别好的机器人,比真人男友更体贴,但他是个铁皮佬。

So, like, 2060, my daughter's gonna bring home a robot who is really, really nice to her, therefore, a nicer boyfriend than a real human, but he's kind of a clanker.

Speaker 1

对。

Right.

Speaker 1

嗯,这个想法就是你不希望自己的女儿和机器人约会。

Well, the idea is you don't want your daughter dating a robot.

Speaker 2

确实如此。

Amen to that.

Speaker 1

嗯,到了2060年,这种想法可能就成种族歧视了。

Well, in 2060, that might be racist.

Speaker 1

天啊。

Oh god.

Speaker 2

一年后'clanker'这个词还会存在吗?

Will the word clanker be around a year from now?

Speaker 2

也许不会。

Maybe not.

Speaker 2

这些流行词和梗图算是当今世界最紧迫的问题吗?

Are these trendy words and memes among the world's most urgent issues?

Speaker 2

当然不算。

Definitely not.

Speaker 2

尽管它们有趣又新奇,但这个算法和梗图的世界也有其阴暗面。

But as much as they are fun and interesting, there is a darker side to this world of algorithms and memes.

Speaker 2

我确实想确认一下,有些人正利用语言和算法运作方式来传播各种观点。

I do wanna make sure that I ask you, there are people who are using language and the way the algorithm works to get across all kinds of ideas.

Speaker 2

最近就发生了令人不安的可怕事件——保守派网红查理·柯克被杀害了。

And, you know, most recently, the very upsetting and awful occurrence of Charlie Kirk, a conservative influencer, was killed.

Speaker 2

但我很好奇,听说他在网上积累了大量粉丝,部分原因在于他对语言的独特运用。

But I guess I'm curious to hear he built a big following online in part through a very particular use of language.

Speaker 2

对吧?

Right?

Speaker 2

它简短有力,专为病毒式传播设计。

It was short, punchy, geared for viralness.

Speaker 2

从你的角度来看,这如何说明社交媒体平台正在塑造公众人物沟通方式的现象?

From your perspective, how does that illustrate the way that social media platforms are shaping the way public figures communicate?

Speaker 1

是的。

Yeah.

Speaker 1

有种容易走红的'陷阱式'风格,你会在本·夏皮罗或查理·柯克过去的内容中看到。

There's a sort of gotcha style that goes viral that you'll see Ben Shapiro or Charlie Kirk, in the past use.

Speaker 1

就是那种——保持简洁,做成适合剪辑传播的片段。

That's like, keep it concise, keep it clip farmed.

Speaker 1

比如能作为30秒片段走红的尖酸语录。

So something that can go viral as a thirty second clip, some snarky sound bite.

Speaker 1

谁更正确不重要,重要的是谁更能病毒式传播。

It doesn't matter who's more correct, just who can go more viral.

Speaker 1

如果你的传播力比对手强,就能压制反驳或反对观点。

And if you can go more viral than the other person, you can outcompete like rebuttals or counterpoints.

Speaker 1

所以重点在于抛出观点,而非给出深思熟虑的回应。

So it's more about just get your idea out there than have some measured response.

Speaker 1

算法总是奖励极端内容。

And algorithms reward extreme things.

Speaker 1

对吧?

Right?

Speaker 1

当我们讨论参与度优化时,这些算法为了吸引我们的注意力所采取的手段。

When we were talking about engagement optimization, what these algorithms are doing to get our attention.

Speaker 1

不幸的是,真正能吸引注意力的往往是极端的内容。

Things that are good at getting attention are unfortunately really extreme things.

Speaker 1

极左和极右的观点与声音将会被放大。

So far left and far right opinions and voices are going to be amplified.

Speaker 1

你知道的,AOC和玛乔丽·泰勒·格林总会比我家乡纽约奥尔巴尼的议员保罗·唐科更容易走红。

You know, AOC and Marjorie Taylor Greene are always gonna go viral than my congressman where I grew up in Albany, New York, Paul Tonko.

Speaker 1

无意冒犯。

No offense.

Speaker 1

他有点无聊。

He's he's kinda boring.

Speaker 1

他不会走红。

He doesn't go viral.

Speaker 1

对吧?

Right?

Speaker 1

他的观点没有极端之处,所以在算法上传播不了那么远。

There's no kind of extreme point to his views, so it's not gonna go as far on the algorithm.

Speaker 1

因此极端观点比温和观点更具竞争力,传播更广,可能导致更多人开始接受这些观点。

And so you have this ability of more extreme views to outcompete less extreme views, diffuse further, and then more people maybe start adopting that.

Speaker 1

这为人们捐款提供了新机会,或者说我认为特朗普是第一位算法总统,就像肯尼迪是第一位电视总统一样。

It represents a new chance for people to donate money or I think Donald Trump is the first algorithmic president in the same way that Kennedy was the first TV president.

Speaker 1

他之所以能战胜尼克松当选,很大程度上是因为他更上镜。

He was famously kind of elected over Nixon because he looked more photogenic.

Speaker 1

而我认为唐纳德·特朗普是更具网络迷因特质的总统。

And I think Donald Trump is the more memeable president.

Speaker 1

如果说上镜的候选人适合电视时代,那么具有迷因特质的总统就更适合算法时代。

And if photogenic candidates were good on TV, memeable presidents are good for algorithms.

Speaker 2

举个例子来说明——还是回到查理·柯克的话题——这种现象在现实中如何体现。

I mean, one illustration I think of how just back to Charlie Kirk, how this can play out in real life.

Speaker 2

我曾和一位当妈妈的朋友聊天,她不好意思地承认自己从没听说过查理·柯克。

I had a conversation with a friend, a mom, who said to me, I I hate to admit it, I had not heard of Charlie Kirk.

Speaker 2

但我问她14岁的儿子时,他回答说:'拜托老妈,我当然知道他。'

But I asked my 14 year old son if he had, and he said, duh, mom, of course I have.

Speaker 2

'我可是社交媒体上的白人14岁少年。'

I'm a white 14 year old kid on social media.

Speaker 2

'他们投放的内容就是专门针对我的。'

I'm targeted exactly by them.

Speaker 2

'当然看过他的视频。'

Of course, I've seen his videos.

Speaker 2

关键在于他完全明白自己接收的内容及其推送原因。

The fact that he understood what he was getting and why he was getting it.

Speaker 2

你认为年轻一代是否更精明?他们是否会因为思考媒介属性而质疑接收到的信息?

Do you think that this the younger generation is more savvy, and are they questioning what the information that they're getting because they are thinking about the medium?

Speaker 1

是的。

Yes.

Speaker 1

我对年轻一代确实充满乐观。

I actually have so much optimism for the younger generation.

Speaker 1

实际上我认为老一辈人对新媒体可能更加陌生,这令人担忧。

I actually think older generations are maybe more concerningly illiterate of the new medium.

Speaker 1

最早被低质内容欺骗的是脸书上的婴儿潮一代,比如那个病毒式传播的'虾仁耶稣'。

The first people to get tricked by slop were boomers on Facebook, and there was like a shrimp Jesus that went viral.

Speaker 1

我认为年轻人普遍能较好地理解'哦,这可能是AI生成的'。

I think young people generally have a decent understanding of, oh, this could be AI.

Speaker 1

这可能不是真实的。

This might not be real.

Speaker 1

或者说,这是算法在作祟,算法不一定会展示我真正想看的内容。

Or like, this is the algorithm, and the algorithm is not necessarily showing me what I want.

Speaker 1

当人们较少接触新媒体时,他们更难意识到某些内容是虚假或被过度呈现的。

When people engage less with new media, they are less aware of how some things are fake or overrepresented.

Speaker 1

但没错,这确实仍是年轻人被激进化的渠道之一。

But yes, definitely, it is still a pipeline for younger people to get radicalized.

Speaker 1

很多极端意识形态直接源自4chan论坛。

A lot of this comes directly out of extreme ideology built on 4chan.

Speaker 1

算法形成的某些语言社群可能具有实际危害性。

Some of the linguistic communities created by the algorithm can be actively harmful.

Speaker 1

许多年轻人开始用后缀'-pilled'表示'被说服接受某种生活方式'。

Many younger people have started using the suffix pilled to mean convinced into a lifestyle.

Speaker 1

比如如果我最近发现自己特别爱吃墨西哥卷饼,就可以说'我完全墨西哥卷饼pilled了'。

If I recently discovered that I really like eating burritos, for example, I can say, I'm so burrito pilled.

Speaker 1

但这个词是通过类比'black pilled'形成的,后者指被说服接受非自愿独身主义意识形态。

But that word was formed through analogy with black pilled, a term meaning convinced into incel ideology.

Speaker 1

如今非自愿独身主义者是危险的厌女群体,但他们的词汇却通过算法提供的空间渗透进了Z世代俚语。

Now incels are a dangerous misogynistic group, and yet somehow the vocabulary is filtering into Gen Z slang because the algorithm gave these hate groups the space.

Speaker 1

而且很多人根本不知道这些词的来源。

And again, many people don't even know where it came from.

Speaker 1

但对少数可能对其核心理念感兴趣的人来说,网络俚语的传播方式让这些思想变得更易接触了。

But for the few people who might be interested in the underlying idea, it's now more accessible to them because of the way that slang spreads on the Internet.

Speaker 2

你认为这意味着什么?

What do you think that means?

Speaker 2

你刚才还说对年轻一代持乐观态度。

You just said you were an optimist for younger generation.

Speaker 2

但听完你说的这些,我非常担忧。

But after hearing you say that, I I worry a lot.

Speaker 1

我乐观的是年轻一代能认清这些东西的本质。

I'm an optimist for younger generations having literacy about this stuff.

Speaker 1

但对算法及其文化影响方面,我并不乐观。

I am not an optimist in terms of algorithms and how they impact our culture.

Speaker 1

我想把语言和文化区分开来,因为我经常同时讨论这两者。

I do wanna draw that distinction between language and culture because I sort of comment on both a lot.

Speaker 1

我认为语言本身没问题,单个词汇从不会出问题。

And I think language is fine, and there's nothing wrong with individual words.

Speaker 1

它们永远不会腐蚀大脑。

They're never brain rot.

Speaker 1

我也认同基安蒂语所指向的文化变迁趋势。

I do think the same way as Chianti language points us towards greater cultural shifts.

Speaker 1

你会发现很多词汇源自4chan男性圈层,我们应当关注这些词汇为何会从该圈层中产生。

You see a lot of words coming from that 4chan manosphere space, and we should be paying attention to why are these words coming out that space.

Speaker 1

因为这个圈层擅长制造传播其意识形态的模因,有时其意识形态与极右翼思想和美学深度交织。

It's because that space is good at creating memes that export their ideology, and sometimes their ideology is very intertwined with far right ideas and aesthetics.

Speaker 1

这某种程度上也是算法自身的问题——它们如何助推极端观点。

And, yeah, that's kind of a problem of algorithms themselves, how they push extreme perspectives.

Speaker 1

先前谈到媒体融合时,我真心希望我们能形成一种文化共识:不应仅依赖算法获取新闻,当然主流媒体也存在自身偏见。

What I said earlier about mixing media, I really hope that we come to realize as a culture that we should not just be relying on algorithms for our news, but also, you know, mainstream news has its own biases.

Speaker 1

从尽可能多的渠道获取新闻有助于构建更完整的现实图景。

It's good to get news from as many places as possible and build a greater picture of reality.

Speaker 1

通过现实人际交往来了解他人也很重要,因为算法呈现的社会面貌是扁平化的。

It's good to build your stories of who people are through in person connections as well because algorithms are going to show you a flattened representation of what society is.

Speaker 1

我认为我们确实需要保持警觉。

I do think we should be aware.

Speaker 1

当我们的说话方式可能已被算法塑造时,我们应当有所觉察。

We should be aware when the way we're talking may have been conditioned by the algorithm.

Speaker 1

当我们使用的词汇可能是被设计来推销商品时,我们应当有所觉察。

We should be aware when the words we're using may have been engineered to sell us things.

Speaker 1

当我们的语言复读极端主义论调时,当这种语言可能伤害他人时,我们都应当有所觉察。

We should be aware when our language regurgitates extremist rhetoric, and we should be aware when that language can be used to harm other people.

Speaker 1

我们总体上应当关注词源学,因为这能帮助我们更好地理解当下的自己。

We should be aware of etymology in general because it helps us better understand who we are today.

Speaker 1

我们应该保持警觉。

We should be aware.

Speaker 1

最后,我还要分享一个俚语给大家。

And with that, I've just one final piece of slang for you.

Speaker 1

这是年轻人在结束冗长解释时常用的说法。

It's a common phrase used by younger people when we finish a long winded explanation of something.

Speaker 1

感谢收听我的TED演讲。

Thanks for coming to my TED talk.

Speaker 2

刚才发言的是亚当·亚历克西克。

That was Adam Alexic.

Speaker 2

他是《算法语言:社交媒体如何改变语言未来》的作者。

He's the author of algo speak, how social media is transforming the future of language.

Speaker 2

你可以在社交媒体上通过@etymologynerd找到他,也可以在ted.com观看他的演讲。

You can find him at etymology nerd on social media, and you can see his talk at ted.com.

Speaker 2

非常感谢大家收听今天的节目。

Thank you so much for listening to our show today.

Speaker 2

如果喜欢我们的节目,请留言评论。

If you enjoyed it, leave us a comment.

Speaker 2

你正在使用哪些新词汇?

What words are you using?

Speaker 2

我们很想知道。

We would love to know.

Speaker 2

你可以在Spotify留言或发邮件至TedRadioHour@NPR.org告诉我们。

You can do that on Spotify Spotify or email us at TedRadioHour@NPR.org.

Speaker 2

我们很高兴收到您的反馈。

We love hearing from you.

Speaker 2

本期节目由马修·克卢捷制作,由萨娜兹·梅什坎普尔和我负责剪辑。

This episode was produced by Matthew Cloutier and edited by Sanaz Meshkampur and me.

Speaker 2

NPR的制作团队成员还包括詹姆斯·德拉胡西、瑞秋·福克纳·怀特、凯蒂·蒙特莱昂、菲奥娜·吉隆、哈沙姆·内贾达和菲比·莱特。

Our production staff at NPR also includes James Dellahusi, Rachel Faulkner White, Katie Monteleone, Fiona Giron, Harsham Nejada, and Phoebe Lett.

Speaker 2

我们的执行制片人是艾琳·野口。

Our executive producer is Irene Noguchi.

Speaker 2

音频工程师是达米安·赫林和吉利·穆恩。

Our audio engineers were Damian Herring and Gilly Moon.

Speaker 2

主题音乐由拉姆廷·阿拉布鲁伊创作。

Our theme music was written by Ramtine Arablui.

Speaker 2

TED的合作伙伴包括克里斯·安德森、罗克珊·海拉什和丹妮拉·巴拉拉佐。

Our partners at TED are Chris Anderson, Roxanne Hailash, and Daniella Balarazzo.

Speaker 2

我是马努什·扎莫罗迪,您正在收听的是NPR出品的TED广播时间。

I'm Manoush Zamorodi, and you've been listening to the TED Radio Hour from NPR.

关于 Bayt 播客

Bayt 提供中文+原文双语音频和字幕,帮助你打破语言障碍,轻松听懂全球优质播客。

继续浏览更多播客