Citadel Dispatch - 想象一下:耶格尔与罗斯——控制时代的社群 封面

想象一下:耶格尔与罗斯——控制时代的社群

IMAGINE IF: YEAGER AND ROSS - COMMUNITIES IN THE AGE OF CONTROL

本集简介

与Shawn Yeager和Derek Ross在田纳西州纳什维尔Imagine If大会上的对话。我们探讨了数字通信与身份的现状,特别聚焦于nostr协议。 日期:2025年9月20日 Shawn的Nostr主页:https://primal.net/shawn Derek的Nostr主页:https://primal.net/derekross 比特币公园Nostr主页:https://primal.net/park Imagine If大会官网:https://bitcoinpark.com/imagineif (00:40) 数字通信、身份与社交的未来 (01:41) 问题诊断:激励机制、KYC与信任崩塌 (03:05) 内容审查、隐形封禁与社交图谱自主权 (05:00) AI、深度伪造与信息可信度 (07:24) 算法控制 vs 用户选择权 (10:10) Nostr协议介绍:开放协议与健康互动 (11:54) 数字健康、末日刷屏与育儿挑战 (15:21) 开放协议下的青少年安全:责任与工具 (18:22) 为父母赋能:操作系统级控制与用户体验 (19:35) Nostr入门指南:密钥管理、Primal与用户体验谱系 (21:17) 氛围编程应用:Nostr+比特币上的Soapbox/Shakespeare (22:39) 无许可支付与AI建站 视频链接:https://primal.net/e/nevent1qqs0v7rgjh55wygwuc8pmqvk0qz6qts30uaut2c8lp4dgh9usw2cdpgnznwd9 节目更多信息:https://citadeldispatch.com 关于我的更多信息:https://odell.xyz

双语字幕

仅展示文本字幕,不包含中文音频;想边听边看,请使用 Bayt 播客 App。

Speaker 0

让我们为马特·奥德尔、德里克·罗斯和肖恩·耶格尔鼓掌。

A round of applause for Matt O'Dell, Derek Ross, and Sean Yeager.

Speaker 0

好的。

Okay.

Speaker 0

哟。

Yo.

Speaker 0

大家最近怎么样?

How's it going, guys?

Speaker 0

今天我们将讨论数字通信、身份识别、社交和开放社区的未来。

We'll be talking today about the future of digital comms, identity, social, and open communities.

Speaker 0

还有我的好朋友德里克和肖恩也在这里。

And then my good friends, Derek and Sean here with us.

Speaker 0

我认为从问题诊断开始会是个有趣的切入点。

I think an interesting place to start is diagnosing the problem.

Speaker 0

我们发现自己身处一个日益数字化的世界,而现状就像是毫无规划地一步步走到今天。

We've found ourselves in a increasing digital world, and the status quo has kinda just been built to, like, one foot in front of the other without really kinda, like, any real planning, and and and now we're here.

Speaker 0

那么让我们先从肖恩开始。

So let's start off with Sean.

Speaker 0

当你思考数字社区、身份识别和社交的现状时?

When you think about the current state of digital communities and identity and social?

Speaker 0

你认为问题出在哪里?

Where where do you diagnose the problems existing?

Speaker 1

作为一个比特币信徒,我认为就像大多数事情一样,问题始于货币体系的崩溃。

I think with as with most everything, as it admitted bitcoiner, it starts with broken money.

Speaker 1

货币体系一旦崩溃,激励机制也随之瓦解。

And with broken money come broken incentives.

Speaker 1

由此衍生的商业模式,众所周知,将我们变成了产品。

And from that flow, business models that turn, as we all know, us into the product.

Speaker 1

我认为,近年来存在一种愈演愈烈的趋势——试图从终端用户、广告主和企业身上榨取更多价值。

And there has been an increasing, I think, drive to try to milk more out of the consumer user and then the advertisers and the businesses.

Speaker 1

科利·多克托罗有个生动的说法(或许我不该在此复述)来描述这种循环如何运作。

Cory Doctorow has a colorful phrase, maybe I won't utter it here, but to describe how these cycles roll out.

Speaker 1

因此我们面临的处境不仅是沦为用户或产品,更逐渐被视为可打包的商品。

And so where we find ourselves is not only are we not the user or the product, but I think increasingly we are seen as something to be packaged.

Speaker 1

我们目睹着KYC(了解你的客户)制度悄然蔓延。

And we see creeping KYC.

Speaker 1

我们看到英国《网络安全法案》引发的种种事态。

We see everything happening in The UK with the Online Safety Act.

Speaker 1

是的,我们当前的处境确实不妙。

And, yeah, we're not in a good place right now.

Speaker 0

说得非常好。

Very well said.

Speaker 0

德里克,你怎么看?

Derek, how do you think about it?

Speaker 2

我认为过去几年里,我们都有这样的经历:认识某人、与网友互动或关注的对象遭遇封禁、影子禁令等类似情况。

Well, I think that over the past few years we've all come to a place where we know somebody or we interacted online with somebody, followed somebody that has been censored or has been shadow banned, something along those lines.

Speaker 2

这种现象正变得日益明显,且加速蔓延。

It's becoming more apparent, and it's accelerating.

Speaker 2

看到它在加速发展确实有点奇怪。

It's kind of odd to see it accelerating.

Speaker 2

就像肖恩刚才说的,我们正目睹这种情况在整个欧盟发生,你知道的,在英国也是如此。

Like Sean just said, we're seeing that happen across the European Union, across, you know, in The UK.

Speaker 2

我们最近甚至开始在美国看到这种现象——人们的整个生计、他们的事业可能被夺走,因为他们把事业建立在别人的基础上。

We're starting to see this actually even happen recently here in The United States where people can have their whole entire livelihood, their business taken away because they built their business on somebody else's foundation.

Speaker 2

而且他们并不拥有那些内容。

And they don't own that content.

Speaker 2

他们也不拥有那些粉丝。

They don't own that their followers.

Speaker 2

他们不拥有整个社交关系图谱,而这些正在一夜之间消失。

They don't own their entire social graph, and it's disappearing overnight.

Speaker 2

多年辛勤工作的成果可能被夺走,你却无能为力,因为你把整个数字生活都建立在别人的基础上。

Years and years of hard work can be taken away from you, and you can't do anything about it because you built your entire digital life on somebody else's foundation.

Speaker 2

现在越来越明显的是,必须要有更好的解决方案。

And it's becoming very apparent that there needs to be a better way.

Speaker 0

是啊。

Yeah.

Speaker 0

我认为有几个问题相互叠加,导致了我们当前在科技巨头和数字平台方面的发展轨迹。

I think there's a there's a couple of issues that compound on top of each other that result in the current trajectory that we're that we're going down in terms of of big tech and digital platforms.

Speaker 0

所以,你们提到了审查和控制,我觉得这是人们经常讨论的一个重点。

So, I mean, you guys phoned in on on on censorship and control, which I think is one that people talk about a lot.

Speaker 0

肖恩,你一直在探索人工智能和比特币之间的交叉领域对吧?

So, Sean, you've been exploring, like, kind of this intersection between, you know, AI and Bitcoin.

Speaker 0

这里另一个让我非常感兴趣的点是关于深度伪造和可验证性的概念。

And the other piece here that is really interesting to me is is, like, this idea of deepfakes and verifiability.

Speaker 0

在当前范式下,你如何看待这个问题?

How do you think about that in the current paradigm?

Speaker 1

我认为,简单说明一下背景——希望不是无耻的广告——信任革命的核心是要解决两个问题。

I think I mean, and and just a a brief bit of background, hopefully not a shameless shill, is the the point of trust revolution is to pursue two questions.

Speaker 1

第一个问题是:作为发达国家,我们为何会陷入低信任社会?

One is how do we, as developed nations, find ourselves in low trust societies?

Speaker 1

我想我们大多数人都能认同这点,皮尤研究中心等机构的研究也支持这一观点。

In that we I think most of us can agree, Pew Research and others would certainly back this up.

Speaker 1

我们不信任政府。

We don't trust the government.

Speaker 1

不信任媒体。

Don't trust the media.

Speaker 1

我们信任医疗体系。

We trust health care.

Speaker 1

但不信任教育系统。

We don't trust education.

Speaker 1

我们彼此之间也不信任。

We don't trust each other.

Speaker 1

我们甚至不信任跨党派合作。

We don't trust to cross party lines.

Speaker 1

这并不是悲观论调。

That's not a black pill.

Speaker 1

我认为这显然是事实。

I think it's just observably true.

Speaker 1

第二个更有希望的问题是,我们如何以及在哪里能重新获得我们付出或被要求却已破裂的信任?

The second, more hopeful question is, how and where can we reclaim the trust that we have given or been demanded of and it has been broken?

Speaker 1

我们如何能在我们认为应该存在信任的地方建立信任?

And how can we build trust where we believe it should be?

Speaker 1

也就是说,我们能相信自己的眼睛吗?针对你的问题。

That's all to say, can we trust our eyes, to your question, you know?

Speaker 1

我们能信任我们所看到和消费的媒体吗?

Can we trust the media that we see and we consume?

Speaker 1

我认为有希望的是利用公私钥加密技术来签名、验证和归属媒体内容。

I think what's hopeful about that is the ability to utilize publicprivate key cryptography to sign, authenticate, attribute media.

Speaker 1

我认为距离大规模应用还很遥远。

I think we're quite a ways away from that being large scale.

Speaker 1

我认为再次强调,激励机制未必能让这项技术被广泛采用,但工具已经存在。

I think once again the incentives are not necessarily aligned for that to be widely adopted, but I think the tools are there.

Speaker 1

我心中的大问题是——呼应你的观点——我们何时会达到这个拐点:当人们对所见是否真实产生大量质疑和困惑时,就会更广泛采用现有工具(如Noster和这些公私钥对)来应对挑战。

And the big question in my mind, to echo yours, is at what point do we reach this inflection where there is so much questioning and confusion about is what I'm seeing real that there's a broader adoption of of the tools that we do have like Noster and these public private key pairs to address that challenge.

Speaker 0

但我的意思是,我们不是已经差不多到那个阶段了吗?

But, I mean, aren't we aren't we kind of already there?

Speaker 1

具体指什么方面?

In what way?

Speaker 1

指的是在...方面

There in terms of

Speaker 0

我觉得大多数人,比如当你打开手机时,你会想,这是真的吗?

I think most people, like, when you open your phone, you're like, is that real?

Speaker 0

哦,是的。

Oh, yes.

Speaker 0

就像,我们已经非常接近了,甚至可能已经跨越了那道鸿沟。

Like, we're very close, if not already across the chasm.

Speaker 0

对吧?

Right?

Speaker 1

是的。

Yeah.

Speaker 1

我的意思是,我想简单补充一点,就像以往的技术浪潮一样,我们需要培养一定的认知能力和批判思维。

Which I mean, and I'll just say one quick thing there is I think much as in sort of prior waves of technology there has been the need to create a certain literacy and a certain ability to scrutinize.

Speaker 1

我希望这能激励人们更审慎地思考他们所消费的内容,以及他们应该质疑或信任什么。

I hope that it incentivizes and motivates people to become more thoughtful about what they consume and what they question or trust.

Speaker 2

我认为扩展消费内容本身就是一个独特的问题,因为我想消费的内容与我被迫消费的内容截然不同。

I think expanding on what you consume is a unique problem in itself because what content I want to consume versus what content I'm forced to consume is very different.

Speaker 2

没错。

Yes.

Speaker 2

因为我们被算法和这些平台想让我们看到的内容所奴役。

Because we are slaves to the algorithms and what these platforms want us to see.

Speaker 2

我们实际上无法控制这些内容。

We don't really have control over the content.

Speaker 2

我们也无法控制自己的注意力。

We don't have the control over our attention.

Speaker 2

而这正是问题的另一部分。

And that that's part of the problem too.

Speaker 2

所以如果你不想看到某些类型的内容,在使用这些现有的传统社交平台时很难避免看到它们。

So if you didn't wanna see certain types of content, it's really hard to not see it using these existing legacy social platforms.

Speaker 1

你正在被强行灌输。

You're being spoon fed.

Speaker 1

是啊。

Yeah.

Speaker 1

是啊。

Yeah.

Speaker 0

我是说,从生产力的角度来看,你如何缓解这种情况?

I mean, from, like, a productive point of view, how do you how do you mitigate that?

Speaker 0

你实际上如何解决这个问题?

How do you actually solve that problem?

Speaker 0

我是说,这说起来容易做起来难。

I mean, that's easier said than done.

Speaker 2

是啊。

Yeah.

Speaker 2

说起来容易做起来难,但我们需要为用户提供工具,让他们能选择自己的算法,选择他们想看的内容类型,选择和策划他们的社交信息流。仅仅因为埃隆和马克·扎克伯格说这是你需要看的内容,并不意味着我想看它。

It's easier said than done, but we need tools for users that allow them to choose their own algorithm, to choose the type of content they want to see, to choose and curate their their social feeds Just because Elon and Mark Zuckerberg say that this is the content that you need to see doesn't mean that I want to see it.

Speaker 2

并不意味着你想看它,但如果我使用Instagram、Facebook或X(推特),我就没有选择权。

Doesn't mean that you want to see it, but I don't have a choice if I use Instagram or Facebook or x, Twitter.

Speaker 2

比如,我不得不看那些算法推荐的内容。

Like, I have to see that algorithm content.

Speaker 2

要知道,如果我想看些猫咪或其他什么内容,我其实没有选择权来决定我的信息流里显示什么。

I I don't have a choice of choosing, you know, cat pics as my feed if I wanna be you know, if I want a few cats or whatever it is.

Speaker 2

我当然可以轻松做到。

I I can easily sure.

Speaker 2

我可以浏览某个话题标签之类的,但你知道,这不是个好选择。

I could browse a hashtag or something like that, but that's not a good, you know, that's not a good choice.

Speaker 2

我们需要更多用户工具。

We need more user tools.

Speaker 2

我们需要更多用户选择权,市面上已有一些方案能让用户完全掌控他们想消费的内容,完全掌控自己的注意力。

We need more user choice, and there are options out there that give users full control over what they want to consume, full control over their attention.

Speaker 2

因为这些平台正是在靠这个盈利。

Because that's what these platforms are are monetizing.

Speaker 2

他们在靠我们的注意力赚钱。

They're monetizing our attention.

Speaker 2

对吧?

Right?

Speaker 2

我们需要一种方式把注意力夺回来。

Like, we need a way to take that back.

Speaker 2

要知道,这是我的——我的眼睛在看的东西。

It's our it's, you know, what my eyes see.

Speaker 2

这是我的注意力。

It's my attention.

Speaker 2

我应该能自主决定什么能吸引我的注意力。

I should be able to designate what gets my attention.

Speaker 0

你认为那个摩擦点是因为我觉得这才是前进的方向。

And do you think the the friction point with that because I do think that's the path forward.

Speaker 0

这个摩擦点在于它需要用户承担一定程度的个人责任。

The friction point with that is it requires a level of personal responsibility from the actual user.

Speaker 2

是的。

Yeah.

Speaker 0

比如,我们该如何处理这种摩擦?

Like, how do we handle that friction?

Speaker 0

我是说,有些人

I mean, if There's some people that

Speaker 2

只想随便刷刷。

just wanna scroll.

Speaker 2

对吧?

Right?

Speaker 2

他们不想——他们没时间构建和筛选自己的内容流,这也没关系。

They they don't want the they don't have time to build and curate their own feed, and and that's fine.

Speaker 2

对此,你本可以有选择。

For that, you have a choice.

Speaker 2

问题就在于你根本没有选择权。

The fact that you don't have a choice is the problem.

Speaker 2

如果你想要被投喂的内容,那很好。

If you want the spoon fed content, great.

Speaker 2

如果你不想要被投喂的内容,想自己掌控算法,你就应该拥有广泛的选择权。

If you don't want the spoon fed content, you wanna be your own algorithm being in control, you should have that choice in a wide variety of choices.

Speaker 2

选择应当是开放透明的,你应该能自主决定要走哪条路。

The choices the choices should be open and transparent, and you should be able to decide which path you wanna go.

Speaker 1

我想说它还具有体验性,如果你还没接触过Nostr,没尝试过Nostr

And I I would say it's also experiential in the sense that if you're not on Nostr, if you haven't tried Nostr

Speaker 2

Nostr是什么?

What is Nostr?

Speaker 1

我们甚至还没讨论过这个。

We didn't even talk about that yet.

Speaker 1

Nostr是什么?

What is Nostr?

Speaker 1

就像比特币一样——这个让马特来解释吧——它是一个开放协议。

Well, so like Bitcoin, and I'll I'll let Matt talk to this, it is an open protocol.

Speaker 1

没人能控制它。

No one controls it.

Speaker 1

没人拥有它。

No one owns it.

Speaker 1

因此它可供自由构建。

And therefore, is there to be built upon.

Speaker 1

我提到它的原因是,我认为大多数传统社交媒体和一对多传播渠道不仅通过我们的注意力获利,更日益通过煽动我们的愤怒牟利。

And the reason I mentioned it is, I think most of traditional social media and communications channels, one to many, they are not only monetizing our attention, increasingly, they're monetizing our outrage.

Speaker 1

根据我观察人们使用替代平台(如Mastodon等)的经验,我认为我们都认同Nostr才是未来方向。

And I think as people that I've observed experience an alternative, Mastodon, others there are out there, I think we all agree that Noster's the way to go.

Speaker 1

一旦消除愤怒情绪,通过Nostr与他人互动的体验让我感觉至少不会比使用X(原推特)、Facebook等平台时更糟,甚至可能更好。

Once you remove the outrage, it is experiential that I feel better, possibly at least, not worse, as I have engaged with others on Noster versus X versus Facebook versus others.

Speaker 1

以上就是全部要说的内容。

And so that is all to say.

Speaker 1

我认为关键部分在于让人们理解那种感受。

I think part of the key is just giving people a sense of what that's like.

Speaker 1

而且我认为他们可以开始——我们每个人都可以——重新调整那些受体,那些我们习以为常的多巴胺刺激。

And I think they can begin, each of us, to sort of rewire those receptors, those dopamine, you know, hits that we're accustomed to getting.

Speaker 1

但这需要一些时间。

But it will take some time.

Speaker 0

我的意思是,你正在深入探讨这个关于健康使用技术的基本概念。是的。

I mean, you're drilling down on basically this concept of healthy usage of of technology Yes.

Speaker 0

作为一个社会,我们可能已经深陷于对这些工具的不健康使用中。

Which I would say as a society, we're probably very deep into unhealthy usage of of the tools.

Speaker 0

而且,我是说,我在自己的生活中亲眼目睹了这一点。

And, I mean, I see this firsthand with my own life.

Speaker 0

我现在在社会各个不同层面都看到了这种现象。

I see this across all different aspects of society right now.

Speaker 2

如今我们对此有个专门术语。

We have a term for that nowadays.

Speaker 2

叫做'末日刷屏'。

It's called doomscrolling.

Speaker 2

末日刷屏。

Doomscrolling.

Speaker 1

就像,它

Like, it

Speaker 2

变得如此明显,我们拥有了这一点。

became so apparent we have that.

Speaker 0

AI精神错乱。

AI psychosis.

Speaker 0

是啊。

Yeah.

Speaker 0

末日刷屏。

Doomscrolling.

Speaker 0

人人都这么做。

Everyone does it.

Speaker 0

很多人都会这样。

A lot of people do it.

Speaker 0

他们明知自己在做。

They know they're doing it.

Speaker 0

却依然继续。

They continue to do it.

Speaker 0

但关于数字健康与健康使用理念中,我认为对社会未来发展至关重要的一个方面是——我们三人都是父母。

But one part on this one one aspect of this idea of of digital health and healthy usage that I think is incredibly key for our society going forward is all three of us are parents.

Speaker 0

具体来说,我认为成年人以非常不健康的方式使用它,但问题在于这会如何影响儿童发展?

It's specifically I mean, I think adults use it in very unhealthy ways, but but the question is, like, how does that affect childhood development?

Speaker 0

而对于像Noster这样的开放协议,它不受任何人控制。

And for something like Noster, that's an open protocol that's not controlled by anybody.

Speaker 0

你觉得呢?我是说,我们再次从肖恩开始。

How do you think I mean, we'll start with Sean again.

Speaker 0

你打算如何处理这个问题?

How do you think about handling that issue?

Speaker 0

比如,社会该如何应对孩子们在信息洪流中成长的问题?

Like, how how does how does society handle that going forward with kids growing up with basically just a fire hose of information?

Speaker 1

嗯,那边就是我家的小家伙,快四岁了。

Well, I am there's my little guy right there, my almost four year old.

Speaker 1

所以我是一个小男孩的父亲。

So I'm a dad to a young boy.

Speaker 1

我有些时间,或许可以分享个轶事——这完全要归功于我妻子,她规定(Lathe别听)每天早上一到两小时的屏幕时间,这样她在家就能有些空间处理事情。

And so I have a bit of time, but I'll just sort of maybe share an anecdote, which is that we, full credit to my wife, had given close your ears, Lathe, had given maybe an hour to two per morning of screen time so that, you know, she at home could have some some space to do some things.

Speaker 1

变化非常显著,这对有经验的人来说显而易见,但对我来说很震撼——当我们说不并彻底取消屏幕时间后,儿子的变化令人难以置信。

It is remarkable, the change, and this will be obvious to those of you who've done it, but it was remarkable to me that in saying no and and ending that and having zero screen time, the change in our son was incredible.

Speaker 1

就我个人而言,这是我人生中最有参考价值的亲身观察。

And I personally don't know of any better reference point in my life than to have observed that firsthand.

Speaker 1

所以我无法想象一个幼儿手里拿着电子设备会怎样,这不是对选择这么做的人的评判,只是我无法想象这会造成多大伤害。

So I can only imagine what a young child given a device in their hand, that's not a judgment for anyone who chooses to do that, but I just can't imagine the damage that that will do.

Speaker 1

因此我强烈认为,我们集体和个人——尤其是家庭内部——有责任找到更好的方式。

So I feel very passionate about our collective and individual, most of all, responsibility within our families to find better ways.

Speaker 0

所以我们现在看到很多关于边缘化青年在网络社区被激进的讨论。

So, I mean, we're seeing like, right now, we're seeing a lot of conversation about disenfranchised youth getting radicalized on Internet communities.

Speaker 0

这已成为非常敏感的话题。

It's become a very sensitive conversation.

Speaker 0

某些所谓的解决方案包括限制言论、加强KYC身份验证等措施。

Some of the, quote, unquote, solutions that have been proposed involves restricting speech, restricting access More KYC.

Speaker 0

添加数字身份认证,添加年龄限制。

Adding digital ID, adding age restrictions.

Speaker 0

我是说,我们刚看到蓝天的例子,好像有两个州刚在他们的应用里添加了年龄限制。

I mean, we just saw Blue Sky, I think, two states just added age restrictions to their app.

Speaker 0

德里克,你认为最有效的推进路径是什么?

Derek, how do you what is what is the most productive path forward?

Speaker 0

因为我觉得关键在于这确实是个问题。

Because I I think the key here is that that is actually a problem.

Speaker 0

比如,我确实认为边缘化青年正在小众网络社区中被激进化。

Like, I I do think disenfranchised youth are getting radicalized on niche Internet communities.

Speaker 0

但当你构建像Noster这样的开放协议时,本质上无法自上而下地进行年龄限制,这种情况下最有效的解决路径是什么?

But when you're building out something like Noster and open protocol where you inherently can't age restrict on a top down level, how do you like, what is the most productive path?

Speaker 0

我们该如何以健康的方式真正解决这个问题?

How how do we actually solve that in a healthy way?

Speaker 2

这是个非常好的问题,可能也是个非常难的问题。

That's a very good question, and it's probably a very hard question.

Speaker 2

我认为部分原因可以追溯到肖恩暗示的观点——归根结底,父母应该履行监护责任。

I I think I'll say part of it goes back to what Sean was alluding to is that, you know, ultimately, parents should parent.

Speaker 2

如果孩子因接触某些内容而在网上被激进化,你不想让自己孩子遭遇这种情况,就需要限制他们使用某些应用。

If kids are having issues online getting radicalized over certain content, you don't want that to happen to your kid, then you need to restrict access to certain applications.

Speaker 2

这并不意味着完全禁止,因为我们知道现在的孩子社交生活高度依赖网络,所以还是可以给他们使用部分应用。

Now that doesn't mean completely take away because we know that kids today are very social and online, so you can still give them apps.

Speaker 2

因此第二点是:我们需要更多用户控制功能,也需要Noster生态中有更多专注于限制过滤这类内容的应用。

But so the second part of this is we just need more user controls, and we need more apps across the Noster ecosystem that maybe do focus on restricting filtering that type of content.

Speaker 2

所以你可能会有这样的选择,因为Noster平台非常开放,你可以随心所欲,也许有人会开发一个更适合青少年的Noster应用。

So maybe you have because Noster is widely open and you can do anything you want, maybe somebody builds a Noster application that is more suitable for the youth.

Speaker 2

或许可以限制某些类型的内容。

Maybe restrict certain type of content.

Speaker 2

它仅绑定到特定经过内容过滤的中继服务器,你只能使用这些指定的服务。

It's only bound to certain content filtered relays, and you can't use anything else but that.

Speaker 2

现在的问题是,孩子可以拿着个人资料和NSEC,直接换用另一个应用。

Now the argument is, well, the kid can take the profile, the NSEC, and just use another app.

Speaker 2

但如果你是家长,你履行监护职责,可以限制孩子访问某些应用程序。

But if if you're the parent, you do parenting and you lock down access to certain applications.

Speaker 2

你只允许他们使用家长审核通过的应用。

You only give them access to the parent approved app.

Speaker 2

毕竟他们是你的孩子。

I mean, they're your kids.

Speaker 2

你应该有权决定他们使用什么应用。个人经历是:我很久都不允许我的孩子用TikTok,现在他们14岁和16岁了。

You should be able to say what apps they use and the personal example was that is I didn't let my kids use TikTok for a very long time and my kids are now 14 and 16 years old.

Speaker 2

他们现在用TikTok了,但几年前当朋友们都在用时(大概10岁、12岁)他们就想用,我说不行。

They now use TikTok but they wanted to use it years ago when their friends were all using it, you know, 10, 12 years old and and I said, no.

Speaker 2

你们不能用那个应用。

You're not using that app.

Speaker 2

我很抱歉,他们抱怨了很多次,但我作为家长坚持说:很抱歉。

I'm sorry and they complained a lot and I was a parent and said, well, I'm sorry.

Speaker 2

你们就是不能用。我行使了家长权利,限制孩子接触我不希望他们使用的东西。

You're not using it and I used my parental rights to restrict my kids' access to something I didn't want them on.

Speaker 2

现在,他们更年长了。

Now, they're older.

Speaker 2

当然。

Sure.

Speaker 2

我让他们这么做,任何Noster应用也都适用。

I I let them do it and the same would go for any Noster app.

Speaker 2

如果需要的话,我会限制区块访问权限,因为我们具备实现这一点的工具。

I would restrict access in block if I wanted to, the access to do that because we have the tools to do that.

Speaker 2

但另一方面,正如我所说,我们确实需要Noster客户端站出来,构建一个适合儿童的安全过滤环境。

But then, as I said, on the other side, we do need a Noster client to step up and build a kid filtered, kid safe environment.

Speaker 1

我认为最有力的地方在于——这也是我大力推广Noster或未来可能出现的类似产品的原因——它能让个体,特别是这个案例中的父母,获得自主选择的工具。

Well, and I think just quickly, the thing that's so powerful about this in my my strong promotion of Noster or whatever may come after is the ability for individuals, for parents in this particular case, to be given the tools to make the choice.

Speaker 1

是的。

Yeah.

Speaker 1

我认为这才是核心。

I think that's the core.

Speaker 1

这不应该由x来决定。

It should not come from x.

Speaker 1

这不应该由政府来决定。

It should not come from the government.

Speaker 1

应该由那些最贴近、最关心这个小生命健康的人来决定。

It should come from the individuals closest to and most invested in that little human's health.

Speaker 1

我认为Noster是开放协议如何赋予我们这种权力的典范。

And I think Noster is a prime example of what an open protocol does with regard to giving us that power.

Speaker 0

是啊。

Yeah.

Speaker 0

我认为你们为父母提供了工具,让他们能更好地履行父母职责。

I think you you give you give parents tools so that they can parent better.

Speaker 2

是的。

Yes.

Speaker 2

完全正确。

Absolutely.

Speaker 0

并且让他们承担责任。

And and have them take responsibility.

Speaker 0

而且这比Nosta更重要。

And I it's bigger than Nosta.

Speaker 0

对吧?

Right?

Speaker 0

因为,就像你说的,完全正确。

Because, like Absolutely.

Speaker 0

我的意思是,苹果没有在iPhone或其他设备中内置让父母能精细控制孩子使用方式的功能,这有点让人费解。

I mean, it's kind of bewildering that you don't like that Apple doesn't have built into the iPhone or whatever, like, really granular controls for parents to choose how their kids are interacting with these things.

Speaker 0

我觉得你们几乎把它降到了04年水平。

I think you you bring it down almost the o four.

Speaker 2

对吧?

Right?

Speaker 2

是啊。

Yeah.

Speaker 2

因为我是个技术宅,我知道怎么进入路由器,屏蔽孩子们设备对某些网站的访问。

Like, because I'm a I'm a tech nerd, I know how to go in and on my router and block access to my kids' devices to certain websites.

Speaker 2

我会说这很简单,但对每个人都容易吗?

I'll say it's easy, but is it easy for everybody?

Speaker 2

可能不是。

Probably not.

Speaker 2

所以我们需要更简单的工具让每个人都能使用。

So we need easier tools for everybody to use.

Speaker 0

我同意。

I agree.

Speaker 0

我是说,这次讨论非常棒。

I mean, this has been a great conversation.

Speaker 0

我们刚才讨论得有点抽象了。

We've been a little bit more abstract.

Speaker 0

现在让我们回归主题,为在座从未使用过Nostr但可能想尝试测试版的朋友们提供些实用建议。

Just to bring it bring it all back together and make it a little bit more actionable to people here that have never used Nostr and maybe wanna play around with the test.

Speaker 0

我认为最好的学习方式就是亲自动手实践这些工具。

I I think, you know, the best way to learn is to, you know, just get your hands dirty and actually use the tools.

Speaker 0

肖恩,对于那些想了解当前开发进展的人,你会给出什么建议?

I mean, Sean, what would be your recommendation to someone who's interested in seeing what's being built out there?

Speaker 1

好的。

Yeah.

Speaker 1

请允许我再抽象地补充一句:我认为Nostr及其底层技术的强大之处在于——借用别人的比喻——就像中世纪的国王要向全国军队发布命令时,你会记得他们都有个印章戒指。

I'll take just a brief moment of further abstraction and just say, I think what's so powerful about Noster and and some of the technology that underlies it is I'll steal someone else's analogy metaphor is if you were a medieval king and you needed to issue a directive throughout the kingdom to your military, to someone else, as you would probably recall, you would have a signet ring.

Speaker 1

那枚印章戒指需加热后压入蜡中。

That signet ring would be heated, pressed into wax.

Speaker 1

它形成了一道封印。

It creates a seal.

Speaker 1

随后这封信件被递送给马特将军。

That letter is then delivered to Matt, the general.

Speaker 1

而我的印章戒指就是我的私钥。

And my signet ring is my private key.

Speaker 1

它难以模仿、难以伪造,想必也很难被盗取。

It is difficult to mimic, difficult to forge, presumably hard to steal.

Speaker 1

这是让我得以签名的专属财产。

That's my piece of of property that allows me to sign.

Speaker 1

封印即是公钥。

The seal is the public key.

Speaker 1

总而言之,通过这些世代沿袭的方式,诺斯特赋予你所有权。

And so that is all to say, in in these ways that have been created and recreated throughout time, Noster gives you that ownership.

Speaker 1

随之而来的是重大责任。

Now with that comes great responsibility.

Speaker 1

你拥有那把钥匙。

You own that key.

Speaker 1

你持有那枚印章戒指。

You have that signet ring.

Speaker 1

基于这种你能拥有身份、能拥有标注创作或内容发布权的认知,事情可以变得非常简单。

And so from that understanding that you can own your identity, you can own the ability to attribute your creation or or or publishing of of content, it can be quite simple.

Speaker 1

我认为Primal非常出色。

So I think Primal's brilliant.

Speaker 1

我会以免责声明的形式说明,10月31日,作为Primal的投资者。

I'll disclaimer format, ten thirty one, investor in Primal.

Speaker 1

绝佳的应用。

Fantastic application.

Speaker 1

所以primal.net,我认为这是个很好的入门方式。

So primal.net, I think it's a great way to get started.

Speaker 1

我认为这是最优秀的消费者用户体验之一。

I think it's one of the best consumer UX's.

Speaker 1

根据你在使用偏好光谱上的位置——从'只想要苹果式开箱即用'到像我们这样的极客想要深入钻研——还有很多其他选择。

There are many others depending on where you are on the spectrum from I just want it to work Apple esque style to, you know, like us, we're nerds and wanna dig in.

Speaker 1

但简而言之,我会说:primal.net,值得一看。

But I would say in short, primal.net, take a look.

Speaker 0

很棒的推荐。

Great recommendation.

Speaker 0

我觉得他处理得非常好。

I think he handled that really well.

Speaker 0

是的。

Yeah.

Speaker 0

趁我们还有点时间,简单快速聊聊:氛围编程、Nostr、AI、比特币,这些是你当前的重点领域。

So while we have a little bit more time, just real quick, vibe coding, Nostr, AI, Bitcoin, that's where your focus is right now.

Speaker 0

没错。

Yes.

Speaker 0

为什么这很强大?

Why is that powerful?

Speaker 2

因为Soapbox正在开发工具,让创作者或拥有自己社区的人能够构建应用程序。

Because so Soapbox is building tools that allow people that are creators or have their own community to build an application.

Speaker 2

你可以用代码实现它。

You can vibe code it.

Speaker 2

你可以为自己的社区构建专属应用。

You can build your own app for your own community.

Speaker 2

由于它基于Noster构建,你可以完全拥有所有内容。

And because it's built on Noster, you can own all of that content.

Speaker 2

所以与其用Discord或Twitter之类的平台服务社区,你可以用Shakespeare构建完全按你理想方式定制的社区应用,并且完全拥有它。

So instead of using Discord or Twitter or whatever for your community, you could use Shakespeare to build your own community app customized how you've always wanted it to be and you own it.

Speaker 2

你拥有全部源代码。

You own all the source code.

Speaker 2

你拥有所有数据。

You own all the data.

Speaker 2

它是去中心化的。

It's decentralized.

Speaker 2

你可以随心所欲地使用它,没人能夺走。

You can do whatever you want with it and nobody can take that away from you.

Speaker 2

但如果你是个主播、音乐人、艺术家之类的人,Discord服务器被封了,那你就完了。

Whereas, if your Discord server gets taken down cause you're a streamer or a musician or an artist or something, well, you're screwed.

Speaker 2

你什么都做不了。

You can't do anything.

Speaker 2

但如果你使用演讲台工具并构建莎士比亚,就能掌握拼图的每一块。

But if you use soapbox tools and you build Shakespeare, you can own every piece of the puzzle.

Speaker 0

对。

Yeah.

Speaker 0

关键在于你不需要封闭的API接口。

And the key there is you don't need closed API access.

Speaker 0

你不需要

You don't need to

Speaker 2

是啊。

Yeah.

Speaker 0

验证你

Verify You

Speaker 2

不需要请求许可。

don't need to ask permission.

Speaker 0

直接做就行。

You just do it.

Speaker 0

没错。

Yeah.

Speaker 0

你已经拥有社交图谱。

You have the you have the social graph.

Speaker 0

你拥有身份层。

You have the the identity layer.

Speaker 0

Nostr里已包含通讯协议,本质上就像面向世界的开放API。

You have the comms protocol all in Nostr, which is basically like an open API for the world for that.

Speaker 0

是的。

Yeah.

Speaker 0

在支付方面,你可以使用比特币,这样就不必去集成Stripe API之类的支付接口了。

And then on the payment side, you have Bitcoin so that you don't have to, you know, get a a Stripe API or something like that to to integrate

Speaker 1

无需任何许可。

No permission required.

Speaker 1

直接去做就行。

Just go do it.

Speaker 2

没错。

Yeah.

Speaker 2

你想为正在销售的产品或个人网站等搭建一个接受比特币支付的网站吗?

You wanna build a website that accepts Bitcoin payments for your product that you're selling or for your personal website or something?

Speaker 2

你不需要懂任何代码。

You don't need to know any code.

Speaker 2

你不需要成为开发者才能做到。

You don't need to be a developer on how to do it.

Speaker 2

你只需要和AI对话,告诉它:给我建这个网站。

You just have a conversation with AI, and you and you say, build me this website.

Speaker 2

让它实现a b c d这些功能,几分钟后,砰——

It does this thing, a b c d, and a few minutes later, boom.

Speaker 2

就搞定了。

It's done.

Speaker 2

它就是你的了,你可以随意使用。

And it's yours, you can do whatever you want with it.

Speaker 0

太棒了。

Love it.

Speaker 0

让我们为Derek和Sean热烈鼓掌好吗?

Can we have a huge round of applause for Derek and Sean?

Speaker 0

谢谢你们。

Thank you, guys.

Speaker 0

谢谢。

Thank you.

Speaker 0

多谢。

Thanks.

关于 Bayt 播客

Bayt 提供中文+原文双语音频和字幕,帮助你打破语言障碍,轻松听懂全球优质播客。

继续浏览更多播客