本集简介
双语字幕
仅展示文本字幕,不包含中文音频;想边听边看,请使用 Bayt 播客 App。
欢迎各位收看Information的TI TV。
Welcome everyone to the Information's TI TV.
我叫阿卡什·帕什里特沙。
My name is Akash Pashritsha.
今天是1月6日,星期二。
It is Tuesday, January 6.
今天在节目中,我们将深入解析CES。
Today on the show, we are unpacking CES.
我们将详解英伟达的重大发布。
We will break down the big announcements from NVIDIA.
我们邀请了Kindred Ventures做客节目,我们的AI与机器人记者将帮助我们理解英伟达的新款自动驾驶汽车模型。
We've got Kindred Ventures coming on the show and our AI and robotics reporter is going to help us understand NVIDIA's new self driving car model.
随后,我们将转向独家报道,即今日早晨Information发布的新闻。
We'll then turn to some exclusive reporting, the information published this morning.
AI评估初创公司LM Arena以17亿美元的估值融资1.5亿美元。
AI evaluation startup LM Arena has raised $150,000,000 at a $1,700,000,000 valuation.
我们会获得更多细节。
We will get more details on that.
接下来,我们将转向我们亚马逊记者报道的一则新闻,关于AWS的AI野心及其自研模型与竞争对手的对比,最后我们将深入探讨能源和工业领域各个细分板块如何调整战略以应对AI热潮。
Next up, we'll turn to a story from our Amazon reporter about AWS's AI ambitions and how its homegrown models stack up against its competitors, and we will wrap with the closest look that we have taken yet into the different pockets of the energy and industrials sector that are changing their strategy to adjust to the AI boom.
这将是一场精彩的节目,让我们马上进入正题。
It's gonna be a fun show, and so let's get right on into things.
昨天在CES上,所有人的目光都聚焦在英伟达身上,黄仁勋登台发表了主题演讲。
At CES yesterday, it was all eyes on NVIDIA as Jensen Huang took the stage for his keynote.
他在这次活动中公布的两大重要消息是英伟达的自动驾驶汽车模型,以及关于其下一代大型AI芯片Rubin的更多细节。
Two of his big announcements from the event were NVIDIA's model for self driving cars and more details about Rubin, the company's next big AI chip it is working on.
这款芯片的使用成本预计将低于以往型号,在某些情况下,企业甚至需要更少的芯片。
The chip is supposed to be cheaper to use compared to previous models, and in some cases, companies should have to use less of them.
我想邀请Kindred Ventures的创始人兼管理合伙人张Steve来帮我们深入解析这一切。
I want to bring on Steve Zhang, founder and managing partner at Kindred Ventures, to help us break it all down.
Steve,欢迎再次做客我们的节目。
Steve, welcome back to the show.
很高兴你能来。
It's great to have you here.
谢谢邀请我。
Thanks for having me.
很高兴见到你。
It's good to see you.
你亲临CES现场。
So you are on the ground at CES.
你昨天参加主题演讲了吗?
Were you at the keynote yesterday?
是的。
Yeah.
你知道,每年的CES都是我们的硬件公司和计算公司齐聚一堂的绝佳时刻。
We, you know, CES every year is a great moment for all of our hardware companies and compute companies to come together.
由于AI的兴起,CES本身焕发了新的活力,这真令人欣喜。
So CES itself has a has a new life because of AI, which is great to see.
不错。
Nice.
好吧,我想和你聊聊主题演讲。
Well, I want to talk to you about the keynote.
你看,詹森·黄昨天演讲中两大重磅发布是自动驾驶汽车模型,我知道你对这个领域很熟悉,我们稍后再谈这个。
So, look, the two big announcements coming out of Jensen Huang's speech yesterday were the self driving car model, I know you're close to that space, so maybe we'll get there in a second.
但我先想聊聊他们更详细披露的新一代芯片——Rubin系列。
But I want to start with the new generation of chips that they gave us more details on, the Rubin family.
关于詹森关于这些芯片的演讲,你印象最深的是什么?
What stood out to you about Jensen's speech about the chips?
而且,我觉得我们昨天其实已经知道不少这方面的信息了,不是吗?
And, you know, I kind of thought that we knew a lot of this stuff yesterday, didn't we?
是,也不是。
Yes and no.
我觉得大家都知道Vera Rubin超级芯片和Rubin显卡即将发布,但一些细节可能还没被广泛理解。
I think everyone knew that the Vera Rubin Superchip and the Rubin GPU were coming, but I think some of the details weren't widely understood yet.
我觉得进来之前有几个担忧。
I think there was there was a couple concerns coming in.
一个是,你知道,这会不会是用来替代 B200 和 GB200 的直接换代芯片?
One was, you know, is this gonna be a drop in chip to replace the b two b two hundreds and GB two hundreds?
但看起来并不是。
And it looks like it's not.
对吧?
Right?
所以你昨天看到的是 Vera Rubin 超级芯片 CPU 和两个 GPU,组成一个节点,然后是 Rubin GPU。
So you're what you what you saw yesterday was a Vera Rubin super chip CPU and a and two GPUs, and then in a node, and then Rubin GPUs.
因此,这些是作为节点和机架销售和交付的,你不能只拿一个 GPU 来替换你现有的芯片。
So these are sold and delivered as nodes and racks, you can't just take a single GPU and replace your existing chips.
在数据中心里,你实际上需要改变你的机架系统运作方式。
As a data in the data center, you actually have to change how your rack system works.
所以在这个新的 Rubin 系列中,机架本身就是一个系统。
So the rack is a system in this new Rubin series.
但你得到的是10倍的推理效率,以及3到4倍的训练效率,但这需要在云平台和新云平台上做一些调整。
So what you get though is 10x inference efficiency, you get three to 4x training efficiency, but it's it's going to require some work on the clouds and neo clouds parts.
所以这是付费才能参与,对吧?
So it's a pay to play, right?
他们必须在还在想办法交付和实现Blackwell芯片盈利的同时,为新系列支付高昂费用。
They have to compete with paying up for a new series while they're still figuring out how to get delivered and monetized on the Blackwells.
因此,这正是当前超大规模云服务商、新云服务商、推理引擎和前沿实验室之间正在进行的混战的延续。
And so this is, you know, a continuation of the battle royale that the hyperscalers, the neo clouds, the inference engines, frontier labs are playing right now.
他们有合作关系、预先承诺和订单,但仍在摸索如何充分利用上一代芯片。
They have alliances, pre commitments, bookings, and they're still figuring out how to utilize the the last series of chips.
所以我认为,2026年,我们对2026年的预测之一是,这场混战将比去年更加激烈。
So I think 2026, one of our predictions for 2026 is that this battle royale is gonna be even more intense than last year.
你会看到更多的订单和更多的联盟合作。
Many more bookings, many more alliances that you'll see.
他们必须争夺客户,同时还要实现并利用那些仍存放在数据中心、并持续运送到那里的B200芯片。
And they're going to have to fight for customers and and still monetize and utilize all those B 200 sitting in data centers and still getting shipped to them.
对。
Right.
但总的来说,这对推理引擎来说是好事。
So the net net, though, is that it's great for inference engines.
这对前沿实验室来说是好事。
It's great for frontier labs.
这对新云实验室来说是好事。
It's great for neo labs.
这对新云和超大规模云服务商来说将非常昂贵。
It's it's going to be expensive for neo clouds and hyperscalers.
但长远来看,它们将能够拥有比世界上任何其他地方都更先进的芯片。
But, you know, long term, they're going to be able to have the very best chips compared to anywhere else around the world.
让我问你一个问题。
Well, let me ask you this.
你所说的必须更换整个机架,而不是简单替换芯片,这个观点。
This idea that you have to sort of replace the entire rack and it's not a drop in chip.
这在你看来是英伟达的一种战略举措,旨在确保客户从他们那里购买整个系统吗?
Does that come across to you as a strategic move from NVIDIA in a way to sort of ensure that customers are buying more of the whole system from them?
还是说,这是一种创新,意味着如果不购买芯片周围的其他所有东西,你就无法获得它想要提供的所有好处?
Or is this an innovation in the sense that you couldn't actually get the benefits that it wants you to get without buying all the other stuff around the chip?
我的理解是,我们刚刚做了一次数据核查,但我的理解是,这不是一个商业决策,而是一个技术系统决策,即Verirubin节点和机架就是这样工作的。
My understanding, and we just had a sort of data look at this, but my understanding is that it's not a business decision, it's a technical systems decision that this is how the the Verirubin nodes and racks will work.
我不知道你有没有看过它的图片。
I don't know if you saw the picture of it.
它是一个金色的巨石。
It's a golden monolith.
是的。
Yeah.
很漂亮。
Beautiful.
我当时评论说,嘿,这看起来很令人印象深刻,但也看起来非常昂贵。
My my comment at the time was, hey, it looks impressive, also looks very expensive.
我的意思是,是啊。
I mean Yeah.
所以他们才把它涂成金色。
That's why they colored it in gold.
对吧?
Right?
所以我认为这是一个机架系统解决方案。
And so I think I think this is a system as a rack solution.
因此,我不觉得这主要是定价或商业决策,而更多是出于技术原因。
And I so I don't know that it's a it's a pricing or a business decision so much as it is.
这是他们想要推广的新机架系统和节点系统。
This is the new rack system and node system that they want to push out there.
这是一个技术要求。
And that's a technical requirement.
所以,我们会在接下来的几天里了解到更多信息。
And so we'll, you know, we'll find out more in the coming days.
但我认为这非常有趣,因为这是一个替换周期。
But I think this is really interesting because it's a replacement cycle.
就像消费者会用新款iPhone替换他们现有的iPhone一样。
It's like the same way that consumers replace their new, their iPhone for the next iPhone.
对。
Right.
他们旧的iPhone、现在的iPhone仍然能用,但他们想要下一代。
Their old iPhone, their current iPhone still works, but they want the next one.
我认为,整个计算行业本质上正处于英伟达的替换周期中。
I think basically the entire industry, the compute industry is essentially on the The Nvidia replacement cycle.
是的,你目前还不需要,但很快就会需要,所以你应该现在就订购。
Yeah, you don't need it yet, but you'll need it soon, and so you should get it now and book it now.
否则你可能会排队等候,从而在与竞争对手的赛跑中落后。
Otherwise you might be waiting in line and then a slip on your competitive race with others.
我认为
I think
我觉得这挺有意思的,我当时在看直播,在YouTube上以及其他平台,下面的评论区置顶评论都在说,这是消费电子展。
I will say it's funny to me, I was watching the livestream and I was on YouTube and wherever I was watching it, the comments underneath, the top comments were, this is the consumer electronics show.
这简直是最不像消费类产品的东西了,我知道你说过所有AI都依赖于芯片,最终一切都会回归到这一点,但有些人说,天啊,这些活动发布的内容越来越技术化了。
This is like, you know, one of the least consumer things, you know, and look, I know you say all of AI rests on the chips, right, and at the end of the day it all comes back to it, but some people were saying, gosh, you know, it's getting more and more technical in terms of what's being announced at these events.
我的意思是,CES一直就是消费类设备和大量基础设施的混合体。
I mean, CS has always been a mix of some consumer devices and a lot of infrastructure.
服务器公司、芯片公司、元器件公司一直都在这里参展。
The server companies, the ship companies, the component companies have always been out here.
这一直都是从很久以前就延续下来的交易和订单对接展会。
This is sort of the it's always been sort of the the the trading, order filling convention from time ago.
但现在规模扩大了一千倍,因为如果你想想所有的主题演讲,真正涉及消费端的其实很少。
But it's, you know, it's a thousand x now because if you think about all the keynotes, like very few of them are actually touching the consumer.
对吧?
Right?
你看了AMD的主题演讲,还会看到西门子的主题演讲,还有许多来自距离消费者两到三步之遥的公司演讲,但这些都极其重要,对吧?
You saw the AMD keynote, you saw, you'll see a Siemens keynote, you'll see so many keynotes from companies that are two, three steps away from the consumer, but it's super important, right?
他们推出的这套Vera Rubin系列,是为未来面向消费者代理和企业代理的场景打造的。
The, you know, this whole Vera Rubin series that they announced, it's built for a future around consumer agents and enterprise agents.
这就是他们开发它的原因。
That's why they built this.
预测是,在未来一两年内,推理工作负载将超过训练工作负载,这将是历史上首次发生这种情况。
The prediction is, is that over the next year or two, inference workloads will exceed training workloads, which is, which will be the first time that that's ever happened.
因此,这款产品是为AgenTek工作负载设计的。
And so the, this is made for AgenTek workloads.
在推理层面,你所发起的请求数量需要一种每瓦特每秒处理的令牌数计算方式,这在行业内我们从未见过。
The number of requests that are you're making on an inference level requires a tokens per second per watt calculation that is we just haven't seen in the industry.
因此,像Perplexity和Plug Code这样的公司,你可以看到他们在Opus模型、编解码器等基础上所做的工作。
So as companies like Perplexity and Plug Code, you see what they're doing with their harness on top of their Opus models, codecs, things like that.
这些代理平台将消耗大量的推理资源,因此你需要一种不同的芯片和不同的系统。
These agent platforms are going to use up so much inference that you need a different chipset and a different system.
所以
So
让我问你一下,消费级工作负载显然是昨天的重点,但很多内容也涉及物理AI,对吧?
Let me ask you about consumer workloads was definitely a focus yesterday, but a lot of it too was physical AI, right?
我的意思是,制造业和机器人领域,昨天我们也讲了一个很好的故事,关于NVIDIA如何现在通过其Omniverse套件大力进军这一客户群体。
I mean, manufacturing, robotics, and we had a good story about that yesterday too, how NVIDIA is really gunning for this category of customers now with its Omniverse suite.
你如何看待对物理AI的重视?
What do you make of the focus on physical AI?
因为昨天的焦点并不仅限于NVIDIA。
Because it wasn't just NVIDIA yesterday.
我认为其他芯片公司也发布了更多针对这些工业应用的产品。
I think other chip companies as well announced more products targeted towards these industrial applications.
为什么现在这成为如此重要的焦点?
Why is this such a big focus now?
这是他们必须寻找增长的地方吗?
Is this where they have to look for growth?
不,我不认为他们的叙事是必须寻找增长,因此才强行推动这个主题。
No, I don't think the narrative is that they have to look for growth and so they're sort of forcing this theme.
我认为这是一个正在酝酿的主题。
I think that this is a theme that's been brewing.
我的意思是,想想芯片制造商过去的做法,英特尔很久以前就做过,但英伟达现在所做的,是为这个领域的对话设定方向,并提供推动这一进程所需的工具。
I mean, you think about what a chipmaker, and Intel used to do this a long time ago, but what NVIDIA is doing is sort of setting the mandate for the conversation and providing the tools for that conversation and that building to happen in a sector.
所以这本质上就像是在扶持王者,对吧?
So essentially it's sort of like king making, right?
我们经常谈论……
You know, we often talk about
他们正在定调。
They're setting the tone.
风险投资基金会扶持行业领导者吗?
Yeah, do venture funds king make companies?
我告诉你,英伟达正在塑造整个行业,对吧?
Well, I'll tell you, Nvidia is king making sectors, right?
所以你看到的是,他们之前在大语言模型领域就这么做过,对吧?
And so what you're seeing is that they're, you know, they did this with LLMs, right?
他们专注于大语言模型,并对OpenAI以及x AI和埃隆·马斯克迅速搭建数据中心的能力大加赞赏——而这些数据中心内部用的正是他们的芯片。
They focused in on LLMs, and, you know, had great things to say about not only OpenAI, but also x AI and Elon Musk's ability to stand up a a data center very quickly with, lo and behold, their chips inside.
然后他们又主导了机器人领域,对吧?
And then King made on the robotics sector, right?
你从Groot以及他们在此领域的所有工作就能看出来。
And you saw that with Groot and everything that they were doing there.
现在当他们提到物理AI时,已经包含了机器人相关的概念。
And now when they say physical AI, it includes robotics concepts.
还包括VLA,对吧?
It also includes VLAs, right?
以及VLM。
And VLMs.
这些并不是孤立的技术,而是行业里不断演进的衍生技术。
So they're not, these are derivative technologies that are improving over time in the industry.
如果你在过去一年参加过任何机器学习会议——比如ICML、ICLR、NeurIPS——就会发现物理AI是一个热门话题。
If you went to any of the ML conferences over the last year, ICML, ICLR, NeurIPS, you saw that physical AI was a huge thing.
所以,世界模型、从世界模型到机器人模型再到自主性,都属于物理AI的一部分。
So world models, everything from world models to robotics models to autonomy is part of physical AI.
这正是现在人们都想追随的叙事。
It's very much the narrative that people want to attach themselves to right now.
是的,如果你看了他们昨天发布的消息,他们推出了开放数据集和开源模型,对吧?
Yeah, and if you saw what they announced yesterday, they have open data sets, they have open source models, right?
他们展示了那个演示视频,而我们对此非常熟悉,因为我们的一家公司——Neuro,是英伟达的合作伙伴,得到了英伟达、优步的投资,他们与Lucid和优步联合推出了搭载英伟达AGX芯片的Neuro汽车,对吧?
They showed that demo video and you know, and we know this well because one of our companies, one of our portfolio companies, Neuro, is an Nvidia partner, is invested in by Nvidia and Uber and they launched their neuro car with Lucid and Uber and it's AGX store inside, Nvidia chips, right?
英伟达昨天发布的内容非常有趣,即其他汽车制造商和技术公司也可以使用这些模型和他们的芯片,对吧?
And what Nvidia basically shipped yesterday, very interesting, is that other auto OEMs and other technology companies can use these models and their chips, right?
他们的芯片始终内置其中。
Their chips are always inside.
没错。
Right.
这加速了整个生态系统的发展,而使用开源技术,我认为一直有人质疑:从商业化角度来看,开源的意义到底是什么?
And this accelerates the ecosystem so that using open source, which I think there's been an argument that what is this purpose of open source from a monetization perspective?
在这种情况下,它销售的是芯片。
Well, in this case, it sells chips.
我认为更有趣的是第二部分,他们实际上提供了一套硬件模型和软件工具包,让你可以打造自己的特斯拉FSD或自己的Waymo。
And then the second part of that, of what I thought was super interesting is that they basically offered a hardware model and software kit to create your own Tesla FSD or your own Waymo.
所以,当他们
So, you know, when they're
这就像是任何人都可以,这极大地 democratized 了谁能开发这项技术。
It's like anyone can, you know, it's very much democratizing who can develop on this technology.
我觉得这个观点其实很好。
I think that's a good point, actually.
我之前没想过这一点。
I hadn't thought about that.
在昨天所有这些公告发布后,股价其实并没有太大变动。
Very quickly, after all these announcements yesterday, the stock didn't really move.
你看,英伟达的股价可以说是很难预测的。
Look, NVIDIA stock is kind of, you know, a tough thing to predict.
但你对这部分怎么看?
But what did you make of of that part of this?
嗯。
Yeah.
而且,从公开股票的角度来看,我认为鲁宾的那些公告,甚至物理AI,早就有相关数据了。
And, you know, from a public stock perspective, I think, you know, a lot of the Rubin stuff announcements and even the physical AI, there's been data out there.
对吧?
Right?
Neuro一直在用Thor与Uber合作。
Neuro's been using Thor with Uber.
Uber也被宣布为英伟达的合作伙伴。
Uber was announced as a Nvidia partner as well.
所以,这些内容很可能已经被华尔街的定价所考虑了。
So a lot of this stuff is already probably factored in to Wall Street's pricing.
但这里另一个有趣的地方是,如果你看看今天的英伟达,他们正在推动一个对抗‘是否存在泡沫’这一叙事的议程。
But the other thing that's interesting here is that, you know, if you look at Nvidia today, they are pushing forward an agenda against the narrative that is, is there a bubble?
还有另一种相反的观点,即NVIDIA的统治地位是否因谷歌的TPU或推理芯片而受到威胁?
And there's another counter narrative, which is, is NVIDIA's throne at risk because of TPUs from Google or is it at risk because of inference chips?
我要告诉你们的是,维拉·鲁宾系列芯片,在推理方面实现了10倍的性能提升。
And what I'll tell you is the Vera Rubin family, that series of chips, it's 10x improvement in inference.
这就让人质疑,之前关于Grok收购的种种说法究竟是怎么回事?
So it kind of calls into question, what was the narrative that was floating around, around the Grok acquisition?
但我仍然认为,我们有
And I still think that, you know, we have
你对Grok收购有些疑问吗?
some Does you question it the Grok acquisition a little bit?
不,
No,
实际上并没有,因为我认为Grok收购,我们还得看看他们是否真的会在产品套件中使用Grok芯片授权。
it doesn't actually, because I think the Grok acquisition, you know, we'll see if they actually use the Grok chip license in their product suite.
我认为,如果他们使用了,那也一定会经过大幅修改以适应NVIDIA的生态系统。
I think if they do, it's heavily modified to fit into Nvidia's suite.
但我认为,他们真正得到的是世界上顶尖的推理团队之一。
But what I think it was, was getting one of the world's best teams in inference.
你知道,当我们审视这一点时,发现的是乔纳森和他的团队所实现的SRAM创新。
You know, we when we looked at that, it was like the SRAM innovation that Jonathan and team had achieved.
世界上能做这件事的人寥寥无几。
That's just there's very few people on the planet that know how to do that.
我认为这并不在英伟达的核心能力范围内。
And I don't think that's not in the wheelhouse of Nvidia.
所以我认为这本质上是一次价值200亿美元的人才收购,对吧?
So I think that was essentially a $20,000,000,000 talent acquisition, right?
这正是他们所称的,通过一切手段但不正式称为收购的方式。
Which is very much what they kind of call it by doing everything but the acquisition.
他们并没有称之为人才收购,
Mean, they didn't call it an acquihire,
但抛开监管和联邦贸易委员会的问题不谈,这无疑是一次出色的人才收购,200亿美元对任何公司来说都是巨款,除了英伟达,对吧?
but I mean, you regulatory, FTC issues aside, it's a great talent acquisition and 20,000,000,000 is a lot for any company other than Nvidia, right?
当我回顾Grok收购时,你知道,他们并没有收购Grok Cloud,对吧?
And when I look back at the Grok acquisition, you know, they didn't take Grok Cloud, right?
他们有DGX Cloud,所以并不需要那个。
They have DGX Cloud, so they don't need that.
我认为他们将开发一款新的推理芯片,专注于每秒处理的token数量,显然不是用于训练。
And I think that they're gonna build a new inference chip that's focused on tokens per second and it's not focused on training obviously.
如果你看看他们在这里提供的内容,说你可以付费参与,就能获得更强大的推理和训练芯片——Rubin芯片;而我们可能在差不多同一时间,或者稍晚一点,推出一款专门针对这一目标和使用场景的推理加速器。
And if you look at what they're offering here, saying you can pay to play here and you can get a much more performant inference and training chip in Rubin And we're probably going to offer you around the same time, maybe a little bit after an inference accelerator that is single focused on that outcome and that use case.
我认为随着这一进程展开,价格可能会面临一些上行压力,但我认为人们仍在消化这些叙事中的几个关键点。
And I think as this unfolds, I think there may be a little bit more upward pressure on that price, but I think people are still grappling with a couple of these narratives.
所以我们只能拭目以待,看看未来会如何发展,但Rubin芯片也不会很快面世。
And so we'll see how that plays out in the future, but the Rubin chips aren't coming out for a while either.
而且,我想我们昨天都期待能获得更多细节,但没来得及讨论自动驾驶汽车的部分,我们将在下一个环节,由我们的AI与机器人记者来探讨这个话题。
Well, and I think we were all excited to get more details on it yesterday, and we didn't have time to get to the self driving car stuff, but we're going to talk about that in our next segment with our AI and robotics reporter.
史蒂夫,感谢你加入我们。
Steve, I want to thank you for joining us.
祝你在CES剩下的时间玩得开心,度过愉快的一周。
Have fun at the rest of CES and enjoy the rest of the week.
这位是Kindred Ventures的创始人兼管理合伙人史蒂夫·江,欢迎收看TI TV。
That is Steve Jiang, founder and managing partner at Kindred Ventures here on TI TV.
好的。
Okay.
继续我们的CES报道,我想邀请我们的AI与机器人记者火箭·德鲁,帮助我们更深入地探讨英伟达昨天发布的阿尔法·罗密欧自动驾驶汽车模型。
Sticking with our CES coverage, I want to bring on our AI and robotics reporter, Rocket Drew, to help us dig a bit deeper into Alfa Romeo, the self driving car model that NVIDIA unveiled yesterday.
他在今天发布的《AI议程》通讯中撰文介绍了这一点。
He wrote about that in our AI agenda newsletter out today.
火箭,欢迎再次回到节目。
Rocket, welcome back to the show.
很高兴你来到这里。
It's great to have you here.
嗨,阿卡什。
Hey Akash.
很高兴回来。
It's great to be back.
我们刚刚谈到了NVIDIA公布或提供更多细节的芯片。
So we just talked about the chips that NVIDIA unveiled or gave us more details on.
你想给我们解释一下ALPAMEO one到底是什么,因为它是ALPAMEO one。
Want you to explain to us what exactly ALPAMEO one is, because it's ALPAMEO one.
没错。
That's right.
这个模型到底要做什么?
What what is this model supposed to do?
是的。
Yeah.
这是一个用于自动驾驶汽车的模型。
It's a model for self driving cars.
它是在大量真实世界中人们驾驶汽车收集的数据基础上训练的,此外还使用了大量通过NVIDIA自己的计算机模拟技术生成的数据,特别是使用他们的主要AI世界模型Cosmos。
So it's trained on a bunch of data that people have collected by driving their cars in the real world and then a lot of data besides that, that's sort of generated in computer simulations using NVIDIA's own computer simulation technology, in particular using Cosmos, is their sort of mainline AI world model.
这个模型旨在驾驶汽车。
And the model is intended to drive cars.
它可以根据道路状态和输入图像预测汽车应采取的行动。
It can predict what action a car should take based on a state of the road, based on an input image.
这个模型本身非常大。
The model itself is like a it's pretty big.
因此,如果有人想在实际中使用它,通常会用自己的数据进行一些定制。
So if anyone was going to use it in practice, you'd typically customize it a bit on your own data.
也许你会将其蒸馏成一个更小、更快的模型,以便在汽车内部运行。
Maybe you'd distill it into a model that's smaller and faster so it can run-in the car itself.
但它旨在以多种方式加速自动驾驶技术的发展。
But it's intended to accelerate the development of self driving technology in a number of ways.
那么,这个模型与其他自动驾驶汽车模型相比如何?
And so how does this model compare against other self driving car models?
我的意思是,我们知道特斯拉正在开发他们自己的机器人出租车车队,我想。
I mean, we know that Tesla is working on their own robo taxi suite of cars, I guess.
我的意思是,他们大概有自己的模型。
I mean, presumably they have their own model.
我也不清楚。
I don't even know.
这个领域竞争激烈吗?
Is this a very competitive landscape?
你知道,很多技术都是保密的,非常专有,所以很难说特斯拉到底在做什么。
You know, I think a lot of it's locked down and very proprietary, so it's difficult to say what Tesla is doing.
我的意思是,他们明确表示只使用摄像头输入。
I mean, has made a pretty firm position that they're interested in using camera inputs only.
所以他们希望汽车仅凭视觉就能在环境中行驶,而不是使用激光雷达等额外传感器来估算不同物体的距离。
So they want their car to be able to maneuver the world solely based on vision rather than using additional sensors like lidar to estimate how far away different objects are.
我认为最大的区别在于,这个模型是开源的。
And then the biggest difference I think is that this model is it's open source.
所以任何人都可以下载它,进行修改、实验和定制。
So anyone can download it, anyone can tinker with it and experiment and customize it.
这对模型来说是一个巨大的区别。
And that makes a big difference for the model.
它也不打算一推出就提供所谓的四级自动驾驶功能。
It's also not intended to provide, you know, so called level four autonomy right out of the box.
他们的本意并不是让你明天就把这个系统装进车里,然后自己待在家里,让车自己开。
The intention isn't that you can put this in your car tomorrow and you can stay home while your car drives around for you.
你知道,最理想的情况下,当他们开始推出这个系统时,我正在与梅赛德斯-奔驰合作,它将首先实现二级自动驾驶,这意味着
You know, At best, as they're starting to roll this out, I'm working with Mercedes Benz, it's going to hit the road at level two autonomy, which means And
就是这个区别。
the difference right.
二级和四级之间的区别是
The difference between two and four is
它能调整你的转向、加速和减速,但你必须全程保持警惕。
It it can tweak your steering steering and your acceleration speeding up and slowing down, but you've gotta be vigilant the whole time.
你必须随时准备接管方向盘,以防出现任何问题。
You gotta be ready to take over and take the wheel if anything goes wrong.
明白了。
Got it.
所以你写过这是个端到端模型。
So you you wrote about this being an end to end model.
那是什么意思?
What what does that mean?
没错。
That's right.
在许多机器人技术中,包括自动驾驶汽车,传统方法是使用专门的软件模块来控制机器人或自动驾驶汽车所需执行的所有任务,从感知世界并理解它,到规划采取什么行动,再到控制机器人或汽车并执行这些动作。
So the traditional approach in a lot of robotics, including for self driving cars, is you have specialized pieces of software that are governing everything the robot, or in this case the self driving car, needs to do, from perceiving the world and making sense of it, to planning what actions to take, to controlling the robot or the car and taking those actions.
每个环节都有独立的软件层来处理。
And you'd have a separate layer of software that handles each of those.
这具有一些非常不错的优势。
That has some really nice properties.
比如,你能知道什么时候出了问题。
Like, you can tell when something's going wrong.
我知道这个过程中的感知部分出了问题。
I know it's going wrong with the perception part of this process.
你可以看到信息从一个环节流向另一个环节。
And you can see the flow of information from one to the other.
但随着人工智能的发展,如今非常流行使用单一的统一AI模型,直接从输入的像素到输出的动作,控制机器人或汽车的所有行为。
But with the rise of AI, it's very in vogue these days to just have a single unitary AI model that from pixels coming in to actions coming out controls everything that the robot or the car does.
这确实有一些真正的优势,因为AI模型非常强大且富有表现力,能够捕捉所有难以定义的规则,并以微妙的方式从数据中学习。
And that has some real advantages, because AI models are so expressive and powerful, they can capture all of these difficult to define rules, and they can learn from data in subtle ways.
另一方面,你现在让一个无法监控、缺乏透明度的黑箱来控制你的汽车。
On the other hand, now you're letting some black box that you can't really monitor that isn't very transparent control your car.
而这种机器人应用的风险非常高。
And that's a very high stakes kind of robotics.
对吧?
Right?
这是一个非常严肃的部署。
That's a very serious deployment.
如果出了问题,你也不知道到底是哪一部分出了错,因为它没有被分段。
If it also something goes wrong, you don't know what part of it actually went wrong because it's not as segmented.
没错。
Exactly.
没错。
Exactly.
而且你知道,它总会遇到一些从未见过的情况。
And something you know, it's always going to encounter some scenario it's never seen before.
你总会遇到一些在训练中模型没见到过的道路场景,而你希望确保它能够
You're always gonna come across something on the road that your model didn't see during training, and you wanna make sure that it's going to be able to
妥善应对。
meet properly.
你愿意分享一下你在通讯中提到的那个例子吗?
Do you wanna share the example that you that you gave in your newsletter?
我正想着圣塔孔活动呢。
I had I had SantaCon on the mind.
我在想,如果这些自动驾驶汽车遇到的是一个醉酒圣诞老人的游行队伍,而火箭决定把它当作
I'm thinking, know, if one of these self
障碍物,
driving Parade of Drunken Santas is the obstacle that Rocket decided he was
没错。
gonna That's right.
打算纳入
Was gonna put
训练数据,你知道吧?
in training data, you know?
你知道,NVIDIA在训练时并没有模拟这些醉酒的圣诞老人。
Know that NVIDIA isn't simulating those drunken Santas during training.
对。
Right.
所以,我的意思是,我想问一下,毕竟这有优点也有缺点。
So, I mean, I do want ask you though, mean, so there are pros and cons.
那你认为 NVIDIA 为什么选择这种端到端的方法呢?
Why do you think NVIDIA went with that end to end approach then?
这真的只是为了与竞争对手区分开来吗?
Was it really just a way to differentiate itself?
这是一个商业决策吗?
Is business it decision?
我的意思是,你说这是开源的,所以 presumably 里面还有更多东西可供人们折腾。
I mean, you're saying it's open source, so presumably there's a lot more there that people can tinker with.
跟我详细说说,嗯。
Walk me through Yeah.
我认为这真正展示了他们的仿真软件。
I think it really showcases their simulation software.
我认为这才是从他们发布的 Cosmos 模型中获取巨大优势的关键,他们正试图推动它的采用。
I think that's really how you get a lot of juice out of this Cosmos model that they've put out, and they're trying to drive adoption of it.
然后我认为他们希望人们消耗更多的计算资源。
And then I think they want people to spend compute.
正如史蒂夫之前提到的,英伟达希望人们在实际运行这个模型时消耗计算资源。
To Steve's point earlier, NVIDIA wants people to spend compute on actually running this model.
我认为这种端到端的范式也最符合这一点。
And I think that makes most sense in this kind of end to end paradigm as well.
但英伟达做了一件聪明的事。
But NVIDIA does something clever.
对吧?
Right?
当他们将这个模型推向市场,与梅赛德斯合作时,并不只依赖端到端模型。
They don't just rely when they're actually putting this model out into the world, when they're working with Mercedes, they're not relying just on the end to end model.
与此同时,他们还在运行传统的感知、规划和执行堆栈。
In parallel, they're running the more traditional stack of perception and planning and action.
这两个系统同时在决定该采取什么行动。
And they're both systems at the same time are deciding what action should we take.
他们管这个叫什么来着?
And they have sort of a what do they call it?
就像在上面加了一个安全策略评估器。
Like a safety policy evaluator on top.
这是一种委婉的说法,意思是他们还有一个系统,随时决定听从哪一个。
It's a fancy way to say they have another system that's deciding at any given moment, which one should we listen to?
我们是否应该信任这个黑箱模型
Do we trust what the black box model
它应该做决定。
It should decides.
你要么选择新模型,要么选择
You either go with the new model or the
旧技术。
older technology.
对。
Right.
根据模型的置信度,以及我们是否处于模型未曾预料到的新场景中。
Based on how confident the models are, based on are we in a novel scenario that the model hasn't anticipated.
这为系统增加了一层安全性,因为我们可以保证,一旦遇到特殊情况,系统会回退到更安全的模型。
And that adds a level of safety to it, because you can guarantee that we'll fall back on the safer model in the event that we run into some exceptional circumstance.
对。
Right.
我想转到谈谈你们今天发布的另一条独家新闻。
I want to shift to talking about another scoop that you published today.
你和我们的风险投资副总监凯蒂·鲁夫写了一篇关于 LM Arena 的报道。
You and Katie Roof, our Deputy Bureau Chief of Venture Capital, had a story about LM Arena.
我在标题里已经透露了,但你们的报道显示,它在新一轮融资中的估值达到了17亿美元。
I gave it away here in the heading, but the reporting showed that it's valued at $1,700,000,000 in a new funding round.
跟我们多讲讲
Tell us a
为什么 Ella 能够筹集到这么多资金。
little bit more about why that Ella was able to raise all this money.
当然可以。
Yeah, absolutely.
我的意思是,Ella Marina 已经成为比较各种模型能力和性能的首选平台。
I mean, Ella Marina has become a go to place to compare the capabilities and performance of models.
我可以告诉你,我经常和人工智能领域的人交流,他们都依赖 Ella Marina 来了解不同模型的表现,从而评估新发布的模型质量。
I can tell you I talk to people in AI all the time that look to Ella Marina for a sense of how different models are performing to gauge the quality of new releases.
它已经成为衡量不同模型在各种任务中表现的共同参考标准,无论是回答问题、生成图像,还是从图像生成视频。
It's really become a source of common knowledge for how different models perform at different tasks, from answering questions to generating images to generating videos from images.
只要你能想到的,Ella Marina 都在创建一个用于比较模型的类别。
Like you name it, Ella Marina is creating a category to compare models.
它是怎么赚钱的?
How does it make money?
它有一支小型团队,这些人免费地在全球范围内比较各种模型,并告诉你,比如 ChatGPT 在做这个,Claude 在做那个。
So it has a small army of people who, for free, are comparing models around the world and telling you, you know, ChatGPT is doing this and Claude is doing that.
Ella Marina 收集的这些数据非常有价值。
That data that Ella Marina collects is really valuable.
我的意思是,这对人工智能公司来说是一个极其宝贵的信号,可以帮助他们改进自己的模型。
I mean, that's a super valuable signal for AI companies to use to improve their models.
所以Ella Marina与AI公司合作,包括模型开发者以及致力于部署这些AI系统的大企业,并为它们提供定制化评估。
So Ella Marina works with AI companies, both the model developers and bigger businesses that are working on deploying these AI systems, and it gives them custom evaluations.
比如,一家公司会来找他们,说:我们想知道它在某个特定类别中的表现如何。
Like a company comes to them and says, We want to know how well it's doing in a certain category.
得益于Ella Marina所收集的数据,以及他们拥有的免费志愿者团队,他们能够提供非常有用的评估指标并进行对比。
And thanks to Ella Marina's, the data that they've collected and thanks to the sort of free workforce that they have, they're able to provide really useful evaluation metrics and compare.
它能提前告诉你:你的模型将如何与其他模型相比。
It can tell you ahead of time, This is how your model is going to stack up.
所以它基本上有一个面向公众的基准测试系统,供公众查看,但同时也能私下接受这些实验室的付费,为他们提供关于模型表现好坏的反馈。
So it's basically the I mean, it has this public benchmarking system for the public to see, but then privately, it can get paid by these labs to basically give feedback on how good their models are.
我想象这两者之间有一道墙,他们会说:嘿,我们收到的反馈必须明确表示:我们不偏袒任何一方。
And I imagine there's a wall between the two and saying, Hey, the feedback that we get they'd have to really say, We're not picking favorites.
对吧?
Right?
我们更接近这个实验室,而不是那个实验室。
We're closer to this lab than this lab.
我们从这个实验室获得了更多业务。
We have more business from this lab.
而且在说我们混合两者时,必须有一道明确的界限。
And it would really have to be a wall in saying, We mix the two.
是的。
Yeah.
过去,他们因与此相关的做法而受到一些批评。
And they've come under some criticism for the practices related to this in the past.
几年前有一篇论文由Cohere主导,该论文批评称,Meta在发布其LAMA模型之前,似乎能够通过你所指的方式付费参与。
There was a paper that was published a while ago that was headed up by Cohere and Cohere was criticizing that it seemed like Meta, in advance of releasing one of its LAMA models, was able to sort of pay to play in the way you're pointing to.
他们可以提交多个以不同方式定制的模型检查点。
They could submit a number of checkpoints for the model that were customized in different ways.
他们可以查看哪一个表现最好,然后选择那个。
They could see which one performed the best, and then they could go with that one.
因此,在模型发布当天,它看起来在与其他模型的对比中表现得非常好。
So on the day the model comes out, it seems like it's performing very well against the others.
现在,Elmerina 已对此进行了详细回应,解释了他们的做法和运作方式。
Now Elmerina has responded to this extensively and has explained their practices and how they go about it.
但你可能也会问:这有什么害处呢?
But you also might say, What's the harm?
你可以说:当然,Meta 愿意支付比其他客户更多的费用,但他们获得的反馈如果质量高,将有助于推出最优质的 Llama 模型。
You could say, Well, sure, Meta is willing to pay more than other customers, but the feedback they're getting, if it's high quality, is leading to the best version of Lama that they could possibly release.
因此,你也可以持这种观点。
So that's a position you could have on it as well.
对。
Right.
好了,Rocket,感谢你前来做客。
Well, Rocket, I want to thank you for coming on.
这位是 Rocket Drew,我们《The Information》的 AI 与机器人记者。
That is Rocket Drew, our AI and robotics reporter here at The Information.
好的。
Okay.
亚马逊的AI团队最近一直处于压力之中,努力确保其自身的模型能够与公司平台上众多外部模型相媲美。
Amazon's AI team has been in a stressful spot lately trying to make sure that its own models are up to par with the many outside models the company has on its platform.
这是我们的亚马逊记者凯瑟琳·佩洛夫今天发表的一篇深度报道的主题。
That is the subject of an in-depth story out today from Katherine Perloff, our Amazon reporter.
我想请她来告诉我们她发现了什么。
I want to bring her on to tell us more about what she learned.
凯瑟琳,欢迎再次做客我们的节目。
Katherine, welcome back to the show.
很高兴你来到这里。
It's great to have you here.
嗨,阿卡什。
Hi, Akash.
我们之前在这档节目中讨论过亚马逊的Nova模型。
We've talked a little bit about Amazon's Nova models on this show.
当我们想到领先公司时,通常不会首先想到这个模型,但这是AWS。
It's not the model that we really think about first when we think about the leading companies, and yet this is AWS.
那么,Nova 模型系列表现如何?
So how good is the Nova model family?
这是个复杂的问题,但告诉我你发现了什么。
It's a loaded question, but but let me know what you found.
是的。
Yeah.
你知道,我认为 Nova 刚刚在上个月底发布了第二代模型,而且 Nova 在许多指标上都有所提升。
You know, I think that Nova they just released the second generation of models, at the end of last month, and Nova has been improving on a lot of, metrics.
在某些指标上,它甚至优于 Anthropic 或 OpenAI。
And in some metrics, it's even better than Anthropic or or OpenAI.
但在许多指标上,它仍然落后于这些大公司。
But on a lot of metrics, it still ranks behind those big companies.
而且,我接触的外部客户表示,觉得它相当可靠。
And, know, people external customers I talk to, you know, find that it's kind of reliable.
它能完成任务。
It does the job.
展开剩余字幕(还有 148 条)
它很便宜,但并不是最先进的,这就是为什么一些亚马逊员工实际上开始称Nova为‘亚马逊基础款’,这有点儿调皮,没错。
It's cheap, but it's not state of the art, which is why, some employees who work at Amazon have actually taken to calling Nova Amazon Basics, which is a little bit, you know, cheeky that's yeah.
是的。
Yeah.
你知道,那是这家电商网站的自有品牌系列。
You know, because that that's the private label line of the ecommerce site.
所以,它们在进步,但还没达到顶尖水平,我想这就是我们目前的状况。
So, you know, they're getting better, but they're still not quite top of the line, I guess, is kind of where we're at.
给我们解释一下吧。
And just explain it to us.
我的意思是,AWS显然是云业务。
I mean, so AWS is obviously the cloud business.
在这个平台上,客户可以选择使用Nova模型、Anthropic模型,或者任何其他众多模型来满足他们的需求。
And so on that platform, customers have the choice of using either the Nova models or Anthropic or any other number of models basically for their purposes.
他们可以自己进行比较,说这个模型更好或更差。
They're able to do the comparisons on their own, saying, you know, this one works better or worse.
对。
Right.
没错。
Exactly.
所以,AWS 提供多种模型。
So, yeah, AWS, you know, offers a variety of models.
他们有一个名为 Bedrock 的服务,客户可以在此基础上利用这些模型构建产品。
They have this Bedrock service that, is where their customers can, you know, build products on these models.
我认为,亚马逊公开会说,而且这确实有道理,Nova 主要是为他们 AWS 客户提供更多选择。
You know, I think, that Amazon, you know, Amazon would say publicly, and I think there is definitely truth to it, that Nova is just about them offering more choice, especially to their AWS customers.
你知道吗?
You know?
应该有更多模型供他们的企业客户选择,而不必被锁定在少数几家大公司身上。
There should be more models for their business customers to not have to be, you know, locked into a cup a couple big players.
但显然,他们也在投资构建自己的模型,出于他们自身的业务原因。
But, you know, obviously, they're also trying to invest in building their own models for their, you know, for for their own business reasons.
而且,你知道,我们在故事中探讨的一个问题是,如果Nova自身不够强大,尤其是不足以支撑亚马逊自己的产品,这会对亚马逊造成问题吗?
And, you know, that that that's something we kinda get into in the story is if if Nova is not sort of good enough on its own, especially to underpin Amazon's own prod products, is that is that gonna be a problem for Amazon?
那么,亚马逊正在采取什么措施来改进Nova模型呢?
So what is it doing to improve its Nova models then?
是的。
Yeah.
我的意思是,我认为他们在过去一年里确实有所改进。
I mean, I think that, you know, they did improve in the past year.
他们在去年12月的AWS大会上发布了第一代模型,现在又推出了新一代模型。
They they they announced the first generation last December at the AWS conference, and now they've announced a new generation.
他们在许多不同指标上都取得了进步,比如遵循规则和其他一些基准测试。
They've gotten, you know, better on a lot of different metrics, like, you know, some including, like, rule following and other other benchmarks.
还有,我刚才提到的LM Arena,就是这类评估平台。
And Iraq, I was just talking about LM Arena, you know, those, you know, those kinds of evaluations.
但内部有人担心,它在实际产品开发中的表现,有时并不如Anthropic那么出色。
But, internally, there is concern that it doesn't always do the job just as well as or as good as Anthropic, especially when, you know, folks are making products.
所以,像Rufus、QuickSuite(一个AWS企业搜索产品)、Curo(一个编程助手)这样的亚马逊旗舰AI产品,实际上都在使用Anthropic和Nova,而且是Nova。
So a lot of like flagship Amazon AI products like Rufus, like QuickSuite, which is an AWS enter enterprise search product, Curo, which is a, coding assistant, you know, re using Anthropic and Novo and actually Novo.
实际上,Curo完全依赖于Nova。
And actually Curo relies, exclusively on Novo and Nova.
抱歉。
And sorry.
这些名称的结尾让人困惑。
The endings are confusing.
我与一些参与过这些产品的人交谈过,他们说AWS的领导层希望他们更多地基于Nova构建,因为他们担心,如果太多亚马逊产品都建立在Anthropic之上,那么当任何公司都能使用Anthropic构建产品时,这些产品就无法在市场中脱颖而出。
And, I talked to some of the folks, you know, who had worked on those, and they said that AWS leaders want them to build more on Nova because they're worried if too many Amazon products are built on Anthropic, those products won't stand out on the market when any company can build products on Anthropic.
而Nova是亚马逊的独特优势。
And Nova's sort of Amazon's special sauce.
因此,他们也在向其他客户开放Nova。
So they they are also making it available to other customers.
他们还指示AI实验室以外的员工对Nova模型进行微调和后训练,以提升其性能,使其跟上进度。
And they've been instructing their employees outside the AI lab to work on kind of fine tuning post training the Nova models to improve them and, you know, may make them kind of up to speed.
而且,我聊过的有些人并没有使用最新一代的Nova模型。
And, you know, some of people I was talking to weren't working with the newest generation of Nova models.
有些人正在处理今年才推出的一些模型,但可能还不是12月re:Invent大会上发布的那些。
Some of them were working with models that were kind of coming out this year, but maybe not quite the ones that were announced at re:Invent in December.
但无论如何,他们发现,他们在对这些模型进行后训练,但效果并不总是如他们所愿。
But nonetheless, they had found that, you know, we were they're post training these models, but they're not always doing quite as good a job as they want them to.
在某些方面,这些模型表现尚可,甚至更好,但并非所有情况都如此,因此仍不得不依赖Anthropic。
For some things, they're okay or even, you know, better, but for not everything, and that's why there's had to be some reliance on Anthropic.
所以让我问你一个问题。
So let me ask you this.
你才刚开始报道亚马逊,过去几个月才涉足这一领域,这肯定是你深入研究亚马逊AI业务的众多报道之一。
You've just started covering Amazon in the past couple months, and this was one of your first of many, I'm sure, deep dives into the AI business.
我一直努力理解,为什么AWS在某些情况下会在AI话题上落后于其他云服务商。
I've always really tried to understand what happened that had AWS, in some cases, fall behind, I guess, in the AI conversation in a way that the other cloud providers didn't.
作为刚接触这个领域的新手,观察到这种情况,你认为AWS究竟发生了什么?为什么他们在AI领域的声量不如GCP或Azure?
And I wonder, as you sort of coming into this beat fresh and looking at this, what is your sense for what happened at AWS and why they haven't been at the top the way that GCP or Azure have been in the AI conversation?
是的。
Yeah.
我的意思是,微软作为这项技术的发明者之一,具有优势。
I mean, well, has the advantage of being one of the inventors of this technology.
微软确实全力支持OpenAI,其与OpenAI的关系比亚马逊与Anthropic的关系更为深入。
Microsoft has really thrown its weight behind OpenAI, and their relationship with OpenAI is more in-depth than Amazon's relationship with Anthropic.
我的意思是,微软也试图开发自己的AI模型,但与OpenAI建立了非常深入的合作关系,并享有某些独家权益,而亚马逊和Anthropic,我想,你知道,
I mean, Microsoft has also tried to develop its own AI models, but it's had this sort of very deep partnership with OpenAI and certain exclusivities to that partnership, which Amazon and Anthropic, you know, I guess, you know
这是一个新的合作关系,更年轻一些。
It's a new it's a newer I mean, it's a newer partnership.
对吧?
Right?
而且,Anthropic也在不断拉近与微软和谷歌的关系。
And and Amazon and Anthropic has also been cozying up to Microsoft and to Google.
最后,我们还可以考虑Meta,它一直在为这个问题投入大量资金。
And then, you know, and the the last kind of player we can think about is Meta, which has just been throwing a ton of money at this problem.
他们没有云业务。
They don't have a cloud business.
我想这可能是他们这样做的部分原因,因为亚马逊拥有云业务,AI初创公司可以利用它来缓解一些市场压力。
And I guess maybe that's part of the reason they've done this is Amazon kinda has this cloud business where AI startups, you know, will use it to sort of insulate them from some of the market pressure.
但你知道,亚马逊的文化并不倾向于花这么多钱。
But, you know, Amazon's culture is not to spend so much money.
这从来就不是他们的文化。
That's never been their culture.
他们一直比较节俭。
They've always been a bit thriftier.
我们在故事中还谈到另一点,他们并没有像Meta那样支付巨额薪水,而且他们在人才保留方面采取了更为谨慎的策略。
That's something else we talk about in the story is, you know, they they didn't they weren't paying, like, exorbitant salaries like Meta was, and, they're they had a bit of a more careful approach to retention.
所以我认为这就像一把双刃剑,他们不像谷歌那样在AI领域那么先进,因此也不需要像谷歌那样追赶,某种程度上,他们反而拥有一个先发优势。
And so I think that it's sort of, like, kinda, I guess, like, a double edged sword where, like, they weren't so advanced in AI like Google, where they didn't have to you know, they had this head start that you could say Google didn't in in some ways.
我的意思是,他们只有Alexa,而Alexa并不被视为先进的AI。
I mean, they just had Alexa, which wasn't seen as as advanced AI.
但他们也在故事中提到,许多来自Alexa团队的人才逐渐转去从事AI相关工作。
But they and so talk to the story about a lot of the talent from the Alexa team sort of migrating over to work on AI.
对吧?
Right?
没错。
Right.
他们正是从这些人才中为这个AI实验室招募了骨干。
That's where they kind of drew the talent for this AI lab.
而且内部还存在一些争论。
And there's some kind of, you know, debates internally.
这些人才真的合适吗?
Was this the right talent?
因为负责Alexa AI的员工在大型语言模型方面的经验,不如其他更传统的AI形式,比如自然语言处理和机器学习。
Because the folks working on Alexa AI are were not as well versed in sort of large language models versus kind of other older forms of AI, like natural language processing machine learning.
事实上,如果我仔细想想,我们曾写过苹果在让Siri借助新AI功能实现规模化方面遇到的困难。
Which is actually kind of you know, if I think about this, we've we've written about Apple's struggles to get Siri to scale the way that would have hoped with new AI features.
我的意思是,Alexa和Siri其实用的是同一拨人才,我想你可以说,依赖这种类型的人才来推动AI战略,对这两家公司来说可能都没取得理想的效果。
I mean, it's sort of the Alexa and Siri, same group of talent, I guess, you could make the argument that maybe relying on that type of talent to scale your AI strategy maybe hasn't voted well for both of these companies in some way.
对。
Right.
没错。
Yeah.
不过公平地说,亚马逊确实做了一些招聘和授权合作。
Think I mean, to be fair, Amazon has done some of those sort of hiring licensing deals.
他们去年或2024年与一家名为Adapt的智能代理初创公司达成了合作。
They did one with this agent specialist startup, Adapt, last or 2024.
我们现在是2022年。
We're in 2022.
他们还与一家机器人AI公司合作,或者说,他们也和另一家机器人AI公司CoVariant做了类似的合作。
And they also worked with a robotics AI company, or they did the same with a robotics AI company, CoVariant.
但确实,人们已经做了这种与苹果的对比。
But, yeah, I think people have made the apple comparison.
我的意思是,仅仅为了展示这一战略正处于变动之中,上个月他们宣布了这一人工智能部门的新领导层。
I mean, just to kind of show how this strategy is in flux, last month they announced new leadership, for this AI organization.
之前负责这一部门的人即将离开。
The person who had been leaving it, leading it is leaving.
他们任命了一位AWS高管,并将原本仅专注于开发基础模型的部门扩展至涵盖芯片和量子计算的职责。
They're getting in an AWS exec, and they're making this kind of division, which had just been focused on making foundation models also have chips and quantum computing in its purview.
所以我认为他们正在试图解决这个问题。
So I think they're trying to sort of address the problem.
但没错,苹果是一个相当恰当的比较对象,用以说明他们在科技巨头中的AI布局如何。
But, yeah, Apple is sort of a fair comparison to sort of how they're how they compare within the big tech companies in the Yeah.
人工智能
AI
这真是个精彩的故事。
Well, it was a great story.
凯瑟琳,感谢你前来分享你的报道。
Kathryn, I want to thank you for coming on and sharing your reporting with us.
这位是凯瑟琳·佩罗夫,我们《信息》杂志的亚马逊记者。
That is Kathryn Perloff, our Amazon reporter here at The Information.
好的。
Okay.
人工智能热潮严重依赖能源行业,在我们最新一期的AI基础设施通讯中,我们考察了这一行业各个领域如何因数据中心的不断涌现而调整策略。
The AI boom relies heavily on the energy sector, and in our newest edition of our AI infrastructure newsletter, we look at all the different corners of that industry that are changing their strategies as more data centers pop up.
现在加入我们的是我们的专栏作家、该通讯的作者安·戴维斯·沃恩。
Joining me now is Ann Davis Vaughn, our columnist and the author behind that newsletter.
安,欢迎再次做客我们的节目。
Ann, welcome back to the show.
很高兴你能来到这里。
It's great to have you here.
嗨,阿卡什。
Hi Akash.
很高兴能来这里。
Great to be here.
我们之前在节目中讨论过,人工智能热潮对能源行业带来了巨大推动。
So we've talked on the show about the extent to which the AI boom has been a big boon for the energy sector.
但我们还没有像你今天的专栏那样深入探讨,究竟是能源行业的哪些领域真正受益并改变了策略。
We haven't really gone as in detail as your column did today about what corners of the sector are actually benefiting and changing their strategy.
你提到了公用事业公司,也提到了设备制造商。
You talked about utilities, you talked about equipment manufacturers.
请为我们详细列出哪些群体受益了,然后谈谈你从报道中观察到的情况。
Lay out the list for us, rather, of who's benefiting and then talk about what you're seeing from your reporting.
当然可以。
Yeah, absolutely.
所以,我希望在这篇专栏中,让我们的读者了解所有从这场人工智能竞赛中获益的不同参与者。
So I wanted in this column for our readers to understand all the different players that are getting something from this AI race.
人工智能对能源的巨大需求是催化剂,但这也是许多行业试图抓住的时刻,因为我们不仅对能源有非凡的需求,对工业资源也是如此,而且企业们正看到一个可能改变政策的契机。
The incredible resource demand for energy to launch AI is the catalyst, but this is a moment that many industries are trying to seize because it is we've got a extraordinary need, not just for energy, but for industrial resources and a moment in time that companies see where some policies could change for them.
因此,思考这场热潮时,你的眼光要超越硅谷正在发生的事情。
So this boom is about thinking thinking about this boom, you want to think bigger than just what's happening in Silicon Valley.
我们在专栏中谈到的,其中包括天然气行业如何利用这一热潮实现某些目标。
What we talk about in the column are among other things, what the natural gas industry is able to use this boom to do.
我来给你举个例子。
And so I'll just give you an example.
天然气行业将这场人工智能竞赛视为一个千载难逢的机会,得以推进那些过去十年甚至二十年都未能实现的议程。
The natural gas industry is seeing the AI race as a generational opportunity to get some aspects of their agenda accomplished that they have not been able to get accomplished for a decade or two.
这包括建设更多永久性的天然气管道和天然气发电设施,因为他们可以辩称,美国必须赢得这场人工智能竞赛,而他们扮演着重要角色。
And that includes building more permanent infrastructure for natural gas pipelines and natural gas plants because they can argue that this AI race must be won by United States and that they have a part to play.
在某些情况下,你提到过,这种投资——我的意思是,有时它是否是最清洁的能源来源其实并不那么重要,对吧?
And in some cases, you talked a little bit about how this investment I mean, sometimes it doesn't really matter if it's the cleanest energy source, right?
我的意思是,因为能源供应实在太短缺了。
I mean, because there's such a shortage.
能源短缺确实非常严重。
There's such a shortage.
我想说,随着时间推移,能源行业的最佳实践其实有助于改善环境。
I would say, you know, you'd be surprised at, over time best practices in the energy industry, can help, clean up.
这促使天然气行业与一些科技公司合作,承诺改善我们运营中的甲烷泄漏问题。
It's been a motivator for the natural gas industry to be a partner to some of these tech companies by saying, hey, we're gonna clean up our, you know, methane leaks from some of our operations.
目前还有一些正在洽谈的交易,旨在开展首批碳捕获项目。
Or there are deals in the works right now to try first of a kind projects that would include carbon capture.
因此,这是一个正在受益并看到自身机遇的行业。
So that's one industry that is benefiting and seeing their moment.
清洁能源公司也在迎来自己的机遇,因为我们需要多种不同的能源来源,国家也将以多种方式利用能源基础设施。
Know, clean energy companies are are seeing their moment because we need so many different energy sources and different reasons that the country will be able to use energy infrastructure in different ways.
电池产业正在蓬勃发展,尤其是在德克萨斯州,作为太阳能的备用和配套设施。
Batteries are absolutely booming, especially in Texas as a backup and a pairing to solar.
尽管你可能听说特朗普政府的政策并不那么支持,但这些行业的创业者仍获得了惊人的投资。
And entrepreneurs in those industries are seeing incredible investment despite what you might hear about the Trump administration's policies not being as supportive.
他们并不是唯一受益的群体。
They're not the only ones.
核工业,包括整个核供应链,由于我们正在重启核电站,正积极应对日益增长的电力需求。
The nuclear industry, including the whole nuclear supply chain, because we are reviving nuclear power plants, is revving up to help this incremental need for power.
这也延伸到了国内铀矿的采矿公司。
And that also extends into mining companies for domestic uranium.
还延伸到了为数据中心内部建设设备的设备公司,以及电气设备供应商。
It extends into equipment companies that are building out what goes inside of the data center, electrical equipment providers.
如果人工智能泡沫如人们所担心的那样破裂了,会发生什么?
And what happens if the AI bubble bursts as people fear it might?
我的意思是,这些数据中心的建设都有多年的承诺,我不禁想到我们同事史蒂夫·莱文的报道。
I mean, these data centers, there are multi year commitments to build them, and I can't help but think of our colleague Steve Levine's reporting.
我们在信息网站上建立了一个完整的数据库,专门记录那些为电动汽车电池及类似材料而兴起的超级工厂。
We have an entire database dedicated on the information website to the gigafactories that popped up to make these, I think it was the batteries for the electric vehicles and the materials, stuff like that.
其中一些现在只是空壳而已。
And some of them are just, I mean, they're just shells now.
它们已经被废弃,或者在某些情况下根本就没有启动。
They've been gutted or in some cases they didn't even get going.
如果这些投资消失了,会发生什么?
What happens if this investment goes away?
是的。
Yeah.
是的。
Yeah.
你正在看到一些电池公司转向公用事业电力。
I mean, you're seeing some battery companies pivot into utility power.
关于这个Akash,有几件有趣的事情,因为我专栏中提到的行业在规划项目时是以几十年为单位的。
So there's a few interesting things about this Akash, because the industries that I wrote about on my column think in terms of decades in terms of their projects.
而他们正在试图帮助那些以十二到十八个月为周期思考的行业,看看如何能尽快实现他们的目标,比如通过人工智能获得某种产品收入。
And they're trying to help an industry that is thinking in a twelve to eighteen month timeframe and how quickly can they achieve what they're trying to achieve with their, you know, getting some sort of product revenue with AI.
但这些行业以前也经历过繁荣与萧条,因此在选择与谁签订这些长期合同时非常谨慎。
But these industries have, you know, been through booms and busts before and they're being really careful about who they engage in these big long term contracts with.
长远来看,这可能会给大型科技巨头带来优势,因为如果你要与石油和天然气公司合作,而它们还能提供碳捕集或帮助你控制成本,那么它们希望有一个信用良好的合作伙伴。
It could ultimately give the very large tech giants an advantage over time because, you know, if you're going to engage with the oil and gas or gas company that can also do carbon capture or that can help you with the price line, they wanna have a credit worthy counterparty.
这就是能源公司保护自己的方式之一。
That's one way that the energy companies are protecting themselves.
我们也在专栏中解释了公用事业业务的性质。
We also just explained the nature of the utility business the column.
公用事业公司在特定区域内被授予垄断权,尤其是在监管更严格的州。
Utilities have been granted a monopoly in territories, especially in states that have heavier regulation.
它们向州监管机构申请进行长达数十年的资本支出。
And they go to state regulators and ask to to do multi decade, you know, capital expenses.
一旦获得批准,它们就能获得一个内置的保证回报率,并向客户收取费用,包括数据中心。
And once they get approval, they have a guaranteed rate of return that is built in that they can charge to their customers, including data centers.
这些项目因此相当持久,因为我们都在为此付费。
And those projects are then pretty durable because we're all paying for it.
有趣的是,我们在电力基础设施方面严重滞后,工业基础需要投入的资金远超当前水平,我们正在建设的这些产能很可能只是我国所需产能的开端。
And the interesting thing is that we are so behind on power infrastructure and we have so much more that needs to be invested in the industrial base that this capacity that we're building is likely just the beginnings of what we need in this country.
其中一些项目的进展非常缓慢。
And it goes so slowly, some of it.
这自然成为能源基础设施泡沫的制约因素。
That is a natural governor on the energy infrastructure bubble.
而且,我们现在有了一个可以加速这些事情的借口,这更让人相信这些项目在长期内实际上不会被取消。
And the fact that we sort of have an excuse now to accelerate things is sort of more reason to say that maybe these projects, they won't actually be canceled in the long run.
安妮,感谢你前来做客。
Anne, I want to thank you for coming on.
这位是安妮·戴维斯·沃恩,我们在《信息》杂志的AI基础设施专栏作家。
That is Anne Davis Vaughn, our AI infrastructure columnist here at The Information.
谢谢你的分享。
Thanks for that.
好的。
Okay.
那么,今天的节目就到这里。
Well, does it for today's show.
提醒一下,我们每周一至周五上午10点(太平洋时间)、下午1点(东部时间)直播。
A reminder, we are on this stream Monday through Friday at 10AM Pacific, 1PM Eastern.
感谢您的收听。
I wanna thank you for tuning in.
我们非常感谢您的观看。
We really do appreciate your viewership.
我已经迫不及待期待明天的节目了。
I am already excited for our next show tomorrow.
祝您周二剩下的时间愉快。
Have a great rest of your Tuesday.
暂时再见了。
Bye bye for now.
关于 Bayt 播客
Bayt 提供中文+原文双语音频和字幕,帮助你打破语言障碍,轻松听懂全球优质播客。