本集简介
双语字幕
仅展示文本字幕,不包含中文音频;想边听边看,请使用 Bayt 播客 App。
欢迎各位收看资讯TITV。我是主持人Akash Pashricha。今天是11月3日星期一。我们为您准备了一期精彩节目。AWS与OpenAI宣布达成380亿美元合作协议,OpenAI将在AWS云平台上运行工作负载,而AWS将为OpenAI提供NVIDIA芯片资源。
Welcome everyone to the Informations TITV. My name is Akash Pashricha. It is Monday, November 3. We have an exciting show lined up for you today. AWS and OpenAI are announcing a $38,000,000,000 deal, which will see OpenAI run workloads on AWS, while AWS will provide OpenAI with access to NVIDIA chips.
我们将邀请专注微软与企业软件领域的记者Aaron Holmes深入解析。同时还将探讨他今日发布的新报道——关于企业使用AI代理时面临的挑战。随后我将对话初创公司StarCloud的联合创始人兼CEO,该公司刚将搭载NVIDIA GPU的卫星发射至太空。我们还将聆听谷歌高管与ServiceNow副主席分别讲述各自公司的AI战略。节目最后将由Tubi CEO分享流媒体行业格局变化,以及该平台如何实现盈利。
We're going to break that down with Aaron Holmes, who covers Microsoft and all things enterprise software. We're also going to talk about a new story that he has out today about the challenges that companies are seeing with AI agents. Then I'm talking with the co founder and CEO of StarCloud, the startup that just launched a satellite into outer space carrying an NVIDIA GPU. We'll also hear from a Google executive and the vice chairman of ServiceNow about each of those companies AI playbooks. And we're gonna wrap up the show with Tubi's CEO on how the streaming and media landscape is changing and on that platform reaching profitability.
本期节目精彩纷呈。内容非常丰富,让我们直接进入主题。AWS与OpenAI签署了为期多年的大型合作协议,OpenAI将在AWS基础设施上运行部分工作负载,并通过AWS的NVIDIA芯片获得强大算力支持。现在有请专注微软与企业软件报道的记者Aaron Holmes为我们解读这则新闻。
It is an exciting show. We've got a lot to get to, so let's get right on into it. AWS and OpenAI have signed a big multi year deal together that will see OpenAI run some of its workloads on AWS infrastructure. OpenAI will get access to massive compute power via AWS's NVIDIA chips. Joining me now to break down the news is our Microsoft and enterprise software reporter, Aaron Holmes.
Aaron,很高兴见到你。周一愉快。
Aaron, it's good to see you. Happy Monday.
周一愉快。很荣幸参与节目。
Happy Monday. Happy to be here.
新的一周,OpenAI又与一家主要云服务商达成重大合作。你对今早这则新闻有何看法?
So it's another week and another big deal that OpenAI is signing with a major cloud player. What did you make of the news this morning?
确实。这对AWS无疑是利好消息,毕竟过去几个季度其增速一直落后于谷歌云和微软Azure。此前业界认为AWS可能错过了这波AI云服务支出热潮。但我们可以看到OpenAI非常渴望获取任何可能的算力协议。虽然这笔交易规模小于OpenAI未来对甲骨文和微软的承诺投入,但它确实让AWS重新加入战局,未来几年将有力提振其AI云业务。
Yeah. So I mean, this is definitely good news for AWS, which has been growing at a slower rate than Google Cloud and and Microsoft Azure in the last few quarters. And, you know, there was a perception that AWS might have been missing out on some of this, you know, AI cloud spending bonanza, But we've seen OpenAI is really eager to sign basically any compute deal it can get its hands on. And this is definitely a smaller deal than what OpenAI has committed to spend on Oracle or on Microsoft in the coming years, but it does get, you know, AWS back into this conversation and, you know, basically will help boost their their AI cloud business in the coming years.
我注意到这次公告中完全没有提及Trainium——这是AWS正在研发的芯片。而就在上周的财报电话会上,高管们对这个芯片还滔滔不绝。正是在那场电话会上,AWS展示了他们实际已加速了增长率。你对Trainium未被提及怎么看?我感觉这事可能后续会有进展。
One of the things that I noticed in the announcement, there was no discussion of Trainium, which is the chip that AWS is working on. And this, of course, is also the chip that last week executives couldn't stop talking about on the earnings call. That was the same call where AWS showed that they have actually been able to accelerate their growth rate. What did you make of the fact that Trainium was not in the conversation? I feel like maybe this comes up later on.
他们只是还没准备好公布而已。我也不清楚这意味着什么。
They just weren't ready to announce it yet. I don't know what to make of it.
没错。这份公告确实如你所说,明确表示OpenAI将在AWS数据中心使用英伟达芯片。我认为这基本说明在GPU领域,特别是训练和推理方面,英伟达仍处于技术前沿。而亚马逊虽然一直在大力推广自研芯片,但至今成效参差不齐。
Yeah. So, I mean, this announcement, you're right. It specifically said OpenAI will be using NVIDIA chips in AWS data centers. I I think it basically just shows that NVIDIA is still state of the art when it comes to GPUs for things like training and and inference. And, yeah, I mean, Amazon has been trying really hard to push its own in house chips with, you know, mixed results so far.
OpenAI坚持选择英伟达芯片的事实,或许反映了AWS在说服大型实验室规模化使用其训练芯片时遇到的挑战。
I think that the fact that OpenAI is opting to stick with NVIDIA chips maybe speaks to some of the challenges that AWS has seen convincing these big labs to use training at scale.
确实。我想转个话题,聊聊你今天早上发表的关于企业如何应对AI智能体的报道。你提到各公司正竞相将智能体集成到工作流中,但现实是许多AI智能体在实际工作中的表现不尽如人意。这是为什么?问题出在哪里?
Right. Well, I want to turn to a story that you published this morning about how enterprises are dealing with AI agents. You looked at how companies are racing to integrate agents into workflows, but the reality is that many AI agents are underperforming in real life on the job. Why is that? What's going wrong here?
是的,我们正处在一个非常有趣的阶段。AI无疑正在改变很多工作方式,许多人也从聊天机器人或AI编程工具中获得了个人效率提升。但我采访过不少尝试部署自主AI智能体接管客服等工作的客户,他们对目前看到的效果仍存顾虑。比如博世电动工具部门曾想开发能解释不同电动工具说明书的AI智能体,却始终无法投入生产,因为它总是幻觉出可能导致用户受伤的错误信息。
Yeah. So we're in this really interesting place where, you know, AI is undeniably changing a lot of how work is done, and a lot of people are getting personal productivity benefits from things like chatbots or AI coding tools. But I have spoken to a lot of customers who have tried to deploy autonomous AI agents to take over things like customer service, and, you know, they're still a little bit concerned by what they're seeing. You know? For example, I spoke to somebody at Bosch Power Tools who wanted to build an AI agent to explain the the user manual of different power tools, but they haven't been able to to put that into production just because it kept hallucinating wrong information that could result in a customer getting hurt.
因此我们看到AI实验室正在投入更多资源帮助客户配置智能体,因为这可能比最初设想的要困难得多。
So as a result, we're seeing AI labs actually devote more resources to helping these customers configure agents because I think it can be harder than they thought originally.
这就是实验室将人才输送给企业的构想,基本上就是在说:'来,把我们的人带走。让我们教你们如何配置这些智能体。'我猜这肯定会影响到这些公司的盈利能力,对吧?
And so this is the idea where the labs will actually give people talent, essentially saying, hey, take our people. Let us show you how to configure these agents. I imagine that that has to take a hit on profitability for these companies. No?
没错。过去一年我们看到OpenAI经常这么做,现在Anthropic也加入进来,甚至像AWS这样的云服务商也开始加大力度帮助客户配置智能体。目前Anthropic和OpenAI其实并不太关注盈利问题,所以他们可以投入大量资源确保这些试点项目真正落地见效。我认为这就是当前正在上演的战略。
Yeah. I mean, I think we've seen OpenAI do this a lot over the past year, and now we're seeing Anthropic, but also, you know, cloud firms like AWS start to step up their efforts to to help customers configure agents. And, I mean, you know, Anthropic and OpenAI for now are not really concerned with profitability, so they can pour a lot of resources into making sure that these pilots actually get off the ground and work effectively. And I think that's sort of the strategy that we're seeing play out right now.
你采访的人对这些问题达成共识了吗?比如这些智能体会彻底取代人类工作吗?它们能否在某些方面完全自主运行?还是说仍然需要人类参与监督?
What was the consensus from the people you talked to around whether or not these agents will make work completely obsolete or whether or not they will be completely autonomous in some ways or whether or not they will still need to have a human in the loop?
大多数和我交流过的软件公司高管、CEO和创始人都认为,距离AI智能体真正能替代工作岗位至少还需要几年时间。这很有趣,因为今年早些时候,像OpenAI的首席产品官还在宣称今年将是'智能体之年'。但企业现在开始面对残酷现实——很多智能体还无法完全自主工作,必须有人类核查它们的输出。
You know, most of the software executives or CEOs, founders I spoke to now feel like it's gonna be at least a couple years before AI agents can truly automate jobs, which is interesting because I think earlier this year, you know, you had folks like OpenAI's, you know, chief product officer saying it would be the year of agents. But I think companies are starting to run into the harsh reality that a lot of these agents are not quite ready to work completely autonomously without having a human essentially check their work.
确实。这个故事一直在发展变化。Erin,感谢你的分享。这位是我们The Information新闻部的Aaron Holmes。我们之前在这个节目里讨论过某些公司想在太空建造数据中心的雄心壮志。
Right. Well, it's a story that is always evolving. So, Erin, I want to thank you for coming on. Is Aaron Holmes from our newsroom here at The Information. We've talked before on this show about the ambitions of some companies to build data centers in outer space.
上周末,处于这个梦想核心位置的公司Star Cloud迈出了实现愿景的第一步。这家卫星公司周日通过SpaceX火箭将英伟达H100 GPU送入了太空。现在有请Star Cloud联合创始人兼CEO Philip Johnson为我们详细介绍这次发射以及公司下一步计划。Philip,周一快乐。对Star Cloud来说今天应该是个大日子吧?
This weekend, one company at the center of that ambition, Star Cloud, took early steps to making that dream a reality. On Sunday, the satellite company launched NVIDIA's H100 GPU onboard a SpaceX rocket and sent it into outer space. I want to bring on Philip Johnson, co founder and CEO of Star Cloud, to tell us more about the launch and what's next for the company. Philip, happy Monday to you. It's a great Monday for all things Star Cloud, I imagine.
非常感谢邀请。是的,这个周末简直太棒了。
Thanks so much for having me on. Yeah. No. It's been a awesome weekend.
周末太棒了。跟我们说说发射的情况吧。详细讲讲整个过程。具体是什么时候发射的?
Awesome weekend. So tell us about the launch. Just walk us through it. What time did it happen?
是的。发射时间是YAM11月2日,也就是周日凌晨非常早的时候。
Yeah. So launch was YAM November 2, so Sunday, very early morning.
好的。
Okay.
我们有一群人在卡纳维拉尔角。我现在还在这里观看发射。发射后大约一小时与航天器分离,然后在大约十二小时后我们收到了第一批遥测数据。所有的
We had a bunch of us down at Cape Canaveral. I'm still down here at Cape Canaveral right now to watch the launch. So, yeah, separated from the spacecraft about an hour after launch, and then we got the first telemetry back about twelve hours after that. And all all of the
第一批遥测数据回来了。能不能给我们这些不懂航天术语的人解释一下这是什么意思?
First telemetry back. What what translate that for us for those of us who don't know space speak.
那基本上就是我们经过地面站的时候。这让我们能建立无线电连接,然后它会告诉我们卫星是否正常?太阳能板是否展开?电池是否充电?我们是否按预期接收数据?
That's essentially when we pass over a ground station. It allows us to form a radio link, and then that will tell us, okay, is the satellite healthy? Have the solar panels deployed? Is the battery charged? Are we receiving data as we should be?
所以,是的,在那之前是一个非常紧张的时刻。据我所知,大约一半的初创卫星首次发射时都无法建立这种首次联系。所以,是的,那是个相当令人紧张的时刻。
So, yeah, it's a it's a very nerve racking moment up until that point. I think about half of all first time startup satellites don't ever make connect like, make this first contact. So it's a, yeah, it's a pretty nerve racking moment.
那么这家公司成立还不到两年对吧?
And the company is, what, less than two years old right now?
对,对。我们是去年初成立的,所以是21年。
Yeah. Yeah. We started early last year, so '21.
好的。首先什么是GPU,你们推出了多少款GPU?
Okay. What is the GPU first of all, how many GPUs did you did you launch?
目前有五款,但其中两款特别值得关注——一款是英伟达的H100,另一款也是英伟达的A6000。
So it has well, it has five, but the two that are interesting is one h 100 from NVIDIA and one a 6,000 also from NVIDIA.
明白了。现在那颗芯片在上面具体执行什么任务?
Okay. And what is the chip doing up there now?
目前我们尚未正式启用它。大约还需要一个月时间降低轨道高度才能开始部署卫星。它的主要任务是处理其他卫星影像的高性能推理运算,并成为首颗在太空进行模型训练和微调的卫星。我们将运行谷歌云的Gemini版本,这本质上是个验证项目——证明像H100这样的地面数据中心级高性能GPU能在太空运行。
Well, right now, we we still haven't commissioned it. So it will take about a month before we lower altitude to the point where we're gonna be commissioning the the ship. But what it will be doing is running high powered inference on imagery from other satellites as well as being the first to do things like training a model in space, first to do fine tuning of model in space. We're gonna run a version of Gemini from Google Cloud. So it's mainly just to prove this is really a demonstrator to prove that you can run high powered terrestrial data center grade GPUs like h one hundreds in space.
所以你们实际上是在用它处理现有其他卫星的AI工作负载?你们如何应对太空卫星特有的问题?比如太空碎片、温度控制——虽然外太空听起来很冷,但如何保持设备冷却显然是个挑战。
So you're actually using it to power some of the AI workloads from the other satellites that are up there right now. How do you deal with some of these issues with having satellites in space at all? Stuff like space debris, temperature control is obviously an issue. Got to figure out how to keep the thing cold, even though it is outer space. People think of it as cold.
实际上这个问题比那要复杂一些。
It's it's actually a bit of a more complicated issue than that.
是的,确实如此。关于太空碎片和微流星体这类问题,对我们来说,主要是要避开太空最拥挤的区域,也就是那些在400到800公里高度飞行的其他卫星所在区域。最初几颗卫星实际上飞得很低,只有380公里高度,这被称为极低地球轨道。
Yeah. That's true. So for space debris and and micrometeorites and things like this, you essentially I mean, for for us, we're gonna be avoiding the most congested parts of space, which are where all of the other satellites flying between about 400 to 800 kilometers altitude. So the first few are flying actually very low, 380 kilometers. It's kind of it's what's called very low Earth orbit.
那个高度具有自清洁特性,因为上层大气阻力较小,几个月内就能让任何东西脱离轨道。所以那些轨道非常非常干净。至于后续版本的卫星,我们会在更高的约1200公里高度飞行,这样我们就能始终处于阳光照射下。而且那里飞行的人也不多。
You have a self cleaning property of that altitude because you have low levels of drag from the upper atmosphere, and that kind of deorbits anything within a few months anyway. So it's very, very clean, those orbits. And then for the later versions of the satellite, we'll be flying quite a bit higher around 1,200 kilometers altitude. And that's so that we are always in the sun. Also, not too many people wanna fly there.
所以...哦...所以风险不算太大。但你始终要面对来自轨道外的微流星体风险。嗯。因此我们需要在最敏感的部件上加装防护。
So Oh. So so not too much. But you always have the risk of micrometeorites coming in from outside Uh-huh. Orbit. And so for that, we need shielding on the most most sensitive parts.
比如说芯片。而对于其他部件,我们允许它们随时间逐渐老化。比如太阳能板,基本上就是让微流星体直接穿透过去。对吧。
So for example, the chips. And then we allow a certain degradation over time for other parts. So for example, the solar panels, you just allow parts to pass through them essentially. Right.
谈谈
Talk to
跟我说说业务方面的事。抱歉,继续
me about the business. Sorry. Go
前面。有一个关于冷却的部分,
ahead. There was one part on on cooling, which
是冷却系统。没错。
is Cooling. Right.
是的。我们需要非常大型的可展开散热器。这就是我们在星云公司开发的核心技术——这些低成本、低质量的大型可展开散热器,专门用于解决散热问题。
Yeah. We need very large deployable radiators. So that's the core IP that we're developing at Star Cloud is these enormous low cost and low mass deployable radiators just to get rid of this heat.
明白了。跟我聊聊这里的商业模式。你们是向客户出售计算资源吗?显然目前太空中的工作量还很小,但已经有公司表示愿意购买你们在太空提供的计算能力了吗?星云公司作为企业的商业模式是怎样的?
Right. Talk to me about the business model here. Are are you guys selling compute to customers? Obviously, this is a very small workload up there right now, but do you have companies to use saying, hey, we will buy the compute workload when you get it to outer space? What does the business model look like for Star Cloud as a company?
谁在给你们付钱?
Who's paying you?
没错。十月份发射的第二颗卫星,其发电量大约是首颗卫星的100倍,计算能力是10倍。我们主要面向国防部客户销售。你可能听说过特朗普推动的那个'金穹顶'计划。
Yeah. For sure. So for the second satellite launching in October year, that's about a 100 times the power generation of the first one, ten times the compute. We are gonna be selling mainly to DOD, and we are selling mainly to DOD customers. So you you might have come across this golden dome that Trump is orchestrating.
这本质上是个导弹防御系统。我不能透露太多细节,但可以想象,在轨部署高性能计算——准确说是高性能推理能力——会非常有用。比如能实时对采集的图像进行分析推理。假设你想知道某地是否发射了导弹,肯定不愿像现在这样必须等待地面站处理。
It's basically a missile defense system. So I can't speak too much about it, but for example, you can imagine it would be useful to have high powered compute apologies. High powered inference on orbit in order to be able to do things like running imagery running inference on the imagery that you're collecting. For example, if you were to want to know, has a missile been launched in this location? You don't wanna have to wait for a ground station, which is what they currently do.
你想把数据立即传送到太空给我们这样的人,我们就能实时提供分析结果。
You wanna ship that data immediately in space to somebody like us, and we can then we can then provide an insight in real time.
我得告诉你,菲尔,我们节目已经邀请过数百位嘉宾。我们还算是个新节目。请过数百位嘉宾,但从没人打过喷嚏。在直播中打喷嚏...连我自己都没干过这事。
I've gotta tell you, Phil, we we've had hundreds of guests on the show. We're still a new show. We've had hundreds of guests on. We've never had anyone sneeze. Sneezing on air is it's a it's a even I haven't done it.
我整天都在打喷嚏。所以祝你健康。我该说'保佑你'。好吧,继续讲。
I'm sneezing all the time. So bless you. I should say bless you. Well, speaking.
你得快速反应。
You had to hit fast.
好的。那么你们正在与国防部合作。这些合同具体是怎样的?我甚至不知道这类服务的定价机制会是怎样。
Okay. So you're working with you're working with DOD. And what do these contracts look like? Are are these you know, I don't even know how pricing would work for something like this.
最终阶段我们会采用类似云服务商的定价模式,比如Corey和Lambda那样向超大规模客户出售计算资源的方式。基本上就是按GPU分钟数计费。短期来看,国防部有很多研发资助项目,我们肯定会争取这类机会。那更像是基于里程碑的资助模式。
So in the end state, we'll be pricing ourselves a bit similar you know, very similar to how cloud providers would know, people like Corey and Lambda, how they would sell compute to the hyperstatus. So it's basically, dollars per minute on GPU time. That's the end state. I think the short phase, the DOD has a lot of, programs to fund r and d, and so we'll we'll certainly be going up to some of those opportunities. And that's really more like a milestone based opportunity.
没错。这是项重大成就,也是我们节目中最引人入胜的故事之一。菲利普,感谢你来做客。这位是StartCloud公司的CEO菲利普·约翰斯顿。谢谢。
Right. Well, it's a big accomplishment, and it's certainly one of the more fascinating stories we've had on the show. Philip, I wanna thank you for coming on. That is Philip Johnston, CEO of StartCloud. Thanks.
上周,Alphabet公布季度财报时,谷歌云交出了一份亮眼的成绩单。该部门增长率加速至34%,运营利润大幅跃升。当然,Alphabet也大幅增加了资本支出,以保持其在人工智能领域的竞争力。今天我们邀请到谷歌云全球生成式AI市场推广副总裁Oliver Parker,来谈谈他团队此刻的动向。Oliver,欢迎来到节目。
Last week, Google Cloud put up some impressive numbers as Alphabet reported quarterly results. The division accelerated its growth rate to 34% and operating profit jumped dramatically. Of course, Alphabet is also upping its CapEx significantly to stay competitive with its AI footprint. I want to bring on Oliver Parker, Vice President, Global Generative AI Go to Market at Google Cloud, to talk about this moment for his group. Oliver, welcome to the show.
很高兴你能来。
It's great to have you.
早上好,Akash。很高兴见到你。
Good morning, Akash. Good to meet you.
这对谷歌云来说是个激动人心的一周。你们刚公布了强劲的盈利数据。能否简单概述下你负责的工作组合架构?因为你的头衔很长,我们有时只想了解你具体负责哪些领域。
So this is kind of an exciting week for Google Cloud here. You're coming off some big earnings, obviously. Just can you give us a little bit of an overview here? How do you structure the portfolio of work that you oversee? Because you got a really long title, and sometimes we just want to translate what you even oversee.
当然。我主要负责云部门相关的AI市场推广工作,包括与公司其他部门的广泛合作。我们直接开展的工作涉及模型开发、开发者平台建设,以及我们重点布局的新领域。几周前我们刚推出了Gemini企业版。
Sure. So I oversee really what our AI go to market is as it relates to the cloud division. So obviously, a lot of partnership across other parts of the company. And really what we're doing directly with our models, really the platforms for developers, as well as new areas that we're focused on. We made a launch a couple of weeks ago with Gemini Enterprise.
这个全栈解决方案已成为我们的关键差异化优势,许多客户正就此与我们展开合作。是的,这就是我的职责范围——我们在云部门围绕AI开展的所有工作。
In the full stack, which has obviously been a really big differentiator for us and where many of us many of the customers are actually partnering with us. So yeah, that's my responsibility, really what we're doing in the cloud division around AI.
稍后我们会再讨论公司战略。但今早看到AWS签署大单的新闻,你认为这对谷歌云是机遇还是威胁?
And so we're going to come back to the company strategy in a minute, but we saw the news this morning opening at AWS signing a big deal together. Do you see that deal as an opportunity for Google Cloud or a threat to Google Cloud?
嗯,你看,我想我可能不愿意评论AWS和OpenAI的动向,这显然可以理解。但我们确实看到市场上更广泛的趋势,以及OpenAI与我们合作发布的一些公告,主要是关于他们寻求的计算能力。再次强调,我认为这反映了整个行业对计算能力和算力的广泛需求,以及人们试图解决模型分发的问题。显然,我们在这一领域具有独特优势——我们不仅支持分发许多其他公司的模型,也包括我们自己的模型,同时长期以来我们还在自有硅基平台方面投入了大量资源。
Well, look, I think I probably wouldn't want to comment on what AWS and OpenAI are doing, obviously, as you can imagine. But I think we've seen sort of more broadly in the market and some announcements announcements that that OpenAI OpenAI have been making with ourselves as well, just in terms of the compute capacity that they're seeking. So again, I think this is just part of a broader industry requirement around compute and power and really people trying to figure out distribution of their models. And obviously, we have a unique position there where we obviously support distribution of many other people's models, as well as obviously our own models, and obviously being heavily invested for a long period of time in our own silicon with the platform too.
那么今天的交易与你们和OpenAI的合作相比如何?
And how does the deal today compare to the work that you guys are doing with OpenAI?
我们并未公开与OpenAI合作的具体财务细节。不过几周前我们确实公布了与Anthropic合作的财务数据,你可能已经看到了。但如果你纵观整个行业,OpenAI在计算需求方面的动作其实是相当公开透明的,Satya和Sam也讨论过很多相关内容。或许我们可以多谈谈我们自己的布局,希望这些能引起你的兴趣。
We didn't share the financials on our partnership with OpenAI. Obviously, did share financials, you've probably seen that around some of the work that we're doing with Anthropic a couple of weeks ago. I But think if you sort of look broadly at the industry, what OpenAI is doing just in terms of sort of the compute requirements that they need, I think that's been pretty open in the industry right now, and I think there's lots of stuff that Satya and Sam have been talking about too. But maybe it'd be good to talk a little bit more about what we're doing, and hopefully that's of interest to you.
我想重点讨论这个。上周财报会议上TPU是个热议话题。我的问题是:这是否为谷歌打造了AI领域端到端解决方案的优势,特别是在向企业销售工作负载方面?当企业客户接触谷歌云时,他们会直接要求'我们希望这部分工作负载跑在TPU而非GPU上'吗?还是需要你们去说服他们?
So I want to talk about that. So TPUs were a big point of discussion on the earnings results last week. And one of the questions I had for you is this sort of sets up Google nicely for sort of this end to end playbook as it relates to AI and selling workloads to enterprises. You know, when enterprises are coming to you and to Google Cloud broadly, are they asking you specifically saying, Hey, we want this proportion of our workload to be run on TPUs, not on GPUs? Is that something that you have to sort of convince them on?
这类讨论的具体情形是怎样的?
What does the dynamic of that discussion look like?
我认为这完全取决于客户类型。有些极具前瞻性的数字企业——特别是那些试图深入技术栈底层的初创公司,也有更传统的企业希望我们帮他们处理复杂性。这正是我们的独特价值——我们在产品组合的所有领域都建立了合作关系。根据企业需求,无论是希望更贴近硅基硬件和加速器平台,还是开发者想通过Vertex等平台选用我们的模型(包括Anthropic等第三方模型),甚至是几周前推出的Gemini Enterprise——正如Sundar在上周财报中提到的,我们将其视为企业级AI的主要入口。
Look, I think it really depends on the kind of customer that you're talking about, right? So you have obviously some of the very progressive digital companies, especially some of the startups that are looking to build lower down the stack. And then you have more traditional enterprises that are really looking for us to be able to manage a lot of that complexity. I think that's honestly the uniqueness of what we offer here, which is really we have partnerships in all the areas of our portfolio. And again, depending on the company, whether they want to really work closer to the silicon and to the hardware and to our accelerator platforms, right the way through to developers that are looking to leverage, you know, platforms like Vertex for choosing our models as well as anyone else's models, including companies like Anthropic, right the way through to a launch that we made a few weeks ago, which has been which Sundar referenced actually on our earnings last week, really around things like Gemini Enterprise, which we sort of see as sort of the front door for AI at a business level.
归根结底,根据客户所处阶段、他们的建设方向以及合作方式,我们能够在其技术栈的各个环节提供多样选择,对此我们充满信心。
And again, really depending on where the clients are at and what they're building and how they want to have a partnership, we feel very good about being able to offer them choice and optionality at all areas of that stack.
关于TPU及其作为产品组成部分的问题,您如何向客户推荐使用它而非英伟达GPU等替代方案?
And as it relates to the TPU and having that as part of the offering, how do you make the case to customers to use that as opposed to something like NVIDIA's GPUs?
我认为,如果关注我们公开披露的TPU相关信息,以及几周前与Anthropic共同宣布的百万级TPU访问计划,同时观察Gemini在训练和推理方面的应用——Gemini的大量服务与推理正是通过我们的TPU平台运行。除了这种选择性优势外,更值得关注的是我们因此能够为这些模型提供卓越性能、优异成本和低延迟保障。Gemini系列的加速发展很大程度上得益于TPU平台在推理和训练环节的双重支撑。
Well, I think if you sort of look at what we've been very public about as it relates to TPUs, and, again, the announcement with Anthropic that was shared a couple of weeks ago in terms of them sort of having access up to a million TPUs, also look at what we do with Gemini from a training as well as an inference standpoint. So a lot of the serving and inference for Gemini is running through our TPU platform. And sort of outside of sort of that optionality or that option of choice, I think what's interesting is is that it puts us in ability to be able to provide great performance, great cost, and great latency requirements around those models. So we're seeing a significant acceleration of our Gemini family, and a big part of that is really due to that actually running from an inference standpoint as well as being trained on our TPU platform as
当然也包括GPU。考虑到谷歌云当前所有的AI工作,其中有多少是依赖于基于使用量的定价模式?
well as obviously GPUs. When you think about all of the AI work that is happening at Google Cloud right now, how much of that is dependent on or based on usage based pricing right now?
我们提供多样化的商业化方案:从直接使用token的用户到通过产品消耗token以优化预算运营的客户。实际上我们根据不同客户需求,在多个产品线上提供灵活选择。这个领域正在持续演进,我们将确保始终引领创新,为客户提供更多选择和灵活性。
Well, we have different monetization, different commercial packages for people that are using tokens right the way through to people that are using products that consume tokens to really alleviate budgets and how they sort of operate. So we're providing flexibility actually across multiple different products really depending on what clients want. And that's a space that I think we see continually evolving, and we're going look to sort of make sure that we are leading in that space to be able to find optionality and flexibility to many customers.
您提到的灵活性具体指什么?您如何看待基于结果的定价、使用量定价和固定定价模式之间的比例变化趋势?
And when you say flexibility, how do you see that shift changing over time, that mix between outcome based pricing, usage based pricing, and then more fixed pricing models?
token成本定价已通过API服务验证,这也是其他云服务商的常见做法。同时我们推出了更侧重AI的产品线,采用类似传统SaaS行业按用户/月收费的模式。在客服AI等领域,我们还提供基于成效的定价方案(如呼叫转移场景)。我们将持续创新以确保领先地位,让AI价值在采购决策中清晰显现。这些讨论仍在进行中,事实上我们已取得显著进展——AI正开始为众多合作企业创造实质价值。
Well, I think, obviously, the cost of tokens has been a sort of a well tried and tested approach with our APIs, as have many of the other frontier providers through sort of other hyperscalers. But we also now have products that are really sort of very AI centric, which are sort of looking more at the per user per month sort of kind of approach that I think, know, the traditional SaaS industry has had for a while. But we also then have certain outcome based capabilities around things like call deflection when you start thinking about AI for customer service. So I think we're going to continue to experiment and actually sort of make sure that we lead in that category so that really the value of AI becomes very apparent in terms of how people buy. Those conversations are ongoing, I would say, and we've actually done really well here because I think we've proven a lot of our AI is actually now starting to make a difference across many of the enterprises that we're working with.
好的。感谢奥利弗的分享,这位是谷歌云的奥利弗·帕克。ServiceNow在企业软件领域已被视为AI应用的标杆企业。
Right. Well, Oliver, I want to thank you for joining us. That was Oliver Parker from Google Cloud. Okay. ServiceNow has in many ways been seen as a bit of an AI darling in the enterprise software world.
该公司过去几年每季度的营收增长率始终保持在20%左右。它不仅拥有非常强劲的自由现金流利润率,同时还是当地最大的SaaS企业之一,这意味着它必须快速适应人工智能时代。今天我们邀请到ServiceNow副董事长Nick Zitzen,来谈谈他的团队将如何引领公司发展。Nick,欢迎来到节目,很高兴你能来。
The company has consistently grown its top line in or around 20% every quarter for the past few years. It also boasts some very strong free cash flow margins, and yet it is one of the biggest SaaS companies in town, which means it is also having to adapt quickly to the AI era. I want to bring on Nick Zitzen, Vice Chairman at ServiceNow to talk about where his team is taking the company. Nick, welcome to the show. It's great to have you.
Akash,很高兴见到你。正如他们所说,我是节目的常看观众,却是第一次打电话参与。
Akash, great to see you. Frequent viewer, first time caller, as they say.
我们期待未来岁月里能越来越多地邀请你参与节目。我想聊聊当前企业软件行业的整体氛围。你们上周刚发布财报,我想我们都看到了结果。但总的来说,我想先退一步观察——你看,纳斯达克指数今年是上涨的。
Well, we look forward to having you on more and more as the years go on here. I want to talk about the general vibe right now in enterprise software. You guys obviously reported earnings last week and I think we saw the results there. But broadly speaking, I just want to take a step back here. I mean, look, the Nasdaq is up this year.
ServiceNow的股价今年却下跌了。所有企业软件股票都在下跌。所以我想了解下你的看法,你认为投资者目前是在对什么做出反应?
ServiceNow stock is down for the year. All enterprise software stocks are down. And so I just want to take your temperature a bit. What do you think investors are reacting to here?
我认为投资者试图厘清行业未来格局是完全正常的。现在有很多噪音。说实话Akash,你和客户交谈时,他们也会告诉你同样的情况——每家企业都在讲AI故事。所以投资者采取观望态度是很自然的。
Well, think it's perfectly normal that investors try to figure out exactly what the shape of the industry is going to be moving forward. There's a lot of noise. And frankly, Akash, you talk to customers, and they'll tell you the same thing. Everybody has an AI story. So I think it's pretty natural that investors are trying to take a bit more of a wait and see approach.
就ServiceNow而言,正如Bill McDermott所说,企业AI领域是一个全新且不同的赛道,相比传统SaaS领域,对我们这样的公司更为有利。因此我们非常乐观能够继续保持你在我们最新财报中看到的那种超额表现。
What I will say for ServiceNow, and Bill McDermott has said, the enterprise AI neighborhood is a different and new neighborhood and much more advantaged for a company like ServiceNow than sort of the traditional SaaS neighborhood. So we're very optimistic that we can continue to curate the kinds of over performance that you saw in our recent earnings.
你们设定了一些宏伟目标:今年底实现5亿美元的年度合同价值,明年达到10亿美元。目前你们看到企业采用AI面临哪些挑战?因为尽管有这些数字,我们在《信息报》的报道中指出,对企业来说这是个缓慢而稳健的过程。当前企业在AI应用上必须克服的最大障碍是什么?
Now, you put up some big numbers, 500,000,000 in ACV annual contract value by the end of this year and $1,000,000,000 by the end of next year. What are the challenges that you're seeing to enterprise adoption right now? Because even despite those numbers, we've been writing here at the information about it is a slow and steady process for enterprises. What are the biggest hurdles that enterprises have to overcome right now with AI?
我认为这就是我之前描述的,Akash。每个企业的技术领导者都面临着汹涌而来的挑战,因为坦白说,那些非天生AI玩家或没有实质AI战略、只有口号式AI战略的公司。所以人们正在试图理清头绪——他们正在思考:如果要为未来五到十年构建参考架构,这个架构必将与过去十年截然不同。差异究竟在哪里?
I think it's what I described earlier Akash. You every technology leader in every enterprise under a siege of incoming, because companies frankly that aren't natural AI players, or don't have a sound bite driven, who do have a sound bite driven AI strategy and not a content driven AI strategy. So I think people are trying to figure it out. I think they're trying to figure out if I'm gonna build a reference architecture for the next five to ten years of my organization, And it's gonna look different than the reference architecture from the last ten years. How is it different?
新架构中哪些角色应该保留旧架构的成员?哪些又该被替换?这些都是复杂的议题。对于像ServiceNow这样始终处于技术核心地位、帮助治理各类系统并整合企业资源的公司而言,这些挑战在AI时代只会加剧。这再次证明——对我们和绝大多数客户而言——我们的战略重要性只会提升。
And what players belong in the new one that perhaps were in the old one and which ones should be replaced? So these are complex conversations. I think for, you know, a company like ServiceNow that has been at the core of the technology estate helping to govern all the different systems and helping to integrate these enterprises, those exact challenges are only gonna be worse in the AI era. It's a, again, it's a good sign for us and our strategic relevance for most of our customers, if not for all.
那么人员因素呢?几周前Vinod Khosla做客时提到:软件只是问题的一面。如果缺乏合适的一线人员——他认为这正是当前许多企业的现状——他们既不会使用软件,也无法发挥其价值。这是一派观点。
What about the people issue to all this? Because we had Vinod Khosla on the show a couple weeks ago, and he talked about the idea that the software is one side of the problem. If you don't have the right people on the ground, which in his opinion was the case for a lot of enterprises right now, they don't know how to use the software. They don't know how to make use of it. That's one perspective.
另一派我们常听到的观点是:'我不可能直接替换数百人的整个IT团队'。您在这个问题光谱上持何种立场?目前如何应对?
The other perspective we've been hearing from people is, well, I can't just replace my entire IT team of hundreds of people. Where do you stand on that spectrum of where the issue lies and how are you approaching that right now?
我认为这更多是思维模式而非技能水平的挑战——当然两者兼有。当前存在持续争论:AI是革命性的吗?它会彻底改变组织运作方式?还是渐进式的?意味着我们只会看到传统流程的逐步改进?
Yeah, I think it's more of a mindset challenge than frankly a skills challenge. It's a little bit of both, right? There's this ongoing debate, is AI revolutionary? It's going to change everything how organizations run. Or is it evolutionary, meaning you're going to see incremental improvements in some of these sort of legacy processes?
那个转折点——革命还是进化——很大程度上取决于人们的思维方式。当我们身处一个崇尚变革的环境,企业文化宣称'我们要拥抱新商业模式、新服务方式、新盈利途径'时,它就会成为革命。确实有许多组织正采取这种态度。但我不认同'可以轻易找到比现有团队更懂AI的新团队'这种观点。
I would tell you that inflection point, is it revolutionary or evolutionary, depends in large part on the mindset of people. What do we want it to do? And look, you have a change oriented environment and a culture that says, we're going to embrace the possibility that this is a new business model, a new way for our business to serve our customers, a new way for us to make money, it's going to be revolutionary. And there are a lot of organizations that are taking that approach. I would tell you, the notion that you can go out shopping for a new team that's better and more AI native than the one you have, I don't subscribe to that.
我们ServiceNow认为,市场对技术工作者——对懂软件人才的需求只会增长。AI绝不能成为裁员的代名词,否则我们如何期待人们真正拥抱企业级AI的无限潜力?
Think we at ServiceNow believe that you're going to see an increase in requirements for technical workers, for people who understand software. So I don't think this is a scenario where we can let AI become a code word for layoffs. Because to do that, I don't know how we expect people to embrace the unbelievable potential of what enterprise AI should really be.
我们来谈谈ServiceNow的公共部门业务。你当然也有政府工作背景。我知道这是你非常关心的一类客户群体。我想听听你对Doge的看法,因为今年早些时候,Doge闹得沸沸扬扬。当时科技新闻的头条基本都是:政府正在削减哪些软件开支?
Let's talk about the public sector business that ServiceNow has. You of course have a background in government too. I know that this is a customer set that is very near and dear to your heart. I want to take your temperature here on Doge, because earlier in the year, Doge made a lot of noise. It was kind of the headline that was dominating the tech news altogether was, you know, what softwares is the government cutting back on?
政府正在用什么软件来替代零散解决方案?比如将所有功能整合到一个平台上?我记得年初时ServiceNow讨论过,他们说'政府并没有削减我们的预算,我们反而视之为机遇'。最近关于Doge的讨论确实少了很多。当然,现在正值政府停摆期间,整个讨论语境都完全不同了。
What software is the government using to replace point solutions, for example, bringing them all under one hood? ServiceNow, I recall at the start of the year, right, the discussion was, hey, they're not cutting back on us. We're actually seeing it as an opportunity for us. We've started to hear less about Doge altogether. And of course, it's a very different conversation now that we're talking in the middle of a government shutdown.
但如果我们暂时搁置停摆问题,你观察到政府在软件支出方面呈现怎样的态势?Doge项目整体进展如何?我们已经很久没有听到相关消息了。
But if we just put the shutdown aside for a minute, what is the temperature that you're seeing from the government in terms of their software spend? And what's going on with Doge altogether? We haven't heard about it in a long time.
确实。这类讨论越来越难脱离政治因素。但如果抛开政治不谈,没人会否认美国政府需要适应和现代化。坦白说,政府现行运作方式难以为继。我们既不能无限制加税,也不能一味削减项目。
Yeah. I mean, it's increasingly hard to take politics out of these kinds of conversations. But if you take the politics out of the conversation, I don't think anybody would deny that the US government needs to adapt and needs to modernize. It's frankly unsustainable for government to run the way that it has. We don't have unlimited power to just continue to raise taxes or to cut programs.
因此必须为这些机构找到更高效的运作方式。这个观点应该没有争议。我认为许多像Akash这样的政府工作人员都愿意参与这种变革讨论。他们希望成为积极变革的一部分,提供更好的公民服务。像ServiceNow这样的平台在政府部署才刚起步,但我们接触的政府客户都坚信我们将是AI未来100%的组成部分。
So we have to find more efficient ways for these agencies to run. I don't think that's a controversial statement. And I think many of the people Akash who work in government are perfectly up for that conversation. They want to be part of a change, they want to be part of a change that's positive, that provides a better citizen service. So when you look at a platform like ServiceNow, which is only in the early days of its deployment in government, I think the government customers that we talk to believe that we are 100% part of the AI future.
他们认为缺乏系统整合是重大挑战。那些十年前、二十年前开发但已无人使用的自制解决方案,正是问题所在。不管称之为Doge还是现代化,政府确实渴望变革。对我们而言,这正是我们平台大展身手的绝佳机遇。
They believe that lack of integration is a big challenge for them. I think they believe homegrown solutions that maybe were built ten, twenty years ago, that people aren't using. I think that's part of the problem they have. So whether you call it Doge or whether you call it modernization or whatever you call it, like government wants to change. And for us, we see that as a massive tailwind for what our platform can do.
让我最后问一个关于AI故事中芯片环节的问题。我们在节目中多次讨论过英伟达及其替代方案。上周看到谷歌TPU在财报中表现亮眼,AWS显然也有自研Tranium芯片。今早还看到个大新闻。
Let me ask you one last question here about the chips angle to the AI story. We have talked a lot on this show about NVIDIA and alternatives to NVIDIA. We saw last week that Google's TPU was getting a lot of traction in their earnings. Were talking about AWS obviously has their Tranium chip. We saw a big deal this morning.
虽然不涉及Tranium,但我们今早看到了AWS与OpenAI的交易。你们与所有超大规模云服务商都有合作,这是ServiceNow的一大价值主张。目前ServiceNow是否仅使用NVIDIA芯片?你们有使用Tranium或TPU吗?
Although it didn't involve Tranium, we saw the deal between AWS and and OpenAI this morning. You work with all the hyperscalers. That is one of the big value props for ServiceNow. Are you only using NVIDIA chips right now in ServiceNow? Are you using Tranium, TPUs?
你们对此持什么立场?
Where do you stand on that?
听着,NVIDIA是ServiceNow的重要合作伙伴。我们与NVIDIA合作多年,早在AI热潮兴起前就长期基于NVIDIA技术构建和训练模型。但我们的原则是满足客户的一切需求,我们拥有天然异构的技术栈。必须强调的是,NVIDIA确实是我们的优先合作伙伴。
So look, NVIDIA is a terrific partner of ServiceNow. We've been partnered with NVIDIA for years and we're building and training models on NVIDIA technology for a long time before it became fashionable. I will say that our orientation is to do whatever our customers need us to do. We have an inherently heterogeneous tech stack. I would be remiss not to say that Nvidia is a priority partner.
他们是伟大的盟友,其技术对我们未来持续增强架构的规划至关重要。但我们始终保持开放平台理念,致力于技术栈多元化。因此大家都有充分机会,不过我们确实为与NVIDIA的合作感到无比自豪。我们密切关注了黄仁勋的主题演讲,也很感激他提到ServiceNow,我们对此报以最大的善意。
So great friends and their technology plays a major role in how we think about continuing to enhance our architecture for the future. But we're also open, we've always been an open platform. We always look diversify the stack. So there's plenty of opportunity for everybody, but you know, we remain super proud of the Nvidia partnership. We watched Jensen's keynote with great interest, and appreciate the reference he made to ServiceNow, and we repay that with full kindness.
那么你们当前是否有在使用Tranium、TPU或谷歌/AWS的芯片处理任何工作负载?
So are you using Tranium or TPUs, Google or AWS chips on any of your workloads right now?
可以说我们的技术栈是异构的,但NVIDIA仍是我们最重要的合作伙伴。
Let's just say that it's a heterogeneous stack, but NVIDIA remains the most important partner we have.
好的。非常感谢尼克今天的分享。这位是ServiceNow副董事长Nick Zitsyn。好的。
Right. Great. Well, Nick, I want to thank you for coming on. That was Nick Zitsyn, Vice Chairman at ServiceNow. Okay.
上周在我们的WTF峰会上,我们邀请了2B公司CEO安贾莉·苏德登台探讨流媒体行业的动态。就在她演讲后的第二天,福克斯公司财报显示其免费广告支持的流媒体服务已实现盈利。因此我想请2B公司CEO安贾莉·苏德来到节目,详细聊聊她所运营的流媒体服务的发展轨迹。安贾莉,欢迎来到节目,很高兴你能来。
Last week at our WTF Summit, we welcomed 2B CEO Anjali Sood to the stage to talk about the dynamics in the streaming sector. Just a day after she spoke at our summit, Fox Corp announced in its earnings that the free ad supported digital streaming service turned profitable. So I want to bring on 2B CEO, Anjali Sood to the show to talk more about the trajectory of the streaming service that she's running. Anjali, welcome to the show. It's great to have you.
谢谢邀请我,阿卡什。
Thanks for having me Akash.
你在Tubi任职大约两年了,现在流媒体服务刚刚实现盈利。你们是如何做到的?能否详细介绍一下成功策略和实现过程?
So you've been at Tubi now for in or around two years, I think now, and the streaming service just turned profitable. How did you do it? Walk us through the playbook here and how it happened.
确实。我认为Tubi在流媒体领域恰逢其时地找到了正确的商业模式,并具备规模优势。我们是完全免费的流媒体平台,对消费者100%免费,没有付费层级,完全依靠广告支持。我们发现在当前流媒体环境中,随着价格上涨、市场愈发碎片化和使用门槛提高,消费者——尤其是年轻群体——正明显倾向于免费服务。只要无需付费,他们愿意接受广告这种价值交换模式。
Yeah. Mean, look, think Tubi has really found itself with the right business model at the right time in streaming and with the advantages of scale. So we're free streaming, 100% free to consumers, no paid tiers, no paid ad tiers, and fully ad supported. And, what we've been finding is just, you know, in a in a streaming environment where prices are increasing, there's more fragmentation and friction, Consumers, particularly younger consumers, are really just gravitating to free. And they're willing to, sort of engage in the value exchange of ads if they don't have to pay.
这种商业模式确实引起了强烈共鸣。我们投入大量资源扩充内容库,目前拥有全球最庞大的影视剧收藏,平台还引入了创作者内容和原创作品。
And so that business model has really been resonating. We've invested a lot in growing our catalog. We have the world's largest collection of movies and TV series. We now have creator content on the platform. We have originals.
随着我们不断提升价值主张,我们也看到了增长势头。
I think as we've improved that value prop, we've also seen momentum.
而且
And
最后,我们拥有这种飞轮效应,用户、数据和规模越大,效果就越好。目前每月有超过1亿人观看Tubi数十亿小时的节目。想象一下,所有这些用户参与度和使用时长,加上我们庞大的内容库,使我们能够提供更个性化的推荐和体验,从而吸引更多广告商的关注和需求,进而让我们能对内容进行再投资,整个飞轮就这样良性运转起来。
lastly, we just have sort of the flywheel, that gets better the more and more users and data and scale we have. So about over a 100,000,000 people are watching a billion hours of Tubi a month now. And so you can imagine if you have all that engagement and all that usage paired with this long tail library of content, our ability to deliver better personalized recommendations and experiences, that then drives more advertiser attention and demand, that then allows us to reinvest in our content, it just all sort of starts to really move in that flywheel.
所以
So
令人兴奋的是,我们实现盈利的方式并非通过削减成本,而是通过高效增长和持续提升价值主张。
what's been exciting is the way we've gotten to profitability hasn't been by cutting costs, it's been by growing efficiently and just improving that value proposition.
所以你们完全不需要削减开支?全是靠营收增长实现的?
So you haven't had to cut the expenses at all? It's all been top line growth?
是的。我们的整体运营费用在增长,尤其是内容投资持续增加。而同期许多其他流媒体平台却在收缩。这确实体现了我们商业模式的势头,以及观众正逐渐向我们靠拢。
Yeah. Our overall operating expenses are growing and our content investment in particular is growing. This is at a time when I think a lot of other streamers are having to pull back. And so it really is sort of just the momentum of the model and the audience kind of gravitating towards us.
我想详细聊聊内容投资,因为你们对如何利用创作者经济、帮助创作者带着自己的影视作品入驻平台一直很公开。作为免费平台,我想了解这类合作的具体结构——当你们与创作者签约,承诺帮助他们制作节目或电影时,是基于他们现有的内容或制作能力。你们是分给他们广告收入分成?还是预付费用?这类合作具体是怎样的?
I wanna talk about that content investment because you guys have been very open about how you are leveraging the creator economy and helping creators get on your platform with their own films and stuff like that. And I wanted to get a sense for how these deals are structured, given that you are a free platform. When you have a creator and you sign a deal with them saying, okay, we'll help you make a show or a movie given the following that you have or the content that you produce, are you giving them a cut of the ad sales? Are you paying them something upfront? What does a deal like that look like?
没错。我们与内容创作者的合作——无论是好莱坞还是创作者经济领域——大多采用广告收入分成模式,这也是我们推崇的方式。这样激励完全一致,双方都能获益。
Yeah. The majority of the deals we do with content creators, whether it's Hollywood or creator economy, tend to be a share of revenue of advertising revenue, which we love. Right? The incentives are totally aligned. We win.
创作者会赢。这实际上具有非常好的扩展性。当我们制作独家内容或原创内容时,你经常会看到稍微不同的模式,但它看起来和其他流媒体或行业内的内容一样标准。所以我认为,真正与众不同的是我们愿意与更广泛多样的故事讲述者合作,以及我们帮助这些故事找到观众的能力。因为我们是长尾平台,你不需要拥有大量观众就能在2B上做得很好。
The creator wins. And that has actually scales remarkably well. When we do exclusive content or original content, that's when you'll often see a slightly different model, but it looks pretty standard as anything else you might see in streaming or in the industry. And so I would say, you know, the thing that is really differentiated to be is just our willingness to work with, a much broader and diverse set of storytellers and then our ability to help those stories, find an audience. So because we're long tail, you know, you don't have to have a mass audience to actually do quite well on 2B.
我们擅长找到特定的粉丝群体。你可能是恐怖片爱好者或真实犯罪题材粉丝,我们会帮助这些消费者深入内容。我认为这正是我们独特的优势,效果非常好。这与YouTube在用户生成内容和短视频领域的做法非常相似,但我们是在长视频、电影和电视剧领域实现这一点。
We're good at finding specific fandoms. You maybe are a horror fan or you're a true crime fan and helping you really those consumers go deep in the content. And I think that's actually been a really unique, thing that has worked really well for us. It's very similar to what YouTube has done in UGC, in short form, but we're really doing it in long form and in movies and TV series.
那么你如何看待这里的竞争?因为你们的目标是年轻一代或Gen C群体,这些人在社交平台上观看大量创作者经济影响者的内容。你们确实面临来自TikTok和Instagram的竞争。在我们的WTF峰会上,你提到注意力经济是最大的竞争形式。根据报道,TikTok和Instagram实际上正在考虑推出自己的电视应用。
So how do you think about competition here? Because as you sort of target the the younger demographic here or Gen C or people who we think of watching a lot of these creator economy influencers and stuff like that on social platforms. You do have competition from TikTok and from Instagram. On stage at our WTF Summit, you talked about the attention economy being the biggest form of competition for you. TikTok and Instagram, we've seen the reporting that they are actually looking to launch their own TV apps.
你们计划如何与他们竞争?
How do you plan to compete against them?
是的。听着,在注意力经济中,任何吸引人们注意力的东西都是竞争对手。所以竞争确实存在。但最终我认为竞争不是坏事,它会迫使你改善粉丝的体验。
Yeah. Listen, attention economy, your competition is anything that takes people's attention. So it is competitive. And ultimately I think competition isn't a bad thing. It forces you to improve the experience for your fans.
但我想说,我们在Tubi花了十多年时间专注于长视频叙事的体验。过去两年我学到的所有经验都告诉我,将多种形式和媒介整合到一个体验中,并提供同样高度优化、令人愉悦的体验是非常非常困难的。因此我认为我们在长视频领域实际上有很大优势。从短视频转向长视频比帮助想要制作长视频的创作者进入我们的生态系统要困难得多。我可以告诉你,我们四个月前才开始将创作者引入平台。
But I would say, you know, we have spent over ten years at Tubi obsessing around the experience for long form storytelling. And everything I've learned over the last two years, Akash, would tell me that it is very, very hard to bring in many different formats and mediums into one experience and to offer that same highly optimized, delightful experience. And so I think we actually have a pretty big advantage in long form. It's actually harder, I think, to go from short form to long form than it is in our case to help creators who want to do long form come into our ecosystem. And I can tell you, you know, we've, we started bringing creators onto the platform only four months ago.
我们现在拥有来自全球最受欢迎创作者近10,000集内容。你将看到我们增加更多独家原创内容。上周我们宣布了首个与创作者合作的原创电影计划。但我可以告诉你的是,到目前为止,已经有创作者在Tubi上赚得比其他任何平台都多,他们能够实现那些艺术上一直想去且具备能力去做的项目。
We have nearly 10,000 episodes from some of the world's most popular creators now. You're gonna see us add more exclusive and original content. We announced, our first original film slate with creators last week. But what I can tell you is so far, we have creators who are making more money on Tubi already than they have on any other platform. And they're able to do projects that are artistically places they have wanted to go and have had the skill sets to go.
但这与你在短视频环境中的体验大不相同,算法优化的方式也截然不同。
But it's been very different than when you're in a short form environment and the way the algorithm is optimized.
在结束前让我快速问两个问题。你们有考虑过推出订阅产品吗?
Let me ask you two quick questions before I let you go. You ever consider launching a subscription product at all?
我们目前没有计划。而且我要告诉你,我们正在全力押注免费模式。我们认为娱乐的未来是免费的,因为消费者将要求零门槛的体验。同时我们也相信,内容创作者希望自己的作品能触达最广泛的受众。
We have no plans. And I will tell you, we are doubling down so much. We think the future of entertainment is free. The reason is because we believe that consumers are going to demand that lack of friction. And we also think that storytellers, they want their stories to reach the widest audience.
所以我们正全力推进免费模式。实际上我们现在就有一个名为'永远免费'的品牌宣传活动。呃不对,是'我们永远'。好吧,就这样。
So we're doubling down on free. I think we actually have a brand campaign out right now called Free Forever. Actually want to- We Forever. Okay. All right.
没有订阅制。我们对此非常坚定。我认为当你拥有优势时,就应该全力投入。
No subscription. We are really leaned in here. And I think when you have strength, you lean into it.
最后再问一个问题。上周我们看到转播权之争再次上演,比如迪士尼与YouTube TV的纠纷。说实话这种转播权冲突屡见不鲜,这次肯定也能解决。
And one question before you go. We've seen sort of the carriage battles, I guess, play out last week. Of course, the Disney YouTube TV dynamic. I mean, look, carriage battles are a dime a dozen really. This one, I'm sure it'll get sorted out.
但你对这次事件怎么看?你认为这反映了行业当前怎样的整体状况?
But what do you make of this one? What do you think it says about the industry as a whole right now?
你知道,这并不那么令人惊讶。我是说,你可能已经注意到,内容争夺战一直相当常见。而且我认为在这种环境下,你会看到更多这样的情况。这是一个竞争更激烈的环境,对注意力的争夺会更加激烈。
You know, it's not that surprising. I mean, you have noted, carriage battles have been sort of pretty common. And I think you're just going to see more of that in this environment. It is a more competitive environment. There is going to be more of that battle for attention.
因此赌注更高了。另外我认为,对于行业外的人来说有时不太清楚的是,这些交易背后存在着真实的人口结构变化——大多数广播电视和付费电视或有线电视套餐的观众群体往往年龄偏大。而许多服务实际上正试图更吸引年轻观众。所以我认为在未来几年,你可能会看到更多这类争夺和更高风险的谈判上演。
And so the stakes are higher. And then I think the other big thing that sometimes isn't clear to sort of the outsider looking into the industry is you have a real demographic shift that's underlying a lot of these deals, in that, you know, most of broadcast television and pay TV or the cable bundle, you know, those audiences tend to be much older. And yet a lot of services are really trying to move more towards younger audiences. So I think you're just gonna see potentially more sort of battles and higher stakes negotiations play out in the coming years.
好的。安贾莉,感谢你的参与。这位是Tubi的CEO安贾莉·苏德。今天的节目就到这里。提醒一下,我们的直播时间是周一至周五上午10点。
Right. Well, Anjali, I want to thank you for coming on. That is Anjali Sud, CEO of Tubi. Well, that does it for today's show. A reminder, we are on this stream Monday through Friday at ten a.
太平洋时间,下午1点东部时间。我要感谢亚马逊网络服务,他们是本节目的主要赞助商。
M. Pacific, one p. M. Eastern. I want to thank Amazon Web Services, who is our presenting sponsor for this production.
我还要感谢各位的收看。我们非常感激你们的观看。我已经对明天的节目充满期待了。祝大家周一愉快。现在先再见啦。
I And want to thank you for tuning in. We really do appreciate your viewership. I'm already excited for our next show tomorrow. Have a great rest of your Monday. Bye bye for now.
关于 Bayt 播客
Bayt 提供中文+原文双语音频和字幕,帮助你打破语言障碍,轻松听懂全球优质播客。