本集简介
双语字幕
仅展示文本字幕,不包含中文音频;想边听边看,请使用 Bayt 播客 App。
最近怎么样,书呆子们?
What's up, nerds?
是你哥们儿我。
It's your boy.
我是Jared,这里是2025年11月10日星期一的《更新日志》周报。
I'm Jared, and this is Changelog News for the week of Monday, 11/10/2025.
Spencer Chang做了个东西,这东西让我今天特别开心。
So Spencer Chang made a thing, a thing that made my day.
它叫《活着的互联网理论》,论证了互联网永远会充满真实的人在寻找彼此、响应求助,甚至在争吵中也不忘分享欢笑。
It's called the Alive Internet Theory, and it makes the case that the Internet will always be filled with real people looking for each other, answering calls for help, and sharing laughs even in the midst of arguing.
这个网站更适合亲身体验而非描述,所以我就不多说了,你们可以通过简报里的链接自己看看。
This is a website that's better felt than tellt, so I'll just leave you to follow the link in the newsletter.
好的。
Okay.
让我们来看看本周充满活力的新闻。
Let's get in to this week's very alive news.
这个新的AI岗位正在爆发式增长。
This new AI role is exploding.
AI相关的岗位流失和未来招聘冻结现在是软件圈的热门话题。
AI related job losses and future non hires are the talk of the software town right now.
但至少在短期/近期,随着岗位发布量激增800%(过去九个月数据),一个新的AI主导的技术岗位已经出现。
But at least in the short slash near term, a new AI led tech role has emerged with a massive increase of job postings, up 800% over the last nine months.
引用:'AI竞赛的领跑者如Anthropic和OpenAI正在积极招聘一种叫前线部署工程师(FTE)的软件专家,负责根据客户需求定制AI模型。'
Quote, forerunners in the AI race such as Anthropic and OpenAI are actively recruiting software engineering specialists called forward deployed engineers, FTEs, to help with tailoring AI models to meet customer needs.
这些工程师不仅与后台程序员合作,更会直接嵌入客户和产品工程团队中。
More than just working with back office coders, these engineers are embedded within customer and product engineering teams, end quote.
还是不确定前线部署工程师(FDE)具体是做什么的?
Still not sure what a forward deployed engineer or an FDE does exactly?
引述,与传统软件工程师不同,FDE(现场开发工程师)不仅编写代码,还会深入现场理解AI能产生最大影响力的领域。
Quote, unlike traditional software engineers, FDEs go beyond writing code to go out in the field and understand where AI can make the biggest impact.
他们的使命是打通AI落地的最后一公里,将通用模型转化为可扩展的AI解决方案,满足客户的复杂需求并解决问题,引述结束。
Their mission is to bridge the last mile of AI, transforming a general purpose model into scalable AI solutions that reflect complex client requirements and solve their problems, end quote.
如果这一趋势持续下去,如果你想在2026年成为抢手人才,现在就该确保自己能自信且诚实地将FDE写入简历。
If this trend has any staying power and if you want to be in demand in 2026, now is the time to ensure you can confidently and truthfully put FDE on your resume.
年轻开发者不会容忍AWS带来的痛苦。
Younger devs won't tolerate pain in the AWS.
顺便说一句,非常幽默的Corey Quinn终于意识到,自从我第一次尝试在EC2上部署Rails应用时就明白的道理。
Corey Quinn, who is hilarious by the way, finally realized what I've known since the first time I tried shipping a Rails app on EC two.
对于AWS新手而言,这纯粹是折磨。
AWS for the uninitiated, it's pure pain.
引述,最近我为了好玩又捣鼓出一个代码很烂的东西——因为我信奉把自己的问题变成大家的问题——结果发现了一个长期困扰我的真相。
Quote, recently, I was spitting up yet another terribly coded thing for fun because I believe in making my problems everyone else's problems and realized something that had been nagging at me for a while.
使用AWS相对而言是痛苦的,引述结束。
Working with AWS is relatively painful, end quote.
Corey详细描述了从零开始搭建AWS的典型需求,然后将其与Vercel在AWS基础上提供的丝滑体验进行对比。
Corey lays out what a typical zero to one AWS setup often requires then compares it to the silky smooth experience Vercel provides on top of AWS.
他对这种差异的解释是:这是代际问题。
His explanation for the discrepancy, it's generational.
引述,我觉得这像是代际差异。
Quote, this feels generational to me.
对于特定年龄段的人(X世代和千禧一代),AWS和GCP已经证明了它们的价值。
For folks of a certain age, Gen X and millennials, AWS and GCP have made their bones.
我们是在这些平台伴随下成长起来的技术一代,早已习惯了它们的缺陷。
We came of technical age with the platforms, and we are used to their foibles.
当然,Azure是婴儿潮一代的云平台,而Z世代使用的平台不需要用户通过技能测试来证明自己的诚意,引述结束。
Azure is, of course, the boomer cloud, but Gen z is using platforms that aren't designed as tests of skill to let customers prove how much they want something, end quote.
向Corey致敬,他将Azure称为'婴儿潮一代云'。
Hat tip to Corey for calling Azure the boomer cloud.
这太神奇了。
That's amazing.
不过,我不认为这是代际问题。
However, I don't think this is a generational thing.
像我这样偏爱Heroku风格部署平台而非AWS的资深开发者大有人在。
There's an entire group of elder devs like myself who have always preferred Heroku style deployment platforms over AWS.
虽然他对过去的看法可能因身处AWS泡沫而有所偏颇,但他对未来的预测可能是正确的。
While his view of the past seems skewed from inside the AWS bubble, he might be right about the future.
引述:'AWS花了二十年打造了全球最强大的云平台'。
Quote, AWS spent two decades building the most powerful cloud platform in the world.
他们可能会再用二十年看着它变得对尚未入局的人毫无意义。
They may spend the next two watching it become irrelevant to anyone who wasn't already bought in.
你应该写一个代理程序。
You should write an agent.
Thomas Toczek提出,要真正理解LLM代理程序,成为最专业的黑粉或铁粉,你需要选对合适的工具。
Thomas Toczek makes the case that to truly grok LLM agents so you can be the best hater or stan that you can be, you need the right one.
引述:'代理程序是我职业生涯中最令人惊讶的编程体验,不是因为它们强大的能力让我显得古怪'。
Quote, agents are the most surprising programming experience I've had in my career, not because I'm odd by the magnitude of their powers.
我喜欢它们,但不是那种喜欢。
I like them, but I don't like like them.
而是因为让一个代理程序快速运行起来如此简单,以及在这个过程中我学到了多少东西,引述结束。
It's because of how easy it was to get one up on its legs and how much I learned doing that, end quote.
四月份当Thorsten Ball的帖子一步步指导我时,我就有过这种体验。
I had this experience back in April when Thorsten Ball's post walked me through it step by step.
Thomas说得没错。
Thomas isn't wrong.
为自己构建一个代理程序,能让你更清晰地理解这可能是本十年最重要的面向开发者的技术。
Building an agent for yourself brings clarity to what is likely the most important developer facing technology of the decade.
现在是赞助新闻时间。
It's now time for sponsored news.
为什么GitHub Actions的checkout操作对98.5%的组织来说都很慢?
Why GitHub actions slash checkout is slow for 98.5% of orgs?
Depot刚刚发布了另一篇深度分析,这篇内容对任何使用GitHub Actions的人来说都直击要害。
Depot just dropped another deep dive, and this one hits home for anyone using GitHub actions.
他们分析了数千个工作流,发现98.5%的组织运行的actions/checkout比实际需要的速度慢。
They analyzed thousands of workflows and found that 98.5% of orgs are running actions slash checkout slower than they need to.
事实证明大多数团队使用的默认设置并不理想。
Turns out the default settings most teams use are not great.
冷克隆、缺少浅层获取和臃肿的历史记录浪费了宝贵的CI时间。
Cold clones, missing shallow fetches, and bloated histories waste precious CI minutes.
而这还只是在你构建开始之前。
And this is before your build even starts.
Depot的文章详细分析了问题原因、你因此损失的时间以及如何解决。
Depot's post breaks down why this happens, how much time it's costing you, and what you can do to fix it.
关键点在于:CI性能不仅仅取决于更大的运行器。
The takeaway, CI performance isn't just about bigger runners.
更在于更智能的运行方式。
It's about smarter ones.
Depot痴迷于为每个步骤节省时间,新数据证明你的流水线中隐藏着大量唾手可得的优化空间。
Depot's obsessed with shaving seconds off every step, and this new data proves there's a ton of low hanging fruit hiding in your pipelines.
完整分析请阅读depot.dev,了解为什么速度现在比以往任何时候都重要。
Read the full breakdown at depot.dev and see why speed matters more now than ever.
博客完整链接见通讯稿中。
Full link to the blog is in the newsletter.
框架消亡论。
Dead framework theory.
Paul Kinlan表示去年十月他预测LLM会消除框架选择的差异时错了。
Paul Kinlan says he was wrong last October when he predicted that LLMs would abstract away framework choice.
或许他没错,只是时间线预测有误。
Well, maybe he wasn't wrong, but he was wrong about the timeline.
引用:'现实更加有趣且持久。'
Quote, the reality is more interesting and more permanent.
React已不再与其他框架竞争。
React isn't competing with other frameworks anymore.
React已成为平台本身。
React has become the platform.
如今若开发新框架、库或浏览器特性,必须明白你不仅在与React竞争。
And if you're building a new framework, library, or browser feature today, you need to understand that you're not just competing with React.
而是在对抗LLM交易数据、系统提示和开发者输出形成的自增强闭环——这使得取代React在功能上已不可能。
You're competing against a self reinforcing feedback loop between LLM trading data, system prompts, and developer output that makes displacing React functionally impossible, end quote.
当他说'自增强闭环'时,并非夸张。
When he says self reinforcing feedback loop, he is not exaggerating.
今天我了解到Replit、Bolt等工具确实在系统提示中硬编码了React。
Today, I learned Replit, Bolt, and tools like them are literally hard coding React into their system prompts.
引用:'他们不得不这样做。'
Quote, they have to.
如今要开发吸引开发者的工具,必须提供可维护的代码。
If you're building a tool today to attract developers, you need to give them code they can maintain.
而当前开发者能维护的代码,对绝大多数网页开发者而言就是React。
And code developers can maintain now means React for the vast majority of web developers, end quote.
记得2022年Josh Collinsworth曾断言React除了流行外一无是处,他还在播客里和我们辩论过这事。
I remember back in 2022 when Josh Collinsworth declared React isn't great at anything except being popular, and he even debated this with us on the pod.
事实证明,受欢迎可能正是它所需要的全部。
Turns out that being popular might be all it needed.
别再凭感觉编写你的单元测试了。
Stop vibe coding your unit tests.
我们仍在努力理解整个代理编程的概念。
We're still trying to figure out this whole agentic coding thing.
我们应该让代理写测试而自己写实现,还是我们写测试让代理写实现?
Should we make the agent write the tests and we write the implementation ourselves, or should we write the tests and make the agent write the implementation?
或者也许我们该放手说:嘿代理,你来掌舵吧。
Or maybe we should just sit back and say, hey, agent, take the wheel.
安德鲁·加拉格尔对这些问题有见解。
Andrew Gallagher has thoughts on these questions.
引述:'越来越多人认为大语言模型擅长CRUD、样板代码和测试。'
Quote, there is a growing sentiment that LLMs are good for crud, boilerplate, and tests.
虽然我不太确定AI在CRUD或生成样板代码方面有多强,但作为现代大语言模型驱动的AI编程环境中的软件工程师,一年工作经验让我确信:大语言模型写的单元测试毫无建设性、充满干扰、脆弱不堪,简直糟糕透顶。
While I am not so sure about how good AI is at making crud or thumping out boilerplate, a year of working as an SWE in the modern LLM powered AI codescape has proven to me that LLMs write unconstructive, noisy, brittle, and downright bad unit tests.
请勿凭感觉编写单元测试,引述结束。
Please do not vibe code your unit tests, end quote.
安德鲁确实提到有方法能让大语言模型生成好测试,但目前需要你让它们逐个编写测试。
Andrew does say there's a way to get good tests from LLMs, but right now, it requires you to make them write tests one at a time.
谁有那闲工夫啊。
Ain't nobody got time for that.
谁都没那闲工夫。
Ain't nobody getting time for that.
谁都没工夫。
Ain't nobody getting for that.
谁都没时间。
Ain't nobody getting time.
没人有空。
Ain't nobody getting time.
没人有空干那个。
Ain't nobody getting time for that.
以上就是目前的新闻。
That's the news for now.
但快去订阅更新日志通讯,获取我们点击的所有链接的完整内容,比如复兴经典Unix游戏、离网长距离去中心化网状网络,以及MCP有何特别之处。
But go and subscribe to the changelog newsletter for the full scoop of links we're clicking on such as reviving classic Unix games, off grid long range decentralized mesh networks, and what is so special about MCP.
在changelog.news上订阅通讯吧。
Get in on the newsletter at changelog.news.
上周的播客中,安德鲁·尼斯贝特在周三向我们介绍了开源元数据的世界。
Last week on the pod, Andrew Nisbet told us all about the world of open source metadata on Wednesday.
而在周五,我们与之前的冠军们玩了一场激烈的猜词游戏。
And on Friday, we played a heated game of pound to find with our previous champs.
本周即将登场的是黑客新闻最爱的博主肖恩·格德克。
Coming up this week, it's Hacker News' favorite blogger, Sean Gedeke.
祝大家本周愉快。
Have a great week.
如果你喜欢这个节目,请点赞、订阅并给我们五星好评,我们很快会再聊。
Like, subscribe, and five star review us if you dig the show, and I'll talk to you again real soon.
关于 Bayt 播客
Bayt 提供中文+原文双语音频和字幕,帮助你打破语言障碍,轻松听懂全球优质播客。