<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Notes on AI Engineering]]></title><description><![CDATA[Notes on AI Engineering]]></description><link>https://jpreagan.com</link><generator>Substack</generator><lastBuildDate>Tue, 07 Apr 2026 20:24:41 GMT</lastBuildDate><atom:link href="https://jpreagan.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[James Reagan]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[jpreagan@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[jpreagan@substack.com]]></itunes:email><itunes:name><![CDATA[James Reagan]]></itunes:name></itunes:owner><itunes:author><![CDATA[James Reagan]]></itunes:author><googleplay:owner><![CDATA[jpreagan@substack.com]]></googleplay:owner><googleplay:email><![CDATA[jpreagan@substack.com]]></googleplay:email><googleplay:author><![CDATA[James Reagan]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[My agent's agent has an agent]]></title><description><![CDATA[uv pip install ceo]]></description><link>https://jpreagan.com/p/my-agents-agent-has-an-agent</link><guid isPermaLink="false">https://jpreagan.com/p/my-agents-agent-has-an-agent</guid><dc:creator><![CDATA[James Reagan]]></dc:creator><pubDate>Tue, 31 Mar 2026 07:11:25 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!-RYy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!-RYy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!-RYy!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic 424w, https://substackcdn.com/image/fetch/$s_!-RYy!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic 848w, https://substackcdn.com/image/fetch/$s_!-RYy!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic 1272w, https://substackcdn.com/image/fetch/$s_!-RYy!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!-RYy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic" width="1456" height="585" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:585,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1473450,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/192694016?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!-RYy!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic 424w, https://substackcdn.com/image/fetch/$s_!-RYy!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic 848w, https://substackcdn.com/image/fetch/$s_!-RYy!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic 1272w, https://substackcdn.com/image/fetch/$s_!-RYy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ba6052c-a8c7-4b91-923e-0593dd99ffef_3264x1312.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>uv pip install ceo</h2><p>Zuckerberg built himself a <a href="https://www.wsj.com/tech/ai/mark-zuckerberg-is-building-an-ai-agent-to-help-him-be-ceo-eddab2d5">CEO agent</a>. It retrieves answers he&#8217;d normally have to go through layers of people to get. The CEO of a 78,000-person company decided the fastest way to get information out of his own organization is to skip the org chart and ask an AI. We&#8217;re in the big 26, babe.</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/MeghanBobrowsky/status/2035849211468767595&quot;,&quot;full_text&quot;:&quot;Scoop: Mark Zuckerberg is building a CEO agent to help him do his job, according to a person familiar with the project. \n\nEmployees are also adopting AI agents and AI tools internally, namely My Claw and Second Brain, in a bid to speed up work, as they get graded on AI use. &quot;,&quot;username&quot;:&quot;MeghanBobrowsky&quot;,&quot;name&quot;:&quot;Meghan Bobrowsky&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/1955002772656631808/MS9yHShl_normal.jpg&quot;,&quot;date&quot;:&quot;2026-03-22T22:40:50.000Z&quot;,&quot;photos&quot;:[{&quot;img_url&quot;:&quot;https://pbs.substack.com/media/HEDJ9rjbkAAVxKP.jpg&quot;,&quot;link_url&quot;:&quot;https://t.co/csuIAbPJNF&quot;}],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:106,&quot;retweet_count&quot;:86,&quot;like_count&quot;:650,&quot;impression_count&quot;:298706,&quot;expanded_url&quot;:null,&quot;video_url&quot;:null,&quot;belowTheFold&quot;:false}" data-component-name="Twitter2ToDOM"></div><p>Meta employees are using personal agents internally. Two get named: My Claw, which accesses your chat logs and work files and can talk to colleagues or their colleagues&#8217; agents on your behalf, and Second Brain, built by an employee on top of Claude, described as &#8220;an AI chief of staff.&#8221; There is an internal group where employees&#8217; agents talk to each other.</p><p>Oh, and AI use is a factor in performance reviews.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;7be2a966-1ecb-42ef-9368-93e16da2dc99&quot;,&quot;duration&quot;:null}"></div><p>Meta bought <a href="https://manus.im/blog/manus-joins-meta-for-next-era-of-innovation">Manus</a> in December for north of $2 billion, a personal agent startup. Then in March they grabbed <a href="https://techcrunch.com/2026/03/10/meta-acquired-moltbook-the-ai-agent-social-network-that-went-viral-because-of-fake-posts/">Moltbook</a>, the social network where AI agents talk to each other. Meta now owns both a personal agent platform and an infrastructure for agents to socialize. I don&#8217;t know what to do with that sentence either, but here we are.</p><p>There&#8217;s a darker read. Meta is <a href="https://www.reuters.com/business/world-at-work/meta-planning-sweeping-layoffs-ai-costs-mount-2026-03-14/">planning layoffs</a> that could hit 20% of the company. The <a href="https://www.nytimes.com/2026/03/25/technology/meta-layoffs-ai-executives.html">first 700 hit last week</a>. Maher Saba&#8217;s new applied AI org is &#8220;ultraflat,&#8221; 50 ICs per manager, &#8220;AI native from day one.&#8221; The math isn&#8217;t subtle. Flatten, automate, cut.</p><p>Six weeks ago I wrote that <a href="https://jpreagan.com/p/billions-of-agents-talking-to-billions">personal AI assistants will be as ubiquitous as smartphones</a>. Per-person agents, each holding your context, growing with you over time. I didn&#8217;t expect one of the first companies to go all-in on that model would be Meta, or that it would happen this fast.</p><h2>Mario says slow down</h2><p>Mario Zechner (<a href="https://x.com/badlogicgames">@badlogicgames</a>) is the guy who built libGDX and BadLogic Games. He&#8217;s been in the trenches a long time. Most recently he also brought us the <a href="https://github.com/badlogic/pi-mono">Pi</a> framework, which happens to power OpenClaw.</p><p><a href="https://mariozechner.at/posts/2026-03-25-thoughts-on-slowing-the-fuck-down/">His post this week is worth reading in full</a>, but the tl;dr is that we&#8217;ve basically replaced one kind of slop with a faster kind of slop.</p><p>His argument is agents don&#8217;t learn. A human makes the same mistake a few times and eventually stops making it, either because someone screams at them or because they hate the pain they caused. An agent has no such feedback loop. It will make the same error indefinitely, at superhuman speed, with no bottleneck.</p><p>You wake up six weeks later with a codebase that&#8217;s technically 200,000 lines but is functionally untrustworthy, and the test suite your agent wrote is equally untrustworthy, and the only reliable measure of &#8220;does this work&#8221; is manual testing.</p><p>He also has a great phrase for it: <em>merchants of learned complexity.</em> Agents have seen a lot of terrible architecture in their training data. When you tell them to architect your application, that&#8217;s mostly what you get, enterprise cargo cult best practices and abstractions for their own sake. Except what takes human teams years to accumulate, two people and a clanker army can achieve in weeks.</p><p>I wrote something related <a href="https://jpreagan.com/p/is-ai-eating-your-coding-skills">last year</a> that I still believe: the interesting question isn&#8217;t how to get more lines of code out of AI, it&#8217;s how to get better code. This may well translate to slower and more deliberate. Use it to explore alternatives, critique your own design, find the edge cases you missed.</p><h2>The coding agent stack is converging</h2><p>A lot shipped this week that on the surface looks like separate announcements but is really one story.</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/openaidevs/status/2037296316104282119&quot;,&quot;full_text&quot;:&quot;We're rolling out plugins in Codex.\n\nCodex now works seamlessly out of the box with the most important tools builders already use, like <span class=\&quot;tweet-fake-link\&quot;>@SlackHQ</span>, <span class=\&quot;tweet-fake-link\&quot;>@figma</span>, <span class=\&quot;tweet-fake-link\&quot;>@NotionHQ</span>, <span class=\&quot;tweet-fake-link\&quot;>@gmail</span>, and more.\n\n<a class=\&quot;tweet-url\&quot; href=\&quot;http://developers.openai.com/codex/plugins\&quot;>developers.openai.com/codex/plugins</a> &quot;,&quot;username&quot;:&quot;OpenAIDevs&quot;,&quot;name&quot;:&quot;OpenAI Developers&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/2022002720971096064/l3Kyt4qt_normal.jpg&quot;,&quot;date&quot;:&quot;2026-03-26T22:31:07.000Z&quot;,&quot;photos&quot;:[{&quot;img_url&quot;:&quot;https://substackcdn.com/image/upload/w_1028,c_limit,q_auto:best/l_twitter_play_button_rvaygk,w_88/yqp9sjpznm7w69nrdsqo&quot;,&quot;link_url&quot;:&quot;https://t.co/TIbsIUAf6S&quot;}],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:404,&quot;retweet_count&quot;:570,&quot;like_count&quot;:5179,&quot;impression_count&quot;:1281711,&quot;expanded_url&quot;:null,&quot;video_url&quot;:&quot;https://video.twimg.com/amplify_video/2037296256998166528/vid/avc1/1280x720/txXLkxzfBzNhE8sv.mp4?tag=14&quot;,&quot;belowTheFold&quot;:true}" data-component-name="Twitter2ToDOM"></div><p>Codex now has plugins. Slack, Figma, Notion, Gmail, Google Drive, out of the box. They also shipped hooks, which let you inject custom logic at key points in the agent loop. So Codex went from coding agent to coding agent that can read your Slack, pull your Figma designs, check your email, and run your pre-commit scripts. Sound familiar?</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/claudeai/status/2038663014098899416&quot;,&quot;full_text&quot;:&quot;Computer use is now in Claude Code.\n\nClaude can open your apps, click through your UI, and test what it built, right from the CLI.\n\nNow in research preview on Pro and Max plans. &quot;,&quot;username&quot;:&quot;claudeai&quot;,&quot;name&quot;:&quot;Claude&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/1950950107937185792/QOfEjFoJ_normal.jpg&quot;,&quot;date&quot;:&quot;2026-03-30T17:01:53.000Z&quot;,&quot;photos&quot;:[{&quot;img_url&quot;:&quot;https://substackcdn.com/image/upload/w_1028,c_limit,q_auto:best/l_twitter_play_button_rvaygk,w_88/sqpaetw4f2vo31l4pi6j&quot;,&quot;link_url&quot;:&quot;https://t.co/s2FDQaDmr1&quot;}],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:1652,&quot;retweet_count&quot;:2120,&quot;like_count&quot;:29723,&quot;impression_count&quot;:3799825,&quot;expanded_url&quot;:null,&quot;video_url&quot;:&quot;https://video.twimg.com/amplify_video/2038658389862146048/vid/avc1/1280x720/1h6ff8AQLUduqFQW.mp4&quot;,&quot;belowTheFold&quot;:true}" data-component-name="Twitter2ToDOM"></div><p>Meanwhile, Claude Code got computer use. Your coding agent can now see your screen and click things. And as of today, you can call Codex directly from within Claude Code using your existing ChatGPT subscription.</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/romainhuet/status/2038677236304245087&quot;,&quot;full_text&quot;:&quot;We&#8217;ve seen Claude Code users bring in Codex for code review and use GPT-5.4 for more complex tasks, so we thought: why not make that easier?\n\nToday we&#8217;re open sourcing a plugin for it! You can call Codex from Claude Code with your ChatGPT subscription.\n\nWe love an open ecosystem!&quot;,&quot;username&quot;:&quot;romainhuet&quot;,&quot;name&quot;:&quot;Romain Huet&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/2020288886602366976/TlUC1Ubk_normal.jpg&quot;,&quot;date&quot;:&quot;2026-03-30T17:58:24.000Z&quot;,&quot;photos&quot;:[],&quot;quoted_tweet&quot;:{&quot;full_text&quot;:&quot;I built a new plugin! You can now trigger Codex from Claude Code!\n\nUse the Codex plugin for Claude Code to delegate tasks to Codex or have Codex review your changes using your ChatGPT subscription.\n\nStart by installing the plugin: \nhttps://t.co/u6gBpArwBc&quot;,&quot;username&quot;:&quot;dkundel&quot;,&quot;name&quot;:&quot;dominik kundel&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/2010049867138379780/Z08FgnXb_normal.jpg&quot;},&quot;reply_count&quot;:139,&quot;retweet_count&quot;:103,&quot;like_count&quot;:1926,&quot;impression_count&quot;:215553,&quot;expanded_url&quot;:null,&quot;video_url&quot;:null,&quot;belowTheFold&quot;:true}" data-component-name="Twitter2ToDOM"></div><p>So, codex gets access to your work tools. Claude Code gets access to your desktop. And then they get access to each other.</p><p>Zoom out and look at what Anthropic has recently shipped for Claude Code:</p><ul><li><p><a href="https://x.com/trq212/status/2034761016320696565">Channels</a> &#8212; Telegram, Discord, and iMessage forwarded into a session so Claude reacts to messages while you&#8217;re away</p></li><li><p><a href="https://x.com/felixrieseberg/status/2034005731457044577">Dispatch</a> &#8212; message a task from your phone and it spawns a desktop session to handle it</p></li><li><p><a href="https://x.com/claudeai/status/2026418433911603668">Remote Control</a> &#8212; steer a running session from the Claude mobile app</p></li><li><p><a href="https://code.claude.com/docs/en/scheduled-tasks">/loop</a> &#8212; cron-style scheduled tasks</p></li><li><p><a href="https://x.com/trq212/status/2027109375765356723">Auto-Memory</a> &#8212; Claude maintains its own MEMORY.md, accumulating your project context, coding style, and decisions across sessions</p></li></ul><p>Messaging, proactive automation, persistent memory that evolves, scheduled tasks, skills, and computer use sounds like an OpenClaw feature list to me. Anthropic is rebuilding OpenClaw inside Claude Code, one release at a time.</p><p>Codex is doing the same from the other direction, plugins and hooks are how you turn a coding agent into a work agent. Both labs are converging on the same architecture that OpenClaw pioneered: a personal agent that lives on your machine, connects to your tools, remembers your context, and reaches out to you through the apps you already use.</p><p>One of them hired the guy who built it. The other is just quietly copying the homework. Real talk though, Anthropic makes incredible stuff and they do it with a conscience. This is the most exciting stretch in computing I've lived through.</p>]]></content:encoded></item><item><title><![CDATA[How I learned to stop worrying and love OpenClaw]]></title><description><![CDATA[There&#8217;s a Mac mini in my house that knows more about my day than I do.]]></description><link>https://jpreagan.com/p/how-i-learned-to-stop-worrying-and-love-openclaw</link><guid isPermaLink="false">https://jpreagan.com/p/how-i-learned-to-stop-worrying-and-love-openclaw</guid><dc:creator><![CDATA[James Reagan]]></dc:creator><pubDate>Sun, 29 Mar 2026 07:40:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!aMkp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aMkp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aMkp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic 424w, https://substackcdn.com/image/fetch/$s_!aMkp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic 848w, https://substackcdn.com/image/fetch/$s_!aMkp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic 1272w, https://substackcdn.com/image/fetch/$s_!aMkp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aMkp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic" width="1456" height="585" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:585,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1013436,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/187820999?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aMkp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic 424w, https://substackcdn.com/image/fetch/$s_!aMkp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic 848w, https://substackcdn.com/image/fetch/$s_!aMkp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic 1272w, https://substackcdn.com/image/fetch/$s_!aMkp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd0ed2251-2489-4201-8d41-b793031ba7cd_3264x1312.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>There&#8217;s a Mac mini in my house that knows more about my day than I do. On the one hand, we&#8217;ve been blessed with strong reasoning models that can call tools and act on our behalf. On the other hand, we have the context of our lives boxed away in various compartments: text messages, email, chat logs, transcripts, notes, calendar.</p><p>And the models are only as good as the context you provide them.</p><p>What if we brought them together? You might say:</p><blockquote><p>James, ChatGPT and Claude already have a memory system and can connect to my data sources, right?</p></blockquote><p>Yes, but let me point out a few shortcomings of these existing products:</p><ul><li><p>They haven&#8217;t done a good job tying your sources together</p></li><li><p>The memory system is mid and you don&#8217;t have control over how it works</p></li><li><p>There&#8217;s no easy way to migrate your memories between products, i.e., vendor lock-in</p></li><li><p>Proprietary walled gardens keep it from being truly useful</p></li><li><p>They can&#8217;t yet reach out to you proactively in any meaningful way</p></li></ul><p><a href="https://openclaw.ai/">OpenClaw</a> fixes all of this mess. And it&#8217;s all open source and free as in freedom. I believe this is what Siri was supposed to be, and what the big labs wish they could make.</p><p>You text your assistant in the same app you text your friends and family. It feels like just another contact except it can reach out to you. It has real-time access to every data source you give it. It knows your life history, whatever you&#8217;ve shared, and you&#8217;d be surprised how many connections it draws across all of it.</p><p>Your memories are markdown files on your hard drive, which is beautiful simplicity and portability. OpenClaw indexes them automatically for semantic search, so the assistant can recall what it needs without you managing a database.</p><p>The system is fully open. Ask the assistant to change itself, or point Codex CLI or Claude Code at it and rebuild whatever you want. It&#8217;s software you modify by having a conversation, and that changes your relationship with every tool you use.</p><h2><strong>But there be dangers</strong></h2><p>Yep, there are dangers. You&#8217;re giving a nondeterministic beast control of a dedicated machine. It can do anything a normal computer can do, which is its greatest strength and its most obvious risk. And we&#8217;re pretty sure these models are <a href="https://andonlabs.com/blog/opus-4-6-vending-bench">only behaving because they know they&#8217;re being watched</a>.</p><p>Prompt injection is still an unsolved problem. OpenClaw now ships with a <a href="https://docs.openclaw.ai/gateway/security">security audit command</a>, a formal trust model, and <a href="https://docs.openclaw.ai/gateway/sandboxing">sandboxing</a> including an OpenShell backend that came out of NVIDIA&#8217;s <a href="https://github.com/NVIDIA/NemoClaw">NemoClaw</a> project. The blast radius is getting smaller.</p><p>The thing I didn&#8217;t appreciate as much as I should have early on is that model choice might be the single strongest safety lever you have. Older and smaller models are significantly more vulnerable to prompt injection and tool misuse. If you&#8217;re running an agent with real tools and real access, use the biggest and baddest model available. You want the best not just for intelligence but for safety.</p><p>I&#8217;m not going to tell you the risks aren&#8217;t real, because they are. But I&#8217;m pretty sure you can find an approach within your comfort zone. Here&#8217;s mine.</p><h2><strong>Do we gotta pay the Apple tax?</strong></h2><p>I run my assistant on a separate machine, a Mac mini. Listen, you don&#8217;t need one. Any dedicated PC running Linux works, a Raspberry Pi works. But a Mac mini is also a perfectly fine low-powered device and if you want native iMessage with blue bubbles on your iPhone, you need an Apple device. I think that&#8217;s worth it, but Signal, Telegram, and WhatsApp all work too.</p><p>The Mac mini setup:</p><ul><li><p>Isolated network on my home system</p></li><li><p>Dedicated Apple ID for the assistant</p></li><li><p>FileVault on</p></li><li><p>Firewall on</p></li><li><p>SIP remains enabled</p></li></ul><p>The most important thing: <strong>do not sign in with your personal Apple ID</strong>. Your assistant should never have access to your keychain, your wallet, your browser sessions, any of it. Give it its own Apple ID. You&#8217;d need a separate account anyway, because otherwise you&#8217;d be texting yourself.</p><p>SIP staying enabled means no Private API with <a href="https://bluebubbles.app">BlueBubbles</a>, so no typing indicators, read receipts, or tapbacks. Basic send and receive works fine. You can get the full bougie experience on WhatsApp or Telegram if privacy means nothing to you, or on Signal if it does.</p><p>You may actually need two Apple machines. I&#8217;ll explain why in a moment.</p><h2><strong>Zero public exposure</strong></h2><p>Ideally you should have zero public exposure unless you know what you&#8217;re doing and really need it. There are uses cases for opening up specific services, but if you&#8217;re just starting out I highly recommend no inbound ports. SSH should be key-only with passwords disabled, but we can do better.</p><p>Everything should route through Tailscale. If you don&#8217;t know what that is listen to this perfectly reasonable explanation by Scott Tolinski (<a href="https://x.com/@stolinski">@stolinski</a>):</p><div id="youtube2-G0sEM9ijkTE" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;G0sEM9ijkTE&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/G0sEM9ijkTE?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>In the end we want this:</p><ul><li><p>No public inbound ports</p></li><li><p>Private device-to-device networking only (Tailscale)</p></li><li><p>Assistant has its own Apple ID on the Mac mini</p></li><li><p>Personal Apple ID stays on your personal machine(s) only</p></li></ul><h2><strong>Use the clanker</strong></h2><p>You&#8217;ll likely want to vibe configure the whole thing. Seriously, just use the clanker. On a fresh install, clone the <a href="https://github.com/openclaw/openclaw">OpenClaw repo</a>, and then first thing fire up Codex CLI or Claude Code the repo. It has an AGENTS.md with the full context of the docs and source code. Tell it what you want done.</p><p>Things change quickly, so I recommend you ignore any setup guides other than the docs. I&#8217;m explaining conceptually what I&#8217;ve done. If this fits for you, tell your coding agent that&#8217;s what you want. Or tweak it. Whatever. This is the future of working with software.</p><h2><strong>Texting, texting 1-2-3</strong></h2><p>There are two separate goals here:</p><ol><li><p>I want to text my assistant from my phone like a normal contact.</p></li><li><p>I want my assistant to have read-only, real-time access to my personal message history.</p></li></ol><p>And remember, we&#8217;re not signing my personal Apple ID into the Mac mini.</p><p>The split looks like this. The Mac mini is the assistant machine signed into a dedicated assistant Apple ID, running OpenClaw and BlueBubbles. This is the only place the assistant sends messages from. The MacBook Pro is my personal machine signed into my personal Apple ID, running the source side of <a href="https://github.com/jpreagan/imsgkit">imsgkit</a>.</p><p>My first approach was an SSH forced command, a dedicated key that could only run a handful of read-only commands on my MacBook Pro over Tailscale. It worked, but it was fragile and weird. So I built imsgkit to do it properly. It&#8217;s open source, MIT licensed.</p><p>imsgkit has two pieces:</p><ol><li><p><strong>imsgd</strong> is a macOS daemon that reads your Messages database and publishes an incremental replica using <a href="https://sqlite.org/rsync.html">sqlite3_rsync</a>.</p></li><li><p><strong>imsgctl</strong> is a CLI for macOS and Linux that reads the replica locally.</p></li></ol><p>imsgd runs on my MacBook Pro and syncs a replica to the Mac mini every few seconds. imsgctl runs on the Mac mini and reads that replica directly. No network calls at read time, no SSH, no personal Apple ID on the assistant machine.</p><p>It ships with an agent skill, so once installed your assistant just knows how to check your messages. For OpenClaw:</p><div class="highlighted_code_block" data-attrs="{&quot;language&quot;:&quot;bash&quot;,&quot;nodeId&quot;:&quot;8c717b3f-e8e2-43dd-80ea-07f97537df13&quot;}" data-component-name="HighlightedCodeBlockToDOM"><pre class="shiki"><code class="language-bash">openclaw skills install imsgctl</code></pre></div><p>For Claude Code, Codex CLI, or any agent that supports skills:</p><div class="highlighted_code_block" data-attrs="{&quot;language&quot;:&quot;bash&quot;,&quot;nodeId&quot;:&quot;5a072224-4357-4998-94a9-3e3a246d4937&quot;}" data-component-name="HighlightedCodeBlockToDOM"><pre class="shiki"><code class="language-bash">npx skills add jpreagan/imsgkit</code></pre></div><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;a14b6570-297e-45fc-971b-0542a7de1178&quot;,&quot;duration&quot;:null}"></div><p>The security boundary is the same one I started with, just cleaner. The assistant can read my personal message history in real time but cannot act as me on my personal machine. No shell access from assistant to personal machine, no personal Apple ID on the assistant machine, outbound texting stays BlueBubbles only.</p><h2><strong>You&#8217;ve got (read-only) mail</strong></h2><p>Now for email. I have a Gmail account. Gross, I know, but it works pretty well actually thanks to the prolific Peter Steinberger (<a href="https://x.com/@steipete">@steipete</a>) with <a href="https://github.com/steipete/gogcli">gogcli</a>.</p><p>I want my assistant to answer questions about my inbox on demand, without giving it permission to send, delete, archive, or modify anything. I also don&#8217;t want to break my zero public exposure rule. So I skipped Pub/Sub and webhooks. No inbound endpoint, no public callback URL, no extra network surface area. Just pull when asked.</p><p>It&#8217;s a plain Jane setup. OAuth scopes are locked to read-only. The assistant can read, summarize, and search, but it cannot act as me in Gmail. One useful detail is your Cloud Console account and your mailbox account don&#8217;t have to be the same. The project can live under one Google account while OAuth authorization is granted by the account that actually owns the email.</p><p>End state is I ask &#8220;what&#8217;s in my inbox today?&#8221; and I get it. No modify or send scopes, no public ports, and Pub/Sub is there if I ever want proactive notifications later.</p><h2><strong>What&#8217;s next?</strong></h2><p>Once you let your hair down and stop treating the whole thing like a frightening disaster, it&#8217;s a lot of fun.</p><p>Good grief, it seems like half of Twitter will tell you the sky is falling and the other half are hustle bros exalting their 24x7 employee churning out SaaS slop that nobody uses. There is something genuinely exciting happening here, and it&#8217;s neither of those things.</p><p>The idea of a personal AI that pulls in all your data sources, knows your context, and grows with you is here. It&#8217;s running on a Mac mini in my house right now, texting me blue bubbles.</p><p>We&#8217;re still early. The rough edges are real. But the upside is massive, and the learning curve is the point. If you&#8217;re building with AI and you haven&#8217;t dogfooded a personal assistant yet, you&#8217;re missing out on all the fun.</p><p>So go off with your friendly neighborhood clanker. Start small, stay paranoid, and enjoy.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://jpreagan.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Notes on AI Engineering is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The most dangerous command in computing]]></title><description><![CDATA[Howdy folks, things are moving fast this week.]]></description><link>https://jpreagan.com/p/the-most-dangerous-command-in-computing</link><guid isPermaLink="false">https://jpreagan.com/p/the-most-dangerous-command-in-computing</guid><dc:creator><![CDATA[James Reagan]]></dc:creator><pubDate>Wed, 25 Mar 2026 07:47:07 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ARNC!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ARNC!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ARNC!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ARNC!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ARNC!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ARNC!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ARNC!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg" width="1456" height="585" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:585,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:4427658,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/192066812?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ARNC!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ARNC!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ARNC!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ARNC!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d52b1b3-79e8-4525-97f5-d1234ea29716_3264x1312.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Howdy folks, things are moving fast this week. I&#8217;ve got a grab bag for you and some thoughts. Some of this is exciting, some of it is terrifying, and one of them made somebody&#8217;s Mac melt down.</p><p>Let&#8217;s get into it.</p><h2>Will PRs get unalived?</h2><p>The mighty pull request, also known as merge request if you speak GitLab, is the atom unit of collaborative software development.</p><p>Lately, plenty of folks are coming for them in 2026:</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/GergelyOrosz/status/2016090541717373135&quot;,&quot;full_text&quot;:&quot;I cannot unsee the death of the pull request in open source upon us.\n\nPRs from external contributors made a lot of sense when it was hard to write code, and it took lots of time investment (+ lots of thinking!) to do so.\n\nNow that it takes seconds/minutes: dynamics change&quot;,&quot;username&quot;:&quot;GergelyOrosz&quot;,&quot;name&quot;:&quot;Gergely Orosz&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/673095429748350976/ei5eeouV_normal.png&quot;,&quot;date&quot;:&quot;2026-01-27T10:06:56.000Z&quot;,&quot;photos&quot;:[],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:86,&quot;retweet_count&quot;:28,&quot;like_count&quot;:776,&quot;impression_count&quot;:75709,&quot;expanded_url&quot;:null,&quot;video_url&quot;:null,&quot;belowTheFold&quot;:false}" data-component-name="Twitter2ToDOM"></div><p>Eleanor Berger (<a href="https://x.com/intellectronica">@intellectronica</a>) called this last year already:</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/intellectronica/status/2001785306102809077&quot;,&quot;full_text&quot;:&quot;Controversial minority opinion: code review (by humans) is going to largely go away. We'll have a few more months of people talking endlessly about how hard it is to review AI-generated code, and many products offering support, and eventually ... we'll just stop reviewing.&quot;,&quot;username&quot;:&quot;intellectronica&quot;,&quot;name&quot;:&quot;Eleanor Berger&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/1998656086987980800/mTwCcf33_normal.jpg&quot;,&quot;date&quot;:&quot;2025-12-18T22:43:03.000Z&quot;,&quot;photos&quot;:[],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:4,&quot;retweet_count&quot;:1,&quot;like_count&quot;:6,&quot;impression_count&quot;:454,&quot;expanded_url&quot;:null,&quot;video_url&quot;:null,&quot;belowTheFold&quot;:false}" data-component-name="Twitter2ToDOM"></div><p>Are PRs cooked? Man, in a way I&#8217;m getting low key nostalgic already for their loss and they haven&#8217;t even gone yet, but the truth is though things have to change. We&#8217;re in a transition period of historical proportions, so a lot of things will be quite different on the other side. Now is the time to start rethinking everything.</p><p>I&#8217;ve been watching this play out on our team. We went from a world where a PR represented hours of focused human work, and the review was a meaningful checkpoint, to a world where the agent generates the PR and another agent reviews it. The human is no longer on either side of that loop by default. We&#8217;re approvers now, not authors or reviewers.</p><p>The interesting question to me isn&#8217;t whether PRs die, but what replaces the trust mechanism they provided. PRs were never really about the code diff. They were about one human saying to another: I looked at this, and I think it&#8217;s right. When both the author and the reviewer are agents, that social contract is gone entirely. The human&#8217;s role shifts from doing the work to deciding whether the work should be done at all.</p><p>Meanwhile, Peter Steinberger (<a href="https://x.com/steipete">@steipete</a>) has a different and complementary take on what&#8217;s happening to PRs in open source. He posted on X: </p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/steipete/status/2012627114386670026&quot;,&quot;full_text&quot;:&quot;My take: PRs are *prompt requests*. I get some really good, but also loads of meh PRs, but they usually identify a problem, just have a suboptimal solution.\n\nI don't bother commenting and doing the wait dance, I usually just rewrite them and credit the reporter as co-author.&quot;,&quot;username&quot;:&quot;steipete&quot;,&quot;name&quot;:&quot;Peter Steinberger &#129438;&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/1131851609774985216/OcsssQ9J_normal.png&quot;,&quot;date&quot;:&quot;2026-01-17T20:44:31.000Z&quot;,&quot;photos&quot;:[],&quot;quoted_tweet&quot;:{&quot;full_text&quot;:&quot;This week we're going to begin automatically closing pull requests from external contributors. I hate this, sorry.&quot;,&quot;username&quot;:&quot;tldraw&quot;,&quot;name&quot;:&quot;tldraw&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/1998824870461796352/HFsywfLU_normal.jpg&quot;},&quot;reply_count&quot;:14,&quot;retweet_count&quot;:3,&quot;like_count&quot;:267,&quot;impression_count&quot;:28139,&quot;expanded_url&quot;:null,&quot;video_url&quot;:null,&quot;belowTheFold&quot;:true}" data-component-name="Twitter2ToDOM"></div><p>PRs as prompt requests. That&#8217;s a good reframe. The value isn&#8217;t the code anymore, it&#8217;s more about the problem identification. The maintainer takes the problem statement, rewrites the solution with their own agent, credits the original reporter. The back-and-forth review cycle collapses into a single step. I think this is where open source PRs are heading fast.</p><h1>Codex meets OpenClaw</h1><p>Speaking of agents doing the work, here&#8217;s something cool: <a href="https://github.com/pwrdrvr/openclaw-codex-app-server">Codex App Server Bridge</a> just landed as a community plugin for OpenClaw.</p><p>What this does is let you bind a Telegram or Discord conversation to a Codex thread. You text it, it routes to Codex. You can resume threads, switch models, review diffs, trigger planning mode, all from a chat interface. One command to bind, then just talk.</p><p>Why does this matter? Because it collapses two things that have been separate: your personal assistant and your coding agent. Right now most of us context switch between them. You ask your assistant something, then you switch to Claude Code or Codex to actually build it, then you come back to the assistant to figure out what to do next. This plugin means your assistant can route coding work to Codex without you leaving the conversation.</p><p>This is the direction everything is heading. Not one mega agent that does it all, but a personal agent that orchestrates specialized agents on your behalf. Your assistant becomes the router, not the executor. OpenClaw already supports this with sub-agents and sessions, but having much improved Codex integration is a signal. The ecosystem is filling in the gaps faster than any single project could.</p><h2>Google&#8217;s TurboQuant: cheaper inference is coming</h2><p>Google Research dropped <a href="https://research.google/blog/turboquant-redefining-ai-efficiency-with-extreme-compression/">TurboQuant</a> this week. It&#8217;s a compression algorithm that reduces LLM key-value cache memory by at least 6x and delivers up to 8x speedup with zero accuracy loss.</p><p>I&#8217;ll spare you the full paper summary, but the key insight is worth understanding. The KV cache is the memory bottleneck for long-context inference. Every token the model has seen gets cached so it doesn&#8217;t have to recompute attention from scratch. As context windows get longer, that cache gets expensive fast.</p><p>TurboQuant compresses this cache down to 3 bits per value without any fine-tuning or accuracy loss. It does this through a clever two-stage approach: first a rotation-based quantization step (PolarQuant) that simplifies the geometry of the data, then a 1-bit error correction pass (QJL) that eliminates the bias introduced by compression.</p><p>Inference cost is often the bottleneck for agent adoption. If you&#8217;re running an agent 24/7 that&#8217;s reading your email, monitoring your calendar, processing documents, you&#8217;re burning tokens constantly. Anything that makes inference cheaper and faster directly enables more people to run agents. A 6x reduction in KV cache memory means you can serve longer contexts on the same hardware, or serve the same contexts on cheaper hardware.</p><p>This won&#8217;t show up in your API bill tomorrow. But when Google deploys this across their fleet, and when it inevitably gets adopted by other providers, the cost curve for always-on agents gets a lot more favorable. The future where everyone has a personal agent is partly a cost problem, and research like this chips away at it.</p><h2>Cloudflare Dynamic Workers: sandboxing, 100x faster</h2><p>Cloudflare <a href="https://blog.cloudflare.com/dynamic-workers/">announced Dynamic Workers</a> yesterday, and this one is significant for anyone thinking about agent security.</p><p>The problem: when an agent writes and executes code, that code needs to run somewhere safe. The current answer is mostly containers. Containers work, but they&#8217;re slow to start (hundreds of milliseconds), expensive in memory (hundreds of megabytes), and hard to scale when every user interaction might need its own sandbox.</p><p>Dynamic Workers use V8 isolates instead. A few milliseconds to start, a few megabytes of memory. That&#8217;s 100x faster and 10-100x more memory efficient than containers. You can spin up a new isolate for every single request and throw it away afterward without flinching.</p><p>Here&#8217;s the part I found most interesting. They&#8217;re pushing TypeScript as the interface language for agent-to-API communication, replacing OpenAPI specs. Their example is a TypeScript interface describing a chat room API is a handful of lines. The equivalent OpenAPI spec is so long you have to scroll. Fewer tokens to describe the API means cheaper inference and better agent comprehension.</p><p>The <code>globalOutbound</code> option is also pretty clever. You can intercept every HTTP request the sandboxed code makes, inject credentials, block unauthorized calls, or rewrite requests. The agent never sees the actual secrets. This is credential injection done right: the agent has the capability without having the keys.</p><p>This matters because sandboxing is one of the main unsolved problems for production agents. OpenShell (which <a href="https://docs.openclaw.ai/gateway/openshell">just got added to mainline OpenClaw</a> this week, by the way) takes a Linux-native approach with Landlock/seccomp/netns. Cloudflare is taking a V8 isolate approach. Different layers, same problem. Let agents execute code without giving them the keys to the kingdom.</p><p>Competition here is good. The more people working on agent sandboxing, the faster we all get to a world where running an agent doesn&#8217;t <a href="https://jpreagan.com/p/how-i-learned-to-stop-worrying-and-love-openclaw">require a tin foil hat</a>. I still wear mine, for the record, but I&#8217;d like the option not to.</p><h2>LiteLLM got compromised (today)</h2><p>And now for the terrifying one. <a href="https://futuresearch.ai/blog/litellm-pypi-supply-chain-attack">LiteLLM version 1.82.7 and 1.82.8 was compromised</a> on PyPI today. If you use LiteLLM, stop what you&#8217;re doing and check your version right now. If you installed or upgraded on or after March 24, 2026, you may be affected.</p><p>What happened: somebody uploaded a malicious version directly to PyPI, bypassing the normal GitHub release process. The payload harvests SSH keys, cloud credentials, .env files, database passwords, crypto wallets, and basically anything sensitive on your machine. It encrypts the haul with a hardcoded RSA key and ships it off to a domain masquerading as LiteLLM infrastructure. If it finds a Kubernetes service account, it tries to read every secret in the cluster and install persistent backdoors across all nodes.</p><p>The discovery story is almost darkly comic. Callum McMahon at FutureSearch <a href="https://futuresearch.ai/blog/no-prompt-injection-required/">wrote the explainer</a>. His machine stuttered to a halt, CPU pegged at 100%, 11,000 processes running. The malware had a bug: it used a <code>.pth</code> file that triggers on every Python interpreter startup, so the malicious subprocess would itself trigger the <code>.pth</code> file, which would spawn another subprocess, which would trigger it again. Fork bomb. The malware&#8217;s own poor quality is what made it visible.</p><p>As <a href="https://x.com/karpathy/status/2036487306585268612">Andrej Karpathy pointed out on X</a>, without that bug it would have gone unnoticed for much longer. A competent attacker would have had a clean exfiltration with nobody the wiser.</p><p>Here&#8217;s what I find most instructive about this. We&#8217;ve spent a year talking about prompt injection as the existential threat to AI agents. Simon Willison (<a href="https://x.com/simonw">@simonw</a>) has been hammering on the <a href="https://simonwillison.net/2025/Jun/16/the-lethal-trifecta/">lethal trifecta</a>. Agents with access to private data, exposure to untrusted content, and the ability to take action. And he&#8217;s right, that is a real problem.</p><p>But what actually got people this week? A supply chain attack. <code>pip install </code>with an unpinned dependency. No prompt injection required. The MCP client (Cursor, in this case) auto-downloaded the latest version via <code>uvx</code>, which pulled in the compromised LiteLLM, which ran the malware before any model was even involved.</p><p>Securing agents is not just about securing the AI part. It&#8217;s about securing the entire stack they run on, the same boring supply chain hygiene we&#8217;ve been preaching for years. Pin your dependencies. Use lock files with checksums. Audit before upgrading. The agent era doesn&#8217;t change any of this. It just raises the stakes because your agent has access to everything.</p><p>And if you&#8217;re running MCP servers locally, think hard about whether they need to be local. FutureSearch moved to a remote MCP architecture after this. The server doesn&#8217;t run on the user&#8217;s machine anymore, which collapses the entire attack surface. That&#8217;s worth considering.</p><h2>Wrapping up</h2><p>A lot happened this week. We&#8217;re rethinking everything as we should, inference is getting cheaper, sandboxing is getting faster, and somebody learned the hard way that <code>pip install</code> is still the most dangerous command in computing.</p><p>The infrastructure is catching up to the ambition. That's the pattern I keep seeing. The models were ahead of the tooling six months ago. Now the tooling is closing the gap. Sandboxing, compression, agent wiring, all of it maturing in parallel.</p><p>Just pin your dependencies.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://jpreagan.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Notes on AI Engineering! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Billions of agents talking to billions of agents]]></title><description><![CDATA[About six weeks ago I started using OpenClaw and wrote about my initial approach. OpenClaw is a new category of personal AI assistant evolved beyond the chatbot. It&#8217;s an open source, self-hosted agent you could say has been doing the numbers. In my last newsletter, I referred to it like this:]]></description><link>https://jpreagan.com/p/billions-of-agents-talking-to-billions</link><guid isPermaLink="false">https://jpreagan.com/p/billions-of-agents-talking-to-billions</guid><dc:creator><![CDATA[James Reagan]]></dc:creator><pubDate>Mon, 16 Mar 2026 00:20:53 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!uUh8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uUh8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uUh8!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png 424w, https://substackcdn.com/image/fetch/$s_!uUh8!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png 848w, https://substackcdn.com/image/fetch/$s_!uUh8!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png 1272w, https://substackcdn.com/image/fetch/$s_!uUh8!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uUh8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png" width="1456" height="585" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:585,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:9729441,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/190982338?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!uUh8!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png 424w, https://substackcdn.com/image/fetch/$s_!uUh8!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png 848w, https://substackcdn.com/image/fetch/$s_!uUh8!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png 1272w, https://substackcdn.com/image/fetch/$s_!uUh8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4c13c183-ba40-462b-8724-b95fc4ed12bb_3264x1312.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Be like Lillian.</figcaption></figure></div><p>About six weeks ago I <a href="https://jpreagan.com/p/im-done-waiting-for-apple-to-figure">started using OpenClaw</a> and wrote about <a href="https://jpreagan.com/p/how-i-learned-to-stop-worrying-and-love-openclaw">my initial approach</a>. OpenClaw is a new category of personal AI assistant evolved beyond the chatbot. It&#8217;s an open source, self-hosted agent you could say has been doing the numbers. In my last newsletter, I referred to it like this:</p><blockquote><p>I believe it is what Siri was supposed to be, and what the big labs wish they could make.</p></blockquote><p>What the big labs wish they could make? Directionally correct I suppose, but my timeline was off. When I wrote those words on February 12, I assumed we were at least a year away from seeing anything comparable from the big labs. Further out still for enterprise adoption.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://jpreagan.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Notes on AI Engineering is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Too much risk involved for the corporate world I figured. They&#8217;d sleep while we all hack and explore. But sleep isn&#8217;t right, it&#8217;s more like stalk.</p><p>Personal AI assistants will soon be as ubiquitous as smartphones. They are so useful that once they hit mainstream adoption, it will be harder to <em>not</em> have one than to just have one. Just like with smartphones, there will be a tipping point. We&#8217;ll have one for home and one for work, and billions of agents will talk to humans and other agents. Sure, things are going to get weird.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!dN1-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!dN1-!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png 424w, https://substackcdn.com/image/fetch/$s_!dN1-!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png 848w, https://substackcdn.com/image/fetch/$s_!dN1-!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png 1272w, https://substackcdn.com/image/fetch/$s_!dN1-!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!dN1-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png" width="1456" height="786" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:786,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:96879,&quot;alt&quot;:&quot;Line chart showing U.S. smartphone ownership rising from 5% in 2007 to 94% in 2026, forming a classic S-curve. Growth is slow from 2007 to 2010, accelerates steeply from 2010 to 2015, then gradually flattens as adoption approaches saturation.&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/190982338?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Line chart showing U.S. smartphone ownership rising from 5% in 2007 to 94% in 2026, forming a classic S-curve. Growth is slow from 2007 to 2010, accelerates steeply from 2010 to 2015, then gradually flattens as adoption approaches saturation." title="Line chart showing U.S. smartphone ownership rising from 5% in 2007 to 94% in 2026, forming a classic S-curve. Growth is slow from 2007 to 2010, accelerates steeply from 2010 to 2015, then gradually flattens as adoption approaches saturation." srcset="https://substackcdn.com/image/fetch/$s_!dN1-!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png 424w, https://substackcdn.com/image/fetch/$s_!dN1-!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png 848w, https://substackcdn.com/image/fetch/$s_!dN1-!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png 1272w, https://substackcdn.com/image/fetch/$s_!dN1-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd960c48e-d6e3-49d1-a09e-e4aaafb0279f_2100x1134.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">This is what a tipping point looks like. Personal AI assistants will likely follow the same curve and even faster.</figcaption></figure></div><p>Anyway, the same day I published those words I happened to hear goated builder of OpenClaw Peter Steinberger (<a href="https://x.com/steipete">@steipete</a>) on Lex Fridman discuss how he was <a href="https://www.youtube.com/watch?v=YFjfBk8HI5o&amp;t=8269s">talking to and getting offers from Meta and OpenAI</a>. Ok, I thought, they are moving fast on this. Couple days later, our man announces <a href="https://steipete.me/posts/2026/openclaw">joining OpenAI</a>. He mentioned his &#8220;next mission is to build an agent that even my mum can use,&#8221; which is honestly a pretty awesome goal. Right now the thing is for hackers, but everyone should have access to this technology.</p><p>Having an assistant running 24/7 on my Mac mini, I&#8217;ve been impressed by how useful it can be in my life. If you thought chatbots were great, you ain&#8217;t seen nothing yet, baby. This technology is early stage, but it&#8217;s a step change beyond anything we&#8217;ve encountered interacting with LLMs up until now. We&#8217;re at a major point in computing history.</p><p>I&#8217;m not some SaaS bro writing slop code for zero user apps, yea, there is a lot of noise online about that right now. Ugh. I'm more interested in how this technology can learn about us and help us become better family members, think deeper about the problems that matter to us, and be more healthy and efficient human beings.</p><p>OpenClaw does this by learning about you over time, and it has a hackable memory system. It doesn&#8217;t wait for you to ask. It reaches out proactively, surfacing things you need to know before you think to check. You have to experience this a few times to get why it&#8217;s different.</p><p>It also occurred to me how wickedly good personal AI assistants could be in a work environment. There are real hurdles like security boundaries and terms of service, but we&#8217;re already starting to see some of these problems get solved. They&#8217;re worth solving.</p><p>First, let&#8217;s go over how we got here.</p><h2>A trip down memory lane</h2><p>We all had our ChatGPT moment. Do you remember the first time you had a fluent and pretty reasonable conversation with a model? Magic. Actually, math, but yea magic.</p><p>Models did make progress, and aside from that we got three things in 2023 that made them considerably more interesting:</p><ol><li><p><a href="https://openai.com/index/function-calling-and-other-api-updates/">Function calling</a> in June</p></li><li><p><a href="https://openai.com/index/introducing-chatgpt-search/">Web search</a> in October</p></li><li><p><a href="https://openai.com/index/new-models-and-developer-products-announced-at-devday/">JSON mode</a> in November</p></li></ol><p>The problem with ChatGPT was that it couldn&#8217;t &#8220;do&#8221; anything. Function calling rolled out at that time to fix that. Now we could write pieces of code that the model could decide when to call and execute at the right time with the right parameters. That was the theory anyway, but anyone who remembers working with them at the time knows the models were pretty terrible with them.</p><p>Another limitation skeptics of LLMs were quick to point out was the training cutoff date was a bigger deal than it is today. Models take a while to pre-train and go through post-training before they are released, so people loved to dunk on them for &#8220;so dumb not knowing&#8221; stuff that happened in the last few months.</p><p>Side note, I wouldn&#8217;t recommend dunking on LLMs too much. It comes across weirdly insecure, and if you give them six months you&#8217;ll be the one feeling dumb. We seem to blow through one barrier after another. There is nothing stopping progress from here on out, everything is an engineering challenge that will be solved.</p><p>Anyway, web search filled that gap, but didn&#8217;t get decently good in my opinion until <a href="https://openai.com/index/introducing-o3-and-o4-mini/">o3 released in late 2024</a>. Likewise, JSON mode was announced at OpenAI&#8217;s first DevDay, which I attended in November 2023. It was essentially a predecessor to <a href="https://openai.com/index/introducing-structured-outputs-in-the-api/">structured outputs</a>, which didn&#8217;t come out until August 2024.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!LBvA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!LBvA!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg 424w, https://substackcdn.com/image/fetch/$s_!LBvA!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg 848w, https://substackcdn.com/image/fetch/$s_!LBvA!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!LBvA!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!LBvA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg" width="1456" height="1941" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1941,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:856447,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/190982338?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!LBvA!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg 424w, https://substackcdn.com/image/fetch/$s_!LBvA!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg 848w, https://substackcdn.com/image/fetch/$s_!LBvA!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!LBvA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F57b1cd6b-a557-472c-ab65-87cea02d8d88_1625x2166.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Me at the first OpenAI DevDay, November 2023. Everything felt like it was about to change. It was.</figcaption></figure></div><p>Structured outputs is what we use today. So much better than JSON mode because now I can not only ask for a guaranteed parsable JSON response but one which conforms to a specific schema which I also define. Function calling likewise had to wait until better models came along to be useful, probably only starting in 2024 as well.</p><h2>Models are only as good as the context provided to them</h2><p>Over the past few years, a handful of phrases have really stuck in my head:</p><ol><li><p>&#8220;Infinitely stable dictatorships,&#8221; a phrase Ilya Sutskever ominously warned about in a doomer moment in the <a href="https://en.wikipedia.org/wiki/IHuman_(film)">2019 iHuman documentary</a>.</p></li><li><p>&#8220;Sparks of AGI&#8221; from the title of a <a href="https://arxiv.org/abs/2303.12712">Microsoft research paper</a> published in 2023.</p></li><li><p>&#8220;Machines of loving grace,&#8221; Dario Amodei&#8217;s <a href="https://darioamodei.com/essay/machines-of-loving-grace">essay you should definitely read</a>, which borrows its title from <a href="https://www.readingdesign.org/all-watched-over-by-machines">Richard Brautigan&#8217;s 1967 poem</a>.</p></li><li><p>&#8220;Models are only as good as the context provided to them.&#8221;</p></li></ol><p>The latter perhaps not exactly poetry, but one I&#8217;ve often heard repeated and for good reason. As far as I can tell, it was first stated by Mahesh Murag (<a href="https://x.com/MaheshMurag">@MaheshMurag</a>) at the <a href="https://www.youtube.com/watch?v=kQmXtrmQ5Zg">AI Engineer Summit in New York, 2025</a>. At any rate, it's a simple truth I keep coming back to. In my own AI engineering experience the past few years, the models get stronger, but you need the right context or they&#8217;re useless.</p><p>I'll get back to that thought in a moment, but there is one other piece of this puzzle we need to fill in: agents. &#8220;Agents are models using tools in a loop,&#8221; a pithy definition from <a href="https://www.youtube.com/watch?v=XSZP9GhhuAc">Hannah Moran</a>, and one I still favor.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!q7cm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!q7cm!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp 424w, https://substackcdn.com/image/fetch/$s_!q7cm!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp 848w, https://substackcdn.com/image/fetch/$s_!q7cm!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp 1272w, https://substackcdn.com/image/fetch/$s_!q7cm!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!q7cm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp" width="1456" height="606" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:606,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:15066,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/190982338?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!q7cm!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp 424w, https://substackcdn.com/image/fetch/$s_!q7cm!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp 848w, https://substackcdn.com/image/fetch/$s_!q7cm!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp 1272w, https://substackcdn.com/image/fetch/$s_!q7cm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F624c0b30-7295-4dce-a097-dda93e9b4413_2401x1000.webp 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">An autonomous agent according to Anthropic in 2024</figcaption></figure></div><p>And trust me, trying to define what the heck even is an agent was a thing for sure. There was plenty of talk about <em>environment</em> that struck me as weirdly pedantic. I think folks were trying to abstractly account for things like physical robots we don't even have yet. Meanwhile, environment turned out to be&#8230; bash. An agent can ostensibly do anything a human can do on a computer via the command line (bash) and, increasingly, via screen-based interaction (computer use).</p><p>When Claude Code was released in early 2025 in tandem with Claude 3.7 Sonnet it was no doubt a watershed moment. The model was meh but improving. A lot of us were using Claude 3.5 Sonnet already to write actual production code, but it was more of a &#8220;let&#8217;s see if we can do this&#8221; or &#8220;hey wouldn&#8217;t it be fun if I just had the LLM write this for me,&#8221; and a whole lot of copy pasta between the chatbot and your IDE that no sane human being should have to do.</p><p>I&#8217;m sure we could have just written the code ourselves in less time during that period, but what is the fun in that? I was ready to automate my job away. Anyone else who tried to use chatbots to write serious code can tell you, the model really needs context of your codebase, and it ideally needs that iterative feedback loop from linters, compilers, and testing. Claude Code automated this loop.</p><p>On the one hand, if you had spent time toiling away back and forth with the chatbot then you&#8217;d have gotten pretty good at hand rolling some artistically packaged context blocks, refining as you go. These are essentially the actions Claude Code automates for you, but if you&#8217;d done so in this period, you look at today&#8217;s agentic output differently.</p><p>All I see is system prompt, user message, tool call, parameters, return, and so on. When I&#8217;m having trouble building agents or working with agents I go back to this simple thing: look at what is being sent to the model and what is coming back. With open source models, that extends to the inference server logs and, on rare occasions, even reading the server&#8217;s source code to make sure you truly understand what&#8217;s going on.</p><p>It&#8217;s not like I&#8217;m looking at the raw request of every agentic turn, but when there is trouble this is exactly where I go. This is a bit of a superpower by the way. Keep that in your back pocket.</p><h2>I thought you were gonna talk about AI assistants</h2><p>The genius of OpenClaw is how it&#8217;s layered. Start with an agent not unlike Claude Code built on the Pi framework in this case. It scales through bash, CLIs, and agent skills, giving it practically unlimited access to your personal context. Your messages, email, calendar, notes, health data, whatever you wire up.</p><p>Layer a persistent memory system on top of that, and now it knows you and evolves. Finally, it reaches out to you and you to it through whatever messaging apps you already use. This is what makes it different from a chatbot. A chatbot answers questions. This thing lives on your machine, accumulates context over time, and acts on your behalf.</p><h2>Big tech is opening the door</h2><p>Sure, you can integrate most of these data sources into ChatGPT or Claude already. But this hits different. With OpenClaw, it&#8217;s my machine, running on my home network. When it browses the web, it does so from a residential IP address. I text it through my normal messaging app, iMessages. It&#8217;s not a product someone else controls. It&#8217;s mine.</p><p>As far as I can tell, the platforms are starting to meet us halfway. Peter Steinberger built <a href="https://github.com/steipete/gogcli">gogcli</a> to integrate Google products early on with plenty of you can just do things energy. But Google themselves recently released a <a href="https://github.com/googleworkspace/cli">Workspace CLI</a> built for humans and AI agents. That&#8217;s a pretty significant signal. This reads to me as a direct response to the new class of personal assistants that OpenClaw opened the door to.</p><p>It appears Microsoft isn&#8217;t far behind. A few days ago Brad Groux (<a href="https://x.com/BradGroux">@BradGroux</a>) <a href="https://x.com/BradGroux/status/2032584859617726652">posted on X</a> that more than a dozen Microsoft employees are involved in getting OpenClaw working on Teams with six dedicated to the effort. They&#8217;re dogfooding it internally. Microsoft is actively making Teams available to this class of AI assistant.</p><p>Clearly, the platforms see where this is going and they&#8217;re laying the rails. All of OpenClaw is built without first-party support. Now imagine what happens with it.</p><h2>Build over buy</h2><p>On the enterprise side, we&#8217;re seeing a similar signal from a different angle. Stripe, Ramp, and Coinbase have all built their own internal coding agents rather than buying off the shelf. Different approaches, Stripe forked and open source agent, Ramp composed on top of one, and Coinbase built from scratch. But I feel like the same underlying conclusion here is nobody understands your business like you do.</p><p>In short, all signs point to build right now. The platform vendors are shipping first-party CLIs and the best engineering organizations are building their own agents in-house. The writing is on the wall.</p><h2>Get started now</h2><p>If you&#8217;re reading this newsletter, my advice hasn&#8217;t changed: get started. Set it up. Dogfood it.</p><p>There will be polished personal AI assistants ready for mainstream use eventually. Right now while it&#8217;s still rough is the best time to learn. Same energy as hand-rolling context blocks for chatbots. It was painful, the models weren&#8217;t quite there either, but the people who did it came out the other side understanding how these systems actually work.</p><p>I believe that understanding compounds. When the safe version arrives, you won&#8217;t be learning. You&#8217;ll be building.</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://jpreagan.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Notes on AI Engineering is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[I'm done waiting for Apple to figure this out]]></title><description><![CDATA[I spent this weekend setting up OpenClaw. It's brilliant. And I believe these type of personal assistants are the future of computing. Here's what I learned.]]></description><link>https://jpreagan.com/p/im-done-waiting-for-apple-to-figure</link><guid isPermaLink="false">https://jpreagan.com/p/im-done-waiting-for-apple-to-figure</guid><dc:creator><![CDATA[James Reagan]]></dc:creator><pubDate>Mon, 02 Feb 2026 08:39:36 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!KyGq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!KyGq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!KyGq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic 424w, https://substackcdn.com/image/fetch/$s_!KyGq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic 848w, https://substackcdn.com/image/fetch/$s_!KyGq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic 1272w, https://substackcdn.com/image/fetch/$s_!KyGq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!KyGq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic" width="1200" height="628" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:628,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:65811,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/186589805?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!KyGq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic 424w, https://substackcdn.com/image/fetch/$s_!KyGq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic 848w, https://substackcdn.com/image/fetch/$s_!KyGq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic 1272w, https://substackcdn.com/image/fetch/$s_!KyGq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6051d2da-711d-4646-8250-357c4a828cbd_1200x628.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I spent this weekend setting up <a href="https://openclaw.ai">OpenClaw</a>. It's brilliant. And I believe these type of personal assistants are the future of computing. Here's what I learned.</p><h2><strong>Why I did this</strong></h2><p>I&#8217;m a hacker. I spent years watching Siri (and any others you care to mention) struggle with basic requests and wondering why digital assistants felt so limited. The answer, I believe is walls. Walls between apps, walls between services, walls between my data and the AI that could actually use it.</p><p>OpenClaw breaks those walls down. It&#8217;s self-hosted, which means I control the data. It connects to multiple channels, which means I&#8217;m not locked into one ecosystem. And you can run it with any model from any provider.</p><p>This is the ground floor of something new. Personal assistants running on your own hardware and integrated with your actual life. Stop having opinions about whether this is a good idea. Just try it. The learning happens in the doing.</p><h2><strong>The reader and brain</strong></h2><p>Here is where I&#8217;ve started. The approach I&#8217;m taking here is one of separation. One machine holds my personal data. Another machine runs the AI. They communicate through a narrow, read-only bridge.</p><pre><code><code>MacBook (Personal)              Mac mini (Brain)
- Personal Apple ID             - Bot Apple ID
- Messages and data             - OpenClaw Gateway
- Read-only message CLI         - BlueBubbles
       &#8593;                              &#8595;
       &#9492;&#9472;&#9472;&#9472;&#9472; SSH/Tailscale &#9472;&#9472;&#9472;&#9472;&#9472;&#9472;&#9472;&#9472;&#9472;&#9472;&#9472;&#9496;</code></code></pre><p>My MacBook stays personal. It has my Apple ID, my messages, my files. The only thing exposed to the outside world is a restricted SSH endpoint that can read messages but not send them, access specific data but not modify it.</p><p>The brain machine runs OpenClaw. It has its own Apple ID, its own identity. BlueBubbles handles outbound messaging through this separate account. I text my assistant the way I&#8217;d text a friend, no special app, just another conversation in iMessage.</p><p>We have to bear in mind this agent is nondeterministic and runs with real permissions. It can execute commands, access files, send messages. If it gets tricked by a prompt injection buried in a message or webpage, or if it simply makes a mistake, it acts with whatever access I&#8217;ve given it.</p><p>The two-machine split limits the blast radius it seems to me. The agent on the brain machine can&#8217;t reach my Keychain, my Apple Wallet, or my personal files. It can only read messages through a narrow SSH bridge. It can&#8217;t modify them, can&#8217;t send as my personal identity, can&#8217;t escalate beyond that one forced command. Even a manipulated agent can only do damage within the boundaries I&#8217;ve drawn.</p><h2><strong>Making sense of the pieces</strong></h2><p>The hardest part wasn&#8217;t the technical setup. It was understanding the concepts. OpenClaw has a lot of moving pieces.</p><p>There&#8217;s a gateway that routes the AI. There&#8217;s an agent with the assistant&#8217;s identity and memory. There are channels for different messaging services. There are tools the agent can use. There are policies controlling what it can access. There&#8217;s a memory system, a heartbeat mechanism, scheduled jobs.</p><p>I kept reading documentation and feeling like I was missing something fundamental. I believe the best thing to do is just jump in. So start small because there is a lot of surface to cover.</p><p>I decided to begin with just text messages while I could get the feel for how it works in motion. One data source, one channel, one attack surface to understand before expanding.</p><h2><strong>The addictive part</strong></h2><p>This is where it got fun.</p><p>You have access to everything under the hood. The configuration is JSON you can read and modify. The logs tell you exactly what&#8217;s happening. When something breaks, and it will, you can trace it.</p><p>I configured the SSH bridge, tested the connection, watched the first message query flow through. I set up BlueBubbles for outbound messaging, paired my iPhone, sent a test text.</p><p>Then I asked a question that required my personal context. And got a real answer.</p><p>The response wasn&#8217;t generic. It was grounded in my actual messages, my actual life. The assistant knew things because I had given it access to know things. And it felt completely different from typing into ChatGPT in a browser.</p><h2><strong>How it works</strong></h2><p>Understanding the underlying systems helped me trust the setup more.</p><h3>Memory</h3><p>OpenClaw stores memories as plain Markdown files. There&#8217;s a daily log for running notes and a curated file for long-term memories. The memories are human-readable. I can open them in any text editor, see exactly what&#8217;s stored, edit or delete anything I want. No black box, no hidden database.</p><p>Before the conversation context gets too long, the system prompts the assistant to save anything important. Memories persist even when individual conversations are compacted.</p><h3>Heartbeat</h3><p>Every thirty minutes, or whatever interval you set, the assistant wakes up and checks a heartbeat file. This is a simple Markdown checklist: check for urgent messages, review upcoming calendar events, summarize any finished background tasks.</p><p>If nothing needs attention, it stays quiet. If something does, it surfaces it. Proactive awareness is the new paradigm.</p><h3>Scheduled jobs</h3><p>Cron jobs handle precise timing. A daily briefing at 7 AM. A reminder in twenty minutes. A weekly analysis every Monday morning.</p><p>Jobs can run in the main conversation context or in isolation with their own session. They can use different models, different thinking levels, deliver results to specific channels. &#8220;Remind me in 20 minutes about the call&#8221; just works.</p><h3>Tools and capabilities</h3><p>The gateway handles multiple CLI tools, plugins, and skills. The assistant can read and write files, execute commands, search the web, control a browser, manage calendar events, send messages across channels.</p><p>Tools are composed in layers with policies controlling access. A sandboxed context gets different capabilities than a fully trusted one. Group chats can be sandboxed or given different tool policies if you configure it.</p><h2><strong>Who should try this</strong></h2><p>OpenClaw is for hackers who love building systems. I wouldn&#8217;t recommend this for mass consumption right now as there is plenty of security risk if you&#8217;re not knowledgeable in this area or risky for anyone if you&#8217;re careless.</p><p>Although that is the thing we must build, and it must be open source so we can understand how it operates. At some point in the near future, I believe nearly every person on the planet will have a personal assistant.</p><p>If you want something that works out of the box with no configuration, wait a year. But if you want to be at the ground floor while the space is still being established, now is the time.</p><h2><strong>What&#8217;s next</strong></h2><p>I started with text messages. Next is email, then notes, calendar, and health data. Each expansion I expect will follow the same patternof understanding the data source, configuring the access, verifying the boundaries, and testing the integration.</p><p>This is day one of a long journey. We&#8217;re building the plane while flying it. The combination of capabilities, agency, memory, integration, and ownership, is what makes this different from everything that came before.</p><p>Stop having opinions. Start experimenting. This is the most exciting thing happening in AI right now, and you can run it on your own hardware this weekend.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://jpreagan.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Notes on AI Engineering is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[AI isn't the Point]]></title><description><![CDATA[My 2025 Reflection]]></description><link>https://jpreagan.com/p/ai-isnt-the-point</link><guid isPermaLink="false">https://jpreagan.com/p/ai-isnt-the-point</guid><dc:creator><![CDATA[James Reagan]]></dc:creator><pubDate>Wed, 24 Dec 2025 07:57:02 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!xJrn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!xJrn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!xJrn!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg 424w, https://substackcdn.com/image/fetch/$s_!xJrn!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg 848w, https://substackcdn.com/image/fetch/$s_!xJrn!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!xJrn!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!xJrn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:790428,&quot;alt&quot;:&quot;Photo by Arina Bondar on Unsplash&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/182488117?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Photo by Arina Bondar on Unsplash" title="Photo by Arina Bondar on Unsplash" srcset="https://substackcdn.com/image/fetch/$s_!xJrn!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg 424w, https://substackcdn.com/image/fetch/$s_!xJrn!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg 848w, https://substackcdn.com/image/fetch/$s_!xJrn!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!xJrn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5cf9b063-c680-473e-9259-a463baf4438e_2400x1600.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@rina_coop?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText">Arina Bondar</a> on <a href="https://unsplash.com/photos/brown-concrete-building-under-blue-sky-during-daytime-aK5U4gzF1Ng?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText">Unsplash</a></figcaption></figure></div><p>I&#8217;m in my third year as a software engineer with the last two focused exclusively on artificial intelligence. I work from home on Hawai&#699;i Island as a government contractor designing systems that use AI/ML to process information and make human tasks more efficient. As the year closes, I find myself at the moment reflecting less about what AI can do and more about who&#8217;s building it and why.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pRHo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pRHo!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg 424w, https://substackcdn.com/image/fetch/$s_!pRHo!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg 848w, https://substackcdn.com/image/fetch/$s_!pRHo!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!pRHo!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pRHo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg" width="752" height="752" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:752,&quot;width&quot;:752,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:99491,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/182488117?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!pRHo!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg 424w, https://substackcdn.com/image/fetch/$s_!pRHo!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg 848w, https://substackcdn.com/image/fetch/$s_!pRHo!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!pRHo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5b10b69-8ab7-414e-a796-1a8a82ea419e_752x752.jpeg 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>A few years ago, I started posting online about what I was learning, what I was building, and what I thought was changing. I had no audience and no particular strategy. I just wanted to think out loud and see who was thinking about the same things.</p><p>That decision opened doors for me.</p><h2>VS Code + GitHub Insider Summit</h2><p>In September, I was invited to the VS Code + GitHub Insider Summit in Redmond, Washington. I spent a few days in conversation with people whose blogs I&#8217;d been reading for years, whose tools I use daily. There&#8217;s still something magical about meeting someone you&#8217;ve prior only known through their work. Technology is made by people, and understanding the people changes how you understand the work.</p><p>The purpose of the summit was to discuss the future of VS Code and GitHub Copilot, what&#8217;s working, what isn&#8217;t, and where things should go. The Microsoft and GitHub teams were generous with their time and genuinely curious about what practitioners are experiencing. The level of care was evident, and that&#8217;s not something you can fake in a room that small.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!r4K8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!r4K8!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg 424w, https://substackcdn.com/image/fetch/$s_!r4K8!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg 848w, https://substackcdn.com/image/fetch/$s_!r4K8!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!r4K8!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!r4K8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg" width="1280" height="960" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:960,&quot;width&quot;:1280,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:354878,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/182488117?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!r4K8!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg 424w, https://substackcdn.com/image/fetch/$s_!r4K8!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg 848w, https://substackcdn.com/image/fetch/$s_!r4K8!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!r4K8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a49518b-b1f7-4bd4-929d-dfb700634052_1280x960.jpeg 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://jpreagan.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Notes on AI Engineering is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h2>Aotearoa</h2><p>Then in November, I traveled with my h&#257;lau to Aotearoa. The trip was about cultural exchange, connecting our h&#257;lau with M&#257;ori communities, visiting marae, and honoring the ties between Hawai&#699;i and Aotearoa.</p><p>The first leg of our journey began at <a href="https://wipce2025.com">WIPCE</a>, the World Indigenous Peoples&#8217; Conference on Education. Among the sessions I attended, several focused specifically on AI, and I found myself thinking about AI in ways I hadn&#8217;t anticipated. The talks all seem to converge on the same theme: own your data, own your infrastructure, own your future.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!5f3F!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!5f3F!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg 424w, https://substackcdn.com/image/fetch/$s_!5f3F!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg 848w, https://substackcdn.com/image/fetch/$s_!5f3F!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!5f3F!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!5f3F!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg" width="1456" height="1092" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1092,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:5967827,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/182488117?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!5f3F!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg 424w, https://substackcdn.com/image/fetch/$s_!5f3F!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg 848w, https://substackcdn.com/image/fetch/$s_!5f3F!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!5f3F!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F214c88ab-abce-48fc-89d3-37139498e7a2_5088x3816.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><a href="https://tehiku.nz">Te Hiku Media</a> presented their work building a M&#257;ori speech recognition model. They&#8217;d started as a radio station in 1990, preserving recordings of native speakers, and eventually realized they were sitting on something invaluable. They had training data. Rather than hand it to Google or Apple, they built their own tools. Their ASR model now <a href="https://blogs.nvidia.com/blog/te-hiku-media-maori-speech-ai/">transcribes te reo M&#257;ori with 92% accuracy</a>, trained on recordings their community contributed and still controls.</p><p>They spoke about land and data being tied to one another, and the two being inseparable. Both are inherited, both require stewardship, both can be extracted by outsiders if you&#8217;re not careful.</p><p>Dr. Lars Ailo Bongo, a S&#225;mi researcher from Norway, put it plainly: &#8220;AI is so important our biggest worry is to be left behind.&#8221; His team used LoRA to train open weight models that accurately represent G&#225;kti, traditional S&#225;mi clothing. The <a href="https://huggingface.co/Sami-AI-Lab">results</a> came from one researcher working for ten days with twenty images. Imagine, he said, what full research funding could do.</p><p>I&#8217;d arrived with a vague sense that AI might help indigenous communities. Something about land stewardship, environmental monitoring, language preservation. I left with real examples of people already doing it, and a framework for thinking about why it matters. This is what open source is actually for. For communities that have had things taken from them, it&#8217;s a way to build without asking permission.</p><p>Since coming home, I&#8217;ve been thinking about how often we interact with our phones. Siri still mangles Hawaiian place names. It can&#8217;t speak &#699;&#333;lelo Hawai&#699;i. Te Hiku Media poured years into solving this for te reo M&#257;ori, and now they are partnering with UH Hilo on <a href="https://lauleo.com">Lauleo</a>, a project to build the first speech-to-text model for Hawaiian.</p><h2>China</h2><p>This year, China became the dominant force in open weight models, and the story is inseparable from one person. Liang Wenfeng started as a quant trader. He co-founded High-Flyer, a hedge fund that grew to manage over 100 billion yuan. In 2023, he pivoted to AI and founded DeepSeek. Unlike most lab founders chasing commercial applications, Liang wanted to do foundational research and give it away.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!cCrB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!cCrB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp 424w, https://substackcdn.com/image/fetch/$s_!cCrB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp 848w, https://substackcdn.com/image/fetch/$s_!cCrB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp 1272w, https://substackcdn.com/image/fetch/$s_!cCrB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!cCrB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:109610,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://jpreagan.com/i/182488117?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!cCrB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp 424w, https://substackcdn.com/image/fetch/$s_!cCrB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp 848w, https://substackcdn.com/image/fetch/$s_!cCrB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp 1272w, https://substackcdn.com/image/fetch/$s_!cCrB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8654135-6cc6-4af6-ba0d-8e99574b4456_1854x1237.webp 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In late December 2024, DeepSeek released <a href="https://api-docs.deepseek.com/news/news1226">V3</a>, a 671 billion parameter model that matched GPT-4o and Claude 3.5 Sonnet on benchmarks. The reported training cost was $5.6 million. Then on January 20, they released <a href="https://api-docs.deepseek.com/news/news250120">R1</a>, a reasoning model that matched OpenAI&#8217;s o1 across math, code, and reasoning tasks. The release temporarily cratered Nvidia&#8217;s stock price.</p><p>In September, the <a href="https://www.nature.com/articles/s41586-025-09422-z">R1 paper was published in Nature</a>, and it was the first large language model to pass peer review in a major scientific journal. Nature made it the cover story. The paper disclosed a training cost of just $294,000 for the reasoning capabilities and addressed the distillation controversy directly: R1 did not learn by copying reasoning examples from OpenAI models. Eight external experts reviewed the work. Nature&#8217;s editors called it &#8220;a welcome step toward transparency and reproducibility.&#8221;</p><p>What really struck me though was the philosophy behind it. Liang has said that open sourcing doesn&#8217;t mean losing anything, and that for technical people, being followed and built upon provides a sense of accomplishment. &#8220;Giving is an extra honor,&#8221; he stated in an <a href="https://wallstreetcn.com/articles/3719982">interview</a>. &#8220;A company doing this has cultural attractiveness.&#8221; He believes China can&#8217;t remain a follower forever, that the real gap isn&#8217;t one or two years but the gap between originality and imitation.</p><p>His team is almost entirely young graduates from Chinese universities. No mysterious overseas talent. Just people who like solving hard problems, given the resources to try. And I believe there are several labs in existence like this in China today.</p><p>The uncomfortable tension is obvious. The same country producing researchers who share their work freely is also the country building mass surveillance infrastructure. The same tools, different hands, different outcomes. I don&#8217;t think this contradiction resolves. Individual researchers solving problems they find interesting exist alongside state apparatus that might use those solutions for control. Both are true.</p><p>Technology doesn&#8217;t have values. People do. And right now, some of the people pushing open models forward are doing it because they believe knowledge should be shared.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://jpreagan.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Notes on AI Engineering is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Is AI eating your coding skills?]]></title><description><![CDATA[Is AI eroding your coding skills? My thoughts on maintaining problem-solving and traditional coding ability in an era of AI-generated code.]]></description><link>https://jpreagan.com/p/is-ai-eating-your-coding-skills</link><guid isPermaLink="false">https://jpreagan.com/p/is-ai-eating-your-coding-skills</guid><dc:creator><![CDATA[James Reagan]]></dc:creator><pubDate>Mon, 30 Jun 2025 16:56:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!lp1l!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!lp1l!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!lp1l!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg 424w, https://substackcdn.com/image/fetch/$s_!lp1l!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg 848w, https://substackcdn.com/image/fetch/$s_!lp1l!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!lp1l!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!lp1l!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1387763,&quot;alt&quot;:&quot;Photo by Marat Gilyadzinov on Unsplash&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://jpreagan.substack.com/i/174571517?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Photo by Marat Gilyadzinov on Unsplash" title="Photo by Marat Gilyadzinov on Unsplash" srcset="https://substackcdn.com/image/fetch/$s_!lp1l!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg 424w, https://substackcdn.com/image/fetch/$s_!lp1l!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg 848w, https://substackcdn.com/image/fetch/$s_!lp1l!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!lp1l!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3f57204-28cd-446e-bb4d-af97c4dd9d51_3800x2533.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@m3design?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Marat Gilyadzinov</a> on <a href="https://unsplash.com/photos/closeup-photography-of-swarm-of-jellyfish-MYadhrkenNg?utm_content=creditCopyText&amp;utm_medium=referral&amp;utm_source=unsplash">Unsplash</a></figcaption></figure></div><p>Lately, I hear developers say that generating AI code feels a lot like <a href="https://news.ycombinator.com/item?id=44327924&amp;ref=jpreagan.com">playing a slot machine</a>. You pull the lever, hoping for a jackpot. When it hits, the feeling is incredible. &#8220;Think of all of this time I just saved!&#8221; When the results aren&#8217;t quite right, the temptation is to keep pulling the lever. Sometimes you&#8217;re just chasing a win that will never arrive.</p><p>Must be a skill issue, right? Maybe you haven&#8217;t prompted the LLM correctly? Prompting is definitely a real skill you can become good at, but it isn&#8217;t always the culprit. LLMs are designed to produce plausible output based on context, not necessarily correct or optimal solutions. But as a software engineer, your goal is to write correct code, not just plausible code.</p><h2><strong>What AI can&#8217;t do (yet)</strong></h2><p>We refer to problems being <em>in distribution</em> for the LLM if it&#8217;s something the model has seen frequently. The model will struggle, however, with contexts that are <em>out of distribution</em>. It can be challenging because you have no real way of knowing which scenario you&#8217;re dealing with here. After working with LLMs for a while, you may even develop a sense for when your problem falls into the latter category, but there is no guarantee.</p><p>This uncertainty underscores the ongoing importance of traditional programming skills, specifically coding without AI assistance. Many problems, both existing and those yet to be imagined, will remain <em>out of distribution</em> for AI models. That&#8217;s exactly why maintaining your traditional coding skills matters. Even as some tech influencers and business leaders tell us that <a href="https://tech.yahoo.com/ai/articles/nvidia-ceo-says-future-coding-124534339.html?ref=jpreagan.com">coding is dead</a> or suggest it has no future as a profession, frankly, I emphatically disagree.</p><p>Somehow, I feel like we&#8217;ve heard this claim before. In 1957, <a href="https://www.softwarepreservation.org/projects/FORTRAN/BackusEtAl-Preliminary%20Report-1954.pdf?ref=jpreagan.com">an IBM original sales brochure for FORTRAN</a> promised that the new language would &#8220;virtually eliminate coding and debugging.&#8221; In 1981, James Martin argued in <em>Application Development Without Programmers</em> that with emerging nonprocedural tools end users would soon be able to build their own software applications without the need for professional coders, and job growth would flatline. In the 1990s, visual programming tools were touted as a means to replace traditional coding. More recently, no-code platforms have claimed that anyone can deliver software solutions without writing a single line of code.</p><p>Yet every time, reality unfolded differently. These tools didn&#8217;t eliminate programmers, but mainly shifted the nature of our work. Rather than shrinking, the demand for developers exploded time and time again, pushing us toward more complex and creative tasks. I predict that AI will follow the same pattern.</p><h2><strong>AI is improving, but won&#8217;t replace humans</strong></h2><p>What can we expect going forward? No doubt, LLMs will continue to improve at helping us write code. They&#8217;ll never be perfect, though, and barring one or more scientific breakthroughs that are currently not evident, the goal of completely replacing human software engineers is science fiction. The code is yours when the AI finishes, but so is the responsibility. You also can&#8217;t verify what you don&#8217;t understand.</p><p>When ChatGPT launched, it was a watershed moment. Moving from GPT-3.5 to GPT-4 felt like a step change, and what a time of possibility that was. Shortly thereafter, Anthropic&#8217;s Claude 3.5 Sonnet in particular gained a deserved reputation for its coding ability. After that, we began to see only comparatively modest improvements, and this cooling began to temper our expectations.</p><p>But then there was another breakthrough. OpenAI&#8217;s <a href="https://openai.com/index/learning-to-reason-with-llms/">reasoning approach</a> moved the needle again by training models to plan and verify, and by allocating additional inference-time compute. Recently, we&#8217;ve been seeing improvements in better memory management and enhanced tool integration, which have gradually advanced the coding capabilities of recent models. Despite all of this progress, all models still hallucinate and continue to struggle with novel problems outside their training data.</p><h2><strong>Your coding skills matter more than ever</strong></h2><p>This post isn&#8217;t a rebuke of AI coding tools. On the contrary, I use them daily. They&#8217;re productive and, honestly, a lot of fun too. But their utility isn&#8217;t unlimited. I believe you would be wise to take action to preserve your traditional coding skills, along with your ability to effectively collaborate with AI. As a developer today, you will almost certainly need both skill sets. I&#8217;ll share with you the strategies that help me stay sharp as coding increasingly integrates AI tools.</p><p>Let&#8217;s start with the obvious: skills are like muscles, use them or lose them. If you stop practicing a skill, it weakens over time. You&#8217;ve probably noticed this when it comes to technical interviewing. Each time you begin a new job search, if you&#8217;re like me, you likely need to refresh your LeetCode skills. With practice, you regain proficiency, pass the interviews, and move on. Then, on the job, you naturally shift focus, and your interview skills fade once more. It&#8217;s a cycle we&#8217;re all familiar with.</p><p>Your coding skills follow the same pattern. If you consistently rely only on AI-generated code without regular coding practice, your core programming skills can deteriorate. Over time, tasks that were once straightforward can become surprisingly challenging. Developers call this skill rot, and it&#8217;s real.</p><h2><strong>Is AI causing cognitive decline?</strong></h2><p>Recently, there has been concern about a <a href="https://www.media.mit.edu/projects/your-brain-on-chatgpt/overview/?ref=jpreagan.com">widely discussed MIT study</a>. The study measured brain activity via EEG of students writing essays either with or without ChatGPT assistance. It revealed reduced cognitive engagement among students who relied heavily on ChatGPT. Some worry this could mean similar cognitive decline might occur with coding. In practice, AI-assisted programming involves active, ongoing dialogue between you and the AI. This interaction, I would argue, still requires technical skill and deep problem-solving ability.</p><p>Let&#8217;s be clear about what is at risk here and why it matters. When I first learned programming in the 1990s, compilers still came in boxes. My syntax recall was pretty strong because I wrote programs by hand with pencil and paper. Later, when I had access to school computers with a compiler, I could type out and run my programs, but the developer tools were still limited compared to today. Syntax recall didn&#8217;t make me a good programmer, though. I was still very much learning programming, and my growth came from thinking through problems.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!PgNI!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!PgNI!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg 424w, https://substackcdn.com/image/fetch/$s_!PgNI!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg 848w, https://substackcdn.com/image/fetch/$s_!PgNI!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!PgNI!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!PgNI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg" width="1000" height="683" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:683,&quot;width&quot;:1000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Vintage Borland C++ compiler box for Windows and DOS.&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Vintage Borland C++ compiler box for Windows and DOS." title="Vintage Borland C++ compiler box for Windows and DOS." srcset="https://substackcdn.com/image/fetch/$s_!PgNI!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg 424w, https://substackcdn.com/image/fetch/$s_!PgNI!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg 848w, https://substackcdn.com/image/fetch/$s_!PgNI!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!PgNI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fef5e09a7-d682-46bf-bd39-b8e844091011_1000x683.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">When compilers came in boxes.</figcaption></figure></div><p>My point is that whether you&#8217;re solving problems traditionally or with AI, you&#8217;re still engaging in deep technical thinking. The MIT study looked at students writing essays on autopilot, which is a completely different scenario from the interactive technical back-and-forth involved when coding with AI tools. So don&#8217;t stress too much if your syntax recall weakens a bit. The real value of a software engineer is problem-solving, not syntax. That said, I&#8217;ll share some strategies on how to preserve these practical details reasonably well, but first, let&#8217;s expand a bit further on the earlier discussion about <em>in distribution</em> versus <em>out of distribution</em> problems.</p><h2><strong>AI cannot solve every problem</strong></h2><p>Some problems aren&#8217;t suited for AI because they are <em>out of distribution</em>. AI hasn&#8217;t encountered them often or at all during training. Advancing models and better tool integration raise the bar for what AI can handle effectively, but won&#8217;t eliminate these limitations entirely. You should understand two things about these problems: (1) AI will offer only limited value or may even be a waste of your time, and (2) these challenges often represent your most marketable skills. In other words, deep expertise in uncommon or unsolved problems is a real career advantage.</p><p>AI struggles here, making your expertise highly valuable. Some will claim AI will eventually catch up. But new problems emerge constantly, likely along with entirely new paradigms we can&#8217;t yet imagine. The world is not static. AI might speed up solving common problems, creating even more focus on complex problems for experts like you. Celebrate that AI can&#8217;t solve every problem. It&#8217;s good news. AI will handle routine tasks, allowing you to focus on the rewarding and challenging work that likely you may enjoy most.</p><h2><strong>Never trust anything that AI produces</strong></h2><p>You should be default skeptical of anything produced by AI. One of the wonderful advantages of code is that it can be read and understood. Unlike other AI outputs, you can directly verify its correctness. You can read and understand the logic, run it, test it, and confirm whether or not it meets your expectations.<br><br>Developers apparently flip into autopilot mode too readily with AI generated code, assuming it&#8217;s correct without adequate scrutiny. This can be dangerous. A <a href="https://arxiv.org/abs/2211.03622?ref=jpreagan.com">Stanford study</a> found that developers relying on AI assistants produced code with far more vulnerabilities, yet paradoxically felt overly confident about its security. Another recent <a href="https://arxiv.org/abs/2502.14202?ref=jpreagan.com">comparative evaluation</a> reinforced this concern, showing that even more recent LLMs consistently fail to identify common security flaws in code examples, catching issues less than half the time.</p><p>But security vulnerabilities aren&#8217;t your only concern. AI-generated code can quietly introduce design flaws or subtle logic errors that slip past linters, compilers, and automated tests. The real danger is that these issues blend into your codebase and stay hidden until they cause significant problems later. When you are iterating with AI take frequent breaks every step of the way to pause and asses your solution. Lean on your version control history to review even tiny incremental changes. Vibe coding has a time and place for demos and experiments, but for serious codebases with actual users, discipline matters a lot more.</p><h2><strong>Slow down for better code</strong></h2><p>There&#8217;s plenty of talk right now about AI-driven 10x productivity. Personally, I find those claims questionable. Numbers closer to 2x sound more believable to me. And frankly, if you&#8217;re genuinely producing code ten times faster, it&#8217;s probably shallow work, no offense. Speed alone isn&#8217;t worth celebrating.</p><p>I&#8217;m far more interested in how AI can help me improve the <em>quality</em> of my code, not just the lines of code or number of commits I make. How can I use AI to create superior designs, identify the more subtle bugs, or, even better, eliminate unnecessary code altogether? Instead of rushing to push out AI-generated code, slow down. Use AI to explore alternatives and thoughtfully question your assumptions. Ask the AI why it made specific suggestions. Have it critique your design, suggest simplifications, or propose edge cases you haven&#8217;t yet considered. Be skeptical still yet.</p><h2><strong>Maintaining your practical coding skills</strong></h2><p>Even if you rely on AI every day, your traditional coding skills still need regular practice. In a <a href="https://nmn.gl/blog/ai-illiterate-programmers?ref=jpreagan.com">blog post</a>, Namanyay Goel writes about an uncomfortable truth:</p><blockquote><p>Every time we let AI solve a problem we could&#8217;ve solved ourselves, we&#8217;re trading long-term understanding for short-term productivity. We&#8217;re optimizing for today&#8217;s commit at the cost of tomorrow&#8217;s ability.</p></blockquote><p>To fight this, Goel and others advocate for &#8220;No-AI days.&#8221; In other words, pick one day per week and code entirely without AI. Stepping away from AI pushes you to genuinely engage with your code again, confront errors directly, rebuild your debugging intuition, and reclaim the satisfaction of truly understanding a problem.</p><p>I admire the discipline and routine of a weekly ritual, and it resonates with you, it&#8217;s a great way to keep your skills sharp. For me, a more flexible approach feels natural. Using AI freely for routine tasks, and intentionally stepping away when facing legitimately novel problems.</p><p>For those deeper problems, you&#8217;ll naturally hit a point where AI stops being helpful. The solution becomes too specialized, and AI-generated suggestions become a waste of time and tokens. In these cases, you&#8217;re forced to go manual anyway.</p><p>Still, this doesn&#8217;t fully solve our problem: how can we intentionally keep our coding skills sharp? I&#8217;m talking about deliberate practice.</p><p>Another popular method is tackling daily Leetcode problems, perhaps without your editor&#8217;s code completion or even on a whiteboard. While likely effective as well, I&#8217;ve personally found Leetcode a bit dull. <a href="https://adventofcode.com/?ref=jpreagan.com">Advent of Code</a> challenges might be a more enjoyable alternative if you&#8217;re looking for that size of puzzle.</p><p>My favorite approach, though, is the &#8220;Build Your Own X&#8221; method. Check out the <a href="https://github.com/codecrafters-io/build-your-own-x?ref=jpreagan.com">Build Your Own X</a> repo or John Crickett&#8217;s <a href="https://codingchallenges.fyi/?ref=jpreagan.com">Coding Challenges</a>. Rewrite a common Unix utility in the language of your choice, or build your own minimalist web server or data store, or create a tiny text editor from scratch. Pick something that excites you and keep it fun.</p>]]></content:encoded></item></channel></rss>