← Back to today

Wednesday, May 13, 2026

5 stories · 3 min read

The UI for AI is getting a major upgrade this week. While everyone focuses on making models smarter, the biggest breakthroughs are happening in how we actually interact with them.

01

Anthropic ships Claude agents for your terminal

Cat Wu from Anthropic showed off a new Claude agents command that creates a control plane in your terminal. Run `claude agents` and then hit `<-` from any CLI session to register that session with the control plane. Wu recommends running it from your root code directory to manage all your Claude agents in one place.

Why it matters: This is the first major AI company to ship agents directly into developer workflows rather than through a separate app. Your terminal is about to become an AI collaboration space, not just a command interface.

Source →

02

OpenAI reveals what 1,000 researchers learned building tiny AI models

OpenAI published results from Parameter Golf, a competition that brought together over 1,000 participants to explore AI-assisted research under strict constraints. The challenge focused on machine learning research, coding agents, quantization techniques, and novel model designs, generating over 2,000 submissions that tested how AI can help humans do better research.

Why it matters: This is OpenAI's data on whether AI can actually accelerate scientific discovery, not just answer questions. The timing suggests they're preparing to launch research-focused AI tools that go way beyond ChatGPT.

Source →

03

Andrej Karpathy's HTML trick changes how you use AI

Former OpenAI researcher Andrej Karpathy shared a simple but powerful technique: ask your LLM to "structure your response as HTML," then view the generated file in your browser. He's also had success asking for slideshows and other visual formats. His insight: "Audio is the human-preferred input to AIs but vision is the preferred output from them."

Why it matters: You've been using AI like a text messenger when you should be using it like a web designer. About a third of your brain processes visual information, which means you can absorb AI-generated HTML pages much faster than walls of text.

Source →

04

Thinking Machines "brutally framemogged" OpenAI and Google

AI community builder Swyx declared that Thinking Machines just redefined what "realtime" means in AI interactions, apparently outpacing both Google and OpenAI on responsiveness. The comment references yesterday's coverage of Thinking Machines' voice AI breakthrough.

Why it matters: If a smaller company is beating the big labs on the metric that matters most for voice AI, expect acquisition rumors within weeks. Real-time interaction is the difference between a demo and a product people actually use.

Source →

05

Product manager Peter Yang asks the real questions

Roblox product leader Peter Yang wondered why Southern California has world-class family attractions like Disney and Legoland while the Bay Area can't keep its only amusement park alive.

Source →