I remember when I was young watching “Star Trek: The Next Generation” (TNG), and thinking about how cool it would be to talk to Data (the android).

I particularly remember thinking hard about the episode where a kid admires and looks up to Data so much, that he pretends to be like him in every way (including being emotionless).

I also really remember “Data’s Day”, which gives you a narrative (Data narrates) view of what Data’s … day is like.

So great. I remember thinking to myself about what questions I would ask him, if I could sit down and talk. I remember feeling jealous of his best friend Geordi. It confused me because they were not hanging out all the time. Does Geordi not realize that his best friend is the most interesting being in the universe???

Claude is Great

Sorry product managers everywhere, there are ONLY two products where I have felt “delighted”. Seriously, I just have a really high bar sitting in front of my computer here. Unless you are one of these two, you can “delight” me by getting out of my way:

  1. Games
  2. Claude (or any future AI agent that is as good)

After Opus 4.6 came out is when I started changing my opinion of how I feel about this software from “cool” to “delighted.”

I’ll say it: I love working with Claude.

On ai-2027.com, in December of 2027, it says that:

Integrated into the traditional chatbot interface, Agent-5 offers Zoom-style video conferencing with charismatic virtual avatars. The new AI assistant is both extremely useful for any task—the equivalent of the best employee anyone has ever had working at 100x speed—and a much more engaging conversation partner than any human. Almost everyone with access to Agent-5 interacts with it for hours every day.

Well, I don’t want a Zoom-style avatar, and yet I interact with Claude for hours every day, and it is only March 2026!!!1

What Claude is Like Now

Working with Claude (Opus 4.6) really does feel like working with a phenomenal engineer, who is super hard-working, works at… 10x my speed.

It doesn’t even tell me I’m “absolutely right!” anymore.2

Claude really does feel like a partner, who is willing to do whatever to get things done. Sometimes Claude asks me to do stuff. Sometimes I push back and ask Claude to do stuff. Sometimes Claude complains and argues. This is all fine, and again honestly, I love it!

I have all this source code here on my laptop. Everything from the apps we are running, to the JVM’s source code, to Linux. Some days, I feel like there is nothing we can’t solve together.

I Want To Shrinkwrap It

Here is the thing: I want it to never change.

I want it to always behave exactly like it is behaving now.

I want Opus 4.6 on a DVD. I would buy the hardware to run it.3

What I really want is the assurance and stability to do work consistently, without depending on a third party. Without 429s, without worrying about tokens or API costs.

The reality is that I personally cannot drive an H200 hard enough to justify its existence, so server computing it is.

But seriously, this is just the Centralization-Decentralization Cycle AGAIN.

It is too hard to tell if the pendulum can swing back to the point where AI agents can be part of MY setup, or if we will forever lose them to being run in the cloud, where they can enact change 24/7.

I Really Really Want Local LLMs to Win

I’ve been tinkering with local LLMs (Oobabooga) for a while.

They are great, OpenCode is awesome, but it just is not Claude. Claude sets an extremely high bar now.

But time marches on. Sometimes products get enshittified. As soon as Claude gets optimized for “engagement”, it is going to suck.

But also, local hardware keeps getting better, and the local LLM community is very strong. At least for now, open models and proprietary ones keep improving.

Maybe We Can Compromise?

SaaS eats the world now, but I would not be opposed to a sort of “self-hosted AI inference server” model, where a company could pay for an Anthropic license, but host it themselves.

This model just isn’t popular anymore, but I like it.

What I hate is to think of the global productivity drop (brain drain) when AI serving goes down for some reason.

Just like on the Enterprise, there was just one incredible superhuman android to share. Maybe we don’t all need our own personal servers4, but we can all talk to a shared one?

The Future

There are plenty of posts out there pontificating about the future nature of software engineering. Go read those. :)

For me, I’m having more fun than I have ever had before.

And I hope to keep doing so in the future!

The future can’t be all centrally run LLMs.

Maybe I just need to wait a few years for H200s to become obsolete, survive a huge AI capex bust, and then get some on the cheap. Combined with waiting for open-weight models to catch up a little more, get better at being quantized, and finally have what looks like Opus 4.6 does now in 2026, but in 2028.

Then, I can have Data next to me, and I don’t have to worry about API limits, just my power bill :)


  1. I cringe a little bit when I read about “engaging conversation”. Don’t get me wrong, I do feel like I’ve had engaging conversation with AI. For Claude, it is engaging, but not for the normal reasons. Claude is engaging to me because we get things done, and I like getting things done. Claude is engaging to me because Claude teaches me new things about how my systems are working, and I love learning↩︎

  2. Even during that era, I had my CLAUDE.md remind me to say I’m “Goddam Right” (for emphasis). That way, when I read it, and smiled, I knew not to take myself too seriously. You gotta be ever vigilant not to let AIs inflate your ego. ↩︎

  3. Yes, I’m extremely familiar with running local LLMs. I have an Nvidia 3090 and dabble, but we are still a long way from SOTA here. ↩︎

  4. I appreciate the hype around people buying Mac Minis to run Claw-like software. But let’s be real, the thing is just making API calls. The local hardware is wasted. Nobody (I would be surprised) is using local LLMs on a Mac Mini and thinking that it is as good or fun as the Anthropic API. ↩︎

Comment via email