Under Glass, Under Constraint

When we talk about information “just under glass” here at MIJUG.NET, we think of that thin digital barrier between our thoughts and the screen—a space both protective and limiting. This week, we lost someone who understood small spaces better than most: David Ketchum, the actor who brought Agent 13 to life in the classic TV series Get Smart.

Ketchum’s passing on August 10, 2025, at age 97, reminds us that some of our fondest technological metaphors come from the most unexpected places. His character, Agent 13, was Control’s master of tight spaces—emerging from mailboxes, lockers, washing machines, and other impossibly cramped quarters to deliver crucial intelligence to Maxwell Smart.

The Irony of Constraint

There’s a beautiful irony in how David Ketchum’s career paralleled our modern computing dilemmas. Just as Agent 13 was perpetually squeezed into spaces barely large enough to contain him, we find ourselves constantly bumping against the limitations of our hardware—especially when it comes to the demanding world of artificial intelligence.

Consider this: Ketchum spent his career making comedy gold out of spatial impossibility, while Don Adams (Maxwell Smart) got to stride confidently through the world above. Today, we face a similar dynamic between our ambitious AI aspirations and the humble reality of our graphics cards.

When 2016 Became Ancient History

Here’s where the metaphor gets personal and practical. Modern AI large language models (LLMs) like GPT-4, Claude, and their contemporaries require graphics processing units (GPUs) with capabilities that didn’t exist in consumer hardware until around 2016-2017. If your GPU predates the NVIDIA GTX 10-series (Pascal architecture) or AMD RX 400-series (Polaris), you’re essentially trying to run a modern AI in Agent 13’s mailbox.

The technical requirements are staggering:

  • Minimum VRAM: 8GB for basic local LLM operation
  • Recommended VRAM: 16-24GB for comfortable performance
  • CUDA Compute Capability: 6.1 or higher for NVIDIA cards
  • Memory Bandwidth: At least 448 GB/s for reasonable inference speeds

That means perfectly functional graphics cards from 2015—hardware that can still run most modern games beautifully—suddenly find themselves crammed into impossibly small computational spaces when asked to process transformer models with billions of parameters.

The Supporting Actor’s Dilemma

David Ketchum understood something profound about being a supporting actor: sometimes your job is to make the impossible look effortless, even when you’re working in conditions that would challenge anyone else’s sanity. His Agent 13 never complained about the ridiculous spaces he occupied—he just delivered his lines and made the show better.

Our older GPUs are the Agent 13s of the computing world. They’re not the stars of the show, they can’t command the spotlight like an RTX 4090, but they’ve been faithfully serving in their small spaces, running our displays, handling our video editing, and managing our gaming for years. It’s only when we ask them to cram themselves into the equivalent of a washing machine—running a 70-billion parameter language model—that we remember their limitations.

The Space We Share

In our “under glass” philosophy, we recognize that the screen creates both connection and constraint. David Ketchum spent his career proving that limitation can breed creativity, that working within tight boundaries often produces the most memorable moments.

When we encounter the hardware constraints that prevent us from running the latest AI models locally, we’re experiencing our own Agent 13 moment. Like Ketchum emerging from increasingly impossible spaces with perfect timing and delivery, we learn to work within our constraints—perhaps using cloud services, optimizing smaller models, or finding creative workarounds that our high-end hardware friends might never discover.

A Fitting Tribute

David Ketchum’s legacy isn’t just in the laughs he generated from small spaces—it’s in the reminder that constraint can be a catalyst for ingenuity. In a world where technology seems to demand ever-expanding resources, there’s something beautifully subversive about finding ways to do more with less.

The next time your 2015-era GPU politely declines to run a local AI model, remember Agent 13. Sometimes the most important work happens in the spaces nobody else wants to occupy, delivered by the supporting actors who make the whole show possible.

The Memory We Keep

As we remember David Ketchum this week, we’re reminded that technology—like comedy—is often about timing, constraint, and making the impossible look effortless. Whether you’re squeezing into a mailbox for a television show or coaxing AI performance from older hardware, the secret is the same: embrace the limitation, find the humor in the challenge, and deliver your best work regardless of the space you’re given.

In our digital age of ever-expanding requirements and planned obsolescence, maybe what we need is a little more Agent 13 spirit—the willingness to work brilliantly within whatever small space we’re given, always ready with the information that keeps the mission moving forward.

Rest in peace, David Ketchum. Thank you for showing us that sometimes the smallest spaces hold the biggest possibilities.


Bibliography and References

Biographical Information

Get Smart and Cultural Context

AI and GPU Technical Requirements

Graphics Card Evolution


This tribute was written using the same spirit of creative constraint that David Ketchum brought to his performances—finding meaning and humor in the spaces we’re given, no matter how small they might seem.