Borrowing Tolerances

Charles Gallant, April 10, 2025

When you watch a working engineer use a 3D Printer, you see something quite different from the typical at-home user. Instead of a single printed object, you'll often see the 3D print as just a single part in a larger composition, with the plastic filament (PLA) used only where it adds unique value.

Maybe it's adapting or coupling two objects together, maybe it's shielding a delicate sensor, maybe it's channeling air in just the right way. But when used efficiently and effectively, the printed object is never doing more than it should.

This is because the PLA isn't always the best material for the entire job. When combined with steel, silicon, and machined bolts/fasteners, the true versatility of 3D printing emerges.

This concept is called “borrowing tolerances” and is articulated nicely by David Malawey on his engineering YouTube channel. You borrow steel's strength, you borrow silicon's flexibility, and by thoughtfully integrating this with a custom print, you can achieve something far superior than a single material could, and one that neither material could achieve on its own. 

It struck me that this is a great visual analogy for explaining AI software development.

AI works best when it's just one piece of a larger technical puzzle—borrowing the tolerances of its adjacent parts. Thoughtful database structures, careful API integrations, smart application logic, and intuitive user interfaces can enhance the AI-powered service within, providing context to the larger system.

Just like PLA, though, it remains difficult for the average person to understand AI’s strengths and appropriate applications. After all, creativity with a new tool is a specialized skill in itself. It’s probably why many optimistically purchased 3D printers are currently gathering dust. Without an understanding of AI’s adjacent context, we default to simple answers like “make an image" or "transform some text".

I believe all the Studio Ghibli images are motivated by the same thinking that drives the baby Yoda statues on Thingverse: They're an immediate + emotionally driven + safe answer to the question "What should I make with this?"

Where this analogy breaks down is also where I find optimism: AI is oddly self-fulfilling. I write code with AI and create AI-powered software. I can ask AI how to proceed and where I might fail. Given enough context, it can even suggest ways to interface AI with the adjacent systems I've composed and essentially "borrow tolerances." My 3D printer seems lazy by comparison; I’d like to see its plans for teaching me how to use Blender.

There’s no magic “developer-only” way to trick ChatGPT, Claude, or Gemini into this sort of behavior, either. You just need to learn to ask.



Borrowing Tolerances