So, you’re looking at a drawing or maybe holding a micrometer and wondering about the math. It's a classic shop-floor question. Honestly, the answer is dead simple on paper, but in practice? It gets a bit more "kinda" and "sorta" depending on who you’re talking to.
There are exactly 39.37 thousandths of an inch in one millimeter.
Wait. Let’s back up.
If you are talking about "thousandths" in the metric sense—as in, how many microns are in a millimeter—the answer is exactly 1,000. But usually, when someone asks "how many thousandths in a mm," they are a machinist or an engineer trying to convert between the metric system and the Imperial system used in the United States. They want to know how many "thou" (0.001 inches) fit into that tiny metric gap.
One millimeter equals $0.03937$ inches.
If you round that for a quick mental check, most guys in the shop will just tell you "forty thou." But if you’re CNC machining a press-fit bearing or building a high-performance engine, that rounding error will ruin your day. You can't just guess when you're dealing with tolerances that determine whether a part slides smoothly or seizes up and smokes.
The math behind how many thousandths in a mm
Let's get technical for a second. The international yard is defined as exactly $0.9144$ meters. This means an inch is exactly $25.4$ millimeters. No more, no less. It’s an hard-and-fast rule established in 1959.
To find out how many thousandths of an inch are in a millimeter, you just do the division. Take 1 and divide it by 25.4. You get $0.0393700787...$ and it keeps going.
For 99% of human applications, $39.37$ is the magic number.
💡 You might also like: Examples of an Apple ID: What Most People Get Wrong
Think about a human hair. It's usually about 2 to 3 thousandths of an inch thick. So, a single millimeter is roughly the thickness of 15 to 20 human hairs stacked together. It's small, but in the world of modern manufacturing, a millimeter is actually a massive canyon. We live in the world of "tenths" (which, confusingly, refers to ten-thousandths of an inch, or $0.0001$).
Why does this conversion trip people up?
It's the terminology.
In the US, we use "thou" for $0.001$ inches. In the UK or Australia, they might call it a "mil," which is incredibly confusing because "mil" sounds like "millimeter." But a mil is an inch-based measurement. If you tell a European engineer you need a gap of "five mils," they might think you mean five millimeters ($5.0$ mm), which is nearly $0.200$ inches. That’s a massive mistake.
Mistakes like this have literally crashed spacecraft.
Remember the Mars Climate Orbiter in 1999? NASA lost a $125 million piece of hardware because one team used metric units (Newtons) and another used English units (pounds-force). The software didn't convert them correctly. While that wasn't specifically about thousandths vs. millimeters, it’s the same fundamental breakdown in communication. If you don't know exactly how many thousandths in a mm you're dealing with, things break.
Real-world tolerances in the shop
Imagine you are turning a shaft on a lathe. The blueprint says the part needs to be $25$ mm. You measure it with an Imperial micrometer. You see $0.984$ inches. Is that right?
$25 \times 0.03937 = 0.98425$.
So, $0.984$ is close, but you're off by a quarter of a thou. In aerospace or medical device manufacturing, a quarter-thou is the difference between a "pass" and a "scrap" part. You have to be precise.
📖 Related: AR-15: What Most People Get Wrong About What AR Stands For
Common reference points for perspective
Sometimes it helps to stop looking at the calculator and look at the objects in your pocket.
- A standard credit card is about $0.030$ inches thick (30 thou). That’s roughly $0.76$ mm.
- A single sheet of high-quality printer paper is usually $0.004$ inches (4 thou), or about $0.1$ mm.
- A spark plug gap for a modern car might be $0.044$ inches. That’s just a hair over $1.1$ mm.
When you start visualizing these sizes, the question of how many thousandths in a mm becomes less about math and more about feel. You start to realize that a millimeter is a fairly chunky unit of measure when you're talking about precision fitment.
Dealing with the "Tenths" confusion
Here is where it gets really messy for beginners.
When an American machinist says "I need to take off two tenths," they almost never mean $0.2$ inches. They don't even mean $0.1$ inches. They mean two ten-thousandths of an inch ($0.0002$).
Now, try to convert that to metric.
$0.0002$ inches is approximately $0.005$ millimeters (or 5 microns).
When you are working at this level of precision, temperature starts to matter. If you hold a steel gauge block in your hand for a minute, the heat from your palm will cause the metal to expand. It can easily grow by a few "tenths" or several microns just from your body heat. This is why high-end metrology labs are kept at exactly 20°C (68°F).
If you’re measuring how many thousandths in a mm in a garage that’s 90 degrees Fahrenheit, your math might be perfect, but your measurement will be wrong. The material itself has changed size.
Tools of the trade: Micrometers vs. Calipers
If you're asking this question, you're probably using one of two tools.
👉 See also: Apple DMA EU News Today: Why the New 2026 Fees Are Changing Everything
Digital calipers are great. Most of them have a "mm/in" button that does the heavy lifting for you. You can slide it open, hit the button, and see $25.4$ mm turn into $1.000$ inch instantly. But calipers are only accurate to about a thousandth or two. They flex. They rely on how hard you squeeze them.
For real accuracy, you use a micrometer.
A standard 0-1 inch micrometer is graduated in thousandths. Many have a "vernier scale" on the sleeve that lets you read down to the ten-thousandth ($0.0001$). If you have a metric micrometer, it usually reads in hundredths of a millimeter ($0.01$ mm).
One "tick" on a metric micrometer ($0.01$ mm) is about $0.00039$ inches. Basically, four-tenths of a thou.
The psychological hurdle of switching systems
Most people who grew up with the Imperial system find metric "easier" but "harder to visualize." We know what an inch looks like. We know what a "thou" feels like when we run a fingernail over a scratch on a car door.
But metric is just base-10. It's objectively better for calculation.
The struggle with how many thousandths in a mm usually comes from trying to live in both worlds at once. My advice? Don't. If the drawing is in metric, buy a metric micrometer. Don't convert back and forth on a calculator every five minutes. That is exactly how errors creep in. You'll misplace a decimal point, or you'll round $39.37$ to $39.4$ and suddenly your parts don't fit together at the end of the week.
Practical Steps for Precision
If you are currently standing at a workbench trying to make sense of a measurement, here is what you should actually do:
- Use the $25.4$ constant. Don't try to memorize $0.03937$. Just remember that one inch is exactly $25.4$ mm. If you have millimeters and want inches, divide by 25.4. If you have inches and want millimeters, multiply by 25.4. It’s a cleaner way to work.
- Trust your "Feel." If you're checking a gap, remember that $1$ mm is roughly $40$ thou. If your measurement comes out to $15$ thou or $80$ thou, you know you've made a massive decimal error in your head.
- Check the Tool. Ensure your digital tools are zeroed properly. If you're switching between "in" and "mm" modes, zero the tool in one mode and then check it in the other.
- Temperature Control. If you're working with tolerances tighter than $0.001$ inches (about $0.025$ mm), let your parts and your tools sit in the same room for an hour before measuring. Thermal expansion is a real thief of precision.
- Write it down. Never trust your memory for a four-digit decimal conversion. Use a Sharpie on the workbench or a notepad.
Understanding how many thousandths in a mm is basically the "Hello World" of machining. Once you internalize that $1$ mm is just shy of $40$ thou, the metric system stops feeling like a foreign language and starts feeling like just another tool in your box. Precision isn't about being perfect; it's about knowing exactly how much error you can live with—and exactly how to measure it.