Tolerances are a plus or minus measurement used on a tech pack to determine whether a product meets a specified quality standard. It is usually expressed as plus or minus. For example, one point of measure (POM) for a bust line may say the tolerance is plus or minus 1/2″. This means that for each size (34, 36, 38 etc), the garment bust measure could be 33.5″-34.5″ for the size 34; 35.5″-36.5″ for the size 36, etc.. My inquiry today is how are these tolerances determined? Frankly, it seems like many of them are drawn out of thin air or copied from similar tech packs wherever one can find them.
Tolerances are a new wrinkle and I’m not finding established or good practices to follow. In the olden days, few worried about tolerances because most everyone made their own stuff. Since things have changed, people have been winging it and don’t let anyone tell you differently. That doesn’t mean some people don’t have the right or good answers only that there is no established practice, much less agreement on how to do it.
Some things I’ve seen are using half the grade rule as a tolerance which as a rule per se, doesn’t work. For example, if the grade rule were 2″ and the tolerance were half that (1″), the product could measure an inch over or an inch under and still make the spec. I also don’t agree that the larger the piece, the larger the room for error (permitted tolerance) could be.
In trying to drill this down into something that can be determined in a concrete way, the illustration below is something I’ve come up with. I’m not saying this is the answer, not saying it is the best and I’m not saying somebody else doesn’t have a better solution [but I’d certainly love to hear about it!].
My idea is pretty simple in that it is tied to production processes. In the top pane is a tolerance for cutting the side seam, specifically 1/16th on each seam making for a total tolerance of 1/4″ because cutting is where one can make errors. Ideally a cutting error would amount to one side, once, so I think 1/4″ is very generous.
Still, we don’t measure cut pieces for tolerance, only sewn ones so the lower pane includes a tolerance for sewing. Again, each side is allotted 1/16th. An error or variance occurring at this stage is actually more possible -or should be more possible- as compared to cutting. Adding the two together, we come up with 1/2″ total tolerance for this point of measure laid flat. [Yes of course, as only the half measure is taken this would amount to a full inch; you’ll need to be specific in your specifications as to whether these are full or half (flat) measures.]
In total, 1/2″ (or 1″all around) is too much -in my opinion. Reason is, if you were running numbered sizes with only 1″ difference between sizes, a size 2 could be as large as a 4, the 4 could be as small as a 2 and both would still meet the specification. So returning to my exercise of attempting to arrive at a solution, I would halve the tolerance from the worksheet to make the final tolerance only 1/4″ (or 1/2″ for total girth).
Ideas? Opinions? Do tell.