First how ACCURATE do you want this. Accuracy generally translates into $$$$
The first thing you will need to get is a certified standard that is traceable to NIST -National Institute of Standards and Technology. It will state what it it’s readout is (ohms, inches, weight, whatever) and what it’s tolerance band is ± .01 ohm, ± .0001", whatever the unit of measurement is.
Then you check and adjust against the standard. Then maintain the standard’s certification, or not if you think it is stable and good for several years. Then it is just reference standard.
I wouldn’t know how to do anything electrical, but say Machine Shop wants to make a simple in house standard for calpipers. We could get a piece of say 2" wide 1" thick by 18" long piece of tool steel, it’s hard/stiff and wear resistant.
Then we do our absolute best to align it in the HAAS or send out. Have notches machined through with as smooth and pararllel a surface as possible at specified intervals. Then we send it out to a lab and have then measure the distances from the end to an edge and provide us the measurement to within .0001".
To check our calipers, it won’t matter what the actual distance of our machined bar is, we may have been trying to do 1.000", 3.000", 6.000", 10.000" 12.000" and 16.000" The measurements will come slightly over or under, but they will be a precise measurement (the reason to go to calibration lab is they have the ability to measure to at least .000025" (25 millionths of an inch).
When we check/calibrate against this, lets say our 4.000’ is actually 4.0018", if my measuring tool is good to ±.001" as long as we get a measurement that is within ±.001" of 4.0018 (after adjustments if needed) then we know the tool is good. If we are measuring to .0001" but can only get repeatability to within .0002" then we would note that on the tool even if it has a readout in .0001".
Anything we use at the Space would in industry be marked REFERENCE ONLY meaning it wouldn’t be allowed for product acceptance purposes.
The most accurate measuring devices I’ve used were a leakage meter that measured He loss in a vacuum chamber - it could detect several CC’s per year. Flatness: optical flats that were good to about 30 nanometers or 10 millionths of an inch, super micrometer: 00001". Ohms, all I know is as you approached the meter you affected the “electrical field”, humidity had to be is certain range, you waited till it stabilized, zeroed it, then did the measurement (was for detonators for explosive bolts used for rocket separations).
Or you buy standards and pay a hefty premium.