That is a Thomas type standard resistor. It was used in primary metrology labs as a first tier transfer standard. At 23C or 73F plus or minus 2Deg it would have been accurate to about 0.05% when new. Most field transfer standards maintain a 10:1 accuracy ratio. That is the standard is 10 times more accurate than the test device. Sometimes only 4:1 is possible depending on the type of standard.Time to stir the pot, George?
Is that a one percent resistor? If so, then you can't prove the meter is accurate to one percent. Back in the day when I worked with automated test equipment, we always used a minimum 3:1 accuracy ratio between the measurement we were making and the instruments and/or standards we were using to make the measurement. So in your case, I would want the resistor to be accurate to at least 0.3 percent.
I worked in metrology on and off for 45 years. I graduated as a 35H40 army calibration technician back in 1974. I was always fascinated with the art and science of accurate measurements. The accuracy of today's modern multimeters amazes me, and the A to D successive approximation ASICs being made are a wonder of technology. The only thing missing for ohms measurements is the old zeroing reference adjustment to accomodate the meter leads. We would touch the leads together, and then turn the adjustment on the lowest scale(usually 10 ohms) to perfect zero on the scale. Not possible with the new digital units.