I was using the forms at http://www.usatf.org/usatf/fil...d5d-58ee97c196bb.pdf to create a calibration course.
The temperature corretion formula is:
correction factor = ((temp - 68) x .00000645) + 1
then multiply that by your avg raw measurement.
So if your temperature is above 68 your adding and if its below 68 your subtracting.
I understand this is correct if your measuring the distance between A and B but when your just measuring out 1000' it seems like it's backwards.
If the tape is hotter than 68 you will actually measure more than 1000' and the correction should actually make it smaller.
When it's colder your measurement will actually be less than 1000' and your correction should make your measurement longer.
I really don't see how if a 1000' tape expands to 1000.5 then you should add .5 to your measurment. Then it would be 1001 not 1000.
Am I totally wrong about this? Can anyone help explain this?
Thanks,
Brian
Original Post