Skip to main content

Here is a new one for my book. I got an application for a certification yesterday. I opened it up today, and was struck once again by the lack of detail noted by some of my measurers. I feel like a broken record when it comes to one or two guys who have been measuring for me for a while. Strangely, this does not have the usual nightmare ending.

The times on the calibration course data sheet were off. That makes me nervous. But, I did the usual quick scan, and moved on to the Course Measurement Data Sheet. Once again, this particular measurer used counts instead of distance when Calculating the Difference in the measurements. I moved quickly on to the calculated final distance and noted that he made no effort to change the measurement, even though it was 20 meters long (according to his calculation). And then I look over the application. He is still using an old one, but it seemed filled out. Then on to the map. I would need to print a copy, just to make sure the margins were good, I noticed he was using GPS points for key points, with very little detail, and the turnaround detail map had NO detail whatsoever. I scroll down further in the document to see that he once again added the map and application of a calibration course (10 years old now), but no certificate. I've tried looking this one up, but it appears it may have never been submitted. Since the original application is all here, I think I can let it slide, but once again remind him to re-measure and submit a new application.

So, now, many of you are ready to throw this back at the guy, me too. But I persisted. I went back to dig out some details to chat with him about. I realized that no mere email would be enough. I was either going to sound critical, or too picky. I promise, that is not the case. I want to help these measurers do a better job. And as I said, it was a quick look through, just to get a starting point on this application, before I found what needed to be done.

Error 1: When calculating the working constant per km, he rounded up his counts per mile and calculated the metric constant and rounded up again.
Error 2: He used the lower constant (Finish Constant) as his constant for the day
Error 3: He used the lower constant (Finish Constant) to calculate all of his distances on the Measurement Data Sheet
Error 4: He calculated the difference Confused in counts for each ride and divided it by the total counts of the first ride (while this works in this instance, it does not work for two riders or multiple day rides)
Error 5: He stated the course length in total counts (not distance) for the measured length of the course as well as the desired length of the course. He added no distance due to these numbers.

Using all the errors above, the measurement came out to be correct! Yes, I did all of the calculations, using his count numbers, and the measurement was less than one inch long. Saved by some miracle of fate, no re-measurement needed to be done.

With that, I close my tale. The moral of this story, even a broken clock is correct twice a day?
Original Post

Replies sorted oldest to newest

An interesting story for sure. From the story I see that as certifier you reviewed the data and did your own calculations. As certifiers it is our responsibility to ask the question; Is there information to support a properly measured race distance. This is different from just making sure all the lines are filled in on the application. Common mistakes can be resolved by using an average of pre and post measurement constants.
Using this measurement as a teaching opportunity rather than sending it back with message to "fix it" allows for the improvement of his or her skills without being known as a difficult certifier. I am sure the measurer got a list of suggestions for the future.
I agree with your comment, Mike. Perhaps something we should all talk about is balancing our roles as arbiters and as teachers/coaches. And there are many times when measurers have made certain mistakes but still end up with a course that is acceptable. And using the average constant has often been the key to deciding that course length is acceptable.

Toni that's a great story. I am curious about one thing, how did you resolve (or did you) the question of details on key points?
I do find a small satisfaction in my ability to "see" what transpired (or possibly didn't actually transpire) during the measurement just from reviewing the data submitted.

I'm so thankful for the competent measurers who consistently send clean data with good maps. Conversely, there are some submissions that make my stomach sink before I even open the documents because I know that I'm going to spend a long time figuring out the information and then drafting an email to address the issues and coach them through resolution. The submission fee becomes inconsequential for these individuals.
Nathan, you are making points, to my way of thinking, in support of us establishing an on line certification application system. I sincerely feel that most measurers have sufficient cyber skills to submit this way, including maps. Good quality scanners are dirt cheap these days. If you have a home printer, it it most likely an "all-in-one" that includes a scanner.

Now, I suspect that we'll have to accept that few people will be able to submit hand-drawn maps electronically in the presently required <500 KB .png. However, it is not a big deal to include in the on line system an "auto-processing" feature that takes scanned documents and transforms and saves them in our presently required format. Parenthetically, this may also be a good opportunity to open up the types of acceptable map formats, to ease the submission process. For instance, all scanners save to .pdf.

If you have ever used Vista Print or one of the many sites that provide an efficient way to upload your graphic in many different formats, you know that it is no big deal to submit almost any image and then see it in its "transformed to standard" version.

Our on line system would be programed to check calculations as entered by the measurer, and highlight or outright reject measurement data falling outside of our requirements. For instance, checking the drop & sep, and the .08% requirements would be calculated automatically as the data is entered. Because we are able to automate much of the data checking this way, I think we are then able to simultaneously raise the quality of our measurement submissions, and reduce the work for all concerned.

It wouldn't be a big deal to program this as a shared system, in which state certifiers would log in and review each submission in his/her state(s), then decide whether to issue a certification number or to require a re-submission or correction update from the measurer. The certification numbers would be kept in the state database and issued in the appropriate order, every time.

Some, but by no means all the considerations that come to mind - first, the pros:
  • Workload reduction
  • automated error notification and correction
  • automated data integrity calculations
  • complete data entry guarantee
  • easy identification of faulty data
  • eliminates need for USATF/RRTC to scan thousands of maps
  • automates certificate entry, ensuring accuracy
  • facilitates posting of maps and certificates on line

  • System development cost
  • organizational adoption and adaptation barriers

I doubt that there is any question in any measurer's mind anymore that we will inevitably employ such a system. It is now just a question of "how soon", IMO. By no means is this proposed initiative any kind of indictment or criticism of our present system, which obviously works well enough. It is all about our time, efficiency, and helping ensure better quality measurements and measuring data.

For myself, I am willing to contribute to a fund to pay for the development. I will even volunteer to establish a donation campaign. This is how strongly I feel the time has come to review our options and devise a plan to go forward adopting the readily available benefits that technology can provide us.

I welcome all thoughts, gripes, arguments, etc.
Lyman, that's an interesting idea, and it makes me think about what it is that I do as a certifier. While a small part of it is checking for completeness and checking math, I feel like a more important part is trying to get a feel for how someone has approached the measurement task, how they are thinking about their results, and how well their course map serves to instruct that imaginary race director from out of town who uses this map as a guide to set up and conduct the race correctly.

I just don't think we are at the stage where computers can take over very many of those tasks. For example, sometimes a measurer gives elevations that I have a hunch may be wrong-- I can go and check on Google Earth, or I can ask them how they got that information, etc-- but the checking starts with the hunch, which I'm not sure can be programmed. Another example would be submissions like the one Toni described to start this conversation, and ones that Mike W and I have received-- lots of mistakes (or just one) but possibly still acceptable. I'd rather work with someone to help them get things right, for the course they've just measured but also for future ones, than risk just having the course rejected due to a math error. (The risk isn't that a measurement is rejected but that a measurer might be discouraged and stop.)
First, we do have an auto-fill certicate that can make it easier for our Certifiers. It does things quickly like filling in the following: drop, separation, measurers names, race contacts ect. If one needs this, please contact me.

As for trying to use PDF's, Bob Baumel explained a long time ago this is not the way to go. Now, to get USATF to change their system-GOOD LUCK as they need an IT person. At this point in time they haven't filled that position.

I don't want to be negative, but I just don't see it happening anytime soon.

We have come a long way with getting our information online quickly. Since I have been the registrar the time frame has gone from a month to less than a day. We now can provide color maps online.

The one thing that frustrates me is most of Certifiers don't check to see if all is posted correctly online. I have asked for this to be done, but it only happens with a few people.
Good thoughts, Bob. I see your points. What I am not totally grasping is how an online submission would preclude "getting a feel" for how someone approached the measurement task.

The autofill certificate is an example of this. That this form is semi-automated in no way precludes our need to review it for accuracy. If you don't like the elevation numbers on a certificate, you are not likely to like them on an automated submission, right? Measurers would still be required to provide a brief narrative of the measuring techniques they employed for each individual measurement, just as I and others have done for many years. Unless you are a handwriting analyst, I am not certain how seeing it can enhance the submission and review process, but I could be missing something here.

As far as PDFs are concerned, I have no interest in a campaign for replacing our .png format with them, though this is the format all of my clients request their maps and certificates in. I have a huge amount of respect and gratitude for all that Bob Baumel has done and continues to do for all of us. Bob and I just have differing opinions about file formats, and I do not see that changing any time soon, if ever. I'm OK with it.

My thing is that, while I am inspired by the ongoing meticulous attention by members of our community to aspects of measuring and mapping that help ensure the high quality product that we produce only gets better and better, I for one do not always sense the same interest in employing certain administrative technologies to take us even further than we have come. If USATF has no IT Director, this is an obvious and unfortunate reason to maintain the status quo, I agree. Nevertheless, I feel confident that our submission system will, one day, be far more automated than it is today. I have solid reasons for saying this. There is nothing wrong with our current system. But an analogy may help express my feelings.

Up until a few weeks ago, I had a 2009 car that I took conscientious mechanical care of. Though the exterior and the interior had suffered years of indignities of bike racks and race equipment hauling, this car functioned perfectly well, and it was safe. A few months before deciding to upgrade my vehicle, I had occasion to drive a certain late-model rental SUV. It impressed me so much that I decided to trade in my old car and purchase a new one that is just like the rental. With all its safety and comfort features, driving this new vehicle was a minor epiphany for me. I was now able to load my bike in the back. Out of the weather and no rack required. I could haul race equipment without destroying the seats and headliners. The feel was different, but it was and is much better. Features such as in-console bluetooth phone calls and text and email message text-to-voice reading options make it convenient to stay connected as needed while driving. The parking camera and blind spot radar are nice.

Now, I realize what a reach this little comparison may be in considering an upgrade to automated submission on our site may be. I don't do a great job of articulating my reasons for mentioning the many benefits such a system can provide us with. I just hope our minds are open about this, because I see so many benefits that we have yet barely touched upon. Maybe someday, or maybe not. Regardless, we perform a valuable service with our current system that I am proud to be a part of.
I have given the automated application some thought. And while I use my own form of automated application, in excel, I wouldn't use the same form for my measurers.

What I would like to see available to them is a form that they fill in with all of the numbers they generated, and the form would kick back to them if anything "looked" off. Maybe some sort of "warning" back to them that the numbers they have entered don't agree with the correct calculation. I wouldn't necessarily want to give them the correct numbers, as that does not teach them how to do the work when they don't have the crutch of an electronic buddy to help them.

We all needed to do the calculations by hand when we started learning to measure. While things that help us calculate are fantastic time savers, we want to make sure our measurers know and understand the principles of what we do. I believe that when I get those cringe worthy applications, that is a measurer who doesn't quite understand what is being done and why it is being done.

But maybe I am being short sighted on this. An application that is filled in by a computer program would certainly warrant the summary portion being filled out. I don't always see a detail summary from my people.
I think we've made a lot of improvements in our process recently: the autofill submission forms; autofill certificates; folks submitting on line; being able to send completed maps and certificates to Gene and Justin online.

Adding the "essay question" to the application form was a good step. It sounds like it is underused, so I'm thinking we should really push for measurers to at least write something there -- could be a narrative, or special problems they faced or solved, even saying what was fun or miserable about this measuring project.

In short, I would favor continuing with incremental steps like the ones I've listed-- some of those steps might even get us closer to the sort of thing Lyman is talking about, while some could be in other directions.

Add Reply

Link copied to your clipboard.