Calibration of Ocular Micrometer

So I’m doing up my discussion section for one of my short lab write ups, and it asks for another way to calibrate the ocular micrometer other than using a stage micrometer or other type of ruler. My idea was to take an already prepared slide with known cell sizes and use them to calibrate the ocular micrometer using the equation:

apparent size of object = magnification*actual size of object

From that we could then determine the distance between each ocular unit for each magnification setting. Does this make sense? Anyone have a better approach cause I can’t think of anything better.

That makes sense, but I think that may be considered cheating. Because a cell of known size is nothing more than a ruler with non-standard units.

I don’t have a better answer, but I’ll keep thinking. What level course is this?

huh?

It’s just my intro micro course, seems like a dumb question to me. The only other thing I came up with is to assume that the 1000x magnification under oil immersion has a distance of 1 micrometer between every 2 ocular divisions. From this you could then calculate the supposed distance for all other magnifications; this however is another stretch, and doesn’t seem like the right thing to type down.