We can approximate the speed of light to 1ft/ns for this experiment.
We set up transparent light sensors every 10 ft which transmit a pulse every time a laser hits them.
We then send a laser through them. The receiver at the source detects pulses every 20 ns indicating the speed of light is 10ft/20ns, half its well known speed. Why?
I don't have the STD but if you draw it yourself, the space and time units are 10ft, 10ns and c is a diagonal line with points 10 ft/10 ns, 20ft/20ns, 30ft/30ns etc. From each of these points is a diagonal light line that goes back to the time axis and intersects it every 20 ns. What is that saying about this way to measure the speed of light? Why does it get the wrong answer?