I'm guessing, based on observation, that the distance at which the atmosphere typically obscures vision is roughly D = arcsec(1+H), where H is your height above sea level. Units are in the constant "R", which is derived from atmospheric density and which RE scientists misinterpret to represent the radius of the earth, but that's pretty silly since the earth is flat.

The difference: you are making a hypothesis about how to interpret a formula that agrees with observations (actually, it probably

*doesn't* agree with observations), whereas I performed a derivation from first principles.

Somebody with more knowledge than me could derive a formula from first principles of atmospheric optics as well, and until they do so, you may not commandeer my formula -- especially since you have not actually

*made* the observations that you claim support your formula (I know you haven't because my formula doesn't take atmospheric effects into account; the real formula would be different.)

Just a hint that my formula will not work on a flat Earth: it's basically a geometric definition of the secant through the centre of a circle. So really, it's more of a first principle itself than it is a derivation.

Also: if the atmosphere limits visibility, then being higher up in the air means there's more air between you and the ground, which means your visibility of the ground should

*decrease*, not increase. Basically, there would be some sphere of visibility around you; the ground that you could see would be the intersection of the flat Earth with that sphere. As the distance between the centre of that sphere and the surface of the Earth increases, the intersection gets smaller. My formula predicts exactly the opposite, so.... yeah, there's an easily verifiable test for sphericity.

-Erasmus