Assume that the earth is flat.
Now GPS transmitters send out a radio signal that contains the time that the signal is sent. The difference between that time and the current time is used to calculate the distance between the reciever and transmitters. Now the distance calculated is too large to agree with the flat earth idea of transmitters on the ground, so the only way for the FEH to be right is for the time contained in the signal is not the real time that the signal is sent. This means that the distance you calculate from the transmitter is always a certain constant more that the real distance. This fact can be used to calculate the real distances on the flat earth.
The two yellow dots are two point on the surface of the earth, one is straight beneath a transmitter (it is the point that has the smallest distance to is). Given a "fake" round earth distance (distance following the curve of the earth, not length of a straight line connecting them) between the two point X, and the real height of the transmitter from ground T, and the "fake" height of the transmitter S, the "real" flat earth distance between the points can be found.

The green dot is the transmitter.
The angle a=X/R (using radians), D=sqrt(2R^2*(1-cos(a)) (D is the staight line distance between the points). The angle between D and S is equal to x/2r + pi/2. Using the cosine rule again, F=sqrt(D^2+S^2-2*D*S*cos(x/2r + pi/2)), so from the "fake" distance x, the "fake" distance to the transmitter F can be found.

Now H=F-S+T, Dr=sqrt(H^2-T^2).
This is the full equation for finding the real distance between two points.

Now for a transmitter and two points, S and x can found using round earth data, all that is need is a method of finding T and you have a way of either creating a flat earth map using triangluation, or disproving it using a contradiction in the distances.