It takes a special kind of person to believe this kinda malarkey.

Yes, it does take a special kind of person to believe the malarkey you are spouting, to believe you can have a close sun that somehow appears due east along a line of longitude all at once.

(yes, I noticed an error in my prior post but will fix it).

It takes a special kind of person to believe the sun can magically set while remaining above you (i.e. above a flat Earth).

With any shadow in all of existence, except that one supposedly created by the sun, you would simply draw a line between the end of the shadow and the top of the object to locate the light source.

No, you wouldn't.

That is because that only gives 2 dimensions, not 3. The sun could be anywhere along that line.

There is also a certain degree of error in this. So to do it properly, instead of having a line you would have a cone, growing larger as it gets further away. The light source would be somewhere in that cone, a region of 3D space.

In order to determine the location in 3D space, you need 2 such lines, which again due to error means 2 cones and instead of a point you get a small region.

But that requires knowing the relationship between those 2 lines/cones, i.e. you need to know where a point on each line/cone is and you need to know what angle they are in relation to each other.

Otherwise you can only make assumptions and see if they hold.

With close objects you can typically get it fairly accurate.

With very distant objects, like the sun, you can't.

But oh no! not with the sun. To sustain the unsustainable belief in a globe they blow our universe up to ridiculous scales and claim the beams from a radial source are 'almost parallel.'

Again, you dismiss this as ridiculous without any basis, even though it is what all the evidence shows.

Again, along a line of longitude, at a particular time on the equinox, the sun is due east. This shows these beams of light are roughly parallel.

Truth be told, the further away you are from a radial source, the distance between any two of its rays grows.

Truth be told, the further away you are from a radial source, the slower the proportional rate of divergence.

For example, if you are 1 m away from a radial source, and looking at 2 beams which are 1 m apart, that subtends an angle of 60 degrees.

If you follow them for an additional 1 m, they end up 2 m apart.

If instead it is 1 km away, and you look at 2 beams which are 1 m apart, they now subtend an angle of roughly 1 milliradian. If you follow it for another m, it will go to roughly 1.001 m

Similar relations to those above can be used.

You start at a distance d1, with 2 beams separated by 2*s1. This forms a right angle triangle with the relation tan(a)=s1/d1.

If you then go to some further distance, d2, the separation grows to 2*s2 and the following relation holds: tan(a)=s2/d2

Thus s1/d1=s2/d2.

Thus s2/s1=d2/d1.

If you let d2=d1+dd, you get:

s2/s1=(d1+dd)/d1=d1/d1+dd/d1=1+dd/d1

Thus the proportional rate of divergence is dd/d1.

So the further away you go, the less they diverge.

So if you are at 150 000 000 km away from a light source, then travelling the entire width of Earth (roughly 13 000 km, which I will over estimate as 15 000 km) would only give a divergence of 15 000/150 000 000 = 1/10 000=0.01%. I would say that is nearly parallel.

Now do you have a rational objection, or just emotional appeals of dismissing it as ridiculous?