My mistake, let me clarify my question, To what percentage of the objects height must a person be accurate in making his readings, 99%, 99.9% , 99,99999%
Your question is fine, you are just ignoring the answer.
Here, let me try and clarify it:
IT DEPENDS ENTIRELY UPON WHAT YOU ARE TRYING TO DO WITH IT!!!
If you are looking for a change of 1% (a direct change), then it would need to be within less than that 1%, preferably 0.1%. If you are looking for a change of 0.00001%, then you need better than that.
Do you understand that?
OK, for this example, what % accuracy would you expect his readings to be.
And how can people disregard this work, without even looking at the numerical data
That would depend upon his exact methodology, i.e. exactly what he is measuring and calculating.
We can disregard his work because he blatantly lied about the accuracy he had.
He claimed to have had micrometer accuracy when in fact the best he could do is half a mm (or 500 times what he claims).
As he has now admitted the data was fake, I can provide a method by which you could do something like this:
We know that just considering the vertical component, a=-g (for simplicity, you could have it as g as well, it is just the direction that changes)
v=v0-g
d=d0+v0*t-0.5*g*t^2.
So the simplest way is to measure 3 points in time.
You have the first, which we define at t0, at which point the object is at d0 and v0.
You have some small time later, t1, at which point the object is at d1=d0+v0*t1-0.5*g*t1^2
And you have another time, t2, at which the object is at d2=d0+v0*t2-0.5*g*t2^2.
We are really looking at the differences in distance, so what we have is:
dd1=v0*t1-0.5*g*t1^2
dd2=v0*t2-0.5*g*t2^2
We directly measure d0, d1 and d2, and then use the difference to find dd1 and dd2, and we measure t1 and t2, but there is also the error in the start.
This means the error is doubled for each.
But at least we now have 2 equations with 2 unknowns (v0 and g).
Well, using the first one:
dd1=v0*t1-0.5*g*t1^2
Thus v0*t1=dd1+0.5*g*t1^2
Thus v0=dd1/t1+0.5*g*t1
Now stick that it 2:
dd2=v0*t2-0.5*g*t2^2
Thus 0.5*g*t2^2=v0*t2-dd2
Thus 0.5*g*t2^2=(dd1/t1+0.5*g*t1)*t2-dd2
Thus 0.5*g*t2^2=dd1*t2/t1+0.5*g*t1*t2-dd2
Thus 0.5*g*t2^2-0.5*g*t1*t2=dd1*t2/t1-dd2
Thus g*0.5*t2*(t2-t1)=dd1*t2/t1-dd2
Thus g=2*(dd1*t2/t1-dd2)/(t2*(t2-t1))
Thus g=2*(dd1/t1-dd2/t2)/(t2-t1)
So, now analysing the error, bit by bit.
So on the bottom we have (t2-t1).
In reality this is (t2-t0)-(t1-t0)=t2-t0-t1+t0=t2-t1.
Thus it is just twice the initial error in time.
Up top we have dd1/t1 (and dd1/t2) Because of that you need to use the percentage errors, and as they are uncorrelated, you add the errors in quadriture.
So now we have the error for that part is sqrt(pEd^2+pEt^2).
You then effecitvely multiply it by 2 as it is added together (and assuming the pE is similar for both).
You then have it getting combined together so you add the percentage errors (I think in quadrature):
So peG=sqrt(pEtop^2+pEbot^2)=sqrt((2*(Ed/t)/(d(d/t))^2+(2*Et/dt)^2)
=sqrt((2*(sqrt(pEd^2+pEt^2))^2+(2*Et/dt)^2)
=sqrt((2*(pEd^2+pEt^2)+(2*Et/dt)^2)
=sqrt((2*((Ed/d)^2+(Et/t)^2)+(2*Et/dt)^2)
=sqrt(2*(Ed/d)^2+2*(Et/t)^2+4*(Et/dt)^2)
And the difference he was looking for was 9.807 vs 9.809, so a difference of 0.002 out of 9.807 or roughly 0.02%, so you would want the error down to something like 0.005% or 0.00005
This means we have:
0.00005=sqrt(2*(Ed/d)^2+2*(Et/t)^2+4*(Et/dt)^2)
So first, lets assume the error in time is 0.
That makes it a lot simpler:
0.00005=sqrt(2*(Ed/d)^2)
=sqrt(2)*(Ed/d)
So Ed/d=0.00005/sqrt(2)
So Ed=d*0.00005*sqrt(2)/2
So assuming the jump was 1m, the maximum you can get for d is 1 m.
So the error is 0.5 m*0.00005*sqrt(2)=3.5 e-5 m=35 micron.
That is not achievable with that camera.
Thus the result must be fake, even without looking into it further.
So you want an instrument which gives at least a 35 micron resolution. When you factor in the error with time the requirements are even tighter.
Let me put it to you this way, the camera on the iPhone is much better video cameras then the ones costing thousands of dollars 30 years ago. So with this logic, the measuring instruments of the iPhone will be just as good as the measuring instruments of 30 years ago.
If you discard the measurement, then you also have to discard all measurement before 30 years ago also.
No.
We are discarding using an iphone (just the iphone) to measure a jump to within micrometer accuracy.
I will discard using a camera from 30+years in the past to do that.
That doesn't mean I have to discard other instruments or techniques.
The iphone is not as good as other instruments.
The iphone camera is for taking pictures, not measuring gravity or distances to micrometer accuracy.