Do you understand that a ten percent error, multiplied by a 3 percent error does not equal a thirty percent error?
That ten percent error is happening over and over again. Ergo, recurring. The 3 percent variable is a multiplier variable since the three percent is the rate at which the ten percent error recurs.
Oh dear. I'll take that as a "no".
If the period varied between 5 and 15, then the average would be 10 seconds, this would in fact be an average error of 2 percent. (Another failure). However, even then I'm confused. An error is usually expressed as a +/- value. 5 to 15 doesn't make much sense.
Anyway, how this error would feed into the final error of the result is more complicated. See the equations here:
http://en.wikipedia.org/wiki/Cavendish_experimentHowever, it still stands that a ten percent error multiplied by a three percent error is not a thirty percent error. The fact that neither you nor Miles Mathis understand this is irrellevent.