The fallacy is that because we have not physically seen with our own eyes ("directly observed") hydrogen atoms fusing and releasing light, we cannot say it happens.
That's not what I'm saying though, I'm saying we cannot
assume it happens. To go back to the electron metaphor, we never had a working understanding of the atom until recently, and our conception of its component and its workings are constantly changing as we gain new information. Early models were completely wrong but considered to be the best representation of them at the time. Current models may indeed be completely wrong as well; we can describe it mathematically quite strenuously but as far as its representation in reality we can only speculate, since we can't observe it directly. No credible nuclear physicist would ever claim that we've accurately mapped the structure of the atom in anything but an abstract, mathematical way.
In this case, we can observe that the stars are composed of helium and hydrogen; we cannot observe the stars undergoing constant nuclear fusion to perpetuate their energy expenditure, and although it's the best theory to fit the model we can't replicate the effect in a laboratory either. Given that we can't perform an experiment confirming that such sustainable nuclear fusion is possible, and we haven't observed it directly, it's nothing more than the best theory about a star's energy expenditure that they've been able to come up with; it's certainly not the final word because our understanding of how such a thing could even be possible is incomplete.
It would be folly to completely discount the possibility that stars work in a way completely different from what we theorize, given our incomplete understanding of the phenomenon. If the most obvious conventions were always taken for granted throughout history, where would scientific progress come from?