Gadianton wrote:could you, or physics guy, or lemmie, maybe analytics, I don't know about his field, comment on the certainty published - 10 ^ 132 (Im not looking at the number exactly right now) and how often legitimate and very well respected scientific findings achieve this level of certainty?
Great question, Dean Robbers.
Physics Guy wrote:Bayes is just a hobby with me; it's rarely used in my field. Not even experimentalists resort to it often. I don't know when I've heard an experimental physicist talk about evidence for a hypothesis; they generally just measure numbers and compare them with theoretical predictions, and both the theory and the measurement are precise enough that it's either an obvious Yes or an obvious No. The data points either hug the theory curve or they don't. You could dress it up in Bayesian language but it would just be longwinded.
In experimental high energy particle physics, which is not my field, there's a whole lot of background noise, and you're trying to infer what happened in a very brief instant from its later consequences. So Bayesian reasoning is probably much more important there—or at least it could be. Things are pretty Gaussian for them so they may still be able to get away with simple estimates. I know the standard for claiming to have discovered a new particle is a signal five standard deviations above background noise, so about a 1 in 3.5 million chance that it's just a fluke. Then they keep on measuring, of course, and the chance that it's not real steadily falls as the data accumulates. They'll give you a Nobel prize long before you get to 1 in 10^132, though.
My attitude is that no extremely low probability estimates are ever valid or even meaningful, because the chance that some unknown thing might be wrong in the analysis is going to be higher than the reported chance.
Physics Guy's response described the situation in physics well. Here is an example.
High precision measurements of physical constants such as the gravitational force constant (G) recently went from a generally accepted uncertainty of 47 parts per million to a claimed uncertainty of 11.6 ppm. This latter work was done by infinitely patient Chinese physicists and reported last year in Nature .
Qing Li, et al., in Nature wrote: Here we report two independent determinations of G using torsion pendulum experiments with the time-of-swing method and the angular-acceleration-feedback method. We obtain G values of 6.674184 × 10−11 and 6.674484 × 10−11 cubic metres per kilogram per second squared, with relative standard uncertainties of 11.64 and 11.61 parts per million, respectively. These values have the smallest uncertainties reported until now, and both agree with the latest recommended value within two standard deviations.
Please note the reference to standard deviation as a probabilistic characterization of the uncertainty related to a standard. According to results from the FIND function on my computer, the term "standard deviation" did not appear in the Dale & Dale paper.
To reiterate; individual sets of measurements to determine the value of G over the last few decades have reached the precision reported by the Chinese of 11.61 parts per million. (The range between the highest and lowest reported value determined for G, measured over the last 40 years, or so, remains on the order of 500 ppm.)
The fact that a physical constant of nature can be determined to an uncertainty of 11.61 parts per million by a team of more than a dozen scientists in experiments important enough to be published in Nature in 2018, makes the pseudo-Bayesian faith-based claim of uncertainty at one part in one hundred thousand billion billion look more than ridiculous.