[SystemSafety] Fwd: Contextualizing & Confirmation Bias

Matthew Squair mattsquair at gmail.com
Wed Feb 5 20:25:56 CET 2014


Isn't this restating the problem of experimental regress?

No experiment is ever completely theory free, you generally need an idea of
what you're looking for before you start, and you inevitably interpret data
through the perception of your theory. If others get the same result great!
If not, everyone gets to throw stones...

Case in point from a safety perspective would be Boeing and the FAAs
Lithium Ion battery woes. :)

http://criticaluncertainties.com/2013/03/27/battery-tests-and-experimenters-regress-part-ii/

Matthew Squair

MIEAust, CPEng
Mob: +61 488770655
Email; Mattsquair at gmail.com
Web: http://criticaluncertainties.com

On 6 Feb 2014, at 4:46 am, Nick Lusty <nl887 at my.open.ac.uk> wrote:

Peter,

Whilst accepting that all science is done in its own context, surely
getting "the right answer ... when there was a right answer to get" is a
not evidence of the absence of confirmation bias in the experiment.
 Indeed, turning this argument on its head, if a theory is correct, and the
experimenter suffers confirmation bias, "the right answer" would seem, at
least to me, to be the inevitable conclusion of the experiment.

Nick Lusty


On 05/02/2014 17:18, Peter Bernard Ladkin wrote:


On 5 Feb 2014, at 15:53, Derek M Jones <derek at knosof.co.uk> wrote:


Some of Physics' famous experiments, e.g., Millikan's oil drop

experiment had a great deal of confirmation bias built into their

original reporting:

http://www.albany.edu/~scifraud/data/sci_fraud_0732.html


I did see a photograph of a page in Millikan's original notebook

that showed exactly what he had done with his data, but cannot find

the link.

It's worth being a little careful when discussing work which is both right
and Nobel-prize-winning. Holton's article on the Millikan oil-drop
experiments was one of the first in which it was pointed out that great
scientists don't just dispassionately collect data, do some averaging and
publish the mean as The Truth. I don't have the original article any more,
but the lesson I seem to remember taking away from it (it was in the 1970's
when I read it, I think) is this.


There are, in some experiments, a plethora of confounding factors and you
you don't know what they are or how to control for them. But you are
certain there is something there that you know about and are trying to
measure. You can do some obvious things such as throw away outliers. But
for others there is no algorithm. There are some people who have an
intuition for when an experiment they have devised is going "right" (the
unknown factors aren't overwhelming the run) and when it's going "wrong"
(something else is dominating) - an intuition for how one can get things to
work. I've met people with such magic skills, starting as a teenager in
school science classes. That's how much of experimental physics had always
been done - you couldn't get results otherwise. Just as some people knew
how to hit a tennis ball better than others.


Millikan was one of these people. And he was in a hurry. He knew the
phenomenon was there, and he was trying to get things to cluster. He did
so, aggressively, from our modern point of view too aggresively. Holton
couldn't find any reasons why he denigrated some runs. But he was after a
result, knew it was there to be got, believed in his intuition with very
good reason, and got it. He was probably also lucky, in the sense that
almost anyone who won the Nobel,prize for their work has to have been in
some respect of other lucky - you don't get it by just putting in your time.


He seems to be a supremely self-confident experimentalist working a century
ago at a time when one did what one could to get the right result. To put a
discussion of what he did up on a WWW page with "fraud" in the title seems
out of bounds to me. The observation that he couldn't do that nowadays
seems right - Holton's point was that you could barely do that even then.


But, if you have stayed with me this far, the point is this. It's not an
example of confirmation bias. He got the right answer, for goodness sake,
when there was a right answer to get. His intuition was spot on.


PBL


Prof. Peter Bernard Ladkin, University of Bielefeld and Causalis Limited

_______________________________________________

The System Safety Mailing List

systemsafety at TechFak.Uni-Bielefeld.DE




_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20140206/4379a48d/attachment-0001.html>


More information about the systemsafety mailing list