Jumping to conclusion, an example.

The climate argument recently got down to the issue of accuracy in the primary data and the manner by which conclusions were drawn from that data. Stats.org has another example of how a bit of agenda driven zeal can result in questionable conclusions. When inference was checked by experiment, the inference was found to be flawed.

In this case, it was the Ecology Center and plastic use in automobiles. “Ecology” and “plastic” are terms that often seem correlated with an unseemly bias and that should raise a bit of skepticism.

But the Center didn’t actually measure the “off-gas” or what it contained. It was enough to know that because the car contained plastic, and off-gassing occurred, it would yield poisonous fumes. … a real toxicologist has finally studied new car smell, and the results are rather different from the Ecology Center.

As in the Hansen climate data brouhaha, one can wonder if some are thinking “why let measurement get in the way of a good story?” — after all, does the end justify the means? Or is jumping to conclusions something that should be kept in mind as just a start on finding accuracy in measure?

Comments are closed.