Getting linky with it.
Some good stuff to read, especially if you're resurfacing from unpleasant tasks like writing or grading papers:
Sharing reagents/no good deed goes unpunished. YoungFemaleScientist has a great post about possible downsides to sharing published reagents. In particular, if the person who requests your reagent can't get it to work — and doesn't keep you in the loop so you can share your expertise — s/he might decide your reagent is crap. They might even go to your competitor for a similar reagent. Go read the post to see how this feeds into the eternal struggle to publish and have your findings recognized and used by your scientific peers. One important take-home message is that effective scientific communication involves asking questions, and it extends well past publishing findings or using published findings.
Sometimes "magic hands" are just hands that are better than yours. The Pluripotentate has a very nice response to my post about "magic hands" in the Korean stem cell scandal. The whole too-good-to-be-true nature of certain experimentalists' achievements can make you suspect ... well, that they really are too good to be true and these experimentalists are maybe makin' stuff up. But the Pluripotentate reminds us that this is not the only explanation:
I'd add another possibility -- stubbornness on the part of people trying to learn the technique. Right now I'm the magic hands in the lab on a particular technique. I remember being suspicious of the woman who taught me, until I finally got it to work. I finally gave in and made fresh buffers and followed the protocol exactly. So painful. So tedious. So rewarding.
The people I'm teaching now, who've been struggling for months, may look on me with some suspicion. But they're the ones hanging on to their precious contaminated Tris, the poor dears. One day, at the end of their ropes, they'll pour their buffers down the sink in frustration. And the week after that, the heavens will rain blessings down on their sparkling new magic hands.
Experiments are hard. That's why scientists get the big bucks (and the chicks, and the public's adulation, and a pony!).
Methodology matters. Check out Shakespeare's Sister's discussion of methodological flaws in a Norwegian study that purports to show that women who have abortions suffer “mental distress” longer than do women who have miscarriages. The dissection of the problems is detailed — go read it. Shake's Sis tells us:
... the mental health of the participating women who sought an abortion was almost statistically significantly poorer than the participating women who had a miscarriage, and the complexity of the abortion issue may account for discrepancies. That’s the problem with poor controls; you can end up with a study that has a completely meaningless conclusion. And yet here it goes—out into the world, reported as fact. Women who get abortions are more highly traumatized than women who have miscarriages. Even though it may be the women who got abortions and participated in the study were more inclined toward mental distress irrespective of their abortions, or that societal views of abortion—and specifically, women who get abortions—may facilitate feelings of shame and guilt.
It's not clear that the Norwegian researchers were trying to skew the results. Even before you start collecting data, good experimental design is hard, too.
It's nice that the blogosphere keeps chugging with such nice posts while I've been off in my cave grading.