Showing posts with label knowledge. Show all posts
Showing posts with label knowledge. Show all posts

20110225

Democrats, Experts, and STS

Predrag Bokšić | perceptron
Governing is no easy task. While in some idealized, Athenian past, every decision required of the body politic might have drawn solely on common sense, these days every decision is intertwined with knowledge known only to specialists in the relevant field; it is locked behind walls of expertise. The body politic, if it is not to flail randomly in an insensate throes, must rely on the advice of experts. How then can rule by a small elite be reconciled with democracy?

The modern expert advisor is the spiritual descendant of Machiavelli. The brutally realist Italian revolutionized the Mirror for Princes genre, speaking directly in the vernacular, and cloaking his rhetoric in an objective "view from nowhere." To prove his credibility, Machiavelli erased himself, claiming merely to transmit the facts of history and psychology into applicable lessons on power. Early scientists, as exemplified by the British Royal Society of Boyle's era, used the same technique to 'merely transmit the facts of nature,' displaying for the public that which was self-evidently true.

The Machiavellian advisor works primarily at the point of power, at the person of the sovereign, but in a modern democracy, the sovereign is a fiction. The people rule, through their representatives. Though the relation of the people and their representatives is far from straightforward, (representatives speak for the people, make decisions for the people, and serve as targets of blame for the people, among their diverse function), a representative who strays too far from the desires of his or her constituents will soon fall. Therefore, expert advice applied at this level, once it departs from common knowledge, becomes useless. The experts and those who listen to them will be discarded at the first opportunity.

Instead, in a democracy, experts must also address the validity of their claims to the public. The end product of advice, and the advisory process itself, must appear credible. Science (roughly, the process of discovering facts about the natural world) in it's Enlightenment legacy, and the scientifically derived technologies around us, is one means of certifying the validity of expert claims, and representative decisions. Yet, because scientific claims speak to fundamental truths about the world, and can thereby override deliberation, astute politicians have learned to deploy counter-claims and counter-experts. Moreover, political figures has disseminated a narrative that discredits the ability of science to make any epistemically true and relevant claims about the world.

How then can scientists operate in a climate of such hostility? Dewey provides an model; by visualizing society as composed of a network of identities, with individuals belonging to multiple identities at once, he suggests that science can be democratized by tying as many people as possible to the "scientist" network. But what exactly is it that individuals should be educated in? There is no way for people to learn more than a scanty sampling of science. Rather, the chief science, the skill of kings, is learning to evaluate experts and their claims. There are universal patterns to how expert knowledge is created, and the vitamin that the body politic needs today is not more public scientific knowledge, but more public science, technology, and society scholarship.


20101201

Belief-based certainty vs. evidence-based certainty

Over at [this] foum I noticed the following comment #10, which plays into some recent thoughts I've been having :
"Evidence-based certainty uses rationality to gradually prove or disprove theories based on empirical evidence. Belief-based certainty works in the other direction, the desired certainty is already known and rationality is abused to build on carefully selected evidence to “prove” that belief.

Belief-based certainty will always have a higher value socially and politically in the short term because it satisfies the immediate need for certainty and it is purchased by those who have the assets to afford it and have the most to lose.

Evidence-based inquiry is a process that only produces a gradually increasing probability of certainty in the long term. Facts will lose the news cycle but quietly win the cultural war."
I think in a very broad sense the narrative which "RFLatta, Iowa City" is drawing, and which Paul Krugman often uses to distinguish himself from those dastardly freshwater economists, is true, but should be taken with a grain of salt because it is a false dichotomy.

"Evidence-based inquiry" is surely what we ultimately want to point to when we talk about science and mathematics, but the process of how the sausage is made is obviously different in some important respects. An investigator knows he must collect evidence, but what are the right questions to ask? What are the right experiments to perform? These decisions cannot be made on the basis of hard evidence, since we haven't collected any hard evidence yet -- one must take existing hard evidence from other's experiments and then try to extrapolate to make a plausible prediction.

Indeed in computational learning theory too, we see the importance of this approach of "finding a plausible fit" to some of the data based on some unjustified assumptions, and then testing the hypothesis against other data.

The point is, we can't find a good fit until we understand the data, but we have to start somewhere, so where do we start? The answer is, generally, we start with our beliefs, and go with our gut.

In mathematics of course, having a good intuition is critically important. Famously for Godel, intuition was all important -- even though the Continuum Hypothesis is known to be independent of ZFC, Godel believed we can have set theoretic intuition about some of its consequences such that we should reject it as false. How Godel could possibly have cultivated such an intuition continues to be regarded as something of a mystery, depending on how much you read into it. Richard Lipton writes a nice blog post about all of this: http://rjlipton.wordpress.com/2010/10/01/mathematical-intuition-what-is-it/

Which brings me to a critical juncture -- what is the distinction between intuition and prejudice? My contention is that there is none, they are semantically equivalent and differ only in positive / negative connotation. I should mention another quote I am fond of which I may have disseminated previously:
"A great many people think they are thinking when they are merely rearranging their prejudices." -William James
How do we know when we are really meaningfully investigating an open question as opposed to just juggling around our prejudices? It really seems that at least some of the time, this may be the hardest aspect of science. I can certainly remember advisors on projects I worked on in the past who were pleased when, I lead myself to backtrack on some entrenched assumption I had made.

How do we confront issues like this when the question is something like P vs. NP, where now essentially 90% of the field believes P != NP, and takes the attitude "we know they aren't equal, now we just have to prove it"? In at least one talk I've seen, Peter Sarnak stuck his neck out and opined that this attitude is unscientific.

It seems to me that most of the time, we don't spend too much time arguing about intuitions, because it is largely unproductive. Use whatever mystical value system you want to guide your research, but if it doesn't produce results, you'd better toss it out the window, and it must yield to proofs. It's fine to believe "P != NP because everything is an expander graph", and get it tattooed on yourself in German if you want, but if it doesn't go anywhere... don't get too attached to your burdens.

So whats the moral? At this point, it seems to me that, mathematical intuition is a total myth, part of this silly hero worship ritual that we all seem to indulge in to some extent. Yet on the other hand, I've never known professors to disabuse undergrads or grad students of this idea. Indeed we even see really famous people like Godel, Richard Lipton, and Enrico Bombieri "indulging".

So perhaps as a reasonable hypothesis is that, we progress as follows -- when we are young we believe anything, when we are grad students, we become dramatically more skeptical, and then somehow with experience, we come around and believe again.

I just spent like 20 minutes trying to find this webcomic I believe I saw like this... it was either xkcd or smbc, one of these things where you have a graph showing how, either with age or amount of thought put into it, your belief in God begins very high, then plummets "how could god possibly exist", and then continues to oscillate between 50% and 0 for the rest of your life "oh that's how...".

Personally I don't find that to be the case wrt God, but I now think its plausible with respect to mathematical intuition.

And there we go again, extrapolating some kind of crazy oscillating curve based on two data points, some hearsay, and a web comic... fml.