Showing posts with label complexity. Show all posts
Showing posts with label complexity. Show all posts

20100825

Why are some aggregates "smarter" than their individual components and others are not?

So I had a very broad question which I had been dimly aware of for some time, but I've never asked it. I'd be interested what some people have to say... I'd almost consider trying to post this on a site like Math Overflow and seeing what people say, even though its not really what they go for I think (I would probably phrase it differently). Please chip in your 2 cents.

In many many fields we have this idea of simple units acting together in cohesion to create some very complicated aggregate body. In some cases, for instance, Brains, the behavior of a single neuron is thought to be quite simple, and "unintelligent", while the overall body displays substantially more complexity of behavior, and capacity to adapt favorably to various environments and situations. On the other hand, maybe the clearest example of the opposite is the "stupidity of crowds". A crowd of people is thought to have dramatically less problem solving ability than a single person. A single person is frequently able to able to monitor their spending and manage a budget appropriately for instance, while the California legislature is not.

Other examples of systems may not be so clear cut. For instance I'm not really sure if an electron is smarter than a cloud of electrons. In areas like probabilistic combinatorics, we can frequently create large probabilistic systems composed of very simple components which are coupled together in simple ways, but about which we can say almost nothing in terms of the behavior of the whole system. In statistical physics I suppose it is the opposite -- predicting the motion of a particle, given its local environment, is thought to be extremely difficult, and typically modeled perhaps using Brownian motion, yet we can deterministically model the evolution of the gas as a whole, and develop useful statistics to "characterize" the macrostate.

I suppose that in most of science, the only time we study aggregates of "smart" components are where the components are people or animals. Perhaps we do not recognize other components as smart?

Its not clear to me what precisely is different in the way that societies are built of humans and the way that brains are built of neurons. You might suggest humans can move freely for one, but in most cases, it seems that people establish some local network of people they trust and respect, through which they receive information, and these parameters of trust and esteem become established and then fine tuned as life progresses, perhaps not unlike neural network weights. Obviously the aggregator function is substantially more complicated.

Perhaps you might suggest that for some problems, networks of humans are quite effective -- for instance, we can design space shuttles, and a single individual probably cannot do that. Its only political problems that individuals fail at. Perhaps you can point out the problems that brains fail at by analogy... certain long term risk reward tradeoffs? drug addiction?

Of course the answers you get will depend heavily on the formalism you choose for computational ability. What I would like to ask is this:

1) Are brains organized from subunits in terms substantially different from societies / other schemes?
2) Given subunits of a certain design with a certain computational power under some formalism... VC dimension which can be learned efficiently? Topological Entropy of the analogous dynamical system?.... how much computational power does the aggregate possess, when formed under one connection scheme vs. another?

Obviously 2 is going to be pretty hard to answer... and will depend on your answer to 1 which may be contentious. Pitch in your 2 cents.


20100220

Algorithmic Thinking

Welcome back Chris Beck!

You asked for comments on the Princeton essay, so here they are. Professionally, every field of science has been bemoaning how it must explain its fundamental methodology and importance to public, starting with Galileo and the Catholic Church. Computer science is unique in that it is the most intimate form of technology, the one we interact with the most on a daily basis. Yet, the basic paradigm is not to show people the code, to lock software down as much as possible. Programming is a trade, making little trinkets, and programmers are mere tradesmen.

Computer Scientists should distinguish their field from the mechanics of making a computer work. I am not a computer scientist (but I do play one on TV), and it seems to me that the paradigm of CS is an algorithmic understanding of phenomenon. An algorithmic view of reality can be incredibly power, by reducing complex systems to discrete steps, we gain insight. Keegan once said that he finds programming very humanistic, he imagines himself as the computer, and acts out the steps he would take to solve a problem. What kind of complex systems would we reduce if we looked at them from the level of basic agent, and developed a simple fool-proof algorithm for solving them? (I'm thinking healthcare here, but there should be other examples.)