Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

20120326

Primer Part 2

As promised, part two of my technology in education post. Thanks to John and Cameron for their comments. In this article I’m going to develop the three core life skills that a person has to develop to become a good citizen capable of living a fulfilling life: self-discipline, cooperation, and reflective evaluation, and explain how educational technology can be used to develop them.

Self-discipline is simply the ability to do something which is not much fun in the short term, yet which you know to be good in the long term. It is a key-component of long-term planning and law-abiding behavior. Walter Mischel’s famous marshmallow study demonstrated a strong correlation between the ability to delay gratification at the age of four and future academic success and good life outcomes. David Brook’s recent column The Relationship School had an interesting vignette about discipline at the New American Academy: “Even though students move from one open area to the next, they line up single file, walk through an imaginary doorway, and greet the teacher before entering her domain.” The goal here is clearly to get students to treat normative barriers in the same way that they’d treat architectural barriers. While different people have different levels of self-control and discipline, it is clear that these are skills that can be learned. Indeed, character-building is the central premise of the KIPP schools.

Cooperation is the ability to work well with others. Human beings are intrinsically social, most tasks these days take place in a group, and even a ‘lone genius’ acts as part of a tradition of human effort. An MIT study on collective intelligence found that the skill of a group didn’t depend on the abilities of its individual members, but rather on the “the willingness of the group to let all its members take turns and apply their skills to a given challenge,” a quality the researchers deemed social sensitivity (which, as an aside, women have a substantial edge in). In preschool and kindergarten, teachers focus on social tasks like sharing toys, but education soon becomes very structured and individualized. Aside from team sports and group projects, we leave learning how to cooperate to ‘free time’ like lunch and recess, and leave teachers out of it. And then we wonder why kids are horrible to one another.

Finally, reflective evaluation is the ability to look at yourself, look at the world around you, and figure out what it is that you want to do with your life. It is the ability to appreciate aesthetics, to create, to explore, to seek clarity, to behave morally, to be driven by something other than shallow urges towards pleasure and away from pain. This is the fuzziest notion of the three that I have advanced, but it is connected with the Maslow’s idea of self-actualization, or Foucault’s goal of “becoming yourself”. Basically, we are happiest when we are doing something that we love and something that we are good at. The trick is figuring out what it is that you actually want.

So how can technology make enable these goals? The key features of the educational technology that I envision are a touchscreen tablet that can easily accept being linked to keyboards, mice, and things like microscopes and sensors for special assignments, a front-mounted camera that can identify the user, a microphone with voice recognition, GPS for locational data, and a system that monitors user activity. Expect it to be loaded with apps that are directly educational, like classic books, history lessons, and math and science worksheets, along with creative apps for art and music and programming, and the usual social networking and games.

Today, the majority of teacher's energy is spent making lesson plans, delivering lectures, and grading assignments. These core educational tasks are about transmitting information and checking that it is received. As they are currently performed, involve a massive duplication of effort, a waste of trained expert's time on menial tasks, and make it impossible to actually develop best practices.

Rather than the lecture, homework, and test model, a textbook app provides multimedia (text + video + extra clarifications) tutorials on the subject, followed by a series of practice problems. Rather than having to demonstrate lessons and maintain order in the classroom, the teacher has time to work with students individually, along with a wealth of data about exactly what in the lesson their students don’t understand. The class doesn’t have to run at the pace of the slowest student, each student can work through the lesson at their own pace and style: alone, in a group, outdoors, or late at night.

Completing a lesson gives you ‘credit’ that you can use to unlock the next lesson, games, and in some cases a useful tool. For example reading a book and writing about it lets you check out more ebooks from the school’s library, or audition for the play. Mastering addition, subtraction, multiplication, and division gives you a four function calculator. Trig and algebra give you a graphing calculator, and completing calculus gives you something like Wolfram Alpha. Completing a unit in science might allow you to sign up for a field-trip or experiment. Participating in a gallery show might upgrade your available art apps from Paint to Photoshop to Maya. Rewarding people for doing good work by giving them more of the same keys both intrinsic and extrinsic rewards.

Finally, I do believe that testing is important, but as it stands, we spend too much time testing students, and students spend too much time worrying about tests. Why not the student review the material as much as they want, and then when they’re ready, they go to a “Quiet Room” where talking is discouraged, put their tablets in test mode, and complete a worksheet just like the ones they’ve practiced with by their selves. In test mode, communication applications are locked down and the face-recognition camera makes sure that the right person is using it. If the student didn’t bother to prepare or scored poorly compared to their previous work, their teacher is automatically alerted.

Entire curriculums can be built out of a series of lessons of increasing complexity. Students and teachers can work together to develop an individualized lesson plan that meets state mandated minimum requirements for a certain number of hours of math and English and what have you, as well as a time-line of how the student should be progressing. A student that is behind on their lessons can be instantly located and given extra support. The goal is a total assessment environment that is both accurate and effortless, rather than the artificial hoops of the current system.

This model is based on the most addictive video games, which combine token economies with collecting activities and increasing difficulty. Make the lessons just hard enough to induce ‘flow’, and students will spend hours in the zone as they burrow through math lessons, scientific theories, and novels and histories. Because “textbooks” are all electronic and monitor how they are used, publishers can continually modify lessons at no cost to clarify issues that lots of students find confusing. And teachers have more time to work with students one-on-one or in small groups.

The conventional wisdom is that electronics, and particularly the internet, are making us perennially distracted and unable to focus. Won’t putting more computers in school make this worse, and lead to the opposite of self-discipline? Well, the internet is only distracting because there’s no reward for not making it distracting. When your email pings, you check it because checking it is rewarding and there’s no reason not to. Imagine that a student is using an educational app and gets a text from a friend (we might as well surrender to the fact that kids are going to text and use social media in school, because that genie is not going back in the bottle). Rather than look at the message, the student chooses to finish his or her lesson and respond in 15 minutes. Their tablet gives them a little smiley face and a ‘good job on good decisions’ message. Only small changes are necessary to make technology useful for concentration rather than distraction.

Through little nudges like this, discipline is taught as an incremental process. Early apps are very structured, and as a student advances they are required to learn how to structure their own education. For example, a ‘learn-to-read’ book might highlight each word in turn and ask the student to follow along out-loud, using voice recognition to demonstrate proficiency. A book for 4th graders might have multiple choice questions about the content at the end of every chapter. A book for middle-school students would ask for a series of short answers about specific elements of the book. And a book for high school students might not have any work attached until the end, where the student has to synthesize an essay about the major themes. At each level, the student is provided the tools to make a smooth transition to the next level.

And finally, we can encourage discipline by making providing a clear reward for success: Freedom! A straight-A student maintaining an ambitious course-load can be expected to have the tools to succeed on their own. A C student struggling with the concepts needs more support. So give the C student a lot of structure and a lot of supervision, and let them graduate to making their own decisions. With constant monitoring of activity, teachers can see instantly if a student is slacking off or excelling.

Teaching cooperation is hard, but my first premise is that since learning core skills will take less time and can be done in groups, students will inherently learn to work together. Students can learn to manage common resources like playground equipment, teaching them about sharing resources. Finally, virtual spaces offer a great way to learn the norms of collaboration. Minecraft has become a platform for monumental shared world building, including a recreation of The Lord of the Rings as a spiritual pilgrimage. Imagine using similar techniques to integrate students into a scientific and political study of local water quality issues, or an international sharing of cultures. And rather than just being seen by the teacher, student’s work could be presented to the entire class, and participating in constructive criticism part of an student’s job. Several of my colleagues have reflected that their students (ASU undergrads) write better for class blogs than they do on traditional papers because they know their audience and want to leave a good impression.

More formally, the MIT study linked above used electronic badges designed by the Media Lab to record the pattern of interactions in groups. Tablets with the right software could do the same, recording when everybody is talking. But this is overkill, and we need to encourage students to cooperate of their own free will with people nearby and over the net because it’s fun and useful.

On the final key skill, reflective evaluation is something that can’t be taught by a machine, a person has to find it on their own. But the process can be made easier by providing many different types of opportunities, and by continually probing with that most important question, “Why?” I hope that under this system, students will spend less time on basic skills, and more time doing in the real world working on interesting problems or pursing their passions in the creative arts. I hope that education can become more relaxed, rather than rushing from kindergarten to college without pausing to take stock. And I hope that it is a process that never ends: that even after passing their High School Certification Exam, people continue to access educational materials.

This model might be optimistic, but I don’t think it’s utopian. The technology is almost ready, requiring just a little more integration. The major problems are now political, with teacher's unions, national testing standards, and a belief in the intrinsic value of a high school diploma. The money for reform can be found: in 2008, according to Department of Education statistics, public schools spent an average of $10,441 per student for results which are at best average and are truly failing disadvantaged communities. We must do better. And the first step towards doing better is replacing 19th century information technologies with 21st century ones, and freeing our skilled and education professionals to do what they do best: inspire and mentor!






I’d also like to preemptively address two major concerns: surveillance and cyber-bullying.

Surveillance and privacy is a hard nut to crack. You should really see Bruce Sterling’s 2006 story “I Have Seen the Best Minds of My Generation Destroyed by Google” for a vision of a dystopia where “All our social relations have been reified with a clunky intensity. They're digitized! And the networking hardware and software that pervasively surround us are built and owned by evil, old, rich corporate people!” (because Chairman Bruce does it better than I can). But on the other hand, read your Foucault: schools are disciplinary environments that run on surveillance, and that can’t be stripped out of the educational system. If you want to educate the masses, you have to monitor and evaluate them.

But we can set up the surveillance humanely. Make only the highest level data available to end-users, like grades and time to completion. A full data review should be reserved for major correctional interventions. Anonymized usage statistics can be sent to publishers to improve software without major issues, as is already standard tech practice.

We’re operating in a world that ranges from fully public, like what Google knows about you, to semi-private (Facebook profiles), to absolutely secret (an encrypted file on a disk in a bank vault). Make an environment that supports varying degrees of privacy, and teach students how to use it from the beginning, from maintain a public profile, to a place for their friends, to a private site for their best friends, to their personal secrets, and then respect their privacy. Teenagers are finding themselves, and a desire for privacy is not a crime in and of itself. It should not be treated like one.

While the Cory Doctorow story Knights of the Rainbow Table is a bit of an exaggeration, in this age of Wikileaks and Anonymous, technological architectures cannot protect us. Only strong norms can.

As for cyber-bullying, it’s a real issue, but I see it more of an extension of normal bullying than as a wholly new phenomenon. Right now, cyberbullying can be hard to prosecute since schools (rightfully) don’t have jurisdiction over what happens outside their walls. But if platforms are substantially used for education, schools can intervene. While I am not an expert on bullying, my sense is that bullying is aggravated by the fact that we force children to spend their time packed together in small rooms with no exit. In a more tech enabled school, a bullied child could simply leave and find an environment that is more protective and conducive to learning. And on the internet, there’s a subculture for everybody, as Juggalos and Bronies so evidently prove.


20120320

A Primer on the Future of Education

In the Diamond Age, a novel full of astounding technologies, the clear star is the Young Lady’s Illustrated Primer. Sure, nanobots and cyborg implants and geological engineering and kitchen 3D printers and--well, you get the point--are cool, but the Primer--the talking, moving, intelligent and empathetic book that educates Nell--is at the heart of the novel. More than a McGuffin, the Primer links all the characters together, and plays a pivotal role in the explosive conclusion.

Primer technology is so awesome that it one of the moonshots for the SOOPER SEKRIT PROJEKT, which is why I’d like to take a closer look at it. I know two things for sure: 1) computing technology is getting better all the time, and 2) our model for education is broken.

Let’s start with the technology side. Cellphones have achieved basically universal penetration. A decently powerful computer is within the budget of the global middle class, and there are many people trying to make it more accessible all the time, from the One Laptop Per Child computer, to the $50 Indian Education Ministry sponsored Aakash Ubislate, or about $300 for an Acer or Asus netbook. These aren’t particularly great computers, but they’ll let you do access the internet, watch videos, create and edit documents, and learn to code. Not that there aren’t very real problems with getting electricity and bandwidth to the poorest billion, but there are also lots of very dedicated people and organizations pushing on the technology. It won’t be as slick and indestructible as the Primer, but the hardware is definitely getting there.

The second side is education. According to Bruce Mau, higher education is accessible to only 1% of the world’s population. Schools are underfunded and overcrowded pretty much everywhere outside of a small group of wealthy post-industrial countries, and then you get ridiculous soul-crushing South Korean study mills. If you believe that education is a pre-requisite to living a good life (and I do, by and large), then we have problems. Big problems.

I’m not alone in this assessment, and there is a tidal wave of innovation directed towards applying the power and scale of information technology to education. Khan Academy, MITx, and ShowMe are some of the bigger names, but probably the most innovative players are Sebastian Thrun and Peter Norvig, the Stanford professors behind the 160,000 student open access graduate level course, CS221: Introduction to Artificial Intelligence. You should really just read the whole article about it at Wired Science, but I’m going to pull out the crunchy philosophical bits.

After seeing Khan at TED, Thrun dusted off a PowerPoint presentation he’d put together in 2007. Back then he had begun envisioning a YouTube for education, a for-profit startup that would allow students to discover and take courses from top professors. In a few slides, he’d spelled out the nine essential components of a university education: admissions, lectures, peer interaction, professor interaction, problem-solving, assignments, exams, deadlines, and certification. While Thrun admired MIT’s OpenCourseWare—the university’s decade-old initiative to publish online all of its lectures, syllabi, and homework from 2,100 courses—he thought it relied too heavily on videos of actual classroom lectures. That was tapping just one-ninth of the equation, with a bit of course material thrown in as a bonus.

Thrun knew firsthand what it was like to crave superior instruction. When he was a master’s-degree student at the University of Bonn in Germany in the late 1980s, he found his AI professors to be clueless. He spent a lot of time filling in the gaps at the library, but he longed for a more direct connection to experts. Thrun created his PowerPoint presentation because he understood that university education was a system in need of disruption. But it wasn’t until he heard Khan’s talk that he appreciated he could do something about it. He spoke with Peter Norvig, Google’s director of research and his CS221 coprofessor, and they agreed to open up their next class to the entire world. Yes, it was an educational experiment, but Thrun realized that it could also be the first step in turning that old PowerPoint into an actual business…

He’s envisioning his own digital university, with a less conventional curriculum, one based on solving problems, not simply lectures on abstract topics. It would offer a viable alternative for students of the global one-world classroom—particularly those who lack the resources to move to the US and attend college.

Thrun decides that KnowLabs will build something called Udacity. The name, a mashup of audacity and university, is intended to convey the boldness of both Thrun’s and his students’ ambitions. His goal is for Udacity to offer free eight-week online courses. For the next six months or more, the curriculum will focus on computer science. Eventually it will expand into other quantitative disciplines including engineering, physics, and chemistry. The idea is to create a menu of high-quality courses that can be rerun and improved with minimal involvement from the original instructor. KnowLabs will work only with top professors who are willing to put in the effort to create dynamic, interactive videos. Just as Hollywood cinematography revolutionized the way we tell stories, Thrun sees a new grammar of instruction and learning starting to emerge as he and his team create the videos and other class materials. Behind every Udacity class will be a production team, not unlike a film crew. The professor will become an actor-producer. Which makes Thrun the studio head.

He’s thinking big now. He imagines that in 10 years, job applicants will tout their Udacity degrees. In 50 years, he says, there will be only 10 institutions in the world delivering higher education and Udacity has a shot at being one of them. Thrun just has to plot the right course.

Yikes! If education is about transferring knowledge from teachers to students, i.e. information transmission via some sort of web app, then we know from observation how that process plays out. With Google, Amazon, Youtube, Facebook, and so on, one firm establishes technological superiority, gains a larger market share, and then just eats everybody else. Education is actually a fairly conservative business, (the oldest continually operating institutions on the planet go: the Catholic Church (~2000 years), Medieval Universities (~1000 years), and the a bunch of Johnny-come-lately corporations and governments), and it’s based a lot of prestige and momentum, but web apps are very cheap to develop and operate compared to a traditional university, and they are much more scalable. The only real barrier is credentialing, the process of giving somebody a piece of paper that says that they’re qualified to do something, and as soon as the education app developers figure out the politics of their credentialing system, the whole edifice of higher education is just going to blow away.

Universities are increasingly expensive, lousy at teaching useful skills, and produce a worthless credential. And all this is doubly true for American primary and secondary education. It won’t take much innovation to make something that is a lot cheaper, and has comparable or even better educational outcomes.

The stage is set for something like the Primer to actually come about. It won’t be slick and seamless like in the novel, but the continuously improving combination of hardware and software that we see in real gadgets can make an educational platform that is cheap, accessible, and able to a take a student from kindergarten to a bachelor’s degree. The technological problems are essentially solved.

But wait, education isn’t just about information transmission. Schools do more than teach facts and theories, they are factories of socialization. They produce citizens. What kind of people will tablet educated students grow up to be? There’s an echo of this in the Diamond Age, where the most impressive feat of the Primer is not that Nell knows kung fu, or computer programming, or nanoscale engineering, or even how to get along in NeoVictorian society, but that the Primer creates a Mouse Army of 100,000 Chinese orphans who are capable of acting as a perfectly coordinated network. We spend only a little time with the Mouse Army girls, but on reflection, they are a profoundly strange society.

Now, I’m not going to defend the kind of socialization that happens in the American school systems that I’m familiar with. I think it’s often dominated by the most pathological personalities, both students and teachers, and results in trauma rather than personal growth. Much of what made me who I am today happened far away from the classroom, and from the structured process of education ((And I had, objectively, one of the best educational trajectories possible, from pre-school to Oakwood to Caltech, Vassar, and ASU)). But socialization has to happen, in fact, students are going to be socialized in some way whether we want it or not.

The question that I therefore pose to you, my loyal readers, is what kind of citizens do we want our schools to produce? How can we best socialize students for the future? And how can new educational technologies and our legacy systems work together to maximize opportunity for all?

I have my own theories, which I’ll try and explain later in the week, but I want to hear from you guys first.


20110713

Two more from Breakthrough

The last two mandatory blogs from my time at Breakthrough are up. Click the links for the full thing.

Technological Mojo
Liberalism as it exists today isn't so much an ideology as a flag of convenience. The progressive position on policies promoting the welfare state and cultural attitudes towards abortion, gun control, and gay marriage unites a solid minority coalition, but one without big ideas except for a vague notion of 'play nice' and 'be yourself.' As Michael Lind of the New America Foundation put it, the Democratic Party is about checking off the wish-lists of its constituent interests groups. "What is the liberal position on the environment? It's what the Sierra Club wants." Rather discuss values, liberals have retreated to policy literalism, appealing to a slew of "scientific" and "rational" policies to achieve narrow, tactical ends: price carbon dioxide, extend healthcare to the uninsured, stop the war, decrease classroom sizes. Liberals have ceded values and emotion to conservatives, with disastrous electoral and policy results at every level of government. Liberal scientism is a rhetoric of failure.

It's Dangerous Being Modern
The Breakthrough Dialog began with a very interesting idea, that of second modern risk, which was not fully fleshed out. At the heart of second modernity is the idea that humanity has become responsible for its own fate. Thanks to the power of science and technology, we have banished the ancient gods and forces of nature. Food, shelter, and physical security are all assured in the first world, and so humanity has directed its efforts to fulfilling post-material needs for status, power, and a moral society. In many ways, this is a zero-sum game; unlike material goods, status and power cannot be increased, only redistributed. Different cultures have profoundly different concepts of morality. For all our efforts to improve the second modern condition, it seems that the best we can do is run to stay in place. Post-material failure is one kind of second modern risk.

But while people worry about their job security, and their child's chances of getting into Harvard, and what their neighbors are up too, second modernity has its own apocalyptic horsemen. Flood, famine, fire and plague are primitive problems. In their place, we have substituted the business cycle, anthropogenic climate change, and total war. Second modern risks are more worrying, not just because they are bigger, mankind finally has the power to wipe itself out, but because they are human in origin, and therefore, in some sense, are our responsibility. My fear is that decades or centuries from now, the weary, broken survivors of whatever ended our technological civilization will look back and say, "But why didn't they change?" How then, can we as individuals and as a collective, come to grips with both kinds of second modern risks?


20110629

The Devil is in the Assumptions

Google just came up with a report on the potential of clean energy technology, which has received some fairly rapturous coverage in the environmental press. The key insights of the report, as follows:

  • Energy innovation pays off big: We compared “business as usual” (BAU) to scenarios with breakthroughs in clean energy technologies. On top of those, we layered a series of possible clean energy policies (more details in the report). We found that by 2030, when compared to BAU, breakthroughs could help the U.S.:
    • Grow GDP by over $155 billion/year ($244 billion in our Clean Policy scenario)
    • Create over 1.1 million new full-time jobs/year (1.9 million with Clean Policy)
    • Reduce household energy costs by over $942/year ($995 with Clean Policy)
    • Reduce U.S. oil consumption by over 1.1 billion barrels/year
    • Reduce U.S. total carbon emissions by 13% in 2030 (21% with Clean Policy)
  • Speed matters and delay is costly: Our model found a mere five year delay (2010-2015) in accelerating technology innovation led to $2.3-3.2 trillion in unrealized GDP, an aggregate 1.2-1.4 million net unrealized jobs and 8-28 more gigatons of potential GHG emissions by 2050.
  • Policy and innovation can enhance each other: Combining clean energy policies with technological breakthroughs increased the economic, security and pollution benefits for either innovation or policy alone. Take GHG emissions: the model showed that combining policy and innovation led to 59% GHG reductions by 2050 (vs. 2005 levels), while maintaining economic growth.
Well, hot damn. That's some good outcomes. All we need is a carbon price, some deployment policy, and a couple of scientific breakthroughs, and we can save the world and get rich at the same time. Well, I was feeling cynical, so I decided to look at exactly what breakthroughs we might need. Google was nice enough to publish it's data in Appendix C, so please turn down there, and look at solar PV. Google has the 2010 overnight capital costs--what it takes to build an electrical plant, at $4000/kW. The breakthrough scenario has that solar PV at $1000/kW in 2020. Batteries are another core technology for electric vehicles and grid-scale storage. Right now, Google has batteries at $500/kWh, and in their good scenario, $100 kWh in 2020. Other clean energy technologies see slightly smaller, but similar three or fourfold decrease in price in just a decade, along with major increases in reliability and lifespan. Now, I won't go out and say that those kinds of cost reductions are impossible, since prediction, especially about science and technology, is very hard. But in the case of solar PV, it would be about an improvement an order of magnitude better than what was seen in the past decade. In many cases it appears that we may be approaching limits imposed by the cost of raw materials: silicon, cobalt, lithium, steel, and rare earth metals. Without a better idea of what scientific breakthroughs are needed, or how those breakthroughs could be achieved, Google's report should be taken with a large grain of salt.


20110623

Between Innovation and Evolution

Breakthrough has my second blog, on innovation and technological evolution. Is evolutionary economics worthwhile, or just more psuedo-scientific bunk?

Policy-makers seeking to ignite the engines of economic growth are turning to a new theory of "innovation economics," which focuses on technological evolution and its supporting institutions. However, the axiom that "innovation drives economic growth" derives mostly from the observation that conventional explanations of growth based on capital and population fail to explain differences in economic outcomes, not incontrovertible evidence. Failure and innovation seem to run hand-in-hand. Fantastically innovative technologies, from the SAGE air defense network, to the Concorde SST, and EV-1 electric car became technological turkeys when they failed in the market. Entrepreneurs have a failure rate approaching 80%. Neoclassic economics--the doctrine that innovation economics seeks to replace--grew up crippled because it borrowed from an incomplete model of equilibrium physics, using the First Law of Thermodynamics, but not the Second Law. Similarly, without a better understanding of the forces behind technological evolution, innovation economics will develop as a fundamentally flawed theory. There is a difference between faster evolution, and real improvements in quality of life. Read the rest.


20110325

Technological Citizenship

In this post, I will advance an explanation of the differences between law and technology, and how ordinary people can reclaim control over their lives through what I refer to as “technological citizenship.”

The modern liberal state is defined by the rule of law, a fair and evenhanded treatment of all people according to clear rules. The most basic laws are constitutional, those that define the relationship between the parts of government, and government and the citizens. In democracies, and particularly in America, the Constitution has been carefully designed to allow for citizenship and participation in the law-making process. The Federalist papers debated and discovered how abstract principles like liberty and justice could be translated into the concrete institutions of policy, and despite occasional hiccups, and one major war, their framework endures today.

But laws are only half the story. The world is also full of technologies, and as Langdon Winner points out in The Whale and the Reactor, our technological constitution, the core systems for providing food, shelter, power, mobility, etc are not nearly as well-designed as the law. While the Constitution and the law grew through a process of considered debate and democratic input, technologies have accreted over time into centralized bureaucratic systems, operating according to a depersonalizing logic of efficient markets. For Langdon Winner, the power and omnipresence of these technological systems is a grave threat to democracy and liberty, as society become dependent on entities which are essentially autonomous from public control.

The democratic person is a political citizen, taking an active role in the process of governance by becoming informed on the issues, voting, communicating with their representatives and their neighbors. Our ideal of democracy remains ancient Athens (albiet with an updated version of who counts as a citizen), where every citizen participated equally in government, and positions were rotated regularly. The technological person is a consumer, and the end goal of technology is the 'utilitization' of everything, technologies becoming absolutely reliable, simple, and omnipresent. The more advanced a technology is, the fewer buttons, access panels, and failure modes it has; compare an early computer like ENIAC to an iPad. The best realized vision of this phenomenon is E.M. Forester's “The Machine Stops”, where planetary civilization is controlled by an immense computer system that is beyond the understanding of its inhabitants.

Now, reverse these roles. A political consumer is an unthinking, uncritical clod who unquestioningly obeys the dictates of The Party, whatever The Party might be. Political consumers are poison to democracy. But what is the technological citizen? By analogy, the technological citizen is somebody who takes an active stance towards technology, who is informed about the features and full scope of a given device or system, is prepared to think critically about the implications of that technology, and is not afraid to transform, adopt, or abandon technologies as alternatives become available. My friends at HeatSync Labs are great examples of technological citizens, actively experimenting with and adapting emerging technologies, and their lives have certainly been made richer through their close understanding of technology.

The challenge is therefore encouraging this new mode of technological citizenship. This will not be easy, citizenship demands deep, continuous engagement, (and political citizenship is on the decline in this country as well). And more and more technologies are becoming utilities, slick services that non-specialists can't even view, let alone think critically about. But conversely, with the internet, the cost of gaining expert technical knowledge is falling. As devices become smarter, making it easier to communicate with and analyze them should become a priority, such as a SmartGrid technology that tracks home energy usage room by room, device by device. Finally, education is a vital part of citizenship, and technological toys that are just visible enough should be developed to teach relevant skills, like computer programming, design and architecture, and ecosystems thinking. Personally, I've always been disappointed that Lego Mindstorms came out just after I lost interest in Lego; it would have made me a much better engineer. While developing technological citizenship is not easy, technological citizens will find it far easier to adapt and live in the future, and as partisan politics becomes increasing rancorous and alienating, technological citizenship may provide a new space for civic action and social development.


20110120

What's Taking so Long?

Anybody who's ever had to deal with a remodel or roadwork knows that construction is slow, slow, slow. Of course, it wasn't always this way. The Empire State Building was built in 14 months and was under budget, the Pentagon was built in 16 months, the Hoover Dam took five years. So what the hell happened since then?

That was the topic of a lecture I attended today by Edd Gibson, ASU construction expert. He made several valuable points, which I would like to extend and speculate on, if I may. Gibson noted that there are several common threads to successful projects: strong leadership, a sense of urgency and purpose, intensive planning, excellent communication, and innovation. Additionally, many of these great projects either failed to turn a profit for years, or required significant renovations afterwards.

The second part of Gibson's talk focused on the difference between successful projects and failures. This is more subjective than one might think, success is matter of perception and matching prior expectations. However, you can reliably detect the difference between projects headed for success and failure, using the Project Definition Rating Index, a scorecard that measures how well the team understand their objectives, local context, and ability to work together.

Leadership and teamwork are important, but they're also largely intangible qualities (unless of course they aren't). What I want to know are the social and technical factors that have driven this slowdown. Gibson alluded to regulation as force that hinders rapid completion. While this point is not really contentious, it's also not something that I've seen conclusively proven. Are there specific regulations (worker safety, public input, material and architectural standards, inspections for various subsystems) that delay projects? It's easy to point to regulation as a vague bogeyman, but regulation also ensures that buildings are safe to use, and embody the virtue of clear planning which Gibson correctly places so highly.

When I first heard about this lecture, I assume that the problem was technology. Simply put, buildings are vastly more complex than they were 60 years ago. If the Empire State building were built today, it would be LEED certified, wi-fi enabled, ergonomic, subject to all sorts of review, etc. On the other hand, CAD makes design much easier than paper drafting, and logistics systems are much more efficient. Communication technology is better, but according to Gibson, there's no substitute for face to face, an opinion that I share. It's too easy to form a sham consensus in cyberspace.

What I fear is that the very mechanisms of public participation and input that I generally champion have in fact lead to the perennial inability to rapidly complete projects. Its too easy for last minute legal challenges to derail a major project. Technology might also contribute, the very protean ease of technology to improvise might hinder proper planning (this is certainly the case with my weekends). We might just have to accept that the great works of the mid-20th century were an industrial anomaly. But hopefully, their 21st century equivalents will be more durable.


20101206

Report from Transforming Humanity

This past weekend (Dec 3-4), I attended the Transforming Humanity: Fantasy? Dream? Nightmare? Conference hosted by the Center for Inquiry, Penn Center for Bioethics, and the Penn Center for Neuroscience and Society. James Hughes and George Dvorsky of the Institute for Ethics and Emerging Technologies give their blow-by-blow record of the conference, but I'd like to step back and provide an overview of the field, and its position today.

The ability to use pharmaceuticals, cybernetics, and genetic engineering to alter human beings poses many complicated ethical, philosophical, and political issues about the potential deployment of these technologies. The attendees at the conference ranged from hardcore transhumanists, to left-wing bio-conservatives, and took a variety of approaches, from theology, to philosophy, to bioethics and medical regulation.

On the philosophical side, several speakers traced the philosophic heritage of transhumanism, and the demand to either find a place for man in the nature world, or the necessity of creating a unique standpoint, through the works of Thoreau, Sartre, and Cassirer. Patrick Hopkins of Millsap College gave an interesting lecture on a taxonomy of post-human bodies, Barbies, Bacons, Nietzsche, and Platos. Post-humans will have to find internal meaning in their lives in many ways, and while I appreciated the scholarship, there should have been more about the new intimacy of technology to the post-human, and its effects on daily life, beyond the obligatory references to Harraway's Cyborg Manifesto.

On the practical side, the Penn contingent (Jonathan Moreno, Martha Farah, and Joseph Powers) talked about coming developments in cybernetic devices, brain implants, and pharmaceuticals. As it stands, there exists no regulatory framework for enhancements. The FDA will only certify the safety of therapeutics, drugs that treat diseases, which means that a prospective enhancement will either have to find disease (medicalization, in the jargon), or exist in a legal limbo. Katherine Drabiak-Syed gave a great lecture about the legal and professional risks that doctors prescribing Modafinil off-label run. Despite American Academy of Neurology guidelines approving neuroenhancement, prescribing doctors are putting their patients at risk, and are violating the Controlled Substances Act.

Allen Buchanan opened the conference by suggesting that there was nothing special about unintended genetic modification, or evolution, while Max Mehlman of Case Western closed the conference by asking if humanity can survive evolutionary engineering. Dr. Mehlman posed four laws: Do nothing to harm children, create an international treaty banning a genetic arms race, do not exterminate the human race, and do not stifle future progress for understanding the universe. Good principles, but as always, the devil is in the details. International law has been at best only partially successful at controlling weapons of mass destruction or global warming.

To close on two points: The practical matter of regulating human enhancement remains highly unsettled, and leading scholars in the field are only beginning to figure out how we can judge the effectiveness and risk of particular enhancements on a short-term basis, let alone control long-term societal changes. The potential creators, users, and regulators of enhancement are spread across medicine, electrical engineering, law, education, and nearly every other sector of activity, and they are not communicating well. Basic questions such as “What does it mean to enhance?” and “Who will be responsible?” are unlikely to be closed any time soon.

On a philosophical level, the question of whether “To be human is to choose our own paths,” and “To be human is to find and accept your natural limits,” is unlikely to have a right answer. But Peter Cross was correct when he pointed out that even enhanced, humans will still need to find a source of meaning in their lives. If there is a human nature, it is to be unsettled, to always seek new questions and answers. The one enhancement we should absolutely avoid is the one that will make us content.


20100911

Nature

Yes, it has come to this : the prestigious scientific journal Nature mentions, by name, Limbaugh, Fox News, Glenn Beck and Sarah Palin, as concrete threats to the American scientific complex, and hence threats to the future prosperity of this country.

I almost don't want to weigh in on this ( all sides have committed errors ), but the fact that one of the most prestigious scientific journals is listing enemies by name clearly signifies at least one thing : the relationship between science and society is deteriorating.

Some time in the past, I'm told, Americans respected science. It let us end WWII brutally and concisely, and got us to the moon before the Soviets. What is different today, that the public no longer respects scientific evidence for making informed policy decisions ?

It also seems to me that complaints about liberal bias in science have a very simple solution. If you think science is too liberal, get off your political stage, get a doctorate from a reputable research institute, and do some quality peer reviewed science of your own. Science doesn't really care that much about your politics. One of my more excellent mentors, who taught me about the Fourier transform and various image processing algorithms, was politically conservative. This was absolutely no barrier to his ongoing stem cell research. So, my more conservative friends, rather than complaining about the inherent liberal bias in science, why don't you come on over and learn some rigorous scientific reason and help us out.

p.s. : the comments on that Nature opinion piece get, predictably, a little crazy. "La République n'a pas besoin de savants, uniquement d'équité" .. oh dear, has it really come to this again ?


20100829

Making Do [ Review & Speculation ]

I recently got an opportunity to speak with Steve Daniels, author of "Making Do", about making a business out of making things out of waste in Kenya. You should read it, its free online. This was Mr. Daniels senior honors thesis at Brown University, and I am impressed.

I'm unfamiliar with the social sciences, and have never done anything as ballsy as travel to the poorest parts of the world to study their problems and potential solutions first hand. So, utmost respect to Mr. Daniels, and I doubt I will be able to contribute much to this particular topic, but here are some thoughts.

My absolute favorite line in the thesis is the following :
"microenterprise efficiency comes not from the individual firm, but from the dynamics among similar enterprises in collective geospatial clusters. In fact, through clustering the jua-kali economy displays a critical property of ecosystems that Western economies lack: it produces virtually no waste."
Daniels illustrates how large industrial centers have arisen spontaneously in Kenya, capable of processing local scrap into useful goods. Everything is recycled and re-used. Items as complex as welding tools and metal lathes are made locally from improvised scrap. These tools can then be used to process more scrap, and to improvise even more tools, creating an organically growing and self reproducing means of production.
"... the linkages among microenterprises form dense networks of activity. Take a stroll through Gikomba, and one can’t help but think of the informal sector as a living organism with intricate systems that form a concordant whole."
Daniels goes on to elaborate on what industrialized nations can learn from this production scheme, in terms of building an efficient economy that wastes nothing and can run on constrained resources. Daniels also notes that there are a number of barriers slowing down the technological progress of this informal manufacturing. Most notable is that subsistence lifestyles leave no room for risk taking and credit is scarce, so there is no room try to develop new technology. There are a number of clear and very doable suggestions in the text, and the work represents one of the few satisfying "and this, in detail, is how we will save the world" answers that I've come across.

In some sense, this form of production is similar to an idealized future-society that we've been considering some time. That is, a society that can run on very low resources, produces zero waste, fits into the natural environment, is local, is self reproducing, and is highly mobile. Now, scrap processing in Kenya does not fit all of these criteria and presents with some serious drawbacks. It would be beneficial to consider weather the flaws evident in the from scrap production in Kenya generalize to our idealized science-industrial complex.

In Making Do, Daniels notes that production of uniform, high quality machinery in the improvised production environment is not cost effective. The specific example cited is an irrigation pump that could not be made affordable to local farmers until its production was outsourced to China. I believe that this could be a fundamental flaw in aiming for local, robust economies that contain a complete industrial basis. At the end of the day, the most efficient means of production is always to produce en masse, for the entire planet, and in a giant optimized and computer controlled assembly facility. Globalization is not just inevitable, it is the necessary and most efficient means to produce high technology.

That said, the optimal means of production my be a hybrid of the local-cellular approach and the global-centralized approach. It makes some sense to claim that the level of sophistication in a technology is directly related to the extent that globalization of production is necessary. As a simple matter of resources, technology that requires neodymium, sapphire, gold, silicon, arsenic, germanium, must necessarily involve global trade networks. Simpler technology that requires only rocks and plants can be manufactured in your backyard. Scrap metal re-use in Kenya that can produce lathes and welding torches and stoves requires an intermediate level of trade integration.

I suspect we can reason about this formally. If we look at a graph representing the transportation cost between population centers, we can get some insight as to where to place the global-local trade off. If transportation costs are low, the whole world is highly connected anyway, so in a sense everything is local. If transportation costs are infinite, no global activity is possible. Reality lies somewhere on this spectrum. It also seems likely that the transportation cost has been kept artificially low due to the temporary abundance of fossil fuels, skewing our means of production to have more globalized characteristics than may be sustainable.

So why was I ever set on some science fiction of the self-reproducing assembly machine that could produce everything I'd ever want in my garage (this being the epitome of hyper-local high tech production)? I remember it has to do with Star Trek and those "replicator" machines that you carry around in space and will make you tea, lasers, whatever you want. If we are to achieve off-world colonization, it is necessary to reduce our industrial basis to something that can escape Earth's gravity well. If we can figure out how to make computers in a slum in India, from purely local resources, maybe we can start working on how to produce them on Mars.

So, with that selfish interest in mind, it sounds like improving the availability of micro-finance in poor regions, coupled with Fab Labs to distribute prototyping technology and skills, is a promising course to follow both for alleviating world poverty and for purely academic advancement in local manufacturing. Read Daniels' thesis for the full list concrete solutions for expanding improvised manufacturing in the third world.