Tuesday, October 6, 2009

The Freedom of Information Post

As much as I used to love The Simpsons, Sunday's episode - in which they gave social media a send up several years too late - may have been the clearest signal yet that they are woefully behind the times (and no longer funny). Nevertheless, they briefly touched upon the very topic I planned to discuss this time around. When the technophile teacher that replaced Edna ridicules Martin's rote memorization in favor of looking the answer up online, the show managed to highlight a cogent question: is it more important to know something or to be able to know something?

Ubiquitous computing is quickly becoming a reality, and accessing new information - an encyclopedia entry, a news article, a restaurant menu - is becoming easier, faster, and more reliable. We will reach a point when there will not be a clear distinction between what information you have stored in your head and what information you have stored in a readily accessible database. When this moment arrives, what will the value of knowledge be?

Some argue that the widely scattered bits of information we gather through our various electronic mediums are inferior to the more deeply textured knowledge we can acquire from books and other more conventional sources. If we don't actually know the details of a thing, but only know an excerpt or a summary of it, then perhaps we don't know it at all. But I would argue that the way we're storing information in our brains now is not much different than the way we've always been doing it.

When we cram for a test the night before we take it, we're storing information for easy access and retrieval in our short term, fleshy memory, but we're not retaining it over the long run. The only effective difference between Martin's rote memorization and the teacher's search engine is where we access the information. But whether we pull it out of our brain or we pull it out of a cell phone, the knowledge will be gone the next day.

This happens in areas outside of education as well, beyond "teaching to the test" and cramming the night before. Most of what we commit to memory is in the form of snippets of information that lack any real depth. The route we take to drive to work, our impressions of a politician, what our favorite food is - all of this knowledge is based on incomplete observations thrown together and regurgitated when necessary.

And this has been true since well before the internet age. While most of us know that the Earth has gravity and holds us to the ground, most of us understand nothing about how gravity actually functions. We know nothing about how Einstein's general relativity upended Newton's classical view, or how gravity's unusual interaction with the universe is at odds with quantum mechanics, or the fact that the gravity around the Sun alters the light of distant stars behind it. Most people's understanding of a fundamental force that rules our lives could be summed up in a newspaper or blog headline: "New Study Indicates Big Things Attract Small Things."

Then there are those that believe the technological revolution we're experiencing will lead to a new enlightenment brought about by the profusion of information available to us. Proponents of this belief point to rising IQs throughout the world, or the unstoppable democratization that social media can give us, or the complexity of today's technological culture. The freedom with which information can cross national, personal, and physical boundaries gives every internet user the power to learn about, discuss, and influence the world at large, from restaurant reviews to global diplomacy.

But what our cell phones, twitter accounts, and cloud computing don't give us is a better way to synthesize and assimilate the data we're reading. And we're no longer just pulling all nighters for our Calc test tomorrow. Now we're keeping track of the updates and activities of all our friends, companies, and favorite celebrities; reading opinions and reviews for every item we can spend our money on; and scanning through statistics, blogs, and reports to win comment and forum wars. But are we closer to our friends, or more frugal with our money, or more reasoned in our debates? Or are we overwhelmed by the deluge of free information?

So far, very little scientific and technological development has been geared toward giving us the ability to critically think, learn, and teach any better. Without an increased capacity for synthesis and assimilation, the knowledge available to us, whether in the form of memorized math formulas or wikipedia entries, is not any more useful to us. The issue is not how we acquire the information but what we're able to do with it once we have it.

Consequently, as the information age engulfs us, just as important as inventing new ways to spread knowledge is inventing new ways to understand knowledge. Technology that rejuvenates our neurons or restores our memories must keep pace with our information technology. We must become smarter before we can become even smarter still.

Tune in next time when I discuss the Singularity Summit, because what kind of futurism blog would this be if I didn't?

No comments:

Post a Comment