The world is a different place than it was even a scant 10-years ago. In the late 1990’s, we knew technology was advancing and that changes were in store. By 1999, we had bought into the idea that the legacy systems that were running major infrastructure could collapse under the weight of a simple shift of year – from 1999 to 2000. Education in today’s schools is lagging almost hopelessly behind because culture and technological-capacity can’t keep up.
Ten-some-odd years after Y2k, the angst is almost laughable. In those ten years, internet proliferation has reached all corners of the world. Where in 1999, AOL had begun to demystify internet chat rooms and introduced the concept of “IM” to the masses. As early as 1998, I can remember conversation around corporate America trying to figure out how “instant messaging” would play a role in business ventures. I remember lobbying my boss to set up an email account for me by detailing the legitimate business uses for such a workplace “luxury” – after all, it was an additional $5 more a month to the company. I remember upgrading my computer from a 2400 baud modem to 14.4Kbs, skipping right over 9600. Man, that was screaming.
Today, you can get a free email account anywhere, but it’s such an outmoded means of communication it could easily be called “Betamax.” There are so many other means by which we can communicate and disseminate information and data. I use my phone for almost everything but as a phone. SMS messages. Instant messaging. Camera. Twitter. Facebook (i.e. web 2.0). I can even read/review/edit documents and spreadsheets. Keep a calendar. Oh, and make telephone calls.
That’s only the tip of the iceberg. I first saw the potential of mass collaboration with a simple application called “SETI @ Home,” an application put together through the University of California at Berkeley, where one user would download a packet of data collected from the SETI – the Search for Extraterrestrial Intelligence – project and use their PC to analyze the data. Not remarkable in 2010, but in 1999 it was a groundbreaking innovation in harnessing distributed collaboration through the internet to create more information and knowledge than could have been accomplished singularly.
We now see entire businesses built around mass collaboration – take, for instance, Local Motors. In short: they build open source cars through the collaboration of a community of enthusiasts and engineers. Linux – an open source operating system – would take something to the order of $10-Billion to develop from scratch, and yet it is given away freely through the efforts of millions of programmers giving away their time to collaborate with others to build something better. A perfect example of where we are going: Wikipedia. Wikipedia single handedly destroyed Encarta, funded by the mighty Microsoft, based on the idea that people will collaborate and give away their knowledge. Google has become one of the largest and wealthiest companies in the world by giving things away and opening their API (application programming interface) away to allow collaboration: Wikileaks publishes confidential Iraq war data, and someone somewhere plugs that data into GoogleMaps to create a visualization of where war deaths have occurred.
To be sure, these are not all altruistic efforts. Engineers contribute to Linux to build experience on the operating system and to build their own credentials to work on projects built on the platform. Author Chris Anderson, the author of Free: The Future of a Radical Price, made the book available for download on Audible for….free. Why? He says specifically – give that away, more people will read it, which will likely increase demand for his other work and specifically speaking engagements. Give it away through collaboration, and burnish your own credentials to build something else.
We bemoan that we need to educate children for careers that haven’t even been invented yet. What we are missing is that we’re trying to educate children in out moded ways and in out moded facilities. Rather than embracing technology in our classrooms, teachers restrict students from using their phones and wifi-enabled iPods. There are a myriad of educational reasons a student can use their personal technologies in furthering the educational purpose. Educational technology teachers limit a students’ use of wikipedia due to concerns of provenance of the data. Schools restrict access to web 2.0 applications such as Facebook and Twitter because they distract from the educational experience. Most importantly, teachers themselves do not understand how to use current technologies and therefore can’t use the technology in their classrooms to deliver a pertinent and effective lesson. To be sure, these are all legitimate issues, but at the end of the day these are the technologies – and their successors – that the people who are now children will be using as they begin their careers.
With as much change as has taken place in just the last ten years, it is clear that what students need to know and be able to do is to navigate these technologies and to use them to create ideas and novel uses for them. There is nothing a student can learn from being directed to put together a foam-board conference-display that will be of use to them in learning how to present data. When schools cannot figure out how to collaborate between disciplines, the students lose the educational experience of synthesizing knowledge. Schools need to learn how to collaborate because collaboration is the way these students will be working and will be the keystone to the world in which they will live.