We all know the routine–all the world has changed but the classroom is the same as it was a millennium ago. Faculty feel guilty but don't know what to do. Those that use technology for some part of their teaching now learn that they have a lot further to go: Techno-nirvana seems to keep receding farther away.
Ideas are powerful, especially when they have become beliefs and have been unquestioned for generations. Three in particular may be standing in the way of more faculty using our new learning tools in enlightened ways.
Myth One: Content is What You Teach
What is the content of a vacation? What is the content of Christmas or Ramadan or Passover? What is the content of a night out at a good restaurant? These questions seem absurd and unanswerable because these are not commodities or things but experiences.
Learning is also an experience. There is no "content" to hand to a student, or to tell a student, there is only a process of guiding students as they learn.
You might ask, what difference does it make? You know what we mean by content, you might argue. But, a label, as we know all too exquisitely in this political year, can determine how something is perceived and understood. Labeling learning as "content delivery" distorts human reality so severely that teaching methods are believed effective if they "deliver content" to as many students as possible.
Why don't students complain? Sometimes, for the simple reason that they don't have to do anything under this model of learning.
Information technology provides the means to guide the process of learning, which is what we should have been doing all along. It also makes "the content" freely available, so the myth of scarce content that can be charged for has been exploded. If it's only about content, and the content is free through open courseware programs, why pay for education?
Myth Two: The Individual Learner
Many years ago, Paul Theroux wrote a travel book called The Great Railway Bazaar. He walked from his home in Boston to a train station and then rode the train all the way to the southern extremes of South America, then flew to London and undertook a similar cross-continental train trip to Japan.
In Mexico, on the train, he met a Japanese man and spent part of a day talking with him. After a few hours, the Japanese man starting looking around quizzically. He continued to look around the railroad car until he stopped looking around, turned back to face Theroux and asked him, "but, but, where is your team?"
To this man, the idea of a person traveling alone was inconceivable, almost unnatural.
Another group of stories describes feral children who have been raised by animals, often by wolves. When the children are discovered years later, they have no recognizable human traits, including language, but are instead like the animals who have raised them. The children did not have a chance to potentiate, to develop their innate human tendencies because they grew up isolated from other humans. A human is only human in a social context.
One feral child researcher felt that the story of one feral girl demonstrated just how mentally naked humans are when born and how much we rely on society to shape us. As another researcher put it, human culture operates on the mind as "a large-scale moulding matrix, a gigantic conditioning apparatus" without which we would remain at the level of animals.
With these stories in mind, we must wonder at the very basis of U.S. higher education, which has traditionally insisted on individual learning, puzzlingly going against the very grain of human nature.
Information technology enables social learning. It is, in this regard, a more natural learning tool than the less social learning tools and materials we've relied on for centuries.
Myth Three: Machines are De-Humanizing Influences
I am often asked one way or another about technology coming between people or about machines intruding in our lives. The questioners seem to believe or fear that if we communicate through or use information technology we will be less human. Using cell phones too much will cause brain cancer; texting will take kids away from other important activities; video games will lead to violence; students will cheat more because of easy access to Google and Wikipedia. And so on.
When I first started using computers in my teaching, in 1985, I was at Gallaudet University in Washington, DC, which is the only university in the world completely dedicated to educating deaf students. The students I worked with faced the same fundamental problem that all people born deaf face: no access to the living form (the spoken form) of the native language of their country. Imagine a blind person trying to learn sign language, for example. It is that hard for a deaf child to pick up a spoken language. Lip reading is largely a myth.
But in my class, we set up a new invention at that time, a local-area network, and used an unknown utility that came with the network software: chat. Hard to believe that chat was almost completely unknown in 1985, but it was. As each of my deaf students sat at their own PCs, they could write to all the other students in the room and to me and we could all have a conversation in English.
Talk about a revelation! For the first time in their lives, they were able to have a group conversation in English and the group included a native speaker of English (me). Suddenly, my class became a destination for my students: They would arrive early and stay late. A new kind of information technology had transformed their learning from a gantlet of failure to a joyful interaction.
Why These Myths No Longer Serve Us Well
Assuming learning is about acquiring content is a distortion of reality. Learning is about learning how to learn. It is a social process: For young people the social process must be tangible, present, and immediate; for more advanced learners, the social context is internalized but still indispensable.
Of course we are social beings. Design learning around the social process of learning. Information technology is allowing us the flexibility to design learning situations that move away from the three crippling myths above, that we've lived with for centuries. There is nothing superficial about the changes we are going through now in our understanding of learning. Now that we have alternative methods that move us from myth to human reality, the time for resistance is over.
Trent Batson, Ph.D. has served as an English professor, director of academic computing, and has been an IT leader since the mid-1980s. He is currently a Communication Strategist in the Office of Educational Innovation and Technology at MIT. email@example.com
Cite this Site