Well, this is deflating. Here I have been extolling the teaching benefits of various widespread technologies, many of which are not particularly new, in the grand scheme of things (Blogs, about 10 years old. Video games, at least 25 years old). I have also been accused of not giving traditional universities or professors credit for being great innovators.
Many of them are. But these survey results would seem to indicate that the vast majority are not. They use their Learning Management Software (at least the gradebooks and syllabus features) but when it comes to anything else, they don’t bother.
Many folks on Twitter asked “How do we change that?”
Here’s some thoughts.
“The first step may be teaching the teachers how to use those tools” said @Hutchirish on Twitter.
I actually disagree. I think that the beauty of the current generation of social media is that anyone who’s motivated can pick up the basics of these tools in a couple of minutes. Remember, we’re talking about a highly educated, intelligent group of people here. There are 70 million blogs out there and 120,000 are being created every day. There are 500 million members of Facebook and several million on Twitter, and almost 190 million households will soon have videogame players. How many of these millions of people had to be explicitly instructed in the use of these technologies?
I don’t think professors aren’t using these technologies because they don’t know how. The reality is a little more complicated.
@Chanders writes, “You’re prepping to teach a class you’ve never taught . You’ve a month. What should you do: learn social software or class content?”
I actually think this is a real question, not a rhetorical one; and it’s a much broader problem with university teaching in general. I think it’s far more important to know how to teach than to necessarily know all the content the course is supposed to put across, because teaching is not supposed to be about feeding forward information.
Whereas pre-K-12 teachers, in order to be certified, receive at least nominal instruction in the practice of instruction itself, from what I understand it’s possible to get a PhD or an MFA or even just be a graduate student on the path to a degree, and get stuck in front of classes full of undergraduates with zero instruction in how to teach. Nor, from everything I understand about how the professoriat works, are professors generally subject to any kind of organized professional development on teaching practice. I don’t remember my parents going to weekend workshops on how to conduct writing seminars at Louisiana State University.
Nor generally do they get rewarded directly for how well their students perform or master information (although they may catch flak via student evaluations if their students don’t LIKE them). In fact all of the work they do to prepare for their classes, if they are lucky enough to be tenure track, is considered to be taking away from their real work of publishing and scholarship. (If they are adjuncts, the work they do to prepare for their classes is likely to be done on the train or during downtime on their shifts at their second job!)
Consider that this system, which doesn’t prepare professors to teach, doesn’t particularly reward them for teaching well, and prefers that they do other things rather than getting better at teaching on their own time–consider the arrogance of the fact that this system considers itself the best kind of teaching there is. Once you get through 12 years of being taught by folks trained and certified in actual pedagogy, only then you are deemed worthy of studying with real scholars–ie, people who know only the content of their own field. And it is held to be self-evident, by the professors themselves, that whatever they come up with to do in their classrooms with no oversight or instruction about how to do it–is a far, far better learning experience then some stupid research-based learning software or some dumb award-winning video lecture by the most fascinating and brilliant expert in the world. Do you prefer wandering into a random regional theater and sitting through an original play by the local Corky St. Clair or Netflixing a five-star film of your choice?
So, the reason most profs don’t use blogs, or wikis, or Google Docs, or videoconferencing or video games or even clickers in their lectures, is probably because they don’t see why they should, they’re not being supported to learn why they should and they wouldn’t be rewarded for it even if they did. It’s an exceptional minority who will endeavor to innovate and be excellent under those circumstances.
If you want to change that you probably have to change the circumstances.
“Whereas pre-K-12 teachers, in order to be certified, receive at least nominal instruction in the practice of instruction itself, from what I understand it’s possible to get a PhD or an MFA or even just be a graduate student on the path to a degree, and get stuck in front of classes full of undergraduates with zero instruction in how to teach.”
Possible? Try “overwhelmingly likely.” Most graduate degree programs don’t even have an option to take a pedagogy course. The only discipline that regularly does, as far as I know, is English, and very often that’s one course in a five-year full-time program.
So much to respond to here. Since you’ve been rather blunt, I’ll do the same.
1. You misplace benefits by focusing on the technology so much. The benefits are less in the technology and more in the design of the instruction.
2. Course prep by faculty is not just about mastering content. It’s also about designing the instruction, whether it uses high-tech tools or not. Often you can accomplish the same thing using either high or low tech.
3. You’re right that many faculty aren’t trained explicitly in pedagogy. However, I’d have to say that nearly all my college profs were pedagogically superior to nearly all my K-12 teachers. YMMV.
4. I’ve spent a fair amount of time talking with faculty about why they do or don’t use certain technologies in their teaching. I would say that most of them are neither ludditical (love that word) nor are they technophiliacs. They are mostly dedicated teachers who will use technology when the perceived benefit of using it outweighs the perceived cost of learning & ongoing use. For many of the technologies you list, the apparent benefits don’t outweigh the expected costs. Of course it could well be that faculty are misjudging both costs and benefits. I think that’s sometimes the case. But learning technologists often misjudge this tradeoff as well.
5. You probably don’t mean to equate technical innovation with teaching excellence. I certainly wouldn’t.
The great irony about not being “taught to teach” is that most professors/instructors in higher education are not qualified to teach high school. Yet they have no problem looking down their noses at those “teachers” (some might not realize that teacher is a very bad word in higher ed).
Another great irony is that higher ed faculty usually resent working with an instructional designer with a Master’s degree or better, usually because the ID is not a faculty member (and viewed as a technician). The ID has received far more education in pedagogy, cognitive load theory, effective teaching practices (and much more) than the research-trained prof, yet they are still waiting their promotions up to the level second-class citizens of the higher ed community.
The general public has no clue about the idiocy.
You’re right, I don’t mean to equate technical innovation with teaching excellence, but I do see a correlation between innovation in general and excellence in general, inasmuch as innovation is the ongoing pursuit of excellence. It’s not “blogs are magic” it’s “Why don’t professors use blogs? Because they don’t use new stuff that often. Why not? Because the system is not designed to get them to change or improve.”
Some of my professors were better than some of my previous teachers, but I was a very good student, so I appreciated simply being in a room with smart studious people and talking to them about ideas. I imagine you were a good student as well. I think to figure out how good a professor is you should see how well they do with reluctant or distracted or otherwise low-performing students.
The first option is the blocker, I believe. Once you make the decision to use your institution’s learning management system (LMS: Blackboard, Moodle, Sakai), you’re pretty much locked into it, and it’s harder to use outside tools. And a university or college generally only supports to the use of an LMS, not blogs or videos.
The new version of Blackboard and moodle do have built-in support for blogs and wikis, but they are always years behind.
I’m not 100% sure about that survey though. 5% of professors don’t know what a videogame even is? Maybe they got tenure before the 1980s.
Yes, good point about the LMS.
I feel like “Never heard of it” is a weirdly dismissive phrasing like something your parents would say, “What is this ‘The Facebook’ that the kids are talking about?”
I think you raise the right issues and questions, but you also simplify the place of pedagogy in higher ed in my view. For one, blanket statements like “all of the work they do to prepare for their classes … is considered to be taking away from their real work of publishing and scholarship” really need to qualified pretty heavily. That statement applies primarily to research institutions. Many institutions are more teaching focused. The much bigger problem is how you measure and reward good teaching (which could conceivably include metrics on effective use of technology, but in my experience do not).
Do you think measurements of good teaching should include metrics on effective use of technology?
Hi Anya,
Interesting stats. But not really surprising for me. In fact I would say it was worse than they show. The LMS stat shows 72% of professors ‘use it at least some’. When LMS usage stats for a typical university are analysed they show that (being generous) maybe 15% to 20% of course shells are being what might be called actively used and maybe only 3 or 4% are using what might be called the advanced features of the LMS.
Often this is treated as a technology problem by universities but really it’s a people and processes problem. Wearing my fantasy Vice Chancellor/President hat I would do the following:
1. Make it clear when faculty are employed that the university expects their teaching and learning material to be made available under a CC license as part of an open learning initiative.
2. Current staff would be given three years to convert their current material into open learning format.
3. Base promotion within universities not just on research output but on the output of open educational resources based on their research and use social and professional reputation benchmarks based on the dissemination of learning not just through traditional peer reviewed journals but through blog posts and social networking links.
4. Allow groups of staff to develop innovative online, flexible teaching models that provide some benefit to the staff either financially or through a redistribution of their work load through the year to allow them to focus on the things that interest them most. That is to say, a model for some courses that may not fit the traditional semester pattern.
5. Allow the Schools, Colleges and Departments to benefit directly from innovations in online delivery. At the moment there is very little incentive for a Head of School at a traditional university to encourage innovation. In fact it is a risk. The organisational structure of universities and the budget processes fundamentally inhibits innovation.
6. Allow staff to be flexible in the technologies they use and avoid locking them in to one solution where possible. I agree with Doug Holton’s point here.
Essentially this is an innovation diffusion problem and what we see is the same proportion of early adopters year after year engaging in innovative online teaching practices. There is no mainstreaming of the innovation. That is largely because there is no perceived benefit for majority of professors innovate.
I could go on and on here I’m afraid. But I’ll leave it there for now.
Cheers
Mark
@marksmithers
As others have noted, there are many different and important issues that this post brings out…with limited time I’ll just comment on a couple.
Ultimately, I think we’re talking about a change and how one supports and drives change in an organization. Higher Education organizations tend to be rather unique given the shared governance structure found in most of them (for-profits are clearly an exception). The control faculty have over policy and business issues, I find, add an additional layer of complexity in driving change in these environments. My “secret recipe” for addressing this “change challenge” is to focus on three primary change drivers. (1) incentives; (2) support services and (3) policies. For example, I’ll work to provide faculty, especially early adopters, with financial rewards or work to get their use of technology considered as part of tenure and promotion…these incentives help build up a critical mass behind the change. I’ll also then make sure that there are significant support resources in place so that as faculty experiment with new technologies and new teaching methods they will know that if they hit a problem someone will be right there to help. And finally, I work with upper administration to implement policy change, such as requiring that online courses be designed to specific standards, that push folks to change.
Change of this nature takes time and I find 3-5 years is often needed to really get faculty to adopt new teaching tools and use them in ways that transform their student’s learning experience.
Josh
Anya:
Based on my experience only, absolutely not. I think “good teaching” is typically evaluated based on standardized evaluation forms, which are basically simplified customer satisfaction surveys.
Oh sorry, wrong question answered. I think I could imagine that use of technology might be considered as part of some larger umbrella rubric that might get at the underlying issues that the technology might enable (about communication, collaboration, etc.).
Is it reflective of the reality of teachers and professors in Higher Education. If teachers or professors don’t “feel” the need of introducing or using Web 2.0, or PLE/PLN in the teaching and learning, how would the learners be “convinced” that they should also learn using the tools? I have often heard or got feedback from students and educators that students prefer simple, well explained lessons by instructors, rather than going through the complicated and complex process of learning through the social networks, or Web 2.0, etc. May be some students found social networking a waste of time, or that they could easily be distracted from studies if they spend too much time in the social media and social networking. So, it is not a simple solution, when it comes to learning pathways, especially when students would like to see how and why their teachers and professors are using the technology (VLE, PLE/PLN & Web 2.0) in the teaching and learning process. If teachers are not recognising the importance of blogging, wiki or forum discussion in teaching and learning, would students think that they should also embark on such learning pathway or learn about them?
Would there still be many myths about technology based learning in an open networked learning environment? Are there boundaries between teachers and students relating to technology based learning? Many of our students still “believe” that they could learn better in a “safe and closed 4 walls classroom” environment where they pay their fees to “listen and learn” from their expert professors. There may be some “truth” to this for young learners who have little or no experience in online and networked learning. Is it the belief in some of our (your) students in HE? How could we move forward to help them in understanding the needs and expectations of “future learning” in a more open learning environment – the internet and social networks?
Very interesting discussion. There is an over-arching philosophical issue – that of exclusive and exclusionary history and tradition of higher education. If someone has reached the top tiers of the professoriate, s/he has most likely been in school since age 5. The process to get to the top is a weeding-out process, a kind of false meritocracy. The university concept essentially was, the smartest men in a town got together and organized a place of study – every one else just carried coal. The professors professed and the students had the responsibility to learn – I’m not sure ‘teaching’ was the point.
As an adult eduction specialist whose focus is socially constructed knowledge, I see the use of technology as another tool in a process that starts with the willingness to ask questions and seek answers. Productive learning can and does occur in low-tech settings as well as high-tech settings. Critical evaluation is the key. As Sui Fei observes above, one can easily get lost in the social setting.
There is a widespread myth that technology is some sort of magic bullet for education. The state of South Dakota put laptops in the hands of high school students in some districts for 3 years. The results? No change in learning outcomes. I’ve been searching for studies showing that actually assess the value of laptops in high school–so far I’ve found lots of true believers, but no data showing positive educational benefits.
This article takes professors to task for not making greater use of technology, but never provides any reason to believe the use of blogs or web management tools provides any educational benefit beyond her own faith based belief. Come on, if you are going to criticize someon for not doing something, at least explain why they should be doing it.
Your analysis is shallow and uniformed.
According to the author I am a luddite because I don’t have a blog or use Desire2learn for more than posting materials. But I have been advocating that students be required to have a laptop computer for nearly 20 years. The second wave of internet access on my campus was driving in large part by me. I was an early adopter of the internet for communication with faculty colleagues across the country and soon the world.
I’m a Luddite?
I spent 8 years doing human computer interaction research and I learned that the first rule of good interface design is to use the computer when it is approprate, not just because it is what the “cool kids” are doing. Blogs make no educational sense. Desire2learn is a poor environment for teaching–it isn’t even properly designed for online gradebook usage never mind serious education. (For a certain, fairly narrow, range of courses D2L does some beneficial things. But for most courses its best use is as a fancy webpage for posting info and providing an online gradebook.)
Show me a technology that makes educational sense, and give me the support I need to implemented it, and I’ll adopt it in a second. (As will 90% of the college/university faculty I know.)
I was an early adoptor of clickers, but administrative support for needed infrastructure never materialized so I had to give them up despite their potential benefits. (It is easy to criticize professors for not using a technology, but better research might have showed you that lack of administrative support and resources are very much to blame in many cases.)
Math folks on my campus are screaming for more SMART boards, but there is no money to purchase/install them. (I’d like to use a SMART board but I’ll be retired long before enough classrooms have them in place for me to use one. And I am years away from retiring.)
But I don’t, and will not, blog. Blogs are vanity/advertising tools, not serious conduits for educational content.
If this blog’s author wants to be taken seriously she might want to take the time to familiarze herself with her topic.