Archive for September, 2008
Sarah Robbins at Ubernoggin posted “An open letter to Baby-Boomer Managers” following a conversation she had with colleagues at the Young Professionals Summit in Florida this week.Â The letter describes differences that group identified and discussed between Generation X/Y workers and their Baby Boomer managers. I want to comment on at least one of the differences Sarah describes from an educational, “in-the-classroom” perspective.Â Sarah comments,
The internet has served as a great social equalizer. In most online communities your value (and therefore reputation and power) are based on what you contribute not who you are.Â A well-read 18 year old who knows his stuff and is constantly active in the editing process of a Wikipedia article may be revered more than the heavily credentialed professor who interjects, corrects, and condescends to the community of the page. These relationships break down entitlements and, instead, center on accomplishment and contribution.
This alludes to an issue I’ve mentioned in this space before: shifting notions of what constitutes or is necessary to establish expertise.Â The ubiquitous availability of information makes the development of expertise much more possible than it’s ever been; as Sarah describes it, the internet is a social equalizer.Â Those that may not have had access to information previously now have access and are capable of using it to develop knowledge and make unique contributions.Â I recently encountered another example via one of my favorite blogs, Wired’s Geekdad, in a story they noted regarding the development of a potentially revolutionary solar cell.
William Yuan developed a three-dimensional solar cell that absorbs UV as well as visible light. The combination of the two might greatly improve cell efficiency. William’s project earned him a $25,000 scholarship and a trip to the Library of Congress to accept the award, which is usually given out for research at the graduate level.
Here’s the kicker.Â William is a 12 year old 7th grader from Portland, OR! How did a 7th grader get the information necessary to learn enough about solar cells to develop such a project?Â I’d like to find out to be sure, but I’m guessing this wasn’t a classroom contained project conducted exclusively between William and his middle school science teacher/s. Further, has William’s use and application of that information not made him an expert of some manner in that field?Â Absolutely. The implications for education focus on the manner in which we “teach.”Â Educators need to acknowledge that learners – of all ages – have the potential to develop expertise given the opportunity, and that opportunity is less likely to develop if the learner’s primary source of information is the instructor at the front of the room.Â We need to encourage and facilitate learner exploration of information resources beyond what we can offer.Â For many – particularly those reading this blog – that seems logical and a simple concept, but if that seems simple and logical to you, you are also probably familiar, as I am, with a number a number of educators that not only believe but insist that their learners need to listen to information only they can provide: their lectures are a necessary precondition for learners developing an understanding of the content. We need to learn how to teach “unconventional” experts – to not just give but to facilitate the opportunity for learners to develop and demonstrate expertise.
At the beginning of my class this semester – college level Microcomputer Applications – I used Google Forms to deliver a survey and collect information about students: their life situation, computer skills, computer use experience and habits etc. Learner responses to questions provide at least some validation to the myth of the computer literate “digital native” discussed in this space previously.When self reporting their own computer literacy on a scale of 1 (Not at all) to 10 (Extremely), a full 70% of the class (N=29) rated themselves a 6 or higher with another 22% placing themselves in the middle of the scale. That makes a full 92% of the class that considers themselves moderately to extremely computer literate.Â If you exclude the most extreme responses, 86% of my students reported between 5 and 8, inclusive. However, from the same survey, less than 10% of the students indicated they use a database application (10%), presentation software (0%), or spreadsheet application (3%) more than “maybe once a week” each.Â And, surprisingly, the numbers weren’t much higher for word processing applications; only 11% of the group suggested they use word processing software more than “maybe once a week.” (review summary results here) Given that:
- Word Processing, Spreadsheet & Presentation applications are considered “core applications” by the IC3 definition of computer literacy.
- An “Introduction to Computers” curriculum – ostensibly to provide a foundation for computer literacy – typically teaches all four types of applications.
- and, those four types of applications form the core of the most commonly used office productivity suites.
How can a group of learners consider themselves to have better than average computer literacy skills when they rarely use these applications?Â My answer . . .?Â Either (a) our learners are not as computer literate as they (and many others) would like to think or (b) our institutional definitions of computer literacy are way off.Â Granted, this only considers one component of the IC3 definition of computer literacy (with the others being basic hardware knowledge and use of the internet), but I tend to think it’s the former.Â Your thoughts?
I decided to experiment using Twitter this semester in my campus based class of “Microcomputer Applications” – basically an introduction to computers type class. My plans focused mostly on offering to send out text message reminders of deadlines for class assignments to students interested in using the device notifications.
The results and interest have been much more positive than I expected. I conducted the survey below on the Tuesday of the third week of class. Better than half the class is already using text notifications with another 10-20% interested but haven’t yet set it up.
In addition to the reminders that I’ve been sending out, two other things have happened.
First, I was able to notify better than half of my students of the emergency closing due to Hurricane Ike within ten minutes of having personally received the phone call from College personnel. It also provides a means of communicating with them regarding changes to class (work to be done, due dates etc) between now and the next class meeting.
Second, there’s healthy handful of students using Twitter as a semi-regular communications channel. I didn’t expect students to post many – or any – updates to communicate with each other. However, at the beginning of class, two students indicated they had used Twitter previously. At this point, several students are posting at least occasional updates and replying to classmates.
I’m wondering if the usage rate will increase for (a) text notices / device notification and/or (b) students posting updates. I will post an update or two later in and at the end of the semester.
I’m teaching a section of Microcomputer Applications – an introductory level computer course. An early topic in the semester, for me, is security and ethics. Of course, I prefer discussion to lecture, and as I prepped this semester, I was wishing I had a classroom clicker – aka audience response system or aka polling – system to help engage the class – to solicit their input beyond a simple show of hands. After having just used Google Forms to collect introductory information about students, I figured I could try using Google Forms as a makeshift classroom polling system. All it requires to be functional is a teaching station with PC & projector and learners having individual access to the internet (a computer lab environment). It worked perfectly, and I definitely got more feedback and learner participation than I’ve gotten in the past. Here’s what I did.
First, I used Google Forms to create a survey for each question that I wanted to ask and use to facilitate the discussion. One example is this question about virus protection software:
I actually created ten surveys since there were various questions that I wanted to poll learners about throughout the discussion. It required more work and prep that way, but I think it helped break up the discussion a little as I stopped to ask a question and poll them. The alternative would be to have all questions in a single survey and have learners respond to all learners in one fell swoop; that definitely would take less prep time.
Second, I added to the learning module within my LMS the list of questions with links to the surveys and the results. There’s also additional links for learners related to the question or concept.
Third, during class, the process was relatively simple. As the discussion progressed, I stopped and asked the class to click on the current survey link and complete the survey.
After students had 30-45 seconds to start responding, I accessed the results page and continued to refresh until I had a number of responses matching the number of students in the class. The results, particularly given the chart and percentage summaries, provided a number of opportunities to personalize the issues and discuss the implications. This example shows that 12 of 21 in the class were/are not sure what phishing is.
With the preparation in advance and the links readily available to learners, Google Forms provided an excellent, makeshift polling system. Several side notes. First, it is anonymous since there’s no way to pass learner ID from the LMS to Google Forms; of course, I could have had learners enter their identity, but I chose not to since they would have had to do that on each and every form/question/survey. Second, it’s not good for impromptu polling; it takes relatively thorough preparation for the discussion. The process of creating the form, making the link available and accessing the results via the Google Forms interface takes some time. Finally, students couldn’t access the results on their own; Google Docs doesn’t currently have a way that I’m aware of to publish results. Clicking on the results link required logging into Google with my identity, but having the link available with the course materials allowed me to quickly and easily access the results.
Certainly, hardware solutions and proprietary software solutions provide more bells, whistles and features, but for the value/cost ratio I got from this, it was a great solution ;-)