Sunday, May 30, 2010

Blog #2 - Information Literacy and Instructional Models

• Define what it means to be information literate?

As I’m beginning to learn, the changing technological landscape has made attempts to define literacy a moving target. Or maybe it is more of an expanding definition, something akin to the universal expansion scientists see following the big bang. I’ll have to think some more on the propriety of connecting these two creations later.

What is certain is that literacy had a much smaller meaning pre-connectivity. Dictionary.com still publishes a definition that focuses on having a knowledge of literature, along with the ability to read words for meaning.

Information literacy still refers to the ability to digest content for meaning. It has simply expanded to include content beyond literature and the written words in a literal sense. It now involves ascertaining the meaning of images and sounds. Perhaps more importantly it involves seeing between the lines of text, images, sounds. Like the lessons of media literacy, we should include the ability to evaluate the meaning of the content for reliability, accuracy and bias.

Both the format of the information and the depth of our analysis appear to be changing.




• How can teachers and students thoughtfully evaluate online information resources, including the online encyclopedia, Wikipedia?

This is a topic that my school has been very thoughtful about. Many of my students fail to approach the world with a critical eye. They spend so much time and energy decoding their text and struggling to keep up with the literal nature of their content, that they often fail to look beyond these basic literacies. In this sense, our Academy has included direct instruction on research skills to our standard curriculum for all students. Much of what we have done coincides with the suggestions of our authors. Starting with the basics, students should know who the author is. They should check the author’s expertise to determine whether she/he is qualified to report. Students should also check the author’s background. As the text points out, Wikipedia editors can be tracked and rated on their Wikipage. Students can also follow their links and prior contributions to determine special interest patterns that might suggest a bias or conflict of interest.

Another key strategy would be to check referenced sources for any postings to be sure they are reputable, actually exist, or more importantly, have been interpreted properly by the Wiki editor.

As I begin to understand the true nature of our new levels of connectivity, I’m beginning to ponder new additions to my research skills curriculum. One key understanding / standard that we should be teaching is the potential downside of this collective collaboration on the production of information. One interesting read that comes to mind is the recent book by James Surowiecki, The Wisdom of Crowds.

Surowiecki is a solid proponent of the ability for what he calls the “collective intelligence” to produce meaningful analysis and, ultimately, decisions (Surowiecki, 2005, p. 17). He includes certain conditions that must be present for the collective process to produce meaningful results, however. Individual contributors to the collective intelligence must be diverse in background and opinion, make their decisions independent of influence of others, and be decentralized in that they are drawing their conclusions from local information. Surowiecki argues that only when these conditions have been met can the aggregating mechanism (in our case - the internet) produce meaningful collective conclusions (Surowiecki, 2005, p.10).

When I apply Surowiecki’s thinking to media literacy objectives, it is easy for me to see that these conditions are not met. Anyone who has ever done jury duty has probably seen the down side of attempts to make decisions using the consensus model. But the question remains whether the increased connectivity and use of collaborative information sites like Wikipedia fail as miserably to meet Surowiecki’s requirements. Data collected recently for another book, Wikinomics, suggests that mass collaboration is a social boon, and Wikipedia is actually more accurate than some traditional encyclopedias. Is it possible that today’s ‘research-publish-feedback-edit’ process balances out the potential dangers of the collective intelligence in some way?

I’ve not yet read beyond the introduction, but I’m generally inclined to think of the individual contribution as the origin/foundation of all collaborative efforts. It would seem to follow that, if the initial idea becomes corrupted by the influence of the collective, then the final product will be faulty. Are there measures in our current technology that might prevent overdue influence?

In the spirit of the collective, I remain open to discussion… and my new read Wikinomics remains on my nightstand waiting for summer vacation.




• What are a few of the similarities and differences between the four instructional models of Internet use (Internet Project, Internet Workshop, Internet Inquiry, and WebQuest)?

Some key similarities of these models are the use of the internet to access unlimited resources, the use of the inquiry-based approach to construct understanding, the need for information literacy to be successful, and the potential to incorporate a more active learning process in the classroom. One key benefit in today’s classroom that is also shared across these models is the ability to link to an informational mode familiar to students.

The key differences I noted centered around independence required / allowed in the process. Some of these models allow students a great deal more freedom. For example, webquests tend to prescribe the inquiries to be made and the sites to be used in the research. This may make the webquest model more ideal for a classroom or individual student that needs precise direction.

There are also differences in the degree of open publication of student work and thereby feedback potential from the audience outside the classroom. This may actually be a benefit for fragile students who have not yet developed the sense of confidence that comes from success.

2 comments:

  1. Hi Chaucey,
    I have one question for you based upon your response to question # 2. How do you accommodate for the levels of the students you teach in helping them understand how to appropriately evaluate the Internet? As a Special Education teacher, too, I've noticed that many of my students are so frustrated by the time they type in a topic in the Google Search Engine (as they are unable to spell they get unrelated links to what they are actually looking for,) they just take what they first find and go from there. What are some of your strategies? Just looking to expand my knowledge-base in how I can further help my students.

    Thanks,
    Amy

    ReplyDelete
  2. Hi Amy,
    We do things step by step and its not easy. I have a whole checklist sheet that allows them to answer a series of 10 (I think) questions. They then total up their score and use that to rate for reliability on a scale of 1-10.

    And, of course, all that comes after they've done the research part. For research they have to fill in another worksheet that asks them to list 5-10 key words or key word combinations to aid their query, followed by an analysis of the titles and brief descriptions they get on the 'hits' page.

    Did I mention it was alot of work? But they actually get pretty good at it after a while. Let me know if you want copies of the forms I use. Happy to share.

    ReplyDelete