On Tuesday, I had the opportunity to participate in the “Assessing Digital Literacy: Outcomes and Impact” webinar presented by ALA’s Office for Information Technology Policy (OITP) and Digital Literacy Task Force. Unlike other webinars, this one allowed for very active participation. It was very similar to the Wimba classes I sometimes had during graduate school. For those of you not familiar with Wimba, it is a module within the Blackboard learning content management system. It features a main video feed from the professor (or a PowerPoint with voice-over), chat, and participation via microphone and webcam (professor permitting). In this case, YouTube was utilized to stream the video (link to recording above) while the chat was ran in two places: Twitter and Google Hangouts. I used Twitter. Usually the webinars feature a video feed, and maybe a in site chat feature.
Before the presenters spoke, the narrator, Renee Hobbs, asked the participants to define digital literacy. I quickly chimed in with an answer OITP quickly repeated:
As the presentation went on, it turns out that I hit the nail on the head as many others focused only on computer skills. Two others pointed out how digital literacy helps promote equality, which is very true.
The webinar covered two ways to access digital literacy: Northstar and ORCA. The first presenter Karen Hanson, a federal program officer at the U.S. Department of Commerce’s National Telecommunications and Information Administration, began with describing BTOP, the Broadband Technology Opportunity Program. While the program’s goal is to expand broadband access, they use the Northstar digital literacy assessment to help their cause. This assessment covers as user’s familiarity with general computer usage, the internet, ability to work with Window and Mac interfaces (not yet updated for Windows 8), e-mail, and word processors. The reason this assessment is used is two-fold. First, the assessment takers see and report their skills. This makes them more aware of what they don’t know so they can sharpen their skills. Second, BTOP wants lawmakers and funding sources to know how digital literacy skills measure up in areas without broadband with the hope it will lead to greater high-speed internet access (critical for rural America!).
ORCA, or the Online Reading Comprehension Assessment, segment was presented by Dr. Julie Cuiro of the University of Connecticut. ORCA is an online program, including a faked search engine (Gloogle), to have students (in the case of the study, those aged 11-12) locate, evaluate, and synthesize information. They must type and summarize their answers because copy and paste is disabled and there is a word limit. Finally, they are asked if the sources they found are reliable and/or biased. To make scoring easier, a second, multiple-choice version of ORCA was created. Interestingly, while the design made it less valid, it was determined to be equally reliable at predicting a student’s digital literacy skills!
Following this revelation, Dr. Cuiro discussed the next step taken. Instead of solely assessing digital literacy skills, the students’ offline skills were also checked. The results showed that the offline print media skills were not correlated to their digital literacy skills. In fact, it was demonstrated that some students were extremely good at either digital literacy or traditional print-based literacy but not the other. This was determined to be due to the home’s focus on digital media; the greater it was, the less the print media skills were and vice versa.
As the presentation on ORCA concluded, it was mentioned that many students do not conduct research properly. They often from their hypothesis/idea and look solely for ideas to support it. This lead to the suggestion that they might read print and digital media differently. Again I quickly replied via Twitter with a link to an article I recently read about that issue:
I was shocked that it was quickly retweeted by five, including OITP! I was very lucky to quickly relocated the article I shared on Twitter a week or two before. Here is a link to the article. It’s a very worthy read because it describes how reading digital media affects the brain differently than reading print media. Scholars at the University of Texas-Austin conducted the study the article is based on (the article links to the original study report too).
I will admit while it was hard to keep up with YouTube in one tab and Twitter in another, I wouldn’t trade the experience for the world. In Twitter, we had many active conversations going to add to the streamed content on YouTube. For example, we had an active conversation about if learning should be redefined in the digital world and another librarian and I compared notes on how we saw students look for information that fits their hypothesis/idea while using only one source or not looking at other viewpoints. It was a great learning experience and way to connect with fellow librarians and library students.
Have you ever participated in a similar style webinar? If so, what did you think? Additionally, as I can see this go either way, do you see digital literacy as a subset of information literacy or is digital literacy a distinct field (or could information literacy be considered a subset of digital literacy)? Feel free to share thoughts and debate!
5 thoughts on “Assessing Digital Literacy”
Pingback: Assessing Digital Literacy | Media literacy | Scoop.it
What a great example of digital literacy- synchronous global interaction embedding high quality content and discussion.
Thank you for commenting! Additional thanks for the shares. Yes, the webinar was great example of digital literacy information and interaction. I’m glad I participated in the webinar and could summarize the presenter’s work (while adding information about the great format!).
Pingback: On Digital Learning « Amy's Scrap Bag: A Blog About Libraries, Archives, and History
Pingback: Digital Learning within Libraries | Amy's Scrap Bag: A Blog About Libraries, Archives, and History