Category Archives: Evaluating Sources

DART Recap – Authority is Constructed and Contextual

The format for the latest DART session was an open forum to discuss one of the frames in the ACRL Framework for Information Literacy, Authority is Constructed and Contextual. While there were not specific articles that circulated prior to the meeting, participants were encouraged to explore threads within two related ALA listservs, the ILI-L and the ACRL Frame.

The discussion generated and explored multiple complex, open-ended questions: How do we instruct students in distinguishing between news media and entertainment media? What are strategies for helping students to navigate the tension between innate bias and journalistic ethics and ideals, particularly in the current landscape of distrust for the media? How do we enable students to evaluate peer-review and persuasive research agendas within varying disciplinary frameworks and norms? What is the difference or interplay between expertise and authority? And overall, how do we foster critical thinking skills that transfer beyond the classroom?

Sorry to disappoint, but I will not be providing any tidy answers to these questions within this blog post. I can, however, share some potential ideas for activities and strategies that folks discussed having used or seen in relation to these issues.

  • Media Diet: Have students map out their media diets. What media/information sources do they readily consume? The diet metaphor can be stretched as far as you like (daily intake; splurge sources; holes in your diet; allergies/intolerance). Use the maps to guide or generate class discussion about bias and media literacy.
  • Choose Your Own Adventure: Present students with a real-world problem-solving scenario in which they are evaluating information. Example: You wake up one morning with a horrible rash on your arm. You do not know what it is. What do you do? Discuss student responses and continue to add questions/choices. (The doctor you see wants to amputate your arm, what do you do now?) This activity can provide a different access or entry point for talking about authority and information literacy.
  • Opposing Viewpoints: Have students look for sources that take different stances on an issue. Unpack this experience as a class, and guide a discussion of the who, what, when, why of these viewpoints and the process of uncovering them.
  • Evidence First: When looking at research, encourage and instruct students to focus on the evidence rather than honing in on the conclusion. Is there a clear trajectory from the evidence provided to the conclusion presented?

There are no easy solutions or quick-fixes in this area, and that can be uncomfortable at times, particularly in the role of educator. But it is possible that in some way, the questions here are just as meaningful as the answers, if not more so. It is through the posing of such questions, engaging in the dialogue, and learning how to operate in the uncertainty, that progress is made, both for us as professionals, and for our students.

We do not always have to have clean-cut, black and white answers for our students, and doing so would ultimately do them a disservice anyway, as it misrepresents the gray areas inherent in critical thinking. When a student asks a complicated question about bias or authority, don’t be afraid to shrug, and say “yeah, it’s really tricky, isn’t it?” Or flip the question back to the students, and ask them what they think and why. Be transparent about the challenges and the give and take, and talk about it explicitly. Indeed, scholarship is a conversation after all.

More to check out:

Read: Interesting blog post examining the frame Authority is Constructed and Contextual.

Listen: Circulating Ideas podcast with UT Doctoral Fellow Jeremy Shermack talking about bias, journalism, and the media.

DART Recap – scoUT Discovery Tool

Yesterday we launched the new platform for our professional discussion group, Discussions About Resources and Teaching (DART), formerly known as RIOT.  Motivated by feedback and transitions within the department, this change better reflects our current structure and goals as a community of practice.  Thank you to everyone who participated in kicking off DART!

Our topic for discussion was teaching with web-scale discovery tools like scoUT. To gather different perspectives, participants were invited to read one of three articles beforehand:

“Teaching Outside the Box: ARL Librarians’ Integration of the ‘One Box’ into Student Instruction” College & Research Libraries

“Beyond Simple, Easy, and Fast: Reflections on Teaching Summon” College & Research Libraries News

“Teaching ‘Format as Process’ in an Era of Web-Scale Discovery” Reference Services Review

We began with a round robin to share how, or in what capacity, people are using or not using our discovery tool scoUT. Perhaps not surprisingly, there was a broad and varied spectrum of responses. Some people actively use it as a teaching tool in classes, while others mention it briefly, do not teach with it at all, or find themselves engaging with it more on the reference desk than during instruction sessions. People mentioned using it for developing topics; searching by citation; refining vague reference requests; finding book reviews; and locating material on obscure subjects or with very specific search phrases only found in full-text.  Also, it seems that few people actually call the tool scoUT when talking about it with students, referring to it instead with names like the “all tab” search; “main library” search; or the “big box” search.

Interestingly, some of the features discussed that make scoUT useful are also what can make it challenging. For example, it is helpful in retrieving sources that cover obscure, specific or seemingly unrelated topics because it searches and crawls across so many things. Yet that also means that it often returns a deluge of results, which people expressed can be difficult or overwhelming to deal with.

After the round robin, much of our discussion stemmed from the third article, which explored the concept of teaching format as process and how web-scale discovery tools factor into this approach.  When searching online, sources become decontextualized; content is separated from its package and so visual indicators cannot necessarily be relied upon. Guiding students to consider the creation process inherent in different source types can help foster the development of higher-level critical thinking and evaluation skills.  A tool like scoUT, that requires sifting through a large number of different, and at times random, source types, presents an authentic opportunity to discuss and hone these skills. However, this depth and engagement takes time, and can be difficult to address in a one-shot instruction session.

There was also a general consensus that whether or not we are teaching scoUT directly, students are going to use it. Not only is it the first, obvious search box on the website, but it also has that familiar Google-like quality that will draw students toward it. So if they are going to use it anyway, it only makes sense for us to think about how we can teach them to do so in a discerning and productive manner that will serve them even outside of school.

It was great to hear at the end of the discussion that several people felt interested and inspired to find new ways to incorporate scoUT into their teaching practice. Thanks again for an insightful and engaging first DART!

Do you have an article or topic you would like to bring to DART? Feel free to contact Elise Nacca with any ideas and feedback!


Kulp, C., McCain, C., & Scrivener, L. (2014). Teaching outside the box: ARL librarians’ integration of the “one-box” into student instruction. College & Research Libraries, 75(3), 298-308. doi:10.5860/crl12-430

Cardwell, C., Lux, V., & Snyder, R. J. (2012). Beyond simple, easy, and fast: Reflections on teaching summon. Chicago: Association of College and Research Libraries.

Seeber, K. P. (2015). Teaching “format as a process” in an era of web-scale discovery. Reference Services Review, 43(1), 19-30. doi:10.1108/RSR-07-2014-0023


RIOT: On reading

Recently, I was introducing an evaluation activity to a UGS class in which students worked in groups to come up with evaluation criteria and apply them to an assigned source. I had a lot to cover during the class, and found myself repeatedly telling the students not to actually read the information I had provided them, but to skim the beginning if necessary, and analyze its merits based on context. While the activity led to a great discussion about evaluation and how to use various kinds of sources, something about it felt inauthentic.
Upon further reflection, I came back to something that has bothered me in the past. By necessarily compressing parts of the research process to make room for a deep discussion in a 50 minute one-shot, one of the first things that goes out the window is reading and reflection. I’ve often thought of reading as a problem to overcome while teaching, and have designed most activities to require little or no reading. I ran into this problem again this semester in trying to rethink how I teach students to use background information to find keywords. I struggled to come up with a good way to demonstrate how to pull keywords out of an encyclopedia article without slowing down and giving students time to read and digest the article. I kept coming back to the same roadblock. How can I in one breath tell students that research is a slow, iterative process and in the next breath, tell them that it’s not necessary to actually read the information I’m asking them to evaluate?

While searching for something to RIOT, I came across an article co-written by a librarian and an English professor at Hunter College. The article outlines the reasoning behind a “Research Toolkit” they created that includes both student-facing online learning tools and a faculty guide for their use. While the resource itself doesn’t sound too dissimilar from our Information Literacy Toolkit, the portions of the article explaining their pedagogical reasoning for moving from mechanics of research to deeper, critical inquiry-based research spoke to my own cognitive dissonance around reading and research. Here’s one excerpt:

Reading is an area often neglected by both library and composition scholars. As Brent (1992) explained, “instruction on the research process…deals with the beginning and the end of the process (using the library and writing the drafts), but it has a gaping hole in the middle where much of the real work of knowledge construction is performed. The evaluation of sources is treated chiefly as a matter of measuring the writer’s overall authority as a witness to facts, as measured by factors such as his reputation and the recency of the source” (p. 105). Looking at a variety of writing textbooks and library instruction materials confirms Brent’s statement: most of them focus only on evaluating sources rather than reading them.

Furthermore, the way evaluation of sources is often taught forefronts ideas such as identifying the “bias” of a source. While sources are indeed biased, most students do not understand that all sources will have a bias; it’s how they choose to use the source that matters. Students reading only to evaluate the credibility or bias of a source are not going to do the deeper reading that truly understanding a source requires. Brent (1992) called for a “theory of rhetorical reading” (p. 103), something that has yet to be fully realized.Keller’s (2014) study analyzed student reading practices and noted that focusing on the evaluation of source may have resulted in a form of overcorrection (p. 65), which may lead students to dismiss valuable resources.

Am I doing my students a disservice by focusing on evaluation skills to the detriment of critical inquiry? How can I teach them to construct knowledge when I don’t even give them time to read? The longer I teach, the more I sometimes feel like by cramming the entire research process into a one-shot, I’m deceiving my students. Some things I’d like to discuss:

  • By focusing on evaluating information, are we leading students towards “overcorrection” and away from inquiry?
  • Is it our responsibility to teaching students how to read deeply, or does that fall outside of information literacy?
  • How can you truly model a process that involves reading and sometimes rereading when you have a limited amount of time? Can tutorials and other online learning objects help?
  • Can we come up with exercises that help students practice the “reading and thinking” parts of research?
  • Are there ways to collaborate with our colleagues at the Writing Center around this issue?

Referenced source:

Margolin, S., & Hayden, W. (2015). Beyond Mechanics: Reframing the Pedagogy and Development of Information Literacy Teaching Tools.The Journal of Academic Librarianship, 41(5), 602–612.

TLS TIP: Background Knowledge for the Win!

Captain Obvious says: “Students don’t come to us as blank slates.”  Though we know this as teachers, we don’t always plan ways to activate and connect students’ prior knowledge to whatever is being taught.  I’m often guilty of this because I have so much to get through in a session.  After learning more about the utility of activating students’ background knowledge, though, it is something I’ll be making more time for in the future.  

If you are also interested in this topic, I recommend picking up a copy of How Learning Works: 7 Research-Based Principles for Smart Teaching (print/ebook), which illuminates what happens in students’ brains during class.  Each chapter is an expanded literature review, summarizing and explaining the research on various aspects of learning, and offering concrete ideas for putting this research into action.  For this TLS tip, Chapter 1: “How Does Students’ Prior Knowledge Affect Their Learning?” will be my focus.  This chapter highlights the importance of connecting content to students’ existing knowledge, and argues that this is an effective way to increase their retention and attention.  Though activating background knowledge can come with issues (such as incorrect or insufficient knowledge), the authors of this book make a strong case that this can and should be a part of good pedagogical practice and that these issues can be ameliorated through good planning.  

The authors deal with this topic deeply, but I was particularly interested in the idea of what they called “elaborative interrogation,” or “asking students questions specifically designed to trigger recall” (16-17) studied by Woloshyn, Paivio, and Pressley (available here).  This strategy, asking students not only to remember something, but to engage with the information, increased retention, and could easily be applied to information literacy instruction.  By bringing in a variety of research, Ambrose et al. were able to convincingly argue that teachers need to engage with students’ prior knowledge so that we can help them recall information more effectively, and understand what level they’re at when we meet them.     

Activating background knowledge is an important consideration when we think of our students’ learning, but how do we go about doing this in our new learning labs?  I think there are two times ripe for turning students’ minds to what they might already know about a subject – before you meet with them and during your session, and I have a few strategies for each situation.  These strategies were informed by How Learning Works.  

Before class:

It isn’t always possible, but if you’re lucky enough to have students do some kind of prep for the session, this can be a great time to encourage students to activate background knowledge!  Think about asking students to:

  • Write a paragraph (or list) on what they already know about a given subject.  If they submit it beforehand, you’ll have an idea of the concepts you still need to target during your session.    
  • Brainstorm what they know, and where they see gaps in their knowledge.  This can have the added benefit of drawing the student’s attention to the idea that he or she has something to learn, and again can allow you to target your session more effectively.     

During class:

  • If you’ve had students do anything beforehand, this is a good time to lead a discussion about it, remember the concept of “elaborative interrogation” discussed above – think about what questions would draw out knowledge and connect it to class concepts.
  • Have students brainstorm what they know about a topic – use the whiteboards or individual flat panels in our new space.  You can ask students to respond to a set of specific questions or to a more general prompt like “tell me what you know about…”  
    • I have had good results doing carousels, in which students move around the room in groups, answering questions on individual whiteboards.  As students circulate around the room and answer each question, they add to the answers the previous groups have given.  If you don’t want that much movement, groups can work together on each question at one whiteboard.  For evaluation of sources, the questions could be something like: 1. What clues indicate to you that a website is a trustworthy source of information?  2. What clues indicate to you that a website is not a trustworthy source of information?  3. What do you think your professors mean when they ask you to use quality sources of information?  4. What does the term “scholarly Source” mean to you?
    • This exercise could be done on any concept.
    • After students have finished circulating, you, the instructor have an opportunity to fill in gaps, correct misconceptions, etc, but often the students can, as a group, come up with many of the points themselves.                
  • Have students draw a concept map, beginning with what they know on a topic, and adding more items as they learn, either from research or from discussion.  This can be done in a group, on whiteboards (again, for a topic like evaluation) or on paper either in groups or alone.  Connecting new knowledge to what students already know should help them retain anything new they’re gaining.  This strategy is a great way to get the conversation going and to activate background knowledge in a consultation as well.

What strategies do you use to access what students already know?Please feel free to share in the comments!

Resources for further reading:

Ambrose, Susan A., et. al. How Learning Works : Seven Research-Based Principles for Smart Teaching. Hoboken: Wiley, 2010.

Woloshyn, Vera E., Allan Paivio, and Michael Pressley. “Use Of   Elaborative Interrogation To Help Students Acquire Information Consistent With Prior Knowledge And Information Inconsistent With Prior Knowledge.” Journal Of Educational Psychology 86.1 (1994): 79-89.

TLS Tips: Lowering the Stakes to Teaching with Technology

When I began teaching, incorporating active learning into my class plan was a big step. It meant that I may have to field unexpected questions, realize I didn’t have all the answers, and have to think on my feet.  [Sidenote: I think this is the perfect librarian job description].  It meant that I needed to let go of control and share the teaching responsibility with students to truly be more of a guide on the side.
So, if you’re just getting started with the idea of integrating active learning into your teaching, adding in technology may sound a bit overzealous. Anything and everything can go wrong with technology, right? Well, I’d like to share a few examples and techniques that may lower the stakes to using technology meaningfully as part of your pedagogical practice in different kinds of teaching environments and situations. I hope these examples will help illustrate how some of these tools could facilitate not only more active learning but also meaningful dialogue and teaching. And also, there’s a lot that can go right with technology, too!
Using GoogleDocs for Group Work and Collaborative Discussion
Previous TLS tips have name checked GoogleDocs or GoogleForms for integrating active learning. I’m going to go a little bit further in-depth to explain how I set this up and why I take my particular approach.
Most, if not all, of the learning outcomes I identify for UGS classes aim to discuss source evaluation. As Krystal mentioned in her previous TLS tip, I prefer to have the students do this exploration and discovery on their own in groups and then come together to share their experience. During the larger group discussion, I try reiterate the most important takeaways of source evaluation.
Here’s a few examples of GoogleDocs that I’ve used in the past to get students working in groups:
  • Exhibit 1 : Prof. Min Liu’s class
    About the class: 65 min total in a computer lab classroom in SZB. The students worked in groups for 15 min and then the report out took about 30 minutes, which was longer than I originally allotted for but the discussion was really fruitful.Document design: This is an openly editable GoogleDoc so students do not need to login to edit which means that there are FERPA fewer issues. I selected a topic based on the students assignments and then found a variety of sources that would enable us to cover multiple aspects of evaluation. I added the “final answer” of Read It, Skip It, or Cite It to help reinforce the idea that research is an iterative process and that background information can come in many different containers (not just Wiki/Encyclopedia articles). I link these documents to the class’s course guide (in this instance, this one) so they can find everything all in one place.How I use it:  As the students are working, I have each of the Docs opened in different browser tabs and toggle back and forth between them. I actually project their documents up on the screen so they know I’m paying attention; i think they also like to look and see how far other groups have gotten and that provides some motivation.  As I’m looking through, I note (mentally, digitally, or analog) which groups have covered a particular point I want to highlight as well as something that I want to discuss further with them. I make sure to start out with one thing they’ve done well since often students can be shy to share and talk about their work in front of the class.

    Changes: Over time I added “Whys” to some of these questions because I wanted the students to delve deeper into their answer. Additionally, this really helped our class discussion because I could see their thought process.

  • Exhibit 2, Form + Responses : Prof. Charumbira’s class
    About the class: 75 minutes total and this took up the entire class. The is the second of two classes I taught for Prof. Charumbira and this took up the entire class session.Document design:  Since we had already had one session about source evaluation, the second session was focused on getting the students to be able to understand the types of resources available to them.  Through assigning each group a different resource to find using tips from the course research guide, the students filled out the form with one student assigned as the recorder so there weren’t multiple entries for each group, a bit of difference from using the GoogleDoc for class activities.How I use it:  I circulate as students fill out the form; those that have identified their       source as a book are free to go into the stacks to retrieve the book (I make sure ahead of time it’s in PCL.) As in the exercise above, I pull up the Google Spreadsheet and check-in noting some of the points I’d like the group to discuss. In this activity in particular, I also ask the students to provide their feedback on the research process so we can also talk about that. This gives me an opportunity to see what I’ve missed covering and where I need to make changes for next year.

    Changes: I wanted to focus the students on a the questions and creating a GoogleForm over a GoogleDoc enabled me to do this. GoogleForms limit the participation, but I think it also sort of forces students to talk. In the future, I would also definitely think about asking students to fill out the form ahead of time, and then discuss their answers in groups or as a larger discussion if there wasn’t available technology in the classroom.

I hope this gives a little bit more insight into some of the ways that just one form of technology can be integrated into the classroom and can help facilitate discussion. Students are very familiar with the Google Suite of tools so hopefully using this tool won’t be as scary as some others.  If you are interested in creating something similar or have an idea about translating a paper activity into something digital, I’d love to hear about and/or help!

TLS TIPS: Evaluating Sources in the Classroom

This semester I started teaching source evaluation differently and wanted to share this approach in case it can be adapted for use in anyone else’s classroom.

Using the Assignment in Your Classroom

Step 1:  Split the room into groups.   This works really well in PCL 1.124 because they’re already facing each other at tables.  Tell them the first thing they have to do is assign a recorder and a presenter.  This gets their attention.

Step 2:  Explain the exercise and pass it out on a half sheet of paper.  It is  interesting to see the types of things students write down, some of which they will have learned by the end of the exercise isn’t really helpful (for example, “if it is an .edu you can use it but if it is a .com you shouldn’t”).   Here is the exercise – just click on it to make it bigger:

eval sources

Step 3:  Have a student explain the exercise back to you.   This way students hear it two ways and it ensures they understand what they are supposed to be doing.  I didn’t do this the first time and they didn’t really get it, but I didn’t know that until they were reporting out.

Step 4:  Assign each group a source.  I pre-pick the sources and put links to them on the SubjectsPlus course guide.

Step 5:  Give them about 7 minutes to do the exercise and then have each group report out one criteria.  As you add it to the board, ask questions and hold a discussion.

My experiences with it

This has worked well in every class in which I’ve used it (all freshmen classes, though).  Sometimes I mix up the source types and other times I’ll stick to one or two types.  I tie the types of sources I use to the assignment and learning outcomes for the session.  It can be used as a viewpoint evaluation exercise, a web evaluation exercise, a scholarly versus popular exercise, or a more general source evaluation exercise.

I always do this at the beginning of the class, after I’ve introduced myself, gotten them logged on and told them the goals (LOs) and agenda for the class.  It works nicely as an ice breaker, but more importantly, it lays the groundwork for weaving source evaluation in to discussion of tools.   When they are doing their own searching during class, they can refer to the criteria list they generated and apply it to the sources they are finding.

I think you could do this exercise in a classroom with no technology and just hand out print sources.

In the honors classes I’ve taught, they gotten really into it and don’t want to stop talking.  It brings up all sorts of issues they want to know more about including evaluating (or arguing with each other about) Wikipedia, figuring out how funding may impact a web site or figuring out which journals are more important than others (not really a freshmen thing but this exercise has led to that question).  Other times it takes a while because they aren’t quite getting it but they always do eventually and I see that they have begun to move away from black and white criteria (all blogs are bad!  don’t use opinions, etc).

It is really fun and establishes a nice connection with the students.  If I start with this exercise, students seem to ask more questions during the rest of the class and seek out my help more readily.

While I would love to know how effective this is beyond what I can learn from anecdotal evidence, I only have that anecdotal evidence right now.  I’d be interested to know what other people’s experiences are if they adapt this exercise for use in their own classroom.

Discussion: Grab bag of conference ideas

I was lucky enough to attend both of these conferences that Krystal mentions. It’s great to be able to send two delegates from our institution because we were able to split up, see more talks between the two of us, share what we learned and bring back twice as much to share with our colleagues here in the Libraries.

In RIOT, our conversation led us to think about how we can present our own assessment findings in a more public way – to students and to faculty. We are all excited at the idea of translating the data we gather into infographics and tossed around the idea of looking into easy-to-use softwares, like Piktochart, that could help us out.

We also talked about our own struggles with reaching graduate students. The ideas Krystal brought back inspired us to think about what we can do in our changing space here in PCL to more meaningfully engage with graduate students. We have thus far implemented a few initiatives, including a photo show of research at the Benson, so we would like to think of translating ideas that the branch libraries have thought up over here in the new Learning Commons.

Lastly, in response to our conversation about wanting to focus less on teaching skills (tool-based searching) and more on concepts, Janelle promised to share with us what sounds like a great infographic that explains the cycle of information succinctly.

RIOT: Grab bag of conference ideas

I recently attended two different conferences in the Pacific NW, Library Instruction West (LIW) and ARL’s Library Assessment Conference (LAC). I encountered tons of great ideas and inspiring work at both conferences, and was reminded of how intertwined instruction and assessment really are. While I could probably prattle on for way too long about things I’m interested in trying, I chose three ideas to bring to the group for discussion. I had trouble choosing sessions to discuss that fit a particular theme so I just went with a grab bag approach, figuring we can just talk about whatever catches y’all’s attention the most.

1) Rubric Assessment Project at Claremont Colleges Library

Libguide with project info

I saw a presentation on this project at LIW, and have linked to a poster that was presented at LAC. Librarians at Claremont undertook this research project through the Assessment in Action program in order to determine what impact librarian intervention in first-year courses has on IL performance in student work. They assessed student papers produced in First-Year Experience programs (similar to Signature Courses) using a broad IL rubric, and analyzed the results to see if different levels of librarian involvement in the courses impacted student performance. They found that librarians did positively influence student performance in the first three levels of involvement, but a fourth higher level of involvement had no impact.

I think my favorite aspect of this study is how they are using the results to communicate with their constituents, like in this infographic: I like the idea of using data to communicate our value on campus.


  • What research questions do we have regarding the impact of our instruction?

  • What would be convincing evidence to our faculty?

2) Assessment in Space Designed for Experimentation: The University of Washington Libraries Research Commons

Lauren Ray and Katharine Macy (University of Washington)

See abstract here:

At LAC, I attended a lightning talk on assessing the University of Washington Libraries Research Commons, described in the program abstract as “a space intended to meet collaborative needs, foster interdisciplinary connections and provide a sandbox for innovating and testing new library space designs, service models and programs.” While I was inspired by the way their assessment plan focused on student learning in the space rather than just satisfaction and use, I was also really excited to learn about the programming they do in the space, which is targeted at created interdisciplinary connections between scholars.

Their Scholars’ Studio series ( consists of a series of 5-minute lightning talks delivered by grad students and postdocs doing research on whatever the interdisciplinary theme of the evening is. Example themes include “predictions” and “Pacific Northwest.” The talks are followed by a Q&A and a reception. They also provided students with guidance on distilling their research into a short talk and presenting to an interdisciplinary audience before the event.

The presentation also covered Collaborating with Strangers workshops ( in which students, faculty and researchers sign up to connect with one another in 3-minute speed meetings – like speed dating for research. Each session is organized around a particular interdisciplinary topic, such as food research, so that participants can connect with others who have similar interests.

In one-on-one interviews with past graduate student presenters from the Scholars’ Studio series librarians learned that the program helped participants rethink their research and think about how other disciplines would approach what they do, as well as how to be more concise in talking about research. I thought that these were both interesting ideas to consider as we think about ways to include graduate student services in our Learning Commons plans.


  • Could we adapt these ideas to fit in our Learning Commons plans?

  • How can we ensure that we assess student and researcher learning in the new spaces and programs we’re designing?

 3) Teaching “Format as Process” in an Era of Web-Scale Discovery

Kevin Seeber (Colorado State University-Pueblo)

Slide deck:

 In this session at LIW, the presenter focused on how he changed from teaching skills based on database navigation to focusing on teaching publishing processes that lead to different formats of information. He stated that “instruction tied to static interfaces, locations, or appearances is not sustainable” because of rapidly changing developments in the technology and delivery systems. I liked an activity he uses to start instruction sessions in which he gives out cards with different formats on them (scholarly journal article, blog post, etc.) and has students arrange them by least to most editing, research, time to publication, and other attributes and uses that as a launching point for discussion on different formats and their uses. This seems like a nice way to discuss source evaluation, as it gives students skills that last beyond university library-based access to information and sets them up for critical reflection to extend beyond the source and examine funding models and how sources are distributed.

I often find myself trying to spend less time discussing interfaces and the like, and am planning on challenging myself to cut down on time spent doing demos even more this fall. I also thought that this was a good example of the pending new ACRL standards being put into action.


  • What ideas do you have for teaching “Format as Process” and other threshold concepts while still making sure students know how to access the materials in question? How can we work together to develop strategies?

  • Now that we’ve had scoUT for a while, how do you see it affecting (or not) students’ understanding of information formats?

RIOT: Political Blogs

Information Literacy in the Study of American Politics: Using New Media to Teach Information Literacy in the Political Science Classroom
Behavioral & Social Sciences LibrarianVolume 32, Issue 1, 2013
I chose this article because it looks at an interesting collaboration between a librarian and a Political Science professor. It also challenges my thinking about how to present evaluative criteria for resources. Given the rise and ubiqity of political blogs, news aggregators, amateur journalism sites and social networks it’s important to think of how to use them in teaching Information literacy. “the new media environment for covering American politics is a chaotic blend of independent bloggers, Internet media aggregators (e.g., The Huffington Post), social media networks, and traditional news organizations with a Web presence. In this context it becomes necessary to think about IL more as a group of methods for thinking about and analyzing the claims made by variegated information sources than as a set of skills that can be taught divorced from a disciplinary engagement with the information content”
The authors describe an assignment where 12 undergraduate students look at a competitive congressional race. They were instructed to look at a number of variables like like fundraising info, campaign tactics, advertising, and media coverage and to consider local political history and demographic info for context. A challenge the authors saw that it was easy for students to find bits and pieces of news information related to the assignment. But they had difficulty with critically examining the claims or their sources or how to distinguish between different types of content such as for example, a highly polemical blog post vs. an empirical analysis and then synthesizing that info into a coherent and original analysis.

Based on their findings, they came up with 4 categories or types of students based on their work. These categories are fluid and it’s probably not accurate to divide all students into these neat 4 groups. I think they are instructive in give insight into how students might engage with these new media and other information sources.

The Believer (4)
Takes all news sources as trustworthy. There
is no attempt to judge the verity of claims
either in the context of the news item itself,
or on any understanding of the institutional
platform from which the reporter is writing.

The Cynic (4)
Claims that nothing written about a campaign
can be trusted. In the competition to win an
election, candidates and their campaigns
will distort facts to win election. All
reporting about the campaign is similarly
biased, where amateur and professional
journalists have some agenda that favors
one side or the others.
The Opportunistic Surfer (2)
Takes satisfaction in the easily available and
diverse sources of information available to
the technology-savvy researcher. The
benefit of access to information is not so
much for deeper analysis but to use the
technology to find easier ways to collect
The Discerning Analyst (2)
Can navigate through all types of information
sources and can evaluate the veracity of
claims using disciplinary tools and concepts
from history, political science, and current
affairs. That is, the analyst can draw on
recent historic events like previous

These new media sources can provide an amazing array of opinions and viewpoints on current events and policy developments that were not available 10 years ago.

Questions / Points of Discussion:

What has your experience been like working with new media in information Literacy sessions?

Discussion: Cathedrals and Bazaars: Discussing Scholarly Publishing and Open Access with Undergraduates

Elise’s RIOT post led to a rich discussion of the limitations of the one-shot instruction session, how to discuss the economics of information in that limited time period, and how we can work with faculty across disciplines to help students understand and evaluate different models of scholarly publishing.

Elise mentioned the Library Class that she and Shiela Winchester developed to discuss scholarly publishing practices.  The session was developed with graduate students in mind, but Elise wondered what it would look like to redevelop the session for an undergraduate audience and what changes would be necessary to make in an effective discussion of topics like open access for that audience.

As Elise mentioned in her post, the authors suggest that when we spend so much time teaching about how to find information and use tools, we don’t have time to talk about all the nuances of the scholarly conversation.  Time is a barrier to explaining such a complicated issue.

A few threads of the conversation that followed:

  • How can we address the journal pricing crisis in a way that resonates with undergraduates?
  • When we tell students that they should use our great resources because they won’t have access to them after they graduate, how does that make the tools and information meaningful to students who won’t continue as scholars?
  • Elise mentioned using the Peer Review in 5 Minutes video from NCSU on research guides to embed this information in the support she provides courses.
  • The publishing model of Wikipedia can be an entry point into this discussion with undergraduates.
  • Discussion of authority lead to deeper discussion of publishing models and why information is being published in a certain place.
  • Kristen mentioned getting students to try out different searches in different tools and evaluate the results without making distinctions between whether it was being provided by the library or another information services.
  • April talks a lot about evaluating business government resources that are often free and open.  Students crave a checklist and don’t necessarily want to deal with the nuances of critical evaluation.
  • Kristen likes to see the discussion in assignments of seeking the “authoritative source” rather than an emphasis on a “peer-reviewed source.”
  • Brittany talked about how her work with public relations students requires discussions of corporate responsibility and communications.  For example, when evaluating PR literature, it’s important to understand the relationships between brands and corporations. Dove’s empowerment messages for women become even more problematic when you recognize the same company owns Axe Body Spray.  Evaluation becomes an endless series of asking “Why?” and/or “So what?”
  • The discussion ended with some consideration of how students struggle to recognize formats and how this is complicated even more with new publishing models, like open access journals and repositories.  As formats for scholarly publishing change, how are they impacting citation practices? Students already struggle to follow style guidelines for websites versus newspapers published online, for example.