Tag Archives: critical thinking

Discussion: Alignment of Research and Instruction

We met to discuss April’s post about the article, “Reinventing the Library’s Message through the Alignment of Research and Instruction,” a project by librarians at the Vanderbilt University graduate school of business.  The project described in this article included the librarians choosing 3 broad objectives they thought all of their students should understand and that they could all commit to teaching, using similar language.  Our conversation revolved around two of those three objectives: information has value and research is a process.

Information has value:  The group discussed when and how we talk about the value of information, including the price tag associated with it, and that this resonates with students outside of business schools as well.  Krystal gave the example of how she uses this concept in UGS classes.  Before discussing databases, they discuss how and why you can’t get everything for free on the Internet, which sets the stage for understanding that different information lives in different places and helps students decide where to search.  Others talk about the actual cost of certain databases in their classes to show the value of this information.

Research is a process:     The group spent the majority of time talking about this objective.  We know that students don’t think like librarians, but we also think it is important to teach them that research is a process, that the more you practice the better you’ll become and that we, as librarians, are able to reflexively do some of these things and think in certain ways because of practice.  We should teach students to think differently about the research process so they can improve.

One of the ways people teach the process is to start by asking students “who cares?”  This helps them decide who would collect or create the data/information so they know where to begin looking for it.  There were numerous examples of how people incorporate this into a class, such as Laura’s Art History example.  She asks students to consider who would care about a piece of artwork besides art historians.  It helps them move beyond their discipline and understand that “art doesn’t live in a vacuum.”  Kristen frames it as a “reflective research process” where students are asked to consider who is talking about the topic and map it to databases and research guides before starting their searches.

The group also talked about how there is some resistance from students because just using Google and simple searches has worked for them.  They look at librarians as unnecessarily complicating things.  This led to a discussion of how students don’t really understand what a college or university is and what faculty and librarians do. Instead they see college as a place to get a degree so they can graduate and get a job.  Faculty and librarians, however, are trying to teach critical thinking skills which are what will help students succeed in work and in life.  We discussed ways we can explain what college is and what a research university is so they can understand why they are asked to go beyond what worked in high school and how their work fits in with the mission of a university.  This ties in with understanding and evaluating scholarly sources, and we had our usual discussion about how difficult it is to teach them source evaluation.

We ended with a discussion about alternative ways to show our value and our learning objectives to our students.  We agreed that some of the information literacy threshold concepts apply here, such as authority is contextual, information has value and research as inquiry (research as a process).  One idea was to make posters/infographics showing our objectives and the value of what we have.  This is something we may explore further in the spring.


Teaching About the Literature

Here’s a scenario:

A local professor has published an article that has led to controversy. Other professors, both from your institution and elsewhere have published a flurry of responses to this article. A cursory Google search shows that it’s even appeared in several news outlets.
You face a class full of undergraduates. How can you help them make sense of/think critically about this controversy?
Well, initial critical thinking relates to IL Competency Standard #1—figuring out what to ask. But further critical thought must be fed with evidence.
In the shift from BI to ILI, librarians have somewhat moved away from teaching specific tools and techniques to teaching more big-picture stuff. That’s not a bad thing, but check out what Bodemer (2012, p.337) says about what professors want:

First, while faculty may say that they want students to find good, scholarly sources for their papers, what they ultimately want is for students to learn how to find such sources. Moreover, though never put in such broad terms, they also want students to exclude sources.

I’m going to say that students in such cases need to be taught specifics—not just to find what they specifically need but how to think and explore the literature. In the case of a controversy like this, the novice researcher needs context. A controversial study like this, with findings that are at variance with the previous literature, needs careful critiquing, in a systematic way that isn’t just savaging it on PC grounds.
I don’t see anywhere in the IL Standards where Knowing About The Literature is explicitly mentioned. But in a situation like this, it’s exactly what students need to know. What are the norms of literature in this discipline? What’s the shape of the literature? Some of this can be known without extensive reading. Some of this might be known in a quantified way, via thorough searching.
How many other studies are on the same/similar topics as Regnerus’s? What proportion of social science articles on this topic have this philosophical or methodological approach? What is Regnerus’s methodology anyway? After reading criticisms of Regnerus’s article, students might need to know, for example, how to read the his article to identify its methodology and find search terms they can use to find other Social Science Research articles using that methodology.
Teaching students about the literature—faculty I know aren’t doing it. I don’t typically get to do it. How about y’all? How do we work it in, and how do we make the case to faculty that we should be teaching this?

Bodemer, Brett B. (2012). The importance of search as intertextual practice for undergraduate research. College & Research Libraries 73(4): 336-348.
Brown, Matthew. (2012). Social scientists defend Mark Regnerus’ controversial study on same-sex parenting. Deseret News.
Olson, Walter. (2012). The Regnerus Gay-Parenting Study: More Red Flags. The Huffington Post.
Osborne, Cynthia. (2012). Further comments on the papers by Marks and Regnerus. Social Science Research 41(4): 779-783.
Regnerus, Mark. (2012). How different are the adult children of parents who have same-sex relationships? Findings from the New Family Structures Study. Social Science Research 41(4): 752-770.

Critical Thinking – Information Literacy – PotAYto – PotAHto?

Schroeder, R. (2012). Merging Critical Thinking and Information Literacy Outcomes — Making Meaning or Making Strategic Partnerships? in C. Wilkinson & C. Bruch (Eds.), Transforming Information Literacy Programs: Intersecting Frontiers of Self, Library Culture, and Campus Community (131-151). Chicago: Association of College and Research Libraries.

Before selecting this chapter, I skimmed a few others and I’m pleased to say that I think this book is one worth picking up for any instruction library interested in viewing their instruction program within a wider perspective. As the subtile notes, the focus on the book is on the many ways information literacy theories and instruction can be found in and influenced by a variety of factors.

I selected the chapter that I did because I thought it would have wide appeal to our cross-departmental group of RIOTers, but also because we have all been part of conversations where the idea of “critical inquiry” has been discussed as a university initiative.

The crux of Schroeder’s argument is that librarians conflate information literacy with critical thinking skills without defining what we mean by critical thinking. However, many other fields, such as education, philosophy or psychology, do the same thing. Not surprising, when scholars in those aforementioned fields do define it, they do not all agree on the same definition; on the contrary, their own subject expertise influences what skills they think essential to critical thinking in their field.

To prove this point, Schroeder recalls a Delphi study conducted by Peter Facione to find out if there was consensus on a definition of critical thinking.

What struck me most about this definition was that it completely summarizes all of the expectations that our seasoned faculty have about their students. And it also speaks to our continued conversations about managing our expectations with respect to creating an assessment plan and communicating these expectations when talking to faculty.

Schroeder continues his research on finding the written relationship of information literacy and critical thinking by discussing Craig Gibson’s 1995 article, “Critical Thinking: Implications for Instruction,” and Dean Cody’s 2006 article, “Critical Thoughts on Critical Thinking.” Both articles highlight what Schroeder defines as controversy within the library instruction movement in the 1990s, termed the “back-to-basics” movement.  There were two factions of IL instructors: those that thought information literacy and the conceptual ideas behind it deserved a place in the classroom (The Sharks) and those who argued that what should be taught in the classroom ought to be practical (The Jets). While most library instruction programs (fingers crossed!) have evolved past the idea that we should focus on practical skills (and I would argue that teaching the conceptual is practical), this conversation again made me think of the way in which some faculty perceive our value as just a way to teach a practical means to the ends. This very idea is often why it’s so hard to explain and illustrate our value because we teaching the conceptual intertwined with the practical.

Moving from beyond the literature, Schroeder then identifies five libraries who have written information literacy and critical thinking into their campus wide initiatives. He works through each (including his home institutions), analyzing the vagaries of each. While it’s interesting to see how perhaps unconsciously our field has integrated the idea of critical thinking into our own language and communications without necessarily defining it (or does it really need to be defined?), the follow-up survey conducted via ILI listserv (anyone take it?) was even more revealing. In it, Schroeder asks librarians to reflect on what consequences there may be from such a merged outcome using the following questions:

1. What is information literacy’s relationship to critical thinking?
2. Reactions to a merged information literacy/critical thinking campus outcome: “Students will develop skills to strategize, gather, organize, create, refine, analyze, and evaluate the credibility of relevant information and ideas.”
3. What is missing from the wording of the critical thinking outcome used in this survey?
4. What are the advantages of having the critical thinking outcome mentioned above?
5. What are the disadvantages?

I’ll discuss in further detail the findings of the survey, as well as the takeaways that Schroeder defines from using shared info lit/critical thinking outcomes tomorrow, but I would like to know what you think about blending this terminology. What’s your definition of critical thinking? How, if at all, is it affected by your subject area? How does the merging of critical thinking and info lit affect our strategic vision as librarians in a faculty run world?

Search process as web evaluation

“I usually click on the first thing that I see.” Asked to clarify how she decides to pick the first result, she emphasized, “Well, I know the ones that are […] in here [pointing to the shaded Sponsored Link section on a Google results page] they’re the most relevant to what I’m looking for.”

Article: Young Adults’ Evaluation of Web Content


I chose this article in thinking about because of its relevance to what we all do as instructors in today’s climate – attempting to reach students and speak a language that they understand, in terms of how they look for information and how they find and decide to use it.

This study was somewhat unique in that instead of looking at the steps people used to evaluate sources after they located them, it focused on how they arrive at those sources in the first place as a component of evaluation. I don’t think the information is necessarily revolutionary – any of us could probably pull similar answers out if asked, but this study is nice in that it validates what we know but don’t always have data to back up, and it is research done not by librarians but by communications researchers, all of whom focus on new media.

If you want a good down and dirty overview, here is an entry about it from ReadWriteWeb –


The research was conducted like a usability test – researchers observed students doing searches for answers to canned questions, audio was recorded, and there was some follow-up.

The reason they focused on how sources were arrived at in the first place was that they felt sources were chosen as much because of where they fell in search results as the results of content evaluation. Some of their related findings that are relevant to us:

  1. Over 25% of respondents chose sites because they were the first result in a search engine, not because of the site’s content
  2. Only 10% of respondents even mentioned an author or credentials when completing tasks (“However, examining the screen captures of the tasks being performed makes it clear that even among those participants, none actually followed through by verifying either the identification or the qualifications of the authors whose sites gave them the information they deemed to be providing the best solution for the tasks at hand.”
  3. Many respondents expressed trust in .org sites, saying that they were more trustworthy than .com sites.
  4. Among our sample of 102 participants, overall 60% stated, at one point or another that they would contact an institution such as a university or governmental agency for information. Broken down by method of contact, 52% of the sample suggested placing a phone call, while 17% said they would send an email to the organization. Professionals, both medical and educational, were second on the list of those whom participants would contact offline with a fifth of our respondents suggesting that they would pursue this method.
  5. One of their conclusions was that instructors “must start by recognizing the level of trust that certain search engines and brand names garner from some users and address this in away that is fruitful to a critical overall evaluation of online materials.”

The questions this article made me think about relate to what we can do speak to our students about web evaluation in a way that is relevant to them, and include more of a focus on the search process itself as a component of evaluation. This is where I would like to focus RIOT discussion . . .

Domo Arigato – Strategies for the Chat Reference Interaction

As librarians, especially teaching librarians, we want to be everywhere at once in an effort to provide assistance in the times and places where students and faculty encounter the most frustrating or beguiling of problems. Students and faculty demand help that is both immediate and convenient. As a result, our digital reference service is heavily used and heavily marketed. We’ve all seen students get stoked when introducing that librarians are available late night (and online) for their research needs. But, I would be amiss if I didn’t admit that there are times when I feel frustrated with my ability to provide good instruction over chat and to assess the quality of the reference interaction.

Megan Oakleaf and Amy VanScoy had similar issues it seems because in the most recent issues of Reference and User Services Quarterly, they’ve published an article entitled Instructional Strategies for Digital Reference: Methods to Facilitate Student Learning, which analyzes a year-long collection of chat transcripts to reveal strategies for providing and guiding student learning in a digital reference transaction.

While often instruction and reference departments are seen as separate, integrating instructional strategies and concepts into reference transactions means that students are getting a two-for: both the information they need, delivered in a highly effective way, as well as at a specific time of need, to deliver perfectly tailored, just-in-time learning.

Oakleaf and VanScoy focused specifically on strategies that can be employed within the digital environment. Integrating educational theories such as metacognition, constructivism and active learning, and social constructivism, they derived eight strategies, listed below:

from Metacognition

1. Catch Them Being Good:  Oakleaf and VanScoy note this as a way to “reveal to students that information-seeking is not random, but rather has a logical problem-solving process,” which is easily revealed by thanking a student for coming to chat to ask a question since this acknowledges they had a question they could not answer on their own. Complimenting student behavior will reinforce positive behaviors that students may have brought with them to the reference transaction and may comfort any anxiety they may have about not having found the answer on their own.

2. Think Aloud:  Narrating the process, showing our work, and how we may fail even when trying to find an answer, demystifies the research process for students. It’s no longer something they can’t do, but also shows that it really is a recursive process and necessitates patience and often times, another brain.

3. Show, Don’t Tell: The authors talk about using “co-browsing” to share your screen, as well as pushing URL’s, tutorials, videos, or to ask students to open a browser window and follow along, narrating their own steps. Not only does this integrate activity and keeps the student interested, it also accommodates multiple learning styles.

4. Chunk It Up:  Because most likely we cannot stay with the student long after they have completed their task, it may be helpful to prepare them for the road ahead, especially if there are particularly tricky tasks up ahead.  The authors also identify this “chunking it up” part to allow students time to practice the guidance of the librarian while being able to come back and ask questions when they have completed one task.

from Constructivism & Active Learning

5. Let Them Drive: Invite students to describe what their research process has consisted of, where they began, and to allow the librarian to be the guide. This, of course, is not new to us in our instruction sessions, but it can sometimes be tricky to do this in a digital setting – especially without co-browsing.  This discussion based chat-reference interaction allows the librarian to guide the student to see “patterns, ask relevant questions, and encourage reflection.”  It also allows the librarian a peek into the student’s research process, much like how in #2 Thinking Aloud, allows the student into our thinking process.

from Social Constructivism:

6. Be the Welcome Wagon: The idea here is to initiate students into this new community of which they are now part since they are information-seekers, but it also calls upon librarians to acknowledge that students also bring something to the table.

7. Make Introductions: Acknowledging that there are others with expertise who may serve students better demonstrates that there is a wide community of people with specializations that can help answer a student’s question in an efficient manner. Again this also illustrates that there is a system in place for problem-solving.

8. Share Secret Knowledge: Our acronyms stump people, as do our subject headings, and other librarian-speak that seem natural to us but confusing to others. We can demystify the library’s inner workings and share tips and tricks in order to cultivate inquiry and understanding in a novice researcher’s expectation.

The results of this specific analysis can be found within the data preparation and results section of Oakleaf and VanScoy’s article, so I will turn my attention to what I find the most thought provoking piece of the article, which is the idea that digital reference can in fact employ very similar strategies to an in-class/in-person information literacy  session. This may seem an obvious point, but I rarely stop to reflect on my digital reference techniques – especially if chat is particularly busy. The integration of multimedia, such as specifically tailored how-to video captures, have gone a long way to perfecting the “Show, Don’t Tell” technique to engage multiple learning styles.  Using a keyword/brainstorming grid can illustrate abstract concepts hard to verbalize in a short IM conversations, but the interview process of asking students to think critically about what they find during a chat session seems a lot harder without visual expressions and intonation. How do we engage students who may come to IM chat for a quick answer into a longer reference-interaction?

Ideas for Reflection

  • Do you use any of these strategies during your chat reference?
  • Are there concepts, descriptions or analogies that you use during an instruction session that you’ve employed in digital reference?
  • What have been the most/least successful?
  • Anything that you’ve been wanting to try but haven’t figured out a way to do so?

Helping students evaluate

A colleague sent round this interesting blog post the other day–>

Anderson, Kent. Improving Peer Review: Let’s Provide An Ingredients List for Our Readers. The Scholarly Kitchen, March 30, 2010.

Anderson wants articles to include more information about the process of peer review–reviewers’ credentials, number of revisions, etc., to help readers distinguish more rigorously reviewed work from less rigorously reviewed work. He writes

“Here are some potential categories I’d like to see:

  • Number of outside reviewers
  • Degree of blindedness (names and institutions eliminated from the manuscript, for instance)
  • Number of review cycles needed before publication
  • Duration of the peer review portion of editorial review
  • Other review elements included (technical reviews, patent reviews, etc.)
  • Editorial board review
  • Editorial advisers review
  • Statistical review
  • Safety review
  • Ethics and informed consent review”

While it might inform practioners, would this information help students evaluate material?

What about this sort of information (from Mosby’s Nursing Consult)?

“Levels of Evidence
Studies are ranked according to the following criteria:
Level I    All relevant randomized controlled trials (RCTs)
Level II   At least one well-designed RCT
Level III  Well-designed controlled trials without randomization
Level IV  Well-designed case-controlled or cohort studies
Level V    Descriptive or qualitative studies
Level VI   Single descriptive or qualitative study
Level VII  Authority opinion or expert committee reports”

How useful will such guidelines be for students who lack subject expertise? Debra Rollins’ recent post on ILI-L, on the thread “evaluating resources,” considers who should best deliver this aspect of IL instruction.

Burrowing beyond the surface

Piggy-backing off of AJ’s recent post about the “future of instruction,” I thought that in the context of our need to begin supporting many more  large-format (lecture) courses a RIOT on our “real future” of instruction is pertinent. In addition to our recent meeting with Undergraduate Studies, I was fortunate to attend a session at the First-year Experience Conference by Dr. Wren Singer of University of Wisconson-Madison, entitled “Engaging First Year Students in Large Classes: What works, What Doesn’t and Why?”

Though Dr. Wren’s approach is more of an anthropologically study of the faculty who teach UW-M’s First Year Experience classes, her bibliography offered up one article in particular that we could reflect upon while designing our approach to next year’s UGS courses. This article by C. Herrington and S. Weaven, entitled Action Research and Reflection on Student Approaches and Learning in Large First Year University Classes, aims to tackle ideas to motivate those students with discussion sections in order to create a positive, enthusiastic learning environment.

At first I thought this was more the terrain of the Learning Center, since they are the ones that are best equipped for teaching TA’s instructional pedagogical skills. However the article then begins to discuss the different ways that motivation plays into deeper student learning or just surface learning. Surface learning, in contrast to deeper student learning, is the idea that students just want to know the “right” answer. In addition to acknowledging that everyone has different learning styles (and teaching styles for that matter), Herrington and Weaven posit that such surface learning is most likely a product of past experience within the classroom.

According to Herrington and Weaver, action research is the solution to creating an engaging classroom environment in a large format course. Action research is, “a form of self reflexive enquiry undertaken by participants in social situations in order to improve the rationality and justice of their own practices, their understanding of these practices and the situations in which these practices are carried out”. In practice, it is essentially recognizing the trial and error process of a inquiry based learning and  is supposed to help formulate the student’s understanding of the iterative process of learning and research. Seemingly, the keyword brainstorm grid would be a perfect way to document just such a trial and error type process.

Harrington and Weaven also point out that students are extremely adaptable to their learning environment, however, there are specific cognitive strategies that students may employ to cope with their environment which most often do not lead to deeper student learning:

“Apart from regulation and approaches to learning, a related stream of research has investigated the differences in the way that students manage their studies in response to challenges in academic environments (Kivinen & Nurmi, 2003). These cognitive strategies refer to “cognitive, affective and behavioural processes people apply to achieve their goals and to evaluate the outcomes of their actions” (Heikkila & Lonka,2006, p. 102). Three main types of strategic processes have been introduced including illusory optimism (students striving for success), defensive pessimism (students with low expectations of future performance) and self handicapping (students who focus on task irrelevant behaviours to justify likely task failure) (Eronen, Nurmi, & Salmela, 1998).

Although illusory optimism and defensive pessimism have been shown to be successful strategies in higher education (Eronen et al., 1998), we did not know initially what cognitive strategies our students were adopting. Consistent with recent research (Heikkila & Lonka, 2006), we concluded that an integrated framework investigating the interplay of SAL, SRL and cognitive strategies was needed to ascertain differences between students’ motivation, situational and contextual thinking prior to, and following changes in their learning environment.

So what exactly does all of this have to do with us? As I continued to read this article, it dawned on me that multiple student learning styles and classroom coping systems means that students aren’t focusing at the task at hand, but are trying to slide by the task at hand. So why aren’t we just changing the task?  It’s easier when we have input or full control over the types of research assignments or activities to do in class, but once we are brought in as “information literacy” consultants, how do we wield our influence to not only provide activities that engage students in a deeper, more critical thinking process, but also trickle down to the TAs/AIs whom we work with?

So, with that being said,

  • how do we create IL activities that aren’t necessarily task-orientated (even a source analysis sheet can seem task orientated)?
  • are there things we can do as librarians to help TAs/AIs become more student-centered or help create a student-centered environment?
  • have you noticed ways in which students “cope” with learning library skills; are there ways you’ve tried or found that at least helps them to morph from surface to deeper learning?

International students and source evaluation

  • “How Helping Chinese ESL Students Write Research Papers can Teach Information Literacy,” Mei-Yun (Annie) Lu, Journal of East Asian Libraries, No. 141, Feb 2007, pp 6-11.
  • “The Adaptation of Asian Masters Students to Westerns Norms of Critical Thinking and Argumentation in the UK,” Kathy Durkin, Intercultural Education, 19(1), Feb 2008, pp15-17.

Last week, Meghan, Cindy and I went to RHE 398T (the pedagogy class for Assistant Instructors new to teaching introductory writing) to introduce support materials for unit 2 and talk about how unit 1 was going.  One Assistant Instructor talked about the difficulty he was having with his international students.  He couldn’t seem to find a way to teach them how to evaluate sources because they just didn’t seem prepared to think critically about them. After a somewhat unsatisfying conversation (nobody really had any decent suggestions for him), I decided to look a bit into this issue in the library and education literature.  The above two articles offered interesting insights into the issue this Assistant Instructor presented, but have repercussions beyond just web evaluation in the introductory writing classroom.

UT Environment:  According to the 2009 preliminary enrollment figures, 9.1% of the UT student population is made up of international students.  Just over 3% of undergraduates are international students and just over 25% of graduate students are.

Because international students are such a varied group, I chose to focus on East Asian students (Chinese especially) to narrow the field a bit.  These two articles do a nice job of describing the cultural norms that East Asian/Chinese (depending on the article) students bring into the Western classroom.  A few items that seem relevant to information literacy instruction and active learning include:

  • emphasis on basic knowledge, memorization and repetition rather than defining and answering your own question
  • teacher as authority figure (not facilitator; not constructivist approach)
  • emphasis on harmony, not disagreeing, not voicing opinions
  • there is a right and a wrong answer; don’t question

This is obviously quite different from the Western classroom.  For example, the Signature Courses curriculum – courses designed to support first year college students in their transition to college academic expectations –  very explicitly emphasizes learning to disagree with one’s classmates and professors, albeit civilly (the opposite of harmony and keeping your opinion to yourself); self-reflection through writing, discussion, presentation;  learning to become a college student who can define and then explore his or her own intellectual questions, etc.  In other words, these facets of critical thinking that we are trying to teach freshmen how to do as they transition from high school to college students are culture shock to East Asian students.

The first article talks about how librarians can help students transition to this Western style, self-directed learning through the research paper assignment.  The author illustrates how the process of research paper writing can help international students begin to learn to question, evaluate and synthesize knowledge.  Its interesting and makes a good point, but doesn’t have much in the way of practical ideas librarians can use.

The second article, which isn’t in the library literature, doesn’t really propose a solution.  Instead it examines a group of East Asian Masters students adjusting from their own culture’s approach to teaching and learning to the Western approach and detrmines that these students, often by choice, never make it all the way there.  They choose a “middle path” that allows some questioning and evaluating but doesn’t give up on harmony.

Some questions to consider:

– Given the fact that a chunk of our students may be having this cultural experience, how can/should/does it impact information literacy and critical thinking instruction?

– If these students don’t like to talk, have opinions, disagree, etc., how do our standard approaches to source/web evaluation work in the classroom?

– if  these students aren’t taught to and don’t value questioning authority, etc., how do we find common ground to talk about source evaluation?

– If these students are used to being lectured to and then memorizing what they’ve been told, where does active learning fit in?