Category Archives: Lifelong Learning

DART Recap – scoUT Discovery Tool

Yesterday we launched the new platform for our professional discussion group, Discussions About Resources and Teaching (DART), formerly known as RIOT.  Motivated by feedback and transitions within the department, this change better reflects our current structure and goals as a community of practice.  Thank you to everyone who participated in kicking off DART!

Our topic for discussion was teaching with web-scale discovery tools like scoUT. To gather different perspectives, participants were invited to read one of three articles beforehand:

“Teaching Outside the Box: ARL Librarians’ Integration of the ‘One Box’ into Student Instruction” College & Research Libraries

“Beyond Simple, Easy, and Fast: Reflections on Teaching Summon” College & Research Libraries News

“Teaching ‘Format as Process’ in an Era of Web-Scale Discovery” Reference Services Review

We began with a round robin to share how, or in what capacity, people are using or not using our discovery tool scoUT. Perhaps not surprisingly, there was a broad and varied spectrum of responses. Some people actively use it as a teaching tool in classes, while others mention it briefly, do not teach with it at all, or find themselves engaging with it more on the reference desk than during instruction sessions. People mentioned using it for developing topics; searching by citation; refining vague reference requests; finding book reviews; and locating material on obscure subjects or with very specific search phrases only found in full-text.  Also, it seems that few people actually call the tool scoUT when talking about it with students, referring to it instead with names like the “all tab” search; “main library” search; or the “big box” search.

Interestingly, some of the features discussed that make scoUT useful are also what can make it challenging. For example, it is helpful in retrieving sources that cover obscure, specific or seemingly unrelated topics because it searches and crawls across so many things. Yet that also means that it often returns a deluge of results, which people expressed can be difficult or overwhelming to deal with.

After the round robin, much of our discussion stemmed from the third article, which explored the concept of teaching format as process and how web-scale discovery tools factor into this approach.  When searching online, sources become decontextualized; content is separated from its package and so visual indicators cannot necessarily be relied upon. Guiding students to consider the creation process inherent in different source types can help foster the development of higher-level critical thinking and evaluation skills.  A tool like scoUT, that requires sifting through a large number of different, and at times random, source types, presents an authentic opportunity to discuss and hone these skills. However, this depth and engagement takes time, and can be difficult to address in a one-shot instruction session.

There was also a general consensus that whether or not we are teaching scoUT directly, students are going to use it. Not only is it the first, obvious search box on the website, but it also has that familiar Google-like quality that will draw students toward it. So if they are going to use it anyway, it only makes sense for us to think about how we can teach them to do so in a discerning and productive manner that will serve them even outside of school.

It was great to hear at the end of the discussion that several people felt interested and inspired to find new ways to incorporate scoUT into their teaching practice. Thanks again for an insightful and engaging first DART!

Do you have an article or topic you would like to bring to DART? Feel free to contact Elise Nacca with any ideas and feedback!


Kulp, C., McCain, C., & Scrivener, L. (2014). Teaching outside the box: ARL librarians’ integration of the “one-box” into student instruction. College & Research Libraries, 75(3), 298-308. doi:10.5860/crl12-430

Cardwell, C., Lux, V., & Snyder, R. J. (2012). Beyond simple, easy, and fast: Reflections on teaching summon. Chicago: Association of College and Research Libraries.

Seeber, K. P. (2015). Teaching “format as a process” in an era of web-scale discovery. Reference Services Review, 43(1), 19-30. doi:10.1108/RSR-07-2014-0023


“…how we teach in the classroom can be as important as what we teach…”

In July 2015, David Gooblar wrote a pithy Chronicle Vitae column, the crux of which is that we should sometimes “model stupidity” for our students. Gooblar cites a couple of other short pieces, notably “The Importance of Stupidity in Scientific Research,” that support the idea that students need instructors to pull back the curtain on their learning process.

“Modeling stupidity,” as Matthew Fleenor writes in a 2010 article for Faculty Focus, “is one of the best ways we can provide an example to our students. It’s important for them to understand that learning involves seeking out the gaps in our knowledge.”

One commenter on Gooblar’s column prefers to call this “modeling curiosity,” but points out the particular pitfalls of exposing ignorance in the classroom for female instructors. That’s almost certainly an issue for instructors of color as well. And it may well pose concerns for instruction librarians, who are already regarded by students as “guest speakers” instead of as experts. Yet clearly, students need to know how instructors recognize and deal with their own ignorance.

Probably we’ve all had an “I don’t know” experience in one-on-one reference encounters, and have pursued answers/solutions with some measure of poise. But “I don’t know” in front of a classroom full of students is a whole ‘nother ball of wax.

How do instructors get comfortable with saying “I don’t know,” and what action plan follows?

Can instruction librarians model stupidity in the often limited time we have with students?

Is there a cost to authority if an instructor models stupidity?

Possible to provide faculty with examples of opportunities to model stupidity with respect to literature searching/information resources?

Fleenor, Matthew. (2010). “Responding to student questions when you don’t know the answer.” Faculty Focus.

Gooblar, David. (2015). “Modeling the behavior we expect in class.” Chronicle Vitae, Pedagogy Unbound.

Moore, Katherine. (2015). Comment on “Modeling the behavior we expect in class.” Chronicle Vitae, Pedagogy Unbound.

Schwartz, Martin. (2008). “The importance of stupidity in scientific research.” Journal of Cell Science (121)11:1771.


How College Graduates Solve Information Problems Once They Join the Workplace

Project Information Literacy Research Report: “Learning Curve: How College Graduates Solve Information Problems Once They Join the Workplace” by Alison Head, October 2012

This report includes data from UT graduates who participated in focus groups back in April (Thanks to Michele for doing all the coordination around this!).

An exploratory study discussing post-college information-seeking behaviors by 33 graduates of 4 different programs, this report provides some interesting insights into what employers hope their staff will be able to do and how the information literacy instruction provided in college might align with those expectations.  Rather than attempting to paraphrase the findings, here they are as found in the report:

“The major findings from our interviews and focus groups are as follows:

1. When it was hiring time, the employers in our sample said they sought similar information
proficiencies from the college graduates they recruited. They placed a high premium on
graduates’ abilities for searching online, finding information with tools other than search
engines, and identifying the best solution from all the information they had gathered.
2. Once they joined the workplace, many college hires demonstrated computer know-how that
exceeded both the expectations and abilities of many of their employers. Yet we found these
proficiencies also obscured the research techniques needed for solving information problems,
according to our employer interviews.
3. Most college hires were prone to deliver the quickest answer they could find using a search
engine, entering a few keywords, and scanning the first couple of pages of results, employers
said, even though they needed newcomers to apply patience and persistence when solving
information problems in the workplace.
4. A majority of employers said they were surprised that new hires rarely used any of the more
traditional forms of research, such as picking up the phone or thumbing through an annual
report for informational nuggets. Instead, they found many college hires—though not all—
relied heavily on they found online and many rarely looked beyond their screens.
5. At the same time, graduates in our focus groups said they leveraged essential information
competencies from college to help them gain an edge and save time at work when solving
workplace information problems. Many of them applied techniques for evaluating the quality of
content, close reading of texts, and synthesizing large quantities of content, usually found
6. To compensate for the gaps in their skills sets, graduates said they developed adaptive
strategies for solving information problems in the workplace, often on a trial-and-error basis.
Most of these strategies involved cultivating relationships with a trusted co-worker who could
help them find quick answers, save time, and learn work processes”

Let’s focus tomorrow’s discussion on the implications of these findings on our instruction programs.  How might these findings inform our teaching practices and our collaborations with faculty and programs?  What opportunities might exist for addressing these issues through new initiatives?  Other reactions to the findings?

Information Literacy outside academia

Lately it seems like we’re constantly being bombarded with calls to justify the expense of college in a bad economy. As we’ve discussed in RIOTs past, most of our students are not going to become scholars, and will likely not have access to the “vetted” scholarly materials that academic libraries provide. Recently, we’ve discussed how to help students understand the information making process and how to approach discussing information that doesn’t fall into the “find a scholarly article on your topic” kind of model. With these things in mind, I found two articles discussing approaches to engage students with the kinds of information they might encounter outside of school. While neither of them fit into the model of how we usually interact with students (one looks at a one-hour credit IL class and one at an internship program) they got me thinking about information literacy in different contexts and include some interesting ideas.

Wong, G. (2010). Facilitating students’ intellectual growth in information literacy teaching. Reference & User Services Quarterly, 50(2), 114-118.

Lisa Wong discusses a unit in which she created a series of assignments that engage students in her one credit course with socioeconomic data. She designed the assignments so that they mapped to the stages of intellectual development, challenging students to build cognitive skills gradually and master the following outcomes:

• describe what socioeconomic data are about;

• describe the differences between socioeconomic and scientific data;

• craft workable strategies to access various socioeconomic data; and

• evaluate data quality on the basis of reliability and authority.

Students started by gathering socioeconomic data from sources like the CIA Word Factbook (and some Hong Kong sources I wasn’t familiar with) and through the rest of the unit they looked at data definitions and the nature of socioeconomic data (as opposed to scientific data); the reliability and authority of data collection methods and agencies; and how to locate, evaluate and use data.

I thought this was a great way to illustrate the life cycle of data, and liked how the students were able to move through systematically so that they could really understand what they were looking at before they were asked to evaluate and analyze it. As more and more jobs require students to be able to work with data, I think that it is part of our job to prepare them for this level of analysis. The author notes that in the class evaluation, “many [students] specified the understanding that socioeconomic data had no absolute accuracy similar to scientific data.” While I would argue that scientific data does not necessarily have absolute accuracy, I do think that the comparison is useful in helping students grasp how information is built in different disciplines and how it is used in the “real world.”

Hoyer, J. (2011). Information is social: information literacy in context. Reference Services Review, 39(1), 10-23.

The Hoyer article takes information literacy instruction totally out of academia. The author works as a librarian for the Edmonton Social Planning Council (which sounds like an awesome job!) and runs an internship that “provides an opportunity for young people to learn about information use in community and nonprofit settings and allows the librarian to mentor information best practices over the course of a long-term project.” Interns can pursue any project related to social issues in the community, and must have a project in mind before they apply. The author states that using information in a new context provides an opportunity to model how information best practices are relevant not just in academia, but in “all work sectors and in all aspects of social interaction.” Interns are also asked to reflect on how the information they use for their projects is shaped by whatever social environment they are working in.

Examples of previous projects outline how interns gathered information from the grey literature, through networking with community stakeholders and finding experts, and from non-academic writing. They also practiced managing information through community organizing, gathering information to write grants, and managing financial information. Hoyer argues that these non-traditional ways of learning information literacy map up with ACRL Standards, and that “recognizing the importance of social context to the production, evaluation, and communication of good information is key awareness that will allow individuals to relate their skills to whatever context they find themselves in.”

I love the idea of teaching students information literacy skills that transcend academia, but I struggle with how to incorporate it into my teaching when I usually see students for such a short amount of time. This seems like a goal that could really benefit from some hardcore faculty collaboration. It could be because I’m such a data nerd, but I think that the ability to work with data will become increasingly important as we continue to transition into an information-based society. How can we make sure that students graduate with these skills? Is there a way to stress how transferrable information literacy skills are as we guide students through academic research projects?