All posts by krystal

Krystal Wyatt-Baxter joined LIS as a Graduate Research Assistant in 2009, then used her data prowess to score a gig as Instruction & Assessment Librarian in 2012. Krystal coordinates assessment of LIS programs, consults with librarians on assessment projects, and works with faculty to integrate information literacy into first-year undergraduate programs at UT through instruction sessions, assignments, and other learning objects.

Spring CATs or Easy Assessment of Student Learning

Resource:  Bowles-Terry, M., & Kvenild, C. (2015). Classroom assessment techniques for librarians. Chicago, Illinois: Association of College and Research Libraries, a division of the American Library Association.

One of the most common questions I’m asked as Learning & Assessment Librarian is how to quickly and effectively assess learning in the classroom. I always feel like my answers are unsatisfying, but the reality is that there is no perfect way to do this. When I went to Assessment Immersion a few years back, much attention was given to Angelo’s Classroom Assessment Techniques, a giant tome full of assessment examples that I believe was referred to at one point as an “assessment bible.” While I agree that it’s a great resource (though perhaps not at the biblical level), it is also huge and sometimes daunting. Not all of the techniques in the book lend themselves to the kind of one-shot teaching we often find ourselves engaging in. All of this to say, I was excited to get my hands on the recently published “Classroom Assessment Techniques for Librarians.” Inspired by Angelo, the authors tailor various classroom assessment techniques (CATs… meow!) for the kinds of outcomes and learning situations that librarians often engage with.

Their model is to simplify CAT usage by breaking it down into three steps:

1) Plan. (Choose a session and a technique.)

2) Do it! (Explain to students that you’re going to be checking their understanding during the session, tell them why, provide clear instructions, and execute your plan.)

3) Respond. (This is the “closing the loop” part. Read and interpret student responses and address what you learn by letting students know what difference that information makes. An example of this is sending a follow-up email to the instructor detailing changes you’ve made to the course guide based on students’ understanding. You should also think about changes you might make to your instruction based on what you learned, and make specific notes for the next time you work with that class.)

The book is broken down into chapters based on the kinds of skills being assessed, and includes examples of CATs being used in various class types and levels. For this RIOT, I’ll give examples of a few that I’m going to try out this semester, and we can talk about things that you have tried/want to try, challenges and possible solutions, and anything else related to CATs (purring, claws, etc).

Assessing Prior Knowledge and Understanding

I used to sometimes send specific pre-assessment questions to classes to gauge where students were at, but eventually learned that first-year students (with the possible exception of honors classes) are almost always going to be all over the place. “CATs for Librarians” includes an example of using pre-assessment in a way I haven’t done before: asking questions before or at the beginning of class to find out about students’ conceptions on how information is available on the Web. Their example questions are as follows (pg. 8):

  • “Google indexes everything on the Web” (answer choices in a Likert scale ranging from agree to disagree)
  • “Information is free”

I love the idea of using this pre-assessment to not only find out more about students’ beliefs, but to set the tone for a session and let students know that we’re not just going to talk about where to click on the Libraries website. I can that this could be a way to introduce multiple threshold concepts, and I’m excited to try it out. I’ll probably use a Google form linked from classes’ Subjects Plus pages and have students respond as they enter the classroom and get settled.

Assessing Skill in Synthesis and Creative Thinking

After going through an explanation of what peer review is, I often wonder how much of my diatribe the students absorbed. This semester, I’ll try the “One-Sentence Summary” CAT (pg. 40). For this technique, students are asked “Who does what to whom, when, where, how, and why?” about a particular concept, in this case, “How do scholarly articles get published?” Bowles-Terry and Kvenild suggest that this technique works particularly well for difficult new ideas and threshold concepts, and offer examples using each of the frames.

A few notes on analysis

Bowles-Terry and Kvenild include the always-useful reminder that assessment is not the same as research. Your goal is to see what your students learned, not to draw sweeping conclusions that can be applied in other settings. Do make sure to set aside time to close the loop, but don’t feel like you have to spend hours carefully categorizing each student response. For some of the higher-level skills (like the “one-sentence summary” example above) it might be useful to score responses with a simple rubric, or even a yes/no checklist. Here’s a very rough “rubric bank” that I sometimes pull from to assess relevant CATs; feel free to use it if it’s helpful to you, but don’t get caught up in “doing it right.” Even if you don’t have time to utilize a rubric for analysis, you can learn a lot by sorting through student responses and thinking about how to respond (to students themselves and in your own teaching).

What CATs are you going to try this semester? What has worked well for you in the past, and what have you learned about your students using CATs?

p.s. We have a copy of “Classroom Assessment Techniques for Librarians in our collection, and TLS also has an office copy that I’m happy to share. There are many more ideas than I was able to address in this post, and I highly recommend browsing it. If you want to see the “assessment bible” itself, we have collection copies and an office copy of that one, too. J

RIOT: On reading

Recently, I was introducing an evaluation activity to a UGS class in which students worked in groups to come up with evaluation criteria and apply them to an assigned source. I had a lot to cover during the class, and found myself repeatedly telling the students not to actually read the information I had provided them, but to skim the beginning if necessary, and analyze its merits based on context. While the activity led to a great discussion about evaluation and how to use various kinds of sources, something about it felt inauthentic.
Upon further reflection, I came back to something that has bothered me in the past. By necessarily compressing parts of the research process to make room for a deep discussion in a 50 minute one-shot, one of the first things that goes out the window is reading and reflection. I’ve often thought of reading as a problem to overcome while teaching, and have designed most activities to require little or no reading. I ran into this problem again this semester in trying to rethink how I teach students to use background information to find keywords. I struggled to come up with a good way to demonstrate how to pull keywords out of an encyclopedia article without slowing down and giving students time to read and digest the article. I kept coming back to the same roadblock. How can I in one breath tell students that research is a slow, iterative process and in the next breath, tell them that it’s not necessary to actually read the information I’m asking them to evaluate?

While searching for something to RIOT, I came across an article co-written by a librarian and an English professor at Hunter College. The article outlines the reasoning behind a “Research Toolkit” they created that includes both student-facing online learning tools and a faculty guide for their use. While the resource itself doesn’t sound too dissimilar from our Information Literacy Toolkit, the portions of the article explaining their pedagogical reasoning for moving from mechanics of research to deeper, critical inquiry-based research spoke to my own cognitive dissonance around reading and research. Here’s one excerpt:

Reading is an area often neglected by both library and composition scholars. As Brent (1992) explained, “instruction on the research process…deals with the beginning and the end of the process (using the library and writing the drafts), but it has a gaping hole in the middle where much of the real work of knowledge construction is performed. The evaluation of sources is treated chiefly as a matter of measuring the writer’s overall authority as a witness to facts, as measured by factors such as his reputation and the recency of the source” (p. 105). Looking at a variety of writing textbooks and library instruction materials confirms Brent’s statement: most of them focus only on evaluating sources rather than reading them.

Furthermore, the way evaluation of sources is often taught forefronts ideas such as identifying the “bias” of a source. While sources are indeed biased, most students do not understand that all sources will have a bias; it’s how they choose to use the source that matters. Students reading only to evaluate the credibility or bias of a source are not going to do the deeper reading that truly understanding a source requires. Brent (1992) called for a “theory of rhetorical reading” (p. 103), something that has yet to be fully realized.Keller’s (2014) study analyzed student reading practices and noted that focusing on the evaluation of source may have resulted in a form of overcorrection (p. 65), which may lead students to dismiss valuable resources.

Am I doing my students a disservice by focusing on evaluation skills to the detriment of critical inquiry? How can I teach them to construct knowledge when I don’t even give them time to read? The longer I teach, the more I sometimes feel like by cramming the entire research process into a one-shot, I’m deceiving my students. Some things I’d like to discuss:

  • By focusing on evaluating information, are we leading students towards “overcorrection” and away from inquiry?
  • Is it our responsibility to teaching students how to read deeply, or does that fall outside of information literacy?
  • How can you truly model a process that involves reading and sometimes rereading when you have a limited amount of time? Can tutorials and other online learning objects help?
  • Can we come up with exercises that help students practice the “reading and thinking” parts of research?
  • Are there ways to collaborate with our colleagues at the Writing Center around this issue?

Referenced source:

Margolin, S., & Hayden, W. (2015). Beyond Mechanics: Reframing the Pedagogy and Development of Information Literacy Teaching Tools.The Journal of Academic Librarianship, 41(5), 602–612. http://doi.org/10.1016/j.acalib.2015.07.001

TLS TIP: Stray observations/tips on using our new Learning Labs

I will start by saying that so far, I LOVE teaching in the new Learning Labs. I was a bit apprehensive coming into the semester without much time to practice using the new technology (I have, at times, been accused of over-preparing) but my first few sessions pretty much got me past that fear. I’m working on transitioning a lot of my old session outlines into more interactive, student-centered formats, but admittedly, I have a way to go.  Since we’re all busy at this time of the year, I’ll make this post short and sweet and share a few stray observations and suggestions I have in regards to our Learning Labs

1) Use the group structure to your advantage

As admitted above, I haven’t yet infused active learning into my teaching as thoroughly as I’d like to. Despite this, I’ve still noticed that the students seem much more relaxed sitting down at group tables with their own devices (as an aside, I’ve started asking faculty members to have their students bring them, and placing a laptop cart in the room for anyone who forgot) than they did in rows with immoveable laptops. When I ask them to work in a group, they seem primed to do so. I’ve also observed that having them do group work activities, like coming up with evaluation criteria by looking at a website, that make use of on the closest flat screen tends to help with engagement. Whereas previously, students would often default to looking at their own computer and have to be prodded to talk in a small group, the shared space and technology seems to invite discussion. Directing students complete tasks as a small group, then report back to the big group, is a fairly easy entree into using the student-controlled screen capability. I recommend trying it out!

2) Minimize distraction

Grace and I co-taught a class in Learning Lab 2, and set the room up so that the instructor laptop connected to the overhead projector, and individual student groups connected to the flatscreens. This was great during group work, but I found myself getting distracted by students following along on the screens while Grace showed them databases. Easy fix! From now on, I will make sure that anytime groupwork is not being done, I will either freeze student panels, or send the instructor laptop to all panels.

3) Mix it up and capture results

I’ve also made use of the whiteboards throughout the room, having students answer questions or brainstorm on them. I like that this gets them moving around a little, but I lamented to Sarah that I couldn’t capture their work for assessment this way. She great idea of using my iPad to snap photos of the whiteboards for assessment. Why didn’t I think of that? Remember that you don’t have to incorporate all of the technology into every activity. I prefer to mix it up a little.
I know that y’all have ideas and observations of your own by now, and we’d love to hear how you’re adjusting to the Learning Labs, too! As always, let us know if you have questions or ideas.

TLS TIPS: Summertime…and the teaching is easy

The end of the semester always feels weird to me. As classes wind down and I find myself with more open blocks in my calendar, it usually takes me a while to transition from the stop-and-go pace of class planning to the long haul of projects that I have lined up for the summer. It can be all too easy to set the teaching aspect of my job largely aside for a few months, but I know I’ll be better off in the fall if I use the slower pace of summer to work on my teaching practice. I searched online to see if I could find any good ideas for  ways to systematically think about teaching during the summer, and found a post on the ProfHacker blog focused on “looking backward and forward” at the end of the academic semester. While some of the tips are specific to faculty (let’s all be glad we don’t have grades to submit), many of the ideas translate to our work. Here are a few things I plan to do this summer to keep my mind on my teaching and my teaching on my mind.

Review & renew online teaching materials

I often forget that the students I teach are likely to spend more time with the course research guides I create than they spend with me in the classroom. When I’m in a hurry to plan classes, the course guide sometimes becomes an afterthought. Summer is a great time to take an in-depth look at SubjectsPlus guides and other online materials we use and refresh them where needed. I plan to spend some time creating at least one brand new guide for a class that I know I’ll work with in the fall so that when things do get busy, I’ll have a great template to use for other guides. In TLS, we usually review and update our “how-to guides” during the summer. If you see something that needs our attention, let us know.

Revisit conference notes/bookmarks/inspiration

I’m sure we’ve all experienced it. You go to a conference and see tons of great ideas, but jump right back into the fray before you can put anything into action. I’m going to set aside some time this summer to go through my notes from ACRL and find things I want to try in the fall. Since I’ll be revamping a lot of my lesson plans anyway to prepare for our new Learning Labs, this is a great opportunity to take a closer look at my teaching practices overall to make sure they don’t get too stale. I also have a list of articles and links I bookmarked throughout the semester that I didn’t have time to fully investigate. Many of these are related to using technology in the classroom, so I’ll use the summer to make a short list of things I want to try out when the Learning Labs are ready.

Look at data & feedback

I can’t seem to write one of these posts without sneaking in something about assessment. I usually spend a lot of time in the summer analyzing data, so assessment is already on my mind. I think that summer is the perfect time to look for trends in how are students might be changing and what’s working or not working in our teaching. Something I’ve been thinking about lately (and that seems to be reflected in our UGS post-test results) is how difficult it is to fully convey the keyword brainstorming process, especially when students are at the beginning stages of refining their topic selection. I’m not sure how to approach this issue differently (let me know if you have ideas) but the data I’ve been looking at reminded me to think about it. If you have any feedback or data to review, now is the time to do it. If not, personal reflection can help you pinpoint specific areas to focus on.

Before we know it, summer will be over and we’ll be back to the grind. Do you have any tips for using this time to improve your teaching? Please comment if you do.

TLS Tips: Strategies for Classes with Lots of Active Learning

I recently worked with our TLS GRA, Grace, to prepare for and teach a set of instruction sessions that consist entirely of teaching students about evaluating information. I always look forward to this class because it gives me the unique opportunity to spend an entire class period focusing on a single learning outcome. Any time I get a chance to plan a class with a narrow focus, I immediately think about active learning. Me lecturing about evaluation for an entire class period sounds like a painful experience for all involved, and in my experience, students learn this skill better by talking with one another and working through examples.

The basic class plan included students working in groups to read a short assigned article (different kind of articles on the same topic), then answer a set of questions in Google Forms designed to lead them through info evaluation. After that activity, we had each table report out on what kind of information they had, its strengths and weaknesses, and whether they would recommend it for a friend considering trying a specific diet (the topic of the articles). We then had a class discussion on different information formats and their possible uses. Although I’ve been teaching with active learning for years, I still get nervous before leading a class that relies almost entirely on student engagement. I’d like to share a few strategies and challenges I typically think about when planning active learning sessions.

1) Set expectations for participation early and often. I told students from the start that there would be lots of group work and class discussions. This probably wasn’t surprising, as the setup of the tables in PCL 1.124 naturally lends itself to group work. Before beginning the activity, I had students designate one group “recorder” and one “reporter.” This gets them talking and makes it difficult for each student to work through the example independently. While they worked, I walked around and reminded quiet groups to work together, clarified that they only needed to submit the form once per group, etc.

2) Come with flexible plans. In classes like this, I usually plan more activities than I think there will be time for. While it may seem like overkill, this has saved me more than once when classes work through an activity more quickly than I expect. My backup plan was to have them work together to find a “better” article than the one they were assigned if time allowed. As it turned out, the first activity took so long that I had to cut some of the debrief time. I was ready for this possibility, and Grace and I chatted after the first class to come up with strategies for time management in the following classes.

3) Outline the most important points you want to debrief. This is something I continue to find challenging. You never know if students are going to report out all the salient points you want to cover, or if you’re going to need to guide them there. If I’m not careful, I find myself going off topic during discussions and debrief sessions. Sometimes writing things on the board can help with this, but I find that providing students with a structured way of reporting back helps too. If I provide myself with an outline of the essential points I want to hit, I am more likely to facilitate a focused discussion.

4) Assess! And share that info with faculty. I love using Google Forms for these kinds of activities, because I can watch groups’ answers roll in in real time on my iPad, and I can easily share their work with faculty afterwards. Faculty often overestimate students’ evaluation skills, so I like being able to show them exactly where their students are at. This information also helps me plan future sessions and refine my approaches and activity materials.

What are your strategies for leading active learning sessions?

TLS Tips: Reflection as Assessment

At the end of the semester, I recommend scheduling time to reflect and perform a self-assessment of your teaching throughout the fall. If you incorporated any in-class assessments into your practice this fall, now is the perfect time to sit down with all the data you collected and look for patterns. Even if you didn’t get a chance to do any formal assessment, think about your high points in the classroom and the challenges you encountered. Ask yourself some questions.

Can you trace any identified patterns to something specific you did or didn’t do as part of your practice this semester? How do you think your particular successes and failures affected student learning? Most importantly, try to think of a concrete change you can implement next semester to try and address the challenges you faced this fall.

Don’t dwell on what went poorly, but use what you learn through your self-assessment to set goals for the next semester. You may want to take a moment to write down your reflections and goals to make them feel more concrete. Revisit them at the end of the semester, and see if you made any progress.

An example from my fall:

At the beginning of the semester, I tried to incorporate a simple “End of Class Questions” form into my instruction sessions by including a link in the Course Guide and setting aside time at the end of class for students to answer the questions. While I didn’t do a good job of prioritizing this exercise as I became busier, the few results I did get line up with my own reflections. While overall satisfied, students in one class suggested having the class search for something specific to make the session more interesting. I agree! In that particular class, students hadn’t yet been given their assignment when they met with me. Reflecting on my semester as a whole, I realized that several of my challenges (students who didn’t know their assignments or why they were at the library, poorly timed instruction sessions, etc.) could be traced back to a lack of strong, clear communication with the faculty members or TAs I was planning with.

In the future, I will build more time into my planning process to ensure that instruction sessions are scheduled at the point of need. Neglecting to do so likely results in lesser student learning through a failure to tie research concepts to immediate student interests (such as getting a good grade on an upcoming assignment). Next semester I will also try to focus on fewer learning outcomes so that I have time for quick assessments during sessions.

What did you learn through self-assessment?

RIOT: Grab bag of conference ideas

I recently attended two different conferences in the Pacific NW, Library Instruction West (LIW) and ARL’s Library Assessment Conference (LAC). I encountered tons of great ideas and inspiring work at both conferences, and was reminded of how intertwined instruction and assessment really are. While I could probably prattle on for way too long about things I’m interested in trying, I chose three ideas to bring to the group for discussion. I had trouble choosing sessions to discuss that fit a particular theme so I just went with a grab bag approach, figuring we can just talk about whatever catches y’all’s attention the most.

1) Rubric Assessment Project at Claremont Colleges Library

Poster
Preprint
Libguide with project info

I saw a presentation on this project at LIW, and have linked to a poster that was presented at LAC. Librarians at Claremont undertook this research project through the Assessment in Action program in order to determine what impact librarian intervention in first-year courses has on IL performance in student work. They assessed student papers produced in First-Year Experience programs (similar to Signature Courses) using a broad IL rubric, and analyzed the results to see if different levels of librarian involvement in the courses impacted student performance. They found that librarians did positively influence student performance in the first three levels of involvement, but a fourth higher level of involvement had no impact.

I think my favorite aspect of this study is how they are using the results to communicate with their constituents, like in this infographic: http://libraries.claremont.edu/informationliteracy/images/FYS_Infographic.jpg. I like the idea of using data to communicate our value on campus.

Questions:

  • What research questions do we have regarding the impact of our instruction?

  • What would be convincing evidence to our faculty?

2) Assessment in Space Designed for Experimentation: The University of Washington Libraries Research Commons

Lauren Ray and Katharine Macy (University of Washington)

See abstract here: http://libraryassessment.org/bm~doc/2014-program-abstracts.pdf

At LAC, I attended a lightning talk on assessing the University of Washington Libraries Research Commons, described in the program abstract as “a space intended to meet collaborative needs, foster interdisciplinary connections and provide a sandbox for innovating and testing new library space designs, service models and programs.” While I was inspired by the way their assessment plan focused on student learning in the space rather than just satisfaction and use, I was also really excited to learn about the programming they do in the space, which is targeted at created interdisciplinary connections between scholars.

Their Scholars’ Studio series (http://commons.lib.washington.edu/scholarsstudio) consists of a series of 5-minute lightning talks delivered by grad students and postdocs doing research on whatever the interdisciplinary theme of the evening is. Example themes include “predictions” and “Pacific Northwest.” The talks are followed by a Q&A and a reception. They also provided students with guidance on distilling their research into a short talk and presenting to an interdisciplinary audience before the event.

The presentation also covered Collaborating with Strangers workshops (http://commons.lib.washington.edu/news-events/colab) in which students, faculty and researchers sign up to connect with one another in 3-minute speed meetings – like speed dating for research. Each session is organized around a particular interdisciplinary topic, such as food research, so that participants can connect with others who have similar interests.

In one-on-one interviews with past graduate student presenters from the Scholars’ Studio series librarians learned that the program helped participants rethink their research and think about how other disciplines would approach what they do, as well as how to be more concise in talking about research. I thought that these were both interesting ideas to consider as we think about ways to include graduate student services in our Learning Commons plans.

Questions:

  • Could we adapt these ideas to fit in our Learning Commons plans?

  • How can we ensure that we assess student and researcher learning in the new spaces and programs we’re designing?

 3) Teaching “Format as Process” in an Era of Web-Scale Discovery

Kevin Seeber (Colorado State University-Pueblo)

Slide deck: http://kevinseeber.com/liw2014.pdf

 In this session at LIW, the presenter focused on how he changed from teaching skills based on database navigation to focusing on teaching publishing processes that lead to different formats of information. He stated that “instruction tied to static interfaces, locations, or appearances is not sustainable” because of rapidly changing developments in the technology and delivery systems. I liked an activity he uses to start instruction sessions in which he gives out cards with different formats on them (scholarly journal article, blog post, etc.) and has students arrange them by least to most editing, research, time to publication, and other attributes and uses that as a launching point for discussion on different formats and their uses. This seems like a nice way to discuss source evaluation, as it gives students skills that last beyond university library-based access to information and sets them up for critical reflection to extend beyond the source and examine funding models and how sources are distributed.

I often find myself trying to spend less time discussing interfaces and the like, and am planning on challenging myself to cut down on time spent doing demos even more this fall. I also thought that this was a good example of the pending new ACRL standards being put into action.

 Questions:

  • What ideas do you have for teaching “Format as Process” and other threshold concepts while still making sure students know how to access the materials in question? How can we work together to develop strategies?

  • Now that we’ve had scoUT for a while, how do you see it affecting (or not) students’ understanding of information formats?

RIOT: What are our standards?

Last week, Meghan and I attended a Student Learning Outcomes Symposium hosted by the Greater Western Library Alliance. The symposium was the culmination of work done by a GWLA taskforce to find out how member libraries were implementing and assessing information literacy  learning outcomes at their institutions, and included workshops, presentations, and roundtables highlighting current and best practices. Patricia Iannuzzi, Dean of Libraries at University of Nevada, Las Vegas opened the symposium with a talk focused on “the challenges and opportunities in creating a campus-wide information literacy agenda.”

During the talk, she remarked on the current work being undertaken by an ACRL task force to revise the Information Literacy Competency Standards for Higher Education. If you haven’t done so already, you can learn about the revision and watch recordings of online forums that recently took place to discuss the coming changes. While much remains unclear, we do know that the standards will somehow incorporate threshold concepts and metaliteracy. Though it is probably fruitless to discuss the merits of the revised standards before we can actually see them, I was intrigued by a statement Dean Iannuzzi made during her talk. Her discussion focused on the importance of aligning learning outcomes, assessments, and learning activities at the library, course, major, and institution level, and communicating outcomes in a way that resonates with your campus. In other words, how to infuse information literacy into the curriculum by paying attention to campus culture and framing what you’re doing so that others buy into it. It was within this context that Dean Iannuzzi expressed her opinion that the IL Standards revision process is focused on the wrong things and mentioned that she had written a forthcoming article for a special issue of Communications in Information Literacy explaining her position. I tracked down the preprint and thought it was pretty interesting and could spur discussion.

Iannuzzi begins the article by arguing that the original Standards were so influential in 2000 because they incorporated the growing push for colleges and universities to “articulate measurable learning outcomes that extended beyond disciplinary content knowledge.” She argues that work done by those within our profession has has placed information literacy among other important sets of outcomes (critical thinking, oral and written communication, etc.) that everyone pretty much agrees need to be integrated throughout the curriculum and infused within disciplinary content. While the creation of the Standards was an important starting place, Iannuzzi argues that they should serve as a framework for campuses to develop their own measurable outcomes and that those discussions are the ones librarians should be focusing on.

If the challenge before the reviewers was to reword, reframe, and rehash the writing of each learning outcome, then the recommendations would suffice. However, I see little to gain from continuing the decades-old battle of “the literacies.” That discussion is a red herring, which leads ACRL and advocates of reform down the path of professional naval gazing at a time when academic libraries should expand their focus on the challenges of undergraduate and graduate education. (Iannuzzi 2013)

Her point (as I take it) is that we have already taught many within higher ed what information literacy encompasses and how it is important to our students, and we don’t need to continue focusing on refining our message. Instead, we need to move toward partnering with others at our institutions to comprehensively and systematically build information literacy into the curriculum and assess it throughout. Rather than focusing on changing the Standards, she calls for ACRL to:

  • work with groups involved in education reform (AAC&U, regional accreditation associations, etc.)
  • distance itself from technology associations on this issue
  • clarify how information literacy is included within existing national frameworks (such as The Degree Qualifications Profile)
  • create developmental models to assist in curriculum mapping
  • address issues of assessment through leadership on standardized testing (since comprehensive tests like the Collegiate Learning Assessment that are meant to measure integrated skills often do not include information literacy)
  • partner to promote already developed, normed, and reliable rubrics that integrate information literacy with related skills and abilities
  • promote research on the relationship between information literacy and student success

Her views keep the idea that information literacy must be addressed within the disciplines, and expand the role of librarians to one that includes curriculum mapping and vertical integration. What do yall think?

Possible questions to discuss include:

  1. What do you think about the direction of the Standards revisions?

  2. Do you/how do you currently use the Standards? What issues do you have?

  3. What is our campus culture – how can we best communicate outcomes?

  4. Has anyone been involved with curriculum mapping in your department? Do you see this as part of our role?

  5. Could we create a campus-wide information literacy agenda at UT?

Sharing is caring (and caring is sharing)

This year, for the first time, the Libraries hosted a series of events to celebrate Open Access Week. Though two weeks have now passed (where is this semester going?), I wanted to share some thoughts about one of the events I went to. In addition to helping Meghan and Elise host a Wikipedia Edit-a-thon, I managed to sneak away from teaching long enough to attend a talk on Open Educational Resources delivered by Garin Fons, Project Manager for UT’s Center for Open Educational Resources and Language Learning.

While there was a lot of good fodder for thought within the presentation, I think that my biggest takeaway is that I do not always embody the ideas of “openness” that I believe in the abstract. Garin opened his talk by discussing “open” as a “mentality and a belief system,” stating that “you either are open or you’re not.” Later in the talk, he discussed some of the issues holding us back from forming a shared culture of openness in education. With much chagrin, I recognized myself as part of the problem.

A quick scan of the For Undergraduates page that houses our student-targeted guides and tutorials reminds me that none of the pages I created include a Creative Commons License. While it’s not difficult to include the license (and we do have it on some of our resources), it rarely occurs to me to do so. In thinking about this I was reminded of a blog post I read a short while ago discussing (among many other things) the difficulties teaching librarians have historically had in establishing a robust practice of systematically sharing what we create. While I think that there are many big pictures approaches we can take toward becoming better sharers (both within libraries and within higher education overall), this talk reminded me that one small step I can take is to practice what I preach and get in the habit of marking what  I create as open. As my teaching winds down for the semester, I hope I can put a little more thought into ways of approaching the larger goal of forming a shared culture to help us spread out some of the work so that we can devote more time to cultivating deep learning.

Is that part of the Assessment Plan?

I am currently reporting to you right from the thick of it. At last count Library Instruction Services had 169 instruction sessions recorded in our database for the semester, and that’s on top of all the other projects and responsibilities that don’t pause when we’re busy. At times like this it’s important to remember that while it may feel validating (and explain our stress dreams) to see how many instruction sessions we have scheduled, sheer numbers don’t actually tell us very much. We know how many students we teach, but the real insight comes from discovering much we taught them and where their information literacy skills are at the end of the semester. Unfortunately, those data are much more difficult to come by.

Luckily, we have an Assessment Plan to help us remember to assess student learning even in the midst of the madness. One of my first tasks as Instruction & Assessment Librarian was to pull together a comprehensive plan outlining and guiding LIS’s assessment efforts. While the task proved to be daunting at first, with a lot of guidance (I highly recommend ACRL’s Immersion: Assessment Track) we completed our first iteration of the plan last spring.

So far one of the nicest effects of having a plan has been the ability to refer back to it when faced with a question of new data we could collect. Rather than collecting everything we come across only to rummage through it later (or more likely forget about it), we have taken a targeted approach that helps us focus most of our efforts on the most important parts of our practice.

When the work feels overwhelming, it’s nice to have a system to fall back on that allows us the time we need to reflect on and improve our teaching, without grabbing every data point we come across. Parts of our assessment become almost automated (pre-and post-tests that we administer through email at the beginning and end of the semester), freeing up each of us to dig in deeper to authentically assess at least two classes each semester for a snapshot of critical thinking in action. In the tide of the fall semester it might feel like we can only keep our heads above water, but having an underlying plan will help us back to the shore so we can begin again and do even better in the spring.