Discussion: Research Skills and College Readiness

A small group convened to review Carolyn’s post on Research Skills and College Readiness. We discussed the assumption that all students need preparation to be “college ready” when they arrive at UT Austin. All students should learn writing skills and research skills in the UGS courses. Sometimes professors assume that the students have a more advanced set of skills than they actually have. When teaching these classes the expectation is to start from scratch when teaching information literacy skills. In addition, some students are required to take rhetoric classes to become college ready.

 

We then discussed assignment design and the role librarians can play in designing research assignments that are more than just linear but fuzzy. It is not just about solving a problem but formulating it. We also discussed how the research process is cyclical not necessarily linear. It is regularly emphasized to first year students that the research process is circular.

 

We learned that Engineering assignments are set up with formulating problems as well as finding solutions. It would be difficult to break out of the step by step process from first year students and work towards this idea of formulating problems. Topics should be selected after preliminary research gets done but often topics get picked with very little research done.

 

We also learned about primary sources and how important it is to spend time with archival collections and how important it is to really look at things in the humanities. If you look closely, formulating questions should come next. This varies greatly from the social sciences where you are constantly looking at and analyzing the data and the evidence.

 

We discussed how to help the students ask questions when they have a research problem and why students may be reluctant to ask for research help but they don’t seem to be reluctant to ask for writing or tutoring help.   Why is this?   If they were encouraged by their peers to ask for research help this makes a difference. Research shows that students think they are proficient in research skills but much of the time they are not. This may also be a reason that they don’t ask for assistance.

 

We discussed research and writing expectations in high school and how these expectations are different in college and how new skills need to be taught and old skills even unlearned.

Research Skills and College Readiness

I have been thinking about how academic libraries are starting to bring more non-library people into our organizations to help us with our thinking about how to best serve students. Librarians have always looked to education professionals to understand pedagogy and adult learning, and I hope this trend continues as we fine-tune our roles in the wider academic landscape.

I came across an interview with David Connelly, a policy analyst and professor of educational policy and leadership at the University of Oregon. The interview was conducted by Project: Information Literacy (PIL), a large-scale, national study about early adults and their research habits, conducted in partnership with the University of Washington’s iSchool. PIL has a whole series of Smart Talks, interviews with professors, authors, and others involved in education who are not necessarily librarians (though there is one with Char Booth!). Very interesting perspectives for us to consider.

Read the interview here:

David Conley: Deconstructing College Readiness

I will try to hit the highlights here and we can discuss further in person. The first thing that jumped out at me is his definition of college readiness. He takes umbrage with how the current admissions system rewards eligibility over readiness. Eligibility here means grades, courses taken, and admissions test scores. Readiness is much more comprehensive:

  • A college and career ready student possesses the content knowledge, strategies, skills, and techniques necessary to be successful in a post-secondary setting.
  • Not every student needs exactly the same knowledge and skills to be college and career ready.
  • A student’s college and career interests help identify the precise knowledge and skills the student needs.

From our perspective, it seems like students would need a lot more information literacy before college to meet this definition.

The next thing I found interesting is that Professor Connelly names “research skills” a key competency for college readiness in his latest book. Students will not learn how to define a research problem and solve it by following predetermined steps. Real research and real workplace problem-solving are often unstructured and fuzzy. He sees librarians in the role of helping students and teachers understand this new “paradigm.” One way is to help teachers design assignments that require formulation of a problem before solving it. That formulation step itself requires exploration and research skills.

Then when the problem has been identified and placed in context, students must know which type of material they will need, and be able to evaluate the quality. For us, this seems like a no-brainer, but it is nice to know that someone who studies education policy sees the role of the librarian in this process.

The last bit I will point out are findings from Connelly’s research related to students seeking assistance when they need it. Students most in need of seeking help are the least likely to ask for it because:

  • Many believe doing so indicates they don’t really belong in college in the first place, so they attempt to do it all by themselves.
  • Some students simply have little experience asking for help, and colleges are far more complex institutional environments than high schools, which it makes it more difficult to get help.
  • Some students do seek help only to be rebuffed or frustrated when a professor or campus advisor does not do what the student wants.

This seems like an important finding to keep in mind as we fine-tune our services and build partnerships across campus units.

Finally, another no-brainer for us, but I really liked what he had to say about librarians helping students in the long-run:

Librarians can guide students toward materials that cast light on the big questions of the subject, and the organizing concepts necessary to connect and unite the specific competencies. They can suggest extensions and ways for students to get more deeply involved in what they are learning. And they can, of course, help the struggling student find resources that break down course content into simpler pieces that can be digested and integrated to reach the level necessary to demonstrate a competency.

 

QUESTIONS

What are some strategies for following the “new research paradigm” that Connelly lays out, wherein librarians help students explore and formulate research problems before solving them? How can we fit this into a one-shot, if possible?

How can we move teachers away from creating assignments where research problems and solutions are predetermined?

What can we do in and outside the library to help students who don’t ask for help, or conversely, to get those students to start asking for help (from us or others)?

DISCUSSION: “Does Library Instruction Make You Smarter?”

Since we didn’t actually have Michele’s RIOT back in the summer, it was good to finally get to this topic. The busy start to the fall semester puts the assessment and instructional effectiveness on all of our minds. A lively discussion followed Michele’s presentation.

All were chagrined that the studies didn’t show any correlation between student success and library instruction (LI), and we wondered about what kinds of instruction interventions do show a correlation to student success. Real-world guest lecturers? Recitation? What?

The tendency for education researchers to work with giant datasets  instead of qualitative researcher leads to literature that doesn’t adequately characterize students’ motivation, an important part of student success. Assessing students’ research papers would show the effect of LI, but one first needs to think about the audience for this information. Creating a controlled teaching environment could help us know what’s effective, but that’s neither practicable or desirable. Besides, there are so many variables that it’s better to collect information on the perceived value of LI, and whether LI changes students’ behaviors.

From here the conversation shifted to library anxiety and creating a culture that takes away the imposter-syndrome discomfort students may feel when they ask for help. Outward-facing attitudes benefit LI, and outreach and campus partnerships can help to create a culture of comfortable help-seeking.

 

Discussion: Grab bag of conference ideas

I was lucky enough to attend both of these conferences that Krystal mentions. It’s great to be able to send two delegates from our institution because we were able to split up, see more talks between the two of us, share what we learned and bring back twice as much to share with our colleagues here in the Libraries.

In RIOT, our conversation led us to think about how we can present our own assessment findings in a more public way – to students and to faculty. We are all excited at the idea of translating the data we gather into infographics and tossed around the idea of looking into easy-to-use softwares, like Piktochart, that could help us out.

We also talked about our own struggles with reaching graduate students. The ideas Krystal brought back inspired us to think about what we can do in our changing space here in PCL to more meaningfully engage with graduate students. We have thus far implemented a few initiatives, including a photo show of research at the Benson, so we would like to think of translating ideas that the branch libraries have thought up over here in the new Learning Commons.

Lastly, in response to our conversation about wanting to focus less on teaching skills (tool-based searching) and more on concepts, Janelle promised to share with us what sounds like a great infographic that explains the cycle of information succinctly.

RIOT: Grab bag of conference ideas

I recently attended two different conferences in the Pacific NW, Library Instruction West (LIW) and ARL’s Library Assessment Conference (LAC). I encountered tons of great ideas and inspiring work at both conferences, and was reminded of how intertwined instruction and assessment really are. While I could probably prattle on for way too long about things I’m interested in trying, I chose three ideas to bring to the group for discussion. I had trouble choosing sessions to discuss that fit a particular theme so I just went with a grab bag approach, figuring we can just talk about whatever catches y’all’s attention the most.

1) Rubric Assessment Project at Claremont Colleges Library

Poster
Preprint
Libguide with project info

I saw a presentation on this project at LIW, and have linked to a poster that was presented at LAC. Librarians at Claremont undertook this research project through the Assessment in Action program in order to determine what impact librarian intervention in first-year courses has on IL performance in student work. They assessed student papers produced in First-Year Experience programs (similar to Signature Courses) using a broad IL rubric, and analyzed the results to see if different levels of librarian involvement in the courses impacted student performance. They found that librarians did positively influence student performance in the first three levels of involvement, but a fourth higher level of involvement had no impact.

I think my favorite aspect of this study is how they are using the results to communicate with their constituents, like in this infographic: http://libraries.claremont.edu/informationliteracy/images/FYS_Infographic.jpg. I like the idea of using data to communicate our value on campus.

Questions:

  • What research questions do we have regarding the impact of our instruction?

  • What would be convincing evidence to our faculty?

2) Assessment in Space Designed for Experimentation: The University of Washington Libraries Research Commons

Lauren Ray and Katharine Macy (University of Washington)

See abstract here: http://libraryassessment.org/bm~doc/2014-program-abstracts.pdf

At LAC, I attended a lightning talk on assessing the University of Washington Libraries Research Commons, described in the program abstract as “a space intended to meet collaborative needs, foster interdisciplinary connections and provide a sandbox for innovating and testing new library space designs, service models and programs.” While I was inspired by the way their assessment plan focused on student learning in the space rather than just satisfaction and use, I was also really excited to learn about the programming they do in the space, which is targeted at created interdisciplinary connections between scholars.

Their Scholars’ Studio series (http://commons.lib.washington.edu/scholarsstudio) consists of a series of 5-minute lightning talks delivered by grad students and postdocs doing research on whatever the interdisciplinary theme of the evening is. Example themes include “predictions” and “Pacific Northwest.” The talks are followed by a Q&A and a reception. They also provided students with guidance on distilling their research into a short talk and presenting to an interdisciplinary audience before the event.

The presentation also covered Collaborating with Strangers workshops (http://commons.lib.washington.edu/news-events/colab) in which students, faculty and researchers sign up to connect with one another in 3-minute speed meetings – like speed dating for research. Each session is organized around a particular interdisciplinary topic, such as food research, so that participants can connect with others who have similar interests.

In one-on-one interviews with past graduate student presenters from the Scholars’ Studio series librarians learned that the program helped participants rethink their research and think about how other disciplines would approach what they do, as well as how to be more concise in talking about research. I thought that these were both interesting ideas to consider as we think about ways to include graduate student services in our Learning Commons plans.

Questions:

  • Could we adapt these ideas to fit in our Learning Commons plans?

  • How can we ensure that we assess student and researcher learning in the new spaces and programs we’re designing?

 3) Teaching “Format as Process” in an Era of Web-Scale Discovery

Kevin Seeber (Colorado State University-Pueblo)

Slide deck: http://kevinseeber.com/liw2014.pdf

 In this session at LIW, the presenter focused on how he changed from teaching skills based on database navigation to focusing on teaching publishing processes that lead to different formats of information. He stated that “instruction tied to static interfaces, locations, or appearances is not sustainable” because of rapidly changing developments in the technology and delivery systems. I liked an activity he uses to start instruction sessions in which he gives out cards with different formats on them (scholarly journal article, blog post, etc.) and has students arrange them by least to most editing, research, time to publication, and other attributes and uses that as a launching point for discussion on different formats and their uses. This seems like a nice way to discuss source evaluation, as it gives students skills that last beyond university library-based access to information and sets them up for critical reflection to extend beyond the source and examine funding models and how sources are distributed.

I often find myself trying to spend less time discussing interfaces and the like, and am planning on challenging myself to cut down on time spent doing demos even more this fall. I also thought that this was a good example of the pending new ACRL standards being put into action.

 Questions:

  • What ideas do you have for teaching “Format as Process” and other threshold concepts while still making sure students know how to access the materials in question? How can we work together to develop strategies?

  • Now that we’ve had scoUT for a while, how do you see it affecting (or not) students’ understanding of information formats?

RIOT: Does Library Instruction Make You Smarter?

All across UT (and higher education in general), people are attempting to assess student learning and articulate the value of their programs to student success, measured by retention, on-time graduation, GPA, post-college success and more.  While we are successfully measuring the impact of our sessions on student learning, meaning we know they are achieving our learning outcomes in our sessions for at least some of our programs, we haven’t measured whether what they are learning translates to more general success in or after college.   Since Megan Oakleaf’s Value of Academic Libraries Review and Report in 2010, I have been wondering just what impact one-shot instruction sessions have on student success, whether that is defined as GPA, retention or on-time graduation.  I am clearly not the only one wondering this so I put together this post as an attempt to answer that question.

In 2007, Joseph Matthews published the book “Library Assessment in Higher Education” which I haven’t read yet but have read about many times.  He looked at studies up to 2007 and found that they are pretty evenly split between finding a correlation between library instruction, GPAs and retention and finding no correlation.   I found a few more articles published since 2007 that represent what has been happening since his book came out.  This list is by no means comprehensive but the articles illustrate the state of the research on the question and the ways people are approaching the question.

Vance, Jason M., Rachel Kirk, and Justin G. Gardner. “Measuring the Impact of Library Instruction on Freshman Success and Persistence: A Quantitate Analysis.” Communication in Information Literacy 6.1 (2012): 49–58.

Librarians from Middle Tennessee State University attempted to find out whether one-shots for freshmen impacted their GPAs and/or their likelihood of returning for a second year (retention).  To do so, they gathered information about the one-shot classes they were offering to freshmen over a two year period, noting that these were introductory rather than research intensive classes.  They also gathered information about high school GPA, family income, ACT scores, race, gender, and major (all variables that have been correlated with retention).  The results of the study were that they could not find a direct connection between library instruction and student retention, although library instruction does appear to have a “small measurable correlation with student performance” (which, in turn, is tied to success and persistence).  There were a lot of issues with the study that the authors themselves point out, including the fact that the students they included as having attended instruction sessions may not have – they were enrolled in the courses that came in but they may have skipped.

Wong, Shun Han Rebekah, and Dianne Cmor. “Measuring Association Between Library Instruction and Graduation GPA.” College & Research Libraries 72.5 (2011): 464–473.

Librarians from Hong Kong Baptist University looked at the correlation between GPA and library workshop attendance for 8,000+ students who graduated between 2007 and 2009.  The findings were that GPAs were positively correlated with increased workshop offerings.  In programs that offered 5 workshops, GPAs were highest.  In those that offered 3 or 4, GPAs were positively affected and in those that offered 1 or 2, there was no positive correlation.  Workshops, in this case, were a mix of required and voluntary, stand-alone and course integrated.  One issue with this (and many) study is that it is only about correlation, not causation.

Bowles-Terry, Melissa. “Library Instruction and Academic Success: A Mixed-Methods Assessment of a Library Instruction Program.” Evidence Based Library and Information Practice 7.1 (2012): 82–95.  

This study from the University of Wyoming used a mixed-methods approach, with qualitative data provided by focus groups with 15 graduating seniors and quantitative data provided by transcripts for about 4,500 students.  The interesting thing about this study is that it provided some evidence for the idea that scaffolded information literacy instruction is most effective for student success.  Students in the focus group said the ideal form of instruction was a session their freshmen year and then at least one more when they were farther along in their majors to focus more on doing research in their discipline.  Transcript analysis showed a correlation (not causation) between GPA at graduation and getting upper division library instruction.  Once again, the authors identified issues such as the fact that they didn’t know if students in the transcript analysis actually attended sessions or skipped that day, and the fact that the analysis only showed correlation.

So what is the answer to our question?  A definitive “we don’t know.”   And where does that leave us as we struggle to demonstrate our value to the teaching & learning mission of UT?  It is clear that researchers in libraries are attempting to answer the question of whether what we do in library instruction is transferrable and positively impacts student’s retention, graduation and academic success.  It is also clear that we can’t definitely say it does.  On the plus side, I didn’t find anything saying it harmed students.

Questions for discussion:

  • How do you articulate the value of library instruction to the students you work with?  To the faculty?
  • Is there something we could or should be doing here in the Libraries to attempt to answer the question?
  • Does the fact that we don’t know affect your plans for library instruction provision
  • Does the fact that we don’t know (beyond anecdotal evidence from our faculty) even matter?

 

 

RIOT: Political Blogs

Information Literacy in the Study of American Politics: Using New Media to Teach Information Literacy in the Political Science Classroom
Behavioral & Social Sciences LibrarianVolume 32, Issue 1, 2013
I chose this article because it looks at an interesting collaboration between a librarian and a Political Science professor. It also challenges my thinking about how to present evaluative criteria for resources. Given the rise and ubiqity of political blogs, news aggregators, amateur journalism sites and social networks it’s important to think of how to use them in teaching Information literacy. “the new media environment for covering American politics is a chaotic blend of independent bloggers, Internet media aggregators (e.g., The Huffington Post), social media networks, and traditional news organizations with a Web presence. In this context it becomes necessary to think about IL more as a group of methods for thinking about and analyzing the claims made by variegated information sources than as a set of skills that can be taught divorced from a disciplinary engagement with the information content”
The authors describe an assignment where 12 undergraduate students look at a competitive congressional race. They were instructed to look at a number of variables like like fundraising info, campaign tactics, advertising, and media coverage and to consider local political history and demographic info for context. A challenge the authors saw that it was easy for students to find bits and pieces of news information related to the assignment. But they had difficulty with critically examining the claims or their sources or how to distinguish between different types of content such as for example, a highly polemical blog post vs. an empirical analysis and then synthesizing that info into a coherent and original analysis.

Based on their findings, they came up with 4 categories or types of students based on their work. These categories are fluid and it’s probably not accurate to divide all students into these neat 4 groups. I think they are instructive in give insight into how students might engage with these new media and other information sources.

The Believer (4)
Takes all news sources as trustworthy. There
is no attempt to judge the verity of claims
either in the context of the news item itself,
or on any understanding of the institutional
platform from which the reporter is writing.

The Cynic (4)
Claims that nothing written about a campaign
can be trusted. In the competition to win an
election, candidates and their campaigns
will distort facts to win election. All
reporting about the campaign is similarly
biased, where amateur and professional
journalists have some agenda that favors
one side or the others.
The Opportunistic Surfer (2)
Takes satisfaction in the easily available and
diverse sources of information available to
the technology-savvy researcher. The
benefit of access to information is not so
much for deeper analysis but to use the
technology to find easier ways to collect
information.
The Discerning Analyst (2)
Can navigate through all types of information
sources and can evaluate the veracity of
claims using disciplinary tools and concepts
from history, political science, and current
affairs. That is, the analyst can draw on
recent historic events like previous
elections.

These new media sources can provide an amazing array of opinions and viewpoints on current events and policy developments that were not available 10 years ago.

Questions / Points of Discussion:

What has your experience been like working with new media in information Literacy sessions?

RIOT: Visual Literacy Discussion

Beatty, N. A. (2013) Cognitive Visual Literacy: From Theories and Competencies to Pedagogy. Art Documentation, 32(1), 33-42.

This article reviews the ACRL standards and demonstrates ways to integrate visual literacy instruction into the classroom. The author also reviews cognitive theories associated with visual literacy.

First the author makes the case for why visual literacy is essential to being literate in the 21st century. Images are everywhere and we interact with them on a daily basis both in our professional and personal lives. Creating and posting images is a regular activity for most of us. The author argues that librarians can include visual literacy instruction into information literacy instruction. And I would argue when it is appropriate or when it makes sense. She argues that cognitive theories such as Dual Coding Theory, Cognitive Load Theory and Multimedia Learning Theory can help teach visual literacy to students.

Here are brief explanations of the theories:
Dual Coding Theory: humans have a visual memory and a verbal memory.
Cognitive Load Theory: when new information is presented it is best to tie it with existing information already in the long term memory.
Multimedia Learning Theory: Using words to describe images. Is this a challenge?

The Visual Literacy Standards definition: “a set of abilities that enables an individual to effectively find, interpret, evaluate, use and create images and visual media.”

The author mentions the following visual literacy standards and performance indicators in the paper though this is not a complete list of performance indicators. I have provided examples for application from either the author or myself:

Standard 1. The visually literate student determines the nature and extent of the visual material needed.
1. The visually literate student defines and articulates a need for an image
2. The visually literate student identifies a variety of images sources, materials and types (ex. Help students find images, show them more effective ways to find images and introduce tools)

Standard 2. The visually literate student finds and accesses needed images and visual media effectively and efficiently. (ex. Ask students to find images on a particular topic)

Standard 3. The visually literate student interprets and analyzes the meanings of images and visual media.
1. The visually literate student can identify information relevant to an image’s meaning.
2. The visually literate student situates an image in its cultural, social and historical contexts.
3. The visually literate student should be able to identify the physical, technical and design components of an image. (ex. Analog or born digital; original or reproduction; altered or manipulated)
4. The visually literate student validates interpretation and analysis of images through discourse with others. (ex. this could be done in a seminar style class)

Standard 4. The visually literate student evaluates images and their sources (ex. Comparing images of an iconic work like the Mona Lisa)

Standard 5. The visually literate student uses images effectively for different purposes. (ex. Performance indicator 2: using technology effectively. Using new digital media lab and programming available)

Standard 6. The visually literate student designs and creates meaningful images and visual media. (ex. Performance indicator 3: using a variety of tools and technologies to produce images and visual media. Again leveraging the digital media lab offerings)

Standard 7. The visually literate student understands many of the ethical, legal, social and economic issues surrounding the creation and use of images and visual media, and accesses and uses visual materials ethically. (ex. Comparing the same image and metadata/citation from two different sources)

Questions:
The author talks about finding an image in an art history class but do you teach classes where the students clearly needs to find images?

Are you familiar with the ACRL Visual Literacy standards? Do you incorporate Visual Literacy into library instruction?

Do you think it is important to discuss visual literacy with students outside of the visual disciplines? Or do you think visual literacy is interdisciplinary?

Do you show students how to find images and how to use particular tools to find images?

Do you teach students how to cite images analogous to citing textual sources?

How can we create opportunities in the new Digital Media Labs for teaching Visual Literacy?

Discussion: One-Shot Library Instruction, does it work?

RIOT began with a round-robin. Roxanne shared a recent “crashing failure.” She worked with a nutrition professor to assign pre-readings on how to write scientific articles. Not many of the students did the reading. The professor did not attend the session and was not there to scold them. Roxanne dealt with this problem by summarizing the readings for the students, proving that she was flexible and able to think on her feet.
Michele shared the Meghan had assigned some preliminary readings and tutorials before some of her classes and it worked. There was probably some kind of accountability, or perhaps the students had to submit something beforehand.
Janelle shared her experience. She assigned something that the students had to complete before the session. She said it was a success.
Cindy shared her strategy of a two-shot instruction session: She assigns something to be submitted and works with the professor to make sure there is a participation grade in Canvass.
Martha then summarized why she thought the article was interesting:
• It was realistic: one-shot 50 min session
• She liked that they used a Google search to evaluate comprehension of concepts
• She liked the blind methodology of not telling students what they were really studying
• Overall, it was a simple approach was refreshing

Martha was also heartened because the study showed that one-shot actually do work.

Other points of interest:
• Background literature: internet is easier to use than library resources; students will sacrifice quality for ease-of-use
• Students with low info-literacy skills are less-likely to know that they need training. “They don’t know that they don’t know”
• Sex/gender or other variables didn’t have a significant influence
• Students who had library sessions made better judgments about the authority of the resources and had better/more sophisticated justifications of their judgments
• Students demonstrated that they were transferring the skills and using these techniques in more personal, casual searches

Martha asked:
• How can we incorporate these findings in how we approach instruction?
• Are there any interesting concepts that are not being addressed?
• What did people think of the study’s methodology?

Kristen shared that there is often not enough time in these sessions to cover evaluating information.

Michele said that we know that one-shots are not enough, but that’s all we have.

Cindy questioned whether we could use these findings to demonstrate the need for more library instruction and the case for selecting relevant, non-library resources later in life

Martha stated that the study shows that library sessions are more than databases and tools: they are about critical thinking and information literacy.

Kristen stated that there is something to be said for teaching students that there is proprietary, subscription-based information.

AJ said that this is the other side of libraries promoting open-access, promoting that Libraries have access to proprietary information.

Cindy said she thought the Libraries should do more to promote the public library and access.

Janelle pointed out that it is often difficult to find academic research at the public library and that she recommends that graduates join professional associations to access those associations’ journals.

Roxanne uses a pre-class survey to determine students’ exposure to info-literacy and previous library instruction.

Many spoke of increased library usage and questions from students who had information literacy sessions. The study showed that students ask more, and more complex, questions after information literacy sessions.

The group discussed that students often do not know what kinds of questions to ask. We may need to provide examples. What can you Ask a Librarian?