Teaching | Scholarship | Service

Below are selected aspects of my professional work as an academic librarian. As tenured faculty, I am evaluated in the areas of research, librarianship/teaching, and service.

[tabby title=”Responsibilities”]My current position is Associate Professor of Library Science and Head of Information Use and Fluency at Milner Library at Illinois State University. My research encompasses pedagogical methods for incorporating information fluency into curricula (specifically through the development of digital learning objects), exploring the difference between information fluency and information literacy, and utilizing various types of assessment methods for a comprehensive picture of student knowledge and learning. I am also interested in the application of graphic design, instructional design, and web design principles for course development.

As department head, I facilitate and guide the teaching and learning initiates at Milner Library. This includes a variety of activities. Among them is collaboration and instruction with the ISU Critical Inquiry Courses (first year students), working with subject liaisons to develop scaffolded learning outcomes, and creating a comprehensive Information Fluency Plan. I am an advocate of active learning techniques and utilizing portable digital learning objects to enhance the student learning experience and environment. My position also lends itself to being able to actively participate at the campus level through various standing committee and special task forces. Additionally, I participate in the administration of Milner Library by serving on several administrative committees.

[tabby title=”Teaching Philosophy”]Academic libraries are in a unique position to connect and interact with students in a variety of ways. I firmly believe that educating students to be information fluency in our information rich environment needs to be front and center. Being information fluent is more then just finding, using and evaluating sources. It also encompasses the management, synthesis, and creation of information and knowledge utilizing a variety of technologies. As educators it is our responsibility to help students achieve this.

One of the best aspects of my current responsibilities is to help shape and promote library-based curriculum that is focused on information fluency. I am an advocate of active learning techniques and teaching at the point of need. To connect to today’s incoming students, being a facilitator in a collaborative and active learning environment is imperative. Research shows that motivation is a key component of engaging in new content. Since completing a project or assignment is a common motivation for college students, academic librarians can tap into this to guide students through their process. Utilizing digital learning objects, to reach students in their environment, is a strategy that can create a stronger connected between the library and students.

Effective assessment is also another key aspect of a strong teaching program within an academic library. Getting a well-rounded picture of what students know and can do is key to shaping curriculum development and a teaching program. It is also important to have multiple types of assessment to gain better insight into a specific student population. These need to range from observational assessment within the classroom to examination of student work to overarching system-wide analysis.

[tabby title=”Instructional Innovations”]Throughout my career I’ve had the opportunity to use the classroom as a lab to explore and experiment with different teaching methods and active learning strategies. This has allowed me to grow as a teacher and educator. As my responsibilities have grown and shifted to overseeing program development and implementation, I have been able to investigate other aspects of teaching that are outside of the typical classroom.

Critical Inquiry Outcomes

The development of the Critical Inquiry Outcomes was a collaborative effort with various different colleagues. These outcomes were developed specifically for freshman enrolled in the introductory speech and composition courses. They are written in a way to support a yearlong library curriculum. The first five outcomes cover the Fall Semester and focusing on more introductory concepts and skills. These include learning about library as space and place, becoming familiar with the ISU & Library research environment, and general search techniques. The last three outcomes are covered in the Spring semester and build upon what was learned in the Fall Semester by focusing on more enhanced search strategies, ethical use of information, and evaluating information.

Interface identification classroom activity

One thing I have noticed in the classroom as students attempt to use academic search engines and databases is that they do not use or notice the various tools available to fine tune their search results. In observations and discussions with students, most don’t use these features because they don’t understand what they mean or do. The unfamiliarity of the interface is a barrier in of itself. I created an activity where I provide students a screen shot of a search results from an academic database. I place students into groups and tell them to identify 3 ways to reduce the number of results. I tell them that they will share their selections with the rest of the class but with a challenge. No group can repeat what was shared previously. Students love this little bit of a challenge and enjoy showing off what they know. This is a great activity because students are the ones discovering the tool themselves and not me telling them or pointing out specific features. Once students start searching for sources I invariably see them using the interface features more extensively.

CRAAP activity

This activity is a revision of the typical web site evaluation exercise (Credibility Relevance Accuracy Authority Purpose) and is specifically created for a group assignment within the introductory speech course. It was modified to accommodate multiple types of information sources not just web sites. Students hand in this assignment and receive feedback from the librarian partnered with their class.

Online Module

This module was developed as a supplement to the introductory composition course, which does not have a required library session. The module outcomes are: a review of module one, distinguishing between various types of sources in a variety of ways (source type, citation style, abstract, etc), determining types of actual sources to use, and using additional features of Search Anything. I developed the structure and content for the course. Of specific note the two videos: Clues for detecting types of sources and Additional elements for Using Search Anything.

Instruction Tips

Providing professional development opportunities for Milner librarians is a component of the department’s roles and as Head, it is my responsibility to facilitate those. About two years ago the department developed an Instruction Tips newsletter. Each article is a short, quick focus on the teaching and learning spectrum. Everyone in the department writes articles. Originally the articles were located at a hosted site, accessible through a unique username and password. In an effort to make it more accessible, I moved the newsletter to a local web server and created an XML document to enable sorting and categorization of the different articles.

[tabby title=”Teaching Challenge”]For most academic librarians our teaching consists of guest lectures by invitation from the course professor or instructor. In my experience many professors and instructors have a very general notion of what librarians are able to bring to the classroom For most, they idea of a guest lecture by a librarian is an overview of the library’s web site and a quick demonstration of several academic databases. Because of perception, the concept of linking or integrating the competencies of information fluency into a course’s curriculum is not one that occurs to many professors or instructors. Throughout my years of teaching, I have learned that I must initiate a deeper conversation with a professor or instructor about information fluency competencies prior to entering the classroom to ensure of positive learning experience for the students.

Early on in my career, as I was just learning what it meant to engage in a conversation with faculty about library skills and competencies, I received a request to provide a library session for an introductory course to a major. In my reply to the professor, I included my customary questions such as how many students were in the class and what was their makeup (e.g. all freshman, mixture of statuses, etc.), what assignment would the students be working on at the time of the visit, what were the resource and research requirements of the assignment, and was there anything specific s/he wanted the students to know. The professor happened to be the department chair so I was excited about the prospect of forging a teaching partnership with this individual. The response I received was somewhat vague and it only included a description of an assignment. I wasn’t too concerned with the response as there were several weeks before the session so I made note to follow up closer to the session date.

As the date of the session drew closer, I again contacted the professor to confirm the date and time but to get more information about the class. I did not receive any response so I again contacted the professor. This time I did get a response but to only confirm the date and time. As time was running out I decided to work with the information I had and developed a 50-minute library session around the assignment I was given. As I did for most of the sessions I provided, it was a combination of overview, demonstration, and hands-on activities. The classroom space was a flexible learning space with mobile tables and laptops, which could accommodate 30 students comfortably. This space was particularly well suited to have students work in small groups.

Approximately 20 minutes prior to the session, I went to the classroom to help the student assistants set up the room and boot up the instructor’s workstation, load my powerpoint, and open the specific web sites I would be using. At about 5 minutes before the session I was a bit surprise there were not any students yet coming into the classroom. However, often professors would meet their students at the entrance of the library and have the entire class walk down to the classroom together. At 5 minutes after the time the professor indicated the class started there still were not any students or the professor. I waited another 5 minutes and still not students or professor. At that time I went up to the entrance of the library to check to see if the class was there or possibly in a different classroom. Nobody was around. I then called the professor’s office wondering if I had the wrong day and time. No answer. I left a voice mail message. At 20 minutes after the class was supposed to start and still no students, I shut down the instructor’s workstation and told the student workers they could put away the laptops. I just assumed there was some type of miscommunication. As I started going back to my office I am met what looked like a professor leading a class to the classroom. I introduced myself to the professor and quickly discovered it was indeed the class scheduled to be at the library.

As I quickly started the instructor’s workstation and turned on the LCD projector, I instructed the students to take a seat as they entered the classroom. And students kept coming in and students kept coming and students kept coming in. The class had 60 students enrolled. There were students everywhere — sitting on the floor and standing in the back. By the time all the students got situated there was maybe 20 minutes left of class. To make the most of the situation, I decided to forgo most of the overview and demonstration component of my lesson and just jumped right into the portion dealing with the class assignment. I started asking questions about topics and resource requirements. Absolute blank stares from some students but mostly confused looks. All of a sudden the professor sitting in the back stated that the assignment was changed that semester and he explained what they were doing. My confused response was ‘Okay, that wasn’t the assignment you sent to me.’ The professor then informed me he wasn’t actually teaching the class but just filling for a couple of sessions for the professor of record while he was at a conference. In attempt to salvage the class and engage the students, I started asking questions related to the actual assignment. Students at this point were completely disengaged. Many were completely ignoring me and talking to each other or reading the student newspaper. The students that were marginally paying attention made no attempt to answer questions I was asking. Finally, one student raised his hand and informed me that they had not yet been given the assignment so they didn’t really know how to answer the questions. I have to admit at this point I was quite mad and had a hard time concealing it. All I ended up doing for the class was a 10-minute generic demonstration of a relevant database. After the class was over, the professor came up to me and said ‘That was great! It is so important that the students come into the library and learn about the web site and databases.’ I was so angry I all I could do was mutter a banal response and walk away. I felt that this professor had no regard for my expertise or me as an individual.

It took me a while to get to a point where I could process the experience without putting all of the blame on the professor and reflect on how to avoid something like that again. Through my reflection I realized I was making a huge assumption that the professor understood why I was asking the questions I was asking in my email message. I was assuming he knew that I would be creating a session linked to an assignment the students were actually working on and would be planning to teach for the entire session. At that point I recognized I needed to be upfront about my expectations and my teaching methods. I needed to state specifically what I would be covering, how I would engage students, and how much time it would take. I also needed to be clear about classroom capacity and fire code. I did get reprimanded for teaching the class with that many students in that space. Once I started being more clear and forthcoming in my initial communications, I found professors and instructors where much more open to discussing what students could and would learn. For most they didn’t even realize that a library session could be more then just a generic database demonstration. Fortunately, I have not had a situation like that again. However, I am still surprised how many professors think that only option for a library session is just a database demonstration.

[tabby title=”Scholarship”]As I progress through my career I have been drawn to exploring how students learn, the cognitive processes that drives retention of content, and various delivery methods related to teaching. My research encompasses pedagogical methods for merging information and technology into curricula (specifically through the development of digital learning objects) and utilizing various types of assessment methods for a comprehensive picture of student knowledge and learning. I am also interested in the application of graphic design, instructional design, and web design principles for course development. Included here is a selected list of my publications and presentations. Full text versions of some of my articles are also available in the ISU ReD institutional repository.

Publications (selected)

In Print

*Rinehart, Amanda, Jennifer Sharkey, and Chad Kahl. “Learning Style Dimensions and Professional Characteristics of Academic Librarians.” College & Research Libraries 76.3 (2015). [Abstract & Full text]

*Sharkey, Jennifer, Bill McMillin, and Trisha Prosise. “One Size Can’t Fit All: A Multi-Layered Assessment Approach to Identifying Skill and Competency Levels.” In Fiesta De Excelencia: Celebrating Excellence in Library Instruction: Thirty-Ninth National LOEX Library Instruction Conference Proceedings Fort Worth, Texas, May 5-7, 2011, edited by Brad Sietz, Randal Baier, Susann DeVries, Sarah Fabian, Sara Memmott and Robert Stevens. Ypsilanti, MI: LOEX Press, 2013. [full text]

*Sharkey, Jennifer. “Establishing twenty-first century information fluency.” Reference & User Services Quarterly 53.1 (2013). [full text]

+Sharkey, Jennifer. “Creating Documentary Shorts in a Credit-bearing Information Literacy Course.” Information Literacy through the Streets of Hollywood. Eds. Germain, Carol Anne and Gerald T. Burke. Active Learning Series. Pittsburgh, PA: Library Instruction Publications, 2011.

+Sharkey, Jennifer. “Political and Social Agendas: Analyzing Commercials & Public Service Announcements Attempting to Sway Public Opinion.” Information Literacy through the Streets of Hollywood. Eds. Germain, Carol Anne and Gerald T. Burke. Active Learning Series. Pittsburgh, PA: Library Instruction Publications, 2011.

+Sharkey, Jennifer. “Beyond the Keyboard: Optimizing Technology Spaces for Collaborative Learning, Instruction, and Service.” Teaching with Technology: An Academic Librarian’s Guide. Eds. Joseph Williams and Susan Goodwin. Oxford, UK: Chandos Publishing (Oxford) Limited, 2007. [more information]

*Rein, Diane, Jennifer Sharkey, and Jane Kinkus. “Integrating Bioinformatic Instruction into Undergraduate Biology Laboratory Curricula” in Tested Studies for Laboratory Teaching, Proceedings of the 28th Workshop/Conference of the Association for Biology Laboratory Education (ABLE). Ed. M.A. ODonnell. 2007: 183-216. Simultaneously published in a CD-ROM format. [description] [fulltext]

+Sharkey, Jennifer. “Utilizing Filmmaking to Advance Generation Y’s Information Fluency.” LOEX Quarterly Fall 2006. [full text] [citation/abstract]

*Sharkey, Jennifer. “Towards Information Fluency: Applying a Different Model to an Information Literacy Credit Course.” Reference Services Review 34.1 (2006). [full text] [abstract]

*Sharkey, Jennifer and F. Bartow Culp. “Cyberplagiarism and the Library: Issues and Solutions.” Reference Librarian 44.91/92 (2005). [full text] [abstract]• simultaneously published in: The Reference Collection: From the Shelf to the Web. Ed. William J. Frost. New York: Haworth Press, 2005. [table of contents (from LC)] [Preview from Google Books]

Presentations (selected)

Catanzaro, Salvatore, Doug Smith, George Seelinger, Aaron Paolucci, Jay Percell, and Jennifer Sharkey. “Designing Learning Spaces to Engage Students with Emerging Pedagogies: A Panel Discussion.” Presentation. 15th Annual University-Wide Symposium on Teaching and Learning. Illinois State University, Normal, Illinois. January 7, 2015.

Sharkey, Jennifer. “Establishing an Information Fluency Plan for Navigating the Changing Instruction Landscape.” Panel Presentation. New Directions in Information Fluency. Augustana College, Rock Island, Illinois. April 5, 2014.

Sharkey, Jennifer. “Helping Students Engage in Academic Integrity.” Workshop. Center for Teaching, Learning, and Technology Faculty Fellow Summer Institute. Illinois State University, Normal, Illinois. May 22, 2013.

Sharkey, Jennifer. “Assessment for Teaching and Learning.” Guest Lecture. LIS590: Advanced Library Instruction, University of Illinois Graduate School of Library and Information Science. Urbana-Champaigne, Illinois. March 26, 2013.

Sharkey, Jennifer, Dane Ward, and Chad Kahl. “Moving Beyond the Database Demo: Helping Students Interact with and Create Information.” Presentation. 13th Annual University-Wide Symposium on Teaching and Learning. Illinois State University, Normal, Illinois. January 9, 2013.

Sharkey, Jennifer. “Integrating Information Literacy into Curriculum.” Workshop. Center for Teaching, Learning, and Technology Faculty Fellow Summer Institute. Illinois State University, Normal, Illinois. July 23, 2012.

Sharkey, Jennifer. “Assessment for Teaching and Learning.” Guest Lecture. LIS592MBE: Advanced Library Instruction, University of Illinois Graduate School of Library and Information Science. Urbana-Champaigne, Illinois. March 23, 2012.

Sharkey, Jennifer. “Information Use & Access in the Digital Environment.” Presentation. Technology & Pedagogy ISU/IWU Joint Workshop. Illinois Wesleyan University, Bloomington, Illinois. January 28, 2012. http://prezi.com/vyhbh4lz3yzu/information-use-access-in-the-digital-environment/

Sharkey, Jennifer, Joyce Walker, and Bill McMillin. “Gauging Students’ Information Fluency Through Multiple Assessments.” Presentation. 12th Annual University-Wide Symposium on Teaching and Learning. Illinois State University, Normal, Illinois. January 11, 2012.

Franzen, Susan, Jennifer Sharkey, Frances Whaley, Kelly Fisher, and Susan Avery. “Information Literacy and General Education Goals: A Brilliant Combination.” Panel. 2011 Illinois Library Association Annual Conference. October 18, 2011.

*Sharkey, Jennifer, Bill McMillin, and Trisha Prosise. “One size can’t fit all: a multi-layered assessment approach to identifying skill and competency levels.” Presentation. Fiesta De Excelencia: Celebrating Excellence in Library Instruction: Thirty-Ninth National LOEX Library Instruction Conference. Fort Worth, Texas, May 5-7, 2011.

Sharkey, Jennifer, Bill McMillin, and Trisha Prosise. “Assessing First Year Students’ Information Literacy: a multifaceted approach.” Information Literacy Summit 2011. Normal, Illinois, April 18, 2011.

Sharkey, Jennifer and Joyce Walker. “Critical Inquiry: An Interdisciplinary Information Literacy Collaboration.” Information Literacy Summit 2010. Normal, Illinois, April 20, 2010.

Hooker, John, Joyce Walker, Nancy McKinney, and Jennifer Sharkey. “Sustainable Teaching in General Education: Critical Inquiry in ENG 101, COM 110, and Milner Library.” Presentation. 10th Annual University-Wide Symposium on Teaching and Learning. Illinois State University, Normal, Illinois. January 6, 2010.

*Sharkey, Jennifer and Catherine Fraser Riehle. “Beyond the entertainment factor: integrating multimedia into library instruction projects and activities.” Workshop. 2009 ACRL National Conference. Seattle, Washington. March 14, 2009.

[*peer reviewed; +editor reviewed]

Academic Blog Posts

IT and the Year 2020: 13 Questions We Are Still Asking.Musings: Info, Tech, Learn, Teach, January 22, 2014.

Information Fluency versus Information Literacy: What Is the Difference?Musings: Info, Tech, Learn, Teach, February 17, 2014.

Source Distinctions: Print vs. Electronic – Still a Viable Differentiation?Musings: Info, Tech, Learn, Teach, February 24, 2014.

Scarce and Abundant Resources.” Musings: Info, Tech, Learn, Teach, March 13, 2014.

Humor in the Library Classroom or ‘Why Are We so Serious All of the Time?’” Musings: Info, Tech, Learn, Teach, April 30, 2014.

The Road to Validity #1 (Assessment Series).” Musings: Info, Tech, Learn, Teach, January 28, 2016.

The Road to Validity #2 (Assessment Series).” Musings: Info, Tech, Learn, Teach, February 4, 2016.

Professional Discourse about the ACRL Framework (a Chronology).” Musings: Info, Tech, Learn, Teach, April 22, 2016.

[tabby title=”Service”]

Milner

Public Services Heads Committee, Standing Member, 2014-presentLibrary Operations Council, Standing Member, 2009-presentSearch Committee – Scholarly Communication Librarian, Chair, 12/2016-04/2017Department Faculty Status Committee, Elected Member, 2014-2017Library Technology Committee, Public Services Representative, 2012-2015Information Literacy Summit, ISU Coordinator, 2012-2013Public Services Coordinating Committee, Standing Member, 2011-2013Library Faculty Council, Elected Member, 2011-2014

University

Committee on Critical Inquiry, Standing Member, 07/2009-presentEducating Illinois Revision Task Force, Appointed Milner Representative, 2017-2018University Teaching Committee, Appointed Milner Representative, 09/2011-05/2017Council on General Education, Appointed Milner Representative, 09/2012-05/2016Instructional Technology Domain Team of the Enterprise Architecture Initiative, 2013-2015Dean of Students – Human Library Project, 2013-2014Search Committee – Director of the CTLT, Appointed Milner Representative, 10/2012-05/2013Foundations of Excellence, Learning Dimension Committee, Appointed Member, 09/2012-05/2013General Education Review Task Force, Appointed Milner Representative, 09/2010-05/2012

National Organizations

American Library Association (ALA) – member since: 1997Association of College and Research Libraries (ALA/ACRL) – member: 2003-presentMember, Instructional Section (IS) 2003-present

Member at Large, Executive Committee, 2011-2013Liaison to Communication Committee, Instructional Technologies Committee, and Web Revision Task Force, 2011-2013IS Co-Web Administrator, 2007-2011Ex-officio non-voting Member, IS Advisory Committee, 2007-2011Communication Committee, 2007-2011

Member, University Libraries Section (ULS) 2015-present

Communication Committee, Blog Team (Lead: 2016), 2015-2017Communication Committee, Chair, 2017-2018Awards Committee, Member, 2017-2019

[tabbyending]

 

Academic Blog Posts – Full Text

IT and the year 2020: 13 questions we are still asking

Posted on January 22, 2014

I recently came across an article(1) by Robert C. Heterick, Jr. and John Gehl written in the nineties and published in Educom Review, now known as EDUCAUSE Review, discussing what IT in education will look like in the year 2020. The article’s authors make some interesting statements and pretty astute observations. They refrained from the obvious discussions of that time period such as paper resources will no longer exist. They instead focused on the broad picture such as the larger impact of digitization efforts on society as a whole and adoptability of devices into the mass population. One key phrase that demonstrates this: “in the year 2020 the capability [“electronic data interchange”] will be global (geographically) and universal (unrestricted to transaction type).” Just six years away from 2020 as a society we have rapidly moved in that direction with the numerous platforms and applications available; we are communicating in a very global way.

We have widely available functionalities that do not bind us to specific software applications. We can create presentations, videos, and images online and via our smart devices — most of which are free to use. (The “cost” of releasing personal data is a subject for another time.) The Open Source movement is definitely an indication of a desire to move away from proprietary, enterprise based-solutions. One area that will be interesting to watch is open video formats such as .ogv (Ogg Vorbis is a audio compression format that is free, open, and not patented; for more information visit the Xiph open source community at http://xiph.org/). The release and adoption of HTML5 makes utilization of these open source options easier. Of course, MOOCs are another trend people are watching very closely. Additionally, the Futures Lab at Reynolds Journalism Institute presents quick video reports on up and coming web-based tools.

Heterick and Gehl reiterate Donald A. Norman’s concept that technology is easy to predict but the social impact is much harder. Certainly by today’s standards, high social impact is one of the only ways a new technology can survive. With the idea of social impact in mind, the authors pose thirteen questions all educators need to consider when trying to plan for the future. What is so interesting about these questions is that we are still asking them and still seeking answers. In today’s stretched economic environment and the recent recession, they seem more relevant and important to ask than ever before. Some are easier to discuss and perhaps find an answer. If we are ever able to answer these question, does that ensure higher education will survive? Maybe. I think what is more important is that we never stop asking these questions and ensure we are asking them before crisis hits rather than during or after.

  1. What will it mean to “register” a student?
  2. What will it mean to “be” a student?
  3. What will credentials and degrees signify?
  4. What will a course of study look like, and who will design it?
  5. Will there be any difference at all between distance learning and traditional education?
  6. Will the distinction between individual effort and collaborative effort be different in 2020 from what it is now?
  7. What will it mean to be “published”?
  8. What will the difference be between a faculty member and an author?
  9. What will the difference be between an author and a publisher?
  10. How will the costs of higher education be apportioned among the state, the federal government, philanthropy, and the student?
  11. What will be the role of the residential campus? What percentage of students in higher education will be seeking a first degree versus continuing education?
  12. Will the calendar of 50-minute classes, 16-week semesters, summer breaks, and so forth continue?
  13. Will the hierarchy of research, comprehensive, liberal arts, and two-year institutions make sense?

Within the structures of higher education and the student experience, a universal experience or one that is unrestricted to transaction type is still a challenge. Students often struggle with navigating the multiple requirement lists to obtain a degree. Transferring to another school can mean adding 1, 2, or more years. The implications of which are not only time but monetary. As educators we are talking about many things to enhance the student experience — effective teaching strategies, developing quality assessments, implementing programs like writing across the curriculum. Certainly, the discussions about what it means to be a professor – published, tenured, etc. – is quite active. Many departments struggle with trying to recognize achievement within the new ways we are communicating, creating, and connecting. As I sit in numerous campus meetings, I am not sure we are any closer to being able to effectively answer those thirteen questions. I do know we can’t depend on any one person to find the answers; we have work together to determine what works within our own institutions.

1. Heterick, R., & Gehl, J. (1995). Information technology and the year 2020. Educom Review, 30(1), 22. http://www.educause.edu/node/158250.

information fluency versus information literacy: what is the difference?

Posted on February 17, 2014

Back in the Spring of 2009 I created a Google alert for the phrase information fluency. At that time the phrase was being thrown around left and right. The alert helped me stay abreast of what people were posting in regards to the term. Until about March 2011, the alerts came daily often indicating upwards to 5 mentions across certain segments of the web. Not long after that I started to notice a slight downturn in the number of alerts. Often seeing a one or two day gap. Since that time I’ve seen a steady downward trend of alerts to where now there are significant gaps between the alerts.

definition of fluencyWhat does this mean? Does it mean no one is focusing on information fluency any more? Google alerts certainly are not going to really answer that question; I am under no illusion that everything posted on the web is caught in a Google alert. However, based on my limited analysis of the alerts I did receive, it is clear information fluency was a buzz phrase easily to throw around in education circles. While at the same time, many people were struggling to clearly define what exactly the phrase meant and represented.

Here it is 2014 and I don’t think anyone is any closer to creating a definition that is clearly accepted by the many groups that want to promote the idea of information fluency. In my own professional circles where many of us have built careers off of the concept of information literacy, the phrase information fluency just became another way to talk about information literacy. Many people started talking about transliteracy or metaliteracy as a mechanism to combine all possible literacies. I actually disagree with that sentiment and trend. What has happened is that every potential knowledge domain now has its own literacy. How does that help anyone? All we have now is a list of literacies that one needs to check off to deem oneself a capable, informed, and engaged citizen.

It is not unknown that I am not a fan of word literacy when it comes to describing the type of teaching we librarians are doing (or at least should be doing) in higher education. Once students reach our doors we should be focusing on much more than rudimentary ways in which students can interact with information; fortunately many of us are. The newly published 2014 Horizon Report: Higher Education edition is a clear indication that the learning environment has gone way past the traditional practice where students sit in a room being lectured to. Higher education is moving towards supporting environments that encourage and foster knowledge creation rather than knowledge consumption. As a profession, we also need to be thinking in those terms.definition of literacy

How do we define information fluency? I believe we need to step back and truly acknowledge that everything around us is information regardless of platform or format. Not so long ago we could clearly separate tools and information. Where the two were linked was that specific tools allowed access to specific information. In our current more ubiquitous communicative environment, the tools used to find and access information are less relevant. Today, fluency is represented in the quality and ability of discourse, methods of interaction, willingness to learn, and avenues of creation. It is not an umbrella for different literacies. As information professionals we need to be thinking about how we help students achieve these higher order aspects and concepts that define and represent fluency.

In the mindset that information fluency is just another phrase for information literacy, I often hear people say that what I describe above is implied in the current ACRL standards and definition of information literacy. I don’t think implied intent is a good model, particularly when it comes to teaching and learning. Fortunately, an ACRL Task Force is working to revise the current information literacy standards and from the reports I’ve read really rethinking the overall approach. My concern is that the phrase information literacy will remain. It is not that I don’t support the efforts of my talented colleagues; I am happy the current standards are being evaluated and revised. My concern is that by using the same label there will be confusion and increased reluctance to consider anything new. I can’t honestly say information fluency is a better phrase. What is tough about information fluency (as well as information literacy) is the abstractness of it and that no one really “owns” it like the traditional knowledge domains we’ve created in higher education. For academic librarians, we do need to think carefully about the language we use as a means to communicate our intent and goals for advancing student learning.

source distinctions: print vs. electronic – still a viable differentiation?

Posted on February 24, 2014

With the semester in in full swing, I am still surprised when I talk with students about what sources they need for a project and the response is “I can’t use anything from the Internet; I need scholarly sources.” Often these are novice researchers still trying to figure out what a scholarly source actually is and what it means to do research as a university student. These students still think in black and white terms – using the right sources, avoiding the wrong sources, meeting the source limit criteria, etc. When told not to use Internet sources without more distinction, this inevitably creates a barrier for the student. All they see is Internet=wrong, print=right. As a result these students are unable to determine if electronic versions of books, journals, newspapers, magazines, images, etc. are appropriate.

The nature of this black and white veiwpoint of the Intertet starts early and is related to the way students are commonly taught about Internet content in school — very much a checklist approach. Many students I work with have an engrained rote response to any question that resembles what Internet sources are appropriate to use — which is always along the line of .org, .gov, and .edu are good sites and .com sites are bad. In a recent bavatuesdays blog post, Going Online, the Jim Groom discusses a worksheet his son needed to fill out to learn about “good websites.” To his chagrin (and it should also be for many of us) the blog site was the “bad” site. He has a great quote at the end of the post which I definitely agree with as it is not only important in K-12 but also post-secondary.

What if we start creating an elementary school curriculum that gets at the more dynamic and generative possibilities of the web rather than thinking about it like a set of static, scary resources.

Given our highly digitized world, we need to ask if it still appropriate to make blank statements like ‘Don’t use Internet sources’ or ‘X sites are bad.’ In my work with college students, I find trying to teach within those parameters highly limiting and it becomes more confusing for students. For those of us who observed the rapid evolution of the Internet from a rudimentary, text-based environment (many sucky websites both in content and design) to one that is highly complex and visually rich, we understand why the distinction of ‘Internet=Bad, Print=Good’ transpired. However, I feel like that many of us in higher ed (where we are trying to encourage higher levels of critical thinking and inquiry) still have the mind set of the early days of the Internet.

In conversations with teaching assistants and professors about appropriate sources — specifically asking if students can use Internet sources, often the response is that students shouldn’t use Internet sources. When pressed further about digital versions of newspapers, magazines, ebooks, and scholarly journals, particularly those within a library database, the response is always ‘Yes, students can use those sources.’ At that point I move into “teachable moment” mode where I ask further questions to understand how a professor or instructor makes the distinction between a scholarly article on the web versus a commercial website. In numerous conversations with professors, I’ve come to realize that many see electronic versions of journals, books, and newspapers as ‘non-Internet’ because they are relying on the construct that they originate as paper therefore they are not really part of the Internet.

As librarians we need to help expand how various pieces of information are viewed and ultimately discussed within the classroom. In the eyes of students, if one uses a web browser to view content, that content is part of the Internet regardless of where it resides – e.g. library database, institutional repository, YouTube or if it is a digitized version of something that originated in a non-digitized format. As educators distinguishing whether or not content originates in an Internet-ready format is not as important as focusing on how the content is significant to what is being learned, the questions students are asking, or what problems many of them may be trying to solve. It comes down to how the parameters and expectations of assignments are communicated. It is easy to include criteria like minimum number of sources, appropriate formats, etc. It is more complex to engage students in a conversation about information sources and ways to interact with those sources to determine significance and relevance to the project.

Scarce and abundant resources

Posted on March 13, 2014

I was recently reminded of the article, “Tech is too Cheap to Meter: It’s Time to Manage for Abundance, Not Scarcity” by Chris Anderson and published in Wired Magazine. Even though it was written in 2009, it still has merit. Anderson puts forth an interesting discussion on how we perceive the abundance or scarcity of technology resources and how this influences our ability to leverage it in different and unique ways. The examples he uses in the article are computer storage space and streaming video. Both no longer are an expensive or scare resource yet both are often viewed as such. The key question he asks is “when an organization views an abundant resource as scare, what impact does that have on the organization’s ability to meet customer needs and continued goodwill?”

As I read this article I of course began thinking about academic libraries and how we view certain technologies/resources along with the policies we implement based on those views. The No food and Drink policy is a good example of a policy based on the perception that items such as computer keyboards, mice, and all print sources are scare or irreplaceable. Yet, as computer hardware costs continue to decrease, peripherals like keyboards or mice are no longer a scare commodity; particularly since many institutions often buy these in bulk for a fraction of the off-the-shelf retail cost. With print-based resources, really only a small percentage of these within a typical academic library are truly scare or irreplaceable. As more materials are digitally born or converted, document delivery systems become more robust, and consortia or shared library systems are more common, any print items that do not fall into that rare and irreplaceable category are more easily replaced or accessed should they become damaged. A more important policy for an academic library becomes one focused on good stewardship of resources and methods in which to meet the unique needs of their community.

Anderson also asks the opposite question as well. What are the implications for an organization that views a scare resource as an abundant one?

Many instructional initiatives within academic libraries use a variety of technologies including those that are part of Web 2.0. While these technologies are abundant, how they are used in connection with student learning vary significantly among librarians. Some approach utilization of these within teaching environments as the scattered shot approach — try everything and anything so maybe something will stick/work. Those that take this approach view time, students and their own, as an abundant resource. We of course all know (even those teaching in a scattered shot way) that time is not an abundant resource and teaching as if it were is not effective. Why do people do it, then? There are many potential answers to that question. However, in my experience, those who are new to teaching are more likely to take the scattered shot approach more so as a lack of experience rather then ill intent.

As someone who views technology as a conduit and support mechanism for learning outcomes, I am not one who is prone to take a scattered shot approach. Although, I do not necessarily take a wait and see approach either. For me the important questions to ask are ‘what is the impact on student learning when using X technology and how does it connect to established learning outcomes?’ Student time and our time is a scare resource; we need to be cognizant of that. Being enthusiastic to try what’s cool and new, won’t necessarily put us at risk of losing students’ goodwill and in turn potential lifelong supporters. However, having a bad learning experience can often translate into a student not asking for help when needed, perpetuate attitudes of “I already know how to do this so I don’t need to participate,” or even establish a lack of respect for the expertise a library science professional does have.

In his closing remarks Anderson talks about a hybrid model evolving in our society using YouTube and Hulu as an example. Both resources provide free access to streaming video but one, YouTube, takes the scattered shop approach — open to all for viewing and uploading, no criteria for content or quality (certain exceptions apply), and no commercials. It should be noted, since Google purchased YouTube, a Google Ads overlay appears on most videos. The other, Hulu, which has a more focused content type, provides high quality streaming video, is restricted to who can upload, and requires viewers to watch short commercials. Even though both are free, each has a different model and perspective on abundance and scarcity. Both of these services are doing extremely well and in the foreseeable future neither is in danger of having fading use or activity.

For a complex organization like an academic library choosing one model over the other (scare or abundant) is not necessarily appropriate. As academic libraries reevaluate today’s learning environments and focus on student learning, we will need to consider how we engage students and what we consider to be scarce and abundant resources. In my view, my time and student time is scare and therefore valuable — the learning opportunities we provide should take this into consideration. We should try to avoid the scatter shot approach with technology. However, with today’s environment of abundant web-based technology and applications, we have multiple choices for creating interesting learning environments with different types of technology.

I am continually amazed at the creativity of my fellow librarians. Teaching strategies to watch will be the flipped classroom, blended learning, and strictly online learning. While all of these methods have been around for many years, what will be important is the movement away from generic, one-size fits all to a personalized learning experience. How the personalization occurs can be multifaceted and should be connected with the goals of the program or institution. Take, for instance, the flipped classroom methodology. This strategy can allow for use of various technologies and at the same time provide a powerfully positive learning experience for students. In the past year, more discussion has popped up regarding academic libraries using this model. Here are three articles discussing interesting initiatives.

Arnold-Garza, Sara. 2014. “The flipped classroom: Assessing an innovative teaching model for effective and engaging library instruction.” College & Research Libraries News 75, no. 1: 10. http://crln.acrl.org/content/75/1/10.full

Datig, Ilka, and Claire Ruswick. 2013. “Four quick flips: Activities for the information literacy classroom.” College & Research Libraries News 74, no. 5: 249. http://crln.acrl.org/content/74/5/249.long

Lemmer, Catherine A. 2013. “A View from the Flip Side: Using the ‘Inverted Classroom’ to Enhance the Legal Information Literacy of the International LL.M. Student.” Law Library Journal 105, no. 4: 461-491. https://scholarworks.iupui.edu/bitstream/handle/1805/3815/Published%20Flip%20Article.pdf?sequence=1

Humor in the library classroom or ‘why are we so serious all of the time?’

Posted on April 30, 2014

I just recently came across Mashable’s video about the deep web (embedded below). While not holding your gut as tears roll down your face funny, it is humorous. In my institution we do have videos that discuss the deep web (we use the term hidden web) and to be honest they aren’t funny. They are straightforward and well constructed but not funny. Mashable’s video started me thinking about how academic librarians construct our instruction whether online or in person. To be honest, most of us are serious most of the time. I often wonder why that is.

I’ve been told throughout my life that I am funny but not in a stand up comedian type of funny, in a dry witty way. (I could go into a tangent here about how this is an insult to comedians but I won’t digress.) I should note here that I don’t consider myself funny. Anyway, in the classroom, any shred of humor I may have with my friends and colleagues goes out the window. I am not sure what it is but I have this mental barrier when it comes to humor related to instruction whether it is in the classroom or through digital learning objects (and frankly, if I am honest with myself, all of my public speaking). I marvel at people who seem to be able to be funny with ease in front of large groups and in the classroom. Is it off the cuff or do they really prepare and practice to be funny?

The best summation of humor I found came ironically from an encyclopedia; certainly not a format one would associate with humor.

Humor is a ubiquitous, pervasive, universal phenomenon potentially present in all situations in which people interact. It is a complex, multifaceted phenomenon involving cognitive, emotional, behavioral, physiological, and social aspects that have a significant effect on individuals, social relations, and even social systems. 1

This definition made me realize that I perhaps make humor bigger then it needs to be. I perceive it as elusive so therefore it will always be elusive. Various research into humor as an effective teaching tool indicates that it helps relax the environment, creates a stronger connection between student and teacher, and can enhance critical thinking.2 However, it has also been found that certain types of humor are better then others such as funny stories/comments, professional humor, and jokes and in the students’ perception the most appropriate.3

In my quest to know more, I am not surprised that over the years, academic librarians have talked about ways to be funny in the classroom. While many students are reluctant to admit it, they do find the overall research process stressful and challenging. In the most recent report on student research behaviors Alison Head specifically examined college freshman. One of the key findings in her research was that

[n]early three-fourths of the sample (74%) said they struggled with selecting keywords and formulating efficient search queries. Over half (57%) felt stymied by the thicket of irrelevant results their online searches usually returned.4

If there is every a time to cut tension it is at a time when students are really struggling. During my short investigation into use of humor in the classroom, I’ve tested the waters a little bit. While I didn’t spend any significant time planning a “comedy routine,” I did keep myself open to opportunities to insert humor. Interestingly, I found the most opportune times to be when exploring selection and use of search terms. Just being aware of current pop culture provides a plethora of humor opportunities.

References

1. Westwood, Robert. “Humor.” In International Encyclopedia of Organization Studies, edited by Stewart R. Clegg and James R. Bailey, 621-24. Thousand Oaks, CA: SAGE Publications, 2008. doi: http://dx.doi.org/10.4135/9781412956246.n214.

2. Chabeli, M. “Humor: a pedagogical tool to promote learning.” Curationis 31, no. 3 (2008): 51-59. http://www.curationis.org.za/index.php/curationis/article/viewFile/1039/975

3. Torok, Sarah E., Robert F. McMorris, and Lin Wen-Chi. “Is humor an appreciated teaching tool? Perceptions of professors’ teaching styles and use of humor.” College Teaching 52, no. 1 (2004): 14-20.

4. Head, Alison J. Learning the Ropes: How Freshmen Conduct Course Research Once They Enter College. Research Report. Project Information Literacy, December 4, 2013. http://projectinfolit.org/images/pdfs/pil_2013_freshmenstudy_fullreport.pdf.

The Road to Validity #1 (Assessment Series)

Posted on January 28, 2016

Literature Review

Something I think about on a regular basis is assessment; particularly how to do it well and where to focus my department’s and library’s efforts. Of course assessment means many different things to many different people. How librarians approach assessment or even define assessment is often very different from a college dean’s definition and approach. Although, I do believe that at the core, most people in higher education view assessment as a vehicle to improve the student learning experience. Fortunately, the literature for both library and information science and education are ripe with articles, books, presentations, etc. discussing assessment that range from types of assessments to design techniques to analysis methods.

In my journey into assessment, I found that for many librarians, the focus of assessment efforts are typically on student learning; however, it needs to encompass much more. As Avery emphasizes, when considering the overall student learning experience, assessment needs to focus not just on evaluating students but should inform and impact how teaching programs are developed.[1] An important step is establishing a foundation from which assessment efforts can be built. This may be establishing the skill sets of a specific group of students (e.g., first-year) or documenting what classes have had or not had library instruction. When determining students’ knowledge, conducting pre- and post-tests are a good mechanism to determine what students know or have learned.

Utilization of pre-/post-tests within library instruction is not new. A general search on this assessment method will return a significant number of articles where pre-tests and post-tests were used to determine the level of student learning, the impact of instruction techniques, or the direction of an instruction program redesign. In making my way through the assessment maze, I personally found it challenging to determine which sources would be the most useful. Much of it came down to the goal of the assessment effort my library was trying to achieve and how closely aligned an article was with that goal. One of the earliest reports of using pre-/post-tests is Kaplowitz’s description about a program’s impact on library use and attitude towards librarians.[2] This was an interesting article to read as it highlights the beginning of a trend within library assessment efforts.

As discussion about the use and effectiveness of online courses and utilizing digital learning objects has increased substantially, research investigating the types of instruction delivery methods provides context in the evolution of this movement. Not surprisingly librarians were assessing these methods just as Web 2.0 was beginning to explode. In their article comparing teaching in online only, in-class only, and hybrid environments, Kraemer, Lombardo, and Lepkowski used an identical pre-test and post-test taken by all of the students participating.[3] Not surprisingly their analysis showed most improvement in the hybrid class. However, this research was done in 2007 when online learning software and platforms were really just beginning to mature. In a more recent research study, Mery, Newby, and Peng evaluated scores from pre-tests and post-tests of students who received different types of instruction to identify if a particular method, either an online course or a one-shot guest lecture, had a higher impact on student learning.[4] Their results showed that online course yielded the greatest improvement in students’ skills. Credit-bearing information literacy courses offer one of the best environments for assessing student learning and using pre-/post-tests. Research, by both Joanna Burkardt and Bonnie Swoger, shows effective use of the pre-/post-test method in credit courses.[5] [6]

Determining the best design or method to use for assessment is not always straightforward. In a review of 127 articles focusing on assessment tools used in libraries, 34% of them were placed in the multiple-choice questionnaires category.[7] When written well, multiple-choice questions can assess recall, understanding, prediction, evaluation, and problem solving.[8] While having a strong preference within the library teaching community, it should be noted that these types of assessment tools do have their limitations. They are not effective in gathering a holistic picture of students’ skills and competencies.[9] While many pre-/post-tests are developed in the form of multiple-choice tests or questionnaires, this does not preclude the option to use other assessment types or blend different types together as one tool.

Open-ended questions provide the option to gather qualitative data. Patton categorizes qualitative data into three kinds: interviews, observations, and documents.[10] Within the documents category he identifies open-ended survey questions as a data collection technique. In his book, Research design: qualitative, quantitative, and mixed-methods approaches, Creswell discusses the advantages of using various types of assessment methods.[11] Mixed-methods research can refer to either a) methodology, b) the philosophy behind the research, or c) method, actual techniques and strategies used in the research. Specifically defined “it focuses on collecting, analyzing, and mixing both quantitative and qualitative data in a single study or series of studies.”[12] Ultimately, Creswell concludes that a mixed-methods approach provides the best way to not only assess a population as a whole, but to also accumulate more granular data for subgroups or even individuals.[13] In her research study using mixed-methods assessment in the form of pre-/post-tests and interviews Diana Wakimoto was able to explore the impact of a credit course on students’ learning and satisfaction.[14]

Despite the growing evidence that mix-methods assessment is the best route to take, it is important to note that institutional culture has a large impact on the type and success of any assessment initiatives. It is not uncommon for large scale assessments to be unrealistic for many libraries and smaller scale assessments are highly dependent on personal relationships with individual departments, programs, or faculty. However, assessment is a necessity and quality small scale efforts can often lead to larger scale initiatives.

Endnotes

  1. Elizabeth Fuseler Avery, “Assessing Information Literacy Instruction,” in Assessing Student Learning Outcomes for Information Literacy Instruction in Academic Institutions, ed. Elizabeth Fuseler Avery (Chicago: Association of College and Research Libraries, 2003).
  2. Joan Kaplowitz, “A Pre- and Post-Test Evaluation of the English 3-Library Instruction Program at Ucla,” Research Strategies 4, no. 1 (1986).
  3. Elizabeth W. Kraemer, Shawn V. Lombardo, and Frank J. Lepkowski, “The Librarian, the Machine, or a Little of Both: A Comparative Study of Three Information Literacy Pedagogies at Oakland University,” College & Research Lilbraries 68, no. 4 (2007), doi: 10.5860/crl.68.4.330. <http://crl.acrl.org/content/68/4/330>
  4. Yvonne Mery, Jill Newby, and Ke Peng, “Why One-Shot Information Literacy Sessions Are Not the Future of Instruction: A Case for Online Credit Courses,” College & Research Libraries 73, no. 4 (2012), doi: 10.5860/crl-271 <http://crl.acrl.org/content/73/4/366>
  5. Joanna M. Burkhardt, “Assessing Library Skills: A First Step to Information Literacy,” portal: Libraries and the Academy 7, no. 1 (2007), doi: 10.1353/pla.2007.0002. <http://digitalcommons.uri.edu/lib_ts_pubs/55/>
  6. Bonnie J. M. Swoger, “Closing the Assessment Loop Using Pre- and Post-Assessment,” 39, no. 2 (2011), doi: 10.1108/00907321111135475. <http://www.geneseo.edu/~swoger/ClosingTheAssessmentLoopPostPrint.pdf>
  7. Andrew Walsh, “Information Literacy Assessment: Where Do We Start?,” Journal of Librarianship and Information Science 41, no. 1 (2009), doi: 10.1177/0961000608099896 .<http://eprints.hud.ac.uk/2882/1/Information>
  8. Thomas M. Haladyna, Writing Test Items to Evaluate Higher Order Thinking (Boston: Allyn and Bacon, 1997).
  9. Davida Scharf et al., “Direct Assessment of Information Literacy Using Writing Portfolios,” The Journal of Academic Librarianship 33, no. 4 (2007), doi: 10.1016/j.acalib.2007.03.005. <http://www.njit.edu/middlestates/docs/2012/Scharf_Elliot_Huey_Briller_Joshi_Direct_Assessment_revised.pdf>
  10. Michael Quinn Patton, Qualitative Research and Evaluation Methods (Thousand Oaks, CA: Sage Publications, 2002).
  11. John W. Creswell, Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 3rd ed. ed. (Thousand Oaks, CA: Sage Publications, 2009).
  12. John W. Creswell and Vicki L. Plano Clark, Designing and Conducting Mixed Methods Research (Thousand Oaks, CA: Sage Publications, 2007): 5.
  13. Creswell, Research Design.
  14. Diana K. Wakimoto, “Information Literacy Instruction Assessment and Improvement through Evidence Based Practice: A Mixed Method Study,” Evidence Based Library and Information Practice 5, no. 1 (2010), https://ejournals.library.ualberta.ca/index.php/EBLIP/article/viewFile/6456/6447.

The Road to Validity #2 (Assessment Series)

Posted on February 4, 2016

Validity Techniques

In my previous post, I highlighted some of the literature that discusses pre- and post-test assessment. In this post I will discuss the challenges of creating an effective online questionnaire. One thing I have learned over the years is that acquiring quality data can only occur by creating and using valid and reliable assessment tools. For online questionnaires, poorly written questions will throw off the results and make analysis almost impossible.

The reason we create and administer questionnaires is to help us find answers to broad overarching questions. If a library, a team, or an individual puts forth effort to gather data and potentially publish the results, it is far better to take the time up front to develop well-crafted questions rather than to find out after the fact that the data gathered cannot be used because the original survey questions were not the appropriate questions.

Validity of questions, and ultimately an assessment tool, ensures that the data gathered represents the stated purpose and goal. The questionnaire as a format for research, while having many benefits, has several disadvantages. One of the most significant drawbacks is that they often have poorly formed questions due to the ease in which these instruments can be developed.[1] Valid questions are clearly written and eliminate any possibility that the individual taking the assessment could misinterpret or be confused by the question.[2] Alreck and Settle state that survey questions should have focus, brevity, and clarity.[3] Multiple-choice questions are particularly prone to being invalid if the question writers are uninformed about what makes a question valid or invalid.

Developing overarching questions before gathering data is imperative and a good technique to writing appropriate questions. For instance, if an overarching question is whether or not students’ skills improved after an instruction section, asking students if they liked the session and instructor’s teaching style cannot answer that question. When I started at my current library, one of our first attempts at instructional assessment was to create a questionnaire with pre-existing tutorial quiz questions because we thought this would be a time saving measure. Even though, generally, we knew we wanted to know if students’ knowledge had improved after a library session, we didn’t take the time to create overarching questions and identify what it was we really wanted to know. Not surprisingly, once we started looking at the data, all we really could conclude was the number of questions students got right and wrong. The tutorial quiz questions, as written, were out of context and the library sessions didn’t specifically address many of the skills or competencies linked to the questions. Also, in discussing the data, it became clear that everyone on the development team had different opinions about what we were supposed to be measuring.

When embarking on the development of an assessment questionnaire it is important to be aware of the different levels of assessment: Classroom Assessment, Programmatic Assessment, and Institutional Assessment.[4] The data gathered for each of the assessment levels tells a different story. A mismatch between the assessment level and what questions need answering has numerous consequences but primarily produces invalid data and the inability to conduct proper analyses.

Administration options within classroom assessment are quite numerous. Radcliff, et al. points out that these can be categorized in the following ways: informal assessment, such as observations or self-reflection; classroom assessment techniques (CATs); surveys; interviews; knowledge tests; concept maps; performance and product assessments; and portfolios.[5] Each of these has advantages and disadvantages. Informal and CATs, often used within library instruction settings, fit well into a one-time guest lecture scenario, due to quick, easy administration and analysis but the drawback being the difficulty in gaining a well-rounded picture of students’ skill set and transference. Assessments such as interviews and portfolios, while providing the most in-depth data, require significant amounts of time for data gathering and analysis. Other types of assessments, like surveys and knowledge tests, can address the time factor and often provide more information than informal or CATs. When administered as a pre-/post-test, acquisition or improvement of skill sets can be tracked.[6] However, depending on administration and data analysis, these assessments may or may not address the question of transference to other courses or within real-life scenarios.

One Example of a Validity Process

Being aware of all of the different assessment options just for classroom assessment was really important for the instruction department at my library. When developing a questionnaire, understanding the strengths of each helped bring into context what questions we could realistically ask and answer. Since an online questionnaire was really our only administration option, we concluded that the questions needed to be in the form of a knowledge test and given as a pre- / post-test.

We chose a knowledge test because the questions in this type of assessment do not focus on self-reporting of skills by students or self-efficacy of capability, nor do they focus on the effectiveness of an instructor. Instead, they focus on specific knowledge, competencies, and skill sets. For administration, a time series design was selected as it utilizes giving several posttests over a designated time period after the pre-test and library instruction sessions conducted.[7] This would provide the opportunity to gather data of student knowledge at specific times of the academic year and if developed well, involve a minimal time commitment on the part of the course instructors and students.

However, we didn’t just want multiple-choice questions, as we were also interested in getting some insight into how students approached researching a specific topic. We decided to create a two-part questionnaire, Part A that included multiple-choice questions and Part B that gave students a scenario and open questions asking them to describe how they would research the scenario. To avoid the same situation of gathering completely invalid data, we engaged in two activities. One was using a validity chart on the multiple-choice questions and the other was to map the questions to our established learning outcomes.

After our first attempt, we did rewrite several questions but there was uncertainty if they were written appropriately. One statistical validity/reliability analysis that is often used on questionnaires is Cronbach α (alpha). However, we wanted to keep the number of multiple-choice questions to ten, which is too small of a set for a Cronboch α or other validity analyses. The team used a slightly modified validity chart from Radcliff, et al.[8] The validity chart is a yes/no checklist that clarifies how questions should be constructed and what types of answer options should be present (see Figure 1). Any presence of a ‘no’ is an indication that the question is invalid and should be rewritten to generate all yeses.

FIGURE 1—Example of the modified validity chart checklist using one of the original assessment questions[8]

modified validity chart

Applying the validity chart to all of the multiple-choice questions revealed that none of them (even the rewritten questions) were valid. It was a great exercise in revealing assumptions on our part as librarians and really forcing us to clarify what we really wanted to ask. All of the questions were rewritten until they generated a check in every Yes box. Once the validity chart showed that all of the questions were clearly valid, we still did some wording revisions to increase reader comprehension. During this process we discovered that one question when first revised was valid, but advances in library database search algorithms ended up making the question invalid in that two answer options became correct instead of just one. This reinforced the need to regularly check the questions and answer options to make sure they are in-line with current tools and services. Below is an example of how one question was revised from its original version to final version (Figure 2).

FIGURE 2—Example of how one question was revised over the course of the tool development

outcomes map

The other activity we did was to make sure the questions were linked to our learning outcomes. To verify this, each question was mapped to one or more learning outcome. Our initial mapping revealed that the first set of questions developed was almost completely grouped with the outcomes that concentrated on finding and searching for information. Other outcomes, such as addressing source types and plagiarism, were completely omitted from the question set. The question set was revised to encompass all of the outcomes. After a second review, some slight adjustments were made to the questions to create an even stronger alignment between the questions and outcomes.

Even though we did not utilize extensive reliability and validity testing of this assessment instrument, the processes we used served our needs at the time. Should there come a time where the University wants to have a standardized assessment instrument of research skills, this process will better situate the library to evaluate or develop such an instrument. In consulting the literature during our validation process, I did come across some good articles (listed below) that articulate more rigorous validation and reliability processes.

Recommended Articles on Validity Testing

Ondrusek, Anita, Valeda F. Dent, Ingrid Bonadie-Joseph, and Clay Williams. “A Longitudinal Study of the Development and Evaluation of an Information Literacy Test.” Reference Services Review 33, no. 4 (2005): 388-417. doi: 10.1108/00907320510631544

Ondrusek, et. al. discussed the development of an online quiz associated with a group of online tutorials as part of their University’s first-year orientation seminars. In this article, the authors highlighted how the quiz went through multiple iterations and testing to develop valid questions in addition to using various statistical analyses such as score summary, standard deviation, and item analysis for test reliability. This extended and thorough development process helped establish the assessment within the university curriculum.

Mery, Yvonne, Jill Newby, and Peng Ke. “Assessing the Reliability and Validity of Locally Developed Information Literacy Test Items.” Reference Services Review 39, no. 1 (2011): 98-122. doi: http://dx.doi.org/10.1108/00907321111108141

Mery, Newby, and Peng described the methodology used in the development of an information literacy test associated with an online credit course. To determine validity and reliability, they used classical test theory and item response theory in correlation to SAILS test items. The data was gathered over two semesters and administered as a pre- and post-tests to students enrolled in the course.

Cameron, Lynn, Steven L. Wise, and Susan M. Lottridge. “The Development and Validation of the Information Literacy Test.” College & Research Libraries 68, no. 3 (2007 2007): 229-36. doi: 10.5860/crl.68.3.229. <http://crl.acrl.org/content/68/3/229>

Cameron, Wise, and Lottridge reported on the development of the James Madison University Information Literacy Test (ILT) and the methods used to create a reliable and valid instrument. The questions where based on the original ACRL Information Literacy Competency Standards. Their statistical analysis included content validity and construct validity. Additionally, they used standard-setting methods to determine expected proficiency levels and performance standards so the test could be administered across a variety of student cohorts.

Mulherrin, Elizabeth, and Husein Abdul-Hamid. “The Evolution of a Testing Tool for Measuring Undergraduate Information Literacy Skills in the Online Environment.” Communications in Information Literacy 3, no. 2 (2009): 204-15. http://www.comminfolit.org/index.php?journal=cil&page=article&op=view&path[]=Vol3-2009AR12

Mulherrin and Abdul-Hamid provided an overview of the processes taken to develop a valid and reliable final exam for the information literacy credit course provided as part of the general education curriculum. As with similar articles, the authors discussed the use of the content and construct validity, item difficulty and discrimination, Cronbach α (alpha), and item characteristic curves (ICC) analysis methods. A clear and ongoing theme in these articles is the importance of using reliable and valid instruments when conducting large-scale assessment.

Endnotes

  1. Bill Gillham, Developing a Questionnaire (London: Continuum, 2000).
  2. Linda A. Suskie, ed., Assessing Student Learning: A Common Sense Guide, 2nd ed. (San Francisco, CA: Jossey-Bass, 2009).
  3. Pamela L. Alreck and Robert B. Settle, ed., The Survey Research Handbook. 3rd ed. (Boston: McGraw-Hill/Irwin, 2004).
  4. Elizabeth Fuseler Avery, “Assessing Information Literacy Instruction,” in Assessing Student Learning Outcomes for Information Literacy Instruction in Academic Institutions, ed. Elizabeth Fuseler Avery (Chicago: Association of College and Research Libraries, 2003).
  5. Carolyn J. Radcliff et al., A Practical Guide to Information Literacy Assessment for Academic Librarians (Westport, CO: Libraries Unlimited, 2007).
  6. Carol McCulley, “Mixing and Matching: Assessing Information Literacy,” Communications in Information Literacy 3, no. 2 (2009), http://www.comminfolit.org/index.php?journal=cil&page=article&op=view&path%5B%5D=Vol3-2009AR9.
  7. Alreck and Settle, The Survey Research Handbook, 414.
  8. Radcliff et al., A Practical Guide to Information Literacy Assessment for Academic Librarians, 94-95.

Professional Discourse about the ACRL Framework (a chronology)

Posted on April 22, 2016

person with many questionsAs many of us are, I am still grappling with the ACRL Framework and what it means for me individually as well as the profession. I fully admit I am still on the fence about the Framework — however, I’ve never really been a fan of the Standards. I have several professional friends who served on the TaskForce and others who help elevate the Standards. I feel like I am in a bit of professional dilemma and often ask myself if creating my own middle ground will resolve this odd place I find myself in. Ultimately, the instructional designer and educational theorist in me suggests there can be a way for both to co-exist or at least an attempt should be made.

A colleague of mine recently asked if I had a list of sources regarding the ACRL Framework. What resulted was a fairly long list of blog posts and articles. I thought I would share the list — certainly not exhaustive and not the first of it’s kind.

The list is my attempt to create a chronology of both informal and formal commentary surrounding the Framework to provide a snapshot of the issues, questions, and concerns. If there are any posts or articles you think should be included, please add them to the comments.

Blog posts

Swanson, Troy. “The New Information Literacy Framework and James Madison.” Tame The Web, February 14, 2014. http://tametheweb.com/2014/02/20/the-new-information-literacy-framework-and-james-madison-by-ttw-contributor-troy-swanson/.

Berg, Jacob. “The Draft Framework for Information Literacy for Higher Education: Some Initial Thoughts.” BeerBrarian, February 25, 2014. http://beerbrarian.blogspot.com/2014/02/the-draft-framework-for-information.html.

Burkhardt, Andy. “New Framework For Information Literacy.” Andy Burkhardt, February 25, 2014. http://andyburkhardt.com/2014/02/25/new-framework-for-information-literacy/.

Fister, Barbara. “On the Draft Framework for Information Literacy.” Library Babel Fish, February 27, 2014. https://www.insidehighered.com/blogs/library-babel-fish/draft-framework-information-literacy.

Pagowsky, Nicole. “Thoughts on ACRL’s New Draft Framework for ILCSHE.” Nicole Pagowsky, March 2, 2014. http://pumpedlibrarian.blogspot.com/2014/03/thoughts-on-acrls-new-draft-framework.html.

Swanson, Troy. “Using the New IL Framework to Set a Research Agenda.” Tame The Web, May 5, 2014. http://tametheweb.com/2014/05/05/using-the-new-il-framework-to-set-a-research-agenda-by-ttw-contributor-troy-swanson/.

Wilkinson. Lane. “The Problem with Threshold Concepts.” Sense & Reference (blog). June 19, 2014. https://senseandreference.wordpress.com/2014/06/19/the-problem-with-threshold-concepts/.

Swanson, Troy. “Information as a Human Right: A Missing Threshold Concept?” Tame The Web, July 7, 2014. http://tametheweb.com/2014/07/07/information-as-a-human-right-a-missing-threshold-concept-by-ttw-contributor-troy-swanson/.

Dalal, Heather. “An Open Letter Regarding the Framework for Information Literacy for Higher Education.” ACRLog (blog). January 7, 2015. http://acrlog.org/2015/01/07/an-open-letter-regarding-the-framework-for-information-literacy-for-higher-education/.

Swanson, Troy. “The IL Standards and IL Framework Cannot Co-Exist.” Tame The Web, January 15, 2015. http://tametheweb.com/2015/01/12/the-il-standards-and-il-framework-cannot-co-exist-by-ttw-contributor-troy-swanson/.

Fister, Barbara. “The Information Literacy Standards/Framework Debate.” Library Babel Fish, January 22, 2015. https://www.insidehighered.com/blogs/library-babel-fish/information-literacy-standardsframework-debate.

Farkas, Meredith Gorran. “Framework? Standards? I’m Keeping It Local.” Information Wants To Be Free (blog). February 4, 2015. http://meredith.wolfwater.com/wordpress/2015/02/04/framework-standards-im-keeping-it-local/.

Accardi, Maria. “I Do Not Think That the Framework Is Our Oxygen Mask.” Librarian Burnout, May 14, 2015. https://librarianburnout.com/2015/05/14/i-do-not-think-that-the-framework-is-our-oxygen-mask/.

Becker, April Aultman. “Visualizing the ACRL Framework for Students.” Librarian Design Share, September 22, 2015. https://librariandesignshare.org/2015/09/22/visualizing-the-acrl-framework-for-students/.
All visualizations can be found here: http://researchbysubject.bucknell.edu/framework

Articles

Oakleaf, Megan. “A Roadmap for Assessing Student Learning Using the New Framework for Information Literacy for Higher Education.” Journal of Academic Librarianship 40, no. 5 (2014). http://meganoakleaf.info/framework.pdf

Morgan, Patrick K. “Pausing at the Threshold.” portal: Libraries and the Academy 15, no. 1 (2015): 183-195. https://muse.jhu.edu/article/566428.

Burgess, Colleen. “Teaching Students, Not Standards: The New ACRL Information Literacy Framework and Threshold Crossings for Instructors.” Partnership: The Canadian Journal of Library & Information Practice & Research 10, no. 1 (January 2015): 1-6. http://dx.doi.org/10.21083/partnership.v10i1.3440.

Beilin, Ian. “Beyond the Threshold: Conformity, Resistance, and the ACRL Information Literacy Framework for Higher Education.” In the Library with the Lead Pipe (February 25, 2015). http://www.inthelibrarywiththeleadpipe.org/2015/beyond-the-threshold-conformity-resistance-and-the-aclr-information-literacy-framework-for-higher-education/.

Baer, Andrea. “The New ACRL Framework for Information Literacy: Implications for Library Instruction & Educational Reform.” InULA Notes: Indiana University Librarians Association 27, no. 1 (May 15, 2015): 5–8. https://scholarworks.iu.edu/journals/index.php/inula/article/view/18978/25096

Kuglitsch, Rebecca Z. “Teaching for Transfer: Reconciling the Framework with Disciplinary Information Literacy.” portal: Libraries and the Academy 15, no. 3 (2015): 457-470. https://muse.jhu.edu/article/586067.

Foasberg, Nancy M. “From Standards to Frameworks for IL: How the ACRL Framework Addresses Critiques of the Standards.” portal: Libraries and the Academy 15, no. 4 (2015): 699-717. https://muse.jhu.edu/article/595062.

Communications in Information Literacy 9, no. 2 (September 11, 2015) – Special Section

Jacobson, Trudi E., and Craig Gibson. “First Thoughts on Implementing the Framework for Information Literacy.” Communications in Information Literacy 9, no. 2 (September 11, 2015): 102–10. doi:10.7548/cil.v9i2.348.

Battista , Andrew, Dave Ellenwood, Lua Gregory, Shana Higgins, Jeff Lilburn, Yasmin Sokkar Harker, and Christopher Sweet. “Seeking Social Justice in the ACRL Framework.” Communications in Information Literacy 9, no. 2 (September 11, 2015): 111-25. doi:10.7548/cil.v9i2.359.

Hosier, Allison. “Teaching Information Literacy Through “Un-Research.” Communications in Information Literacy 9, no. 2 (September 11, 2015): 126-35. doi:10.7548/cil.v9i2.334.

Pagowsky Nicole. “A Pedagogy of Inquiry.” Communications in Information Literacy 9, no. 2 (September 11, 2015): 136-44. doi:10.7548/cil.v9i2.367.

Critten, Jessica. “Ideology and Critical Self-Reflection in Information Literacy Instruction.” Communications in Information Literacy 9, no. 2 (September 11, 2015): 145-56. doi:10.7548/cil.v9i2.324.

Seeber, Kevin Patrick. “This Is Really Happening: Criticality and Discussions of Context in ACRL’s Framework for Information Literacy.” Communications in Information Literacy 9, no. 2 (September 11, 2015): 157–63. doi:10.7548/cil.v9i2.354.

Dempsey, Megan E., Heather Dalal, Lynee R. Dokus, Leslin H. Charles, and Davida Scharf. “Continuing the Conversation: Questions about the Framework.” Communications in Information Literacy 9, no. 2 (September 11, 2015): 164–75. doi:10.7548/cil.v9i2.347.

Anderson, Melissa J. “Rethinking assessment: Information literacy instruction and the ACRL framework.” SJSU School of Information Student Research Journal 5, no. 2 (2015). http://scholarworks.sjsu.edu/slissrj/vol5/iss2/3

Berkman, Robert. “ACRL’s New Information Framework: Why Now and What Did It Discover?” Online Searcher (March/April 2016). http://www.infotoday.com/OnlineSearcher/Articles/Features/ACRLs-New-Information-Framework-Why-Now-and-What-Did-It-Discover-109503.shtml.