Information Literacy, or How I Learned to Stop Worrying and Love Library Instruction
Edited by Kristin Fontichiaro
Each essay in this collection is copyright 2015 by the individual author and shared under a Creative Commons Attribution-NonCommercial-ShareAlike License (CC BY-NC-SA). http://creativecommons.org/licenses/by-nc-sa/4.0/
Cover Image: “Library Love” by ABC Open Riverland on Flickr. CC-BY-2.0. http://flickr.com/photos/[email protected]/12313151875
Shakespir Edition License Notes: Thank you for downloading this free eBook. Although this is a free book, we hope you will encourage your friends to download their own copy at Shakespir.com, where they can also discover other works. Thank you for your support.
For Our Mentors
TABLE OF CONTENTS
Here at the University of Information School of Information, we recognize that information literacy is inherently murky … and that’s what makes it so much fun. We recognize that information literacy is inherently complex: differences in background knowledge, researchers’ experience, information needed, purpose, instructor guidelines, and more mean each research task is inherently unique. For experienced librarians, it is easy to forget – because they have done it for so long – that information literacy exists, as well, in the middle of a pulsing vortex of library culture, campus or community climate, time constraints, and competing priorities. Therefore, successful instructors are those who recognize not only how to plan and assess effectively, but also how to navigate the relationships and expectations around them.
The authors in this collection, most of whom are relative newcomers to librarianship, recognize these tensions right away. In this essay collection, you’ll hear them think aloud as they explore information literacy’s complexity across contexts.
Kristin Fontichiaro is a clinical assistant professor at the University of Michigan School of Information. Contact: [email protected]
WHAT OTHER “BEST PRACITCES” LIBRARY GUIDES LEAVE OUT
Best practices for library guides is a subject which, for librarians, feels much like catalog search classes feel for undergraduates: “We’ve learned this before, and it’s honestly not even that difficult. Why do we keep being told we need more classes about this?” Today the most popular tool for creating library guides is the LibGuides Content Management System (CMS) by SpringShare, which is used by nearly 5,000 libraries and nearly 80,000 librarians (LibGuides 2015). Most librarians have been creating guides in this system for years. If you’re a librarian at an academic library, you have attended a half-dozen classes on how best to use guides. At this point, I suspect that most librarians feel they have enough resources to successfully create any guide, whether it be instruction sessions, best practices documentations, or looking at guides at their own or other institutions. But in my own research, I’ve uncovered that there are a few important concepts that most best practices documentation doesn’t mention.
My research began with a summer internship with the University of Michigan Library. University of Michigan Library ranks as one of the top ten largest libraries in the United States (Kyrillidou, Morris, & Roebuck 2015). HathiTrust, ProQuest, and JSTOR all started off as entities of the University of Michigan Library. It may be a bit of an understatement to say that I was excited by my project for the summer, which was to help prepare for the library’s transition to the new version of LibGuides.
This internship began as a technical assignment: The new version of LibGuides had no branding or customizing done to it, and the library needed the system to look like it belonged to University of Michigan before it launched. I got this opportunity because recent staffing changes had left the library’s user experience (UX) department understaffed, and as a graduate student preparing for a future in library UX work, I was brought on to help make up for the department’s current lack of manpower.
In a few weeks, I came up with a set of specifications for what would be in a branded web front-end, created a draft that got general approval from my supervisor and the library UX department, and put those specifications into production on the site. However, as I continued to work on polishing the appearance and behavior of our LibGuides platform, I increasingly moved away from aesthetic questions and into the more significant matter of library policy and usability design. Should we use side navigation or top navigation? Should we limit librarians to templates or allow them to construct guides from scratch? Should we force the librarian profile and Ask a Librarian content boxes to appear on every page, or should we let librarians decide where to put them? These and other questions led me to move from doing design work to doing design research.
I started off by looking at LibGuides best practices guides at a number of other universities. Two guides that I found particularly well-designed and useful were Best Practices for LibGuides at UCLA (Martin 2015), and Boston College's LibGuides Standards Guide (Martinez 2015). Both of these guides provide a huge amount of information on web and accessibility best practices that apply to guides, institution-specific policies, and instructions on specific LibGuide construction -- from how to avoid lengthy blocks of text to naming conventions to adding images. Using this information, I constructed a guide that had many of the same elements, but customized to University of Michigan's standards.
Even at this point though, I felt like something was wrong. I was having trouble determining the scope of my guide—it didn’t seem to me that instructions on how to do specific tasks in LibGuides belonged in the same place as instructions on how to make a guide that better fit the needs of the user. This feeling, paired with my desire to discover the answer to some of the more major policy issues I’d come across, led me to look for other research papers in LibGuides. Here is what I found.
Software instruction, policy, and web best practices don’t make up the whole picture of how to create a good library guide. This isn’t to say they aren’t important; without software instruction, librarians might not know the correct way to format an image, insert embedded video, or make sure that their guide can be found when patrons search for it. Without policy, librarians wouldn’t know how to ensure that their guide fits with other university guides, and fits with the image for what their institution wants guides to be. Without web best practices, users will think that the sites look unprofessional and non-authoritative, and will be less likely to find the content they’re looking for.
What most best practices guides leave out is context. The ultimate goal of a guide is to provide exactly the resources that will be most beneficial to your users. However, what most guides don’t talk about is the fact that doing this is always a matter of dealing with users in context and fighting between extremes. Building library guides is a matter of dealing with conflicting forces on all fronts. For example:
WRITING SCHOLARLY CONTENT VS. WRITING FOR READABILITY:As members of an academic community, academic librarians want to write in a way that highlights their knowledge of the subject. At the same time, they want to write guides that are accessible to users who may have almost no knowledge in the subject.
STANDARDIZED APPROACHES VS. CUSTOMIZED APPROACHES: Having a consistent layout for all guides helps frequent guide users to navigate new guides and helps maintain a consistent appearance across all guides. On the other hand, customizing your guide’s layout can help make sure you’re delivering resources in the way that’s most effective to your specific topic.
MEETING FACULTY NEEDS VS. MEETING STUDENT NEEDS: Faculty, students, and librarians often have different ways they want to approach information and different sorts of information that are more valuable for them.
WRITING ABOUT RESOURCES VS. WRITING ABOUT BEHAVIORS: It’s faster and easier to create guides around the resources available for a topic, but users will often approach information with a specific need, such as doing background research, or researching an author.
WRITING FOR NEW SCHOLARS VS. WRITING FOR EXPERIENCED SCHOLARS: Some resources are more useful to researchers who have been involved in a field for years, while other resources are more suitable to getting an introduction to a discipline.
PUTTING CONTENT ALL IN ONE PLACE VS. POINTING TO EXISTING CONTENT: It often makes more sense practically to point to a single source of truth for a resource, but it can be inconvenient to make a user jump through yet another set of links to get to the resource they’re looking for and may be unprofessional to make it look like your school couldn’t be bothered to create its own resource on a topic.
QUALITY VS. TIME: It takes a lot of time and effort to make sure that your guide perfectly fits the needs of your user. Spending more time on guides often means spending less time on something else important.
It’s easy to understand why most best practices guides don’t get into issues of context and why you don’t frequently see sections such as, how to select the right resources for your audience, how to group resources to meet user information-seeking needs, or how to write content for both new and experienced users. While these are all important parts of making good guides, there are no simple answers as how to do these things. At best, the answer depends on a number of factors that are unique to the specific context of each guide being created. At worst, we may not be sure what the right answer is at all. Any attempt to create a standardized guide that can teach how to create a better guide will likely lead to frustration on the part the guide author since they are the one writing the guide and may be the only one who understands the context it’s being written in. No best practices guide is in a position to understand that context better than the guide author, and any attempt to say otherwise can be seen as unhelpful at best and insulting at worst.
That said, as LibGuides adoption has matured, I think it’s important that we revisit and think about ways we can make context a part of best practices guides and to consider what the next piece of the puzzle is for making guides as useful as possible. As part of my research, I attempted to come up with a list of recommendations that could go into a guide to help librarians understand the importance of designing around the context of the guide and to provide some very basic guidelines for helping build better content for guides. My goal with these recommendations wasn’t to provide authoritative advice, but rather to help highlight the importance of considering context-based design and to hopefully help provide a few tools to allow it to take a bit less time. Here is the short summary of what I’ve come up with so far:
WHY MAKING BETTER GUIDES MATTER: Guides serve as a means of getting users to interact with the library. A good guide helps educate people and spreads knowledge. A good guide takes longer to make, but a guide that no one uses isn’t a very good use of time either. Guides can serve as a reflection of the skill of the librarian and can help give people confidence in the library or university as an institution. Finally, guides also serve as a great resource for librarians themselves.
CREATING YOUR GUIDE’S PAGE STRUCTURE: Create a need-driven page layout by using page titles and content boxes based around specific needs, rather than types of resources. To figure out what sorts of needs users have, use information literacy standards such as Carol Kulthau’s framework (2004) or the Association of College and Research Libraries’ Framework for Information Literacy (ACRL 1998) as your starting point (Brazzeal 2006, Pendell & Armstrong 2014). Some examples of need-driven page names include titles of “Background Research” instead of “Encyclopedias”or “Researching a Subject” instead of “Databases” (Sinkinson 2012).
CREATING CONTENT FOR YOUR PAGES: Focus only on what’s most important. Get rid of resources if they aren’t the best fit for your subject and your audience, including generic resources like your library catalog (Alverson et al 2015, Ouellette 2011). Work with faculty members when selecting content. Students trust faculty members, and you should make it clear that the faculty have had a hand in content curation (Ouellette 2011). Also, avoid non-actionable items. Students are most interested in resources that they can actually use for their work (Ouellette 2011). Teach skills with your content; a Google search can discover a database. You should focus intead on why that database is a good fit, and how to use it (Hintz et al 2010). Finally, write for the modern web. In other words, web best practices and accessibility standards apply to guides just as they apply to everything else on the web.
COMMUNICATE WITH YOUR AUDIENCE: No guide is an island (Martin 2015). Pay attention to other resources on the internet, and make sure your guide connects to anything else that might be helpful to someone researching your topic. Make your guide welcome user contact and put contact information in obvious, useful places to make it clear when it’s appropriate for the user to get in touch with you (Alverson 2015, Sonsteby & DeJonghe 2013). Finally, design around your audience. It’s important to have a strong understanding of who you’re writing the guide for, so when possible, try to meet with members of your audience, and ask them what they want. A resource designed for one specific user will often be more helpful to everyone than a resource designed to be “generally useful” (Staley 2007).
This is a very brief overview, but I believe it hits upon the part of guide creation that other best practices guides leave out. Creating a guide isn’t just about format: It’s about content and context, and I believe that providing some foundation for acknowledging that is fundamental to helping librarians to work towards ensuring that their guides are made for the people they serve. A more complete summary of my research can be found on the best practices LibGuide I created as part of my practicum (Barry 2015).
Of course, a guide on how to make guides ends up in an interesting position, which you may have already noticed. Basically, how can I make a guide that tries to explain how best to make guides if it doesn’t follow its own advice? This issue has led me to conduct an interview study with librarians at the University of Michigan, to help us understand all of the unique pressures that our librarians are facing, so that we can customize a guide to their needs that helps them to customize guides to the needs of others. Guides are one of our best resources for interacting with users and helping make our library resources accessible to the world, and any gain we make in improving guide content is a gain we make in helping improve access to knowledge on a large scale.
Alverson, Jessica, LeFager, James, Schwartz, Jennifer, and Brunskill, Amelia. 2015. “Creating Audience and Environment-Friendly Research Guides: Findings from a User Study. Paper Presented at Creating Sustainable Community: The Proceedings of the ACRL 2015 Conference, Portland, Oregon.
Association of College and Research Libraries (ACRL). 1998. “Information Literacy Competency Standards for Higher Education.” ACRL Web site. Retrieved December 16, 2015, from http://www.ala.org/acrl/standards/informationliteracycompetency .
Barry, Matthew. 2015. “Best Practices for Library Guides.” University of Michigan Library. Retrieved December 16, 2015, from .
Brazzeal, Bradley. 2006. “Research Guides as Library Instruction Tools.” Reference Services Review 34(3), 358-67.
Hintz, Kimberley, Farrar, Paula, Eshghi, Shirin, Sobol, Barbara, Naslund, Jo-Anne, Lee, Teresa, Stephens, Tara, & McCauley, Aleha. 2010. “Letting Students Take the Lead: A User-Centred Approach to Evaluating Subject Guides.” Evidence Based Library and Information Practice 5(4), 39-52.
Kuhlthau, Carol Collier. 2004. Seeking Meaning: A Process Approach to Library and Information Services. 2nd ed. Westport, Conn: Libraries Unlimited.
Martin, Scott. 2015. “Best Practices for LibGuides at UCLA.” UCLA Library. Retrieved December 6, 2015, from .
Martinez, Jesse. 2015. “LibGuides Standards.” Boston College University Libraries. Retrieved December 6, 2015, from .
Ouellette, Dana. 2011. “Subject Guides in Academic Libraries: A User-Centred Study of Uses and Perceptions” Canadian Journal of Information and Library Science 35(4), 436-51.
Pendell, Kimberly, and Armstrong, Annie. 2014. “Psychology guides and information literacy: The current landscape and a proposed framework for standards-based development”, Reference Services Review 42(2), 293-304.
Sinkinson, Caroline, Alexander, Stephanie, Hicks, Alison, & Kahn, Meredith. 2012. “Guiding Design: Exposing Librarian and Student Mental Models of Research Guides. portal: Libraires and the Acadmey 12(1), 63-84.
Sonsteby, Alec, and DeJonghe, Jennifer. 2013. “Usability Testing, User-Centered Design, and LibGuides Subject Guides: A Case Study.” Journal of Web Librarianship 7(1), 83-94.
Staley, Shannon M. 2007. “Academic Subject Guides: A Case Study of Use at San Jose State University.” College & Research Libraries, 68(2), 119-140.
Matthew Barry is a second year master’s student at the University of Michigan School of Information.
FISH OUT OF WATER:
ADAPTING AND CUSTOMIZING INFORMATION LITERACY INSTRUCTION
I’ll be honest: the only science class I ever took in college, while pursuing a degree in English, was called “The History of Astronomy.” It barely even sounds like a science class, right? I spent my time in the library focusing on humanities research, reveling in the dusty feeling of pages from Victorian periodicals and the excitement of using primary sources. I never once read a scientific article or conducted an experiment, let alone set foot in a laboratory. I walked by labs in the physics building on my way to ASTR1660 and thought to myself, “I bet they’re having fun in there,” and then went on my merry way.
Fast forward a few years to my third semester as a graduate student in the School of Information at the University of Michigan. As a student in SI641, Information Literacy, I knew that I would be participating in a practicum experience in which I would be matched up with a mentor whom I would observe and eventually teach library instructional courses alongside. I dreamed of being paired with an English literature subject specialist and planning lessons that combined primary resources with the humanities databases I used as an undergraduate. I imagined myself instilling excitement and wonder in my students as they discovered nineteenth century texts and felt the literature come alive. But this was not written in the stars, as I might have known had I paid more attention in my astronomy class as a freshman in college.
Instead, I received an email of introduction to the Chemistry Librarian at the University of Michigan library, who was to be my mentor that semester. Shock set in quickly: how was I going to assist an accomplished and knowledgeable professional in teaching students who were studying a subject vastly different from anything with which I was remotely familiar? How might I explain searching in a database for topics way beyond my comprehension level? Would they believe that I knew what I was doing, and respect me as an instructor, or would they laugh at me when I crumbled at a single question from the audience that went further than the research I was about to have to do in order to even think about setting foot in the classroom?
I scheduled a meeting with my mentor, Ye Li, and when the appointed Monday morning came I made my way up to the science library on the third floor of the Shapiro library and knocked on her door. After simple introductions, Ye explained to me the types of classes that she normally teaches and the resources she uses. She explained the differences between humanities research and scientific research, describing the different mindsets, publishing cycles, and writing styles of the disciplines in detail. Then she asked me to think about whether or not I was sure I wanted to continue working with her, given my background in English and my lack of experience with science-related library resources. Panic washed through me as I felt her wondering whether I would be an asset or a liability in the classes that she taught as part of her career, her job and her livelihood.
Feeling stuck, I softly said, “Yes. I think this will be a great learning experience for me, to work with material outside my comfort zone. I need to learn how to teach things that I’m not familiar with, and I think I’m up for the challenge!” I may also have timidly raised a triumphant fist in the air at this point, and I wondered whether either of us believed the words coming out of my mouth at that moment.
We boldly set off on a journey of mutual learning and discovery and agreed to draw on our individual strengths to make the experience worthwhile and to ensure that the class we would co-teach met her standards and expectations. What I soon discovered was that the most important thing to consider when teaching a class was emphatically not my own feelings, emotions, and apprehensions. It is, instead, the students. Call me crazy, but this idea was far from apparent when I enrolled in SI641. Worrying about overcoming my shyness in front of students, learning about scientific databases, and the fear of doing something I never thought I would do was all I could think about at first. But what really mattered was this group of students, the skills they needed to succeed in their program, and the confidence they would have going forward after a class with Ye and me.
Once I realized this, I put my effort into constructing a lesson plan for our first class, an instruction session on Scientific Literature, and I found that a lot more goes into instruction than subject knowledge. I was charged with creating a pre-class survey that would collect information about the research projects and backgrounds of the students and elucidate their specific interests, with the idea that we would tailor our instructional session to meet these needs and interests. “I can do this!” I thought. With the guidance of Ye as to the content of the questions, I was completely capable of creating a well-written and functional survey using Google Forms, and after selecting a visually appealing “science” theme for my form which I was perhaps a little to excited about, I sent it off. I then created an entire set of slides for the class, and found that with a little direction about the content, again, this was something I could do. I had been learning about how to create powerful, clean, clear, and meaningful presentations like this and practicing it for years, and I was able to put these skills into practice. During the class, I walked the students through a scientific article I barely understood the meaning of, but because I understood the basic structure after a little studying I was able to explain to students how they might go about dissecting it and understanding it.
What I’d like to illustrate by describing the work I did to prepare for my very first information literacy instruction class is that I did in fact learn a very valuable lesson by working with an instructor in a field outside of my comfort zone. Teaching students information literacy in any capacity and centered around any subject matter requires a foundation that transcends any field knowledge. It requires skills that can be cultivated in many settings. Making good slide presentations, using clear writing, printing and stapling helpful worksheets and documents, and practicing the actual act of standing up in front of a classroom and teaching are just a few of the basic building blocks that one must cultivate to be a good instructor. I stumbled upon an analogy for my conceptualization of information literacy instruction one day after attending my Web Design class: in web design, you must first learn the basics of different coding languages. Once you understand how to put together the bones of a website and how different languages interact with one another, a whole world of new possibilities opens up to you because you suddenly have the ability to create a simple page and insert different widgets or functionalities to customize your page. A lot of this involves not knowing inherently or fully how to create a complex widget, but rather how to link to a package that someone else may have built or that you may have found sifting through help documentation and get it to function in your context.
I believe that this analogy illuminates the fundamental nature of information literacy in an academic setting, especially for a librarian embarking on a career that is not situated as a subject specialist or tied to a department as a liaison. New instructors must take the time to cultivate their teaching abilities which includes everything from how to dress and how to project your voice across a large classroom to how to choose the right font size and colors on your slide presentation and how to use instructional strategies like group work, think-pair-share, worksheets, and demonstrations to instill knowledge and know-how into your students. Once you begin to cultivate these in any setting, they are transferrable and form the basis of instructional efforts in any subject area whether you’re teaching students how to use a database or how to search for and make use of Victorian periodicals. Like learning basic html code and inserting a widget, information literacy instructors must strive to learn the fundamentals of teaching and the tenets of information literacy and be able to simply customize their sessions with different content as is required.
After this experience, I know that all of my peers and I have made great strides in beginning to or continuing to work on our instructional strategies and our personas as teachers. Whether or not we pursue instruction as a career goal, we have all begun to carefully place fundamental building blocks in our repertoire which we will always be able to access and use whether we are called upon to create a last minute one-shot instructional session about using citation management sources or given the opportunity to construct a subject-specific workshop from the ground up. If we continue to cultivate our knowledge of information literacy trends and practices as they change over time and practice our teaching skills, there is no limit to the variety of information we will be able to convey to our students.
Erin Curtis is from Athens, Georgia, where she attended the University of Georgia and obtained a degree in English Literature. She is currently pursuing her Masters in Information from the University of Michigan, specializing in Library and Information Science with a special focus on digital and physical resource preservation. Throughout her academic career she has studied extensively Victorian literature with an emphasis on nonfiction prose, and continues to pursue studying this literature in the context of Information Studies. She hopes to begin a career in academic librarianship upon graduating with a focus on special collections and literary archives.
PURPOSEFUL ACTIVITIES IN INFORMATION LITERACY INSTRUCTION
I told myself, “I will never become the kind of librarian who does database walkthroughs.” And then, to my horror, I did just that. In two of my first SI641 practicum lessons, I found myself demonstrating the steps for searching a database: Access database through library gateway. Introduce the Federated Search page like it’s the black sheep of the family (“We don’t really use that search box.”). Introduce the Advanced Search page like it’s the golden child (“It’s just so much more capable. This is really what you want to use.”). Ask the students to follow along as you carry out your canned search. Interpret the search results.
I was instructing in the same way I had been instructed as a student at my college library. It was my worst information literacy nightmare. Where did I go wrong?
I split my practicum between two concurrent placements at a large, Midwestern public university library system. My first placement was under the guidance of a research librarian who serves library users across academic disciplines. With her recommendation and guidance, I developed and delivered two workshops. The first session was a staff-only workshop focusing on a database of charitable organizations and grant makers. The purpose of the session was to introduce the features and functions of the database to an audience of reference and public services staff. The assumption was that librarians could apply their workshop training to help patrons seek out grant funding. The second workshop was open to the broader university community, with a focus on finding funding for social science research.
My second placement was with the director of a new university library space, which was being reconfigured from physical book stacks to a center for collaborative learning and digital scholarship. The space employs student consultants, and together with the director of the space, we offered faculty consultation and instruction for an undergraduate course in the humanities. These two very distinct learning environments allowed me to see a range of possibilities for information literacy instruction in an academic library. In this chapter, I will be focusing on my teaching experiences in my first placement.
I enrolled in SI641 because I was fed up with database walkthroughs. I knew there had to be something more to library instruction, and technical instruction in general, than the standard narration of a web page or software interface, followed by a series of context-less exercises. I embarked on my practicum experience a little too smug, bent on finding a way to peel back the surface of library instruction. As an instructor, I aimed to convey the kind of meanings and to set off the kind of understanding that reverberates beyond our time together in the workshop. Of course, I totally failed.
SI641 sets us up to enact the kind of robust and transformative learning and teaching I had always dreamed of. As a class, we poured over the ACRL and AASL standards, unpacking the ways our students should demonstrate understanding in complex and empowering ways. We practiced with the Hunter model and took on the backwards framework developed by Wiggins and McTighe in Understanding by Design (2005). I contemplated the potential for transfer in my own learning at the School of Information (Bransford, 2000). But somewhere between seeing the potential of good IL instruction and executing my own lessons, I lost my way.
I realize now that my database walkthroughs were a symptom of a larger problem. The problem started with how I planned my workshops and how the idea for the workshops emerged. Together with my mentor, we thought it would be a great idea to offer staff training on a database that has received increased reference inquiries from our user community. Our bottom line was, “Librarians should know about these key features of this database!” The mistake was in framing the workshop as a resource-focused “Introduction to XYZ database,” and as a result, we approached the lesson plan from the perspective of database features. The lesson’s learning objectives may have been stunted, stopping short of transferrable, “representative” (Phenix 1964, 323), or “Big Ideas” understanding (Wiggins and McTighe 2005, 69).
After my database walkthrough, I included a hands-on activity for participants to engage with real-life examples of patrons searching for funding. In my worksheet, I included a profile of the patron persona and a short description of their need. Workshop attendees were encouraged to use the database to find resources that would address their situation. This hands-on activity was my attempt to place the database tool in the context of its use for public services staff. Where I think the exercise fell short was in reaching what Wiggins and McTighe in Understanding by Design call a “minds-on activity” (2005, 16). The difference between a hands-on activity, like my worksheet, and a minds-on activity, is an engagement with bigger questions that leads to understanding, and assessment that checks for deeper understanding. The activities in my first lesson were simply to follow along with my walkthrough and then complete the worksheet, what Wiggins and McTighe would call “mere engagement” (2005, 20). And my check for understanding was as passive as, the sound of typing, clicking, and the absence of confused eye contact during the walkthrough and hands-on activity. I had not asked the big questions that pushed my students to “consider the meaning of the activity” (2005, 16).
Database walkthroughs and worksheets are easy, but as an instructor, I’m left wondering what impact I’ve had beyond activity completion. To get at “enduring understanding,” the instructor must shift from resource-centered teaching to inquiry and process-centered learning. Activities should enable engagement that “burst[s] through the boundaries of the topic” and that is “broad [and] full of transfer possibilities” (2005, 106). Questions like, “What’s the point? What’s the big idea here? What does this help us understand or be able to do? To what does this relate? Why should we learn this?” are tools that can uncover important aspects of information literacy, like reflection, evaluation, and contextualization (2005, 16).
In my observations, I’ve learned that the most adept instructors will leverage these questions and the minds-on activity in a way that shifts authority away from the information resource and transfers it to their students. Students feel empowered to dig deeper into the resource, to discern what is meaningful for them, and to critique and reject when necessary. Together, the instructor and the students create an environment of trust, inquiry, and engagement, where they can explore the information resource together.
Bransford, John. 2000. How People Learn Brain, Mind, Experience, and School. Expanded ed. Washington, D.C.: National Academies Press.
Phenix, Philip. 1964. Realms of Meaning. New York: McGraw-Hill.
Vanderbilt University Center for Teaching. n.d. “Understanding by Design.” Retrieved December 6, 2015, from https://cft.vanderbilt.edu/guides-sub-pages/understanding-by-design/.
Wiggins, Grant P., and Jay McTighe. 2005. Understanding by Design. Expanded 2nd ed. Alexandria, VA: Association for Supervision and Curriculum Development.
Christina Czuhajewski is a second year MSI student at the University of Michigan School of Information and a University Library Associate at the University of Michigan Library. As a librarian, she aims to foster connections and community through technology, advocacy, and design. When she’s not librarianing, Christina enjoys reading graphic novels and making amateur podcasts.
“A VAST UNKNOWN”:
EMBRACING DISCOMFORT AND MAKING SENSE IN RESEARCH
“We felt the lonely beauty of the evening, the immense roaring silence of the wind, the tenuousness of our tie to all below. There was a hint of fear, not for our lives, but of a vast unknown which pressed in upon us…”
Thomas F. Hornbein, from his memoir, Everest: The West Ridge
The view from the west ridge of Mount Everest may seem a little dramatic for a conversation about information literacy instruction. Stick with me, though. I promise it’s related.
I used to fall over in open spaces. Just feel a little dizzy, like I might float into the sky, then – plop! – right down. It would probably still happen today if I didn’t make such an effort to walk along walls and seek out visual cues to create some kind of tether to the earth. It’s a strange bodily discomfort that I explored in my BFA thesis, which, in part, provided a fictional history to explain why the pioneers wore sunbonnets and rode in covered wagons. That big, open space was just too much. Bonnets and covered wagons served as blinders, helping them focus on what was ahead, instead of becoming overwhelmed by the vastness of the prairie. Having that feeling of security allowed them to move forward.
When it came to deciding what I wanted to explore and teach about in the realm of information literacy, I had an idea, but I didn’t know quite what to do with it. I didn’t realize at the time that I was setting out to research and discuss a more practical application related to my thesis work on the unheimlich. There may be something more to be said about returning to the same ideas again and again, even if in different forms. After all, academics often make their life’s work in a single area, exploring and revisiting the ideas that drive them. Certainly this is because of a genuine interest, but building on something familiar allows them to go deeper, learn more, and maybe even start to make connections that they weren’t expecting. Thomas Hornbein would not have seen that same view from Everest if he had instead decided to just go up about 200 feet on 10 different mountains.
I was first introduced to Carol Kuhlthau’s model of the Information Search Process last fall, and was fascinated to see that it considered the student’s affective responses to the research process. That first step—initiation, where the student feels uncertainty—is where the student is on that west ridge of Everest. It is the very tip of information overload, where the student may say, “I don’t even know where to start.” It’s the “vast unknown.” It’s the open prairie.
Recognizing the student’s feelings of discomfort during research is important, because without guidance (or an awareness or reminder that those feelings are perfectly normal) the student may give up. Or maybe they won’t give up. Maybe they will feel like they have to continue because they have invested time and hard work into their research, and so they continue to sit in that discomfort, frustrated. Maybe they will somehow make it out of the exploration phase with a bunch of information they can’t quite formulate into the full idea they were hoping to communicate. That sounds pretty awful, doesn’t it? So, what can we—librarians, educators—do about it?
Here enters Guided Inquiry. Carol Kuhlthau collaborated with her educator daughters to develop Guided Inquiry, which gives the student and the instructor a roadmap for that sometimes-scary research process. At first, Guided Inquiry might sound like hand-holding, but it isn’t. Those steps are the walls I walk closely to, the elongated rim of a sunbonnet, a tether tied back to the earth when things are starting to spiral out of control. At each stage, there are natural places for instructor or librarian intervention. Those interventions are opportunities to normalize the students’ affective responses and help them to move through the tricky spots by having someone to dialogue with.
The real beauty of this model is that it can be adapted for any grade level and any subject. This means that, in an education system where information literacy instruction is inconsistent, a single, practical method can be reused again and again. If we start young and stay constant, they will have a tool that can serve them throughout their academic career, and on into real life where they will still be posing questions, researching options, and deciphering what information sources they can trust. When students can refer back to a constant, there is less anxiety about reaching further. In this way, rather than inhibiting growth, having a guide provides the confidence that a student needs to move forward. I saw this while I was observing instruction this fall.
For the observation portion of this course, I prepared for and observed instruction in the research center of an art museum. During this observation, I noticed a pattern: not only are these sessions always structured the same way, but the students—a different class every time—almost always have the same reaction.
For the first part of every session, the students are invited to walk around the room and look at the different pieces of art that my mentor (the instructor) selected in support of their professor’s lesson plans. This activity orients them to a new space, as well as allowing them to explore what is before them. In that exploration, they may be searching for something they can connect with, something they recognize within any single work of art.
After a few minutes, the instructor brings the group back together and asks them to choose which piece they would like to discuss first. This is where things get interesting. At most, maybe 25% of the time there is an outspoken student who will say right away that they want to talk about the big/weird/colorful piece. An obvious draw, everyone is curious about it, but not everyone will say so right away. You see, the rest of the time, whether someone volunteers right away or it takes some instructor reassurance that anyone can pick any piece, the students will initially choose a work of art that they think they will be able to talk about. They are interested in the exotic, but they will choose the familiar. After a couple of works, the students become more familiar with the process. They become more confident, and make bolder choices, and learn something bigger.
This is the same thing that can happen if we build a familiar process for our students. Let’s use an adaptable framework that they can carry with them throughout their years, a roadmap they can refer back to, a tether to keep them from floating away. Let’s help them build a covered wagon, not to protect them from the elements, but so that they will have a vehicle for moving out into the vastness of the prairie. Let’s help them make sense of research, help them to see that a little bit of discomfort is part of the process, and that it’s worth it to keep going.
Hornbein, Thomas F. Everest: The West Ridge. 3rd ed. Seattle: Mountaineers, 1980.
Amy Eiben is a second-year master’s student at the University of Michigan School of Information.
THE FUTURE OF COMPUTER CLASSES IS ONE ON ONE
My practicum for this semester was with a smaller public library that is an interesting mix of rural and urban. There is a wide variety of technology skills among library patrons, but many of the people we see on a day-to-day basis have fairly low skills. My practicum lessons included the preparation and teaching of an introductory Pinterest class as well as a beginner smartphones class. My mentor and I were hopeful that these relatively new classes would be attractive to patrons. Four learners attended our Pinterest class, and two came for our smartphones class. Since the two smartphone learners had different phones -- one Android and the other Apple -- my mentor and I broke off and it turned into a one on one class. From these experiences, and my experience teaching one on one sessions on a regular basis, I have found many drawbacks to group classes. This is why technology literacy should really be taught one on one.
With group classes, it can be difficult to plan for and teach to such a wide array of skill sets. We can say a particular class is for beginners, but even beginners have a wide range in skills. In my class on Pinterest, I had two learners who were comfortable with computers and already fairly familiar with Pinterest, one who struggled a little with using the computer but had a Pinterest account she never used, as well as one whose kids recently taught her how to use a computer but did not have a Pinterest account. So even in this small group of learners, there was a wide spread of skills and familiarity with the technology.
In one on one sessions, I have been able to work with a lot of different skill levels in individual sessions and have seen an even greater range of computer skills. There are some people who come knowing quite a bit about computers and are only looking for some extra tidbits to add to their knowledge, people who occasionally use a computer but would like to know more about various functionalities, and people who just got this new tablet or phone and know nothing about working it, in addition to what spans in between. It is a great benefit to the learner when the instructor can gauge the personal skill level at the beginning of the lesson and focus on helping the learner grow from where they are.
To do this, I always ask the learner at the beginning of every session, whether I’m seeing them for the first time, or this is their tenth lesson, “What do you want to work on today?” This is the most important step. My goal is to make sure the learner is learning what they want in the session. I don’t ever want to impose upon them what I think they should know. I emphasize, “Please ask me questions, we can go over something again if it’s not clear,” especially with new learners. It is never what I want to work on, but always what the learner would like to know driven by what they want to be able to do. This can be difficult when someone’s answer to what they want to learn is everything. For this I have a quick list of tasks to find something of interest to the learner. Once I have sparked an idea, we build up the fire from there.
Not only are the skill levels in a group different, but so are the patrons’ end goals. Two learners who have never used a computer before can both be looking to gain basic computer skills, but for completely different reasons. One might be interested in learning how to find coupons and what movies are playing at a local theatre while the other is looking to move up at work and learn how to use technology in the workplace. Those two lessons are going to look incredibly different.
The learner is the one who knows what they would like to get out of the session. They are the ones who should be running the process. There is always a motivation or a reason behind coming in for a lesson. Using the same skills as one would use in a reference interview can help bring out these personal goals and help us teach a better and more meaningful lesson. This can be difficult to do in a group session, and you could spend the whole time learning the reason why each person in the class is there and what they are hoping to accomplish with their knowledge. With one on one, because the learner has contributed in choosing the topic and has been prompted to think about applications, it can give them the sense of control. This is a step toward feeling more open and comfortable with unfamiliar technology, as well as the learning process.
One of the other great opportunities of one on one computer classes is finding those teachable moments within lessons. While I let the learner choose the topic for the lesson, there are often places we can stop in the middle of that lesson and focus on useful mini topics. Many times I find we will be working on a topic and there will be something that would make their task easier. Sometimes it’s a simple fix like not being required to put the ‘@gmail.com’ at the end your username when logging in, or it can be something bigger like being introduced to the capabilities of your smartphone for those who were still only using it for calling and texting. This is a great way to learn because it is very organic, growing out of what the learner is interested in doing as well as building on what they already know. No one comes in knowing absolutely nothing. Even one of my learners who has never used a computer before our session has heard lots of talk about what they can do and is still familiar with the idea of computers even though they are less familiar with the computer itself. Everyone has some prior knowledge.
Another area of difficulty with group classes is assessment. With groups, pre-assessment of skill can be problematic because testing everyone individually would require a lot of time as well as turn many away. Yet asking the group initial questions really only gauges the knowledge of those who answer. The skill levels of those who do not answer questions would remain a mystery. Formative assessment would have similar problems as those who need a second walk through would potentially keep quiet if others in the class were more vocal about being ready to move on. Summative assessment poses similar issues.
Assessment with one on one is more concrete. I can immediately ask, can you tell me what we just did? or Can you repeat the steps we took to get here? and see if they understood. Taking constant assessment can make sure that we are moving at a good pace and can also make sure we go back and fill in any details that might have been lost in the first pass at something. With one learner, I have noticed that there has not been a large amount of retention between weeks, and I have been able to adjust my teaching so that we are doing more repetitive exercises so that she can get the muscle memory to help lessons stick. I could see that we were moving too fast for her to remember. It was a great opportunity to reevaluate and move forward to a better lesson at the right level for her.
One on one sessions offer a greater flexibility to meet the learner’s needs. This is done through gauging interest in topics as well as how the learner plans to use that knowledge. Assessment is faster and completely focused on the learner allowing for the lesson to develop as is fit for the individual. This helps to create a meaningful learning experience which can then empower learners to continue on their path of greater technology literacy.
Alyssa Hanson is a second-year master’s student at the University of Michigan School of Information.
SHAKING OFF PLAGIARISM WITH TAYLOR SWIFT
When I think about information literacy instruction, I think of rich and exciting lessons that challenge students to develop the skills, knowledge, and ethical wherewithal to navigate our changing landscape of information and technology. Plagiarism is not high on my mental list. So when my mentor, a middle school librarian in a nearby town, asked me to teach about plagiarism, the question that immediately struck me was, “How do I make this not boring?” If it is not something that I am thrilled to teach about, then how can I expect students to be engaged, when they are already less disposed to be passionate about information literacy?
I was not excited to teach about plagiarism, but it was not because the topic is unimportant. My mentor discussed how she had already had to speak to some of the seventh graders I would be teaching about plagiarism in their papers. She emphasized that the seventh graders were in dire need of this lesson, so that they can learn what plagiarism is and how to avoid it. According to my mentor, the most critical lesson which the students needed to learn was that paraphrasing without citing your source is still plagiarism. They also needed to know that plagiarism can have serious consequences which far outweigh the so-called benefits.
My mentor asked me to create a website for the students that would cover the same basic content as my lesson: what plagiarism is, how to avoid plagiarism (including how to cite sources and an overview of paraphrasing), and why it is important. All of the instruction I have received on plagiarism has focused on its consequences. I wanted to challenge the students to think about the ethical reasons why they should not plagiarize: respect for others’ ideas, respect for their instructors, and respect for themselves. This is the part which was left out of the lessons I received, and it is the part which I think is the most important for learners to grasp as they build their understanding of how knowledge is constructed and shared in the 21st century.
Although I was confident in the website I created, (http://responsibleresearch.weebly.com), and I knew the content well enough to plan my lesson, I still struggled with how to create something that would engage students, rather than just supplying them with vague arguments and threats of consequences. Since I was teaching in the students’ English classes (and did not think to ask what they were currently reading in class) I could not rely on tying my lesson into the course content. My breakthrough came when I was putting together my slide deck the night before my lesson. I decided to use a recent news story about the judge who dismissed the plagiarism lawsuit against Taylor Swift using lyrics from her songs as my introduction (Gajanan 2015).
I had an epiphany: since my mentor wanted me to emphasize paraphrasing in my lesson, why not have the students practice paraphrasing Taylor Swift lyrics? Taylor Swift is one of the most well-known pop stars of our time, and I knew it was safe to assume that almost all students would know her reputation and be familiar with her songs, whether or not they liked her music. Choosing something that I could assume prior knowledge of would eliminate the need for any instruction on the materials to be paraphrased. I printed out portions of lyrics from several of her well-known hits including “Shake it Off,” “Teardrops on my Guitar,” “Love Story,” “We Are Never Ever Getting Back Together,” “Blank Space,” and “Wildest Dreams” using the website AZLyrics.com. I cut up the slips so I could pass them out and have students work in groups of 2-3 to paraphrase the lyrics.
I was honestly a little nervous about how this activity would go (would the students be too distracted by the lyrics to actually work on paraphrasing?) but I was very pleased at how well the lesson went. Instead of having eyes glazed over, the students were mostly excited when I announced we would be paraphrasing our generation’s greatest wordsmith, Taylor Swift (although there were also a few groans mixed in). The success of the activity varied between the four classes I taught, but in each class the students were engaged in the activity, completed at least one paraphrase, and were excited to share with each other and the class what they had written. The fact that the students knew who Swift is, what she is known for, and are familiar with her songs meant that many of them took great care with their paraphrases. My third class of the day was so enthusiastic about paraphrasing her lyrics, I had to scramble to keep up with the demand of new lyrics for each group.
Circling the room and listening to the examples that the class shared, there were still a few students whose desire to go for the joke undermined their ability to paraphrase effectively. However, this was by far the minority. Most students understood what they were doing was an act of translation, and they created fresh, concise re-wordings that honored the original meaning. When I asked the students to reflect on what what difficult about paraphrasing, they were full of insightful observations about balancing how to respect the original ideas while putting something in their own words without over-complicating it. My favorite observation from a student was how not knowing the context of a lyric could greatly alter the perceived meaning, and make it more difficult to paraphrase accurately.
So what does Taylor Swift teach us about plagiarism? First, that when students “know” a person, they are more likely to respect that person’s words. Getting the meaning of Swift’s lyrics right was a greater priority than it might have been if I had chosen a more traditional text, where the author is little more than a name on a page. Second, choosing something with a personal connection to student’s lives can make even a very dry topic a little more engaging. Say what you will about Taylor Swift, her songs are catchy and her lyrics are memorable. Students knew what her songs meant. They had heard them before and could sing them by heart (more than one broke out into song during this activity). For a librarian doing a one-shot instruction session, tapping into a pop culture icon is a great way to establish common ground by connecting to something that students’ find important in their personal lives.
Gajanan, Mahita. 2015. “Judge Borrows Taylor Swift Lyrics When Shaking Off Plagiarism Suit.” The Guardian, Nov. 12. Retrieved November 17, 2015, from http://www.theguardian.com/music/2015/nov/12/taylor-swift-plagiarism-lawsuit-dismissed-shake-it-off.
Amber Lovett is an aspiring librarian in her second year at the University of Michigan School of Information, specializing in Library and Information Science and School Library Media. She is an intern at the Canton Public Library in Canton, MI, and also a Site Coordinator for Michigan Makers which brings a pop-up makerspace to local schools as an after-school club. She is interested in inquiry-based instruction, makerspaces and maker learning, teacher education, and collaboration across public, school, and academic libraries. Upon graduation, she hopes to work with children and youth in a school or public library setting.
LEARNING FROM HEALTH SCIENCES LIBRARIANSHIP:
QUESTION FORMULATION AND SEARCH STRATEGY
Of the many more extensive personality questionnaires and surveys used frequently in psychological studies, one of my personal favorites is a single question: “What is your favorite stage in a work or school project – the beginning, middle, or end?” As I have found, a surprising amount of insight reveals itself in different individuals’ choices here. First, those who enjoy the beginning of a project often seem to value that stage with optimism towards unnumbered possibilities, not yet entangled by project limitations and choices. Those who like the middle of a project seem to value the guidance those limitations and decisions provide, seeing this stage as comfortingly stable and therefore most productive. Finally, those who prefer the end of project feel more than general relief on finishing, valuing most its gratifying sense of accomplishment. Interestingly, people who favor certain stages often seem to have strong feelings about others. For example, those who enjoy the beginning often dislike the middle and end stages, seeing them as overshadowed by regrettable decisions and mistakes that can no longer be undone. Meanwhile, those who prefer the middle or end of a project might dislike the beginning because of the ambiguity of there, seeing the unknown as potential for those regrettable decisions and mistakes.
I have always enjoyed the beginning of a project, when there is everything to learn and nothing yet out of reach. But even I, a first stage enthusiast, would agree that to start a project with no limitations or guidance whatsoever makes that part of a project much less fun and much more worrisome. This is precisely the kind of feelings that many students experience who are yet developing information literacy. Information literacy provides structure and background for information searches and project-building, and without that understanding and skill to help users conceptualize, plan, find, evaluate, and use the information they need, a project such as the classic freshman-year research paper “on topic of your choice” seems more like an obstacle than an opportunity. Many students ask, “How do I start when I don’t even know what I need to start?” Several studies on how students approach the research process have highlighted this as a common anxiety-causing spot in the very beginning of a project. Kuhlthau’s classic model of affective and cognitive stages of the information search process is the result of some of these studies. In my own experience working as a reference desk assistant at the campus’s graduate and undergraduate research libraries, I have seen this anxiety and confusion. Often, students are not actually sure what they need to begin learning about their topic. Consequently, their questions are vague, or wrongly specific, and their attitude timid and insecure.
In particular, students seem to struggle with question formulation in beginning a research project. In Kuhlthau’s model (1991), question formation is described as topic selection, followed by scholarly investigation and refinement into a research focus. I like Kuhlthau’s model because it acknowledges the difference between topic and focus, as well as the need to explore the topic before deciding on a focus. It allows students to answer the question, “How do I start when I don’t even know what I need to start?” The first step is to learn enough about the topic to formulate a focused research question, often using reference sources such as encyclopedias or dictionaries. The second step is transitioning from question formulation to search strategy. A great deal of librarian instruction has traditionally focused on search strategy in teaching scholarly databases, Boolean operators, controlled vocabulary, and other similar topics.
While many scholarly databases now facilitate keyword searching to make searching require less technical expertise, the bridge between the question the user has and the statement the database needs remains fluid and sometimes overlooked in our instruction. Often at the reference desk, I have made do with demonstrating to students how they might have to “play around” with keywords in different databases, but it is difficult to teach this stage in a few minutes at a reference desk without a more stable instruction model.
However, during my practicum at the Taubman Health Sciences Library in fall 2015, I observed an interesting solution to the issue of teaching students how to begin a research project, using the PICO Question Model and Keyword Table Exercise. These exercises are based in health sciences librarianship, but I believe they could be used effectively for students in multiple disciplines.
The Taubman Health Sciences Library (THSL) serves both UM students and health professionals of the UM Hospital, through instruction and consultation with its expert health sciences librarians, or Informationists. For my practicum, I engaged with the Nursing Core of informationists at the THSL, and during its course, I had several opportunities to observe, and later lead, library instruction in a variety of settings.
My first instructional observation was particularly memorable in that it introduced me to a couple of new strategies for beginning research in the health sciences. The first was the PICO Question Model. A commonly used structure in health sciences education, it stands for
P = Population/Person/Problem
I = Intervention
C = Control/Comparison
O = Outcome.
A question formulated using PICO which we used in our THSL instruction sessions was “What non-pharmacological treatments have proved effective in reducing depression in pregnant women?” In this example, shown in Figure 1 below, P is “pregnant women with depression,” I is “non-pharmacological treatment,” C (here implied) is “pharmacological treatment” or “drug prescription,” and O is “effective in reducing depression.”
This question formulation model gained popularity in the health sciences along with the evidence-based practice movement in the early 1990s (Adams 2014), and from my observations of students at the THSL, it is clear to see why it continues to be common use. The structure outlines the essential facets of a health sciences research question for nursing students to best identify and diagnose a medical problem. It is a reliable yet still flexible structure, giving students and librarians a base point to start instruction and learning in beginning a research project. Consequently, I found that our nursing students seemed a lot less uncomfortable with this stage of the research project than other undergraduates of different disciplines I encountered at the main UM libraries. In general, while most instruction sessions included discussion on finessing the PICO, both librarians and students were able to use this model to effectively discuss how to improve their question formulation.
Figure 1. Sample PICO question from slides created by THSL Nursing informationists for literature search workshop, for Nursing 254 class of sophomore nursing students
The second strategy I observed in my THSL practicum for beginning of a research project was the Keyword Table Exercise. Used by all of the Nursing informationists in their literature searching workshops, the keyword table exercise organizes the main concepts of a research question and collects related terms for students to use later in generating a search statement in a scholarly database. In the previous example about pregnancy, Figure 2 below was used in guided practice. Then informationists encouraged students to fill in the blanks, first with the main ideas of the question (“pregnancy,” “depression,” “non-pharmacological treatment”) and then with a few related terms (“childbirth,” “postpartum,” etc.).
I found this exercise extremely useful because it visualized the relationship between concepts and keywords for students and was simple and flexible enough for them to work out in guided and independent practice. The exercise also lent itself well to explaining concepts such as truncation, or how to easily cover all forms of a related term, and how to use quotation marks to contain and search for the exact phrase or related term. Finally, upon completing the table, the informationist could illustrate the use of Boolean operators by aligning ORs across the rows of related terms and ANDs down the columns of main concepts. I was especially impressed with how easy the exercise made explaining this complex skill for students because in my past experience at the reference desk, I have found it difficult to explain how Boolean operators work, short of just walking students through plugging them into the database.
Figure 2. Sample Keyword Table from slides created by THSL Nursing informationists for literature search workshop, for Nursing 254 class of sophomore nursing students
Both the PICO structure and the Keyword table have been used at the THSL with great success. Literature in library scholarship suggests these strategies are so useful because they address student information literacy, particularly question formulation and search strategy in the beginning of a research project. Highlights include the use of PICO and the value of evidence based practice within the context of information literacy, as well as areas of importance and for improvement in the ACRL Standards and Framework. For example, Adams’ comparison (2014)with the Standards argues that more attention should be given to question formulation and application of knowledge in health sciences librarianship and more emphasis placed on source evaluation based on internal logic rather than traditional authority. Another, more recent article on the Framework’s implications for health sciences librarianship has suggested that the Framework’s conceptual flexibility can allow instructors to develop discipline or course-specific learning objectives (Knapp and Browers 2014), which might very well include Adams’ suggestions based on the needs of health sciences students and professionals. From this, I further suggest that one of the most exciting things about the new Framework in its flexibility is how we might learn from other disciplines to improve our information literacy instruction. The Keyword Table Exercise could easily be employed for instruction in humanities research, and the PICO Question Model could also be adapted, or at least inspire librarians of other disciplines to bear in mind the importance of question formulation and other issues for beginning research in their instruction. Overall, my experience in health sciences librarianship has convinced me that all librarians can learn from each other, especially where one branch of librarianship has cultivated effective strategies for meeting their users’ learning needs.
Adams, Nancy E. 2014. “A Comparison of Evidence-Based Practice and the ACRL Information Literacy Standards: Implications for Information Literacy Practice.” College & Research Libraries 75(2), 232-248.
Knapp, Maureen and Brower, Stewart. 2014. “The ACRL Framework for Information Literacy in Higher Education: Implications for Health Sciences Librarianship.” Medical Reference Services Quarterly 33(4 ), 460-468.
Kuhlthau, Carol C. 1991. “Inside the Search Process: Information Seeking from the User’s Perspective.” [_Journal of the American Society for Information Science _]42(5) 361-371.
Amanda Palomino is a second year Master’s student at the University of Michigan School of Information.
PUBLIC LIBRARY INSTRUCTION – DEMYSTIFIED?
In my project I conducted observations and taught lessons at a large, multi-branch public library in southeastern Michigan. I was particularly interested in the intersection of technology and information literacy instruction. Although I had previous experience working with patrons on technology issues in a public library setting, this experience left me with a deeper appreciation for the nuances and challenges of teaching such a diverse group of students.
For my observations I attended courses on a range of beginning computer skills, from navigating the Internet to using office productivity software. I also observed courses about using tablet devices. From my observations I learned that public library patrons differ from “traditional” students in several important ways. First, each class might be comprised of students who vary greatly in age, familiarity with technology, and physical abilities. In the computer classes, the teacher began every class by checking to see that students had some experience using a mouse and keyboard. Second, the students in a public library are not a captive audience but are free to come and go as they please. The programs themselves—the way that they are named and marketed—provide their own instructional “hook.” Students would not make the time and effort to voluntarily come to a class that they did not think that they would learn something from. In turn, this contributes to the overall mindset of the students during class. They are highly motivated, but also might come with their own agendas. In contrast to an educational context where librarians work with classroom instructors or faculty, at the public library it is the students themselves who are ultimately partners in the curriculum development.
Keeping these observations in mind I thought carefully about the ABCD portion of the Hunter Model for lesson planning when I began to develop my own lesson plan for “Your Smartphone—Demystified” which was part of a series of similarly named and designed classes at the library. Although I had visited a variety of courses at this library, I still had only a vague idea of what kind of students to expect at the program. I decided to base my lesson plan on one that had been developed by one of my instructional mentors, adding a few additional components. The outline covered seven different basic components of smartphone operation, beginning with basic hand gestures and finishing with connecting to the library’s Wi-Fi network. In addition to this interactive presentation, I wanted to allow time for the students to help each other in small groups, as I noticed this happening organically in the larger sessions I attended. I also included a handout to help guide the discussion and give the students a place to take notes. The handout also helped spark discussion for students who had already mastered the basic smartphone functions.
One aspect of the lesson planning and implementation that I struggled with was how to weave information literacy topics into skills instruction. How could I make my sessions more than an Apple Store workshop about how to use a phone, but without seeming too forced or lecture-y? From the instructional sessions that I had observed I knew that there are many potential opportunities for this in skills-based instruction. For instance, a class about adding web images to a word processing document could include a discussion about copyright and attribution practices. A lesson on downloading and updating tablet apps could include a discussion about personal data sharing and harvesting.
However, although I planned for a few different “advanced” topics in my presentation about smartphones, I also found out how difficult it was to cover everything after bringing everyone up to speed on the basic topics. For instance, I hoped to get to a discussion about assessing the safety of apps available for download, but ran out of time because we had to move on to the group activity portion. Maybe with more experience and better timing I will be able to incorporate more topics into my teaching.
Additionally, I felt that the “mysteries” that I had anticipated were not always the ones that were most perplexing to the students. They had already mastered some topics that seemed tricky, but asked questions about other things that never occurred to my mentor or myself. Teaching students from different backgrounds, who are older than I am, or have different communication needs will take much more careful preparation, feedback, and iteration.
Towards the end of each session I showed the class a picture of Hedy Lamarr and talked briefly about how she had helped to develop the technology on which modern Wi-Fi is based. This was at the end of my presentation and I thought everyone might need a mental break from all the topics that we had covered before we moved into group work. The students seemed pleased to learn about this little-known episode in computing history. However, the picture also provoked additional questions: about the differences between Wi-Fi and broadband; about the operation of the technologies themselves; and eventually the final stumper: “What is data?”
As I thought back on how difficult it was for me to answer this question on the spot, it occurred to me that an understanding of technology need not be a sidebar in 21st century information literacy: it is actually crucial to understanding the information that flows from our devices and the decisions we can make about that information. We are in a technological moment in which the makers of our devices do not want us to ask critical questions, but to simply accept the answers that they have chosen for us. We are being taught to value convenience over control, and we are being controlled by convenience. In this context, even actions as simple as picking the second Google search result or customizing a news headline widget can be radical acts by users who understand how to use these tools beyond the default. During my sessions I was pleased to find that the students are already asking critical questions of their devices. “Why on Earth does my phone substitute the words I want to use for these other ones?” “Why does Siri sometimes get things so wrong?” “Why is Siri so often right?”
Perhaps the heart of digital literacy instruction is connecting this natural curiosity (and frustration!) with a deeper understanding about how and why technologies work the way they do. Without this base of knowledge the mysteries might only continue to deepen with each new device, for those on either side of the digital divide.
As one of my mentors prefaced her own instructional session, “demystification” is an on-going process that takes time, practice, and experimentation to reach some resolution. Spending time with instructors in the public library provided vivid examples of many of the issues that I had previously read about, but also revealed others which I did not expect. In the end I wished that I could have taught more times, so that I could try out different techniques and get more feedback from the students. In some ways I feel even more mystified than when I started; I am also full of new ideas to apply to my instruction in the future.
“Madeline Hunter’s Lesson Plan” via Iowa State University Department of Economics. http://www2.econ.iastate.edu/classes/tsc220/hallam/MadelineHunterModel.pdf
Fiona Potter is a second-year masters student at the University of Michigan School of Information. Her previous experiences with digital science education and technology mentoring brought her to the field of information literacy instruction. She is particularly interested in the work that public libraries do to narrow the digital divide in their communities. In addition to her current studies she also works for a e-resource vendor in Ann Arbor, Michigan.
FOAM EXPLOSIONS AND NON-NEWTONIAN FLUIDS:
ARE WE TEACHING STEM?
CASE STUDY 1: PUBLIC LIBRARY
On a Saturday afternoon, children and parents crowd the programming room of a Midwestern town’s public library. There are tables and chairs set out, and the room is filled with excited talking and even laughter as children mix hydrogen peroxide and yeast or baking soda and citric acid to create mini foam explosions. Staff members circle around, removing the messy remains of finished experiments and refilling water jugs and other supplies as needed. Small half sheets of paper serve as handouts with instructions on how to conduct the experiment, sometimes referencing words such as “endothermic” or “viscosity,” but more often containing simple directions: “empty the bag of baking soda into the bowl” or “add a small amount of dish soap.” The handouts invite patrons to record their observations on the back of the sheet, but few do so. Some parents help their children follow the instructions precisely, asking them questions about what they are doing. More parents step back and watch as their children add this and that ingredient, mixing foams and happily making a mess before heading home.
CASE STUDY 2: SCIENCE MUSEUM
On a Wednesday morning, children from a 1st grade class sit with chaperones at small tables in a classroom inside an interactive children’s museum, situated in the same town as the public library above. Their instructor reminds the children to raise their hands to ask questions and to give her their eyes and ears when she counts down from five. The lesson begins with the instructor questioning the class and using props and the whiteboard to explain the states of matter: who can tell me what a solid is? What shape is the liquid in this pitcher? And what if I pour it into this round bowl? Where does the gas go when I let it out of the balloon? After reviewing the five senses and hinting at the ideas of observation and hypotheses, the instructor and the chaperones help the children mix glue and borax soap to make small cups of slime. The instructor circles the classroom, stopping at each table to ask students what they see, smell, feel. What state of matter do they think the slime is? Why? The class votes on what state of matter the slime is, before moving on and repeating the process, this time using cornstarch and water to make a second slime. At the end, the instructor explains that slime is a non-Newtonian fluid, writing the term on the board and telling a short story about Sir Isaac Newton. The children have never learned that term before, and many of them are excited by it. The instructor hands out slime recipe cards to take home, and gives the classroom teacher an evaluation sheet to fill out about the lesson. When the children return to school, their teacher will have a follow-up activity to reinforce what they learned at the museum.
For much of this semester, I have been exploring what Science, Technology, Engineering and Mathematics (STEM) instruction looks like in libraries and museums. Like many of my coworkers and classmates, I do not have a STEM background. Once I started my practicum observing and co-teaching at a children’s hands-on science museum, however, I started thinking about STEM literacy quite a bit, and I started comparing how the museum approaches STEM and how the public library I have worked at for the last year and a half incorporates STEM. The above (real life) case studies are illustrative of the two approaches, at least for the organizations with whom I have been involved. I think each approach has value, but I also wonder how they can inform and improve one another.
The public library I work at has a wonderful department manager who is trying to incorporate more STEM programming into the library’s already robust events and activity schedule. These STEM programs follow along similar lines to our public crafting and DIY programs: an activity is created, instructions are written, materials gathered, and patrons encouraged to give it a shot. Registration is never required, so all events are walk-in. The museum has events similar to the library’s open structured programs where patrons can walk in and have the chance to experiment with kinetic sand or see wild animals from the Leslie Science Center, for instance. However, the museum also offers field trip labs that provide structured instruction to classes, crafted to adhere to educational standards and compliment the school curriculum. Throughout my practicum I observed over a dozen of these labs which taught a variety of topics such as circuits, the water cycle, sound and light waves, magnets, simple machines, and states of matter. These labs are what made me start comparing STEM instruction in libraries and museums, and they made me think about the difference between providing an experience and really teaching a concept.
The labs aim not just to expose children to hands-on science, but also to teach relatively complex concepts. Each lab has a detailed lesson plan, with clearly stated objectives and carefully documented compliance with Michigan Grade Level Content Expectations and Next Generation Science Standards. Every lab lesson has a full script, with activities carefully planned to encourage active participation, often beginning by exploring and reviewing the students’ prior knowledge (for instance, questions about and examples of states of matter), moving to activities that require students to apply this knowledge and critical thinking skills (making observations and inferences to determine the state of matter of the slime), and then using this application to introduce new concepts (the slime was both a liquid and a solid; it’s a non-Newtonian fluid). Not every lab follows the same exact structure, but all are designed to determine and build off of prior knowledge through a hands-on activity. The labs are clear, thoughtful, and intentional instruction, the likes of which I have not yet seen in a public library setting.
The library where I work is a Library Journal Five Star library and is well known for the exceptional service that it provides, including top rate programs and events. Attendance at our programs, particularly during the summer, is notably high. It is clear that the administration and librarians have discovered what services are valued by our community. The program I described in Case Study 1 was a very successful science program for K-5 patrons. The introduction was brief, letting patrons know staff names, where to find the supplies and instructions, and a few tips such as cleaning their work space in-between experiments to avoid mixing ingredients that wouldn’t react properly. From there, patrons were left to experience the activity on their own, with guidance from staff when asked for (or when visibly struggling). This program, and other STEM programs like it, bears little resemblance to the classroom-style labs at the museum and is much more similar to the crafting programs the library hosts. In fact, in many ways our library STEM programs are such because they have a science, math, or engineering theme, more so than because they actually teach a STEM concept .(The exception may be technology courses such as Minecraft and other coding classes, and the newly up and running makerspace-like laboratory, which I have little experience with as of yet.)
Despite the fact that the Case Study 1 STEM program did not aim to directly teach a science concept, it was still considered a success by the library administration and by myself as well. Most of the patrons who attended this program probably did not gain a true understanding of what an endothermic reaction is, but many patrons did leave with an awakened interest in chemistry, even if they might not quite use that word. Library programs that focus on an enjoyable and engaging experience have the potential to play an important role in STEM education, or any kind of education or literacy. Many professionals and researchers are already aware of the importance of immersive experiences in the research process and for information literacy instruction: Kuhlthau and Maniotes research framework, Guided Inquiry Design, for example, acknowledges the importance of a kind of immersion in the inquiry process, even suggesting a visit to a museum or similar place to spark interest and ignite inquiry through experiences (Maniotes and Kuhlthau, 2014, 11).
In short, the STEM themed programs the library offers are valuable experiences for patrons, and I think we can recognize that creating positive experiences is a strength of public libraries. However, are we teaching STEM? Are we teaching information literacy, or technological literacy? Public libraries rarely hold the kind of instructional sessions that an academic library would have, and even more rarely get the chance to work with faculty or teachers to create integrated courses on research methods. The public library I work at has historically poor attendance for database instructional programs, and for computer classes in general. Yet, many public librarians are adamant about the role that public libraries play in information literacy, touting public libraries as places for technology and information literacy instruction. When, however, we compare our experience centered programs to the instruction that I see at the hands-on science museum, are the services we offer truly instruction? I would argue that they are not.
More importantly, though, does it really matter what we call our services, so long as we are providing good ones? I think that it does. Much as learning objectives direct a lesson plan, I think the language we use to express our intentions directs patron and staff expectations of what we are going to be delivering. In order to advertise and promote public libraries as institutions of instruction, STEM or otherwise, we need to adopt the kind of educational standards driven lesson plans and more formal instruction and assessment that are being utilized in the hands-on museum setting. At the same time, I believe that the approach to STEM programming many public libraries have taken, exposing patrons to STEM themes in causal and fun ways, is still a valuable and important part of the future of library services. If public libraries that have found that STEM programs with the intent to provide experience or exposure are effective and valued by their communities, it isn’t necessary to undercut and rebuild entirely new services. Instead, we need to decide upon a new way of expressing what our existing services are. Meanwhile, visiting and understanding how museums and other institutions are incorporating STEM and other education can only be beneficial. After all, adapting ideas from the museum setting, whether it be in the form of more clearly educational handouts, posters, or program introductions, will only strengthen our own services.
Maniotes, Leslie K. and Carol C. Kuhlthau. 2014. “MAKING THE SHIFT: From Traditional Research Assignments to Guiding Inquiry Learning.” Knowledge Quest 43(2), 8-17.
Shannon Powers is a graduate student at the University of Michigan, School of Information. She will graduate in the spring of 2016 with a specialization in Library and Information Services. She works as a Public Library Associate at the Ann Arbor District Library, and also as a Research Data Services Intern at the University of Michigan Clark Library. She is interested in nearly all aspects of libraries, public or academic, and is eager to see what she can do in the ever changing world of information.
FAIR USE EMPOWERMENT FOR VISUAL ARTS PRACTITIONERS, SCHOLARS, AND STUDENTS
If I have learned anything at all about information literacy, it is that it provides very real and useful empowerment. Regarding my own experience, the successful completion of my first year in information school left me feeling genuinely empowered in terms of being able to locate, gather, synthesize, and disseminate information. In learning about the Internet, the World Wide Web, digital databases, web-based applications, and digital tools, I became assured of my ability to find and use almost any kind of information that I would ever need.
My second year in information school has involved a heightened level of practical experience in the area of teaching. In SI641, this meant that I would be assigned a real-world mentor from whom I would learn and for whom I would work. Because my ultimate career goal is to be a librarian in an institution of higher learning, I was thrilled to learn that my mentor would be the Librarian for Art and Design at a major Midwestern university.
At our first meeting, my mentor assigned me to create a research guide that would enable university practitioners, scholars, and students to understand -- to the extent possible -- the right of fair use in copyright law, as applied in the field of visual arts. This guide, the librarian explained, was to be based on the College Art Association’s newly-published Code of Best Practices in Fair Use for the Visual Arts (hereinafter, “the Code”) (College Art Association 2015a).
I would soon discover that the issue of empowerment, so important to me in terms of my own educational experience, is likewise important in the area of fair use in copyright law. Succinctly and accurately -- if not optimistically -- stated, the doctrine of fair use in copyright law “defines an open-norm for deciding permissible uses of copyrighted material based on a fairly ambiguous set of standards” (Elkin-Koren & Fischman-Afori 2015, p.3). Widely accepted is the notion that because fair use is a flexible legal doctrine, uncertainty in its application inevitably results (Elkkin-Koren & Fischman-Afori 2015, p.4). This uncertainty is concerning, because it gives rise to the risk of “chill[ing] socially desirable behaviors” such as those connected with education and scholarship (Elkin-Koren & Fischman-Afori 2015, p.4). As stated in the Code:
“the practices of many professionals in the visual arts are constrained due to the pervasive perception that permissions to use third-party materials are required even where a confident exercise of fair use would be appropriate. . . . Although members of the community may rely on fair use in some instances, they may self-censor in others, due to confusion, doubt, and misinformation about fair use, leading them to over-rely on permissions. . . . Doing so jeopardizes their ability to realize their own full potential, as well as that of the visual arts community as a whole” (College Art Association 2015a, p. 6).
The decision not to invoke fair use rights is most commonly made by visual arts professionals themselves (College Art Association 2015a, p.6). A recent movement to create best practice guides is supported by the idea that it is important for “an educated user base” to make “reasoned, rather than indiscriminate, uses of [copyrighted] materials” (Subotnik 2014, p. 985). The Code is a thoughtfully considered attempt to educate and empower the visual arts community with regard to fair use.
The research guide that I created in response to my mentor’s request () begins with an introductory page, entitled “Getting Started,” that defines the concept of fair use in a simple and easily understandable way and, further, sets forth the four factors that courts consider when making fair use determinations. Those factors include:
1. the purpose and character of the use of the copyrighted work;
2. the nature of the copyrighted work;
3. the amount of the copyrighted work that is used; and
4. the effect of the use of the copyrighted work on the potential market for or value of that work.
This page also contains a link to a colorful and informative infographic (College Art Association 2015b), which explains:
1. the need for a fair use code in the field of visual arts;
2. the risks associated with not having such a code;
3. the way in which the Code was developed; and
4. how one can use the code to make an informed decision about rights under the fair use doctrine.
My intent in creating this page was to provide users with a straightforward, uncomplicated, and visually attractive presentation of the fair use doctrine, and some reasons as to why that doctrine should be important to them.
A second page describes, in uncomplicated terms, the flexible and unpredictable nature of the fair use standard, the difficulties that users may face in applying that standard, and the visual arts community’s response to that difficulty as set forth in the Code. The page contains a statement explaining :
1. that the purpose of the Code is to “provide an approach to reasoning about the application of fair use issues both familiar and emergent” (College Art Association 2015b, p. 8); and
2. that the Code accomplishes its purpose by setting forth five situations, together with related principles and limitations, about which there was a strong consensus among relevant discussion participants. (College Art Association 2015b, p. 8).
Importantly, the second page also describes what the Code does not do, and the circumstances to which it does not apply. Specifically, the Code does not provide rules of thumb or bright-line rules, nor does it apply to uses outside of the United States or where other permissions, licenses or agreements apply (College Art Association 2015a, p. 7). I elected to place this information in the second page of the research guide, because I felt that it would be important for users to understand at the outset just what the Code can and cannot do to help.
A third page in the research guide contains an abbreviated version of the five situations, together with related principles and limitations. In addition, each of the five situations contains a link to the section of the actual Code where each of the entries can be read in their entirety. In creating this page, I thought it might benefit users to be able view in isolation, without confusing and unnecessary distraction, the information content that comprises the very heart of the Code.
Finally, there is a page, entitled, “Now You Try!,” which links to hypothetical situations that have been created by the College Art Association. These hypothetical situations allow users to think about how they might evaluate specific fair use situations and to compare their reasoning with that of the authors (College Art Association 2015c). I put this page at the end for those users who, after reading the first three pages, desire additional conceptual help and practice in understanding and utilizing fair use provisions.
My mentor’s reaction, upon seeing the finished research guide, was positive. She confirmed to me that this particular resource, in this particular format, would, indeed, be of value to her users. It is my particular hope that use of the guide will demonstrate to those who are reluctant to use copyrighted material for any purpose that they are entitled to a broader range of uses of copyrighted material than they ever before thought possible.
College Art Association. 2015a. “Code of Best Practices in Fair Use for the Visual Arts.” Retrieved November 13, 2015, from .
College Art Association. 2015b. “Fair Use in the Visual Arts: Why We Need It.” Retrieved November 7, 2015, from [+ http://www.collegeart.org/pdf/fair-use/best-practice-fair-use-infographic.pdf+] .
College Art Association. 2015c. “Fair Use: You be the Judge.” Retrieved November 7, 2015, from .
Elkin-Koren, Niva, and Orit Fischman-Afori. 2015. “Taking users’ rights to the next level: A pragmatist approach to fair use.” Cardozo Arts & Entertainment 33(1), 1-45.
Subotnik, Eva E. 2014. “Intent in Fair Use.” Lewis & Clark Law Review 18, 935-88.
Sofia-Cerilli, Tamara, and Vanderbroek, Jamie. 2015. “Copyright and the Arts: Getting Started.” University of Michigan Library.Retrieved December 16, 2015, from .
Tamara Sofia-Cerilli is a second year graduate student at the University of Michigan School of Information and an enthusiastic, if not always traditional, wife and mother of one. Having a strong interest and background in the law, Tamara hopes that after earning her master’s degree in information science she will have the opportunity to channel any remaining energies into a robust and active career as a law librarian, where she can teach law students how to conduct thorough and effective legal research.
GETTING UP CLOSE AND PERSONAL WITH NUMBERS:
FORAYS INTO DATA LITERACY VIA DATA VISUALIZATION
Before I began researching data visualization, my affinity for numbers was lukewarm at best, and my charting abilities were limited to occasionally making a line graph in my budget spreadsheet. Even my nascent interest in data visualization was rooted in linguistics: ever a lover of languages, I had thought it would be interesting to learn more programming, and data visualization seemed like a good way to familiarize myself with more Python or pick up R (two programming languages that can be used to create advanced data visualizations). As a former English major, I had to struggle with the “fight-or-flight” instincts that kicked in when it came to looking at, talking about, and – horror of horrors! – working with numbers. Little did I know that my journey to data literacy would draw upon skills I already had, encouraging me to think critically about data presentation and look at numbers in a new light.
My project revolved around the idea of “data presentation” – thinking about how to present data in the most compelling and coherent way, weighing the merits of one chart type over another in different situations, reflecting on the questions that are answered through different means of presenting data. Professor Kristin Fontichiaro recommended the topic to me, suggesting that I look at it from the point of view of creating materials for high school educators – teachers and librarians alike – a strong need that she recognized and recently received an IMLS grant to address.
Once I started researching, I was hooked. I discovered that there is a lot of information and research, both scholarly (like the works of Edward Tufte and William S. Cleveland) and informal (blog posts, infographics, and the like), out there on this topic, but often it is not collected in a thorough manner, and much of it is aimed at an audience of people who are already data enthusiasts, rather than data amateurs with very little prior knowledge. Professor Fontichiaro and I decided that I would come up with some rules of thumb for data presentation, and give a rundown of how to most effectively use a variety of common charts and graphs, in the form of a chapter geared towards high school educators. I also spent time observing at the University of Michigan Library: over the course of the semester, my mentor taught a variety of workshops on various data visualization tools to faculty members, graduate students, and undergraduate students of all disciplines. Thinking about the needs and desires of learners in both the high school and higher education settings helped crystallize my thinking.
The first time I really looked closely at a bad bar chart was a revelation. I had read the oft-repeated rule that “bar charts should always start at zero,” but its value did not sink in until I looked at a chart and thought about why. I could see how the trimmed bars’ proportions were completely off, since each bar is taken to represent a whole value, and I could understand how a reader with little prior knowledge or experience could easily be mislead. It felt eye-opening.
Likewise, a good visualization can be a revelation as well (for all the right reasons). In Data Points, Nathan Yau demonstrates how even the simplest data visualization can tell a compelling story (2013, pp. 231-232). He uses a scatterplot to display the weekly median salaries of a variety of professions in 2011: a dot’s position on the x-axis shows how much men are paid for that job, and the dot’s position on the y-axis shows how much women are paid for it. He first shows the graph without any annotation: an upward trend to the dots shows generally that jobs that pay more do so for both men and women. But on the next page, he plots a line representing equal pay (x=y) through the middle of the graph, and this small detail puts the entire graph into perspective: there is one dot is over the line, and two or three right on it, but all the rest of the dots (several dozen of them) are under it, showing quite plainly that men are paid more than women in many professions. Yau’s visualization shows the storytelling power of numbers, and his succinct yet arresting style makes the reader hungry for more.
Over the course of the semester I looked at an abundance of charts, graphs, and visualizations: some great, some not so great, some ordinary – and all absolutely fascinating. Numbers can tell a story just like words can. Numbers can be used to confound and mislead. Often you have to look not only at what is in the chart, but also at what is missing. Every detail of a graph can mean something. Feeling scared? Nervous? Wary? There is no need to be: no matter your background or interests, you already have the skills required to think critically about data visualization – you just need to apply them.
Numbers are not so different from words: the two are interwoven together, forming the fabric of our intellectual lives. Numbers have power: they can tell a story. Numbers can be a foundation upon which you can build an argument, or the ammunition needed to take an argument apart. We do ourselves, and our patrons and students, a disservice when we needlessly silo different topics: the topics may be different, but the critical thinking skills they require are the same. My observations at the U-M Library showed me that patrons from all disciplines are interested in data visualization, and would love to use data visualizations in their work, even when they know very little about how to create ones themselves. In his classes, my mentor often brought in visualizations and encouraged students to voice their opinions and pick the visualizations apart – this instructional strategy truly showcased the fact that data visualization can be understood and enjoyed by anyone.
Take the critical thinking skills you already have, and apply them to numbers! The underlying, foundational competencies are the same in any discipline: deep reading, attention to detail, an ability to take ideas apart and draw conclusions. The very same skills you use to interpret a scholarly article on library instruction, or critique a study on students’ information-seeking behavior (or even analyze Hamlet, for all you other former English majors out there) can be unleashed onto data visualization.
Exploring data visualization simply involves applying the relevant skills to particular situations. That is the beauty of data literacy. It is not subject or discipline-specific: data literacy invokes skills and ways of thinking that are applicable to any number of contexts and situations. Data literacy encompasses competencies that are valuable not only in the classroom, but which translate into real life applications. It allows us look at information around us in a new light, and it helps us make more informed decisions. Take the news, for example: being a more critical interpreter of visualizations you see in news articles allows you to assess credibility and form your own opinions. Data visualization can act as a an door into data literacy, bringing in new abilities that build off of old ones, and transferring prior knowledge to new contexts of learning, thinking, and doing.
Take a deep breath and dive in: data visualization is all around you already. Critique a map in a news article. Think critically about a line graph in a scholarly paper. Look closely at a bar chart in a report at work. Practice makes perfect, and this can be a fun skillset to practice. Then let yourself plunge to even greater depths: seek out visualizations online, do some in-depth research – maybe even begin creating charts and graphs yourself, to visualize data from your workplace, or from the lessons you teach, and glean new insights. Before you know it, you will be fully immersed in the engrossing world of data visualization.
You can encourage your students and patrons to start doing the same. As a group, question the way data is presented in the charts, graphs, and other visualizations that you encounter everyday. Consider together how changes in presentation might change the meaning or essence of what is being conveyed: for example, a line graph might allow you to visualize the changes in values over time, while a bar chart would let you see those same values as wholes, and compare them to one another. Foster in-depth critical discussions, and watch data literacy – both your own and that of your students or patrons – grow before your very eyes.
Even after all the research that I did, I am still unable to make a visualization with Python, and my grasp of R is still tenuous (and that is being generous) – but I know now that there is more to data literacy than that. I am proud to call myself newly data literate: there is always room for improvement on the spectrum of literacy, so I look forward to continuing my forays into the analysis and appreciation of data visualization. I hope my enthusiasm is contagious: data literacy is well worth taking the plunge. You will be rewarded with a new outlook on the data that you encounter, and deepened analytical skills that will serve you well in both your personal and professional lives.
Yau, Nathan. 2013. Data Points: Visualization That Means Something. Indianapolis: Wiley.
Tierney Steelberg is a Master’s student studying library science at the University of Michigan School of Information. In addition to her interests in information literacy and data literacy, she is passionate about access, digital scholarship, and innovation in libraries. In her spare time, she enjoys reading voraciously and cooking comfort food.
WHERE DO YOU LEARN DIGITAL ORGANIZATION FOR YOUR ONLINE LIFE?
Consider your online life. How do you organize yourself? How did you learn your organizational habits? Do you use tools like RSS feeds, Google Alerts, and Diigo (a social bookmarking and digital annotation website)? Or, do you have lists of favorite websites in huge hierarchies of bookmark folders that you sift through in your browser?
Using digital tools, from RSS feeds to Diigo accounts, can make online work more efficient and effective, but it seems like students and professionals do not consistently employ them or even know about them. The rapidity with which the Internet was adopted meant that instruction for working online was not part of education at first. Rather, people had to figure out how to use tools and developed their own personal practices. However, employing tools like the aforementioned, and others, can save time, make re-finding online content easier, and automate common tasks. These digital organization skills, which are a subset of digital literacy, need to be taught, particularly in the high school environment, because believing that people will “just figure it out” – a notion prevalent in high schools – is inadequate.
For my class on information literacy in the fall of 2015, I observed a library media specialist and library media center at a well-funded, suburban Michigan high school with more than 2,000-students. Watching instruction sessions, talking with teachers, participating in daily tasks, troubleshooting technology issues, and planning lessons were among my responsibilities and experiences. To gain teaching experience, I taught two lessons: a one-shot session about creating an online personal learning environment to an AP Language class and professional development sessions about using Diigo to form a professional learning community to teachers.
My concern about instruction on digital organization evolved from my observations of teachers and students in both the library and classrooms. In my observations, the lack of a home for instruction in digital organization in high school education became apparent. That high school simply offers graphic design, computer-aided design (CAD), and programming classes as electives. A required course on technology skills does not exist, nor do teachers take on technology instruction as a consistent learning objective. In the classroom, teachers give point-of-need technology instruction when it fits with lessons, such as a research assignment or a unit for which the final product is an infographic. This evident issue – no clear approach to teaching digital organization, including how to use particular tools and when – left me wondering who should be accountable for teaching these technology skills.
At the high school where I observed, the library media center serves students, teachers, and staff. Its services and tools include computers for students to use, a laminator, book checkout, laptop carts, conference rooms, and seating areas for classes to work in the media center. This high school has an abundance of technology, from iPads to sets of Chromebooks on carts, for its students and classrooms (owing to a recently-passed technology bond). The library media specialist typically gives point-of-need instruction for classes, as well as teaches freshman orientations each fall. Sessions range from database tutorials to lessons on how to use citation systems. Teachers request help from the media specialist in the form of planning assignments or giving 10-minute to full hour sessions on information literacy topics.
When I inquired about the lack of a home for digital organization instruction in conversation with my mentor, the answer was that students are expected to “figure out” how to use devices and digital tools. The efficiency of searching online – searching Google, that is – and abundance of devices have caused teachers and administrators to assume that students can operate technology and do research effectively. However, my mentor and I agreed that this approach does not instill the necessary technical know-how.
For example, my mentor said students will compose papers on their smartphones during lunch on the day an assignment is due. This practice stopped me in my tracks. What about editing the draft? Are the students looking at notes that they have synthesized as they write, and, if so, where are the notes? Do they at least proofread the document before turning it in? This concerning behavior raises questions not only about good scholarship but also technology use and best practices. Why is it acceptable for students to key a paper during lunch hour? And where will this behavior leave them when they move on to universities and colleges?
Furthermore, in considering this issue, I heard two related anecdotes from university librarians. One librarian, who teaches an undergraduate research skills class, said that a freshman college student from a less privileged high school struggled with navigating the online course management system, which most courses use to communicate, provide course readings and materials, and collect completed assignments. It is expected that the students can figure out how to navigate it. However, to figure it out, students would need either prior experience with such systems, which would come from their previous academic program (high school for undergraduates), or instruction on using the system from the university – neither of which this student presumably received. This is a gap.
Another librarian noted that young people are supposedly “digital natives” (Prensky 2001). Yet, this librarian has found undergraduates lacking in technical knowledge. This comment corroborates my observation that there is no home for digital organization instruction in high schools. This issue is manifesting itself in higher education because students are not coming in with the skills to navigate technology at the level that they need to be able to operate.
The common thread through these stories is that students will just figure it out. This approach leaves their competency up to chance, chance for whether a teacher teaches a particular skill, chance for whether the student takes the time to figure it out, chance for whether they have the skills to complete a task, fulfill job requirements, or operate effectively in their education. Tools and devices in use now will change, but knowledge of how to use them, and why, will equip students to transfer their understanding to new devices and contexts – skills that will serve them in both the workplace and higher education.
Instruction in digital organization is uneven. Several factors influence this inconsistency. Students have digital access at home and often on their own devices, and school leaders think that students know how to use them well, as a result. (It is reasonable to expect that they can use them but not reliable to assume they can use them efficiently.) Educators leave it up to students and parents to figure out the tools and devices that figure prominently in their lives and education. Also, websites and digital tools update frequently and consequently require teachers to relearn them in order to teach them. This recurring change can cause frustration when lesson plans are structured for an older version of the tool or device and time is short. These factors, though valid points, do not seem strong enough to leave digital organization skills up to chance.
Marc Prensky’s assertion that students who have grown up with technology are “digital natives,” while people who did not grow up with technology are “digital immigrants” (2001), shows a dichotomy that does not reflect students’ skills. Technology is intuitive to some and not to others, including those who grow up using it – using tools on an as needed basis, that is. To address this discrepancy, White and Le Cornu’s point that students absorb skills at varying rates applies to digital organization. Their work suggests that some students might not pick up digital literacy skills easily and need assistance (White and Le Cornu 2011). This point negates the idea that students will just “figure it out” for their digital skills.
Additionally, White and Le Cornu’s definitions of “place” as a means to interact digitally with others and “tools” as the means to do so, as well as “visitors” employing the web to access content and “residents” occupying the web as a social network that creates content (a re-imagining of Prensky’s typology), demonstrate the need for metacognition about one’s actions on the web and digitally in general (2011). Rather than diving into the web or a digital program to achieve a task, considering what tool and what place will best support a need will improve digital organization. For example, all but one of the 22 Advanced Placement Language students I taught said that they go to Google first for searching. What about setting up a Google Alert or RSS feed for the topic? Similar to the questions that White and Le Cornu propose that people ask about their motivations, I aimed to encourage the students and teachers who I taught to step back and consider what tools they use to fill a need and why. Making the Internet work for them would save precious time and improve their organization, whether they take a “visitor” or “resident” approach. Digital organization instruction would prompt students to make such considerations.
As such, I am discontent with the digital organization skills that high school students have and receive – skills necessary personally and professionally. My mentor told me that basic instruction occurs in middle school in her district, and by high school, it is expected that students have picked up the skills. Perhaps they know enough to negotiate high school – enough to use their smartphones to dash off a paper – but they might not employ time-saving digital tools and practices that increase their competency in the workplace or at the university. Just as education becomes more advanced in higher grades, technology instruction needs to correspondingly advance.
I assert that digital organization instruction needs a home – a dedicated class – in high schools to equip students for the workplace and higher education. Evidence comes from the reception of my lessons. The students in the AP Language class I taught asked for access to my slides about creating a personal learning environment. Their request reveals their interest in organizing themselves and making the Internet work for them. It also indicates the need for formal instruction on digital organization and related digital literacy skills, such as digital citizenship. They were not figuring it out themselves, and they were not aware of all the tools (RSS feeds and apps, Google Alerts, and Diigo) that I taught. To guide this instruction, the International Society for Technology in Education (ISTE) defines National Education Technology Standards (NETS) for students, including selecting appropriate tools and transferring skills (ISTE 2007), and is launching a revision. Also, the nonprofit Common Sense Media offers curriculum for digital citizenship. These standards and curriculum can inform teaching digital organization in high schools.
Students need more preparation for digital organization, from using tools to employing the best ones, to be competent personally and professionally. Prensky recognized the need to incorporate technology into teaching methodology nearly 15 years ago. Now, the need is not only to use technology but to teach digital organization. Teachers and library media specialists do their best to prepare students. Still, that no one has claimed responsibility for digital organization instruction is a growing problem because this component of digital literacy is a key competency for life. High school is the place to teach digital organization because it is when students start needing more tools and is prior to their entry into the workplace or higher education where they will use the tools professionally. I propose that this underlying skill, which is like mortar between the bricks of core subjects (Fontichiaro 2015), needs time and a place in high schools. Figuring that students will figure it out is not sufficient.
Common Sense Media, Inc. 2015. “K-12 Digital Citizenship Curriculum.” Common Sense Education. Retrieved November 25, 2015, from .
Fontichiaro, Kristin. 2015. Interview by the author. Ann Arbor, MI. Dec. 2.
International Society for Technology in Education (ISTE). 2007. “ISTE Standards For Students.” International Society for Technology in Education. Retrieved December 6, 2015, from .
Prensky, Marc. 2001. “Digital natives, digital immigrants.” On the Horizon 9(5). Retrieved December 6, 2015, from [+ http://www.marcprensky.com/writing/Prensky%20-%20Digital%20Natives,%20Digital%20Immigrants%20-%20Part1.pdf+].
White, David S., and Le Cornu, Alison. 2011. “Visitors and residents: A new typology for online engagement.” First Monday 16(9), Sept. 5. Retrieved December 6, 2015, from .
Martha Stuit is a graduate student at the University of Michigan’s School of Information and will receive her Master’s of Science in Information in the spring of 2016. She serves as a User Information Services Assistant on the reference desks and instant message service at U-M Library. She works on an IMLS-funded project to increase data literacy in high schools as the grant’s Project Assistant. Martha is eager to help people navigate information and data.
RULES OF THUMB FOR TEACHING SCHOLARLY PUBLICATIONS TO UNDERGRADUATE STUDENTS
Once they reach the university level, students are often expected to navigate the confusing maze of library resources to locate, read, analyze, and cite scholarly literature on a certain topic, even though few are taught these skills in high school. Course instructors usually know that students need help from librarians to learn how to find scholarly sources through the library but may incorrectly assume that students know how to read and analyze scholarly articles. Library instructors are often asked to give generic information literacy sessions for a variety of disciplines and course levels (from beginning freshmen to graduate students), but these requests often don’t include how to read and analyze scholarly publications.
My practicum experience at a large public university in the Midwest focused on observing and teaching information literacy sessions for undergraduate students in STEM fields. I observed sessions in which undergraduate students were guided in how to retrieve scholarly publications from scientific databases. Unfortunately, many library instructors were clearly more interested in the quantity of information conveyed rather than the quality of information conveyed. These instructors only planned a lecture style session with very little student interaction. These sessions were held in a computer lab, so with the combination of a boring lecture on using scientific databases and a computer in front of them, most students start using the computer for other tasks within a few minutes.
For my instruction sessions, I taught several sessions for two student groups in the STEM fields on how to read scholarly publications. One of these sessions, a session for undergraduate engineering researchers, I had taught for the first time the previous year. I was very lucky to have a year to reflect on the session and how to improve it.
After reflecting on my observation and instruction sessions, I have developed a few Rules of Thumb for teaching how to read and analyze scholarly publications. While my focus is on the STEM disciplines, many of these Rules of Thumb are also applicable to humanities and social sciences. I hope that these Rules of Thumb will help library instructors focus on quality rather than quantity of information conveyed during these instruction sessions.
EIGHT RULES OF THUMB FOR TEACHING HOW TO READ SCHOLARLY PUBLICATIONS
ALWAYS EXPLICITLY TIE READING SCHOLARLY PUBLICATIONS TO STUDENTS’ ASSIGNMENTS
Students are much more likely to pay attention if you clearly state how this instruction session will help them complete their assignment more easily. To help match your instruction to the assignment, ask the for a copy of the assignment in advance so that you can tailor your session to their assignment. Don’t assume that the assignment is the same as last year or that it’s the same as the assignment from your colleague’s instruction session. If you can’t get assignment in advance, ask the instructor or teaching assistant right before the session starts. If all else fails, you can always ask students to describe the assignment and its requirements during the session. (Asking students may lead to incorrect or incomplete information, however.)
PROVIDE AN INTERESTING AND RELEVANT SCHOLARLY PUBLICATION
To engage students in the session, give each student a copy of a scholarly publication. Be strategic when selecting an article. First, select a topic that is both relevant to the discipline of the course and an interesting or current topic. For engineering students, I often use scholarly articles on how hydraulic fracturing (fracking) causes earthquakes. Second, look for articles on this topic that have good visuals, in color if possible. Visuals make articles more interesting, especially for novice readers, and also cater to visual learners. Publications with maps are especially good because most students have prior experience with reading and interpreting maps.
START WITH A COMPARISON OF POPULAR AND SCHOLARLY LITERATURE
Students are often incredibly intimidated by scholarly publications because they think they have no prior knowledge or experience to help with reading these publications. You can ease their anxiety by leading them through a discussion of popular vs scholarly publications. First, you’ll need to provide students with two articles on the same topic: one from a popular source and one from a scholarly source. Second, give students a few minutes to skim both articles. They don’t need to read the entirety of each article; you don’t want students to get bogged down in the technical language of the scholarly article. Third, ask students which article is from a popular source and which is from a scholarly source. Finally, start a discussion on the characteristics that led the students to categorize the articles in this way. Typical characteristics to discuss include the author’s credentials, the type of visuals used, the type of language used, and the format of the articles. Make sure to highlight the difference between a popular and scholarly article for each characteristic (such as popular articles use general language whereas scholarly articles use discipline-specific language). This hands-on discussion will tap into students’ prior knowledge and set a conversational tone for your session.
BRIEFLY DESCRIBE THE DIFFERENT TYPES OF SCHOLARLY PUBLICATIONS THAT THEY MIGHT ENCOUNTER
Before discussing the scholarly article in depth, you should have a short conversation about the different types of scholarly publications (articles, review articles, letters, conference proceedings, etc.) But first use your professional judgment to determine if the students absolutely need this for their assignment; if they don’t, avoid the temptation to include it in your session. Again, having the assignment in advance will help you determine if this section is necessary and if so, which types of cover. Some assignments may not explicitly specify the type of scholarly publication needed. Use your knowledge of the assignment and the discipline to guide your decision. For example, conference proceedings are a very common type of scholarly publications in engineering fields so I usually include them in sessions for engineering students. Finally, after discussing the different types, ask the students to identify the type of scholarly publication from your previous discussion of popular vs. scholarly publications.
NAME AND DESCRIBE ALL OF THE TYPICAL SECTIONS OF SCHOLARLY ARTICLES
For any session on scholarly publications, you shouldn’t assume that students know the different sections of an article. These sections include: abstract, introduction, materials and methods, results, discussion, and conclusion. This part of the session will show students that scholarly publication have a predictable structure. It’s much easier to describe these sections if your scholarly article has explicit section headings to denote each section. In your presentation, I suggest using screenshots of each section so that students can follow along on their copy of the article and you can highlight certain aspects with arrows, circles, etc. Describe the purpose of each section and any unique features. For example, with the abstract, I describe it as a mini version of the article, including roughly one sentence for each an introduction, methods, results, and conclusions.
SHOW STUDENTS A PROCEDURE FOR HOW TO READ AN ARTICLE EFFECTIVELY
After describing the structure of a scholarly publication, give students a plan for how to read the article. Most importantly, highlight that the most efficient way to read an article is not beginning to end but rather by skipping around to various sections. The exact order of sections will depend on the discipline and assignment. After reading each section, I encourage students to think ‘Is this article still relevant to my assignment?’; if not, they shouldn’t waste their time by continuing to read. This part of the session is also a good time to mention that students shouldn’t expect to understand all of the terminology used in the article and should be prepared to read the article multiple times before completely understanding it. I also encourage students to mark up the text with comments or questions. As they read, they should analyzing the strengths, limitations, and assumptions of the publication.
BRIEFLY DESCRIBE A CITATION MANAGEMENT TOOL AND HOW IT CAN HELP WITH THEIR ASSIGNMENT
Depending on the level of students and the assignment, introducing students to a citation management tool may be appropriate. Again, use your professional judgment to assess whether these students will benefit from an introduction to one of these tools and which tool to present. Don’t try to present several features of the citation management feature; instead focus on one or two features that will be the most important to students. For example, in RefWorks, I demonstrate how to create a bibliography with one click of a button. Also, make sure to highlight online resources for students to download and learn more about these tools.
AVOID ABSTRACT, HIGH-LEVEL TOPICS
We, as librarians, know that scholarly publications are complex and nuanced. But, for students’ sanity, focus on presenting the bare minimum amount of information (which will still be quite a bit of information as demonstrated in the seven rules above). Stick to exactly what they need to know. Students will already be overwhelmed with this amount of information. Don’t add extraneous information merely because it’s interesting. Topics such as open access publications and DOIs are incredibly interesting from a librarian viewpoint but are extraneous for undergraduates.
You may have noticed that several Rules of Thumb contain a caveat: use your professional judgment when deciding whether to include this section in your instruction session. These Rules of Thumb are meant to serve as guidelines, rather than a definitive instruction set. You know your institution, faculty, and students the best and therefore are the best judge to determine which of these Rules of Thumb are applicable to a given instruction session. Creating and executing tailored lesson plans are more work than generic lesson plans but students gain much more from the former type.
Joanna Thielen is a University Library Associate at the University of Michigan Library and a second year master’s student at the University of Michigan School of Information.
ABOUT OUR CLASS:
SI 641: Information Literacy for Teaching and Learning
This course introduces theories and best practices for integrating library-user instruction with faculty partnerships. Instructional roles are presented within the wider context of meeting institutional learning goals. Students acquire explicit knowledge, skills, and competencies needed to design, develop, integrate, and assess curriculum and instruction in a variety of information settings, including educational and public organizations. The integral relationship between technology and information literacy is examined. Students are given opportunities to partner with professional mentors in schools, academic libraries, museums, and in other educational institutions.
Upon completion of this course, students will be able to:
1. Identify key theories about inquiry-based learning and information literacy;
2. Create a virtual learning module about some aspect of information literacy and learning;
3. Reflect on their experiences observing practitioners in a teaching role;
4. Lead face-to-face instruction on an aspect of inquiry or information literacy;
5. Develop an online learning module on some aspect of inquiry or information literacy, in partnership with a mentor;
6. Engage in ongoing discussions about how we define literacy(-ies) in the digital age.
To learn more about the School of Information at the University of Michigan, visit http://si.umich.edu