Association of Research Libraries

University of Washington Libraries

Library Assessment Conference

Building Effective, Sustainable, Practical Assessment


2018 Library Assessment Conference
Houston, Texas
December 5–7, 2018
#lac18
Follow


2008 Post-Conference Workshops

Date: Thursday, August 7, 2008
Time: Morning Sessions: 8:30am – 12:00pm; Afternoon Sessions: 1:00pm – 4:30pm
Fee: $60 per workshop

(1) Beyond the Pie Chart: Techniques for Structuring and Visualizing Quantitative Information
(2) Getting Started with Learning Outcomes Assessment: Purposes, Practical Options, and Impact
(3) Jumpstarting the Assessment of Library Learning Spaces
(4) Successfully Implementing the Balanced Scorecard
(5) Turning Data into Information: Details behind Telling the Library Valuation Story
(6) Usability Testing: Effective Methodologies and Practical Applications

1. Beyond the Pie Chart: Techniques for Structuring and Visualizing Quantitative Information

Leader: Joe Zucca (University of Pennsylvania)

Time: 8:30am – 12:00pm

This workshop, led by Joe Zucca of the University of Pennsylvania Libraries, explores a number of issues involved in the visualization of data. Topics draw on a wide range of concrete library data sources and data representation problems, with a focus on techniques for effectively presenting statistical information in graphical form. Joe will discuss basic graphing concepts; he’ll look at practical problems of data gathering, particularly automated processes of data collection which draw upon experience with the Penn Library Data Farm. And he’ll explore important issues concerning data structures, which influence how library systems record, store and ultimately make raw data available for analysis, for use in decision-making, and for the expository needs of library managers and higher level administrators.

Joe Zucca is the Director for Planning and Communication at the Penn Libraries. His units are responsible for coordinating Library planning and project management; for overseeing management information services, such as Data Farm; for maintaining a wide range of publication and marketing programs, including content management and design of the library web space; and for administering corporate and foundation grant activity. The effective development of data sources and the expository use of data are critical tools in the areas Joe manages, and the workshop is designed to show how evidence-based librarianship works in a large ARL setting.

[back to top]

2. Getting Started with Learning Outcomes Assessment: Purposes, Practical Options, and Impact

Time: 8:30am – 12:00pm

Leader: Megan Oakleaf (iSchool, Syracuse University)

NOTE: This workshop is full. You may be placed on the wait list at the time of conference registration.

Tasked with assessing information literacy on your campus? Confused about your options? Dissatisfied with assessments you’ve already attempted?

Intended for librarians considering, commencing, or retooling a plan for assessing student learning outcomes, this half-day workshop will include mini-lectures, discussion, and hands-on, scenario-based activities to engage participants in answering three questions:

  1. What is the purpose of learning outcomes assessment in my library?
  2. What assessment tools can I use? What are the strengths and limitations of each? How do I choose the right one for my campus?
  3. How will my choices impact teaching and learning? How will I “close the loop”?

Megan Oakleaf is an Assistant Professor in the School of Information Studies at Syracuse University where she is the professor of record for IST 613 “Planning, Marketing, and Assessing Library Services”. Megan joined the iSchool at Syracuse after completing her dissertation entitled, “Assessing Information Literacy Skills: A Rubric Approach,” at the School of Information and Library Science at the University of North Carolina at Chapel Hill. Megan is also a faculty member of the ACRL Institute for Information Literacy Immersion Program. Previously, Megan was the Librarian for Instruction and Undergraduate Research at North Carolina State University. Megan has presented on topics including information literacy, outcomes based assessment, evidence based decision making, user education, information services, and digital librarianship at numerous conferences, including ACRL National Conferences, the Library Assessment Conference, the IUPUI Assessment Institute, the NCSU Undergraduate Assessment Symposium, the Texas A&M Assessment Conference, and EDUCAUSE. Recently, Megan won “Best Paper” at the International Evidence Based Library and Information Practice Conference. Prior to a career in librarianship, Megan taught language arts and advanced composition in public secondary schools, grades 8-12.

[back to top]

3. Jumpstarting the Assessment of Library Learning Spaces

NOTE: This workshop is full. You may be placed on the wait list at the time of conference registration.

Time: 8:30am – 12:00pm

Leaders: Crit Stuart (Research, Teaching, and Learning, Association of Research Libraries) and Todd White (Anthropologist & Library Consultant)

We know that we should continually assess our built (or soon-to-deliver) research and learning spaces, but many of us lack the confidence or prior experience to get started. By doing little or no assessment, we risk losing touch with the needs of our constituents; fail to make needed improvements or corrections in course; and demonstrate to our students and faculty a poor stewardship of the enterprise. Assessment should be a critical component of new space and program planning activities, and ongoing post-occupancy analysis. Assessment is best conducted by individuals who manage, program, and work within the learning spaces, but there is often a reluctance to go the final distance. This workshop supplies attendees with efficacious and easy-to-apply techniques to “jumpstart the momentum,” and provide you with the confidence you will need to create and sustain a comprehensive assessment program for learning spaces.

At the 2006 Library Assessment Conference, Joan Lippincott outlined a variety of assessment techniques to apply to new or renovated learning spaces in her presentation, “Assessing Learning Spaces.” Lippincott suggested that we identify and address the big issues, characterized in part by these questions:

  • How do we establish the learning objectives the learning space supports?
  • How do we identify, engage, and sustain the right mix of partners in the learning space?
  • How is student learning facilitated; are critical skills enhanced; what learning needs languish?
  • How are the technology and physical components of the new spaces working?
  • How are faculty pedagogy and classroom deliverables being supported?

This workshop builds on Lippincott’s framework by exposing participants to a rich suite of assessment methodologies to employ in their settings – typically learning / information commons, multimedia studios, and other spaces that blend services, technologies, mentoring, and training for the benefit of student and faculty.

Attendees will (1) be exposed to a rich suite of assessment techniques that are becoming popular with librarians and their collaborators around learning spaces and (2) begin to jumpstart their confidence through moments of hands-on practice, coupled with discussions around select techniques.

Crit Stuart is ARL’s program director for research, teaching, and learning, the newest of the Association’s strategic initiatives. The RTL program focuses on new and expanding roles for ARL libraries to engage in the transformations affecting research, and undergraduate and graduate education. Previously Crit was senior associate director for public services at Georgia Tech Library, where he facilitated learning space transformations derived from user-centered studies.

C. Todd White (PhD Anthropology, University of Southern California) consults with university libraries to provide assessment support for new and evolving programs that target both space and consumer behaviors. Todd was involved in the University of Rochester’s IMLS-funded project on how doctoral students conduct research, write, collaborate, and use library resources to help create a web-based authoring, archiving, and self-publishing tool for faculty and doctoral candidates. He recently collaborated on a similar project at Colorado State University to build an online search tool to facilitate student research.

[back to top]

4. Successfully Implementing the Balanced Scorecard

Time: 1:00pm - 4:30pm

Leaders: Jim Self and Donna Tolson (University of Virginia Library)

Learn what it takes to successfully implement a Balanced Scorecard.

This workshop is intended for anyone seeking to develop a coherent structure of performance indicators and targets for their library.

As part of its effort to create a culture of assessment, the University of Virginia Library implemented the Balanced Scorecard in 2001. Jim Self has been involved with the scorecard since the beginning, and Donna Tolson chaired the scorecard committee from 2004 to 2007.

Jim and Donna will call on their experience at U.Va.to inform the content of this workshop. They will discuss the principles of designing a scorecard, and will emphasize practical approaches and realistic indicators that express the value and impact of library services. Participants in the workshop will have the opportunity to craft the outline of a scorecard specific to their own circumstances and needs.

Donna is Head of Clemons Library, the undergraduate and media services library at U.Va. Previously she served as Head of the Scholars’ Lab, a collaborative venture between the Library and the University’s IT division. Prior to joining the Library, she worked for twenty years in the areas of demographic research for the Commonwealth of Virginia, and at the U.S. Census Bureau.

Jim is Director of Management Information Services at the U.Va. Library. He formerly headed the Clemons Library at U.Va., and the Undergraduate Library at Indiana University. Working with Steve Hiller, Jim serves as a Visiting Program Officer for the Association of Research Libraries, bringing to life the “Effective, Sustainable and Practical Assessment” service, www.arl.org/stats/initiatives/esp/. During the past four years Jim and Steve have conducted more than thirty assessment consultations at academic libraries in North America, Africa, Europe, and the Middle East.

[back to top]

5. Turning Data into Information: Details behind Telling the Library Valuation Story

Time: 8:30am – 12:00pm

Leader: Neal K. Kaske (Public Services & Regional Libraries Branch, National Oceanic and Atmospheric Administration)

Learn to turn data into information that will change management’s mind about your library’s value and performance. Please bring your questions about library evaluation research and current data to this participatory workshop.

Key tools and concepts needed to build your library’s value case will be addressed, including: data dictionaries; international and national standards for library metrics; national, state, and local data sets; online library comparison tools, statistical dictionaries, valuation tools; cost/benefit analysis and return on investment. You will also define your library’s current valuation data and information needs in this participatory workshop.

The primary goal is to have a better understanding of ways to turn your current data into information that you can use to demonstrate the value of your library. The secondary goal is to start identifying additional data and information you could collect and use to advance your library’s case to upper management. Basic outcomes will be a clearer understanding of the key valuation concepts and methods for presenting information to your organization’s upper management and to current and potential library customers.

Neal K. Kaske, Chief – Public Services & Regional Libraries Branch, National Oceanic and Atmospheric Administration (NOAA), has been an active library evaluation researcher for many years. He is currently working to document the value of online database and journal use. His experience includes federal and academic library administration, science reference, teaching, research, national survey and statistical management, research management, and grant management. Neal’s doctorate is in industrial engineering – library systems management, masters in librarianship and baccalaureate in sociology. Neal is an active member of the American Library Association and on the editorial board for /portal: Libraries and the Academy /and reviews for other library and information science journals.

[back to top]

6. Usability Testing: Effective Methodologies and Practical Applications

Time: 1:00pm - 4:30pm

Leader: Jennifer L. Ward (Web Services, University of Washington)

According to ISO 9241-11 (1998), usability is defined as the “extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.” Increasingly libraries have turned to usability testing to evaluate their online, and to some extent physical, presence. Not all methodologies associated with usability testing are created equal and this workshop will focus on the various methodologies and how they can be applied in a library setting.

This practical workshop will cover the following areas:

  • Different methodologies that can be used throughout the product life cycle to iteratively evaluate its usability
  • Minimum equipment needs for the different methods
  • Role of the moderator and observers
  • Recruiting users
  • How the methodologies might be applied in other contexts, such as using the think-aloud protocol to evaluate the usability of the library’s physical space

Participants will have an opportunity to shape the workshop content once registration is confirmed, and will leave the workshop with a better understanding of various usability methodologies and when they should be used at different points in the product life cycle.

Jennifer L. Ward is Head of Web Services at the University of Washington Libraries, where she is responsible for general oversight of the Libraries’ Web presence and has managed the Libraries’ human factors/usability program since its inception in 2001. A member of the Libraries Assessment Group since 2000, Ms. Ward is a frequent presenter and publisher on a variety of assessment-related topics. In addition to her Web and assessment work, she manages the Services Group within the Libraries’ Information Technology Services unit. Ms. Ward received her MSLIS from the University of Illinois at Urbana-Champaign.

[back to top]