Authored by the YALSA Research Committee
Throughout the current term, the YALSA Research Committee will be looking at Teen Services Competencies for Library Staff through the lens of research. Through our posts, we will attempt to provide a brief snapshot of how scholarship currently addresses some of the issues put forth through the standards.
Researching outcomes, libraries, and assessments, the research committee narrowed the research results to three relatively recent studies on outcomes and assessments. The first study examines advantages and disadvantages for end of programs assessments (EPA’s) for LIS master programs utilizing a survey. In the second report the research committee will highlight a case study of a LIS distant learning program with an outcome of over 90% graduation rate and what their assessments look like. The third report looks at a review of recent research of school libraries and the importance of using evidence for successful student outcomes.
Outcomes and assessments are crucial as evidence of getting things done. Everything we do relies on outcomes and assessments and this includes schools, whether you are observing your or children’s progress in classes. The first study looks specifically at MLIS programs, examining various end of program assessments (EPA’s). Burke and Sneads’ (2014) study looked at EPA options including theses, capstone courses, various kinds of projects or papers, and portfolios, with about 40% of programs requiring comprehensive exams comparing both advantages and disadvantages of each assessment requirement. One hundred and twenty five faculty members were asked what their EPA’s preference would be, if given a choice. Overwhelmingly, they chose portfolios, the most common reason given was that “it requires students to reflect on their program experience” (p. 31). It is interesting to note that two respondents stated, a “good assessment to know how well the program is functioning is to assess the careers of graduates” (p. 36). Burke and Snead noted that full professors were more likely than their more junior colleagues to choose portfolios as their preferred EPA. The authors added that this might result from their recognition of the value of outcomes-based measures due to their greater experience with accreditation, funding, and administrative activity.
Aversa and MacCall (2013) head the second report, which is a case study of successful outcomes of an MLIS program, which included a 90% graduation rate. Interesting to note, Aversa and MacCall claim “It has been established, through studies at several colleges, that attrition rates have been 10 to 30 percent higher for courses delivered online than for those delivered face-to-face” (p. 148). First, they focused on recruitment and supported that process in five ways; through student-initiated inquiries, advertising, direct mail to potential students, word-of-mouth, and contacted on an individual basis by administrative staff. Next, the application and admission process is supported by administrative staff and faculty communicating individually with each applicant as applications travel through the admissions process. Upon admission, students are contacted by the Distance Education Coordinator with whom students participate in a “test drive” of the distance learning technology. Enrollment being the next process for the SLIS program is supported by one-credit-hour residential orientation; students are introduced to one another, and to the faculty, as well as to the technology, curriculum, and traditions of University of Alabama (UA) SLIS. Instructional delivery for core and elective courses are supported by various synchronous and asynchronous systems, and technical support staff are readily available during and after class sessions for public or private support when needed. Distant learning students can take classes face-to-face if they wish to enhance their professional network. The fourth way UA supports their students is ensuring socialization through introductions of faculty, staff, and cohorts but in addition each year of cohorts are encouraged to “name” their group, this builds group identities and instills a sense of camaraderie. In addition, online town halls are held once every term to ensure open communication, access, and to address any problems or concerns with the director and their assistant of the program. Finally, the UA SLIS program identified and addressed barriers such as isolation, financial, and time management issues. The outcomes of the aforementioned assessments are impressive with a reporting of five cohorts a graduate rate of 90.4 percent.
The third report the research committee focused on is Hillary Hughes’ (2014) review of evidence from formal research that involved purposeful data and the importance of teacher librarians to guide their professional practice and demonstrate their contribution to student learning. Hughes reports on what Ross Todd calls ‘evidence for practice’ that focuses on “the real results of what school librarians do, rather than on what school librarians do” (p. 89). We need to examine “impacts, going beyond process and activities as outputs” (p. 39). The author suggests that professional articles are commentaries and antidotes, which can be useful but it does not provide evidence. We need to focus more not on the inputs, what librarians put into making a program successful but rather the outputs, what our learners actually learned, what is the solid evidence that something was learned.
The findings of this body of literature contribute evidence and understanding about school libraries and school librarians, which are of potential use to a variety of stakeholders. In particular, they can support school librarian practice and indirectly influence student outcomes. However, unless evidence is used strategically, its value is lost (Hughes, 2014). As we are all aware, if we cannot prove our worth, why would we expect our programs to be funded. Actionable evidence is paramount. It must be systematically gathered, evaluated, applied, and presented in a suitable format for the intended audience, not just preaching to the choir (Hughes, 2014). No matter how passionate and persistent the advocacy, it will have limited potency without ‘actionable evidence’ (Hughes, 2014, p. 89) that is systematically gathered, evaluated and applied. This is the essence of evidence-based practice.
Aversa, E. & MacCall, S. (2013). Profiles in retention part 1: Design characteristics of a graduate synchronous online program. Journal of Education for Library and Information Science, 54(2), 147-161.
Burke, S.K. & Snead, J.T. (2014). Faculty opinions on the use of master’s degree end of program assessments. Journal of Education for Library and Information Science, 55(1), 26-39.
Hughes, H. (2014). School libraries, teacher-librarians and student outcomes: Presenting and using the evidence. School Libraries Worldwide, 20(1), 29-50.