Blog post round-up is a series of posts that pull from the great YALSAblog archive. The topics have been requested by YALSA members. Have an idea for a topic? Post it in the comments.

Community Partnerships:

Teen Programming: Building Teen Futures with Community Partnerships

30 Days of Teen Programming: Develop Rich, Mutually Beneficial Community Partnerships

30 Days of Teen Programming: Develop Partnerships Part 2

Adventures in Outreach: Micro Partnerships & Equity

Connect, Create, Collaborate: The Next Big Thing with Partnerships

School Library Partnerships:

Let it Go … The End of a Partnership

Partnership Profile: Library Linx

 

The MaKey MaKey

The MaKey MaKey

Blog post round-up is a series of posts that pull from the great YALSAblog archive. The topics have been requested by YALSA members. Have an idea for a topic? Post it in the comments.

Inexpensive Ideas:

Back to Afterschool: Tech Resources

Pop Up Programming

30 Days of Teen Programming: Low Stress Making through Crafternoons

Easy:

30 Days of Teen Programming: Evaluate Outcomes

Developing Creative Programming for Teen Read Week

Keep These Things in Mind When Creating Programs:

30 Days of Teen Programming: How Do You Know What’s Needed

30 Days of Teen Programming: Programming for the Platform

 

I can take no credit in the creation of my library’s longest-running teen-led program (teen programming guideline 3), and only a little for it’s continued existence since I took it over in 2007. Project Playbill is an intense, 5-week summer theater program. Teens meet together at the library three days a week to write, produce and perform an original short play. Besides the inherent value in their participation, we also entice them with volunteer service credit.

In 2008, My then-supervisor told me that I could cancel Playbill if more teens didn’t participate, because it sucks up a tremendous amount of time. In fact, because Playbill depends on teen leadership and labor to run, the fewer teens who show up, the more work I end up doing. That’s one of the reasons why no teen is ever turned away: you can’t host a teen-led program without teen participation. For the first couple of years I ran it, attendance hovered around five teens. I seriously considered putting Playbill out of its misery.

Read More →

YALSA’s recently updated Teen Programming Guidelines encourage the use of evidence-based outcome measurement as a means of developing meaningful programs for young people. The Public Library Association – through its latest field-driven initiative, Project Outcome – is also working to assist with librarians’ efforts to capture the true value and impact of programs and services. At ALA Annual 2016, PLA will launch Project Outcome, designed to help any programmer measure outcomes beyond traditional markers such as circulation and program attendance. Instead, Project Outcome focuses on documenting how library services and programs affect our patrons’ knowledge, skills, attitudes, and behaviors. It will help librarians use concrete data to prove what they intuitively know to be true: Communities are strengthened by public libraries and patrons find significant value in library services.

Lessons from the Field:  Skokie (IL) Public Library

At Skokie Public Library, we participated in the pilot testing of Project Outcome in the fall of 2014 by administering surveys for 10 different programs. The surveys were conducted online, on paper, and through in-person interviews. In one example, teens attending a class about biotechnology were interviewed using a survey designed to measure outcomes for “Education/Lifelong Learning.” Participants ranked the extent to which they agreed or disagreed with statements measuring knowledge, confidence, application, and awareness. Results showed that 85% of respondents agreed or strongly agreed that they learned something helpful, while only 43% agreed or strongly agreed that they intended to apply what they just learned. The results demonstrated some improvement in subject knowledge, information that can be useful for advocacy. But it also revealed that there’s room for growth in ensuring program participants understand how they can apply what they’re learning. In an open-ended question asking what they liked most about the program, teens mentioned the chemical experiments conducted during the program. This type of data is something that we can pay attention to when planning future programs.

Read More →

One of my favorite sections of the Teen Programming Guidelines (is it nerdy to have favorite sections?) is “Align programs with community and library priorities.” But you have to be deeply involved with community agencies and activities in order to be ready to act on the community’s priorities as they arise. This sounds obvious (and it is!), but it’s taken me a few years to figure it out.

Several years back my coworker and I began working with the Seattle Youth Employment Program (SYEP). SYEP is a city agency that places youth with barriers in paid internships in a variety of environments in city government and the private sector. It also provides them with job training and academic support. We worked with SYEP staff to design a curriculum that would build the interns’ digital and information literacy skills. We were sometimes surprised by the needs identified by SYEP staff and the interns’ employers: touch typing, for example, and basic MS Word. We learned a lot about putting our own assumptions aside.

Over the years, we continually evaluated and adjusted the program. We dropped some pieces and added others to make it as relevant as possible to the youth’s needs and the needs of their employers. Mayor YEP Logo

This year, Seattle’s mayor put forth a huge Youth Employment Initiative in which he asked SYEP to more than double the number of youth placed in jobs over the summer. Suddenly, the community had spoken: youth employment was a major need. Because we already had an ongoing relationship with SYEP, the library was poised to expand the partnership to serve more youth with our trainings. We also helped in other ways, like providing meeting rooms for SYEP staff trainings. Next summer, the mayor intends to make the program five times larger than it is this year (eep!), which will present a huge opportunity for library involvement.

Of course, being in the right place at the time is always partly a matter of luck. But you can’t be lucky if you’re not out there.

When we plan programs for teens, how do we create programs that will teach them something useful, but still fun and exciting? We can search the web, ask our colleagues for ideas, and look in old library school textbooks, but, ultimately, our journey begins with the Search Institute’s 40 Developmental Assets for Adolescents.

When we look closely at the 40 Developmental Assets for Adolescents, the general framework focuses on the external and internal assets that can be found in a teen’s environment, which helps them develop. According to the Search Institute:

“The 40 Developmental Assets follow “building blocks of healthy development—known as Developmental Assets—that help young children grow up healthy, caring, and responsible”

What’s great about these developmental assets is that we already offer programs that support one or more of these assets.  Although we can’t hit every single asset (much to our chagrin), we can cover many of these building blocks by creating programs that ensure our teens are getting the support, encouragement, and opportunity to grow and learn in the library; by incorporating several developmental assets within our programs, we can help teens discover new things, which will inspire and entice them to come into the library with their friends to learn more. If we want to lure new teens, and current teens, I highly recommend introducing these programs during the annual summer reading program.

Read More →

Staffing situations vary from library to library based on a number of factors including population served, budget, and organizational structure. So who gets to staff programs? YALSA’s guidelines lay out a number of considerations to take into account whenever making staff and volunteer assignments for a program, no matter our size or structure. Points 6.3 and 6.5 in particular consider the different roles that staff and volunteers take.

6.3: Consider which tasks are best suited to librarians and which are more suited to paraprofessionals, community partners and mentors, adult volunteers or Friends of the Library, and teen volunteers and participants.

With any program, someone needs to take the leadership role and accept responsibility for everything (the good and the bad) that comes of it. I find this is most often the person (usually a librarian) who pitches the program, and who believes in it enough to carry through with it. Whether hiring a presenter or relying on a crew of regular volunteers, the program leader needs to know (or find how to find) the answers to any question anyone may have about it from the time it first goes on the program schedule to three weeks afterward, when someone calls to ask when the next one will take place. The librarian leading a program is also most often the person charged with enforcing the rules as in, “Sorry, this a teen program for teens only.”

Read More →

When the email got sent around the bloggers about doing a 30 days of programming, my mind instantly went blank. I’m just a librarian-in-training and haven’t done a lot of hands-on programming with teens. What could I bring to the conversation?

Then I remembered I did have a program. A hypothetical one that is. I’m currently taking a Media Literacy for Youth class which has been amazing. One of our assignments was to create either a lesson or program plan about a media literacy topic. It could be targeted to any age group and should last 2-3 hours. We had to write about outcomes, lay out all the activities, essentially plan it so some librarian could do it with the kids they work with.

I’ll lay out my idea and then want your feedback. Is this program realistic? Would it work with the teens you work with? And if it’s not realistic, what needs to be changed?

So…here I go!

As a twenty-something, I would say I’m pretty well-connected in social media. If someone asked what my favorite social media platform is, I would say it’s Twitter. There something exciting about Twitter when you think about it like a cocktail party (shout out to blogger Dave Charest for this analogy) — there are hundreds of conversations going on around you and you decide which ones to tap into. And our teens are using it so why not have a program that challenges them to think about not only how they use Twitter, but how others use Twitter?

Read More →

As a member of YALSA’s Programming Guidelines taskforce I thought of my library’s teen technology intern program that’s been in existence since 2005 and has utilized an outcome based measurement system in the last few years to measure success.

The Teen Programming Guidelines devote a fair amount of space to discuss the details of developing programs that “Engage in youth-driven, evidence-based evaluation and outcome measurement.” Basically it points to more than attendance being the sole measure of success for a program. Considering both short and long-term goals defined by teen participants themselves, in what they hope to learn can be a way to assess if a program is on track or if it needs to be tweaked in any way. Evaluations can take the form of pre and post surveys, a face-to-face conversation, or even informally asking questions. Aside from what teens defining their goals, other capacities a program might want to focus on is “. . .an improvement or expansion of knowledge, skills, confidence, attitude, or behavior.” Pre and post surveys can do well in capturing this data because they can show a change from when the teen started the program to when they completed. Sometimes the hardest part is remembering to give them the survey!
Read More →

Admission time: like many of us in Library Land, I am still figuring out the best ways to measure program outcomes. Marking attendance is relatively easy (although to be fair, sometimes the teens move do around a lot, which can make them tricky to count). It’s a bit harder to identify the changes I want to see as a result of my program, and then accurately measure those changes.

The Programming Guidelines ask us to “Engage in youth-driven, evidence-based evaluation and outcome measurement.” I’m not quite there yet. As I mentioned in my post about our weekly drop-in, we’ve been working with participants in that program to identify priorities, and now we’re moving towards evaluations that will measure whether those priorities are being met. But it’s still a work in progress.

What I have gotten better at is working with community partners to create evaluations for programs. For example, we regularly work collaborate with Year Up to build their students’ information and digital literacy skills. Before each workshop, we meet with Year Up staff to make sure that we’ll be teaching the skills they want participants to gain. Collaborating with partners on our evaluations and learning from them about their own evaluation methods has made a huge difference in the quality of our evaluations overall.

At Year Up, I give the students pre- and post-tests to see how much our classes are moving the needle on desired skills and knowledge. We send Year Up staff an early draft of the tests (same questions for both) and incorporate their feedback in the final evaluation tool. Seems foolproof, right?

Read More →