How easy was it for parents to help young children track the amount of reading done over the summer? Was the performer you chose money well-spent? Why was your teen program attendance lower than last year? Why is your library only reaching about 10% of the student population in your area?

Outputs such as circulation, attendance at programs, or number of participants can tell you how much your program was utilized, but those numbers do not tell you how well your program worked, or why it might not have worked.

Outputs vs. Effectiveness

To track your library’s outputs over time, consider using an Excel spreadsheet to keep track of your participation each year. A benefit of Excel is the capability to make instant charts and graphs. Here is a spreadsheet designed with the outputs requested on the ICfL summer reading report: These numbers are important, especially if you compare them from year to year.

However, to assess the quality and effectiveness of your program, consider administering formal surveys and/or conducting focus groups. A sample survey for parents can be found on ICfL’s summer reading resource page: This can be customized for your individual library or library system.

Chatting with parents or observing behavior is not considered “formal” evaluation. It’s hard to make the case for additional funding with “data” such as, “Well, a lot of people told us they really enjoyed summer reading.” You need to be able to compile the information you collect from survey instruments and report hard data. “Only 55% of the parents surveyed felt our new online reporting system was easy and would use it again. Clearly we need to re-evaluate this tool for next year.” Or, “Over 80% of the parents surveyed indicated they would attend more library programs if they were held in the evenings or Saturdays.”

What do you really need to know about your summer reading program? When should you find out? What data would show you the best way to utilize your summer reading budget, or provide rationale to request more funding? Who should you survey, those who participated in your summer reading program, or those that did not? Each purpose is unique, but here are a few common data points frequently collected:

Pre-summer reading–Planning: Survey students and/or parents

Consider sending survey to past participants, catch walk-ins at the library, ask your school library to survey students, set up a table outside of the grocery store, farmer’s market, soccer field, or community center…)

  • What kinds of programs might you attend at the library? (STEM, book clubs, performers, arts/crafts, gaming, etc.)
  • Whether they typically attend summer reading at the library; if they don’t, what are the reasons why?
  • What kinds of books would they like to be able to read? Any specific titles?
  • Would they attend summer reading if programs were held at an alternate location? (school, park, community center, apartment complex, mall, pool, etc.)
  • What kind of prizes would encourage them to “complete” the summer reading program?
  • Would they be more willing to attend if a meal or snack were provided?
  • Do they have access to computers, internet, or mobile devices for e-books or online tracking?
  • What is the best way to tell them about library programs? (mail, email, texts, Facebook, etc.)
  • What other questions would help you plan the best way to promote and implement your program?

During summer reading–

Comment cards are a great way to collect feedback about current programs, as are emails to current participants or Facebook posts. Simply asking, “How are we doing?” or asking for feedback about specific programs or tracking tools, can help you make tweaks to your summer reading program or address challenges before your patrons get frustrated.

Post-summer reading–Evaluation: Survey students and/or parents

Consider sending to registered participants as well as walk-ins. Ask school if they would allow you to survey students, or ask school librarian for assistance.

  • Did they participate? What prevents them from participating?
  • Did it make a difference in attitude, ability?
  • Did parents change behavior?
  • How many programs did you attend?
  • 1st time user?
  • Favorite programs?
  • Ease of tracking system (paper, online)?
  • Will you come back next year?
  • Did you read more this year?
  • What would you like to see more of?
  • Facility feedback… space, ….
  • Timing of program
  • Demographic information
  • How could we make programs more accessible?
  • Feedback about staff?
  • How many years have you participated?
  • Volunteer next year? (collect contact info)
  • Do they have access to computers, internet, or mobile devices for e-books or online tracking?
  • Did they access web page or Facebook page?
  • Language – Primary / Dual language?

Keep your surveys as simple and convenient as possible. You certainly wouldn’t want to ask all these questions! Asking parents to fill out a survey on-site can increase the return rate, as well as asking school librarian or teachers to help distribute and collect surveys.

Many libraries are now using online survey tools. A benefit to using an online tool is that most will download and compile the data for you; many even make charts and graphs. Here are some web resources that offer free software:

Be sure to share results with your stakeholders:

  • Board of Trustees
  • School District/Individual Schools
  • Chamber of Commerce
  • Legislators
  • Community (press release)

See a template for End-of-Program Reporting here:

Remember, if you choose not to evaluate the quality and effectiveness of your SRP, others will (parents, children, school staff). Proactively assessing your program sends the message that you value the needs of your patrons.