Learning from the Audience: How San Jose Taiko Utilizes Data

In the month of May, TCA is releasing weekly blog posts about various ways the taiko community is utilizing data.  This week’s post features a Q&A with San Jose Taiko (SJT)’s Executive Director Wisa Uemura.   Wisa discusses how SJT has utilized surveys to collect audience feedback, and some tips for any groups who would like to start a survey collection process.

What types of data do you collect, and what are the method(s) you choose to collect the information?

San Jose Taiko (SJT) typically collects data through quantitative and qualitative questions on:

  1. Audience demographics (ex. age, gender, ethnicity,  household income);
  2. Audience/participant expectations prior to activity (ex. motivations, how they’d hear about it, what they thought would happen); and
  3. Audience/participant satisfaction (ex. quality of service, content of program, benefit)

We collect the data through paper or online surveys depending on the program, typically following the activity.

Why did you decide to start surveying your audiences, and how long have you been collecting audience data?

SJT’s founding members were civically engaged individuals, laying the groundwork for many Asian American nonprofits of this area (SJT, Yu Ai Kai, Asian American Law Alliance to name a few). Their belief in equality, empowerment, and community building instilled processes within SJT that strive to balance the individual experience with the collective vision.

SJT was also encouraged to survey audiences by our private foundation and government funders. As a recipient of private and public funds, we have a responsibility to report back on the impact of our programs – who we’re reaching and to what benefit.

What are the top 3 insights that you’ve gained from conducting these surveys?

Shorter is better: limit survey to single page (front and back if necessary) to increase response rate. However, this can be difficult depending on the range of information you are trying to collect.

There is a difference between real time and delayed feedback: in order to get the most responses we typically have audience/participants complete the survey immediately following the event. In a few occasions we’ve asked them to return it to us a few days after. The responses are similar, but making them revisit a few days later extends the experience, possibly making it more memorable.

Learn from all responses: as artists/instructors we put our creative work out there and with all art, there will be a range of feedback or interpretation. SJT tries to learn from all responses so we can be proactive not reactive.

How much time does it typically take your staff to create the survey, distribute the survey, and calculate the results?

It depends. SJT has template surveys for performance audience or workshop/class participants. When applicable, use of existing survey decreases the time needed:

  • Survey creation = 1 hour or less;
  • Production (printing or placing into online format) = 30 minutes or less;
  • Calculating the results (immediate informal review, data entry, analysis, sharing) = 1-5 days depending on number of surveys collected, types of responses, and intended use of data.

What are 3 things that someone thinking about starting a program evaluation project should consider before starting?

  1. What will you do or want to do with the information? Are you open to potentially changing what you do based on the information collected?
  2. What types of information do you need towards the purpose determined by question 1?
  3. What is your capacity to analyze the data and/or effectively implement changes responsive to the data?


We would like to thank Wisa and San Jose Taiko for sharing their data learnings with us!  Please help TCA achieve our data goals by filling out the TCA Census at the link below.