How far can a survey question take you?
What survey respondents say may be less important than the intention behind their words, and what they don’t say may be more important than their intentions.
(Reading time: 4m, 8s)
One of the questions I’ve asked in my survey of decision makers at cultural institutions is: “In the last two years, has your organization spent time or money on studying its audience or constituents?”
81 out of 136 qualified respondents say yes to this question.
(If/when I run this survey again, I think I’ll make this a multiple choice question with an “other” option, so people can still describe in their own words. I suspect at least some of the people who answered “no” to this question did so because they couldn’t think of an example in the moment. Some methods may not spring to mind immediately — For example, some museum leaders may not think of Google Analytics as a tool to study audience behavior, but someone at the museum may be at least counting monthly website visits. Counting hits to the website is not a terribly useful practice, but it does qualify.)
Those who answer “yes” to this question are asked a second question: “In what ways has your organization studied its audience or constituents?”
I’ve been struggling to suss out patterns from the responses to this question, so I thought I’d share my thinking with you and see what you think.
A method-focused approach
Let me give you some verbatim responses that I think will show why it’s been difficult to code responses to the question:
We are looking at who isn't coming to the museum and why.
That doesn’t tell us how they’re exploring the question of why people aren’t coming to the museum.
Focus groups, market research, data, strategic marketing plan
What does “data” mean?
We do this as part of the TDC (Tourist Development Council) grant.
Do what? How?
We hired an expensive consulting firm.
Through various programs and continued daily surveys.
Satisfaction with experience, why not attending for non-visitors
And so on.
I wrote about surveys generating more questions than answers earlier this week. It’s fine — I don’t expect people to give in-depth answers — but it does remind me of all the times I’ve asked organizational leaders whether they’re doing any qualitative or contextual research and they respond with something like, “Yes, we ask open-ended survey questions.” Maybe this survey can be an example of just how far open-ended survey questions alone will take you.
Anyway, my inclination was to categorize responses to this question by method — for example, surveys, focus groups, interviews, etc. — but it’s hard to assign mutually exclusive and collectively exhaustive (MECE) themes when you have responses like those above or when people use catch-all terms like “market research” or “evaluation.”
Take interviews, for example. Only four respondents have cited interviews as a method they’ve used to study their audience, which is what I kind of expected, but market research and evaluation are more commonly cited, and either of those terms could include interviews. (Based on my conversations with museum leaders, I’m holding firm to my belief that those who refer to these categories are not interviewing, though.)
I wonder if it might be more useful categorize responses based on intent.
Categorizing by intent
One way to categorize intent would be to ask whether the research is more motivated by the organization’s desire to improve visitor-oriented goals or organizational outcomes. (These aren’t always exclusive, I know, but humor me for now.)
Museums use evaluation, for example, primarily as a way to understand the impact of a particular exhibition or program in terms of education, engagement, or satisfaction. The organization is trying to understand the impact on the individual. While evaluation insights could be used to inform appeals for funding — which is definitely an organizational goal — I think evaluation is generally used to improve the museum’s traditional products in terms of visitor-oriented goals, like learning, during a single visit to the museum. I don’t think evaluation insights are often used to inform the museum’s communications or marketing — a department that I think of as being more focused on organizational outcomes. (If those sort of silos don’t exist at your organization — if you’re using evaluation to inform your communications — reply and let me know. I’d like to hear more.)
Two ways to view “engagement”
Dividing research intent into these two categories — individual-oriented goals vs organization-oriented goals — may also map to the way museum leaders define “engagement”.
In follow-up interviews with respondents these past few weeks, I’ve asked museum leaders how they define engagement.
Hey, if you’ve read this far, you might like getting letters like these in your inbox, so maybe throw your email address into the hole below.
I’ve noticed that leaders who think more in terms of what I’m calling “individual growth” are more likely to define engagement based on observations of visitor behavior during a single visit. These individual-growth leaders view people asking questions during a tour, or posting comment cards, or looking up exhibition-related information on the web as signs of engagement.
Leaders who define engagement in terms of organizational growth tend to talk about how constituents interact with the organization over time. They’re thinking of the audience’s journey as it impacts organizational goals. So, they may talk about visitors becoming volunteers or members becoming donors.
These two ways of thinking are not exclusive. A visitor who shows signs of engagement at the museum may be more likely to tell their friends about their experience, which can mean more visitors, which is a metric for organizational growth. And some leaders talk about engagement in terms of both audience impact and organizational impact — but most seem to emphasize one or the other.
The absence of information is more information
I’ve been scratching around in this survey sandbox, looking for patterns in the responses, creating little piles in the sand, leveling those piles and starting over.
But the most valuable information may not be in what these respondents are saying. There may be more to learn from what they’re not saying.
But let’s save that for another day.
As always, feel free to reply with any questions or comments.
Have a good weekend,