A newsletter on progress-space research and audience development for cultural leaders. One reader calls it, "sometimes funny"

(Reading time: 2m, 14s)

I’ve been building a new habit of contacting list members individually to ask them if there are any topics they’d like me to cover in these letters. Rather than always drawing my list of ideas, it can be grounding to hear from readers in their own words.

List member Randi Korn had a topic (shared with permission):

I guess if I were to pick something for you to write about—you could focus on the importance of not replicating (copying) the testing strategy that another museum used because by doing so, you are inadvertently adopting another museum’s purpose (and your museum’s purpose is probably infused into the website). By replicating another museum’s strategy, your data will become “garbage in, garbage out,” which would be a shame, a waste of resources, and even worse, give a bad name to user testing.

Yes — you can’t copy results and you can’t copy value(s). This is one of those things that I consider so fundamental that I take it for granted, and I probably shouldn’t.

I’ve considered occasionally testing museum websites and sharing results with you here. The benefit, I hope, would be to demonstrate that testing almost always uncovers surprising results and that organizations should be doing their own testing regularly. The goal would not be to try to uncover some universal value proposition that other organizations could adopt.

That said, there are common usability issues that museums can solve for.

Language is pretty important here. I often call usability testing “user testing”, which is a bit of a misnomer. When testing a website, you’re primarily testing the website’s usability — you’re not testing the user. To a lesser degree, you may listen for clues as to comprehension and perception, but it’s important to mostly ignore what the user says in a test about their own personal preferences. We’re not chasing people’s favorite colors in testing.

What usability testing can demonstrate is just how essential seemingly mundane details can be.

Example: If I tell a marketing director that the font on their website is too small, they’ll often mentally shrug — these details are too minor to bother with. But testing forces that director to stand over the shoulder of a real person trying to complete a task and understand what the museum offers; Design details like the font size are working against her every step of the way.

Using tiny font on a website is like whispering in front of an auditorium full of people with no microphone.

Testing makes that problem real.

To stick with that analogy, you could say that the value of usability testing is that it shows you how to give a speech in front of a big group of people. You could study other people’s tests to look for guidelines as to what patterns support the delivery of the speech — a working microphone, comfortable seating, lighting, emergency exits … all this infrastructure stuff.

But the content of the speech you give is not something you can copy, and I think this is what Randi is getting at. Copying someone else’s speech and expecting your audience to receive it, in the same way, is a recipe for failure.

Thanks for reading,



Sign in or sign up to join the conversation.

Enter your email below to get a log in link.

You’ve successfully subscribed to SuperHelpful Letters
Welcome back. You’re now signed in.
Great! You’ve successfully signed up.
Your link has expired
Success! Check your email for magic link to sign in.