LAST UPDATED: 5 February 2023
Designing Actionable Race Surveys
Ann Arbor Track Club President and survey design PhD, Laurel Park, discusses how to use participant survey feedback to inform your decisions and improve your race.
Most races only come around once a year. So when you’re working to improve a race for your participants, you’ve got precious few opportunities to receive feedback from them. How do you use those opportunities right? And what feedback should you look to gather from them?
Well, my guest today, Laurel Park, has the unique privilege of being both a race director and a PhD in survey design, and has helped countless organizations develop effective surveys that leverage customer feedback to inform strategic decisions. As the President of the Ann Arbor Track Club, Laurel knows running and races inside out, and today she’ll help us understand how to craft an actionable race survey, how to maximize survey response rates, and how to avoid some of the common pitfalls of survey design, like asking things you shouldn’t care to know about or asking things you do care to know about in a way that delivers poor quality or unusable results.
If you do send out a race survey after your event, or have thought of doing so, this is an excellent crash course in getting the most of the one shot you get each year to gather productive feedback from your participants.
In this episode:
- The purpose of a race survey
- Working backwards from what you need to know to what you're going to ask
- Collecting demographic information from respondents
- Do people respond truthfully to surveys?
- Reducing survey friction and question bloat
- Avoiding distractions/cognitive load with clean survey styling
- Types of questions to ask and areas to explore with your race survey
- Avoiding leading respondents with biased question phrasing
- Using open-ended vs close-ended questions
- Best practices for sharing your race survey
- Increasing survey response rates with incentives
- Survey software options for designing your race survey
- Analyzing and presenting survey results
RunSignup are the leading all-in-one technology solution for endurance and fundraising events. More than 26,000 in-person, virtual, and hybrid events use RunSignup's free and integrated solution to save time, grow their events, and raise more. Find out more at https://runsignup.com/.
Racecheck can help you collect and showcase your participant reviews on your race website, helping you more easily convert website visitors into paying participants, with the help of their Racecheck Review Box. Download yours for free today at https://organisers.racecheck.com/.
Laurel, welcome to the podcast!
Well, thank you very much. Glad to be here.
Well, thank you very much for coming on. I'm going to guess that you're based in Ann Arbor, is that right?
You're gonna guess correctly. I should have put on my Go Blue sweatshirt, but didn't think of it.
Yes. Because one aspect of you is being the President of the Ann Arbor Track Club.
That is correct.
Do you want to tell us a little bit about that - the kinds of stuff you do at the club, your involvement with it, maybe some of the races you guys put on?
The Ann Arbor Track Club has been in existence for-- I think this is our 53rd year going into it. We serve as a community resource for running, race, and walking. Despite the name of "Track", we really don't do track events much anymore, although we do have an all-commers meet every year. I have been involved with the club in one capacity or another for more years than I care to admit-- but I'll just say it's been more than about 30 years. Actually, cute story, my husband and I met at an Ann Arbor Track Club workout. It was the second day in town that we met. So yes, I owe much gratitude to the Ann Arbor Track Club. We have one signature event that's kind of a local mainstay race that's called the Dexter-Ann Arbor Run. As the name suggests, it is a half marathon and it runs from a little town west of us, Dexter, and finishes in downtown Ann Arbor. It started as a 15-miler and has evolved over the years. We also have accompanying events of a 10K and a 5K. But that's really kind of our big signature event and I've been involved with that race for 20 years. I've been on the race committee and I've run it and all sorts of different things. Right now, the club is-- we're looking forward-- having come out of the pandemic where everything shut down, it's a good time to say, "Okay - and we're actually doing this - what's the next chapter? What's the next phase of this club?" We have a very, very vibrant running community in Ann Arbor University of Michigan. It's a very activity-oriented town. So, we are figuring out what we do next. We have a very popular and vibrant youth programme as well. We've had some of the graduates of that programme have actually gone on to the Olympics. So it's a high-quality programme as well.
You mentioned the University of Michigan there, which is also in Ann Arbor. You've done lots of work with the university. You also got your PhD from the University in survey design and analysis. Do you want to tell us also a little bit about that part of your life as well?
Yeah, I have actually three degrees from the University. I have the hat trick. As one of my friends said, "Come play. You've been great. You've completed the hat trick there" - if you're a hockey fan. I worked for the University for several years after graduating, initially and, then, in student affairs-- then, I decided to go on and do a PhD in higher ed, since I enjoyed the environment so much. I hadn't really planned to go into survey design. Actually, my programme was organisational development. A big part of that was training in research methodology, which initially was not kind of my cup of tea. But as I got more into it - learning how to conduct research studies, survey design was a big one - I took a lot of courses in survey design, questionnaire design, and research methods. I just became more and more interested in that. So following my graduation from my Ph. D. programme, I worked for several years at the University of Michigan doing data management, basically. I did enrollment counts. I worked with the university's database. I also did a lot of survey analysis of different surveys - none that I wrote. Most of them were surveys that the university was putting out - or another organisation - and I would do the analysis of the data on that. I won't say that was my favourite part because all of it was very enjoyable, but it was very interesting to me to see what the results were. I guess I've always enjoyed helping people find answers. Even when I worked with students, "I need help with this. Let me figure it out." So looking through that data is really interesting to see - what the answers were, what the results to this particular survey topic were - and I've been doing that for the past about four years, five years now. I've been doing it not full-time, but on a consulting as-need basis. Track Club has taken up most of my time recently, but I reach out and let people know, "I have training and surveys. I enjoy doing them. I enjoy analysing them." So I've had a few people come up to me and say, "Willing to help out or take a look at this? I'm like, "Sure, sure, why not?" My background-- I work with the track club. I've ran for the University of Michigan way back in the day. I ran competitively for several years. I'm still involved in running. I have the background. I have the knowledge. I have the context. So when a race director or someone comes to me, I understand that perspective and I can use that in helping someone design a survey or whatever they need me to do. I've been there. It's just fun. If that's the bottom line, it's just fun.
Yeah. I think we can definitively say that, from that, you are by far the most qualified person to have an episode on race survey. You have the running background. You have the racing. you have the club on one end. So you understand all of that stuff. And then, you're obviously very knowledgeable on the design process of a survey - what the objective of a survey is and all that. So let's jump straight into that. I think we should start with the basics. I won't quite ask you what is what a race survey is because most people would have filled some of those, or have seen some of those, or have seen surveys in general, but I will ask you, what is the purpose of a race survey? Starting from that.
The opportunity to gather useful information that a race director or race committee can act upon - we call that actionable information. So, in addition to getting the "What?" The "What" being, "Did you like the race? Did this go well? Yes, no? What?" The "Why?" What can we do to improve thsee other underlying questions I'm trying to think of off the top of my head? Were there adequate waterstops?" So beyond just, "It was good, it was bad," in particular, "What worked? What didn't?" And then use that, take that information back, and say, "Okay, here are some things that we need to look at for next year," or "Here's some things that we're doing well," and we put that into our planning for our next event.
So basically, you're saying that the survey is a tool that you use to gather, sort of, answers to questions you have, but always with a view of making something of it, right?
Yeah, I think so. A survey is a project - a good survey. There's really no point in launching a bad survey - garbage in, garbage out. But a good survey-- it takes some time, it takes some effort, and it takes some knowledge to do it properly. You want to keep that in mind. You want to do a good job of it. Then, you gather that useful information and you use it as part of your complete race package.
And is a survey, then, something that typically races or other organisations or other events would aim to do every year and, perhaps, like, establish a standard so they can compare answers across years? Is that sort of like part of it? Or would you only do it when you have something specific to ask - or specific years - or something like that?
Really, the answer to that is it depends on your purpose for the survey. I am a strong advocate of doing them every year because you do get what we call longitudinal day data. With that long-term data, you can look over the years and you can say, "Have things changed? Have things shifted?" And so on. If your event is relatively young, that can be useful. But as I said, doing a survey takes time and effort. It's a project. If you don't have time to do it, I think it's useful to do it on occasion. It gives you a kind of good feedback on where you are, what your participants and your customers think of the event, and the ideas that they have. I've been in a lot of meetings, race director meet-- or excuse me, race committee meetings, professional meetings, where everybody sits around and they try and guess what people want - usually, concerning the T-shirt. I've been in more T-shirt conversations. So you're trying to guess, "Well, would they like this? Would they like that? Would they like that?" And my response is, "Ask them. Try and find out what they're feeling." And if it's a rather major change to the event and, sometimes, those are constrained-- whatever the customer wants isn't necessarily what you can do. You have other constraints. But if you're saying, "Would they like A" Would they like B?" Ask. My response to that is, "Ask them."
I mean, to me, it sounds like a very obvious tool to be using. I mean, in any industry - let alone events - you hear people stress the importance of getting customer feedback. So, it's really important on that end. And still, in races I participate in or races I follow, I don't actually see people use race surveys as much as I'd expect on the back of how useful I think they can be. Do you think that might be because of the effort involved or because people don't appreciate the value of a survey? Why don't we see more surveys being done in the industry?
I think, particularly with races, those come down to the effort involved. I think everybody is-- at least, the majority of race directors that I'm familiar with, even if they're not specifically knowledgeable about survey design - it's a customer-driven industry - they appreciate the value of getting the customers' viewpoint. I think, in terms of races, in particular, there are a couple of things that jump out-- doing a survey is an additional project that gets added to the work of putting on the race. For anybody who has been involved in putting on a race - big race, small race - there are a lot of moving parts and there are a lot of things that have to be done. The survey is something that doesn't have to be done, so it's lower on the priority scale. That doesn't mean that the race director doesn't appreciate the value of it. But when you come down to scarce hands - human resources - to get things done, particularly, these days when everybody's looking for volunteers, it's not a critical aspect of the event. So, you have to take care of those. And I think the other part of it is the timing of it. A survey is something that takes place after the event. That's where all of the energy is focused on - "Let's get this event done. Let's make it a successful event." The energy and the motivation after your race is finished-- I've been there. You're cleaning up the finish line, and you're like, "Wow, I'm emotionally exhausted. I'm physically exhausted." I kind of liken it to, after a big family dinner, suddenly, people realise, "Oh, man, we got to do the dishes and wash the pots and pans." Nobody really wants to do that. So there's really not much motivation. If you have someone who is dedicated to doing it and is not needed for other critical tasks, that person can handle that. That's actually one of the things that I always recommend to pretty much anybody who's going to do a survey - designate a project manager. Designate somebody who is going to be responsible for doing this. I always kind of quiver a little bit when I'm in a meeting and they go, "Well, we should do a survey. We should do a survey," and there's kind of blank stares and a moment of silence until somebody finally said, "Alright, fine, I'll put together some questions and put it up on the website," and I kind of cringe. I was like, "Well, okay, but that's not the most effective way to get that information."
But of course, the survey is something you could have prepared in advance, right? I mean, it's not something that you need to wait until after the race to put it together. I totally get your points, stepping into the shoes of someone who's been through that and done the event, really, everything after that gets so much harder to do, but it's not something that you need to start doing after the race.
Well, but to a point, you've got to analyse the results after the race. There are two phases to doing a survey. There's, kind of, a before and after. The before is writing the survey, selecting your population, getting all of that launched, and it's kind of the same kind of parallel to doing a race. So you've got all that time and energy. You do your survey. You enter your survey on the survey software. You launch your survey, and you're like, "Oh, man, it's gone. It's launched. We're getting responses." Well, then, you've got all these responses, but your job isn't over. The second phase, which could be more time-consuming than the first phase, depending on how many responses you get-- you got to go through, you've got to analyse your responses, you've got to do some descriptors or some charts - whatever it takes to communicate your responses properly. Ideally, you're going to write up a little report. That takes time. You can do a lot of preparation, but there isn't really anything you can do in advance to analyse the results until you get them.
Yeah, I get that. I get that. Thinking of, basically, who the survey is addressed to - participants sent various survey questions about the event - would it also make sense to send a survey to volunteers - maybe a different survey? Or are surveys traditionally just for, sort of, like, paying customers?
It is. Surveys are for whoever is going to give you the information you need to improve your event. Backup for-- I actually am not backing up. I might skip ahead to something we're going to cover. But you know what? I'm going to spill the beans now. I always tell people, "When you're thinking of designing a survey, don't think of it in terms of, 'We want to know. We want to know,' because there are a lot of things you want to know. Think of it in terms of what is the question we're trying to answer." So the question could be, "Would participants return to our event? Would participants recommend our event? Do participants feel that the race was worth the entry fee?" Something like that. You've got to have, kind of, this big question, "What's the key thing you want to get out of the survey?" Well, then you think, "Okay, what's the information I need to answer that question?" And that information then becomes your survey questions. I call it the mission creep, where you start with 10. Well, we should ask about this. Well everybody in the race committee - well, almost everybody - will have an opinion about that. So that's the focus of it. So the question you may have regarding your volunteers is probably going to be very different from the question you have regarding your race participants. The value of the information you get-- it depends on who provides it. You want to have people who are knowledgeable and can give you valuable, useful information. There's an ad up in Ann Arbor that's kind of a joke - it's for a local finance company - and it says, "Would you take investment advice from your yoga instructor?" I kind of chuckled because, in Ann Arbor, you probably would. But that's the same kind of thing. Who's going to give you? And if you're asking about your race, it is probably going to be your participants or-- I usually suggest sending it to your registrants because you have that data file. You have everybody who registered. If they didn't participate, they're probably not going to respond. I don't worry too much about people jumping in there, saying, "Oh, I'm gonna try and hijack the survey and give bad--" The way to waste time, I don't think, is to do rogue surveys - so, whoever is qualified to give you that knowledgeable information.
And the surveys you send out to participants - focusing on that - should they be anonymous? So, basically, you share the same link - requesting no personal information-- you send it out to everyone. As you say, like, there's a small chance - but no one really would want to waste their time - of someone outside of that circle taking it. Or would you actually want to also know, by name, who's filling in the survey?
It depends, somewhat, on the situation. But generally, if you're using your registration list, you can generally take that - download that registration list as a CSV file - load it into a survey software like SurveyMonkey, Qualtrics, or something like that, and use that cleanup of the really personal information. You don't want the date of birth on there. You don't want the address and so on. There are two aspects to collecting data for this. One is to be anonymous. If you're offering an incentive, you can't be completely anonymous because you got to know who to send the incentive to. So Anonymous is one thing, and that is you cannot link a particular response or group of responses to a particular individual. And the other, which is much more common is confidential, which is there is some type of identifying information there. Frequently, it's just an email address. It could be a first name and last name. And then, confidential is either only going to be shared with the person doing the survey, who is going to then take the responses, and what we call "aggregate them" or bring them together. So, no. We say no individual responses will be distributed. Responses will be distributed in aggregated form only. Or it means that you are going to share that information with only a few people, a few individuals. I think one of the barriers for some people to taking a survey is, "I'll be glad to give you the information, but depending on the topic, I don't really want you to broadcast that it's from me," which I would never do anyway. So you have to bounce it. Now, for a race survey, you're probably not asking about really sensitive information, but you could get some responses that you'd go, "Well, that's useful information. I want to make sure that the person who was willing to share that-- it doesn't come back on them." Play it on the side of caution there. You can send out a link. I'm not a fan of putting links on websites - and I'll get to that in a minute - but if you have a population file, the people who are going to give you this knowledgeable information, and anybody that does the survey using that link know what they're talking about and, of course, you're not offering an incentive, sure, you can have one link and send it out, and it can be anonymous because you know the people and everybody. It's going to be useful.
Yeah. The next question after that would be that, if I do have the name and contact information, maybe, of every person filling in my survey, what happens if I get tempted to reach out and respond to someone who, perhaps, like, write some negative comments about my event? Or maybe, I even want to follow up on some suggestions they may have had on one of my open-ended questions. Is that something that I can do something, I would want to do, ot something I want to avoid? What's the etiquette there on surveys?
My take on this is, if you think you may want to interact with any of your respondents, you put a little checkbox at the end, and you say, "You may contact me. I give you my permission. It is okay for you to contact me regarding any of my responses." Now, realise that if you do that, and somebody gives you that information and allows you to do that, you're going to have to take the extra time to contact them. I don't tend to reach out to people who have given me responses, regardless of what the response is. In general, I am a researcher. I try not to get personally and emotionally involved in what they're talking about, even if I'm involved in the race - putting on the race or something. You really have to kind of take a step back, look at this as research data, not try and read too much into it, and create some distance. I just recommend that to people unless somebody has said - and this has happened in open-ended questions where they say - "Feel free to contact me if you want more information about that." Boom, you have permission to do it.
Yet, another aspect of this might be, I guess, if you have personal information there that tracks back to your registrations, to your participants, perhaps, you may also want to link that to a CRM and use some of those answers, maybe, to tailor your marketing to those people. So basically, data-- is it typical or even something that people might do to use data from a race survey to, then, market to people in a different way, or reach out to people after the survey in a different way?
I personally would not do that. There's a fine line between collecting data for informational purposes and collecting data for marketing purposes, and you're really blurring that boundary. People are overloaded with surveys. People are hesitant to complete surveys anyway. And if they feel that their information is going to be used for marketing in any way, or is going to be shared with any other organisation that is, in turn, going to use it for marketing outside of the organisation, the race committee, the race, or whatever clubs putting on the event, then they're probably not going to be as motivated to respond. I will sometimes say-- depending on the survey and the topic of the survey, I will say right in the invitation message that this data will not be shared outside of the organisation and will not be shared or sold for any type of marketing purposes. It will be used solely by the race committee to improve the event and make it clear upfront.
Yeah, I think that's a good tip to get people, sort of, like, comfortable with how their data is being used. Of course, in Europe-- much stricter regulations around that as well. But definitely, always a great idea to be upfront with people on how their data is being used. Beyond personal data, sort of, like, name and email - just to be able to, sort of, like, identify someone taking the survey - what are the kinds of information would I want to ask respondents on my survey to make my survey - maybe my segmentation of results down the line - a little bit more helpful?
Generally, gender. I caution against asking people for their specific age. For some reason, asking somebody for their specific age-- I think they're either hesitant to do that or they are afraid that that's going to be used to identify them. I tend to ask for an age group. Some people don't mind if age is done. And again, I've seen some race surveys that will ask, "Is this your first race?" That can be helpful because you can segregate out those that this-was-my-first-time. What was the experience of the first-timers? What were their responses? Or you can ask, "How many races do you run in a year?" Again, it comes down to what's going to be helpful to you to improve your event. I do caution about asking too many-- again, anything that leads people to believe that, in some way, they're going to be identified by responding-- For a lot of small races - and I certainly take this into account - if I'm analysing results for an event around here, odds are, I probably know a lot of the people in that event. So, I don't want to know what their responses are. I want it to be anonymous, and I think that's in people's minds, too - "If this organisation is going to be looking at the responses I give, they're probably going to figure out who I am." So I try to really, really lessen that. Some people don't care. It's not a threatening topic or the data they're giving or the information they're giving is not highly classified or whatever. But I always try to err on the side of the comfort of whoever is taking the survey. The more comfortable they feel, the more likely they are to respond.
Yes. And how likely do people, really, respond truthfully to questions? Because you mentioned already that, like, there is a degree of self-awareness when people take surveys. You already mentioned the fact that they may not feel comfortable sharing some stuff or some information about themselves. We all have biases of all sorts when we respond to these kinds of things. What I'm really getting at is, as the person running the survey and looking at the results, how much can I trust what people say or their intents? Or their declared intense would match what people would actually do?
Not surprisingly, there's been research on this. There's been research on pretty much everything. It was a different type of survey. It was a research-oriented survey. But for the most part, people respond truthfully unless it is a really, really highly-sensitive or seemingly threatening topic - then, they might be a little less likely. But in general, in a race survey, I trust it. I always tell people, "If you get really strange results, if you look and go, 'Wow, 50% of my respondents strongly agree that we need to raise the entry fee?'" Then, I go back and look at the question or I look at the response scale, and I go, "That just doesn't make sense," and sometimes, "Wow, this was not the response I expected." But sometimes, there's an error. The response scale got twisted or the question was phrased in a manner that the way it read to you was not the way it read to other people. You can't go back and validate every single response. So you have to trust that the information they give you is going to be authentic. Again, I keep coming back to-- you ask the people that are going to be invested in the survey.
So, you’ve got your survey all worked out, you know what you want to know, and have crafted an awesome survey that’s going to give you the answers you need to make your race better.
But, how do you distribute it to your participants in a nice, elegant way without spending hours upon hours hacking email templates and moving data back and forth between your registration platform and email marketing software?
The answer is, you do everything in one place - with free integrated email marketing from RunSignup.
“Integrated” is the keyword here. The minute you have your survey set up, you can include that survey link in a beautifully crafted email that gets sent out from your RunSignup dashboard to all the registered participants in your race - or whomever you choose to send your race survey out to. And, as we’ve been discussing with Laurel, you don’t have to wait to set all this up until after the end of your race - you can schedule your race survey email to go out exactly when you want it to go out for minimum hassle and maximum effect.
That’s the beauty of platforms like RunSignup that are built with races at the core. Every tool you use, whether it’s email marketing or fundraising or race results, taps into the same participant data, updated in real time, without you having to do a thing.
So to learn more about RunSignup’s native email marketing capabilities, and so many other awesome free tools and features, visit runsignup.com, and check out all the amazing ways RunSignup’s market-leading race technology can take your event to the next level.
Ok, that’s it from me for now. Now, let’s rejoin Laurel and our chat on actionable race survey design…
Let's look, then, a little bit closer at survey design - how do you put one of these things together. I'm guessing - otherwise, that should have been a very short PhD - there is no one-size-fits-all on surveys.
There's really not. Again, it goes back to what is it you're trying to learn from this particular survey. I already described about "What is the question you're trying to answer?" You do want to try and keep the survey fairly short. I think we're going to talk about response rates and how to maximise response rates in a minute here. One of the key factors there is don't make it a chore. People do not want to spend half an hour on a Saturday afternoon responding to 58 questions about where the mile marker is bright enough on your course and something like that - don't ask that question. Try and keep it short. Now, what "short" means for some races is going to be different from others - the questions you need to effectively answer your big question - no more.
So you're saying, basically - which I think makes total sense - is there's a bit of a trade-off between the longer the survey, the less incentivized people would be to fill it in. So there's that kind of balance. So you're saying, "Pick your battles, choose the really important stuff you want to know about, and go with those"?
Yeah, and responses drop off - in long surveys, you see this all the time. Long - I'm saying more than about 25 questions. Some of the research surveys I've worked with have been pretty intense and pretty-- responses drop off after a certain point. So you'll get the highest percentage of responses. Everybody will answer the first 15 questions and then it starts to drop in that. By the time you get to question 40, you have about half the number of respondents you had for question one because they get tired of it. To do a survey well, as a respondent, it takes some thought. It's a mental exercise too, You don't want to have to put people through that. Again, don't make it a chore.
With the kind of principle that people advise - we approach, maybe, writing an email or writing anything-- does it make sense to basically have your draft and then go back with a critical eye and maybe take things out? So ask yourself, "Do I really need this kind of thing?"
Yeah, sometimes what I will do is I just, kind of, brainstorm - going back to, "What's the information I need to answer this question?" and I just write them all down. I go, "Here's what comes to mind. Here's what comes to mind. Here's what comes to mind." Then, I take a look at that. Can this information be gathered in some other manner? Do we already have this somewhere? Is this something that we will really be able to use so that we can take action on this? So I always tell people, "Don't ask about anything that you cannot change or you will not change because you're not going to change it. If you can't change your starting time that's from our event-- we have some parameters on the starting time of our race and we can't change it." So we don't ask people, "Should the race start earlier because we can't do anything about it." Also, is this question as phrased going to give me useful information? If you have a good idea of what you need to know in order to answer your big question, it's not a hard process. Like you said, go away and have fresh eyes look at it.
I think - there's probably not a definitive answer on this - for a typical race survey-- or I guess, in terms of the sweet spot of people's attention and getting as many responses as you could get, what's a good length for a survey? We're talking about how many questions, roughly?
There's research on that, too. I generally try to keep a race survey to about 15 to 20. I hate to keep going back to "It depends. It depends. It depends," but you can have a very-- I've seen some really very effective race surveys that are 10 questions and, boom, they hit on all the important things. I've seen some really effective surveys that are 10 to 15 questions - or excuse me, 15 to 20 questions. Just keep in mind that you can ask as many questions as you feel is necessary, but there does come a point where people are gonna say, "Okay, I've had enough." I suggest breaking them into sections. So I generally will do five questions next page or next section because if you continue to go, "Here's another one, and here's another one, and here's another one," you get this feeling of, "This is never going to end. This is this endless survey that just goes on and on." And psychologically, it kind of, sort of, gives people a break. You can put the little page one of two, page two of two so that they have an idea of how much time and effort is awaiting them and is still to come.
And for a well-designed survey, along a moderate length, broken-up-in-sections and, like, all those good practices, what kind of response rate should I sort of expect if everything's done well?
A really good response rate would be about 40% or higher, and you're getting good representative feedback from your audience. 30% is fine. I generally read results from any survey that's less than 20%. I still look at the responses and also look at the data and I'm like, "Well, this is a really small representation of the people in this event," which could be interpreted in a couple of ways. It could be that you're only hearing from the people who have complaints. It could be that, maybe, a lot of people are really satisfied with what you're doing and don't feel the need to respond. You just don't know. The keys to getting a good response rate are, first of all, people have to be interested in the topic, and that's where choosing your population comes in. They have to be invested in what they're doing. They have to feel that their participation is valued. It's going to be worth their time to take the survey. Incentives? Absolutely, incentives help. That has been shown time and time again. And as I've already mentioned a couple of times, make sure that the survey is not a chore. Make it quick and easy. Design it well so that they don't have to jump through many hoops to get it done.
As you mentioned, making people feel like their responses are gonna are going to be valued-- I sort of see that - why it would work - but how do you make that happen? How do you design a survey that makes-- or maybe a reach out that makes people comfortable that their response will be valued?
it is primarily in the reach out. It is really making people feel that they have been selected to do this, which is exactly what you're doing. I am a strong proponent of a personalised email invitation and your message is, "Who's doing the survey? Why is the survey being done? What do we hope to gain from this? Why is this person invited?" Generally, for a race, you know that, but you could even say, "Your participation in the 2022 Dexter-Ann Arbor Run makes you important to us - that you respond and get your get your feedback." This one, I think is really important too. How many questions is the survey and approximately how much time is going to be needed to complete it? Given that upfront, that also motivates you to try and keep it as short and succinct as possible because you don't want to say, "Well, so it's gonna be 50 questions and you're going to take 25 minutes." No. And let them know what they're getting into. That creates some goodwill. "Okay, here. this is what I'm signing up for. When does the survey close? When is the last date to respond?" And this too, I think, is important. Tell them how the results are going to be used. I don't mean you have to say specifically, but say, "We're going to use these responses to improve the event for next year." Or if you have a specific reason, "We're thinking of changing the date of this event. Your responses are going to help us determine what the new date of the event will be." If you're offering an incentive, what is the incentive? And are there any parameters for having that incentive? Generally, not an issue with a race survey but in, some surveys, you may want to say, "You must complete the entire survey to qualify for the incentive." That's special consideration-- a certain type of survey. And then, I always kind of joke about this, but it's true. Include the link to the survey. Don't forget to do that. I've gotten an invitation sometimes where I'm like, "This is awesome. How do I get to it?" So when I'm writing up one of these invitation messages, generally, what I do before I get caught up in the details-- first thing I do is, at the bottom, I paste the link to the survey. So it's there. So I don't get it all nice, read, everything, send it out, and get a bunch of, "There's no link."
Yeah, I guess people would be, "Yeah, what?! Of course, I'll put a link!" But then, I hit all those race websites that don't have a date on the first page or a registration button. So that's, sort of, like, similar to that.
Yeah. And I think one of the things you try and get across in a survey - in really any type of survey - is you're taking this seriously, you're putting time into it, and that it's professional - the feeling that it's professional, and the feeling that you are really serious about what you're doing. I think putting together a detailed invitation message shows people that you've done your homework, you've thought about this, and you're very serious about wanting you to participate - that is very appealing. I kind of do an analogy of, "Are you willing more likely to go to an event, a lecture, a gathering, or whatever, if you receive a personalised invitation to it, or if you see a flyer posted up at a coffee shop?" You're probably more likely to respond to the individual or to the direct invitation. That's the same philosophy here.
This professionalism you mentioned should also carry through into the actual style and the aesthetic of the survey as well, right? I guess you don't want to stamp your personality or your personal preferences about, like, floral patterns, serif font, or whatever on it.
Yes. Again, make it look professional. Keep it simple stupid for the engineers out there - the "KISS" principle-- I tell people you obviously want to have the logo in the title of your event. If it's a club putting it on or whoever's putting it on-- if you have a name sponsor, part of the contract is to be on all of your promotional materials and follow-up materials. I tell people, "You don't get any bonus points for creativity." Again, you want people to be able to get in there, take it, and leave. So, don't have your race mascot dancing up in the corner. Don't have a slideshow of photos going across the top that's going to distract people. One of the more annoying surveys that I personally ever received about a year ago where you opened up the survey and the questions faded in and out as you went through them. So question one faded-- it came from the background into the fore. I answered it and it disappeared into the mist. The second one-- I got so annoyed with that. I couldn't focus on the survey and I just shut it down. What you're dealing with here is what we call cognitive load, which is the number of things your brain has to focus on at any one time. The psychologists out there may scream at my very lay explanation of this. But basically, you want to make sure that your respondents can focus on the task at hand, which is answering the survey. So if you're in a car and you come to a four-way intersection in a big city, there's a lot going on. There's a lot you have to take into account. Now, your task here is to safely proceed through the intersection. You've got other cars. You've got buildings. You've got pedestrians. You've got all sorts of things. So it takes more for you to make that determination. Whereas if you're doing this out in the country or out in a smaller area, you come to the stop sign and look around for other cars. There are not a lot of things distracting you. You can make that decision pretty quickly and move forward. So that's what you're you're trying to get at. The fonts have to be easy to read on any device. Realise that people are not going to just be taking it on a laptop or a desktop. You've got to test this on different devices and see different browsers interpret different colour patterns differently. I say, "Stick with the black font. Don't try, again, to get creative." I was signing a release form on a survey a couple of months ago, and it was a company that worked with animals. The motif, the theme was soft pastel colours. The release had been written in-- I think, it was supposed to be a soft pastel blue, which completely faded into the background on my Chromebook. I finally emailed her and I said, "I can't read this. I mean, I literally cannot read this." So, trying to get that theme, that branding across - which I understand-- but your number one goal here is people have to be able to do this. So I say do a fairly large, I don't mean 18 point, but you do want it 12 or higher, simple font, take a look at your population, young people, old people, you want to make sure that everybody can read what you're doing. Read your survey.
Yeah. Whilst you were saying all that, the thing that came to my mind is registration checkout, which is sort of similar, right? I mean, the length of how long it takes for someone to check out. Now that platforms allow you to add any kind of question. I've seen some race checkouts ask some completely irrelevant stuff just for the sake of it and I can see people dropping off or people being too exuberant with colours and stuff. I think what you're saying is - the simple principle - treat it as a checkout. Like, get people to the finish line, get what you need out of it, and make it sort of, like, purpose-driven.
Yeah, the perspective is you have to do what works best for the customer and not necessarily what works best for you. And I see that on business websites all the time this great, wonderful idea. Business cards too - hardcopy business cards, - where it's in a fancy script font or it's in a really cool font type. I can't read the phone number or I can't read the name of the business, which completely, of course, defeats the purpose of it. You can't communicate well.
So let's, let's move on to look at some specific questions or areas of questions you may want to have in your survey. You've done this lots of times for both your races and others. What are, sort of, like, typical areas to survey in this process?
Again, it comes back to what's the information you're trying to gather. A lot of the standard questions - we were talking about demographic questions - you can get those through the registration form. You can certainly ask some of those. You always put your demographic questions last because people do perceive them as threatening. I mean, threatening in a very easygoing manner. Put those last. Whatever you feel you need to answer your questions. I don't really have, right now, what I would say, "You should have these three questions, for sure."
So for instance, the question you mentioned earlier in the discussion about people returning to the race-- is that something that I guess most races would be interested in one day?
Yeah. That's something you could add. You could have something along the lines of, "Was this race worth the entry fee?" You could phrase it a little bit nicer. Another great one is, "Would you recommend this event to a friend?" Like, what is that? The NPS?
The NPS question, yeah.
The NPS question. Certainly, that would be a good one to ask.
That's a 1-to-10 question, usually, isn't it? I mean, the way I see it presented NPS score is like a 1-to-10. And basically, yeah, 9/10 means yes. Something like eight to six means "maybe", and then anything below a six or something is considered, like, a negative response or something like that.
Yeah, probably not, or whatever, or not likely - something like that.
Right. And then everything else from there, you're saying, would be just what specific things do you want to know about your race.
Right. So again, if it is going to give you useful, actionable information, add it. The part of my hesitation in responding to this is, if you think it's important to ask, go ahead, but don't put it in there just because everybody asked this, don't put it in there just because, "Oh, we really ought to ask it." Is it going to be useful? Yeah, that's a question that's going to take somebody some time to answer it. So think about whether it's an important use of someone's time.
And of course, the way in which questions are asked and phrased is also quite important. I mean, I've seen questions that have the answer in the way they're being asked kind of thing, right? So it's very difficult to speak your true mind based on the way some questions are phrased.
Right. I really enjoyed that stunning course or something like that. That stunning course was an important part of the race, right? That's called bias. That's called introducing bias. I think the way that you prevent that is a couple of things. I strongly recommend that you phrase questions as statements. So you're not having any type of inflection there. So rather than, "Were there enough water stops on the course? Strongly disagree, disagree, and so on." There were enough water stops along the course - it's a statement. There's much less of an emotional component to it, and it presents it as a fact and allows you to think about it. Do I agree with that? Do I not agree with it? Do I somewhat agree with it? The other thing that I say - and people kind of chuckle but it's true - is to avoid adjectives because adjectives are the things that generally put the bias in there. The beautiful course, the delicious food, the refreshing sports drink, or something like that - just pull those out. It also makes it a slightly shorter survey. You have the statements that are just a very, "Boom. Okay, done. Move on. Boom. Okay, done. Move on." The other thing that I recommend in making sure that people understand your survey questions is avoid slang or jargon. Again, look at who's in your population. Do young people, old people-- do you have people who are not native speakers of English? Everybody answering your survey is not necessarily you - doesn't necessarily look at this question or this language from the same perspective. Slang changes over time. I have a really amusing example involving my mother that I will share with you after we're done with this podcast. Anyway, jargon is language or vocabulary that is typical to a specific industry. The best example of that is information technology. They have a lot of jargon. My husband works in information technology. He's currently working from home in our office. Every now and then, he will be on a conference call with his project team and the only words that I understand through that entire call are pronouns, conjunctions, and pretty much all the rest of it is Klingon - it's something else. So I warn people, "Be careful about asking - if this were for evaluating a track workout - do you want to do a Fartlek workout or an interval?" I had an example of this myself about a year ago. I got an email from an event trying to figure out what would entice you to participate in this event. One of the first things was, "Which of the following would entice you to participate in this event?" The first one was "Swag." It was a capital SWAG. I happen to know what SWAG stands for because I work with races. It should all be capitalised. But anyway, if you've never run a race before or you've only done one race or so, and you're looking and you go, "What's swag?" Well, it's two things. It's frustrating and it also sends the message that maybe you don't belong here. "I don't know what this is, so I shouldn't have answered the survey. Clearly, I shouldn't go into races because I don't know any of this stuff." Have someone sit back and look. It's funny because things that you think are perfectly clear, but you have someone else who's not involved in the industry or is relatively new to it, who will look and say, "I don't understand that. What does that word mean? Here's what I think you're asking." And you go, "Oh, that's not what I intended."
Did you know that in a recent survey, 73% of responders said that reading reviews influences which races they enter?
Well, Racecheck is the largest aggregator of race reviews in the world and has collected over 40,000 reviews for over 6,000 events globally.
So how can you collect more reviews for your event and make the most of them to increase your race registrations? You can start by listening to our Power of Race Reviews podcast from September 20th last year - plenty of tips there on growing your race reviews - and then visit organisers.racecheck.com to download your free Racecheck Review Box, so you can start showing all your race reviews on your website for an instant boost to your race’s social proof and conversions.
It really is a no brainer. So go to organisers.racecheck.com and download your free Racecheck Review Box today.
Ok, now, let’s get back to the episode…
You mentioned earlier the style of questioning with, sort of, like, statement and "agree, strongly agree, strongly disagree" kind of stuff, which definitely I can see it having an advantage of honing in with very specific questions to specific answers that you want to get out. What is, I guess, the consensus on asking more open-ended questions - things like, "What did you like about our race? Or how would you recommend our race to a friend - something like that?"
Yeah. "Is there anything else you want to add?" "Is there anything else you want to tell us about this event?" That type of thing - open-ended textbox questions. I thought they've also been clauses where you have an unlimited number of characters and you can prattle on for as long as you desire. The pros and cons of those-- the benefit of something like that from the survey perspective is actually less work for the survey writer because you don't have to develop as many individual questions. So you just said, "What are the things you liked about our race?" Well, instead of trying to break that down into the registration part, the packet pickup part, the actual event, the awards, and the whole different things that go into your race, we'll let them tell us. More importantly, you can get some really insightful data and comments from open-ended responses about things that you really hadn't thought about. Also, again, if you have permission, we'd like to put a checkbox and say, "We can use your comments to market our race and pull little juicy quotes out about the event - describing the event." So that's the real benefit of it. The cons, the negatives of them-- analysing open-ended questions-- doing it properly is extremely time-consuming because you have to read every single one. I'm not going to go through the whole qualitative data analysis process for you. It's not an issue if you get 15 or 20 responses. But if you have a race of, let's say, 5000 people, and you get a response rate of 40% and then half of those finished the open-ended, responded to the open-ended questions, that's a lot of responses to go through. It's a lot of things to look at. It is a lot of things to take into account. And ideally, you're going to look for big themes in there, which means you're gonna have to read them a number of times, what I will often see in survey results where there is an open question is whoever did the analysis says, "Oh, here are three or four representative comments from all of the open-ended comments," and I go, "No, these are three or four comments that you thought that we should see, kind of." little smarmy about that. No. To do it well, it takes a very long time. But the bigger one is people hate them. Why do they hate them? Because it's a chore to fill them out. Not many people respond to open-ended. You can get, from your respondents, sometimes, only 40% of them, 50% of them. And even if people are willing to take your survey, there are some people who are, kind of, grudgingly, "Well, I'm not real fond of surveys, but I'm going to provide input." When they get to this open-ended, they're like, "Nope, nope, not going to do it." So, on the one hand, you can get great information if people respond to it. I'm in favour of them because you do get information, but I always suggest that you use them sparingly. I reviewed a survey a while ago where the first four questions of the survey were - it was an evaluation of a training programme - "What was your thoughts about the training programme? What was good about it? What was bad about it?" And then another one, "What suggestions would you make?" I said, "Guys, you can't start with four open-ended. They're like, "Oh, well, we think that--" Well, it turns out, in the first week, they had 325 people in the population and they had nine responses. So I said, "You can't do it. Not gonna happen." Some parameters. So what are three things we can do to improve this event? And not just, "Do you have any other comments for us?" So give them something to focus on and not where they have to think, "Oh, man, what am I going to say?" or "No, don't have time. I'm just gonna move on." And put them last because people will get halfway through and see this open-ended and go, "I'm done. Survey closed."
In terms of sharing the final product - the survey that I've put together - first of all, what is the best time to do that, in order to get the maximum response rate I can out of it?
I recommend trying to get it out as soon as feasible after the event is over. We talked earlier about having things set up in advance and getting ready to go. Most survey packages allow you to schedule a Sunday. You set everything up and you can schedule it. You're trying to capture people when the event is fresh, when they're still excited about it, or their feelings are strongest about it, and that's generally within about the first 48 hours of the event. So if you can have it set, loaded, and ready to go in the evening or the next day after the race, that's probably the best. You're going to get the most results in general - most responses, excuse me - from any survey-- are going to come within your first 48 hours, and then it drops off. I do recommend sending a reminder to the people who have not responded. You generally get a little bump in responses after that. With races, though, I mean, they do the race. After a few days, they're looking to the next race. It fades quickly. Within about a week, you're probably going to get most of your responses. You're not going to get many more after about a week. So I say keep it open for maybe 10 days and then call it good.
And you're saying you've got people's attention after the race before they move on to something else because it's so fresh to them - like, the feelings and all the impression they have of the race is just there.
Right. Many of them are still excited about it. "We're gonna tell our friends how we did. We're gonna post it on Facebook." It's there. It's very vivid. They're very excited. So when they say, "Oh, I'm here. This is from this race that I'm very excited about. I'm going to jump in and respond to this short well-designed survey."
And you've already mentioned sending the survey out over email you. You went over - which I think is super helpful - being really upfront about why you're doing it, why someone's been selected, how long it's going to take you, how the results are going to be used, all of that really, really excellent stuff to put on the email, and of course, the link of the survey - let's not forget. Is email the only distribution channel you'd use for something like this?
I think it's the most efficient because you can load the registration list right up into a mail processor or even your registration software if you want. You could also send the link out in, say-- if you're doing a newsletter-- some bigger races will send up a newsletter a week or so after the event to all the participants or all the registrants thanking them, "Here are some numbers about the race. Oh, by the way, here's the link to the survey." And again, that's okay because it's going to the people who've registered. Where I get a little bit less excited - don't recommend - is just posting it on a public website or a public Facebook page. "For those of you that completed this race, boom, here's this link." Again, you're not exactly knowing who's responding. And again, this perception of "It's not very important. I have other things to do. I'm not going to click on that link." So I think, for effectiveness and for data quality, it's best to get it straight out to the person that was involved in the event.
Because you also wanted, as you were saying earlier, to be personalised, etc. Right? I mean, you want to say, "Hi, Laurel, we really value your response here."
That's ideal. If you don't have a first name or don't feel comfortable doing that, you can certainly see race participant, but I do think getting it as an email invitation directly to your inbox-- pretty much everybody-- I can't think of a race these days that doesn't collect an email address because you need to have it for communication. So that should be a pretty easy way to do it.
Now, you mentioned earlier, how effective incentives can be in getting people to fill in the survey. An interesting point I was discussing on another podcast on race reviews was that, generally, for race reviews, it's a bit of a tricky territory to basically be offering incentives for someone to review your race on a public website. I think I can see sort of the complexities there. For surveys, it sounds like it's pretty straightforward, right? I mean, you're not, sort of, like, jeopardising the integrity of the survey by telling people, "We'll give you a $5 Amazon voucher or whatever."
Absolutely not. No. We're not rewarding a good response, a favourable response. What we're saying is, in exchange for your willingness to provide this feedback, I guess, you could say we're going to give everybody who responds $5 off in their next entry. I don't know if your race director would go for that. Your treasurer probably would not. But with incentives, I think it's nice. I think there are people that really would respond well to that. The key with unspent incentives is it has to be something everybody or pretty much everybody is going to be interested in. So you have to find something that's going to be of value to your entire population. You have to also find something that's going to be applicable to your entire population. So one of the common ones is a gift card to a store - might be a local sponsor. Sometimes, you get those as part of the sponsorship. They'll give you two or three gift cards of a reasonable value. If you draw 50% of your participants from more than a one-hour drive or a two-hour drive, odds are they're not going to drive two hours to come back to use this card at this particular store. So, you think creatively. What's going to appeal to the majority of your respondents based on what your population looks like, where they're coming from - demographics or whatever? I always recommend offering more than one because people-- the perspective is, if there's only one, what are my odds? But when people hear that they're going to be offering two or three, in their mind, that increases the odds more than the actual offering of the additional card does. So they're thinking, "I have way better chance to get one of these, rather than just just one." So I always recommend that. You don't have to offer an incentive. Really, if there isn't something that you think would be useful and reasonable and within your budget, then you don't have to do it. But it does help. It's empirically been shown that it does help.
But again, it would go back to your point about being professional and putting the effort into it. If you would have put all the effort already to do a great survey, run the right tools, get the right responses, and you really care about the results, if offering two or three or four or five modest prizes or something or vouchers to people, it sounds to me like something you'd want to factor into the cost of doing the survey, if it's going to get you more results.
Yeah, I agree with that. And again, sometimes, you can leverage some of your sponsorships to do that. Get them to contribute something of value that can be used as your incentive and that can be very helpful. You can also, depending on your event, offer a free entry into next year's event or a future event. For certain events, that's going to be a really strong incentive. For other events, maybe not so much, depending on the size of your event, the reputation of your event, and whether your event sells out or not. I did a survey a little while ago about four years ago for the Ann Arbor Track Club about which indoor track to rent for our indoor track workouts. My incentive was a free pass to use the indoor track because I knew that was going to appeal to everybody who responded, it was relatively low cost, and it was going to be very useful. And that worked really well.
So some of the things we've been touching on throughout this discussion, implicitly, happen better or are part of purpose-built survey software that you get these days to do the job. Do you have some online tools that you would recommend for cost, ease of use, and maybe specifically working well for race surveys? What kind of stuff do you use?
I generally use Survey Monkey. That one is pretty accessible. I think it is quite well known. It has many different platform options with different features. I've seen some very good surveys done on Google Forms. I'm not a real fan of Google Forms simply because it's not meant to be a survey software, so the number of question formats and options is somewhat limited. I don't necessarily recommend-- there are a lot of other platforms out there. I really haven't had time to go in and test all of them. But I would say that the things to keep in mind is-- people ask me, "What's the easiest one to use?" And I'd go, "There's a learning curve for everyone. It's software. You're not gonna be able to sit right down and, like, magically understand how to use it. You're gonna have to take some-- whatever it is to learn how to do it." There are many different platforms that have fancy features - many of which you're not going to need and some of which you will - different question types that I recommend. The advice I would give in doing this is to look at the different options out there and don't design your survey to fit the tool, to fit the software. Look for software that best serves your purpose and that is going to allow you to create a good survey. And it's not necessarily going to be expensive because you're not doing a scientific survey, you're not going to have to do different types of question logic or skip patterns or stuff like that - you don't need that. My deal breakers when I'm looking at survey software-- the two things that I absolutely have to have in my software package is it's got to allow a sufficient number of questions and responses. So some of the, like, free or basic packages will say, "Okay, you're allowed only 10 questions and only the first 100 responses." Hey, if your race only had 100 people, that would work from a response perspective, but that's too limiting, I think. You're really putting out a lot of effort but you're not going to get as much in return with that. So look and make sure that the number of responses is sufficient for your population. You assume about 50% to 60% of the people you're sending it to. I would err on 60 - probably won't get that. And it has enough questions that you don't have to cut out some necessary questions. Now that might work to your advantage because you have to cut out some that you're like, "Well, we really shouldn't be asking that anyway." And the other thing that I think is really valuable is the ability to export a file of your responses so that you have that data - usually a CSV. That's what I do. So you have it. Even if you're going to be using the reporting tools within the survey software - which a lot of people do, which is great, it's a real benefit of some of them - it's your data. Take it. Download it. Save a copy of it. Save it in your Google Drive or wherever so that you can have it back to look at for reference or if you have additional questions that come up later. "Here's the report. But hey, why don't we look at this? Why don't we look at this? You've got it." Those are the two things that I would always look for in choosing a survey software programme.
The analysis tools that you just mentioned that you do get with most of these. I mean, I know you get it with Google Form, which is the most basic of all of them. So I'm sure Survey Monkey and others would offer that as well. Do they allow you to also do, sort of, like, more advanced segmenting of the data or is that something that you have to do on a spreadsheet?
No. Most of them will allow that to a certain degree. These are not statistical software packages, so they don't go in and do, kind of, the advanced statistics, the advanced types of research. But then again, you're probably not interested in that. You just want to know what the various segments of your population are. Think about this. You want to be able to present the results in a manner that is very clear, that anybody can understand, and that gives you a good picture of how people feel. You can look at the result, you can look at the report, and I do this - it's not uncommon. I do a general report and then I will go through and people will ask, "Well, what about this group? What about this group?" So I can go in and I can see-- you pull out and select "Females only." How do they respond to certain things? You can do it by age group? Certainly, I think for a lot of questions on your survey, if your event has multiple races-- for example, our Dexter Ann Arbor run, we have a half marathon, a 10K or 5K. Any question that has to do with the start, with the course - I had to segment those out because, otherwise, it's noise. I have people that are in there that are rating three different courses and looking at the aggregate of that is not helpful for any of those three, so I would segregate those three out and it's pretty easy to do. Excel is a great tool. Excel has an awful lot of functionality to it. I think most people just use it as a simple spreadsheet but you can create formulas and you can do some statistics as I do. If I'm doing t-tests or ANOVAs, I tend to do those in Excel. It's pretty easy. I don't need SPSS. I don't need any of the Stata or any of those tools to do it. It's pretty easy to do that. Again, there's a little bit of a learning curve, but that's true of any software programme.
Yeah, absolutely. I think we have covered so much ground and I'm really, really grateful for all your insights here. I think we've probably gone through enough to demystify it for people and, hopefully, more people will be putting together race surveys from now on. One last thing from you-- if people happen to maybe have some questions or want to take this a little bit further, is there any way they can reach you? Would you be willing to help out, maybe, some people who are having survey questions?
Absolutely. Be glad to. Best way right now is my email address, email@example.com and just put "Survey question" in the subject line so that I can put a filter and segregate those out from a lot of the other spam I get. Be glad to do it. Yeah, I really enjoy it. Glad to have the opportunity to talk about this. I hope this isn't a chore. Appreciate the opportunity.
Well, absolutely. I mean, we appreciate having you on and I hope you don't get inundated when people offer up their help that generously. Sometimes, it gets a little bit overwhelming, but it's great that you're offering that help to people. I want to really thank you very, very much, Laurel, for taking the time today to talk to us about this.
And I want to thank everyone listening in. I hope you found it useful and we'll see y'all on our next podcast!
I hope you enjoyed today’s episode on actionable race survey design with Ann Arbor Track Club President Laurel Park.
You can find more resources on anything and everything related to race directing on our website RaceDirectorsHQ.com. You can also share your thoughts about designing race surveys or anything else in our Facebook group, Race Directors Hub.
Many thanks again to our awesome podcast sponsors RunSignup and Racecheck for sponsoring today’s episode. And if you enjoyed this episode, please don’t forget to subscribe on your favorite player, and check out our podcast back-catalog for more great content like this.
Until our next episode, take care and keep putting on amazing races.