• Log in
  • Sign Up
  • Log in
  • Sign Up

Best Practices: Collaborating helps build better surveys

Posted by Jillian Dutson and Drew Chambers on Dec 5, 2019 10:24:13 AM

The Corus platform was designed for teams. But what are some of the best ways to get the most out of it if you’re not used to building surveys collaboratively?

Working collaboratively around large datasets just makes sense. In general there is just too much information for one individual to ingest and decipher. ProPublica put out a great piece about how their newsroom improved when they started implementing large-scale collaborative processes around data analysis, and the results are pretty impressive. But what about before the dataset is built? What about collaborating during the survey phase? Is that possible, practical, and/or advisable? In our experience, yes - and here are some best practices to get the most out of the collaborative tools integrated on the Corus platform.

 

Signing Up Your Whole Team

Corus was designed from the ground up to be used by your entire team. As a result, we don’t charge on a per-seat basis. So we encourage our users to sign up all the people on their team that will possibly use the tool. Even if it’s just to jump in after a survey goes live to view the results in the dashboard. Note: you’re able to control all the individual permissions which includes managing other users, editing, sending out surveys, extracting data and managing locations. A quick tip here: you can create unique organizations within your company (so sales and marketing could have a different org than human resources for example).

Screen Shot 2019-12-04 at 4.16.01 PM

[here's a full intercom help article about inviting members to your team ]

Establish a Process for Survey Creation and Editing

Here’s where we get into the nitty gritty. With anything collaborative, you obviously run the risk of redundant work, miscommunication, and potential headaches. That’s why it’s best to start with a process for your team as you begin your surveys - establish roles and responsibilities, and determine a point person for the initial survey writing, with others editing at the next stage. So what this would look like in theory:

  • You, project manager, determines who is going to write the survey, and what the overall objective is for the dataset that you’re building.
  • The survey writer will go in and begin authoring the survey in your organization
  • Send an email or slack to the rest of the team notifying them that the survey is ready to be editing/critiqued - along with a timetable for when all suggestions/comments are due by
  • Review the notifications and comments in the platform along with the survey writer and make final determination of edits

Dashboarding and Review

The data analysis stage is, depending on how you’re using our tool, also a great opportunity to collaborate (and it is often necessary to collaborate at this stage, a la ProPublica). At this stage you can comment on the findings as well as highlight anything that’s jumping out to you (i.e. “these responses are not what we were expecting…”), as well as an opportunity to check statistical significance. Go into the survey itself after reviewing the dashboard and add comments to your team via the “activity” pane on the right side.

Screen Shot 2019-12-04 at 4.18.35 PM

[here's an article in intercom about the comment section]

Some best practices at this stage of the process:

  • Try to keep your suggestions/edits constructive - at this point the survey has been filled so you can’t change the responses, but you can obviously learn if something is insignificant or if it might be worthwhile to try a new survey with a different question.
  • Always keep your objective in mind - what was the goal of this survey in the first place? Is the data you have giving you insight into this ultimate problem statement? Try to look at each question and your comments through this lens.
  • Make sure one person is responsible for consolidating and clarifying the feedback (maybe the survey author or the project manager)
  • Always keep a deadline for this stage of commenting (maybe a day or two for the team)

 

Final Thoughts and Other Tips and Tricks

Utilizing these quick best practices from the get-go will set you up for successful survey-writing and data-set building in the future. Once you get things established, it makes writing and collecting this data quicker and easier with each subsequent project. Pretty soon this will be second nature, and you’ll wonder how you ever created customer or employee surveys without it. A couple more best practices to keep in mind:

  • Keep comments short and clear, much like you would in slack
  • Because you can hyperlink and add images in surveys, if you have a comment about multimedia or a suggestion, feel free to drop that right in the comment section
  • Regularly check your organizations permissions
  • Keep things light-hearted - a fun comment here and there makes the process a bit more fun

Enjoy, and as always, contact us at support@cor.us if you have any questions!

Topics: use case, best practices

      Better Answers Powered by Numbers

      This Blog is Dedicated to Discovering Answers Through the Power of Research

      What are the things that surprise and excite our team? Is Net Promoter Score really the best way to think about customers? Is customer service truly a dying art? Why do polls get things wrong? What are the true value drivers for successful businesses? How do you know? Read here to find out!

      Couple of key things:

      • This blog is powered by the same tool that you can use for free at www.cor.us
      • We love any and all feedback - if you have ideas for how to make the blog better let us know!
      • Please be respectful of others in the comments, we're all here to learn!

      Subscribe Here!

      Recent Post