The following outlines all of the question types available.
Question category |
Question type | Description |
Key metrics |
NPS |
Net Promoter Score (NPS) was first proposed as an effective measurement of customer loyalty in the Harvard Business Review in 2003 by Fred Reicheld, a partner at US consulting firm Bain & Co. NPS asks customers to score on a scale of 0 (very unlikely) to 10 (very likely) how likely they would be to recommend Brand X. Those who score in the range of 0 to 6 are Detractors; those who score 7 or 8 are Passives; and those who score 9 or 10 are Promoters. The NPS score is calculated by subtracting the % of the sample who are Detractors from the % who are Promoters. So if 20% are Detractors, 30% are Passives and 50% are Promoters the NPS score is 50-20=30. The default wording for this question we have used is "On a scale from 0-10, how likely are you to recommend us to a friend or colleague?". This can be tweaked to include reference to your brand etc. Users are presented with radio buttons or slider to select their score (depending on what device they are using). Using this question type will populate the NPS column on the dashboard. |
CSAT |
Customer satisfaction is a more traditional way of gauging a customer’s reaction to a number of issues. Our question uses with a 5 point Likert-type scale where 1 is very dissatisfied and 5 is very satisfied. Its strength (and weakness) is that the format can be used in a wide range of questions. It lets you investigate:
The default wording for this question we have used is "How satisfied were you with your overall experience?". This can be tweaked to meet your needs as per the above. Users are presented with radio buttons or slider to select their score (depending on what device they are using). Using this question type will populate the CSAT column on the dashboard. |
|
Effort |
The Customer Effort Score emerged in 2010, partly as a reaction to NPS. Launched in the Harvard Business Review (by the Corporate Executive Board - CEB), it was based on a single question -“How much effort did you personally have to put forth to handle your request?” - to which customers were asked to respond on a scale from 1 (very little effort) to 5 (a great deal of effort). The underlying premise of the Customer Effort Score is that companies spend too much time trying to delight their customers when most people just want an effortless experience. Service organisations can create loyal customers by reducing customer effort - i.e. helping them solve their problems quickly and easily - not by delighting them in service interactions. CEB maintain that Customer Effort is a better predictor of customer loyalty than NPS or customer satisfaction scores. One of the criticisms of the Customer Effort Score has been the awkward phrasing of their question. In 2013 they revamped this to a more effortless "The company made it easy for me to handle my issue". The default wording for this question we have used is "How easy was it to deal with us today?". This can be tweaked to include reference to your brand etc. Users are presented with radio buttons or slider to select their score (depending on what device they are using). Using this question type will populate the Effort column on the dashboard. |
|
Standard | Radio buttons |
Radio button questions allow you to ask a question which requires the user to select one option from a list of two or more options. Each radio button question requires you to enter the question name and a list of options that the user selects. This is useful for questions where you ask customers to select a score based on a numerical range (e.g. rating happiness on a scale of 1-5 etc.) or selecting from a list of options where only one is true. |
Tick boxes |
Tick box questions allow you to ask a question which requires the user to select one or more option from a list of two or more options. Each tick box question requires you to enter the question name and a list of options that the user selects. This is useful for questions where you ask customers to select from a list of options where only multiple options could be applicable/true. |
|
Matrix |
Matrix questions allow you to ask users to rate multiple topics/questions on the same scale. When creating a matrix question you first need to give your the other arching question name and select the options required. The options are plotted as columns in the matrix table. Next you will then need to add "questions" to the matrix. Questions are plotted as rows in the matrix table. |
|
Drop down |
Drop down questions allow you to ask a question which requires the user to select one option from a list of two or more options. Each drop down question requires you to enter the question name and a list of options that the user selects. We generally recommend using radio buttons rather than drop downs as users can instantly see all options available. However, drop downs are useful when there are lots of options and/or user doesn't need to know all of the alternative choices available (e.g. if asking what country they are in a drop down would be more appropriate that radios as there are many countries and the user will already know the answer to the question & as such doesn't need to review all options). |
|
Text box (single line) |
Text box (single line) questions allow you to ask a question where the user can type their response. Each text box question requires you to enter the question name. Text box (single line) questions are useful for capturing short amount of text (e.g. name). |
|
Text box (multi line) |
Text box (multi line) questions allow you to ask a question where the user can type their response. Each text box question requires you to enter the question name. Text box (multi line) questions are useful for capturing large amount of text (e.g. general feedback or details about a complaint). |
|
Other | Text |
Text isn't specifically a question type, but allows you to add free format text & links into your survey. For example you can use this to add an introduction to your survey. |
Comments
0 comments
Article is closed for comments.