1: About Verifying a MaxDiff Question
MaxDiff or Maximum Difference Scaling is an approach for obtaining preference/importance scores for multiple items (brand preferences, brand images, product features, advertising claims, etc.) using marketing or social survey research. With MaxDiff, respondents are shown a set (subset) of the possible items in the exercise, and are asked to indicate (among this subset) the best and worst items (or most and least important, etc.).
In this document we learn how to test the MaxDiff Element in your survey to make sure that it is working properly before you launch the project.
2: Requirements for Testing
The items listed below are required to test a MaxDiff question.
Ideally, the questionnaire will give you an overview of how the survey should be programmed. It includes the MaxDiff’s placement in the survey, along with the question text. Use the questionnaire to navigate through the survey and verify that the MaxDiff Element is set up properly.
2.2 Design File
The design file illustrates the setup required for the MaxDiff. In our example we've set up the design in an Excel file for easier viewing, but we must change it to tab-delimited (.dat) after the design has been configured. Read more below.
1. Version: The first column in the design file, containing the number given to identify the group of sets/tasks.
2. Task/Set: The second column in the design file, containing the number given to identify the group of items.
3. Item [1,2,...]: Each "Item" column becomes a statement place in the MaxDiff question, and the statement is determined by the number.
4. Attributes: These numbers correspond with the statements for the question. You can match them with the item descriptions found in the attribute list.
2.3 Attribute List
The attribute list will display all of the items that will pipe into the MaxDiff Element, as shown in the example below.
With the MaxDiff Element, only a select number of attributes will show for each task, as shown below.
The order of the items in the attribute list is important. Each attribute should retain its position relative to the other attributes as it is referenced by the design file and used to determine which items are actually shown on on each task.
2.4 Survey Programmed with MaxDiff Element
The final item that you need is the link to test the survey. You can access the testing link by selecting Test and Test Survey from the project controls in the research hub or from the navigation menu anywhere on the platform.
In the survey test environment, you can configure the survey link as needed and then click “Show Survey with Tools,” as shown in the example below.
Note: Testing the survey with tools allows you to view the hidden version question and the debugging tool (if adding during the programming phase).
3: Verifying the MaxDiff Setup
Now that we’ve got all the necessary supplies to test the MaxDiff Element, we’re ready to jump into testing it!
3.1 Spot Checking
Depending on your project, your MaxDiff exercise can have from as little as one version to thousands of versions. It would be unrealistic to be able to check every single version of a MaxDiff, so instead, we recommend doing a spot check.
A spot check would involve verifying that a few versions of the exercise are programmed correct by verifying each task and attribute, against the design file specifications for that version.
If the versions checked are correct, then it can be assumed that the rest will also be correct.
If errors are present, make changes to the MaxDiff programming and recheck the versions. Randomly select a few more versions and verify that no additional issues exist. Repeat this process until you can verify a series of versions free of errors.
3.2 QA Tools
The QA tools that you need to verify the MaxDiff setup are only available when you use the Test Survey with Tools link and have QA Codes enabled.
Learn more: QA Codes
Click here for a refresher.
3.2.1 MaxDiff Version Assignment Hidden Question
The MaxDiff element uses a randomly assigning quota, to assign respondents to a version of the MaxDiff Exercise. This version assignment shows on a hidden question screen when you’re testing with tools:
This screen displays the version of the MaxDiff Exercise that has been assigned. If preferred, you can manually enter a different version for testing. Click "Continue" to apply the version displayed or entered here.
Note: You can manually adjust the version number on this screen, however it may conflict with the version assignment shown in the recorder.
3.2.2 Print Dialogue Box (only available if added in the XML)
Programmers can implement the Print Statement in the XML to display various information about the MaxDiff in a green box, at the top of each task page. If available, it appears when you click the arrow at the top, left of the page for the MaxDiff Exercise, as shown in the example below.
In this print dialog box, you typically see the Version & Task for the current page, along with the attribute numbers according to the design file.
Note: The text available in the print dialog box may vary depending on programmer preference. To achieve the print dialog box shown above, use the following code at the end of the exec block which calls the MaxDiff function in your XML:
print "*****STAFF ONLY*****" print "Version_Task: %s" % vt for i in range(len(items)): print "Item %s: %s" % (i+1,items[i]) </exec>
3.3 Manual Testing
Manual testing involves going through the Maxdiff element in your survey and looking for errors that can only be caught by eye. For the MaxDiff element, you’ll want to verify that the attributes assigned per each task match the version, according to the Design File specifications.
3.3.1 Verify Version, Tasks and Attributes
Step 1: Decide which Version you’re going to test.
You want to make sure that you’re getting a good sample of all versions of the MaxDiff from your design file specifications. For example, if your design file has 100 versions, test versions 26, 66, and 89.
In the example below, we’re verifying version 2 of our MaxDiff exercise.
Step 2: Use the testing with tools link to reach the MaxDiff version question in your survey.
Answer each question in your survey to ensure that you qualify for the MaxDiff Element.
Step 3: Manually select the version that you wish to test.
Use the MaxDiff hidden question to select which version of the MaxDiff you want to test. Enter the appropriate version number and click "Continue":
Step 4: Verify that the item text and item order displayed on each task matches the design file specifications.
In this example we’re looking at Version 2, Task (Set) 1. According to the design file we should see items 8, 12, 9, & 1 as shown in the example below.
Next, check the design file item numbers against the attribute list and survey:
Note: The goal is to verify that the items in the design file are displayed in the survey with the correct order & attribute per each task in the version being checked.
According to the design file, the first item that should be displayed on version 2, task 1 is Item 8, “Expert (product/industry).” As shown in the example below, the Sneakers attribute is in the first item position in the MaxDiff exercise in our survey, which matches the design file and attribute list.
Correspondingly, we should see Item 12, “Proactive,” displayed second; Item 9, “Consultative” displayed third; and Item 1, “Traditional” displayed fourth.
Step 5: Complete checking the version of the MaxDiff.
Verify that each task in the version displays the proper item in the order designated by the design file.
Note: If there are any discrepancies between the design file, attribute list, and survey, then this indicates an error in programming. Resolve the errors and then retest the version.
Step 6: Repeat steps 1-5 for each Version of the MaxDiff that you’re testing.