1: About DCM
Discrete Choice Model (DCM), is a statistical technique used to determine how people value different features or attributes that make up an individual product or service. The model most commonly used in Decipher surveys is know as the Choice Based Conjoint (CBC).
With a DCM, respondents are shown several products with different features and asked to pick the product with the most desirable combination of features. The exercise can be repeated several times as determined by the design file specifications.
In this document we’ll learn how to test the DCM Element in your survey to make sure that it is working properly before you launch the project.
To learn how to program the DCM in this example, click here.
2: What You’ll Need to Test the DCM
Ideally, the questionnaire will give you an overview of how the survey should be programmed. It’ll include the DCM’s placement in the survey, along with the question text. Use the questionnaire to navigate through the survey and verify that the DCM Element is set up properly.
2.2 Design File
The design file illustrates the setup required for the DCM. In our example we've set up the design in an Excel file for easier viewing, but this file is typically saved as a tab-delimited (.dat) format when uploaded into the system files. Read more about the Design File Setup below.
1. Version: The first column in the design file, containing the number given to identify the group of sets/tasks.
2. Task/Set: The second column in the design file, containing the number given to identify the group of items.
3. Concept [1,2,...]: The "Concept" column represents the products/services to be compared against each other in the DCM.
4. Attributes [1,2,...]: These columns represent the features that will display for each concept. You can match the numbers in these columns with the descriptions found in the attribute list.
2.3 Attribute List
The attribute list displays all of the items that pipe into each concept of the DCM. As shown in the example below, we have a total of five attributes pipe into each concept of the DCM:
- Cost of Phone
- Battery Life
With the DCM, the level of attributes shown for each concept varies depending on the design file specifications. For example, the "Cost of Phone" attribute contains five different levels, as shown in the example below.
The attribute text that will be piped into the concept is determined by the number under that attribute column for that concept as designated in the design file.
2.4 Survey Programmed with DCM Element
The final item that you’ll need is the link to test the survey. You can access the testing link by selecting Test and test survey from the project controls in the research hub or from the navigation menu anywhere on the platform.
In the survey test environment, you can configure the survey link as needed and then click "Show Survey with Tools," as shown in the example below.
Note: Testing the survey with tools will allow you to view the hidden version question and the debugging tool (if adding during the programming phase).
3: Verifying the DCM Setup
Now that we’ve got all the necessary supplies to test the DCM, we’re ready to jump into testing it!
3.1 Spot checking
Depending on your project, your DCM exercise can have from as little as one version to thousands of versions. It would be unrealistic to be able to check every single version of a DCM, so instead, we recommend doing a spot check.
A spot check would involve verifying that a few versions of the exercise are programmed correct by verifying each task shows the correct concepts and attribute configurations as laid out in the design file specifications.
If the versions checked are correct, then it can be assumed that the rest will also be correct.
If errors are present, make changes to the DCM programming and recheck the versions. Randomly select a few more versions and verify that no additional issues exist. Repeat this process until you can verify a series of versions free of errors.
3.2 QA tools
The QA tools that you’ll need to verify the DCM setup are only available when you use the Test Survey with Tools link and have QA Codes enabled.
To Learn more: QA Codes
Click here for a refresher.
3.2.1 DCM Version Assignment Hidden Question
The DCM element uses a randomly assigning quota, to assign respondents to a version of the DCM Exercise. This hidden version assignment shows on two different screens when you’re testing with tools.
The first will show all of the versions that the respondent can possibly qualify for. On this screen you have the option to manually select the version that you’d like to test.
If no versions are selected, then a version will randomly be assigned on the next screen, as shown below. This screen will display the version of the DCM Exercise that has been assigned.
Note: You can manually adjust the version number on this screen, however it may conflict with the version assignment shown in the recorder.
Use the version assignment question to verify which version of the DCM you will test.
3.2.2 Print Dialog Box (only available if added in the XML)
Programmers can implement the Print Statement in the XML to display various information about the DCM in a green box, at the top of each task page. If available, it appears when you click the arrow at the top, left of the page for the DCM Exercise, as shown in the example below.
In this print dialog box, you are typically able to see the Version & Task for the current page, along with the Concepts and their corresponding attribute numbers according to the design file.
Note: The text available in the print dialog box may vary depending on programmer preference. To achieve the print dialog box shown above, use the following code at the end of the exec block which calls the DCM function in your XML:
#set persistent items, format: p.concept#_att# def setupDCMItems(d, vt, prefix="1"): print "***** STAFF ONLY *****" print "***** DCM Matrix *****" print "Version_Task: %s" % vt for concept in d.get("concepts"): attributes = d[ "%s_c%s" % (vt,concept) ] print "Concept %s: %s" % (concept,attributes) for i,attr in enumerate(attributes): p[ "concept%s_att%s" % (concept,i+1) ] = res[ "%s_att%s_level%s" % (prefix,i+1,attr) ] p[ "dcmLegend_att%s" % (i+1) ] = res[ "%s_legend%s" % (prefix,i+1) ] </exec>
Use the Print Dialog Box as reference while manually verifying the DCM element.
3.3 Manual Testing
Manual testing involves going through the DCM in your survey and looking for errors that can only be caught by eye. For the DCM element, you’ll want to verify that the attributes assigned for each concepts, match the task match the version, according to the Design File specifications.
3.3.1 Verify Version, Tasks and Attributes
Step 1: Decide which Version you’re going to test.
You want to make sure that you’re getting a good sample of all versions of the DCM from your design file specifications. For example, if your design file has 100 versions, test versions 26, 66, and 89.
In the example below, we’re verifying version 4 of our DCM exercise.
Step 2: Use the testing with tools link to reach the DCM version question in your survey.
Answer each question in your survey to ensure that you qualify for the DCM Element.
Step 3: Manually select the version that you want to test.
Use the DCM Virtual question to select which version of the DCM you will test. Select the appropriate version number and hit continue.
Verify that the next hidden question displays the correct version assignment, as shown below.
Select "Continue" to start the DCM exercise.
Step 4: Verify the Version, Task, Concepts & Attributes in the Design File/Attribute list against the Survey.
In the example above, we’re looking at the DCM Version 4, Task (Set) 1. According to the design file we should have Two concepts with 5 attributes each (Cost of Phone, Storage, Color, Contract, & Battery Life).
Tip: Verify this set up once; it will remain the same for every version and task of the DCM exercise.
Step 5: Verify the attribute text for each concept.
Each attribute is displayed as a column in the design file. We can match the numbers in each of these columns with the level of the corresponding attribute in the attribute list to find the text that should show in the survey.
For example, Version 4, Task 1, Concept 1 should have the following attributes and levels:
Cost of Phone - 1
Storage - 1
Color - 3
Contract - 1
Battery Life - 4
You can see them outlined in the image below.
Finally, we can verify that the text for these attributes display correctly in the survey.
Repeat this process for all remaining concepts in the task.
Step 6: Complete checking the version of the DCM
Verify that each task in the version displays the proper item in the order designated by the design file.
Note: if there are any discrepancies between the design file, attribute list, and survey, then this indicates an error in programming. Resolve the errors and then retest the version.
Step 7: Repeat steps 1-6 for each Version of the DCM that you’re testing.