1: About Report Testing
When conducting report testing, you analyze the way that simulated data flows through your survey to ensure that things like logic, terminates, and quotas are working properly. Before getting started, make sure that you’ve successfully completed a simulated data run.
When testing, any simulated data that is run through your project mimics the path that actual respondents might take when the survey is live and display in the reporting tools. The simulated data must obey the setup of the survey, respecting question logic, quotas, sample sources, languages, etc., which allows you to use the reporting tools for testing purposes.
Learn more: Simulated Testing
Note: For general testing purposes it's good practice to let the data fall out naturally, without restrictions or custom test simulation configuration, in order to identify any issues or errors.
The Decipher Platform offers a number of reporting tools for helping you analyze your data. In this training we’re going to utilize those tools to perform quality assurance checks on our project before going live.
The crosstabs reporting system by decipher provides you with a quick and easy way to view all of the data in your survey. You can create various Crosstabs and splits to dissect your data and conduct more specific side-by-side comparison of the information.
You can utilize the crosstabs reporting tool to verify that various components of your survey are functioning as expected (quotas are assigning properly, logic is working, etc.) after running simulated data. The goal of testing with crosstabs is to verify that the base of the table matches the base of the crosstabs segment that you’re comparing it to. Take a look at the following documents to learn more:
Tip: Save you crosstab reports for later! This helps to save time when reviewing the soft launch data.
Learn More About Testing with Crosstabs:
Learn More About Crosstabs:
- Crosstabs Overview Crosstabs Entry Page
- Configuring Crosstab Settings
- Running a Total Crosstab
- Running a Quick Split Crosstab
- Building a Crosstab
- Using XML to Edit Crosstabs
- Creating a Table Set
- Add/Edit Charts
- Editing Tables
- Statistical Testing in Crosstabs
- Viewing Crosstabs
- Create and Upload a Weighting Scheme
- Pinning Tables and the Whiteboard
- Saved Reports
Use Crosstabs Report Checklist
Use the Crosstabs Report Checklist as a guideline on what types of errors to look for when you conduct report testing in the Crosstabs report.
Click here to download the checklist
Checking Data in Report:
Compare the Questionnaire (QRE) to the Crosstabs Reporting Tool, paying special attention to quotas, logic terminates, and survey paths. You can use simulated data and custom banners in crosstabs to verify that the logic is working properly.
Run the Crosstabs Total Report
Split by: Total Frequencies
Respondents: Qualified Only
Verify the base for all tables with no condition logic.
Verify that all terminated answer choices have 0 Total Respondents.
Verify that Rating questions include the following: Top 2 / Bottom 2 / Avg. (top 3 for scales over 7 points).
Verify 2D checkbox questions grouped appropriately by columns or by rows.
Appropriate virtual questions are in place & correct.
Create Custom Crosstabs to check logic from QRE:
Check Logic from QRE.
Quota breakouts, quota tables match your segments.
Verify that autofill elements are collecting data as expected.
3: Field Report
The field report shows the health of your survey. This tool gives an overall view of what’s happening with your project. All the data in the field report are real-time and keep track of who is in the survey.
After running simulated data, you can utilize the Field Report to verify that quotas and terminates are working as desired. The field report indicates areas where respondents were not able to qualify for a quota or identify any areas of high terminates.
Learn More About The Field Report
- About the Field Report
- Monitoring Completions in Field
- Accessing Click-Through Partial Data
- Monitoring Terminates in Field
- Monitoring Quotas in Field
- Editing Quota Limits in Field
- Adding Quota Alerts in the Field Report
- Quota Structure Download
- Monitoring Drop-outs in Field
- Sharing and Printing Field Statistics
Use Field Report Checklist
Use the Field Report Checklist as a guideline on what types of errors to look for when you conduct report testing on the field report.
Click here to download the checklist
Checking Field Report:
This report allows you to monitor the progress of your survey before, during and after it has launched.
Split field report by list, segment, variables if needed.
|Quotas Tab|| |
Quota Limits and totals are set (if provided).
Run simulated data to fill all buckets completely. Check all quota segments for any non-qualifying data, for more information click here.
Verify that all quota segments match the quota setup designated in the QRE.
Check for Non-Qualified Respondents on the Quota Sheet.
|Terminate Tab|| |
Terminate tab shows all terms possible. If you see “Unspecified Terminates” then you should check your survey immediately for programming errors.
4: Soft Launch Data Check
Fielding a study is completed in two steps. First is the soft launch, which is reached when 10% or less of the overall quota have completed the survey. At this point, a data check is done to ensure there are no problems with the invite, link, study, etc. After data has been reviewed and it is confirmed that everything looks correct, then the full launch can begin, which is the second part of fielding a study.
The following section explores what to check when conducting a review of the soft launch data collected for your project.
Tip: Do a review of the soft launch data for your project when you have between 30-50 respondents who have qualified and completed your survey. Collecting a larger base size prior to conducting the data check may represent a more accurate fallout of the data.
Use Soft Launch Data Checklist
Use the Soft Launch Data Checklist as a guideline to review important parts of your survey and troubleshoot potential issues before proceeding with a full launch.
Click here to download the checklist
Soft Launch Data Check:
Review important parts of your survey and troubleshoot potential issues before proceeding with a full launch.
Re-check critical logic and segments and verify totals are matching in the report and quotas.
Verify virtuals are recording data properly.
Review the terminates for the survey and take note of particularly high terminates. Depending on the term, you may want to check with the sample provider to ensure that you’re accurately targeting your sample.
Review the Dropouts for the survey and take any note of particularly low completion rates as this may indicate an area of frustration or programming error for respondents taking your survey. Depending on the dropout page, you may want to use manual testing to make sure that no issue is present.
Verify that there is data in all fields of the report and that the counts in the report match counts in the Field report.
Verify all required link variables are being passed and recorded properly (check virtual questions, segments, etc.).
Re-check the Quotas. Verify that setup and limits are correct according to the questionnaire.