Skip to main content
Loading...
Skip to article
  • Qualtrics Platform
    Qualtrics Platform
  • Customer Journey Optimizer
    Customer Journey Optimizer
  • XM Discover
    XM Discover
  • Qualtrics Social Connect
    Qualtrics Social Connect

Basic Transform Task


Was this helpful?


This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

The feedback you submit here is used only to help improve this page.

That’s great! Thank you for your feedback!

Thank you for your feedback!


About Basic Transform Task

The Basic Transform task allows you to edit and validate your raw data into a compatible, simplified document. Data transformation can help unify your datasets automatically in order to export into the format that you’d like to send to your partners and teams.

Qtip: This task is only compatible with ETL workflows.

Setting Up a Basic Transform Task

  1. Click the navigation menu in the top left.
    Creating an ETL workflow from the workflows page
  2. Choose Workflows.
  3. Click Create a workflow.
  4. Click Extract, transform, and load (ETL) data.
  5. If desired, rename your workflow by clicking the workflow name at the top.
    Creating a new extractor task in the workflow
  6. Setup your extractor task. This task will provide the data that will be transformed in this workflow.  For instructions on setting up extractor tasks, see Building ETL Workflows.
  7. Click Data transformation.
  8. Select Basic Transform from the menu.
    Basic Transform task from the task creator
  9. Choose the data source from the dropdown.
    Choose a data source then click next
  10. Click Next.
  11. Edit the fields from your extractor task. There are four main elements for each field:
    Fields in the extractor task

    • Required: Select the checkbox next to the field if data is required. If the field is checked and there is no data in that field, it will be skipped.
    • Source field: All of the fields from the file in the extractor task. When you add a new field, select which column to pull the data from via the dropdown.
    • Destination field name: How the field names will look in the transformed dataset. If you would like to rename fields from the source file, edit the name in this column.
    • Field type: How the extracted data will be interpreted in the transformed dataset. The field type determines which transformations operations are available.
  12. Click the three dot menu on the right to transform your data, add validation rules, add conditional fields, or delete that field from the dataset. For more information on these options, see the sections below.
    Three dot menu with transform data highlighted and transform data options. Also Add field highlighted
  13. If you would like to add an additional field to the dataset, click Add field.
    Qtip: You can reorder your fields by hovering your mouse to the left of the field’s name, clicking the icon that appears, and then dragging the field. Reordering fields with the basic transform task is the only way to guarantee the order of fields in ETL workflows. hovering the mouse to the left of a field to reorder it
  14. Review all of your transformations in the Applied transformations tab. For more information on each transformation, click on a transformation. Click the three dot menu to edit or delete.
    Applied Transformations tab highlighted

    Qtip: The operations are carried out in the order that they appear in the applied transformations tab, which is the order in which they are configured. If multiple operations are set up on the same field, subsequent operations will be calculated on the results of previous ones.
  15. Review all of your validation rules in the Validation rules tab. For more information on each validation rule, click on a validation rule. Click the three dot menu to edit or delete.
  16. Click Next.
  17. Review the summary of your transformations and validations. Click Save when ready.
Qtip: If the transformation encounters an error (like invalid formatting for a field), then all subsequent operations in that row will be skipped.

Data Transformation

The transformation operations available for your field depends on the destination field type.
Data transformation operations available and highlighted

The following transformations are available for text value, text set, and open text fields:

  • Concatenate: Combine values from multiple fields together.
  • Split: Split the original field into multiple fields that are separated by a delimiter.
  • Add prefix: Add text to the front of each value.
  • Add suffix. Add text to the end of each value.
  • Format case: Apply a case format (e.g. UPPERCASE) to all values in the field.
  • Replace value: Find and replace values in the field.
  • Trim value: Delete whitespace at either end of the text in the field.
  • Split by number of characters: Split the text into two fields by specifying the number of characters from the beginning or end.

The following transformations are available for number value and number set fields:

  • Addition transformation: Add a value to each number in the field or add two fields together.
  • Subtraction transformation: Subtract a value to each number in the field or subtract one field from another.
  • Multiplication transformation: Multiply a value to each number in the field or multiply two fields together.
  • Division transformation: Divide each number in the field by a value or divide one field from another.
  • Integer division transformation: Divide each number in the field by a value or divide one field from another. Only returns the whole number (integer).
  • Absolute value: Get the absolute value of each value in the field.
  • Round value: Round all values in the field either up or down.
  • Average of values: Take the average of values in selected fields for each response.
  • Median of values: Take the median of values in selected fields for each response.
  • Sum of values: Take the sum of values in selected fields for each response.
  • Minimum of values: Take the minimum of values in selected fields for each response.
  • Maximum of values: Take the maximum of values in selected fields for each response.

The following transformations are available for date fields:

  • Date calculation: Returns how much time has passed between current source date and another date.
  • Date/time offset: Offset date/time with a specified time value. For example, you could could provide the date for 5 days after an interaction.
Qtip: When using offset, 1 day in the future and 1 day in the past are both considered to be today.
Qtip: Our system uses the UTC time zone. All date calculations and offsets will be based on UTC.

When entering a custom date / time format, these are your formatting options:

  • Year: yy, yyyy
  • Month: M, MM, MMM, MMMM
  • Day: d, dd
  • Day of year: DDD
  • Hour of half day: K, KK
  • Hour of day: HH
  • Minute: m, mm
  • Second: s, ss
  • Fractional second: S, SS, SSS
  • Time zone offset: ZZZZ, ZZ:ZZ (can include +/-)
  • Half day of day: a
  • Day of week: E, EE
  • Time zone format: z

 

Data Validation

Adding validation rules allows you to create criteria and conditions that ensure the data in your fields has the same format. For example, you may want to eliminate any records with an invalid email address.

  1. Click the three dot menu then click Add validation rule.
    three dot menu with add validation rules highlighted
  2. Select only include records that are not empty if you would like to omit the response if this field is empty. This preference does the same as the Required column in the transform table.
  3. Select enforce record data type, only including records that can be cast to the specified type to validate based on the format of the data. For example, for a text value you can select if that data should be a number or text format.
    Options in data validation highlighted

    • Should warn on failure instead of skip: Enabling this option will include records that do not meet the validation criteria in the dataset. A warning will be displayed in the row report indicating that the response failed this rule. Leaving this unchecked will skip the record so that it is not included in the new dataset.
  4. Add Validation Rules, if you’d like. For more information, see the section below.
  5. Click Apply.

VALIDATION RULES

You can add validation rules to conditionally validate the data in your field. For more information on building conditions, see Basics of Building Conditions.

  1. Select what will happen to the record upon condition failure. You can choose to either skip the record so that it is not included in the dataset or include it with a warning.
    Rule validation setup in data validation menu
  2. Indicate whether all or any of the rules need to be true in order to validate the condition.
  3. Create your condition using the operator dropdown and textbox. This determines what the record in this field needs to satisfy in order for the condition to pass.
    Qtip: To create another condition or delete an existing condition, click the three dot menu on the right.
  4. Add another validation rule, if you’d like.
    Qtip: Click the trash can to delete a validation rule.
  5. Click Apply.

Conditional Fields

Conditional fields are created by setting up logic conditions that use other fields to determine how the field is created. Add a conditional field from the 3 dot menu to the right of the source fields. For more information on building conditions, see Basics of Building Conditions.

Click the three-dot menu on the right then select conditional field from the options

CONDITIONAL FIELD

A conditional field creates a new field based on a single logic condition.

  1. Enter the New field name.
    Basic condition with the values highlighted
  2. Choose an operator for your condition.
  3. Enter the value to determine how the condition is met.
  4. Enter what will be returned if the rule created in steps 2 and 3 is true.
  5. Enter what will be returned if the rule created in steps 2 and 3 is false.
  6. Click Apply when finished.
Example: Let’s say you want to create a field for “Consenter” or “Non Consenter” based on how respondents answer the question: “Do you consent to taking this survey?” You can create conditional field with the logic: If “Yes” is selected, then the value is “Consenter”.

ADVANCED CONDITIONAL FIELD

An advanced conditional field creates a new field based on multiple logic conditions or condition sets.

  1. Enter the New field name.
    Advanced conditional field values highlighted
  2. Indicate whether any or all of the rules need to be true in order to validate the condition.
  3. Create a condition. For more information, see the Conditional Field section above.
  4. To create a new condition set, click Add branch.
    Qtip: There is a limit of 4 branches.
  5. Click Apply when finished.
Example: Let’s say you want to create a field for “Promoters” or “Detractors” based on how respondents answer your 10 point satisfaction question. Create an advanced conditional field with the logic: If Q2 is “greater than or equal to 8” then the value is “Promoter”, but if Q2 is “less than or equal to 5” then the value is “Detractor”.
An example of an advanced condition set

Transformed File and Reports

The Basic Transform task generates the transformed file along with 2 reports to help you understand your transformed dataset. In the Run History tab of your Workflow editor, click the Basic Transform task within the workflow run to see the files on the right side. These files can be exported by clicking the download icon to their right.

Highlights the fields generated by the data transform task from the run history menu

  • Transformation Report: All of the transformations that were carried out on records in the source file and other details such as their name, row number, status, failure reason, and more.
  • Transformed File: The result of the data transformation containing the new dataset with all of the transformations and validations applied.
  • Validation Report: Report of any validations that were performed, including whether the record was excluded and the validation that it failed.

Troubleshooting

  • If the file produced by the basic transform task is bigger than 1GB, the workflow will fail.
  • Depending on how you choose to transform your data, the basic transform task can add more data to your file. If the file you’re extracting is already close to the 1GB limit, keep in mind that your transformations can make your file too large to process.