|
| 1 | +--- |
| 2 | +title: Databricks Integration |
| 3 | +sidebar_label: Databricks |
| 4 | +pagination_label: Databricks Integration |
| 5 | +description: Information about integrating with UID2 through Databricks. |
| 6 | +hide_table_of_contents: false |
| 7 | +sidebar_position: 04 |
| 8 | +displayed_sidebar: docs |
| 9 | +--- |
| 10 | + |
| 11 | +import Link from '@docusaurus/Link'; |
| 12 | + |
| 13 | +# Databricks Clean Rooms Integration Guide |
| 14 | + |
| 15 | +This guide is for advertisers and data providers who want to convert their user data to raw UID2s in a Databricks environment. |
| 16 | + |
| 17 | +## Integration Overview |
| 18 | + |
| 19 | +This solution enables you to securely share consumer identifier data without exposing sensitive <Link href="../ref-info/glossary-uid#gl-dii">directly identifying information (DII)</Link>, by processing your data in an instance of the [Databricks Clean Rooms](https://docs.databricks.com/aws/en/clean-rooms/) feature. This feature provides a secure and privacy-protecting environment for working on sensitive data. |
| 20 | + |
| 21 | +When you've set up the Databricks Clean Rooms environment, you establish a trust relationship with the UID2 service and allow the service to convert your data, which you share in the clean room, to raw UID2s. |
| 22 | + |
| 23 | +<!-- |
| 24 | +## Databricks Partner Network Listing |
| 25 | +
|
| 26 | +[**GWH__EE or MC for listing update when available. https://www.databricks.com/company/partners/technology?**] |
| 27 | +--> |
| 28 | + |
| 29 | +## Functionality |
| 30 | + |
| 31 | +The following table summarizes the functionality available with the UID2 Databricks integration. |
| 32 | + |
| 33 | +| Encrypt Raw UID2 to UID2 Token for Sharing | Decrypt UID2 Token to Raw UID2 | Generate UID2 Token from DII | Refresh UID2 Token | Map DII to Raw UID2s | |
| 34 | +| :--- | :--- | :--- | :--- | :--- | |
| 35 | +| — | — | — | — | ✅ | |
| 36 | + |
| 37 | +## Key Benefits |
| 38 | + |
| 39 | +Here are some key benefits of integrating with Databricks for your UID2 processing: |
| 40 | + |
| 41 | +- Native support for managing UID2 workflows within a Databricks data clean room. |
| 42 | +- Secure identity interoperability between partner datasets. |
| 43 | +- Direct lineage and observability for all UID2-related transformations and joins, for auditing and traceability. |
| 44 | +- Streamlined integration between UID2 identifiers and The Trade Desk activation ecosystem. |
| 45 | +- Self-service support for marketers and advertisers through Databricks. |
| 46 | + |
| 47 | +## Integration Steps |
| 48 | + |
| 49 | +At a high level, the following are the steps to set up your Databricks integration and process your data: |
| 50 | + |
| 51 | +1. [Create a clean room for UID2 collaboration](#create-clean-room-for-uid2-collaboration). |
| 52 | +1. [Send your Databricks sharing identifier to your UID2 contact](#send-sharing-identifier-to-uid2-contact). |
| 53 | +1. [Add data to the clean room](#add-data-to-the-clean-room). |
| 54 | +1. [Map DII](#map-dii) by running the clean room notebook. |
| 55 | + |
| 56 | +### Create Clean Room for UID2 Collaboration |
| 57 | + |
| 58 | +As a starting point, create a Databricks Clean Rooms environment—a secure environment for you to collaborate with UID2 to process your data. |
| 59 | + |
| 60 | +Follow the steps in [Create clean rooms](https://docs.databricks.com/aws/en/clean-rooms/create-clean-room) in the Databricks documentation. Use the correct sharing identifier based on the [UID2 environment](../getting-started/gs-environments) you want to connect to: see [UID2 Sharing Identifiers](#uid2-sharing-identifiers). |
| 61 | + |
| 62 | +:::important |
| 63 | +After you've created a clean room, you cannot change its collaborators. If you have the option to set clean room collaborator aliases—for example, if you’re using the Databricks Python SDK to create the clean room—your collaborator alias must be `creator` and the UID2 collaborator alias must be `collaborator`. If you’re creating the clean room using the Databricks web UI, the correct collaborator aliases are set for you. |
| 64 | +::: |
| 65 | + |
| 66 | +### Send Sharing Identifier to UID2 Contact |
| 67 | + |
| 68 | +Before you can use the clean room notebook, you'll need to send your Databricks sharing identifier to your UID2 contact. |
| 69 | + |
| 70 | +The sharing identifier is a string in this format: `<cloud>:<region>:<uuid>`. |
| 71 | + |
| 72 | +Follow these steps: |
| 73 | + |
| 74 | +1. Find the sharing identifier for the Unity Catalog metastore that is attached to the Databricks workspace where you’ll work with the clean room. |
| 75 | + |
| 76 | + For information on how to find this value, see [Finding a Sharing Identifier](#finding-a-sharing-identifier). |
| 77 | +1. Send the sharing identifier to your UID2 contact. |
| 78 | + |
| 79 | +### Add Data to the Clean Room |
| 80 | + |
| 81 | +Add one or more tables or views to the clean room. You can use any names for the schema, tables, and views. Tables and views must follow the schema detailed in [Input Table](#input-table ). |
| 82 | + |
| 83 | +### Map DII |
| 84 | + |
| 85 | +Run the `identity_map_v3` Databricks Clean Rooms [notebook](https://docs.databricks.com/aws/en/notebooks/) to map email addresses, phone numbers, or their respective hashes to raw UID2s. |
| 86 | + |
| 87 | +A successful notebook run results in raw UID2s populated in the output table. For details, see [Output Table](#output-table). |
| 88 | + |
| 89 | +## Running the Clean Rooms Notebook |
| 90 | + |
| 91 | +This section provides details to help you use your Databricks Clean Rooms environment to process your DII into raw UID2s, including the following: |
| 92 | + |
| 93 | +- [Notebook Parameters](#notebook-parameters) |
| 94 | +- [Input Table](#input-table) |
| 95 | +- [DII Format and Normalization](#dii-format-and-normalization) |
| 96 | +- [Output Table](#output-table) |
| 97 | +- [Output Table Schema](#output-table-schema) |
| 98 | + |
| 99 | +### Notebook Parameters |
| 100 | + |
| 101 | +You can use the `identity_map_v3` notebook to map DII in any table or view that you've added to the `creator` catalog of the clean room. |
| 102 | + |
| 103 | +The notebook has two parameters, `input_schema` and `input_table`. Together, these two parameters identify the table or view in the clean room that contains the DII to be mapped. |
| 104 | + |
| 105 | +For example, to map DII in the clean room table named `creator.default.emails`, set `input_schema` to `default` and `input_table` to `emails`. |
| 106 | + |
| 107 | +| Parameter Name | Description | |
| 108 | +| :--- | :--- | |
| 109 | +| `input_schema` | The schema containing the table or view. | |
| 110 | +| `input_table` | The name you specify for the table or view containing the DII to be mapped. | |
| 111 | + |
| 112 | +### Input Table |
| 113 | + |
| 114 | +The input table or view must have the two columns shown in the following table. The table or view can have additional columns, but the notebook doesn't use any additional columns, only these two. |
| 115 | + |
| 116 | +| Column Name | Data Type | Description | |
| 117 | +| :--- | :--- | :--- | |
| 118 | +| `INPUT` | string | The DII to map. | |
| 119 | +| `INPUT_TYPE` | string | The type of DII to map. Allowed values: `email`, `email_hash`, `phone`, and `phone_hash`. | |
| 120 | + |
| 121 | +### DII Format and Normalization |
| 122 | + |
| 123 | +The normalization requirements depend on the type of DII you're processing, as follows: |
| 124 | + |
| 125 | +- **Email address**: The notebook automatically normalizes the data using the UID2 [Email Address Normalization](../getting-started/gs-normalization-encoding#email-address-normalization) rules. |
| 126 | +- **Phone number**: You must normalize the phone number before mapping it with the notebook, using the UID2 [Phone Number Normalization](../getting-started/gs-normalization-encoding#phone-number-normalization) rules. |
| 127 | + |
| 128 | +### Output Table |
| 129 | + |
| 130 | +If the clean room has an output catalog, the mapped DII is written to a table in the output catalog. Output tables are stored for 30 days. |
| 131 | + |
| 132 | +For details, see [Overview of output tables](https://docs.databricks.com/aws/en/clean-rooms/output-tables#overview-of-output-tables) in the Databricks documentation. |
| 133 | + |
| 134 | +### Output Table Schema |
| 135 | + |
| 136 | +The following table provides information about the structure of the output data, including field names and values. |
| 137 | + |
| 138 | +| Column Name | Data Type | Description | |
| 139 | +| :--- | :--- | :--- | |
| 140 | +| `UID` | string | The value is one of the following:<ul><li>**DII was successfully mapped**: The UID2 associated with the DII.</li><li>**Otherwise**: `NULL`.</li></ul> | |
| 141 | +| `PREV_UID` | string | The value is one of the following:<ul><li>**DII was successfully mapped and the current raw UID2 was rotated in the last 90 days**: the previous raw UID2.</li><li>**Otherwise**: `NULL`.</li></ul> | |
| 142 | +| `REFRESH_FROM` | timestamp | The value is one of the following:<ul><li>**DII was successfully mapped**: The timestamp indicating when this UID2 should be refreshed.</li><li>**Otherwise**: `NULL`.</li></ul> | |
| 143 | +| `UNMAPPED` | string | The value is one of the following:<ul><li>**DII was successfully mapped**: `NULL`.</li><li>**Otherwise**: The reason why the identifier was not mapped: `OPTOUT`, `INVALID IDENTIFIER`, or `INVALID INPUT TYPE`.<br/>For details, see [Values for the UNMAPPED Column](#values-for-the-unmapped-column).</li></ul> | |
| 144 | + |
| 145 | +#### Values for the UNMAPPED Column |
| 146 | + |
| 147 | +The following table shows possible values for the `UNMAPPED` column in the output table schema. |
| 148 | + |
| 149 | +| Value | Meaning | |
| 150 | +| :--- | :--- | |
| 151 | +| `NULL` | The DII was successfully mapped. | |
| 152 | +| `OPTOUT` | The user has opted out. | |
| 153 | +| `INVALID IDENTIFIER` | The email address or phone number is invalid. | |
| 154 | +| `INVALID INPUT TYPE` | The value of `INPUT_TYPE` is invalid. Valid values for `INPUT_TYPE` are: `email`, `email_hash`, `phone`, `phone_hash`. | |
| 155 | + |
| 156 | +## Testing in the Integ Environment |
| 157 | + |
| 158 | +If you'd like to test the Databricks Clean Rooms implementation before signing a UID2 POC, you can ask your UID2 contact for access in the integ (integration) environment. This environment is for testing only, and has no production data. |
| 159 | + |
| 160 | +In the request, include your sharing identifier. |
| 161 | + |
| 162 | +While you're waiting to hear back, you can complete the following actions: |
| 163 | +- Create the clean room, using the UID2 sharing identifier for the integration environment. |
| 164 | +- Put your assets into the clean room. |
| 165 | + |
| 166 | +For details, see [Integration Steps](#integration-steps). |
| 167 | + |
| 168 | +When your access is ready, your UID2 contact notifies you. |
| 169 | + |
| 170 | +## Reference |
| 171 | + |
| 172 | +This section includes the following reference information: |
| 173 | + |
| 174 | +- [UID2 Sharing Identifiers](#uid2-sharing-identifiers) |
| 175 | +- [Finding a Sharing Identifier](#finding-a-sharing-identifier) |
| 176 | + |
| 177 | +### UID2 Sharing Identifiers |
| 178 | + |
| 179 | +UID2 sharing identifiers can change. Before creating a new clean room, check this section to make sure you have the latest sharing identifier. |
| 180 | + |
| 181 | +| Environment | UID2 Sharing Identifier | |
| 182 | +| :--- | :--- | |
| 183 | +| Production | `aws:us-east-2:21149de7-a9e9-4463-b4e0-066f4b033e5d:673872910525611:010d98a6-8cf2-4011-8bf7-ca45940bc329` | |
| 184 | +| Integration | `aws:us-east-2:4651b4ea-b29c-42ec-aecb-2377de70bbd4:2366823546528067:c15e03bf-a348-4189-92e5-68b9a7fb4018` | |
| 185 | + |
| 186 | +### Finding a Sharing Identifier |
| 187 | + |
| 188 | +To find the sharing identifier for your UID2 contact, follow these steps: |
| 189 | + |
| 190 | +In your Databricks workspace, in the Catalog Explorer, click **Catalog**. |
| 191 | + |
| 192 | +At the top, click the gear icon and select **Delta Sharing**. |
| 193 | + |
| 194 | +On the **Shared with me** tab, in the upper right, click your Databricks sharing organization and then select **Copy sharing identifier**. |
| 195 | + |
| 196 | +For details, see [Request the recipient's sharing identifier](https://docs.databricks.com/aws/en/delta-sharing/create-recipient#step-1-request-the-recipients-sharing-identifier) in the Databricks documentation. |
0 commit comments