Release notes
v4.0 - 21/11/24
New features
- It is now possible to create rule-based audiences in the Media DCR. Combine and filter any audiences in the Media DCR for even more targeting flexibility.
- Completely revamped user interface: The UI is now cleaner and more consistent across the platform.
- It is now possible to share datasets with other users in your organisation through the Datasets page.
- We deprecated the keychain and instead added email-OTP to the authentication flow. This requires a migration, see this page for more details.
- Change in the Media DCR insights logic: Affinity for a given segment is now calculated as “Share of users in this segment in the overlap” / “Share of users in this segment in all matchable publisher users”. Previously the denominator was “Share of users in this segment in all addressable publisher users”.
Bugfixes
- Datalab charts now render better in case of many segments.
- As an advertiser, we fixed a bug which prevented using single-column audience lists in the Media DCR.
- Various minor UI fixes across the platform.
Enclave versions
In addition to the enclaves deployed in the last release:
- Driver enclave
- Identifier:
decentriq.driver:mrsigner
- Hash:
326d4225aad2a886d9cc7d6fcb79435b8e996bc4c6982f87157804ce378699f4
- Identifier:
- Azure Blob Storage worker
- Identifier:
decentriq.azure-blob-storage-worker:v4
- Hash:
7b13d5830e8b6ff93f48b106496c344f9a827c127a501f46788c03bf1f45fc79
- Identifier:
- Dataset Sink worker
- Identifier:
decentriq.dataset-sink-worker:v7
- Hash:
0fbcc3e43572c4a1ec005b1316c9be0c1cfdd5fc986af1a3e4ce34ae3e385ea1
- Identifier:
- Data Source S3 worker
- Identifier:
decentriq.data-source-s3-worker:v7
- Hash:
b908e575ca1c299e1ec88a2fc2d578c88d67bad0d68072139166b6914b809d79
- Identifier:
- Data Source Snowflake worker
- Identifier:
decentriq.data-source-snowflake-worker:v6
- Hash:
966d8a4028a329f378738bf4513c1efa64ed39d77758a48698d581af7b7e9296
- Identifier:
- Google DV360 sink worker
- Identifier:
decentriq.google-dv-360-sink-worker:v6
- Hash:
f10977527c66b63f87d90d0325cf952007b2ec3c64c3080a177b6eaee56f28d4
- Identifier:
- Meta sink worker
- Identifier:
decentriq.meta-sink-worker:v6
- Hash:
98034826e2766ab59cd4e8395e6c0150af58e02e573596e0e10956ea298a2f96
- Identifier:
- Permutive worker
- Identifier:
decentriq.permutive-worker:v2
- Hash:
cf0f465a80ff76c4dedfd386798187ffedbfe46ab0a794f9858f99565a9aa612
- Identifier:
- Salesforce worker
- Identifier:
decentriq.salesforce-worker:v3
- Hash:
e4fe508770580b06fb36cc118078686109386f58386da0da7964c31958d8bcdf
- Identifier:
- SQL worker
- Identifier:
decentriq.sql-worker:v13
- Hash:
5632503c8cba4de2b90631cd6f33ae2d91743a6c0c7b4a9c4d30082b26e1156a
- Identifier:
- R-ML worker
- Identifier:
decentriq.r-ml-worker-32-32:v2
- Hash:
8cd87f95bd5bb8be0865de4f1a91324cbbb08b4deb0db6f4e18c6f2be9282a3a
- Identifier:
- SQL worker
- Identifier:
decentriq.python-synth-data-worker-32-64:v19
- Hash:
779775a9500ae33da375256421cc9168ecb2f78459469136c5f9de4744fbc808
- Identifier:
Python SDK
- The Python SDK was updated to version
0.34.0
.
v3.21 - 26/09/24
New features
- We have added support for integrating SAML SSO via customer managed IDP solutions.
v3.20 - 30/08/24
New features
- We added support for defining custom Python environments in Advanced Analytics DCRs. Packages in these environments can then be used in Python nodes of this DCR. This feature is currently only supported for DCRs created via the Python SDK.
- We added a data output connector to Splicky DSP
- We added a data output connector to Microsoft DSP
Bugfixes
- Fixed an issue with the Google Ad Manager connector (introduced by Google changing their API)
- Fixed various smaller bugs
Enclave versions
In addition to the enclaves deployed in the last release:
- Added Microsoft DSP connector
- Identifier: decentriq.microsoft-dsp-worker:v1
- Hash: 06430e9c7ebe4ac98d32e2511c46492518f97987499ae9b4b2ede74d00c5218f
- New version of the S3 connector
- Identifier: decentriq.s3-sink-worker:v9
- Hash: 6b11170435f4f0f8bbfa99a6215b93dc308f54cacd8f43638468e3f2081a8893
- New version of the Python ML worker
- Identifier: decentriq.python-ml-worker-32-64:v25
- Hash: c9485c10a958f15eddc4a123a47e8845a6ae832951dd7d800175fbc8042b53bd
Python SDK
- The Python SDK was updated to version
0.33.0
.
v3.19 - 30/07/24
New features
- Media DCR
- Media DCR can compute over even larger datasets (tested with 140 GB of publisher data, but expected to work beyond)
- Adjusted aggregation thresholds in Media DCR to allow somewhat more fine-grained results
- Added model quality visualisations when computing lookalike audiences
- Renamed Retargeting to Remarketing in order to be better aligned with the market’s expectation
- It is now possible to upload files in the Dataset Portal directly. If it’s a tabular file (CSV or XLSX), the schema is autodetected (and can be modified).
- Added Adform DSP to the export destinations
- Completely overhauled the documentation https://docs.decentriq.com
- Emails are now sent also when an organisation admin invites an internal user
- Various improvements to the UI
Bugfixes
- Fixed a bug in the Age/Gender calculation in the Data Lab that caused rows where one of the values was NULL to be dropped
- Fixed some smaller UI bugs in the Python computation
- Fixed a bug where files uploaded to File nodes were shown as Tabular type
Enclave versions
In addition to the enclaves deployed in the last release:
- Added Adform DSP connector
- Identifier: decentriq.adform-dsp-worker:v1
- Hash: b760181a0b1d4a582586c402bb4e335b6e2efd6ebd6922641076afe420e7c359
- Changes to the S3 connector
- Identifier: decentriq.s3-sink-worker:v8
- Hash: 7fa49351d3f3ffecd7e0c4155b6b42e4b35e02be25e6eeef1e099d58c2c5377b
Python SDK
- The Python SDK was updated to version
0.32.0
.
v3.18.2 - 05/07/24
New features
- Updated Analytics DCR: The analytics DCR has been updated to use a new validation pipeline that can handle much bigger datasets. This new version of the analytics DCR can be built using both the Decentriq Web UI, as well as the Python SDK (v0.31.1 and greater).
Enclave versions
No new enclaves have been released.
Python SDK
- The Python SDK was updated to version
0.31.1
.
v3.18.1 - 04/07/24
New features
- Addition of encrypted temporary storage: Python workers starting from version v24
will have access to a large (800 gigabytes) and very fast encrypted volume that can be used
for storing temporary data that would otherwise not fit into memory.
This volume is available to every user-defined Python script by writing your data to the directory
/scratch
. Data written to that location will always be encrypted and deleted after the script has been run. - New data lab: We reworked the internal data lab implementation to make it both faster and compatible with much larger datasets (100G+).
Enclave versions
- New Python worker
- Identifier:
decentriq.python-ml-worker-32-64:v24
- Hash:
fb3f8ffc47d568f241e57800d895318eba71a1edc836b032bcc85cd30fccd17b
- Identifier:
Python SDK
- The Python SDK was updated to version
0.30.0
.
v3.18 - 27/06/24
New features
- Lookalike model quality: The lookalike audience creation dialog now includes numeric indicators and a visual representation of model quality. Brands can see how choosing a larger lookalike audience affects precision in real time. Publishers and Decentriq can view model performance summary reports to diagnose data issues.
- Conversion measurement: There is a new dedicated clean room creation workflow for conversion measurement that requires no coding to set up. It supports cross-publisher attribution and offers popular rules like last click, even credit, and U-shape. Clickthrough and viewthrough lookback windows are independently configurable.
- Exclusion targeting: Media clean rooms now support exclusion targeting. Brands can target all users meeting criteria except those on a blocklist.
Bugfixes
- Fixed an issue where editable fields may be small, and only show part of the text. This change is most notable while editing long clean room names in draft mode — you can now see much more of a long name while editing, especially on higher resolutions.
- Fixed an issue where long column names may overflow beyond the edge of the window.
- Fixed an issue where users may not see the deletion confirmation dialog when deleting test datasets in draft mode.
- Fixed an issue where some text, especially in sidebars, was not easily legible on very low or very high resolutions.
Enclave versions
• No enclave changes, please refer to v3.15 below.
Python SDK
- The Python SDK was updated to version
0.29.1
. - Direct data connectors: Data connectors can now be directly attached to a computation, allowing data to be sent directly to an external location such as a cloud provider or DSP without being locally stored by Decentriq or the computation result owner. The scripts should additionally now be easier to read and more concise.
v3.17 - 30/05/24
New features
- The Media DCR can now be configured to allow exporting of the audience by the advertiser.
- The Media DCR now shows overlap statistics even if the Insights collaboration type is not selected.
- We introduced an optional data partner role in the Media DCR. If added, such a user will provide the seed audience data on behalf of the advertiser.
- We introduced a data partner portal which allows organizations acting as data partners to configure their visibility to advertisers and data usage policies.
- It is now possible to import the schema for table nodes from a CSV file.
- The automated platform emails appear now in a fresh look.
- We changed the platform’s font from Montserrat to Inter for improved readability.
- Various small UI and wording improvements.
- Dataset uploader now has automatic character encoding detection, allowing now also UTF-16 (and other) encoded files.
Bugfixes
- Some logo files were not cropped correctly.
- Media DCR: The data tab UI didn’t update correctly when provisioning a new dataset.
Enclave versions
- No enclave changes, please refer to v3.15 below.
Python SDK
- The Python SDK was updated to version
0.28.0
. - Added support for adding participants with role data partner when creating Media DCRs.
v3.16 - 02/05/24
New features
- Publisher and Brand Markets: We have added the ability for publishers to declare which markets they are active in. Brands will now be able to browse publishers that are active in markets they choose during Media DCR creation. Brands should expect to see more publishers to choose from when creating clean rooms. Most publishers in our Decentriq-Ready program are now visible within the markets they operate in.
- Media DCR Controls: We have added a new setting to allow finer control over how data is used in our Media DCR UI. Brands can choose to explicitly exclude the seed audience from lookalike model predictions, and publishers may choose to not share audience segment sizes in the insights dashboard.
- Excel uploads: It is now possible to choose files with the xls or xlsx extension when uploading in the UI. This takes the first sheet of the excel file, and otherwise treats it the same way as a csv upload.
Bugfixes
- Small graphical improvements to make the UI more legible and consistent, especially on smaller screen resolutions.
- More detailed error messages for modeling issues in Lookalike activations and when trying to set inconsistent/impossible user permissions in the SDK.
- It is no longer possible to create tables with zero columns, which could cause issues with some types of downstream computations.
- Fixed a display bug where it was sometimes not possible to see the preview size of a pending Airlock node request.
- Added SDK support to improve coverage for features introduced to the Media DCR in the previous release.
Enclave versions
- No enclave changes, please refer to v3.15 below.
v3.15 - 27/03/24
New features
- Publisher Portal - The Publisher Portal now supports a publisher contact form. For publishers that have configured a contact email in the portal, brands viewing them in the publisher discovery interface will be presented with an option to contact them. We also added several UI improvements.
- Insights Dashboards - The Insights dashboard has been extended with Basic, Detailed and Comparison views. This new experience gives the most important insights first and makes it easier and faster for marketers to get the answers they need.
- UI - The Decentriq platform now has a new background color.
- Synthetic Data Report - If the synthetic data generation report includes an error, this is displayed more gracefully now.
Bugfixes
- Fixed a bug so that resizing of the window now also resizes the insight chart.
- Fixed a bug so that the organisation logos don’t overflow anymore on small screens with wide logos.
- Fixed a bug so that during the computation of the insights and lookalike audience no error screen is displayed.
- Fixed a bug that prevented a brand from inviting a publisher to a clean room under some organization configurations.
Enclave versions
In addition to the enclaves deployed in the last release:
- Synthetic data worker
- Identifier:
decentriq.python-synth-data-worker-32-64:v18
- Hash:
c4460572608c7f5649586f106fb6ea60f2a9055e7803fff1728bab70a8df9a89
- Identifier:
- Python worker
- Identifier:
decentriq.python-ml-worker-32-64:v23
- Hash:
438b64b3d64e12340a47cdf5df77fc30ccfa4a2a37707114ac34fc994bf7d36f
- Identifier:
- R worker
- Identifier:
decentriq.r-ml-worker-32-32:v1
- Hash:
b899df8ccf97aea2afd34dcd4214bcd184bd0d3d63e8195e10cefc9157ef08d9
- Identifier:
v3.14 - 11/03/24
New features
- Revamped Look & Feel: The UI now features a different sidebar layout and several other improvements, making it easier to navigate and improving legibility.
- Publisher Portal: Advertisers can now easily find and create Media Clean Rooms with suitable publishers. Publishers can choose to be discoverable by specifying their preferences and default settings. Advertisers can see these publishers and Media Clean Rooms already pre-configured by them. The process for publishers inviting advertisers to collaborate remains the same.
- Privacy Noise for Media Insights: We now add a small amount of noise to all results from the Insights tab in Media clean rooms. This follows principles established in differential privacy using the Laplace Mechanism. For a more detailed discussion, please see the tooltips and mouseover text in the dashboard iteself or request a copy of our whitepaper.
- SDK Improvements: We are also releasing v0.25 of our python SDK this week. The main change in this that we have changed the default level of abstraction when building clean rooms using the SDK, more closely mirroring the graphical UI experience. This is a breaking change, users that wish to continue using the old syntax may continue to do so by using the
legacy
module.- Additionally we added new helper functions to make it less verbose to: discover appropriate confidential computing enclave specifications for a given clean room if the enclave version is unknown, verify the identity of the creator of a clean room, and extract the JSON specification of a published clean room.
- Python and R worker libraries: The following libraries are now supported in appropriate python and R workers.
- Python: seaborn, pyreadstat, streamlit, lifelines, scikit-survival
- R: lme4, coxme, cmprsk, msm, randomForestSRC
- Minor Improvements
- Retargeting/Lookalike integration: retargeting audiences and lookalike audiences are now displayed in a single activation UI tab. Data labs may now be provisioned to retargeting-only clean rooms, instead of provisioning matching data directly.
- Data lab versions: Data labs now have versions, and there is a version compatibility check between the clean room and data lab during provisioning. Incompatible versions are explicitly reported in error messages if they occur during provisioning.
- Faster UI: most common tasks in the UI now make fewer calls to the API, speeding performance, especially over higher latency connections or periods of heavy website load.
Bugfixes
- Fixed a vulnerability in the Airlock feature where an attacker could raise error messages that exceeded the expected preview budget.
- Fixed an issue where very wide but short training data sets may fail unexpectedly when used for synthetic data generation.
- Fixed an issue where errors would not be reported in the UI if raised during synthetic data generation in some configurations.
- Fixed an issue where clean rooms with a reprovisioning rate limit would report a generic provisioning failure if it failed due to the rate limit. Rate limit failures now have their own error message to make this behavior more clear.
- Fixed an issue where creating too many audiences in the same UI session could cause the UI to report metering errors in some scenarios.
Enclave versions
In addition to the enclaves deployed in the last release:
- driver
- Identifier:
decentriq.driver:v21
- Hash:
90f95bb9e5e0a50e5b40376f69ed6bb844284b50dc00c0e69c5f3a51a54e9f1e
- Identifier:
- synthetic-data-worker
- Identifier:
decentriq.python-synth-data-worker-32-64:v17
- Hash:
7a32e9aef2cc5ab6fb93ee4cf48b5b409de0cdaa46bb777535867d093ea2d416
- Identifier:
- python-ml-worker
- Identifier:
decentriq.python-ml-worker-32-64:v22
- Hash:
bb9805cdc7cbd64521e9bc146e5992465145230dd427921bdbc6d6ec4d073644
- Identifier:
- r-latex-worker
- Identifier:
decentriq.r-latex-worker-32-32:v17
- Hash:
dcf7976d044bd74e5a70b5e1147fde5cda0f208cdd3badec86c71ccb1ad0d775
- Identifier:
v3.13 - 01/02/24
New features
- Dataset upload wizard: uploading datasets is now easier than ever. The new UI wizard guides you through the process and helps you fix common issues with your data.
- Image results preview: besides tabular results you can now preview images directly in the Decentriq UI. This is useful for example when an analysis has charts or other visualizations as output.
- Lookalike Clean Room improvements: it now supports much larger publisher audiences, runs faster for all audience sizes, and supports multiple matching IDs per publisher user, increasing the match rate with the advertiser data.
- Publisher Data Lab statistics: Publishers can now visualize the statistics of their datasets in beautiful charts.
Bugfixes
- Small UI improvements.
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v20
- Hash:
8b7cc00da45d87a495e64a5e856aae1456c75b51f71e7c662224183a38c60714
- Identifier:
- SQL worker
- Identifier:
decentriq.sqlite-container-worker-32-64:v7
- Hash:
8683d5de0e95afe30ab9fdff67459936baada4dfd87633cc867178aa2ca6478a
- Identifier:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v21
- Hash:
331a8810a2984399b5b68a8dc7a8c8d1240f281b619d8bef8bce50d051d5fba5
- Identifier:
- R worker:
- Identifier:
decentriq.r-latex-worker-32-32:v16
- Hash:
38057eb485d934ba444ff6b3d0b8d5117d074e9e366f54ca36256b1739dedda3
- Identifier:
- Synthetic data worker:
- Identifier:
decentriq.python-synth-data-worker-32-64:v16
- Hash:
f1b623637d56f6af562eff126d18e7b5d2be1a24ece074106f8f1a5defdded82
- Identifier:
v3.12 - 21/12/23
New features
- Limited preview of datasets: Data Owners can now specify a limited amount of data they want to allow Analysts to run computations on, to avoid going through the request approval flow multiple times. Please contact us if you wish to enable this feature for your organization.
- Audience type no longer mandatory: Advertisers can now upload their customer list to Lookalike Clean Rooms without having to specify an audience type.
- Export Lookalike Clean insights: it is now possible to export the overlap insights and the top affinity segments to CSV.
- Data Clean Rooms page filters: see your DCRs in an organized way, hide DCRs to keep the list clean.
Bugfixes
- Small UI improvements.
- For some older DCRs it was not possible to see the list of enclave versions defined.
Enclave versions
In addition to the enclaves deployed in the last release:
- Driver enclave
- Identifier:
decentriq.driver:v20
- Hash:
8b7cc00da45d87a495e64a5e856aae1456c75b51f71e7c662224183a38c60714
- Identifier:
v3.11 - 30/11/23
New features
- Improved upload process: revamped the selection of existing datasets and added validation to make sure they are compatible with the computations in the Data Clean Room. The upload steps have now a progress indicator.
- Data clean rooms page: you can now conveniently browse and manage all your DCRs in a single place.
- Publisher dataset refresh: the Lookalike Clean Room now allows the publisher to reprovision a Data Lab containing updated audience data to existing collaborations with Advertisers. The entire flow is also possible programmatically using the Python SDK.
Bugfixes
- Small UI improvements.
- Mitigated a possible security vulnerability affecting interactive DCRs in combination with Development mode.
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v19
- Hash:
38508f0eec4e5433866cfcbda0e9a82e64debecdaef29be6504cf9ec60bb2736
- Identifier:
- SQL worker
- Identifier:
decentriq.sqlite-container-worker-32-64:v7
- Hash:
8683d5de0e95afe30ab9fdff67459936baada4dfd87633cc867178aa2ca6478a
- Identifier:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v20
- Hash:
0cb1712c7182118d29ab44d335fd31a1bc6b5e1cc2887f7c99ea4f957ef25839
- Identifier:
- R worker:
- Identifier:
decentriq.r-latex-worker-32-32:v16
- Hash:
38057eb485d934ba444ff6b3d0b8d5117d074e9e366f54ca36256b1739dedda3
- Identifier:
- Synthetic data worker:
- Identifier:
decentriq.python-synth-data-worker-32-64:v16
- Hash:
f1b623637d56f6af562eff126d18e7b5d2be1a24ece074106f8f1a5defdded82
- Identifier:
v3.10 - 02/11/23
New features
- Permutive integration: as a Publisher, import multiple datasets ready to be provisioned to a Data Lab, match with the Advertiser data in the Lookalike Clean Room and export the audience back to Permutive for activation.
- Report error to Decentriq: it is now possible to submit error details and other contextual information about issues within one click.
- Improved platform resilience: interacting with the Decentriq UI is now more robust by automatically retrying operations when the connection with the enclave is interrupted.
Bugfixes
- Small UI improvements.
- Fixed an issue that in rare cases would make it not possible to run computations on datasets from duplicated DCRs.
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v18
- Hash:
8caa66630e99ac7be0eb5804e61067403447a79233eca0ca2f49f752ad3268ea
- Identifier:
- SQL worker
- Identifier:
decentriq.sqlite-container-worker-32-64:v7
- Hash:
8683d5de0e95afe30ab9fdff67459936baada4dfd87633cc867178aa2ca6478a
- Identifier:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v19
- Hash:
a72e6c69147b8ba77dab4a693612fc61d2d38ce5676299d8a9293a3da9f07b89
- Identifier:
- R worker:
- Identifier:
decentriq.r-latex-worker-32-32:v16
- Hash:
38057eb485d934ba444ff6b3d0b8d5117d074e9e366f54ca36256b1739dedda3
- Identifier:
- Synthetic data worker:
- Identifier:
decentriq.python-synth-data-worker-32-64:v16
- Hash:
f1b623637d56f6af562eff126d18e7b5d2be1a24ece074106f8f1a5defdded82
- Identifier:
v3.9 - 11/10/23
New features
- Lookalike Clean Room improvements: Duplicated rows in advertiser datasets will be automatically removed. Publishers can export activated audiences using data connectors. Both can now see the detailed Lookalike Clean Room configuration and can stop it Lookalike Clean Rooms. Insights and audiences computations of Lookalike Clean Rooms are now executed simultaneously, and it is possible to retry in case of errors.
- CSV upload improvements: now it is possible to select which decimal separator is being used in the dataset (dot or comma) and the CSV column delimiter is automatically detected.
- Column uniqueness constraints: when creating a table, you can now indicate which column or column combinations must have unique values. This will be validated when a dataset is provisioned to the table.
- More data connectors available: you can now import data from Salesforce, as well as export audiences directly to Google Ad Manager.
Bugfixes
- Fix an issue that prevented SQL computations to execute on tables with spaces in the name.
- Fix an issue that reported N/A values in the Lookalike Clean Room Insights table (they should have been filtered out).
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v17
- Hash:
1708f45dc74d750fb2cb776117c411bb73d41c905028442621136f2dc1cb715c
- Identifier:
- SQL worker
- Identifier:
decentriq.sqlite-container-worker-32-64:v6
- Hash:
c84bda0f3e5f173aabe58c3d764972186da6b9e77b92988e51c4bc768ac8cbdf
- Identifier:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v18
- Hash:
1042f59fb3ef401069481b3aa0e5f5db0de49e194f595c30127336adebdbd3ae
- Identifier:
- R worker:
- Identifier:
decentriq.r-latex-worker-32-32:v15
- Hash:
d4b61b7e33e3c7ac4488f8515951f09d2519528425e04cb2f34d4251b4e11121
- Identifier:
- Synthetic data worker:
- Identifier:
decentriq.python-synth-data-worker-32-64:v15
- Hash:
f5b256b018dd31d2e50a12e5339c1fcce01453477f0527c29ed7a44e5c6b343c
- Identifier:
v3.8.1 - 21/09/23
New features
- Dataset validation report: Publishers and advertisers can now access the validation report of datasets provisioned to a Data Lab or to a Lookalike Clean Room directly in the Decentriq UI.
- Performance improvements: Faster publishing of Data Clean Rooms, faster validation for large datasets and various improvements in the Admin Portal.
Bugfixes
- Fixed a misleading error message when sending external invitations.
- Fixed a permission issue that did not allow media agencies to also generate audiences for activation.
- Prevent duplication of rows when hashing emails directly from the Decentriq UI (in case auto-fixing is disabled).
- Numbers indicating the user overlap in the Lookalike Clean Room are now rounded in a consistent manner.
Enclave versions
In addition to the enclaves deployed in the last release:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v18
- Hash:
1042f59fb3ef401069481b3aa0e5f5db0de49e194f595c30127336adebdbd3ae
- Identifier:
v3.8 - 31/08/23
New features
- Lookalike clean room: brands can now create lookalikes with high quality and high reach based on their (and publisher’s) first-party data. Publishers can now provision richer datasets including embeddings and demographics. A machine learning model is confidentially trained inside the clean room on data from both sides, with the aim of increasing the conversion rate of advertising campaigns.
- More data connectors available: you can now connect to Google Cloud and Azure Blob Storage to import and export your datasets.
- Hash values before uploading: for columns with emails and phone numbers in your dataset, it is now possible to first auto-fix these values and then hash with SHA256 before the dataset is uploaded to Decentriq. Besides convenient it also helps increasing the match rate when joining with other datasets.
- New utility functions for dataset upload: it is now more convenient to upload and provision datasets using the Python SDK. For more details, please follow the Datasets guide.
- Performance improvements: significant improvements in several parts of the platform.
Bugfixes
- Fixed issues with large computations memory allocation and file system usage.
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v16
- Hash:
0dd8402de7ecbc3d223f0db4b0e681769701c6e4683f23ae85d9c118b19efaa3
- Identifier:
- SQL worker
- Identifier:
decentriq.sqlite-container-worker-32-64:v5
- Hash:
03066177b3d76f49b11dda3edf37b4fd3c030aee95e58f3b37a3f1c243d142f0
- Identifier:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v17
- Hash:
6da576563aa69b8080ea67d157016b4744c53378c6226808b21a1c73c0a03b62
- Identifier:
- R worker:
- Identifier:
decentriq.r-latex-worker-32-32:v15
- Hash:
d4b61b7e33e3c7ac4488f8515951f09d2519528425e04cb2f34d4251b4e11121
- Identifier:
- Synthetic data worker:
- Identifier:
decentriq.python-synth-data-worker-32-64:v15
- Hash:
f5b256b018dd31d2e50a12e5339c1fcce01453477f0527c29ed7a44e5c6b343c
- Identifier:
v3.7 - 27/07/23
New features
- Standard SQL engine: write your queries the way you are used to. As of this release, all new DCRs will use by default an SQLite engine in confidential computing.
- Google DV 360 integration: export your audience from Decentriq directly to Google Display & Video 360.
- Brand new datasets page: a revamped layout that simplifies dataset management in a centralized location, offering convenient options for browsing, searching, importing, and exporting.
- Computations error messages: now it is possible to safely get more details about computation errors in Python nodes of published DCRs.
- Performance improvements: resolved some bottlenecks to make computations even faster.
Bugfixes
- Several improvements in the UI and data validation.
- The columns of test datasets in draft DCRs were sometimes not mapped in the correct order.
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v14
- Hash:
9e19a5a1102810af4484b8a304ecc9b7513666c32207c9917ae831fdcab1e150
- Identifier:
- SQL worker
- Identifier:
decentriq.sqlite-container-worker-32-64:v2
- Hash:
676b18ae5ebb7009d5936f6d04e275a77552d1e75a4521e4b2a0a73f420cb560
- Identifier:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v14
- Hash:
4fa689025841825e94eee06c1c8c498a0f6416c8cfb30308444ce496235347a1
- Identifier:
- R worker:
- Identifier:
decentriq.r-latex-worker-32-32:v12
- Hash:
c8bec9b2a5ad40572ca8f906f49d56cb8f1412e76e2d1a5acccc731988dcc9da
- Identifier:
- Synthetic data worker:
- Identifier:
decentriq.python-synth-data-worker-32-64:v12
- Hash:
c95cff6d9d1393a152b9defec7bc1d2dd8bd2bd5fe0361e8858c651e4ff8ac86
- Identifier:
v3.6 - 28/06/23
New features
- Introducing the Test Mode: You can test Data Clean Rooms with your own test data before publishing them. In published Data Clean Rooms you can enter the Test Mode to test new computations before requesting changes.
- Enhanced dataset upload and support for new data types: We added client-side validation and auto-fixing of the uploaded data based on the data type. We support string, numeric, hash, date, email and phone data types.
- More data connectors: You can now directly import data also from Snowflake. You can export any result from a Data Clean Room to Amazon S3 and Meta Ads Manager.
- Reuse stored datasets in the Media DCR: the same audience can now be used in multiple Media Data Clean Rooms without the need of being re-uploaded.
- Store Data Clean Room results to a dataset: besides downloading, it is now possible to store any result as a dataset, which can then be used in other DCRs.
Bugfixes
- Mitigated a possible security vulnerability affecting interactive DCRs in combination with Join computations.
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v13
- Hash:
1127742ccc14f7e0c9590981768213ce9215248f31aa538b795f8627e7929762
- Identifier:
- SQL worker
- Identifier:
decentriq.sql-worker:v10
- Hash:
0c01e1cb6f1ccd10be716124f4d7479ed13de382c9a8eb6ac79a753c1b5ddbc8
- Identifier:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v13
- Hash:
d0d80fcc8e864a2d3ddf7876fa23ce1de0665d3a8ad24df486d6e6f99c7640f3
- Identifier:
- R worker:
- Identifier:
decentriq.r-latex-worker-32-32:v11
- Hash:
6f9be2d92d0506cd955bab2478b9402f95044e5e760f87614c40907a754d8ecc
- Identifier:
- Synthetic data worker:
- Identifier:
decentriq.python-synth-data-worker-32-64:v11
- Hash:
5afe94be61bf77814f3fe0acd2226521bce906ca62c6a374205ab4efe6304872
- Identifier:
v3.5 - 25/05/23
New features
- Integrate consecutive requests: no need to wait for an approval before integrating other submitted requests. Multiple requests can be approved and integrated at any time.
- Top affinity segments charts: Media Data Clean Rooms now have interactive insights.
- Import data from Amazon S3: in the Datasets page, you can import a file from an S3 Bucket and it will become a dataset ready to be used in multiple DCRs. Other sources coming soon.
- No-code join and synthesize: you can now join 2 datasets and generate artificial data based on the joined result without writing any code.
- Decentriq util library for Python: read and write tabular data including schema in just one line in your Python computations.
- Several UI improvements: a few actions were simplified and some error messages were improved.
Bugfixes
- The participants form was not validating unique email addresses.
- Fixed an issue when importing * from decentriq_platform in Python.
- Reading tabular data with an utility function was dropping an entire row when one Integer cell was null.
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v11
- Hash:
5e3095a61edad4ea813f3ae5a7a65ce9235be4aa954aa4c631dc10033f322e9e
- Identifier:
- SQL worker
- Identifier:
decentriq.sql-worker:v10
- Hash:
0c01e1cb6f1ccd10be716124f4d7479ed13de382c9a8eb6ac79a753c1b5ddbc8
- Identifier:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v12
- Hash:
43ebf1ad0dfe11603bf2d7037ab4338bc00c77a8f8619f7915f8d78f87d89f96
- Identifier:
- R worker:
- Identifier:
decentriq.r-latex-worker-32-32:v11
- Hash:
6f9be2d92d0506cd955bab2478b9402f95044e5e760f87614c40907a754d8ecc
- Identifier:
- Synthetic data worker:
- Identifier:
decentriq.python-synth-data-worker-32-64:v11
- Hash:
5afe94be61bf77814f3fe0acd2226521bce906ca62c6a374205ab4efe6304872
- Identifier:
v3.4 - 27/04/23
New features
- Keychain & Dataset reprovisioning: when uploading a new dataset to the platform, its encryption key can be confidentially stored in the Decentriq Keychain. Among other features, the Keychain enables you to reuse datasets across data clean rooms without having to re-upload them. It operates like a traditional password manager, deriving from your password an encryption key used to encrypt the secrets you store in the Keychain. Decentriq does not have access to your password, and can never access the secrets stored in the Keychain.
- Media Data Clean Rooms participants and roles: it is now possible to create Media Data Clean Rooms with multiple users representing the Publisher and the Advertiser. This release also introduces the role of the Agency, which can support the Advertiser analysing and choosing the best audiences to activate.
- Export Media Data Clean Rooms results: you can now export the overlap insights and the top affinity segments to CSV.
- UX improvements: Filter the "Overview" tab to show only actionable items. Enhanced the visibility of provisioned datasets.
Bugfixes
- Including certain special characters in the computation name could disrupt its execution.
- In a few cases, the Decentriq UI was not displaying updated information about which Data Owners would be required to approve a new computation request.
Enclave versions
- No enclave changes, please refer to v3.3 below.
v3.3 - 30/03/23
New features
- Admin Portal: as an organization admin, you now have a dedicated interface to manage internal users, invite external users to DCRs, and get an overview of all DCRs and collaborating organizations on a monthly basis.
- Azure Marketplace: You can now subscribe to the Decentriq Platform directly through the Azure Marketplace. Your billing is managed in Azure while your users and DCR activities can be managed in the Decentriq Admin Portal. Contact us for an offer specific to your needs.
- No-code Join: a new computation type that allows you to select two tables and match their columns based on conditions. Data privacy is ensured by letting analysts only see statistics about the resulting join, while the data itself can be queried by downstream computations.
Bugfixes
- Fixed an issue preventing some users to provision datasets with empty values on numeric column types.
- Some users were unable to access datasets in existing data clean rooms.
- Small UI improvements in the Media Clean Rooms.
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v10
- Hash:
53e01ba06d349e304f75e37c13d485a3f6f4b0518ee2b9c5a653e5ef3112af51
- Identifier:
- SQL worker
- Identifier:
decentriq.sql-worker:v10
- Hash:
0c01e1cb6f1ccd10be716124f4d7479ed13de382c9a8eb6ac79a753c1b5ddbc8
- Identifier:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v11
- Hash:
04dad4ce0d15eb40280af521e08b1f0dbe0aca24a5f731fc0c31ba11e899a325
- Identifier:
- R worker:
- Identifier:
decentriq.r-latex-worker-32-32:v10
- Hash:
36640d2e8c94ee71a45cd6c209ab6a707178fec72a94192f7fae5b59f936c598
- Identifier:
- Synthetic data worker:
- Identifier:
decentriq.python-synth-data-worker-32-64:v10
- Hash:
c6498b2ad3b795c6293562c743f1737bb21cb8ef7ff8b40bc3068b12910caadb
- Identifier:
v3.2 - 28/02/23
New features
- Added support for the package imbalanced-learn (version 0.9.0) in the Python worker.
- Improved resilience when uploading large files over an unstable internet connection.
- Improved user experience when creating new Data Clean Rooms.
Enclave versions
In addition to the enclaves deployed in the last release:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v11
- Hash:
04dad4ce0d15eb40280af521e08b1f0dbe0aca24a5f731fc0c31ba11e899a325
- Identifier:
v3.1 - 13/02/23
Decentriq Media Clean Rooms: the no-code solution for brands, publishers and retailers to join their first-party data to reach high-value audiences.
- Create a new Media Data Clean Room and invite your partner.
- Choose between affinity segment activation (best segments such as Sports > Tennis) and retargeting (identifiers in the overlap).
- Securely provision your datasets & view overlap insights.
- Choose desired reach-precision tradeoff and compute audiences.
Bugfixes
- Fixed an issue causing the Python or R code editor to misplace some characters.
- Fixed an invalid token issue affecting users with cached expired authentication tokens.
- Archived DCRs were sometimes not hidden until the next page reload.
- Improved the loading time of published DCRs.
Enclave versions
- No enclave changes, please refer to v3.0 below.
v3.0.1 - 13/01/23
Bugfixes
- Performance improvements for DCRs that include a large number of computations and participants.
- Fixed an issue that prevented downloading large CSV result files.
- Fixed an issue that sometimes delayed the loading of DCR participants or the setting/unsetting of permissions.
- Exposed more context and details to some error messages.
- Dataset information is now accessible again to all DCR participants.
- Several other small UI improvements.
Enclave versions
- No enclave changes, please refer to v3.0 below.
v3.0 - 05/01/23
We are pleased to share that the version 3.0 of the Decentriq Platform is now officially available to all our customers.
This is the result of a major engineering effort, focused on security, scalability and resilience.
What's new in version 3.0
- With this release, Decentriq becomes the world’s first analytics platform deployed in production properly using AMD SEV-SNP. This means more memory available to the enclaves while keeping strong security assurances.
- All sensitive data is hosted and processed exclusively in Switzerland. Auxiliary services are based in the EU only.
- Enhanced resource distribution system to reduce latency, allowing you to run computations even when the platform is being heavily used.
- Revamp of the internals to make the platform future-proof for upcoming feature development.
Additional features
- Larger datasets support: provision Gigabytes of data directly in the Decentriq UI, and run computations on much larger workloads.
- Improved synthetic data generation: you now can choose whether to use differential privacy-based or a standard model to synthesize your dataset. The standard model has better pairwise correlations and supports larger datasets.
- S3 bucket integration: your computation results can be uploaded directly from a data clean room to your S3 bucket, following high security standards.
- Data clean room password: it is now possible to set a data clean room password that is shared out of bands. If set, any interaction with the data clean room requires knowledge of the password.
- New documentation: revamped documentation page suitable for all levels of expertise, from step-by-step guides to advanced API reference.
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v10
- Hash:
53e01ba06d349e304f75e37c13d485a3f6f4b0518ee2b9c5a653e5ef3112af51
- Identifier:
- SQL worker
- Identifier:
decentriq.sql-worker:v10
- Hash:
0c01e1cb6f1ccd10be716124f4d7479ed13de382c9a8eb6ac79a753c1b5ddbc8
- Identifier:
- Python worker:
- Identifier:
decentriq.python-ml-worker-32-64:v10
- Hash:
463d8137e3279f474dc55bcb83246815394b5a941f9255a97b364287480d0290
- Identifier:
- R worker:
- Identifier:
decentriq.r-latex-worker-32-32:v10
- Hash:
36640d2e8c94ee71a45cd6c209ab6a707178fec72a94192f7fae5b59f936c598
- Identifier:
- Synthetic data worker:
- Identifier:
decentriq.python-synth-data-worker-32-64:v10
- Hash:
c6498b2ad3b795c6293562c743f1737bb21cb8ef7ff8b40bc3068b12910caadb
- Identifier:
v2.2 - 20/06/22
New features
- Computation requests: So far, Data Clean Rooms have been immutable. This made it necessary to create new Data Clean Rooms to run different computations on the same data. Now it is possible to request additional computations in existing Data Clean Rooms which were not part of the original specification. Of course the Data Owners have to approve first in order to continue guaranteeing the security of the data. To do this, when you create a request, the platform automatically determines the owners of the affected datasets and asks them for approval. Once all approved, the new computation can be added to the Data Clean Room.
- Development tab: This is a scratch space that allows you to run arbitrary computations on data and results you already have access to without the need for approval. This allows prototyping computations which subsequently can be turned into a request.
- Join then synthesize: Our UI now allows you to create synthetic data not only based on tables, but also based on the output of a SQL computation. This allows generating synthetic copies of joined datasets from multiple sources, enabling much richer synthetic datasets.
- Privacy filter for individual computations: You can now set an independent k-anonymity privacy filter for each SQL computation, previously this was only possible globally.
- Mandatory datasets: You can now set which tables and files must be provisioned before allowing running computations that depend on them.
- New UI look: We did our spring cleaning and removed several borders and other unnecessary elements. We hope you like it too!
- UX improvements: Several enhancements in order to improve the experience of writing and debugging code into the platform, and preparing the Data Clean Room draft.
- New Python SDK version: A new version of the Python SDK, version 0.11.0, has been released with support for all new features available in the platform.
Bugfixes
- Removed some performance bottlenecks for large Data Clean Rooms.
- Pagination bug in ‘My Datasets’ page is fixed.
- The list of Data Clean Rooms in the sidebar is now updated automatically after creating a new one.
- It is now possible to have multiple files as dependencies for computations.
Enclave versions
- Driver enclave
- Version 4 (new):
- Identifier:
decentriq.driver:v4
- Hash:
d7af541df8de018effbe537269a624850868e93c6982acc31e3f81e5babaf9a5
- Identifier:
- Version 2:
- Identifier:
decentriq.driver:v2
- Hash:
e3c40c52ccaa92ab420d30d31afcbfe898861d9d67e7ebee2df261db9c0372c4
- Identifier:
- Version 4 (new):
- SQL worker
- Version 4 (new):
- Identifier:
decentriq.sql-worker:v4
- Hash:
3cf64f9d80f538e7d1e48725c63b3ee2091d7cf5508e69e0db3a8ed2d98595c5
- Identifier:
- Version 2:
- Identifier:
decentriq.sql-worker:v2
- Hash:
3742070ece17bf1c0f37191bdbba4e1ebdb1be0f6194e1594a53bea49484f98d
- Identifier:
- Version 4 (new):
- Python worker (AWS Nitro-based):
- Version 2 (new):
- Identifier:
decentriq.python-ml-worker:v2
- Hash:
0c3952473d23707bf8cd6a909c8162e2c9f13c5a5af34228518a1c4ad35d358e
- Identifier:
- Version 1:
- Identifier:
decentriq.python-ml-worker:v1
- Hash:
47dd5eee8bbebf33a25dfffd825196bcb4276f4fe06aef09e27d0bdf5f7da43c
- Identifier:
- Version 2 (new):
- R worker (AWS Nitro-based):
- Version 2 (new):
- Identifier:
decentriq.r-latex-worker:v2
- Hash:
b58b53ecaa9636d9a1f75d471000226a556bb36c9238dd20e0b063163b5922f1
- Identifier:
- Version 2 (new):
- Synthetic data worker (AWS Nitro-based):
- Version 2 (new):
- Identifier:
decentriq.python-synth-data-worker:v2
- Hash:
73a18198fdd98ded93cd9be57c3457e5a83eb7f998dcfdc625c8bb25c0751a30
- Identifier:
- Version 2 (new):
v2.1 - 05/04/22
New features
- Synthetic data generation: generate from any table a differentially-private synthetic copy of the data with similar statistical properties. This allows you to prototype your scripts locally before running them on the original data in a dedicated Data Clean Room.
- R computations: you can now also write scripts in the R language.
- Multi-file scripts: Python and R computations support more than just a single script, they can now consist of multiple scripts and file types.
- Preview script results: the CSV result from Python and R scripts can now be viewed directly in the UI.
- Unstructured data support: it is now possible to provision datasets of any kind, not only tabular but also unstructured data such as texts, JSON and images.
Bugfixes
- Some datasets with floating point numbers couldn’t be provisioned.
- Some strings were not wrapped in quotes when downloading results as CSV.
- Fixed a compatibility issue with Firefox.
- Improved specific error messages for Python scripts.
- Improved performance for Data Clean Rooms with a large amount of computations.
Enclave versions
- Driver enclave
- Identifier:
decentriq.driver:v2
- Hash:
e3c40c52ccaa92ab420d30d31afcbfe898861d9d67e7ebee2df261db9c0372c4
- Identifier:
- SQL worker
- Identifier:
decentriq.sql-worker:v2
- Hash:
3742070ece17bf1c0f37191bdbba4e1ebdb1be0f6194e1594a53bea49484f98d
- Identifier:
- Python worker (AWS Nitro-based):
- Identifier:
decentriq.python-ml-worker:v1
- Hash:
47dd5eee8bbebf33a25dfffd825196bcb4276f4fe06aef09e27d0bdf5f7da43c
- Identifier:
- R worker (AWS Nitro-based):
- Identifier:
decentriq.r-latex-worker:v1
- Hash:
77dab20c5a42a8383083162e9f5f7c6251f5e5ac28b007968dc216b8e9f01750
- Identifier:
- Synthetic data worker (AWS Nitro-based):
- Identifier:
decentriq.python-synth-data-worker:v1
- Hash:
a5324a143bed8f63639fc48cc45046e288ad0d1feb79dc0c323a8ebfe9d6ed07
- Identifier:
v2.0 - 07/03/22
We are delighted to announce that the Decentriq platform version 2.0 is released! We are introducing compute nodes that support the execution of Python scripts. This opens the door to confidential machine learning and other exciting applications in order to unlock the value of your sensitive data assets, all powered by confidential computing! Available in Switzerland and globally.
What's new in version 2.0
- The most important improvement is our new compute graph-based platform architecture: the Decentriq platform now consists of driver and worker enclaves. While the driver enforces the Data Clean Rooms permissions and orchestrates the execution, the worker enclaves execute the computations. This architecture allows us to combine different trusted execution environment technologies according to their strengths: Intel SGX, AMD SEV/SNP, AWS Nitro...
- Processing your datasets just got more powerful! Besides writing SQL queries, you can now take your analyses to the next level by writing Python scripts, also running on confidential computing!
- Brand new Python SDK, fully compatible with Data Clean Rooms created using our web platorm. Your entire workflow can be automated: manage Data Clean Rooms, datasets and analyses programmatically. Documentation and step-by-step guides available.
Additional Features
- Tables and files browser: writing a query or script is made easier now by just having all available tables, columns and files in a handy side panel.
- Several UI improvements: finding your way in the UI is just getting more intuitive and efficient.
Bugfixes
- When importing a Data Clean Room from a file, the description field was not being read correctly.
- Switching between Data Clean Rooms was slightly time consuming.
- Some brand logos were cropped.
- Some error messages were not very intuitive.
Enclave versions
The following enclave versions will be available.
- Driver enclave
- Identifier:
decentriq.driver:v2
- Hash:
e3c40c52ccaa92ab420d30d31afcbfe898861d9d67e7ebee2df261db9c0372c4
- Identifier:
- SQL worker
- Identifier:
decentriq.sql-worker:v2
- Hash:
3742070ece17bf1c0f37191bdbba4e1ebdb1be0f6194e1594a53bea49484f98d
- Identifier:
- Python worker (AWS Nitro-based):
- Identifier:
decentriq.python-ml-worker:v1
- Hash:
47dd5eee8bbebf33a25dfffd825196bcb4276f4fe06aef09e27d0bdf5f7da43c
- Identifier:
v1.6 - 6/12/21
New features
- Data portal: An overview of all datasets you encrypted and connected to Data Clean Rooms is available in the new ‘Datasets’ menu. Keep full control over your data!
- Dataset statistics: When connecting your datasets, the platform computes summary statistics, that you can share with other participants of the Data Clean Room. Get and share instant insights into the data quality!
- Fuzzy string matching: The
fuzzystrmatch
function now takes an optional parameter such that only the best match is returned. - Several UI improvements
Bugfixes
- Fixed a computation bug of the STDEV and STDEVP SQL commands.
- When CSV datasets were truncated, the file size was not correctly calculated.
- Fixed a bug when editing column names of a table.
- Fixed a bug that forced users to login and logout twice to switch accounts.
Enclave version
1978873e5be413527f9025d18b39e8a7071fbfeea90669064bf8322596c0a595
v1.5 - 11/11/21
New features
- Technical documentation: Easy access to centralised documentation about the platform, SDKs and step-by-step tutorials now available at docs.decentriq.com - check it out!
- Dataset upload wizard: Uploading CSVs got more powerful - now you can adjust the parsing parameters and preview your data before connecting it to a Data Clean Room, besides adding some context to it such as name and description.
- Dataset metadata: On every Data Clean Room table, it is possible to see more information about the uploaded datasets like number or rows, file size, name and description.
- Previewing query results with many columns is now possible, with horizontal scrolling.
Bugfixes
- Fixed a bug that was sending multiple requests when retrieving the audit log.
- Fixed a bug that was breaking the chronological order of the rendered audit log.
- Now all participants are always copied when duplicating a Data Clean Room.
Enclave version
e6546a05f73a23c0f7fa88fbabe3e0feca4107d82b1336797235626b0554d981
v1.4 - 03/09/21
New features
- Data deletion: Now, when you delete your dataset, also all the results, derivated datasets and metadata get deleted from the encrypted data store.
- Improved multitenancy: Enclaves now prioritize fast requests (such retrieving the audit log or the Data Clean Room definition) over longer running tasks (executing queries) to give you the best possible user experience.
- Data Clean Room stopping: The Data Clean Room creator can now stop the Data Clean Room, so that no participant can upload datasets nor run queries. Stopped Data Clean Rooms still allow data deletion and retrieving the audit log.
- Major changes of Data Clean Room creation UI/UX: The UX of the Data Clean Room creation has been improved thanks to several changes: Modifications of titles or elements are now auto-saved, you can re-order table columns easily, the Data Clean Room creation elements are easier to modify, and multiple styling improvements.
- API tokens self-service: You can create and manage your API tokens on your own from within the platform UI.
- SQL capabilities page is now better accessible through a link in the sidebar.
Bugfixes
- Fixed a bug that did not allow the audit log to be sometimes properly rendered.
- The sidebar now gets refreshed every time that a DCR is created or deleted.
Enclave version
e6546a05f73a23c0f7fa88fbabe3e0feca4107d82b1336797235626b0554d981
v1.3 - 16/08/21
New features
- Privacy filter: To protect the privacy of individuals, we implemented a configurable privacy filter that enforces a minimum level of aggregation in the output. If enabled, only aggregating queries (with a GROUP BY in the last step) are allowed and all resulting groups with less than a specified threshold are filtered out before being returned to the Analyst.
- Result file naming: Result files are now named according to the query.
- Various small UI/UX improvements
Bugfixes
- Fixed a rare concurrency issue when executing long-running queries that could lead to a situation where the query never seems to finish.
- 'Last edited' timestamp bug has been fixed.
Enclave version
59b7768e3c455cec14aac84187618c46a2a49c21cac754bcd021deb5cd0299ea
v1.2 - 02/08/21
New features
- Better support for long-running queries: You can now leave the page when a query is triggered without losing the results. The query is run anyway and you are able to retrieve the results once computed.
- Query constraint: Added the option to run queries only when all datasets have been provided to the Data Clean Room.
- Better support for query development: When creating a Data Clean Room, you can now validate your queries for correctness without publishing the Data Clean Room. To boot, you also get the schema of each query result.
- Better Data Clean Room descriptions: The description field now supports long, rich-formatted texts for beautiful descriptions.
- Branding of Data Clean Rooms: You can now to personalize the Data Clean Rooms you create with your company logo, using the toggle from the top-right menu.
- Simplified participant view: We simplified the Data Clean Room participation by combining the STATUS and OVERVIEW tabs.
- Improved audit log: We added a more expressive human-readable column to the audit log and also log more information.
Bugfixes
- In published Data Clean Rooms, the text is now in read-only mode.
- Fixed link to change password.
- Fixed formatting of the headers in the results preview.
Enclave version
597731febdecd0b5fcdfd4331c1b0c238e40f2c4c51e346e9036135ce1a81287
v1.1 - 13/07/21
New features
- SQL engine and stack optimization: The platform now performs computations up to 5x faster.
- The platform now can support Single Sign On via Microsoft Active Directory.
- Full support for DISTINCT function and casting to VARCHAR.
- More control of your data: Improved user experience for dataset de-provisioning.
- The description box is now a long rich-text field that can contain instructions and useful information to use the Data Clean Room in a well-formatted manner.
- The audit log is now better readable.
- Improved the validation of identifiers (table names, column names, query names).
- Improved user experience in the analysis and action tab: It is now easier to scroll, run queries, and download results.
- The UI is now able to load and display large result tables.
- The reordering of table columns reordering is now more intuitive.
- Other minor UI and UX improvements.
Bugfixes
- The Data Clean Room publication validation now checks the consistency of the data type when variables are joined in an SQL statement.
- Fixed bug regarding SUM on a window function (it used to be implemented as a sum over the partition but it should be a running sum).
- The analysis tab now refreshes the content properly when a query is deleted
- Fixed a bug during login workflow.
- Fixed a bug that freezes text writing in the modal window a new Data Clean Room creation.
Enclave version
9288101a27978bfbc67ec393d3060ce5c96139bbf9aab320061a25c562bdac4b
v1.0 - 18/06/21
V1.0 brings major changes in the web interface of the platform, with a whole new look & feel, an improved UI and more intuitive UX.
New features
(1) Expanded SQL functionality:
- Added support for 'IS NULL' and 'IS NOT NULL'.
- ORDER BY ASC/DESC function is now fully supported.
- NTILE function is now supported.
- ROUND function is now supported.
(2) New UI functionalities:
- You can now directly duplicate an existing Data Clean Room.
- Error messages are more actionable and closer to natural language.
- You are now able to see who was already contributed data to a data cleanroom.
- Headers are now included in the preview and the .CSV of the results.
- Archiving Data Clean Rooms without deleting them is now possible.
- General improvements and improved visualization of the audit log.
- User roles and permissions are now explicit and have a dedicated section of the Data Clean Room.
- User documentation is available directly from the platform.
- Table creation is now supported by a UI table builder.
- It is now possible to delete your encrypted datasets from the Data Clean Room.
Enclave version
5d93bfef5324f984d7781b55ac0cc4cd13dc21eed1f9a7750e1747016abddaf0
v0.4 - 31/5/21
New features
- Fuzzy matching now supports any type of
JOIN
in theFROM
statement. - Casting into
int64
is now supported. - Computation speed of the fuzzy matching algorithm has been substantially improved.
Bugfixes
- Timeout error during upload is now fixed.
Enclave version
561fd910346334b5a245b21b81287393310bd90bc1ae09028a7680291ab48d4a
v0.3 - 15/5/21
New features
- Audit log: It is now possible to download a CSV file that shows all the interactions that happened in the secure enclave for a specific Data Clean Room. This intamparable log gives Data Owners full transparency of how their data has been used.
- Queries can now use other queries in their
FROM
part. This UX feature makes it much easier to write longer queries. - Results can now be indefinitely big without resulting in problems.
Enclave version
b6f5fe60884d309951128d02e53e623c66b246d0ae9425974edb28296d3eac4d
v0.2 - 3/5/21
New features
- You can now set a password during the Data Clean Room definition that needs to be used to interact with the Data Clean Room once published.
- Now
UNION
function is supported. - Email notifications are sent out when a new account or Data Clean Room is created or you are invited to collaborate in a Data Clean Room.
- Performance improvement: Now the platform switches automatically from single enclave to distributed mode, allowing computation on bigger datasets.
- Table and query editors are now resizable.
- Personalised platform branding has been added.
Bugfixes
- Fixed bug in parsing the table name from the CREATE TABLE statement.
- Minor bug fixes in query validation.
v0.1 - 27/4/21
New features
- Now the platform supports
NULL
values and all types ofJOINs
. - Added query polling to prevent the browser from timing out.
- Improved messages and UX for Data Clean Room validation errors.
- Now the uploaded datasets get validated before being ingested.
- Tables and queries definition can be downloaded also after publication.
- Improved UI for participants' actions.
Buxfixes
- Now tables and queries cannot be submitted without having at least one assigned Analyst.
- Table name entry is now a read-only field preventing from errors.
- Solved an issue regarding the results preview.