Importing from Azure Blob Storage
Setup
First, import the necessary modules:
from decentriq_platform import create_client, Key
from decentriq_platform.analytics import (
AnalyticsDcrBuilder,
RawDataNodeDefinition,
)
from decentriq_platform.data_connectors import (
AzureBlobStorageImportConnectorDefinition,
AzureBlobStorageCredentials,
)
Then, create the Client instance with which you can communicate with the
Decentriq platform:
user_email = "@@ YOUR EMAIL HERE @@"
api_token = "@@ YOUR TOKEN HERE @@"
client = create_client(user_email, api_token)
import json
enclave_specs = dq.enclave_specifications.latest()
Example: Import a file from Azure Blob Storage
This example shows how to import a file from Azure Blob Storage into your Data Clean Room.
# Build the Data Clean Room
builder = AnalyticsDcrBuilder(client=client)
dcr_definition = (
builder.with_name("Azure Import DCR")
.with_owner(user_email)
.with_description("Import a file from Azure Blob Storage")
.add_node_definitions([
# Node to hold Azure credentials
RawDataNodeDefinition(
name="azure_blob_storage_credentials",
is_required=True,
),
# Import connector node
AzureBlobStorageImportConnectorDefinition(
name="azure_blob_storage_import",
credentials_dependency="azure_blob_storage_credentials",
),
])
.add_participant(
user_email,
analyst_of=["azure_blob_storage_import"],
data_owner_of=["azure_blob_storage_credentials"],
)
.build()
)
# Publish the Data Clean Room
dcr = client.publish_analytics_dcr(dcr_definition)
# Upload Azure Blob Storage credentials
azure_blob_storage_credentials = dcr.get_node("azure_blob_storage_credentials")
azure_blob_storage_credentials.upload_and_publish_dataset(
AzureBlobStorageCredentials(
storage_account="@@ AZURE STORAGE ACCOUNT HERE @@",
storage_container="@@ AZURE CONTAINER NAME HERE @@",
blob_name="hello.txt",
sas_token="@@ AZURE SAS TOKEN HERE @@",
).as_binary_io(),
Key(),
"credentials.txt",
)
# Import the data from Azure Blob Storage
azure_blob_storage_import_connector = dcr.get_node("azure_blob_storage_import")
result = azure_blob_storage_import_connector.run_computation_and_get_results_as_bytes()
Parameters
The AzureBlobStorageImportConnectorDefinition requires the following parameters:
name: The name of the import connector nodecredentials_dependency: The name of the node containing Azure Blob Storage credentials
The AzureBlobStorageCredentials object specifies the Azure connection details:
storage_account: The Azure storage account namestorage_container: The Azure container name containing the fileblob_name: The name (path) of the blob to importsas_token: The SAS (Shared Access Signature) token for authentication
The imported file will be available in the Decentriq Platform and can be used in your Data Clean Rooms.