Is Your Data Ready for the EU Data Act? A 4-Step Guide Using Snowflake Cortex AI
Navigate EU Data Act compliance with confidence. Learn how data teams and compliance professionals can collaborate using Snowflake Cortex AI to automate discovery, governance, portability, and audit requirements in this practical 4-step framework.
The EU Data Act enforcement deadline has arrived, and organisations across Europe are scrambling to demonstrate compliance. For many, the challenge isn’t just understanding what the regulation requires—it’s bridging the gap between compliance requirements and technical implementation at scale. This is where data professionals and compliance teams must unite, and where intelligent automation becomes essential rather than optional.
This guide presents a practical 4-step framework that addresses both regulatory mandates and technical realities. Whether you’re a data engineer building compliant systems or a compliance officer needing to demonstrate adherence, this approach using Snowflake Cortex AI provides the automation, documentation, and collaboration tools necessary for Data Act readiness. The key insight? Compliance isn’t just a legal checkbox—it’s a data engineering challenge that benefits from AI-powered automation.
Understanding the EU Data Act: What It Means for Your Organisation#
The EU Data Act, now in active enforcement as of September 2025, fundamentally reshapes how organisations must handle data. Unlike GDPR’s focus on personal data protection, the Data Act addresses data accessibility, portability, and sharing obligations across all types of business data.
Key Requirements Affecting Data Platforms#
Data Accessibility Mandates: Organisations must make data readily accessible to authorised users, including customers, business partners, and in some cases, third-party service providers. This means your data platform must support rapid, structured access to diverse data sets.
Portability Requirements: Upon request, you must provide data in commonly used, machine-readable formats within a reasonable timeframe (typically 30 days, or immediately for certain automated systems). This affects how you architect data storage, transformation pipelines, and export capabilities.
B2B Data Sharing Obligations: Businesses must facilitate data sharing between organisations under fair, reasonable, and non-discriminatory terms. Your data governance framework must support secure, auditable data sharing mechanisms.
Technical Compliance Timeline: With primary enforcement beginning September 2025, organisations face penalties of up to 1% of global annual turnover for non-compliance. The clock isn’t just ticking—it’s already struck midnight.
The Dual Challenge: Regulatory + Technical#
For compliance teams: You need to demonstrate that appropriate controls exist, that data can be accessed and exported on demand, and that comprehensive audit trails prove adherence. Manual processes don’t scale, and spreadsheet-based governance won’t satisfy regulators.
For data teams: You’re managing petabytes across multiple platforms, thousands of tables, complex transformation logic, and diverse access patterns. Manually classifying data, tracking access, building custom export pipelines for every request, and maintaining audit logs is operationally infeasible.
This is where Snowflake Cortex AI transforms Data Act compliance from an overwhelming mandate into a manageable, automated process.
The 4-Step Compliance Framework#
Step 1: Data Discovery and Classification#
The Compliance Need: Know What Data You Have and Where#
Regulators require organisations to maintain comprehensive inventories of data assets, including classification by sensitivity, data subject rights applicability, and sharing obligations. You cannot comply with access or portability requests if you don’t know what data exists or how it’s classified.
Compliance question: “Can you demonstrate that you’ve identified all data subject to the Data Act across your entire data estate?”
The Data Team Challenge: Scale of Discovery#
Modern data platforms contain thousands of databases, tens of thousands of tables, and millions of columns. Manual discovery is impossible. Traditional metadata tools provide schema information but lack semantic understanding—they can’t distinguish between a customer identifier that triggers Data Act obligations and an internal transaction ID that doesn’t.
Data team question: “How do we automatically classify data across our entire Snowflake estate without building custom ML models?”
The Cortex AI Solution: Automated Classification and Tagging#
Snowflake Cortex AI provides semantic understanding of data content through large language models that can analyse column names, sample data, and usage patterns to automatically classify data according to Data Act categories.
-- Automated Data Discovery and Classification using Cortex AI
CREATE OR REPLACE PROCEDURE discover_and_classify_data_act_data(
target_database STRING,
target_schema STRING
)
RETURNS STRING
LANGUAGE PYTHON
RUNTIME_VERSION = '3.9'
PACKAGES = ('snowflake-snowpark-python')
HANDLER = 'classify_data'
AS
$$
import snowflake.snowpark as snowpark
from snowflake.cortex import Complete
def classify_data(session, target_database, target_schema):
# Get all tables and columns in the schema
tables_query = f"""
SELECT
table_catalog,
table_schema,
table_name,
column_name,
data_type
FROM {target_database}.information_schema.columns
WHERE table_schema = '{target_schema}'
"""
columns_df = session.sql(tables_query).collect()
classifications = []
for row in columns_df:
# Use Cortex AI to classify each column
classification_prompt = f"""
Analyse this database column and classify it according to EU Data Act requirements:
Table: {row['TABLE_NAME']}
Column: {row['COLUMN_NAME']}
Data Type: {row['DATA_TYPE']}
Classify as one of: CUSTOMER_DATA, PRODUCT_DATA, TRANSACTION_DATA,
DEVICE_DATA, USAGE_DATA, INTERNAL_ONLY, UNKNOWN
Also indicate if this data is subject to portability requirements (YES/NO).
Return only: CLASSIFICATION|PORTABILITY_REQUIRED
"""
response = Complete('mistral-large2', classification_prompt)
classification, portability = response.split('|')
# Store classification in governance table
session.sql(f"""
INSERT INTO data_governance.data_act_classifications
(database_name, schema_name, table_name, column_name,
classification, portability_required, classified_date)
VALUES
('{row['TABLE_CATALOG']}', '{row['TABLE_SCHEMA']}',
'{row['TABLE_NAME']}', '{row['COLUMN_NAME']}',
'{classification}', '{portability}', CURRENT_TIMESTAMP())
""").collect()
classifications.append(f"{row['TABLE_NAME']}.{row['COLUMN_NAME']}: {classification}")
return f"Classified {len(classifications)} columns in {target_schema}"
$$;
-- Execute discovery across production databases
CALL discover_and_classify_data_act_data('PRODUCTION_DB', 'CUSTOMER_SCHEMA');sqlImplementation Approach#
- Create classification taxonomy aligned with Data Act categories
- Deploy Cortex-powered discovery across all databases and schemas
- Tag sensitive data using Snowflake’s native tag-based governance
- Automate reclassification on schema changes or new table creation
- Generate compliance inventory reports for audit purposes
Outcome: Comprehensive Data Inventory#
For compliance: A complete, up-to-date inventory showing what data exists, how it’s classified, and which Data Act obligations apply. This inventory can be presented to regulators demonstrating proactive governance.
For data teams: Automated classification that scales with your data estate, reducing manual effort by 90% whilst maintaining accuracy through AI-powered semantic analysis.
Step 2: Access Control and Governance#
The Compliance Need: Demonstrate Appropriate Access Controls#
The Data Act requires organisations to implement and document appropriate access controls that ensure data is accessible to authorised parties whilst preventing unauthorised access. Compliance officers must prove that access governance is robust, auditable, and enforced consistently.
Compliance question: “Can you demonstrate who has access to what data, why they have that access, and how access decisions align with Data Act obligations?”
The Data Team Challenge: Managing Complex Permissions#
Enterprise Snowflake environments contain hundreds of roles, thousands of users, and complex role hierarchies. Access patterns evolve continuously as business needs change. Traditional role-based access control (RBAC) provides the mechanism but doesn’t provide intelligence about whether access patterns align with compliance requirements.
Data team question: “How do we ensure our role hierarchy and access grants comply with Data Act principles without manually auditing thousands of permission grants?”
The Cortex AI Solution: Intelligent Access Pattern Analysis#
Cortex AI can analyse your existing access patterns, identify potential compliance gaps, and recommend governance improvements based on Data Act principles.
-- Analyse Access Patterns for Data Act Compliance
CREATE OR REPLACE VIEW data_governance.data_act_access_analysis AS
WITH role_access AS (
SELECT
r.name AS role_name,
r.granted_to,
p.table_catalog,
p.table_schema,
p.table_name,
p.privilege
FROM snowflake.account_usage.grants_to_roles p
JOIN snowflake.account_usage.roles r
ON p.grantee_name = r.name
WHERE p.deleted_on IS NULL
),
data_act_sensitive AS (
SELECT DISTINCT
database_name,
schema_name,
table_name
FROM data_governance.data_act_classifications
WHERE portability_required = 'YES'
)
SELECT
ra.role_name,
ra.granted_to,
ra.table_catalog || '.' || ra.table_schema || '.' || ra.table_name AS full_table_name,
ra.privilege,
CASE
WHEN das.table_name IS NOT NULL THEN 'DATA_ACT_SENSITIVE'
ELSE 'NON_SENSITIVE'
END AS data_sensitivity,
SNOWFLAKE.CORTEX.COMPLETE(
'mistral-large2',
'Analyse this access pattern and determine if it complies with EU Data Act principles. ' ||
'Role: ' || ra.role_name || ', ' ||
'Granted To: ' || ra.granted_to || ', ' ||
'Privilege: ' || ra.privilege || ', ' ||
'Data Sensitivity: ' || CASE WHEN das.table_name IS NOT NULL THEN 'DATA_ACT_SENSITIVE' ELSE 'NON_SENSITIVE' END || '. ' ||
'Return: COMPLIANT or NON_COMPLIANT with brief reason.'
) AS compliance_assessment
FROM role_access ra
LEFT JOIN data_act_sensitive das
ON ra.table_catalog = das.database_name
AND ra.table_schema = das.schema_name
AND ra.table_name = das.table_name;
-- Query for compliance gaps
SELECT
role_name,
full_table_name,
compliance_assessment
FROM data_governance.data_act_access_analysis
WHERE compliance_assessment LIKE '%NON_COMPLIANT%';sqlImplementation Approach#
- Deploy access pattern analysis across all databases containing Data Act-relevant data
- Implement tag-based access control using Snowflake’s tag-based masking and row access policies
- Create automated compliance reporting showing access governance status
- Establish review workflows for access requests to Data Act-sensitive data
- Document access decisions in governance metadata for audit purposes
Outcome: Auditable Access Management#
For compliance: Documented, defensible access governance demonstrating that only authorised individuals can access Data Act-relevant data, with clear justification for each access grant and comprehensive audit trails.
For data teams: Intelligent, automated analysis that identifies compliance gaps before auditors do, with actionable recommendations for remediation. Access governance becomes proactive rather than reactive.
Step 3: Data Portability Readiness#
The Compliance Need: Ability to Export Data on Request#
Articles 4 and 5 of the Data Act establish clear portability obligations: organisations must provide data to authorised requesters in commonly used, machine-readable formats within specified timeframes. This isn’t optional—it’s a legal mandate with financial penalties for non-compliance.
Compliance question: “If we receive a portability request tomorrow, can we fulfil it within the required timeframe with complete, accurate data in the requested format?”
The Data Team Challenge: Multiple Formats, Complex Transformations#
Data portability sounds straightforward until you consider the technical reality: data spans multiple tables with complex relationships, requires transformation from internal to standardised formats, must be filtered to include only authorised data, and needs to be exported in formats that may differ from your storage format (JSON, XML, CSV, Parquet).
Building custom export pipelines for each request is time-consuming and error-prone. You need automated, repeatable portability workflows.
Data team question: “How do we build flexible export pipelines that can handle diverse portability requests without custom development for each request?”
The Cortex AI Solution: Automated Export Workflows#
Cortex AI can interpret portability requests in natural language, identify relevant data based on classifications, and orchestrate export workflows that produce compliant outputs.
-- Data Portability Orchestration using Cortex AI
CREATE OR REPLACE PROCEDURE execute_data_portability_request(
requester_id STRING,
request_description STRING,
output_format STRING,
output_location STRING
)
RETURNS STRING
LANGUAGE PYTHON
RUNTIME_VERSION = '3.9'
PACKAGES = ('snowflake-snowpark-python')
HANDLER = 'process_portability_request'
AS
$$
import snowflake.snowpark as snowpark
from snowflake.cortex import Complete
import json
def process_portability_request(session, requester_id, request_description, output_format, output_location):
# Use Cortex AI to parse the portability request
parse_prompt = f"""
Parse this data portability request and identify:
1. What data categories are requested (e.g., CUSTOMER_DATA, TRANSACTION_DATA)
2. Any time period constraints
3. Any specific filters or conditions
Request: {request_description}
Return as JSON: {{"categories": [], "date_from": "", "date_to": "", "filters": ""}}
"""
parsed_request = Complete('mistral-large2', parse_prompt)
request_params = json.loads(parsed_request)
# Build query based on parsed parameters
categories = "','".join(request_params['categories'])
# Identify tables containing requested data
tables_query = f"""
SELECT DISTINCT
database_name,
schema_name,
table_name
FROM data_governance.data_act_classifications
WHERE classification IN ('{categories}')
AND portability_required = 'YES'
"""
tables = session.sql(tables_query).collect()
exported_files = []
for table in tables:
full_table_name = f"{table['DATABASE_NAME']}.{table['SCHEMA_NAME']}.{table['TABLE_NAME']}"
# Build export query with filters
export_query = f"""
COPY INTO '{output_location}/{table['TABLE_NAME']}.{output_format.lower()}'
FROM (
SELECT *
FROM {full_table_name}
WHERE 1=1
{f"AND date_column >= '{request_params['date_from']}'" if request_params.get('date_from') else ""}
{f"AND date_column <= '{request_params['date_to']}'" if request_params.get('date_to') else ""}
)
FILE_FORMAT = (TYPE = {output_format} COMPRESSION = GZIP)
HEADER = TRUE
OVERWRITE = TRUE
"""
session.sql(export_query).collect()
exported_files.append(f"{table['TABLE_NAME']}.{output_format.lower()}")
# Log portability request for audit trail
session.sql(f"""
INSERT INTO data_governance.portability_requests
(request_id, requester_id, request_description, files_exported,
request_date, completion_date, status)
VALUES
(UUID_STRING(), '{requester_id}', '{request_description}',
'{json.dumps(exported_files)}', CURRENT_TIMESTAMP(),
CURRENT_TIMESTAMP(), 'COMPLETED')
""").collect()
return f"Exported {len(exported_files)} files to {output_location}"
$$;
-- Execute a portability request
CALL execute_data_portability_request(
'CUSTOMER_12345',
'All my transaction data from the last 2 years',
'CSV',
'@data_exports/customer_12345/'
);sqlImplementation Approach#
- Create portability request intake process (web form, API, or customer portal)
- Deploy Cortex-powered parsing to interpret requests and identify relevant data
- Implement automated export pipelines supporting multiple formats (CSV, JSON, Parquet, XML)
- Establish secure delivery mechanism (encrypted download links, secure file transfer)
- Maintain comprehensive audit logs of all portability requests and fulfilments
Outcome: Rapid Compliance with Data Requests#
For compliance: Demonstrated capability to fulfil portability requests within required timeframes, with complete audit trails showing request receipt, processing, and delivery. This satisfies Data Act Articles 4 and 5 requirements.
For data teams: Elimination of manual export work through intelligent automation. Portability requests that previously took days or weeks can now be fulfilled in hours, with consistent quality and complete documentation.
Step 4: Audit Trail and Documentation#
The Compliance Need: Prove Compliance to Regulators#
When regulators audit your Data Act compliance, they require evidence: documentation showing what controls exist, how they’re enforced, who accessed what data when, and how portability requests were handled. Assertions aren’t sufficient—you need comprehensive, tamper-evident audit trails.
Compliance question: “Can you provide complete, auditable evidence of Data Act compliance across all required dimensions, formatted for regulatory review?”
The Data Team Challenge: Comprehensive Logging at Scale#
Snowflake’s account usage views capture extensive metadata, but transforming raw access logs, query history, and governance events into compliance-ready audit reports requires significant effort. You need to correlate events across multiple dimensions, identify relevant activities, and generate reports that non-technical auditors can understand.
Data team question: “How do we transform petabytes of access logs and metadata into concise, auditable compliance reports without building a custom reporting system?”
The Cortex AI Solution: Automated Compliance Reporting#
Cortex AI can analyse access logs, governance events, and portability request history to generate narrative compliance reports suitable for regulatory review.
-- Generate Data Act Compliance Audit Report
CREATE OR REPLACE PROCEDURE generate_data_act_audit_report(
report_period_start DATE,
report_period_end DATE
)
RETURNS STRING
LANGUAGE SQL
AS
$$
DECLARE
report_text STRING;
discovery_summary STRING;
access_summary STRING;
portability_summary STRING;
BEGIN
-- Data Discovery Summary
SELECT
SNOWFLAKE.CORTEX.COMPLETE(
'mistral-large2',
'Generate a compliance summary paragraph from this data: ' ||
'Total tables classified: ' || COUNT(DISTINCT table_name) || ', ' ||
'Data Act sensitive columns: ' || SUM(CASE WHEN portability_required = 'YES' THEN 1 ELSE 0 END) || ', ' ||
'Last classification date: ' || MAX(classified_date) || '. ' ||
'Focus on demonstrating comprehensive data discovery for EU Data Act compliance.'
)
INTO discovery_summary
FROM data_governance.data_act_classifications;
-- Access Governance Summary
SELECT
SNOWFLAKE.CORTEX.COMPLETE(
'mistral-large2',
'Generate a compliance summary paragraph from this data: ' ||
'Total roles with access to sensitive data: ' || COUNT(DISTINCT role_name) || ', ' ||
'Compliant access patterns: ' || SUM(CASE WHEN compliance_assessment LIKE '%COMPLIANT%' THEN 1 ELSE 0 END) || ', ' ||
'Non-compliant patterns identified and remediated: ' || SUM(CASE WHEN compliance_assessment LIKE '%NON_COMPLIANT%' THEN 1 ELSE 0 END) || '. ' ||
'Focus on demonstrating robust access governance for EU Data Act compliance.'
)
INTO access_summary
FROM data_governance.data_act_access_analysis;
-- Portability Request Summary
SELECT
SNOWFLAKE.CORTEX.COMPLETE(
'mistral-large2',
'Generate a compliance summary paragraph from this data: ' ||
'Total portability requests: ' || COUNT(*) || ', ' ||
'Average completion time: ' || AVG(DATEDIFF('hour', request_date, completion_date)) || ' hours, ' ||
'Successfully completed: ' || SUM(CASE WHEN status = 'COMPLETED' THEN 1 ELSE 0 END) || ', ' ||
'Compliance rate: ' || ROUND(100.0 * SUM(CASE WHEN status = 'COMPLETED' THEN 1 ELSE 0 END) / COUNT(*), 2) || '%. ' ||
'Focus on demonstrating effective data portability for EU Data Act compliance.'
)
INTO portability_summary
FROM data_governance.portability_requests
WHERE request_date BETWEEN :report_period_start AND :report_period_end;
-- Combine into comprehensive report
report_text := '# EU Data Act Compliance Audit Report\n\n' ||
'## Reporting Period: ' || :report_period_start || ' to ' || :report_period_end || '\n\n' ||
'## Data Discovery and Classification\n' || discovery_summary || '\n\n' ||
'## Access Control and Governance\n' || access_summary || '\n\n' ||
'## Data Portability\n' || portability_summary || '\n\n' ||
'## Conclusion\n' ||
'This organisation demonstrates comprehensive EU Data Act compliance through ' ||
'automated data discovery, robust access governance, and effective portability mechanisms.';
-- Store report for audit purposes
INSERT INTO data_governance.compliance_reports
(report_date, report_type, report_content)
VALUES (CURRENT_TIMESTAMP(), 'DATA_ACT_AUDIT', :report_text);
RETURN report_text;
END;
$$;
-- Generate quarterly compliance report
CALL generate_data_act_audit_report('2025-07-01', '2025-09-30');sqlImplementation Approach#
- Establish compliance reporting schedule (monthly or quarterly)
- Deploy automated report generation covering all Data Act dimensions
- Create executive dashboards showing real-time compliance status
- Implement alert mechanisms for potential compliance breaches
- Maintain immutable audit logs using Snowflake’s time travel and fail-safe features
Outcome: Regulator-Ready Documentation#
For compliance: Comprehensive, narrative audit reports that demonstrate Data Act compliance across all required dimensions, generated automatically and ready for regulatory submission with minimal manual intervention.
For data teams: Elimination of manual compliance reporting burden. What previously required weeks of data gathering, analysis, and report writing now happens automatically, freeing technical resources for value-adding work.
Collaboration: Bridging Data Teams and Compliance#
The most significant insight from implementing Data Act compliance is that technology alone isn’t sufficient—successful compliance requires genuine collaboration between data professionals and compliance teams.
Why Both Teams Must Work Together#
Data teams understand: System architecture, data lineage, technical capabilities, performance constraints, and implementation feasibility.
Compliance teams understand: Regulatory requirements, risk assessment, audit expectations, documentation standards, and regulatory relationships.
Neither team alone has complete visibility. Data teams might implement technically elegant solutions that don’t satisfy regulatory expectations. Compliance teams might mandate controls that are technically infeasible or prohibitively expensive.
How Cortex AI Facilitates Collaboration#
Shared language: Cortex AI translates between technical metadata and compliance terminology, allowing both teams to discuss the same concepts without specialised knowledge of each other’s domains.
Shared dashboards: Automated compliance reports provide a single source of truth that both teams can reference, eliminating conflicting interpretations of compliance status.
Shared workflows: Portability requests, access reviews, and classification updates become collaborative processes rather than sequential handoffs between teams.
Practical Collaboration Patterns#
-
Joint governance meetings: Review Cortex-generated compliance reports together, with data teams explaining technical implementation and compliance teams confirming regulatory adequacy.
-
Shared responsibility model: Data teams own automation and technical implementation; compliance teams own policy definition and regulatory relationships; both own compliance outcomes.
-
Feedback loops: Compliance teams provide regulatory context that shapes technical implementation; data teams provide feasibility input that shapes compliance strategies.
The organisations succeeding with Data Act compliance are those that have broken down silos between data and compliance functions, using intelligent automation as the bridge.
Best Practices for Data Act Readiness#
| Practice | Data Team Focus | Compliance Team Focus |
|---|---|---|
| Start Early | Begin classification and automation implementation well before deadlines | Engage with regulators early to clarify interpretation of requirements |
| Automate Where Possible | Deploy Cortex AI for classification, access analysis, portability workflows, reporting | Focus compliance expertise on policy and risk, not manual data gathering |
| Document Everything | Implement comprehensive logging and metadata management | Maintain decision logs explaining classification rationale and access decisions |
| Regular Compliance Reviews | Schedule automated compliance scans and gap analysis | Conduct quarterly compliance assessments with regulatory lens |
| Cross-Functional Governance | Participate in governance meetings with compliance perspective on technical feasibility | Participate in governance meetings with regulatory perspective on implementation |
Critical success factor: Treat Data Act compliance as an ongoing operational capability, not a one-time project. Regulations evolve, data estates grow, and business needs change. Automated, intelligent compliance frameworks adapt; manual processes become obsolete.
Conclusion#
The EU Data Act represents a fundamental shift in how organisations must govern, access, and share data. With enforcement now active in September 2025, organisations face a stark choice: demonstrate compliance through robust, auditable processes, or face substantial financial penalties and reputational damage.
The good news? Data Act compliance is achievable when you combine the right technology with the right organisational approach. Snowflake Cortex AI transforms what would be an overwhelming manual effort into an automated, scalable compliance framework that serves both regulatory obligations and operational efficiency.
For data professionals: Cortex AI eliminates the burden of manual classification, access analysis, custom export pipelines, and compliance reporting, allowing you to focus on value-adding data work rather than regulatory overhead.
For compliance teams: Cortex AI provides the comprehensive audit trails, documentation, and evidence required to demonstrate compliance to regulators, with far greater accuracy and consistency than manual processes.
For both teams together: This is your opportunity to break down silos, establish shared compliance capabilities, and demonstrate that your organisation takes data governance seriously—not just as a regulatory obligation, but as a competitive advantage.
The organisations that thrive under the Data Act won’t be those that treat compliance as a burden to be minimised. They’ll be those that embrace intelligent automation, foster genuine collaboration between technical and compliance functions, and recognise that robust data governance creates business value beyond regulatory compliance.
Is your data ready for the Data Act? With Snowflake Cortex AI and a collaborative approach between data and compliance teams, the answer can be a confident “yes”.
Key Takeaways#
- Data Act compliance requires both regulatory expertise and technical implementation—neither compliance teams nor data teams can succeed alone
- Automated classification using Cortex AI scales to enterprise data estates, providing comprehensive data inventories required by regulators
- Intelligent access governance analysis identifies compliance gaps proactively, before auditors discover them
- Automated portability workflows transform data export from a weeks-long manual process into hours of automated execution
- AI-generated compliance reports provide regulator-ready documentation demonstrating adherence across all Data Act dimensions
- Start with collaboration: Establish joint governance between data and compliance teams, using intelligent automation as the bridge
Additional Resources#
- EU Data Act Official Text ↗ - Complete regulatory requirements
- Snowflake Cortex AI Documentation ↗ - Technical implementation guide
- Snowflake Governance Features ↗ - Tag-based governance and access control
- Data Act Compliance Checklist ↗ - European Commission guidance