仅欧洲替代方案 Snowflake.
Snowflake is the cloud data warehouse that won the analytics market through separation of compute and storage, with EU regions on AWS, Azure or GCP. Snowflake Inc. is a Delaware corporation; the EU regions live on US-hyperscaler infrastructure — meaning two layers of US jurisdiction. For analytics workloads on EU customer data, Schrems II compliance is genuinely difficult on Snowflake. The sovereign alternatives are: ClickHouse (open-source columnar warehouse), DuckDB (embedded analytics), or PostgreSQL with appropriate columnar extensions — all deployable on EU sovereign infrastructure.
"欧盟区域"不等于主权。四个问题决定一切。
数据驻留告诉你数据在哪里。主权告诉你哪个法律体系可以强制访问。四个答案都必须成立——否则该技术栈就不主权。
数据物理存储在哪里?
不是"在云中"——而是哪个数据中心、在哪个国家、受哪个司法管辖区管辖。
您的数据路径中还有谁?
每一个接触数据的供应商:CDN、邮件中继、错误追踪、分析管道。
哪些法律可以强制披露?
美国总部的供应商受 FISA 702 和 CLOUD Act 管辖——即使数据存放在法兰克福。
谁实际持有加密密钥?
如果云供应商同时持有数据和密钥,无论 DPA 如何,他们都能读取数据。
在司法管辖权和密钥托管上失败。
欧盟数据、美国母公司、默认路径中的美国次级处理者、供应商管理的密钥。
四项全部通过。
托管在欧盟、由欧盟总部基础设施提供。默认路径中零美国次级处理者。客户持有或欧盟 KMS 密钥。在您的第 28 条 DPA 中按名称列出。
为什么团队正在退出 Snowflake
Snowflake exits we have scoped come from regulated workloads where the analytics warehouse holds personal data of EU customers, and the Schrems II analysis fails on multiple layers. The unique migration challenge: data warehouses are large, queries are complex, and dbt / Looker / Tableau pipelines need re-pointing. The honest answer for a Snowflake exit is 3-6 months of careful work, not a quick swap. Where the savings are: Snowflake credits at scale ($20k-100k+/month is common) compress to ClickHouse on EU bare metal at a fraction.
Snowflake 服务及其仅欧盟等效方案
迁移不是"换一个盒子"。下面的映射是我们为离开以下平台的客户运行的 Snowflake 基于 Schrems II — 完全欧盟司法管辖权,数据路径中没有美国母公司。
| Snowflake 服务 | 仅欧盟替代方案 | 工程说明 |
|---|---|---|
| Snowflake compute (warehouses) | ClickHouse on EU compute (Hetzner dedicated, OVH bare metal), self-managed Trino on EU | ClickHouse is the strongest sovereign alternative for OLAP workloads. For ad-hoc query workloads, Trino over EU object storage is the lakehouse pattern. |
| Snowflake storage | OVH Object Storage, Wasabi EU as data lake, ClickHouse internal storage on EU NVMe | For lakehouse architecture, EU S3-compatible storage as the data layer with ClickHouse or Trino as the query engine. |
| Snowpipe (continuous ingestion) | ClickHouse Kafka engine, custom ingestion via Apache Airflow on EU, dbt-cloud-replacement self-hosted | For Kafka-based ingestion, ClickHouse has native Kafka engine. For batch ingestion, Airflow on EU compute. |
| Streams & Tasks | Apache Airflow on EU, ClickHouse materialized views, Postgres triggers + LISTEN/NOTIFY | Materialized views in ClickHouse cover most "Stream" use cases. |
| Snowpark (Python/Scala in DB) | PySpark on EU compute, ClickHouse Python UDFs, dbt models with Python | For ML and feature engineering at the warehouse layer, PySpark on EU compute is the standard pattern. |
| Time Travel + Zero-Copy Cloning | ClickHouse table snapshots, PostgreSQL pg_dump + restore, application-layer point-in-time queries | Snowflake's Time Travel is a unique feature; ClickHouse snapshots provide a rougher equivalent. |
| Secure Data Sharing | Bring-your-own-key encrypted exports to EU object storage, custom API layer for shared datasets | Secure Data Sharing has no direct equivalent; the migration involves redesigning the data-sharing pattern. |
| Snowflake Marketplace | Direct vendor relationships for any third-party data, EU-hosted data marketplaces (limited maturity) | For datasets you currently subscribe to via Marketplace, direct vendor contracts are typically required. |
| Snowflake Cortex (LLMs) | Mistral AI (FR), Aleph Alpha (DE), self-hosted Llama on EU GPUs | Cortex is recent; the sovereign EU LLM space (Mistral, Aleph Alpha) has matured to be a real alternative. |
| BI tool integrations (Tableau, Looker, dbt Cloud) | Same BI tools repointed to ClickHouse / Trino, Metabase self-hosted, dbt Core (open-source) on EU runners | The BI tool layer typically transfers cleanly with new connection strings; dbt Cloud → dbt Core on self-hosted EU CI. |
我们如何迁移离开 Snowflake
典型的中端市场迁移分三个阶段进行。以下数字假设一个 6-10 人的工程团队和中等复杂的应用程序技术栈。
Architecture decision + audit
Decide ClickHouse vs Trino+lakehouse vs PostgreSQL based on query patterns and data volume. Inventory every dbt model, every dashboard, every external integration. The architecture decision dominates the schedule.
Pilot + parallel run
Migrate a representative subset of workloads to the EU target. Run parallel for validation. Tune ClickHouse cluster sizing based on real query patterns. dbt models converted (most run unchanged on dbt Core with adapter swap).
Full cutover
Phased migration of remaining workloads. BI tools repointed. Snowflake accounts scoped down. Final cutover with a rollback plan; Snowflake retained for archival access for 60-90 days post-cutover.
5-year TCO on Snowflake → ClickHouse migrations: typically 60-85% cheaper at scale. A team running $50k/month of Snowflake credits often replaces it with €5-10k/month of EU ClickHouse infrastructure plus the managed-partner fee. The break-even point is around $5-10k/month of Snowflake spend; below that, the engineering cost of migration may exceed the saved spend over a 3-year horizon.
常见问题
Snowflake has Frankfurt and other EU regions — does that solve GDPR?
No. Snowflake Inc. is US-headquartered (parent jurisdiction), and the EU regions run on AWS/Azure/GCP — also US-headquartered (infrastructure jurisdiction). Two layers of US legal exposure under the CLOUD Act and FISA 702. For Schrems II–strict workloads, neither is acceptable.
Is ClickHouse really comparable to Snowflake?
For OLAP query workloads, ClickHouse is genuinely competitive — often faster on equivalent hardware. The differences: ClickHouse requires more operational expertise, Snowflake's separation of compute and storage is harder to replicate cleanly, and Snowflake's ecosystem (Marketplace, Cortex, etc.) doesn't fully exist on ClickHouse. For pure analytics workloads, the gap is small.
What about ClickHouse Cloud — they have an EU region?
ClickHouse Inc. is a US Delaware corporation. ClickHouse Cloud EU regions run on AWS — same dual US-jurisdiction problem as Snowflake. The sovereign answer is self-hosted ClickHouse on EU compute. Aiven offers managed ClickHouse with a clearer EU-jurisdiction story (Aiven is Finnish).
How does dbt fit in?
dbt Core is open-source and runs anywhere; dbt Cloud is dbt Labs Inc. (US). For sovereign workloads, dbt Core on a self-hosted CI runner (GitLab CI EU, Forgejo Actions) replaces dbt Cloud. The actual dbt models port cleanly with the warehouse adapter swap (snowflake → clickhouse).
How long does a Snowflake exit really take?
For a small-to-mid Snowflake usage ($5-20k/month, dozens of dbt models): 3-6 months elapsed time. For enterprise Snowflake ($50k+/month, hundreds of models, complex data sharing): 9-18 months. Snowflake migrations are not weekend projects — they require planning, parallel runs, and careful BI-layer choreography.
Can we keep some Snowflake and migrate the rest?
Hybrid is sometimes the right answer for very specific Snowflake-only features. The discipline: keep only non-personal-data workloads on Snowflake (e.g. internal analytics on aggregated metrics with no PII), and document the boundary in the DPA. For most regulated workloads, full exit is cleaner than the documentation burden of a hybrid.