Snowpark Container Services (SPCS) Use Cases
Docker containers running inside Snowflake? It
Snowpark Container Services (SPCS) has been the “cool new toy” for a while, but in 2025, it’s hitting mainstream adoption. The ability to run any Docker container/image inside the Snowflake perimeter is an architectural paradigm shift.
It means Snowflake is no longer just a SQL engine; it’s a compute platform.
1. Hosting Legacy/Custom Apps#
Do you have a dusty C++ executable developed in 2005 that calculates credit risk? Or a specialized Fortran library for geological modeling?
Previously, you had to extract data from Snowflake, move it to an EC2 instance, run the executable, and load the results back. Painful.
With SPCS, you package that executable in a Docker container, upload it to the Snowflake Image Registry, and run it next to the data.
2. Long-Running Data-Intensive Services#
Standard UDFs have timeouts. SPCS Services can be “Job Services” (finish and exit) or “Long-Running Services” (web servers).
A great use case is hosting a Python Flask or FastAPI server that provides specialized data transformations or validation logic that is too complex for a UDF. You can expose this service via a Service Function, so from SQL it looks like a normal function call, but it routes to your container.
3. Generative AI & LLMs (The Big One)#
While Cortex provides managed models, sometimes you need complete control.
- You want to run a specific open-source model from HuggingFace (e.g., a fine-tuned Mistral 7B).
- You want to use a GPU-accelerated library.
SPCS provides GPU compute pools. You can spin up an NVIDIA A10G instance, load your model, and expose an inference endpoint securely.
How it works (Simplified)#
-
Build:
docker build -t my-app . -
Push:
docker push my-org.registry.snowflakecomputing.com/db/schema/repo/my-app:latest -
Run:
sqlCREATE SERVICE my_service IN COMPUTE POOL my_gpu_pool FROM @my_spec_stage SPEC_FILE='service.yml';
Conclusion#
SPCS eliminates the “Data Gravity” friction. Instead of moving massive data to your compute, you bring your custom compute to the massive data.