Skip to content

Snowflake Depots

To read/write data on the Snowflake data source, you first need to create a depot on top of it. In case you haven’t created a Snowflake Depot navigate to the below link: Snowflake Depot.

Read Config

Once you have set up a Snowflake Depot, you can start reading data from it.

snowflake_depot_read.yml
version: v1
name: snowflake-read-05
type: workflow
tags:
  - Connect
  - read
  - write
description: Jobs reads from snowflake
workflow:
  title: Connect Snowflake
  dag:
    - name: read-snowflake-01
      title: Reading data and writing to snowflake
      description: This job writes data to wnowflake
      spec:
        tags:
          - Connect
          - write
        stack: flare:6.0
        compute: runnable-default
        flare:
          job:
            explain: true
            inputs:
            - name: city_connect
              dataset: dataos://sanitysnowflake:public/snowflake_write_12
              format: snowflake
              options:
                sfWarehouse: "TMDCWH"
            logLevel: INFO
            outputs:
              - name: cities
                dataset: dataos://lakehouse:smoketest/snowflake_read_12?acl=rw
                format: iceberg
                description: City data ingested from bigquery
                options:
                  saveMode: append
                tags:
                  - Connect
                title: City Source Data
            steps:
              - sequence:
                - name: cities
                  sql: SELECT * FROM city_connect limit 10

Write Config

snowflake_depot_write.yml
version: v1
name: snowflake-write-01
type: workflow
tags:
  - Connect
  - read
  - write
description: Jobs writes data to snowflake and reads from thirdparty
workflow:
  title: Connect Snowflake
  dag:
    - name: write-snowflake-01
      title: Reading data and writing to snowflake
      description: This job writes data to wnowflake
      spec:
        tags:
          - Connect
          - write
        stack: flare:6.0
        compute: runnable-default
        flare:
          job:
            explain: true
            inputs:
            - name: city_connect
              dataset: dataos://thirdparty01:none/city
              format: csv
              schemaPath: dataos://thirdparty01:none/schemas/avsc/city.avsc
            logLevel: INFO
            outputs:
              - name: finalDf
                dataset: dataos://sanitysnowflake:public/snowflake_write_12?acl=rw
                format: snowflake
                description: City data ingested from external csv
                title: City Source Data
                options:
                  extraOptions:
                    sfWarehouse: "TMDCWH"
            steps:
              - sequence:
                  - name: finalDf
                    sql: SELECT * FROM city_connect limit 10
Was this page helpful?