Skip to content

Commit

Permalink
Add Snowflake JDBC Connector (trinodb#11)
Browse files Browse the repository at this point in the history
Had to redo the connector because all the rebases caused havoc
  • Loading branch information
yuuteng committed Feb 22, 2024
1 parent 590f710 commit ec745a0
Show file tree
Hide file tree
Showing 25 changed files with 2,858 additions and 0 deletions.
6 changes: 6 additions & 0 deletions core/trino-server/src/main/provisio/trino.xml
Original file line number Diff line number Diff line change
Expand Up @@ -296,6 +296,12 @@
</artifact>
</artifactSet>

<artifactSet to="plugin/snowflake">
<artifact id="${project.groupId}:trino-snowflake:zip:${project.version}">
<unpack />
</artifact>
</artifactSet>

<artifactSet to="plugin/sqlserver">
<artifact id="${project.groupId}:trino-sqlserver:zip:${project.version}">
<unpack />
Expand Down
1 change: 1 addition & 0 deletions docs/src/main/sphinx/connector.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ Prometheus <connector/prometheus>
Redis <connector/redis>
Redshift <connector/redshift>
SingleStore <connector/singlestore>
Snowflake <connector/snowflake>
SQL Server <connector/sqlserver>
System <connector/system>
Thrift <connector/thrift>
Expand Down
96 changes: 96 additions & 0 deletions docs/src/main/sphinx/connector/snowflake.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
# Snowflake connector

```{raw} html
<img src="../_static/img/snowflake.png" class="connector-logo">
```

The Snowflake connector allows querying and creating tables in an
external [Snowflake](https://www.snowflake.com/) account. This can be used to join data between
different systems like Snowflake and Hive, or between two different
Snowflake accounts.

## Configuration

To configure the Snowflake connector, create a catalog properties file
in `etc/catalog` named, for example, `example.properties`, to
mount the Snowflake connector as the `snowflake` catalog.
Create the file with the following contents, replacing the
connection properties as appropriate for your setup:

```none
connector.name=snowflake
connection-url=jdbc:snowflake://<account>.snowflakecomputing.com
connection-user=root
connection-password=secret
snowflake.account=account
snowflake.database=database
snowflake.role=role
snowflake.warehouse=warehouse
```

### Arrow serialization support

This is an experimental feature which introduces support for using Apache Arrow
as the serialization format when reading from Snowflake. Please note there are
a few caveats:

- Using Apache Arrow serialization is disabled by default. In order to enable
it, add `--add-opens=java.base/java.nio=ALL-UNNAMED` to the Trino
{ref}`jvm-config`.

### Multiple Snowflake databases or accounts

The Snowflake connector can only access a single database within
a Snowflake account. Thus, if you have multiple Snowflake databases,
or want to connect to multiple Snowflake accounts, you must configure
multiple instances of the Snowflake connector.

% snowflake-type-mapping:

## Type mapping

Trino supports the following Snowflake data types:

| Snowflake Type | Trino Type |
| -------------- | -------------- |
| `boolean` | `boolean` |
| `tinyint` | `bigint` |
| `smallint` | `bigint` |
| `byteint` | `bigint` |
| `int` | `bigint` |
| `integer` | `bigint` |
| `bigint` | `bigint` |
| `float` | `real` |
| `real` | `real` |
| `double` | `double` |
| `decimal` | `decimal(P,S)` |
| `varchar(n)` | `varchar(n)` |
| `char(n)` | `varchar(n)` |
| `binary(n)` | `varbinary` |
| `varbinary` | `varbinary` |
| `date` | `date` |
| `time` | `time` |
| `timestampntz` | `timestamp` |
| `timestamptz` | `timestampTZ` |
| `timestampltz` | `timestampTZ` |

Complete list of [Snowflake data types](https://docs.snowflake.com/en/sql-reference/intro-summary-data-types.html).

(snowflake-sql-support)=

## SQL support

The connector provides read access and write access to data and metadata in
a Snowflake database. In addition to the {ref}`globally available
<sql-globally-available>` and {ref}`read operation <sql-read-operations>`
statements, the connector supports the following features:

- {doc}`/sql/insert`
- {doc}`/sql/delete`
- {doc}`/sql/truncate`
- {doc}`/sql/create-table`
- {doc}`/sql/create-table-as`
- {doc}`/sql/drop-table`
- {doc}`/sql/alter-table`
- {doc}`/sql/create-schema`
- {doc}`/sql/drop-schema`
Binary file added docs/src/main/sphinx/static/img/snowflake.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
247 changes: 247 additions & 0 deletions plugin/trino-snowflake/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,247 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<parent>
<groupId>io.trino</groupId>
<artifactId>trino-root</artifactId>
<version>435-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>trino-snowflake</artifactId>
<packaging>trino-plugin</packaging>
<description>Trino - Snowflake Connector</description>

<properties>
<air.main.basedir>${project.parent.basedir}</air.main.basedir>
</properties>

<dependencies>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</dependency>

<dependency>
<groupId>com.google.inject</groupId>
<artifactId>guice</artifactId>
</dependency>

<dependency>
<groupId>io.airlift</groupId>
<artifactId>configuration</artifactId>
</dependency>

<dependency>
<groupId>io.airlift</groupId>
<artifactId>log</artifactId>
</dependency>

<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-base-jdbc</artifactId>
</dependency>

<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-plugin-toolkit</artifactId>
</dependency>

<dependency>
<groupId>net.snowflake</groupId>
<artifactId>snowflake-jdbc</artifactId>
<version>3.13.32</version>
</dependency>

<!-- Trino SPI -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>io.airlift</groupId>
<artifactId>slice</artifactId>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>io.opentelemetry</groupId>
<artifactId>opentelemetry-api</artifactId>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>io.opentelemetry</groupId>
<artifactId>opentelemetry-context</artifactId>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-spi</artifactId>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>org.openjdk.jol</groupId>
<artifactId>jol-core</artifactId>
<scope>provided</scope>
</dependency>

<!-- for testing -->
<dependency>
<groupId>io.airlift</groupId>
<artifactId>testing</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-base-jdbc</artifactId>
<type>test-jar</type>
<scope>test</scope>
</dependency>

<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-main</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-main</artifactId>
<type>test-jar</type>
<scope>test</scope>
</dependency>

<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-testing</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-testing-services</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>io.trino</groupId>
<artifactId>trino-tpch</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>io.trino.tpch</groupId>
<artifactId>tpch</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>org.assertj</groupId>
<artifactId>assertj-core</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>org.jetbrains</groupId>
<artifactId>annotations</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>jdbc</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>testcontainers</artifactId>
<scope>test</scope>
</dependency>

<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<scope>test</scope>
</dependency>
</dependencies>

<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>--add-opens=java.base/java.nio=ALL-UNNAMED</argLine>
</configuration>
</plugin>
</plugins>
</build>

<profiles>
<profile>
<id>default</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<excludes>
<exclude>**/TestSnowflakeClient.java</exclude>
<exclude>**/TestSnowflakeConfig.java</exclude>
<exclude>**/TestSnowflakeConnectorTest.java</exclude>
<exclude>**/TestSnowflakePlugin.java</exclude>
<exclude>**/TestSnowflakeTypeMapping.java</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</profile>

<profile>
<!-- Tests which require third party cloud services are separated from the main test profile -->
<id>cloud-tests</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<includes>
<exclude>**/TestSnowflakeClient.java</exclude>
<exclude>**/TestSnowflakeConfig.java</exclude>
<exclude>**/TestSnowflakeConnectorTest.java</exclude>
<exclude>**/TestSnowflakePlugin.java</exclude>
<exclude>**/TestSnowflakeTypeMapping.java</exclude>
</includes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
Loading

0 comments on commit ec745a0

Please sign in to comment.