Unsupported Screen Size: The viewport size is too small for the theme to render properly.
Documentation

Install and configure Snowflake JDBC driver

Warning This operation should only be executed by your database administrator.

You can install a new JDBC driver by registering a data source and adding the JDBC driver in the procedure. Once you have installed the JDBC driver, you can use it to register other data sources of the same type.

Prerequisites

  • You have downloaded the JDBC driver of your choice as an archive file (for example, ZIP or JAR).
    Tip You can find a wide range of drivers on Collibra Marketplace.
  • You have configured one or more Jobservers in Collibra Console. If there is no available Jobserver, the Register data source actions will be grayed out in the global create menu of Collibra Data Governance Center.

Steps

  1. In the main menu, click Catalog.
    The Catalog Home opens.
  2. In the main menu, click the Create button.
    The Create dialog box appears.
  3. In the Create dialog box, click Register data source (use a Collibra provided driver).
  4. If a JDBC driver is already installed for your data source:
    1. Enter the schema properties.
      Field Description
      Schema name

      This name is used in Collibra DGC as schema asset and must therefore be unique.

      Schema descriptionThe description of the schema. This is used as description of the schema asset.
      Data ownerThe owner of the registered data in Collibra DGC.
    2. Click Next.
    3. In the JDBC driver version field, click manage drivers....
  5. Do one of the following:
    • Click Add JDBC Driver if you want to create a new JDBC driver.
    • Click if you want to edit an existing JDBC driver.
  6. Enter the required information.
    FieldDescription
    JDBC Driver Version NameThe name of the JDBC driver.
    Upload

    Button to navigate to the driver file in JAR or ZIP format and upload it.

    Driver files

    This table contains a list of uploaded driver files in JAR or ZIP format.

    You can remove a driver file by clicking .

  7. Click Next.
  8. Configure the JDBC connection.
    FieldDescription
    Connection

    The JDBC connection string is:

    jdbc:snowflake:

    Driver Class Name

    Driver class name of the connection:

    cdata.jdbc.snowflake.SnowflakeDriver

    Connection properties

    In addition to providing authentication (see below) set the following properties to connect to a Snowflake database:

    • Url: Both AWS and Azure instances are supported. For example:
      • AWS: https://myaccount.region.snowflakecomputing.com
      • Azure: https://myaccount.region.azure.snowflakecomputing.com
    • Warehouse: warehouse name
    • Database: database name
    • Schema: schema name
    • AsyncQueryTimeout: connection timeout and it is a must to be set to a big value like 7200.
    • Account is only required if your Url does not conform to the usual syntax containing the account name at the beginning. Snowflake provides the Account name needed in this case.

    Authenticating to Snowflake

    The driver supports Snowflake user authentication, federated authentication, and SSL client authentication. To authenticate, set User and Password, and select the authentication method in the AuthScheme property.

    Using Snowflake Password Authentication

    Set User and Password to a Snowflake user and set AuthScheme to PASSWORD.

    Using Snowflake Key Pair Authentication

    The driver allows you to authenticate using key pair authentication by creating a secure token with the private key defined for your user account. To connect with this method, set AuthScheme to PRIVATEKEY and set the following values:

    • User: The user account to authenticate as.
    • PrivateKey: The private key used for the user such as the path to the .pem file containing the private key.
    • PrivateKeyType: The type of key store containing the private key such as PEMKEY_FILE, PFXFILE, etc.
    • PrivateKeyPassword: The password for the specified private key.

    Using Federated Authentication

    To use federated authentication, set the User and Password that you need to authenticate to your SSO identity provider and set the following properties to configure the authentication scheme.

    Set AuthScheme based on your IdP (currently, the driver supports Okta only).

    • OKTA: Set AuthScheme to OKTA and set SSOIDP to the Okta SAML endpoint. For example: https://cdata-okta.okta.com.

    Using SSL Client Authentication

    To authenticate with an SSL client certificate, set SSLClientCert, SSLClientCertPassword, SSLClientCertSubject, SSLClientCertType, and SSLServerCert.

    Configuring Access Control

    If the authenticating user maps to a system-defined role, specify it in the RoleName property.

    Click here for a complete overview of the available connection properties.

  9. Click Create.
  10. In the Register data source dialog box, enter the schema properties.
    FieldDescription
    Schema name

    This name is used in Collibra DGC as Schema asset and must therefore be unique.

    Schema descriptionThe description of the schema. This is used as description of the Schema asset.
    Data ownerThe owner of the registered data in Collibra DGC.
  11. Click Next.
  12. Enter the database connection properties.
    OptionDescription

    JDBC driver version

    The JDBC driver to connect to your database.

    Connect via

    The jobserver used for ingesting.

    URL

    Address of the used database. Use the format hostname:port.

    ...
    <The connection properties that are defined in JDBC driver>
    ...

    Store credentials

    Select this option to store the credentials to access the database. With a schema refresh, you can clear this option again.

    UsernameUsername to access the database.
    PasswordCorresponding password to access the database.

    Schedule data refresh

    Enable or disable a schedule to automatically refresh the data registration.
    Cron pattern

    Schedule of the data refresh as a Cron pattern.

    If you create an invalid Cron pattern, Collibra Data Governance Center stops responding.

    Time zoneThe time zone of the database.
    Note   If Collibra DGC cannot connect to the database, you cannot continue the data source registration wizard.
    Depending on the selected database, some of the options are not available.
  13. Click Next.
  14. Select the data profiling options.

    Option Description

    Store Data Profile

    Option to perform data profiling on the registered data.
    Detect advanced data types Option to detect advanced data types in the data source.

    Store Sample Data

    Option to extract sample data from the registered data.
  15. Click Create.

Advanced settings

Customizing the SSL Configuration

By default, the driver attempts to negotiate SSL/TLS by checking the server's certificate against the system's trusted certificate store. To specify another certificate, see the SSLServerCert property for the available formats to do so.

Connecting Through a Firewall or Proxy

HTTP Proxies

To connect through the Windows system proxy, you do not need to set any additional connection properties. To connect to other proxies, set ProxyAutoDetect to false.

In addition, to authenticate to an HTTP proxy, set ProxyAuthScheme, ProxyUser, and ProxyPassword, in addition to ProxyServer and ProxyPort.

Other Proxies

Set the following properties:

  • To use a proxy-based firewall, set FirewallType, FirewallServer, and FirewallPort.
  • To tunnel the connection, set FirewallType to TUNNEL.
  • To authenticate, specify FirewallUser and FirewallPassword.
  • To authenticate to a SOCKS proxy, additionally set FirewallType to SOCKS5.

Troubleshooting the Connection

To show driver activity from query execution to network traffic, use Logfile and Verbosity. The examples of common connection errors below show how to use these properties to get more context. Contact the support team for help tracing the source of an error or circumventing a performance issue.

  • Authentication errors: Typically, recording a Logfile at Verbosity 4 is necessary to get full details on an authentication error.
  • Queries time out: A server that takes too long to respond will exceed the driver's client-side timeout. Often, setting the Timeout property to a higher value will avoid a connection error. Another option is to disable the timeout by setting the property to 0. Setting Verbosity to 2 will show where the time is being spent.
  • The certificate presented by the server cannot be validated: This error indicates that the driver cannot validate the server's certificate through the chain of trust. If you are using a self-signed certificate, there is only one certificate in the chain.

    To resolve this error, you must verify yourself that the certificate can be trusted and specify to the driver that you trust the certificate. One way you can specify that you trust a certificate is to add the certificate to the trusted system store; another is to set SSLServerCert.

Connection properties

The connection properties are the various options that can be used to establish a connection. This section provides a complete list of the options you can use. Click the links for further details.

AccountThe Account provided for authentication with Snowflake database. This is usually derived from the URL automatically.
Async Query TimeoutThe timeout for asynchronous requests issued by the provider to download large result sets.
Auth SchemeThe authentication scheme used. Accepted entries are Password, OKTA, PrivateKey or AzureAD.
Auto CacheAutomatically caches the results of SELECT queries into a cache database specified by either CacheLocation or both of CacheConnection and CacheProvider .
Batch SizeThe maximum size of each batch operation to submit.
Cache ConnectionThe connection string for the cache database. This property is always used in conjunction with CacheProvider . Setting both properties will override the value set for CacheLocation for caching data.
Cache DriverThe database driver to be used to cache data.
Cache LocationSpecifies the path to the cache when caching to a file.
Cache MetadataThis property determines whether or not to cache the table metadata to a file store.
Cache ToleranceThe tolerance for stale data in the cache specified in seconds when using AutoCache .
Connection Life TimeThe maximum lifetime of a connection in seconds. Once the time has elapsed, the connection object is disposed.
Connect On OpenThis property species whether to connect to the Snowflake when the connection is opened.
Credentials LocationThe location of the settings file where credentials are saved.
DatabaseThe name of the Snowflake database.
External TokenThe External Token for authentication with the Snowflake database. This is usually derived from the external handler. For example, handle the callback URL from procedure GetSSOAuthorizationURL will get this token.
Firewall PasswordA password used to authenticate to a proxy-based firewall.
Firewall PortThe TCP port for a proxy-based firewall.
Firewall ServerThe name or IP address of a proxy-based firewall.
Firewall TypeThe protocol used by a proxy-based firewall.
Firewall UserThe user name to use to authenticate with a proxy-based firewall.
Ignore CaseWhether to ignore case in identifiers. Default: false.
LocationA path to the directory that contains the schema files defining tables, views, and stored procedures.
LogfileA filepath which designates the name and location of the log file.
Log ModulesCore modules to be included in the log file.
Max Log File CountA string specifying the maximum file count of log files. When the limit is hit, a new log is created in the same folder with the date and time appended to the end and the oldest log file will be deleted.
Max Log File SizeA string specifying the maximum size in bytes for a log file (for example, 10 MB). When the limit is hit, a new log is created in the same folder with the date and time appended to the end.
Max RowsLimits the number of rows returned rows when no aggregation or group by is used in the query. This helps avoid performance issues at design time.
Merge DeleteA boolean indicating whether batch DELETE statements should be converted to MERGE statements automatically. Only used when the DELETE statement's where clause contains a table's primary key field only and they are combined with AND logical operator.
Merge InsertA boolean indicating whether INSERT statements should be converted to MERGE statements automatically. Only used when the INSERT contains a table's primary key field.
Merge UpdateA boolean indicating whether batch UPDATE statements should be converted to MERGE statements automatically. Only used when the UPDATE statement's where clause contains a table's primary key field only and they are combined with AND logical operator.
MFA PasscodeSpecifies the passcode to use for multi-factor authentication.
OfflineUse offline mode to get the data from the cache instead of the live source.
Okta MFA ProviderSpecifies the Provider to use for multi-factor authentication of Okta.
OtherThese hidden properties are used only in specific use cases.
PagesizeThe maximum number of results to return per page from Snowflake.
Parallel LevelThe maximum number of thread which executing the select statement.
PasswordThe user's password.
Pool Idle TimeoutThe allowed idle time for a connection before it is closed.
Pool Max SizeThe maximum connections in the pool.
Pool Min SizeThe minimum number of connections in the pool.
Pool Wait TimeThe max seconds to wait for an available connection.
Private KeyThe private key provided for key pair authentication with Snowflake.
Private Key PasswordThe password for the private key specified in the PrivateKey property, if required.
Private Key TypeThe type of key store containing the private key to use with key pair authentication.
Proof KeyThe ProofKey for authentication with Snowflake database. This is usually derived from GetSSOAuthorizationURL call.
Proxy Auth SchemeThe authentication type to use to authenticate to the ProxyServer proxy.
Proxy Auto DetectThis indicates whether to use the system proxy settings or not. This takes precedence over other proxy settings, so you'll need to set ProxyAutoDetect to FALSE in order use custom proxy settings.
Proxy ExceptionsA semicolon separated list of destination hostnames or IPs that are exempt from connecting through the ProxyServer .
Proxy PasswordA password to be used to authenticate to the ProxyServer proxy.
Proxy PortThe TCP port the ProxyServer proxy is running on.
Proxy ServerThe hostname or IP address of a proxy to route HTTP traffic through.
Proxy SSL TypeThe SSL type to use when connecting to the ProxyServer proxy.
Proxy UserA user name to be used to authenticate to the ProxyServer proxy.
Pseudo ColumnsThis property indicates whether or not to include pseudo columns as columns to the table.
Query PassthroughThis option passes the query to the Snowflake server as is.
ReadonlyYou can use this property to enforce read-only access to Snowflake from the provider.
Role NameThe role of the Snowflake user: PUBLIC, SYSADMIN, or ACCOUNTADMIN.
RTKThe runtime key used for licensing.
SchemaThe schema of the Snowflake database.
Session ParametersThe session parameters for Snowflake. For example: SessionParameters='QUERY_TAG=MyTag;QUOTED_IDENTIFIERS_IGNORE_CASE=True;';.
SSL Server CertThe certificate to be accepted from the server when connecting using TLS/SSL.
SSOIDP DomainThe domain endpoint for SAML requests.
Support Enhanced SQLThis property enhances SQL functionality beyond what can be supported through the API directly, by enabling in-memory client-side processing.
TablesThis property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.
TimeoutThe value in seconds until the timeout error is thrown, canceling the operation.
URLThe URL of Snowflake database.
Use Connection PoolingThis property enables connection pooling.
UserThe username provided for authentication with the Snowflake database.
VerbosityThe verbosity level that determines the amount of detail included in the log file.
ViewsRestricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.
WarehouseThe name of the Snowflake warehouse.

Account

The Account provided for authentication with Snowflake database. This is usually derived from the URL automatically.

Data Type

string

Default Value

""

Remarks

The Account provided for authentication with Snowflake database. This is usually derived from the URL automatically and will not need to be set manually. A notable exception is Snowflake VPS if your Account name doesn't follow the usual URL syntax https://myaccount.region.snowflakecomputing.com. Snowflake provides the Account name in this case.

Async Query Timeout

The timeout for asynchronous requests issued by the provider to download large result sets.

Data Type

int

Default Value

300

Remarks

If the AsyncQueryTimeout property is set to 0, asynchronous operations will not time out; instead, they will run until they complete successfully or encounter an error condition. This property is distinct from Timeout which applies to individual HTTP operations while AsyncQueryTimeout applies to execution time of the operation as a whole.

If AsyncQueryTimeout expires and the asynchronous request has not finished being processed, the driver raises an error condition.

Auth Scheme

The authentication scheme used. Accepted entries are Password, OKTA, PrivateKey or AzureAD.

Data Type

string

Default Value

"Password"

Remarks

The driver supports the following authentication mechanisms. See the Getting Started chapter for authentication guides.

  • Password: Set this to authenticate with a Snowflake user.
  • OKTA: Set this to use the OKTA SSO identity provider. Set SSOIDPDomain in addition to the User and Password you use to authenticate to OKTA.
  • AzureAD: Set this along with User to use the Azure Active Directory identity provider. When connecting, your browser will open allowing you to login to Azure AD to complete the authentication.
  • PrivateKey: Set this to use key pair authentication. Set PrivateKey, PrivateKeyPassword and PrivateKeyType in addition to authenticate with key pair authentication.

Auto Cache

Automatically caches the results of SELECT queries into a cache database specified by either CacheLocation or both of CacheConnection and CacheProvider .

Data Type

bool

Default Value

false

Remarks

When AutoCache = true, the driver automatically maintains a cache of your table's data in the database of your choice.

Setting the Caching Database

When AutoCache = true, the driver caches to a simple, file-based cache. You can configure its location or cache to a different database with the following properties:

See Also

  • CacheMetadata: This property reduces the amount of metadata that crosses the network by persisting table schemas retrieved from the Snowflake metadata. Metadata then needs to be retrieved only once instead of every connection.
  • Explicitly Caching Data: This section provides more examples of using AutoCache in Offline mode.
  • CACHE Statements: You can use the CACHE statement to persist any SELECT query, as well as manage the cache; for example, refreshing schemas.

Batch Size

The maximum size of each batch operation to submit.

Data Type

int

Default Value

0

Remarks

When BatchSize is set to a value greater than 0, the batch operation will split the entire batch into separate batches of size BatchSize. The split batches will then be submitted to the server individually. This is useful when the server has limitations on the size of the request that can be submitted.

Setting BatchSize to 0 will submit the entire batch as specified.

Cache Connection

The connection string for the cache database. This property is always used in conjunction with CacheProvider . Setting both properties will override the value set for CacheLocation for caching data.

Data Type

string

Default Value

""

Remarks

The cache database is determined based on the CacheDriver and CacheConnection properties. Both properties are required to use the cache database. Examples of common cache database settings can be found below. For more information on setting the caching database's driver, refer to CacheDriver.

The connection string specified in the CacheConnection property is passed directly to the underlying CacheDriver. Consult the documentation for the specific JDBC driver for more information on the available properties. Make sure to include the JDBC driver in your application's classpath.

Derby and Java DB

The driver simplifies caching to Derby, only requiring you to set the CacheLocation property to make a basic connection.

Alternatively, you can configure the connection to Derby manually using CacheDriver and CacheConnection. The following is the Derby JDBC URL syntax:

jdbc:derby:[subsubprotocol:][databaseName][;attribute=value[;attribute=value] ... ]
For example, to cache to an in-memory database, use the following:
jdbc:derby:memory

SQLite

To cache to SQLite, you can use the SQLite JDBC driver. The following is the syntax of the JDBC URL:

jdbc:sqlite:dataSource
  • Data Source: The path to an SQLite database file. Or, use a value of :memory to cache in memory.

MySQL

The installation includes the CData JDBC Driver for MySQL. The following is an example JDBC URL:

jdbc:mysql:User=root;Password=root;Server=localhost;Port=3306;Database=cache
The following are typical connection properties:

  • Server: The IP address or domain name of the server you want to connect to.
  • Port: The port that the server is running on.
  • User: The user name provided for authentication to the database.
  • Password: The password provided for authentication to the database.
  • Database: The name of the database.

SQL Server

The JDBC URL for the Microsoft JDBC Driver for SQL Server has the following syntax:

jdbc:sqlserver://[serverName[instance][:port]][;database=databaseName][;property=value[;property=value] ... ]
For example:
jdbc:sqlserver://localhostsqlexpress:1433;integratedSecurity=true
The following are typical SQL Server connection properties:
  • Server: The name or network address of the computer running SQL Server. To connect to a named instance instead of the default instance, this property can be used to specify the host name and the instance, separated by a backslash.
  • Port: The port SQL Server is running on.
  • Database: The name of the SQL Server database.
  • Integrated Security: Set this option to true to use the current Windows account for authentication. Set this option to false if you are setting the User and Password in the connection.

    To use integrated security, you will also need to add sqljdbc_auth.dll to a folder on the Windows system path. This file is located in the auth subfolder of the Microsoft JDBC Driver for SQL Server installation. The bitness of the assembly must match the bitness of your JVM.

  • User ID: The user name provided for authentication with SQL Server. This property is only needed if you are not using integrated security.
  • Password: The password provided for authentication with SQL Server. This property is only needed if you are not using integrated security.

Oracle

The following is the conventional JDBC URL syntax for the Oracle JDBC Thin driver:

jdbc:oracle:thin:[userId/password]@[//]host[[:port][:sid]]
For example:
jdbc:oracle:thin:scott/tiger@myhost:1521:orcl
The following are typical connection properties:
  • Data Source: The connect descriptor that identifies the Oracle database. This can be a TNS connect descriptor, an Oracle Net Services name that resolves to a connect descriptor, or, after version 11g, an Easy Connect naming (the host name of the Oracle server with an optional port and service name).

  • Password: The password provided for authentication with the Oracle database.
  • User Id: The user Id provided for authentication with the Oracle database.

PostgreSQL

The following is the JDBC URL syntax for the official PostgreSQL JDBC driver:

jdbc:postgresql:[//[host[:port]]/]database[[?option=value][[&option=value][&option=value] ... ]]
For example, the following connection string connects to a database on the default host (localhost) and port (5432):
jdbc:postgresql:postgres
The following are typical connection properties:
  • Host: The address of the server hosting the PostgreSQL database.
  • Port: The port used to connect to the server hosting the PostgreSQL database.
  • Database: The name of the database.
  • User name: The user Id provided for authentication with the PostgreSQL database. You can specify this in the JDBC URL with the "user" parameter.
  • Password: The password provided for authentication with the PostgreSQL database.

Cache Driver

The database driver to be used to cache data.

Data Type

string

Default Value

""

Remarks

You can cache to any database for which you have a JDBC driver, including CData JDBC drivers.

The cache database is determined based on the CacheDriver and CacheConnection properties. The CacheDriver is the name of the JDBC driver class that you want to use to cache data.

Note that you must also add the CacheDriver JAR file to the classpath.

The following examples show how to cache to several major databases. Refer to CacheConnection for more information on the JDBC URL syntax and typical connection properties.

Derby and Java DB

The driver simplifies Derby configuration. Java DB is the Oracle distribution of Derby. The JAR file is shipped in the JDK. You can find the JAR file, derby.jar, in the db subfolder of the JDK installation. In most caching scenarios, you need to specify only the following, after adding derby.jar to the classpath:

jdbc:snowflake:CacheLocation='c:/Temp/cachedir';url=https://myaccount.region.snowflakecomputing.com;user=Admin;password=test123;Database=Northwind;Warehouse=TestWarehouse;Account=Tester1;
To customize the Derby JDBC URL, use CacheDriver and CacheConnection. For example, to cache to an in-memory database, use a JDBC URL like the following:
jdbc:snowflake:CacheDriver=org.apache.derby.jdbc.EmbeddedDriver;CacheConnection='jdbc:derby:memory';url=https://myaccount.region.snowflakecomputing.com;user=Admin;password=test123;Database=Northwind;Warehouse=TestWarehouse;Account=Tester1;

SQLite

The following is a JDBC URL for the SQLite JDBC driver:

jdbc:snowflake:CacheDriver=org.sqlite.JDBC;CacheConnection='jdbc:sqlite:C:/Temp/sqlite.db';url=https://myaccount.region.snowflakecomputing.com;user=Admin;password=test123;Database=Northwind;Warehouse=TestWarehouse;Account=Tester1;

MySQL

The following is a JDBC URL for the included CData JDBC Driver for MySQL:

  jdbc:snowflake:Cache Driver=cdata.jdbc.mysql.MySQLDriver;Cache Connection='jdbc:mysql:Server=localhost;Port=3306;Database=cache;User=root;Password=123456';url=https://myaccount.region.snowflakecomputing.com;user=Admin;password=test123;Database=Northwind;Warehouse=TestWarehouse;Account=Tester1;
  
The CData JDBC Driver for MySQL is located in the lib subfolder of the CData JDBC Driver for Snowflake installation directory.

SQL Server

The following JDBC URL uses the Microsoft JDBC Driver for SQL Server:

jdbc:snowflake:Cache Driver=com.microsoft.sqlserver.jdbc.SQLServerDriver;Cache Connection='jdbc:sqlserver://localhostsqlexpress:7437;user=sa;password=123456;databaseName=Cache';url=https://myaccount.region.snowflakecomputing.com;user=Admin;password=test123;Database=Northwind;Warehouse=TestWarehouse;Account=Tester1;

Oracle

The following is a JDBC URL for the Oracle Thin Client:

jdbc:snowflake:Cache Driver=oracle.jdbc.OracleDriver;CacheConnection='jdbc:oracle:thin:scott/tiger@localhost:1521:orcldb';url=https://myaccount.region.snowflakecomputing.com;user=Admin;password=test123;Database=Northwind;Warehouse=TestWarehouse;Account=Tester1;
NOTE: If using a version of Oracle older than 9i, the cache driver will instead be oracle.jdbc.driver.OracleDriver .

PostgreSQL

The following JDBC URL uses the official PostgreSQL JDBC driver:

jdbc:snowflake:CacheDriver=org.postgresql.Driver;CacheConnection='jdbc:postgresql://localhost:5433/postgres?user=postgres&password=admin';url=https://myaccount.region.snowflakecomputing.com;user=Admin;password=test123;Database=Northwind;Warehouse=TestWarehouse;Account=Tester1;

Cache Location

Specifies the path to the cache when caching to a file.

Data Type

string

Default Value

"%APPDATA%\CData\Snowflake Data Provider"

Remarks

The CacheLocation is a simple, file-based cache. The driver uses Java DB, Oracle's distribution of the Derby database. To cache to Java DB, you will need to add the Java DB JAR file to the classpath. The JAR file, derby.jar, is shipped in the JDK and located in the db subfolder of the JDK installation.

If left unspecified, the default location is "%APPDATA%\CData\Snowflake Data Provider" with %APPDATA% being set to the user's configuration directory:

Platform %APPDATA%
Windows The value of the APPDATA environment variable
Mac ~/Library/Application Support
Linux ~/.config

See Also

  • AutoCache: Set to implicitly create and maintain a cache for later offline use.
  • CacheMetadata: Set to persist the Snowflake catalog in CacheLocation.

Cache Metadata

This property determines whether or not to cache the table metadata to a file store.

Data Type

bool

Default Value

false

Remarks

As you execute queries with this property set, table metadata in the Snowflake catalog are cached to the file store specified by CacheLocation if set or the user's home directory otherwise. A table's metadata will be retrieved only once, when the table is queried for the first time.

When to Use CacheMetadata

The driver automatically persists metadata in memory for up to two hours when you first discover the metadata for a table or view and therefore, CacheMetadata is generally not required. CacheMetadata becomes useful when metadata operations are expensive such as when you are working with large amounts of metadata or when you have many short-lived connections.

When Not to Use CacheMetadata

  • When you are working with volatile metadata: Metadata for a table is only retrieved the first time the connection to the table is made. To pick up new, changed, or deleted columns, you would need to delete and rebuild the metadata cache. Therefore, it is best to rely on the in-memory caching for cases where metadata changes often.
  • When you are caching to a database: CacheMetadata can only be used with CacheLocation. If you are caching to another database with the CacheDriver and CacheConnection properties, use AutoCache to cache implicitly. Or, use CACHE Statements to cache explicitly.

Cache Tolerance

The tolerance for stale data in the cache specified in seconds when using AutoCache .

Data Type

int

Default Value

600

Remarks

The tolerance for stale data in the cache specified in seconds. This only applies when AutoCache is used. The driver checks with the data source for newer records after the tolerance interval has expired. Otherwise, it returns the data directly from the cache.

Connection Life Time

The maximum lifetime of a connection in seconds. Once the time has elapsed, the connection object is disposed.

Data Type

int

Default Value

0

Remarks

The maximum lifetime of a connection in seconds. Once the time has elapsed, the connection object is disposed. The default is 0 which indicates there is no limit to the connection lifetime.

Connect On Open

This property species whether to connect to the Snowflake when the connection is opened.

Data Type

bool

Default Value

false

Remarks

When set to true, a connection will be made to Snowflake when the connection is opened. This property enables the Test Connection feature available in various database tools.

This feature acts as a NOOP command as it is used to verify a connection can be made to Snowflake and nothing from this initial connection is maintained.

Setting this property to false may provide performance improvements (depending upon the number of times a connection is opened).

Credentials Location

The location of the settings file where credentials are saved.

Data Type

string

Default Value

"%APPDATA%\CData\Snowflake Data Provider\CredentialsFile.txt"

Remarks

If left unspecified, the default location is "%APPDATA%\CData\Snowflake Data Provider\CredentialsFile.txt" with %APPDATA% being set to the user's configuration directory:

Platform %APPDATA%
Windows The value of the APPDATA environment variable
Mac ~/Library/Application Support
Linux ~/.config

Database

The name of the Snowflake database.

Data Type

string

Default Value

""

Remarks

The name of the Snowflake database.

External Token

The External Token for authentication with the Snowflake database. This is usually derived from the external handler. For example, handle the callback URL from procedure GetSSOAuthorizationURL will get this token.

Data Type

string

Default Value

""

Remarks

Firewall Password

A password used to authenticate to a proxy-based firewall.

Data Type

string

Default Value

""

Remarks

This property is passed to the proxy specified by FirewallServer and FirewallPort, following the authentication method specified by FirewallType.

Firewall Port

The TCP port for a proxy-based firewall.

Data Type

int

Default Value

0

Remarks

This specifies the TCP port for a proxy allowing traversal of a firewall. Use FirewallServer to specify the name or IP address. Specify the protocol with FirewallType.

Firewall Server

The name or IP address of a proxy-based firewall.

Data Type

string

Default Value

""

Remarks

This property specifies the IP address, DNS name, or host name of a proxy allowing traversal of a firewall. The protocol is specified by FirewallType: Use FirewallServer with this property to connect through SOCKS or do tunneling. Use ProxyServer to connect to an HTTP proxy.

Note that the driver uses the system proxy by default. To use a different proxy, set ProxyAutoDetect to false.

Firewall Type

The protocol used by a proxy-based firewall.

Data Type

string

Default Value

"NONE"

Remarks

This property specifies the protocol that the driver will use to tunnel traffic through the FirewallServer proxy. Note that by default, the driver connects to the system proxy; to disable this behavior and connect to one of the following proxy types, set ProxyAutoDetect to false.

Type Default Port Description
TUNNEL 80 When this is set, the driver opens a connection to Snowflake and traffic flows back and forth through the proxy.
SOCKS4 1080 When this is set, the driver sends data through the SOCKS 4 proxy specified by FirewallServer and FirewallPort and passes the FirewallUser value to the proxy, which determines if the connection request should be granted.
SOCKS5 1080 When this is set, the driver sends data through the SOCKS 5 proxy specified by FirewallServer and FirewallPort. If your proxy requires authentication, set FirewallUser and FirewallPassword to credentials the proxy recognizes.

To connect to HTTP proxies, use ProxyServer and ProxyPort. To authenticate to HTTP proxies, use ProxyAuthScheme, ProxyUser, and ProxyPassword.

Firewall User

The user name to use to authenticate with a proxy-based firewall.

Data Type

string

Default Value

""

Remarks

The FirewallUser and FirewallPassword properties are used to authenticate against the proxy specified in FirewallServer and FirewallPort, following the authentication method specified in FirewallType.

Ignore Case

Whether to ignore case in identifiers. Default: false.

Data Type

bool

Default Value

false

Remarks

A session parameter that specifies whether Snowflake will treat identifiers as case sensitive. Default: false(case is sensitive).

Location

A path to the directory that contains the schema files defining tables, views, and stored procedures.

Data Type

string

Default Value

"%APPDATA%\CData\Snowflake Data Provider\Schema"

Remarks

The path to a directory which contains the schema files for the driver (.rsd files for tables and views, .rsb files for stored procedures). The folder location can be a relative path from the location of the executable. The Location property is only needed if you want to customize definitions (for example, change a column name, ignore a column, and so on) or extend the data model with new tables, views, or stored procedures.

If left unspecified, the default location is "%APPDATA%\CData\Snowflake Data Provider\Schema" with %APPDATA% being set to the user's configuration directory:

Platform %APPDATA%
Windows The value of the APPDATA environment variable
Mac ~/Library/Application Support
Linux ~/.config

Logfile

A filepath which designates the name and location of the log file.

Data Type

string

Default Value

""

Remarks

Once this property is set, the driver will populate the log file as it carries out various tasks, such as when authentication is performed or queries are executed. If the specified file doesn't already exist, it will be created.

Connection strings and version information are also logged, though connection properties containing sensitive information are masked automatically.

If a relative filepath is supplied, the location of the log file will be resolved based on the path found in the Location connection property.

For more control over what is written to the log file, you can adjust the Verbosity property.

Log Modules

Core modules to be included in the log file.

Data Type

string

Default Value

""

Remarks

Only the modules specified (separated by ';') will be included in the log file. By default all modules are included.

Max Log File Count

A string specifying the maximum file count of log files. When the limit is hit, a new log is created in the same folder with the date and time appended to the end and the oldest log file will be deleted.

Data Type

int

Default Value

-1

Remarks

A string specifying the maximum file count of log files. When the limit is hit, a new log is created in the same folder with the date and time appended to the end and the oldest log file will be deleted. The minimum supported value is 2. A value of 0 or a negative value indicates no limit on the count.

Max Log File Size

A string specifying the maximum size in bytes for a log file (for example, 10 MB). When the limit is hit, a new log is created in the same folder with the date and time appended to the end.

Data Type

string

Default Value

"100MB"

Remarks

A string specifying the maximum size in bytes for a log file (for example, 10 MB). When the limit is hit, a new log is created in the same folder with the date and time appended to the end. The default limit is 100 MB. Values lower than 100 kB will use 100 kB as the value instead.

Max Rows

Limits the number of rows returned rows when no aggregation or group by is used in the query. This helps avoid performance issues at design time.

Data Type

int

Default Value

-1

Remarks

Limits the number of rows returned rows when no aggregation or group by is used in the query. This helps avoid performance issues at design time.

Merge Delete

A boolean indicating whether batch DELETE statements should be converted to MERGE statements automatically. Only used when the DELETE statement's where clause contains a table's primary key field only and they are combined with AND logical operator.

Data Type

bool

Default Value

false

Remarks

A boolean indicating whether DELETE statements should be converted to MERGE statements automatically to allow for upsert functionality. This property is primarily intended for use with tools where you have no direct control over the queries being executed. Otherwise, as long as Query Passthrough is True, you could execute the MERGE command directly.

When this property is False, DELETE bulk statements won't executed against the server. When it is set to True and the DELETE query contains the primary key field, the Snowflake will send a MERGE query that will execute an DELETE if match is found in Snowflake. For example this query:

DELETE FROM "Table" WHERE "ID" = 1 AND "NAME" = 'Jerry'
Will be sent to Snowflake as the following MERGE request:
MERGE INTO "Table" AS "Target" USING "RTABLE1_TMP_20eca05b-c050-47dd-89bc-81c7f617f877" AS "Source" ON ("Target"."ID" = "Source"."ID" AND "Target"."NAME" = "Source"."NAME") 
WHEN MATCHED THEN DELETE

Merge Insert

A boolean indicating whether INSERT statements should be converted to MERGE statements automatically. Only used when the INSERT contains a table's primary key field.

Data Type

bool

Default Value

false

Remarks

A boolean indicating whether INSERT statements should be converted to MERGE statements automatically to allow for upsert functionality. This property is primarily intended for use with tools where you have no direct control over the queries being executed. Otherwise, as long as Query Passthrough is True, you could execute the MERGE command directly.

When this property is False, INSERT statements are executed directly against the server. When it is set to True and the INSERT query contains the primary key field, the Snowflake will send a MERGE query that will execute an INSERT if no match is found in Snowflake or an UPDATE if it is. For example this query:

INSERT INTO "Table" ("ID", "NAME", "AGE") VALUES (1, 'NewName', 10)
Will be sent to Snowflake as the following MERGE request:
MERGE INTO "Table" AS "Target" USING (SELECT 1 AS "ID") AS [Source] ON ("Target"."ID" = "Source"."ID") 
WHEN NOT MATCHED THEN INSERT ("ID", "NAME", "AGE") VALUES (1, 'NewName', 10) 
WHEN MATCHED THEN UPDATE SET "NAME" = 'NewName', "AGE" = 10

Merge Update

A boolean indicating whether batch UPDATE statements should be converted to MERGE statements automatically. Only used when the UPDATE statement's where clause contains a table's primary key field only and they are combined with AND logical operator.

Data Type

bool

Default Value

false

Remarks

A boolean indicating whether UPDATE statements should be converted to MERGE statements automatically to allow for upsert functionality. This property is primarily intended for use with tools where you have no direct control over the queries being executed. Otherwise, as long as Query Passthrough is True, you could execute the MERGE command directly.

When this property is False, UPDATE statements are executed directly against the server. When it is set to True and the UPDATE query contains the primary key field, the Snowflake will send a MERGE query that will execute an INSERT if no match is found in Snowflake or an UPDATE if it is. For example this query:

UPDATE "Table" SET "NAME" = 'NewName', "AGE" = 10 WHERE "ID" = 1
Will be sent to Snowflake as the following MERGE request:
MERGE INTO "Table" AS "Target" USING "RTABLE1_TMP_20eca05b-c050-47dd-89bc-81c7f617f877" AS "Source" ON ("Target"."ID" = "Source"."ID") 
WHEN MATCHED THEN UPDATE SET "Target"."NAME" = "Source"."NAME", "Target"."AGE" = "Source"."AGE"

MFA Passcode

Specifies the passcode to use for multi-factor authentication.

Data Type

string

Default Value

""

Remarks

Specifies the passcode to use for multi-factor authentication.

Offline

Use offline mode to get the data from the cache instead of the live source.

Data Type

bool

Default Value

false

Remarks

When Offline = true, all queries execute against the cache as opposed to the live data source. In this mode, certain queries like INSERT, UPDATE, DELETE, and CACHE are not allowed.

Okta MFA Provider

Specifies the Provider to use for multi-factor authentication of Okta.

Data Type

string

Default Value

""

Remarks

Specifies the Provider to use for multi-factor authentication of Okta.

Other

These hidden properties are used only in specific use cases.

Data Type

string

Default Value

""

Remarks

The properties listed below are available for specific use cases. Normal driver use cases and functionality should not require these properties.

Specify multiple properties in a semicolon-separated list.

Caching Configuration

CachePartial=TrueCaches only a subset of columns, which you can specify in your query.
QueryPassthrough=TruePasses the specified query to the cache database instead of using the SQL parser of the driver.

Integration and Formatting

DefaultColumnSizeSets the default length of string fields when the data source does not provide column length in the metadata. The default value is 2000.
ConvertDateTimeToGMTDetermines whether to convert date-time values to GMT, instead of the local time of the machine.
RecordToFile=filenameRecords the underlying socket data transfer to the specified file.

Pagesize

The maximum number of results to return per page from Snowflake.

Data Type

int

Default Value

5000

Remarks

The Pagesize property affects the maximum number of results to return per page from Snowflake. Setting a higher value may result in better performance at the cost of additional memory eaten up per page consumed.

Parallel Level

The maximum number of thread which executing the select statement.

Data Type

int

Default Value

0

Remarks

The maximum number of thread which executing the select statement.

Password

The user's password.

Data Type

string

Default Value

""

Remarks

The password provided for authentication with Snowflake.

Pool Idle Timeout

The allowed idle time for a connection before it is closed.

Data Type

int

Default Value

0

Remarks

The allowed idle time a connection can remain in the pool until the connection is closed. The default is 60 seconds.

Pool Max Size

The maximum connections in the pool.

Data Type

int

Default Value

100

Remarks

The maximum connections in the pool. The default is 100. To disable this property, set the property value to 0 or less.

Pool Min Size

The minimum number of connections in the pool.

Data Type

int

Default Value

1

Remarks

The minimum number of connections in the pool. The default is 1.

Pool Wait Time

The max seconds to wait for an available connection.

Data Type

int

Default Value

60

Remarks

The max seconds to wait for a connection to become available. If a new connection request is waiting for an available connection and exceeds this time, an error is thrown. By default, new requests wait forever for an available connection.

Private Key

The private key provided for key pair authentication with Snowflake.

Data Type

string

Default Value

""

Remarks

The path to the file containing the private key or the name of the certificate store for the client certificate. The PrivateKeyType field specifies the type of the certificate store specified by PrivateKey. If the store is password protected, specify the password in PrivateKeyPassword.

When the certificate store type is PEMKEY_FILE, PFXFILE, etc., this property must be set to the path to the file. When the type is PEMKEY_BLOB, PFXBLOB, etc., the property must be set to the binary contents of the file.

Designations of certificate stores are platform-dependent.

The following are designations of the most common User and Machine certificate stores in Windows:

MYA certificate store holding personal certificates with their associated private keys.
CACertifying authority certificates.
ROOTRoot certificates.
SPCSoftware publisher certificates.

In Java, the certificate store normally is a file containing certificates and optional private keys.

Private Key Password

The password for the private key specified in the PrivateKey property, if required.

Data Type

string

Default Value

""

Remarks

The password for the private key specified in the PrivateKey property, if required.

Private Key Type

The type of key store containing the private key to use with key pair authentication.

Data Type

string

Default Value

""

Remarks

This property can take one of the following values:

USER - defaultFor Windows, this specifies that the certificate store is a certificate store owned by the current user. Note that this store type is not available in Java.
MACHINEFor Windows, this specifies that the certificate store is a machine store. Note that this store type is not available in Java.
PFXFILEThe certificate store is the name of a PFX (PKCS12) file containing certificates.
PFXBLOBThe certificate store is a string (base-64-encoded) representing a certificate store in PFX (PKCS12) format.
JKSFILEThe certificate store is the name of a Java key store (JKS) file containing certificates. Note that this store type is only available in Java.
JKSBLOBThe certificate store is a string (base-64-encoded) representing a certificate store in JKS format. Note that this store type is only available in Java.
PEMKEY_FILEThe certificate store is the name of a PEM-encoded file that contains a private key and an optional certificate.
PEMKEY_BLOBThe certificate store is a string (base64-encoded) that contains a private key and an optional certificate.
PUBLIC_KEY_FILEThe certificate store is the name of a file that contains a PEM- or DER-encoded public key certificate.
PUBLIC_KEY_BLOBThe certificate store is a string (base-64-encoded) that contains a PEM- or DER-encoded public key certificate.
SSHPUBLIC_KEY_FILEThe certificate store is the name of a file that contains an SSH-style public key.
SSHPUBLIC_KEY_BLOBThe certificate store is a string (base-64-encoded) that contains an SSH-style public key.
P7BFILEThe certificate store is the name of a PKCS7 file containing certificates.
PPKFILEThe certificate store is the name of a file that contains a PuTTY Private Key (PPK).
XMLFILEThe certificate store is the name of a file that contains a certificate in XML format.
XMLBLOBThe certificate store is a string that contains a certificate in XML format.

Proof Key

The ProofKey for authentication with Snowflake database. This is usually derived from GetSSOAuthorizationURL call.

Data Type

string

Default Value

""

Remarks

Proxy Auth Scheme

The authentication type to use to authenticate to the ProxyServer proxy.

Data Type

string

Default Value

"BASIC"

Remarks

This value specifies the authentication type to use to authenticate to the HTTP proxy specified by ProxyServer and ProxyPort.

Note that the driver will use the system proxy settings by default, without further configuration needed; if you want to connect to another proxy, you will need to set ProxyAutoDetect to false, in addition to ProxyServer and ProxyPort. To authenticate, set ProxyAuthScheme and set ProxyUser and ProxyPassword, if needed.

The authentication type can be one of the following:

  • BASIC: The driver performs HTTP BASIC authentication.
  • DIGEST: The driver performs HTTP DIGEST authentication.
  • NEGOTIATE: The driver retrieves an NTLM or Kerberos token based on the applicable protocol for authentication.
  • PROPRIETARY: The driver does not generate an NTLM or Kerberos token. You must supply this token in the Authorization header of the HTTP request.

If you need to use another authentication type, such as SOCKS 5 authentication, see FirewallType.

Proxy Auto Detect

This indicates whether to use the system proxy settings or not. This takes precedence over other proxy settings, so you'll need to set ProxyAutoDetect to FALSE in order use custom proxy settings.

Data Type

bool

Default Value

false

Remarks

This takes precedence over other proxy settings, so you'll need to set ProxyAutoDetect to FALSE in order use custom proxy settings.

NOTE: When this property is set to True, the proxy used is determined as follows:

  • A search from the JVM properties (http.proxy, https.proxy, socksProxy, etc.) is performed.
  • In the case that the JVM properties don't exist, a search from java.home/lib/net.properties is performed.
  • In the case that java.net.useSystemProxies is set to True, a search from the SystemProxy is performed.
  • In Windows only, an attempt is made to retrieve these properties from the Internet Options in the registry.

To connect to an HTTP proxy, see ProxyServer. For other proxies, such as SOCKS or tunneling, see FirewallType.

Proxy Exceptions

A semicolon separated list of destination hostnames or IPs that are exempt from connecting through the ProxyServer .

Data Type

string

Default Value

""

Remarks

The ProxyServer is used for all addresses, except for addresses defined in this property. Use semicolons to separate entries.

Note that the driver uses the system proxy settings by default, without further configuration needed; if you want to explicitly configure proxy exceptions for this connection, you need to set ProxyAutoDetect = false, and configure ProxyServer and ProxyPort. To authenticate, set ProxyAuthScheme and set ProxyUser and ProxyPassword, if needed.

Proxy Password

A password to be used to authenticate to the ProxyServer proxy.

Data Type

string

Default Value

""

Remarks

This property is used to authenticate to an HTTP proxy server that supports NTLM (Windows), Kerberos, or HTTP authentication. To specify the HTTP proxy, you can set ProxyServer and ProxyPort. To specify the authentication type, set ProxyAuthScheme.

If you are using HTTP authentication, additionally set ProxyUser and ProxyPassword to HTTP proxy.

If you are using NTLM authentication, set ProxyUser and ProxyPassword to your Windows password. You may also need these to complete Kerberos authentication.

For SOCKS 5 authentication or tunneling, see FirewallType.

By default, the driver uses the system proxy. If you want to connect to another proxy, set ProxyAutoDetect to false.

Proxy Port

The TCP port the ProxyServer proxy is running on.

Data Type

int

Default Value

80

Remarks

The port the HTTP proxy is running on that you want to redirect HTTP traffic through. Specify the HTTP proxy in ProxyServer. For other proxy types, see FirewallType.

Proxy Server

The hostname or IP address of a proxy to route HTTP traffic through.

Data Type

string

Default Value

""

Remarks

The hostname or IP address of a proxy to route HTTP traffic through. The driver can use the HTTP, Windows (NTLM), or Kerberos authentication types to authenticate to an HTTP proxy.

If you need to connect through a SOCKS proxy or tunnel the connection, see FirewallType.

By default, the driver uses the system proxy. If you need to use another proxy, set ProxyAutoDetect to false.

Proxy SSL Type

The SSL type to use when connecting to the ProxyServer proxy.

Data Type

string

Default Value

"AUTO"

Remarks

This property determines when to use SSL for the connection to an HTTP proxy specified by ProxyServer. This value can be AUTO, ALWAYS, NEVER, or TUNNEL. The applicable values are the following:

AUTODefault setting. If the URL is an HTTPS URL, the driver will use the TUNNEL option. If the URL is an HTTP URL, the component will use the NEVER option.
ALWAYSThe connection is always SSL enabled.
NEVERThe connection is not SSL enabled.
TUNNELThe connection is through a tunneling proxy. The proxy server opens a connection to the remote host and traffic flows back and forth through the proxy.

Proxy User

A user name to be used to authenticate to the ProxyServer proxy.

Data Type

string

Default Value

""

Remarks

The ProxyUser and ProxyPassword options are used to connect and authenticate against the HTTP proxy specified in ProxyServer.

You can select one of the available authentication types in ProxyAuthScheme. If you are using HTTP authentication, set this to the user name of a user recognized by the HTTP proxy. If you are using Windows or Kerberos authentication, set this property to a user name in one of the following formats:

user@domain
domainuser

Pseudo Columns

This property indicates whether or not to include pseudo columns as columns to the table.

Data Type

string

Default Value

""

Remarks

This setting is particularly helpful in Entity Framework, which does not allow you to set a value for a pseudo column unless it is a table column. The value of this connection setting is of the format "Table1=Column1, Table1=Column2, Table2=Column3". You can use the "*" character to include all tables and all columns; for example, "*=*".

Query Passthrough

This option passes the query to the Snowflake server as is.

Data Type

bool

Default Value

false

Remarks

When this is set, queries are passed through directly to Snowflake.

Readonly

You can use this property to enforce read-only access to Snowflake from the provider.

Data Type

bool

Default Value

false

Remarks

If this property is set to true, the driver will allow only SELECT queries. INSERT, UPDATE, DELETE, and stored procedure queries will cause an error to be thrown.

Role Name

The role of the Snowflake user: PUBLIC, SYSADMIN, or ACCOUNTADMIN.

Data Type

string

Default Value

""

Remarks

The role of the Snowflake user using the specified database. The defaults in Snowflake are: PUBLIC, SYSADMIN, or ACCOUNTADMIN. A custom role may also be specified.

RTK

The runtime key used for licensing.

Data Type

string

Default Value

""

Remarks

The RTK property may be used to license a build. See the included licensing file to see how to set this property. The runtime key is only available if you purchased an OEM license.

Schema

The schema of the Snowflake database.

Data Type

string

Default Value

""

Remarks

The schema of the Snowflake database.

Session Parameters

The session parameters for Snowflake. For example: SessionParameters='QUERY_TAG=MyTag;QUOTED_IDENTIFIERS_IGNORE_CASE=True;';.

Data Type

string

Default Value

""

Remarks

The session parameters for Snowflake. For example: SessionParameters='QUERY_TAG=MyTag;QUOTED_IDENTIFIERS_IGNORE_CASE=True;';

SSL Server Cert

The certificate to be accepted from the server when connecting using TLS/SSL.

Data Type

string

Default Value

""

Remarks

If using a TLS/SSL connection, this property can be used to specify the TLS/SSL certificate to be accepted from the server. Any other certificate that is not trusted by the machine is rejected.

This property can take the following forms:

Description Example
A full PEM Certificate (example shortened for brevity) -----BEGIN CERTIFICATE----- MIIChTCCAe4CAQAwDQYJKoZIhv......Qw== -----END CERTIFICATE-----
A path to a local file containing the certificate C:cert.cer
The public key (example shortened for brevity) -----BEGIN RSA PUBLIC KEY----- MIGfMA0GCSq......AQAB -----END RSA PUBLIC KEY-----
The MD5 Thumbprint (hex values can also be either space or colon separated) ecadbdda5a1529c58a1e9e09828d70e4
The SHA1 Thumbprint (hex values can also be either space or colon separated) 34a929226ae0819f2ec14b4a3d904f801cbb150d

If not specified, any certificate trusted by the machine is accepted.

Certificates are validated as trusted by the machine based on the System's trust store. The trust store used is the 'javax.net.ssl.trustStore' value specified for the system. If no value is specified for this property, Java's default trust store is used (for example, JAVA_HOMElibsecuritycacerts).

Use '*' to signify to accept all certificates. Note that this is not recommended due to security concerns.

SSOIDP Domain

The domain endpoint for SAML requests.

Data Type

string

Default Value

""

Remarks

The URL endpoint for SAML requests. For example, if your IdP is Okta, you can use native Okta SSO with the following URL: https://cdata-okta.okta.com.

Support Enhanced SQL

This property enhances SQL functionality beyond what can be supported through the API directly, by enabling in-memory client-side processing.

Data Type

bool

Default Value

false

Remarks

When SupportEnhancedSQL = true, the driver offloads as much of the SELECT statement processing as possible to Snowflake and then processes the rest of the query in memory. In this way, the driver can execute unsupported predicates, joins, and aggregation.

When SupportEnhancedSQL = false, the driver limits SQL execution to what is supported by the Snowflake API.

Execution of Predicates

The driver determines which of the clauses are supported by the data source and then pushes them to the source to get the smallest superset of rows that would satisfy the query. It then filters the rest of the rows locally. The filter operation is streamed, which enables the driver to filter effectively for even very large datasets.

Execution of Joins

The driver uses various techniques to join in memory. The driver trades off memory utilization against the requirement of reading the same table more than once.

Execution of Aggregates

The driver retrieves all rows necessary to process the aggregation in memory.

When SupportEnhancedSQL = true, the driver offloads as much of the SELECT statement processing as possible to Snowflake and then processes the rest of the query in memory. In this way, the driver can execute unsupported predicates, joins, and aggregation.

When SupportEnhancedSQL = false, the driver limits SQL execution to what is supported by the Snowflake API.

Execution of Predicates

The driver determines which of the clauses are supported by the data source and then pushes them to the source to get the smallest superset of rows that would satisfy the query. It then filters the rest of the rows locally. The filter operation is streamed, which enables the driver to filter effectively for even very large datasets.

Execution of Joins

The driver uses various techniques to join in memory. The driver trades off memory utilization against the requirement of reading the same table more than once.

Execution of Aggregates

The driver retrieves all rows necessary to process the aggregation in memory.

Tables

This property restricts the tables reported to a subset of the available tables. For example, Tables=TableA,TableB,TableC.

Data Type

string

Default Value

""

Remarks

Listing the tables from some databases can be expensive. Providing a list of tables in the connection string improves the performance of the driver.

This property can also be used as an alternative to automatically listing views if you already know which ones you want to work with and there would otherwise be too many to work with.

Specify the tables you want in a comma-separated list. Each table should be a valid SQL identifier with any special characters escaped using square brackets, double-quotes or backticks. For example, Tables=TableA,[TableB/WithSlash],WithCatalog.WithSchema.`TableC With Space`.

Timeout

The value in seconds until the timeout error is thrown, canceling the operation.

Data Type

int

Default Value

120

Remarks

If Timeout = 0, operations do not time out. The operations run until they complete successfully or until they encounter an error condition.

If Timeout expires and the operation is not yet complete, the driver throws an exception.

If Timeout = 0, operations do not time out. The operations run until they complete successfully or until they encounter an error condition.

If Timeout expires and the operation is not yet complete, the driver throws an exception.

URL

The URL of Snowflake database.

Data Type

string

Default Value

""

Remarks

Set this property to the URL of the Snowflake database instance.

AWS Format:

  https://myaccount.region.snowflakecomputing.com

Azure Format:

  https://myaccount.region.azure.snowflakecomputing.com

Use Connection Pooling

This property enables connection pooling.

Data Type

bool

Default Value

false

Remarks

This property enables connection pooling. The default is false. See Connection Pooling for information on using connection pools.

User

The username provided for authentication with the Snowflake database.

Data Type

string

Default Value

""

Remarks

The username provided for authentication with the Snowflake database.

Verbosity

The verbosity level that determines the amount of detail included in the log file.

Data Type

string

Default Value

"1"

Remarks

The verbosity level determines the amount of detail that the driver reports to the Logfile. Verbosity levels from 1 to 5 are supported. These are described in the following list:

1Setting Verbosity to 1 will log the query, the number of rows returned by it, the start of execution and the time taken, and any errors.
2Setting Verbosity to 2 will log everything included in Verbosity 1, cache queries, and additional information about the request, if applicable, such as HTTP headers.
3Setting Verbosity to 3 will additionally log the body of the request and the response.
4Setting Verbosity to 4 will additionally log transport-level communication with the data source. This includes SSL negotiation.
5Setting Verbosity to 5 will additionally log communication with the data source and additional details that may be helpful in troubleshooting problems. This includes interface commands.

The Verbosity should not be set to greater than 1 for normal operation. Substantial amounts of data can be logged at higher verbosities, which can delay execution times.

Views

Restricts the views reported to a subset of the available tables. For example, Views=ViewA,ViewB,ViewC.

Data Type

string

Default Value

""

Remarks

Listing the views from some databases can be expensive. Providing a list of views in the connection string improves the performance of the driver.

This property can also be used as an alternative to automatically listing views if you already know which ones you want to work with and there would otherwise be too many to work with.

Specify the views you want in a comma-separated list. Each view should be a valid SQL identifier with any special characters escaped using square brackets, double-quotes or backticks. For example, Views=ViewA,[ViewB/WithSlash],WithCatalog.WithSchema.`ViewC With Space`.

Warehouse

The name of the Snowflake warehouse.

Data Type

string

Default Value

""

Remarks

The name of the Snowflake warehouse.