Skip to content

Releases: snowflakedb/spark-snowflake

Snowflake For Spark Connector 2.8.2 Release Notes

01 Oct 00:36
Compare
Choose a tag to compare

Fixed some critical issues and added test cases for stability.

  1. Fixed an issue that occurred when reading a large result set with Arrow format from a Snowflake database through AWS PrivateLink.
  2. Added support for quoted table names that contain special characters when writing to the Snowflake database.
  3. Fixed an issue with writing data to the Snowflake database via an external stage.
  4. Fix an issue with pushdowns and the InSet expression when reading from the Snowflake database.
  5. Use the latest version of the JDBC Driver (3.12.12).

Note:

  1. Spark Connector 2.8.2 is NOT compatible with version 3.12.11 (and earlier versions) of the JDBC Driver.
  2. On Linux, when retrieving the Snowflake GPG public key to verify the Snowflake Connector for Spark package signature, use the GPG key ID 37C7086698CB005C.

Release Spark Connector 2.8.1

14 Jul 22:06
Compare
Choose a tag to compare

Do some enhancements and bug fixes.

  1. Support nano second for writing Timestamp to snowflake
  2. Reduce memory usage for writing to snowflake tables to avoid Out-Of-Memory issue.
  3. Don’s send “spark_plan” telemetry message.
  4. Support to pushdown for “Case when/otherwise” statement to snowflake.
  5. Use the latest version of the JDBC Driver (3.12.8).

Release Spark Connector 2.8.0 for spark 3.0 support

29 Jun 23:30
Compare
Choose a tag to compare

Support spark 3.0 and some bug fixes.

  1. Fix one data-loss issue when writing data to snowflake table.
  2. Fix one issue that the returned ResultSet for Utils.runQuery() is always empty.
  3. Use JDBC 3.12.8

Note:

  1. Spark 3.0 is supported, but Spark 3.0 preview/preview2 are not compatible.
  2. No binary is released for spar 2.2 from spark connector 2.8.0.

Release Spark Connector 2.7.2

19 Jun 18:22
Compare
Choose a tag to compare

Release Spark Connector for some critical enhancements and bug fixes.

  1. Support pushdown for length()/trunc()/date_trunc()
  2. Log WARN messages if the runtime JDBC version is not the certified JDBC version.
  3. Send some telemetry messages for statistic and diagnostic purpose.
  4. Use JDBC 3.12.7

Release Spark Connector 2.7.1

13 May 21:01
Compare
Choose a tag to compare

Release Spark Connector for some critical enhancements and bug fixes.

  1. Support to write Spark BinaryType column to Snowflake Binary type column.
  2. Log diagnostic information when writing to snowflake.
  3. Log diagnostic information for Snowflake Azure deployment accounts.
  4. Implement backoff and retry when uploading data to snowflake internal stage to tolerate some intermediate network failures or cloud storage service throttling.
  5. Allow Spark Connector option: “sfurl” to begin with "https://“.
  6. Close the JDBC connection created in Utils.runQuery().
  7. Use JDBC 3.12.5.

NOTE:
DO NOT use JDBC 3.12.4 with any Spark Connector because of the compatibility issue.

Release Spark Connector 2.7.0

16 Mar 16:40
Compare
Choose a tag to compare

Spark Connector is enhanced to support Snowflake GCP account and OAuth authentication. This release also includes below minor enhancements and bug fixes:

  • Log diagnostic statistic data size in pretty format
  • Use JDBC 3.12.2

Release Spark Connector 2.6.0

04 Mar 22:01
Compare
Choose a tag to compare

Spark Connector is enhanced to leverage Snowflake internal arrow result format feature for excellent read performance. This release also includes below minor enhancements and bug fixes:

  1. Enable column name to support dot(.) character.
  2. Support special values of Inf/-Inf/NaN for Double and Float type.
  3. Move to use latest JDBC driver 3.12.1

Release snowflake spark connector 2.5.9

07 Feb 23:42
Compare
Choose a tag to compare

Upgrade to use JDBC 3.12.0
Introduce Max File Count Per Partition to avoid Out-Of-Memory exception when reading big data from snowflake.
Enable pushdown for FULL OUTER JOIN to snowflake.
Wrap TRUNCATE TABLE and COPY INTO statements in one transaction when writing data to snowflake in Overwrite mode, "usestagingtable" is "false" and "truncate_table" is "on".
Fix "Unparseable number exceptions" issue when reading from snowflake.

v2.5.7

15 Jan 22:50
Compare
Choose a tag to compare
  • Upgrade Snowflake JDBC to 3.11.1.
  • Remove a confusing warning message when writing to snowflake on Amazon S3.
  • Log or validate the row count when reading from snowflake.
  • Remove the CREATE TABLE privilege requirement when saving data into existing table with APPEND mode.
  • Support Case Insensitive Column Mapping for Spark Streaming.
  • Pretty format logging.
  • Code format refactor in compliance with Scala code style.

v2.5.6

15 Jan 22:50
Compare
Choose a tag to compare
  • Upgrade Snowflake JDBC to 3.11.0.
  • Enhance the read performance from Snowflake.
  • Implement retry mechanism when downloading data from cloud storage.