JDBC Sources
Spark SQL also includes a data source that can read data from other databases using JDBC.
Tables from the remote database can be loaded as a DataFrame or Spark SQL temporary view using the Data Sources API. Users can specify the JDBC connection properties in the data source options. user
and password
are normally provided as connection properties for logging into the data sources
Spark supports the following case-insensitive options:
Property Name | Meaning |
---|---|
| The JDBC URL to connect to. The source-specific connection properties may be specified in the URL. e.g., |
| The JDBC table that should be read from or written into. Note that when using it in the read path anything that is valid in a |
| A query that will be used to read data into Spark. The specified query will be parenthesized and used as a subquery in the FROM clause. Spark will also assign an alias to the subquery clause. As an example, spark will issue a query of the following form to the JDBC Source.
Below are a couple of restrictions while using this option.
|
| The class name of the JDBC driver to use to connect to this URL. |
| These options must all be specified if any of them is specified. In addition, |
| The maximum number of partitions that can be used for parallelism in table reading and writing. This also determines the maximum number of concurrent JDBC connections. If the number of partitions to write exceeds this limit, we decrease it to this limit by calling |
| The number of seconds the driver will wait for a Statement object to execute to the given number of seconds. Zero means there is no limit. In the write path, this option depends on how JDBC drivers implement the API |
| The JDBC fetch size, which determines how many rows to fetch per round trip. This can help performance on JDBC drivers which default to low fetch size (e.g. Oracle with 10 rows). This option applies only to reading. |
| The JDBC batch size, which determines how many rows to insert per round trip. This can help performance on JDBC drivers. This option applies only to writing. It defaults to |
| The transaction isolation level, which applies to current connection. It can be one of |
| After each database session is opened to the remote DB and before starting to read data, this option executes a custom SQL statement (or a PL/SQL block). Use this to implement session initialization code. Example: |
| This is a JDBC writer related option. When |
| This is a JDBC writer related option. If enabled and supported by the JDBC database (PostgreSQL and Oracle at the moment), this options allows execution of a |
| This is a JDBC writer related option. If specified, this option allows setting of database-specific table and partition options when creating a table (e.g., |
| The database column data types to use instead of the defaults, when creating the table. Data type information should be specified in the same format as CREATE TABLE columns syntax (e.g: |
| The custom schema to use for reading data from JDBC connectors. For example, |
| The option to enable or disable predicate push-down into the JDBC data source. The default value is true, in which case Spark will push down filters to the JDBC data source as much as possible. Otherwise, if set to false, no filter will be pushed down to the JDBC data source and thus all filters will be handled by Spark. Predicate push-down is usually turned off when the predicate filtering is performed faster by Spark than by the JDBC data source. |
MySQL Example
-- Creating a proxy table
CREATE TABLE mysqlTable
USING org.apache.spark.sql.jdbc
OPTIONS (
url "jdbc:mysql://db_host:db_port/db_name",
dbtable "schema.tablename",
driver 'com.mysql.cj.jdbc.Driver',
user 'username',
password 'password'
);
-- Reading from proxy table
SELECT * FROM mysqlTable;
-- Writing to mysql
INSERT INTO TABLE mysqlTable
SELECT * FROM dwhTable;
PostgreSQL Example
-- Creating a proxy view
CREATE TABLE postgreTable
USING org.apache.spark.sql.jdbc
OPTIONS (
url "jdbc:postgresql://db_host:db_port/db_name",
dbtable 'schema.tablename',
user 'username',
password 'password'
);
-- Reading from proxy view
SELECT * FROM postgreTable;
-- Writing to mysql
INSERT INTO TABLE postgreTable
SELECT * FROM dwhTable;
Updated 13 days ago