...
Basic authentication (username password)
Temporary security credentials using assume role. Enter the secret access key and secret key of an IAM user with no permissions to access Amazon S3 bucket.
Supported Save Mode as Target:
...
See: Target Connector Save Modes
Connection Properties
Using S3 Staging = No
Using S3 Staging = Yes
...
Name
...
Mandatory
...
Description
...
Connection Name
...
Yes
...
Name of the connection
...
Description
...
No
...
Description of connection
...
JDBC Url
...
Yes
...
A JDBC URL. The format is jdbc:subprotocol://host:port/database
Host and port should point to the Redshift master node, so you must configure security groups and/or VPC to allow access from EazyDI.
Database is the Redshift database name.
...
Username
...
Yes
...
The Redshift username
...
Password
...
Yes
...
The Redshift password
...
Use S3 for Staging
...
Yes
...
Valid options (Yes, No) Indicates if connection will use S3 for staging when performing COPY or UNLOAD. Helpful for large data sets
...
S3 Bucket name
...
No (Required if Use S3 for Staging is Yes)
...
A writeable bucket in Amazon S3. Used for unloading data when reading and Avro data to be loaded into Redshift when writing. If you're using a Redshift data source for EazyDI as part of a regular ETL pipeline, it can be useful to set a lifecycle policy on a bucket and use that as a temp location for this data.
...
Region Name
...
No (Required if Use S3 for Staging is Yes)
...
Region of the s3 Bucket
...
S3 Access Key ID
...
No (Required if Use S3 for Staging is Yes)
...
S3 Secret Access Key
...
No (Required if Use S3 for Staging is Yes)
...
IAM Role ARN
...
No
...
The IAM Role ARN for temporary credentials for credentials that has no direct access to s3 bucket but can assume a role
...
External Id
...
No
...
Indicates the External id used for the IAM Role ARN (If necessary)
Test Connection
Note: Test Connection only tests for the database connection and does not verify if s3 bucket is properly configured. To configure please see https://docs.aws.amazon.com/redshift/latest/dg/r_UNLOAD.html and https://docs.aws.amazon.com/redshift/latest/dg/r_COPY.html
Make Sure that redshift has the necessary access to s3 See https://docs.aws.amazon.com/redshift/latest/mgmt/authorizing-redshift-service.html
Make Sure EazyDI is allowed to access redshift by adding EazyDI IP to security group inbound rules and setting publicly accessible to Yes in redshift
See : Whitelisting EazyDI for application access
Reading Objects
Format for objects are schema/table
Using Redshift as source or target
When Reading or Saving using S3 for staging, you will notice the assigned s3 bucket with the generated temp directories for COPY or UNLOAD
...
Expand | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||
|