Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

See: Target Connector Save Modes

Connection Properties

  • Using S3 Staging = No

image-20240407-051800.pngImage Removed

  • Using S3 Staging = Yes

image-20240407-051902.pngImage Removed

...

Name

...

Mandatory

...

Description

...

Connection Name

...

Yes

...

Name of the connection

...

Description

...

No

...

Description of connection

...

JDBC Url

...

Yes

  • A JDBC URL. The format is jdbc:subprotocol://host:port/database

    Host and port should point to the Redshift master node, so you must configure security groups and/or VPC to allow access from EazyDI.

    Database is the Redshift database name.

image-20240407-052208.pngImage Removed

...

Username

...

Yes

...

The Redshift username

...

Password

...

Yes

...

The Redshift password

...

Use S3 for Staging

...

Yes

...

Valid options (Yes, No) Indicates if connection will use S3 for staging when performing COPY or UNLOAD. Helpful for large data sets

...

S3 Bucket name

...

No (Required if Use S3 for Staging is Yes)

...

A writeable bucket in Amazon S3. Used for unloading data when reading and Avro data to be loaded into Redshift when writing. If you're using a Redshift data source for EazyDI as part of a regular ETL pipeline, it can be useful to set a lifecycle policy on a bucket and use that as a temp location for this data.

...

Region Name

...

No (Required if Use S3 for Staging is Yes)

...

Region of the s3 Bucket

...

S3 Access Key ID

...

No (Required if Use S3 for Staging is Yes)

...

S3 Secret Access Key

...

No (Required if Use S3 for Staging is Yes)

...

IAM Role ARN

...

No

...

The IAM Role ARN for temporary credentials for credentials that has no direct access to s3 bucket but can assume a role

...

External Id

...

No

...

Indicates the External id used for the IAM Role ARN (If necessary)

Test Connection

image-20240407-053453.pngImage Removed

Note: Test Connection only tests for the database connection and does not verify if s3 bucket is properly configured. To configure please see https://docs.aws.amazon.com/redshift/latest/dg/r_UNLOAD.html and https://docs.aws.amazon.com/redshift/latest/dg/r_COPY.html

Make Sure that redshift has the necessary access to s3 See https://docs.aws.amazon.com/redshift/latest/mgmt/authorizing-redshift-service.html

image-20240407-053113.pngImage Removed

Make Sure EazyDI is allowed to access redshift by adding EazyDI IP to security group inbound rules and setting publicly accessible to Yes in redshift

image-20240407-053433.pngImage Removedimage-20240407-053707.pngImage Removed

See : Whitelisting EazyDI for application access

Reading Objects

Format for objects are schema/table

image-20240407-053829.pngImage Removed

image-20240407-053921.pngImage Removed

Using Redshift as source or target

image-20240407-054429.pngImage Removed

When Reading or Saving using S3 for staging, you will notice the assigned s3 bucket with the generated temp directories for COPY or UNLOAD

image-20240407-054125.pngImage Removed

image-20240407-054231.pngImage Removed

image-20240407-054343.pngImage Removed

image-20240407-054529.pngImage Removed