EazyDI Agent
Current Version: v.1.7.0
Version Release Schedule
v0.1.0 (Pilot) August 3, 2024
v1.0.0 August 17, 2024
v1.1.0 August 31, 2024
v1.2.0 September 14, 2024
v1.3.0 September 28, 2024
v1.4.0 October 12, 2024
v1.5.0 October 26, 2024
v1.6.0 November 9, 2024
v1.7.0 November 23, 2024
Supported Connectors:
MySQL (since v0.1.0 Pilot)
IBM DB2 (since v0.1.0 Pilot)
Oracle (since v0.1.0 Pilot)
Postgres (since v0.1.0 Pilot)
Snowflake Data Warehouse (since v1.0.0)
Salesforce (since v1.0.0)
MongoDB (since v1.0.0)
Azure Cosmos MongoDB (since v1.0.0)
Azure Cosmos PostgreSQL (since v1.0.0)
Azure PostgreSQL (since v1.0.0)
Azure MySQL (since v1.0.0)
Azure SqlServer (since v1.0.0)
Google CLoud MongoDB Atlas (since v1.0.0)
AWS Aurora MySQL (since v1.1.0)
AWS Aurora PostgreSQL (since v1.1.0)
AWS RDS PostgreSQL (since v1.1.0)
AWS RDS MySQL (since v1.1.0)
AWS RDS SqlServer (since v1.1.0)
AWS RDS Oracle (since v1.1.0)
AWS RDS MariaDB (since v1.1.0)
Clickhouse DB (since v1.1.0)
ElasticSearch (since v1.1.0)
Fauna DB (since v1.1.0)
Cockroach DB (since v1.1.0)
Amazon S3 (since v1.2.0)
Amazon S3 Csv (since v1.2.0)
Amazon S3 Excel (since v1.2.0)
Azure Data Lake (since v1.2.0)
Azure Data Lake Csv (since v1.2.0)
Azure Data Lake Excel (since v1.2.0)
FTP (since v1.2.0)
FTP Csv (since v1.2.0)
FTP Excel (since v1.2.0)
Google Sheets (since v1.2.0)
Google Firebase (since v1.3.0)
Google Firestore (since v1.3.0)
Heroku Postgres (since v1.3.0)
Google Cloud Platform SqlServer (since v1.3.0)
Google Cloud Platform MySQL (since v1.3.0)
Google Cloud Platform PostgreSql (since v1.3.0)
CSV File for Agent (since v1.4.0)
Excel File for Agent (since v1.4.0)
Rest Api Connector (since v1.5.0)
Installation and Minimum Hardware Requirements
Note: We advise you to use one secure agent per machine as the way it is registered in Eazydi is as follows:
system: Operating system (Windows or Linux)
name: name of the machine on the network (unique)
user: The user who registered the Agent to EazyDI
Minimum Hardware Requirements
Before installing the Secure Agent, ensure your system meets the following minimum hardware requirements:
For Windows:
Operating System: Windows 10 or later (64-bit)
Processor: Quad-core CPU (Intel or AMD) with a clock speed of 2.5 GHz or higher
Memory: 16 GB RAM
Disk Space: 50 GB of free disk space
Network: High-speed internet connection
Java: JDK 11 or later
For Linux:
Operating System: CentOS 7 or later, Ubuntu 18.04 or later (64-bit)
Processor: Quad-core CPU (Intel or AMD) with a clock speed of 2.5 GHz or higher
Memory: 16 GB RAM
Disk Space: 50 GB of free disk space
Network: High-speed internet connection
Java: JDK 11 or later
Installation Instructions
Follow these steps to install the Secure Agent on your system.
Prerequisites
(Python 3.9)
Install python latest 3.9.x in the environment which is the current supported version for the secure agent using pyspark
Download Python for python
Managing Python — conda 24.9.3.dev48 documentation for conda
set PYSPARK_PYTHON env to point to 3.9 python binary executable for pyspark or switch to the conda environment using the same version see https://spark.apache.org/docs/latest/configuration.html
(Install JDK)
For Windows:
Download JDK:
Download the latest JDK from the official Oracle or OpenJDK Website
Install JDK:
Run the installer and follow the on-screen instructions to install the JDK.
Set JAVA_HOME Environment Variable:
Open the Start menu and search for "Environment Variables".
Click on "Edit the system environment variables".
In the System Properties window, click on the "Environment Variables" button.
Under "System variables", click "New" and set the variable name to
JAVA_HOME
and the variable value to the JDK installation path (e.g.,C:\Program Files\Java\jdk-11
).Find the
Path
variable in the "System variables" section, select it, and click "Edit".Click "New" and add
%JAVA_HOME%\bin
to the list.Click "OK" to close all dialogs.
For Linux:
Install JDK:
For Ubuntu/Debian-based distributions:
sudo apt update sudo apt install openjdk-11-jdk
For CentOS/RHEL-based distributions:
sudo yum install java-11-openjdk-devel
Set JAVA_HOME Environment Variable:
Open your
.bashrc
or.profile
file in a text editor:nano ~/.bashrc
Add the following lines at the end of the file:
Copy code
export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64 export PATH=$JAVA_HOME/bin:$PATH
Save the file and reload it:
Downloading the Eazydi Agent
Login to EazyDI Application and navigate to the Environments tab and click New Agent button
Click on Generate New Token and copy it, or optionally download the new token as a text file
Choose the platform and click Download Secure Agent to download the latest build of the Eazydi Agent
Setting up the Eazydi Agent
app.ini
The app.ini
file is a critical configuration file for the Secure Agent, dictating how it connects to the cloud service. Below is a detailed breakdown of the sections and key parameters within the app.ini
file
base_socket_url
Description: The URL used by the Secure Agent to establish a connection with the EazyDi platform.
Purpose: This is a crucial setting that should not be altered. It specifies the endpoint for all communication between the agent and the cloud service.
username
Description: The username associated with your EazyDi account.
Purpose: This parameter is used for authenticating the Agent with the EazyDi platform. It ensures that the agent operates under the correct user context.
Example:
username=myusername@eazydi.com
token
Description: The authentication token provided by EazyDi.
Purpose: This token is used in conjunction with the username to authenticate the Secure Agent. It is essential for secure and authorized access to the platform.
Example:
token=320af1e1-5786-4779-bf20-1ecadb92c900
spark_memory
Description: The amount of memory allocated to Apache Spark.
Purpose: This setting determines how much memory is available for Spark jobs executed by the Agent. Adequate memory allocation is crucial for performance, especially when handling large datasets. see https://spark.apache.org/docs/latest/configuration.html
Example:
spark_memory=6g
spark_master
Description: The Spark master URL.
Purpose: This specifies the master node for the Spark cluster. In a standalone or local mode, it indicates that Spark jobs should run on the local machine. see https://spark.apache.org/docs/latest/submitting-applications.html
Example:
spark_master=local
spark_cores
Description: The number of CPU cores allocated to Spark.
Purpose: This parameter sets the number of CPU cores available for Spark processing tasks. More cores can improve performance by allowing more parallel processing. see https://spark.apache.org/docs/latest/configuration.html
Example:
spark_cores=2
For Windows
Unzip the downloaded file and open
app.ini
Fill up
username
andtoken
Double click on the
eazydi agent.exe
to runNotice in Environments Tab your machine will be registered and shows status as up
Notice the agent receiving a message after registering the machine
Using the Eazydi Agent
Create Connections which support the Eazydi Agent and assign the environment to your Agent
In Creating Pipelines using the Eazydi Agent, the Source and Target connections should be the same environment
Running Jobs make sure that the Agent is up and running. Notice the agent will receive requests for jobs in the logs
Once Job is complete it will reflect in Eazydi’s Monitor → Archived Listings
For Linux
Run the command to extract tar.gz file
Update the
app.ini
using your preferred Linux editor
Fill up
username
andtoken
Run
Notice in Environments Tab your machine will be registered and shows status as up
Notice the agent receiving a message after registering the machine
Using the Eazydi Agent
Create Connections which support the Eazydi Agent and assign the environment to your Agent
In Creating Pipelines using the Eazydi Agent, the Source and Target connections should be the same environment
Running Jobs make sure that the Agent is up and running. Notice the agent will receive requests for jobs in the logs
Once Job is complete it will reflect in Eazydi’s Monitor → Archived Listings
Pre and Post Environment Script execution (since v1.6.0)
See Pipeline Pre and Post Script Executions | Agent Scripts Execution