ETL Developer / Azure Data Engineer (LATAM)
Gravity IT Resources
To Apply for this Job Click Here
Data Engineer / ETL Developer (Azure) – LATAM
Location: LATAM – Remote (HQ is in Florida, US)
An ETL Developer specializing in Azure is responsible for designing, developing, and maintaining data pipelines that extract, transform, and load data from various sources into Azure data storage solutions.
Responsibilities:
- Design and Implement ETL Processes: Develop ETL processes using Azure tools like Azure Data Factory, Azure Synapse Pipelines, and Azure Databricks.
- Data Integration: Extract data from various sources, transform it into a suitable format, and load it into Azure Data Lake or Azure SQL Database.
- Data Pipeline Management: Create and maintain data pipelines to ensure efficient data flow and processing.
- Collaboration: Work closely with data analysts, data scientists, and other stakeholders to understand data requirements and ensure data accuracy.
- Optimization: Identify and implement process improvements to optimize data delivery and scalability.
- Monitoring and Troubleshooting: Monitor ETL processes and troubleshoot any issues that arise to ensure smooth operation.
- Documentation: Maintain comprehensive documentation of ETL processes and data flows.
Qualifications:
- Experience: Proven experience as an ETL Developer, preferably with Azure.
- Technical Skills: Proficiency in Azure Data Factory, Azure Synapse Analytics, Azure Databricks, and SQL. Familiarity with big data tools like Hadoop and Spark is a plus.
- Analytical Skills: Strong analytical and problem-solving skills to handle complex data integration tasks.
- Communication: Excellent communication skills to collaborate with team members and stakeholders.
- Education: A degree in Computer Science, Information Systems, or a related field.
Preferred Skills: - Programming: Knowledge of programming languages such as Python or Scala.
- Data Modeling: Experience with data modeling and database design.
- Cloud Security: Understanding of Azure security models and best practices for data security.