Data Engineer

Gravity IT Resources

To Apply for this Job Click Here

Job Title:  Data Engineer
Location: Remote
Job-Type: Full-Time
Referral Fee: $1000
Employment Eligibility: Gravity cannot transfer nor sponsor a work visa for this position. Applicants must be eligible to work in the U.S. for any employer directly (we are not open to contract or “corp to corp” agreements).

Position Overview:

Our client is looking for a data engineer to join the team. The Senior Data Engineer will provide leadership in the ongoing development of new and existing data pipelines including development of data movement, data quality, data cleansing and other ETL-related activities. The role is intended to serve the data integration and information needs. The candidate will work with all areas to ensure functional use and access to data intelligence, information and analytics.


Essential Duties and Responsibilities:

  • Participate in all phases of data integration, such as analysis, requirement gathering, design, documentation and development of data related architectures, structures, and ETL processes.
  • Develop and maintain high volume streaming pipelines as well as batch processes.
  • Maintain and improve existing framework to support data integration pipelines.
  • Conduct code reviews with peers and provide best-practice design and architecture suggestions.
  • Interface with all areas of the business to understand and analyze business and functional requirements.
  • Provide tactical solutions on priority issues/requirements.
  • Maintain the data dictionary as processes and applications change.
  • Understand and implement best practices, tuning and optimization for applications and reporting systems.
  • Other duties as assigned.


Education and Experience Requirements:


  • BS, MS Degree in a computer science, information technology or another computer-based discipline.
  • 5+ years of experience in designing and developing ETL data pipelines.
  • 5+ years of experience in OOP programming language (Python, C#, Java)
  • Experience with ETL methodologies and practices.
  • Experience with large scale streaming pipelines is a plus.
  • Proven analytical and database skills.
  • MSSQL and NoSQL experience.
  • Experience utilizing and extending ETL processes in a complex, high-volume data environment.
  • Or an equivalent combination of education and experience that provides the required knowledge, skills and abilities.


Knowledge, Skills, and Abilities Requirements:


  • Highly proficient in Python, SQL/T-SQL and other ETL/ELT applications.
  • Hands-on Experience in executing high-volume data load to SQL and NoSQL environments is preferred.
  • Knowledge of RESTful API architecture including experience with creation and consumption.
  • Comfortable using GIT version control and working in CI/CD pipeline.
  • Must have independent problem-solving skills and ability to develop solutions to complex analytical/data storage problems.
  • Excellent interpersonal skills necessary to work effectively with colleagues at various levels of the organization and across multiple locations.
  • Successfully engage in multiple initiatives simultaneously.
  • Identifies critical issues with ease and determines appropriate level of escalation, if needed.
  • Work successfully in a team environment using Agile Scrum methodologies.
  • Continually seeks opportunities to increase internal customer satisfaction and deepen relationships.
  • Suggests areas for improvement in internal processes along with possible solutions.
  • Excellent verbal and written communication skills and the ability to interact professionally with a diverse group including executives, managers, and subject matter experts.
  • Experience in Big Data technologies and utilities is preferred.
  • Financial Services and Commercial banking experience is a plus.

To Apply for this Job Click Here