Data Engineer (Cumulus)

Gravity IT Resources

Apply Now

Position: Data Engineer
Type: Contract
Location: Remote (must be able to work EST)

This role will handle all the complex replication framework from the source and make sure the tables are flowing with the expected structure.

Our client is looking for a Data Warehouse Engineer to join their IT Data Engineering Team. To work in a IT Storefront project. The ideal candidate will be a well-rounded technologist with a passion for building innovative business solutions with big data and ETL technologies. Experience delivering big data solutions – from system setup to development to business solution delivery – is critical. Programming experience is also paramount. Big data technologies are constantly evolving. The Data Warehouse Engineer should be keeping up with the latest developments and should be able to connect technical capability with business need.

What You’ll Do:
• Use python coding to process and flatten JSON files to create pipelines ingestions from Snowflake to Vertica
• Implement data ingestion routines both real time and batch using best practices in data modeling, ETL/ELT processes
• Write documentation on design, architecture and solutions

What You’ll Need:
• Experience in data warehouse design and data integration methodologies
• SQL expertise and the ability to work with many different MPPs. Vertica, Snowflake, Postgres, etc.
• 5+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools
• Experience building data engineering solutions on AWS using S3, EC2, ECS, Kenisis, DynamoDB, Lambda etc.
• 5+ years of experience integrating data between operational databases using data integration tools like Informatica, CloverDX, Datastage, Talend, Pentaho Data Integrator, or SSIS.
• 5+ years of hands-on experience designing and developing scalable, high performing and fault-tolerant applications for large enterprises
• 5+ years of experience writing software in native python.
• Expert level knowledge of data integration and familiarity with common data integration challenges like converting data types, handling errors, and translating between different technology stacks
• Experience handling JSON file’s flattening and transformations
• Able to work in a fast-paced environment and be comfortable being accountable for work products
• Hand on experience delivering high performance distributed systems in public cloud environments
• Experience performing analytics to solve complex business problems
• Experience with cloud-based data warehouse platforms (e.g. Snowflake, etc.)
• BS in Computer Science, Mathematics, Engineering, or Information Technology
• Position may require travel

Bonus:
• Working experience with Airflow.
• Background in e-commerce
• Agile experience is a plus
• Java experience is a plus
• Experience with statistical methods and data science
 

Apply Now