Data Platform Engineer

Job ID: 8971
Job Type: Contract
Salary Range: $125k - $150K
, , US
Referral Bonus: +/- $2080
Posted:

To Apply for this Job Click Here

Title: Data Platform Engineer
Type: Contract to Hire
Location: REMOTE

Overview
Own the multi-stage data pipeline layer that ingests from external sources (BigQuery direct queries, vendor file feeds, APIs) into a governed lake/warehouse. Deliver scaled event-driven integrations; managing ETL for downstream integrations, an event-bus for user engagement services, and API / query access for billions of data points.
Responsibilities

  • Design and implement ingestion/connectors (BigQuery direct, CSV?JSON, REST) and normalization into standardized data models.
  • Build event-driven jobs and services (Python/Node) that enrich, dedupe, and apply rules; ensure idempotency and safe replays.
  • Define data contracts, schema evolution, backfill strategy, and cutover plans; partner with stakeholders on acceptance criteria.
  • Operate AWS workloads (EC2, Lambda, AppRunner, RDS, Redshift) with Terraform; secure secrets, roles, and least-privilege access.
  • Optimize SQL for MPP systems (Redshift, Snowflake, or similar); profile queries, partition/cluster, and tune materializations.
  • Implement observability (logs, metrics, tracing, lineage) and incident response; drive postmortems and remediation.
  • Maintain concise documentation of architecture, workflows, standards, and governance.

Required Skills

  • 7–10+ years backend/data engineering with production ownership of large event-driven data systems.
  • Proficient in Python, Node, and similar toolsets for ETL job implementation; strong testing and reliability toolset.
  • Deep AWS experience and Terraform-based IaC; CI/CD for data and application deployments.
  • Expert SQL and performance tuning on Redshift, Snowflake, or equivalent.
  • Experience delivering idempotent pipelines, restaging/reconciliation, and parity validation against legacy systems.
  • Experience in highly-governed environments (HIPAA, GLBA, PCI, etc.).
  • Solid security and compliance practices, supporting internal and external audits (SOC 2 ISO 27001, etc.) and remediation.
  • Orchestration experience (Airflow or similar), streaming (Kinesis/Kafka/SQS), and dbt or warehouse-centric modeling. (preferred)
  • Data quality frameworks, lineage/metadata tooling, and SLA/SLO design. (preferred)
  • Exposure to real-time enrichment and rules engines; familiarity with warehouse-native features (tasks/streams). (preferred)

 

To Apply for this Job Click Here

Equal Employment Opportunity Statement
Gravity IT Resources is an Equal Opportunity Employer. We are committed to creating an inclusive environment for all employees and applicants. We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age, disability, genetic information, veteran status, or any other legally protected characteristic. All employment decisions are based on qualifications, merit, and business needs.

Share This Job

Refer A Candidate

Recommend a candidate and receive a referral bonus as a thank-you for helping us find top talent.

Upload Your Resume

Share your resume, and we’ll match you with opportunities that fit your skills and goals.

Related Jobs