Unix Operations Engineer

Gravity IT Resources

Apply Now

Job Title:  Unix Operations Engineer   

Location: Remote

Work Authorization: USC/GC Holder

Job Type: FTE

Position Overview:

Our client delivers truly disruptive and transformative products and services that will impact the healthcare industry. The work we do makes a difference.

Our partner is looking for an exceptional Data Systems Operations Engineer with a passion for technology to help revolutionize the world of Healthcare IT. Data Operations Engineers are at the core of a data-driven business; they build and maintain the infrastructure that empowers customers, analysts, and data scientists to drive insights. We’ve built a team of passionate, creative, and innovative engineers and data scientists that are changing the world and having fun doing it


You may be a great fit for the Data Systems Operations Engineer opportunity, IF…

  • You are a gifted Data Operations Engineer with a background in large scale data engineering support.
  • You will enjoy using technology to automate solutions and optimize outcomes focusing on data engineering, continuous integration, and continuous deployment
  • You are a collaborator. You thrive in environments that freely exchange ideas and viewpoints.
  • You are an innovator that believes in making a difference and having fun doing it.
  • You are passionate about data and in being a part of a tight-knit Data Operations team

Your Role:

  • Automate, deploy and operate data pipelines
  • Automate build, deployment, and quality processes
  • Implement facilities to monitor all aspects of the data pipeline
  • Manage data in Spark and other environments using scripts and automation
  • Communicate and/or address build, deployment and operational issues as they come up
  • Implement, administer, and support the Qliksense systems
  • Installations, Upgrades, Patches, Backups of the Data Operations systems
  • Maintain documentation of Data Operations systems and processes
  • Support the data usage needs of downstream analytics teams
  • Ensure availability meets or exceed agreed-upon SLAs
  • Participate in a 24×7 on-call rotation for critical issues


  • BS in Computer Science or equivalent work experience.
  • Experience managing data in relational databases and developing ETL pipelines
  • Experience using Spark SQL or other Big Data tools
  • Experience implementing and administering logging, telemetry and monitoring tools like ops view
  • Experience using AWS or other cloud services
  • Experience scripting for automation and config management (Chef, Puppet, Ansible)
  • Experience with orchestration software such as Airflow or Luigi
  • Fluent in at least one scripting or systems programming language (Python, Ruby, Bash, Go, Rust, Crystal, etc.)
  • Deep Knowledge of the Linux and Windows operating systems (OS, networking, process level)
  • Strong problem-solving skills
  • Interest in DevOps style engineering teams – we operate what we build!
  • Strong verbal and written communication skills
  • Strong commitment to quality, architecture, and documentation.
  • Experience working with the Qlik Sense product in a large-scale enterprise environment a plus



Apply Now