Sr. Big Data Engineer

Gravity IT Resources

To Apply for this Job Click Here

Job Title: Sr. Big Data Engineer
Location: Remote
Job Type: Contract 

Our client Supports more than 7,000 institutional and retail hospitals and pharmacies with automation and analytics solutions to help increase operational efficiency, reduce medication errors, deliver actionable intelligence, and improve patient safety.

Our client is seeking an experienced professional who participates in creating and extending product optimizations. This person can apply best practices to take data from multiple sources, build and manage a pipeline to serve advanced analytics including dashboards, reporting, and even ML. This person applies their practical and theoretical knowledge from Data Science or a related field. This person works as part of a team to design and develop new products as well as maintain Omnicell’s current portfolio.

  • Lead  the solution  design for a diverse  range  of complex problems  with  a collaborative  approach with team members on a single scrum team
  • Provide technical leadership to agile teams–onshore and offshore: Mentor other team members ranging from junior engineers to new team members.
  • Implement data pipelines on a scrum team to deliver new data sources and features for machine learning.
  • Implement data quality checks, classification, and data lineage.
  • Write and validate test cases for other team member’s code.
  • Implement unit tests to cover the new features being created.
  • Augment the CI/CD pipeline to deliver new features to production.
  • Approve pull requests after a through code review or require changes to better solve the problem or adhere to standards.
  • Provide engineering support for escalations from the 24/7 Technical Assistance Center when production downtime occurs.

Required Knowledge and Skills

  • Advanced SQL skills with an emphasis on being able to review and understand data via ER Diagrams and exploratory queries.
  • Deep development experience of distributed/scalable systems and high-volume transaction applications, architecting big data projects. 
  • Ability to conduct unit testing using automated frameworks.
  • Ability to convey ideas and integrate with external systems.
  • Expert in using Big data technologies like, Apache Kafka, Apache Spark, Real Time streaming, Structured Streaming, Delta lake.
  • Expert in Agile development practices and Software Quality Assurance process. Comfortable in shipping new features in complex environment.
  • Uses independent judgement to accomplish objectives.
  • Excellent analytical and problem-solving skills.
  • Focus on development, architectural consultancy, teamwork
  • Energetic, motivated self-starter that is eager to excel with excellent inter-personal skills.
  • Expert in knowing a balance driving the right architecture but realizing the realities of having customers and the need to ship software.
  • Able to establish great relationships with managers and team, communicate properly on time, able to ensure the team work efficiently.
  • Have a can-do attitude and make a positive impact our culture.
  • Always put the customer first. 

Basic Qualifications

  • Bachelor’s Degree in Software Engineering or quantitative field of study preferred; may consider relevant experience in lieu of a Bachelor’s degree
  • 8+ years hands on experience in data engineering with a degree; 12+ years hands on experience in data engineering in lieu of a degree
  • Hands-on development experience with distributed/scalable systems and high-volume transaction applications
  • Experience in data profiling, management, and ETL methodology 
  • Hands-on programming experience in Scala, Python or other object-oriented programming languages. 
  • Experience in creating architecture diagrams and models.

Preferred Qualifications

  • Hands-on working experience in cloud infrastructure like AWS. Able to scale code and deploy applications in the public cloud using technologies like AWS, Lambda, Docker, Kubernetes.
  • Working knowledge of tools like Databricks notebooks, JIRA, CodeFresh, and DataDog.
  • Healthcare or pharmaceutical experience working with interfaces such as HL7, FHIR. EDI and knowing what PHI means.
  • Big Data Technical Certification.

To Apply for this Job Click Here