To Apply for this Job Click Here
Data Architect/Engineer (Potential Extension or Conversion)
Location: Charlotte, NC (Preferred) or Remote
Reports to: MDM Architect
About the Role
We are seeking a highly skilled Data Architect Engineer/Developer to design and implement scalable, cloud native data solutions on AWS. In this role, you’ll work hands on within a modern tech stack (Python, Angular/TypeScript) and collaborate closely with data engineers, software developers, and business stakeholders to support our rapidly growing data infrastructure.
You’ll help build high performing data pipelines, architect cloud data solutions, optimize storage and retrieval patterns, and develop API driven integrations. This is an opportunity to join a forward thinking, AWS driven environment and contribute to innovative data architecture initiatives.
What You’ll Do
- Design, build, test, and deploy scalable data pipelines.
- Architect cloud native solutions on AWS, including data lakes, data warehouses, and real time data pipelines.
- Develop serverless applications using AWS Lambda and AWS Glue (Python).
- Build and maintain API integrations that support reporting and analytics.
- Collaborate with development and analytics teams to define data models for BI/analytics.
- Troubleshoot data issues and optimize systems for scalability and performance.
- Ensure adherence to data security, privacy, and compliance standards.
- Create documentation, data flow diagrams, and architecture artifacts.
- Stay current with AWS technologies and industry best practices.
- Mentor junior engineers and contribute to team technical growth.
Basic Qualifications
- Bachelor’s degree in Computer Science, Information Systems, Analytics, or related field.
- 5+ years in data architecture, data engineering, or similar roles.
- 3+ years Python development.
- 3+ years ETL/data pipeline engineering experience.
- Strong understanding of OLTP design, ODS reporting, and dimensional modeling.
- Handson experience with AWS services (Lambda, Glue, S3, DynamoDB, Athena, Snowflake, etc.).
- Strong SQL and NoSQL experience with excellent query writing skills.
- Experience with API based data access and integration.
- Knowledge of serverless architecture patterns.
- Experience working in Agile environments.
- Strong communication, analytical, and problem solving skills.
- AWS certifications are highly desirable.
Preferred Skills
- Experience with modern data stack tools (dbt, Snowflake, Databricks).
- Exposure to ML pipelines or AI driven analytics.
- DevOps/IaC experience (Terraform, CloudFormation).
- Experience with CI/CD workflows for data engineering.
Preferred Skills
- Experience with modern data stack tools (dbt, Snowflake, Databricks).
- Exposure to ML pipelines or AIdriven analytics.
- DevOps/IaC experience (Terraform, CloudFormation).
- Experience with CI/CD workflows for data engineering.
Why Join Us?
- Opportunity to influence and shape cloud data architecture.
- Modern AWS-focused tech stack.
- Collaborative, innovative engineering culture.
- Ability to work fully remote if preferred.
- Contract?to?hire pathway with long?term potential.
To Apply for this Job Click Here
Equal Employment Opportunity Statement
Gravity IT Resources is an Equal Opportunity Employer. We are committed to creating an inclusive environment for all employees and applicants. We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age, disability, genetic information, veteran status, or any other legally protected characteristic. All employment decisions are based on qualifications, merit, and business needs.