To Apply for this Job Click Here
Job Title: Snowflake Developer
Location: Tallahassee, FL
Job Type: Contract
Referral Fee:
Employment Eligibility: Gravity cannot transfer nor sponsor a work visa for this position. Applicants must be eligible to work in the U.S. for any employer directly
Required Skills
- Candidate must have a minimum of 3 years of experience in data engineering, analytics, or cloud data warehousing, with at least 2 years of hands-on experience designing and implementing solutions using the Snowflake Data Cloud platform.
- Expert level SQL programming is REQUIRED for this position.
- Proven experience with Snowflake platform architecture and data warehousing concepts.
- Expertise in building efficient, secure, and scalable data models in Snowflake using views, materialized views, and secure shares.
- Strong knowledge of ELT/ETL patterns and tools (e.g., dbt, Airflow, Talend, Informatica, MS SSIS, Fivetran).
- Solid understanding of data governance, security roles, masking policies, and RBAC within Snowflake.
- Experience working with cloud storage integrations (e.g., AWS S3, Azure Blob) and external tables in Snowflake.
- Familiarity with dimensional modeling (Star/Snowflake Schema), OLAP concepts, and reporting layers for BI tools.
- Strong communication and analytical skills for working with cross-functional teams and converting data requirements into technical solutions.
- Strong understanding of current data governance concepts and best practices.
- Knowledge of data migration best practices from external data sources and legacy systems (e.g., mainframe, DB2, MS SQL Server, Oracle) into Snowflake.
- Experience with data visualization tools (Power BI, Tableau, Looker) and building BI semantic models using Snowflake as a backend is a PLUS
- Experience working with financial, ERP, or general ledger data in a reporting or analytics capacity is a PLUS
- Exposure to mainframe systems, legacy flat files, and their integration with cloud-based platforms is a PLUS
- Familiarity with Agile/SCRUM frameworks and experience working in iterative development cycles is a PLUS
- Experience with Oracle Data Warehouse is a PLUS
- Understanding of DevOps and CI/CD practices in data engineering (e.g., Git, dbt Cloud, or GitHub Actions) is a PLUS
Duties and Responsibilities
- Analyze the current data environment, including data sources, pipelines, and legacy structures, to determine required transformations and optimal migration strategies into Snowflake.
- Collaborate with stakeholders and data architects to design and implement scalable, secure, and cost-effective data architecture using Snowflake.
- Re-engineer legacy reporting logic (e.g., WebFOCUS, Mainframe FOCUS, and T-SQL) by translating them into Snowflake SQL and optimizing performance.
- Develop and automate ELT/ETL data pipelines using Snowflake’s native features and tools such as Snowpipe, Streams, Tasks, Informatica, and integration with external orchestration tools (e.g., dbt, Airflow).
- Partner with analysts and business users to build efficient, reusable data models and secure views within Snowflake that support downstream reporting (e.g., Power BI, Tableau, or Looker).
- Optimize query performance and data governance by implementing best practices in Snowflake for security, access control, caching, clustering, and cost monitoring.
- Support training, documentation, and knowledge transfer to internal teams, ensuring smooth adoption and use of Snowflake-based solutions
To Apply for this Job Click Here
Share This Job
Share This Job
Refer A Candidate
Recommend a candidate and receive a referral bonus as a thank-you for helping us find top talent.
Upload Your Resume
Share your resume, and we’ll match you with opportunities that fit your skills and goals.