Allata - Data Manager (Snowflake)
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Deep hands-on Snowflake experience: warehouses, databases/schemas, stages, file formats, external tables, Snowpipe, Streams and Tasks, Time Travel/Fail-safe, query performance, micro-partitions, clustering, and cost management. • Strong SQL and Python for data engineering use cases. • ELT-first development with dbt or an equivalent modeling/testing/documentation framework. • Experience with modern ingestion and orchestration (Fivetran, ADF, Glue, Matillion, Airflow, or Dagster). • Cloud experience (AWS preferred; Azure or GCP considered), including storage, IAM, networking basics, and secrets management. • CI/CD for data with Git-based workflows and environment promotion. • Familiarity with Infrastructure as Code (e.g., Terraform Snowflake provider). • Solid understanding of dimensional modeling and data quality practices. • Snowpark (Python), UDFs or stored procedures, and Dynamic Tables. • Streaming or CDC with Kafka, Kinesis, or Debezium; event-driven patterns. • BI or semantic layer exposure (Power BI, Tableau, Looker); metrics layer concepts. • Experience with Azure Synapse, Microsoft Fabric, or Databricks in lakehouse contexts. • Security and compliance exposure (PII handling, encryption, auditing, regulated environments). • Proven leadership with coaching, feedback, objective setting, and conflict resolution. • Excellent stakeholder communication; able to facilitate discovery and manage expectations. • Ownership mindset with proactive risk identification, prioritization, and mitigation. • Structured problem-solving; comfortable with ambiguity and change. • At Allata, we value differences.
Responsibilities
• Define and evolve Snowflake platform architecture, standards, and best practices. • Translate business goals into a pragmatic technical roadmap and delivery plan. • Lead and mentor data engineers; establish quality bars, review code, and guide execution. • Manage delivery using Agile rituals; align priorities across data, analytics/BI, and application teams. • Design, build, and optimize ELT pipelines on Snowflake with an ELT-first approach (dbt preferred). • Implement modern data ingestion using tools such as Fivetran, ADF, Glue, or Matillion. • Set up orchestration and CI/CD for data using Airflow or Dagster and Git-based pipelines. • Ensure data quality, observability, monitoring, alerting, documentation, and runbooks. • Implement security and governance in Snowflake (RBAC, masking, row access policies, auditing, data sharing). • Facilitate discovery with stakeholders, clarify requirements, and communicate trade-offs and recommendations.
Similar Jobs
No credit card. Takes 10 seconds.