Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• 5+ years of work experience in relevant field (Data Engineer, DWH Engineer, Software Engineer, etc) • Experience with data-lake and data-warehousing technologies and relevant data modeling best practices (Presto, Athena, Glue, etc) • Proficiency in at least one of the main programming languages used: Python and Scala. Additional programming languages expertise is a big plus! • Experience building data pipelines/ETL in Airflow, and familiarity with software design principles. • Excellent SQL and data manipulation skills using common frameworks like Spark/PySpark, or similar. • Expertise in Apache Spark, or similar Big Data technologies, with a proven record of processing high volumes and velocity of datasets. • Experience with business requirements gathering for data sourcing. • Bonus - Kafka and other streaming technologies like Apache Flink. • This job is accepting ongoing applications and there is no application deadline. • Please note, applicants are permitted to redact or remove information on their resume that identifies age, date of birth, or dates of attendance at or graduation from an educational institution. • We consider qualified applicants with criminal histories for employment on our team, assessing candidates in a manner consistent with the requirements of the San Francisco Fair Chance Ordinance. • Kraken is powered by people from around the world and we celebrate all Krakenites for their diverse talents, backgrounds, contributions and unique perspectives. We hire strictly based on merit, meaning we seek out the candidates with the right abilities, knowledge, and skills considered the most suitable for the job. We encourage you to apply for roles where you don't fully meet the listed requirements, especially if you're passionate or knowledgable about crypto!