Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Experience building both streaming & batch data pipelines/ETL and familiarity with design principles. • Expert in Python, PostgreSQL, and PL/pgSQL development and administration of large databases with focus on performance and production support on native cloud deployments. • Experience with scalability solutions and multi-region replication and failover solutions. • Experience with data warehouse technologies (Trino, Clickhouse, Airflow, etc). • Bachelor’s degree (or its foreign degree equivalent) in Computer Science, Engineering, or a related technical discipline or equivalent experience. • Deep understanding of programming and experience with at least one programming language. • English language proficiency. • Knowledge of Kubernetes and Docker • 4+ years of working experience in relevant data field. • Knowledge of blockchain technology / mining pool industry. • Experience with agile development methodology. • Experience delivering and owning web-scale data systems in production. • Experience working with Kafka, preferably redpanda & redpanda connect. • Passionate about cryptocurrency and public-blockchain technologies. • Has an interest in creating an entirely new market with Hashrate (compute power) as a commodity. • Has an interest in thinking and evolving the architecture of our software to make it robust and maintainable. • Enjoys writing code and pushing boundaries of what has been done so far. • Brings fun to the team but can also go own the rabbit hole to push quality code on schedule.
Responsibilities
• Build scalable and reliable data pipelines that provide accurate data feeds from internal and external systems. • Govern scalable & performant cloud-deployed production relational and non-relational databases. • Collaborate on architecture definitions, always thinking of solutions that are scalable and secure. • Drive data systems to be as near real-time as possible. • Design, document, automate and execute test plans to ensure the quality of the datasets. • Participate in the process of generating and analyzing features.