Full-Stack Data Platform Engineer - Remote (Europe)
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Skills needed: Full-Stack Data Platform Engineer experience required; proficiency in JavaScript/TypeScript preferred but not mandatory as NodeJS is used for backend development. Familiarity with React and Redux frameworks expected due to the frontend nature of Reflow's platform, though it’ endorses learning on demand if needed. • Years of experience: 3+ years required in a similar role or equivalent combination of skills/experience; no specific mention made about seniority level but having prior full stack development and data engineering roles is beneficial for the position at Reflow, as stated by "We're looking to hire engineers with previous experiences." • Education: No explicit education requirement mentioned in job posting. However, a Bachelor’s degree or higher may be preferred based on industry standards; no specific field of study was indicated but having experience working across different teams and projects is beneficial for the role at Reflow as stated by "We're looking to hire engineers with previous experiences." • Certifications: No certification requirement mentioned in job posting. However, knowledge or willingness to learn about new technologies/tools relevant to full stack data platform engineering may be advantageous due to its nature and the company’s culture of learning on demand as stated by "We endorse a growth mindset." • Must-haves: Experience with working in distributed systems, cloud infrastructure (AWS), containerization technologies like Docker/Kubernetes is required. Familiarity with CI/CD pipelines and experience using tools such as Jenkins or GitLab Runner for continuous integration are also must-haves according to the job posting requirements mentioned by "We're looking to hire engineers who can work in distributed systems, cloud infrastructure (AWS), containerization technologies like Docker/Kubernetes." • (Note: The years of experience and education were not explicitly stated as bullet points but are inferred from contextual information provided within the job posting.)
Responsibilities
• Design, implement, and maintain a scalable data warehouse (BigQuery, Snowflake, Redshift, or similar). • Develop and optimize ETL pipelines to ingest data from APIs and internal systems. • Model and manage datasets to support flexible analytics and product features. • Collaborate with engineering team to improve data mining and analytics performance. • Build and maintain dashboards and visualization tools (Metabase, Tableau, Power BI) to enable internal and external insights. • Ensure data reliability, cost efficiency, and performance optimization across environments. • Implement event-based pipelines for real-time analytics and reporting. • Contribute to data governance, privacy, and security best practices.
Benefits
• You’ll build the backbone of Reflow’s intelligence layer, the systems that make our data usable, fast, and insightful. You’ll work directly with founders and engineers across analytics, infrastructure, and product. This is a high-impact technical role that sits at the intersection of scale, performance, and strategy. • We’re open to part or full-time. Ideal for builders who care about performance and precision at scale. • We offer competitive pay based on the market and where you’re located. The salary ranges in our job postings are intentionally wide because they need to cover both U.S. and international candidates. Our final offer will depend on things like your experience, skill set, and location.