wagey.ggwagey.gg
Open Tech JobsCompaniesPricing
Log InGet Started Free
© 2026 Dominic Morris. All rights reserved.·Privacy·Terms·
Jobs/Neo4j Jobs/Senior Data Engineer - Data Platform and Analytics

Senior Data Engineer - Data Platform and Analytics

WorldlyRemote - Concord, California, United States$135k – $165k+ Equity1w ago
RemoteSeniorNAInsuranceLogisticsCloud ComputingSenior Data EngineerNeo4jProduct MarketingReportingAWSSQL

Upload My Resume

Drop here or click to browse · PDF, DOCX, TXT

Apply in One Click

Requirements

  • Experience with semantic layers (Cube.dev preferred),
  • Familiarity with genAI/NLP enablement,
  • Exposure to graph databases (knowledge graphs) and related concepts (Neo4j preferred).
  • What We Can Offer You
  • Medical, Dental, and Vision Insurance are offered through multiple PPO options. Worldly covers 90% employee premium and 60% spouse/dependent premium.
  • Company-sponsored 401k with up to 4% match for US employees.
  • Incentive Stock Options.
  • 100% Parental Paid Leave.
  • 12 paid company holidays.
  • Life at Worldly
  • Our team is motivated to transform the way products are made. By helping our customers succeed in a new era of sustainable production, we can build technology that makes a difference on a planetary level.
  • Our team represents over 15 countries and brings unique experiences from technology to farming to the table. Surround yourself with kind, enthusiastic, and dedicated people who put collaboration and growth at the center of our shared goals.

Responsibilities

  • You will collaborate with stakeholders across the organization to design and implement scalable, cloud-based data solutions, integrating generative AI to drive innovation.
  • You will work closely with cross-functional stakeholders (finance, product, marketing, customer support, tech, data science) to enable trusted data products for internal decision making and external-facing tools.
  • You will have a leading role in the development of a data lake resource to complement our existing data warehouse, enabling greater flexibility in analytics and reporting.
  • You will work with AWS services, automation tools, machine learning, and generative AI to enhance efficiency, stability, security, and performance.
  • This role is expected to drive outcomes in day-to-day execution and operational stability, while partnering with senior engineering leadership on longer-range architecture direction.
  • SQL + Postgres Data Warehouse
  • Operate and evolve our Postgres data warehouse: schema design, performance tuning, indexing, access controls, and so on.
  • Build analytics-ready datasets supporting sustainability measurement, supply-chain insights, and business metrics.
  • Semantic Layer Deployment
  • Deploy and maintain multiple instances of Cube.dev semantic layers with standardized configuration, CI/CD workflows, and governance practices (including documentation of processes, configurations, and troubleshooting).
  • Establish clear and consistent metric definitions and versioning across dashboards and analytics surfaces.
  • GenAI/NLP Enablement
  • Support integration and deployment of genAI-enabled workflows, especially NLP-based use cases (classification, extraction, normalization, embeddings/similarity). • Ensure that our data infrastructure is “AI-ready”.
  • Graph Data Enablement
  • In collaboration with data scientists, research and develop practical transition plans for evolving selected relational/warehouse data structures into a graph-based knowledgebase (for use with the Neo4j framework), including candidate use cases, data modeling approach, migration sequencing, and operational considerations (performance, governance, lineage, and security).
  • DBT Pipelines and Automation
  • Maintain and stabilize existing DBT pipelines that underpin reporting and analytics, including automation for incremental processing/scheduling, data quality monitoring, and performance tuning.
  • Lead operational support and modernization planning: rapid triage and root-cause resolution for pipeline issues, and evaluation/prototyping of next-generation transformation approaches with clear, low-risk transition plans in partnership with analytics and engineering stakeholders.
  • AWS Data Pipelines
  • Build ingestion and ETL processes using S3, Glue, Lambda, and AppFlow.
  • Integrate data from third-party systems and APIs (e.g., Zendesk, HubSpot, NetSuite, other platform data) with strong auditability and operational resilience.
  • We'd Like to See
  • 5+ years of professional experience in data engineering, analytics engineering, or data platform engineering.
  • Advanced SQL expertise and strong experience with relational databases, especially Postgres.
  • Strong Python development skills applied to data pipelines, automation, and operational tooling.
  • Strong Git-based development practices (branching, PRs, code review).
  • Demonstrated experience developing and supporting DBT transformations and operational workflows.
  • Hands-on experience building AWS ingestion/ETL workflows using services such as S3, IAM, Glue, Lambda, CloudFormation (or other IaC), and AppFlow.
  • Practical DevOps experience: CI/CD pipelines, Git/GitHub workflows, and containerization fundamentals (Docker).
  • Experience with analytics data modeling and metric definition practices.
  • Experience implementing automated monitoring/alerting and data quality controls for pipelines and critical datasets.
  • Experience operating production data systems (including data quality tests, regression checks, validation frameworks, incident triage, root-cause analysis, runbooks, reliability improvements).
  • Experience working closely with analytics teams and cross-functional stakeholders; familiarity with Jira/Confluence and Agile delivery.
  • Familiarity with data security practices (PII protection, encryption controls, access management).

Benefits

  • Earn a competitive salary and performance-based bonuses. Get healthcare, retirement matching, and equity for US employees.
  • Use the office stipend to get the supplies you need—combat Zoom fatigue with no-meeting Fridays.
  • Flexible time off. Take the time you need to recharge. Our culture encourages team members to explore and rest to be their best selves.
  • We're remote, not lonely. Join the culture committee, coffee chats, or a variety of other interest groups.
  • ​​Roles at Worldly may require occasional travel to support business needs, including but not limited to team collaboration, customer engagement, or company events.
  • Equity Statement
  • We believe reflecting the diversity of those we strive to serve is essential. True innovation happens when everyone has room at the table, including the tools, resources, and opportunity to excel. We’re dedicated to building a culturally and experientially diverse team that leads and works with empathy and respect.
  • Work-From-Home Stipends
  • Final compensation figures will be determined based on a wide variety of factors, including experience and location. These factors will be evaluated and considered by Worldly throughout the entirety of this process.

Similar Jobs

Senior Software Engineer, Product Data Platform
12h ago
BrexBrex·Unknown - USA *·$192k – $240k/year + Equity
In OfficeSeniorNAData AnalyticsSenior Software EngineerSenior Data EngineerElasticsearchBrexReportingJavaKotlin
Senior Software Engineer, AI Data Infrastructure
18h ago
Stack AVStack AV·Remote - Pittsburgh, PA or Remote
RemoteSeniorNAGovernmentSenior Software EngineerSenior Data EngineerPythonSQLAirflowKubernetes
Senior Data Engineer (Data Management)
18h ago
RecursionRecursion·Milton Park, England·$102k – $137k/year
In OfficeSeniorEMEASenior Data EngineerTeam ManagementJavaC#PythonSQL
Senior Threat Data Infrastructure Engineer
18h ago
Recorded FutureRecorded Future·Remote - UK
RemoteSeniorEMEAInfrastructure EngineerSenior Data EngineerGoPerlPythonBashLinuxDocumentation
Senior Scientific Data Engineer, Data Platform
18h ago
RecursionRecursion·London, England; Milton Park, England·$102k – $137k/year
In OfficeSeniorEMEAArtificial IntelligenceData AnalyticsData EngineerSenior Data EngineerPythonSnowflakeSQLRecords ManagementPrefectCompoundData Quality

Stop filling. Start chilling.Start chilling.

Get Started Free

No credit card. Takes 10 seconds.