wagey.ggwagey.gg
Open Tech JobsCompaniesPricing
Log InGet Started Free
Jobs/Data Engineer Role/Object Computing, Inc.

Object Computing, Inc.

Object Computing, Inc.Hybrid - Asia-Pacific *1mo ago
RemoteAPACCloud ComputingData AnalyticsLogisticsData EngineerDBAApplied ScientistEnterprise ArchitectGoJavaCDKPythonScala

Upload My Resume

Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT

Apply in One Click

Requirements

• Design, build, and optimize large-scale data pipelines (Azure, AWS, GCP) • Develop data ingestion and storage solutions. • Implement scalable APIs and ensure system performance. • Manage big data infrastructure and cloud deployments. • Collaborate with developers, designers, and data scientists. • Work in an Agile/DevOps environment. • ETL data engineering (Databricks, SQL Server, Snowflake, BigQuery, Apache Spark). • Proficiency in Go, Java, Python, or Scala. • Proficiency in CI/CD pipelines and Infrastructure as Code (IaC) (Terraform, CDK, Ansible). • Hands-on experience with event-driven architectures (Kafka, Pulsar). • Strong knowledge of data warehousing, SQL/NoSQL databases, and cloud platforms. • Experience with distributed computing, DevOps tools, and data governance. • Familiarity with Delta Lake, Unity Catalog, Delta Sharing, and DLT. • Degree in Computer Science, Data Engineering, or a related field, or equivalent experience. • Experience with AI/ML-driven data solutions and real-time data processing. • Expertise in building scalable APIs and integrating with modern analytics tools (Power BI, Tableau, QuickSight). • Cloud certifications (Databricks, AWS, Azure, GCP).

Responsibilities

• Design, build, and optimize large-scale data pipelines (Azure, AWS, GCP) • Develop data ingestion and storage solutions. • Implement scalable APIs and ensure system performance. • Manage big data infrastructure and cloud deployments. • Collaborate with developers, designers, and data scientists. • Work in an Agile/DevOps environment. • Core Competencies • ETL data engineering (Databricks, SQL Server, Snowflake, BigQuery, Apache Spark). • Proficiency in Go, Java, Python, or Scala. • Proficiency in CI/CD pipelines and Infrastructure as Code (IaC) (Terraform, CDK, Ansible). • Hands-on experience with event-driven architectures (Kafka, Pulsar). • Strong knowledge of data warehousing, SQL/NoSQL databases, and cloud platforms. • Experience with distributed computing, DevOps tools, and data governance. • Familiarity with Delta Lake, Unity Catalog, Delta Sharing, and DLT.

Similar Jobs

Software Engineer1h ago
DeliverooDeliveroo·London, London, City of, United Kingdom - Hybrid
In OfficeEMEACloud ComputingSoftware EngineerAWSGoRelease Management
Lead Data Engineer1h ago
BrillioBrillio·Bangalore, Karnataka, India·$1.6M – $1.6M/year
In OfficeAPACStaffCloud ComputingData EngineerData AnalystScalaRedshiftAWS
Implementation Manager3h ago
aiwynaiwyn·Remote - USA·Equity
RemoteNAMidSoftwareAccountingImplementation ManagerCFOB2BCustomer OnboardingJiraHubSpotGo
Senior Product & Commercial Counsel3h ago
houzzhouzz·Remote - USA·$170k – $205k/year + Equity
RemoteNASeniorFintechPaymentsCorporate CounselContract ReviewContract DraftingCRM ManagementLead GenerationGoWorkableSales EnablementB2B
Principal Data Engineer5h ago
TrainlineTrainline·London, Greater London, United Kingdom - Hybrid·$135k – $162k/year
In OfficeEMEAPrincipalInsuranceCloud ComputingData EngineerPrincipalCoachingSQLPythonRayAirflowAWSVectorMentoring

Stop filling. Start chilling.Start chilling.

Get Started Free

No credit card. Takes 10 seconds.

© 2026 Dominic Morris. All rights reserved.·Privacy·Terms·