wagey.ggwagey.ggv1.0-e93b95d-4-May
Browse Tech JobsCompaniesFeaturesPricingFAQs
Log InGet Started Free
Jobs/Data Engineer Role/Object Computing, Inc. - Inc.
Object Computing, Inc.

Object Computing, Inc. - Inc.

Hybrid - Asia-Pacific *3mo ago
RemoteAPACCloud ComputingData AnalyticsLogisticsData EngineerDBAApplied ScientistEnterprise ArchitectGoJavaCDKPythonScala

Upload My Resume

Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT

Apply in One Click
Apply in One Click

Requirements

• Design, build, and optimize large-scale data pipelines (Azure, AWS, GCP) • Develop data ingestion and storage solutions. • Implement scalable APIs and ensure system performance. • Manage big data infrastructure and cloud deployments. • Collaborate with developers, designers, and data scientists. • Work in an Agile/DevOps environment. • ETL data engineering (Databricks, SQL Server, Snowflake, BigQuery, Apache Spark). • Proficiency in Go, Java, Python, or Scala. • Proficiency in CI/CD pipelines and Infrastructure as Code (IaC) (Terraform, CDK, Ansible). • Hands-on experience with event-driven architectures (Kafka, Pulsar). • Strong knowledge of data warehousing, SQL/NoSQL databases, and cloud platforms. • Experience with distributed computing, DevOps tools, and data governance. • Familiarity with Delta Lake, Unity Catalog, Delta Sharing, and DLT. • Degree in Computer Science, Data Engineering, or a related field, or equivalent experience. • Experience with AI/ML-driven data solutions and real-time data processing. • Expertise in building scalable APIs and integrating with modern analytics tools (Power BI, Tableau, QuickSight). • Cloud certifications (Databricks, AWS, Azure, GCP).

Responsibilities

• Design, build, and optimize large-scale data pipelines (Azure, AWS, GCP) • Develop data ingestion and storage solutions. • Implement scalable APIs and ensure system performance. • Manage big data infrastructure and cloud deployments. • Collaborate with developers, designers, and data scientists. • Work in an Agile/DevOps environment. • Core Competencies • ETL data engineering (Databricks, SQL Server, Snowflake, BigQuery, Apache Spark). • Proficiency in Go, Java, Python, or Scala. • Proficiency in CI/CD pipelines and Infrastructure as Code (IaC) (Terraform, CDK, Ansible). • Hands-on experience with event-driven architectures (Kafka, Pulsar). • Strong knowledge of data warehousing, SQL/NoSQL databases, and cloud platforms. • Experience with distributed computing, DevOps tools, and data governance. • Familiarity with Delta Lake, Unity Catalog, Delta Sharing, and DLT.

Get Started Free

No credit card. Takes 10 seconds.

Privacy·Terms··Contact·FAQ·Wagey on X