JioStar - Senior Software Development Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Feature Implementation: Develop and implement features for data ingestion, processing, governance, observability and storage systems based on technical specifications and guidance from senior engineers • Real-time Data Pipeline Contribution: Build components for highly scalable, accurate, and real-time event collection systems under the guidance of senior team members • Data Pipeline Development: Implement data pipelines to support AI model training and autonomous AI Agents, ensuring data quality and reliability • Code Quality: Write clean, maintainable, and well-tested code following established engineering standards and best practices • System Maintenance: Contribute to improving existing systems and services using modern tooling and engineering practices • Performance Optimization: Help identify and implement optimizations for ingestion, storage, and processing components • Data Processing: Hands-on experience with data processing frameworks like Apache Spark, Apache Flink, or similar technologies for batch and streaming workloads • AI/ML Pipeline Experience: Some exposure to building data pipelines for machine learning workflows, feature engineering, or model training (nice to have) • Messaging Systems: Experience with messaging queues and stream processing platforms like Apache Kafka, Kinesis, or equivalent • Cloud Platforms: Working knowledge of at least one major cloud platform (AWS, GCP, or Azure) and their data services, with preference for AWS (S3, EMR, Kinesis, etc.) • Data Storage: Understanding of modern data warehousing and data lake concepts (Snowflake, BigQuery, Redshift, Delta Lake) • ETL/Pipeline Tools: Experience building ETL/ELT pipelines using tools like Apache Airflow, Prefect, or similar orchestration systems • Databases: Solid SQL skills and experience with both relational (PostgreSQL, MySQL) and NoSQL databases (Cassandra, DynamoDB) • Monitoring: Experience with monitoring and observability tools (Datadog, Grafana, Prometheus etc.) • IaC : Familiarity with Infrastructure as Code (Terraform, CloudFormation) • Professional Experience: 4 - 6 years of software engineering experience with at least 4 years focused on data engineering or related fields • Scale Exposure: Experience working with large datasets and high-throughput systems (millions of events, terabytes of data) • Programming Skills: Proficiency in at least one major programming language such as Python, Java, Scala, or Go • System Design Understanding: Basic understanding of distributed systems concepts and data processing patterns
Responsibilities
• Feature Implementation: Develop and implement features for data ingestion, processing, governance, observability and storage systems based on technical specifications and guidance from senior engineers • Real-time Data Pipeline Contribution: Build components for highly scalable, accurate, and real-time event collection systems under the guidance of senior team members • Data Pipeline Development: Implement data pipelines to support AI model training and autonomous AI Agents, ensuring data quality and reliability • Code Quality: Write clean, maintainable, and well-tested code following established engineering standards and best practices • System Maintenance: Contribute to improving existing systems and services using modern tooling and engineering practices • Performance Optimization: Help identify and implement optimizations for ingestion, storage, and processing components
Similar Jobs
No credit card. Takes 10 seconds.