wagey.ggwagey.ggv1.0-38ee235-5-May
Browse Tech JobsCompaniesFeaturesPricingFAQs
Log InGet Started Free
Jobs/Software Engineer Role/bee-talents - Software Engineer (Python, Kubernetes, AI/ML) | Gcore | Remote
bee-talents

bee-talents - Software Engineer (Python, Kubernetes, AI/ML) | Gcore | Remote

Remote - but only within Poland2mo ago
RemoteEMEAArtificial IntelligenceDeveloper ToolsSoftware EngineerAI EngineerPythonKubernetesDockerHelmGo

Upload My Resume

Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT

Apply in One Click
Apply in One Click

Requirements

• Proficiency with Python, especially in the context of ML tooling or backend development. • Experience with AI/ML pipelines or integrating machine learning frameworks like TensorFlow or PyTorch into production environments. • Hands-on experience with vLLM and SGLang • Familiarity with cloud-native tooling such as Docker, Helm, and related CNCF technologies. • A problem-solving mindset and genuine interest in working on distributed systems and platform-level challenges. • Clear communication skills and a collaborative attitude - you enjoy working closely with others to build great solutions. • Solid experience with Go programming, particularly in the context of Kubernetes - including building controllers, operators, and working with custom resources (CRDs). • Strong understanding of Kubernetes architecture, container orchestration, and resource management at scale. • Understanding of GPU scheduling and performance optimization in Kubernetes. • Awareness of Kubernetes security practices, including RBAC and container hardening. • Contributions to open-source projects or involvement in cloud-native or MLOps communities.

Responsibilities

• Contribute to the development of the Everywhere Inference platform - a Kubernetes-based solution enabling scalable and portable AI inference across a wide range of environments. • Design and implement APIs and developer tools to simplify deployment, management, and monitoring of AI applications. • Focus on packaging and integrating new ML models into the platform, using Python and common ML frameworks. • Optimize serverless container workflows for AI workloads, ensuring performance, scalability, and seamless autoscaling. • Collaborate with customers to fine-tune ML model performance and support their unique use cases. • Work with cross-functional teams to improve the AI applications marketplace and ensure smooth model onboarding and lifecycle management. • Stay current with trends in Kubernetes, machine learning, and MLOps, and help drive innovation within the platform.

Benefits

• At Gcore, we want you to do your best work and enjoy the journey. Our benefits are designed to support your growth, well-being, and life beyond work: • Flexible working hours and hybrid or remote options, depending on your role • Work from anywhere in the world for up to 45 days per year • Private medical insurance for you and your family* • Extra paid vacation and sick leave days* • Support for life’s important moments and celebrations • Language courses to help you connect and grow • Modern, welcoming offices with snacks, drinks, and entertainment* • Team sports and social activities* • Benefits may vary depending on your location.

Get Started Free

No credit card. Takes 10 seconds.

Privacy·Terms··Contact·FAQ·Wagey on X