Black Duck Software, Inc. - Data Scientist/Engineer
Upload My Resume
Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT
Requirements
• 5+ years of experience working in Data Science, AI/Data Engineering, Data Operations, DevOps, Business Analytics, or a related field • BSc Or MSc in Computer Science, Data Science, Artificial Intelligence, Math, Physics, Engineering or related field/degree • Experience in a relevant analytical programming language the point where you can build / deliver a project/module from scratch that can be used by others (Python is our main daily-driver, expert-level experience in Julia or Rlang could be accepted • Experience in Jupyter Notebook / equivalents • Experience in Airflow, DBT, Databricks, or equivalent stacks • Experience in the PyData / Spark or equivalent analytical stacks • Familiarity with Cybersecurity Governance, Application Security Testing, Quality Assurance or similar • Experience in data modelling and working with RDBMS (PostgreSQL, Oracle or MySQL) and knowledge of NoSQL databases (e.g. MongoDB) • Experience with Machine Learning and AI systems • Hands-on experience with AI-assisted development tools (e.g., GitHub Copilot, Claude Code, Cursor, or similar) • Independent project operation and cross-functional collaboration • Strong or Developing communication skills (in-person and remote) • Familiarity with Data Mesh/Data Product concepts • Experience in operating in Linux Command line environments • Experience in Langchain or equivalent Agentic development stack • Experience in training custom Machine Learning models, including familiarity with evaluation criteria and metric design • Experience in integrating AI capabilities into software systems, including prompt engineering, API integration, and leveraging LLM-based services for automation and productivity • Experience in Enterprise Data Visualisation such as Power BI, Tableau, Grafana, DataBricks, Snowflake etc. • Experience deploying ML/AI models in production environments/workloads • Experience in developing/working within large enterprise applications using microservices architecture, and container orchestration technologies, running on Kubernetes and/or cloud technologies (AWS, Azure or GCP) • Experience in software architecture, systems design, interaction design (to the point where you can have constructive conversations with security / architecture leaders)
Responsibilities
• Developing and maintaining analytical data pipelines from a range of sources, internal and external • Participate in system design discussions and contribute to architectural decisions. • Evaluating new analytical / technological opportunities for leveraging those data for security/business impact • Leading projects from research through to production deployment and operational handover to appropriate teams • Partnering with R&D and Engineering teams to develop and share best practices for data tooling, from pipelines and dashboards to ML and LLM integration
No credit card. Takes 10 seconds.