wagey.ggwagey.ggv1.0-38ee235-5-May
Browse Tech JobsCompaniesFeaturesPricingFAQs
Log InGet Started Free
Jobs/Quality Control Inspector Role/weloglobal - Welo Global - AI Data Quality Control Coordinator
Pro members applied to this job 36 hours before you saw itGet Pro ›
weloglobal

weloglobal - Welo Global - AI Data Quality Control Coordinator

Remote - Spain / France / Romania / Greece / Italy / United Kingdom / Ireland6d ago
RemoteMidEMEAArtificial IntelligenceData AnalyticsQuality Control InspectorLab DirectorExcelGoogle SheetsData AnalysisQuality ControlQuality Assurance

Upload My Resume

Drop here or click to browse · Tap to choose · PDF, DOCX, DOC, RTF, TXT

Apply in One Click
Apply in One Click

Requirements

• Bachelor degree in any discipline (Data Science, Computer Science, Linguistics, or related fields preferred). • 2-4 years of experience in Quality Control/Quality Assurance within AI data annotation, data labeling, or content moderation. • Strong understanding of annotation workflows (bounding boxes, segmentation, classification, transcription, etc.). • Familiarity with QA metrics such as accuracy, F1 score, precision/recall, and inter-annotator agreement. • Proficiency in MS Excel/Google Sheets (pivot tables, dashboards, data analysis). • Strong communication and coordination skills across cross-functional teams. • Experience with annotation tools (e.g., Labelbox, CVAT, Scale AI, or similar platforms). • Exposure to NLP, Computer Vision, or Speech datasets. • Basic knowledge of machine learning workflows and data lifecycle. • Experience working with global clients and remote annotation teams • Locaton- Remote  (Europe/America based) • ## Primary Responsibility • Performing sampling and guideline-adherence checks; logging defects with categorization and severity. • Tracking corrective actions to closure; verifying re-tests and documenting outcomes. • Coordinating training sessions and refreshers; managing attendance and quick assessments. • Keeping guidelines, SOPs, and rubrics current and version-controlled; managing release notes. • Preparing client-ready exports (tables, charts) with consistent formatting and footnotes. • Liaising with PMs and Ops to align timelines and inputs for reporting and audits. • Managing access requests and permissions for QA tools and folders. • Supporting vendor coordination (checklists, SLAs, documentation requests). • Identifying minor process gaps and suggesting simple fixes (templates, checklists). • ## Ideal Profile • 2–4 years of experience in Quality Control/Quality Assurance within AI data annotation, data labeling, or content moderation. • Excellent attention to detail and ability to identify subtle quality issues in datasets. • This role offers an exciting opportunity to contribute to cutting-edge AI projects while building expertise in data quality and annotation processes. Join us to play a key role in shaping high-quality datasets that power intelligent systems.

Responsibilities

• Perform sampling and quality checks on annotated datasets (text, image, audio, or video) to ensure adherence to annotation guidelines. • Identify, log, and categorize annotation defects (e.g., labeling errors, boundary issues, misclassification) with severity levels. • Track corrective actions and rework tasks to closure; validate re-tests and document outcomes. • Coordinate onboarding training, calibration sessions, and periodic refreshers for annotators and reviewers. • Ensure annotation guidelines, SOPs, and rubrics are updated, version-controlled, and clearly communicated to stakeholders. • Identify process gaps and recommend practical improvements (annotation templates, QA checklists, sampling strategies). Manage access permissions for annotation tools, QA platforms, and shared repositories.

Get Started Free

No credit card. Takes 10 seconds.

Privacy·Terms··Contact·FAQ·Wagey on X