Software Engineers
Role details
Job location
Tech stack
Job description
- Design and implement large-scale data processing pipelines using SQL and Python
- Build systems that support data collection, transformation, and monitoring workflows
- Develop scalable data architectures that support high-volume data processing
Platform Development & Monitoring
- Build dashboards and visualization tools that provide insights into data quality and system performance
- Implement monitoring systems and automated alerts to detect issues in data pipelines
- Create end-to-end testing frameworks to ensure reliability and data integrity
Data Quality & Debugging
- Debug complex data flow issues across distributed data systems
- Identify root causes of data inconsistencies and implement long-term solutions
- Maintain platform reliability and ensure consistent data quality
Cross-Functional Collaboration
- Work closely with engineering teams, product managers, and data specialists
- Coordinate with operational partners involved in data collection workflows
- Support targeted data collection strategies and performance optimization initiatives
Requirements
We are sharing a specialised part-time consulting opportunity for experienced Software Engineers with strong backgrounds in data pipelines, monitoring systems, data quality tooling, and large-scale platform development., * Professional experience working with SQL and large production datasets
- Proficiency in Python for data processing, automation, and pipeline development
- Experience building data engineering pipelines including ETL workflows and data modeling
- Experience creating dashboards or data visualization tools such as Tableau or similar platforms
- Strong analytical and problem-solving skills, * Degree in Computer Science or a related technical field
- Experience working with machine learning datasets and model training pipelines
- Experience building scalable data platforms or distributed systems
- Experience working on privacy-sensitive or advertising-related systems
- Experience working with human labeling or data annotation workflows
Benefits & conditions
Why This Opportunity
- Apply software engineering expertise to high-impact data platform work
- Contribute to data pipeline development, monitoring systems, and data quality workflows
- Collaborate with engineers, data specialists, and product teams on practical infrastructure challenges
- Flexible remote work with competitive hourly compensation
Contract Details
- Independent contractor role
- Fully remote with flexible scheduling
- Competitive hourly compensation
- Expected commitment may vary based on project scope and business needs
- Competitive rates between $70-$90 per hour depending on expertise
- Weekly payments via Stripe or Wise
- Projects may be extended, shortened, or concluded early depending on project needs and performance
- Work will not involve access to confidential or proprietary information from any employer, client, or institution
About the Platform
This opportunity is available through 24-MAG LLC. We connect experienced professionals with remote consulting opportunities across technical, evaluation, and project-based workstreams.