Staff Data Engineer- Ads reporting
Role details
Job location
Tech stack
Job description
Unity is building a robust, near real-time reporting platform that powers critical analytics and decision-making across our ecosystem. We are looking for a Data Engineer to help architect and implement the distributed data systems that drive this platform at scale.
In this role, you'll design and build high-throughput, low-latency data processing pipelines that power reporting used by internal teams and external customers. You'll operate at the intersection of distributed systems, stream processing, and cloud-native infrastructure - ensuring correctness, reliability, and scalability in a high-volume production environment.
This is a high-impact role where engineering rigor, architectural clarity, and production ownership matter.
What you'll be doing
- Design and implement near real-time data pipelines and reporting infrastructure.
- Architect distributed stream and batch processing systems using technologies such as Apache Flink, Spark, and Airflow.
- Build and maintain data processing frameworks that handle large-scale event ingestion and transformation with strong correctness guarantees.
- Ensure production-grade reliability, observability, and operability across distributed systems.
- Define and enforce data processing semantics including:
- Exactly-once processing
- Event time vs. processing time handling
- Stateful stream management
- Backpressure and fault tolerance strategies
- Collaborate cross-functionally with data consumers, product, and infrastructure teams to define scalable reporting solutions.
- Contribute to long-term platform architecture, setting engineering standards for performance, resilience, and maintainability.
Requirements
- Strong foundation in distributed systems and systems design.
- Hands-on experience building and operating large-scale data processing systems.
- Deep understanding of streaming concepts:
- Exactly-once semantics
- Watermarking and event-time processing
- Stateful stream processing
- Checkpointing and recovery
- Backpressure handling
- Production experience with frameworks such as Apache Flink, Spark, Kafka, or similar technologies.
- Proficiency in Python, Java, or Scala.
- Experience with workflow orchestration tools (e.g., Airflow) for stream and batch coordination.
- Strong understanding of cloud-native architectures and distributed infrastructure (Kubernetes, containerization, cloud platforms)., This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.
Benefits & conditions
At Unity, we want our team members to thrive. We offer a wide range of benefits designed to support well-being and work-life balance.
Please note: Benefits eligibility, specific offerings, and coverage vary based on the country and employment status.
While specific benefits vary, here are some of the ways we strive to take care of our eligible team members globally: Comprehensive health, life, and disability insurance | Commute subsidy | Employee stock ownership | Competitive retirement/pension plans | Generous vacation and personal days | Support for new parents through leave and family-care programs | Office food snacks | Mental Health and Wellbeing programs and support | Employee Resource Groups | Global Employee Assistance Program | Training and development programs | Volunteering and donation matching program