Data Engineer
Role details
Job location
Tech stack
Job description
Are you ready to trade your job for a journey? Become a FlyMate! Passion, excitement & global collaboration are all core to what it means to be a FlyMate. At Flywire, we're on a mission to deliver the world's most important and complex payments. We use our Flywire Advantage - the combination of our next-gen payments platform, proprietary payment network and vertical specific software, to help our clients get paid, and help their customers pay with ease - no matter where they are in the world. What more do we need to truly be unstoppable? Perhaps, that is you!, Flywire is seeking a motivated and skilled Data Engineer to contribute to the development and optimisation of our data platforms and pipelines. In this role, you will work closely with the data engineering, analytics engineering, and business intelligence teams to build, maintain, and enhance the infrastructure that supports Flywire's data needs. You will have the opportunity to work with modern data technologies in a cloud-based environment while contributing to projects that have a direct impact on the business., * Assist in the design, development, and maintenance of scalable and efficient data pipelines and ETL/ELT processes.
- Contribute to the optimisation of existing data workflows and applications for performance and resource efficiency.
- Write, test, and deploy data transformation jobs using tools like dbt.
- Work with streaming data frameworks and infrastructure to process real-time and near real-time data.
- Support the development and maintenance of data models within our cloud data warehouse (Big Query).
- Collaborate with data scientists, BI developers, analytics engineers, and other data engineers to understand data requirements and deliver reliable data solutions.
- Implement data quality checks and monitoring to ensure data accuracy and reliability.
- Participate in code reviews and contribute to the team & engineering standards and best practices.
- Troubleshoot and resolve issues related to data pipelines and data infrastructure.
- Learn and apply new technologies and techniques in the data engineering
Requirements
- Must be proficient in Python and Java; experience with Ruby preferred.
- Experience with GCP and AWS.
- Understanding of data warehousing concepts and experience working with a cloud data warehouse (e.g., BigQuery, Snowflake).
- Experience with SQL and working with relational databases.
- Experience with data streaming frameworks (e.g., Apache Beam, Flink).
- Experience with streaming infrastructure (e.g., Kinesis, Pub/Sub).
- Experience with advanced messaging queuing protocols (AMPQ) and software (RabbitMQ).
- Experience with Infrastructure as Code (IAC) and specifically Terraform.
- Familiarity with workflow orchestration tools like Airflow (GCP Composer).
- Experience with data transformation tools like dbt.
- Exposure to containerisation technologies like Docker.
- Understanding of data governance principles.
- Ability to work effectively in a team environment and collaborate with technical and non-technical stakeholders.
- Good communication skills and attention to detail.
- Bachelor's degree in computer science, Engineering, Mathematics, or a related technical field.