*Senior Software Engineer - Data & APIs
Role details
Job location
Tech stack
Job description
In this role, as a Senior Software Engineer Data & APIs, you will design, develop, and maintain high-performance analytical applications and services that power large-scale data pipelines and analytics platforms. Your main focus will be on building Python-based solutions for data ingestion, transformation, and modeling, ensuring efficiency, scalability, and reliability across distributed systems. You will work closely with Data Engineers, Analysts, and cross-functional teams to optimize data workflows, improve ETL/ELT processes, and deliver high-quality datasets that enable advanced analytics and business insights within a cloud-based environment. What to Expect in This Role (Responsibilities) Contribute to all phases of the analytical application development life cycle, from design to deployment.Design, develop, and deliver high-volume data analytics applications with a focus on performance and scalability.Write well-structured, testable, and efficient code that aligns with technical and business requirements.Ensure all solutions comply with design specifications and best engineering practices.Support continuous improvement by researching emerging technologies, evaluating alternatives, and presenting recommendations for architectural review.
Requirements
5+ years of proven experience developing analytical or data-driven applications.5+ years of strong proficiency in Python and SQL, with hands-on experience in DBMS and Apache Iceberg.5+ years of expertise working with distributed data processing frameworks such as Apache Spark, Hadoop, Hive, or Presto.Deep understanding of data modeling techniques (Star schema, Snowflake) and data cleansing/manipulation processes.Good knowledge of NoSQL databases and handling semi-structured/unstructured data.Experience working within the AWS ecosystem (S3, Redshift, RDS, SQS, Athena, Glue, CloudWatch, EMR, Lambda, or similar).Experience with ETL/ELT batch processing workflows.Proficiency with version control tools (GitHub or similar).Advanced or fluent level of English (written and spoken). Nice to Have Experience in data architecture or pipeline optimization.Familiarity with containerization (Docker, Kubernetes).Exposure to Agile/Scrum methodologies.Knowledge of data orchestration tools (Airflow, Prefect).Understanding of data lakehouse architectures.
Benefits & conditions
100% remote Long-term commitment, with autonomy and impactStrategic and high-visibility role in a modern engineering cultureCollaborative international team and strong technical leadershipClear path to growth and leadership within Coderio