Junior Data Engineer
Role details
Job location
Tech stack
Job description
our state-of-the art data platform, keeping it robust and efficient,leveragingbest-in-industry tools & solutions to improve the platform's maintainability and performance.In this 'core engineering' element of this role, youwill: Partakein the entire SDLC (planning, execution, andmaintainingnew components) Leveragethe latestcloud-native,DevOps, Infrastructure-as-Codepractices, CI/CD,GitOpsparadigms Work with the engineering team to develop our data pipeline, collecting analytics events from our microservices and loading them into our reporting suite. Work with tech and product teams to ensure new features will enable new meaningful analyses. Work withcutting-edgetech such as DBT,Starburst,Snowflake, AWS, Looker,Kafka,etc (https://www.getdbt.com/casestudies/landbay/). Connect data from external sources (e.g. Google Analytics, HubSpot, third-party data suppliers, etc) and integrate with our production sources. Integrate testing into our pipeline to ensure integrity and reliability. Createand
Requirements
Do you have experience in Software development?, deliverawesome techniques to ensure and tomonitorthe quality and freshness of data. Data analysis: This rolealso involves data analysis, helping to foster a data-driven culture and supporting the wider Product & Engineering teamsinroadmapsprioritisation,feature development,livetroubleshooting.In this element of the role, you will: Becomesubject matter expertfor all data questions and will understand our business inside out. Understandhow to interpret the data, with all its nuances. Helptoidentifyand analyse trends - your voice and insights will help shape the roadmap across the business. Support our business intelligence activities by working with the Looker platform,maintainingconfigurations,LookMLcode and repo management, as well as creating reports to support the business. Desired Skills & Experience Experience: 1-3 years' experience in a data engineering role working on data pipeline and ETL components or an ability todemonstratetheappropriate skills. Exposure to modern, agile software development environments and Product-led teams, with collaboration across Product, Engineering, andData.Skills Some experience working with a data platform in a commercial or production environment. DesiredSkills: A foundational understanding of software engineering principles, including modular code, version control, and basic DevOps concepts. A good understanding of internet technologies and terminology. Good SQL skills, including joins, aggregations, and basic analytical queries (window functions a plus). Working knowledge of Python for data processing, scripting, or automation, and some exposure to at least one other programming language (e.g. Java, C#, JavaScript, etc.) is a plus. Some exposure to Cloud Platforms (AWS, etc) Exposure to DBT or similar data transformation tools is desirable but notrequired. Experience working with relational databases; familiarity with analytical data warehouses (e.g. Snowflake) is a bonus. Exposure to Business Intelligence tools and writing queries to support reporting or analysis; Looker experience is a bonus. Excellent attention to detail. Proactive self-starter who thrives in a high-trust environment and can take responsibility for owned projects. Great communicationskills with the ability to explain complex ideas simply. Able to foster good relationships across the business. Location & LogisticsHybrid workingwithcurrentoffice working days asTuesday and Wednesday. Theoffice is based in Victoria, central London. The Application Process If youare successful and shortlisted for the role, therewillbe aHR screening followed by a2 stageinterview process withteam Company benefits Generousholidayentitlement Hybrid working policy Generouspensionscheme Healthinsurance Enhancedmaternity/paternityleave LifeInsurance Cycle-to-workscheme RegularSocialevents EV scheme