Software Developer SC

Postaladdress Uk
Newcastle upon Tyne, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 46K

Job location

Newcastle upon Tyne, United Kingdom

Tech stack

Artificial Intelligence
Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data analysis
Cloud Computing
Continuous Integration
Data Warehousing
Database Applications
Database Queries
DevOps
Identity and Access Management
Python
Standard Sql
Jupyter Notebook
Data Processing
Spark
Jupyter
Gitlab
SC Clearance
Containerization
PySpark
Deployment Automation
Cloudwatch
Terraform
Software Version Control
Docker

Job description

  • Design, build, and operate data ingest and publishing pipelines
  • Implement workflow orchestration and task scheduling using managed services
  • Collaborate with Product Owners, Business Analysts, and users to shape technical solutions
  • Provide production support, monitoring, and enhance system resilience, stability, and performance
  • Conduct data analysis to identify root causes of defects and operational issues
  • Work closely with DevOps to support automated deployments and infrastructure management
  • Coach and mentor junior engineers and promote engineering best practices

Technologies:

  • Airflow
  • AWS
  • CI/CD
  • CloudWatch
  • DevOps
  • Docker
  • EC2
  • GitLab
  • IAM
  • Support
  • Jupyter
  • Python
  • PySpark
  • SQL
  • Spark
  • Terraform
  • Cloud
  • AI
  • Security

More:

We are a forward-thinking company offering a hybrid work arrangement with flexible locations, including London, Leeds, Newcastle, and more. Our team focuses on innovative development in data-driven applications while maintaining a collaborative DevOps environment. We value continuous improvement, technical leadership, and the mentoring of junior colleagues. Come join us in providing robust technical solutions while contributing to the success of our projects.

Requirements

  • Strong experience in Python and data processing with Apache Spark
  • Knowledge of SQL and familiarity with PySpark
  • Experience using Apache Airflow for task orchestration
  • Understanding of EMR and ability to review output logs
  • Proficiency in using Jupyter notebooks and/or Amazon Athena for data querying
  • Skills in data analysis to identify root causes of issues
  • Understanding of dimensional data models and historic data capture
  • Familiarity with AWS console and services (CloudWatch, IAM, S3, Glue, EC2, etc.)
  • Knowledge of Docker and solutions containerization
  • Experience with Infrastructure as Code (IaC) using Terraform
  • Understanding of both server-side and client-side encryption
  • Code management experience using GitLab for CI/CD
  • Active BPSS or SC clearance, or eligibility for clearance

Apply for this position