Data Analyst Course Programme

ITonlinelearning
Charing Cross, United Kingdom
2 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Junior
Compensation
£ 65K

Job location

Charing Cross, United Kingdom

Tech stack

Artificial Intelligence
Airflow
Amazon Web Services (AWS)
Big Data
Google BigQuery
Cloud Computing
Cloud Storage
Databases
Data as a Services
Information Engineering
ETL
Data Manipulation Languages
Data Transformation
Data Mining
Data Warehousing
Distributed Systems
MapReduce
Python
Machine Learning
NumPy
Performance Tuning
Azure
SQL Stored Procedures
SQL Databases
Technical Data Management Systems
Data Processing
Scripting (Bash/Python/Go/Ruby)
Snowflake
Spark
GIT
Pandas
Data Lake
PySpark
Information Technology
Star Schema
Google BigQuery
Data Management
Software Version Control
Data Pipelines

Job description

  • Data Pipeline Development: Assist in the design, construction, and maintenance of robust ETL/ELT pipelines to integrate data from various sources into our data warehouse or data lake.
  • Data Transformation with Python: Write, optimize, and maintain production-grade Python scripts to clean, transform, aggregate, and process large volumes of data.
  • Database Interaction (SQL): Develop complex, high-performance SQL queries (DDL/DML) for data extraction, manipulation, and validation within relational and data warehousing environments.
  • Quality Assurance: Implement data quality checks and monitoring across pipelines, identifying discrepancies and ensuring the accuracy and reliability of data.
  • Collaboration: Work closely with Data Scientists, Data Analysts, and other Engineers to understand data requirements and translate business needs into technical data solutions.
  • Tooling & Automation: Utilize version control tools like Git and contribute to the automation of data workflows and recurring processes.
  • Documentation: Create and maintain technical documentation for data mappings, processes, and pipelines., A global consulting firm in London is looking for an entry-level Analyst with an undergraduate qualification in economics. The role involves analyzing economic issues, building models, and preparing client deliverables. Ideal candidates are graduating between December 2025..., Job Description Junior Prompt Product Engineer London, Hybrid Salary up to - £65,000 This is a rare chance to step into an early-career AI product role where your research expertise directly shapes how an emerging platform is built. You will work hands-on with..., Job Description Junior Data Engineer - Research & Technology Role Overview This role supports the development and maintenance of a scalable data platform underpinning private equity investment activity across EMEA. Working closely with investment teams, you will build and...

Requirements

We're looking for an enthusiastic and detail-oriented Junior Big Data Developer to join our data engineering team. This role is ideal for an early-career professional with foundational knowledge in data processing, strong proficiency in Python , and expert skills in SQL . You'll focus on building, testing, and maintaining data pipelines and ensuring data quality across our scalable Big Data platforms., Programming

Strong proficiency in Python for data manipulation and scripting. Familiarity with standard Python data libraries (e.g., Pandas, NumPy ).

Database

Expert-level proficiency in SQL (Structured Query Language). Experience writing complex joins, stored procedures, and performing performance tuning.

Big Data Concepts

Foundational understanding of Big Data architecture (Data Lakes, Data Warehouses) and distributed processing concepts (e.g., MapReduce).

ETL/ELT

Basic knowledge of ETL principles and data modeling (star schema, snowflake schema).

Version Control

Practical experience with Git (branching, merging, pull requests).

Preferred Qualifications (A Plus)

  • Experience with a distributed computing framework like Apache Spark (using PySpark).
  • Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or Google BigQuery/Cloud Storage ).
  • Exposure to workflow orchestration tools ( Apache Airflow, Prefect, or Dagster ).
  • Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field., A remote-first consultancy in the UK is seeking individuals for a role focused on designing and building various data platforms. The ideal candidates will have experience with data and SQL, familiarity with cloud technologies like Snowflake and BigQuery, and strong...

Apply for this position