Big Data specialist

Go Arrow
15 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 100K

Job location

Tech stack

Java
Amazon Web Services (AWS)
Azure
Big Data
Databases
Data Architecture
Data Infrastructure
ETL
Data Systems
Data Warehousing
Distributed Systems
Hadoop
HBase
Python
Machine Learning
NoSQL
Google Cloud Platform
Spark
Apache Flink
Cassandra
Kafka
Data Management
Data Pipelines
Programming Languages

Job description

We are seeking a highly skilled Big Data Architect to lead the design, development, and implementation of scalable data solutions. The successful candidate will be responsible for creating robust data architectures that support advanced analytics, machine learning, and business intelligence initiatives. This role offers an exciting opportunity to work at the forefront of data technology within a dynamic organisation committed to innovation and excellence., * Design and develop comprehensive big data architectures that optimise data collection, storage, processing, and analysis.

  • Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into scalable solutions.
  • Evaluate and select appropriate big data tools and frameworks, such as Hadoop, Spark, Kafka, or similar technologies.
  • Ensure the security, integrity, and privacy of organisational data assets through effective governance practices.
  • Develop best practices for data modelling, ETL processes, and data warehousing to support diverse analytical needs.
  • Lead efforts in integrating new data sources and maintaining existing data pipelines for continuous improvement.
  • Provide technical guidance and mentorship to team members involved in data architecture projects.
  • Stay abreast of emerging trends in big data technologies and recommend innovative solutions to enhance organisational capabilities.

Requirements

Do you have experience in Spark?, * Proven experience as a Big Data Architect or similar role with a strong understanding of distributed computing systems.

  • Proficiency in big data frameworks such as Hadoop, Spark, Kafka, Flink or equivalent platforms.
  • Solid knowledge of database management systems including NoSQL databases like Cassandra or HBase.
  • Strong understanding of cloud platforms such as AWS, Azure or Google Cloud for scalable big data solutions.
  • Expertise in data modelling, ETL processes, and data warehousing concepts.
  • Familiarity with programming languages such as Java, Scala or Python for developing custom solutions.
  • Excellent problem-solving skills with the ability to translate complex business requirements into technical architectures.
  • Strong communication skills to effectively collaborate across teams and present technical concepts clearly. This role offers an engaging environment where innovation is valued and professional growth is encouraged for those passionate about shaping the future of organisational data infrastructure.

Apply for this position