Hadoop Developer
Realign Llc
Charlotte, United States of America
1 month ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
Senior Compensation
$ 125KJob location
Charlotte, United States of America
Tech stack
Java
Airflow
Amazon Web Services (AWS)
Azure
Big Data
Cloud Computing
Information Engineering
Data Governance
ETL
Data Mining
Data Security
Data Warehousing
Linux
Distributed Data Store
Hadoop
Hadoop Distributed File System
MapReduce
Hive
Python
Apache Oozie
Shell Script
SQL Databases
Sqoop
Data Processing
Google Cloud Platform
Spark
Data Lake
Kafka
Data Pipelines
Job description
We are seeking a skilled Hadoop Developer to join our data engineering team. The ideal candidate will design, develop, and maintain scalable big data solutions using the Hadoop ecosystem. You will work closely with data engineers, analysts, and business stakeholders to process large datasets and build reliable data pipelines that support analytics and business intelligence. Key Responsibilities
- Design, develop, and maintain big data applications using the Hadoop ecosystem.
- Build and optimize data pipelines for ingesting, transforming, and processing large-scale datasets.
- Develop solutions using tools such as Hive, Spark, MapReduce, HDFS, and Kafka.
- Write efficient SQL queries and scripts for data extraction, transformation, and loading (ETL).
- Collaborate with data scientists, analysts, and engineering teams to deliver data-driven solutions.
- Monitor and troubleshoot data processing workflows and cluster performance.
- Ensure data quality, governance, and security best practices.
- Optimize Hadoop jobs for performance and scalability.
- Document technical designs, workflows, and development processes.
Key Skills
- Hadoop Ecosystem
- Apache Spark
- Hive & HDFS
- Kafka
- Python / Java / Scala
- ETL & Data Pipelines
- SQL & Data Warehousing
- Linux / Shell Scripting
Requirements
Do you have experience in SQL?, * 6+ years of experience in Hadoop or big data development.
- Strong experience with Hadoop ecosystem tools (HDFS, Hive, Spark, MapReduce, Sqoop, Kafka).
- Proficiency in Python, Java, or Scala.
- Experience with ETL tools and data pipeline development.
- Strong knowledge of SQL and data warehousing concepts.
- Familiarity with Linux/Unix environments.
- Experience with distributed data processing and big data architecture.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud.
- Familiarity with workflow orchestration tools (Airflow, Oozie).
- Knowledge of data lakes, data governance, and data security practices.
- Experience in financial services or large enterprise environments.
Required Skills
CLOUD DEVELOPER