Senior Data Engineer
Role details
Job location
Tech stack
Requirements
-
Experience: 8-10 years in Data Engineering and Data Analysis.
-
Informatica Expertise: Strong hands-on experience in Informatica PowerCenter/IDQ for ETL design, development, and optimization.
-
PySpark Development: Advanced skills in PySpark for large-scale data processing, transformation, and analytics.
-
Hadoop Ecosystem: Solid working knowledge of Hadoop technologies (HDFS, Hive, Sqoop, MapReduce).
-
Programming Skills: Proficiency in Python and Kafka for streaming and batch data pipelines.
-
Database & Modeling: Strong understanding of database concepts, data design, data modeling, and ETL workflows.
-
ETL Lifecycle: Experience in analyzing, designing, and coding ETL programs including data extraction, ingestion, quality checks, normalization, and loading.
-
Agile Delivery: Hands-on experience with Agile methodology and Jira for project delivery.
-
Client Interaction: Proven ability in client-facing roles with strong communication and leadership skills to coordinate across SDLC.
Preferred Skills:
- Exposure to AWS data components and analytics.
- Familiarity with machine learning models and AI concepts.
- Experience with data modeling tools such as Erwin.