Integration & Architect
| India
4 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Tech stack
API
Agile Methodologies
Amazon Web Services (AWS)
Audit Trail
User Authentication
Cloud Engineering
Databases
Continuous Delivery
Continuous Integration
Data Governance
ETL
Data Mapping
Data Transformation
Data Structures
Data Systems
Relational Databases
Database Design
Middleware
Hypertext Transfer Protocols (HTTP)
Identity and Access Management
JSON
Python
PostgreSQL
Online Analytical Processing
Online Transaction Processing
Performance Tuning
Scrum
E2e Testing
Cloud Services
Standard Sql
Software Deployment
Software Engineering
SQL Databases
Data Streaming
XML
Data Logging
Scripting (Bash/Python/Go/Ruby)
Spark
Boomi
Amazon Web Services (AWS)
Event Driven Architecture
Amazon Web Services (AWS)
PySpark
Enterprise Integration
Kafka
Functional Programming
Cloudwatch
Api Gateway
REST
Job description
As a Data Engineer, you will be responsible for:
- Delivering data-driven product features by working closely with developers and stakeholders
- Managing backlog items and designing, developing, delivering, and supporting data changes
- Ensuring data structures and database designs meet application, scalability, and architectural standards.
- Building and maintaining ETL pipelines for OLAP and OLTP systems.
- Driving build and release activities for data solutions and supporting related architecture artefacts.
- Continuously improving and optimizing data models and warehouse structures.
- Collaborating with product owners, architects, BAs, and scrum teams to deliver change on schedule., As an Integration Engineer, you will be responsible for:
- Designing, building and owning end-to-end enterprise integrations supporting critical business workflows.
- Developing and maintaining Boomi-based integrations, including process design, connector usage, data mapping, error handling, retries, and environment promotions.
- Building event-driven integrations using Kafka producers and consumers with proper offset management.
- Implementing asynchronous and API-led integration patterns to decouple systems.
- Managing backlog items and delivering integration changes using Agile practices.
- Driving build, deployment, and release activities for integration solutions across environments.
- Ensuring integrations meet scalability, security, auditability, and reliability standards.
- Monitoring and supporting production integrations, including incident analysis, message replay, and recovery.
- Collaborating with product owners, architects, application teams to deliver change on schedule.
Requirements
- Strong hands-on SQL and Python skills.
- Strong hands-on experience with databases PostgreSQL
- Strong hands-on experience with streaming platform and event-driven architectures
- Strong hands-on experience with AWS cloud services (EKS, EC2, S3, RDS, Lambda, API Gateway, ECS, VPC, IAM, CloudWatch, etc.) and cloud architecture best practices.
- Solid expertise in physical data modelling, DB design, CDC and performance tuning.
- Experience in creating REST API
- Experience designing, developing, and deploying applications using OLTP and OLAP systems.
- Experience with ETL tools such as Boomi; scripting skills in Spark or PySpark are desirable.
- Hands-on experience with CI/CD tools
- Exposure to financial services, compliance, or regulated data domain
- Familiarity with data governance, lineage, and auditability concepts
- Experience working across onsite/offshore teams.
- Strong team player with a willingness to learn new technologies.
Experience
- 10+ years of experience in various databases and ETL tools.
- Must have ETL E2E experience in documentation, development, testing & deployment to Production
- Strong Relational Database background and SQL Skills
- Proficiency in automation and continuous delivery methods
- Proficiency in all aspects of the Software Development Life Cycle in an Agile environment, * Strong hands-on experience with Boomi (process design, connectors, mappings, Atom/Molecule).
- Strong hands-on experience with Kafka including topics, partitions, offsets, and retries.
- Proven experience designing event-driven and message-based architectures.
- Strong understanding of resiliency patterns (idempotency, retries, DLQs, replay).
- Experience building and consuming REST APIs (JSON/XML, authentication, HTTP semantics).
- Ability to troubleshoot and resolve production integration issues, including failed messages or lag
- Familiarity with logging, monitoring, and operational observability.
- Experience working across onsite/offshore teams.
- Strong ownership mindset and willingness to learn evolving integration patterns., * 6+ years of experience in enterprise integration and middleware development.
- Proven end-to-end integration ownership from design through production support.
- Mandatory hands-on experience delivering integrations using Boomi and Kafka.
- Experience supporting business-critical, high-volume integrations in production.
- Strong understanding of integration patterns, data transformation, and error-handling strategies.
- Proficiency in automation, CI/CD, and Agile delivery practices.