Data Engineer
Role details
Job location
Tech stack
Requirements
Overview Join to apply for the Data Engineer role at TechDelivery. We are looking for a Data Engineer to join a dynamic team focused on building scalable, secure, and high-performance data solutions. The ideal candidate will have hands-on experience in Big Data environments, strong programming skills, and the ability to work collaboratively with business stakeholders. Responsibilities * Design, build, and optimize modern data pipelines and data flow architectures to support analytics and business intelligence. * Work with Big Data technologies, including Databricks and Apache Spark, to process and transform large-scale datasets. * Develop and maintain scalable data solutions using PySpark and PowerShell, ensuring automation and efficiency. * Ensure data quality, security, and integrity across all stages of the data lifecycle. * Collaborate with data analysts, data scientists, and business stakeholders to understand requirements and deliver reliable data products. * Monitor and troubleshoot data pipelines and proactively identify performance improvements. * Document data flows, technical specifications, and system designs to support long-term maintenance and scalability. * Stay up to date with the latest tools and best practices in data engineering and apply them to project work. Required Skills * Databricks * PySpark * PowerShell * Apache Spark * data pipelines * data architecture * data quality * data security * data governance * Azure * cloud environments * ETL * ELT * data integration * cross-functional collaboration * analytical thinking * problem-solving * English B2 * Spanish communication * adaptability Desired Skills * Azure Data Factory * Azure Synapse Analytics * Azure Data Lake * CI/CD for data workflows * high-volume data processing * data compliance * data governance frameworks * DataOps * Python * SQL * automation scripting * agile methodologies * cloud-native data services * distributed computing Company &