#1563 - Senior Data Engineer Python
Role details
Job location
Tech stack
Job description
Description: At Imagemaker we are looking for the best talents in solution development to join our team of Makers.
The main role of the Data Engineer is to develop and maintain data pipelines and builds out new API integrations to support continuing increases in data volume and complexity. The skillset required for this profile are: Hands on experience with Python and knowledge of Data Science popular libraries.
Hands on experience with popular databases, PostgreSQL and MongoDB.
Knowledge of GeoSpatial and TimeSeries DBs.
Experience with Docker is a must with a good knowledge of Kubernetes.
Responsibilities & Accountabilities
Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
Writes unit/integration tests, contributes to engineering wiki, and documents work.
Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
Works closely with a team of frontend and backend engineers, product managers, and analysts.
Designs data integrations and data quality framework.
Designs and evaluates open source and vendor tools for data lineage.
Works closely with all business units and engineering teams to develop strategy for long term data platform architecture.
Requirements: Skills: Hands on experience with Image Processing is a must
Hands on experience with Python and knowledge of Data Science popular libraries like (PyTorch, Tensorflow and OpenCV - either OR)
Hands on experience with popular databases, PostgreSQL(must) and MongoDB(optional). Knowledge of geospatial (must)and timeseries DBs (must).
Experience with Docker is a must. Knowledge of devops tools and Kubernetes(preferred) is a plus.
Great understanding of messaging broker concept (preferred).
Good knowledge on task scheduling techniques (preferred).
Experience with data pipeline and workflow management tools is desirable (preferred).
Experience with or knowledge of Agile Software Development methodologies (desirable)
Excellent problem solving and troubleshooting skills
Process oriented with great documentation skills
Excellent oral and written communication skills with a keen sense of customer service
Requirements: BS or MS degree in Computer Science or a related technical field
Strong analytic skills related to working with unstructured datasets
Experience
5+ years of Python or other
5+ years of SQL and NoSQL experience
5+ years of experience with schema design and dimensional data modeling
Experience with other programming languages like TypeScript (NodeJs) and Go is a huge advantage
Ability in managing and communicating data warehouse plans to internal clients
Experience designing, building, and maintaining data processing systems
Experience working with either a Map Reduce or an MPP system on any size/scale (a plus)
We invite you to review the following requirements of the offer: contractors (Latam - EUR), 100% remote work, fluent English and availability to work at least 3 hours at the client's schedule +3H UAE - Abu Dhabi.
- For countries such as Spain, Colombia and Chile we offer indefinite term
Requirements
Hands on experience with Python and knowledge of Data Science popular libraries.
Hands on experience with popular databases, PostgreSQL and MongoDB.
Knowledge of GeoSpatial and TimeSeries DBs.
Experience with Docker is a must with a good knowledge of Kubernetes., Hands on experience with Image Processing is a must
Hands on experience with Python and knowledge of Data Science popular libraries like (PyTorch, Tensorflow and OpenCV - either OR)
Hands on experience with popular databases, PostgreSQL(must) and MongoDB(optional). Knowledge of geospatial (must)and timeseries DBs (must).
Experience with Docker is a must. Knowledge of devops tools and Kubernetes(preferred) is a plus.
Great understanding of messaging broker concept (preferred).
Good knowledge on task scheduling techniques (preferred).
Experience with data pipeline and workflow management tools is desirable (preferred).
Experience with or knowledge of Agile Software Development methodologies (desirable)
Excellent problem solving and troubleshooting skills
Process oriented with great documentation skills
Excellent oral and written communication skills with a keen sense of customer service
Requirements: BS or MS degree in Computer Science or a related technical field
Strong analytic skills related to working with unstructured datasets
Experience
5+ years of Python or other
5+ years of SQL and NoSQL experience
5+ years of experience with schema design and dimensional data modeling
Experience with other programming languages like TypeScript (NodeJs) and Go is a huge advantage
Ability in managing and communicating data warehouse plans to internal clients
Experience designing, building, and maintaining data processing systems
Experience working with either a Map Reduce or an MPP system on any size/scale (a plus)
We invite you to review the following requirements of the offer: contractors (Latam - EUR), 100% remote work, fluent English and availability to work at least 3 hours at the client's schedule +3H UAE - Abu Dhabi.
- For countries such as Spain, Colombia and Chile we offer indefinite term