Senior Data Engineer
Role details
Job location
Tech stack
Job description
As Senior Data Engineer, you should be an expert with data warehousing technical components (e.g. Data Modeling, ETL and Reporting), infrastructure (e.g. hardware and software) and their integration. You will be responsible for collecting data from multiple sources and building optimal pipelines to process & leverage the data to meet various business requirements. You will be responsible for the execution of our data strategy through design and development of the Data platform using, but not limited to, AWS technologies, Airflow and Snowflake to deliver Reporting, BI and Analytics solutions. You will be working closely with business and technical stakeholders to aggregate, analyze & transform data to report insights., * Create and maintain optimal data pipeline architecture
- Assemble, analyze and organize large, complex data sets that meet functional / non-functional business requirements
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using DBT, SQL, Snowflake and AWS/GCP 'big data' technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
- Assist the data science team by preparing data for prescriptive and predictive modelling
- Collaborating with the data architects, analysts and scientists on the team
Requirements
Do you have experience in SQL?, Do you have a Master's degree?, * 5+ years of experience as a Data Engineer or in a similar role
- Experience with data modeling, data warehousing, and building ETL pipelines
- Experience in SQL
- Experience with building data pipelines and applications to stream and process datasets
- Sound knowledge of distributed systems and data architecture (lambda)- design and implement batch and stream data processing pipelines, knows how to optimize the distribution, partitioning, and MPP of high-level data structures.
- Knowledge of Engineering and Operational Excellence using standard methodologies.
- Expertise in designing systems and workflows for handling Big data volumes
- Knowledge of data management fundamentals and data storage principles
- Strong problem-solving skills and ability to prioritize conflicting requirements.
- Excellent written and verbal communication skills and ability to succinctly summarize key findings.
- Experience working with AWS Big Data Technologies (EMR, Redshift, S3), Bachelor's or Master's degree in Computer Science, Information Systems, or equivalent, Driving Continuous Improvement Driving for Results Driving Projects to Completion Interacting with People at Different Levels Using Computers and Technology
About the company
At Optimizely, we’re on a mission to help people unlock their digital potential. With our leading digital experience platform (DXP), we equip teams with the tools and insights they need to create and optimize in new and novel ways. Now, companies can operate with data-driven confidence to create hyper-personalized experiences.
We live an inclusive culture with a global team of 1500+ people across the US, Europe, Australia, Bangladesh, and Vietnam. We blend European and American business culture with emphasis on teamwork, diversity, and moving fast. Our people make the difference!
If you are looking to work on the next generation of digital technologies in a fast-paced, but solid environment, let’s have a conversation! We’re just getting started...