Data Engineer

Here Technologies
München, Germany
2 days ago

Role details

Contract type
Internship / Graduate position
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

München, Germany

Tech stack

Java
Airflow
Amazon Web Services (AWS)
Big Data
Cloud Computing
Data Cleansing
Data Infrastructure
Data Normalization
Data Warehousing
Github
Hadoop
Python
PostgreSQL
MongoDB
NoSQL
Operational Databases
Performance Tuning
Query Optimization
Data Processing
Scripting (Bash/Python/Go/Ruby)
GIT
Build Management
Information Technology
Data Management

Job description

Teza Technologies is looking for Data Engineers to join our data team. Data drives systematic trading and is critical to all aspects of the firm's business., * Work directly with Portfolio Managers and Quantitative Developers to translate business requirements into technical solutions; be a resource to explain dataset details and nuances.

  • Expand our data warehouse by designing and adding new sources and functionality; improve robustness, speed and scalability of our systems; manage data entitlements
  • Provide innovative data management, analytics and technology input to the team and management.
  • Evaluate new tools and technologies suitable for organizing, querying and streaming large datasets.
  • Design and build automated systems for data cleansing, anomaly detection, monitoring and alerting.
  • Support our production data warehouse as required.
  • Develop and maintain strong vendor relationships aligned with our business objectives.

Requirements

This is a hands-on position on a small team of data engineers with growth potential, as this team will grow rapidly over the next couple of years. The firm is looking for outstanding technical skills, strong attention to detail, and experience architecting and building data platforms., * Proficiency in Python and Unix/Linux for data manipulation, scripting, and automation.

  • Strong SQL knowledge and familiarity with NoSQL databases (ideally Postgres and MongoDB), including query optimization and performance tuning.
  • Strong understanding of data modeling principles, including both normalization and denormalization techniques.
  • Familiarity with cloud platforms, e.g. AWS or GCP
  • Experience with Git version control, collaborative workflows (e.g., Github), and understanding of CI/CD best practices.
  • Bachelor's degree in Computer Science, Information Technology or related field.

Nice to have Requirements

  • Financial industry internships / experience are a plus.
  • Experience with Java recommended.
  • Experience with on-premises data infrastructure (e.g., Hadoop)
  • Experience with Apache Airflow or similar workflow orchestration tools.
  • An understanding of best practices for data modeling, including data normalization techniques.
  • Master's degree in Computer Science, Information Technology, Data Science or related field.

Benefits & conditions

  • Health, visual and dental insurance
  • Flexible sick time policy

Apply for this position