Back to search:Senior AI / Germany, Oh

Your mission & challenges
  • Design, implement, and maintain ETL data pipelines at scale
  • Build and optimize data models for robotics applications
  • Ensure data quality, governance, and security across all platforms
  • Develop data workflows using scalable processing, streaming, and dataset curation technologies
  • Collaborate with cross-functional teams to deliver high-quality datasets
  • Evaluate and integrate emerging data engineering technologies and best practices

What we can look forward to
  • Master’s degree in Computer Science, Information Systems, or related field
  • 7+ years of experience in data engineering or related roles
  • Strong programming skills in Python and SQL; experience with Java or Scala for big data frameworks
  • Experience with modern data technologies (Spark, Kafka, Airflow) and NoSQL databases (e.g., MongoDB)
  • Cloud expertise (AWS, Azure, or GCP) and familiarity with data lake/data warehouse solutions
  • Proficiency in containerization and orchestration (Docker, Kubernetes)
  • Excellent problem-solving and debugging skills
  • Ability to work independently and as part of a team
  • You have a perfect command of the English language and, best of all, speak German well
#J-18808-Ljbffr

FoCookieConsentP1 FoCookieConsentLink FoCookieConsentP2