www.acad.jobs : academic jobs worldwide – and the best jobs in industry
                
     
Position: (Senior) Data/Machine Learning Engineer (m/f/diverse)
Institution: Deutsche Lufthansa AG
Location: Hamburg, Germany
Duties: We’re using technologies like Python, Terraform, Kubernetes, Docker & Pub/Sub to run event-based analytics pipelines, processing new data points every day on the Google Cloud Platform; We have developed an ML Ops stack allowing us to seamlessly deploy our models into a production environment; We’re rolling out new features through fully automated CI/CD pipelines including code reviews & automated tests; We take responsibility for the full DevOps cycle, but thanks to managed cloud services & automation, we spend most of our time working on new features & architecture optimisations, rather than responding to ops issues
Requirements: completed degree in computer science or a related subject; at least four years of experience in data engineering (setting up and operating data pipelines in big data/analytics environments); at least two years of experience with one or more of the major cloud providers (GCP/Azure/AWS); experience in software development with Python, knowledge of tools such as VS Code, PyCharm, git, Jupyter, pip, conda, CLI
   
Text: (Senior) Data/Machine Learning Engineer (m/f/diverse) We’re using technologies like Python, Terraform, Kubernetes, Docker & Pub/Sub to run event-based analytics pipelines, processing new data points every day on the Google Cloud Platform; We have developed an ML Ops stack allowing us to seamlessly deploy our models into a production environment; We’re rolling out new features through fully automated CI/CD pipelines including code reviews & automated tests; We take responsibility for the full DevOps cycle, but thanks to managed cloud services & automation, we spend most of our time working on new features & architecture optimisations, rather than responding to ops issues completed degree in computer science or a related subject; at least four years of experience in data engineering (setting up and operating data pipelines in big data/analytics environments); at least two years of experience with one or more of the major cloud providers (GCP/Azure/AWS); experience in software development with Python, knowledge of tools such as VS Code, PyCharm, git, Jupyter, pip, conda, CLI
Please click here, if the job didn't load correctly.







Please wait. You are being redirected to the job in 3 seconds.