Branchen:
IT & Programmierung

Data Platform Engineer (f/m/d) – Cologne, Paderborn or remote

DeepL…

is Germany’s best-known AI company. We develop neural networks to help people work with language. With DeepL Translator, we have created the world’s best machine translation system and made it available free of charge to everyone online. Over the next few years, we aim to make DeepL the world’s leading language technology company.

 

Our goal is to overcome language barriers and bring cultures closer together.

 

What distinguishes us from other companies?

DeepL (formerly Linguee) was founded by developers and researchers. We focus on the development of new, exciting products, which is why we spend a lot of time actively researching the latest topics. We understand the challenges of developing new products and try to meet them with an agile and dynamic way of working. Our work culture is very open because we want our employees to feel comfortable. In our daily work we use modern technologies – not only to translate texts, but also to create the world’s best dictionaries, and solve other language problems.

When we tell people about DeepL as an employer, reactions are overwhelmingly positive. Maybe it’s because they have enjoyed our services, or maybe they just want to get on board with our quest to break down language barriers and facilitate communication.

 

What will you be doing at DeepL?

As a data platform engineer you act as a key figure between our software developers, data analysts, product managers and the devops team. Together with your dedicated team you develop, maintain and manage a constantly growing internal analytics platform (PaaS-like).

 

>>>

Your responsibilities

  • Creating, testing and alerting for data pipelines
  • Building out our data infrastructure and managing dependencies between data pipelines
  • Implementing of software for internal tooling and our streaming pipelines
  • Defining and implementing metrics that provide visibility into our data quality
  • Managing and maintaining our Kafka and Clickhouse clusters
  • Defining new requirements and constantly challenge our ideas
  • Last but not least, you are not afraid of trying new technologies and ways even if StackOverflow says it can’t be done

 

What we offer

  • Data at scale from products used by more than 100 million people worldwide
  • Our own analytics and experimentation platform – far beyond the limitations of standard web analytics platforms
  • You get the chance to work and play with: datawarehouses (Clickhouse), configuration management (Ansible), container orchestration (Kubernetes) and CI/CD (Jenkins/Gitlab CI)
  • Interesting challenges: design and programming at the highest level
  • A friendly, international, and highly committed team with a lot of trust and very short decision making processes
  • Meaningful work: We break down language barriers worldwide and bring different cultures closer together
  • A comfortable office in Cologne or Paderborn (or suitable equipment for your home office) and a lot of flexibility

 

About you

  • 2+ years of industry experience as a data engineer
  • A university degree in computer science, information systems or a similar technical field or a similar qualification
  • Expert in Python and basic knowledge in a compiled language (like C#, C++, Java)
  • A solid understanding of SQL
  • Hands-on experience with event streaming and common technologies like Apache Kafka or RabbitMQ
  • You know your way around Linux and can debug a failing system
  • Familiar with the processing of unstructured and structured data at scale
  • You have worked with containers (like Docker) and automated their creation
  • Good communication and team player skills
  • Fluent in English, German is a plus

 

We are looking forward to your application!

 

Gib bei deiner Bewerbung gerne an, dass du von der GetRemote Jobbörse kommst. So werden noch mehr remote Unternehmen ihre offenen Stellen hier teilen.