Senior Data Engineer Python/GCP (x/f/m)
Your Impact
We are looking for a Senior Data Engineer to join the AI Team working on our AI Medical Companion.
Your mission will be to build and optimize the data foundations that power safe, scalable, and impactful AI models. You will work on data infrastructure for LLM, VLM, and RAG-based systems, ensuring our engineers and data scientists can train, evaluate, and deploy AI models efficiently on high-quality, well-structured, and compliant data. Your work will directly support health professionals in delivering better care while improving their work-life balance, ultimately impacting 80 million patients and 400,000 healthcare professionals across Europe.
Working in the tech team at Doctolib means building innovative products and features to improve the daily lives of care teams and patients.
What you'll do
Your responsibilities include but are not limited to:
- Design, build, and maintain scalable data pipelines on Google Cloud Platform (GCP) for AI and machine learning use cases
- Implement data ingestion and transformation frameworks that power Retrieval systems and training datasets for LLMs and multimodal models
- Architect and manage NoSQL and Vector Databases to store and retrieve embeddings, documents, and model inputs efficiently
- Collaborate with ML and platform teams to define data schemas, partitioning strategies, and governance rules that ensure privacy, scalability, and reliability
- Integrate unstructured and structured data sources (text, speech, image, documents, metadata) into unified data models ready for AI consumption
- Optimize performance and cost of data pipelines using GCP native services (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI)
- Contribute to data quality and lineage frameworks, ensuring AI models are trained on validated, auditable, and compliant datasets
- Continuously evaluate and improve our data stack to accelerate AI experimentation and deployment
- You have 5+ years of experience in Data Engineering, ideally supporting AI or ML workloads
- You have strong experience with the GCP data ecosystem and proficiency in Python and SQL
- You have deep understanding of NoSQL systems (e.g., MongoDB) and vector databases (e.g., FAISS, Vector Search)
- You have experience designing data architectures for RAG, embeddings, or model training pipelines
- You have knowledge of data governance, security, and compliance for sensitive or regulated data
- You are fluent in English
- You hold a Master's or Ph.D. degree in Computer Science, Data Engineering, or a related field
- You have familiarity with W&B / MLflow / Braintrust / DVC for experiment tracking and dataset versioning
- You have experience with containerized environments (Docker, Kubernetes) and CI/CD for data workflows
- Our solutions are built on a single fully cloud-native platform that supports web and mobile app interfaces, multiple languages, and is adapted to country and healthcare specialty requirements.
- Our stack is composed of Rails, TypeScript, Java, Python, Kotlin, Swift, and React Native.
- We leverage AI ethically across our products to empower patients and health professionals. Discover our AI vision here.
- Free comprehensive health insurance (basic package) for you and your children
- 25 days of paid vacation per year, plus up to 14 days of RTT
- Free mental health and coaching services through our partner Moka.care
- Work from abroad for up to 10 days per year thanks to our flexibility days policy
- Lunch vouchers (Swile card) worth €8.50 per working day, with €4.50 covered by Doctolib
- A subsidy from the work council to refund part of the membership to a sport club or a creative class
- 50% reimbursement of your public transport subscription
- Parent Care Program: receive one additional month of leave on top of the legal parental leave
- Enrollment in Doctolib's long-term employee value sharing plan called DoctoGrowth
- For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
- Relocation support in case of international mobility
- Access to the best AI tools for coding, development and dedicated training
- Recruiter Interview
- Technical Deep Dive
- System Design Interview
- Behavioral Interview
- At least one reference check
- Permanent position
- Tech stack: GCP, Python, SQL, NoSQL, Vector Databases, AI/ML
- Full-time
- Paris, France
- Hybrid work setup (up to 2 remote days per week)
- Start date: as soon as possible