WHAT YOU'LL DO
- You will work in multi-disciplinary environments harnessing data to provide real-world impact for organisations globally.
- Partner with our clients, from data owners and users to C-level executives, to understand their needs and build impactful analytics solutions
- Design and build data pipelines to support data science projects following software engineering best practices
- Use state of the art technologies to acquire, ingest and transform big datasets
- Map data fields to hypothesis, curate, wrangle and prepare data to be used in advanced analytics models
- Create and manage data environments in the cloud or on premise
- Ensure information security standards are maintained at all time
- Contribute to cross-functional problem-solving sessions with your team and deliver presentations to colleagues and clients
- Be flexible to travel to our clients' offices to deliver presentations, gather information or share knowledge
- Have the opportunity to contribute to R&D and internal asset development projects
Our tech stack
While we advocate for using the right tech for the right task, we often leverage the following technologies: Python, PySpark, SQL, Airflow, Databricks, our own OSS called Kedro, container technologies such as Docker and Kubernetes, cloud solutions such as AWS, GCP or Azure, and more!