- Bachelor degree in Computer Science or Software Engineering
- Specializing in data science or a higher degree is a big plus.
- At least 02-year-experience in building data platforms and pipelines for analytics.
- Excellence at least 2 programming languages like SQL, Python, Java, Scala
- Experience with Hadoop, Spark
- Experience with cloud services (AWS, GCP, Azure)
- Experience with different database/data warehouse systems: MongoDB, PostgreSQL, BigQuery, etc
- Experience with data pipeline and workflow management tools: Airflow, Cloud Composer, dbt, Airbyte
- Knowledge of data viz tools like Metabase, Tableau, Looker, etc
- Knowledge of streaming process platforms is a plus
- Exposure to emerging open-source technologies.
Why it would be awesome to work with us
- We collect, analyze and integrate data into every corner of the enterprise. Data team is the backbone of our company. Just like the “on-demand delivery” segment, our data team provides "on-demand technologies" to serve our fast-changing business needs and evolving market.
- Our data team has the power to try many up-to-date solutions, tools, and technologies such as Cloud Services, Big Data Distributed Systems, Machine Learning Models…
- Currently, we have created different data positions to solve a variety of problems that are fun, challenging, and meaningful.