- Lead the development of the Data Infrastructure
- Work with the Latest Technologies
- Attractive salary & benefits
Our client is a venture capital firm that invests in startups with significant impact on society. They are committed to helping them reach the next level with highly unique perspective from investing in startups across 9 countries. With a wide range of startups, they are seeking for talented individuals for roles in their portfolio companies.
You will be responsible for :
- Design, develop and support data pipelines, warehouses and reporting systems to solve business operations, users and product problems.
- Create extract, transform, and load (ETLs) and reporting systems for new data using a variety of traditional as well as large-scale distributed data systems.
- Collaborate and influence Users and Products stakeholders and support engineers to ensure our data infrastructure meets constantly evolving requirements.
- Work closely with analysts to produce various statistical and machine learning models using data processing pipelines.
- Continuously learn emerging new technologies and seek for the possibilities of their adaptation in order to improve continuously the technology stack we use.
- Bachelor's degree in Computer Science, related technical field or equivalent practical experience.
- Experience with one general purpose programming language (e.g., Java, C/C++, Python).
- Experience in data processing using traditional and distributed systems (e.g., Hadoop, Spark, Dataflow, Airflow).
- Experience designing data models and data warehouses and using SQL and NoSQL database management systems.
- You possess strong analytical skills and are comfortable dealing with numerical data
- You pay strong attention to detail and deliver work that is of a high standard
- You are a strong team player who can manage multiple stakeholders