Senior Platform Data Engineer
Flow Traders is seeking a seasoned Senior Platform Data Engineer to join their expanding Data team based in Amsterdam. This role presents a unique chance to become part of a prominent proprietary trading company known for its entrepreneurial and forward-thinking culture. They particularly value individuals with sharp, innovative minds and offer opportunities to fully utilize their skills.
As a Platform Data Engineer at Flow Traders, your primary responsibility will be to optimize the value derived from the company's data. You will be tasked with developing and enhancing platform functionalities, such as self-service offerings to data platform users, improving resource efficiency, and delivering on various business requirements. Collaboration with Quantitative Traders and Fundamental Analysts is key to ensure data availability, performance, and accessibility through the selection of suitable technologies.
Platform capabilities encompass a range of features including but not limited to self-service ETL templates, job scheduling tools, user cluster provisioning, value-added services like monitoring and alerting, configurable Data Quality rules, Metadata Management, and decisions on data modeling, storage, and formats. You will also be involved in proposing and implementing new platform capabilities and technologies.
Key Responsibilities
- Develop and maintain platform functionalities
- Build and upkeep data infrastructure and data-focused applications
- Derive insights and ensure data is available, accessible, and performant for business needs
- Collaborate closely with other technology and business teams
- Assist in operational tasks related to deployment and configuration of data infrastructure
- Mentor and guide junior team members
Qualifications for Success
- Proficiency in data modeling and data warehousing platforms, such as BigQuery
- Extensive experience in distributed storage, middleware (e.g., Kafka), data processing (e.g., Apache Beam, Spark), and container orchestration (e.g., Kubernetes)
- Hands-on expertise in Java and Python programming languages
- Understanding of data governance principles and processes
- Familiarity with modern data science techniques and tools like Pandas, TensorFlow, and SparkML, with exposure to AI and machine learning
- Sound knowledge of cloud computing, especially in storage, databases, data analytics, and processing, preferably within GCP and AWS environments