Senior Data Engineer
We are a prominent algorithmic trading and market-making firm specializing in digital asset liquidity provision. Operating across numerous global locations, we facilitate a significant percentage of global daily volumes and boast connectivity to over 50 venues.
Our reputation is built on leveraging a systematic approach, sophisticated pricing models, and cutting-edge execution capabilities to deliver resilient and dependable trading performance, fostering liquidity in crypto markets worldwide.
What truly distinguishes us is our company culture. Our flat organizational structure ensures autonomy and ample opportunities to drive innovation, contribute ideas, and play a key role in shaping the systems that will drive our future success.
Role Overview
We are seeking an experienced Data Engineer to oversee the management and enhancement of our market and trading data archives, as well as internal data products. In this role, you will collaborate with existing data pipelines and databases while spearheading the development and implementation of Auros' next-generation data and analytic capabilities. Your contributions will have a significant impact on our business outcomes as you work in a dynamic environment alongside our experienced trading team.
Responsibilities
- Design, test, and maintain distributed data architectures optimized for high throughput and volume
- Identify, analyze, and automate enhancements to data quality
- Develop and manage real-time data collectors for time series databases
- Enhance trading analytics systems
- Create tools for automating the setup, deployment, and troubleshooting of the data pipeline
- Devise strategies for optimizing the efficiency, reliability, and timeliness of our data pipeline within a 24/7 trading setting
- Establish monitoring mechanisms to ensure the accuracy and completeness of captured data
- Mitigate the impact of trading systems and protocol changes on the data pipeline
- Populate historical datasets and ensure data cleanliness
- Collaborate with traders and trading system developers to understand data analysis requirements and enhance data quality
- Develop user-friendly tools, APIs, and interfaces for accessing archived data
Requirements
- Proficient in using Python for data analysis and developing ad hoc tools to analyze time series and large datasets
- Preferably, experience in building real-time large-scale data pipelines handling massive data volumes
- Familiarity with distributed, high-performance SQL and NoSQL database systems
- Bachelor's degree or higher in Computer Science, Software Engineering, or a related field with outstanding academic achievement
Desired Skills
- Experience with data lakes like Amazon S3 or similar technologies
- Proficiency in C++ development on Linux
- Background in protocol-level network analysis
- Knowledge of Terraform
- Familiarity with technologies such as Hive, Hadoop, Snowflake, Presto, or similar tools.
