Data Engineer
About OKX
OKX is a leading cryptocurrency trading platform and Web3 ecosystem with a strong presence in over 180 international markets. Trusted by more than 20 million customers worldwide, OKX is renowned for its speed, reliability, and popularity among investors and professional traders on a global scale.
Since its establishment in 2017, OKX has fostered a diverse community of individuals passionate about participating in a new financial system that promotes fairness and inclusivity. We are committed to educating people about the potential of cryptocurrency markets and guiding them to make informed investment decisions. In addition to our flagship OKX trading app, we are excited to introduce MetaX, our Web3 wallet, catering to those interested in exploring NFTs, the metaverse, and trading GameFi and DeFi tokens.
About the Data Team
The OKX data team plays a pivotal role in managing the comprehensive data scope of OKG. We handle technical selection, architecture design, data ingestion, data storage, ETL, data visualization, business intelligence, and data science. As data engineers, data analysts, and data scientists, we assume end-to-end ownership over OKx's data lifecycle, including data ingestion, ETL processes, data warehousing, and data services. As a data engineer, your collaboration with the team will leverage data technologies to drive evidence-based decision-making, enhance the quality of our products and services, and facilitate the company's growth.
Responsibilities:
- Develop and implement robust and efficient data pipelines for both batch and real-time streaming data.
- Design data infrastructure on cloud platforms using industry-standard tools.
- Execute projects with an Agile mindset, ensuring timely delivery and quality outcomes.
- Create software frameworks to tackle complex data challenges at scale.
- Collaborate closely with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools.
- Maintain data integrity and scalability by enforcing data standards, improving data validation and monitoring processes, and proactively addressing any issues that arise.
- Identify and explore opportunities, both internal and external, to enhance our products and services through data-driven insights and improvements.
Requirements:
- Bachelor's Degree in Computer Science or equivalent professional experience.
- Strong proficiency in data processing tools like Spark and Flink.
- Extensive experience in building and implementing batch and streaming data pipelines.
- Solid programming skills in Python, Go, Scala, or Java.
- In-depth knowledge of SQL and NoSQL databases, including performance tuning and troubleshooting.
- Familiarity with DevOps tools such as Git, Docker, and Kubernetes.
- Experience with cloud platforms like AWS, Ali Cloud, GCP, or Azure.
- Proficiency in SQL, with familiarity in advanced features such as window functions, aggregate functions, and scalar/user-defined functions.
- Proven track record of delivering end-to-end data solutions involving data ingestion, persistence, extraction, and analysis.
- Self-motivated, innovative, and collaborative with excellent communication and presentation skills.
- Fluent in written and spoken English.
Preferred Qualifications:
- Experience in FinTech, eCommerce, SaaS, AdTech, or Digital Wallet industries.
- Ability to work effectively with geographically dispersed teams across different time zones.
- Familiarity with big data tools such as Amplitude, Tableau, QlikView, Ali Cloud DataWorks, MaxCompute, Hadoop, Hive, Spark, and HBase is highly desirable.