Senior Data Engineer
Overview
As a Senior Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure that powers our decentralized data platform. This includes building and optimizing data pipelines, designing and implementing data storage solutions, and ensuring data quality and integrity. You will work closely with our data scientists, developers, and product teams to understand and solve complex data challenges.
The ideal candidate will have a strong background in big data technologies and experience working with distributed systems. They should also have excellent programming skills and be able to work in a fast-paced, collaborative environment. If you have a passion for working with data and an interest in decentralized technologies, we would love to hear from you.
Responsibilities
Design, build and maintain the data infrastructure, including data storage and processing systems, data pipelines, and data integration.
Develop and implement data pipelines that can handle large-scale data processing and analytics at high speed.
Collaborate with other members of the technical team to gather and analyze requirements, and translate them into technical solutions.
Optimize the data infrastructure for performance and scalability.
Monitor and troubleshoot issues related to the data infrastructure and data pipelines.
Ensure compliance with relevant industry standards and regulations.
Stay current with new developments in data technology and the industry and incorporate them into the project solutions as appropriate.
Create technical documentation, including architecture diagrams, technical specifications, and user manuals.
Collaborate with other teams to provide technical guidance and mentorship to other members of the team.
Qualifications
Bachelor's or Master's degree in Computer Science, Computer Engineering, or a related field.
At least 8+ years of experience in data engineering, data integration, or a related field.
Strong experience with data storage and processing systems such as RDBMS, NoSQL databases, and data warehousing.
Strong understanding of data pipelines, data integration, and ETL processes.
Strong experience with programming languages such as SQL, Python, and Java.
Experience with cloud infrastructure, including AWS, Azure, or GCP.
Strong analytical and problem-solving skills.
Strong experience with monitoring and troubleshooting data infrastructure and data pipelines.
Experience with creating technical documentation, including architecture diagrams, technical specifications, and user manuals.
