Data Engineer
Data Engineer
Data Engineer
Data Engineer
Senior Data EngineerWaterstons Ltd
Internet, IT
Zürich
- Art der Anstellung: Vollzeit
- Vor Ort
Data Engineer
Über diesen Job
Data Engineer
Role details
Job location
Tech stack
Job description
As a Data Engineer within our Quantitative Platform team, you will play a pivotal role in building and maintaining the data infrastructure that fuels our research and trading strategies. You will be responsible for the end-to-end lifecycle of diverse datasets - including market, fundamental, and alternative sources - ensuring their timely acquisition, rigorous cleaning and validation, efficient storage, and reliable delivery through robust data pipelines. Working closely with quantitative researchers and technologists, you will tackle complex challenges in data quality, normalization, and accessibility, ultimately providing the high-fidelity, readily available data essential for developing and executing sophisticated investment models in a fast-paced environment.
Your responsibilities will include:
- Evaluating, onboarding, and integrating complex data products from diverse vendors, serving as a key technical liaison to ensure data feeds meet our stringent requirements for research and live trading.
- Designing, implementing, and optimizing robust, production-grade data pipelines to transform raw vendor data into analysis-ready datasets, adhering to software engineering best practices and ensuring seamless consumption by our automated trading systems.
- Engineering and maintaining sophisticated automated validation frameworks to guarantee the accuracy, timeliness, and integrity of all datasets, directly upholding the quality standards essential for the efficacy of our quantitative strategies.
- Providing expert operational support for our data pipelines, rapidly diagnosing and resolving critical issues to ensure the uninterrupted flow of high-availability data powering our daily trading activities.
- Participating actively in team rotations, including on-call schedules, to provide essential coverage and maintain the resilience of our data systems outside of standard business hours.
Requirements
- 1+ years' experience building ETL/ELT pipelines using Python
- Familiarity with various technologies such as S3, Kafka, Airflow, Iceberg
- Proficiency working with large financial datasets from various vendors.
- A commitment to engineering excellence and pragmatic technology solutions.
- A desire to work in an operational role at the heart of a dynamic data-centric enterprise.
- Excellent communication and collaboration skills, and the ability to work in a team.
What would be advantageous:
- Strong understanding of financial markets.
- Proficiency working with large financial datasets from various vendors
- Prior experience working with equity data and resolving associated challenges, including cross-reference management across multiple vendors, corporate action handling, and revision workflows
- Experience working with hierarchical reference data models.
- Proven expertise in handling high-throughput, real-time market data streams
- Familiarity with distributed computing frameworks such as Apache Spark
- Operational experience supporting real time system
