
Senior Data Engineer, Unified Platform at DRW
LondonFull-timeUP - Platform DataPosted 11 days ago
About the Role
<p><span style="font-size: 12pt;"><strong>DRW</strong> is a diversified trading firm with over 3 decades of experience bringing sophisticated technology and exceptional people together to operate in markets around the world. We value autonomy and the ability to quickly pivot to capture opportunities, so we operate using our own capital and trading at our own risk.</span></p>
<p><span style="font-size: 12pt;">Headquartered in Chicago with offices throughout the U.S., Canada, Europe, and Asia, we trade a variety of asset classes including Fixed Income, ETFs, Equities, FX, Commodities and Energy across all major global markets. We have also leveraged our expertise and technology to expand into three non-traditional strategies: real estate, venture capital and cryptoassets.</span></p>
<p><span style="font-size: 12pt;">We operate with respect, curiosity and open minds. The people who thrive here share our belief that it’s not just what we do that matters–it's how we do it. <strong>DRW</strong> is a place of high expectations, integrity, innovation and a willingness to challenge consensus.</span></p>
<p><span style="font-size: 12pt;">As a <strong>Senior Data Engineer</strong> on our Data Experience team, you will play an integral role in bringing vendor datasets into our data platform, governing our centralized data pipelines, consolidating data delivery infrastructure across disparate systems, and working alongside individual Traders, Quantitative Researchers, and Back-Office personnel to best utilize the firm’s data and platform tools.</span></p>
<p><span style="font-size: 12pt;"><strong>Technical requirements summary:</strong></span></p>
<ul>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Have extensive experience designing and building data pipelines</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Have experience working within modern batch and streaming data ecosystems</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">An expert in SQL and Java or Python</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Extensive knowledge of data modeling techniques</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Able to own the delivery of data products, working with analysts and stakeholders to understand requirements and implement solutions</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Able to contribute to project management and project reporting</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Have experience driving nontrivial data migrations across platforms</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Able to establish, follow, and enforce data modeling standards across a large ecosystem.</span></li>
</ul>
<h1><span style="font-size: 12pt;">What you will do in this role:</span></h1>
<ul>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Model, build, and manage data products built atop DRW’s Unified Data Platform.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Work closely with Data Strategists to determine appropriate data sources and implement processes that onboard and manage new datasets for trading, research, and back-office purposes.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Design data governance processes that enable discovery, cost-sharing, usage tracking, access controls, and quality control of datasets to address the needs of DRW trading teams and strategies.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Continually monitor data ingestion pipelines and data quality to ensure stability, reliability, and quality of the data. Design monitoring and quality control software and processes.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Own the technical aspects of vendor ingestion pipelines, coordinating with vendor relationship managers on upcoming changes, performing routine data operations without breaking internal users, and contributing to the team’s on-call rotation to respond to unanticipated changes.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Rapidly respond to user requests, identifying platform gaps, and self-service opportunities that make the user experience more efficient.</span></li>
</ul>
<h1><span style="font-size: 12pt;">What you will need in this role:</span></h1>
<ul>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">7+ years of experience working with modern data technologies and building data-first products.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Excellent written and verbal communication skills.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Proven ability to work in a collaborative, agile, and fast-paced environment, prioritizing multiple tasks and projects, and efficiently handle the demands of a trading environment.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Proven ability to deliver rapid results within processes that span multiple stakeholders.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Strong technical problem-solving skills.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Extensive familiarity with SQL and Java or Python, with a proven ability to develop and deliver maintainable data transformations for production data pipelines.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Experience leveraging data modeling techniques and ability to articulate the trade-offs of different approaches.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Experience with one or more data processing technologies (e.g. Flink, Spark, Polars, Dask, etc.).</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Experience with multiple data storage technologies (e.g. S3, RDBMS, NoSQL, Delta/Iceberg, Cassandra, Clickhouse, Kafka, etc.) and knowledge of their associated trade-offs.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Experience with multiple data formats and serialization systems (e.g. Arrow, Parquet, Protobuf/gRPC, Avro, Thrift, JSON, etc.).</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Experience managing data pipeline orchestration systems (e.g. Kubernetes, Argo Workflows, Airflow, Prefect, Dagster, etc.), delving deeply into low-level details.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Proven experience in managing the operational aspects of large data pipelines such as backfilling datasets, rerunning batch jobs, monitoring event throughput, and handling dead-letter queues.</span></li>
<li style="font-size: 12pt;"><span style="font-size: 12pt;">Prior experience triaging data quality control processes, correcting data gaps and inaccuracies.</span></li>
</ul>
<p><span style="font-size: 12pt;"><strong>For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice at <a href="https://drw.com/privacy-notice">https://drw.com/privacy-notice</a>.</strong></span></p>
<p><span style="font-size: 12pt;"><strong> California residents, please review the California Privacy Notice for information about certain legal rights at <a href="https://drw.com/california-privacy-notice">https://drw.com/california-privacy-notice</a>.</strong></span></p>
<p><span style="font-size: 12pt;"><strong>[#LI-TL1]</strong> </span></p>