Responsibilities:
Contributing new software components both for live trading and research that scale effectively
Designing and extending data-intensive ETL pipelines
Profiling and optimizing the hot paths of the codebase for performance
Building and maintaining continuous integration pipelines
Enhancing the live system’s stability and observability via monitoring and alerting
Requirements:
Bachelor's degree in Computer Science, Engineering.
2 years experience with core Python and its mainstream big data / scientific libraries (NumPy, Pandas, etc.)
Working experience with at least one strongly typed programming language (ideally C++)
Excellent communication and teamwork abilities and the ability to work in a fast-paced, collaborative, and geographically distributed environment
Nice to Have:
Experience with scripting languages such as Bash
Experience with Google Cloud platform
Experience with workflow management and task scheduling