About Solidus Labs At Solidus, we are shaping the financial markets of tomorrow by providing cutting-edge trade surveillance technology that protects investors, enhances transparency, and ensures regulatory compliance across traditional financial assets and crypto markets. With over 20 years of experience in developing Wall Street-grade FinTech, our team delivers innovative solutions that financial institutions and regulators worldwide rely on to detect, investigate, and report market manipulation, financial crime, and fraud. Headquartered in Wall Street, with offices in Singapore, Tel Aviv, and London, we safeguard millions of retail and institutional entities globally, monitoring over a trillion events each day. The Role We’re looking for a strong Senior Data Engineer with experience in building robust, scalable, maintainable and thoroughly monitored data pipelines on cloud environments. As a young and ambitious company in an extremely dynamic space, we pride ourselves on being independent, accountable, and organized, have a self-starter attitude, and be willing to get their hands dirty with day-to-day work that might be out of their official scope, while keeping an eye on their goals and the big picture. Responsibilities: Tackle data duplication, velocity, schema adherence (and schema versioning), high availability, data governance, and more. Develop, design, and maintain end-to-end ETL workflows, including data ingestion and transformation logic, involving different data sources. Enrich financial data through third-party data integrations. Develop and maintain our data pipeline written mostly in Java and running on K8S in a micro-service architecture. Plan and communicate integrations with other teams that consume the data and use it for insights creation. Ongoing improvement of the way data is stored and served. Improve queries and data formats to make sure the data is optimized for consumption by a variety of clients. Requirements: BSc. in Computer Sciences from a top university, or equivalent. 8 years in data engineering, and data pipeline development on high-volume production environments, with at least 2 years experience with Java, and 2 years experience with monitoring systems (Prometheus, Grafana, Zabbix, Datadog). Experience working in fintech or trading industries. Experience in object-oriented development. Should have strong software engineering foundations. Experience with data-engineering cloud technologies as Apache Airflow, K8S, Clickhouse, Snowflake, Redis, cache technologies and Kafka. Experience with relational and non-relational DBs. Proficient in SQL and query optimizations. Experience with designing infrastructure to maintain high availability SLAs. Experience with monitoring and managing production environments. Curiosity, ability to work independently and proactively identify solutions. Strong communication skills.