Senior Data Management Professional - Data Engineering - Physical AssetsBloomberg runs on data. Our products are fueled by powerful information. Webine data and context to paint the whole picture for our clients, around the clock - from around the world. In Data, we are responsible for delivering this data, news and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify innovative workflow efficiencies, and we implement technology solutions to enhance our systems, products and processes.
The Physical Assets Data Team maintains databases for physical assets such as renewable and conventional power plants, facilities, storage projects globally. The team is currently working on a new future-proof data model and workflow that can facilitate and accelerate coverage expansion for integrated use in downstream analysis across our customer groups (includingernments, portfolio managers, corporations, equity analysts etc.).
What's the role?
As a Data Engineer on the Physical Assets team, you're required to understand the data requirements, specify the modeling needs of datasets and use existing techstack solutions for efficient data ingestion workflows and data pipelining. You will implement technical solutions using programming, machine learning, AI, and human-in-the-loop approaches to make sure our data is fit-for-purpose for our clients. You will work closely with our Engineering partners, our Data Product Manager as well as Product teams, so you need to be able to coordinate with multi-disciplinary and regional teams and have experience in project management and stakeholder engagement. You will need to befortable working with large, varied, sophisticated and often unstructured data sets and you will need to demonstrate strong experience in data engineering.
We trust you to:
1. Build database schema and configure ETL software to onboard new data sets
2. Analyze internal processes to find opportunities for improvement, as well as devise and implement innovative solutions
3. Build quality data workflows to verify and validate third party data
4. Maintain workflow configurations for critical functions such as acquisition, worklist management, and quality control
5. Contribute to the creation of best practices and guidelines forernance
6. Partner with Engineering and Product to propose, develop and implement market leading solutions for our clients
7. Understand customer needs and markets to ensure our data sets are fit-for-purpose and seamlessly integrate with other data products when developing -data product strategies
8. Stay updated on market, industry and dataset developments related to your area of support
9. Make well-informed decisions in a fast-paced, ever-changing environment
10. Report on results of on-going operations and projects, as required
You'll need to have:
*Please note we use years of experience as a guide but we certainly will consider applications from all candidates who are able to demonstrate the skills necessary for the role.
11. A BA/BS degree or higher inputer Science, Mathematics, or relevant data technology field, or equivalent professional work experience
12. 3+ years of programming and scripting in a production environment (Python, Javascript, etc)*
13. 3+ years of experience working with databases either SQL or NoSQL*
14. Understanding and experience of large-scale, distributed systems as well as ETL logics
15. Strong passion for data and the overall energy transition movement
16. The ability to think creatively and provide out of the box solutions with an eagerness to learn and collaborate
17. Familiarity with data processing paradigms and associated tools and technologies
18. Exceptional problem-solving skills, numerical proficiency and high attention to detail
19. Excellent written and verbalmunication skills, especially when explaining technical processes and solutions to business partners and management
20. Ability to work independently as well as in a distributed team environment
We'd love to see:
21. Track record of collaborating with Engineering to promote code to production (BREs or DTPs)
22. Knowledge of Machine Learning frameworks
23. Experience in conducting technical training and mentoring others
24. Proficiency and previous experience working with Bloomberg tech stack such as BBDS, BCOS, DFR, BRE, DTPs
25. Prior experience working with QlikSense (both visualizations and load scripting)