What do we do?
Incentive Games is a dynamic and innovative company specialising in retention and acquisition services through free-to-play games. Our mission is to drive traffic and repeated visits to our clients' websites by rewarding the right users. We provide free-to-play games for leading companies in the gaming industry, such as Fanduel, Livescore, and Bet365.
Job Description:
As a Data Engineer at Incentive Games, you will play a crucial role in our data team. Your primary responsibility will be to design, implement, and maintain robust data infrastructure and pipelines, ensuring efficient data processing and availability for analysis and decision-making.
Responsibilities:
1. Design, develop, and maintain scalable ETL pipelines using AWS cloud technologies, ensuring efficient data ingestion, processing, and storage.
2. Implement data quality checks, monitoring systems, and data governance practices, troubleshooting and resolving data-related issues in production environments.
3. Collaborate with Data Scientists and Analysts to understand data requirements and deliver high-quality, accessible datasets.
4. Develop and maintain data models that support business intelligence and analytics needs.
5. Participate in code reviews and contribute to best practices in data engineering.
6. Stay up-to-date with emerging technologies and industry trends in data engineering.
7. Document data architectures, pipelines, and processes for knowledge sharing and maintenance.
8. Implement and maintain DevOps/DataOps practices, including CI/CD pipelines, infrastructure as code, and monitoring systems.
Qualifications:
* 2-3 years of experience working as a Data Engineer or in a similar role focused on data infrastructure and management.
* Proven experience in building, developing, and maintaining data pipelines to support large-scale data processing and analytics.
* Proficiency in Python, with strong expertise in Pandas for data manipulation. Experience with data frameworks like PySpark or NumPy is a plus.
* Experience with cloud platforms is essential. AWS is preferred, but knowledge of Azure, Google Cloud Platform (GCP), or other major cloud providers is also highly valued.
* Excellent knowledge of SQL and a strong understanding of relational database architecture.
* Experience with DevOps/DataOps practices such as CI/CD pipelines, infrastructure as code, and monitoring tools is beneficial.
* Familiarity with real-time data processing and streaming technologies such as Amazon SQS is a plus.
* Experience with version control systems (e.g., Git) and familiarity with collaborative development workflows like branching strategies and pull requests.
* Strong problem-solving abilities with high attention to detail.
* Self-motivated, adaptable, and able to thrive in a collaborative, fast-paced environment.
What we offer:
* 4-day work week
* Flexible working (Hybrid or remote)
* Competitive salary and benefits package.
* A dynamic and innovative work environment.
* Opportunities for career growth and development.
* A chance to work with world-class products and lead in our industry.
* A flexible, informal work culture that values creativity and teamwork.
#J-18808-Ljbffr