At PeakMetrics, we stand at the forefront of narrative intelligence, harnessing the power of advanced machine learning to safeguard enterprises and government agencies from social media manipulation and narrative attacks. Our platform rapidly sifts through millions of unstructured, cross-channel media datasets, transforming them into actionable insights. By identifying adversarial online messaging, understanding its audience, and providing crucial context such as source credibility, we empower our customers to effectively respond to digital threats.
Our team tackled some of today's most intricate media challenges, from crisis management to thwarting state-sponsored disinformation campaigns. Our goal is unwavering: to shield our clients from the risks of social media manipulation and emerging narrative threats by detecting and remediating from harmful online trends.
We are looking for an experienced and visionary Director of Data Engineering to lead the design, execution, and evolution of our data infrastructure and DevOps strategies. In this role, you will create and collaborate with the Data/Product/Engineering teams to implement a cohesive vision for scalable, secure, and efficient systems, while building a high-performing team of data engineers and DevOps professionals. You will be pivotal in ensuring the seamless integration of data systems with business needs and driving innovation across the organization. This role will begin hands-on, with you spending a majority of your time building software, and will transition to a full-time leadership role as the data team expands.
Responsibilities
* Develop and execute the long-term vision for the organization’s data engineering and DevOps strategies.
* Collaborate with senior leadership to prioritize initiatives, set objectives, and define measurable outcomes.
* Stay ahead of industry trends, tools, and technologies to ensure competitive advantage and efficiency.
* Build, mentor, and lead a diverse team of data engineers and DevOps professionals.
* Foster a culture of innovation, accountability, and collaboration within the team.
* Establish best practices for performance management, career development, and skills growth.
* Oversee the design, development, and maintenance of scalable data pipelines, warehouses, and processing frameworks.
* Build the infrastructure and codebase required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
* Drive the implementation of best practices in data governance, quality, and security.
* Ensure the availability, reliability, and performance of data systems
* Lead the adoption of modern DevOps methodologies to streamline CI/CD pipelines and deployment processes.
* Ensure robust monitoring, logging, and alerting systems are in place for all applications and data infrastructure.
* Partner with cross-functional teams, including product, analytics, and engineering, to understand and deliver on business needs.
* Present project updates, performance metrics, and strategic initiatives to leadership.
Qualifications
* 12+ years of experience, with a recent focus on data engineering, data infra, or related roles, with 3+ years in leadership positions.
* Proven experience in designing, implementing, and operating data architectures, ETL processes, and data pipelines.
* Expertise in cloud platforms (AWS, Azure, or GCP) and cloud-native solutions.
* Strong understanding of data governance, security, and compliance standards (e.g., GDPR, HIPAA).
* Proficiency in programming with scripting languages (e.g., Python, Node.js, Bash).
* Experience with modern platform & infra tools such as Kubernetes, Docker, Terraform, CloudFormation or similar.
* Track record of successfully managing and scaling high-performing technical teams.
* Experience with big data technologies (e.g., Kafka, SnowFlake, Avro & Protobuf).
* Familiarity with ML/AI infrastructure and frameworks.
* Hands-on experience with both relational and non-relational databases (e.g., SQL, NoSQL).
* Certifications in cloud platforms or DevOps practices are a plus.
* You have maintained data management systems and have built new data pipelines from scratch.
* You are comfortable automating data flows with resilient code using Python.
* Experience with Dagster or other data orchestration platforms such as Airflow.
* You have strong database architecture design and management knowledge in both structured and unstructured data.
* Extensive experience working with SQL, SQLAlchemy with Postgres, Athena, Parquet and Iceberg.
* You are able to define and communicate data architecture requirements, keeping current with data management best practices.