4+ years of data engineering and data management experience, with a proven track record of delivering complex, production-grade data systems.
Strong proficiency in SQL and SQL optimization — including query tuning, indexing strategies, execution plan analysis, and data modeling in BigQuery and PostgreSQL.
Expert-level scripting and programming in Python; experience with Go is a strong plus.
Deep expertise with Google Cloud Platform (GCP), including BigQuery, GCS, Composer/Airflow, Cloud Functions, Cloud Run, Pub/Sub, and CloudSQL.
Proven experience building and operating event-driven and streaming data pipelines using Kafka or similar technologies (Aiven/Debezium experience a plus).
Strong understanding of modern data warehouse and Lakehouse architectures, including multi-layered data modeling patterns (bronze/silver/gold or equivalent).
Infrastructure as Code (IaC) experience with Terraform to define, manage, and version cloud data infrastructure.
Solid understanding of Software Development Lifecycle (SDLC) best practices: CI/CD pipelines, automated testing, code review processes, code repositories, and deployment management.
Experience with data replication tools (e.g., Fivetran, Debezium) and understanding of CDC (Change Data Capture) patterns.
Ability to independently drive data architecture decisions, translate business requirements into source-to-target data mappings, and deliver working, maintainable solutions.
Strong communication skills; able to effectively collaborate with and educate both technical and non-technical stakeholders.
Experience mentoring engineers and leading technical initiatives within a team environment.