Data Engineer

Share This Post


Based in: Pecs – Király utca 42 / hybrid

Role Overview


As a Data Engineer, you will design, build, and maintain scalable data pipelines and architectures that power our analytical and operational systems.
You’ll work closely with Data Analysts, BI Developers, and Software Engineers to ensure data quality, performance, and scalability across all environments — from real-time streaming to batch processing.
You’ll be part of a modern data ecosystem leveraging AWS, Snowflake, Spark, Airflow, and Kafka, supporting mission-critical reporting and predictive analytics across the business.

Key Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines for ingesting and transforming large-scale data from multiple sources (betting, casino, player, finance, CRM, and external feeds).
  • Implement data models and schemas (Bronze/Silver/Gold layers) in Snowflake / Redshift / Postgres, ensuring data consistency and governance.
  • Work on real-time streaming and event-driven architectures using Kafka, Solace, or similar tools.
  • Optimize SQL and Python-based data transformations for performance and scalability.
  • Develop data validation, monitoring, and alerting frameworks.
  • Collaborate with BI developers to deliver Power BI and analytical datasets.
  • Partner with stakeholders across Trading, Finance, Compliance, and Risk to deliver high-impact data solutions.
  • Contribute to data quality standards, documentation, and CI/CD automation of pipelines (GitHub).

Skills & Experience


Essential

  • Minimum 3+ years of experience as a Data Engineer or in a similar data-focused role.
  • Strong proficiency in SQL and experience with relational databases (PostgreSQL, Snowflake, Redshift, BigQuery, etc.).
  • Proven experience building ETL pipelines using Python (Pandas, PySpark, Airflow, or dbt).
  • Solid understanding of data warehousing principles and dimensional modelling.
  • Familiarity with cloud platforms (AWS, Azure, or GCP).
  • Strong problem-solving skills and ability to work in fast-paced, data-intensive environments.

Desirable

  • Experience with real-time data streaming (Kafka, Kinesis, or Pub/Sub).
  • Knowledge of sports betting or iGaming data models (bets, markets, transactions, GGR, etc.).
  • Exposure to data governance, data observability, or catalogue tools
  • Experience with CI/CD for data (GitHub Actions, Terraform).
  • Familiarity with Power BI or other BI tools.

If this sounds interesting, please submit your CV in English using the Apply button.

    Fill-1

    Contact Us To Find Out How Amelco Can Improve Your Business

    Let's have a chat