Big Data Engineer (m/f)

Requirements:
  • Bachelor's or Master's degree in information systems/engineering, computer science and management or related
  • Proficiency in modeling and maintaining Data Lakes with PySpark - preferred basis
  • Experience in Big Data technologies (e.g. Databricks)
  • Ability to model and optimize workflows (e.g. Azure Data Factory, Apache Airflow)
  • Experience with Streaming Analytics services (e.g. Kafka, Grafana)
  • Analytical, innovative and solution oriented mindset
  • Teamwork, strong communication and interpersonal skills
  • Rigour and organisation skills
  • Fluency in English (spoken and written)
 
WE ALSO VALUE
  • Ability to implement APIs and custom connectors
  • Knowledge of automation services (e.g. Terraform, Azure DevOps, AWS CodeBuild, Jenkins)
  • Knowledge of visualization technologies (e.g. Microsoft PowerBI, Looker)
  • Cloud Certifications
  • Experience in AGILE Methodologies
 
What can you expect from us?
Integration in a professional, dynamic and constantly growing team that:
- Values your professional and personal growth and offers you several training courses;
- Has a strong international presence and can provide you a experience abroad;
- Is always by your side, helps and values you in every single moment.