Data Engineer (m/f) - Remote possibility

REQUIREMENTS:
  • Bachelor’s Degree in Engineering, Computer Science, or other related discipline
  • Minimum 2 years’ experience
  • Experience building and optimizing data pipelines, architectures, and data sets.
  • Experience performing root cause analysis on data pipelines and identify opportunities for improvement.
  • Experience with object-oriented/functional/scripting languages: Python, Java, Scala (master at least one).
  • Good knowledge of SQL (mandatory).
  • Experience with Apache Spark (EMR, Databricks, HDInsight) (mandatory).
  • Knowledge of Unix shell scripting (ex: Bash).
  • Experience with Airflow (nice to have, not mandatory).
  • Knowledge in Azure Cloud Services, Azure Data Lake, Azure Databricks, Networking, Security (nice to have, not mandatory).
  • Knowledge of Docker and Kubernetes (nice to have, not mandatory).
  • Familiarity with stream-processing systems: Apache Kafka, Apache Spark-Streaming (nice to have, not mandatory).
  • Experience with big data tools like Apache Sqoop, Apache NiFi, Apache Kafka (nice to have, not mandatory).
 
Behavior characteristics:
  • Passion for what you do and for building things that are useful
  • Would like to be part of a team of a large organization with a start-up spirit
  • Who respects the ideas of his team members but wants to share his own too
  • Curious and a true team player, eager to learn something new every day
  • Good communication skills
  • An analytical mind with problem-solving abilities
  • Positive attitude
 
What can you expect from us?
Integration in a professional, dynamic and constantly growing team that:
- Values your professional and personal growth and offers you several training courses;
- Has a strong international presence and can provide you a experience abroad;
- Is always by your side, helps and values you in every single moment.