- Hybrid work model – Warsaw, Poland
- Stable, long-term cooperation within a global organization
- Opportunity to work with modern technologies including Apache Spark, Scala, and Big Data environments
- International, Agile work culture promoting continuous improvement and collaboration
- Design and develop scalable backend solutions to support global reporting systems
- Build and optimize Spark-based applications (Scala preferred) for high-performance data processing
- Create and maintain ETL/ELT data pipelines, covering extraction, validation, transformation, and delivery
- Integrate data from multiple internal and external sources within a microservices-based architecture
- Collaborate with Product Owners, Solution Architects, and cross-functional teams across regions
- Actively participate in Agile ceremonies, ensuring fast delivery and continuous feedback loops
- Contribute to CI/CD pipelines, focusing on automation, testing, and peer code reviews
- Document processes and solutions using Jira, Confluence, and ALM tools
- 3+ years of experience in backend or data engineering roles (ideally within financial services)
- Solid expertise in Apache Spark and Scala
- Proven experience developing ETL/ELT pipelines and data integration solutions
- Knowledge of APIs, microservices, and SQL-based databases
- Hands-on familiarity with CI/CD tools (Git, GitHub, Jenkins)
- Bachelor’s degree in Computer Science or a related field
- English proficiency (min. B2+) for effective technical communication
- Experience with Python, Bash scripting, or Airflow / Control-M
- Familiarity with cloud technologies (Azure, AWS, Databricks, Docker, Kubernetes, S3, EMR)
- Awareness of cybersecurity and quality assurance concepts (e.g. Sonar)
- Understanding of software development lifecycle tools (HP ALM)
- Strong analytical mindset, attention to detail, and proactive problem-solving attitude
Employment agency entry number 47
this job offer is intended for people over 18 years of age
...