We are looking for experienced Data Architect for our client from technology sector. The person in this role will work with project teams developing a modern data platform in a cloud environment for global brands. They will work in an international environment and specialise in the architecture of cloud-based Big Data, Business Intelligence solutions and the latest technologies.
Apply and move forward your profesional career.
what we offer- Global projects in multiple clouds - working with clients from all over the world based on modern cloud technologies
- Employment contract or B2B
- Certification reimbursement – founding exams, Microsoft certifications, AWS, Databricks, Snowflake
- Flexible approach - you can choose to work from home or meet at office
- Personalised benefits - medical care, subsidised sports packages, language tuition, new employee referral bonus
- Learning time - 60 paid hours per year
- Coming up with improvement initiatives in existing solutions and designing new solutions
- Coordinating architectural arrangements with Architects on the client side and other vendors
- Optimisation of the production process through process and tool changes
- Coordinating the work of a team of Data Engineers (responsible for the development of data platforms and ETL/ELT processes) and Data Analysts (responsible for the data model and report development) on the company side across multiple projects running in parallel for the same client; working closely with Project Managers
- Enforcing and improving platform development standards, as well as setting them (where gaps are identified)
- Ensuring quality of delivered solutions, conducting code review
- Working "at the grassroots" as Data Engineer and Data Analyst (to stay in touch with the technology)
- Keeping up to date with current trends, tools, services in the Data area
- Minimum 5 years of experience in designing and building Business Intelligence, ETL/ELT, Data Warehouse, Data Lake, Big Data solutions
- Practical knowledge of various relational (e.g. SQL Server/SQL Database, Oracle, Redshift, PostgreSQL) and non-relational (e.g. MongoDB, CosmosDB, DynamoDB, Neo4j) database engines
- Very good knowledge of SQL and Python (min. 5 years' experience)
- Knowledge of process orchestration and data processing solutions, in particular: Spark/Databricks (including structured streaming mechanisms, Delta Lake, etc.), Snowflake, Azure Data Factory, Apache Airflow
- Understanding of the following areas: data governance, data quality, data visualisation
- Data modelling skills (Star schema / Lakehouse / Medallion / Data Vault / Data Mesh / Common Data Model / Corporate Data Model)
- Advanced skills in the use of git repositories (Bitbucket/GitHub)
- Familiarity and experience with data services offered by the Azure and/or AWS platform
- Flexibility, self-reliance and efficiency in action, as well as responsibility for assigned tasks
- Practical knowledge of English at a level of min. B2 (C1+ preferred)
Employment agency entry number 47
this job offer is intended for people over 18 years of age
...