Project Overview:

As part of the central BI team you will work closely with the BI Developer & the DWH BI Engineering team. You will specifically contribute to data modelling, data warehousing and help to scale end to end BI process. You will also enable the team to ingest data from 3rd party sources, deploy scheduling pipelines and contribute towards our data infrastructure in order to do more effective analytics.

Рекрутинг лід
Катерина Леспух
Responsibilities:
  • Work closely with Analysts & Data Scientist to understand business requirements and provide data ready for analysis;
  • Design high-performance, reusable, and scalable data models for our data warehouse to ensure our end-users get consistent and reliable answers when running their own analyses;
  • Write complex yet optimised data transformations in SQL using dbt;
  • Schedule data transformation and analysis pipelines using airflow;
  • Continuously discover, transform, test, deploy and document data sources;
  • Apply, help define, and champion data warehouse governance: data quality, testing, coding best practises, and peer reviews;
  • Apply advanced aggregations and data wrangling techniques such as imputation for predictive analytics;
  • Keep our data efficient and consistent at a global level, maintaining the warehouse environments over multiple regions.
Requirements:
  • Working with SQL, Snowflake, dbt, Apache Airflow, Python, Fivetran, AWS, git and Looker;
  • Strong hands-on data modeling and data warehousing skills and passion (the data warehouse is based on Snowflake, with transformations orchestrated through dbt);
  • Power-user expertise in at least one of these Cloud technologies: Snowflake, AWS, Google Cloud, Microsoft Azure;
  • Solid experience with ETL and scheduling tools (e.g. Talend, Airflow).
Nice to have:
  • Experience in dimensional modelling, specifically designing, developing and maintaining STAR and/or snowflake schemas;
  • Experience with SDLC within a Data Warehouse context, including experience using Git and familiarity with Continuous Integration and Deployment;
  • Extensive experience in SQL query optimisation & troubleshooting;
  • Experience with data visualisation tools and packages (e.g. Looker, Tableau, matplotlib);
  • Strong attention to details to highlight and address data quality issues;
  • Proven project management skills: roadmap, prioritisation, and stakeholders management;
  • Strong communication skills to work with non technical stakeholders and an excellent understanding of the commercial value of their work and the monetisation of data;
  • Ability to deal with ambiguity and competing objectives in a fast paced environment.
Higher Education:
  •  Bachelor’s Degree/Master’s Degree.

Тебе також можуть зацікавити

Чому варто приєднатись до команди INTELLIAS

У нас ти знайдеш доброзичливе середовище та можливості навчатися й зростати щодня.

Можливості релокації в INTELLIAS

Отримуй новий досвід та відкривай нові горизонти, знаходячись лише в декількох годинах подорожі…

Підтримка здоров’я та спорту

Ми докладаємо максимум зусиль, щоб забезпечити комфортні умови для консультантів компанії, та піклуємося…

Як стати частиною команди INTELLIAS

Ми робимо все можливе, щоб спростити та прискорити твій шлях до нашої команди. Будемо раді бачити тебе...