Project Overview:

We are seeking for a Staff Data Engineer to build and scale the data storage and processing pipeline for the world’s best long-range perception solution. You will work closely with the entire software team to make sure our data infrastructure scales to support a global team and continually supplies all parties with actionable data points to accelerate software development and machine learning modeling.

Product consisting of a 120° FoV high-resolution lidar sensor (hardware) and perception packages that segment the point cloud delivered by the lidar into interpretable structures like vehicles, pedestrians, and free space (software). Project includes systems and safety development (functional safety, safety of the intended functionality and cybersecurity).

Рекрутинг лід
Катерина Леспух
Responsibilities:
  • Implementation of large scale data infrastructure and services;
  • Quality control of data infrastructure and data service development;
  • Management of data pipelines used to train and evaluate machine learning models;
  • Automation of data infrastructure;
  • Implementation of data transformation and streaming services;
  • Site reliability of data services.
Requirements:
  • 7+ years of industry experience;
  • Expert-level Python coding abilities;
  • Experience working with Autonomous and/or ADAS vehicle data or enterprise 'Big Data' experience;
  • Already shipped and operated big data systems in production environments;
  • Experience in scalability of data systems;
  • Expert knowledge in relational, NoSQL and distributed data stores;
  • Hands-on big data software and infrastructure development skills;
  • Hands-on experience of data storage and schemes like Avro, Parquet;
  • Enjoy working within a dynamic and continuously evolving environment;
  • Able to work successfully in cross-functional teams, especially across organizational and geographical boundaries;
  • Demonstrate a high degree of software craftsmanship and the ability to own a complex system end to end.
Nice to have:
  • Strong coding skills in one of the following: C++ or Java or Go or Rust;
  • Knowledge of machine learning modeling best practices;
  • Experience working with data technologies such Hadoop, Spark, Airflow, Kibana/Elastic;
  • Experience working with ROS;
  • Operational experience with on-prem and cloud data systems;
  • Experience with large-scale ingestion architectures;
  • Hands-on experience in data streaming technologies;
  • 3D perception modeling experience;
  • Experience in cloud and on-premises data systems;
  • High energy personality, with a strong and demonstrable work ethic;
  • Demonstrate out-of-box thinking to develop creative solutions to challenging problems.

Тебе також можуть зацікавити

Чому варто приєднатись до команди INTELLIAS

У нас ти знайдеш доброзичливе середовище та можливості навчатися й зростати щодня.

Можливості релокації в INTELLIAS

Отримуй новий досвід та відкривай нові горизонти, знаходячись лише в декількох годинах подорожі…

Підтримка здоров’я та спорту

Ми докладаємо максимум зусиль, щоб забезпечити комфортні умови для консультантів компанії, та піклуємося…

Як стати частиною команди INTELLIAS

Ми робимо все можливе, щоб спростити та прискорити твій шлях до нашої команди. Будемо раді бачити тебе...