Project Overview:

The Data Engineering team builds the apps and infrastructure to leverage data from our vehicles and operations. We are building the next generation data lake and pipelines as we scale our offerings. You will develop and implement the systems that power Luminar’s data-driven improvement and growth.

Recruiter
Daniela Riabova
Responsibilities:
  • Designing, building, and maintaining the infrastructure that transforms data at scale into insights:
  • Ingestion of real-world vehicle data.
  • Automated labeling and data enrichment.
  • Generation of ground truth information.
  • Analysis of data quality.
  • Partnering with engineering and ML teams to define data consumption patterns and establish best practices.
  • Establishing robust data integrity and systems monitoring.
Requirements:
  • 4+ years of relevant industry experience: 
  • Building backend / data services at scale.
  • Hands-on Big Data software and infrastructure development skills.
  • Experience with NoSQL databases and/or Relational databases.
  • Accomplished in Python.
  • Demonstrated ability to work independently.
  • Timezone flexibility: Willingness to sync with colleagues working in the Palo Alto office for up to 1-2 hours a day.
  • Strong communication skills, being able to collaborate with the wider software organization.
  • Bachelor’s Degree (or higher) in Computer Science.
  • Experience with MongoDB.
  • Experience with pipeline / workflow frameworks: Argo Workflows or Airflow.
  • Hands-on experience with cloud computing: AWS / GCP / Azure / Alibaba Cloud.
  • Experience with automation of data infrastructure: Docker / Kubernetes / Terraform / Linux.
  • Experience with SRE practices - running services reliably in production environments:
  • Incident management and root cause analysis.
  • Performance Engineering.
  • Previous experience in the Automotive / Robotics industry.
Nice to have:
  • Proficiency in front-end development, preferably using React.
  • Experience with processing frameworks like Spark / Hadoop.
  • Experience with big data / data warehouse tools like Redshift, Presto/Athena, BigQuery and Snowflake.
  • Experience with streaming and message queueing technologies: Kafka / Kinesis / RabbitMQ.
  • Hands-on experience with data storage and transfer schemes like Avro, Parquet and Protobuf.
  • Experience with rosbags.
Higher Education:
  • Bachelor's Degree.

#LI-DR2

 

 

Тебе також можуть зацікавити

Чому варто приєднатись до команди INTELLIAS

У нас ти знайдеш доброзичливе середовище та можливості навчатися й зростати щодня.

Можливості релокації в INTELLIAS

Отримуй новий досвід та відкривай нові горизонти, знаходячись лише в декількох годинах подорожі…

Підтримка здоров’я та спорту

Ми докладаємо максимум зусиль, щоб забезпечити комфортні умови для консультантів компанії, та піклуємося…

Як стати частиною команди INTELLIAS

Ми робимо все можливе, щоб спростити та прискорити твій шлях до нашої команди. Будемо раді бачити тебе...
Dropzone.autoDiscover = false;