Please scroll down, To apply

Data Engineer

hiring now

Storm4

2021-12-03 08:58:20

Job location Aurora, Colorado, United States

Job type: fulltime

Job industry: HR / Recruitment

Job description

⚡ Data engineer Brighton, Colorado


The Role


The Data Engineer will be responsible for the design, implementation and continuous improvement of the data pipeline that brings product performance data to the Engineering teams quickly and accurately, both from testing at the test site and customer sites available to the Engineering teams. This position plays an essential role in helping deliver a safety-critical system for industrial automation. The opportunity offers a very talented, self-starting engineer the chance to help develop a market-defining enterprise product that combines autonomous vehicle technology with a software-as-a-service (SaaS) business model.


Duties and responsibilities


Design, implement, and improve all aspects of the data pipeline, from ingestion through analysis to visualization.

Work cross-functionally with other teams to ensure the data pipeline (including on the cloud) serves the various internal and external customers of data.

Drive and advise the Engineering teams on how to approach Data Engineering and Analytics, and be an advocate for data-driven decision making at all levels by providing data tools that are clean and reliable.

Work with the team to understand and prioritize development and sustainment projects and drive toward a product that meets all customer, business, and technical expectations


Required qualifications


  • 1 or more years' experience working with and integrating AWS tools such as Lambda, Athena, API Gateway, Batch, Glue, and RDS
  • Proficient with at least one general-purpose programming language, such as Python or Scala
  • Comfortable working with APIs to access data from a variety of sources
  • Experience with the software development life cycle, modern software/hardware/system testing methods, and continuous integration
  • Self-motivated and able to identify and fix issues before they become blockers
  • BS or MS in Engineering or Computer Science or a similar discipline
  • Excellent written and verbal communication skills
  • Strong analytical, curious mindset
  • Sterling references


Ideal qualifications


  • Experience with Spark/PySpark, Apache Arrow, and other big data tools
  • Experience with Docker
  • Experience creating cloud-based ETL workflows
  • Experience with infrastructure-as-code platforms
  • Experience with autonomous trucks or other automated distribution yard products
  • Experience in ROS and ROS bag format
  • Experience with C++ and Python


Sounds like you? Please click on the 'Easy Apply' button. You can also send your resume directly to or message me directly!

If you have any extra requirements to support your application, then please just add a note along with your CV to let us know.

Inform a friend!

Top