Who we are

Gatik, the leader in autonomous middle mile logistics, delivers goods safely and efficiently using its fleet of light & medium-duty trucks. The company focuses on short-haul, B2B logistics for Fortune 500 customers including Kroger, Walmart, Tyson Foods, Loblaw, Pitney Bowes, Georgia-Pacific, and KBX; enabling them to optimize their hub-and-spoke supply chain operations, enhance service levels and product flow across multiple locations while reducing labor costs and meeting an unprecedented expectation for faster deliveries. Gatik’s Class 3-7 autonomous box trucks are commercially deployed in multiple markets including Texas, Arkansas, and Ontario, Canada.

About the role

We are looking for a skilled & experienced Software Engineer to join our AV Infrastructure & DataOps, focusing on critical automation tools and pipelines for our autonomous vehicle software stack.  This team is at the forefront of our efforts to streamline and optimize our processes for the development, validation & deployment of Gatik’s autonomous vehicle software. As a Data Engineer, you will be responsible for developing and maintaining scalable data pipelines, ensuring efficient data collection, processing, and storage. You will optimize workflows, maintain data security and compliance, and support the data needs of the organization. This role requires strong Python skills, experience with data storage solutions and processing frameworks, and familiarity with cloud platforms. Strong problem-solving, communication, and collaboration abilities are essential.
As a member of our team, you'll have the opportunity to work on cutting-edge technologies and contribute to groundbreaking projects that are shaping the future of transportation. If you're passionate about automation, innovation, and making a meaningful impact, we invite you to join us on this exciting journey.   This role is onsite 5 days a week at our Mountain View, CA office!

What you'll do

  • Create, optimize, and maintain scalable data pipelines to collect, process, and store large volumes of data from various sources.
  • Implement ETL (Extract, Transform, Load) processes to ensure data quality and integrity.
  • Design and implement data architectures that support efficient data storage, retrieval, and analysis.
  • Integrate data from multiple sources, including APIs, databases, and third-party services.
  • Optimize data processing workflows for maximum efficiency and speed.
  • Implement techniques such as data partitioning and indexing to improve query performance.
  • Ensure data security, privacy, and compliance with relevant regulations.
  • Implement access controls, encryption, and other security measures.
  • Work closely with data scientists, analysts, and other stakeholders to understand data needs and provide necessary support.
  • Develop tools and frameworks to enable data analysis and reporting.
  • Monitor the performance and health of data systems and pipelines.
  • Troubleshoot and resolve issues to ensure data availability and reliability.
  • Maintain comprehensive documentation of data systems, pipelines, and processes.
  • Ensure documentation is up to date and accessible to relevant team members.
  • Stay updated on the latest data engineering tools and technologies.
  • Evaluate and recommend new tools and technologies to improve data engineering processes.

What we're looking for

  • 5-7 years of industry experience in data engineering or related roles.
  • Strong proficiency in Python, with experience in developing data pipelines and ETL processes.
  • Experience with data storage solutions (e.g., SQL, NoSQL, cloud storage).
  • Proficiency with data processing frameworks (e.g., Apache Spark, Apache Beam).
  • Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud).
  • Strong problem-solving skills and the ability to troubleshoot complex data issues.
  • Ability to optimize data workflows for performance and scalability.
  • Excellent communication and collaboration skills.
  • Ability to work effectively in a fast-paced, dynamic environment.
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.

Bonus

  • Experience with containerization and orchestration tools (e.g., Docker, Kubernetes).
  • Familiarity with data visualization and business intelligence tools (e.g., Tableau, Power BI).
  • Knowledge of machine learning and data science concepts.

More about Gatik

With headquarters in Mountain View, CA and offices in Canada, Texas and Arkansas. Gatik is establishing new standards of success for the autonomous trucking industry every day. Visit us at Gatik for more company information and Jobs @ Gatik for more open roles.   Gatik News
  Taking care of our team At Gatik, we connect people of extraordinary talent and experience to an opportunity to create a more resilient supply chain and contribute to our environment’s sustainability. We are diverse in our backgrounds and perspectives yet united by a bold vision and shared commitment to our values. Our culture emphasizes the importance of collaboration, respect and agility.   We at Gatik strive to create a diverse and inclusive environment where everyone feels they have opportunities to succeed and grow because we know that together we can do great things. We are committed to an inclusive and diverse team. We do not discriminate based on race, color, ethnicity, ancestry, national origin, religion, sex, gender, gender identity, gender expression, sexual orientation, age, disability, veteran status, genetic information, marital status or any legally protected status.

Location

Mountain View, CA

Job Overview
Job Posted:
5 months ago
Job Expires:
Job Type
Full Time

Share This Job: