Software Engineer, Perception

Software Engineer, Perception
Company:

Agtonomy


Details of the offer

About Us Agtonomy is pioneering advanced automation and AI solutions to transform agriculture and beyond. Initially focused on specialty crops, our TeleFarmer platform addresses labor-intensive needs with automation, turning conventional equipment into autonomous machines. By partnering with leading manufacturers like Doosan Bobcat, we integrate smart technology into tractors and other machinery, enhancing safety and efficiency. As we expand into ground maintenance and other industrial applications, our expert team continues to address key challenges with labor shortages, sustainability and profitability across various industries.
About the Role As a Perception / Machine Learning Engineer on the Autonomy Team, you will play a key role in solving challenging perception problems in outdoor vehicle automation. Leveraging your experience, you will implement state-of-the-art ML perception techniques to improve how Agtonomy's tractors perceive and understand the environments where they operate. You will work closely with embedded, localization, and planning engineers on the team to design and evolve the upstream and downstream interfaces of the perception system. This role is perfect for someone who loves implementing ML to tackle real world problems and is excited about applying their experience to make robots perceive in rugged, agricultural environments.
What You'll Do Applying machine learning to solve challenging perception problems for autonomous systems (e.g. object detection, semantic segmentation, instance segmentation, dense depth, optical flow, tracking, etc.).Driving the architecture, deployment, and performance characterization of our deep learning models.Refining and optimizing models for low-latency inference on embedded hardware.Designing and building cloud-based training and labeling pipelines.Collaborating with the hardware and embedded teams on sensor selection and vehicle packaging given safety requirements.Writing performant, well-tested software, and improving code quality of the entire Autonomy team through code and design reviews.What You'll Bring 5+ years of experience in software development for problems involving computer vision, machine learning, and robotic perception techniques.Foundational understanding of deep learning: model layer design, loss function intuition, training best practices.Experience handling large datasets efficiently and organizing them for training and evaluation.Experience curating synthetic and real-world image datasets for training.Strong proficiency in modern C++ and Python and experience writing efficient algorithms for resource-constrained embedded systems.Ability to thrive in a fast-moving, collaborative, small team environment with lots of ownership.Excellent analytical, communication, and documentation skills with demonstrated ability to collaborate with interdisciplinary stakeholders outside of Autonomy.An eagerness to get your hands dirty by testing your code on real robots at real customer farms (gives "field testing" a whole new meaning!).What Makes You a Strong Fit Experience architecting multi-sensor ML systems from scratch.Experience with compute-constrained pipelines: optimizing models to balance the accuracy vs. performance tradeoff, leveraging TensorRT, model quantization, etc.Experience implementing custom operations in CUDA.MS or PhD in Robotics, Computer Science, Computer Engineering, or a related field.Publications at top-tier perception/robotics conferences (e.g. CVPR, ICRA, etc.).Passion for sustainable agriculture and electric vehicles.Salary and Benefits The US base salary range for this full-time position is $160,000 to $220,000 + equity + benefits + unlimited PTO.
Benefits:
100% covered medical, dental, and vision for the employee (cost plus partner, children, or family is additional)Commuter BenefitsFlexible Spending Account (FSA)Life InsuranceShort- and Long-Term Disability401k PlanStock OptionsCollaborative work environment working alongside passionate mission-driven folks!Our interview process is generally conducted in five (5) phases:
Phone Screen with People Operations (30 minutes)Video Interview with the Hiring Manager (45 minutes)Coding Challenge and Technical Challenge (1 hour with an Autonomy Engineer)Panel Interview (Video interviews scheduled with key stakeholders, each interview will be 30 to 45 minutes)Final Interviews (CEO, CFO, VP of Engineering, 30 minutes each)

#J-18808-Ljbffr


Source: Jobleads

Job Function:

Requirements

Software Engineer, Perception
Company:

Agtonomy


Radar Repair

Work in the cutting edge of the STEM field. You will build skills in electronics and technology to conduct advanced radar and computer repair on world-class ...


From United States Army - California

Published 7 days ago

Senior Engineer - Solid Propulsion

Date Posted: 2024-07-24 Country: United States of America Location: AZ802: RMS AP Bldg 802 1151 East Hermans Road Building 802, Tucson, AZ, 85756 USA Positio...


From Raytheon - California

Published 7 days ago

Manufacturing Engineer

About us : HCLTech is a global technology company, home to more than 223,400 people across 60 countries, delivering industry-leading capabilities centered ar...


From Hcltech - California

Published 7 days ago

Senior Systems Design Engineer

Date Posted: 2024-07-11 Country: United States of America Location: AZ805: RMS AP Bldg 805 1151 East Hermans Road Building 805, Tucson, AZ, 85756 USA Positio...


From Raytheon - California

Published 7 days ago

Built at: 2024-10-05T03:16:08.897Z