What are the responsibilities and job description for the Perception Engineer position at Stealth Mode?
Aerospace & Defense startup currently in Stealth based in El Segundo, made up of former SpaceX, Anduril, and Lockheed engineers, applying the SpaceX model to mass produce life-saving systems. Backed by joint staff level military leadership and several major SV firms, their first product is designed for immediate deployment around the world starting with Ukraine where the need is critical.
Job Overview
We are seeking a Perception Engineer with expertise in sensor fusion and computer vision and to develop target detection, tracking, and classification capabilities for a variety of systems. This role requires a deep understanding of optical, infrared, and radar-based perception, as well as geometric vision, image processing, and real-time tracking algorithms.
The ideal candidate will have strong experience in sensor modeling, multi-modal data fusion, and computational imaging techniques that enhance target recognition and tracking under real-world conditions, and would use machine learning classification techniques along with physics-based approaches to ensure high reliability and robustness in contested environments.
Key Responsibilities
Job Overview
We are seeking a Perception Engineer with expertise in sensor fusion and computer vision and to develop target detection, tracking, and classification capabilities for a variety of systems. This role requires a deep understanding of optical, infrared, and radar-based perception, as well as geometric vision, image processing, and real-time tracking algorithms.
The ideal candidate will have strong experience in sensor modeling, multi-modal data fusion, and computational imaging techniques that enhance target recognition and tracking under real-world conditions, and would use machine learning classification techniques along with physics-based approaches to ensure high reliability and robustness in contested environments.
Key Responsibilities
- Develop and implement computer vision algorithms for target detection, tracking, and classification using EO/IR, radar, and LIDAR sensor data.
- Design multi-sensor fusion algorithms to improve detection accuracy and reduce ambiguity in real-world scenarios.
- Utilize geometric vision techniques (e.g., epipolar geometry, 3D reconstruction, optical flow, structure from motion) to enhance tracking and target identification.
- Develop and refine atmospheric and environmental compensation models to improve EO/IR sensor performance in adverse conditions (e.g., fog, smoke, turbulence, low-light).
- Implement real-time image processing techniques, including contrast enhancement, noise reduction, feature extraction, and edge detection, to improve sensor data fidelity.
- Develop AI/ML models for friend/foe classification and target type identification (e.g., vehicle, personnel, aircraft).
- Design object tracking algorithms based on deterministic methods such as Kalman filtering, particle filtering, and optical flow-based tracking.
- Model target kinematics and dynamics to enhance predictive tracking and classification.
- Work with hardware engineers to integrate real-time perception algorithms into embedded systems and ensure low-latency processing.
- Perform simulation-based testing and field validation using high-fidelity sensor models and real-world datasets.
- Develop and test friend/foe classification and target type identification algorithms using a variety of data-driven ML models.
- Optimize algorithms for real-time performance on low-SWaP (Size, Weight, and Power) platforms such as FPGA or GPU-accelerated embedded systems.
- M.S. or Ph.D. in Computer Science, Electrical Engineering, Applied Physics, Aerospace Engineering, or a related field.
- 3 years of experience in computer vision, sensor fusion, and real-time image processing for defense, robotics, or aerospace applications.
- Strong background in optics, infrared imaging, radar signal processing, and multispectral/hyperspectral imaging.
- Experience with physics-based image formation models, including camera calibration, radiometric corrections, and optical distortion modeling.
- Expertise in object tracking techniques, including motion models, optical flow, and deterministic state estimation.
- Proficiency in programming languages such as C , Python, and MATLAB for real-time applications.
- Hands-on experience with embedded systems and real-time optimization for edge computing platforms.
- Experience integrating multiple sensing modalities (EO, IR, LiDAR, radar, RF) into a cohesive perception system.
- Strong mathematical foundation in linear algebra, signal processing, machine learning, and probabilistic estimation.
- Experience in missile guidance, UAV autonomy, or defense-related sensor systems.
- Understanding of signature-based target classification methods, including thermal and radar cross-section (RCS) analysis.
- Hands-on experience with ROS (Robot Operating System) for perception and sensor integration.
- Familiarity with synthetic aperture radar (SAR) processing and passive RF-based target detection.
- Familiarity with synthetic data generation and simulation tools such as CARLA, Gazebo, or Unreal Engine.
- Experience with numerical modeling of sensor behavior and adaptive optics techniques.
- Background in high-speed image processing and real-time embedded vision system design.