The Guidance Navigation and Control Department of the Leidos Innovation Center develops and deploys cutting edge alternative positioning systems for customers who can't rely on GPS alone, and we need your help in the Rocket City (Huntsville, AL)! In this Autonomous Navigation Engineer position you'll be working on a mix of analytical and hands-on tasks including algorithm development, analysis, simulation, and field testing of autonomous navigation systems. Your group primarily supports research and development as well as advanced prototyping for customers including DARPA, NASA, Air Force Research Laboratory, Office of Naval Research, and the Army Aviation and Missile Research Development and Engineering Center. These customers need to get our advanced technology into the field! You'll be a part of more than one small team of engineers (typically 3-5) working together in spiral or agile development environments to meet aggressive customer schedules aimed at near-term field demonstrations of advanced capabilities. You'll get to go demo your work in the field in front the customer! Are you excited about making advanced technologies work in the real world? Then we want you on our team!
Fun stuff on the job you will get to do:
- Develop, integrate, and analyze navigation fusion algorithms, including Extended Kalman Filters, Unscented Kalman Filters, and Particle Filters
- Develop, integrate, and analyze computer vision techniques (including feature extraction, feature matching, optical flow, visual odometry, and Simultaneous Localization and Mapping (SLAM) functions)
- Develop and utilize simulation testbeds to support algorithm testing and assessment
- Travel with your team to demo your system to the customer on a variety of platforms, including aircraft, ground vehicles, pedestrians, and naval ships. Testing typically lasts for 1-2 weeks.
Skills you will need to be successful in this role:
- Bachelor's Degree in Computer Engineering, Electrical Engineering, Computer Science, Aerospace Engineering, Mechanical Engineering, or equivalent and 4+ years of relevant experience
- Must be a US Citizen with the ability to obtain and maintain a Secret security clearance. You don't have to have a clearance currently, but we'll be putting you in for a one after you begin and you'll need to be "clear-able".
- Experience programming in MATLAB. You'll do most of your analysis and algorithm development in Matlab. Strong programming experience in other languages usually makes transition to Matlab pretty easy, so this could work as well.
- Experience with either sensor fusion/estimation algorithms (e.g. Kalman Filters, Particles Filters, or Factor Graphs), or with computer vision algorithms (for instance openCV). If you have experience with both sensor fusion and computer vision that's even better! If you don't have experience with either of these, a strong background in System Dynamics and Controls theory (e.g. PID Controls, State Space Controls, etc.) can suffice.
You will wow us even more if you have these skills:
- Master's or PhD Degree in major listed above
- Experience working with various sensor types (Inertial Measurement Units, GPS, cameras, LiDAR, etc.)
- Experience with C++ and/or Python
- Experience with the Robot Operating System (ROS) and the OpenCV computer vision library
- Experience with interface communication standards / protocols including RS232, RS422, TCP/IP, and UDP to communicate with various sensors
- Experience with autonomous system development through robotics competitions
- Experience with the Linux operating environment
- Experience using version control tools (GIT, Subversion, etc).
External Referral Program