The Guidance Navigation and Control Department of the Leidos Innovation Center develops and deploys cutting edge alternative positioning systems for customers who can't rely on GPS alone, and we need your help in the Rocket City (Huntsville, AL)! In this position you'll be working on a mix of analytical and hands-on tasks including algorithm development, analysis, simulation, and field testing of autonomous navigation systems. Your group primarily supports research and development as well as advanced prototyping for customers including DARPA, NASA, Air Force Research Laboratory, Office of Naval Research, and the Army Aviation and Missile Research Development and Engineering Center. These customers need to get our advanced technology into the field! You'll be a part of more than one small team of engineers (typically 3-5) working together in spiral or agile development environments to meet aggressive customer schedules aimed at near-term field demonstrations of advanced capabilities. You'll get to go demo your work in the field in front the customer! Are you excited about making advanced technologies work in the real world? Then we want you on our team!
Fun stuff on the job you will get to do:
- Develop, integrate, and analyze navigation fusion algorithms, including Extended Kalman Filters, Unscented Kalman Filters, and Particle Filters
- Develop, integrate, and analyze computer vision techniques (including feature extraction, feature matching, optical flow, visual odometry, and Simultaneous Localization and Mapping (SLAM) functions)
- Develop and utilize simulation testbeds to support algorithm testing and assessment
- Travel with your team to demo your system to the customer on a variety of platforms, including aircraft, ground vehicles, pedestrians, and naval ships. Testing typically lasts for 1-2 weeks.
- Bachelor's Degree in Computer Engineering, Electrical Engineering, Computer Science, Aerospace Engineering, Mechanical Engineering, or equivalent
- US Citizenship required, must be eligible to obtain security clearance
- Experience programming in MATLAB. You'll do most of your analysis and algorithm development in Matlab. Strong programming experience in other languages usually makes transition to Matlab pretty easy, so this could work as well.
- A strong background in System Dynamics and Feedback Control theory
You will wow us even more if you have these skills:
- Experience with either sensor fusion/estimation algorithms (e.g. Kalman Filters, Particles Filters, or Factor Graphs), or with computer vision algorithms (for instance openCV). If you have experience with both sensor fusion and computer vision that's even better!
- Master's Degree in major listed above
- Experience with the Robot Operating System (ROS) and the OpenCV computer vision library
- Experience with interface communication standards / protocols including RS232, RS422, TCP/IP, and UDP to communicate with various sensors
- Experience with C++ and/or Python
- Experience with autonomous system development through robotics competitions
- Experience with real-time operating systems including VxWorks, Integrity, and LynxOS
- Experience with the Linux operating environment
- Experience working with various sensor types (Inertial Measurement Units, GPS, cameras, LiDAR, etc.)
- Experience using version control tools (GIT, Subversion, etc).