Join our talent network

Job #: R-00005231
Location: Bethesda, MD
Category: Software Development
Schedule (FT/PT): Full time
Travel Required: Yes, 10% of the time
Shift: Day
Potential for Telework: No
Clearance Required: Top Secret
Referral Eligibility: Eligible
Group: Leidos Innovations Center (LInC)

Job Description:

Looking for pathfinders. Carve your own path at Leidos in our Innovations Center as our newest Senior Data Engineer who is excited about solving national security related data problems by developing data pipelines and search systems for analyzing and making available multiple data sources in Bethesda, MD. Are you someone with a mix of intellectual curiosity, quantitative acumen, and collaborative mission focus to identify novel sources of data across a range of fields, want to improve the performance of analytic systems, and to encourage user adoption of high-end data analytics platforms in partnership with a highly qualified, highly motivated team? If so keep reading.

We're in the process of building a next-generation data analysis and exploitation platform for video, audio, documents, and social media data. This platform will help users identify, discover and triage information via user interfaces that leverages best in class speech-to-text, machine translation, image recognition, OCR, and entity extraction services. We're looking for data engineers to improve the infrastructure and systems behind our platform. The ideal contributor should have experience building and maintaining large Elasticsearch clusters and highly parallelized ETL pipelines. They will be expected to work in a collaborative environment, able to communicate well with their teammates and customers. This is a great opportunity to work with a high-performing team in a fun environment.

If you love doing these, this could be a great job for you:
- Architect and lead technology of large scale and high traffic data warehouse platform
- Write code on ETL platform to transform data to a common data model focused on data exploitation in support of customer mission objectives
- Develop, maintain code, and integrate software into a fully functional software system with dockerized containers
- Conceive, design, develop and deploy a data architecture and data models for new data sources
- Collaborate across teams to develop data warehouse platform vision, strategy, and roadmap
- Train and mentor the team in data modeling and data quality related processes and standards
- Add features to ETL platform to shorten timelines for future data integration efforts
- Participate in daily scrum meetings, sprint retrospectives, and other agile processes

Skills you need to be successful in this job:
- Bachelor's Degree and 8 or more years' experience or Master's Degree with 4 or more years' experience from an accredited course of study in computer science, engineering, operations research, mathematics, or physics
- Must possess a TS security clearance and be eligible to obtain a TS/SCI with Polygraph
- 6+ years of software development experience focused on a variety of data structures and types
- Expertise in data ingestion, data transformation (ETL), and data modeling
- Experience with performance optimization of large Elasticsearch clusters
- Experience identifying external data specifications for common data representations
- Experience building monitoring and alerting mechanisms for data pipelines
- Experience with scripting languages (Python, Bash, etc.)
- Experience with object-oriented software development
- Experience working within a UNIX/Linux environment
- Experience with the design, development, and deployment of the data architecture and data models
- Familiarity with data pipeline batch processing and orchestration tools (Apache Airflow, Apache NiFi, Apache Oozie, etc.)
- Experience with Kubernetes and/or Docker container environment
- Strong verbal and written communication skills
- Strong analytical skills, with excellent problem solving abilities in the face of ambiguity

You will wow us even more if you have these skills:
- Experience working with a message-driven architecture (JMS, Kafka, Kinesis, SNS/SQS, etc.)
- Experience developing data connectors to third party APIs
- Experience with Web scraping technologies
- Experience in Agile/SCRUM enterprise-scale software development
- Experience with Restful web services including code development, deployment, versioning, and build tools (e.g., Eclipse, git, svn, maven, Jenkins)
- Experience working with tools in the stream-processing (e.g., Storm)


External Referral Eligible


Pay and benefits are fundamental to any career decision. That's why we craft compensation packages that reflect the importance of the work we do for our customers. Employment benefits include competitive compensation, Health and Wellness programs, Income Protection, Paid Leave and Retirement. More details are available here .

Leidos will never ask you to provide payment-related information at any part of the employment application process. And Leidos will communicate with you only through emails that are sent from a email address. If you receive an email purporting to be from Leidos that asks for payment-related information or any other personal information, please report the email to .

All qualified applicants will receive consideration for employment without regard to sex, race, ethnicity, age, national origin, citizenship, religion, physical or mental disability, medical condition, genetic information, pregnancy, family structure, marital status, ancestry, domestic partner status, sexual orientation, gender identity or expression, veteran or military status, or any other basis prohibited by law. Leidos will also consider for employment qualified applicants with criminal histories consistent with relevant laws.

Talent Community

Join our Talent Community to create a profile, enabling a streamlined application process and to help our recruiters better understand your areas of expertise and interest.

Join our Talent Community