Looking for pathfinders. Carve your own path at Leidos in our Advanced Solutions Group as our newest Senior Data Engineer with data transformation (ETL) experience working with third party data feeds and latest technologies to develop an enterprise data collection and analytic pipeline capability in Bolling, DC.
Are you someone with a mix of intellectual curiosity, quantitative acumen, and collaborative mission focus to identify novel sources of data across a range of fields, want to improve the performance of analytic systems, and to encourage user adoption of high-end data analytics platforms in partnership with a highly qualified, highly motivated team? If so keep reading.
If you love doing these this could be a great job for you:
• Write code on ETL platform to transform data to a common data model focused on data exploitation in support of customer mission objectives.
• Develop, maintain code, and integrate software into a fully functional software system with dockerized containers.
• Develop and implement data connectors with third party APIs and web scraping technologies.
• Work with external teams to define data collection and integration requirements, and validate ingest.
• Add features to ETL platform to shorten timelines for future data integration efforts.
• Participate in daily scrum meetings, sprint retrospectives, and other agile processes.
• Provide and maintain documentation of system architecture, development, and enhancements.
Skills you need to be successful in this job:
• Bachelor's Degree and 8 or more years' experience or Master's Degree with 4 or more years' experience from an accredited course of study in computer science, engineering, operations research, mathematics, or physics.
• Must possess a TS/SCI security clearance and be able to maintain the clearance.
• 6+ years of software development experience focused on a variety of data structures and types
• Demonstrated understanding of high scale cloud architecture
• Expertise in data ingestion, data transformation (ETL), and data modeling.
• Scripting experience with Java, Ruby, and Python
• Experience developing data connectors to third party APIs
• Experience developing applications that work with NoSQL stores (eg, ElasticSearch, Hbase, Cassandra, MongoDB, CouchDB)
• Working in cloud architecture with AWS EC2, RDS, S3, VPC, and Elastic Search• Linux/Unix experience
• Object Oriented programming language
• Strong verbal and written communication skills
• Strong analytical skills, with excellent problem solving abilities in the face of ambiguity
You will wow us even more if you have these skills:
• Experience with Web scraping technologies
• Experience in Agile/SCRUM enterprise-scale software development
• Experience working with batch-processing and tools (e.g., Nifi, Midpoint, MapReduce, Yarn, Pig, Hive, HDFS, Oozie)
• Experience with Restful web services including code development, deployment, versioning, and build tools (e.g., Eclipse, git, svn, maven, Jenkins)
• Experience working with tools in the stream-processing (e.g., Storm)
External Referral Eligible