This role is referral eligible with an amount of $3,000 for the person referring a candidate we hire for this role, and meeting eligibility requirements. Reach out to ZapataRecruiting@ZapataTechnology.com for details and questions.
To support our team members in joining us, relocation assistance/funds may be offered for qualified candidates.
Summary:
The Data Engineer is part of a skilled IT team that will work independently and collectively with other engineers to identify and test potential enhancements and resolve technical issues with the rapid scaling of an intelligence application.
Essential Functions:
Perform software integration tasks, including scripting, systems analysis, requirements gathering, testing, and documentation. Provide support to technical writers and testers, manage configuration and system maintenance, utilize source code control to track and safeguard changes to the codebase, and apply software updates.
Collaboratively design, develop, and implement custom ETL scripts while supporting software products as required. Ensure data quality, consistency, and integrity across the entire pipeline. Engage directly with customers to create ingestion feeds for unique systems, demonstrating the ability to devise solutions with minimal prior documentation or details about these systems.
Perform infrastructure management functions, including designing, architecting, and implementing scalable infrastructure across multiple IC networks and application nodes using DevSecOps practices. Oversee containerization efforts with tools such as Docker and Podman, and manage orchestration processes. Upgrade and migrate applications to containerized environments to enable scalability and transition to a microservices architecture. Document infrastructure administration, including creation, management, and decommissioning procedures.
Qualifications:
Knowledge of DevSecOps processes including Infrastructure as Code using Puppet or Ansible.
Knowledge of containerization and container orchestration including networking using Docker, Podman, or Kubernetes.
Experience with the full DataOps lifecycle including ingestion and ETL processes into databases.
Linux System administration knowledge: know the Linux command line, service management, software configuration concepts in lieu of virtual machine constraints.
Ability to convert data and fix malformed data. This requires an understanding of XML, JSON, and regular expressions.
Preferred Qualifications:
Experience with All Source Intelligence reports standards such as USMTF and lifecycle requirements management development.
Experience in NiFi data flow, from configuration to creating new flows and making new processors.
Ability to write in JavaScript, Groovy, and Python within NiFi and Linux.
Certifications Required:
IAT Level III certification is needed for this role, but having a level II like COMPTIA Security+ with ability to obtain level III in agreed upon time may be accepted.
Education/Experience include:
A bachelor’s degree with 9 to 12 years of experience in performing similar or related work is required. Four years of experience may be substituted for a bachelor’s degree.
Clearance Type:
TS//SCI clearance required.
Travel:
Travel may be required for this role, and it is estimated at 2 weeks or less per TDY and an average of 1-3 times per year.
AAP/EEO Statement:
Zapata Technology is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, genetic information, creed, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.