Full-time Description Job Description: COMPANY is seeking a Data Engineer to join a cutting-edge initiative in support of the U.S. Department of Defense (DoD) in Germany. This role plays a vital part in delivering critical engineering and technical support services, with a strong emphasis on leveraging advanced data science and engineering solutions to drive mission success. Position: Data Engineer (Engineer IV) Location: Kaiserslautern, Germany Clearance: Secret Responsibilities include, but are not limited to, the following: Configure Data Pipelines: Design ETL Processes: Extract, transform, and load (ETL) data from various sources including but not limited to DoD Advana and Army Vantage, into data warehouses or data lakes. Ensure data is cleaned, normalized, and enriched during transformation. Build Data Pipelines: Configure automated workflows to move data between systems using tools like Apache Airflow, AWS Glue, or Azure Data Factory. Optimize pipelines for scalability and reliability. Implement Real-Time Data Processing Implement streaming data pipelines using tools like Apache Kafka, Apache Flink, or Spark Streaming. Process and analyze data in real-time for applications like fraud detection or IoT analytics. Manage Data Storage: Design Databases Design and implement relational databases (e.g., MySQL, PostgreSQL) or NoSQL databases (e.g., MongoDB, Cassandra) based on project requirements. Ensure efficient schema design and indexing for performance. Warehouse Data: Build and maintain data warehouses using platforms like Snowflake, Amazon Redshift, or Google BigQuery. Optimize storage and query performance for analytical workloads. Support the Management of Data Lakes/Integrate within existing organizational policies. Understand and interpret policies to better support the command with regards to integrating products into existing data lakes to ensure long-term sustainability. Integrate Data Connect Data Sources Integrate data from APIs, databases, flat files, and third-party systems. Handle diverse data formats like JSON, XML, CSV, and Parquet.\ Migrate Data Migrate data between systems during upgrades or transitions (e.g., moving from on-premises to cloud). Configure APIs Configure APIs to expose data for consumption by other systems or applications. Manage Data Quality and Governance at the 21st TSC level in accordance with USAEUR-AF policies and guidance Validate Data Implement checks to ensure data accuracy, completeness, and consistency. Identify and resolve data anomalies or errors. Manage Metadata Document data lineage, definitions, and transformations to ensure transparency and traceability. Implement and manage Compliance and Security measures Ensure data systems comply with regulations like GDPR, HIPAA, or CCPA. Implement security measures such as encryption, access control, and auditing. Work with internal teams within the Group to rectify or otherwise mitigate data quality issues in analyses and tool configuration being performed within the Group. Work with PEOs and PMs of Systems of Record that provide feed the multiple data platforms to understand and correct data issues Job Requirements: Years of Experience combination of any of the following: Eighteen (18) years of recent specialized experience Major IT Certification AND fourteen (14) years of recent specialized experience Associate’s degree in computer science, information management, or related discipline AND fourteen (14) years of recent specialized experience Bachelor’s degree in computer science, information management, or related discipline AND ten (10) years of recent specialized experience Master’s degree in computer science, information management, or related discipline AND seven (7) years of recent specialized experience IA Technical Level II certification or higher: CCNA-Security CompTIA CySA Global Industrial Cyber Security Professional (GICSP) GIAC Security Essentials (GSEC) CompTIA Security CE EC-Council Certified Network Defense (CND) (ISC)² Certified?System?Security?Practitioner (SSCP)