AI Engineer
Role details
Job location
Tech stack
Requirements
- Minimum of 3 years of professional experience in data and AI engineering.\n
- This position requires US Citizenship and the ability to pass/obtain any required government background check or security clearance. An active security is preferred.\n
- Proven experience providing operational support for AI/ML systems or data platforms.\n
- Hands-on experience with CI/CD pipelines and Agile development methodologies.\n
- Proficiency with GitHub Enterprise or similar version control and collaboration tools.\n
- Experience implementing and maintaining Infrastructure as Code (IaC) solutions using Terraform or equivalent tools.\n
- Strong understanding of cloud computing, data integration, and machine learning lifecycle concepts.\n
- Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams.\n, * Experience supporting federal data modernization or digital transformation programs.\n
- Familiarity with AWS, Azure, or Google Cloud AI/ML services.\n
- Exposure to containerization (Docker, Kubernetes) and DevSecOps practices.\n
- Background in Python, SQL, or similar languages for data and AI development.\n
Benefits & conditions
Plateau GRP is seeking a skilled AI Engineer to support a large-scale data modernization initiative for the Federal Deposit Insurance Corporation (FDIC). The ideal candidate will apply artificial intelligence and data engineering expertise to enhance data accessibility, automate processes, and support advanced analytics capabilities within a modernized cloud environment.\n \n The AI Engineer will work as part of an Agile delivery team, collaborating closely with data scientists, cloud engineers, and system architects to design, develop, and operationalize AI-driven data solutions.\n \n \nKey Responsibilities\n \n \n
- Design, implement, and maintain AI and data-driven solutions supporting FDIC's data modernization goals.\n
- Develop and optimize machine learning models, pipelines, and automation workflows for data ingestion, transformation, and analysis.\n
- Support production operations, monitoring, and troubleshooting of AI/ML and data integration systems.\n
- Contribute to the design and implementation of CI/CD pipelines to ensure reliable deployment of AI and data applications.\n
- Use Infrastructure as Code (IaC) tools (such as Terraform) to manage and automate infrastructure provisioning and configuration.\n
- Collaborate within an Agile framework to deliver high-quality, iterative solutions using tools such as GitHub Enterprise, Jira, and Confluence.\n
- Partner with data engineers and analysts to enhance data governance, security, and model lifecycle management processes.\n
- Document workflows, maintain version control, and follow DevSecOps best practices.\n
\n Requirements:\n