Associate Data and AI Engineer
Role details
Job location
Tech stack
Requirements
initiatives. Key Responsibilities The Associate Data and AI Engineer is responsible for Implementing data pipelines that bring together structured, semi-structured and unstructured data to support Predictive, AI and Agentic solutions. Responsibilities can include, but are not limited to the following: + Build and maintain scalable and robust AI-driven systems?using technologies such as Snowflake, REST API's, Apache Kafka, AWS Kinesis, Spark streaming, or similar. + Collaborate with external partners such as Enterprise Data, Data Science, Business, Cloud Enablement Team, and Enterprise Architecture teams. + Translate business requests into technical requirements for small and medium complexity engineering tasks. + Validate internal and external data sources for availability and quality when asked. Work with SMEs to describe and understand data lineage and suitability for a use case + Assist in developing code that enables real-time modeling solutions to be ingested into front-end systems + Produce code artifacts and documentation using GitHub for reproducible results and hand-off to other teams if needed. + Ensure the reliability, availability, and scalability of AI solutions and systems?through effective monitoring, alerting, and incident management. + Collaborate closely with AI Ops and infrastructure teams?to ensure seamless deployment, operation, and maintenance of AI / ML systems. Required Skills & Experience + Bachelor's degree in Computer Science, Engineering, IT, Management Information Systems, or a related discipline? + Experience developing solutions using Python and SQL, and using AI code assistants as part of daily workflow + Familiarity working with Github + Familiarity working in an Agile team + Exposure to data ingestion from a variety of structures including relational databases, Hadoop/Spark, cloud data sources, XML, JSON ?? + Exposure to ETL concepts, metadata management and data validation? + Exposure to Automation tools (Autosys, Cron, Airflow, etc.)?required + Exposure to GCP or AWS Services (i.e. S3, EMR, etc.) a plus? + Exposure to Cloud Formation, Terraform, TeamCity, Jenkins, Octopus and other IAC tools and platform a plus + Exposure to Cloud data warehouses, automation, and data pipelines (i.e. Snowflake, Redshift) a plus? + Able to communicate effectively?with both technical and non-technical teams? + Able to translate complex technical topics into business solutions and strategies Candidate must be authorized to work in the US without company sponsorship. The company will not support the STEM OPT I-983 Training Plan endorsement for this position. Compensation The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration
Benefits & conditions
of competencies required for the role. The base pay is just one component of The Hartford's total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is: $74,000 - $111,000 The posted salary range reflects our ability to hire at different position titles and levels depending on background and experience. Equal Opportunity Employer/Sex/Race/Color/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age