Arbetsbeskrivning
About the Role
As a Senior AI Solutions Engineer, you will drive the end-to-end design, development, and deployment of cutting-edge AI solutions.
You'll architect and build scalable microservices and APIs using Python and Java, while leveraging your expertise in large language models, vector databases and semantic similarity search.
In addition, you'll develop and maintain data pipelines with Airflow, ensure robust CI/CD and IaC practices (Terraform, CloudFormation), and containerise solutions using Docker and Kubernetes.
Your work will span multiple cloud environments AWS (EC2, S3, Lambda, CodePipeline, AWS Bedrock) and Azure (Azure AI Foundry) and integrate front-end customisations in JavaScript/TypeScript for chatbots and interactive interfaces.
You'll also contribute to modular, scalable RAG platform development, integrating structured and unstructured data sources working closely with Lead AI Solutions engineer.
By collaborating with cross-functional teams, you'll transform raw data into actionable insights, delivering AI-driven innovations that power our analytics applications and shape the future of data-centric decision-making across the organisation.
Key Responsibilities
- AI System Architecture & Development
- Design, implement, and optimise AI-driven applications using Python (Hugging Face Transformers, Langchain) and Java.
- Incorporate advanced NLP and LLM techniques (e.g., embeddings, semantic search) to enhance data insights and user experiences.
- AI & RAG System Development:
- Design and build scalable RAG systems, including chatbots and AI assistants, leveraging cutting-edge ML/LLM techniques.
- Develop backend APIs and microservices using Python frameworks and Java for seamless integration with our existing services.
- Data Engineering & Integration:
- Design and maintain efficient data pipelines for real-time and batch processing, integrating structured data (SQL, data warehousing) and unstructured sources (documents, reports).
- Utilise Airflow to orchestrate ETL workflows and ensure robust data processing.
- Implement pipelines for LLM-based querying, embedding extraction, and metadata processing.
- Ensure compliance with data governance, privacy, and security policies.
- Cloud & Infrastructure Management:
- Deploy and manage applications on AWS (EC2, S3, Lambda, CloudFormation, Terraform, CodePipeline, AWS Bedrock) and Azure (Azure AI Foundry including Functions and Agentic Workflows).
- Collaborate with Platform Engineering to support Kubernetes deployments on FDJ's Cloud.
- Ensure security integration with SSO platforms (e.g.
Azure SSO, SailPoint).
- Implement containerisation with Docker and Kubernetes while using CI/CD best practices and infrastructure as code (IaC) to ensure seamless deployments.
- Vector Databases & Semantic Search:
- Manage and optimise vector databases (PGVector, ChromaDB) to support semantic similarity search using cosine similarity techniques.
- Integrate embedding models (Sentence Transformers, OpenAI Embeddings) to enhance data retrieval quality.
- Ensure indexing performance and accuracy in semantic search systems.
- Frontend & API Development:
- Collaborate on developing interactive UI components (using JavaScript/TypeScript) for chatbot customisation and data visualisation.
- Ensure that APIs are well-documented, secure, and scalable to support self-service analytics.
- Team Collaboration & Mentorship:
- Work with data analysts, data engineers, and platform engineers to create a cohesive AI ecosystem.
- Mentor junior team members and lead knowledge transfer sessions to promote best practices across the organisation.
Required Skills & Qualifications
- 5+ years of experience in data engineering, backend development, or AI/ML integration.
- Strong proficiency in Python and Java.
- Extensive hands-on experience with AWS services (EC2, S3, Lambda, CloudFormation, Terraform, CodePipeline, AWS Bedrock) and Azure AI Foundry.
- Data & AI Technologies:
- Expertise in vector databases (PGVector/ChromaDB) and semantic similarity search (cosine similarity).
- Proficient with embedding techniques using Sentence Transformers and OpenAI Embeddings.
- DevOps & CI/CD: Demonstrated experience in containerisation (Docker, Kubernetes), CI/CD pipelines, and IaC practices.
- API & Data Integration: Proven track record in API development and managing data pipelines & warehousing solutions.
- Frontend Exposure: Experience with JavaScript/TypeScript for enhancing chatbot customisations and interactive data interfaces.
- Strong problem-solving skills, excellent communication, and a collaborative, agile mindset.
Nice-to-Have Skills
- Experience fine-tuning large LLMs for domain-specific applications.
- Prior work in self-service analytics or AI-powered business intelligence solutions.
- Additional experience with Airflow for advanced ETL orchestration.
What We Offer
- A dynamic, innovative work environment where bold ideas are celebrated.
- Opportunity to work with cross-functional experts in AI, data engineering, and cloud technologies.
- Competitive salary and benefits, with opportunities for professional growth and continuous learning.