Contribute to building AI-powered workflows end-to-end, finding the right balance between research and development
Work with very high volume structured and unstructured data, create training and evaluation data sets
Propose and systematically evaluate AI performance and cost improvement strategies
Qualifications
In-depth Experience with LLM Frameworks: Demonstrated deep expertise in working with Large Language Models such as OpenAI's GPT series (GPT-3, GPT-4), including model integration, customization, and optimization within diverse applications.
LLM Fine-Tuning and Transfer Learning: Proven experience in fine-tuning LLMs for specific tasks or industries, including the use of transfer learning techniques to adapt pre-trained models to new domains or requirements.
Data Handling Skills: Proficiency in handling, processing, and analyzing large datasets. Familiarity with data preprocessing techniques and tools (e.g., Pandas in Python) is crucial.?
Experience: 4+ years of relevant industry experience with a bachelor's degree; or 2+ years and a master's degree
Required Qualifications, Capabilities, And Skills
Formal training or certification on software engineering concepts and 3+ years applied experience.
Ability to write robust code in Python, Java, or another relevant language.
Experience with web development frameworks ex. React, Vue.js or similar.
Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn).
Passion for building great user experiences for the clients, attention to detail.
3+ years of experience working with AWS stack
Hands-on experience with software design, problem solving and debugging skills.
Strong interpersonal skills; able to work independently as well as in a team.
Experience in containerization and infrastructure as code: Docker/Kubernetes/Terraform
Experience with distributed systems.
Proficiency in data embeddings using Langchain and vector database storage.
Preferred Qualifications, Capabilities, And Skills
Experience in designing and implementing pipelines using Retrieval-Augmented Generation (RAG).
Ability to construct batch and streaming microservices exposed as gRPC and/or GraphQL endpoints.
Hands-on knowledge of Chain-of-Thoughts, Tree-of-Thoughts, Graph-of-Thoughts prompting strategies.
Familiarity with agile development methodologies and project management tools (e.g JIRA, Confluence)