Call : (+91) 968636 4243
Mail : info@EncartaLabs.com
EncartaLabs

Open Source Generative AI

( Duration: 4 Days )

This Generative AI training course, teaches practical applications for AI in the business environment. This course offers a combination of lectures and hands-on labs, providing participants with a solid understanding of AI concepts and the skills to design and implement AI solutions.

Throughout the course, you will learn about AI transformer-based architectures, the fundamentals of Python programming for AI deployments, and the deployment of open-source Transformer models. You will also explore the importance of hardware requirements in AI performance, comparing different GPU architectures and understanding how to match AI requirements with suitable hardware. The course delves into training techniques, including back propagation, gradient descent, and various AI tasks such as classification, regression, and clustering.

You will gain practical experience through hands-on exercises with open source LLM (Language Learning Model) frameworks, allowing you to work with fine-tuned models and run workloads on different models to understand their strengths and weaknesses. Additionally, the course covers the conversion of model formats and provides in-depth exploration of AI programming environments like PyTorch + transformers and transformers' low-level interactive inspection.

Towards the end of the course, you will delve into advanced topics such as context extension through fine-tuning and quantization for specific application target environments..

By attending Generative AI workshop, delegates will learn:

  • The Mechanics of Deep Learning
  • Transformer Architecture
  • Hardware Requirements
  • Using Llama to perform Natural Language Processing Tasks
  • Deploy a Natural Language Model
  • Model Fine Tuning

  • Basic Python skills.

The Generative AI class is ideal for:

  • Project Managers
  • Architects
  • CKA Developers
  • Data Acquisition Specialists

COURSE AGENDA

1

The Mechanics of Deep Learning

  • Choosing the latest AI models
    • What is current and what has already gone extinct?
  • Neural network architecture essentials
    • Tokenization
    • Embedding
    • Parameters: weights and bias
    • Nodes
    • Fully connected/Partially connected
    • Prompts and prompt engineering
2

The Transformer Model

  • Neural Network Architectures
  • Word embeddings
    • Importance in NLP
    • Word2Vec and GloVe embeddings
    • Contextualized embeddings (BERT, ELMo, etc.)
  • Self-attention mechanism and multi-head attention
    • Input Representation (Query, Key, and Value)
    • Computing Similarities
    • Attention Weights
    • Weighted Sum
    • Final Output
  • Positional encoding for attention
  • Transformer layers and stacking
  • Quantization
  • Effective LLM selection criteria and use-cases
    • Models: size, datasets, quantization, etc
    • Libraries
    • Frameworks
3

Hardware Requirements

  • GPUs role in AI performance (CPU vs GPU)
  • GPUs architecture history
    • Pascal, Turing, Ampere
  • Tensor core vs older GPU architectures
  • GPU vocabulary and the transformer
    • SM, tensors, tensor core, CUDA core, FP and INT cores, warp, threads
  • Current GPUs and cost vs value
    • Analysis of GPU specification sheets
  • GPU selection for models and workload
  • Quantizing for hardware performance and cost
4

Pre-trained LLM Model Essentials

  • Select and use pre-trained models to produce immediate results
  • Model ratings, metrics, leader boards
  • Model licensing and commercial use
  • Numbers every AI developer should know (cost, time, dataset size, etc.)
  • Synthetic data generation for model training.
  • Model training through feature extraction
    • Derive features from provided dataset
    • Use annotated data for fine-tuning training
5

Pre-trained LLM hands-on

  • Frameworks
    • llama.cpp, exllama, GPT4All
  • Evaluate multiple models, fine-tunings quantization
    • Falcon, Orca Mini, OpenLlama, Alpaca, MPT
  • Parameters
  • Fine tuned models
  • Prompts
    • Model expectations (instruct, chat, etc)
    • Extension improvement
6

Transformer Model training essentials

  • Overfitting
  • Regularization (Avoiding Overfitting)
  • Backpropagation
  • Gradient Descent
  • Embedding
  • Learning Rate
  • Perplexity
  • Batch Normalization
  • Warm-up and Learning Rate Decay
  • Loss Functions
  • Data Augmentation
  • Training Strategies
  • Evaluation Metrics
7

Conversion of Model Formats

  • PyTorch to ggml
  • JAX to ggml
  • F16 to 4bit quantization
8

Hands on with AI programming environments

  • PyTorch and transformers
  • Transformers low level interactive inspection
9

Model Fine Tuning

  • Perform fine-tuning in a hands-on environment
  • Demonstration project with good fine-tuning data to highlight frameworks
    • with llama.cpp
    • with PyTorch
  • Understanding fine tuning dataset formats
  • Data cleaning skills
10

Introduce Application Interfacing to Models

  • LangChain
  • Guidance for structured LLM output
11

Applications Augmentation with Langchain and Guidance

  • LangChain
  • Guidance for structured LLM output
  • Prompt engineering
12

Advanced Topics

  • Optimize your model through fine-tuning and quantization.
  • Realize quality and speed inference on specific workload
  • Context extension through fine tuning
  • LLM deployment
13

Use llama to perform Natural Language Processing tasks

  • Rewrite a classic poem of a well-known author in the tone of another well know author
  • Use Named Entity Recognition (NER) to identify Cajun food in recipes
  • Use domain specific models to improve project accuracy
14

Deploy a Natural Language Model Capstone

  • Build a full NLP project
  • Download, install and implement a trained NLP model
  • Deploy your NLP to perform the following tasks:
    • text classification
    • sentiment analysis
    • machine translation
    • chatbots
    • question answering

Encarta Labs Advantage

  • One Stop Corporate Training Solution Providers for over 6,000 various courses on a variety of subjects
  • All courses are delivered by Industry Veterans
  • Get jumpstarted from newbie to production ready in a matter of few days
  • Trained more than 50,000 Corporate executives across the Globe
  • All our trainings are conducted in workshop mode with more focus on hands-on sessions

View our other course offerings by visiting https://www.encartalabs.com/course-catalogue-all.php

Contact us for delivering this course as a public/open-house workshop/online training for a group of 10+ candidates.

Top
Notice
X