👇 CELEBRATE CLOUD SECURITY DAY 👇
00
HOURS
00
MINUTES
00
SECONDS
Explore the exciting world of Large Language Models (LLMs) in this hands-on course, covering both no-code and low-code solutions with Python. Learn to set up software for running open-source models like Google’s Gemma:7b and Llama3 locally, without internet access. Advance your skills by streaming responses from LLMs, integrating with the OpenAI API, and using function calling in ChatGPT models. Gain practical experience fetching responses in batch and streaming modes, and build a small web frontend with Streamlit for your LLM API. By the end, you’ll be able to run and harness the full potential of open-source LLMs locally.
This course is designed for AI developers, machine learning enthusiasts, and researchers who want to explore and work with open-source large language models alongside ChatGPT. It’s also ideal for professionals aiming to build custom AI solutions using open-source tools and frameworks.
Setting Up and Exploring Local LLMs
Advanced Usage and API Integrations