Grow your YouTube views, likes and subscribers for free
Get Free YouTube Subscribers, Views and Likes

Function Calling Local LLMs!? LLaMa 3 Web Search Agent Breakdown (With Code!)

Follow
Adam Lucek

With the power of LangGraph, we can add the ability for tool use and function calling with local LLMs! In this video, we go over my entire process and breakdown all of the code needed to do this yourself using Llama 3 8b as a research agent with a web search tool. This idea can be extracted further to more advanced workflows, and additional tools outside of just web search. Enjoy!

Code: https://github.com/ALucek/llama3webs...
Ollama Tutorial:    • How To Easily Run & Use LLMs Locally ...  
Ollama Llama 3: https://ollama.com/library/llama3

FIX FOR ERROR 202 WITH DUCKDUCKGO_SEARCH PACKAGE:
pip install U duckduckgo_search==5.3.0b4

Chapters:
00:00 Intro
00:33 Prerequisite Ollama
00:59 Workflow Diagram Overview
02:11 How This Simulates Function Calling
03:17 Code Walkthrough Starts
04:10 Defining LLMs
05:15 Setting Up DuckDuckGo API
06:52 Prompting Overview
07:21 Llama 3 Special Token Explanation
09:07 Generation Prompt
11:29 Routing Prompt
13:23 Transform Query Prompt
14:45 Putting It Together With LangGraph
15:40 Defining the State
17:40 Router Node
18:48 Transform Query Node
20:13 Web Search Node
21:15 Generation Node
22:17 Adding Nodes & Edges
25:02 Invoking the Agent!
27:15 LangSmith Trace Overview
29:34 Outro

posted by vrygestelb6