Menu

LangGraph Installation & Setup: Build Your First Graph

Complete your LangGraph installation setup in minutes, then build and run your first StateGraph that calls an LLM and returns a response.

Written by Selva Prabhakaran | 15 min read

LangGraph is a Python framework for building stateful AI workflows as graphs — you wire together nodes and edges, and it handles state, loops, and branching for you. In this tutorial, I’ll help you install it, set it up, and build a working graph that calls an LLM.

Maybe you’ve watched demos where AI agents loop, branch, and reason on their own. Looks great — until you sit down and try to code one. Suddenly questions pile up. What do I install? Where does state live? What’s a node, and how is it different from an edge?

This tutorial answers all of that. By the time you’re done, you’ll have a running LangGraph app that takes a question, talks to an LLM, and gives you back an answer. Everything here runs — nothing is pseudocode.

What Is LangGraph?

LangGraph is a Python framework for building stateful, multi-step AI workflows shaped as graphs. The LangChain team built it, but you don’t need LangChain itself to use it.

The concept is straightforward. You create nodes — regular Python functions that carry out tasks. You connect them with edges — arrows that control which function fires next. And you pass around a state dict that every node can read and update.

Why bother with graphs when simple chains exist? Because real AI work isn’t a straight line. A chatbot that uses tools needs to circle back after each tool call. A research agent has to pick a different path based on what it discovers. Graphs handle that loop-and-branch pattern without messy hacks.

python
import langgraph
print(f"This tutorial uses LangGraph version: {langgraph.__version__}")
python
This tutorial uses LangGraph version: 0.3.34

How Do You Install LangGraph and Its Dependencies?

Bad installs are the top reason beginners get stuck. Let’s get yours right on the first try.

Prerequisites

  • Python version: 3.10+
  • Required libraries: langgraph (0.3+), langchain-openai (0.3+), python-dotenv (1.0+)
  • Install: pip install langgraph langchain-openai python-dotenv
  • API key: An OpenAI API key (get one at platform.openai.com)
  • Time to complete: 20 minutes

Start by making a virtual environment. This keeps your LangGraph packages away from your other projects.

bash
python -m venv langgraph-env
source langgraph-env/bin/activate   # On Windows: langgraph-env\Scripts\activate
pip install langgraph langchain-openai python-dotenv
Tip: Always spin up a fresh virtual environment for LangGraph work. Both LangGraph and LangChain ship updates often. Isolating each project saves you from version clashes that are a pain to untangle.

Make sure everything landed by running these imports. If any line throws an error, that package didn’t install right.

python
from langgraph.graph import StateGraph, START, END
from langchain_openai import ChatOpenAI
from dotenv import load_dotenv
print("All imports successful")
python
All imports successful
Warning: See `ModuleNotFoundError: No module named ‘langgraph’`? You’re likely running a different Python than the one you installed to. Run `which python` (or `where python` on Windows) and check. Make sure your virtual environment is active.

How Do You Set Up Your API Key?

The graph framework itself has no need for credentials. The LLM provider does. I’m going with OpenAI in this guide, but you can switch to Anthropic, Google, or a local model later.

Create a .env file at your project root. Hardcoding secrets in source files is risky — one accidental commit and your key is public.

bash
# Create .env file (run once)
echo "OPENAI_API_KEY=sk-your-key-here" > .env

At the top of your script, run load_dotenv(). It scans the .env file and loads each key-value pair as an environment variable.

python
import os
from dotenv import load_dotenv

load_dotenv()
api_key = os.getenv("OPENAI_API_KEY")
print(f"API key loaded: {'Yes' if api_key else 'No -- check your .env file'}")
python
API key loaded: Yes
Key Insight: The graph layer and the LLM layer are fully decoupled. LangGraph handles state and flow. The provider handles text generation. Because they’re independent, you can switch from OpenAI to Anthropic (or a local model) without changing any graph code.

What Are the Three Building Blocks: State, Nodes, and Edges?

Every LangGraph app stands on three ideas. Get these down and the rest is just details.

State — a typed Python dict (TypedDict) that flows through the graph. Picture a shared notepad: each node can read it and scribble updates.

Nodes — ordinary Python functions. A node receives the state, does its thing, and returns a dict with the fields it wants to change. That’s the whole deal.

Edges — wires between nodes that decide the order of execution. The simplest edge says: “after A finishes, run B.”

Let me show you how these look in code. Below, I define a state with a single messages field and use Annotated to tell LangGraph how updates should merge.

python
from typing import TypedDict, Annotated
from langgraph.graph import StateGraph, START, END
import operator

class State(TypedDict):
    messages: Annotated[list, operator.add]

That Annotated[list, operator.add] line deserves a closer look. It says: “When a node sends back new messages, append them to whatever is already there.” Drop this tag and every node would overwrite the whole list. This one annotation is probably the most critical detail in LangGraph state design.

How Do You Create Your First StateGraph?

Time to write some code. Let’s make a graph that takes a user’s name and prints a greeting. No LLM for now — just pure Python functions linked by edges.

I’ll use two nodes. One cleans up the raw name. The other builds a greeting from the cleaned version. State moves data between them.

python
from typing import TypedDict
from langgraph.graph import StateGraph, START, END


class GreetingState(TypedDict):
    name: str
    greeting: str


def format_name(state: GreetingState) -> dict:
    """Capitalize the user's name."""
    return {"name": state["name"].strip().title()}


def generate_greeting(state: GreetingState) -> dict:
    """Create a greeting using the formatted name."""
    return {"greeting": f"Hello, {state['name']}! Welcome to LangGraph."}

Notice the pattern: each function gets the full state as input but only sends back the keys it modified. format_name edits name. generate_greeting sets greeting. LangGraph merges those partial updates into the main state behind the scenes.

How Do You Add Nodes, Edges, and Run the Graph?

Now let’s connect the pieces. Call add_node to attach a function to a label. Call add_edge to link one label to another.

Two special constants matter here: START is the entry point and END is the exit. Every graph must include both.

python
graph = StateGraph(GreetingState)

graph.add_node("format_name", format_name)
graph.add_node("generate_greeting", generate_greeting)

graph.add_edge(START, "format_name")
graph.add_edge("format_name", "generate_greeting")
graph.add_edge("generate_greeting", END)

print("Graph built with 2 nodes and 3 edges")
python
Graph built with 2 nodes and 3 edges

The route is linear: START → format_name → generate_greeting → END. Input goes in, walks through both nodes in order, and comes out.

But a StateGraph is still a blueprint — you can’t execute it directly. Convert it to a runnable form with .compile(), which hands you a CompiledGraph. Then call .invoke() with your starting state.

python
app = graph.compile()

result = app.invoke({"name": "  alice  ", "greeting": ""})
print(result)
python
{'name': 'Alice', 'greeting': 'Hello, Alice! Welcome to LangGraph.'}

Follow the data: the input was " alice " — extra spaces, all lowercase. format_name stripped the spaces and capitalized it to "Alice". Then generate_greeting used that polished name. One job per node, state as the messenger.

Tip: `.compile()` is a one-time step. Build the graph, compile it, and then reuse the compiled object for every `.invoke()` call. Re-compiling before each run adds overhead for no gain.

How Do You Build a Graph That Calls an LLM?

The greeting example taught you the mechanics. Now let’s tap into LangGraph’s real strength — orchestrating LLM calls. We’ll build a graph that accepts a question, feeds it to a model, and returns the response.

The graph needs just one node: call_llm. It grabs messages from state, passes them to OpenAI’s GPT, and writes the reply back. Annotated[list, operator.add] makes sure messages stack up rather than overwrite.

python
import operator
from typing import Annotated, TypedDict
from langgraph.graph import StateGraph, START, END
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage

class ChatState(TypedDict):
    messages: Annotated[list, operator.add]

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)

def call_llm(state: ChatState) -> dict:
    """Send messages to the LLM and append the response."""
    response = llm.invoke(state["messages"])
    return {"messages": [response]}

ChatOpenAI grabs OPENAI_API_KEY from the environment by itself — no need to pass it explicitly. And call_llm wraps the response in a list because the operator.add reducer expects a list to concatenate.

Wire the node into a graph: one node, two edges. You can’t get simpler.

python
chat_graph = StateGraph(ChatState)
chat_graph.add_node("call_llm", call_llm)
chat_graph.add_edge(START, "call_llm")
chat_graph.add_edge("call_llm", END)

chat_app = chat_graph.compile()

Run it with a question. The starting state holds one HumanMessage.

python
result = chat_app.invoke({
    "messages": [HumanMessage(content="What is LangGraph in one sentence?")]
})

print(result["messages"][-1].content)
python
LangGraph is a Python framework for building stateful, multi-step AI agent workflows using a graph-based architecture with nodes and edges.

And there you have it — a complete LangGraph application in a few lines. A question goes in, the model thinks, and the answer comes back. The exciting part? Growing this is easy. Add more nodes for retrieval, tool calls, or validation — nothing you built so far needs to change.

Key Insight: Five steps, every time: define state → write nodes → draw edges → compile → invoke. Once this rhythm becomes second nature, building any new workflow is just a matter of plugging in extra nodes.

How Do You Visualize Your Graph?

Pipelines get confusing fast once you add more nodes. LangGraph ships with drawing tools so you can see the full flow in seconds.

Grab the structure with .get_graph(), then call draw_ascii() for a quick text diagram that works anywhere.

python
print(chat_app.get_graph().draw_ascii())
python
+-----------+
| __start__ |
+-----------+
      *
      *
      *
 +---------+
 | call_llm |
 +---------+
      *
      *
      *
 +---------+
 | __end__ |
 +---------+

Need a nicer visual? draw_mermaid() spits out Mermaid markup. Copy it into mermaid.live for a polished flowchart.

python
print(chat_app.get_graph().draw_mermaid())
python
%%{init: {'flowchart': {'curve': 'linear'}}}%%
graph TD;
    __start__([<p>__start__</p>]):::first
    call_llm(call_llm)
    __end__([<p>__end__</p>]):::last
    __start__ --> call_llm;
    call_llm --> __end__;
    classDef default fill:#f2f0ff,line-height:1.2
    classDef first fill:#DAE8FC
    classDef last fill:#baffc9
Tip: Working in a Jupyter notebook? Render a PNG inline with `from IPython.display import Image, display` then `display(Image(chat_app.get_graph().draw_mermaid_png()))`. This hits the Mermaid.ink API, so you’ll need internet access.

What Are the Best Ways to Debug a LangGraph App?

Bugs are part of the process. Here are the tactics I reach for first when a graph misbehaves.

Pass debug=True to .compile(). This prints every state transition, so you can trace exactly what each node received and returned.

python
debug_app = chat_graph.compile(debug=True)

Add print statements inside nodes. When the final output looks wrong, the culprit is almost always one node’s return dict. A quick print() during dev makes the issue obvious.

python
def call_llm_debug(state: ChatState) -> dict:
    """Send messages to LLM with debug printing."""
    print(f"[DEBUG] Input messages: {len(state['messages'])}")
    response = llm.invoke(state["messages"])
    print(f"[DEBUG] Response type: {type(response)}")
    return {"messages": [response]}

Verify your dict keys. A subtle trap: if a node returns a key that isn’t in your TypedDict, LangGraph silently drops it. When an update seems to vanish, match the key spelling against your state definition.

What Are the Most Common Setup Errors (and How Do You Fix Them)?

Error 1: ModuleNotFoundError: No module named 'langgraph'

Almost every time, this means the venv isn’t active — or you installed the package in one environment but you’re running code in another.

bash
# Check which Python you're using
which python        # Linux/Mac
where python        # Windows

# Verify langgraph is installed in THAT environment
pip show langgraph

Error 2: AuthenticationError from OpenAI

The key is either missing or invalid. Verify three things: (1) .env lives in the project root, (2) load_dotenv() runs before you create the ChatOpenAI object, and (3) the key starts with sk-.

python
import os
from dotenv import load_dotenv

load_dotenv()
key = os.getenv("OPENAI_API_KEY", "NOT SET")
print(f"Key starts with: {key[:5]}...")

Error 3: State updates vanish without a trace

This pops up when a node returns the wrong type. If your state declares messages: list but a node hands back {"messages": "hello"} (a string instead of a list), the merge logic breaks silently.

Wrong:

python
def bad_node(state):
    return {"messages": "hello"}

Correct:

python
def good_node(state):
    return {"messages": ["hello"]}
Warning: Missing `operator.add` is the top beginner mistake. If you skip `Annotated[list, operator.add]`, every `{“messages”: [“new”]}` return replaces the full list instead of extending it. All prior messages disappear with no error.

Complete Code

Click to expand the full script (copy-paste and run)
python
# Complete code from: LangGraph Installation, Setup, and Your First Graph
# Requires: pip install langgraph langchain-openai python-dotenv
# Python 3.10+

import os
import operator
from typing import Annotated, TypedDict
from dotenv import load_dotenv
from langgraph.graph import StateGraph, START, END
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage

# --- Setup ---
load_dotenv()

# --- Example 1: Simple Greeting Graph (no LLM) ---

class GreetingState(TypedDict):
    name: str
    greeting: str


def format_name(state: GreetingState) -> dict:
    """Capitalize the user's name."""
    return {"name": state["name"].strip().title()}


def generate_greeting(state: GreetingState) -> dict:
    """Create a greeting using the formatted name."""
    return {"greeting": f"Hello, {state['name']}! Welcome to LangGraph."}


graph = StateGraph(GreetingState)
graph.add_node("format_name", format_name)
graph.add_node("generate_greeting", generate_greeting)
graph.add_edge(START, "format_name")
graph.add_edge("format_name", "generate_greeting")
graph.add_edge("generate_greeting", END)

app = graph.compile()
result = app.invoke({"name": "  alice  ", "greeting": ""})
print("--- Greeting Graph ---")
print(result)

# --- Example 2: LLM Chat Graph ---

class ChatState(TypedDict):
    messages: Annotated[list, operator.add]


llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)


def call_llm(state: ChatState) -> dict:
    """Send messages to the LLM and append the response."""
    response = llm.invoke(state["messages"])
    return {"messages": [response]}


chat_graph = StateGraph(ChatState)
chat_graph.add_node("call_llm", call_llm)
chat_graph.add_edge(START, "call_llm")
chat_graph.add_edge("call_llm", END)

chat_app = chat_graph.compile()

print("\n--- Chat Graph ---")
result = chat_app.invoke({
    "messages": [HumanMessage(content="What is LangGraph in one sentence?")]
})
print(result["messages"][-1].content)

# --- Visualization ---
print("\n--- Graph Structure ---")
print(chat_app.get_graph().draw_ascii())

print("\nScript completed successfully.")

Summary

Here’s what you built in this tutorial:

  1. Installed LangGraph with pip install langgraph langchain-openai python-dotenv inside a fresh virtual environment.
  2. Set up API key handling with a .env file and load_dotenv().
  3. Learned the three core pieces — state (TypedDict), nodes (functions), and edges (wires between them).
  4. Built a greeting graph using StateGraph, add_node, add_edge, .compile(), and .invoke().
  5. Built an LLM graph that sends user messages to GPT and brings back answers.
  6. Drew your graph with draw_ascii() and draw_mermaid().
  7. Fixed common bugs — import errors, key issues, and state gotchas.

Practice exercise: Build a two-node graph where the first node takes a topic string and lowercases it, and the second node calls the LLM to produce a one-sentence definition. Test it with {"topic": " MACHINE LEARNING ", "definition": ""}.

Click to see the solution
python
class DefState(TypedDict):
    topic: str
    definition: str

def clean_topic(state: DefState) -> dict:
    return {"topic": state["topic"].strip().lower()}

def define_topic(state: DefState) -> dict:
    msg = HumanMessage(content=f"Define {state['topic']} in one sentence.")
    response = llm.invoke([msg])
    return {"definition": response.content}

def_graph = StateGraph(DefState)
def_graph.add_node("clean_topic", clean_topic)
def_graph.add_node("define_topic", define_topic)
def_graph.add_edge(START, "clean_topic")
def_graph.add_edge("clean_topic", "define_topic")
def_graph.add_edge("define_topic", END)

def_app = def_graph.compile()
result = def_app.invoke({"topic": "  MACHINE LEARNING  ", "definition": ""})
print(f"Topic: {result['topic']}")
print(f"Definition: {result['definition']}")

`clean_topic` strips the spaces and lowercases the text to “machine learning”. `define_topic` asks the LLM for a one-line definition and saves the reply. State shuttles data between the two nodes — same pattern as the greeting graph.

Up next, you’ll learn about conditional edges — branches where the graph picks which node to run based on the current state. That’s where LangGraph starts to feel like a real agent framework.

Frequently Asked Questions

Do You Need LangChain to Use LangGraph?

Nope. LangGraph depends on langchain-core for message types and model wrappers, but the full langchain bundle isn’t required. Install langgraph plus your provider package (e.g., langchain-openai) and you’re set.

Can You Use LangGraph with Anthropic, Google, or Local Models?

Yes — swap langchain-openai for langchain-anthropic, langchain-google-genai, or langchain-ollama for local models. The graph code doesn’t change at all; only the line where you create the LLM object differs.

python
# Anthropic
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic(model="claude-sonnet-4-20250514")

# Local via Ollama
from langchain_ollama import ChatOllama
llm = ChatOllama(model="llama3")

What’s the Difference Between StateGraph and MessageGraph?

MessageGraph was a convenience class where the entire state was a flat message list. It’s deprecated now. The recommended path is StateGraph with Annotated[list, operator.add] on your messages field — more flexible and actively maintained.

python
# Recommended approach
class State(TypedDict):
    messages: Annotated[list, operator.add]

graph = StateGraph(State)

How Do You Add Conditional Branching?

Replace add_edge with add_conditional_edges. You provide a routing function that inspects the current state and returns the label of the node to run next. I walk through this fully in the next post on conditional edges and routing.

References

  1. LangGraph documentation — Installation and getting started. Link
  2. LangGraph GitHub repository — Build resilient language agents as graphs. Link
  3. LangGraph PyPI package page. Link
  4. LangChain documentation — ChatOpenAI integration. Link
  5. Python documentation — typing.TypedDict. Link
  6. Python documentation — operator module. Link
  7. LangGraph Quickstart tutorial — Official LangChain docs. Link
  8. Real Python — LangGraph: Build Stateful AI Agents in Python. Link

Reviewed: March 2026 | LangGraph 0.3+ | Python 3.10+

Free Course
Master Core Python — Your First Step into AI/ML

Build a strong Python foundation with hands-on exercises designed for aspiring Data Scientists and AI/ML Engineers.

Start Free Course
Trusted by 50,000+ learners
Related Course
Master Gen AI — Hands-On
Join 5,000+ students at edu.machinelearningplus.com
Explore Course
Free Callback - Limited Slots
Not Sure Which Course to Start With?
Talk to our AI Counsellors and Practitioners. We'll help you clear all your questions for your background and goals, bridging the gap between your current skills and a career in AI.
10-digit mobile number
📞
Thank You!
We'll Call You Soon!
Our learning advisor will reach out within 24 hours.
(Check your inbox too — we've sent a confirmation)
⚡ Before you go

Python.
SQL. NumPy.
All free.

Get the exact 10-course programming foundation that Data Science professionals use.

🐍
Core Python — from first line to expert level
📈
NumPy & Pandas — the #1 libraries every DS job needs
🗃️
SQL Levels I–III — basics to Window Functions
📄
Real industry data — Jupyter notebooks included
R A M S K
57,000+ students
★★★★★ Rated 4.9/5
⚡ Before you go
Python. SQL.
All Free.
R A M S K
57,000+ students  ★★★★★ 4.9/5
Get Free Access Now
10 courses. Real projects. Zero cost. No credit card.
New learners enrolling right now
🔒 100% free ☕ No spam, ever ✓ Instant access
🚀
You're in!
Check your inbox for your access link.
(Check Promotions or Spam if you don't see it)
Or start your first course right now:
Start Free Course →
Scroll to Top
Scroll to Top
Course Preview

Machine Learning A-Z™: Hands-On Python & R In Data Science

Free Sample Videos:

Machine Learning A-Z™: Hands-On Python & R In Data Science

Machine Learning A-Z™: Hands-On Python & R In Data Science

Machine Learning A-Z™: Hands-On Python & R In Data Science

Machine Learning A-Z™: Hands-On Python & R In Data Science

Machine Learning A-Z™: Hands-On Python & R In Data Science