Menu

LangGraph Installation, Setup, and Your First Graph

Written by Selva Prabhakaran | 14 min read

You’ve heard about LangGraph. Maybe you’ve seen demos of AI agents that loop, branch, and make choices on their own. Cool stuff. But when you sat down to build one, the setup felt murky. Which packages do you need? How does state work? What’s a node versus an edge?

This tutorial clears all of that up. By the end, you’ll have a running LangGraph app that takes user input, sends it to an LLM, and returns a response. Every line of code runs — no hand-waving.

What Is LangGraph?

LangGraph is a Python framework for building stateful, multi-step AI workflows as graphs. The LangChain team built it, but you don’t need LangChain to use it.

The core idea is simple. You define nodes (Python functions that do work) and edges (links that control the flow). LangGraph passes a shared state object between nodes. Each node reads the state, does its job, and sends back updates.

Why graphs instead of simple chains? Because real AI workflows need loops, branches, and if-this-then-that logic. A chatbot that calls tools needs to loop back after each tool call. A research agent needs to branch based on what it finds. Graphs handle that with ease.

python
import langgraph
print(f"This tutorial uses LangGraph version: {langgraph.__version__}")
python
This tutorial uses LangGraph version: 0.3.34

Installing LangGraph and Its Packages

Getting the install wrong is the top reason beginners get stuck. Let’s get it right the first time.

What You Need

  • Python version: 3.10+

  • Required libraries: langgraph (0.3+), langchain-openai (0.3+), python-dotenv (1.0+)

  • Install: pip install langgraph langchain-openai python-dotenv

  • API key: An OpenAI API key (get one at platform.openai.com)

  • Time to complete: 20 minutes

Create a virtual environment first. This keeps your LangGraph packages separate from other projects.

python
python -m venv langgraph-env
source langgraph-env/bin/activate   # On Windows: langgraph-env\Scripts\activate
pip install langgraph langchain-openai python-dotenv

Tip: Always use a virtual environment for LangGraph projects. LangGraph and LangChain get updates often. Keeping each project on its own avoids version clashes that are hard to debug.

Check that everything works by running these imports. If any line throws an error, that package didn’t install right.

python
from langgraph.graph import StateGraph, START, END
from langchain_openai import ChatOpenAI
from dotenv import load_dotenv
print("All imports successful")
python
All imports successful

Warning: If you see ModuleNotFoundError: No module named 'langgraph', you’re likely running a different Python than the one where you installed the package. Run which python (or where python on Windows) to check. Make sure your virtual environment is active.

Setting Up Your API Key

LangGraph itself doesn’t need an API key. But calling an LLM does. We’ll use OpenAI here, though you can swap in Anthropic, Google, or any other provider later.

Create a .env file in your project root. Never hard-code API keys in your Python files — one bad git push and your key is public.

python
# Create .env file (run once)
echo "OPENAI_API_KEY=sk-your-key-here" > .env

Load it at the start of your script with load_dotenv(). This reads the .env file and sets each line as an environment variable.

python
import os
from dotenv import load_dotenv

load_dotenv()
api_key = os.getenv("OPENAI_API_KEY")
print(f"API key loaded: {'Yes' if api_key else 'No -- check your .env file'}")
python
API key loaded: Yes

Key Insight: LangGraph keeps flow and LLM calls separate. The graph framework handles state and routing. The LLM provider handles text output. You can switch providers without touching your graph logic.

The Three Building Blocks: State, Nodes, and Edges

Every LangGraph app has exactly three parts. Get these right and the rest falls into place.

State is a Python dict (defined with TypedDict) that flows through the graph. Think of it as a shared notebook that every node can read and write to.

Nodes are Python functions. Each one takes the current state as input and returns a dict of updates. That’s it — a function that reads state and returns changes.

Edges link nodes and control flow. They tell LangGraph which node runs next. The simplest edge is a direct link: “after node A, run node B.”

Here’s how these three pieces look in code. We define a state with a single messages field, using Annotated to tell LangGraph how to handle updates.

python
from typing import TypedDict, Annotated
from langgraph.graph import StateGraph, START, END
import operator

class State(TypedDict):
    messages: Annotated[list, operator.add]

That Annotated[list, operator.add] line is worth a closer look. It tells LangGraph: “When a node sends back new messages, append them to the list.” Without this, each node would replace the whole messages list. This is the single most important detail in LangGraph state design.

Creating Your First StateGraph

Enough theory. Let’s build a graph that does something real: take a user’s name and create a greeting. No LLM yet — just plain Python functions wired together.

We need two nodes. The first cleans the user’s name. The second builds a greeting from the clean name. The state carries data between them.

python
from typing import TypedDict
from langgraph.graph import StateGraph, START, END

class GreetingState(TypedDict):
    name: str
    greeting: str

def format_name(state: GreetingState) -> dict:
    """Capitalize the user's name."""
    return {"name": state["name"].strip().title()}

def generate_greeting(state: GreetingState) -> dict:
    """Create a greeting using the formatted name."""
    return {"greeting": f"Hello, {state['name']}! Welcome to LangGraph."}

Each function takes the full state and returns only the fields it wants to change. format_name updates the name field. generate_greeting updates greeting. LangGraph merges these changes into the state on its own.

Adding Nodes, Edges, and Running the Graph

With the state and functions ready, we wire them into a graph. The add_node method signs up a function under a name. The add_edge method creates a link between two nodes.

START and END are special constants from LangGraph. START marks where things kick off. END marks where they stop. Every graph needs both.

python
graph = StateGraph(GreetingState)

graph.add_node("format_name", format_name)
graph.add_node("generate_greeting", generate_greeting)

graph.add_edge(START, "format_name")
graph.add_edge("format_name", "generate_greeting")
graph.add_edge("generate_greeting", END)

print("Graph built with 2 nodes and 3 edges")
python
Graph built with 2 nodes and 3 edges

The flow is straight: START -> format_name -> generate_greeting -> END. Data enters at START, passes through both nodes in order, and exits at END.

A StateGraph is a blueprint — you can’t run it as-is. Calling .compile() turns it into a CompiledGraph you can invoke. Then .invoke() runs the graph with a starting state.

python
app = graph.compile()

result = app.invoke({"name": "  alice  ", "greeting": ""})
print(result)
python
{'name': 'Alice', 'greeting': 'Hello, Alice! Welcome to LangGraph.'}

See what happened? The input had messy whitespace: " alice ". The format_name node cleaned it to "Alice". Then generate_greeting used the clean name. Each node did one job, and the state carried the result forward.

Tip: Think of .compile() like building an app. Define the graph once, compile once, then call .invoke() as many times as you like with different inputs. Don’t recompile between calls — that’s wasted work.

Building a Graph That Calls an LLM

The greeting graph showed you the basics. But the real power of LangGraph is running LLM calls. Let’s build a graph that sends user input to an LLM and returns what it says.

This graph has one node: call_llm. It reads the user’s messages from state, sends them to OpenAI’s GPT model, and stores the response back in state. We use Annotated[list, operator.add] so messages pile up across calls.

python
import operator
from typing import Annotated, TypedDict
from langgraph.graph import StateGraph, START, END
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage

class ChatState(TypedDict):
    messages: Annotated[list, operator.add]

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)

def call_llm(state: ChatState) -> dict:
    """Send messages to the LLM and append the response."""
    response = llm.invoke(state["messages"])
    return {"messages": [response]}

ChatOpenAI reads the OPENAI_API_KEY from the environment on its own — you don’t pass it by hand. The call_llm function wraps the response in a list because operator.add needs a list to append.

Wire the node into a graph. One node, two edges. It doesn’t get simpler.

python
chat_graph = StateGraph(ChatState)
chat_graph.add_node("call_llm", call_llm)
chat_graph.add_edge(START, "call_llm")
chat_graph.add_edge("call_llm", END)

chat_app = chat_graph.compile()

Run it with a question. The input state holds a single HumanMessage.

python
result = chat_app.invoke({
    "messages": [HumanMessage(content="What is LangGraph in one sentence?")]
})

print(result["messages"][-1].content)
python
LangGraph is a Python framework for building stateful, multi-step AI agent workflows using a graph-based architecture with nodes and edges.

That’s a full LangGraph app. User input goes in, the LLM handles it, and the answer comes back. What makes this exciting is how easy it is to grow. You can drop in more nodes — retrieval, tool calls, checks — without changing the shape of anything.

Key Insight: Every LangGraph app follows the same five steps: define state, write node functions, link with edges, compile, invoke. Once this clicks, building any workflow is just a matter of adding more nodes.

Drawing Your Graph

Ever built a pipeline and lost track of which step links to which? LangGraph’s built-in drawing tools fix that. You can render your graph as a diagram with one method call.

The .get_graph() method returns the graph layout. From there, draw_ascii() prints a text diagram that works in any terminal.

python
print(chat_app.get_graph().draw_ascii())
python
+-----------+
| __start__ |
+-----------+
      *
      *
      *
 +---------+
 | call_llm |
 +---------+
      *
      *
      *
 +---------+
 | __end__ |
 +---------+

For nicer visuals, draw_mermaid() outputs Mermaid syntax. Paste it into mermaid.live to see a styled flowchart.

python
print(chat_app.get_graph().draw_mermaid())
python
%%{init: {'flowchart': {'curve': 'linear'}}}%%
graph TD;
    __start__([<p>__start__</p>]):::first
    call_llm(call_llm)
    __end__([<p>__end__</p>]):::last
    __start__ --> call_llm;
    call_llm --> __end__;
    classDef default fill:#f2f0ff,line-height:1.2
    classDef first fill:#DAE8FC
    classDef last fill:#baffc9

Tip: In Jupyter notebooks, render a PNG right away with from IPython.display import Image, display then display(Image(chat_app.get_graph().draw_mermaid_png())). This calls the Mermaid.ink API, so you need web access.

Debugging Tips for LangGraph

When things go wrong — and they will — LangGraph gives you tools to trace what happened. Here are the tricks that save the most time.

Use debug=True when compiling. This prints every state change to the console, so you can see what each node got and what it sent back.

python
debug_app = chat_graph.compile(debug=True)

Print state inside your nodes. If the final output looks wrong, the bug is usually in one node’s return value. Add print lines during dev.

python
def call_llm_debug(state: ChatState) -> dict:
    """Send messages to LLM with debug printing."""
    print(f"[DEBUG] Input messages: {len(state['messages'])}")
    response = llm.invoke(state["messages"])
    print(f"[DEBUG] Response type: {type(response)}")
    return {"messages": [response]}

Check your key names. A sneaky bug: if you return a key that doesn’t exist in your TypedDict, LangGraph just ignores it. If a node’s update isn’t taking effect, make sure the key name matches your state exactly.

Common Setup Errors and How to Fix Them

Error 1: ModuleNotFoundError: No module named ‘langgraph’

This almost always means your virtual environment isn’t active. Or you installed LangGraph in a different environment than the one you’re running.

python
# Check which Python you're using
which python        # Linux/Mac
where python        # Windows

# Verify langgraph is installed in THAT environment
pip show langgraph

Error 2: AuthenticationError from OpenAI

Your API key is missing or bad. Check three things: the .env file exists in your project root, load_dotenv() runs before ChatOpenAI(), and the key starts with sk-.

python
import os
from dotenv import load_dotenv

load_dotenv()
key = os.getenv("OPENAI_API_KEY", "NOT SET")
print(f"Key starts with: {key[:5]}...")

Error 3: State updates that vanish

This happens when a node returns the wrong type for a field. If your state expects messages: list and a node returns {"messages": "hello"}, LangGraph won’t merge it right.

Wrong:

python
def bad_node(state):
    return {"messages": "hello"}

Right:

python
def good_node(state):
    return {"messages": ["hello"]}

Warning: Missing operator.add is the #1 beginner bug. Without Annotated[list, operator.add], returning {"messages": ["new"]} wipes out the whole messages list instead of appending. You lose all earlier messages with no warning.

Complete Code

Click to expand the full script (copy-paste and run)
python
# Complete code from: LangGraph Installation, Setup, and Your First Graph
# Requires: pip install langgraph langchain-openai python-dotenv
# Python 3.10+

import os
import operator
from typing import Annotated, TypedDict
from dotenv import load_dotenv
from langgraph.graph import StateGraph, START, END
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage

# --- Setup ---
load_dotenv()

# --- Example 1: Simple Greeting Graph (no LLM) ---

class GreetingState(TypedDict):
    name: str
    greeting: str

def format_name(state: GreetingState) -> dict:
    """Capitalize the user's name."""
    return {"name": state["name"].strip().title()}

def generate_greeting(state: GreetingState) -> dict:
    """Create a greeting using the formatted name."""
    return {"greeting": f"Hello, {state['name']}! Welcome to LangGraph."}

graph = StateGraph(GreetingState)
graph.add_node("format_name", format_name)
graph.add_node("generate_greeting", generate_greeting)
graph.add_edge(START, "format_name")
graph.add_edge("format_name", "generate_greeting")
graph.add_edge("generate_greeting", END)

app = graph.compile()
result = app.invoke({"name": "  alice  ", "greeting": ""})
print("--- Greeting Graph ---")
print(result)

# --- Example 2: LLM Chat Graph ---

class ChatState(TypedDict):
    messages: Annotated[list, operator.add]

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)

def call_llm(state: ChatState) -> dict:
    """Send messages to the LLM and append the response."""
    response = llm.invoke(state["messages"])
    return {"messages": [response]}

chat_graph = StateGraph(ChatState)
chat_graph.add_node("call_llm", call_llm)
chat_graph.add_edge(START, "call_llm")
chat_graph.add_edge("call_llm", END)

chat_app = chat_graph.compile()

print("\n--- Chat Graph ---")
result = chat_app.invoke({
    "messages": [HumanMessage(content="What is LangGraph in one sentence?")]
})
print(result["messages"][-1].content)

# --- Visualization ---
print("\n--- Graph Structure ---")
print(chat_app.get_graph().draw_ascii())

print("\nScript completed successfully.")

Summary

Here’s what you built in this tutorial:

  • Installed LangGraph with pip install langgraph langchain-openai python-dotenv inside a virtual environment.

  • Set up API key handling using a .env file and load_dotenv().

  • Learned the three building blocks — state (TypedDict), nodes (functions), and edges (links).

  • Built a greeting graph with StateGraph, add_node, add_edge, .compile(), and .invoke().

  • Built an LLM-calling graph that sends user messages to GPT and returns responses.

  • Drew graphs with draw_ascii() and draw_mermaid().

  • Fixed common bugs like import errors, API key problems, and state mix-ups.

Practice exercise: Build a two-node graph where the first node takes a topic string and lowercases it, and the second node calls the LLM to create a one-line definition. Test it with {"topic": " MACHINE LEARNING ", "definition": ""}.

Click to see the solution
python
class DefState(TypedDict):
    topic: str
    definition: str

def clean_topic(state: DefState) -> dict:
    return {"topic": state["topic"].strip().lower()}

def define_topic(state: DefState) -> dict:
    msg = HumanMessage(content=f"Define {state['topic']} in one sentence.")
    response = llm.invoke([msg])
    return {"definition": response.content}

def_graph = StateGraph(DefState)
def_graph.add_node("clean_topic", clean_topic)
def_graph.add_node("define_topic", define_topic)
def_graph.add_edge(START, "clean_topic")
def_graph.add_edge("clean_topic", "define_topic")
def_graph.add_edge("define_topic", END)

def_app = def_graph.compile()
result = def_app.invoke({"topic": "  MACHINE LEARNING  ", "definition": ""})
print(f"Topic: {result['topic']}")
print(f"Definition: {result['definition']}")

The `clean_topic` node strips whitespace and lowercases to “machine learning”. The `define_topic` node sends a prompt to the LLM and stores what it says. The state carries data between nodes, just like the greeting graph.

The next step in the learning path is adding conditional edges — branches where the graph picks which node to run based on the current state. That’s where LangGraph starts to feel like a real agent framework.

Frequently Asked Questions

Do I need LangChain to use LangGraph?

No. LangGraph depends on langchain-core for message types and LLM hooks, but you don’t need the full langchain package. Install langgraph plus a provider package like langchain-openai and you’re good to go.

Can I use LangGraph with Anthropic, Google, or local models?

Yes. Swap langchain-openai for langchain-anthropic or langchain-google-genai. For local models, use langchain-ollama. The graph stays the same — only the LLM setup line changes.

python
# Anthropic
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic(model="claude-sonnet-4-20250514")

# Local via Ollama
from langchain_ollama import ChatOllama
llm = ChatOllama(model="llama3")

What’s the difference between StateGraph and MessageGraph?

MessageGraph was a shortcut where the whole state was a message list. It’s gone now. Use StateGraph with Annotated[list, operator.add] for the messages field instead. It’s more flexible and is the way to go from here on.

python
# Recommended approach
class State(TypedDict):
    messages: Annotated[list, operator.add]

graph = StateGraph(State)

How do I add conditional branching to my graph?

Use add_conditional_edges instead of add_edge. You pass a routing function that looks at the state and returns the name of the next node. We cover this in the next tutorial on conditional edges and routing.

References

  • LangGraph documentation — Installation and getting started
  • LangGraph GitHub repository — Build resilient language agents as graphs
  • LangGraph PyPI package page
  • LangChain documentation — ChatOpenAI integration
  • Python documentation — typing.TypedDict
  • Python documentation — operator module
  • LangGraph Quickstart tutorial — Official LangChain docs
  • Real Python — LangGraph: Build Stateful AI Agents in Python

Reviewed: March 2026 | LangGraph 0.3+ | Python 3.10+

Free Course
Master Core Python — Your First Step into AI/ML

Build a strong Python foundation with hands-on exercises designed for aspiring Data Scientists and AI/ML Engineers.

Start Free Course
Trusted by 50,000+ learners
Related Course
Master Gen AI — Hands-On
Join 5,000+ students at edu.machinelearningplus.com
Explore Course
Get the full course,
completely free.
Join 57,000+ students learning Python, SQL & ML. One year of access, all resources included.
📚 10 Courses
🐍 Python & ML
🗄️ SQL
📦 Downloads
📅 1 Year Access
No thanks
🎓
Free AI/ML Starter Kit
Python · SQL · ML · 10 Courses · 57,000+ students
🎉   You're in! Check your inbox (or Promotions/Spam) for the access link.
⚡ Before you go

Python.
SQL. NumPy.
All free.

Get the exact 10-course programming foundation that Data Science professionals use.

🐍
Core Python — from first line to expert level
📈
NumPy & Pandas — the #1 libraries every DS job needs
🗃️
SQL Levels I–III — basics to Window Functions
📄
Real industry data — Jupyter notebooks included
R A M S K
57,000+ students
★★★★★ Rated 4.9/5
⚡ Before you go
Python. SQL.
All Free.
R A M S K
57,000+ students  ★★★★★ 4.9/5
Get Free Access Now
10 courses. Real projects. Zero cost. No credit card.
New learners enrolling right now
🔒 100% free ☕ No spam, ever ✓ Instant access
🚀
You're in!
Check your inbox for your access link.
(Check Promotions or Spam if you don't see it)
Or start your first course right now:
Start Free Course →
Scroll to Top
Scroll to Top
Course Preview

Machine Learning A-Z™: Hands-On Python & R In Data Science

Free Sample Videos:

Machine Learning A-Z™: Hands-On Python & R In Data Science

Machine Learning A-Z™: Hands-On Python & R In Data Science

Machine Learning A-Z™: Hands-On Python & R In Data Science

Machine Learning A-Z™: Hands-On Python & R In Data Science

Machine Learning A-Z™: Hands-On Python & R In Data Science