Skip to main content
Docs by LangChain home page
LangChain + LangGraph
Search...
⌘K
GitHub
Try LangSmith
Try LangSmith
Search...
Navigation
Abso
LangChain
LangGraph
Deep Agents
Integrations
Learn
Reference
Contribute
Python
Overview
All providers
Popular Providers
OpenAI
Anthropic (Claude)
Google
AWS (Amazon)
Hugging Face
Microsoft
Ollama
Groq
Integrations by component
Chat models
Tools and toolkits
Retrievers
Text splitters
Embedding models
Vector stores
Document loaders
Key-value stores
close
On this page
Installation and setup
Chat Model
Abso
Copy page
Copy page
Abso
is an open-source LLM proxy that automatically routes requests between fast and slow models based on prompt complexity. It uses various heuristics to chose the proper model. It’s very fast and has low latency.
Installation and setup
Copy
Ask AI
pip
install
langchain-abso
Chat Model
See usage details
here
Edit the source of this page on GitHub.
Connect these docs programmatically
to Claude, VSCode, and more via MCP for real-time answers.
Was this page helpful?
Yes
No
⌘I