Langchain Postgres
| Entity Passport | |
| Registry ID | gh-tool--langchain-ai--langchain-postgres |
| License | MIT |
| Provider | github |
Cite this tool
Academic & Research Attribution
@misc{gh_tool__langchain_ai__langchain_postgres,
author = {Langchain Ai},
title = {Langchain Postgres Tool},
year = {2026},
howpublished = {\url{https://free2aitools.com/tool/gh-tool--langchain-ai--langchain-postgres}},
note = {Accessed via Free2AITools Knowledge Fortress}
} đŦTechnical Deep Dive
Full Specifications [+]âž
Quick Commands
pip install langchain-postgres âī¸ Nexus Index V2.0
đŦ Index Insight
FNI V2.0 for Langchain Postgres: Semantic (S:50), Authority (A:0), Popularity (P:62), Recency (R:97), Quality (Q:50).
Verification Authority
đ Specs
- Language
- Python
- License
- MIT
- Version
- 1.0.0
Usage documentation not yet indexed for this tool.
Technical Documentation
langchain-postgres
The langchain-postgres package implementations of core LangChain abstractions using Postgres.
The package is released under the MIT license.
Feel free to use the abstraction as provided or else modify them / extend them as appropriate for your own application.
Requirements
The package supports the asyncpg and psycopg3 drivers.
Installation
pip install -U langchain-postgres
Vectorstore
[!WARNING] In v0.0.14+,
PGVectoris deprecated. Please migrate toPGVectorStorefor improved performance and manageability. See the migration guide for details on how to migrate fromPGVectortoPGVectorStore.
Documentation
[!TIP] For developing, debugging, and deploying AI agents and LLM applications, see LangSmith.
Example
from langchain_core.documents import Document
from langchain_core.embeddings import DeterministicFakeEmbedding
from langchain_postgres import PGEngine, PGVectorStore
# Replace the connection string with your own Postgres connection string
CONNECTION_STRING = "postgresql+psycopg://langchain:langchain@localhost:6024/langchain"
engine = PGEngine.from_connection_string(url=CONNECTION_STRING)
# Replace the vector size with your own vector size
VECTOR_SIZE = 768
embedding = DeterministicFakeEmbedding(size=VECTOR_SIZE)
TABLE_NAME = "my_doc_collection"
engine.init_vectorstore_table(
table_name=TABLE_NAME,
vector_size=VECTOR_SIZE,
)
store = PGVectorStore.create_sync(
engine=engine,
table_name=TABLE_NAME,
embedding_service=embedding,
)
docs = [
Document(page_content="Apples and oranges"),
Document(page_content="Cars and airplanes"),
Document(page_content="Train")
]
store.add_documents(docs)
query = "I'd like a fruit."
docs = store.similarity_search(query)
print(docs)
[!TIP] All synchronous functions have corresponding asynchronous functions
Hybrid Search with PGVectorStore
With PGVectorStore you can use hybrid search for more comprehensive and relevant search results.
vs = PGVectorStore.create_sync(
engine=engine,
table_name=TABLE_NAME,
embedding_service=embedding,
hybrid_search_config=HybridSearchConfig(
fusion_function=reciprocal_rank_fusion
),
)
hybrid_docs = vector_store.similarity_search("products", k=5)
For a detailed guide on how to use hybrid search, see the documentation.
ChatMessageHistory
The chat message history abstraction helps to persist chat message history in a postgres table.
PostgresChatMessageHistory is parameterized using a table_name and a session_id.
The table_name is the name of the table in the database where
the chat messages will be stored.
The session_id is a unique identifier for the chat session. It can be assigned
by the caller using uuid.uuid4().
import uuid
from langchain_core.messages import SystemMessage, AIMessage, HumanMessage
from langchain_postgres import PostgresChatMessageHistory
import psycopg
# Establish a synchronous connection to the database
# (or use psycopg.AsyncConnection for async)
conn_info = ... # Fill in with your connection info
sync_connection = psycopg.connect(conn_info)
# Create the table schema (only needs to be done once)
table_name = "chat_history"
PostgresChatMessageHistory.create_tables(sync_connection, table_name)
session_id = str(uuid.uuid4())
# Initialize the chat history manager
chat_history = PostgresChatMessageHistory(
table_name,
session_id,
sync_connection=sync_connection
)
# Add messages to the chat history
chat_history.add_messages([
SystemMessage(content="Meow"),
AIMessage(content="woof"),
HumanMessage(content="bark"),
])
print(chat_history.messages)
Google Cloud Integrations
Google Cloud provides Vector Store, Chat Message History, and Data Loader integrations for AlloyDB and Cloud SQL for PostgreSQL databases via the following PyPi packages:
Using the Google Cloud integrations provides the following benefits:
- Enhanced Security: Securely connect to Google Cloud databases utilizing IAM for authorization and database authentication without needing to manage SSL certificates, configure firewall rules, or enable authorized networks.
- Simplified and Secure Connections: Connect to Google Cloud databases effortlessly using the instance name instead of complex connection strings. The integrations creates a secure connection pool that can be easily shared across your application using the
engineobject.
| Vector Store | Metadata filtering | Async support | Schema Flexibility | Improved metadata handling | Hybrid Search |
|---|---|---|---|---|---|
| Google AlloyDB | â | â | â | â | â |
| Google Cloud SQL Postgres | â | â | â | â | â |
đ Quick Start
pip install -U langchain-postgres
Social Proof
AI Summary: Based on GitHub metadata. Not a recommendation.
đĄī¸ Tool Transparency Report
Technical metadata sourced from upstream repositories.
đ Identity & Source
- id
- gh-tool--langchain-ai--langchain-postgres
- slug
- langchain-ai--langchain-postgres
- source
- github
- author
- Langchain Ai
- license
- MIT
- tags
- langchain, langchain-python, postgres, postgresql, python
âī¸ Technical Specs
- architecture
- null
- params billions
- null
- context length
- null
- pipeline tag
- other
đ Engagement & Metrics
- downloads
- 0
- stars
- 273
- forks
- 0
- github stars
- 273
Data indexed from public sources. Updated daily.