Difference between revisions of "LangChain"

From
Jump to: navigation, search
m
m (Pinecone)
Line 105: Line 105:
  
 
= Pinecone =
 
= Pinecone =
* [https://safe.menlosecurity.com/https://www.pinecone.io/learn/langchain-intro/ Pinecone]
+
* [[AI-Powered Search#Pinecone|Pinecone]]
 +
* [https://www.pinecone.io/learn/langchain-intro/ Pinecone LangChain Intro]
 
* [https://colab.research.google.com/github/pinecone-io/examples/blob/master/generation/langchain/handbook/00-langchain-intro.ipynb Demo using Colab]
 
* [https://colab.research.google.com/github/pinecone-io/examples/blob/master/generation/langchain/handbook/00-langchain-intro.ipynb Demo using Colab]
  
Line 114: Line 115:
 
<youtube>15TDwVSpwKc</youtube>
 
<youtube>15TDwVSpwKc</youtube>
 
<youtube>tp0bQNDtLPc</youtube>
 
<youtube>tp0bQNDtLPc</youtube>
 
  
 
= Supabase =
 
= Supabase =

Revision as of 22:10, 9 April 2023

YouTube ... Quora ...Google search ...Google News ...Bing News



LangChain is a Python framework built around Large Language Model (LLM) that can be used for chatbots, Generative Question-Answering (GQA), summarization, and more. The core idea of the library is that we can “chain” together different components to create more advanced use cases around LLMs. LLMs are emerging as a transformative technology, enabling developers to build applications that they previously could not. But using these LLMs in isolation is often not enough to create a truly powerful app - the real power comes when you are able to combine them with other sources of computation or knowledge.


Getting Started

Data Independent - tutorial videos

reference videos throughout page



My Documents


Long documents

Memory

Emails

Zapier


Tabular Data

Javascript

Pinecone

Supabase

Water

Visual ChatGPT

Summarization

Hugging Face

LLama

GPT-Index

Comparing Large Language Models (LLM)

Gradio

Filtering LLM

Taxes

More