Difference between revisions of "Chain of Thought (CoT)"

From
Jump to: navigation, search
m (Tree of Thoughts (ToT))
m (Tree of Thoughts (ToT))
Line 24: Line 24:
 
"Tree of Thoughts" is a new framework for inferencing language models like [[GPT-4]], inspired by prompt engineering methods like Chain of Thought. It is a novel approach aimed at improving the problem-solving capabilities of auto-regressive [[Large Language Model (LLM)]]s by allowing them to explore multiple reasoning paths over thoughts. To implement ToT as a software system, an [[Large Language Model (LLM)|LLM]] is augmented with additional modules including a prompter agent, a checker module, a memory module, and a ToT controller. These modules engage in a multi-round conversation with the [[Large Language Model (LLM)|LLM]] to solve a given problem. The memory module records the conversation and state history of the problem-solving process, which allows the system to backtrack to previous steps of the thought-process and explore other directions from there.
 
"Tree of Thoughts" is a new framework for inferencing language models like [[GPT-4]], inspired by prompt engineering methods like Chain of Thought. It is a novel approach aimed at improving the problem-solving capabilities of auto-regressive [[Large Language Model (LLM)]]s by allowing them to explore multiple reasoning paths over thoughts. To implement ToT as a software system, an [[Large Language Model (LLM)|LLM]] is augmented with additional modules including a prompter agent, a checker module, a memory module, and a ToT controller. These modules engage in a multi-round conversation with the [[Large Language Model (LLM)|LLM]] to solve a given problem. The memory module records the conversation and state history of the problem-solving process, which allows the system to backtrack to previous steps of the thought-process and explore other directions from there.
  
 +
<youtube>PFK5g_kxhVM</youtube>
 +
<youtube>RndhsZvr-cI</youtube>
 +
<youtube>ut5kp56wW_4</youtube>
 +
<youtube>BrjAt-wvEXI</youtube>
 +
<youtube>bjnTy2TdmYw</youtube>
 +
<youtube>BrjAt-wvEXI</youtube>
 +
 +
 +
== May 2023 ==
 
<youtube>G9dRK9TNAeg</youtube>
 
<youtube>G9dRK9TNAeg</youtube>

Revision as of 12:25, 25 May 2023

YouTube ... Quora ...Google search ...Google News ...Bing News

AI can generate text that follows a logical and coherent sequence of ideas, building on previous statements to form a chain of thought.


Tree of Thoughts (ToT)

"Tree of Thoughts" is a new framework for inferencing language models like GPT-4, inspired by prompt engineering methods like Chain of Thought. It is a novel approach aimed at improving the problem-solving capabilities of auto-regressive Large Language Model (LLM)s by allowing them to explore multiple reasoning paths over thoughts. To implement ToT as a software system, an LLM is augmented with additional modules including a prompter agent, a checker module, a memory module, and a ToT controller. These modules engage in a multi-round conversation with the LLM to solve a given problem. The memory module records the conversation and state history of the problem-solving process, which allows the system to backtrack to previous steps of the thought-process and explore other directions from there.


May 2023