langchaingo

package
v0.0.0-...-7871f83 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 23, 2025 License: Apache-2.0 Imports: 9 Imported by: 0

Documentation

Overview

Package langchaingo provides an adapter for using LangChainGo (github.com/tmc/langchaingo) models within AgentMesh workflows.

This adapter enables integration with LangChainGo's 50+ model providers (OpenAI, Anthropic, Google AI, Cohere, local models, etc.) while using the AgentMesh model interface for agents and graph execution.

Example usage:

import (
    "github.com/hupe1980/agentmesh/pkg/model/langchaingo"
    "github.com/tmc/langchaingo/llms/openai"
)

// Create a LangChainGo model
llm, err := openai.New(openai.WithModel("gpt-4"))
if err != nil {
    log.Fatal(err)
}

// Wrap it as an AgentMesh model
model, _ := langchaingo.NewModel(llm)

// Use with AgentMesh agents
agent, err := agent.NewReAct(model, tools)

Package langchaingo provides sentinel errors for the LangChain Go model package.

Index

Constants

This section is empty.

Variables

View Source
var (
	// ErrNoMessages is returned when generate is called without messages.
	ErrNoMessages = errors.New("model/langchaingo: generate requires at least one message")
)

Functions

This section is empty.

Types

type Model

type Model struct {
	// contains filtered or unexported fields
}

Model wraps a LangChainGo llms.Model to implement the AgentMesh model.Model interface.

func MustNewModel

func MustNewModel(llm llms.Model, optFns ...Option) *Model

MustNewModel creates a new model adapter, panicking on error. Use this only when you can guarantee the llm is non-nil.

func NewModel

func NewModel(llm llms.Model, optFns ...Option) (*Model, error)

NewModel creates a new AgentMesh model adapter from a LangChainGo model. Returns an error if the llm parameter is nil.

func (*Model) BindTools

func (m *Model) BindTools(tools ...tool.Tool) *Model

BindTools returns a new model with the specified tools bound for function calling.

func (*Model) Capabilities

func (m *Model) Capabilities() model.Capabilities

Capabilities returns the features supported by this model adapter. Note: LangChainGo doesn't expose capability introspection, so we return conservative defaults. Streaming support depends on the underlying model.

func (*Model) Generate

func (m *Model) Generate(ctx context.Context, req *model.Request) iter.Seq2[*model.Response, error]

Generate executes a generation request against the wrapped LangChainGo model. Returns an iterator that yields model.Response as generation progresses.

type Option

type Option func(*Options)

Option is a function that configures Options.

func WithMaxTokens

func WithMaxTokens(maxTokens int) Option

WithMaxTokens sets the maximum tokens to generate.

func WithStopWords

func WithStopWords(stopWords ...string) Option

WithStopWords sets the stop sequences for generation.

func WithStreaming

func WithStreaming(enabled bool) Option

WithStreaming enables or disables streaming mode.

func WithTemperature

func WithTemperature(temperature float64) Option

WithTemperature sets the temperature for generation.

type Options

type Options struct {
	// Temperature controls randomness in output (0.0 to 1.0).
	// Higher values produce more random output.
	Temperature float64

	// MaxTokens is the maximum number of tokens to generate.
	MaxTokens int

	// StopWords are sequences that will stop generation when encountered.
	StopWords []string

	// Streaming enables streaming mode when true.
	Streaming bool
}

Options configures the LangChainGo model adapter behavior.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL