Skip to main content
Version: 2.0.0

Langchain integration

note

Integrations with Langchain can be done with Python or Javascript.

LangChain is a framework for developing large language model (LLM) powered applications.

You can configure Langchain to use any Gaia node as the LLM backend, that way you can build any AI agent or AI-powered application that uses Gaia for inferencing.

Prerequisites

You will need a Gaia node ready to provide LLM services through a public URL. You can:

If you are using a public node, you will need an API key. Gaia overs free 50,000 API credits to use with available services such as public nodes when you apply for a developer account.

Setup

  • Project setup on machine (JavaScript or Python)

  • Langchain Installation:

npm install @langchain/openai @langchain/core dotenv

Integration with Gaia

To get started with running your Gaia node, you can follow the guide on the Setting up your own node page for a quickstart.

In this guide, we will be running our Gaia node locally so we do not need an API key, you can use a string like: "Gaia" as a placeholder. Create a .env file and store your API key:

GAIANET_API_KEY="Gaia"

Integrations with Langchain and Gaia can be done with any JavaScript or Python. There are code snippets below that show how integration looks like in both languages:

import { ChatOpenAI, OpenAI } from "@langchain/openai";
import dotenv from "dotenv";

dotenv.config();

const model = new ChatOpenAI({
configuration: {
apiKey: process.env.GAIANET_API_KEY,
model: "Llama-3-Groq-8B-Tool",
baseURL:
"gaia-node-url/v1",
},
});

const response = await model.invoke("Hello, world!");

console.log(response)

Invoking Gaia models

Once you have the basic connection established, you can start using Langchain's powerful features. Start by making invocations to the model.


// ...
const response = await model.invoke("Hello, world!");

console.log(response)

The LangChain support also opens up integrations with LangGraph and LangSmith.