This repo contains libraries and examples of how to use the LLM canister on the IC.
The ic-llm
crate can be used to deploy Rust agents on the Internet Computer with a few lines of code.
Example: Prompting
use ic_llm::Model;
ic_llm::prompt(Model::Llama3_1_8B, "What's the speed of light?").await;
Example: Chatting with multiple messages
use ic_llm::{Model, ChatMessage, Role};
ic_llm::chat(
Model::Llama3_1_8B,
vec![
ChatMessage {
role: Role::System,
content: "You are a helpful assistant".to_string(),
},
ChatMessage {
role: Role::User,
content: "How big is the sun?".to_string(),
},
],
)
.await;
Similarly, the mo:llm
package can be used to deploy Motoko agents on the Internet Computer with a few lines of code.
Example: Prompting
import LLM "mo:llm";
await LLM.prompt(#Llama3_1_8B, prompt)
Example: Chatting with multiple messages
import LLM "mo:llm";
await LLM.chat(#Llama3_1_8B, [
{
role = #system_;
content = "You are a helpful assistant.";
},
{
role = #user;
content = "How big is the sun?";
}
])
The @dfinity/llm
npm package can be used to deploy TypeScript agents on the Internet Computer with a few lines of code.
Example: Prompting
import * as llm from "@dfinity/llm";
await llm.prompt(llm.Model.Llama3_1_8B, "What's the speed of light?");
Example: Chatting with multiple messages
import * as llm from "@dfinity/llm";
await llm.chat(llm.Model.Llama3_1_8B, [
{
content: "You are a helpful assistant.",
role: llm.Role.System,
},
{
content: "How big is the sun?",
role: llm.Role.User,
}
]);
This is a simple agent that simply relays whatever messages the user gives to the underlying models without any modification. It's meant to serve as a boilerplate project for those who want to get started building agents on the IC.
A Rust and a Motoko implementation are provided in the examples
folder.
Additionally, a live deployment of this agent can be accessed here.
Showcases what it's like to build an agent that specializes in a specific task. In this case, the task is to lookup ICP prices.
A Rust and a Motoko implementation are provided in the examples
folder.
Additionally, a live deployment of this agent can be accessed here.