A Node.js CLI that uses Ollama and LM Studio models (Llava, Gemma, Llama etc.) to intelligently rename files by their contents
-
Updated
Feb 9, 2025 - JavaScript
A Node.js CLI that uses Ollama and LM Studio models (Llava, Gemma, Llama etc.) to intelligently rename files by their contents
Efficient visual programming for AI language models
visionOS examples ⸺ Spatial Computing Accelerators for Apple Vision Pro
LLMX; Easiest 3rd party Local LLM UI for the web!
Meshtastic-AI - Off-Grid LM-Studio / Ollama / OpenAI integration & Home Assistant API control for Meshtastic with custom commands, inbound / outbound Twilio SMS routing, Discord channel routing & GPS emergency alerts over sms, email & discord. Now with Windows, Linux & Docker support!
Soupy is a Discord bot that uses Flux, and LM Studio. It chats and functions as an image generator for your users, and has other fun features.
LocalAPI.AI is a local AI management tool for Ollama, offering Web UI management and compatibility with vLLM, LM Studio, llama.cpp, Mozilla-Llamafile, Jan Al, Cortex API, Local-LLM, LiteLLM, GPT4All, and more.
Serverless single HTML page access to an OpenAI API compatible Local LLM
An AI chatbot that talks to people in VR Chat.
Techniques for Using LLMs Effectively: An Introduction to Prompt Engineering, Retrieval Augmented Generation, and Toolformer
Using llm with langchain to generate cover letters based on company data found on the internet and the person's profile
Some handy tools to do with audio locally.
MCP prompt tool applying Chain-of-Draft (CoD) reasoning - BYOLLM
Browser extension that generates image alternate text, using GPT-4o or an LM Studio server.
🚀 Typollama – AI Writing Assistant – A Chrome extension for real-time text enhancement using local and cloud AI models. Instantly spellcheck, proofread, and customize AI-powered writing with keyboard shortcuts and right-click integration. ✨
Add a description, image, and links to the lm-studio topic page so that developers can more easily learn about it.
To associate your repository with the lm-studio topic, visit your repo's landing page and select "manage topics."