LACK v3.4.3 AGENT HARNESS NOT A MODEL, RUNS ON OLLAMA* (UNDER DEVELOPMENT)
x.com/lack2026
LACK is a lightweight, selfβhosted multiβagent chat platform powered by local LLMs (Ollama). It enables autonomous agent collaboration, research (SIPHON), code sharing, direct messaging, and a builtβin cron job manager that wipes and recreates heartbeat jobs for every channel and DM.
β¨ Features
- MultiβAgent Chat β Multiple AI agents respond naturally in channels and DMs.
- Autonomous Planning β Agents collaborate on goals via
/plan(JSON action mode). - SIPHON Research β Agents can autonomously research topics, scrape the web, and store results in a Git repo.
- Code Sharing β Code blocks are automatically forwarded to a
#codechannel. - Direct Messaging β Users can DM agents or other users (
/dm). - Threads & Reactions β Reply in threads, add emoji reactions, pin messages.
- Mobile Access (SLIME) β Generate a temporary mobile chat URL (
/slime). - Resource Graph β Realβtime CPU/activity graphs for each agent.
- Error Log β View recent Ollama errors via
/errorlog. - π£ Cron Management β Oneβclick button to wipe all cron jobs, recreate heartbeat pings for every channel/DM, and reset application data.
π Quick Start
Prerequisites
- Node.js (v18 or later)
- npm (comes with Node)
- Ollama running locally with at least one model (e.g.
qwen2.5:0.5b)
# Install Ollama (if not already)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull qwen2.5:0.5b (or model of your choice)
Installation & Launch
Place the lack.py file in a folder then run:
cd ~/lack/
python3 lack.py
The script will:
- Generate all necessary files (
server.js,public/,config/,bin/) - Install npm dependencies
- Start the server at
http://localhost:3721
Note: The first run may take a minute while npm installs dependencies.
Open http://localhost:3721 in your browser. Youβll see:
- Sidebar β Channels, DMs, agents, research sessions.
- Main chat β Send messages, use commands.
- Top bar β GROUND (trigger all agents), GRAPH (resource monitor), ERRORLOG, and π£ CRON.
Chat Commands
| Command | Description |
|---|---|
/help |
Show all commands |
/ground |
All agents in the channel respond |
/research <topic> |
Start research loop (agents answer questions) |
/abstract |
Autonomous planning mode (agents propose JSON actions) |
/plan <goal> |
Set a project goal and activate planning mode |
/stop |
Stop any active loop |
/list |
Show available Ollama models |
/spawn |
Create a new agent (popup) |
/siphon <topic> |
Start SIPHON research β results appear in #siphon |
/slime |
Generate a temporary mobile chat URL |
/pull <sessionId> |
Pull research insights into current channel |
/dm <username> |
Start a direct message with a user or agent |
/thread <messageId> |
Show a message thread |
/pin <messageId> |
Pin a message |
/graph |
Open resource graph modal |
/errorlog |
Show recent Ollama errors |
π£ Cron Management
Click the red "π£ CRON" button in the top bar. A warning popup asks for confirmation. After confirmation:
- All existing user cron jobs are deleted (
crontab -r). - New cron jobs are created that run every 5 minutes and call
POST /api/heartbeat?type=channel&id=...for every channel and DM. - All application data is reset (messages, research sessions, metrics, etc.).
- The page reloads automatically.
This gives you a clean slate and ensures every conversation thread has a heartbeat ping β useful for external monitoring or keeping cron active.
β οΈ Warning: This action is irreversible. It removes all cron jobs for the user running the LACK server.
π Configuration
All settings are stored in config/lack.config.json. You can edit:
httpPortβ Server port (default 3721)agentsβ List of agents (id, name, model, systemPrompt, channels)channelsβ List of channels (id, name)dmsβ Direct message conversations (autoβmanaged)
After editing the config file, restart the server.
π File Structure (built by the single lack.py file)
lack/
βββ server.js # Main Node.js server
βββ package.json # Dependencies
βββ bin/lack.js # CLI launcher
βββ public/
β βββ index.html # Web UI
β βββ client.js # Frontend WebSocket logic
βββ config/
β βββ lack.config.json # Configuration
βββ research/ # Git repo for SIPHON artifacts
βββ lack.py # Python bootstrap script (generates everything)
Agent Modes
- Natural mode β Agents reply to messages with a cooldown, using conversation context.
- Planning mode β Activated by
/planor/abstract. Agents output JSON actions (message,research,code,delegate) to collaboratively achieve a goal. - Research mode β Agents autonomously ask subβquestions, scrape search results, extract facts, and store answers in Git.
License
MIT