⚡ LightPhon Node Client

Host your GPU and earn Bitcoin Lightning payments

📥 Download for Windows

Version: 1.0.11

Release Date: February 12, 2026

Platform: Windows 10/11 (64-bit)

📥 Download LightPhon-Node.exe

File size: ~38 MB

📋 System Requirements

  • Windows 10 or Windows 11 (64-bit)
  • NVIDIA GPU with CUDA support (recommended)
  • At least 8 GB RAM
  • llama.cpp / llama-server installed
  • Internet connection

📋 Prerequisites

Before running LightPhon, you need to install the following dependencies:

1 Visual C++ Redistributable

Download and install the latest Microsoft Visual C++ Redistributable:

👉 Download VC++ Redistributable (x64)

Or install via winget:

winget install Microsoft.VCRedist.2015+.x64

2 llama-server (llama.cpp)

Install llama.cpp which includes llama-server:

winget install llama.cpp

After installation, verify it's installed correctly:

llama-server --version

🚀 Installation

  1. Download the latest LightPhon-Node.exe from the button above
  2. Run the executable — no installation required! It's fully portable.

⚙️ Configuration

On first run, you'll need to:

  1. Select the folder containing your GGUF models
  2. Choose which models to make available on the network

💡 How it Works

  • Connect — The app connects to the AI Lightning network
  • Share Models — Your selected AI models become available for users
  • Earn Sats — When users run inference on your models, you earn Bitcoin via Lightning Network

🔧 Troubleshooting

"llama-server not found"

Make sure llama-server is installed and in your PATH:

winget install llama.cpp

"VCRUNTIME140.dll not found"

Install Visual C++ Redistributable:

winget install Microsoft.VCRedist.2015+.x64

"Connection failed"

  • Check your internet connection
  • Verify the server URL is correct
  • Make sure your firewall allows the connection

📝 Changelog

v1.0.11 (2026-02-12)

  • Fixed inference timeout for slow GPUs
  • Increased read timeout from 2s to 10 minutes

v1.0.10 (2026-02-11)

  • Switched to /v1/chat/completions for proper chat template support
  • Fixed LLM hallucinations caused by raw prompt without chat template
  • Added conversation history (multi-turn chat)
  • Added system prompt with UI toggle
  • RAG context now injected as system message

v1.0.9 (2026-02-11)

  • Added password reset via email functionality
  • Fixed RAG document clearing on model reload
  • Fixed status_callback AttributeError
  • Improved RAG logging and debugging

v1.0.8 (2026-02-09)

  • RAG enabled by default
  • Context length now controlled from web client
  • Lowered RAG similarity threshold for better matches

v1.0.7 (2026-02-08)

  • Updated server URL to https://lightphon.com
  • Updated GitHub links to LightPhon repository

v1.0.6 (2026-02-08)

  • Added option to start automatically when Windows starts
  • Fixed tray exit not properly killing the process

v1.0.5 (2026-02-08)

  • Fixed login issues in compiled exe
  • UI improvements

v1.0.4 (2026-02-08)

  • Added minimize to system tray functionality
  • Window minimizes to tray icon instead of closing