LM Studio Uptime Monitor
Discover, download, and run local LLMs on your computer
Last 30 Days Performance
Average Uptime
100%
Based on 30-day monitoring period
Average Response Time
368.7ms
Mean response time across all checks
Daily Status Overview
Hover for detailsHistorical Performance
Dec-2025
100% uptime
Monthly Uptime
100%
Monthly Response Time
707ms
Daily Status Breakdown
Nov-2025
99.57% uptime
Monthly Uptime
99.57%
Monthly Response Time
625ms
Daily Status Breakdown
Oct-2025
99.87% uptime
Monthly Uptime
99.87%
Monthly Response Time
724ms
Daily Status Breakdown
Sep-2025
99.84% uptime
Monthly Uptime
99.84%
Monthly Response Time
652ms
Daily Status Breakdown
Aug-2025
99.57% uptime
Monthly Uptime
99.57%
Monthly Response Time
823ms
Daily Status Breakdown
Jul-2025
99.86% uptime
Monthly Uptime
99.86%
Monthly Response Time
1439ms
Daily Status Breakdown
Jun-2025
100% uptime
Monthly Uptime
100%
Monthly Response Time
1405ms
Daily Status Breakdown
May-2025
100% uptime
Monthly Uptime
100%
Monthly Response Time
1406ms
Daily Status Breakdown
Apr-2025
98.93% uptime
Monthly Uptime
98.93%
Monthly Response Time
1263ms
Daily Status Breakdown
Related Uptime Monitors
Explore uptime status for similar tools that also have monitoring enabled.
-
Operationallm-studio.me
Local LLM Running & Download Platform
LM Studio is a user-friendly desktop application that allows users to run various large language models (LLMs) locally and offline, including Llama 2, PN3, Falcon, Mistral, StarCoder, and GEMMA models from Hugging Face.
Last checked: 2 hours ago View Status -
OperationalOllama
Get up and running with large language models locally
Ollama is a platform that enables users to run powerful language models like Llama 3.3, DeepSeek-R1, Phi-4, Mistral, and Gemma 2 on their local machines.
Last checked: 2 hours ago View Status -
OperationalKolosal AI
The Ultimate Local LLM Platform
Kolosal AI is a lightweight, open-source application enabling users to train, run, and chat with local Large Language Models (LLMs) directly on their devices, ensuring complete privacy and control.
Last checked: 2 hours ago View Status -
OperationalBodhi
Run LLMs locally, powered by Open Source
Bodhi is a free, privacy-focused application allowing users to run Large Language Models (LLMs) locally on their macOS devices without technical setup.
Last checked: 2 hours ago View Status -
OperationalMsty
The easiest way to use local and online AI models
Msty is a user-friendly application that simplifies using local and online AI models, offering offline functionality, privacy, and advanced features like parallel multiverse chats.
Last checked: 2 hours ago View Status -
OperationalLlamaChat
Chat with LLaMA, Alpaca, and GPT4All models locally on your Mac
LlamaChat is a free, open-source macOS application that enables users to run and chat with various LLaMA-based AI models locally on their computers, supporting both Intel and Apple Silicon processors.
Last checked: 3 hours ago View Status