DeepSeek + ASTER: Run Powerful Local AI Simultaneously for Multiple Users on One PC
DeepSeek (particularly the DeepSeek-R1, DeepSeek-V3 families and distilled versions) became one of the most discussed open-source AI models in 2025–2026. These models outperform or match o1, Claude 3.5, and GPT-4o in reasoning, math, and coding tasks — and they can run completely locally without any cloud.

The easiest way to run DeepSeek on Windows is through Ollama, LM Studio, or Open WebUI + Ollama.
Hardware requirements examples (for distilled 7B–32B versions that work well on home PCs):
- 7B–8B model → 8–16 GB VRAM (RTX 3060/4060) or 16–32 GB RAM (CPU mode is slower)
- 32B → 24–48 GB VRAM (RTX 4090 / A6000) or 64+ GB RAM + offload
- Full 671B variants → require server-grade hardware (hundreds of GB VRAM/RAM), but distilled versions run on regular gaming PCs
The problem: one user runs the model → family members / classmates / office colleagues have to wait their turn.
The solution — ASTER from IBIK LLC (ibiksoft.com).
ASTER turns one powerful PC into up to 12 independent workstations without virtualization or thin clients. Each user gets:
- Their own monitor, keyboard, mouse
- A fully separate Windows desktop
- Independent launch of browser, Ollama/LM Studio, and DeepSeek chat
In practice:
- Install Ollama / LM Studio on the main (first) workstation.
- Download a DeepSeek model (e.g., deepseek-r1:7b or distilled 32B).
- ASTER allocates resources: each user runs their own Ollama/WebUI instance or connects to the shared server (localhost:11434) via browser.
Performance: on a PC with Ryzen 9 / RTX 4090 / 64 GB RAM, 4–6 people can work comfortably at the same time (each with their own DeepSeek chat, coding, text generation).
Advantages of ASTER + DeepSeek combo:
- Up to 80% savings — one powerful system instead of 4–6 PCs/laptops
- Complete privacy — all data stays local, no cloud upload
- Perfect for schools, universities, AI clubs, families, coworking spaces, small companies
- No subscriptions — model is free, ASTER is one-time license after 14-day trial
Real-world example configuration (works in 2026):
- CPU: Ryzen 7/9 7700X–9950X
- GPU: RTX 4080/4090 (24–48 GB VRAM)
- RAM: 64–128 GB
→ 4–8 simultaneous users with DeepSeek 7B–32B at acceptable speed
Want to test it? Download the 14-day full trial of ASTER from ibiksoft.com, install Ollama (ollama.com), load any DeepSeek version — and see for yourself how multiple people can use powerful local AI on a single computer at the same time.
DeepSeek + ASTER = affordable, private, and truly multi-user local AI right at home or in the classroom.

