Device Compatibility
Local Dream supports two inference paths: NPU (fastest, requires a supported Snapdragon chip) and CPU/GPU (fallback, runs on most modern Android devices).
NPU Acceleration
NPU inference uses the Qualcomm QNN SDK and runs on the Hexagon NPU built into Snapdragon chips.
SD1.5 — Hexagon V68 or newer
| Chip | Supported |
|---|---|
| Snapdragon 8 Gen 1 / 8+ Gen 1 | ✅ |
| Snapdragon 8 Gen 2 | ✅ |
| Snapdragon 8 Gen 3 | ✅ |
| Snapdragon 8 Elite | ✅ |
| Snapdragon 8 Elite Gen 5 / 8 Gen 5 | ✅ |
| Non-flagship chips with Hexagon V68 or newer (e.g. Snapdragon 7 Gen 1, 8s Gen 3) | ✅ |
SDXL — Snapdragon 8 Gen 3 or newer
| Chip | Supported |
|---|---|
| Snapdragon 8 Gen 3 | ✅ |
| Snapdragon 8 Elite | ✅ |
| Snapdragon 8 Elite Gen 5 / 8 Gen 5 | ✅ |
TIP
On unsupported chips, NPU models simply will not appear in the in-app download list.
CPU/GPU Inference
The CPU/GPU path uses the MNN framework with W8 dynamic quantization. It runs on most Android phones from recent years.
- RAM: ~2 GB available memory recommended
- Resolution: Flexible — 128×128, 256×256, 384×384, 512×512
- Performance: Moderate speed; high compatibility
If your chip is outside the NPU support list, you can still run SD1.5 in CPU/GPU mode.
Quick Decision Tree
- Snapdragon 8 Gen 3 or newer flagship? → SDXL via NPU is available.
- Hexagon V68 or newer (any Snapdragon NPU listed above)? → SD1.5 via NPU is available.
- Otherwise → SD1.5 via CPU/GPU is your only option.