A list of Edge optimized Ollama models
-
Updated
Jan 7, 2026
A list of Edge optimized Ollama models
Fine-tuning LiquidAI/LFM2-VL-1.6B in Colab (LoRA/4-bit) + dataset template + probe test.
🧠 AI Developer based in Tokyo 🇯🇵 | Building creative, human-centered AI experiences. 🥈 2nd Place — Liquid AI x W&B x Lambda Hackathon (Tokyo) 💻 Focused on multimodal LLMs, voice interaction & fast prototyping (vibe coding). 🌸 Exploring the intersection of Japanese culture and AI innovation.
Anywhere-LFM est une application de bureau légère, pensée pour simplifier l'utilisation en local des modèles LiquidAI/LFM2 (350M à 1.2B). Elle permet à n'importe qui de discuter avec une IA générative performante, sans nécessiter de ligne de commande ou de configuration complexe.
Add a description, image, and links to the liquidai topic page so that developers can more easily learn about it.
To associate your repository with the liquidai topic, visit your repo's landing page and select "manage topics."