maryam-abacha-university-ad

Microsoft has launched Fara-7B, a compact 7-billion-parameter small language model (SLM) designed to operate computers the way humans do—clicking, typing, navigating and completing multi-step tasks directly from on-screen content.

Unlike typical AI agents that require multiple large cloud models, Fara-7B runs locally on-device, reducing latency and improving privacy by keeping user data offline. Microsoft describes it as its first fully agentic SLM, capable of interpreting screenshots and acting without external model assistance.

The model was trained using FaraGen, an AI-driven dataset of more than 145,000 verified task sessions, covering real-world websites and interfaces across 70,000 domains.

Early benchmarks show Fara-7B outperforming other lightweight agentic systems and achieving competitive results against larger models—while costing as little as 2.5 cents per task, compared to 30 cents for GPT-4-class agents.

Fara-7B is available on Microsoft Foundry and Hugging Face, with an MIT license and device-ready versions optimized for Windows 11 Copilot+ PCs.