News
The BitNet b1.58 2B4T model was developed by Microsoft's General Artificial Intelligence group and contains two billion ...
1d
Gadget Review on MSNBitNet: Microsoft's Compact AI Challenges Industry Giants with Radical EfficiencyMicrosoft's BitNet challenges industry norms with a minimalist approach using ternary weights that require just 400MB of ...
18h
ExtremeTech on MSNMicrosoft's New Compact 1-Bit LLM Needs Just 400MB of MemoryMicrosoft’s new large language model (LLM) puts significantly less strain on hardware than other LLMs—and it’s free to ...
Microsoft researchers have developed — and released — a hyper-efficient AI model that can run on CPUs, including Apple's M2.
Memory requirements are the most obvious advantage of reducing the complexity of a model's internal weights. The BitNet b1.58 ...
Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
Microsoft’s new BitNet b1.58 model significantly reduces memory and energy requirements while matching the capabilities of ...
Microsoft released what it describes as the most expansive 1-bit AI model to date, BitNet b1.58 2B4T. Unlike traditional ...
Microsoft’s General Artificial Intelligence group has introduced a groundbreaking large language model (LLM) that drastically ...
Microsoft researchers have created BitNet b1.58 2B4T, a large-scale 1-bit AI model that can efficiently run on CPUs, ...
Microsoft Research has introduced BitNet b1.58 2B4T, a new 2-billion parameter language model that uses only 1.58 bits per ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results