IBM Releases Granite 3.1 Large Language Models with Enhanced Capabilities
Outperforming the Competition
IBM’s latest release, Granite 3.1, boasts an impressive 128K token context window, a significant increase from its predecessors. This expansion enables the models to process and understand much larger amounts of text, equivalent to approximately 85,000 English words, enabling more comprehensive analysis and generation tasks.
Improved Graphics and Language Capabilities
The new release introduces image-in/text-out functionality, broadening the models’ applicability for businesses working with graphics. Granite 3.1 also offers improved foreign language proficiency, now working with a dozen languages, including German, Spanish, French, Japanese, Portuguese, Arabic, Czech, Italian, Korean, Dutch, and Simplified Chinese.
MoE Models and Dense Models
The Granite 3.1 family includes dense models and Mixture of Experts (MoE) variants. IBM states its Granite 2B and 8B models are text-only dense LLMs trained on over 12 trillion data tokens. The dense models are designed to support tool-based use cases and for retrieval augmented generation (RAG), streamlining code generation, translation, and bug fixing. The MoE models are trained on over 10 trillion tokens of data and are ideal for deployment in on-device applications with low latency.
Powerful, Trustworthy AI for Enterprises
Granite 3.1 models are available on IBM’s Watsonx platform, cloud service providers like Google Vertex AI, and AI platforms including Hugging Face, NVIDIA (as NIM microservices), Ollama, and Replicate. The release of Granite 3.1 is poised to accelerate AI adoption in enterprise settings, providing businesses with powerful, efficient, and trustworthy AI tools to drive innovation and solve complex business challenges.
Conclusion
IBM’s Granite 3.1 represents a significant step forward in providing enterprises with powerful, efficient, and trustworthy AI tools. By combining these models with proprietary data using techniques like IBM’s InstructLab, businesses can potentially achieve task-specific performance rivaling larger models at a fraction of the cost.
Frequently Asked Questions
Q: What are the key features of Granite 3.1?
A: Granite 3.1 offers enhanced capabilities, improved graphics and language capabilities, and expanded MoE models and dense models.
Q: How does Granite 3.1 outperform its rivals?
A: According to IBM, Granite 3.1 outperforms its rivals on HuggingFace’s OpenLLM Leaderboard benchmarks.
Q: What is the context window size of Granite 3.1?
A: Granite 3.1 boasts an impressive 128K token context window, allowing it to process and understand much larger amounts of text.
Q: Which languages does Granite 3.1 support?
A: Granite 3.1 supports a dozen languages, including English, German, Spanish, French, Japanese, Portuguese, Arabic, Czech, Italian, Korean, Dutch, and Simplified Chinese.

