World Lifestyler
  • Art & Culture
    • Architecture
    • Art & Exhibitions
    • Books
    • Design
    • Film & Music
  • Competitions
    • Dining Experiences
    • Hotel Stays
    • Luxury Experiences
    • Product Giveaways
    • Reader Exclusives
    • Travel Giveaways
  • Food & Drink
    • Chefs
    • Coffee Culture
    • Food Destinations
    • Recipes
    • Restaurants
    • Wine & Spirits
  • Lifestyle
    • Design
    • Fashion
    • Health & Wellbeing
    • Homes & Property
    • Love & Romance
  • People
    • Creatives
    • Entrepreneurs
    • Icons
    • Interviews
    • Profiles
    • Rising Talent
  • Travel
    • Adventure & Experience Travel
    • City Guides
    • Destinations
    • Hotels
    • Secret Spots
    • Travel Trends
  • Art & Culture
    • Architecture
    • Art & Exhibitions
    • Books
    • Design
    • Film & Music
  • Competitions
    • Dining Experiences
    • Hotel Stays
    • Luxury Experiences
    • Product Giveaways
    • Reader Exclusives
    • Travel Giveaways
  • Food & Drink
    • Chefs
    • Coffee Culture
    • Food Destinations
    • Recipes
    • Restaurants
    • Wine & Spirits
  • Lifestyle
    • Design
    • Fashion
    • Health & Wellbeing
    • Homes & Property
    • Love & Romance
  • People
    • Creatives
    • Entrepreneurs
    • Icons
    • Interviews
    • Profiles
    • Rising Talent
  • Travel
    • Adventure & Experience Travel
    • City Guides
    • Destinations
    • Hotels
    • Secret Spots
    • Travel Trends
No Result
View All Result
WORLD LIFESTYLER
No Result
View All Result
Home Uncategorized

Nota AI Reduces Memory Usage of Upstage’s Solar LLM by 72%, Demonstrating Proprietary Quantization Technology

Cision PR Newswire by Cision PR Newswire
March 6, 2026
in Uncategorized
Reading Time: 3 mins read
0
Share on FacebookShare on Twitter

New “Nota AI MoE Quantization” approach preserves model performance while significantly improving memory efficiency

SEOUL, South Korea, March 5, 2026 /PRNewswire/ — Nota AI, an AI optimization technology company behind the Nota AI brand, announced that it has developed a next-generation quantization technology that significantly compresses the size of Solar, a high-performance large language model (LLM) developed by Upstage, while maintaining high accuracy. The breakthrough reduces inference costs and improves processing speed without sacrificing performance.


Nota AI Reduces Memory Usage of Upstage’s Solar LLM by 72%, Demonstrating Proprietary Quantization Technology (PRNewsfoto/Nota AI)

The development was carried out as part of the “Sovereign AI Foundation Model Project” led by South Korea’s Ministry of Science and ICT. By applying Nota AI’s lightweighting and optimization technologies to Solar Open 100B, the company significantly improved memory efficiency while preserving model performance. The achievement lowers the memory requirements of the 100B-parameter model while maintaining its capabilities, enabling more practical deployment of Korean AI foundation models in physical AI environments such as mobility and robotics.

The newly developed technology focuses on addressing technical challenges associated with the Mixture of Experts (MoE) architecture, which is rapidly gaining adoption in next-generation LLMs. Conventional quantization methods typically compress the entire model uniformly without considering the distinct characteristics of individual expert models. To overcome this limitation, Nota AI developed a proprietary algorithm optimized for MoE architectures, called “Nota AI MoE Quantization.”

The approach is designed to minimize quantization distortion during the inference process of MoE models. Unlike conventional methods that uniformly reduce precision across all operations, Nota AI’s algorithm selectively preserves precision in critical components while compressing less sensitive parts of the model. This enables effective model compression while minimizing performance loss.

Applying the technology to the Solar 100B model yielded significant improvements compared with conventional quantization methods. Nota AI successfully reduced Solar’s memory usage from 191.2GB to 51.9GB, representing a 72.8% reduction. At the same time, the model maintained performance levels comparable to the original version, achieving a Perplexity (PPL) score of 6.81, close to the baseline model’s 6.06. In contrast, some generic quantization approaches resulted in performance degradation exceeding fivefold. Nota AI has filed a patent application for the technology to strengthen its intellectual property portfolio.

While conventional quantization techniques often sacrifice model performance to reduce memory usage, Nota AI’s technology demonstrates that it is possible to maintain performance while delivering AI services faster and to more users on limited GPU infrastructure. As a result, enterprises can deploy large-scale LLMs more easily on their own devices—models that were previously difficult to implement due to hardware constraints.

The significant reduction in Solar 100B’s memory footprint while preserving performance also creates new opportunities for deploying high-performance AI in real-world on-device environments, including robotics and automotive systems. Additionally, the technology enables organizations facing limited access to high-end GPU infrastructure to serve more users on the same hardware, directly contributing to lower operational costs.

“This achievement is meaningful because we were able to apply Nota AI’s proprietary quantization technology to Solar 100B, a Korean AI foundation model, significantly reducing memory usage while maintaining performance,” said Myungsu Chae, CEO of Nota AI, said, “As demand grows for deploying large-scale models directly on devices, Nota AI’s lightweighting and optimization technologies will play a critical role in enabling high-performance AI.”

Cision View original content to download multimedia:https://www.prnewswire.com/news-releases/nota-ai-reduces-memory-usage-of-upstages-solar-llm-by-72-demonstrating-proprietary-quantization-technology-302706619.html

SOURCE Nota AI

Cision PR Newswire

Cision PR Newswire

Related Posts

love around the world cultures

Love Around the World: How Different Cultures Define Romance

March 17, 2026
Oscars 2026 fashion

The Best Oscars 2026 Red Carpet Fashion Moments That Defined the Night

March 16, 2026
eating disorders

Eating Disorders in Women: What Is Really Going On and Where to Get Help

March 16, 2026
Mels Robins skin care

The Dermatologist Skincare Routine That Actually Works

March 16, 2026

Ulike Spring: Dare to Glow — Celebrating Confident and Aspirational Women

March 9, 2026

InventHelp Inventor Develops Automatic Nail Care Device (CCT-4883)

March 9, 2026

Popular News

  • Shareholders who lost money in Gemini Space Station, Inc. (NASDAQ: GEMI) Should Contact Wolf Haldenstein Immediately

    0 shares
    Share 0 Tweet 0
  • Natural Grocers® Celebrates Arbor Day in Nebraska With a Special Gift and Savings April 24-26, 2026

    0 shares
    Share 0 Tweet 0
  • B. Riley Securities Expands New York Institutional Sales & Trading Desk

    0 shares
    Share 0 Tweet 0
  • DeFi Technologies Announces Venture Portfolio Company Stablecorp’s QCAD Is Now Live for Trading on Kraken

    0 shares
    Share 0 Tweet 0
  • Ballerine Introduces Agentic Detection to Stop Merchant Fraud Early as AI Scams Rise

    0 shares
    Share 0 Tweet 0

About & Contact

  • About Us
  • Branding Style Guide
  • Contact Us
  • Help Centre
  • Media Kit
  • Site Map

Explore Content

  • Events
  • Newsletter
  • Press Releases
  • Topics

Legal & Privacy

  • Advertiser & Partner Policy
  • Communications & Newsletter Policy
  • Contributor Agreement
  • Copyright Policy
  • Privacy Policy
  • Prohibited Content Policy
  • Terms of Service

Tiny Media Brands

  • Silicon Valleys Journal
  • The AI Journal
  • The City Banker
  • The Wall Street Banker
  • World Lifestyler

© 2025 World Lifestyler

No Result
View All Result
  • Home

© 2025 World Lifestyler