• Menu
  • Skip to main content
  • Skip to primary sidebar

All Tech News

Latest Technology News

New Hugging Face Vulnerability Exposes AI Models to Supply Chain Attacks

You are here: Home / Cyber Security News / New Hugging Face Vulnerability Exposes AI Models to Supply Chain Attacks

Cybersecurity scientists have uncovered that it can be attainable to compromise the Hugging Deal with Safetensors conversion support to in the long run hijack the styles submitted by consumers and end result in provide chain attacks.

“It can be attainable to deliver destructive pull requests with attacker-managed details from the Hugging Facial area services to any repository on the platform, as nicely as hijack any designs that are submitted by way of the conversion support,” HiddenLayer reported in a report revealed last 7 days.

This, in convert, can be completed employing a hijacked product that is meant to be transformed by the company, thereby permitting malicious actors to ask for improvements to any repository on the platform by masquerading as the conversion bot.

Hugging Deal with is a well-liked collaboration platform that helps users host pre-educated device finding out versions and datasets, as very well as construct, deploy, and teach them.

Safetensors is a structure devised by the corporation to retailer tensors holding security in head, as opposed to pickles, which has been probable weaponized by danger actors to execute arbitrary code and deploy Cobalt Strike, Mythic, and Metasploit stagers.

It also comes with a conversion provider that permits customers to change any PyTorch product (i.e., pickle) to its Safetensor equivalent via a pull ask for.

HiddenLayer’s investigation of this module observed that it’s hypothetically achievable for an attacker to hijack the hosted conversion provider utilizing a malicious PyTorch binary and compromise the process hosting it.

What’s extra, the token affiliated with SFConvertbot โ€“ an official bot developed to produce the pull request โ€“ could be exfiltrated to ship a malicious pull ask for to any repository on the web site, primary to a scenario in which a threat actor could tamper with the product and implant neural backdoors.

“An attacker could operate any arbitrary code any time anyone attempted to transform their design,” scientists Eoin Wickens and Kasimir Schulz noted. “Without any indication to the user on their own, their types could be hijacked upon conversion.”

Must a consumer attempt to transform their very own non-public repository, the attack could pave the way for the theft of their Hugging Deal with token, entry in any other case inner styles and datasets, and even poison them.

Complicating issues additional, an adversary could consider edge of the reality that any consumer can submit a conversion ask for for a community repository to hijack or change a greatly made use of design, probably ensuing in a considerable provide chain risk.

“Regardless of the most effective intentions to secure machine studying styles in the Hugging Face ecosystem, the conversion services has demonstrated to be vulnerable and has had the probable to result in a common provide chain attack by using the Hugging Confront official assistance,” the researchers mentioned.

“An attacker could acquire a foothold into the container managing the company and compromise any design converted by the provider.”

The improvement arrives a small in excess of a thirty day period just after Path of Bits disclosed LeftoverLocals (CVE-2023-4969, CVSS score: 6.5), a vulnerability that permits recovery of info from Apple, Qualcomm, AMD, and Imagination normal-purpose graphics processing models (GPGPUs).

The memory leak flaw, which stems from a failure to sufficiently isolate procedure memory, allows a local attacker to read memory from other processes, which includes one more user’s interactive session with a substantial language model (LLM).

“This details leaking can have extreme security consequences, especially offered the increase of ML systems, exactly where regional memory is employed to retail store product inputs, outputs, and weights,” security scientists Tyler Sorensen and Heidy Khlaaf said.

Identified this short article intriguing? Follow us on Twitter ๏‚™ and LinkedIn to read through extra exceptional material we submit.

Some parts of this article are sourced from:
thehackernews.com

Previous Post: « WordPress Plugin Alert – Critical SQLi Vulnerability Threatens 200K+ Websites
Next Post: Five Eyes Agencies Expose APT29’s Evolving Cloud Attack Tactics »

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Recent Posts

  • Google Chrome Zero-Day CVE-2025-2783 Exploited by TaxOff to Deploy Trinper Backdoor
  • LangSmith Bug Could Expose OpenAI Keys and User Data via Malicious Agents
  • Silver Fox APT Targets Taiwan with Complex Gh0stCringe and HoldingHands RAT Malware
  • Google Warns of Scattered Spider Attacks Targeting IT Support Teams at U.S. Insurance Firms
  • Are Forgotten AD Service Accounts Leaving You at Risk?

Copyright © 2025 ยท AllTech.News, All Rights Reserved.