Table of Contents
1. From the perspective of challenges, why can TransformerFAM help large models “remember more”?
2. 作業記憶の大規模モデルを使用して、AGI への移行を継続します
Home Technology peripherals AI Google takes action to rectify the 'amnesia' of large models! The feedback attention mechanism helps you 'update' the context, and the era of unlimited memory for large models is coming.

Google takes action to rectify the 'amnesia' of large models! The feedback attention mechanism helps you 'update' the context, and the era of unlimited memory for large models is coming.

Apr 17, 2024 pm 03:40 PM
Google Model attention

Editor | Yi Feng

produced | 51CTO technology stack (WeChat ID: blog51cto)

Google finally took action! We will no longer suffer from the "amnesia" of large models.

TransformerFAM was born, promising to make large models have unlimited memory!

Without further ado, let’s take a look at the “efficacy” of TransformerFAM:

Google takes action to rectify the amnesia of large models! The feedback attention mechanism helps you update the context, and the era of unlimited memory for large models is coming.Picture

The large model is processing long context tasks Performance has been significantly improved!

In the above figure, tasks such as Isabelle and NarrativeQA require the model to understand and process a large amount of contextual information and give accurate answers or summaries to specific questions. In all tasks, the model configured with FAM outperforms all other BSWA configurations, and it can be seen that beyond a certain point, the increase in the number of BSWA memory segments cannot continue to improve its memory capabilities.

It seems that on the way to long texts and long conversations, the "unforgettable" of FAM, a big model, does have something to it.

Google researchers introduced FAM, a novel Transformer architecture - Feedback Attention Memory. It uses feedback loops to enable the network to pay attention to its own drift performance, promote the emergence of the Transformer's internal working memory, and enable it to handle infinitely long sequences.

To put it simply, this strategy is a bit like our strategy to artificially combat the "amnesia" of large models: enter the prompt again before each conversation with the large model. It's just that FAM's approach is more advanced. When the model processes a new data block, it will use the previously processed information (that is, FAM) as a dynamically updated context and integrate it into the current processing process again.

In this way, you can well deal with the problem of "forgetting things". Even better, despite the introduction of feedback mechanisms to maintain long-term working memory, FAM is designed to maintain compatibility with pre-trained models without requiring additional weights. So in theory, the powerful memory of the large model does not make it dull or consume more computing resources.

So, how was such a wonderful TransformerFAM discovered? What are the related technologies?

1. From the perspective of challenges, why can TransformerFAM help large models “remember more”?

The concept of Sliding Window Attention (SWA) is crucial to the design of TransformerFAM.

In the traditional Transformer model, the complexity of self-attention (Self-Attention) increases quadratically as the length of the sequence increases, which limits the model's ability to handle long sequences.

"In the movie Memento (2000), the main character suffers from anterograde amnesia, which means he cannot remember what happened in the past 10 minutes, but his long-term memory is intact , he had to tattoo important information on his body to remember them, similar to the current state of large language models (LLMs)," the paper reads.

Google takes action to rectify the amnesia of large models! The feedback attention mechanism helps you update the context, and the era of unlimited memory for large models is coming.Screenshots from the movie "Memory", the pictures come from the Internet

Sliding Window Attention (Sliding Window Attention), it is an improved attention Mechanism for processing long sequence data. It is inspired by the sliding window technique in computer science. When dealing with natural language processing (NLP) tasks, SWA allows the model to focus on only a fixed-size window of the input sequence at each time step, rather than the entire sequence. Therefore, the advantage of SWA is that it can significantly reduce the computational effort.

Google takes action to rectify the amnesia of large models! The feedback attention mechanism helps you update the context, and the era of unlimited memory for large models is coming.Picture

However, SWA has limitations because its attention span is limited to the window size, which results in the model being unable to consider outside the window. Important information.

TransformerFAM achieves integrated attention, block-level updates, information compression, and global context storage by adding feedback activation to re-input context representation into each block of sliding window attention.

In TransformerFAM, improvements are achieved through feedback loops. Specifically, when processing the current sequence block, the model not only focuses on elements within the current window, but also reintroduces previously processed contextual information (i.e., previous "feedback activation") as additional input into the attention mechanism. In this way, even if the model's attention window slides over the sequence, it is able to maintain memory and understanding of previous information.

So, after these improvements, TransformerFAM gives LLMs the potential to handle infinite length sequences!

2. 作業記憶の大規模モデルを使用して、AGI への移行を継続します

TransformerFAM は研究で前向きな見通しを示しており、これにより AI の長いテキスト タスクの理解と生成の能力が向上することは間違いありません。処理などのパフォーマンスが向上します。文書の要約、ストーリーの作成、Q&A など。

Google takes action to rectify the amnesia of large models! The feedback attention mechanism helps you update the context, and the era of unlimited memory for large models is coming.写真

同時に、それがインテリジェントなアシスタントであれ、感情的なパートナーであれ、無制限のメモリを持つ AI はより魅力的に聞こえます。

興味深いことに、TransformerFAM の設計は生物学の記憶メカニズムに触発されており、AGI が追求する自然知能シミュレーションと一致しています。この論文は、神経科学の概念である注意ベースの作業記憶を深層学習の分野に統合する試みです。

TransformerFAM は、フィードバック ループを通じて大規模なモデルに作業記憶を導入し、モデルが短期的な情報を記憶するだけでなく、長期シーケンスにおける重要な情報の記憶を維持できるようにします。

研究者は、大胆な想像力を通じて、現実世界と抽象概念の間に仮説的な橋を架けます。 TransformerFAM のような革新的な成果が生まれ続けるにつれて、技術的なボトルネックは何度も突破され、よりインテリジェントで相互接続された未来がゆっくりと私たちに向かって展開されています。

AIGC の詳細については、次のサイトをご覧ください:

51CTO AI.x コミュニティ

https://www.51cto.com/aigc/

The above is the detailed content of Google takes action to rectify the 'amnesia' of large models! The feedback attention mechanism helps you 'update' the context, and the era of unlimited memory for large models is coming.. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Hot Topics

Java Tutorial
1658
14
PHP Tutorial
1257
29
C# Tutorial
1231
24
Sesame Open Door Exchange Web Page Login Latest version gateio official website entrance Sesame Open Door Exchange Web Page Login Latest version gateio official website entrance Mar 04, 2025 pm 11:48 PM

A detailed introduction to the login operation of the Sesame Open Exchange web version, including login steps and password recovery process. It also provides solutions to common problems such as login failure, unable to open the page, and unable to receive verification codes to help you log in to the platform smoothly.

Sesame Open Door Exchange Web Page Registration Link Gate Trading App Registration Website Latest Sesame Open Door Exchange Web Page Registration Link Gate Trading App Registration Website Latest Feb 28, 2025 am 11:06 AM

This article introduces the registration process of the Sesame Open Exchange (Gate.io) web version and the Gate trading app in detail. Whether it is web registration or app registration, you need to visit the official website or app store to download the genuine app, then fill in the user name, password, email, mobile phone number and other information, and complete email or mobile phone verification.

Top 10 recommended for crypto digital asset trading APP (2025 global ranking) Top 10 recommended for crypto digital asset trading APP (2025 global ranking) Mar 18, 2025 pm 12:15 PM

This article recommends the top ten cryptocurrency trading platforms worth paying attention to, including Binance, OKX, Gate.io, BitFlyer, KuCoin, Bybit, Coinbase Pro, Kraken, BYDFi and XBIT decentralized exchanges. These platforms have their own advantages in terms of transaction currency quantity, transaction type, security, compliance, and special features. For example, Binance is known for its largest transaction volume and abundant functions in the world, while BitFlyer attracts Asian users with its Japanese Financial Hall license and high security. Choosing a suitable platform requires comprehensive consideration based on your own trading experience, risk tolerance and investment preferences. Hope this article helps you find the best suit for yourself

Tutorial on how to register, use and cancel Ouyi okex account Tutorial on how to register, use and cancel Ouyi okex account Mar 31, 2025 pm 04:21 PM

This article introduces in detail the registration, use and cancellation procedures of Ouyi OKEx account. To register, you need to download the APP, enter your mobile phone number or email address to register, and complete real-name authentication. The usage covers the operation steps such as login, recharge and withdrawal, transaction and security settings. To cancel an account, you need to contact Ouyi OKEx customer service, provide necessary information and wait for processing, and finally obtain the account cancellation confirmation. Through this article, users can easily master the complete life cycle management of Ouyi OKEx account and conduct digital asset transactions safely and conveniently.

Sesame Open Door Trading Platform Download Mobile Version Gateio Trading Platform Download Address Sesame Open Door Trading Platform Download Mobile Version Gateio Trading Platform Download Address Feb 28, 2025 am 10:51 AM

It is crucial to choose a formal channel to download the app and ensure the safety of your account.

How to register and download the latest app on Bitget official website How to register and download the latest app on Bitget official website Mar 05, 2025 am 07:54 AM

This guide provides detailed download and installation steps for the official Bitget Exchange app, suitable for Android and iOS systems. The guide integrates information from multiple authoritative sources, including the official website, the App Store, and Google Play, and emphasizes considerations during download and account management. Users can download the app from official channels, including app store, official website APK download and official website jump, and complete registration, identity verification and security settings. In addition, the guide covers frequently asked questions and considerations, such as

Why is Bittensor said to be the 'bitcoin' in the AI ​​track? Why is Bittensor said to be the 'bitcoin' in the AI ​​track? Mar 04, 2025 pm 04:06 PM

Original title: Bittensor=AIBitcoin? Original author: S4mmyEth, Decentralized AI Research Original translation: zhouzhou, BlockBeats Editor's note: This article discusses Bittensor, a decentralized AI platform, hoping to break the monopoly of centralized AI companies through blockchain technology and promote an open and collaborative AI ecosystem. Bittensor adopts a subnet model that allows the emergence of different AI solutions and inspires innovation through TAO tokens. Although the AI ​​market is mature, Bittensor faces competitive risks and may be subject to other open source

Detailed tutorial on how to register for binance (2025 beginner's guide) Detailed tutorial on how to register for binance (2025 beginner's guide) Mar 18, 2025 pm 01:57 PM

This article provides a complete guide to Binance registration and security settings, covering pre-registration preparations (including equipment, email, mobile phone number and identity document preparation), and introduces two registration methods on the official website and APP, as well as different levels of identity verification (KYC) processes. In addition, the article also focuses on key security steps such as setting up a fund password, enabling two-factor verification (2FA, including Google Authenticator and SMS Verification), and setting up anti-phishing codes, helping users to register and use the Binance Binance platform for cryptocurrency transactions safely and conveniently. Please be sure to understand relevant laws and regulations and market risks before trading and invest with caution.

See all articles