Close Menu
Ztoog
    What's Hot
    Mobile

    Paramount Plus: Price, shows & how to try for free

    Mobile

    Apple warns Apple Watch owners not to use a fake or counterfeit charger

    Science

    AI reveals first word of ancient scroll torched by Mount Vesuvius

    Important Pages:
    • About Us
    • Contact us
    • Privacy Policy
    • Terms & Conditions
    Facebook X (Twitter) Instagram Pinterest
    Facebook X (Twitter) Instagram Pinterest
    Ztoog
    • Home
    • The Future

      What is Project Management? 5 Best Tools that You Can Try

      Operational excellence strategy and continuous improvement

      Hannah Fry: AI isn’t as powerful as we think

      FanDuel goes all in on responsible gaming push with new Play with a Plan campaign

      Gettyimages.com Is the Best Website on the Internet Right Now

    • Technology

      Iran war: How could it end?

      Democratic senators question CFTC staffing cuts in Chicago enforcement office

      Google’s Cloud AI lead on the three frontiers of model capability

      AMD agrees to backstop a $300M loan from Goldman Sachs for Crusoe to buy AMD AI chips, the first known case of AMD chips used as debt collateral (The Information)

      Productivity apps failed me when I needed them most

    • Gadgets

      macOS Tahoe 26.3.1 update will “upgrade” your M5’s CPU to new “super” cores

      Lenovo Shows Off a ThinkBook Modular AI PC Concept With Swappable Ports and Detachable Displays at MWC 2026

      POCO M8 Review: The Ultimate Budget Smartphone With Some Cons

      The Mission: Impossible of SSDs has arrived with a fingerprint lock

      6 Best Phones With Headphone Jacks (2026), Tested and Reviewed

    • Mobile

      Android’s March update is all about finding people, apps, and your missing bags

      Watch Xiaomi’s global launch event live here

      Our poll shows what buyers actually care about in new smartphones (Hint: it’s not AI)

      Is Strava down for you? You’re not alone

      The Motorola Razr FIFA World Cup 2026 Edition was literally just unveiled, and Verizon is already giving them away

    • Science

      Big Tech Signs White House Data Center Pledge With Good Optics and Little Substance

      Inside the best dark matter detector ever built

      NASA’s Artemis moon exploration programme is getting a major makeover

      Scientists crack the case of “screeching” Scotch tape

      Blue-faced, puffy-lipped monkey scores a rare conservation win

    • AI

      Online harassment is entering its AI era

      Meet NullClaw: The 678 KB Zig AI Agent Framework Running on 1 MB RAM and Booting in Two Milliseconds

      New method could increase LLM training efficiency | Ztoog

      The human work behind humanoid robots is being hidden

      NVIDIA Releases DreamDojo: An Open-Source Robot World Model Trained on 44,711 Hours of Real-World Human Video Data

    • Crypto

      Google paid startup Form Energy $1B for its massive 100-hour battery

      Ethereum Breakout Alert: Corrective Channel Flip Sparks Impulsive Wave

      Show Your ID Or No Deal

      Jane Street sued for alleged front-running trades that accelerated Terraform Labs meltdown

      Bitcoin Trades Below ETF Cost-Basis As MVRV Signals Mounting Pressure

    Ztoog
    Home » Researchers from UC Berkeley Propose RingAttention: A Memory-Efficient Artificial Intelligence Approach to Reduce the Memory Requirements of Transformers
    AI

    Researchers from UC Berkeley Propose RingAttention: A Memory-Efficient Artificial Intelligence Approach to Reduce the Memory Requirements of Transformers

    Facebook Twitter Pinterest WhatsApp
    Researchers from UC Berkeley Propose RingAttention: A Memory-Efficient Artificial Intelligence Approach to Reduce the Memory Requirements of Transformers
    Share
    Facebook Twitter LinkedIn Pinterest WhatsApp

    A kind of deep studying mannequin structure known as Transformers in the context of many state-of-the-art AI fashions. They have revolutionized the subject of synthetic intelligence, significantly in pure language processing and numerous different duties in machine studying. It relies on a self-attention mechanism the place the mannequin weighs the significance of totally different components of the enter sequence when making predictions. They consist of an encoder and a decoder to course of the inputs.  

    However, scaling up the context size of Transformers takes lots of work. It is due to the inherited self-attention. Self-attention has reminiscence price quadratic in the enter sequence size, which makes it difficult to scale to the longer enter sequences. Researchers at UC Berkley developed a technique known as Ring Attention to deal with this based mostly on a easy remark. They noticed that when self-attention and feedforward community computations are carried out blockwise, the sequences will be distributed throughout a number of gadgets and simply analyzed.

    They distribute the outer loop of computing blockwise consideration amongst hosts, every machine managing its respective enter block. For the inside loop, they compute blockwise consideration and feedforward operations particular to its designated enter block for all gadgets. Their host gadgets kind a conceptual ring and ship a replica of its key-value blocks getting used for blockwise computation to the subsequent machine in the ring. They additionally concurrently obtain key-value blocks from the earlier one.

    The block computations take longer than block transfers. The group overlapped these processes, leading to no added overhead in contrast to customary transformers. By doing so, every machine requires solely reminiscence proportional to the block measurement, impartial of the unique enter sequence size. This successfully eliminates the reminiscence constraints imposed by particular person gadgets. 

    Their experiments present that Ring Attention can scale back the reminiscence necessities of Transformers by enabling them to practice greater than 500 instances longer sequences than prior reminiscence environment friendly state-of-the-arts. This technique additionally permits coaching sequences that exceed 100 million in size with out making approximations to consideration. As Ring Attention eliminates the reminiscence constraints imposed by particular person gadgets, one also can obtain near-infinite context sizes. However, one would require many quantity of gadgets as sequence size is proportional to the quantity of gadgets.

    The analysis solely includes an analysis of the effectiveness of the technique with out the large-scale coaching fashions. As the scale context size depends upon the quantity of gadgets, the mannequin’s effectivity depends upon the optimization; they’ve solely labored on the low-level operations required for reaching optimum laptop efficiency. The researchers say that they want to work on each most sequence size and most laptop efficiency in the future. The chance of near-infinite context introduces many thrilling alternatives, comparable to massive video-audio-language fashions, studying from prolonged suggestions and trial-and-errors, understanding and producing codebase, and adapting AI fashions to perceive scientific information comparable to gene sequences.


    Check out the Paper. All Credit For This Research Goes To the Researchers on This Project. Also, don’t neglect to be a part of our 31k+ ML SubReddit, 40k+ Facebook Community, Discord Channel, and Email Newsletter, the place we share the newest AI analysis information, cool AI tasks, and extra.

    If you want our work, you’ll love our e-newsletter..

    We are additionally on WhatsApp. Join our AI Channel on Whatsapp..


    Arshad is an intern at MarktechPost. He is at present pursuing his Int. MSc Physics from the Indian Institute of Technology Kharagpur. Understanding issues to the elementary stage leads to new discoveries which lead to development in expertise. He is keen about understanding the nature basically with the assist of instruments like mathematical fashions, ML fashions and AI.


    ▶️ Now Watch AI Research Updates On Our Youtube Channel [Watch Now]

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp

    Related Posts

    AI

    Online harassment is entering its AI era

    AI

    Meet NullClaw: The 678 KB Zig AI Agent Framework Running on 1 MB RAM and Booting in Two Milliseconds

    AI

    New method could increase LLM training efficiency | Ztoog

    AI

    The human work behind humanoid robots is being hidden

    AI

    NVIDIA Releases DreamDojo: An Open-Source Robot World Model Trained on 44,711 Hours of Real-World Human Video Data

    AI

    Personalization features can make LLMs more agreeable | Ztoog

    AI

    AI is already making online crimes easier. It could get much worse.

    AI

    NVIDIA Researchers Introduce KVTC Transform Coding Pipeline to Compress Key-Value Caches by 20x for Efficient LLM Serving

    Leave A Reply Cancel Reply

    Follow Us
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    Top Posts
    The Future

    iRobot in Financial Trouble: Amazon Cuts Acquisition Price

    In a transfer to deal with the monetary struggles confronted by robotic vacuum-cleaner maker iRobot,…

    Mobile

    Huawei MatePad Pro 11 2024 and MateBook D16 2024 unveiled

    Huawei held a product launch occasion in China earlier immediately the place it unveiled a…

    Science

    What does it take to get AI to work like a scientist?

    As machine-learning algorithms develop extra subtle, synthetic intelligence appears poised to revolutionize the observe of…

    Technology

    The Cold War Arms Race Over Prosthetic Arms

    In 1961, Norbert Wiener, the daddy of cybernetics, broke his hip and wound up in…

    Science

    After nozzle failure, Space Force is “assessing” impacts to Vulcan schedule

    Horne stated the Space Force is nonetheless analyzing knowledge from the October 4 launch, however…

    Our Picks
    Crypto

    Bitcoin Shrimps hit ATH As Aggressive Accumulation Continues

    Crypto

    Blockchain Firm Says Bitcoin Price Might Be Headed For $60,000

    The Future

    Implantable battery is charged up by the body’s oxygen supply

    Categories
    • AI (1,560)
    • Crypto (1,826)
    • Gadgets (1,870)
    • Mobile (1,910)
    • Science (1,939)
    • Technology (1,862)
    • The Future (1,716)
    Most Popular
    Crypto

    Will Bitcoin Price Crash Similarly To 2019 And 2020?

    Science

    Meet the Venetoraptor gassenae, the pterosaur’s flightless ancestor

    Science

    Parakeets may use ‘voice prints’ to identify each other

    Ztoog
    Facebook X (Twitter) Instagram Pinterest
    • Home
    • About Us
    • Contact us
    • Privacy Policy
    • Terms & Conditions
    © 2026 Ztoog.

    Type above and press Enter to search. Press Esc to cancel.