Close Menu
Ztoog
    What's Hot
    Mobile

    What to expect from phones in 2024

    AI

    Google DeepMind’s new AI system can solve complex geometry problems

    Technology

    What AMD Learned From Its Big Chiplet Push

    Important Pages:
    • About Us
    • Contact us
    • Privacy Policy
    • Terms & Conditions
    Facebook X (Twitter) Instagram Pinterest
    Facebook X (Twitter) Instagram Pinterest
    Ztoog
    • Home
    • The Future

      Can work-life balance tracking improve well-being?

      Any wall can be turned into a camera to see around corners

      JD Vance and President Trump’s Sons Hype Bitcoin at Las Vegas Conference

      AI may already be shrinking entry-level jobs in tech, new research suggests

      Today’s NYT Strands Hints, Answer and Help for May 26 #449

    • Technology

      Elon Musk tries to stick to spaceships

      A Replit employee details a critical security flaw in web apps created using AI-powered app builder Lovable that exposes API keys and personal info of app users (Reed Albergotti/Semafor)

      Gemini in Google Drive can now help you skip watching that painfully long Zoom meeting

      Apple iPhone exports from China to the US fall 76% as India output surges

      Today’s NYT Wordle Hints, Answer and Help for May 26, #1437

    • Gadgets

      Future-proof your career by mastering AI skills for just $20

      8 Best Vegan Meal Delivery Services and Kits (2025), Tested and Reviewed

      Google Home is getting deeper Gemini integration and a new widget

      Google Announces AI Ultra Subscription Plan With Premium Features

      Google shows off Android XR-based glasses, announces Warby Parker team-up

    • Mobile

      Deals: the Galaxy S25 series comes with a free tablet, Google Pixels heavily discounted

      Microsoft is done being subtle – this new tool screams “upgrade now”

      Wallpaper Wednesday: Android wallpapers 2025-05-28

      Google can make smart glasses accessible with Warby Parker, Gentle Monster deals

      vivo T4 Ultra specs leak

    • Science

      June skygazing: A strawberry moon, the summer solstice… and Asteroid Day!

      Analysts Say Trump Trade Wars Would Harm the Entire US Energy Sector, From Oil to Solar

      Do we have free will? Quantum experiments may soon reveal the answer

      Was Planet Nine exiled from the solar system as a baby?

      How farmers can help rescue water-loving birds

    • AI

      Rationale engineering generates a compact new tool for gene therapy | Ztoog

      The AI Hype Index: College students are hooked on ChatGPT

      Learning how to predict rare kinds of failures | Ztoog

      Anthropic’s new hybrid AI model can work on tasks autonomously for hours at a time

      AI learns how vision and sound are connected, without human intervention | Ztoog

    • Crypto

      Bitcoin Maxi Isn’t Buying Hype Around New Crypto Holding Firms

      GameStop bought $500 million of bitcoin

      CoinW Teams Up with Superteam Europe to Conclude Solana Hackathon and Accelerate Web3 Innovation in Europe

      Ethereum Net Flows Turn Negative As Bulls Push For $3,500

      Bitcoin’s Power Compared To Nuclear Reactor By Brazilian Business Leader

    Ztoog
    Home » A New AI Research from Apple and Equall AI Uncovers Redundancies in Transformer Architecture: How Streamlining the Feed Forward Network Boosts Efficiency and Accuracy
    AI

    A New AI Research from Apple and Equall AI Uncovers Redundancies in Transformer Architecture: How Streamlining the Feed Forward Network Boosts Efficiency and Accuracy

    Facebook Twitter Pinterest WhatsApp
    A New AI Research from Apple and Equall AI Uncovers Redundancies in Transformer Architecture: How Streamlining the Feed Forward Network Boosts Efficiency and Accuracy
    Share
    Facebook Twitter LinkedIn Pinterest WhatsApp

    Transformer design that has not too long ago develop into widespread has taken over as the customary methodology for Natural Language Processing (NLP) actions, notably Machine Translation (MT). This structure has displayed spectacular scaling qualities, which implies that including extra mannequin parameters outcomes in higher efficiency on a wide range of NLP duties. A variety of research and investigations have validated this statement. Though transformers excel in phrases of scalability, there’s a parallel motion to make these fashions more practical and deployable in the actual world. This entails caring for points with latency, reminiscence use, and disc house.

    Researchers have been actively investigating strategies to handle these points, together with element trimming, parameter sharing, and dimensionality discount. The broadly utilized Transformer structure includes numerous important elements, of which two of the most vital ones are the Feed Forward Network (FFN) and Attention.

    1. Attention – The Attention mechanism permits the mannequin to seize relationships and dependencies between phrases in a sentence, regardless of their positions. It capabilities as a form of mechanism to assist the mannequin in figuring out which parts of the enter textual content are most pertinent to every phrase it’s presently analyzing. Understanding the context and connections between phrases in a phrase is dependent upon this.
    1. Feed Forward Network (FFN): The FFN is answerable for non-linearly reworking every enter token independently. It provides complexity and expressiveness to the mannequin’s comprehension of every phrase by performing particular mathematical operations on the illustration of every phrase.

    In latest analysis, a crew of researchers has centered on investigating the function of the FFN inside the Transformer structure. They have found that the FFN reveals a excessive stage of redundancy whereas being a big element of the mannequin and consuming a major variety of parameters. They have discovered that they might reduce the mannequin’s parameter rely with out considerably compromising accuracy. They have achieved this by eradicating the FFN from the decoder layers and as a substitute utilizing a single shared FFN throughout the encoder layers.

    1. Decoder Layers: Each encoder and decoder in an ordinary Transformer mannequin has its personal FFN. The researchers eradicated the FFN from the decoder layers.
    1. Encoder Layers: They used a single FFN that was shared by all of the encoder layers somewhat than having particular person FFNs for every encoder layer.

    The researchers have shared the advantages which have accompanied this method, that are as follows.

    1. Parameter Reduction: They drastically decreased the quantity of parameters in the mannequin by deleting and sharing the FFN elements.
    1. The mannequin’s accuracy solely decreased by a modest quantity regardless of eradicating a large variety of its parameters. This exhibits that the encoder’s quite a few FFNs and the decoder’s FFN have some extent of useful redundancy.
    1. Scaling Back: They expanded the hidden dimension of the shared FFN to revive the structure to its earlier measurement whereas sustaining and even enhancing the efficiency of the mannequin. Compared to the earlier large-scale Transformer mannequin, this resulted in appreciable enhancements in accuracy and mannequin processing pace, i.e., latency.

    In conclusion, this analysis exhibits that the Feed Forward Network in the Transformer design, particularly in the decoder ranges, could also be streamlined and shared with out considerably affecting mannequin efficiency. This not solely lessens the mannequin’s computational load but additionally improves its effectiveness and applicability for various NLP purposes.


    Check out the Paper. All Credit For This Research Goes To the Researchers on This Project. Also, don’t overlook to hitch our 30k+ ML SubReddit, 40k+ Facebook Community, Discord Channel, and Email Newsletter, the place we share the newest AI analysis information, cool AI initiatives, and extra.

    If you want our work, you’ll love our e-newsletter..


    Tanya Malhotra is a closing yr undergrad from the University of Petroleum & Energy Studies, Dehradun, pursuing BTech in Computer Science Engineering with a specialization in Artificial Intelligence and Machine Learning.
    She is a Data Science fanatic with good analytical and crucial considering, together with an ardent curiosity in buying new expertise, main teams, and managing work in an organized method.


    🚀 Check out Noah AI: ChatGPT with Hundreds of Your Google Drive Documents, Spreadsheets, and Presentations (Sponsored)

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp

    Related Posts

    AI

    Rationale engineering generates a compact new tool for gene therapy | Ztoog

    AI

    The AI Hype Index: College students are hooked on ChatGPT

    AI

    Learning how to predict rare kinds of failures | Ztoog

    AI

    Anthropic’s new hybrid AI model can work on tasks autonomously for hours at a time

    AI

    AI learns how vision and sound are connected, without human intervention | Ztoog

    AI

    How AI is introducing errors into courtrooms

    AI

    With AI, researchers predict the location of virtually any protein within a human cell | Ztoog

    AI

    Google DeepMind’s new AI agent cracks real-world problems better than humans can

    Leave A Reply Cancel Reply

    Follow Us
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    Top Posts
    Science

    Inside NASA’s new Moon to Mars Office

    NASA officers have talked for years about utilizing the moon as a stepping stone to…

    Gadgets

    8 Best Robot Vacuums (2023): Mops, Budget Vacs, Great Mapping

    No different product I’ve examined has superior as rapidly as the common-or-garden robotic vacuum. Just…

    Gadgets

    MediaTek Presents 5G RedCap Hardware Showing Impressive Power-Efficiency

    After asserting its partnership with Meta to develop System-on-Chips (SoCs) for Augmented Reality (AR) glasses,…

    Science

    The Boy Who Reached for the Stars excerpt

    Excerpt from The Boy Who Reached for the Stars: A Memoir by Elio Morillo. Published by…

    Crypto

    Bitcoin Investors Get Stern Warning From Crypto Analyst, Price Could Get ‘Hammered’

    Crypto analyst Justin Bennett has warned Bitcoin traders about what may trigger the flagship crypto…

    Our Picks
    The Future

    A Journey Beyond Geographical Bounds for Unrestricted Streaming Delight

    The Future

    Meltwater, the media monitoring startup, gets a $65M investment from Verdane

    Gadgets

    Hydrow Wave rower review: Different strokes for different folks

    Categories
    • AI (1,493)
    • Crypto (1,754)
    • Gadgets (1,805)
    • Mobile (1,851)
    • Science (1,867)
    • Technology (1,803)
    • The Future (1,649)
    Most Popular
    Mobile

    Amazon celebrates Father’s Day by adding a free high-value gift card with Pixel 7

    Gadgets

    Reddit insists on being “fairly paid” amid API price protest plans, layoffs

    Science

    How These Nobel-Winning Physicists Explored Tiny Glimpses of Time

    Ztoog
    Facebook X (Twitter) Instagram Pinterest
    • Home
    • About Us
    • Contact us
    • Privacy Policy
    • Terms & Conditions
    © 2025 Ztoog.

    Type above and press Enter to search. Press Esc to cancel.