lynx   »   [go: up one dir, main page]

Librarian Bot. I found the following papers similar to this paper.

\n

The following papers were recommended by the Semantic Scholar API

\n\n

Please give a thumbs up to this comment if you found it helpful!

\n

If you want recommendations for any Paper on Hugging Face checkout this Space

\n

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: \n\n@librarian-bot\n\t recommend

\n","updatedAt":"2024-03-07T01:14:38.778Z","author":{"_id":"63d3e0e8ff1384ce6c5dd17d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg","fullname":"Librarian Bot (Bot)","name":"librarian-bot","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":264}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.73470538854599},"editors":["librarian-bot"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg"],"reactions":[],"isReport":false}},{"id":"65e922034812b946e6261983","author":{"_id":"65e52e7d27dc8aa470a640e3","avatarUrl":"/avatars/022a179d14de29b9ab9d96fcc85aa264.svg","fullname":"hankai","name":"hankaixyz","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":2},"createdAt":"2024-03-07T02:10:11.000Z","type":"comment","data":{"edited":false,"hidden":false,"latest":{"raw":"The code is released: https://github.com/WailordHe/DenseSSM","html":"

The code is released: https://github.com/WailordHe/DenseSSM

\n","updatedAt":"2024-03-07T02:10:11.212Z","author":{"_id":"65e52e7d27dc8aa470a640e3","avatarUrl":"/avatars/022a179d14de29b9ab9d96fcc85aa264.svg","fullname":"hankai","name":"hankaixyz","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":2}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.8687251210212708},"editors":["hankaixyz"],"editorAvatarUrls":["/avatars/022a179d14de29b9ab9d96fcc85aa264.svg"],"reactions":[{"reaction":"🚀","users":["hankaixyz"],"count":1}],"isReport":false}}],"primaryEmailConfirmed":false,"paper":{"id":"2403.00818","authors":[{"_id":"65e782203b2aca8cd1e3c8ae","name":"Wei He","hidden":false},{"_id":"65e782203b2aca8cd1e3c8af","user":{"_id":"65e52e7d27dc8aa470a640e3","avatarUrl":"/avatars/022a179d14de29b9ab9d96fcc85aa264.svg","isPro":false,"fullname":"hankai","user":"hankaixyz","type":"user"},"name":"Kai Han","status":"claimed_verified","statusLastChangedAt":"2024-03-07T09:15:25.683Z","hidden":false},{"_id":"65e782203b2aca8cd1e3c8b0","user":{"_id":"64d5deb154bb9eb704f83122","avatarUrl":"/avatars/86ce09bcca903319051e2307581a43f4.svg","isPro":false,"fullname":"Yehui Tang","user":"tangyehui","type":"user"},"name":"Yehui Tang","status":"admin_assigned","statusLastChangedAt":"2024-03-05T21:00:10.881Z","hidden":false},{"_id":"65e782203b2aca8cd1e3c8b1","name":"Chengcheng Wang","hidden":false},{"_id":"65e782203b2aca8cd1e3c8b2","user":{"_id":"654d989f636d5f1b88a734f1","avatarUrl":"/avatars/f72e6c44b1c47b4a125370a7bf6e52c7.svg","isPro":false,"fullname":"Yujie Yang","user":"EvanYYJ","type":"user"},"name":"Yujie Yang","status":"admin_assigned","statusLastChangedAt":"2024-03-05T21:00:57.559Z","hidden":false},{"_id":"65e782203b2aca8cd1e3c8b3","user":{"_id":"62dd692c26b500df9124b2eb","avatarUrl":"/avatars/a490194a2839904e9497388bacf00c10.svg","isPro":false,"fullname":"Tianyu Guo","user":"tiguo66","type":"user"},"name":"Tianyu Guo","status":"admin_assigned","statusLastChangedAt":"2024-03-05T21:01:04.778Z","hidden":false},{"_id":"65e782203b2aca8cd1e3c8b4","user":{"_id":"658bdf7b925aadd43304f05c","avatarUrl":"/avatars/64d9e9dea27c376c3bc7b2a54efc2a46.svg","isPro":false,"fullname":"Yunhe Wang","user":"MightyCrane","type":"user"},"name":"Yunhe Wang","status":"admin_assigned","statusLastChangedAt":"2024-03-05T21:01:10.813Z","hidden":false}],"publishedAt":"2024-02-26T09:21:59.000Z","submittedOnDailyAt":"2024-03-05T18:05:44.916Z","title":"DenseMamba: State Space Models with Dense Hidden Connection for\n Efficient Large Language Models","submittedOnDailyBy":{"_id":"60f1abe7544c2adfd699860c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674929746905-60f1abe7544c2adfd699860c.jpeg","isPro":false,"fullname":"AK","user":"akhaliq","type":"user"},"summary":"Large language models (LLMs) face a daunting challenge due to the excessive\ncomputational and memory requirements of the commonly used Transformer\narchitecture. While state space model (SSM) is a new type of foundational\nnetwork architecture offering lower computational complexity, their performance\nhas yet to fully rival that of Transformers. This paper introduces DenseSSM, a\nnovel approach to enhance the flow of hidden information between layers in\nSSMs. By selectively integrating shallowlayer hidden states into deeper layers,\nDenseSSM retains fine-grained information crucial for the final output. Dense\nconnections enhanced DenseSSM still maintains the training parallelizability\nand inference efficiency. The proposed method can be widely applicable to\nvarious SSM types like RetNet and Mamba. With similar model size, DenseSSM\nachieves significant improvements, exemplified by DenseRetNet outperforming the\noriginal RetNet with up to 5% accuracy improvement on public benchmarks.","upvotes":19,"discussionId":"65e782203b2aca8cd1e3c90c","ai_summary":"DenseSSM enhances state space model architectures by integrating shallow-layer hidden states into deeper layers, improving performance without increasing model size.","ai_keywords":["Transformers","state space model (SSM)","DenseSSM","hidden information","training parallelizability","inference efficiency","RetNet","Mamba","DenseRetNet"]},"canReadDatabase":false,"canManagePapers":false,"canSubmit":false,"hasHfLevelAccess":false,"upvoted":false,"upvoters":[{"_id":"6538119803519fddb4a17e10","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6538119803519fddb4a17e10/ffJMkdx-rM7VvLTCM6ri_.jpeg","isPro":false,"fullname":"samusenps","user":"samusenps","type":"user"},{"_id":"60db7be9bf02367a70af5bb9","avatarUrl":"/avatars/6bddc3e0d04775aaff9fe71c5f237baa.svg","isPro":false,"fullname":"Prince Aboagye","user":"paboagye","type":"user"},{"_id":"63284f86cbc744f197050300","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/63284f86cbc744f197050300/cGbUDe5fn-8A8Jcmz5lre.png","isPro":false,"fullname":"Hoptimizer","user":"bunnycore","type":"user"},{"_id":"620783f24e28382272337ba4","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/620783f24e28382272337ba4/zkUveQPNiDfYjgGhuFErj.jpeg","isPro":false,"fullname":"GuoLiangTang","user":"Tommy930","type":"user"},{"_id":"6422abd6be23df15743b5ec5","avatarUrl":"/avatars/589e51eba422d3a15bd96298d92ca691.svg","isPro":false,"fullname":"Le Huy Hoang","user":"splendor1811","type":"user"},{"_id":"60a551a34ecc5d054c8ad93e","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/60a551a34ecc5d054c8ad93e/dhcBFtwNLcKqqASxniyVw.jpeg","isPro":false,"fullname":"Mishig Davaadorj","user":"mishig","type":"user"},{"_id":"64acc03fa49185bdc2f415a3","avatarUrl":"/avatars/eb2ee218711480d1da56a34ded1167c3.svg","isPro":false,"fullname":"Traly","user":"Traly","type":"user"},{"_id":"6356dc2317d306ce1d161370","avatarUrl":"/avatars/ccd9d3bab82abadee4163bdf0feae44b.svg","isPro":false,"fullname":"yo ft","user":"tomhitto","type":"user"},{"_id":"63107b18e87051f3e3e0f598","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/63107b18e87051f3e3e0f598/R9onir4Y0MZuq1jEWCZ2-.jpeg","isPro":false,"fullname":"Unchun Yang","user":"ucyang","type":"user"},{"_id":"65faaef4d6d211bbc757f07f","avatarUrl":"/avatars/9583255d6eae699df89ab277665b8a17.svg","isPro":false,"fullname":"reed","user":"jamesHD2001","type":"user"},{"_id":"64647882fdbe694bc753575e","avatarUrl":"/avatars/cd323c732f5069763f36bb5d21d036e4.svg","isPro":false,"fullname":"LINA","user":"jundai0611","type":"user"},{"_id":"6101c620900eaa0057c2ce1d","avatarUrl":"/avatars/bd282166c120711c65b5409dc860ac58.svg","isPro":false,"fullname":"Abdel-Dayane Marcos","user":"admarcosai","type":"user"}],"acceptLanguages":["*"],"dailyPaperRank":0}">
Papers
arxiv:2403.00818

DenseMamba: State Space Models with Dense Hidden Connection for Efficient Large Language Models

Published on Feb 26, 2024
· Submitted by AK on Mar 5, 2024
Authors:
,
,

Abstract

DenseSSM enhances state space model architectures by integrating shallow-layer hidden states into deeper layers, improving performance without increasing model size.

AI-generated summary

Large language models (LLMs) face a daunting challenge due to the excessive computational and memory requirements of the commonly used Transformer architecture. While state space model (SSM) is a new type of foundational network architecture offering lower computational complexity, their performance has yet to fully rival that of Transformers. This paper introduces DenseSSM, a novel approach to enhance the flow of hidden information between layers in SSMs. By selectively integrating shallowlayer hidden states into deeper layers, DenseSSM retains fine-grained information crucial for the final output. Dense connections enhanced DenseSSM still maintains the training parallelizability and inference efficiency. The proposed method can be widely applicable to various SSM types like RetNet and Mamba. With similar model size, DenseSSM achieves significant improvements, exemplified by DenseRetNet outperforming the original RetNet with up to 5% accuracy improvement on public benchmarks.

Community

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Paper author

Sign up or log in to comment

Models citing this paper 2

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2403.00818 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2403.00818 in a Space README.md to link it from this page.

Collections including this paper 10

Лучший частный хостинг