lynx   »   [go: up one dir, main page]

Librarian Bot. I found the following papers similar to this paper.

\n

The following papers were recommended by the Semantic Scholar API

\n\n

Please give a thumbs up to this comment if you found it helpful!

\n

If you want recommendations for any Paper on Hugging Face checkout this Space

\n","updatedAt":"2024-01-22T14:14:40.139Z","author":{"_id":"63d3e0e8ff1384ce6c5dd17d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg","fullname":"Librarian Bot (Bot)","name":"librarian-bot","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":264}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.7033198475837708},"editors":["librarian-bot"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg"],"reactions":[],"isReport":false}}],"primaryEmailConfirmed":false,"paper":{"id":"2401.08740","authors":[{"_id":"65a8afd76cb12bdc272f2774","name":"Nanye Ma","hidden":false},{"_id":"65a8afd76cb12bdc272f2775","user":{"_id":"6470f300d9360cd9d8e9219b","avatarUrl":"/avatars/dc09260e9912d7e626cad5c670ec8d3f.svg","isPro":false,"fullname":"mark goldstein","user":"marikgoldstein","type":"user"},"name":"Mark Goldstein","status":"admin_assigned","statusLastChangedAt":"2024-01-18T09:57:45.428Z","hidden":false},{"_id":"65a8afd76cb12bdc272f2776","user":{"_id":"636fbc0731af06da864cd7eb","avatarUrl":"/avatars/415db2ff8f79440a0216dd39cf6d0d43.svg","isPro":false,"fullname":"Michael Albergo","user":"malbergo","type":"user"},"name":"Michael S. Albergo","status":"admin_assigned","statusLastChangedAt":"2024-01-18T09:58:12.428Z","hidden":false},{"_id":"65a8afd76cb12bdc272f2777","user":{"_id":"65afd8b7bd418aab3a677782","avatarUrl":"/avatars/11aa33b32ef4b6aa27ab8e6c068afa14.svg","isPro":false,"fullname":"Nicholas Boffi","user":"nmboffi","type":"user"},"name":"Nicholas M. Boffi","status":"claimed_verified","statusLastChangedAt":"2024-01-23T17:09:22.554Z","hidden":false},{"_id":"65a8afd76cb12bdc272f2778","name":"Eric Vanden-Eijnden","hidden":false},{"_id":"65a8afd76cb12bdc272f2779","user":{"_id":"6596422646624a86ff3b3bda","avatarUrl":"/avatars/216e12b77e45ac5f1fa20932f5745411.svg","isPro":false,"fullname":"Saining Xie","user":"sainx","type":"user"},"name":"Saining Xie","status":"admin_assigned","statusLastChangedAt":"2024-01-18T09:58:57.348Z","hidden":false}],"publishedAt":"2024-01-16T18:55:25.000Z","submittedOnDailyAt":"2024-01-18T02:28:01.441Z","title":"SiT: Exploring Flow and Diffusion-based Generative Models with Scalable\n Interpolant Transformers","submittedOnDailyBy":{"_id":"60f1abe7544c2adfd699860c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674929746905-60f1abe7544c2adfd699860c.jpeg","isPro":false,"fullname":"AK","user":"akhaliq","type":"user"},"summary":"We present Scalable Interpolant Transformers (SiT), a family of generative\nmodels built on the backbone of Diffusion Transformers (DiT). The interpolant\nframework, which allows for connecting two distributions in a more flexible way\nthan standard diffusion models, makes possible a modular study of various\ndesign choices impacting generative models built on dynamical transport: using\ndiscrete vs. continuous time learning, deciding the objective for the model to\nlearn, choosing the interpolant connecting the distributions, and deploying a\ndeterministic or stochastic sampler. By carefully introducing the above\ningredients, SiT surpasses DiT uniformly across model sizes on the conditional\nImageNet 256x256 benchmark using the exact same backbone, number of parameters,\nand GFLOPs. By exploring various diffusion coefficients, which can be tuned\nseparately from learning, SiT achieves an FID-50K score of 2.06.","upvotes":14,"discussionId":"65a8afd96cb12bdc272f2813","ai_summary":"Scalable Interpolant Transformers (SiT), built on Diffusion Transformers (DiT), enhance generative models through a flexible interpolant framework, achieving superior performance on ImageNet 256x256 with tunable diffusion coefficients.","ai_keywords":["Scalable Interpolant Transformers","SiT","Diffusion Transformers","DiT","interpolant framework","dynamical transport","discrete time learning","continuous time learning","stochastic sampler","diffusion coefficients","FID-50K score"]},"canReadDatabase":false,"canManagePapers":false,"canSubmit":false,"hasHfLevelAccess":false,"upvoted":false,"upvoters":[{"_id":"620783f24e28382272337ba4","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/620783f24e28382272337ba4/zkUveQPNiDfYjgGhuFErj.jpeg","isPro":false,"fullname":"GuoLiangTang","user":"Tommy930","type":"user"},{"_id":"64ca3456f103036e2393cf0f","avatarUrl":"/avatars/dfb18e15d7f4c765c890cdeeeb200637.svg","isPro":false,"fullname":"Tiantian","user":"ltt1598","type":"user"},{"_id":"652b83b73b5997ed71a310f2","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/652b83b73b5997ed71a310f2/ipCpdeHUp4-0OmRz5z8IW.png","isPro":false,"fullname":"Rui Zhao","user":"ruizhaocv","type":"user"},{"_id":"6187b284e742829538fb6bb0","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1636283009799-noauth.jpeg","isPro":false,"fullname":"Ritwik Raha","user":"ritwikraha","type":"user"},{"_id":"6538119803519fddb4a17e10","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6538119803519fddb4a17e10/ffJMkdx-rM7VvLTCM6ri_.jpeg","isPro":false,"fullname":"samusenps","user":"samusenps","type":"user"},{"_id":"6305195aacc17ce4ad34367b","avatarUrl":"/avatars/da37610098ea5e4eadefc3cccdc2b1f8.svg","isPro":false,"fullname":"Tonimono","user":"Tonimono","type":"user"},{"_id":"6342796a0875f2c99cfd313b","avatarUrl":"/avatars/98575092404c4197b20c929a6499a015.svg","isPro":false,"fullname":"Yuseung \"Phillip\" Lee","user":"phillipinseoul","type":"user"},{"_id":"658966aff8b453e1f57af437","avatarUrl":"/avatars/57f6fadef48261f04921d1b05a8ffe64.svg","isPro":false,"fullname":"liupeng","user":"pescaliu","type":"user"},{"_id":"6436babc9ddae7f3812bd34a","avatarUrl":"/avatars/ba8bfd8c61a5205d545f6db3ec61e65b.svg","isPro":false,"fullname":"zhe li","user":"watchtowerss","type":"user"},{"_id":"64bd7456b567ae97c3398076","avatarUrl":"/avatars/f27a1df7a374dd40bad53fb7019828dc.svg","isPro":false,"fullname":"Andy Wang","user":"wangdelp","type":"user"},{"_id":"648eb1eb59c4e5c87dc116e0","avatarUrl":"/avatars/c636cea39c2c0937f01398c94ead5dad.svg","isPro":false,"fullname":"fdsqefsgergd","user":"T-representer","type":"user"},{"_id":"609653c1146ef3bfe2fc7392","avatarUrl":"/avatars/1639b6552a419209ae67b6562183bc2f.svg","isPro":false,"fullname":"Inui","user":"Norm","type":"user"}],"acceptLanguages":["*"],"dailyPaperRank":0}">
Papers
arxiv:2401.08740

SiT: Exploring Flow and Diffusion-based Generative Models with Scalable Interpolant Transformers

Published on Jan 16, 2024
· Submitted by AK on Jan 18, 2024
Authors:
,
,

Abstract

Scalable Interpolant Transformers (SiT), built on Diffusion Transformers (DiT), enhance generative models through a flexible interpolant framework, achieving superior performance on ImageNet 256x256 with tunable diffusion coefficients.

AI-generated summary

We present Scalable Interpolant Transformers (SiT), a family of generative models built on the backbone of Diffusion Transformers (DiT). The interpolant framework, which allows for connecting two distributions in a more flexible way than standard diffusion models, makes possible a modular study of various design choices impacting generative models built on dynamical transport: using discrete vs. continuous time learning, deciding the objective for the model to learn, choosing the interpolant connecting the distributions, and deploying a deterministic or stochastic sampler. By carefully introducing the above ingredients, SiT surpasses DiT uniformly across model sizes on the conditional ImageNet 256x256 benchmark using the exact same backbone, number of parameters, and GFLOPs. By exploring various diffusion coefficients, which can be tuned separately from learning, SiT achieves an FID-50K score of 2.06.

Community

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2401.08740 in a model README.md to link it from this page.

Datasets citing this paper 1

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2401.08740 in a Space README.md to link it from this page.

Collections including this paper 5

Лучший частный хостинг