lynx   »   [go: up one dir, main page]

https://huggingface.co/Magpie-Align

\n","updatedAt":"2024-06-13T03:08:49.719Z","author":{"_id":"607f666a4ad99100d63ce35c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/607f666a4ad99100d63ce35c/QxhxnvfeV6efkxwUFHwjI.png","fullname":"Bill Yuchen Lin","name":"yuchenlin","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":119}},"numEdits":0,"editors":["yuchenlin"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/607f666a4ad99100d63ce35c/QxhxnvfeV6efkxwUFHwjI.png"],"reactions":[{"reaction":"โค๏ธ","users":["mrm8488","sugatoray"],"count":2}],"isReport":false}},{"id":"666abf819f7dd6b55fea10ee","author":{"_id":"6032802e1f993496bc14d9e3","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6032802e1f993496bc14d9e3/w6hr-DEQot4VVkoyRIBiy.png","fullname":"Omar Sanseviero","name":"osanseviero","type":"user","isPro":false,"isHf":false,"isHfAdmin":true,"isMod":false,"followerCount":3289},"createdAt":"2024-06-13T09:44:33.000Z","type":"comment","data":{"edited":false,"hidden":false,"latest":{"raw":"Very cool! FYI, you can add a link of the paper in the model cards so they are linked from here automatically.","html":"

Very cool! FYI, you can add a link of the paper in the model cards so they are linked from here automatically.

\n","updatedAt":"2024-06-13T09:44:33.806Z","author":{"_id":"6032802e1f993496bc14d9e3","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6032802e1f993496bc14d9e3/w6hr-DEQot4VVkoyRIBiy.png","fullname":"Omar Sanseviero","name":"osanseviero","type":"user","isPro":false,"isHf":false,"isHfAdmin":true,"isMod":false,"followerCount":3289}},"numEdits":0,"editors":["osanseviero"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/6032802e1f993496bc14d9e3/w6hr-DEQot4VVkoyRIBiy.png"],"reactions":[{"reaction":"๐Ÿš€","users":["yuchenlin","zhangchenxu","mrm8488","raincandy-u"],"count":4}],"isReport":false},"replies":[{"id":"666b4186857314165594934a","author":{"_id":"607f666a4ad99100d63ce35c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/607f666a4ad99100d63ce35c/QxhxnvfeV6efkxwUFHwjI.png","fullname":"Bill Yuchen Lin","name":"yuchenlin","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":119},"createdAt":"2024-06-13T18:59:18.000Z","type":"comment","data":{"edited":false,"hidden":false,"latest":{"raw":"yes, will do! thanks!","html":"

yes, will do! thanks!

\n","updatedAt":"2024-06-13T18:59:18.134Z","author":{"_id":"607f666a4ad99100d63ce35c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/607f666a4ad99100d63ce35c/QxhxnvfeV6efkxwUFHwjI.png","fullname":"Bill Yuchen Lin","name":"yuchenlin","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":119}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.41869011521339417},"editors":["yuchenlin"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/607f666a4ad99100d63ce35c/QxhxnvfeV6efkxwUFHwjI.png"],"reactions":[{"reaction":"๐Ÿ‘","users":["Tonic","mrm8488"],"count":2},{"reaction":"๐Ÿค—","users":["raincandy-u"],"count":1}],"isReport":false,"parentCommentId":"666abf819f7dd6b55fea10ee"}}]},{"id":"671678c6b0696c99d5ea9e3d","author":{"_id":"626505d493e0b04d75710566","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/626505d493e0b04d75710566/9rfJc9ORXU9J5a42Ev3v6.png","fullname":"Stefano Fiorucci","name":"anakin87","type":"user","isPro":true,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":182},"createdAt":"2024-10-21T15:52:38.000Z","type":"comment","data":{"edited":false,"hidden":false,"latest":{"raw":"For anyone interested in generating mulilingual instruct datasets with Magpie,\nI have written a simple article on this topic:\n๐Ÿ‡ฎ๐Ÿ‡น๐Ÿ‡ฏ๐Ÿ‡ต๐Ÿ‡ง๐Ÿ‡ท https://huggingface.co/blog/anakin87/multilingual-magpie","html":"

For anyone interested in generating mulilingual instruct datasets with Magpie,
I have written a simple article on this topic:
๐Ÿ‡ฎ๐Ÿ‡น๐Ÿ‡ฏ๐Ÿ‡ต๐Ÿ‡ง๐Ÿ‡ท https://huggingface.co/blog/anakin87/multilingual-magpie

\n","updatedAt":"2024-10-21T15:52:38.704Z","author":{"_id":"626505d493e0b04d75710566","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/626505d493e0b04d75710566/9rfJc9ORXU9J5a42Ev3v6.png","fullname":"Stefano Fiorucci","name":"anakin87","type":"user","isPro":true,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":182}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.8235049247741699},"editors":["anakin87"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/626505d493e0b04d75710566/9rfJc9ORXU9J5a42Ev3v6.png"],"reactions":[],"isReport":false}},{"id":"672ba694afefa13b9d224350","author":{"_id":"65b36ffeed21070756eab527","avatarUrl":"/avatars/02c825a3a8ca42d62e8fcf8fca48440d.svg","fullname":"Akhil kollara","name":"Akhilkra","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":7},"createdAt":"2024-11-06T17:25:40.000Z","type":"comment","data":{"edited":true,"hidden":true,"hiddenBy":"","latest":{"raw":"This comment has been hidden","html":"This comment has been hidden","updatedAt":"2024-11-06T17:27:54.427Z","author":{"_id":"65b36ffeed21070756eab527","avatarUrl":"/avatars/02c825a3a8ca42d62e8fcf8fca48440d.svg","fullname":"Akhil kollara","name":"Akhilkra","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":7}},"numEdits":0,"editors":[],"editorAvatarUrls":[],"reactions":[]}}],"primaryEmailConfirmed":false,"paper":{"id":"2406.08464","authors":[{"_id":"666a4c1165a06c5745d71a5c","user":{"_id":"653df1323479e9ebbe3eb6cc","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/653df1323479e9ebbe3eb6cc/K_g-r1iMRNKj99LXPuYF3.jpeg","isPro":true,"fullname":"Zhangchen Xu","user":"zhangchenxu","type":"user"},"name":"Zhangchen Xu","status":"claimed_verified","statusLastChangedAt":"2024-06-14T07:24:06.092Z","hidden":false},{"_id":"666a4c1165a06c5745d71a5d","user":{"_id":"6531e1021dd8ebbdc1a6fd8e","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6531e1021dd8ebbdc1a6fd8e/lIcl7zCPtzRsfiUh6uY1o.jpeg","isPro":false,"fullname":"Fengqing Jiang","user":"fqjiang","type":"user"},"name":"Fengqing Jiang","status":"claimed_verified","statusLastChangedAt":"2024-06-13T09:31:14.539Z","hidden":false},{"_id":"666a4c1165a06c5745d71a5e","user":{"_id":"666dfd4770f5a2cb4aefd12f","avatarUrl":"/avatars/fa0e0dbc203a21e58dda8fdb4cbc67ad.svg","isPro":false,"fullname":"Luyao Niu","user":"LNIU","type":"user"},"name":"Luyao Niu","status":"claimed_verified","statusLastChangedAt":"2024-06-15T21:29:06.971Z","hidden":false},{"_id":"666a4c1165a06c5745d71a5f","user":{"_id":"63081e15a670ed10f9d44229","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/63081e15a670ed10f9d44229/w1b9uq-9774bMMgJbSPsS.jpeg","isPro":true,"fullname":"Yuntian Deng","user":"yuntian-deng","type":"user"},"name":"Yuntian Deng","status":"claimed_verified","statusLastChangedAt":"2025-04-30T09:57:09.774Z","hidden":false},{"_id":"666a4c1165a06c5745d71a60","name":"Radha Poovendran","hidden":false},{"_id":"666a4c1165a06c5745d71a61","name":"Yejin Choi","hidden":false},{"_id":"666a4c1165a06c5745d71a62","user":{"_id":"607f666a4ad99100d63ce35c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/607f666a4ad99100d63ce35c/QxhxnvfeV6efkxwUFHwjI.png","isPro":false,"fullname":"Bill Yuchen Lin","user":"yuchenlin","type":"user"},"name":"Bill Yuchen Lin","status":"claimed_verified","statusLastChangedAt":"2024-06-13T09:31:16.691Z","hidden":false}],"publishedAt":"2024-06-12T17:52:30.000Z","submittedOnDailyAt":"2024-06-13T00:02:07.625Z","title":"Magpie: Alignment Data Synthesis from Scratch by Prompting Aligned LLMs\n with Nothing","submittedOnDailyBy":{"_id":"60f1abe7544c2adfd699860c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674929746905-60f1abe7544c2adfd699860c.jpeg","isPro":false,"fullname":"AK","user":"akhaliq","type":"user"},"summary":"High-quality instruction data is critical for aligning large language models\n(LLMs). Although some models, such as Llama-3-Instruct, have open weights,\ntheir alignment data remain private, which hinders the democratization of AI.\nHigh human labor costs and a limited, predefined scope for prompting prevent\nexisting open-source data creation methods from scaling effectively,\npotentially limiting the diversity and quality of public alignment datasets. Is\nit possible to synthesize high-quality instruction data at scale by extracting\nit directly from an aligned LLM? We present a self-synthesis method for\ngenerating large-scale alignment data named Magpie. Our key observation is that\naligned LLMs like Llama-3-Instruct can generate a user query when we input only\nthe left-side templates up to the position reserved for user messages, thanks\nto their auto-regressive nature. We use this method to prompt Llama-3-Instruct\nand generate 4 million instructions along with their corresponding responses.\nWe perform a comprehensive analysis of the extracted data and select 300K\nhigh-quality instances. To compare Magpie data with other public instruction\ndatasets, we fine-tune Llama-3-8B-Base with each dataset and evaluate the\nperformance of the fine-tuned models. Our results indicate that in some tasks,\nmodels fine-tuned with Magpie perform comparably to the official\nLlama-3-8B-Instruct, despite the latter being enhanced with 10 million data\npoints through supervised fine-tuning (SFT) and subsequent feedback learning.\nWe also show that using Magpie solely for SFT can surpass the performance of\nprevious public datasets utilized for both SFT and preference optimization,\nsuch as direct preference optimization with UltraFeedback. This advantage is\nevident on alignment benchmarks such as AlpacaEval, ArenaHard, and WildBench.","upvotes":71,"discussionId":"666a4c1265a06c5745d71b12","ai_summary":"Magpie, a self-synthesis method, generates high-quality instruction data from aligned LLMs, enabling competitive performance in alignment benchmarks compared to models fine-tuned with larger, supervised datasets.","ai_keywords":["large language models","LLMs","Llama-3-Instruct","auto-regressive","instruction data","Magpie","alignment datasets","fine-tuning","performance evaluation","AlpacaEval","ArenaHard","WildBench","supervised fine-tuning","SFT","preference optimization","UltraFeedback"]},"canReadDatabase":false,"canManagePapers":false,"canSubmit":false,"hasHfLevelAccess":false,"upvoted":false,"upvoters":[{"_id":"64ca7c04710645aa7bdbbfff","avatarUrl":"/avatars/c12f4cb6dc1ff0010edb3ef4cfcccd7c.svg","isPro":false,"fullname":"Lize Pirenne","user":"Inversta","type":"user"},{"_id":"648eb1eb59c4e5c87dc116e0","avatarUrl":"/avatars/c636cea39c2c0937f01398c94ead5dad.svg","isPro":false,"fullname":"fdsqefsgergd","user":"T-representer","type":"user"},{"_id":"657217faabb25ed8aedd5e48","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/657217faabb25ed8aedd5e48/UUHAXeGtOnQBXFD3nYtf2.jpeg","isPro":false,"fullname":"Vlad Bogolin","user":"vladbogo","type":"user"},{"_id":"64bbe9b236eb058cd9d6a5b9","avatarUrl":"/avatars/c7c01a3fa8809e73800392679abff6d5.svg","isPro":false,"fullname":"Kai Zuberbรผhler","user":"kaizuberbuehler","type":"user"},{"_id":"64b8358f6b5ee8c3884adb7c","avatarUrl":"/avatars/02ed8c12e4b64367bc81001a6242b678.svg","isPro":false,"fullname":"Daniel Pacheco","user":"Danielpgp1012","type":"user"},{"_id":"653df1323479e9ebbe3eb6cc","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/653df1323479e9ebbe3eb6cc/K_g-r1iMRNKj99LXPuYF3.jpeg","isPro":true,"fullname":"Zhangchen Xu","user":"zhangchenxu","type":"user"},{"_id":"607f666a4ad99100d63ce35c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/607f666a4ad99100d63ce35c/QxhxnvfeV6efkxwUFHwjI.png","isPro":false,"fullname":"Bill Yuchen Lin","user":"yuchenlin","type":"user"},{"_id":"620783f24e28382272337ba4","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/620783f24e28382272337ba4/zkUveQPNiDfYjgGhuFErj.jpeg","isPro":false,"fullname":"GuoLiangTang","user":"Tommy930","type":"user"},{"_id":"63082bb7bc0a2a5ee2253523","avatarUrl":"/avatars/6cf8d12d16d15db1070fbea89b5b3967.svg","isPro":false,"fullname":"Kuo-Hsin Tu","user":"dapumptu","type":"user"},{"_id":"63e93b57ca4fc7d30decb380","avatarUrl":"/avatars/611a96669ddcd187c298a67ec24a509a.svg","isPro":false,"fullname":"Hammer++++","user":"HammerW","type":"user"},{"_id":"6254f8e5d21e4cc386b881ad","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1649899774659-6254f8e5d21e4cc386b881ad.jpeg","isPro":false,"fullname":"Somshubra Majumdar","user":"smajumdar94","type":"user"},{"_id":"666a82452c3271b9b5cf2ca6","avatarUrl":"/avatars/0c03a4626d04b2633c8769205b953c4f.svg","isPro":false,"fullname":"MITsys","user":"AiLab-Coco","type":"user"}],"acceptLanguages":["*"],"dailyPaperRank":1}">
Papers
arxiv:2406.08464

Magpie: Alignment Data Synthesis from Scratch by Prompting Aligned LLMs with Nothing

Published on Jun 12, 2024
ยท Submitted by AK on Jun 13, 2024
#1 Paper of the day
Authors:
,
,

Abstract

Magpie, a self-synthesis method, generates high-quality instruction data from aligned LLMs, enabling competitive performance in alignment benchmarks compared to models fine-tuned with larger, supervised datasets.

AI-generated summary

High-quality instruction data is critical for aligning large language models (LLMs). Although some models, such as Llama-3-Instruct, have open weights, their alignment data remain private, which hinders the democratization of AI. High human labor costs and a limited, predefined scope for prompting prevent existing open-source data creation methods from scaling effectively, potentially limiting the diversity and quality of public alignment datasets. Is it possible to synthesize high-quality instruction data at scale by extracting it directly from an aligned LLM? We present a self-synthesis method for generating large-scale alignment data named Magpie. Our key observation is that aligned LLMs like Llama-3-Instruct can generate a user query when we input only the left-side templates up to the position reserved for user messages, thanks to their auto-regressive nature. We use this method to prompt Llama-3-Instruct and generate 4 million instructions along with their corresponding responses. We perform a comprehensive analysis of the extracted data and select 300K high-quality instances. To compare Magpie data with other public instruction datasets, we fine-tune Llama-3-8B-Base with each dataset and evaluate the performance of the fine-tuned models. Our results indicate that in some tasks, models fine-tuned with Magpie perform comparably to the official Llama-3-8B-Instruct, despite the latter being enhanced with 10 million data points through supervised fine-tuning (SFT) and subsequent feedback learning. We also show that using Magpie solely for SFT can surpass the performance of previous public datasets utilized for both SFT and preference optimization, such as direct preference optimization with UltraFeedback. This advantage is evident on alignment benchmarks such as AlpacaEval, ArenaHard, and WildBench.

Community

Paper author

Very cool! FYI, you can add a link of the paper in the model cards so they are linked from here automatically.

ยท
Paper author

yes, will do! thanks!

For anyone interested in generating mulilingual instruct datasets with Magpie,
I have written a simple article on this topic:
๐Ÿ‡ฎ๐Ÿ‡น๐Ÿ‡ฏ๐Ÿ‡ต๐Ÿ‡ง๐Ÿ‡ท https://huggingface.co/blog/anakin87/multilingual-magpie

This comment has been hidden

Sign up or log in to comment

Models citing this paper 77

Browse 77 models citing this paper

Datasets citing this paper 198

Browse 198 datasets citing this paper

Spaces citing this paper 45

Collections including this paper 24

ะ›ัƒั‡ัˆะธะน ั‡ะฐัั‚ะฝั‹ะน ั…ะพัั‚ะธะฝะณ