lynx   »   [go: up one dir, main page]

@julien-c\n\t what do you think of this?

\n","updatedAt":"2023-11-27T09:14:39.862Z","author":{"_id":"5fcfb7c407408029ba3577e2","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1608146735109-5fcfb7c407408029ba3577e2.png","fullname":"Simon Brandeis","name":"sbrandeis","type":"user","isPro":false,"isHf":true,"isHfAdmin":true,"isMod":false,"followerCount":191}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.9657925367355347},"editors":["sbrandeis"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/1608146735109-5fcfb7c407408029ba3577e2.png"],"reactions":[],"isReport":false}},{"id":"6569fc321d449d8b95ce7d06","author":{"_id":"5dd96eb166059660ed1ee413","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5dd96eb166059660ed1ee413/NQtzmrDdbG0H8qkZvRyGk.jpeg","fullname":"Julien Chaumond","name":"julien-c","type":"user","isPro":true,"isHf":true,"isHfAdmin":true,"isMod":false,"followerCount":3569},"createdAt":"2023-12-01T15:30:58.000Z","type":"comment","data":{"edited":false,"hidden":false,"latest":{"raw":"@sbrandeis it's just 🤯","html":"

\n\n@sbrandeis\n\t it's just 🤯

\n","updatedAt":"2023-12-01T15:30:58.609Z","author":{"_id":"5dd96eb166059660ed1ee413","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5dd96eb166059660ed1ee413/NQtzmrDdbG0H8qkZvRyGk.jpeg","fullname":"Julien Chaumond","name":"julien-c","type":"user","isPro":true,"isHf":true,"isHfAdmin":true,"isMod":false,"followerCount":3569}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.9115955829620361},"editors":["julien-c"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/5dd96eb166059660ed1ee413/NQtzmrDdbG0H8qkZvRyGk.jpeg"],"reactions":[{"reaction":"🤯","users":["sbrandeis"],"count":1}],"isReport":false}}],"primaryEmailConfirmed":false,"paper":{"id":"2009.02085","authors":[{"_id":"653fe68901b0d514c8989c50","user":{"_id":"5fcfb7c407408029ba3577e2","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1608146735109-5fcfb7c407408029ba3577e2.png","isPro":false,"fullname":"Simon Brandeis","user":"sbrandeis","type":"user"},"name":"Simon Brandeis","status":"claimed_verified","statusLastChangedAt":"2023-10-31T08:23:38.831Z","hidden":false},{"_id":"653fe68901b0d514c8989c51","name":"Adrian Jarret","hidden":false},{"_id":"653fe68901b0d514c8989c52","name":"Pierre Sevestre","hidden":false}],"publishedAt":"2020-09-04T09:39:43.000Z","title":"About Graph Degeneracy, Representation Learning and Scalability","summary":"Graphs or networks are a very convenient way to represent data with lots of\ninteraction. Recently, Machine Learning on Graph data has gained a lot of\ntraction. In particular, vertex classification and missing edge detection have\nvery interesting applications, ranging from drug discovery to recommender\nsystems. To achieve such tasks, tremendous work has been accomplished to learn\nembedding of nodes and edges into finite-dimension vector spaces. This task is\ncalled Graph Representation Learning. However, Graph Representation Learning\ntechniques often display prohibitive time and memory complexities, preventing\ntheir use in real-time with business size graphs. In this paper, we address\nthis issue by leveraging a degeneracy property of Graphs - the K-Core\nDecomposition. We present two techniques taking advantage of this decomposition\nto reduce the time and memory consumption of walk-based Graph Representation\nLearning algorithms. We evaluate the performances, expressed in terms of\nquality of embedding and computational resources, of the proposed techniques on\nseveral academic datasets. Our code is available at\nhttps://github.com/SBrandeis/kcore-embedding","upvotes":0,"discussionId":"653fe68b01b0d514c8989cde","ai_summary":"The use of K-Core Decomposition reduces the time and memory consumption of walk-based Graph Representation Learning algorithms, as evaluated on academic datasets.","ai_keywords":["Graph Representation Learning","K-Core Decomposition","walk-based algorithms","vertex classification","missing edge detection","embedding","graph datasets"]},"canReadDatabase":false,"canManagePapers":false,"canSubmit":false,"hasHfLevelAccess":false,"upvoted":false,"upvoters":[],"acceptLanguages":["*"]}">
Papers
arxiv:2009.02085

About Graph Degeneracy, Representation Learning and Scalability

Published on Sep 4, 2020
Authors:
,

Abstract

The use of K-Core Decomposition reduces the time and memory consumption of walk-based Graph Representation Learning algorithms, as evaluated on academic datasets.

AI-generated summary

Graphs or networks are a very convenient way to represent data with lots of interaction. Recently, Machine Learning on Graph data has gained a lot of traction. In particular, vertex classification and missing edge detection have very interesting applications, ranging from drug discovery to recommender systems. To achieve such tasks, tremendous work has been accomplished to learn embedding of nodes and edges into finite-dimension vector spaces. This task is called Graph Representation Learning. However, Graph Representation Learning techniques often display prohibitive time and memory complexities, preventing their use in real-time with business size graphs. In this paper, we address this issue by leveraging a degeneracy property of Graphs - the K-Core Decomposition. We present two techniques taking advantage of this decomposition to reduce the time and memory consumption of walk-based Graph Representation Learning algorithms. We evaluate the performances, expressed in terms of quality of embedding and computational resources, of the proposed techniques on several academic datasets. Our code is available at https://github.com/SBrandeis/kcore-embedding

Community

Paper author

@julien-c what do you think of this?

@sbrandeis it's just 🤯

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2009.02085 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2009.02085 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2009.02085 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.
Лучший частный хостинг