Coming soon - Get a detailed view of why an account is flagged as spam!
view details

This post has been de-listed

It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.

132
[R] A PyTorch implementation of "A Higher-Order Graph Convolutional Layer" (NeurIPS 2018).
Post Flair (click to view more posts with a particular flair)
Post Body

https://preview.redd.it/zpeghdd0noi21.jpg?width=1852&format=pjpg&auto=webp&v=enabled&s=f33cb3862daf13ab69d25eeae5ee73ac4cb00143

PyTorch: https://github.com/benedekrozemberczki/NGCN

Paper: http://sami.haija.org/papers/high-order-gc-layer.pdf

Abstract:

Recent methods generalize convolutional layers from Euclidean domains to graph-structured data by approximating the eigenbasis of the graph Laplacian. The computationally-efficient and broadly-used Graph ConvNet of Kipf & Welling, over-simplifies the approximation, effectively rendering graph convolution as a neighborhood-averaging operator. This simplification restricts the model from learning delta operators, the very premise of the graph Laplacian. In this work, we propose a new Graph Convolutional layer which mixes multiple powers of the adjacency matrix, allowing it to learn delta operators. Our layer exhibits the same memory footprint and computational complexity as a GCN. We illustrate the strength of our proposed layer on both synthetic graph datasets, and on several real-world citation graphs, setting the record state-of-the-art on Pubmed.

Author
Account Strength
100%
Account Age
6 years
Verified Email
Yes
Verified Flair
No
Total Karma
24,550
Link Karma
23,857
Comment Karma
433
Profile updated: 5 days ago
Posts updated: 2 weeks ago

Subreddit

Post Details

We try to extract some basic information from the post title. This is not always successful or accurate, please use your best judgement and compare these values to the post title and body for confirmation.
Posted
5 years ago