Skip to content

huangtinglin/NGCF-PyTorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

Neural Graph Collaborative Filtering

This is my PyTorch implementation for the paper:

Xiang Wang, Xiangnan He, Meng Wang, Fuli Feng, and Tat-Seng Chua (2019). Neural Graph Collaborative Filtering, Paper in ACM DL or Paper in arXiv. In SIGIR'19, Paris, France, July 21-25, 2019.

The TensorFlow implementation can be found here.

Introduction

My implementation mainly refers to the original TensorFlow implementation. It has the evaluation metrics as the original project. Here is the example of Gowalla dataset:

Best Iter=[38]@[32904.5]	recall=[0.15571	0.21793	0.26385	0.30103	0.33170], precision=[0.04763	0.03370	0.02744	0.02359	0.02088], hit=[0.53996	0.64559	0.70464	0.74546	0.77406], ndcg=[0.22752	0.26555	0.29044	0.30926	0.32406]

Hope it can help you!

Environment Requirement

The code has been tested under Python 3.6.9. The required packages are as follows:

  • pytorch == 1.3.1
  • numpy == 1.18.1
  • scipy == 1.3.2
  • sklearn == 0.21.3

Example to Run the Codes

The instruction of commands has been clearly stated in the codes (see the parser function in NGCF/utility/parser.py).

  • Gowalla dataset
python main.py --dataset gowalla --regs [1e-5] --embed_size 64 --layer_size [64,64,64] --lr 0.0001 --save_flag 1 --pretrain 0 --batch_size 1024 --epoch 400 --verbose 1 --node_dropout [0.1] --mess_dropout [0.1,0.1,0.1]
  • Amazon-book dataset
python main.py --dataset amazon-book --regs [1e-5] --embed_size 64 --layer_size [64,64,64] --lr 0.0005 --save_flag 1 --pretrain 0 --batch_size 1024 --epoch 200 --verbose 50 --node_dropout [0.1] --mess_dropout [0.1,0.1,0.1]

Supplement

  • The parameter negative_slope of LeakyReLu was set to 0.2, since the default value of PyTorch and TensorFlow is different.
  • If the arguement node_dropout_flag is set to 1, it will lead to higher calculational cost.

Releases

No releases published

Packages

No packages published

Languages

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy