Skip to content

0aqz0/pytorch-attention-mechanism

Repository files navigation

pytorch-attention-mechanism

my codes for learning attention mechanism

CNN with attention

Apply spatial attention to CIFAR100 dataset

Usage

Train the model:

$ python cnn-with-attention.py --train

Visualize attention map:

$ python cnn-with-attention.py --visualize

RNN with attention

Apply temporal attention to sequential data

e.g. A sequence of length 20, the output is only related to the 5th position and the 13th position

Usage

Train the model:

$ python rnn-with-attention.py --train

Visualize attention map:

$ python rnn-with-attention.py --visualize

Todos

  • CNN+attention
  • RNN+attention

References

About

my codes for learning attention mechanism

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy