Skip to content
/ memit Public

Mass-editing thousands of facts into a transformer memory (ICLR 2023)

License

Notifications You must be signed in to change notification settings

kmeng01/memit

Repository files navigation

MEMIT: Mass-Editing Memory in a Transformer

Editing thousands of facts into a transformer memory at once.

Table of Contents

Installation

We recommend conda for managing Python, CUDA, and PyTorch; pip is for everything else. To get started, simply install conda and run:

CONDA_HOME=$CONDA_HOME ./scripts/setup_conda.sh

$CONDA_HOME should be the path to your conda installation, e.g., ~/miniconda3.

MEMIT Algorithm Demo

notebooks/memit.ipynb demonstrates MEMIT. The API is simple; simply specify a requested rewrite of the following form:

request = [
    {
        "prompt": "{} plays the sport of",
        "subject": "LeBron James",
        "target_new": {
            "str": "football"
        }
    },
    {
        "prompt": "{} plays the sport of",
        "subject": "Michael Jordan",
        "target_new": {
            "str": "baseball"
        }
    },
]

Other similar example(s) are included in the notebook.

Running the Full Evaluation Suite

experiments/evaluate.py can be used to evaluate any method in baselines/.

For example:

python3 -m experiments.evaluate \
    --alg_name=MEMIT \
    --model_name=EleutherAI/gpt-j-6B \
    --hparams_fname=EleutherAI_gpt-j-6B.json \
    --num_edits=10000 \
    --use_cache

Results from each run are stored at results/<method_name>/run_<run_id> in a specific format:

results/
|__ MEMIT/
    |__ run_<run_id>/
        |__ params.json
        |__ case_0.json
        |__ case_1.json
        |__ ...
        |__ case_10000.json

To summarize the results, you can use experiments/summarize.py:

python3 -m experiments.summarize --dir_name=MEMIT --runs=run_<run1>,run_<run2>

Running python3 -m experiments.evaluate -h or python3 -m experiments.summarize -h provides details about command-line flags.

How to Cite

@article{meng2022memit,
  title={Mass Editing Memory in a Transformer},
  author={Kevin Meng and Sen Sharma, Arnab and Alex Andonian and Yonatan Belinkov and David Bau},
  journal={arXiv preprint arXiv:2210.07229},
  year={2022}
}

About

Mass-editing thousands of facts into a transformer memory (ICLR 2023)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy