Skip to content

Update requirements.txt #2

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

MattMuffin
Copy link

Added prerequisite packages

Added prerequisite packages
@dribnet
Copy link

dribnet commented Jul 28, 2023

this was helpful - but IIUC torch==1.13.1 also works

@zifanw505
Copy link
Collaborator

Hey thanks for the PR. So far we only put the most concerning version control in the requirement.txt, which is transformers. For other packages, it may just work for several versions. I will merge this PR in the future if more issues show that versioning plays a big role in optimization.

Thanks.

@zifanw505 zifanw505 self-requested a review July 28, 2023 17:31
@OliverOffing
Copy link

The existing requirements.txt wasn't enough for me and I had to manually run pip install torch.

@zifanw505
Copy link
Collaborator

Yeah you need to install pytorch. There are several pytorch versions for different CUDA versions so I incline to leave it to users to decide based on their own machine.

@tjaffee99
Copy link

When I ran pip install -e . it gave me the following error:

INFO: pip is looking at multiple versions of fschat to determine which version is compatible with other requirements. This could take a while.
ERROR: Cannot install llm-attacks and llm-attacks==0.0.1 because these package versions have conflicting dependencies.

The conflict is caused by:
    llm-attacks 0.0.1 depends on transformers==4.28.1
    fschat 0.2.23 depends on transformers>=4.31.0

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

The same thing happened when I attempted to install both llm-attacks and fastchat manually.

@zifanw505
Copy link
Collaborator

@tjaffee99 Solved. Thanks for pointing out the issue. We did not have this issue before as fschat was not in the requirements.txt. I pushed new versions in the requirements.txt

AmirZur added a commit to AmirZur/llm-attacks that referenced this pull request May 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy