Skip to content

安装transformers后,无法应用pipeline #30

Open
@SongYuan205

Description

@SongYuan205

求教问题如何解决呀
报错信息:

from transformers import pipeline
transcriber = pipeline(task="automatic-speech-recognition")
No model was supplied, defaulted to facebook/wav2vec2-base-960h and revision 55bb623 (https://huggingface.co/facebook/wav2vec2-base-960h).
Using a pipeline without specifying a model name and revision in production is not recommended.
D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\file_download.py:1150: FutureWarning: resume_download is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use force_download=True.
warnings.warn(
urllib3.exceptions.SSLError: TLS/SSL connection has been closed (EOF) (_ssl.c:1149)

The above exception was the direct cause of the following exception:

urllib3.exceptions.ProxyError: ('Unable to connect to proxy', SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1149)')))

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\requests\adapters.py", line 667, in send
resp = conn.urlopen(
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\urllib3\connectionpool.py", line 843, in urlopen
retries = retries.increment(
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\urllib3\util\retry.py", line 519, in increment
raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /facebook/wav2vec2-base-960h/resolve/main/config.json (Caused by ProxyError('Unable to connect to proxy', SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1149)'))))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "", line 1, in
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\transformers\pipelines_init_.py", line 750, in pipeline
config = AutoConfig.from_pretrained(model, _from_pipeline=task, **hub_kwargs, **model_kwargs)
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\transformers\models\auto\configuration_auto.py", line 916, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\transformers\configuration_utils.py", line 573, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\transformers\configuration_utils.py", line 628, in _get_config_dict
resolved_config_file = cached_file(
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\transformers\utils\hub.py", line 409, in cached_file
resolved_file = hf_hub_download(
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\utils_deprecation.py", line 101, in inner_f
return f(*args, **kwargs)
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\file_download.py", line 1240, in hf_hub_download
return _hf_hub_download_to_cache_dir(
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\file_download.py", line 1303, in _hf_hub_download_to_cache_dir
(url_to_download, etag, commit_hash, expected_size, head_call_error) = _get_metadata_or_catch_error(
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\file_download.py", line 1751, in _get_metadata_or_catch_error
metadata = get_hf_file_metadata(
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\file_download.py", line 1673, in get_hf_file_metadata
r = _request_wrapper(
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\file_download.py", line 376, in _request_wrapper
response = _request_wrapper(
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\file_download.py", line 399, in _request_wrapper
response = get_session().request(method=method, url=url, **params)
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\huggingface_hub\utils_http.py", line 66, in send
return super().send(request, *args, **kwargs)
File "D:\Anaconda3\envs\diag_extract\lib\site-packages\requests\adapters.py", line 694, in send
raise ProxyError(e, request=request)
requests.exceptions.ProxyError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /facebook/wav2vec2-base-960h/resolve/main/config.json (Caused by ProxyError('Unable to connect to proxy', SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1149)'))))"), '(Request ID: c10d4da6-29d7-40d8-b03c-66c843815d1a)')

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      pFad - Phonifier reborn

      Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

      Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


      Alternative Proxies:

      Alternative Proxy

      pFad Proxy

      pFad v3 Proxy

      pFad v4 Proxy