Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When I runing the example occurred error #1865

Closed
1 task done
lianggx opened this issue Dec 21, 2023 · 1 comment
Closed
1 task done

When I runing the example occurred error #1865

lianggx opened this issue Dec 21, 2023 · 1 comment
Assignees

Comments

@lianggx
Copy link

lianggx commented Dec 21, 2023

Describe the bug
Failed to load https://file.hankcs.com/hanlp/dep/ctb9_dep_electra_small_20220216_100306.zip
If the problem still persists, please submit an issue to https://github.com/hankcs/HanLP/issues
When reporting an issue, make sure to paste the FULL ERROR LOG below.

import hanlp
HanLP = hanlp.pipeline() \
    .append(hanlp.utils.rules.split_sentence, output_key='sentences') \
    .append(hanlp.load('FINE_ELECTRA_SMALL_ZH'), output_key='tok') \
    .append(hanlp.load('CTB9_POS_ELECTRA_SMALL'), output_key='pos') \
    .append(hanlp.load('MSRA_NER_ELECTRA_SMALL_ZH'), output_key='ner', input_key='tok') \
    .append(hanlp.load('CTB9_DEP_ELECTRA_SMALL', conll=0), output_key='dep', input_key='tok')\
    .append(hanlp.load('CTB9_CON_ELECTRA_SMALL'), output_key='con', input_key='tok')
text = HanLP('2021年HanLPv2.1为生产环境带来次世代最先进的多语种NLP技术。阿婆主来到北京立方庭参观自然语义科技公司。')
print(text)

Describe the current behavior
When I runing the example

Expected behavior
The ternminal had output variable 'text' content.

System information
-- OS: Windows-10-10.0.22621-SP0
-- Python: 3.10.4
-- PyTorch: 2.1.2+cpu
-- HanLP: 2.1.0-beta.54

Other info / logs
================================ERROR LOG BEGINS================================
OS: Windows-10-10.0.22621-SP0
Python: 3.10.4
PyTorch: 2.1.2+cpu
HanLP: 2.1.0-beta.54
Traceback (most recent call last):
File "C:\TestProject\python\hanlptest\main.py", line 17, in
.append(hanlp.load('CTB9_DEP_ELECTRA_SMALL', conll=0), output_key='dep', input_key='tok')
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp_init_.py", line 43, in load
return load_from_meta_file(save_dir, 'meta.json', verbose=verbose, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\utils\component_util.py", line 186, in load_from_meta_file
raise e from None
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\utils\component_util.py", line 106, in load_from_meta_file
obj.load(save_dir, verbose=verbose, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\common\torch_component.py", line 173, in load
self.load_config(save_dir, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\common\torch_component.py", line 126, in load_config
self.on_config_ready(**self.config, save_dir=save_dir)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\components\parsers\biaffine\biaffine_dep.py", line 562, in on_config_ready
self.build_transformer_tokenizer() # We have to build tokenizer before building the dataloader and model
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\components\parsers\biaffine\biaffine_dep.py", line 285, in build_transformer_tokenizer
transformer_tokenizer: PreTrainedTokenizer = AutoTokenizer.from_pretrained(transformer, use_fast=True)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 752, in from_pretrained
config = AutoConfig.from_pretrained(
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1082, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\configuration_utils.py", line 644, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\configuration_utils.py", line 699, in _get_config_dict
resolved_config_file = cached_file(
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\hub.py", line 429, in cached_file
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like hfl/chinese-electra-180g-small-discriminator is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
=================================ERROR LOG ENDS=================================

Code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate the problem.

  • I've completed this form and searched the web for solutions.
@hankcs
Copy link
Owner

hankcs commented Dec 22, 2023

Hi, it's due to your unstable internet connection. Nevertheless, we provide a CDN server to help you download these files from our servers.

@hankcs hankcs closed this as completed Dec 22, 2023
@hankcs hankcs added improvement and removed bug labels Dec 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants