You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the current behavior
When I runing the example
Expected behavior
The ternminal had output variable 'text' content.
System information
-- OS: Windows-10-10.0.22621-SP0
-- Python: 3.10.4
-- PyTorch: 2.1.2+cpu
-- HanLP: 2.1.0-beta.54
Other info / logs
================================ERROR LOG BEGINS================================
OS: Windows-10-10.0.22621-SP0
Python: 3.10.4
PyTorch: 2.1.2+cpu
HanLP: 2.1.0-beta.54
Traceback (most recent call last):
File "C:\TestProject\python\hanlptest\main.py", line 17, in
.append(hanlp.load('CTB9_DEP_ELECTRA_SMALL', conll=0), output_key='dep', input_key='tok')
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp_init_.py", line 43, in load
return load_from_meta_file(save_dir, 'meta.json', verbose=verbose, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\utils\component_util.py", line 186, in load_from_meta_file
raise e from None
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\utils\component_util.py", line 106, in load_from_meta_file
obj.load(save_dir, verbose=verbose, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\common\torch_component.py", line 173, in load
self.load_config(save_dir, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\common\torch_component.py", line 126, in load_config
self.on_config_ready(**self.config, save_dir=save_dir)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\components\parsers\biaffine\biaffine_dep.py", line 562, in on_config_ready
self.build_transformer_tokenizer() # We have to build tokenizer before building the dataloader and model
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\components\parsers\biaffine\biaffine_dep.py", line 285, in build_transformer_tokenizer
transformer_tokenizer: PreTrainedTokenizer = AutoTokenizer.from_pretrained(transformer, use_fast=True)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 752, in from_pretrained
config = AutoConfig.from_pretrained(
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1082, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\configuration_utils.py", line 644, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\configuration_utils.py", line 699, in _get_config_dict
resolved_config_file = cached_file(
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\hub.py", line 429, in cached_file
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like hfl/chinese-electra-180g-small-discriminator is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
=================================ERROR LOG ENDS=================================
Code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate the problem.
I've completed this form and searched the web for solutions.
The text was updated successfully, but these errors were encountered:
Describe the bug
Failed to load https://file.hankcs.com/hanlp/dep/ctb9_dep_electra_small_20220216_100306.zip
If the problem still persists, please submit an issue to https://github.com/hankcs/HanLP/issues
When reporting an issue, make sure to paste the FULL ERROR LOG below.
Describe the current behavior
When I runing the example
Expected behavior
The ternminal had output variable 'text' content.
System information
-- OS: Windows-10-10.0.22621-SP0
-- Python: 3.10.4
-- PyTorch: 2.1.2+cpu
-- HanLP: 2.1.0-beta.54
Other info / logs
================================ERROR LOG BEGINS================================
OS: Windows-10-10.0.22621-SP0
Python: 3.10.4
PyTorch: 2.1.2+cpu
HanLP: 2.1.0-beta.54
Traceback (most recent call last):
File "C:\TestProject\python\hanlptest\main.py", line 17, in
.append(hanlp.load('CTB9_DEP_ELECTRA_SMALL', conll=0), output_key='dep', input_key='tok')
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp_init_.py", line 43, in load
return load_from_meta_file(save_dir, 'meta.json', verbose=verbose, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\utils\component_util.py", line 186, in load_from_meta_file
raise e from None
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\utils\component_util.py", line 106, in load_from_meta_file
obj.load(save_dir, verbose=verbose, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\common\torch_component.py", line 173, in load
self.load_config(save_dir, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\common\torch_component.py", line 126, in load_config
self.on_config_ready(**self.config, save_dir=save_dir)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\components\parsers\biaffine\biaffine_dep.py", line 562, in on_config_ready
self.build_transformer_tokenizer() # We have to build tokenizer before building the dataloader and model
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\hanlp\components\parsers\biaffine\biaffine_dep.py", line 285, in build_transformer_tokenizer
transformer_tokenizer: PreTrainedTokenizer = AutoTokenizer.from_pretrained(transformer, use_fast=True)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 752, in from_pretrained
config = AutoConfig.from_pretrained(
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1082, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\configuration_utils.py", line 644, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\configuration_utils.py", line 699, in _get_config_dict
resolved_config_file = cached_file(
File "C:\Users\jsyyb\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\hub.py", line 429, in cached_file
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like hfl/chinese-electra-180g-small-discriminator is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
=================================ERROR LOG ENDS=================================
Code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate the problem.
The text was updated successfully, but these errors were encountered: