Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid contention when loading models #1034

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open

Avoid contention when loading models #1034

wants to merge 2 commits into from

Conversation

Btlmd
Copy link
Member

@Btlmd Btlmd commented May 16, 2023

Model loading is very likely to trigger this error

Traceback (most recent call last):    
  File "main.py", line 434, in <module>                                                                                                                                       
    main()                                                                                                                                                                    
  File "main.py", line 114, in main    
    tokenizer = AutoTokenizer.from_pretrained(model_args.model_name_or_path, trust_remote_code=True)                                                                          
  File "/opt/conda/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 663, in from_pretrained                                                   
    tokenizer_class = get_class_from_dynamic_module(                                   
  File "/opt/conda/lib/python3.8/site-packages/transformers/dynamic_module_utils.py", line 388, in get_class_from_dynamic_module                                              
    final_module = get_cached_module_file(                                                                                                                                    
  File "/opt/conda/lib/python3.8/site-packages/transformers/dynamic_module_utils.py", line 279, in get_cached_module_file                                                     
    shutil.copy(resolved_module_file, submodule_path / module_file)                                                                                                           
  File "/opt/conda/lib/python3.8/shutil.py", line 419, in copy                                                                                                                
    copymode(src, dst, follow_symlinks=follow_symlinks)                                
  File "/opt/conda/lib/python3.8/shutil.py", line 308, in copymode                                                                                                            
    chmod_func(dst, stat.S_IMODE(st.st_mode))                                          
FileNotFoundError: [Errno 2] No such file or directory: '/root/.cache/huggingface/modules/transformers_modules/chatglm-6b/tokenization_chatglm.py' 

because the model are loaded simultaneoutly in each process.

Add a lock to prevent this.

@roki1031
Copy link

roki1031 commented Jun 6, 2023

我按照您的代码在本地做了修改,还是会报一样的错误。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants