Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6442 Discussions

PermissionError when i use OVModelForCausalLM

MansonHua
Beginner
222 Views

Hi,

I want to switch huggingface model to Openvino model, like the tutorial did: "https://docs.openvino.ai/2024/learn-openvino/llm_inference_guide/llm-inference-hf.html"

 

When I run the following code, 

from optimum.intel import OVModelForCausalLM
model_id = "TinyLlama/TinyLlama-1.1B-Chat-v1.0"
model = OVModelForCausalLM.from_pretrained(model_id, export=True)

The error is reported as follows :

Compiling the model to CPU ...
Exception ignored in: <finalize object at 0x2728ab83e40; dead>
Traceback (most recent call last):
File "D:\miniconda3\envs\npu-cu121\Lib\weakref.py", line 590, in __call__
return info.func(*info.args, **(info.kwargs or {}))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\miniconda3\envs\npu-cu121\Lib\tempfile.py", line 933, in _cleanup
cls._rmtree(name, ignore_errors=ignore_errors)
File "D:\miniconda3\envs\npu-cu121\Lib\tempfile.py", line 929, in _rmtree
_shutil.rmtree(name, onerror=onerror)
File "D:\miniconda3\envs\npu-cu121\Lib\shutil.py", line 787, in rmtree
return _rmtree_unsafe(path, onerror)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\miniconda3\envs\npu-cu121\Lib\shutil.py", line 634, in _rmtree_unsafe
onerror(os.unlink, fullname, sys.exc_info())
File "D:\miniconda3\envs\npu-cu121\Lib\tempfile.py", line 893, in onerror
_os.unlink(path)
PermissionError: [WinError 32] The process cannot access the file "C:\\Users\\super\\AppData\\Local\\Temp\\tmp8iwe25fe\\openvino_model.bin" because it is being used by another process

Based on the output, it seems that compression of the model has been completed:

MansonHua_0-1717659726183.png

 

What should I do to successfully run the code?

0 Kudos
2 Replies
Peh_Intel
Moderator
165 Views

Hi MansonHua,

 

For your information, I am able to load TinyLlama-1.1B-Chat-v1.0 model from Hugging Face Model to Optimum Intel without having issue. I am running the code in Command Prompt with Python 3.11.5.

 

llma_model.jpeg

 

Please have a look at the directory:

C:\\Users\\super\\AppData\\Local\\Temp\\tmp8iwe25fe\\

 

If the temp folder has the config file and converted IR files, you can load the model from the directory

and save in a new folder called llma_model with the LLM in OpenVINO IR format inside.

model = OVModelForCausalLM.from_pretrained("C:\\Users\\super\\AppData\\Local\\Temp\\tmp8iwe25fe")

 

model.save_pretrained("llma_model")

 

 

Regards,

Peh

 

 

 

0 Kudos
Peh_Intel
Moderator
83 Views

Hi MansonHua,


This thread will no longer be monitored since we have provided a suggestion. If you need any additional information from Intel, please submit a new question. 



Regards,

Peh


0 Kudos
Reply