Skip to content

using a model of type deepseek_vl_v2 to instantiate a model of type DeepseekOCR #5

@just-bean

Description

@just-bean

Python server is ready!
Python: 2025-10-25 12:55:59,379 - INFO - Progress: loading - init - Initializing model loading... (0%)
2025-10-25 12:55:59,380 - INFO - Loading DeepSeek OCR model from deepseek-ai/DeepSeek-OCR...
2025-10-25 12:55:59,380 - INFO - Model will be cached in: H:\Temp\deepseek-ocr-client\backend..\cache\models

Python: 2025-10-25 12:55:59,385 - INFO - GPU available: NVIDIA GeForce RTX 4060 Ti
2025-10-25 12:55:59,385 - INFO - Progress: loading - tokenizer - Loading tokenizer... (10%)
2025-10-25 12:55:59,385 - INFO - Loading tokenizer...

Python: 2025-10-25 12:56:05,132 - INFO - Progress: loading - tokenizer - Tokenizer loaded (20%)

Python: 2025-10-25 12:56:05,133 - INFO - Progress: loading - model - Loading model from cache... (25%)

Python: 2025-10-25 12:56:05,133 - INFO - Loading model from cache...

Python: 2025-10-25 12:56:07,134 - INFO - Progress: loading - model - Loading model from cache... (25%)

Python: 2025-10-25 12:56:09,135 - INFO - Progress: loading - model - Loading model from cache... (25%)

Python: 2025-10-25 12:56:11,184 - INFO - Progress: loading - model - Loading model from cache... (25%)

Python: You are using a model of type deepseek_vl_v2 to instantiate a model of type DeepseekOCR. This is not supported for all configurations of models and can yield errors.

Python: 2025-10-25 12:56:13,186 - INFO - Progress: loading - model - Loading model from cache... (25%)

Python: 2025-10-25 12:56:17,547 - WARNING - Flash attention not available: FlashAttention2 has been toggled on, but it cannot be used due to the following error: the package flash_attn seems to be not installed. Please refer to the documentation of https://huggingface.co/docs/transformers/perf_infer_gpu_one#flashattention-2 to install Flash Attention 2., using default attention

Python: You are using a model of type deepseek_vl_v2 to instantiate a model of type DeepseekOCR. This is not supported for all configurations of models and can yield errors.

Python process exited with code 3221225477

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingcan't reproduceCan't reproduce the bug as describedhelp wantedExtra attention is neededquestionFurther information is requested

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions