Skip to content

Conversation

sgonorov
Copy link

@sgonorov sgonorov commented Sep 30, 2025

Description

Ticket:
Part of CVS-173285

@sgonorov sgonorov requested a review from as-suvorov September 30, 2025 05:41
@github-actions github-actions bot added category: continuous batching Continuous batching category: LLM LLM pipeline (stateful, static) category: sampling Sampling / Decoding algorithms category: tokenizers Tokenizer class or submodule update category: GGUF GGUF file reader category: RAG RAG pipeline components labels Sep 30, 2025
Copy link
Contributor

@apaniukov apaniukov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ov_pipe should not be created inside the test if possible - it incurs significant overhead. You should pass initialized pipelines to the tests, not models_path.

@sgonorov
Copy link
Author

ov_pipe should not be created inside the test if possible - it incurs significant overhead. You should pass initialized pipelines to the tests, not models_path.

Yes, in LLM pipeline tests it's done. I'm currently looking on how to make it work properly for LLL static pipeline.

@sgonorov sgonorov force-pushed the testing_llm_pipeline_speedup branch from 8c2c779 to 1f12a03 Compare October 1, 2025 21:08
@sgonorov sgonorov self-assigned this Oct 1, 2025
@sgonorov sgonorov force-pushed the testing_llm_pipeline_speedup branch from 85d9db9 to 1e323b2 Compare October 5, 2025 19:07
@sgonorov sgonorov marked this pull request as ready for review October 5, 2025 19:22
@sgonorov sgonorov force-pushed the testing_llm_pipeline_speedup branch from 1e323b2 to 7820744 Compare October 6, 2025 12:15
@sgonorov sgonorov force-pushed the testing_llm_pipeline_speedup branch from ab9bc74 to 6933c81 Compare October 7, 2025 07:42
Comment on lines +210 to +215
from pathlib import Path
media_path = (Path(download_test_content) / media_file).as_posix()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we make download_test_content to return a Path object? And/or move the import to the top?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can do it, but most use-cases for this fixture as a string parameter, so it'll just add more conversions.

@sgonorov sgonorov force-pushed the testing_llm_pipeline_speedup branch from ffd8269 to c4bfe29 Compare October 7, 2025 22:16
@sgonorov sgonorov force-pushed the testing_llm_pipeline_speedup branch 3 times, most recently from 2d04bc3 to 0a51896 Compare October 10, 2025 08:59
@sgonorov sgonorov force-pushed the testing_llm_pipeline_speedup branch from 0a51896 to af1ab81 Compare October 10, 2025 09:02
@sgonorov sgonorov requested a review from apaniukov October 12, 2025 14:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

category: continuous batching Continuous batching category: GGUF GGUF file reader category: Image generation samples GenAI Image generation samples category: llm_bench Label for tool/llm_bench folder category: LLM samples GenAI LLM samples category: LLM LLM pipeline (stateful, static) category: RAG RAG pipeline components category: sampling Sampling / Decoding algorithms category: Speech generation samples category: Structured Output samples category: tokenizers Tokenizer class or submodule update category: VLM samples GenAI VLM samples category: whisper Whisper pipeline

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants