Skip to content

Conversation

@Chibukach
Copy link
Collaborator

No description provided.

@Chibukach Chibukach requested a review from shubhra October 23, 2025 11:30
Copy link
Member

@anmarques anmarques left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall it looks good. It needs some small fixes

config_semantic_similarity_args.update(semantic_similarity_args)
self.semantic_similarity_args = config_semantic_similarity_args

self.num_samples_per_dataset = config_kwargs.pop("num_samples_per_dataset", num_samples_per_dataset)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This implementation allows arguments set in config to overwrite arguments passed in by the user. This should not be the case. Config arguments should follow one of these two options:

  1. They can be overridden by the user. In this case the use-provided value should be used.
  2. They can't be overridden by the user. If the user provides a value for an argument that can't be overridden the argument parsing should raise an error.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fair point, will fix


try:
print(">>> Initializing vLLM...")
llm = LLM(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we using the LLM class instead of vllm serve?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The main branch has an old src/automation/vllm/server.py file the class VLLMServer but other branches use start_vllm_server`.
Also shouldn't the output of the LLM class be identical to the vllm serve api endpoint?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants