-
Couldn't load subscription status.
- Fork 794
[Benchmarks] add Record and Replay benchmarks #20481
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: sycl
Are you sure you want to change the base?
Conversation
fb1858a to
518f3b8
Compare
| "./devops/scripts/benchmarks/main.py", | ||
| self.WORKDIR_DIR, | ||
| "--sycl", | ||
| os.environ.get("ONEAPI_ROOT"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add info that oneapi has to be installed and sourced for sycl benchmarks tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that you've changed the code so that oneapi is no more needed. Please, remove the comment. This CMPLR_ROOT env var is set directly in the workflow file.
|
|
||
| metadata = out.metadata[testName] | ||
| self.assertEqual(metadata["type"], "benchmark") | ||
| self.assertEqual(set(metadata["tags"]), {"L0", "latency", "micro", "submit"}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also check if group metadata for this benchmark has been created in data.json
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please precise what path in json I need to verify
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
benchmark_group = metadata["explicit_group"]
group_metadata = out.metadata[benchmark_group]
self.assertEqual(group_metadata["type"], "group")
| Returns: | ||
| bool: True if the repository was cloned or updated, False if it was already up-to-date. | ||
| """ | ||
| if os.environ.get("LLVM_BENCHMARKS_UNIT_TESTING") == "1": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we describe this env var somewhere?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All new env vars should be described in some tests' readme
fd812e1 to
269c808
Compare
269c808 to
4ac4f56
Compare
f7a16dc to
5b25f63
Compare
with unittest support