Commit 63b8cfb
authored
feat: [AiPlatform] add evaluation metrics and autorater configuration to the AI Platform v1 API (#9074)
* feat: add evaluation metrics and autorater configuration to the AI Platform v1 API
feat: add evaluation configuration and dataset runs to TuningJob
feat: add multimodal input support and custom output formats to evaluation metrics
PiperOrigin-RevId: 893059415
Source-Link: googleapis/googleapis@582172d
Source-Link: googleapis/googleapis-gen@b8fd776
Copy-Tag: eyJwIjoiQWlQbGF0Zm9ybS8uT3dsQm90LnlhbWwiLCJoIjoiYjhmZDc3Njc2ZGUwMDYzZTQ2MjY1NGE5NGYzOGM5YjE3YzQwNDJmNyJ9
* 🦉 Updates from OwlBot post-processor
See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md
---------
Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>1 parent ceeb2e2 commit 63b8cfb
32 files changed
Lines changed: 3584 additions & 2 deletions
File tree
- AiPlatform
- metadata/V1
- src/V1
- ComputationBasedMetricSpec
- ContentMap
- Metric
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
0 commit comments