Skip to content

Commit 63b8cfb

Browse files
feat: [AiPlatform] add evaluation metrics and autorater configuration to the AI Platform v1 API (#9074)
* feat: add evaluation metrics and autorater configuration to the AI Platform v1 API feat: add evaluation configuration and dataset runs to TuningJob feat: add multimodal input support and custom output formats to evaluation metrics PiperOrigin-RevId: 893059415 Source-Link: googleapis/googleapis@582172d Source-Link: googleapis/googleapis-gen@b8fd776 Copy-Tag: eyJwIjoiQWlQbGF0Zm9ybS8uT3dsQm90LnlhbWwiLCJoIjoiYjhmZDc3Njc2ZGUwMDYzZTQ2MjY1NGE5NGYzOGM5YjE3YzQwNDJmNyJ9 * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md --------- Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
1 parent ceeb2e2 commit 63b8cfb

32 files changed

Lines changed: 3584 additions & 2 deletions

AiPlatform/metadata/V1/EvaluationService.php

Lines changed: 5 additions & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

AiPlatform/metadata/V1/TuningJob.php

Lines changed: 2 additions & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

AiPlatform/src/V1/AggregationOutput.php

Lines changed: 111 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)