Skip to content

Add max batch size config option#351

Merged
mikoarce merged 4 commits into
masterfrom
add_maxBatchSize_config_option
May 6, 2025
Merged

Add max batch size config option#351
mikoarce merged 4 commits into
masterfrom
add_maxBatchSize_config_option

Conversation

@mikoarce
Copy link
Copy Markdown
Contributor

@mikoarce mikoarce commented May 2, 2025

This PR exposes the maxBatchSize option so we can break up large requests if we have to.

@mikoarce mikoarce requested a review from Copilot May 2, 2025 17:11
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces a new configuration option for controlling the maximum batch size for property batch loaders.

  • Adds an optional maxBatchSize option to BatchResourceConfig in config.ts.
  • Updates the property batch loader to conditionally include maxBatchSize in DataLoader options in implementation.ts.
  • Provides a new test to verify that batching is handled properly when maxBatchSize is set.

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 1 comment.

File Description
src/implementation.ts Injects maxBatchSize into DataLoader options for property batch loaders
src/config.ts Defines the new optional maxBatchSize property in BatchResourceConfig
tests/implementation.test.js Adds a new test to validate the batching behavior with maxBatchSize

Comment thread src/implementation.ts
}
...cacheKeyOptions
...cacheKeyOptions,
${resourceConfig.maxBatchSize ? `maxBatchSize: ${resourceConfig.maxBatchSize},` : ''}
Copy link

Copilot AI May 2, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The conditional check uses a truthy test for resourceConfig.maxBatchSize, which will skip valid values like 0. Consider explicitly checking for undefined (e.g. resourceConfig.maxBatchSize !== undefined) to ensure all valid numeric values are included.

Suggested change
${resourceConfig.maxBatchSize ? `maxBatchSize: ${resourceConfig.maxBatchSize},` : ''}
${resourceConfig.maxBatchSize !== undefined ? `maxBatchSize: ${resourceConfig.maxBatchSize},` : ''}

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown
Contributor Author

@mikoarce mikoarce May 2, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Checking this as falsy conditional cause I don't think we'd want to set a max batch size of 0 anyway.

But I can swap this to fall back to Infinity instead since that seems like what it defaults to under the hood anyway.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

0 is probably not a valid value tbf here @mikoarce so we should be good to ignore this

@mikoarce mikoarce marked this pull request as ready for review May 2, 2025 17:21
Copy link
Copy Markdown

@robertroeder robertroeder left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! Love the test :)

Comment thread src/implementation.ts
}
...cacheKeyOptions
...cacheKeyOptions,
${resourceConfig.maxBatchSize ? `maxBatchSize: ${resourceConfig.maxBatchSize},` : ''}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to add this to getBatchLoader too?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wasn't sure if we had to, but we probably should since it's also about batched calls. I can add that in there too!

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Made change here: fec60be added a test for this one as well.

Comment thread src/config.ts
isBatchKeyASet?: boolean;
propertyBatchKey?: string;
responseKey?: string;
maxBatchSize?: number;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

++ good catch, totally forgot about that.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds a new configuration option – maxBatchSize – to allow large requests to be broken up into smaller batches. The changes include propagating the new configuration through the DataLoader creation logic (in both batch and property batch loaders), updating the configuration interface, and adding tests and documentation.

Reviewed Changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated no comments.

File Description
src/implementation.ts Updated DataLoader initialization to include maxBatchSize when set
src/config.ts Extended the BatchResourceConfig interface with maxBatchSize
tests/implementation.test.js Added tests to validate batching behavior based on maxBatchSize
API_DOCS.md Documented the new maxBatchSize configuration option

Copy link
Copy Markdown

@robertroeder robertroeder left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes LGTM!

@mikoarce mikoarce merged commit 42acd25 into master May 6, 2025
1 check failed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants