Skip to content

Commit a8f8ce4

Browse files
Merge pull request #13 from microsoft/users/yohuan/fixreadme2
fix readme codes
2 parents 298854a + edffffd commit a8f8ce4

1 file changed

Lines changed: 3 additions & 2 deletions

File tree

README.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Batch Inference Toolkit
22

3-
Batch Inference Toolkit(batch-inference) is a Python package that batches model input tensors coming from multiple users dynamically, executes the model, un-batches output tensors and then returns them back to each user respectively. This will improve system throughput because of better compute parallelism and better cache locality. The entire process is transparent to developers.
3+
Batch Inference Toolkit(batch-inference) is a Python package that batches model input tensors coming from multiple requests dynamically, executes the model, un-batches output tensors and then returns them back to each request respectively. This will improve system throughput because of better compute parallelism and better cache locality. The entire process is transparent to developers.
44

55
## When to use
66

@@ -59,7 +59,7 @@ from batch_inference.batcher.concat_batcher import ConcatBatcher
5959
@batching(batcher=ConcatBatcher(), max_batch_size=32)
6060
class MyModel:
6161
def __init__(self, k, n):
62-
self.weights = np.random.randn((k, n)).astype("f")
62+
self.weights = np.random.randn(k, n).astype("f")
6363

6464
# shape of x: [batch_size, m, k]
6565
def predict_batch(self, x):
@@ -75,6 +75,7 @@ def process_request(x):
7575
y = host.predict(x)
7676
return y
7777

78+
host.stop()
7879
```
7980

8081
**Batcher** is responsible to merge queries and split outputs. In this case ConcatBatcher will concat input tensors into a batched tensors at first dimension. We provide a set of built-in Batchers for common scenarios, and you can also implement your own Batcher. See [What is Batcher](https://microsoft.github.io/batch-inference/batcher/what_is_batcher.html) for more information.

0 commit comments

Comments
 (0)