Skip to content
This repository was archived by the owner on Jan 10, 2025. It is now read-only.
This repository was archived by the owner on Jan 10, 2025. It is now read-only.

model generated by model_generation but not able to invoke. #5

@o20021106

Description

@o20021106

I tried to convert tensorflow model to tflite using model_generation/distilbert.py.

I was able to convert and save the model without error, but could not allocate_tensor with python API and invoke with interpreter also failed with RuntimeError

RuntimeError: Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.Node number 0 (FlexShape) failed to prepare.

What should I do to fix this error?

Here's my colab to reproduce the error
colab tf-nightly-gpu==2.2.0.dev20200115

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions