How to use TFLite model signatures with InterpreterApi in LiteRT from Google Play Services ()?

1 day ago 1
ARTICLE AD BOX

I’m migrating my Android project from the bundled TensorFlow Lite to TensorFlow Lite via Google Play Services (LiteRT).

I use Movinet, which uses signatures and I’m feeding it individual video frames one by one for streaming inference.

In the old bundled  Interpreter  class, I could use:
interpreter.runSignature(inputs, outputs, "serving_default")

However, after migrating to  InterpreterApi  from Play Services, this method does not appear to be available —  InterpreterApi  only exposes  run()  and  runForMultipleInputsOutputs().

How to use TFLite model signatures with InterpreterApi?

Read Entire Article