Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

While OP Issue in TFLM #2674

Closed
HemanthSai7 opened this issue Aug 28, 2024 · 2 comments
Closed

While OP Issue in TFLM #2674

HemanthSai7 opened this issue Aug 28, 2024 · 2 comments
Labels

Comments

@HemanthSai7
Copy link

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 22.04):
  • TensorFlow installed from (source or binary): pip
  • TensorFlow version (or github SHA if from source): 2.15.0.post1

Background of the issue
I created a conformer transducer model following the Conformer paper. The model consists of encoder, decoder and the joiner network. The Prediction network contains a single LSTM Layer. I converted the model to INT8 using the get_concrete_function().

concrete_func = model.get_concrete_function()
converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func])

The model converted successfully and after visualising the model in netron, i verified that none of the ops were missing. I then flashed the model in esp32s3 and kept on getting this error infinitely in while.cc op. I have pasted the screenshot of the same below.
image

I then created a simple model consisting of only while loop and then flashed it. I faced the same error.
NOTE: AllocateTensors() was passed

Provide the text output from tflite_convert

W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:378] Ignored output_format.
W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:381] Ignored drop_control_dependency.
I tensorflow/cc/saved_model/reader.cc:83] Reading SavedModel from: /tmp/tmp839tyn2a
I tensorflow/cc/saved_model/reader.cc:51] Reading meta graph with tags { serve }
I tensorflow/cc/saved_model/reader.cc:146] Reading SavedModel debug info (if present) from: /tmp/tmp839tyn2a
I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:388] MLIR V1 optimization pass is not enabled
I tensorflow/cc/saved_model/loader.cc:233] Restoring SavedModel bundle.
I tensorflow/cc/saved_model/loader.cc:217] Running initialization op on SavedModel bundle at path: /tmp/tmp839tyn2a
I tensorflow/cc/saved_model/loader.cc:316] SavedModel load for tags { serve }; Status: success: OK. Took 13069 microseconds.
I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
Summary on the non-converted ops:
---------------------------------
 * Accepted dialects: tfl, builtin, func
 * Non-Converted Ops: 2, Total Ops 18, % non-converted = 11.11 %
 * 2 ARITH ops

- arith.constant:    2 occurrences  (i32: 2)

  (i1: 1, i32: 1)


  (i32: 2)
  (i1: 1)
  (i32: 1)
  (i32: 1)

I tensorflow/compiler/mlir/lite/flatbuffer_export.cc:2989] Estimated count of arithmetic ops: 3  ops, equivalently 1  MACs
fully_quantize: 0, inference_type: 6, input_inference_type: FLOAT32, output_inference_type: FLOAT32
I tensorflow/compiler/mlir/lite/flatbuffer_export.cc:2989] Estimated count of arithmetic ops: 3  ops, equivalently 1  MACs

Standalone code to reproduce the issue
This is how I converted the sample model consisting only while loop

import tensorflow as tf

def convert_tflite(model, rep_data):
    converter = tf.lite.TFLiteConverter.from_keras_model(model)
    converter.experimental_new_converter = True
    converter.representative_dataset = rep_data
    converter.optimizations = [tf.lite.Optimize.DEFAULT]
    converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8, tf.lite.OpsSet.SELECT_TF_OPS]
    converter.allow_custom_ops = True
    converter._experimental_lower_tensor_list_ops = False
    tflite_model = converter.convert()
    with open("model.tflite", "wb") as f:
        f.write(tflite_model)

class SimpleWhileLoopModel(tf.keras.Model):
    def __init__(self):
        super(SimpleWhileLoopModel, self).__init__()

    @tf.function
    def call(self, inputs):
        i, result = tf.constant(0), tf.constant(0)
        while i < inputs:
            result += i * i
            i += 1
        return result

model = SimpleWhileLoopModel()
model.summary()


def representative_dataset_gen():
    for limit in range(1, 11):
        yield [tf.constant(limit, dtype=tf.int32)]

convert_tflite(model, rep_data = representative_dataset_gen)

Also, please include a link to a GraphDef or the model if possible.
image

Any other info / logs

This is what I meant by the loop getting stuck

E (331202) task_wdt: Task watchdog got triggered. The following tasks did not reset the watchdog in time:
E (331202) task_wdt:  - IDLE (CPU 0)
E (331202) task_wdt: Tasks currently running:
E (331202) task_wdt: CPU 0: main
E (331202) task_wdt: CPU 1: IDLE
E (331202) task_wdt: Print CPU 0 (current core) backtrace


Backtrace: 0x4204E152:0x3FC92460 0x403770C9:0x3FC92480 0x42041845:0x3FCF39E0 0x4201AA06:0x3FCF3A00 0x42030772:0x3FCF3A40 0x4200927E:0x3FCF3A70 0x42008F5E:0x3FCF3AA0 0x42006CA3:0x3FCF3AC0 0x42006B36:0x3FCF3AE0 0x42006B2B:0x3FCF3B00 0x4206162C:0x3FCF3B20 0x4037C795:0x3FCF3B40
0x4204e152: task_wdt_isr at esp-adf/esp-idf/components/esp_system/task_wdt.c:183 (discriminator 3)

0x403770c9: _xt_lowint1 at esp-adf/esp-idf/components/freertos/port/xtensa/xtensa_vectors.S:1114

0x42041845: tflite::TfLiteTypeSizeOf(TfLiteType, unsigned int*) at dependency/esp-tflite-micro/tensorflow/lite/micro/memory_helpers.cc:87
 (inlined by) tflite::TfLiteEvalTensorByteLength(TfLiteEvalTensor const*, unsigned int*) at dependency/esp-tflite-micro/tensorflow/lite/micro/memory_helpers.cc:135

0x4201aa06: tflite::micro::ValidateAndGetTensorSizes(TfLiteEvalTensor const*, TfLiteEvalTensor const*) at dependency/esp-tflite-micro/tensorflow/lite/micro/kernels/kernel_util.cc:156
 (inlined by) tflite::micro::CopySubgraphOutputsToOpOutputs(TfLiteContext*, TfLiteNode*, tflite::MicroGraph*, int) at dependency/esp-tflite-micro/tensorflow/lite/micro/kernels/kernel_util.cc:262

0x42030772: tflite::(anonymous namespace)::WhileEval(TfLiteContext*, TfLiteNode*) at dependency/esp-tflite-micro/tensorflow/lite/micro/kernels/while.cc:110

0x4200927e: tflite::MicroInterpreterGraph::InvokeSubgraph(int) at dependency/esp-tflite-micro/tensorflow/lite/micro/micro_interpreter_graph.cc:194

0x42008f5e: tflite::MicroInterpreter::Invoke() at dependency/esp-tflite-micro/tensorflow/lite/micro/micro_interpreter.cc:294

0x42006ca3: loop at application/ml_testing/src/main_functions.cc:259

0x42006b36: app_init at application/ml_testing/src/ml_testing.cpp:11 (discriminator 1)

0x42006b2b: app_main at main/src/audio_frame_work.cpp:10

0x4206162c: main_task at esp-adf/esp-idf/components/freertos/port/port_common.c:141 (discriminator 2)

0x4037c795: vPortTaskWrapper at esp-adf/esp-idf/components/freertos/port/xtensa/port.c:142


E (331202) task_wdt: Print CPU 1 backtrace


Backtrace: 0x40378659:0x3FC92A60 0x403770C9:0x3FC92A80 0x400559DD:0x3FCF4940 |<-CORRUPTED
0x40378659: esp_crosscore_isr at esp-adf/esp-idf/components/esp_system/crosscore_int.c:92

0x403770c9: _xt_lowint1 at esp-adf/esp-idf/components/freertos/port/xtensa/xtensa_vectors.S:1114


E (336202) task_wdt: Task watchdog got triggered. The following tasks did not reset the watchdog in time:
E (336202) task_wdt:  - IDLE (CPU 0)
E (336202) task_wdt: Tasks currently running:
E (336202) task_wdt: CPU 0: main
E (336202) task_wdt: CPU 1: IDLE
E (336202) task_wdt: Print CPU 0 (current core) backtrace


Backtrace: 0x4204E152:0x3FC92460 0x403770C9:0x3FC92480 0x420326B6:0x3FCF39F0 0x4200927E:0x3FCF3A10 0x42030762:0x3FCF3A40 0x4200927E:0x3FCF3A70 0x42008F5E:0x3FCF3AA0 0x42006CA3:0x3FCF3AC0 0x42006B36:0x3FCF3AE0 0x42006B2B:0x3FCF3B00 0x4206162C:0x3FCF3B20 0x4037C795:0x3FCF3B40
0x4204e152: task_wdt_isr at esp-adf/esp-idf/components/esp_system/task_wdt.c:183 (discriminator 3)

0x403770c9: _xt_lowint1 at esp-adf/esp-idf/components/freertos/port/xtensa/xtensa_vectors.S:1114

0x420326b6: tflite::AddEval(TfLiteContext*, TfLiteNode*) at dependency/esp-tflite-micro/tensorflow/lite/micro/kernels/esp_nn/add.cc:213

0x4200927e: tflite::MicroInterpreterGraph::InvokeSubgraph(int) at dependency/esp-tflite-micro/tensorflow/lite/micro/micro_interpreter_graph.cc:194

0x42030762: tflite::(anonymous namespace)::WhileEval(TfLiteContext*, TfLiteNode*) at dependency/esp-tflite-micro/tensorflow/lite/micro/kernels/while.cc:107

0x4200927e: tflite::MicroInterpreterGraph::InvokeSubgraph(int) at dependency/esp-tflite-micro/tensorflow/lite/micro/micro_interpreter_graph.cc:194

0x42008f5e: tflite::MicroInterpreter::Invoke() at dependency/esp-tflite-micro/tensorflow/lite/micro/micro_interpreter.cc:294

0x42006ca3: loop at application/ml_testing/src/main_functions.cc:259

0x42006b36: app_init at application/ml_testing/src/ml_testing.cpp:11 (discriminator 1)

0x42006b2b: app_main at main/src/audio_frame_work.cpp:10

0x4206162c: main_task at esp-adf/esp-idf/components/freertos/port/port_common.c:141 (discriminator 2)

0x4037c795: vPortTaskWrapper at esp-adf/esp-idf/components/freertos/port/xtensa/port.c:142


E (336202) task_wdt: Print CPU 1 backtrace


Backtrace: 0x40378659:0x3FC92A60 0x403770C9:0x3FC92A80 0x400559DD:0x3FCF4940 |<-CORRUPTED
0x40378659: esp_crosscore_isr at esp-adf/esp-idf/components/esp_system/crosscore_int.c:92

0x403770c9: _xt_lowint1 at esp-adf/esp-idf/components/freertos/port/xtensa/xtensa_vectors.S:1114


E (341202) task_wdt: Task watchdog got triggered. The following tasks did not reset the watchdog in time:
E (341202) task_wdt:  - IDLE (CPU 0)
E (341202) task_wdt: Tasks currently running:
E (341202) task_wdt: CPU 0: main
E (341202) task_wdt: CPU 1: IDLE
E (341202) task_wdt: Print CPU 0 (current core) backtrace


Backtrace: 0x4204E152:0x3FC92460 0x403770C9:0x3FC92480 0x4200924C:0x3FCF3A10 0x42030762:0x3FCF3A40 0x4200927E:0x3FCF3A70 0x42008F5E:0x3FCF3AA0 0x42006CA3:0x3FCF3AC0 0x42006B36:0x3FCF3AE0 0x42006B2B:0x3FCF3B00 0x4206162C:0x3FCF3B20 0x4037C795:0x3FCF3B40
0x4204e152: task_wdt_isr at esp-adf/esp-idf/components/esp_system/task_wdt.c:183 (discriminator 3)

0x403770c9: _xt_lowint1 at esp-adf/esp-idf/components/freertos/port/xtensa/xtensa_vectors.S:1114

0x4200924c: tflite::EnumNameBuiltinOperator(tflite::BuiltinOperator) at dependency/esp-tflite-micro/tensorflow/lite/schema/schema_generated.h:1641
 (inlined by) OpNameFromRegistration at dependency/esp-tflite-micro/tensorflow/lite/micro/micro_interpreter_graph.cc:34
 (inlined by) tflite::MicroInterpreterGraph::InvokeSubgraph(int) at dependency/esp-tflite-micro/tensorflow/lite/micro/micro_interpreter_graph.cc:190

0x42030762: tflite::(anonymous namespace)::WhileEval(TfLiteContext*, TfLiteNode*) at dependency/esp-tflite-micro/tensorflow/lite/micro/kernels/while.cc:107

0x4200927e: tflite::MicroInterpreterGraph::InvokeSubgraph(int) at dependency/esp-tflite-micro/tensorflow/lite/micro/micro_interpreter_graph.cc:194

0x42008f5e: tflite::MicroInterpreter::Invoke() at dependency/esp-tflite-micro/tensorflow/lite/micro/micro_interpreter.cc:294

0x42006ca3: loop at application/ml_testing/src/main_functions.cc:259

0x42006b36: app_init at application/ml_testing/src/ml_testing.cpp:11 (discriminator 1)

0x42006b2b: app_main at main/src/audio_frame_work.cpp:10

0x4206162c: main_task at esp-adf/esp-idf/components/freertos/port/port_common.c:141 (discriminator 2)

0x4037c795: vPortTaskWrapper at esp-adf/esp-idf/components/freertos/port/xtensa/port.c:142


E (341202) task_wdt: Print CPU 1 backtrace


Backtrace: 0x40378659:0x3FC92A60 0x403770C9:0x3FC92A80 0x400559DD:0x3FCF4940 |<-CORRUPTED
0x40378659: esp_crosscore_isr at esp-adf/esp-idf/components/esp_system/crosscore_int.c:92

0x403770c9: _xt_lowint1 at esp-adf/esp-idf/components/freertos/port/xtensa/xtensa_vectors.S:1114
Copy link
Contributor

"This issue is being marked as stale due to inactivity. Remove label or comment to prevent closure in 5 days."

@github-actions github-actions bot added the Stale label Oct 17, 2024
Copy link
Contributor

"This issue is being closed because it has been marked as
stale for 5 days with no further activity."

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Oct 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant