runtime error

Exit code: 1. Reason: Running on device: cpu processor_config.json: 0%| | 0.00/371 [00:00<?, ?B/s] processor_config.json: 100%|██████████| 371/371 [00:00<00:00, 2.42MB/s] tokenizer_config.json: 0%| | 0.00/5.12k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 5.12k/5.12k [00:00<00:00, 25.3MB/s] preprocessor_config.json: 0%| | 0.00/284 [00:00<?, ?B/s] preprocessor_config.json: 100%|██████████| 284/284 [00:00<00:00, 1.78MB/s] Using a slow image processor as `use_fast` is unset and a slow processor was saved with this model. `use_fast=True` will be the default behavior in v4.52, even if the model was saved with a slow processor. This will result in minor differences in outputs. You'll still be able to use a slow processor with `use_fast=False`. Error loading MedASR: Unrecognized processing class in google/medasr. Can't instantiate a processor, a tokenizer, an image processor or a feature extractor for this model. Make sure the repository contains the files of at least one of those processing classes. Traceback (most recent call last): File "/app/app.py", line 34, in <module> raise e File "/app/app.py", line 20, in <module> processor = AutoProcessor.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/processing_auto.py", line 424, in from_pretrained raise ValueError( ValueError: Unrecognized processing class in google/medasr. Can't instantiate a processor, a tokenizer, an image processor or a feature extractor for this model. Make sure the repository contains the files of at least one of those processing classes.

Container logs:

Fetching error logs...