### System Info - `transformers` version: 4.34.1 - Platform: Windows-10-10.0.22621-SP0 - Python version: 3.9.12 - Huggingface_hub version: 0.16.4 - Safetensors version: 0.3.2 - Accelerate version: not installed - Accelerate config: not found - PyTorch version (GPU?): 1.13.1+cpu (False) - Tensorflow version (GPU?): 2.13.0 (False) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using GPU in script?: no - Using distributed or parallel set-up in script?: no ### Who can help? @ArthurZucker @amyeroberts ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction As described [here](https://github.com/huggingface/transformers/pull/23909#issuecomment-1785240174), when I try to load this tokenizer ```python from transformers import AutoTokenizer AutoTokenizer.from_pretrained("iarfmoose/t5-base-question-generator") ``` [This exception](https://github.com/huggingface/transformers/blob/e4dad4fe32525c26eccb5790c258aa271476ac33/src/transformers/models/t5/tokenization_t5_fast.py#L127) is raised. ### Expected behavior The tokenizer should be loaded without error as it was for version < 4.34