RuntimeError: Failed to import transformers ('NoneType' object has no attribute 'split') with Python 3.11, TensorFlow 2.15 & Docker

5 days ago 4
ARTICLE AD BOX

I am encountering a critical RuntimeError during the startup of my Dockerized FastAPI application. The error occurs specifically when sentence_transformers attempts to import transformers.models.auto.modeling_auto.

It seems related to an environment variable parsing issue (NoneType has no attribute split), but I cannot pinpoint which variable is causing the crash inside the transformers library.

Environment Details:

Python: 3.11 (running in Docker)

TensorFlow: 2.15.0

tf-keras: 2.15.1

Transformers: 4.39.3

Sentence-Transformers: 2.7.0

Keras (Standalone): Uninstalled (I verified that pip list only shows tf-keras).

The Error Traceback:

The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 694, in lifespan async with self.lifespan_context(app) as maybe_state: File "/usr/local/lib/python3.11/contextlib.py", line 210, in __aenter__ return await anext(self.gen) ^^^^^^^^^^^^^^^^^^^^^ File "/app/api_server.py", line 138, in lifespan app.state.rag = get_rag() ^^^^^^^^^ File "/app/api_server.py", line 120, in get_rag from modules.rag.graph import RAGPipeline File "/app/modules/rag/__init__.py", line 3, in <module> from .reranker import CrossEncoderReranker File "/app/modules/rag/reranker.py", line 8, in <module> from sentence_transformers import CrossEncoder File "/usr/local/lib/python3.11/site-packages/sentence_transformers/__init__.py", line 3, in <module> from .datasets import SentencesDataset, ParallelSentencesDataset File "/usr/local/lib/python3.11/site-packages/sentence_transformers/datasets/__init__.py", line 3, in <module> from .ParallelSentencesDataset import ParallelSentencesDataset File "/usr/local/lib/python3.11/site-packages/sentence_transformers/datasets/ParallelSentencesDataset.py", line 4, in <module> from .. import SentenceTransformer File "/usr/local/lib/python3.11/site-packages/sentence_transformers/SentenceTransformer.py", line 38, in <module> from .models import Transformer, Pooling, Normalize File "/usr/local/lib/python3.11/site-packages/sentence_transformers/models/__init__.py", line 1, in <module> from .Transformer import Transformer File "/usr/local/lib/python3.11/site-packages/sentence_transformers/models/Transformer.py", line 2, in <module> from transformers import AutoModel, AutoTokenizer, AutoConfig, T5Config, MT5Config File "<frozen importlib._bootstrap>", line 1229, in _handle_fromlist File "/usr/local/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1463, in __getattr__ value = getattr(module, name) ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1462, in __getattr__ module = self._get_module(self._class_to_module[name]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1474, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.models.auto.modeling_auto because of the following error (look up to see its traceback): 'NoneType' object has no attribute 'split' ERROR: Application startup failed. Exiting. What I have tried so far: I suspected a Keras 2 vs Keras 3 conflict, so I uninstalled keras and kept tf-keras. I added the following Environment Variables in docker-compose.yml and explicitly inside my Python script (os.environ): os.environ["TRANSFORMERS_NO_TF"] = "1" os.environ["TF_USE_LEGACY_KERAS"] = "1" os.environ["CUDA_VISIBLE_DEVICES"] = "-1" # Suspected this was None, so I set it to -1 I verified that pip list shows tensorflow==2.15.0 and tf-keras==2.15.1. Despite setting CUDA_VISIBLE_DEVICES to "-1", the error persists at the exact same import line. Does anyone know which specific environment variable transformers v4.39.3 parses with .split() that might be returning None in a Docker environment?
Read Entire Article