Description
Describe the bug
If I try to use modelbuilder to create a deployable model using a pytorch then try to save it I get the following error:-
sagemaker.config INFO - Applied value from config key = SageMaker.PythonSDK.Modules.Session.DefaultS3Bucket
sagemaker.config INFO - Applied value from config key = SageMaker.PythonSDK.Modules.Session.DefaultS3ObjectKeyPrefix
ModelBuilder: INFO: Save path: /tmp/sagemaker/save/2025-03-13-10-03-56
ModelBuilder: INFO: Saving model to /tmp/sagemaker/save/2025-03-13-10-03-56
ModelBuilder: INFO: Inferred Framework string: <class 'sagemaker.model.FrameworkModel'>
ModelBuilder: INFO: Inferred Python version tuple: ('3', '11', '11')
TypeError Traceback (most recent call last)
Cell In[22], line 1
----> 1 modelBuilder.save()
File /opt/conda/lib/python3.11/site-packages/sagemaker/serve/builder/model_builder.py:986, in ModelBuilder.save(self, save_path, s3_path, sagemaker_session, role_arn)
972 self.role_arn = role_arn
974 save_handler = SaveHandler(
975 model=self.model,
976 schema_builder=self.schema_builder,
(...)
983 metadata=Metadata(),
984 )
--> 986 return save_handler.save()
File /opt/conda/lib/python3.11/site-packages/sagemaker/serve/save_retrive/version_1_0_0/save/save_handler.py:170, in SaveHandler.save(self)
167 if not Path(self.save_path).exists():
168 Path(self.save_path).mkdir(parents=True, exist_ok=True)
--> 170 inferred = detect_framework_and_its_versions(
171 self.model if self.model else self.inference_spec.load(self.model_loader_path)
172 )
173 self.framework = inferred[0][0]
174 self.framework_version = inferred[0][1]
File /opt/conda/lib/python3.11/site-packages/sagemaker/serve/save_retrive/version_1_0_0/save/utils.py:178, in detect_framework_and_its_versions(model)
176 raise Exception("Unable to import xgboost, check if pytorch is installed")
177 else:
--> 178 raise Exception("Unable to determine framework for tht base model" % framework_string)
180 logger.info("Inferred framework and its version: %s %s", fw, vs)
182 return [(fw, vs), py_tuple]
TypeError: not all arguments converted during string formatting
To reproduce
from sagemaker.pytorch import PyTorch
hyperparameters = {'epochs': 5}
mnistfashion_estimator = PyTorch(entry_point='Training.py',
source_dir='model',
role=role,
framework_version='2.3.0',
py_version='py311',
hyperparameters=hyperparameters,
instance_count=1,
instance_type="ml.p3.2xlarge")
mnistfashion_estimator.fit(inputs)
model = mnistfashion_estimator.create_model()
model_path = "s://{0}/{1}/{2}/dev/{3}".format(bucket, project.domain_id, project.id, 'model')
model_path
import sagemaker
from sagemaker import Model
from sagemaker.serve import ModelBuilder
#from sagemaker.model import ModelBuilder
modelBuilder = ModelBuilder(model=model, s3_model_data_url=path)
modelBuilder.save()
Expected behavior
I would expect the model to be saved without error.
Screenshots or logs
If applicable, add screenshots or logs to help explain your problem.
System information
- SageMaker Python SDK version: 2.227.0
- Framework: PyTorch
- Framework version: 2.3.0
- Python version: py311
- GPU
- Custom Docker image: n
Additional context
Looking at the code here: -
def detect_framework_and_its_versions(model: object) -> bool:
"""Placeholder docstring"""
model_base = model.class.base
if object == model_base:
model_base = model.__class__
framework_string = str(model_base)
logger.info("Inferred Framework string: %s", framework_string)
py_tuple = platform.python_version_tuple()
logger.info("Inferred Python version tuple: %s", py_tuple)
fw = ""
vs = ""
if "torch" in framework_string:
The class will always be PyTorchModel but the base class will be sagemaker.model.FrameworkModel meaning this function will never match on a valid class.