Skip to content

feature: separating sagemaker dependencies into more use case specific installable components. #1130

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Nov 21, 2019
12 changes: 12 additions & 0 deletions doc/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -661,6 +661,12 @@ For example, the ``dataframe`` method gets a pandas dataframe summarizing the as
# Look at summary of associated training jobs
my_dataframe = my_tuner_analytics.dataframe()

You can install all necessary for this feature dependencies using pip:

::

pip install 'sagemaker[analytics]' --upgrade

For more detailed examples of running hyperparameter tuning jobs, see:

- `Using the TensorFlow estimator with hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/tensorflow_mnist/hpo_tensorflow_mnist.ipynb>`__
Expand Down Expand Up @@ -718,6 +724,12 @@ The SageMaker Python SDK supports local mode, which allows you to create estimat
This is a great way to test your deep learning scripts before running them in SageMaker's managed training or hosting environments.
Local Mode is supported for frameworks images (TensorFlow, MXNet, Chainer, PyTorch, and Scikit-Learn) and images you supply yourself.

You can install all necessary for this feature dependencies using pip:

::

pip install 'sagemaker[local]' --upgrade

We can take the example in `Using Estimators <#using-estimators>`__ , and use either ``local`` or ``local_gpu`` as the instance type.

.. code:: python
Expand Down
52 changes: 31 additions & 21 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,12 +38,40 @@ def read_version():
"numpy>=1.9.0",
"protobuf>=3.1",
"scipy>=0.19.0",
"urllib3>=1.21, <1.25",
"urllib3>=1.21, <1.25", # local mode dependencies -> remove in the next release
"protobuf3-to-dict>=0.1.5",
"requests>=2.20.0, <2.21",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I realize this isn't strictly in scope, but the request bounds were somewhat defined by docker-compose - we could probably change it to requests>=2.20.0?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe, don't have bandwidth to do a proper check for it as part of this PR knowing how many problems it caused before.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fair enough 😂

"fabric>=2.0",
"docker-compose>=1.23.0", # local mode dependencies -> remove in the next release
]

# Specific use case dependencies
extras = {
"analytics": ["pandas"],
"local": ["urllib3>=1.21, <1.25", "docker-compose>=1.23.0"],
"tensorflow": ["tensorflow>=1.3.0"],
}
# Meta dependency groups
extras["all"] = [item for group in extras.values() for item in group]
# Tests specific dependencies (do not need to be included in 'all')
extras["test"] = (
[
extras["all"],
"tox==3.13.1",
"flake8",
"pytest==4.4.1",
"pytest-cov",
"pytest-rerunfailures",
"pytest-xdist",
"mock",
"contextlib2",
"awslogs",
"black==19.3b0 ; python_version >= '3.6'",
"stopit==1.1.2",
"apache-airflow==1.10.5",
"fabric>=2.0",
],
)

# enum is introduced in Python 3.4. Installing enum back port
if sys.version_info < (3, 4):
required_packages.append("enum34>=1.1.6")
Expand All @@ -70,24 +98,6 @@ def read_version():
"Programming Language :: Python :: 3.6",
],
install_requires=required_packages,
extras_require={
"test": [
"tox==3.13.1",
"flake8",
"pytest==4.4.1",
"pytest-cov",
"pytest-rerunfailures",
"pytest-xdist",
"mock",
"tensorflow>=1.3.0",
"contextlib2",
"awslogs",
"pandas",
"black==19.3b0 ; python_version >= '3.6'",
"stopit==1.1.2",
"apache-airflow==1.10.5",
"docker-compose>=1.23.0",
]
},
extras_require=extras,
entry_points={"console_scripts": ["sagemaker=sagemaker.cli.main:main"]},
)
11 changes: 9 additions & 2 deletions src/sagemaker/local/entities.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,12 +19,19 @@
import os
import tempfile
import time
import urllib3

import sagemaker.local.data
from sagemaker.local.image import _SageMakerContainer
from sagemaker.local.utils import copy_directory_structure, move_to_destination
from sagemaker.utils import get_config_value
from sagemaker.utils import DeferredError, get_config_value

try:
import urllib3
except ImportError as e:
logging.warning("urllib3 failed to import. Local mode features will be impaired or broken.")
# Any subsequent attempt to use urllib3 will raise the ImportError
urllib3 = DeferredError(e)


logger = logging.getLogger(__name__)

Expand Down
10 changes: 8 additions & 2 deletions src/sagemaker/local/local_session.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@
import platform

import boto3
import urllib3
from botocore.exceptions import ClientError

from sagemaker.local.image import _SageMakerContainer
Expand All @@ -29,7 +28,14 @@
_LocalTransformJob,
)
from sagemaker.session import Session
from sagemaker.utils import get_config_value
from sagemaker.utils import DeferredError, get_config_value

try:
import urllib3
except ImportError as e:
logging.warning("urllib3 failed to import. Local mode features will be impaired or broken.")
# Any subsequent attempt to use urllib3 will raise the ImportError
urllib3 = DeferredError(e)

logger = logging.getLogger(__name__)

Expand Down