Skip to content

Strip not working as expected #345

Closed
@aodj

Description

@aodj

I've recently had to add the Pillow library to my project but when running with slim I get the usual ELF alignment issue.

Looking over the documentation I saw that strip: false is supposed to handle this use case, by not stripping the .so files, but when running the deploy I still get the error once deployed to AWS Lambda:

$ sls deploy
Serverless: Generated requirements from /Users/alexander/Documents/github/multiplechoice/spiders/requirements.txt in /Users/alexander/Documents/github/multiplechoice/spiders/.serverless/requirements.txt...
Serverless: Installing requirements from /Users/alexander/Documents/github/multiplechoice/spiders/.serverless/requirements/requirements.txt ...
Serverless: Docker Image: lambci/lambda:build-python3.7
Serverless: Using download cache directory /Users/alexander/Library/Caches/serverless-python-requirements/downloadCacheslspyc
Serverless: Running docker run --rm -v /Users/alexander/Documents/github/multiplechoice/spiders/.serverless/requirements\:/var/task\:z -v /Users/alexander/Library/Caches/serverless-python-requirements/downloadCacheslspyc\:/var/useDownloadCache\:z -u 0 lambci/lambda\:build-python3.7 /bin/sh -c 'python3.7 -m pip install -t /var/task/ -r /var/task/requirements.txt --cache-dir /var/useDownloadCache && find /var/task -name \\*.so -exec strip \\{\\} \\;'...
Serverless: Packaging service...
Serverless: Excluding development dependencies...
Serverless: Injecting required Python packages to package...
Serverless: Uploading CloudFormation file to S3...
Serverless: Uploading artifacts...
Serverless: Uploading service spiders.zip file to S3 (16.11 MB)...
Serverless: Validating template...
Serverless: Updating Stack...
Serverless: Checking Stack update progress...
...........................
Serverless: Stack update finished...
Service Information
service: spiders
stage: dev
region: eu-central-1
stack: spiders-dev
resources: 22
api keys:
  None
endpoints:
  None
functions:
  tvinna: spiders-dev-tvinna
  mbl: spiders-dev-mbl
  alfred: spiders-dev-alfred
  job: spiders-dev-job
layers:
  None
Serverless: Removing old service artifacts from S3...

Once the deploy is complete, and I invoke the command I get the following output:

$ sls invoke -f tvinna --log
START RequestId: 6730c6ac-217b-4fa8-a18f-ffd732473f1c Version: $LATEST
Unhandled error in Deferred:
Traceback (most recent call last):
  File "/var/task/scrapy/crawler.py", line 172, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "/var/task/scrapy/crawler.py", line 176, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "/var/task/twisted/internet/defer.py", line 1613, in unwindGenerator
    return _cancellableInlineCallbacks(gen)
  File "/var/task/twisted/internet/defer.py", line 1529, in _cancellableInlineCallbacks
    _inlineCallbacks(None, g, status)
--- <exception caught here> ---
  File "/var/task/twisted/internet/defer.py", line 1418, in _inlineCallbacks
    result = g.send(result)
  File "/var/task/scrapy/crawler.py", line 80, in crawl
    self.engine = self._create_engine()
  File "/var/task/scrapy/crawler.py", line 105, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "/var/task/scrapy/core/engine.py", line 70, in __init__
    self.scraper = Scraper(crawler)
  File "/var/task/scrapy/core/scraper.py", line 71, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "/var/task/scrapy/middleware.py", line 53, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/var/task/scrapy/middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "/var/task/scrapy/utils/misc.py", line 44, in load_object
    mod = import_module(module)
  File "/var/lang/lib/python3.7/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/var/task/jobs/pipelines.py", line 7, in <module>
    from scrapy.pipelines.images import ImagesPipeline
  File "/var/task/scrapy/pipelines/images.py", line 15, in <module>
    from PIL import Image
  File "/var/task/PIL/Image.py", line 93, in <module>
    from . import _imaging as core
builtins.ImportError: /var/task/PIL/_imaging.cpython-37m-x86_64-linux-gnu.so: ELF load command address/offset not properly aligned

END RequestId: a53f34b0-6e07-4358-9f4b-9868b30be703
REPORT RequestId: a53f34b0-6e07-4358-9f4b-9868b30be703	Duration: 131.01 ms	Billed Duration: 200 ms 	Memory Size: 1024 MB	Max Memory Used: 135 MB

The serverless config has the following setup:

custom:
  pythonRequirements:
    dockerizePip: true
    useDownloadCache: true
    slim: true
    strip: false

As you can see from the log output of the deploy the command being executed in the Docker container still runs the strip comand: find /var/task -name \\*.so -exec strip \\{\\} \\;'...

Any ideas?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions