-
Notifications
You must be signed in to change notification settings - Fork 3.1k
Add --only-deps
(and --only-build-deps
) option(s)
#11440
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thanks for filing this @flying-sheep! I wonder if it would be better for |
If I interpret the spartan docs for |
There’s actually yet another possibility: make I think I personally like |
I see, those options are modifiers that pick out individual packages from the flattened dependency list and modify pip’s behavior towards that ones. So one would do: cd mypkg # project dir
pip install --only-deps=mypkg .[floob] I think it makes sense regarding consistency with |
It may make sense to create wild card names e.g. |
This would be extremely useful to prepare lambda layers, or any kind of "pre-provided" environment, whilst keeping the exact requirements (including locks) properly versioned in a git repository. The target environment could then be replicated with ease, e.g. when developing locally, or when testing. Follows an example [build-system]
requires = [
"setuptools >= 45",
"wheel",
]
build-backend = "setuptools.build_meta"
[project]
name = "my-lambda"
requires-python = ">= 3.7"
version = "0.1.0"
# Again, this is just an example!
[project.optional-dependencies]
provided = [
"typing-extensions >= 4",
"requests ~= 2.23.0",
"requests_aws4auth ~= 0.9",
"boto3 ~= 1.13.14",
"certifi >= 2020.4.5.1",
"elasticsearch ~= 7.7.0",
"elasticsearch_dsl ~= 7.2.0",
"aws_requests_auth ~= 0.4.2",
]
pre-commit = [
'nox >= 2022.1',
'pytest >= 7.1.2',
'black[d] >= 22',
'mypy >= 0.950',
'pre-commit >= 2.17.0',
'flake8 >= 4; python_version >= "3.8"',
'flake8 < 4; python_version < "3.8"',
'pydocstyle[toml] >= 6.1.1',
'isort >= 5.10.1',
] Then, when creating a new "provided" environment (e.g. a lambda layer): # Must be run in a similar environment as the target one.
# Advantage of this over `pip download` is the ability
# of mixing source and binary distributions, whenever
# necessary (e.g. downloading both numpy and pyspark).
# Could also add locks/pinning, via `--constraint`.
mkdir -p dist/python
pip3 install \
.[provided] \
--target dist/python \
--only-deps=:requested:
( cd dist && zip ../dist.provided.zip ./python ) And in a development or CI-like environment: # May be cached.
python3 -m venv venv
source ./venv/bin/activate
# Gets all development tools, and anything used to run
# automated tasks.
# Could also add locks/pinning, via `--constraint`.
./venv/in/pip3 install -e .[provided,pre-commit] |
@uranusjr Sure, I’m not married to the semantics I suggested. I’m fine with your design. Now we just need someone to implement it lol. |
Would it be reasonable to have |
Yes. |
I'm not sure I understand the pip install . --only-deps dependencies
pip install . --only-deps requires
pip install . --only-deps doc # for a "doc" key under optional-dependencies And the two comments above say that if you need two of those, you need multiple invocations of |
IMO, None the less, I agree that we don't need the additional complexity here. |
Okay, so:
And just to confirm, |
I'd suggest a small variation, to avoid overloading the [extras] syntax.
|
Maybe there are situations where you only want to install the optional dependencies without the project dependencies.
|
I'm not comfortable with that. Indeed extras are additive to the base dependencies by definition, so such a mechanism sounds a bit awkward to me. |
The desire for this feature just came up in a discussion at work around whether dependencies should be recorded in The motivating scenario is beginners who have started coding, have some script or package going (i.e., not doing a |
Without a build system there's no way for pip to determine what the dependencies are, pip doesn't (afaik, and it shouldn't) read dependencies from |
@brettcannon In the use case, what is the motivation behind not including |
If
Are you saying we need to come up with a separate standard to list arbitrary dependencies like requirements files allow for (this is not the same a lock file to me; think of it as the input file of your top-level dependencies for a lock file generation tool)?
Beginners don't typically need it (i.e., they aren't doing a |
Yes, it's technically true that if the I understand the use case, and in a practical sense, getting the data from Is there a reason this couldn't be an external tool? # Warning: code has only been lightly tested!
with open(FILE, "rb") as f:
data = tomllib.load(f)
if "project" not in data:
raise ValueError("No PEP 621 metadata in pyproject.toml")
if "dependencies" in data["project"].get("dynamic", []):
raise ValueError("Dependencies cannot be dynamic")
deps = data["project"].get("dependencies")
if deps:
cmd = [sys.executable, "-m", "pip", "install", *deps]
subprocess.run(cmd) |
Everything can be an external tool 😉, but at least for VS Code we have a general policy of trying not to introduce our own custom tooling so people can operate outside of VS Code without issue. So if we created an If the answer from pip is, "use requirements files," then I totally understand and accept that as that's pip's mechanism for this sort of thing. But it also means that I will probably have to develop some new standard for feeding dependencies into a lock file tool since right now it seems like all that tool could take is what's provided on the command-line (although ironically pip-tools now works with |
My impression reading from the above is the real hurdle is actually making the code a proper Python package (with the build system etc. defined), not to install only dependencies of a Python package. The latter is in itself an entirely separate valid use case, but can be more properly covered by external tooling. So maybe what we really need is an accompanying file format that mirrors PEP 621, but does not express a Python package by forbidding |
I was thinking about this more, and it occurred to me that I don't think it actually exists to have a PEP 518 says that it is expected if the PEP 517 says that if This it is my assertion that any directory that has a Likewise, since setuptools implements PEP 621, the following [project]
name = "test"
version = "1.0"
dependencies = ["requests"] So I guess, in a way, what @brettcannon wants exists already (other than the Also note that it's not a valid PEP 621 file if it doesn't have a name and version specified (either dynamically or statically for version, and only statically for name). This means that it's not possible to create a valid
pip-tools isn't reading the |
There are some references in here to reading metadata from For example, in Hatch, you do: [project]
dependencies = [
"black @ {root:uri}/black_editable"
] ...to declare a dependency at a path relative to the current project. This is called the Context formatting API. Similarly, PDM injects a magic [project]
dependencies = [
"sub-package @ file:///${PROJECT_ROOT}/sub-package",
"first @ file:///${PROJECT_ROOT}/first-1.0.0-py2.py3-none-any.whl",
] I think we actually support that in uv too (it falls out of supporting environment variable expansion in |
If I remember correctly, the goal of static/dynamic in PEP 621 was to allow for backend interoperability and allow decision making on the metadata without invoking the backend.
So I share this view, as they prevent these optimizations. Actually, I think frontends should check that backends honor static metadata in the wheels they build. |
So do I. |
@pietakio your request doesn't sound reasonable to me. What's the justification? You probably just want dependency groups. |
@webknjaz disk is cheap these days, so the default behavior of However, the speed up would be refreshing if you could optimize |
@evbo doesn't seem like a valid use case. If you don't want a package, why do you package it in the first place? |
@webknjaz name another tool that installs python packages into a virtual environment as defined by a pyproject.toml file and I'll use that instead if it beats the simplicity of setuptools. |
Setuptools is not an installer but a build backend. But that's not the point. And is out of the scope. The point is that you're asking for dependency groups and instead of relying on them demand that pip stops following standards, it seems. |
Unfortunately my read is that git repositories@branches syntax are not supported by dependency groups? If I'm wrong though then I would use them. |
@evbo pretty sure they would not be any different from the PEP 508 requirements specs in other places. They'd be fed into the same dependency resolver. |
Use dep2req to make a requirements.txt and then use |
@gsmethells while there might be tools for extracting that from the metadata, desire to use one typically indicates a fundamentally flawed structure of a project's dependency management strategy. I'd recommend first finding out the root cause and addressing that rather than relying on hacks. |
That would be It solves the original problem by simply running As for the matter of container deployment, Mr. @hynek wrote a pretty handy piece about it. It is a good starting point from where you can build your own base container and decide on whether you want to use caches, locks, or virtual environments.
It is possible with |
@Aeron I primarily develop in Rust where Cargo is a one-stop-shop. This is refreshing, thank you. |
Uh oh!
There was an error while loading. Please reload this page.
#11440 (comment) is the currently agreed upon user-facing design for this feature.
What's the problem this feature will solve?
In #8049, we identified an use case for installing just the dependencies from
pyproject.toml
.As described in the solution section below
--only-deps=<spec>
would determine all dependencies of<spec>
excluding that package itself and install those without installing the package. It could be used toThis example shows both use cases:
Instead of the solution from #8049, @pradyunsg prefers a solution similar to the one below: #8049 (comment)
Describe the solution you'd like
One of those two, or similar:
(used in the example above)
--only-deps
would work like-r
in that it’s not a flag globally modifying pip’s behavior but a CLI option with one argument that can be specified multiple times. Unlike-r
it accepts a dependency spec and not a path to a file containing dependency specs.Where
pip install <spec>
first installs all dependencies and then (build and) install the package referred to by the spec itself,pip install --only-deps=<spec>
would only install the dependencies.--only-deps
would work like--[no|only]-binary
, in that it requires an argument specifying what package not to install. A placeholder like:requested:
could be used, e.g.:Alternative Solutions
Re-using
-r
instead of adding--only-deps
.I don’t think this is a good idea, since people would be tempted to do
-r pyproject.toml
which would be wrong (Dependency specs including file paths look like like./path/to/pkg[extra1,extra2]
)Making
--only-deps
a global switch modifying pip’s behavior like e.g.--pre
.I have found that global switches like that are dangerous and not very intuitive. To install a dev version of your package, doing
pip install --pre mypkg
seems innocuous but will actually install dev versions ofmypkg
and all its dependencies that have any dev versions. It’s safer to do something likepip install mypkg>=0.1.post0.dev0
to limit dev version installations to one package. Similarly it’s unclear what a--only-deps
switch would apply to. Wouldpip install -r reqs.txt --only-deps
install the dependencies of every package specified in the file but none of those packages?Using e.g. beni to convert PEP 621 dependencies to a requirements.txt.
This works even today but feels like is shouldn’t be necessary as it involves quite a few steps, including writing a file to disk.
Additional context
NA
Code of Conduct
The text was updated successfully, but these errors were encountered: