Discussion:
[openstack-dev] [all][requirements] a plan to stop syncing requirements into projects
Doug Hellmann
2018-03-15 11:03:11 UTC
Permalink
Back in Barcelona for the Ocata summit I presented a rough outline
of a plan for us to change the way we manage dependencies across
projects so that we can stop syncing them [1]. We've made some
progress, and I think it's time to finish the work so I'm volunteering
to take some of it up during Rocky. This email is meant to rehash
and update the proposal, and fill in some of the missing details.

[1] https://etherpad.openstack.org/p/ocata-requirements-notes

TL;DR
-----

Let's stop copying exact dependency specifications into all our
projects to allow them to reflect the actual versions of things
they depend on. The constraints system in pip makes this change
safe. We still need to maintain some level of compatibility, so the
existing requirements-check job (run for changes to requirements.txt
within each repo) will change a bit rather than going away completely.
We can enable unit test jobs to verify the lower constraint settings
at the same time that we're doing the other work.

Some History
------------

Back in the dark ages of OpenStack development we had a lot of
trouble keeping the dependencies of all of our various projects
configured so they were co-installable. Usually, but not always,
the problems were caused by caps or "exclusions" (version != X) on
dependencies in one project but not in another. Because pip's
dependency resolver does not take into account the versions of
dependencies needed by existing packages, it was quite easy to
install things in the "wrong" order and end up with incompatible
libraries so services wouldn't start or couldn't import plugins.

The first (working) solution to the problem was to develop a
dependency management system based the openstack/requirements
repository. This system and our policies required projects to copy
exactly the settings for all of their dependencies from a global
list managed by a team of reviewers (first the release team, and
later the requirements team). By copying exactly the same settings
into all projects we ensured that they were "co-installable" without
any dependency conflicts. Having a centralized list of dependencies
with a review team also gave us an opportunity to look for duplicates,
packages using incompatible licenses, and otherwise curate the list
of dependencies. More on that later.

Some time after we had the centralized dependency management system
in place, Robert Collins worked with the PyPA folks to add a feature
to pip to constrain installed versions of packages that are actually
installed, while still allowing a range of versions to be specified
in the dependency list. We were then able to to create a list of
"upper constraints" -- the highest, or newest, versions -- of all
of the packages we depend on and set up our test jobs to use that
list to control what is actually installed. This gives us the ability
to say that we need at least version X.Y.Z of a package and to force
the selection of X.Y+1.0 because we want to test with that version.

The constraint feature means that we no longer need to have all of
the dependency specifications match exactly, since we basically
force the installation of a specific version anyway. We've been
running with both constraints and requirements syncing enabled for
a while now, and I think we should stop syncing the settings to
allow projects to let their lower bounds (the minimum versions of
their dependencies) diverge.

That divergence is useful to folks creating packages for just some
of the services, especially when they are going to be deployed in
isolation where co-installability is not required. Skipping the
syncs will also mean we end up releasing fewer versions of stable
libraries, because we won't be raising the minimum supported versions
of their dependencies automatically. That second benefit is my
motivation for focusing on this right now.

Our Requirements
----------------

We have three primary requirements for managing the dependency list:

1. Maintain a list of co-installable versions of all of our
dependencies.

2. Avoid breaking or deadlocking any of our gate jobs due to
dependency conflicts.

3. Continue to review new dependencies for licensing, redundancy,
etc.

I believe the upper-constraints.txt file in openstack/releases
satisfies the first two of these requirements. The third means we
need to continue to *have* a global requirements list, but we can
change how we manage it.

In addition to these hard requirements, it would be nice if we could
test the lower bounds of dependencies in projects to detect when a
project is using a feature of a newer version of a library than
their dependencies indicate. Although that is a bit orthogonal to
the syncing issue, I'm going to describe one way we could do that
because the original plan of keeping a global list of "lower
constraints" breaks our ability to stop syncing the same lower
bounds into all of the projects somewhat.

What I Want to Do
-----------------

1. Update the requirements-check test job to change the check for
an exact match to be a check for compatibility with the
upper-constraints.txt value.

We would check the value for the dependency from upper-constraints.txt
against the range of allowed values in the project. If the
constraint version is compatible, the dependency range is OK.

This rule means that in order to change the dependency settings
for a project in a way that are incompatible with the constraint,
the constraint (and probably the global requirements list) would
have to be changed first in openstack/requirements. However, if
the change to the dependency is still compatible with the
constraint, no change would be needed in openstack/requirements.
For example, if the global list constraints a library to X.Y.Z
and a project lists X.Y.Z-2 as the minimum version but then needs
to raise that because it needs a feature in X.Y.Z-1, it can do
that with a single patch in-tree.

We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.

We also need to verify that projects do not cap dependencies for
the same reason. Caps prevent us from advancing to versions of
dependencies that are "too new" and possibly incompatible. We
can manage caps in the global requirements list, which would
cause that list to calculate the constraints correctly.

This change would immediately allow all projects currently
following the global requirements lists to specify different
lower bounds from that global list, as long as those lower bounds
still allow the dependencies to be co-installable. (The upper
bounds, managed through the upper-constraints.txt list, would
still be built by selecting the newest compatible version because
that is how pip's dependency resolver works.)

2. We should stop syncing dependencies by turning off the
propose-update-requirements job entirely.

Turning off the job will stop the bot from proposing more
dependency updates to projects.

As part of deleting the job we can also remove the "requirements"
case from playbooks/proposal/propose_update.sh, since it won't
need that logic any more. We can also remove the update-requirements
command from the openstack/requirements repository, since that
is the tool that generates the updated list and it won't be
needed if we aren't proposing updates any more.

3. Remove the minimum specifications from the global requirements
list to make clear that the global list is no longer expressing
minimums.

This clean-up step has been a bit more controversial among the
requirements team, but I think it is a key piece. As the minimum
versions of dependencies diverge within projects, there will no
longer *be* a real global set of minimum values. Tracking a list of
"highest minimums", would either require rebuilding the list from the
settings in all projects, or requiring two patches to change the
minimum version of a dependency within a project.

Maintaining a global list of minimums also implies that we
consider it OK to run OpenStack as a whole with that list. This
message conflicts with the message we've been sending about the
upper constraints list since that was established, which is that
we have a known good list of versions and deploying all of
OpenStack with different versions of those dependencies is
untested.

After these 3 steps are done, the requirements team will continue
to maintain the global-requirements.txt and upper-constraints.txt
files, as before. Adding a new dependency to a project will still
involve a review step to add it to the global list so we can monitor
licensing, duplication, python 3 support, etc. But adjusting the
version numbers once that dependency is in the global list will be
easier.

Testing Lower Bounds of Dependencies
------------------------------------

I don't have any personal interest in us testing against "old"
versions of dependencies, but since the requirements team feels at
least having a plan for such testing in place is a prerequisite for
the other work, here is what I've come up with.

We can define a new test job to run the unit tests under python 3
using a tox environment called "lower-constraints" that is configured
to install the dependencies for the repo using a file lower-constraints.txt
that lives in the project repository.

Then, for each repository listed in projects.txt (~325 today), we
need to add the job to the zuul configuration within the repo. We
don't want to turn the job on voting by default globally for all
of those projects because it would break until the tox environment
was configured. We don't want to turn it on non-voting because then
the infra team would have ~325 patches to review as it was set to
be voting for each repository individually. At some point in the
future, after all of the projects have it enabled in-repo, we can
move that configuration to the project-config repo in 1 patch and
make the lower-constraints job part of the check-requirements job
template.

To configure the job in a given repo we will need to run a few
separate steps to prepare a single patch like
https://review.openstack.org/#/c/550603/ (that patch is experimental
and contains the full job definition, which won't be needed
everywhere).

1. Set up a new tox environment called "lower-constraints" with
base-python set to "python3" and with the deps setting configured
to include a copy of the existing global lower constraints file
from the openstack/requirements repo.

2. Run "tox -e lower-constraints —notest" to build a virtualenv
using the lower constraints.

3. Run ".tox/lower-constraints/bin/pip freeze > lower-constraints.txt"
to create the initial version of the lower-constraints.txt file for
the current repo.

4. Modify the tox settings for lower-constraints to point to the
new file that was generated instead of the global list.

5. Update the zuul configuration to add the new job defined in
project-config.

The results of those steps can be combined into a single patch and
proposed to the project. To avoid overwhelming zuul's job configuration
resolver, we need to propose the patches in separate batches of
about 10 repos at a time. This is all mostly scriptable, so I will
write a script and propose the patches (unless someone else wants to do
it all -- we need a single person to keep up with how many patches we're
proposing at one time).

The point of creating the initial lower-constraints.txt file is not
necessarily to be "accurate" with the constraints immediately, but
to have something to work from. After the patches are proposed,
please either plan to land them or vote -2 indicating that you don't
want a job like that on that repo. If you want to change the
constraints significantly, please do that in a separate patch. With
~325 of them, I'm not going to be able to keep up with everyone's
separate needs and this is all meant to just establish the initial
version of the job anyway.

For projects that currently only support python 2 we can modify the
proposed patches to not set base-python to use python3.

You will have noticed that this will only apply to unit test jobs.
Projects are free to use the results to add their own functional
test jobs using the same lower-constraints.txt files, but that's
up to them to do.

For the reasons outlined above about why we want divergence, I don't
think it makes much sense to run a full integration job with the
other projects, since their dependency lists may differ.

Sorry for the length of this email, but we don't have a specs repo
for the requirements team and I wanted to put all of the details
of the proposal down in one place for discussion.

Let me know what you think,
Doug
Thierry Carrez
2018-03-15 13:34:50 UTC
Permalink
Post by Doug Hellmann
[...]
TL;DR
-----
Let's stop copying exact dependency specifications into all our
projects to allow them to reflect the actual versions of things
they depend on. The constraints system in pip makes this change
safe. We still need to maintain some level of compatibility, so the
existing requirements-check job (run for changes to requirements.txt
within each repo) will change a bit rather than going away completely.
We can enable unit test jobs to verify the lower constraint settings
at the same time that we're doing the other work.
Thanks for the very detailed plan, Doug. It all makes sense to me,
although I have a precision question (see below).
Post by Doug Hellmann
[...]
We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.
[...]
2. We should stop syncing dependencies by turning off the
propose-update-requirements job entirely.
Turning off the job will stop the bot from proposing more
dependency updates to projects.
[...]
After these 3 steps are done, the requirements team will continue
to maintain the global-requirements.txt and upper-constraints.txt
files, as before. Adding a new dependency to a project will still
involve a review step to add it to the global list so we can monitor
licensing, duplication, python 3 support, etc. But adjusting the
version numbers once that dependency is in the global list will be
easier.
How would you set up an exclusion in that new world order ? We used to
add it to the global-requirements file and the bot would automatically
sync it to various consuming projects.

Now since any exclusion needs to also appear on the global file, you
would push it first in the global-requirements, then to the project
itself, is that correct ? In the end the global-requirements file would
only contain those exclusions, right ?
--
Thierry Carrez (ttx)

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.openstac
Doug Hellmann
2018-03-15 13:45:38 UTC
Permalink
Post by Thierry Carrez
Post by Doug Hellmann
[...]
TL;DR
-----
Let's stop copying exact dependency specifications into all our
projects to allow them to reflect the actual versions of things
they depend on. The constraints system in pip makes this change
safe. We still need to maintain some level of compatibility, so the
existing requirements-check job (run for changes to requirements.txt
within each repo) will change a bit rather than going away completely.
We can enable unit test jobs to verify the lower constraint settings
at the same time that we're doing the other work.
Thanks for the very detailed plan, Doug. It all makes sense to me,
although I have a precision question (see below).
Post by Doug Hellmann
[...]
We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.
[...]
2. We should stop syncing dependencies by turning off the
propose-update-requirements job entirely.
Turning off the job will stop the bot from proposing more
dependency updates to projects.
[...]
After these 3 steps are done, the requirements team will continue
to maintain the global-requirements.txt and upper-constraints.txt
files, as before. Adding a new dependency to a project will still
involve a review step to add it to the global list so we can monitor
licensing, duplication, python 3 support, etc. But adjusting the
version numbers once that dependency is in the global list will be
easier.
How would you set up an exclusion in that new world order ? We used to
add it to the global-requirements file and the bot would automatically
sync it to various consuming projects.
Now since any exclusion needs to also appear on the global file, you
would push it first in the global-requirements, then to the project
itself, is that correct ? In the end the global-requirements file would
only contain those exclusions, right ?
The first step would need to be adding it to the global-requirements.txt
list. After that, it would depend on how picky we want to be. If the
upper-constraints.txt list is successfully updated to avoid the release,
we might not need anything in the project. If the project wants to
provide detailed guidance about compatibility, then they could add the
exclusion. For example, if a version of oslo.config breaks cinder but
not nova, we might only put the exclusion in global-requirements.txt and
the requirements.txt for cinder.

Doug

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/
Matthew Thode
2018-03-15 15:05:50 UTC
Permalink
Post by Doug Hellmann
Post by Thierry Carrez
Post by Doug Hellmann
[...]
TL;DR
-----
Let's stop copying exact dependency specifications into all our
projects to allow them to reflect the actual versions of things
they depend on. The constraints system in pip makes this change
safe. We still need to maintain some level of compatibility, so the
existing requirements-check job (run for changes to requirements.txt
within each repo) will change a bit rather than going away completely.
We can enable unit test jobs to verify the lower constraint settings
at the same time that we're doing the other work.
Thanks for the very detailed plan, Doug. It all makes sense to me,
although I have a precision question (see below).
Post by Doug Hellmann
[...]
We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.
[...]
2. We should stop syncing dependencies by turning off the
propose-update-requirements job entirely.
Turning off the job will stop the bot from proposing more
dependency updates to projects.
[...]
After these 3 steps are done, the requirements team will continue
to maintain the global-requirements.txt and upper-constraints.txt
files, as before. Adding a new dependency to a project will still
involve a review step to add it to the global list so we can monitor
licensing, duplication, python 3 support, etc. But adjusting the
version numbers once that dependency is in the global list will be
easier.
How would you set up an exclusion in that new world order ? We used to
add it to the global-requirements file and the bot would automatically
sync it to various consuming projects.
Now since any exclusion needs to also appear on the global file, you
would push it first in the global-requirements, then to the project
itself, is that correct ? In the end the global-requirements file would
only contain those exclusions, right ?
The first step would need to be adding it to the global-requirements.txt
list. After that, it would depend on how picky we want to be. If the
upper-constraints.txt list is successfully updated to avoid the release,
we might not need anything in the project. If the project wants to
provide detailed guidance about compatibility, then they could add the
exclusion. For example, if a version of oslo.config breaks cinder but
not nova, we might only put the exclusion in global-requirements.txt and
the requirements.txt for cinder.
I wonder if we'd be able to have projects decide via a flag in their tox
or zuul config if they'd like to opt into auto-updating exclusions only.
--
Matthew Thode (prometheanfire)
Doug Hellmann
2018-03-15 23:15:23 UTC
Permalink
Post by Matthew Thode
Post by Doug Hellmann
Post by Thierry Carrez
Post by Doug Hellmann
[...]
TL;DR
-----
Let's stop copying exact dependency specifications into all our
projects to allow them to reflect the actual versions of things
they depend on. The constraints system in pip makes this change
safe. We still need to maintain some level of compatibility, so the
existing requirements-check job (run for changes to requirements.txt
within each repo) will change a bit rather than going away completely.
We can enable unit test jobs to verify the lower constraint settings
at the same time that we're doing the other work.
Thanks for the very detailed plan, Doug. It all makes sense to me,
although I have a precision question (see below).
Post by Doug Hellmann
[...]
We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.
[...]
2. We should stop syncing dependencies by turning off the
propose-update-requirements job entirely.
Turning off the job will stop the bot from proposing more
dependency updates to projects.
[...]
After these 3 steps are done, the requirements team will continue
to maintain the global-requirements.txt and upper-constraints.txt
files, as before. Adding a new dependency to a project will still
involve a review step to add it to the global list so we can monitor
licensing, duplication, python 3 support, etc. But adjusting the
version numbers once that dependency is in the global list will be
easier.
How would you set up an exclusion in that new world order ? We used to
add it to the global-requirements file and the bot would automatically
sync it to various consuming projects.
Now since any exclusion needs to also appear on the global file, you
would push it first in the global-requirements, then to the project
itself, is that correct ? In the end the global-requirements file would
only contain those exclusions, right ?
The first step would need to be adding it to the global-requirements.txt
list. After that, it would depend on how picky we want to be. If the
upper-constraints.txt list is successfully updated to avoid the release,
we might not need anything in the project. If the project wants to
provide detailed guidance about compatibility, then they could add the
exclusion. For example, if a version of oslo.config breaks cinder but
not nova, we might only put the exclusion in global-requirements.txt and
the requirements.txt for cinder.
I wonder if we'd be able to have projects decide via a flag in their tox
or zuul config if they'd like to opt into auto-updating exclusions only.
We could just change the job that does the sync and use the existing
projects.txt file, couldn't we?

Doug

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman
Jeremy Stanley
2018-03-15 14:28:49 UTC
Permalink
On 2018-03-15 07:03:11 -0400 (-0400), Doug Hellmann wrote:
[...]
Post by Doug Hellmann
1. Update the requirements-check test job to change the check for
an exact match to be a check for compatibility with the
upper-constraints.txt value.
[...]

I thought it might be possible to even just do away with this job
entirely, but some cursory testing shows that if you supply a
required versionspec which excludes your constrained version of the
same package, you'll still get the constrained version installed
even though you indicated it wasn't in your "supported" range. Might
be a nice patch to work on upstream in pip, making it explicitly
error on such a mismatch (and _then_ we might be able to stop
bothering with this job).
Post by Doug Hellmann
We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.
[...]

At first it seems like this wouldn't end up being necessary; as long
as you're not setting an upper bound or excluding the constrained
version, there shouldn't be a coinstallability problem right? Though
I suppose there are still a couple of potential pitfalls if we don't
check exclusions: setting an exclusion for a future version which
hasn't been released yet or is otherwise higher than the global
upper constraint; situations where we need to roll back a constraint
to an earlier version (e.g., we discover a bug in it) and some
project has that earlier version excluded. So I suppose there is
some merit to centrally coordinating these, making sure we can still
pick sane constraints which work for all projects (mental
exercise: do we also need to build a tool which can make sure that
proposed exclusions don't eliminate all possible version numbers?).
Post by Doug Hellmann
As the minimum
versions of dependencies diverge within projects, there will no
longer *be* a real global set of minimum values. Tracking a list of
"highest minimums", would either require rebuilding the list from the
settings in all projects, or requiring two patches to change the
minimum version of a dependency within a project.
[...]

It's also been suggested in the past that package maintainers for
some distributions relied on the ranges in our global requirements
list to determine what the minimum acceptable version of a
dependency is so they know whether/when it needs updating (fairly
critical when you consider that within a given distro some
dependencies may be shared by entirely unrelated software outside
our ecosystem and may not be compatible with new versions as soon as
we are). On the other hand, we never actually _test_ our lower
bounds, so this was to some extent a convenient fiction anyway.
Post by Doug Hellmann
1. Set up a new tox environment called "lower-constraints" with
base-python set to "python3" and with the deps setting configured
to include a copy of the existing global lower constraints file
from the openstack/requirements repo.
[...]

I didn't realize lower-constraints.txt already existed (looks like
it got added a little over a week ago). Reviewing the log it seems
to have been updated based on individual projects' declared minimums
so far which seems to make it a questionable starting point for a
baseline. I suppose the assumption is that projects have been
merging requirements proposals which bump their declared
lower-bounds, though experience suggests that this doesn't happen
consistently in projects receiving g-r updates today (they will
either ignore the syncs or amend them to undo the lower-bounds
changes before merging). At any rate, I suppose that's a separate
conversation to be had, and as you say it's just a place to start
from but projects will be able to change it to whatever values they
want at that point.
--
Jeremy Stanley
Doug Hellmann
2018-03-15 23:22:03 UTC
Permalink
Post by Jeremy Stanley
[...]
Post by Doug Hellmann
1. Update the requirements-check test job to change the check for
an exact match to be a check for compatibility with the
upper-constraints.txt value.
[...]
I thought it might be possible to even just do away with this job
entirely, but some cursory testing shows that if you supply a
required versionspec which excludes your constrained version of the
same package, you'll still get the constrained version installed
even though you indicated it wasn't in your "supported" range. Might
be a nice patch to work on upstream in pip, making it explicitly
error on such a mismatch (and _then_ we might be able to stop
bothering with this job).
Post by Doug Hellmann
We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.
[...]
At first it seems like this wouldn't end up being necessary; as long
as you're not setting an upper bound or excluding the constrained
version, there shouldn't be a coinstallability problem right? Though
That second case is what this prevents. There's a race condition between
updating the requirements range (and exclusions) in a project tree and
updating the upper-constraints.txt list. The check forces those lists to
be updated in an order that avoids a case where the version in
constraints is not compatible with an app installed in an integration
test job.
Post by Jeremy Stanley
I suppose there are still a couple of potential pitfalls if we don't
check exclusions: setting an exclusion for a future version which
hasn't been released yet or is otherwise higher than the global
upper constraint; situations where we need to roll back a constraint
to an earlier version (e.g., we discover a bug in it) and some
project has that earlier version excluded. So I suppose there is
some merit to centrally coordinating these, making sure we can still
pick sane constraints which work for all projects (mental
exercise: do we also need to build a tool which can make sure that
proposed exclusions don't eliminate all possible version numbers?).
Yes, those are all good failure cases that this prevents, too.
Post by Jeremy Stanley
Post by Doug Hellmann
As the minimum
versions of dependencies diverge within projects, there will no
longer *be* a real global set of minimum values. Tracking a list of
"highest minimums", would either require rebuilding the list from the
settings in all projects, or requiring two patches to change the
minimum version of a dependency within a project.
[...]
It's also been suggested in the past that package maintainers for
some distributions relied on the ranges in our global requirements
list to determine what the minimum acceptable version of a
dependency is so they know whether/when it needs updating (fairly
critical when you consider that within a given distro some
dependencies may be shared by entirely unrelated software outside
our ecosystem and may not be compatible with new versions as soon as
we are). On the other hand, we never actually _test_ our lower
bounds, so this was to some extent a convenient fiction anyway.
The lack of testing is an issue, but the tight coupling of those
lower bounds is a bigger problem to me. I know that distros don't
necessarily package exactly what we have in the upper-constraints.txt
list, but they're doing their own testing with those alternatives.
Post by Jeremy Stanley
Post by Doug Hellmann
1. Set up a new tox environment called "lower-constraints" with
base-python set to "python3" and with the deps setting configured
to include a copy of the existing global lower constraints file
from the openstack/requirements repo.
[...]
I didn't realize lower-constraints.txt already existed (looks like
it got added a little over a week ago). Reviewing the log it seems
Yes, Dirk did that work.
Post by Jeremy Stanley
to have been updated based on individual projects' declared minimums
so far which seems to make it a questionable starting point for a
baseline. I suppose the assumption is that projects have been
merging requirements proposals which bump their declared
lower-bounds, though experience suggests that this doesn't happen
consistently in projects receiving g-r updates today (they will
either ignore the syncs or amend them to undo the lower-bounds
changes before merging). At any rate, I suppose that's a separate
conversation to be had, and as you say it's just a place to start
from but projects will be able to change it to whatever values they
want at that point.
Right. The fact that the values aren't necessarily accurate won't
be affected by the change to stop syncing, and the additional unit
tests should help us catch at least some of the issues.

Doug

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://li
Matthew Thode
2018-03-15 15:24:10 UTC
Permalink
Post by Doug Hellmann
What I Want to Do
-----------------
1. Update the requirements-check test job to change the check for
an exact match to be a check for compatibility with the
upper-constraints.txt value.
We would check the value for the dependency from upper-constraints.txt
against the range of allowed values in the project. If the
constraint version is compatible, the dependency range is OK.
This rule means that in order to change the dependency settings
for a project in a way that are incompatible with the constraint,
the constraint (and probably the global requirements list) would
have to be changed first in openstack/requirements. However, if
the change to the dependency is still compatible with the
constraint, no change would be needed in openstack/requirements.
For example, if the global list constraints a library to X.Y.Z
and a project lists X.Y.Z-2 as the minimum version but then needs
to raise that because it needs a feature in X.Y.Z-1, it can do
that with a single patch in-tree.
I think what may be better is for global-requirements to become a
gathering place for projects that requirements watches to have their
smallest constrainted installable set defined in.

Upper-constraints has a req of foo===2.0.3
Project A has a req of foo>=1.0.0,!=1.6.0
Project B has a req of foo>=1.4.0
Global reqs would be updated with foo>=1.4.0,!=1.6.0
Project C comes along and sets foo>=2.0.0
Global reqs would be updated with foo>=2.0.0

This would make global-reqs descriptive rather than prescriptive for
versioning and would represent the 'true' version constraints of
openstack.
Post by Doug Hellmann
We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.
How would this happen when using constraints? A project is not allowed
to have a requirement that masks a constriant (and would be verified via
the requirements-check job).

There's a failure mode not covered, a project could add a mask (!=) to
their requirements before we update constraints. The project that was
passing the requirements-check job would then become incompatable. This
means that the requirements-check would need to be run for each
changeset to catch this as soon as it happens, instead of running only
on requirements changes.
Post by Doug Hellmann
We also need to verify that projects do not cap dependencies for
the same reason. Caps prevent us from advancing to versions of
dependencies that are "too new" and possibly incompatible. We
can manage caps in the global requirements list, which would
cause that list to calculate the constraints correctly.
This change would immediately allow all projects currently
following the global requirements lists to specify different
lower bounds from that global list, as long as those lower bounds
still allow the dependencies to be co-installable. (The upper
bounds, managed through the upper-constraints.txt list, would
still be built by selecting the newest compatible version because
that is how pip's dependency resolver works.)
2. We should stop syncing dependencies by turning off the
propose-update-requirements job entirely.
Turning off the job will stop the bot from proposing more
dependency updates to projects.
As part of deleting the job we can also remove the "requirements"
case from playbooks/proposal/propose_update.sh, since it won't
need that logic any more. We can also remove the update-requirements
command from the openstack/requirements repository, since that
is the tool that generates the updated list and it won't be
needed if we aren't proposing updates any more.
3. Remove the minimum specifications from the global requirements
list to make clear that the global list is no longer expressing
minimums.
This clean-up step has been a bit more controversial among the
requirements team, but I think it is a key piece. As the minimum
versions of dependencies diverge within projects, there will no
longer *be* a real global set of minimum values. Tracking a list of
"highest minimums", would either require rebuilding the list from the
settings in all projects, or requiring two patches to change the
minimum version of a dependency within a project.
Maintaining a global list of minimums also implies that we
consider it OK to run OpenStack as a whole with that list. This
message conflicts with the message we've been sending about the
upper constraints list since that was established, which is that
we have a known good list of versions and deploying all of
OpenStack with different versions of those dependencies is
untested.
As noted above I think that gathering the min versions/maskings from
openstack projects to be valuable (especially to packagers who already
use our likely invalid values already).
Post by Doug Hellmann
After these 3 steps are done, the requirements team will continue
to maintain the global-requirements.txt and upper-constraints.txt
files, as before. Adding a new dependency to a project will still
involve a review step to add it to the global list so we can monitor
licensing, duplication, python 3 support, etc. But adjusting the
version numbers once that dependency is in the global list will be
easier.
Thanks for writing this up, I think it looks good in general, but like
you mentioned before, there is some discussion to be had about gathering
and creating a versionspec from all of openstack for requirements.
--
Matthew Thode (prometheanfire)
Doug Hellmann
2018-03-15 23:29:37 UTC
Permalink
Post by Matthew Thode
Post by Doug Hellmann
What I Want to Do
-----------------
1. Update the requirements-check test job to change the check for
an exact match to be a check for compatibility with the
upper-constraints.txt value.
We would check the value for the dependency from upper-constraints.txt
against the range of allowed values in the project. If the
constraint version is compatible, the dependency range is OK.
This rule means that in order to change the dependency settings
for a project in a way that are incompatible with the constraint,
the constraint (and probably the global requirements list) would
have to be changed first in openstack/requirements. However, if
the change to the dependency is still compatible with the
constraint, no change would be needed in openstack/requirements.
For example, if the global list constraints a library to X.Y.Z
and a project lists X.Y.Z-2 as the minimum version but then needs
to raise that because it needs a feature in X.Y.Z-1, it can do
that with a single patch in-tree.
I think what may be better is for global-requirements to become a
gathering place for projects that requirements watches to have their
smallest constrainted installable set defined in.
Upper-constraints has a req of foo===2.0.3
Project A has a req of foo>=1.0.0,!=1.6.0
Project B has a req of foo>=1.4.0
Global reqs would be updated with foo>=1.4.0,!=1.6.0
Project C comes along and sets foo>=2.0.0
Global reqs would be updated with foo>=2.0.0
This would make global-reqs descriptive rather than prescriptive for
versioning and would represent the 'true' version constraints of
openstack.
It sounds like you're suggesting syncing in the other direction, which
could be useful. I think we can proceed with what I've described and
consider the work to build what you describe as a separate project.
Post by Matthew Thode
Post by Doug Hellmann
We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.
How would this happen when using constraints? A project is not allowed
to have a requirement that masks a constriant (and would be verified via
the requirements-check job).
If project A excludes version X before the constraint list is updated to
use it, and then project B starts trying to depend on version X, they
become incompatible.

We need to continue to manage our declarations of incompatible versions
to ensure that the constraints list is a good list of versions to test
everything under.
Post by Matthew Thode
There's a failure mode not covered, a project could add a mask (!=) to
their requirements before we update constraints. The project that was
passing the requirements-check job would then become incompatable. This
means that the requirements-check would need to be run for each
changeset to catch this as soon as it happens, instead of running only
on requirements changes.
I'm not clear on what you're describing here, but it sounds like a
variation of the failure modes that would be prevented if we require
exclusions to exist in the global list before they could be added to the
local list.
Post by Matthew Thode
Post by Doug Hellmann
We also need to verify that projects do not cap dependencies for
the same reason. Caps prevent us from advancing to versions of
dependencies that are "too new" and possibly incompatible. We
can manage caps in the global requirements list, which would
cause that list to calculate the constraints correctly.
This change would immediately allow all projects currently
following the global requirements lists to specify different
lower bounds from that global list, as long as those lower bounds
still allow the dependencies to be co-installable. (The upper
bounds, managed through the upper-constraints.txt list, would
still be built by selecting the newest compatible version because
that is how pip's dependency resolver works.)
2. We should stop syncing dependencies by turning off the
propose-update-requirements job entirely.
Turning off the job will stop the bot from proposing more
dependency updates to projects.
As part of deleting the job we can also remove the "requirements"
case from playbooks/proposal/propose_update.sh, since it won't
need that logic any more. We can also remove the update-requirements
command from the openstack/requirements repository, since that
is the tool that generates the updated list and it won't be
needed if we aren't proposing updates any more.
3. Remove the minimum specifications from the global requirements
list to make clear that the global list is no longer expressing
minimums.
This clean-up step has been a bit more controversial among the
requirements team, but I think it is a key piece. As the minimum
versions of dependencies diverge within projects, there will no
longer *be* a real global set of minimum values. Tracking a list of
"highest minimums", would either require rebuilding the list from the
settings in all projects, or requiring two patches to change the
minimum version of a dependency within a project.
Maintaining a global list of minimums also implies that we
consider it OK to run OpenStack as a whole with that list. This
message conflicts with the message we've been sending about the
upper constraints list since that was established, which is that
we have a known good list of versions and deploying all of
OpenStack with different versions of those dependencies is
untested.
As noted above I think that gathering the min versions/maskings from
openstack projects to be valuable (especially to packagers who already
use our likely invalid values already).
OK. I don't feel that strongly about the cleanup work, so if we want to
keep the lower bounds in place I think that's OK.
Post by Matthew Thode
Post by Doug Hellmann
After these 3 steps are done, the requirements team will continue
to maintain the global-requirements.txt and upper-constraints.txt
files, as before. Adding a new dependency to a project will still
involve a review step to add it to the global list so we can monitor
licensing, duplication, python 3 support, etc. But adjusting the
version numbers once that dependency is in the global list will be
easier.
Thanks for writing this up, I think it looks good in general, but like
you mentioned before, there is some discussion to be had about gathering
and creating a versionspec from all of openstack for requirements.
__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailm
Matthew Thode
2018-03-15 23:36:50 UTC
Permalink
Post by Doug Hellmann
Post by Matthew Thode
Post by Doug Hellmann
What I Want to Do
-----------------
1. Update the requirements-check test job to change the check for
an exact match to be a check for compatibility with the
upper-constraints.txt value.
We would check the value for the dependency from upper-constraints.txt
against the range of allowed values in the project. If the
constraint version is compatible, the dependency range is OK.
This rule means that in order to change the dependency settings
for a project in a way that are incompatible with the constraint,
the constraint (and probably the global requirements list) would
have to be changed first in openstack/requirements. However, if
the change to the dependency is still compatible with the
constraint, no change would be needed in openstack/requirements.
For example, if the global list constraints a library to X.Y.Z
and a project lists X.Y.Z-2 as the minimum version but then needs
to raise that because it needs a feature in X.Y.Z-1, it can do
that with a single patch in-tree.
I think what may be better is for global-requirements to become a
gathering place for projects that requirements watches to have their
smallest constrainted installable set defined in.
Upper-constraints has a req of foo===2.0.3
Project A has a req of foo>=1.0.0,!=1.6.0
Project B has a req of foo>=1.4.0
Global reqs would be updated with foo>=1.4.0,!=1.6.0
Project C comes along and sets foo>=2.0.0
Global reqs would be updated with foo>=2.0.0
This would make global-reqs descriptive rather than prescriptive for
versioning and would represent the 'true' version constraints of
openstack.
It sounds like you're suggesting syncing in the other direction, which
could be useful. I think we can proceed with what I've described and
consider the work to build what you describe as a separate project.
Yes, this would be a follow-on thing.
Post by Doug Hellmann
Post by Matthew Thode
Post by Doug Hellmann
We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.
How would this happen when using constraints? A project is not allowed
to have a requirement that masks a constriant (and would be verified via
the requirements-check job).
If project A excludes version X before the constraint list is updated to
use it, and then project B starts trying to depend on version X, they
become incompatible.
We need to continue to manage our declarations of incompatible versions
to ensure that the constraints list is a good list of versions to test
everything under.
Post by Matthew Thode
There's a failure mode not covered, a project could add a mask (!=) to
their requirements before we update constraints. The project that was
passing the requirements-check job would then become incompatable. This
means that the requirements-check would need to be run for each
changeset to catch this as soon as it happens, instead of running only
on requirements changes.
I'm not clear on what you're describing here, but it sounds like a
variation of the failure modes that would be prevented if we require
exclusions to exist in the global list before they could be added to the
local list.
Yes, that'd work (require exclusions to be global before local).
Post by Doug Hellmann
Post by Matthew Thode
Post by Doug Hellmann
We also need to verify that projects do not cap dependencies for
the same reason. Caps prevent us from advancing to versions of
dependencies that are "too new" and possibly incompatible. We
can manage caps in the global requirements list, which would
cause that list to calculate the constraints correctly.
This change would immediately allow all projects currently
following the global requirements lists to specify different
lower bounds from that global list, as long as those lower bounds
still allow the dependencies to be co-installable. (The upper
bounds, managed through the upper-constraints.txt list, would
still be built by selecting the newest compatible version because
that is how pip's dependency resolver works.)
2. We should stop syncing dependencies by turning off the
propose-update-requirements job entirely.
Turning off the job will stop the bot from proposing more
dependency updates to projects.
As part of deleting the job we can also remove the "requirements"
case from playbooks/proposal/propose_update.sh, since it won't
need that logic any more. We can also remove the update-requirements
command from the openstack/requirements repository, since that
is the tool that generates the updated list and it won't be
needed if we aren't proposing updates any more.
3. Remove the minimum specifications from the global requirements
list to make clear that the global list is no longer expressing
minimums.
This clean-up step has been a bit more controversial among the
requirements team, but I think it is a key piece. As the minimum
versions of dependencies diverge within projects, there will no
longer *be* a real global set of minimum values. Tracking a list of
"highest minimums", would either require rebuilding the list from the
settings in all projects, or requiring two patches to change the
minimum version of a dependency within a project.
Maintaining a global list of minimums also implies that we
consider it OK to run OpenStack as a whole with that list. This
message conflicts with the message we've been sending about the
upper constraints list since that was established, which is that
we have a known good list of versions and deploying all of
OpenStack with different versions of those dependencies is
untested.
As noted above I think that gathering the min versions/maskings from
openstack projects to be valuable (especially to packagers who already
use our likely invalid values already).
OK. I don't feel that strongly about the cleanup work, so if we want to
keep the lower bounds in place I think that's OK.
Post by Matthew Thode
Post by Doug Hellmann
After these 3 steps are done, the requirements team will continue
to maintain the global-requirements.txt and upper-constraints.txt
files, as before. Adding a new dependency to a project will still
involve a review step to add it to the global list so we can monitor
licensing, duplication, python 3 support, etc. But adjusting the
version numbers once that dependency is in the global list will be
easier.
Thanks for writing this up, I think it looks good in general, but like
you mentioned before, there is some discussion to be had about gathering
and creating a versionspec from all of openstack for requirements.
--
Matthew Thode (prometheanfire)
Doug Hellmann
2018-03-15 23:43:49 UTC
Permalink
Post by Matthew Thode
Post by Doug Hellmann
Post by Matthew Thode
Post by Doug Hellmann
What I Want to Do
-----------------
1. Update the requirements-check test job to change the check for
an exact match to be a check for compatibility with the
upper-constraints.txt value.
We would check the value for the dependency from upper-constraints.txt
against the range of allowed values in the project. If the
constraint version is compatible, the dependency range is OK.
This rule means that in order to change the dependency settings
for a project in a way that are incompatible with the constraint,
the constraint (and probably the global requirements list) would
have to be changed first in openstack/requirements. However, if
the change to the dependency is still compatible with the
constraint, no change would be needed in openstack/requirements.
For example, if the global list constraints a library to X.Y.Z
and a project lists X.Y.Z-2 as the minimum version but then needs
to raise that because it needs a feature in X.Y.Z-1, it can do
that with a single patch in-tree.
I think what may be better is for global-requirements to become a
gathering place for projects that requirements watches to have their
smallest constrainted installable set defined in.
Upper-constraints has a req of foo===2.0.3
Project A has a req of foo>=1.0.0,!=1.6.0
Project B has a req of foo>=1.4.0
Global reqs would be updated with foo>=1.4.0,!=1.6.0
Project C comes along and sets foo>=2.0.0
Global reqs would be updated with foo>=2.0.0
This would make global-reqs descriptive rather than prescriptive for
versioning and would represent the 'true' version constraints of
openstack.
It sounds like you're suggesting syncing in the other direction, which
could be useful. I think we can proceed with what I've described and
consider the work to build what you describe as a separate project.
Yes, this would be a follow-on thing.
Post by Doug Hellmann
Post by Matthew Thode
Post by Doug Hellmann
We also need to change requirements-check to look at the exclusions
to ensure they all appear in the global-requirements.txt list
(the local list needs to be a subset of the global list, but
does not have to match it exactly). We can't have one project
excluding a version that others do not, because we could then
end up with a conflict with the upper constraints list that could
wedge the gate as we had happen in the past.
How would this happen when using constraints? A project is not allowed
to have a requirement that masks a constriant (and would be verified via
the requirements-check job).
If project A excludes version X before the constraint list is updated to
use it, and then project B starts trying to depend on version X, they
become incompatible.
We need to continue to manage our declarations of incompatible versions
to ensure that the constraints list is a good list of versions to test
everything under.
Post by Matthew Thode
There's a failure mode not covered, a project could add a mask (!=) to
their requirements before we update constraints. The project that was
passing the requirements-check job would then become incompatable. This
means that the requirements-check would need to be run for each
changeset to catch this as soon as it happens, instead of running only
on requirements changes.
I'm not clear on what you're describing here, but it sounds like a
variation of the failure modes that would be prevented if we require
exclusions to exist in the global list before they could be added to the
local list.
Yes, that'd work (require exclusions to be global before local).
OK. That's what I was trying to describe as the new rules.
Post by Matthew Thode
Post by Doug Hellmann
Post by Matthew Thode
Post by Doug Hellmann
We also need to verify that projects do not cap dependencies for
the same reason. Caps prevent us from advancing to versions of
dependencies that are "too new" and possibly incompatible. We
can manage caps in the global requirements list, which would
cause that list to calculate the constraints correctly.
This change would immediately allow all projects currently
following the global requirements lists to specify different
lower bounds from that global list, as long as those lower bounds
still allow the dependencies to be co-installable. (The upper
bounds, managed through the upper-constraints.txt list, would
still be built by selecting the newest compatible version because
that is how pip's dependency resolver works.)
2. We should stop syncing dependencies by turning off the
propose-update-requirements job entirely.
Turning off the job will stop the bot from proposing more
dependency updates to projects.
As part of deleting the job we can also remove the "requirements"
case from playbooks/proposal/propose_update.sh, since it won't
need that logic any more. We can also remove the update-requirements
command from the openstack/requirements repository, since that
is the tool that generates the updated list and it won't be
needed if we aren't proposing updates any more.
3. Remove the minimum specifications from the global requirements
list to make clear that the global list is no longer expressing
minimums.
This clean-up step has been a bit more controversial among the
requirements team, but I think it is a key piece. As the minimum
versions of dependencies diverge within projects, there will no
longer *be* a real global set of minimum values. Tracking a list of
"highest minimums", would either require rebuilding the list from the
settings in all projects, or requiring two patches to change the
minimum version of a dependency within a project.
Maintaining a global list of minimums also implies that we
consider it OK to run OpenStack as a whole with that list. This
message conflicts with the message we've been sending about the
upper constraints list since that was established, which is that
we have a known good list of versions and deploying all of
OpenStack with different versions of those dependencies is
untested.
As noted above I think that gathering the min versions/maskings from
openstack projects to be valuable (especially to packagers who already
use our likely invalid values already).
OK. I don't feel that strongly about the cleanup work, so if we want to
keep the lower bounds in place I think that's OK.
Post by Matthew Thode
Post by Doug Hellmann
After these 3 steps are done, the requirements team will continue
to maintain the global-requirements.txt and upper-constraints.txt
files, as before. Adding a new dependency to a project will still
involve a review step to add it to the global list so we can monitor
licensing, duplication, python 3 support, etc. But adjusting the
version numbers once that dependency is in the global list will be
easier.
Thanks for writing this up, I think it looks good in general, but like
you mentioned before, there is some discussion to be had about gathering
and creating a versionspec from all of openstack for requirements.
Andreas Jaeger
2018-03-16 09:59:34 UTC
Permalink
thanks for the proposal, Doug. I need an example to understand how
things will work out...

so, let me use a real-life example (version numbers are made up):

openstackdocstheme uses sphinx and needs sphinx 1.6.0 or higher but
knows version 1.6.7 is broken.

So, openstackdocstheme would add to its requirements file:
sphinx>=1.6.0,!=1.6.7

Any project might assume they work with an older version, and have in
their requirements file:
Sphinx>=1.4.0
openstackdocstheme

The global requirements file would just contain:
openstackdocstheme
sphinx!=1.6.7

The upper-constraints file would contain:
sphinx===1.7.1

If we need to block sphinx 1.7.x - as we do right now - , we only update
requirements repo to have in global requirements file:
openstackdocstheme
sphinx!=1.6.7,<1.7.0

and have in upper-constraints:
sphinx===1.6.6

But projects should *not* add the cap to their projects like:
sphinx>=1.6.0,!=1.6.7,<=1.7.0

Is that all correct?

Andreas
--
Andreas Jaeger aj@{suse.com,opensuse.org} Twitter: jaegerandi
SUSE LINUX GmbH, Maxfeldstr. 5, 90409 NĂ¼rnberg, Germany
GF: Felix Imendörffer, Jane Smithard, Graham Norton,
HRB 21284 (AG NĂ¼rnberg)
GPG fingerprint = 93A3 365E CE47 B889 DF7F FED1 389A 563C C272 A126


__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-
Doug Hellmann
2018-03-21 20:02:06 UTC
Permalink
Post by Doug Hellmann
TL;DR
-----
Let's stop copying exact dependency specifications into all our
projects to allow them to reflect the actual versions of things
they depend on. The constraints system in pip makes this change
safe. We still need to maintain some level of compatibility, so the
existing requirements-check job (run for changes to requirements.txt
within each repo) will change a bit rather than going away completely.
We can enable unit test jobs to verify the lower constraint settings
at the same time that we're doing the other work.
The new job definition is in https://review.openstack.org/555034 and I
have updated the oslo.config patch I mentioned before to use the new job
instead of one defined in the oslo.config repo (see
https://review.openstack.org/550603).

I'll wait for that job patch to be reviewed and approved before I start
adding the job to a bunch of other repositories.

Doug
Doug Hellmann
2018-03-22 20:16:06 UTC
Permalink
Post by Doug Hellmann
Post by Doug Hellmann
TL;DR
-----
Let's stop copying exact dependency specifications into all our
projects to allow them to reflect the actual versions of things
they depend on. The constraints system in pip makes this change
safe. We still need to maintain some level of compatibility, so the
existing requirements-check job (run for changes to requirements.txt
within each repo) will change a bit rather than going away completely.
We can enable unit test jobs to verify the lower constraint settings
at the same time that we're doing the other work.
The new job definition is in https://review.openstack.org/555034 and I
have updated the oslo.config patch I mentioned before to use the new job
instead of one defined in the oslo.config repo (see
https://review.openstack.org/550603).
I'll wait for that job patch to be reviewed and approved before I start
adding the job to a bunch of other repositories.
Doug
The job definition for openstack-tox-lower-constraints [1] was approved
today (thanks AJaegar and pabelenger).

I have started proposing the patches to add that job to the repos listed
in openstack/requirements/projects.txt using the topic
"requirements-stop-syncing" [2]. I hope to have the rest of those
proposed by the end of the day tomorrow, but since they have to run in
batches I don't know if that will be possible.

The patch to remove the update proposal job is ready for review [3].

As is the patch to allow project requirements to diverge by changing the
rules in the requirements-check job [4].

We ran into a snag with a few of the jobs for projects that rely on
having service projects installed. There have been a couple of threads
about that recently, but Monty has promised to start another one to
provide all of the necessary context so we can fix the issues and move
ahead.

Doug

[1] https://review.openstack.org/555034
[2] https://review.openstack.org/#/q/topic:requirements-stop-syncing+(status:open+OR+status:merged)
[3] https://review.openstack.org/555426
[4] https://review.openstack.org/555402

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.o
Doug Hellmann
2018-03-25 20:04:11 UTC
Permalink
Post by Doug Hellmann
Post by Doug Hellmann
Post by Doug Hellmann
TL;DR
-----
Let's stop copying exact dependency specifications into all our
projects to allow them to reflect the actual versions of things
they depend on. The constraints system in pip makes this change
safe. We still need to maintain some level of compatibility, so the
existing requirements-check job (run for changes to requirements.txt
within each repo) will change a bit rather than going away completely.
We can enable unit test jobs to verify the lower constraint settings
at the same time that we're doing the other work.
The new job definition is in https://review.openstack.org/555034 and I
have updated the oslo.config patch I mentioned before to use the new job
instead of one defined in the oslo.config repo (see
https://review.openstack.org/550603).
I'll wait for that job patch to be reviewed and approved before I start
adding the job to a bunch of other repositories.
Doug
The job definition for openstack-tox-lower-constraints [1] was approved
today (thanks AJaegar and pabelenger).
I have started proposing the patches to add that job to the repos listed
in openstack/requirements/projects.txt using the topic
"requirements-stop-syncing" [2]. I hope to have the rest of those
proposed by the end of the day tomorrow, but since they have to run in
batches I don't know if that will be possible.
The patch to remove the update proposal job is ready for review [3].
As is the patch to allow project requirements to diverge by changing the
rules in the requirements-check job [4].
We ran into a snag with a few of the jobs for projects that rely on
having service projects installed. There have been a couple of threads
about that recently, but Monty has promised to start another one to
provide all of the necessary context so we can fix the issues and move
ahead.
Doug
All of the patches to define the lower-constraints test jobs have been
proposed [1], and many have already been approved and merged (thank you
for your quick reviews).

A few of the jobs are failing because the projects depend on installing
some other service from source. We will work out what to do with those
when we solve that problem in a more general way.

A few of the jobs failed because the dependencies were wrong. In a few
cases I was able to figure out what was wrong, but I can use some help
from project teams more familiar with the code bases to debug the
remaining failures.

In a few cases projects didn't have python 3 unit test jobs, so I
configured the new job to use python 2. Teams should add a step to their
python 3 migration plan to update the version of python used in the new
job, when that is possible.

I believe we are now ready to proceed with updating the
requirements-check job to relax the rules about which changes are
allowed [2].

Doug

[1] https://review.openstack.org/#/q/topic:requirements-stop-syncing+status:open
[2] https://review.openstack.org/555402
Doug Hellmann
2018-03-25 20:08:45 UTC
Permalink
Post by Doug Hellmann
A few of the jobs failed because the dependencies were wrong. In a few
cases I was able to figure out what was wrong, but I can use some help
from project teams more familiar with the code bases to debug the
remaining failures.
If you need to raise the lower bounds in a requirements file, please
update that file as well as lower-constraints.txt in the patch. You may
need to add a Depends-On for https://review.openstack.org/555402 in
order to have a version specifier that is different from the value in
the global requirements list.

Doug

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.openstack.
Robert Collins
2018-03-25 21:46:22 UTC
Permalink
Post by Doug Hellmann
Post by Doug Hellmann
A few of the jobs failed because the dependencies were wrong. In a few
cases I was able to figure out what was wrong, but I can use some help
from project teams more familiar with the code bases to debug the
remaining failures.
If you need to raise the lower bounds in a requirements file, please
update that file as well as lower-constraints.txt in the patch. You may
need to add a Depends-On for https://review.openstack.org/555402 in
order to have a version specifier that is different from the value in
the global requirements list.
Nice stuff; I'm so glad to see this evolution happening.

-Rob

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/list
Doug Hellmann
2018-03-25 23:21:45 UTC
Permalink
Post by Robert Collins
Post by Doug Hellmann
Post by Doug Hellmann
A few of the jobs failed because the dependencies were wrong. In a few
cases I was able to figure out what was wrong, but I can use some help
from project teams more familiar with the code bases to debug the
remaining failures.
If you need to raise the lower bounds in a requirements file, please
update that file as well as lower-constraints.txt in the patch. You may
need to add a Depends-On for https://review.openstack.org/555402 in
order to have a version specifier that is different from the value in
the global requirements list.
Nice stuff; I'm so glad to see this evolution happening.
-Rob
Thanks for laying such a firm foundation for us, Robert!

Doug

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinf
Doug Hellmann
2018-05-14 12:52:08 UTC
Permalink
Post by Doug Hellmann
Post by Doug Hellmann
Post by Doug Hellmann
Post by Doug Hellmann
TL;DR
-----
Let's stop copying exact dependency specifications into all our
projects to allow them to reflect the actual versions of things
they depend on. The constraints system in pip makes this change
safe. We still need to maintain some level of compatibility, so the
existing requirements-check job (run for changes to requirements.txt
within each repo) will change a bit rather than going away completely.
We can enable unit test jobs to verify the lower constraint settings
at the same time that we're doing the other work.
The new job definition is in https://review.openstack.org/555034 and I
have updated the oslo.config patch I mentioned before to use the new job
instead of one defined in the oslo.config repo (see
https://review.openstack.org/550603).
I'll wait for that job patch to be reviewed and approved before I start
adding the job to a bunch of other repositories.
Doug
The job definition for openstack-tox-lower-constraints [1] was approved
today (thanks AJaegar and pabelenger).
I have started proposing the patches to add that job to the repos listed
in openstack/requirements/projects.txt using the topic
"requirements-stop-syncing" [2]. I hope to have the rest of those
proposed by the end of the day tomorrow, but since they have to run in
batches I don't know if that will be possible.
The patch to remove the update proposal job is ready for review [3].
As is the patch to allow project requirements to diverge by changing the
rules in the requirements-check job [4].
We ran into a snag with a few of the jobs for projects that rely on
having service projects installed. There have been a couple of threads
about that recently, but Monty has promised to start another one to
provide all of the necessary context so we can fix the issues and move
ahead.
Doug
All of the patches to define the lower-constraints test jobs have been
proposed [1], and many have already been approved and merged (thank you
for your quick reviews).
A few of the jobs are failing because the projects depend on installing
some other service from source. We will work out what to do with those
when we solve that problem in a more general way.
A few of the jobs failed because the dependencies were wrong. In a few
cases I was able to figure out what was wrong, but I can use some help
from project teams more familiar with the code bases to debug the
remaining failures.
In a few cases projects didn't have python 3 unit test jobs, so I
configured the new job to use python 2. Teams should add a step to their
python 3 migration plan to update the version of python used in the new
job, when that is possible.
I believe we are now ready to proceed with updating the
requirements-check job to relax the rules about which changes are
allowed [2].
Doug
[1] https://review.openstack.org/#/q/topic:requirements-stop-syncing+status:open
[2] https://review.openstack.org/555402
We still have about 50 open patches related to adding the
lower-constraints test job. I'll keep those open until the third
milestone of the Rocky development cycle, and then abandon the rest to
clear my gerrit view so it is usable again.

If you want to add lower-constraints tests to your project and have
an open patch in the list [1], please take it over and fix the
settings then approve the patch (the fix usually involves making
the values in lower-constraints.txt match the values in the various
requirements.txt files).

If you don't want the job, please leave a comment on the patch to
tell me and I will abandon it.

Doug

[1] https://review.openstack.org/#/q/topic:requirements-stop-syncing+status:open

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mai

Doug Hellmann
2018-03-28 22:53:03 UTC
Permalink
We're making good progress. Some of the important parts of the
global job changes are in place. There are still a lot of open
patches to add the lower-constraints jobs to repos, however.

Excerpts from Doug Hellmann's message of 2018-03-15 07:03:11 -0400:

[...]
Post by Doug Hellmann
What I Want to Do
-----------------
1. Update the requirements-check test job to change the check for
an exact match to be a check for compatibility with the
upper-constraints.txt value.
This change has merged: https://review.openstack.org/#/c/555402/

There are some additional changes to that job still in the queue.
In particular, the change in https://review.openstack.org/#/c/557034/3
will start enforcing some rules to ensure the lower-constraints.txt
settings stay at the bottom of the requirements files.

Because we had some communication issues and did a few steps out
of order, when this patch lands projects that have approved
bot-proposed requirements updates may find that their requirements
and lower-constraints files no longer match, which may lead to job
failures. It should be easy enough to fix the problems by making
the values in the constraints files match the values in the
requirements files (by editing either set of files, depending on
what is appropriate). I apologize for any inconvenience this causes.
Post by Doug Hellmann
2. We should stop syncing dependencies by turning off the
propose-update-requirements job entirely.
This is also done: https://review.openstack.org/#/c/555426/
Post by Doug Hellmann
3. Remove the minimum specifications from the global requirements
list to make clear that the global list is no longer expressing
minimums.
This clean-up step has been a bit more controversial among the
requirements team, but I think it is a key piece. As the minimum
versions of dependencies diverge within projects, there will no
longer *be* a real global set of minimum values. Tracking a list of
"highest minimums", would either require rebuilding the list from the
settings in all projects, or requiring two patches to change the
minimum version of a dependency within a project.
Maintaining a global list of minimums also implies that we
consider it OK to run OpenStack as a whole with that list. This
message conflicts with the message we've been sending about the
upper constraints list since that was established, which is that
we have a known good list of versions and deploying all of
OpenStack with different versions of those dependencies is
untested.
We've decided not to do this step, because some of the other
requirements team members want to use those lower bound values.
Projects are no longer required to be consistent with the lower
bounds in that global file, however.
Post by Doug Hellmann
Testing Lower Bounds of Dependencies
------------------------------------
[...]
Post by Doug Hellmann
The results of those steps can be combined into a single patch and
proposed to the project. To avoid overwhelming zuul's job configuration
resolver, we need to propose the patches in separate batches of
about 10 repos at a time. This is all mostly scriptable, so I will
write a script and propose the patches (unless someone else wants to do
it all -- we need a single person to keep up with how many patches we're
proposing at one time).
The point of creating the initial lower-constraints.txt file is not
necessarily to be "accurate" with the constraints immediately, but
to have something to work from. After the patches are proposed,
please either plan to land them or vote -2 indicating that you don't
want a job like that on that repo. If you want to change the
constraints significantly, please do that in a separate patch. With
~325 of them, I'm not going to be able to keep up with everyone's
separate needs and this is all meant to just establish the initial
version of the job anyway.
I ended up needing fewer patches than expected because many of the
projects receiving requirements syncs didn't have unit test jobs
(ansible roles, and some other packaging-related things, that are tested
other ways).

Approvals have been making good progress. As I say above, if you
have minor issues with the patch, either propose a fix on top of
it or take it over and fix it directly. Even though there are fewer
patches than I expected, I'm still not going to be able to be able
to keep up with lots of individual differences or merge conflicts
in projects. Help wanted.
Post by Doug Hellmann
For projects that currently only support python 2 we can modify the
proposed patches to not set base-python to use python3.
You will have noticed that this will only apply to unit test jobs.
Projects are free to use the results to add their own functional
test jobs using the same lower-constraints.txt files, but that's
up to them to do.
I'm not aware of anyone trying to do this, yet. If you are, please let
us know how it's going.

Doug
Doug Hellmann
2018-04-04 13:58:48 UTC
Permalink
Post by Doug Hellmann
Because we had some communication issues and did a few steps out
of order, when this patch lands projects that have approved
bot-proposed requirements updates may find that their requirements
and lower-constraints files no longer match, which may lead to job
failures. It should be easy enough to fix the problems by making
the values in the constraints files match the values in the
requirements files (by editing either set of files, depending on
what is appropriate). I apologize for any inconvenience this causes.
In part because of this, and in part because of some issues calculating
the initial set of lower-constraints, we have several projects where
their lower-constraints don't match the lower bounds in the requirements
file(s). Now that the check job has been updated with the new rules,
this is preventing us from landing the patches to add the
lower-constraints test job (so those rules are working!).

I've prepared a script to help fix up the lower-constraints.txt
based on values in requirements.txt and test-requirements.txt.
That's not everything, but it should make it easier to fix the rest.

See https://review.openstack.org/#/c/558610/ for the script. I'll work
on those pep8 errors later today so we can hopefully land it soon, but
in the mean time you'll need to check out that commit and follow the
instructions for setting up a virtualenv to run the script.

Doug

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.opens
super user
2018-04-06 08:10:32 UTC
Permalink
Hope you fix this soon, there are many patches depend on the 'match the
minimum version' problem which causes requirements-check fail.
Post by Doug Hellmann
Post by Doug Hellmann
Because we had some communication issues and did a few steps out
of order, when this patch lands projects that have approved
bot-proposed requirements updates may find that their requirements
and lower-constraints files no longer match, which may lead to job
failures. It should be easy enough to fix the problems by making
the values in the constraints files match the values in the
requirements files (by editing either set of files, depending on
what is appropriate). I apologize for any inconvenience this causes.
In part because of this, and in part because of some issues calculating
the initial set of lower-constraints, we have several projects where
their lower-constraints don't match the lower bounds in the requirements
file(s). Now that the check job has been updated with the new rules,
this is preventing us from landing the patches to add the
lower-constraints test job (so those rules are working!).
I've prepared a script to help fix up the lower-constraints.txt
based on values in requirements.txt and test-requirements.txt.
That's not everything, but it should make it easier to fix the rest.
See https://review.openstack.org/#/c/558610/ for the script. I'll work
on those pep8 errors later today so we can hopefully land it soon, but
in the mean time you'll need to check out that commit and follow the
instructions for setting up a virtualenv to run the script.
Doug
__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
Doug Hellmann
2018-04-06 11:42:14 UTC
Permalink
Post by super user
Hope you fix this soon, there are many patches depend on the 'match the
minimum version' problem which causes requirements-check fail.
The problem is with *those patches* and not the check.

I've been trying to update some, but my time has been limited this week
for personal reasons. I encourage project teams to run the script I
provided or edit their lower-constraints.txt file by hand to fix the
issues.

Doug

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-***@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/li
super user
2018-04-06 15:47:11 UTC
Permalink
I will help to update some.
Post by Doug Hellmann
Post by super user
Hope you fix this soon, there are many patches depend on the 'match the
minimum version' problem which causes requirements-check fail.
The problem is with *those patches* and not the check.
I've been trying to update some, but my time has been limited this week
for personal reasons. I encourage project teams to run the script I
provided or edit their lower-constraints.txt file by hand to fix the
issues.
Doug
__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
Loading...