Package change workflow¶
This page attempts to describe the change workflow in order to implement change requests that have been detailed on an SCCB ticket.
Adding/updating packages¶
External¶
To add/update an External package, add or modify a single-line reference
to the package in the
packages/upstream.yaml
.
file.
Please remember to follow the guidance rules regarding external packages.
Internal¶
To add/update an Internal package, add or modify a YAML file for that package. This will typically require two things
- Identify the right
build_string
lines for each Python version, platform, and architecture. - Work out what platforms, if any, need to be skipped.
- Configure some integration tests.
An in-place description of the YAML syntax can be found in the
packages/template.yaml
exemplar.
Determining version numbers and build strings¶
In order to add a new package, or upgrade an existing package, the new package version and build string(s) need to be determined.
The version number should be declared in the SCCB ticket, which can then be used
to determine the build string by using the
search.py
script
(from this repo):
python ./scripts/search.py <package> <version>
search.py output
> python ./scripts/search.py python-framel 8.40.1
package:
name: python-framel
version: 8.40.1
build_string: py36h785e9b2_0 # [linux and x86_64 and py==36]
build_string: py37h03ebfcd_0 # [linux and x86_64 and py==37]
build_string: py38h8790de6_0 # [linux and x86_64 and py==38]
build_string: py36h255dfe6_0 # [osx and x86_64 and py==36]
build_string: py37h10e2902_0 # [osx and x86_64 and py==37]
build_string: py38h65ad66c_0 # [osx and x86_64 and py==38]
build_string: py36h9902d54_0 # [win and x86_64 and py==36]
build_string: py37h9dff50a_0 # [win and x86_64 and py==37]
build_string: py38h377fac3_0 # [win and x86_64 and py==38]
Support multiple platforms using selectors
Specifying a unique build_string
for multiple platforms, architectures,
and Python versions is achieved using conda-build's pre-processing
selectors.
These selectors can be used on any line of the YAML file, and only
when the selector evaluates to true
will that line actually be used
to define a package.
Restricting the build
You can use the --build
option to specify a build string (or wildcard)
to restrict the results to a single build.
Skipping a package on certain platforms¶
While we strive to provide the IGWN Conda Distribution on a wider variety of
platforms and architectures, not all packages are built for all platforms,
architectures, or Python versions,
so it may be necessary to declare skip: true
for some packages.
This should be done as a top-level key in the YAML file using
selectors
to restrict when a package is skipped.
Common skip examples
skip: true # [py<39]
skip: true # [win or (linux and not x86_64)]
Optional requirements¶
An internal package should declare its own optional requirements as a list
specified by the requirements
key:
Optional requirements
requirements:
- pytest-freezegun
Use cases for optional requirements include (but are not limited to):
- supporting an optional feature
- test requirements that aren't runtime requirements
- constraining a third-party package that is known to conflict with the internal package
Integration tests¶
Each internal package should be tested to ensure its integration into the distribution In most cases this will mean adding tests directly to the YAML file for that package, however in some cases a package is best tested by running tests defined for a downstream package.
There are a number of different types of tests, all of which can be used on their own, or together for a package.
Import tests¶
import
tests specify one or more Python modules that are imported in a
Python session as a basic sanity check.
Example:
test:
imports:
- package
- package.module
pytest
tests¶
pytest
tests declare a script or module that should be executed using
pytest.
For a custom Python script, the argument for the test/pytest
option
should just be the name of the script, which should them be added to the
computing/conda
project alongside the YAML file:
test:
pytest: mypackage-tests.py
For packages that bundle their own test suite as a subpackage that is
installed, you can declare the test using --pyargs
:
test:
pytest: --pyargs mypackage
The test/pytest
option can also specify a list of independent test
scripts/modules to execute:
test:
pytest:
- mypackage-tests.py
- --pyargs mypackage
pytest
command execution
pytest
tests are executed using the following invocation:
python -m pytest --cache-clear --no-header -r a <args>
where <args>
is the literal content of the test/pytest
entry.
Script tests¶
script
tests specify a Python, Bash, or Powershell (dependening on the
platform) script file that is executed to test a package.
Example:
test:
script: mything-tests.sh
Each script
should be added to the computing/conda
project alongside the
YAML file in the
packages/
.
directory.
Script execution
script
tests are executed using the following invocation based on
the file extension of the script option:
bash -e <args>
bash.exe -el <args>
perl <args>
python <args>
All other file extensions will result in an error, however it should
be straightforward to modify
scripts/render.py
to support new extensions.
Tests run in unique, temporary directories
Each script
is copied at runtime into a unique temporary directory, so
must be able to run independently of the contents of the computing/conda
project.
The temporary directory is automatically cleaned up once the test
completes, so manual clean up of created/downloaded files is not required.
Command tests¶
command
tests specify lines of shell code that are executed to test a
package.
For example, to run a single command:
test:
command: mything --help
or to run multiple commands, specify a list:
test:
commands:
- mything --help
- myotherthing --help
Each line is treated as an indepdendent test that is executed literally in a unique process, so multiple lines cannot be chained to form a script. In that case use a Script test.
Getting packages into testing¶
To get packages in testing,
open a merge request against testing
.
The CI will run to validate things.
Once the CI passes, the MR can be merged immediately in order to deploy the new packages into the testing environments.
Migrating packages from testing to staging¶
To migrate packages from testing
into staging
:
- go back to the merge request you posted for
testing
, and click the Cherry-pick button, - in the popup, under Pick into branch, select
staging
- ensure that the Start a new merge request with these changes radio box is checked
- Click Cherry-pick.
Apply the right labels
Please then apply the sccb::pending label to indicate that the SCCB is still considering this request. Please also apply any other labels that seem appropriate.
Wait for SCCB approval
Once the CI passes, wait until the SCCB approves the request.
As soon as the SCCB approves the request, the MR should then be merged to deploy the new packages into the staging environments.
Creating a new stable release¶
When deployment time arrives, a new stable distribution should be created, see Tagging a release.
Troubleshooting¶
Determining why a package isn't available¶
If search.py
errors, it may be because a package (at a particular version)
isn't available in conda-forge
.
You can perform a basic search to see if any version of that package
is available (useful if the requested package is new):
conda search my-new-package
From here there are three likely outcomes:
-
The package isn't available at any version; it is likely that the requested package simply isn't packaged for conda yet. In that case you can check the conda-forge
staged-recipes
repository's Pull Requests list to see if there's a PR marked Add my-new-package.If there is a PR, just keep track of it and add the package when it becomes available. If there isn't a PR, you should contact the developer to ask whether they plan to add one, if they don't have a plan, give them a stern look and ask them to make one.
-
The package isn't available at the requested version, only earlier ones; it is likely that the conda-forge feedstock hasn't yet been updated to reflect the new upstream release. In this case you can check the feedstock repo Pull Requests list at
https://github.com/conda-forge/my-new-package-feedstock/pulls/
(replace
my-new-package
with an actual package name). If the feedstock is well formed the conda-forge autotick bot should make a PR automatically when an upstream tarball is uploaded (e.g. to pypi.python.org), otherwise you might have to open a ticket on the feedstock, or just directly contact the feedstock maintainers (likely the developers of the package themselves) to see what's going on. -
The package isn't available at the specifically requested version, but is available at newer versions. In this case it's more trouble than it's worth to attempt to backport the given version for conda-forge, you should go back to the requester and see if the closest matching version that does exist in conda-forge will fulfil their request.
Resolving conflicts¶
It is relatively common, when updating a package, that the new/updated package conflicts with one or more existing packages in the distribution. In my experience there are two reasons for this:
-
Direct conflict, where the requirements for the new/updated package and one of the existing packages form a mutually exclusive set. In this case the only thing you can really do is go back to the requester (of the new/updated package) and tell them there's a conflict.
You may be able to convince the maintainer of the conflicting packages to update their requirements to resolve things.
-
Migration-based conflict, where the new/updated package has been built against a new version of some upstream library while existing packages were built against an older version.
This is typically the case when conda-forge has migrated the upstream package to a new ABI version.
In this case all immediate dependents of the relevant migrated upstream package must be updated to a new build that was made against the new upstream package. Examples of upstream packages for which migrations may require this include
gsl
androot_base
. The full list of pinned packages that are subject to migration rules are found in the conda-forge-pinning-feedstock.