Using IGWN credentials with HTCondor¶
If your workflow jobs require access to restricted IGWN services, you may need to configure your job to include an authorisation credential.
There are two types of credentials you can use with HTCondor jobs.
Work in progress
Documenting best-practice for using Kerberos credentials in an HTCondor workflow is a work in progress.
Please consider contributing to help us complete this section.
SciTokens are capabiltity tokens that inform services that the bearer of the token should be allowed access to a specific capability (e.g. read) on a specific service. See SciTokens for more details on what SciTokens are and how to use them in general.
For full details on specifying credentials in a job, please see
The basic usage valid for the majority of IGWN contexts is as follows:
Using a token in a job¶
To use a SciToken in an HTCondor job, add these commands to your HTCondor submit instructions:
use_oauth_services = <issuer> <issuer>_oauth_resource = <service-url> <issuer>_oauth_permissions = <capability>
<issuer>is the name of the token issuer to use. In all cases for production workflows this should be
igwn. For some testing applications, you can use
<service-url>is the fully-qualified URL of the service to access. This is also referred to as the 'audience' (
aud) of the token.
This is optional, and will default to
igwn_oauth_resource = ANY
<capability>is the access level that is needed. This is also referred to as the 'scope' (
scope) of the token. This can be a space-separated list of multiple scopes.
This is optional and will default to the default set of scopes for all users (as defined by vault.ligo.org).
For example, to enable queries to the GWDataFind service located at
https://datafind.igwn.org you would use:
use_oauth_services = igwn igwn_oauth_resource = https://datafind.igwn.org igwn_oauth_permissions = gwdatafind.read
With these instructions, HTCondor would automatically generate a new token for you, and would transfer it to the execute point into the
.condor_creds directory. This is typically a subdirectory of the job scratch directory, but the value is stored in the
$_CONDOR_CREDS environment variable.
For a single token job like above, the token filename will be
Single token path
For single-token jobs, the token will be generated on the execute machine as
which is typically the same as
Most IGWN scitoken clients should be able to automatically discover the appropriate token file inside the
$_CONDOR_CREDS directory, so you shouldn't actually need to care where the token file exists at the execute point.
To use this token with client tools that do not support discovering tokens inside
$_CONDOR_CREDS, you can set the environment variable
BEARER_TOKEN_FILE in your condor submit file:
environment = "BEARER_TOKEN_FILE=$$(CondorScratchDir)/.condor_creds/igwn.use"
Tokens are shared across all jobs for a user
Tokens are generated and stored on the access point independently of the jobs that request them, so multiple concurrent or consecutive jobs may not use different token permissions without special considerations.
Submitting a second job that requires different token permissions from an existing job may result in a submission failure that looks something like this:
$ condor_submit science.sub Submitting job(s) condor_vault_storer: Credentials exist that do not match the request. They can be removed by condor_store_cred delete-oauth -s igwn but make sure no other job is using them. More details might be available by running condor_vault_storer -v "igwn&scopes=gwdatafind.read&audience=https://datafind.igwn.org" ERROR: (0) invoking /usr/bin/condor_vault_storer
The solution for this is to use token handles.
Tokens may be given a 'handle' to allow HTCondor to distinguish between different sets of permissions and resources.
Handles are specified as a suffix to the
<issuer>_oauth_permissions commands, as follows:
use_oauth_services = igwn igwn_oauth_resource_gracedb = https://gracedb.ligo.org igwn_oauth_permissions_gracedb = gracedb.read
In the above example the handle is
Tokens with handles are stored using the filename
<issuer>_<handle>.use. In the above example, the token will be generated on the access point as
and made available on the execute point as
Using handles for tokens enables submitting multiple different jobs with different token capabilities without clashes.
Using handles for tokens also enables submitting a job that requires multiple tokens.
To use multiple SciTokens in an HTCondor job, specify multiple tokens with unique handles in the same set of submit commands:
use_oauth_services = igwn igwn_oauth_resource_token1 = <service-url-1> igwn_oauth_permissions_token1 = <capability-1> igwn_oauth_resource_token2 = <service-url-2> igwn_oauth_permissions_token2 = <capability-2>
This will generate multiple token files with the following names
For example, to enable queries to GWDataFind at
https://datafind.igwn.org and to GraceDB at
https://gracedb.ligo.org in the same job:
use_oauth_services = igwn igwn_oauth_resource_gwdatafind = https://datafind.igwn.org igwn_oauth_permissions_gwdatafind = gwdatafind.read igwn_oauth_resource_gracedb = https://gracedb.ligo.org igwn_oauth_permissions_gracedb = gracedb.read
Simpler to use a single, multi-capability token
While using multiple tokens is valid usage, it is probably simpler to use a single token with multiple capabilities.
Refactoring the above example:
use_oauth_services = igwn igwn_oauth_resource = https://datafind.igwn.org https://gracedb.ligo.org igwn_oauth_permissions = gwdatafind.read gracedb.read
1. Downloading proprietary IGWN data via OSDF¶
The following HTCondor submit commands can be used to configure a job with the necessary permissions to transfer IGWN proprietary h(t) data from OSDF to a job:
use_oauth_services = igwn igwn_oauth_permissions = read:/ligo read:/virgo should_transfer_files = yes transfer_input_files = igwn+osdf:///igwn/ligo/frames/O4/hoft_C00/H1/H-H1_HOFT_C00-137/H-H1_HOFT_C00-1373577216-4096.gwf
In the above example, the
igwn+osdf URL 'scheme' includes the name for the token (
igwn in this case), to tell the appropriate HTCondor file transfer plugin to use that access token when attempting to download the data.
Currently token handles are unsupported
Using token handles with OSDF URLs is currently unsupported.
Please follow https://opensciencegrid.atlassian.net/browse/HTCONDOR-1731 for a resolution to this issue.
2. Reading proprietary IGWN data from CVMFS¶
The following condor submit commands can be used to configure a job with the necessary permissions to read IGWN proprietary h(t) data from CVMFS:
use_oauth_services = igwn igwn_oauth_permissions = read:/ligo read:/virgo environment = "BEARER_TOKEN_FILE=$$(CondorScratchDir)/.condor_creds/igwn.use"
Authenticated CVMFS requires
The helper tool that does credential handling for CVMFS does not know to look into the
$_CONDOR_CREDS directory of an HTCondor job, so it is required to set the
BEARER_TOKEN_FILE environment variable to enable CVMFS to use the token transferred with the job.
X.509 is no longer fully supported
Identity-based X.509 credentials are deprecated in favour of capability-based SciTokens in almost all cases, so please consider using the instructions for tokens above.
For details on the timescale on which support for X.509 certificates will be fully dropped, please see
For details on which use cases still require X.509 over SciTokens, please contact the Computing and Software Working Group (email@example.com).
X.509 is a credential standard used to encode an identity so that a service can authenticate a request and enable capabilities based on its own records of what users should be allowed to do.
See X.509 for more details on what X.509 is and how to use it in general.
Using an X.509 credential in a job¶
Generate the X.509 credential manually
Using X.509 with HTCondor requires manually generating the credential before submitting the job.
Please see How to generate a credential for documentation on how to generate an X.509 credential.
To use an X.509 credential file in an HTCondor job, add one of the following commands to your submit instructions:
To automatically discover the credential file based on your environment:
use_x509userproxy = true
To manually specify the path of the credential file:
x509userproxy = /path/to/myproxy.pem
In either case, the credential will be transferred onto the execute machine with your job and its path encoded in the
$X509_USER_PROXY environment variable.