Metadata-Version: 2.1
Name: relval
Version: 2.5.9
Summary: Fedora QA wiki test validation event management tool
Home-page: https://pagure.io/fedora-qa/relval
Author: Adam Williamson
Author-email: awilliam@redhat.com
License: GPLv3+
Keywords: fedora qa mediawiki validation
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Topic :: Utilities
Classifier: License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Requires-Python: !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, <4
Description-Content-Type: text/markdown
License-File: COPYING

# relval

relval is a CLI tool, using [python-wikitcms][1], which aids in the creation of the wiki pages that are used to track the results of Fedora [release validation][2] [events][3], in creating statistics like the [heroes of Fedora testing][4] and [test coverage][5] statistics, and also can report test results. If you're interested in relval, you may also be interested in [testdays][6], which is to Test Day pages as relval is to release validation pages.

Put simply, you can run `relval compose --release 25 --milestone Final --compose 1.1` and all the wiki pages that are needed for the Fedora 25 Final 1.1 release validation test event will be created (if they weren't already). The `user-stats` and `testcase-stats` sub-commands handle statistics generation, and `report-results` can report test results.

## Installation and use

relval is packaged in the official Fedora and EPEL 7+ repositories: to install on Fedora run `dnf install relval`, on RHEL / CentOS with EPEL enabled, run `yum install relval`. You may need to enable the *updates-testing* repository to get the latest version. To install on other distributions, you can run `python setup.py install`.

You can visit [the relval project page on Pagure][7], and clone with `git clone https://pagure.io/fedora-qa/relval.git`. Tarballs are also [available][8].

You can use the relval CLI from the tarball without installing it, as `./run-relval.py` from the root of the tarball. You will need all its dependencies, which are listed in `setup.py`.

## Bugs, pull requests etc.

You can file issues and pull requests on [Pagure][7].

## Usage

The [validation event SOP][4] provides the correct invocation of relval to use when you simply wish to create the pages for a new compose (the most common use case).

### User authentication (for commands requiring login)

The following applies for all commands that require login - anything that writes to the wiki, currently `compose`, `report-results`, and `size-check`.

Since early 2018, the Fedora wikis use OpenID Connect-based authentication. When you first use any of the commands that require login, a browser window will open and walk you through the authentication process; this will create a login token that is valid for a while, and subsequent use of these commands will work transparently. After a while the token will expire, and the next time you try to use one of these commands, you will go through the authentication process again.

The old `--username` and `--password` arguments, and the `~/.fedora/credentials` file which used to be available for you to store your username and password for 'non-interactive' login, no longer do anything. It would be a good idea to remove any remaining credentials files as they are now only a potential security risk. For long-term non-interactive usage of the wiki via relval or any other system, you must request a permanent auth token from the wiki administrators.

### Common options

All sub-commands honor the option `--test`, to operate on the staging wiki instead of the production wiki, which can be useful for testing. **Please** use this option if you are experimenting with the result page creation or result reporting sub-commands, especially if you also pass `--force`.

All options mentioned here have short names (e.g. `-r` for `--release`, but the long names are given here for clarity. Usually the short name is the first letter of the long name. The help pages (`relval <sub-command> -h`) list all options with both their long and short names.

### `compose`

For validation event page creation, use the `compose` sub-command: `relval compose`. You must pass *either* the parameters `--milestone` and `--compose` (and optionally `--release`) *or* the parameter `--cid` to identify the compose for which pages will be created. When using `--milestone` and `--compose` you may also pass `--release` to specify the release to operate on; otherwise, relval will attempt to discover the 'next' release, and use that.

You may pass `--testtype` to specify a particular 'test type' (e.g. Base or Desktop); if you pass a test type, only the page for that type (and the summary page and category pages) will be written, while if you do not, the pages for all test types will be written. You may pass `--no-current` to specify that the Test_Results:Current redirect pages should not be updated to point to the newly-created pages (by default, they will). You may pass `--force` to force the creation of pages that already exist: this applies to the results pages category page contents, and summary page, but not to the Current redirects, which will always be written if page creation succeeds (unless `--no-current` is passed). You may pass `--download-only` to specify that only the Download template (which provides the table included in the instructions section of all the results pages) should be written; this is handy if you need to create or update the Download page for an existing event.

### `user-stats`

For user statistics generation, use `relval user-stats`. It has no required options.

You may pass `--release` to specify the release to operate on; otherwise, relval will attempt to discover the 'next' release, and use that. You may optionally specify a milestone to operate against, with `--milestone (Alpha|Beta|Final)` (it does not accept Branched or Rawhide, but if you do not pass `--milestone` at all, Branched and Rawhide result pages will be included). You may also pass the `--filter` option as many times as you like. If passed, only pages whose name matches *any* of the `--filter` parameters will be included. For instance, `relval user-stats --release 21 --milestone Beta --filter TC3 --filter Desktop` will operate against all Fedora 21 Beta pages with "TC3" *or* "Desktop" in their names. You may pass `--bot` to include 'bot' results (those from automated test systems) in the statistics; by default they are excluded.

The result will be a simple HTML page source printed directly to the console which you can save or paste into for e.g. a blog post, containing statistics on the users who contributed results to the chosen set of pages.

### `testcase-stats`

For test coverage statistics generation, use `relval testcase-stats`. The parameters are the same as those for `user-stats`. The output will be an entire directory of HTML pages in `/tmp` with a top-level `index.html` that links to summary pages for each "test type", and detailed pages for each "unique test" that are linked from the summary pages. You can also pass `--out` to specify an output directory, *which will be deleted if it already exists*. You can simply place the entire directory on your web server in a sensible location. Note that the top-level directory will have 0700 permissions by default and you may have to change this before the content will be visible on the server.

### `report-results`

report-results lets you...report results. It edits the result pages in the wiki for you. Why yes, a hacky TUI that pretends mediawiki is a structured data store *is* a deeply ridiculous thing, thank you for asking.

You may pass `--release`, `--milestone`, `--compose` and `--testtype` if you like. If you don't fully specify a compose version, it will first attempt to detect the 'current' compose and offer to let you report results against that; if you want to report against a different compose, it will prompt you for the details.

Once you've chosen a compose to report against one way or another, it will then ask you which page section to report a result in, and then which test to report a result for, then what type of result to submit, then whether you want to specify associated bug IDs and/or a comment. And then it will submit the result. Once you're done, you can submit another result for the same section, page, or test type (avoiding the need to re-input those choices).

Please do keep an eye on the actual result wiki pages and make sure the tool edited them correctly.

### `size-check`

size-check checks the size of the image files for a given compose, and reports the results to the wiki.

You may pass `--release`, `--milestone`, and `--compose` to specify the compose to operate on. If you pass none of them, relval will check the 'current' compose. If you pass only some, wikitcms will try and guess what compose you meant, and the command will fail if it cannot.

You may also pass `--bugzilla`, which will report bugs to Bugzilla for oversize images. If `--test` is also passed, the bugs will be reported to partner-bugzilla.redhat.com (which is effectively a sandbox instance); otherwise they will be reported to bugzilla.redhat.com, so please do not do this unless you're really sure it's necessary. This uses [python-bugzilla][9]: please see its documentation for information on authentication. If you do not provide some form of authentication information in a python-bugzilla configuration file and no valid tokens are stored locally from a recent successful login, you will be prompted to enter a username and password interactively.

Note that there is now automation in place to run `size-check` automatically when validation events are created, so it is unusual for it to be necessary to run it manually any more.

## Credits

The `user-stats` and `testcase-stats` sub-commands are re-implementations of work originally done by [Kamil Paral][10] and [Josef Skladanka][11], and incorporate sections of the original implementations, which can be found in the history of the [qa-stats git repository][12].

## License

relval is released under the [GPL][13], version 3 or later.

[1]: https://pagure.io/fedora-qa/python-wikitcms
[2]: https://fedoraproject.org/wiki/QA:Release_validation_test_plan
[3]: https://fedoraproject.org/wiki/QA/SOP_Release_Validation_Test_Event
[4]: https://roshi.fedorapeople.org/heroes-of-fedora-hof-f20-final.html
[5]: https://www.happyassassin.net/testcase_stats/
[6]: https://pagure.io/fedora-qa/testdays
[7]: https://pagure.io/fedora-qa/relval
[8]: https://www.happyassassin.net/relval/releases/
[9]: https://github.com/python-bugzilla/python-bugzilla
[10]: http://kparal.wordpress.com/
[11]: http://jskladan.wordpress.com/
[12]: https://pagure.io/fedora-qa/qa-stats
[13]: https://www.gnu.org/licenses/gpl.txt


