diff --git a/.github/workflows/branch.yml b/.github/workflows/branch.yml index fabf219aea..9463a5130f 100644 --- a/.github/workflows/branch.yml +++ b/.github/workflows/branch.yml @@ -24,7 +24,7 @@ jobs: message: | Hi @${{ github.event.pull_request.user.login }}, - It looks like this pull-request has been made against the ${{github.event.pull_request.head.repo.full_name}} `master` branch. The `master` branch on nf-core repositories should always contain code from the latest release. Beacuse of this, PRs to `master` are only allowed if they come from the ${{github.event.pull_request.head.repo.full_name}} `dev` branch. + It looks like this pull-request has been made against the ${{github.event.pull_request.head.repo.full_name}} `master` branch. The `master` branch on nf-core repositories should always contain code from the latest release. Because of this, PRs to `master` are only allowed if they come from the ${{github.event.pull_request.head.repo.full_name}} `dev` branch. You do not need to close this PR, you can change the target branch to `dev` by clicking the _"Edit"_ button at the top of this page. diff --git a/.github/workflows/sync.yml b/.github/workflows/sync.yml index 764649e5cc..9e67087def 100644 --- a/.github/workflows/sync.yml +++ b/.github/workflows/sync.yml @@ -2,6 +2,7 @@ name: Sync template on: release: types: [published] + workflow_dispatch: jobs: get-pipelines: @@ -21,7 +22,6 @@ jobs: matrix: ${{fromJson(needs.get-pipelines.outputs.matrix)}} fail-fast: false steps: - - uses: actions/checkout@v2 name: Check out nf-core/tools @@ -32,6 +32,7 @@ jobs: ref: dev token: ${{ secrets.nf_core_bot_auth_token }} path: nf-core/${{ matrix.pipeline }} + fetch-depth: "0" - name: Set up Python 3.8 uses: actions/setup-python@v1 @@ -63,7 +64,6 @@ jobs: --username nf-core-bot \ --repository nf-core/${{ matrix.pipeline }} - - name: Upload sync log file artifact if: ${{ always() }} uses: actions/upload-artifact@v2 diff --git a/.gitignore b/.gitignore index bc2d21a3d5..84ddfd3a08 100644 --- a/.gitignore +++ b/.gitignore @@ -109,3 +109,6 @@ ENV/ # backup files *~ *\? + +# Jetbrains IDEs +.idea diff --git a/CHANGELOG.md b/CHANGELOG.md index d959b2d246..d91c1f5cb8 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,26 @@ # nf-core/tools: Changelog +## [v1.11 - Iron Tiger](https://github.com/nf-core/tools/releases/tag/1.11) - [2020-10-27] + +### Template + +* Fix command error in `awstest.yml` GitHub Action workflow. +* Allow manual triggering of AWS test GitHub Action workflows. +* Remove TODO item, which was proposing the usage of additional files beside `usage.md` and `output.md` for documentation. +* Added a Podman profile, which enables Podman as container. +* Updated linting for GitHub actions AWS tests workflows. + +### Linting + +* Made a base-level `Dockerfile` a warning instead of failure +* Added a lint failure if the old `bin/markdown_to_html.r` script is found +* Update `rich` package dependency and use new markup escaping to change `[[!]]` back to `[!]` again + +### Other + +* Pipeline sync - fetch full repo when checking out before sync +* Sync - Add GitHub actions manual trigger option + ## [v1.10.2 - Copper Camel _(brought back from the dead)_](https://github.com/nf-core/tools/releases/tag/1.10.2) - [2020-07-31] Second patch release to address some small errors discovered in the pipeline template. diff --git a/docs/lint_errors.md b/docs/lint_errors.md index 39794dc29c..d42e85acfb 100644 --- a/docs/lint_errors.md +++ b/docs/lint_errors.md @@ -10,8 +10,8 @@ The lint test looks for the following required files: * `nextflow.config` * The main nextflow config file -* `Dockerfile` - * A docker build script to generate a docker image with the required software +* `nextflow_schema.json` + * A JSON schema describing pipeline parameters, generated using `nf-core schema build` * Continuous integration tests with [GitHub Actions](https://github.com/features/actions) * GitHub Actions workflows for CI of your pipeline (`.github/workflows/ci.yml`), branch protection (`.github/workflows/branch.yml`) and nf-core best practice linting (`.github/workflows/linting.yml`) * `LICENSE`, `LICENSE.md`, `LICENCE.md` or `LICENCE.md` @@ -27,8 +27,14 @@ The following files are suggested but not a hard requirement. If they are missin * `main.nf` * It's recommended that the main workflow script is called `main.nf` +* `environment.yml` + * A conda environment file describing the required software +* `Dockerfile` + * A docker build script to generate a docker image with the required software * `conf/base.config` * A `conf` directory with at least one config called `base.config` +* `.github/workflows/awstest.yml` and `.github/workflows/awsfulltest.yml` + * GitHub workflow scripts used for automated tests on AWS The following files will cause a failure if the _are_ present (to fix, delete them): @@ -39,13 +45,18 @@ The following files will cause a failure if the _are_ present (to fix, delete th * `parameters.settings.json` * The syntax for pipeline schema has changed - old `parameters.settings.json` should be deleted and new `nextflow_schema.json` files created instead. +* `bin/markdown_to_html.r` + * The old markdown to HTML conversion script, now replaced by `markdown_to_html.py` ## Error #2 - Docker file check failed ## {#2} -Pipelines should have a files called `Dockerfile` in their root directory. +DSL1 pipelines should have a file called `Dockerfile` in their root directory. The file is used for automated docker image builds. This test checks that the file exists and contains at least the string `FROM` (`Dockerfile`). +Some pipelines, especially DSL2, may not have a `Dockerfile`. In this case a warning +will be generated which can be safely ignored. + ## Error #3 - Licence check failed ## {#3} nf-core pipelines must ship with an open source [MIT licence](https://choosealicense.com/licenses/mit/). @@ -216,22 +227,25 @@ This test will fail if the following requirements are not met in these files: { [[ ${{github.event.pull_request.head.repo.full_name}} == / ]] && [[ $GITHUB_HEAD_REF = "dev" ]]; } || [[ $GITHUB_HEAD_REF == "patch" ]] ``` -4. `awstest.yml`: Triggers tests on AWS batch. As running tests on AWS incurs costs, they should be only triggered on `push` to `master` and `release`. - * Must be turned on for `push` to `master` and `release`. - * Must not be turned on for `pull_request` or other events. +4. `awstest.yml`: Triggers tests on AWS batch. As running tests on AWS incurs costs, they should be only triggered on `workflow_dispatch`. +This allows for manual triggering of the workflow when testing on AWS is desired. +You can trigger the tests by going to the `Actions` tab on the pipeline GitHub repository and selecting the `nf-core AWS test` workflow on the left. + * Must not be turned on for `push` or `pull_request`. + * Must be turned on for `workflow_dispatch`. ### GitHub Actions AWS full tests Additionally, we provide the possibility of testing the pipeline on full size datasets on AWS. This should ensure that the pipeline runs as expected on AWS and provide a resource estimation. -The GitHub Actions workflow is: `awsfulltest.yml`, and it can be found in the `.github/workflows/` directory. -This workflow incurrs higher AWS costs, therefore it should only be triggered on `release`. -For tests on full data prior to release, [https://tower.nf](Nextflow Tower's launch feature) can be employed. +The GitHub Actions workflow is `awsfulltest.yml`, and it can be found in the `.github/workflows/` directory. +This workflow incurrs higher AWS costs, therefore it should only be triggered on `release` and `workflow_dispatch`. +You can trigger the tests by going to the `Actions` tab on the pipeline GitHub repository and selecting the `nf-core AWS full size tests` workflow on the left. +For tests on full data prior to release, [Nextflow Tower](https://tower.nf) launch feature can be employed. `awsfulltest.yml`: Triggers full sized tests run on AWS batch after releasing. -* Must be only turned on for `release`. -* Should run the profile `test_full`. If it runs the profile `test` a warning is given. +* Must be only turned on for `release` and `workflow_dispatch`. +* Should run the profile `test_full` that should be edited to provide the links to full-size datasets. If it runs the profile `test` a warning is given. ## Error #6 - Repository `README.md` tests ## {#6} @@ -298,7 +312,7 @@ If a workflow has a conda `environment.yml` file (see above), the `Dockerfile` s to create the container. Such `Dockerfile`s can usually be very short, eg: ```Dockerfile -FROM nfcore/base:1.7 +FROM nfcore/base:1.11 MAINTAINER Rocky Balboa LABEL authors="your@email.com" \ description="Docker image containing all requirements for the nf-core mypipeline pipeline" diff --git a/nf_core/__main__.py b/nf_core/__main__.py index be9bdfaac9..6eebbe8815 100755 --- a/nf_core/__main__.py +++ b/nf_core/__main__.py @@ -35,11 +35,11 @@ def run_nf_core(): # Print nf-core header to STDERR stderr = rich.console.Console(file=sys.stderr) - stderr.print("\n[green]{},--.[grey39]/[green],-.".format(" " * 42)) - stderr.print("[blue] ___ __ __ __ ___ [green]/,-._.--~\\") - stderr.print("[blue] |\ | |__ __ / ` / \ |__) |__ [yellow] } {") - stderr.print("[blue] | \| | \__, \__/ | \ |___ [green]\`-._,-`-,") - stderr.print("[green] `._,._,'\n") + stderr.print("\n[green]{},--.[grey39]/[green],-.".format(" " * 42), highlight=False) + stderr.print("[blue] ___ __ __ __ ___ [green]/,-._.--~\\", highlight=False) + stderr.print("[blue] |\ | |__ __ / ` / \ |__) |__ [yellow] } {", highlight=False) + stderr.print("[blue] | \| | \__, \__/ | \ |___ [green]\`-._,-`-,", highlight=False) + stderr.print("[green] `._,._,'\n", highlight=False) stderr.print("[grey39] nf-core/tools version {}".format(nf_core.__version__), highlight=False) try: is_outdated, current_vers, remote_vers = nf_core.utils.check_if_outdated() @@ -503,7 +503,7 @@ def lint(schema_path): try: schema_obj.validate_schema_title_description() except AssertionError as e: - log.warn(e) + log.warning(e) except AssertionError as e: sys.exit(1) diff --git a/nf_core/create.py b/nf_core/create.py index 2fdeeecff2..717042b517 100644 --- a/nf_core/create.py +++ b/nf_core/create.py @@ -67,8 +67,7 @@ def init_pipeline(self): ) def run_cookiecutter(self): - """Runs cookiecutter to create a new nf-core pipeline. - """ + """Runs cookiecutter to create a new nf-core pipeline.""" log.info("Creating new nf-core pipeline: {}".format(self.name)) # Check if the output directory exists @@ -113,8 +112,7 @@ def run_cookiecutter(self): shutil.rmtree(self.tmpdir) def make_pipeline_logo(self): - """Fetch a logo for the new pipeline from the nf-core website - """ + """Fetch a logo for the new pipeline from the nf-core website""" logo_url = "https://nf-co.re/logo/{}".format(self.short_name) log.debug("Fetching logo from {}".format(logo_url)) @@ -135,8 +133,7 @@ def make_pipeline_logo(self): fh.write(r.content) def git_init_pipeline(self): - """Initialises the new pipeline as a Git repository and submits first commit. - """ + """Initialises the new pipeline as a Git repository and submits first commit.""" log.info("Initialising pipeline git repository") repo = git.Repo.init(self.outdir) repo.git.add(A=True) diff --git a/nf_core/download.py b/nf_core/download.py index a6b2cfc2d3..db570231e4 100644 --- a/nf_core/download.py +++ b/nf_core/download.py @@ -197,8 +197,7 @@ def fetch_workflow_details(self, wfs): raise LookupError("Not able to find pipeline '{}'".format(self.pipeline)) def download_wf_files(self): - """Downloads workflow files from GitHub to the :attr:`self.outdir`. - """ + """Downloads workflow files from GitHub to the :attr:`self.outdir`.""" log.debug("Downloading {}".format(self.wf_download_url)) # Download GitHub zip file into memory and extract @@ -216,8 +215,7 @@ def download_wf_files(self): os.chmod(os.path.join(dirpath, fname), 0o775) def download_configs(self): - """Downloads the centralised config profiles from nf-core/configs to :attr:`self.outdir`. - """ + """Downloads the centralised config profiles from nf-core/configs to :attr:`self.outdir`.""" configs_zip_url = "https://github.com/nf-core/configs/archive/master.zip" configs_local_dir = "configs-master" log.debug("Downloading {}".format(configs_zip_url)) @@ -236,8 +234,7 @@ def download_configs(self): os.chmod(os.path.join(dirpath, fname), 0o775) def wf_use_local_configs(self): - """Edit the downloaded nextflow.config file to use the local config files - """ + """Edit the downloaded nextflow.config file to use the local config files""" nfconfig_fn = os.path.join(self.outdir, "workflow", "nextflow.config") find_str = "https://raw.githubusercontent.com/nf-core/configs/${params.custom_config_version}" repl_str = "../configs/" @@ -294,8 +291,7 @@ def pull_singularity_image(self, container): raise e def compress_download(self): - """Take the downloaded files and make a compressed .tar.gz archive. - """ + """Take the downloaded files and make a compressed .tar.gz archive.""" log.debug("Creating archive: {}".format(self.output_filename)) # .tar.gz and .tar.bz2 files diff --git a/nf_core/licences.py b/nf_core/licences.py index 1637367427..08e3ac8b42 100644 --- a/nf_core/licences.py +++ b/nf_core/licences.py @@ -50,8 +50,7 @@ def run_licences(self): return self.print_licences() def get_environment_file(self): - """Get the conda environment file for the pipeline - """ + """Get the conda environment file for the pipeline""" if os.path.exists(self.pipeline): env_filename = os.path.join(self.pipeline, "environment.yml") if not os.path.exists(self.pipeline): @@ -68,8 +67,7 @@ def get_environment_file(self): self.conda_config = yaml.safe_load(response.text) def fetch_conda_licences(self): - """Fetch package licences from Anaconda and PyPi. - """ + """Fetch package licences from Anaconda and PyPi.""" lint_obj = nf_core.lint.PipelineLint(self.pipeline) lint_obj.conda_config = self.conda_config diff --git a/nf_core/lint.py b/nf_core/lint.py index fcc47ae3d6..9b62da3209 100755 --- a/nf_core/lint.py +++ b/nf_core/lint.py @@ -148,6 +148,7 @@ class PipelineLint(object): def __init__(self, path): """ Initialise linting object """ self.release_mode = False + self.version = nf_core.__version__ self.path = path self.git_sha = None self.files = [] @@ -253,7 +254,6 @@ def check_files_exist(self): 'nextflow.config', 'nextflow_schema.json', - 'Dockerfile', ['LICENSE', 'LICENSE.md', 'LICENCE', 'LICENCE.md'], # NB: British / American spelling 'README.md', 'CHANGELOG.md', @@ -268,13 +268,16 @@ def check_files_exist(self): 'main.nf', 'environment.yml', + 'Dockerfile', 'conf/base.config', '.github/workflows/awstest.yml', '.github/workflows/awsfulltest.yml' Files that *must not* be present:: - 'Singularity' + 'Singularity', + 'parameters.settings.json', + 'bin/markdown_to_html.r' Files that *should not* be present:: @@ -289,7 +292,6 @@ def check_files_exist(self): files_fail = [ ["nextflow.config"], ["nextflow_schema.json"], - ["Dockerfile"], ["LICENSE", "LICENSE.md", "LICENCE", "LICENCE.md"], # NB: British / American spelling ["README.md"], ["CHANGELOG.md"], @@ -303,13 +305,14 @@ def check_files_exist(self): files_warn = [ ["main.nf"], ["environment.yml"], + ["Dockerfile"], [os.path.join("conf", "base.config")], [os.path.join(".github", "workflows", "awstest.yml")], [os.path.join(".github", "workflows", "awsfulltest.yml")], ] # List of strings. Dails / warns if any of the strings exist. - files_fail_ifexists = ["Singularity", "parameters.settings.json"] + files_fail_ifexists = ["Singularity", "parameters.settings.json", os.path.join("bin", "markdown_to_html.r")] files_warn_ifexists = [".travis.yml"] def pf(file_path): @@ -357,6 +360,9 @@ def pf(file_path): def check_docker(self): """Checks that Dockerfile contains the string ``FROM``.""" + if "Dockerfile" not in self.files: + return + fn = os.path.join(self.path, "Dockerfile") content = "" with open(fn, "r") as fh: @@ -601,7 +607,10 @@ def check_nextflow_config(self): ) else: self.warned.append( - (4, "Config `manifest.version` should end in `dev`: `{}`".format(self.config["manifest.version"]),) + ( + 4, + "Config `manifest.version` should end in `dev`: `{}`".format(self.config["manifest.version"]), + ) ) elif "manifest.version" in self.config: if "dev" in self.config["manifest.version"]: @@ -660,11 +669,19 @@ def check_actions_branch_protection(self): "PIPELINENAME", self.pipeline_name.lower() ) if has_name and has_if and has_run: - self.passed.append((5, "GitHub Actions 'branch' workflow looks good: `{}`".format(fn),)) + self.passed.append( + ( + 5, + "GitHub Actions 'branch' workflow looks good: `{}`".format(fn), + ) + ) break else: self.failed.append( - (5, "Couldn't find GitHub Actions 'branch' check for PRs to master: `{}`".format(fn),) + ( + 5, + "Couldn't find GitHub Actions 'branch' check for PRs to master: `{}`".format(fn), + ) ) def check_actions_ci(self): @@ -683,7 +700,12 @@ def check_actions_ci(self): # NB: YAML dict key 'on' is evaluated to a Python dict key True assert ciwf[True] == expected except (AssertionError, KeyError, TypeError): - self.failed.append((5, "GitHub Actions CI is not triggered on expected events: `{}`".format(fn),)) + self.failed.append( + ( + 5, + "GitHub Actions CI is not triggered on expected events: `{}`".format(fn), + ) + ) else: self.passed.append((5, "GitHub Actions CI is triggered on expected events: `{}`".format(fn))) @@ -699,7 +721,10 @@ def check_actions_ci(self): assert any([docker_build_cmd in step["run"] for step in steps if "run" in step.keys()]) except (AssertionError, KeyError, TypeError): self.failed.append( - (5, "CI is not building the correct docker image. Should be: `{}`".format(docker_build_cmd),) + ( + 5, + "CI is not building the correct docker image. Should be: `{}`".format(docker_build_cmd), + ) ) else: self.passed.append((5, "CI is building the correct docker image: `{}`".format(docker_build_cmd))) @@ -790,32 +815,27 @@ def check_actions_awstest(self): with open(fn, "r") as fh: wf = yaml.safe_load(fh) - # Check that the action is only turned on for push + # Check that the action is only turned on for workflow_dispatch try: - assert "push" in wf[True] + assert "workflow_dispatch" in wf[True] + assert "push" not in wf[True] assert "pull_request" not in wf[True] except (AssertionError, KeyError, TypeError): self.failed.append( - (5, "GitHub Actions AWS test should be triggered on push and not PRs: `{}`".format(fn)) - ) - else: - self.passed.append((5, "GitHub Actions AWS test is triggered on push and not PRs: `{}`".format(fn))) - - # Check that the action is only turned on for push to master - try: - assert "master" in wf[True]["push"]["branches"] - assert "dev" not in wf[True]["push"]["branches"] - except (AssertionError, KeyError, TypeError): - self.failed.append( - (5, "GitHub Actions AWS test should be triggered only on push to master: `{}`".format(fn)) + ( + 5, + "GitHub Actions AWS test should be triggered on workflow_dispatch and not on push or PRs: `{}`".format( + fn + ), + ) ) else: - self.passed.append((5, "GitHub Actions AWS test is triggered only on push to master: `{}`".format(fn))) + self.passed.append((5, "GitHub Actions AWS test is triggered on workflow_dispatch: `{}`".format(fn))) def check_actions_awsfulltest(self): """Checks the GitHub Actions awsfulltest is valid. - Makes sure it is triggered only on ``release``. + Makes sure it is triggered only on ``release`` and workflow_dispatch. """ fn = os.path.join(self.path, ".github", "workflows", "awsfulltest.yml") if os.path.isfile(fn): @@ -828,15 +848,26 @@ def check_actions_awsfulltest(self): try: assert "release" in wf[True] assert "published" in wf[True]["release"]["types"] + assert "workflow_dispatch" in wf[True] assert "push" not in wf[True] assert "pull_request" not in wf[True] except (AssertionError, KeyError, TypeError): self.failed.append( - (5, "GitHub Actions AWS full test should be triggered only on published release: `{}`".format(fn)) + ( + 5, + "GitHub Actions AWS full test should be triggered only on published release and workflow_dispatch: `{}`".format( + fn + ), + ) ) else: self.passed.append( - (5, "GitHub Actions AWS full test is triggered only on published release: `{}`".format(fn)) + ( + 5, + "GitHub Actions AWS full test is triggered only on published release and workflow_dispatch: `{}`".format( + fn + ), + ) ) # Warn if `-profile test` is still unchanged @@ -1130,11 +1161,11 @@ def check_conda_dockerfile(self): * dependency versions are pinned * dependency versions are the latest available """ - if "environment.yml" not in self.files or len(self.dockerfile) == 0: + if "environment.yml" not in self.files or "Dockerfile" not in self.files or len(self.dockerfile) == 0: return expected_strings = [ - "FROM nfcore/base:{}".format("dev" if "dev" in nf_core.__version__ else nf_core.__version__), + "FROM nfcore/base:{}".format("dev" if "dev" in self.version else self.version), "COPY environment.yml /", "RUN conda env create --quiet -f /environment.yml && conda clean -a", "RUN conda env export --name {} > {}.yml".format(self.conda_config["name"], self.conda_config["name"]), @@ -1299,7 +1330,8 @@ def _s(some_list): if len(self.passed) > 0 and show_passed: table = Table(style="green", box=rich.box.ROUNDED) table.add_column( - "[[\u2714]] {} Test{} Passed".format(len(self.passed), _s(self.passed)), no_wrap=True, + r"\[✔] {} Test{} Passed".format(len(self.passed), _s(self.passed)), + no_wrap=True, ) table = format_result(self.passed, table) console.print(table) @@ -1307,7 +1339,7 @@ def _s(some_list): # Table of warning tests if len(self.warned) > 0: table = Table(style="yellow", box=rich.box.ROUNDED) - table.add_column("[[!]] {} Test Warning{}".format(len(self.warned), _s(self.warned)), no_wrap=True) + table.add_column(r"\[!] {} Test Warning{}".format(len(self.warned), _s(self.warned)), no_wrap=True) table = format_result(self.warned, table) console.print(table) @@ -1315,7 +1347,8 @@ def _s(some_list): if len(self.failed) > 0: table = Table(style="red", box=rich.box.ROUNDED) table.add_column( - "[[\u2717]] {} Test{} Failed".format(len(self.failed), _s(self.failed)), no_wrap=True, + r"\[✗] {} Test{} Failed".format(len(self.failed), _s(self.failed)), + no_wrap=True, ) table = format_result(self.failed, table) console.print(table) @@ -1325,10 +1358,11 @@ def _s(some_list): table = Table(box=rich.box.ROUNDED) table.add_column("[bold green]LINT RESULTS SUMMARY".format(len(self.passed)), no_wrap=True) table.add_row( - "[[\u2714]] {:>3} Test{} Passed".format(len(self.passed), _s(self.passed)), style="green", + r"\[✔] {:>3} Test{} Passed".format(len(self.passed), _s(self.passed)), + style="green", ) - table.add_row("[[!]] {:>3} Test Warning{}".format(len(self.warned), _s(self.warned)), style="yellow") - table.add_row("[[\u2717]] {:>3} Test{} Failed".format(len(self.failed), _s(self.failed)), style="red") + table.add_row(r"\[!] {:>3} Test Warning{}".format(len(self.warned), _s(self.warned)), style="yellow") + table.add_row(r"\[✗] {:>3} Test{} Failed".format(len(self.failed), _s(self.failed)), style="red") console.print(table) def get_results_md(self): @@ -1485,7 +1519,7 @@ def github_comment(self): log.info("Posted GitHub comment: {}".format(r_json["html_url"])) log.debug(response_pp) else: - log.warn("Could not post GitHub comment: '{}'\n{}".format(r.status_code, response_pp)) + log.warning("Could not post GitHub comment: '{}'\n{}".format(r.status_code, response_pp)) except Exception as e: log.warning("Could not post GitHub comment: {}\n{}".format(os.environ["GITHUB_COMMENTS_URL"], e)) diff --git a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/CONTRIBUTING.md b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/CONTRIBUTING.md index 3836aa7637..8ab3b9bd2e 100644 --- a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/CONTRIBUTING.md +++ b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/CONTRIBUTING.md @@ -54,4 +54,4 @@ These tests are run both with the latest available version of `Nextflow` and als ## Getting help -For further information/help, please consult the [{{ cookiecutter.name }} documentation](https://nf-co.re/{{ cookiecutter.short_name }}/docs) and don't hesitate to get in touch on the nf-core Slack [#{{ cookiecutter.short_name }}](https://nfcore.slack.com/channels/{{ cookiecutter.short_name }}) channel ([join our Slack here](https://nf-co.re/join/slack)). +For further information/help, please consult the [{{ cookiecutter.name }} documentation](https://nf-co.re/{{ cookiecutter.short_name }}/usage) and don't hesitate to get in touch on the nf-core Slack [#{{ cookiecutter.short_name }}](https://nfcore.slack.com/channels/{{ cookiecutter.short_name }}) channel ([join our Slack here](https://nf-co.re/join/slack)). diff --git a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/ISSUE_TEMPLATE/bug_report.md b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/ISSUE_TEMPLATE/bug_report.md index 3b98a62ca1..30a5ef8b47 100644 --- a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/ISSUE_TEMPLATE/bug_report.md +++ b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/ISSUE_TEMPLATE/bug_report.md @@ -36,7 +36,7 @@ Steps to reproduce the behaviour: ## Container engine -- Engine: +- Engine: - version: - Image tag: diff --git a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/workflows/awsfulltest.yml b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/workflows/awsfulltest.yml index 07b164e6bf..3054b61dbf 100644 --- a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/workflows/awsfulltest.yml +++ b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/workflows/awsfulltest.yml @@ -1,10 +1,12 @@ name: nf-core AWS full size tests -# This workflow is triggered on push to the master branch. +# This workflow is triggered on published releases. +# It can be additionally triggered manually with GitHub actions workflow dispatch. # It runs the -profile 'test_full' on AWS batch on: release: types: [published] + workflow_dispatch: jobs: run-awstest: diff --git a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/workflows/awstest.yml b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/workflows/awstest.yml index 6a2759edbe..d261c7878b 100644 --- a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/workflows/awstest.yml +++ b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/.github/workflows/awstest.yml @@ -1,11 +1,10 @@ name: nf-core AWS test # This workflow is triggered on push to the master branch. -# It runs the -profile 'test' on AWS batch +# It can be additionally triggered manually with GitHub actions workflow dispatch. +# It runs the -profile 'test' on AWS batch. on: - push: - branches: - - master + workflow_dispatch: jobs: run-awstest: @@ -37,4 +36,4 @@ jobs: --job-name nf-core-{{ cookiecutter.short_name }} \ --job-queue $AWS_JOB_QUEUE \ --job-definition $AWS_JOB_DEFINITION \ - --container-overrides '{"command": ["{{ cookiecutter.name }}", "-r '"${GITHUB_SHA}"' -profile test --outdir s3://'"${AWS_S3_BUCKET}"'/{{ cookiecutter.short_name }}/results-'"${GITHUB_SHA}"' -w s3://'"${AWS_S3_BUCKET}"'/{{ cookiecutter.short_name }}/work-'"${GITHUB_SHA}"' -with-tower"], "environment": [{"name": "TOWER_ACCESS_TOKEN", "value": "'"$TOWER_ACCESS_TOKEN"'"}]}'{{ cookiecutter.short_name }}/work-'"${GITHUB_SHA}"' -with-tower"], "environment": [{"name": "TOWER_ACCESS_TOKEN", "value": "'"$TOWER_ACCESS_TOKEN"'"}]}' + --container-overrides '{"command": ["{{ cookiecutter.name }}", "-r '"${GITHUB_SHA}"' -profile test --outdir s3://'"${AWS_S3_BUCKET}"'/{{ cookiecutter.short_name }}/results-'"${GITHUB_SHA}"' -w s3://'"${AWS_S3_BUCKET}"'/{{ cookiecutter.short_name }}/work-'"${GITHUB_SHA}"' -with-tower"], "environment": [{"name": "TOWER_ACCESS_TOKEN", "value": "'"$TOWER_ACCESS_TOKEN"'"}]}' diff --git a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/README.md b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/README.md index 169587892b..28a7317990 100644 --- a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/README.md +++ b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/README.md @@ -18,12 +18,12 @@ The pipeline is built using [Nextflow](https://www.nextflow.io), a workflow tool 1. Install [`nextflow`](https://nf-co.re/usage/installation) -2. Install either [`Docker`](https://docs.docker.com/engine/installation/) or [`Singularity`](https://www.sylabs.io/guides/3.0/user-guide/) for full pipeline reproducibility _(please only use [`Conda`](https://conda.io/miniconda.html) as a last resort; see [docs](https://nf-co.re/usage/configuration#basic-configuration-profiles))_ +2. Install any of [`Docker`](https://docs.docker.com/engine/installation/), [`Singularity`](https://www.sylabs.io/guides/3.0/user-guide/) or [`Podman`](https://podman.io/) for full pipeline reproducibility _(please only use [`Conda`](https://conda.io/miniconda.html) as a last resort; see [docs](https://nf-co.re/usage/configuration#basic-configuration-profiles))_ 3. Download the pipeline and test it on a minimal dataset with a single command: ```bash - nextflow run {{ cookiecutter.name }} -profile test, + nextflow run {{ cookiecutter.name }} -profile test, ``` > Please check [nf-core/configs](https://github.com/nf-core/configs#documentation) to see if a custom config file to run nf-core pipelines already exists for your Institute. If so, you can simply use `-profile ` in your command. This will enable either `docker` or `singularity` and set the appropriate execution settings for your local compute environment. @@ -33,14 +33,14 @@ The pipeline is built using [Nextflow](https://www.nextflow.io), a workflow tool ```bash - nextflow run {{ cookiecutter.name }} -profile --input '*_R{1,2}.fastq.gz' --genome GRCh37 + nextflow run {{ cookiecutter.name }} -profile --input '*_R{1,2}.fastq.gz' --genome GRCh37 ``` -See [usage docs](docs/usage.md) for all of the available options when running the pipeline. +See [usage docs](https://nf-co.re/{{ cookiecutter.short_name }}/usage) for all of the available options when running the pipeline. ## Documentation -The {{ cookiecutter.name }} pipeline comes with documentation about the pipeline which you can read at [https://nf-core/{{ cookiecutter.short_name }}/docs](https://nf-core/{{ cookiecutter.short_name }}/docs) or find in the [`docs/` directory](docs). +The {{ cookiecutter.name }} pipeline comes with documentation about the pipeline: [usage](https://nf-co.re/{{ cookiecutter.short_name }}/usage) and [output](https://nf-co.re/{{ cookiecutter.short_name }}/output). diff --git a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/README.md b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/README.md index ef2bb5200a..191f199dc5 100644 --- a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/README.md +++ b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/README.md @@ -2,8 +2,6 @@ The {{ cookiecutter.name }} documentation is split into the following pages: - - * [Usage](usage.md) * An overview of how the pipeline works, how to run it and a description of all of the different command-line flags. * [Output](output.md) diff --git a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/output.md b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/output.md index 4722747c35..966fefb2a7 100644 --- a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/output.md +++ b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/output.md @@ -1,5 +1,11 @@ # {{ cookiecutter.name }}: Output +## :warning: Please read this documentation on the nf-core website: [https://nf-co.re/{{ cookiecutter.short_name }}/output](https://nf-co.re/{{ cookiecutter.short_name }}/output) + +> _Documentation of pipeline parameters is generated automatically from the pipeline schema and can no longer be found in markdown files._ + +## Introduction + This document describes the output produced by the pipeline. Most of the plots are taken from the MultiQC report, which summarises results at the end of the pipeline. The directories listed below will be created in the results directory after the pipeline has finished. All paths are relative to the top-level results directory. @@ -40,7 +46,7 @@ For more information about how to use MultiQC reports, see [https://multiqc.info **Output files:** -* `multiqc/` +* `multiqc/` * `multiqc_report.html`: a standalone HTML file that can be viewed in your web browser. * `multiqc_data/`: directory containing parsed statistics from the different tools used in the pipeline. * `multiqc_plots/`: directory containing static images from the report in various formats. diff --git a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/usage.md b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/usage.md index b0111a0afd..737d9ea20a 100644 --- a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/usage.md +++ b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/docs/usage.md @@ -1,5 +1,9 @@ # {{ cookiecutter.name }}: Usage +## :warning: Please read this documentation on the nf-core website: [https://nf-co.re/{{ cookiecutter.short_name }}/usage](https://nf-co.re/{{ cookiecutter.short_name }}/usage) + +> _Documentation of pipeline parameters is generated automatically from the pipeline schema and can no longer be found in markdown files._ + ## Introduction @@ -47,7 +51,7 @@ This version number will be logged in reports when you run the pipeline, so that Use this parameter to choose a configuration profile. Profiles can give configuration presets for different compute environments. -Several generic profiles are bundled with the pipeline which instruct the pipeline to use software packaged using different methods (Docker, Singularity, Conda) - see below. +Several generic profiles are bundled with the pipeline which instruct the pipeline to use software packaged using different methods (Docker, Singularity, Podman, Conda) - see below. > We highly recommend the use of Docker or Singularity containers for full pipeline reproducibility, however when this is not possible, Conda is also supported. @@ -64,8 +68,11 @@ If `-profile` is not specified, the pipeline will run locally and expect all sof * `singularity` * A generic configuration profile to be used with [Singularity](https://sylabs.io/docs/) * Pulls software from Docker Hub: [`{{ cookiecutter.name_docker }}`](https://hub.docker.com/r/{{ cookiecutter.name_docker }}/) +* `podman` + * A generic configuration profile to be used with [Podman](https://podman.io/) + * Pulls software from Docker Hub: [`{{ cookiecutter.name_docker }}`](https://hub.docker.com/r/{{ cookiecutter.name_docker }}/) * `conda` - * Please only use Conda as a last resort i.e. when it's not possible to run the pipeline with Docker or Singularity. + * Please only use Conda as a last resort i.e. when it's not possible to run the pipeline with Docker, Singularity or Podman. * A generic configuration profile to be used with [Conda](https://conda.io/docs/) * Pulls most software from [Bioconda](https://bioconda.github.io/) * `test` @@ -98,7 +105,7 @@ process { See the main [Nextflow documentation](https://www.nextflow.io/docs/latest/config.html) for more information. -If you are likely to be running `nf-core` pipelines regularly it may be a good idea to request that your custom config file is uploaded to the `nf-core/configs` git repository. Before you do this please can you test that the config file works with your pipeline of choice using the `-c` parameter (see definition below). You can then create a pull request to the `nf-core/configs` repository with the addition of your config file, associated documentation file (see examples in [`nf-core/configs/docs`](https://github.com/nf-core/configs/tree/master/docs)), and amending [`nfcore_custom.config`](https://github.com/nf-core/configs/blob/master/nfcore_custom.config) to include your custom profile. +If you are likely to be running `nf-core` pipelines regularly it may be a good idea to request that your custom config file is uploaded to the `nf-core/configs` git repository. Before you do this please can you test that the config file works with your pipeline of choice using the `-c` parameter (see definition above). You can then create a pull request to the `nf-core/configs` repository with the addition of your config file, associated documentation file (see examples in [`nf-core/configs/docs`](https://github.com/nf-core/configs/tree/master/docs)), and amending [`nfcore_custom.config`](https://github.com/nf-core/configs/blob/master/nfcore_custom.config) to include your custom profile. If you have any questions or issues please send us a message on [Slack](https://nf-co.re/join/slack) on the [`#configs` channel](https://nfcore.slack.com/channels/configs). diff --git a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/nextflow.config b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/nextflow.config index 30f5260f67..122a916e24 100644 --- a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/nextflow.config +++ b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/nextflow.config @@ -71,6 +71,9 @@ profiles { singularity.enabled = true singularity.autoMounts = true } + podman { + podman.enabled = true + } test { includeConfig 'conf/test.config' } } diff --git a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/nextflow_schema.json b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/nextflow_schema.json index b12212e80b..0a6e83a49e 100644 --- a/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/nextflow_schema.json +++ b/nf_core/pipeline-template/{{cookiecutter.name_noslash}}/nextflow_schema.json @@ -1,5 +1,5 @@ { - "$schema": "https://json-schema.org/draft-07/schema", + "$schema": "http://json-schema.org/draft-07/schema", "$id": "https://raw.githubusercontent.com/{{ cookiecutter.name }}/master/nextflow_schema.json", "title": "{{ cookiecutter.name }} pipeline parameters", "description": "{{ cookiecutter.description }}", @@ -101,7 +101,7 @@ "link", "copy", "copyNoFollow", - "mov" + "move" ] }, "name": { diff --git a/nf_core/schema.py b/nf_core/schema.py index 177de0a0f9..0df35e3ea3 100644 --- a/nf_core/schema.py +++ b/nf_core/schema.py @@ -23,8 +23,8 @@ class PipelineSchema(object): - """ Class to generate a schema object with - functions to handle pipeline JSON Schema """ + """Class to generate a schema object with + functions to handle pipeline JSON Schema""" def __init__(self): """ Initialise the object """ @@ -79,13 +79,13 @@ def load_lint_schema(self): self.load_schema() num_params = self.validate_schema() self.get_schema_defaults() - log.info("[green][[✓]] Pipeline schema looks valid[/] [dim](found {} params)".format(num_params)) + log.info("[green]\[✓] Pipeline schema looks valid[/] [dim](found {} params)".format(num_params)) except json.decoder.JSONDecodeError as e: error_msg = "[bold red]Could not parse schema JSON:[/] {}".format(e) log.error(error_msg) raise AssertionError(error_msg) except AssertionError as e: - error_msg = "[red][[✗]] Pipeline schema does not follow nf-core specs:\n {}".format(e) + error_msg = "[red]\[✗] Pipeline schema does not follow nf-core specs:\n {}".format(e) log.error(error_msg) raise AssertionError(error_msg) @@ -127,7 +127,7 @@ def save_schema(self): json.dump(self.schema, fh, indent=4) def load_input_params(self, params_path): - """ Load a given a path to a parameters file (JSON/YAML) + """Load a given a path to a parameters file (JSON/YAML) These should be input parameters used to run a pipeline with the Nextflow -params-file option. @@ -159,12 +159,12 @@ def validate_params(self): assert self.schema is not None jsonschema.validate(self.input_params, self.schema) except AssertionError: - log.error("[red][[✗]] Pipeline schema not found") + log.error("[red]\[✗] Pipeline schema not found") return False except jsonschema.exceptions.ValidationError as e: - log.error("[red][[✗]] Input parameters are invalid: {}".format(e.message)) + log.error("[red]\[✗] Input parameters are invalid: {}".format(e.message)) return False - log.info("[green][[✓]] Input parameters look valid") + log.info("[green]\[✓] Input parameters look valid") return True def validate_schema(self, schema=None): @@ -225,7 +225,7 @@ def validate_schema_title_description(self, schema=None): return None assert "$schema" in self.schema, "Schema missing top-level `$schema` attribute" - schema_attr = "https://json-schema.org/draft-07/schema" + schema_attr = "http://json-schema.org/draft-07/schema" assert self.schema["$schema"] == schema_attr, "Schema `$schema` should be `{}`\n Found `{}`".format( schema_attr, self.schema["$schema"] ) diff --git a/nf_core/sync.py b/nf_core/sync.py index 0f48eb5161..959648ae03 100644 --- a/nf_core/sync.py +++ b/nf_core/sync.py @@ -22,15 +22,13 @@ class SyncException(Exception): - """Exception raised when there was an error with TEMPLATE branch synchronisation - """ + """Exception raised when there was an error with TEMPLATE branch synchronisation""" pass class PullRequestException(Exception): - """Exception raised when there was an error creating a Pull-Request on GitHub.com - """ + """Exception raised when there was an error creating a Pull-Request on GitHub.com""" pass @@ -57,7 +55,12 @@ class PipelineSync(object): """ def __init__( - self, pipeline_dir, from_branch=None, make_pr=False, gh_repo=None, gh_username=None, + self, + pipeline_dir, + from_branch=None, + make_pr=False, + gh_repo=None, + gh_username=None, ): """ Initialise syncing object """ @@ -73,8 +76,7 @@ def __init__( self.gh_repo = gh_repo def sync(self): - """ Find workflow attributes, create a new template pipeline on TEMPLATE - """ + """Find workflow attributes, create a new template pipeline on TEMPLATE""" log.info("Pipeline directory: {}".format(self.pipeline_dir)) if self.from_branch: @@ -211,8 +213,7 @@ def make_template_pipeline(self): ).init_pipeline() def commit_template_changes(self): - """If we have any changes with the new template files, make a git commit - """ + """If we have any changes with the new template files, make a git commit""" # Check that we have something to commit if not self.repo.is_dirty(untracked_files=True): log.info("Template contains no changes - no new commit created") @@ -294,7 +295,8 @@ def update_existing_pull_request(self, pr_title, pr_body_text): self.gh_repo, self.from_branch ) r = requests.get( - url=list_prs_url, auth=requests.auth.HTTPBasicAuth(self.gh_username, os.environ.get("GITHUB_AUTH_TOKEN")), + url=list_prs_url, + auth=requests.auth.HTTPBasicAuth(self.gh_username, os.environ.get("GITHUB_AUTH_TOKEN")), ) try: r_json = json.loads(r.content) @@ -335,12 +337,12 @@ def update_existing_pull_request(self, pr_title, pr_body_text): return True # Something went wrong else: - log.warn("Could not update PR ('{}'):\n{}\n{}".format(r.status_code, pr_update_api_url, r_pp)) + log.warning("Could not update PR ('{}'):\n{}\n{}".format(r.status_code, pr_update_api_url, r_pp)) return False # Something went wrong else: - log.warn("Could not list open PRs ('{}')\n{}\n{}".format(r.status_code, list_prs_url, r_pp)) + log.warning("Could not list open PRs ('{}')\n{}\n{}".format(r.status_code, list_prs_url, r_pp)) return False def submit_pull_request(self, pr_title, pr_body_text): diff --git a/nf_core/utils.py b/nf_core/utils.py index 87c2a4eba3..c0eb461e33 100644 --- a/nf_core/utils.py +++ b/nf_core/utils.py @@ -142,7 +142,9 @@ def setup_requests_cachedir(): if not os.path.exists(cachedir): os.makedirs(cachedir) requests_cache.install_cache( - os.path.join(cachedir, "github_info"), expire_after=datetime.timedelta(hours=1), backend="sqlite", + os.path.join(cachedir, "github_info"), + expire_after=datetime.timedelta(hours=1), + backend="sqlite", ) diff --git a/pyproject.toml b/pyproject.toml index 2d9759a7fb..bd60056e92 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,3 +1,8 @@ [tool.black] line-length = 120 target_version = ['py36','py37','py38'] + +[tool.pytest.ini_options] +markers = [ + "datafiles: load datafiles" +] diff --git a/setup.py b/setup.py index cbb355615e..afe3ae7db6 100644 --- a/setup.py +++ b/setup.py @@ -3,7 +3,7 @@ from setuptools import setup, find_packages import sys -version = "1.10.2" +version = "1.11" with open("README.md") as f: readme = f.read() @@ -40,7 +40,7 @@ "pyyaml", "requests", "requests_cache", - "rich>=4.2.1", + "rich>=9", "tabulate", ], setup_requires=["twine>=1.11.0", "setuptools>=38.6."], diff --git a/tests/lint_examples/failing_example/.github/workflows/awstest.yml b/tests/lint_examples/failing_example/.github/workflows/awstest.yml index 8e72d862db..a4bf436da0 100644 --- a/tests/lint_examples/failing_example/.github/workflows/awstest.yml +++ b/tests/lint_examples/failing_example/.github/workflows/awstest.yml @@ -8,8 +8,6 @@ on: - master - dev pull_request: - release: - types: [published] jobs: run-awstest: diff --git a/tests/lint_examples/minimalworkingexample/.github/workflows/awsfulltest.yml b/tests/lint_examples/minimalworkingexample/.github/workflows/awsfulltest.yml index 99c9ab9165..ef81b9cd54 100644 --- a/tests/lint_examples/minimalworkingexample/.github/workflows/awsfulltest.yml +++ b/tests/lint_examples/minimalworkingexample/.github/workflows/awsfulltest.yml @@ -5,6 +5,7 @@ name: nf-core AWS full size tests on: release: types: [published] + workflow_dispatch: jobs: run-awstest: diff --git a/tests/lint_examples/minimalworkingexample/.github/workflows/awstest.yml b/tests/lint_examples/minimalworkingexample/.github/workflows/awstest.yml index 3d39c4505a..2347f7d019 100644 --- a/tests/lint_examples/minimalworkingexample/.github/workflows/awstest.yml +++ b/tests/lint_examples/minimalworkingexample/.github/workflows/awstest.yml @@ -3,9 +3,7 @@ name: nf-core AWS tests # It runs the -profile 'test' on AWS batch on: - push: - branches: - - master + workflow_dispatch: jobs: run-awstest: diff --git a/tests/lint_examples/minimalworkingexample/Dockerfile b/tests/lint_examples/minimalworkingexample/Dockerfile index 7f033b0331..d5c8005c47 100644 --- a/tests/lint_examples/minimalworkingexample/Dockerfile +++ b/tests/lint_examples/minimalworkingexample/Dockerfile @@ -1,6 +1,5 @@ -FROM nfcore/base:dev -MAINTAINER Phil Ewels -LABEL authors="phil.ewels@scilifelab.se" \ +FROM nfcore/base:1.11 +LABEL authors="Phil Ewels phil.ewels@scilifelab.se" \ description="Docker image containing all requirements for the nf-core tools pipeline" COPY environment.yml / diff --git a/tests/lint_examples/minimalworkingexample/nextflow_schema.json b/tests/lint_examples/minimalworkingexample/nextflow_schema.json index bbf2bbe9eb..9340e60113 100644 --- a/tests/lint_examples/minimalworkingexample/nextflow_schema.json +++ b/tests/lint_examples/minimalworkingexample/nextflow_schema.json @@ -1,5 +1,5 @@ { - "$schema": "https://json-schema.org/draft-07/schema", + "$schema": "http://json-schema.org/draft-07/schema", "$id": "https://raw.githubusercontent.com/nf-core/tools/master/nextflow_schema.json", "title": "nf-core/tools pipeline parameters", "description": "Minimal working example pipeline", diff --git a/tests/test_licenses.py b/tests/test_licenses.py index 70e0ea461a..5458b6b1ce 100644 --- a/tests/test_licenses.py +++ b/tests/test_licenses.py @@ -13,7 +13,7 @@ class WorkflowLicensesTest(unittest.TestCase): - """ A class that performs tests on the workflow license + """A class that performs tests on the workflow license retrieval functionality of nf-core tools.""" def setUp(self): diff --git a/tests/test_lint.py b/tests/test_lint.py index 707c730e71..c3243018ea 100644 --- a/tests/test_lint.py +++ b/tests/test_lint.py @@ -75,14 +75,17 @@ def test_call_lint_pipeline_pass(self): """Test the main execution function of PipelineLint (pass) This should not result in any exception for the minimal working example""" + old_nfcore_version = nf_core.__version__ + nf_core.__version__ = "1.11" lint_obj = nf_core.lint.run_linting(PATH_WORKING_EXAMPLE, False) + nf_core.__version__ = old_nfcore_version expectations = {"failed": 0, "warned": 5, "passed": MAX_PASS_CHECKS - 1} self.assess_lint_status(lint_obj, **expectations) @pytest.mark.xfail(raises=AssertionError, strict=True) def test_call_lint_pipeline_fail(self): """Test the main execution function of PipelineLint (fail) - This should fail after the first test and halt execution """ + This should fail after the first test and halt execution""" lint_obj = nf_core.lint.run_linting(PATH_FAILING_EXAMPLE, False) expectations = {"failed": 4, "warned": 2, "passed": 7} self.assess_lint_status(lint_obj, **expectations) @@ -90,6 +93,7 @@ def test_call_lint_pipeline_fail(self): def test_call_lint_pipeline_release(self): """Test the main execution function of PipelineLint when running with --release""" lint_obj = nf_core.lint.PipelineLint(PATH_WORKING_EXAMPLE) + lint_obj.version = "1.11" lint_obj.lint_pipeline(release_mode=True) expectations = {"failed": 0, "warned": 4, "passed": MAX_PASS_CHECKS + ADD_PASS_RELEASE} self.assess_lint_status(lint_obj, **expectations) @@ -97,6 +101,7 @@ def test_call_lint_pipeline_release(self): def test_failing_dockerfile_example(self): """Tests for empty Dockerfile""" lint_obj = nf_core.lint.PipelineLint(PATH_FAILING_EXAMPLE) + lint_obj.files = ["Dockerfile"] lint_obj.check_docker() self.assess_lint_status(lint_obj, failed=1) @@ -109,7 +114,7 @@ def test_failing_missingfiles_example(self): """Tests for missing files like Dockerfile or LICENSE""" lint_obj = nf_core.lint.PipelineLint(PATH_FAILING_EXAMPLE) lint_obj.check_files_exist() - expectations = {"failed": 6, "warned": 2, "passed": 12} + expectations = {"failed": 6, "warned": 2, "passed": 13} self.assess_lint_status(lint_obj, **expectations) def test_mit_licence_example_pass(self): @@ -210,14 +215,14 @@ def test_actions_wf_awstest_pass(self): """Tests that linting for GitHub Actions AWS test wf works for a good example""" lint_obj = nf_core.lint.PipelineLint(PATH_WORKING_EXAMPLE) lint_obj.check_actions_awstest() - expectations = {"failed": 0, "warned": 0, "passed": 2} + expectations = {"failed": 0, "warned": 0, "passed": 1} self.assess_lint_status(lint_obj, **expectations) def test_actions_wf_awstest_fail(self): """Tests that linting for GitHub Actions AWS test wf fails for a bad example""" lint_obj = nf_core.lint.PipelineLint(PATH_FAILING_EXAMPLE) lint_obj.check_actions_awstest() - expectations = {"failed": 2, "warned": 0, "passed": 0} + expectations = {"failed": 1, "warned": 0, "passed": 0} self.assess_lint_status(lint_obj, **expectations) def test_actions_wf_awsfulltest_pass(self): @@ -277,6 +282,7 @@ def test_readme_fail(self): def test_dockerfile_pass(self): """Tests if a valid Dockerfile passes the lint checks""" lint_obj = nf_core.lint.PipelineLint(PATH_WORKING_EXAMPLE) + lint_obj.files = ["Dockerfile"] lint_obj.check_docker() expectations = {"failed": 0, "warned": 0, "passed": 1} self.assess_lint_status(lint_obj, **expectations) @@ -384,7 +390,8 @@ def test_conda_env_skip(self): def test_conda_dockerfile_pass(self): """ Tests the conda Dockerfile test works with a working example """ lint_obj = nf_core.lint.PipelineLint(PATH_WORKING_EXAMPLE) - lint_obj.files = ["environment.yml"] + lint_obj.version = "1.11" + lint_obj.files = ["environment.yml", "Dockerfile"] with open(os.path.join(PATH_WORKING_EXAMPLE, "Dockerfile"), "r") as fh: lint_obj.dockerfile = fh.read().splitlines() lint_obj.conda_config["name"] = "nf-core-tools-0.4" @@ -395,7 +402,8 @@ def test_conda_dockerfile_pass(self): def test_conda_dockerfile_fail(self): """ Tests the conda Dockerfile test fails with a bad example """ lint_obj = nf_core.lint.PipelineLint(PATH_WORKING_EXAMPLE) - lint_obj.files = ["environment.yml"] + lint_obj.version = "1.11" + lint_obj.files = ["environment.yml", "Dockerfile"] lint_obj.conda_config["name"] = "nf-core-tools-0.4" lint_obj.dockerfile = ["fubar"] lint_obj.check_conda_dockerfile() @@ -433,8 +441,8 @@ def test_pip_package_not_latest_warn(self): @mock.patch("requests.get") def test_pypi_timeout_warn(self, mock_get): - """ Tests the PyPi connection and simulates a request timeout, which should - return in an addiional warning in the linting """ + """Tests the PyPi connection and simulates a request timeout, which should + return in an addiional warning in the linting""" # Define the behaviour of the request get mock mock_get.side_effect = requests.exceptions.Timeout() # Now do the test @@ -449,8 +457,8 @@ def test_pypi_timeout_warn(self, mock_get): @mock.patch("requests.get") def test_pypi_connection_error_warn(self, mock_get): - """ Tests the PyPi connection and simulates a connection error, which should - result in an additional warning, as we cannot test if dependent module is latest """ + """Tests the PyPi connection and simulates a connection error, which should + result in an additional warning, as we cannot test if dependent module is latest""" # Define the behaviour of the request get mock mock_get.side_effect = requests.exceptions.ConnectionError() # Now do the test @@ -475,7 +483,7 @@ def test_pip_dependency_fail(self): self.assess_lint_status(lint_obj, **expectations) def test_conda_dependency_fails(self): - """ Tests that linting fails, if conda dependency + """Tests that linting fails, if conda dependency package version is not available on Anaconda. """ lint_obj = nf_core.lint.PipelineLint(PATH_WORKING_EXAMPLE) @@ -488,7 +496,7 @@ def test_conda_dependency_fails(self): self.assess_lint_status(lint_obj, **expectations) def test_pip_dependency_fails(self): - """ Tests that linting fails, if conda dependency + """Tests that linting fails, if conda dependency package version is not available on Anaconda. """ lint_obj = nf_core.lint.PipelineLint(PATH_WORKING_EXAMPLE) diff --git a/tests/test_list.py b/tests/test_list.py index 1b7920475d..97be0771be 100644 --- a/tests/test_list.py +++ b/tests/test_list.py @@ -57,14 +57,14 @@ def test_pretty_datetime(self): @pytest.mark.xfail(raises=AssertionError, strict=True) def test_local_workflows_and_fail(self): - """ Test the local workflow class and try to get local - Nextflow workflow information """ + """Test the local workflow class and try to get local + Nextflow workflow information""" loc_wf = nf_core.list.LocalWorkflow("myWF") loc_wf.get_local_nf_workflow_details() def test_local_workflows_compare_and_fail_silently(self): - """ Test the workflow class and try to compare local - and remote workflows """ + """Test the workflow class and try to compare local + and remote workflows""" wfs = nf_core.list.Workflows() lwf_ex = nf_core.list.LocalWorkflow("myWF") lwf_ex.full_name = "my Workflow"