Skip to content

Commit

Permalink
WIP: PR GHAction
Browse files Browse the repository at this point in the history
  • Loading branch information
gmfrasca committed Nov 18, 2023
1 parent 0609af1 commit 62be4c1
Show file tree
Hide file tree
Showing 2 changed files with 43 additions and 41 deletions.
83 changes: 42 additions & 41 deletions .github/workflows/build-prs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,26 +61,26 @@ jobs:
if: needs.fetch-data.outputs.pr_state == 'open'
runs-on: ubuntu-latest
needs: fetch-data
concurrency:
group: ${{ github.workflow }}-build-pr-image-${{ needs.fetch-data.outputs.pr_number }}
cancel-in-progress: true
# concurrency:
# group: ${{ github.workflow }}-build-pr-images-${{ needs.fetch-data.outputs.pr_number }}
# cancel-in-progress: true
env:
SOURCE_BRANCH: ${{ needs.fetch-data.outputs.head_sha }}
TARGET_IMAGE_TAG: pr-${{ needs.fetch-data.outputs.pr_number }}
strategy:
fail-fast: false
matrix:
include:
- image: ds-pipelines-api-server
dockerfile: backend/Dockerfile
- image: ds-pipelines-frontend
dockerfile: frontend/Dockerfile
- image: ds-pipelines-cacheserver
dockerfile: backend/Dockerfile.cacheserver
- image: ds-pipelines-persistenceagent
dockerfile: backend/Dockerfile.persistenceagent
- image: ds-pipelines-scheduledworkflow
dockerfile: backend/Dockerfile.scheduledworkflow
# - image: ds-pipelines-api-server
# dockerfile: backend/Dockerfile
# - image: ds-pipelines-frontend
# dockerfile: frontend/Dockerfile
# - image: ds-pipelines-cacheserver
# dockerfile: backend/Dockerfile.cacheserver
# - image: ds-pipelines-persistenceagent
# dockerfile: backend/Dockerfile.persistenceagent
# - image: ds-pipelines-scheduledworkflow
# dockerfile: backend/Dockerfile.scheduledworkflow
- image: ds-pipelines-viewercontroller
dockerfile: backend/Dockerfile.viewercontroller
- image: ds-pipelines-artifact-manager
Expand All @@ -105,12 +105,13 @@ jobs:
runs-on: ubuntu-latest
needs: [fetch-data, build-pr-images]
concurrency:
group: ${{ github.workflow }}-build-pr-image-${{ needs.fetch-data.outputs.pr_number }}
group: ${{ github.workflow }}-comment-on-pr-${{ needs.fetch-data.outputs.pr_number }}
cancel-in-progress: true
env:
SOURCE_BRANCH: ${{ needs.fetch-data.outputs.head_sha }}
TARGET_IMAGE_TAG: pr-${{ needs.fetch-data.outputs.pr_number }}
steps:
- uses: actions/checkout@v3
- name: Echo PR metadata
shell: bash
env:
Expand Down Expand Up @@ -150,35 +151,28 @@ jobs:
cat <<"EOF" >> /tmp/body-file.txt
A set of new images have been built to help with testing out this PR:
- ds-pipelines-api-server: `${{ env.FULLIMG_API_SERVER }}
run: |`
- ds-pipelines-frontend: `${{ env.FULLIMG_FRONTEND }}`
- ds-pipelines-cacheserver: `${{ env.FULLIMG_CACHESERVER }}`
- ds-pipelines-persistenceagent: `${{ env.FULLIMG_PERSISTENCEAGENT }}`
- ds-pipelines-scheduledworkflow: `${{ env.FULLIMG_SCHEDULEDWORKFLOW }}`
- ds-pipelines-viewercontroller: `${{ env.FULLIMG_VIEWERCONTROLLER }}`
- ds-pipelines-artifact-manager: `${{ env.FULLIMG_ARTIFACT_MANAGER }}`
- ds-pipelines-metadata-writer: `${{ env.FULLIMG_METADATA_WRITER }}`
- ds-pipelines-metadata-envoy: `${{ env.FULLIMG_METADATA_ENVOY }}`
- ds-pipelines-metadata-grpc: `${{ env.FULLIMG_METADATA_GRPC }}`
**API Server**: `${{ env.FULLIMG_API_SERVER }}`
**Persistence Agent**: `${{ env.FULLIMG_PERSISTENCEAGENT }}`
**Scheduled Workflow Manager**: `${{ env.FULLIMG_SCHEDULEDWORKFLOW }}`
**CRD Viewer Controller**: `${{ env.FULLIMG_VIEWERCONTROLLER }}`
**Artifact Manager**: `${{ env.FULLIMG_ARTIFACT_MANAGER }}`
**MLMD Server**: `${{ env.FULLIMG_METADATA_GRPC }}`
**MLMD Writer**: `${{ env.FULLIMG_METADATA_WRITER }}`
**MLMD Envoy Proxy**: `${{ env.FULLIMG_METADATA_ENVOY }}`
**Cache Server**: `${{ env.FULLIMG_CACHESERVER }}`
**UI**: `${{ env.FULLIMG_FRONTEND }}`
EOF
gh pr comment ${{ needs.fetch-data.outputs.pr_number }} --body-file /tmp/body-file.txt
if [[ "$action" == "opened" || "$action" == "reopened" ]]; then
cat <<"EOF" >> /tmp/body-file.txt
cat <<"EOF" >> /tmp/additional-comment.txt
An OCP cluster where you are logged in as cluster admin is required.
The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator.
Check [here](https://github.com/opendatahub-io/data-science-pipelines-operator) for more information on using the DSPO.
To use and deploy a DSP stack with these images using this Operator, after deploying the DSPO above, run the following:
The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check [here](https://github.com/opendatahub-io/data-science-pipelines-operator) for more information on using the DSPO.
```bash
cd $(mktemp -d)
git clone git@github.com:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/${{ needs.fetch-data.outputs.pr_number }}/head
git checkout -b pullrequest ${{ env.SOURCE_BRANCH }}
cat << "DSPA" >> dspa.pr-${{ needs.fetch-data.outputs.pr_number}}.yaml
To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named `dspa.pr-${{ needs.fetch-data.outputs.pr_number}}.yaml`:
```yaml
apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
Expand Down Expand Up @@ -209,17 +203,24 @@ jobs:
minio:
deploy: true
image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance'
DSPA
```
Then run the following:
```bash
cd $(mktemp -d)
git clone git@github.com:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/${{ needs.fetch-data.outputs.pr_number }}/head
git checkout -b pullrequest ${{ env.SOURCE_BRANCH }}
oc apply -f dspa.pr-${{ needs.fetch-data.outputs.pr_number}}.yaml
```
More instructions [here](https://github.com/opendatahub-io/data-science-pipelines-operator#deploy-dsp-instance) on how to deploy and test a Data Science Pipelines Application.
EOF
gh pr comment ${{ needs.fetch-data.outputs.pr_number }} --body-file /tmp/additional-comment.txt
fi
gh pr comment ${{ needs.fetch-data.outputs.pr_number }} --body-file /tmp/body-file.txt
clean-pr-images:
if: needs.fetch-data.outputs.pr_state == 'closed'
runs-on: ubuntu-latest
Expand Down
1 change: 1 addition & 0 deletions foo
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
bar1

0 comments on commit 62be4c1

Please sign in to comment.