From f9efccf6d8f3b61af58adadc18233d4b3db0e245 Mon Sep 17 00:00:00 2001 From: vonbraunbates Date: Mon, 6 Jan 2025 15:05:51 +0000 Subject: [PATCH 1/7] Update CONTRIBUTING.md Misc updates to documentation --- CONTRIBUTING.md | 27 +++++++++++++++++++++------ 1 file changed, 21 insertions(+), 6 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index d77314a..981c9d5 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -21,18 +21,33 @@ or ## Pre-requisites -Either set up a new virtual environment: +1. Clone the repo locally and `cd` to this folder. + +2. Set up a new virtual environment: `python3 -m venv ` -or activate the existing one: +We'll call our environment `env` from here onwards. + +3. Activate it: + `source ./env/bin/activate` -Install the packages. If `which pip` returns an error message about pip not found, use `pip3` instead of `pip` in the followng line: -`pip install -r requirements.txt && pip install -r requirements-dev.txt`. Use the `--force-reinstall` flag to replace an existing version if necessary. -Then install the pre-commit hooks with `pre-commit install`. +Check that it is activated: + +`which python3` + +should return `/env/bin/python3` instead of your OS's global python executable (which will be something like `/opt/bin/python3`). + +4. Install the packages: + +`pip install -r requirements.txt && pip install -r requirements-dev.txt` + +Use the `--force-reinstall` flag to replace an existing version if necessary. This may take several minutes depending upon your internet connection. + +5. Then install the pre-commit hooks with `pre-commit install`. ## Access the Pulumi stack -1. Activate the `restricted-admin-data` AWS role (see [instructions](https://dsdmoj.atlassian.net/wiki/spaces/DE/pages/3862331895/Set+up+AWS+access#Config-file-for-prisons-and-probation) if this isn't already configured). +1. Activate the AWS SSO role you use for data engineering: `aws-vault exec ` (the details should be in your `~/.aws` folder). 2. Log in to the Pulumi backend with `pulumi login s3://data-engineering-pulumi.analytics.justice.gov.uk`. 3. Run `pulumi stack select` and pick `data-engineering-exports`. 4. Run `pulumi stack` to check you can see what's currently deployed. From 1cc0e45f7e44172ff216e6ae0f213053bf3b1ee6 Mon Sep 17 00:00:00 2001 From: vonbraunbates Date: Mon, 6 Jan 2025 15:09:08 +0000 Subject: [PATCH 2/7] Update README.md --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 764810a..6fecf2f 100644 --- a/README.md +++ b/README.md @@ -52,7 +52,8 @@ Only use lower case and underscores in your dataset name. 5. Commit the file and push it to GitHub 6. Create a new pull request and request a review from the data engineering team. Once this is approved, you can merge your PR: this doesn't happen automatically, so don't forget. -7. Once your changes are in the `main` branch, request a data engineer to `pulumi up` which deploys your changes to the infrastructure. They will tell you when it's ready. If you can't see your new role in IAM (in our example it's `export_<>-move`) then your changes haven't been deployed. +7. Once your changes are in the `main` branch, request a data engineer to `pulumi up` which deploys your changes to the infrastructure. They will tell you when it's ready. If you have access to the data engineering SSO role, then you can do this yourself [following the instructions](./CONTRIBUTING.md). +8. If you can't see your new role in IAM (in our example it's `export_new_project-move`) then your changes haven't been deployed. You may need to wait 24 hours. ## Exporting from your bucket From 1d406cd61db8b0bf46bee6d496817fae8e7ee317 Mon Sep 17 00:00:00 2001 From: vonbraunbates Date: Mon, 6 Jan 2025 15:12:56 +0000 Subject: [PATCH 3/7] Update CONTRIBUTING.md --- CONTRIBUTING.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 981c9d5..3b9d667 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -39,7 +39,7 @@ should return `/env/bin/python3` instead of your OS's global 4. Install the packages: -`pip install -r requirements.txt && pip install -r requirements-dev.txt` +`pip3 install -r requirements.txt && pip3 install -r requirements-dev.txt` Use the `--force-reinstall` flag to replace an existing version if necessary. This may take several minutes depending upon your internet connection. @@ -47,7 +47,8 @@ Use the `--force-reinstall` flag to replace an existing version if necessary. T ## Access the Pulumi stack -1. Activate the AWS SSO role you use for data engineering: `aws-vault exec ` (the details should be in your `~/.aws` folder). +1. Activate the AWS SSO role you use for data engineering: `aws-vault exec ` + If you're not sure which profile to use, consult `aws-vault list` or for even more detail, your `~/.aws` folder. 2. Log in to the Pulumi backend with `pulumi login s3://data-engineering-pulumi.analytics.justice.gov.uk`. 3. Run `pulumi stack select` and pick `data-engineering-exports`. 4. Run `pulumi stack` to check you can see what's currently deployed. From 23b5aab06d00973b81b40d563b07b552eafa632b Mon Sep 17 00:00:00 2001 From: vonbraunbates Date: Mon, 6 Jan 2025 16:01:19 +0000 Subject: [PATCH 4/7] Update CONTRIBUTING.md Updated the role needed to access the stacks --- CONTRIBUTING.md | 13 +++++++------ 1 file changed, 7 insertions(+), 6 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 3b9d667..06cdf82 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -47,12 +47,13 @@ Use the `--force-reinstall` flag to replace an existing version if necessary. T ## Access the Pulumi stack -1. Activate the AWS SSO role you use for data engineering: `aws-vault exec ` - If you're not sure which profile to use, consult `aws-vault list` or for even more detail, your `~/.aws` folder. -2. Log in to the Pulumi backend with `pulumi login s3://data-engineering-pulumi.analytics.justice.gov.uk`. -3. Run `pulumi stack select` and pick `data-engineering-exports`. -4. Run `pulumi stack` to check you can see what's currently deployed. -5. Run `pulumi preview` to check the resources look correct. Use the `--diff` flag to see details. +1. Activate the AWS SSO role you use to access `analytical-platform-data-production`: + `aws-vault exec ` + If you're not sure which profile to use, consult `aws-vault list` or for even more detail, your `~/.aws/config` file. +3. Log in to the Pulumi backend with `pulumi login --cloud-url s3://data-engineering-pulumi.analytics.justice.gov.uk`. +4. Run `pulumi stack select` and pick `data-engineering-exports`. +5. Run `pulumi stack` to check you can see what's currently deployed. +6. Run `pulumi preview` to check the resources look correct. Use the `--diff` flag to see details. You may see changes to update the local archive path, which can be ignored. If you are using a different version of `pulumi-aws` to the current deplyment you may see changes relating to the provider, you can avoid these by installing the specific version curently in use, for example, `pip install --force-reinstall pulumi-aws==5.40.0`. From 1dfab60c4eb3968887505842024359a557b29b28 Mon Sep 17 00:00:00 2001 From: vonbraunbates Date: Mon, 6 Jan 2025 16:20:00 +0000 Subject: [PATCH 5/7] Update CONTRIBUTING.md --- CONTRIBUTING.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 06cdf82..f4ef9dc 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -51,8 +51,8 @@ Use the `--force-reinstall` flag to replace an existing version if necessary. T `aws-vault exec ` If you're not sure which profile to use, consult `aws-vault list` or for even more detail, your `~/.aws/config` file. 3. Log in to the Pulumi backend with `pulumi login --cloud-url s3://data-engineering-pulumi.analytics.justice.gov.uk`. -4. Run `pulumi stack select` and pick `data-engineering-exports`. -5. Run `pulumi stack` to check you can see what's currently deployed. +4. Run `pulumi stack select` and pick `data-engineering-exports`(which may be out-of-sight, just keep hitting the up arrow). +5. Run `pulumi stack` to check you can see what's currently deployed. If asked for a passphrase, hit return. 6. Run `pulumi preview` to check the resources look correct. Use the `--diff` flag to see details. You may see changes to update the local archive path, which can be ignored. If you are using a different version of `pulumi-aws` to the current deplyment you may see changes relating to the provider, you can avoid these by installing the specific version curently in use, for example, `pip install --force-reinstall pulumi-aws==5.40.0`. From da87ffd7a29323e46cc488c099215a40783cf27e Mon Sep 17 00:00:00 2001 From: vonbraunbates Date: Fri, 17 Jan 2025 13:54:02 +0000 Subject: [PATCH 6/7] Update CONTRIBUTING.md Updating to account for new permissions. --- CONTRIBUTING.md | 11 ++++++----- 1 file changed, 6 insertions(+), 5 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index f4ef9dc..244187e 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -19,7 +19,7 @@ or 6. Merge to main **and pull to your local machine** (since pulumi will operate on the locally-held version of your code). -## Pre-requisites +## Pulumi pre-requisites 1. Clone the repo locally and `cd` to this folder. @@ -57,14 +57,15 @@ Use the `--force-reinstall` flag to replace an existing version if necessary. T You may see changes to update the local archive path, which can be ignored. If you are using a different version of `pulumi-aws` to the current deplyment you may see changes relating to the provider, you can avoid these by installing the specific version curently in use, for example, `pip install --force-reinstall pulumi-aws==5.40.0`. +## Deploying changes -6. Deploy the changes with `pulumi up` (there's a ticket to [automate the deployment](https://dsdmoj.atlassian.net/browse/PDE-1441)). +Pre-SSO, data enigneers had the permissions to deploy changes. Now you will need to ask someone from the Analytical Platform team to do so in `#ask-analytical-platform` on Slack. As usual, they will deploy the changes with `pulumi up` (there's a ticket to [automate the deployment](https://dsdmoj.atlassian.net/browse/PDE-1441)). -You may also need to set `export PULUMI_CONFIG_PASSPHRASE=""` if you've changed this for other projects. +## QA -7. Ask the user to test the export. This should include making sure the destination system gets the test file, as we can't see the destination buckets ourselves. +After the stack is live, ask the user to test the export. This should include making sure the destination system gets the test file, as we can't see the destination buckets ourselves. -## Running tests +## Running Pulumi tests There are normal unit tests that mock Pulumi resources. There is also an end-to-end test that uses Localstack, which creates a mock AWS environment. The test infrastructure should behave like real resources, but none of it needs access to a real AWS account. From 0c0652b39064f20e5252b2b5105a1bd05c9ff71b Mon Sep 17 00:00:00 2001 From: vonbraunbates Date: Fri, 17 Jan 2025 13:55:53 +0000 Subject: [PATCH 7/7] Update CONTRIBUTING.md spelling --- CONTRIBUTING.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 244187e..5d280e0 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -45,7 +45,7 @@ Use the `--force-reinstall` flag to replace an existing version if necessary. T 5. Then install the pre-commit hooks with `pre-commit install`. -## Access the Pulumi stack +## Previewing changes to the Pulumi stack 1. Activate the AWS SSO role you use to access `analytical-platform-data-production`: `aws-vault exec ` @@ -59,9 +59,9 @@ You may see changes to update the local archive path, which can be ignored. If y ## Deploying changes -Pre-SSO, data enigneers had the permissions to deploy changes. Now you will need to ask someone from the Analytical Platform team to do so in `#ask-analytical-platform` on Slack. As usual, they will deploy the changes with `pulumi up` (there's a ticket to [automate the deployment](https://dsdmoj.atlassian.net/browse/PDE-1441)). +Pre-SSO, data engineers had the permissions to deploy changes. Now you will need to ask someone from the Analytical Platform team to do so in `#ask-analytical-platform` on Slack. As usual, they will deploy the changes with `pulumi up` (there's a ticket to [automate the deployment](https://dsdmoj.atlassian.net/browse/PDE-1441)). -## QA +## Testing deployment After the stack is live, ask the user to test the export. This should include making sure the destination system gets the test file, as we can't see the destination buckets ourselves.