Skip to content

dvsa/cvs-tsk-update-test-stations

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

cvs-tsk-update-test-stations

AWS Lambda function that updates the list of test stations in the ATF Sites DynamoDB from the master list in Dynamics CE.

Designed to be invoked by a timer every night to pick up the previous days changes, but can be invoked manually.

Description

The function authenticates with Azure AD, and uses the returned token to retrieve the details of test stations that have changed from Dynamics CE using OData. Each updated test station is added to EventBridge as a separate event, and EventBridge is then responsible for invoking the Test Station API to perform the actual update in DynamoDB.

The test stations are queried on their modifiedon property; by default this is midnight yesterday, but any valid ISO-formatted date can be passed in.

For example, to update all test stations that have been modified on or since the 1st of December 2021:

{
  "detail": {
    "lastModifiedDate": "2021-12-01"
  }
}

The solution design can be found in Confluence

Dependencies

The project runs on node 18.x with typescript. For further details about project dependencies, please refer to the package.json file. nvm is used to managed node versions and configuration explicitly done per project using an .npmrc file.

Running the project

Once the dependencies are installed (npm install), you will be required to rename the /config/env.example file to .env.local as we use dotenv files for configuration for local local development for example. Further information about variables and environment variables with serverless. Please note that multiple .env files can be created per environments. Our current development environment is 'local'.

The application runs on port :3001 by default when no stage is provided.

The service has local environmental variables (please see env placeholder file) set locally however should we wish to further extend the service, the environmental variables will need to be ported over to the CI/CD pipeline which currently uses BRANCH and BUCKET.

Environments

We use NODE_ENV environment variable to set multi-stage builds (region, stages) with the help of dotenv through npm scripts to load the relevant .env.<NODE_ENV> file from ./config folder into the serverless.yml file as we don't rely on serverless for deployment. If no NODE_ENV value is provided when running the scripts, it will default its NODE_ENV value to 'development' with the .env.development config file.

The defaulted values for 'stage' and 'region' are 'local'. Please refer to the values provided in the serverless.yml file.

The following values can be provided when running the scripts with NODE_ENV:

// ./config/.env.<NODE_ENV> files
'local'; // used for local development
'development'; // used development staging should we wish to require external services
'test'; // used during test scripts where local services, mocks can be used in conjonction
/** Running serverless offline as an example for a specific stage - 'local'.
* Stage 'local' will be injected in the serverless.yml
**/
NODE_ENV=local serverless offline

Further details about environment setup can be found in the provided documentation and env.example file.

All secrets the secrets are will stored in AWS Secrets Manager.

Scripts

The following scripts are available, for further information please refer to the project package.json file:

  • start: npm start - launch serverless offline service
  • dev: npm run dev - run in parallel the service and unit tests in --watch mode with live reload.
  • test: npm run test - execute the unit test suite
  • build: npm run build - bundle the project for production
  • production build: npm run build:production - generate the project with bundled libraries, minified, concatenated code

Offline

Serverless-offline is used to run the project locally. Please use npm run dev script to do so. Go to http://localhost:3001/local/version to confirm that everything has loaded correctly, you should see that the version is the same as the version in the package.json

The below routes are available as default routes from this scaffolding

(GET) http://localhost:3009/local-stage/version
(GET) http://localhost:3009/local-stage/*/service-name/
(POST) http://localhost:3009/local-stage/*/service-name/:id/something

Lambda locally

Serverless can invoke lambda functions locally which provide a close experience to the real service if you decide not use the offline mode. events and paths can be found under /local folder. For further details using lambda locally please refer to the serverless documentation.

Debugging

Existing configuration to debug the running service has been made available for vscode, please refer to .vscode/launch.json file. Serverless offline will be available on port :4000. 2 jest configurations are also provided which will allow to run a test or multiple tests. Should you wish to change the ports when debugging, please change the config args accordingly.

For further information about debugging, please refer to the following documentation:

Testing

Unit

Jest is used for unit testing. Please refer to the Jest documentation for further details.

Release

Releases (tag, release notes, changelog, github release, assets) are automatically managed by semantic-release and when pushing (or merging) to develop branch which is protected. semver convention is followed.

Please be familiar with conventional commit as described in the Contributing section below.

Default preset used is angular for conventional commits, please see the angular conventions.

The <type> 'breaking' in the commit message will trigger a major version bump as well as any of the following text contained in the commit body: "BREAKING CHANGE", "BREAKING CHANGES", "BREAKING_CHANGES", "BREAKING", "BREAKING_CHANGE". Please refer to the .releaserc.json file for the full configuration.

The script npm run release will automatically trigger the release in CI. To manually test the release the following flags ---dry-run --no-ci - can be passed to the release script.

Publishing and artifacts are managed separately by the pipeline.

Contributing

To facilitate the standardisation of the code, a few helpers and tools have been adopted for this repository.

External dependencies

The projects has multiple hooks configured using husky which will execute the following scripts: audit, lint, build, test and format your code with eslint and prettier.

You will be required to install git-secrets (brew approach is recommended) and DVSA repo-security-scanner that runs against your git log history to find accidentally committed passwords or private keys.

We follow the conventional commit format when we commit code to the repository and follow the angular convention.

The type is mandatory and must be all lowercase. The scope of your commit remain is also mandatory, it must include your ticket number and be all lowercase. The format for the ticket number can be set in the commitlint.config.js file.

// Please see /commitlint.config.js for customised format

type(scope?): subject

// examples
'chore(cb2-1234): my commit msg' // pass
'CHORE(cb2-1234): my commit msg' // will fail

Code standards

Code structure

Domain Drive Design diagram with Interfaces, Application, Domain layers and Infrastructure across the layers.

Domain Drive Design diagram with Interfaces, Application, Domain layers and Infrastructure across the layers

Toolings

The code uses eslint, typescript clean code standards as well as SonarQube for static analysis. SonarQube is available locally, please follow the instructions below if you wish to run the service locally (docker is the preferred approach):

  • Docker:

    • Run docker run -d -p 9000:9000 --name sonarqube sonarqube:latest
    • The SonarQube container won't start automatically with your PC. To start it run docker start sonarqube
    • Login with admin/admin - http://localhost:9000 and create a Project with name and key found in ./sonar-project.properties. There you'll also find instructions for creating and configuring an authentication token.
    • Run the analysis with npm run sonar-scanner
  • Brew:

    • Install SonarQube using brew
    • Change sonar.host.url to point to localhost, by default, sonar runs on http://localhost:9000
    • run the sonar server sonar start, then perform your analysis npm run sonar-scanner
  • Manual:

    • Add sonar-scanner in environment variables in your _profile file add the line: export PATH=<PATH_TO_SONAR_SCANNER>/sonar-scanner-3.3.0.1492-macosx/bin:$PATH
    • Start the SonarQube server: cd <PATH_TO_SONARQUBE_SERVER>/bin/macosx-universal-64 ./sonar.sh start
    • In the microservice folder run the command: npm run sonar-scanner