Publishing NPM Libraries using NX and Github Actions

 Published: May 7, 2020  (last updated: Feb 12, 2021)~ 2300 words~ 11 minutes reading time

This week I released version 1.0 of RxJS Primitives to NPM - but the journey to get there was not as easy as I hoped.

In the past I’ve used CircleCI and at work I use Jenkins but with this project I wanted to try out Github Actions .

After some trial an error (and many failed builds) I managed to get the workflow working, I’ve decided to share here the steps taken to hopefully save you from the same pain.

Setting up your monorepo

The NX CLI is a set of tools that make managing your repository of JavaScript and TypeScript code easy. Originally built on top of the Angular CLI the tooling now supports more project types including Node, React and Web Components.

In my day-job most of my work is building Angular Enterprise applications, but I also work on my own open source software and in both cases I use NX, including using it to publish Angular and TypeScript libraries to NPM.

Each way of setting this up is subtly different. In this example I’ll show the steps of how I built a pipeline for releasing my TypeScript RxJS library.

The easiest way to start is to create your repo using <code>create-nx-workspace</code>

npx create-nx-workspace <project-name>

When running this, you will be asked a few questions depending on your setup. For RxJS Primitives I used a plain workspace, and then once created I added <code>@nrwl/node</code> to the project. The default @nrwl/workspace plugin allows the creation of libraries, but does not provide a publishable output (adding package.json) as an option.

Once set up you can now create libraries that can be built and published as NPM modules, using the @nrwl/node plugin type to create them. For rxjs-string I used:

1
> nx g @nrwl/node:library --name=string --directory=rxjs --publishable

If it’s an Angular library use:

1
> nx g @nrwl/angular:library --name=my-component --directory=ngx --publishable

After a few seconds, inside the libs folder will be a default output of a TypeScript library, with various tsconfig.json files for testing and building, a Jest config for unit testing, a index.ts file as the entry point and package.json for publishing.

One issue with nx is that with this configuration in the package.json you’ll find a 2-level name for your library (e.g. @tinynodes/rxjs-string) however in the root tsconfig.json file you’ll see the following in the paths property

1
2
3
4
5
6
7
{
  "paths": {
    "@tinynodes/rxjs/string": [
      "libs/rxjs/string/src/index.ts"
    ]
  }
}

If you are only using this library internally it’s not really an issue, but when you intend to publish the library I recommend changing the path to @tinynodes/rxjs-string to match the NPM import path.

Preparing for publishing

Once you have developed your library, it’s time to publish to NPM! First of all, make sure your public API (Functions, Classes, and Types) are exported in the library index.ts:

1
2
3
4
// index.ts
export { myFactoryFunction } from './libs/factory';
export { MyThing } from './classes/my-thing'
export { MyInterface, SOME_CONSTANT } from './types/thing-types' 

To see the output of this library you can run nx build library-name, this will output a NPM module to the dist folder.

Setting up Github Actions for Pull Requests

For an open-source library on GitHub, it’s good practice for you to get pull requests from other developers, and to have confidence in those PRs having a pull request checker is ideal.

First create a folder .github at the root of your workspace, and inside the workflows folder. This is the folder that GitHub checks to see if there are any YAML files containing action flows.

Create a file pr_on_master.yml and inside this file set up the following steps:

1
2
3
4
5
6
7
8
9
# File for Pull Request on master branch
name: PR on master

# When a PR is opened to master
on:
  pull_request:
    branches:
      - master
    types: [ opened, reopened, synchronize ]

This first section provides the name and triggers for when the action runs: when a pull request opened or re-opened against master.

Next the steps need set up:

1
2
3
4
5
6
7
8
jobs:
  build:
    # Setup OS and Node Version
    runs-on: ubuntu-latest
    strategy:
      matrix:
        # Latest nodes only
        node-version: [ 13.x ]

This sets up a Github Action runner and makes sure NodeJS 13 is installed. Here you can add more versions and run parallel tests against different versions if you plan to support more than one.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
# Define Steps
steps:
  # Checkout code
  - uses: actions/[email protected]
    with:
      ref: ${{ github.event.pull_request.head.ref }}
      fetch-depth: 0

  - name: Use Node.js ${{ matrix.node-version }}
    uses: actions/[email protected]
    with:
      node-version: ${{ matrix.node-version }}

The first two steps first check out the code from the pull request branch, and then sets up NodeJS to run.

The next piece of step was the part of setting up the pipeline that took the longest to fix.

To use the nx affected commands it needs a base branch to compare changes against - by default Github checkout does not pull all the branches, only the current one - and this includes the master branch. This command tells git to pull the master branch from the origin:

1
2
3
  # Make sure we have all branches
  - name: Fetch other branches
    run: git fetch --no-tags --prune --depth=5 origin master

Finally, we run some NPM command for installing the dependencies, running linting and test coverage.

1
2
3
4
5
6
7
8
  - name: Install environment
    run: npm ci

  - name: Run lint
    run: npm run affected:lint -- --base="origin/master"

  - name: Tests coverage
    run: npm run affected:test -- --base="origin/master" --codeCoverage

Using the affected commands, the pipeline will only run linting and testing against libraries that have changed against the master branch. More steps such as SonarQube or other webhooks can be performed here.

Publishing to NPM

When publishing libraries, there’s a few additional steps needed before we write the pipeline. First of all we need two access tokens, one for NPM and one for GitHub.

NPM Token

The NPM token will be used to publish the library to the public NPM registry. Log into NPM and under your profile go to the “Auth Tokens” section and create a new token with “Publish” access. Next go to your Github repository and under “Settings -> Secrets” add a new token called NPM_AUTH_TOKEN and paste in the value.

If publishing to a private registry follow it’s instructions on generating an API token.

Github Token

To publish changes back to GitHub from the pipeline you also need a personal access token - this can be created under your account settings in “Developer Settings -> Personal Access Token”. The only permissions you need to give this token are the repo permissions.

Add this under “Settings -> Secrets” as ACTION_AUTH_TOKEN (it seems you cannot name them with GITHUB_ in the name at all) 🤷‍♂️

Setting up the action

Like before we are going to add a YAML file to the .github/workflows folder - this time called publish.yml

First set the action up to trigger only when a PR is closed on master:

1
2
3
4
5
6
name: Merge on master
on:
  pull_request:
    branches:
      - master
    types: [ closed ]

The job section is the same as above, but for the steps there is a slight change - to allow merges to master that * don’t* trigger a release, each command will be wrapped in a if block this block checks the commit message for the string [skip-ci] to avoid running these tasks (unfortunatly it seems you can’t just put a block around the entire set of steps so has to be added to all of them)

1
2
3
4
5
steps:
  # Checkout code
  - name: Checkout Code
    if: github.event.pull_request.merged == true && contains(github.event.commits[0].message, '[skip-ci]') == false
    uses: actions/[email protected]

Instead of running lint and test, we’ll now run a set of different tasks - hold on tight because we’re about to dive into some bash code!

Deployment Step

In our publish.yaml add the following step which first exports our registry publishing setting with the auth token we set up earlier, then we call our publish bash script

1
2
3
4
5
6
7
- name: Deploy
    if: github.event.pull_request.merged == true && contains(github.event.commits[0].message, '[skip-ci]') == false
    env:
      NPM_AUTH_TOKEN: ${{ secrets.NPM_AUTH_TOKEN }}
    run: |
      npm config set //registry.npmjs.org/:_authToken=$NPM_AUTH_TOKEN
      ./.github/scripts/publish-libraries.sh      

Create the .github/scripts/publish-libraries.sh file and make sure it’s set to executable by typing chmod +x .github/scripts/publish-libraries.sh before committing the script.

The first part of the script is setting up our variables and getBuildType function to check what type of release we are doing. It’s good to use SemVer to publish based on changes that happen in the library and with this function using the following words in the MERGE COMMIT message will determine the release type.

For example: (fix): Fixed that annoying bug in issue #123 will set the release type to patch. The default is minor . Using (feat) will make a major change.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
#!/usr/bin/env bash
set -o errexit -o noclobber -o nounset -o pipefail

# This script uses the parent version as the version to publish a library with

getBuildType() {
  local release_type="minor"
  if [[ "$1" == *"feat"* ]]; then
    release_type="major"
  elif [[ "$1" == *"fix"* || "$1" == *"docs"* || "$1" == *"chore"* ]]; then
    release_type="patch"
  fi
  echo "$release_type"
}

PARENT_DIR="$PWD"
ROOT_DIR="."
echo "Removing Dist"
rm -rf "${ROOT_DIR:?}/dist"
COMMIT_MESSAGE="$(git log -1 --pretty=format:"%s")"
RELEASE_TYPE=${1:-$(getBuildType "$COMMIT_MESSAGE")}
DRY_RUN=${DRY_RUN:-"False"}

The script also cleans up the dist folder, and has an optional DRY_RUN to set if you want to test the pipeline before release.

After this add the following:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
AFFECTED=$(node node_modules/.bin/nx affected:libs --plain --base=origin/master~1)
if [ "$AFFECTED" != "" ]; then
  cd "$PARENT_DIR"
  echo "Copy Environment Files"

  while IFS= read -r -d $' ' lib; do
    echo "Setting version for $lib"
    cd "$PARENT_DIR"
    cd "$ROOT_DIR/libs/${lib/-//}"
    npm version "$RELEASE_TYPE" -f -m "RxJS Primitives $RELEASE_TYPE"
    echo "Building $lib"
    cd "$PARENT_DIR"
    npm run build "$lib" -- --prod --with-deps
    wait
  done <<<"$AFFECTED " # leave space on end to generate correct output

  cd "$PARENT_DIR"
  while IFS= read -r -d $' ' lib; do
    if [ "$DRY_RUN" == "False" ]; then
      echo "Publishing $lib"
      npm publish "$ROOT_DIR/dist/libs/${lib/-//}" --access=public
    else
      echo "Dry Run, not publishing $lib"
    fi
    wait
  done <<<"$AFFECTED " # leave space on end to generate correct output
else
  echo "No Libraries to publish"
fi

This part of the script allows us to control the build and release - the first line gets a single line list of affected libraries from the current master to the last HEAD in master - if you use only PRs to make changes into master this is very effective - but can break if you make changes directly to master.

This is done instead of using nx affected as there is currently no task for publishing, so this gives a way to provide a loop for both building and publishing.

Both while loops parses this string and splits on the space to allow a loop to be run.

In each loop there is a string replacement ${lib/-//} - this changes the library name (e.g. rxjs-string) into a path (rxjs/string)

The first while loop ensures we are in the correct directory and does npm version - bumping the package.json for release - it then runs the build task for production, and ensures any dependencies are also build.

The second while loop iterates over the same list but runs the npm publish command in each directory, setting the access to public.

Congratulations

If you’ve made it this far, well done - this is quite a long post! At this point your pipeline should have successfully published your NPM module for other developers to use! We’re not done yet though!

Before running this we are going to add some additional steps so that we can publish documentation and changes back to Github:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
  - name: Build Docs
    if: github.event.pull_request.merged == true && contains(github.event.commits[0].message, '[skip-ci]') == false
    run: npm run docs

  - name: Commit files
    if: github.event.pull_request.merged == true && contains(github.event.commits[0].message, '[skip-ci]') == false
    run: |
      git config --local user.email "[email protected]"
      git config --local user.name "GitHub Action"
      git add .
      git commit -m "Release [skip-ci]" -a || true      

  - name: Push changes
    if: github.event.pull_request.merged == true && contains(github.event.commits[0].message, '[skip-ci]') == false
    uses: ad-m/[email protected]
    with:
      github_token: ${{ secrets.ACTION_AUTH_TOKEN }}
      tags: true
      force: true

For RxJS Primitives I used TypeDoc to generate static content to the docs folder, which is then used by GitHub Pages, but here you can set up whatever documentation system you prefer.

Once the docs have been generated we then commit all changes including the package.json version bumps back to git and then finally push it back to master using the GitHub token generated earlier.

Within a few seconds your GitHub Page should update with the latest content, the master branch reflect all changes made.

That’s a wrap! You’ve now successfully published your TypeScript library for other developers to use, along with documentation.

If you’ve found this article useful, let me know - and if you find any issues or improvements please get in touch!

RxJS Primitives - Operators for mutating and filtering primitives

 Published: Apr 23, 2020  (last updated: Apr 23, 2020)~ 400 words~ 2 minutes reading time

Today I’ve published a new set of libraries to NPM - RxJS Primitives .

These are based on some operators I’ve collected over the last year, and some additional ones I’ve started adding. Most are based around ECMASCript objects such as String, Number and Boolean but also includes some useful utility operators.

Over the coming weeks I’ll add more operators, both based on ECMAScript methods and custom functions that I have found useful.

The following modules are on NPM:

rxjs-string

@tinynodes/rxjs-string operators that are built around the String object in ECMAScript, for example with toUpperCase:

1
2
3
from(['hello', 'world']).pipe(
  toUpperCase()
).subscribe(value => console.log(value)) // ['HELLO', 'WORLD']

There are also some boolean value operators such as endsWith and extraction operators like charAt, with plans to add more useful utilities. For example endWith returns a boolean value, but I also want to include an endsWith that returns the original value instead (like filter).

rxjs-number

@tinynodes/rxjs-number operators that are built around the Number object in ECMAScript, for example parseFloat/parseInt and isNaN.

1
from(['1', '1.2', '3.14']).pipe(parseInt()).subscribe(value => console.log(value)) // [1, 2, 3]

This also includes toString which uses Number.prototype.toLocaleString supports formatting such as currency.

rxjs-boolean

@tinynodes/rxjs-boolean operators that are built around the Boolean object in ECMAScript, and are designed to help with filtering content from observables. Currently, there are two operators firstTruthy and filterTruthy.

In both cases these return the underlying value only if it’s a truthy value in JavaScript, in the case of firstTruthy it only returns the first value, while filterTruthy returns all truthy values.

rxjs-utility

@tinynodes/rxjs-utility is a custom module that provides some additional operators that don’t fit into the other packages but still have some usefulness.

Currently, there are two operators:

  • startWithTap - Will fire a callback method only on the first emission from an Observable
1
2
3
form.valueChanges.pipe(
  startWithTap(() => form.touch()),
).subscribe()
  • debounceWithQuery - Debounces an input such as a text input and passes it to a method that returns a value from a query (such as a search)
1
2
3
searchField.valueChange.pipe(
  debounceWithQuery(1000, (search) => http.get(`/search?query=${search}`))
).subscribe()

Validating data with JSON Schema, Angular and TypeScript

 Published: Sep 18, 2019  (last updated: Feb 12, 2021)~ 1900 words~ 9 minutes reading time

One common question I see with a lot of new TypeScript developers is how to handle runtime validation of data, using the types they have built.

The issue is web platform, as yet, does not support types. Typescript itself is a higher-level language built on top of JavaScript and uses a compiler to create compatible code for the web, node or other JS platforms - this means that types are only available at design time.

Most developers have a method or form in their code where they want to validate data being passed in is correct before sending it to another API. This works for hard coded data in Typescript, but not dynamic data from sources such as a form or API source

The good news is the problem itself has been solved and there are several solutions for TypeScript such as io-ts or joi but I find these solutions to encourage duplication of types across different domains to maintain both your types and validation objects.

Introducing JSON Schema

A much simpler way to maintain both types and validation within a project is to use a single source of truth. The main option for this is JSON Schema .

A JSON Schema file allows you to define types using a JSON file, using a specification defined by the selected draft (at the time of writing it’s number 7 ).

This file can be used to generate types for design-time coding using CLI tools and can be used for data validation at runtime using another library that can consume a schema to generate a validation method.

Schema Example

For this demo, I’ve created a simple schema object defining a customer within a system. The customers' properties are:

  • ID
  • firstName
  • lastName
  • dateOfBirth
  • email

In this example we set "additionalProperties": false to keep the example simple, but it’s a very flexible option!

If set to true or not included, the outputted types will include an indexable type with a [key: string]: any at the end of the type properties.

You can also pass it properties such as "additionalProperties": { "type": "string" } which will allow only string additional properties to be added.

By setting to false - only the properties defined will be available on the type, which I’ll do for this example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
{
  "$id": "https://tane.dev/customer.json",
  "$schema": "http://json-schema.org/draft-07/schema#",
  "title": "Customer Record",
  "type": "object",
  "properties": {
    "id": {
      "type": "string",
      "description": "The Customers ID in our system"
    },
    "firstName": {
      "type": "string",
      "description": "The customer's first name."
    },
    "lastName": {
      "type": "string",
      "description": "The customer's last name."
    },
    "email": {
      "type": "string",
      "format": "email",
      "description": "The customers email address"
    },
    "dateOfBirth": {
      "type": "string",
      "format": "date",
      "description": "The customer's date of birth."
    }
  },
  "additionalProperties": false,
  "required": [
    "id",
    "firstName",
    "lastName",
    "dateOfBirth",
    "email"
  ]
}

The first tool we will run this through is the imaginatively titled <code>json-schema-to-typescript</code> ! This project will take a valid schema file and generate a file containing the types. From the example above the output is:

json2ts -i customer.json -o customer.d.ts --style.singleQuote

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
/* tslint:disable */
/**
 * This file was automatically generated by json-schema-to-typescript.
 * DO NOT MODIFY IT BY HAND. Instead, modify the source JSONSchema file,
 * and run json-schema-to-typescript to regenerate this file.
 */

export interface CustomerRecord {
  /**
   * The Customers ID in our system
   */
  id: string;
  /**
   * The customer's first name.
   */
  firstName: string;
  /**
   * The customer's last name.
   */
  lastName: string;
  /**
   * The customers email address
   */
  email: string;
  /**
   * The customer's date of birth.
   */
  dateOfBirth: string;
}

One thing to note is that email and dateOfBirth are string in our type, the format is only used with validation. It is possible to create types for these fields and reference them using a more complex schema .

This type can now be imported into other types, and the json-schema-to-typescript will do this when you use complex references. For example, if we define an entire customer order type it might look like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
import { CustomerRecord } from './customer';
import { OrderItem, Checkout, Address } from './shop-front'

export interface CustomerOrder {
  customer: CustomerRecord;
  deliveryAddress: Address;
  billingAddress: Address;
  items: OrderItem[]
  checkout: Checkout
}

Also, all the properties have been added to the required array. When creating a new customer, if the data does not contain an ID, you can use the Partial type to accept an incomplete object - if you expect your API to give back a full object you can return a CustomerRecord. You can also use Required where you need to ensure all fields are passed.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
import { CustomerRecord } from './customer';

class CustomerClass {
  // Return a API request with a customers object
  async addCustomer(customer: Partial<CustomerRecord>): Promise<CustomerRecord> {
    return this.api.save(customer);
  }
  
  // Return a API request with a customers object
  async updateCustomer(customer: Required<CustomerRecord>): Promise<CustomerRecord> {
    return this.api.update(customer);
  }
}

Validating with the Schema

Now you have types, it makes the development of your application easier - but we still need to validate data entered is correct.

One way is to use the same schema on the server-side, using your languages JSON Schema validator, but in this example, I’ll use ajv - a javascript library that allows a schema to be loaded and data validated against it. The documentation is quite complete on using it in a JavaScript environment so I won’t repeat it too much here, but instead, I’ll build an Angular module that can be provided as a schema validation service.

First, we’ll create the Angular module, in to which we inject the AJV class and allow the user to provide a configuration, the service is provided below. This allows the module to be imported with a configuration and a service that is injectable through your application.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
import { NgModule, InjectionToken } from '@angular/core';
import { HttpClientModule } from '@angular/common/http'
import { JSONSchemaService, AJV_INSTANCE } from './json-schema.service';
import ajv, { Ajv, Options } from 'ajv';

export const AJV_CLASS = new InjectionToken<Ajv>('The AJV Class Instance');
export const AJV_CONFIG = new InjectionToken<Ajv>('The AJV Class config');

export function createAjvInstance(AjvClass: any, config: Options) {
  return new AjvClass(config);
}

@NgModule({
  import: [HttpClientModule],
  provides: [
    JSONSchemaService,
    { provide: AJV_CLASS, useValue: ajv },
    { provide: AJV_CONFIG, useValue: {} },
    {
      provide: AJV_INSTANCE,
      useFactory: createAjvInstance,
      deps: [AJV_CLASS, AJV_CONFIG]
   }
  ]
})
export class JSONSchemaModule {}

Now we create a service - within this service it will access to the Ajv class that allows the service to be provided with schemas via an Angular HTTP call. The parsed schema is assigned a name and can be used through the app using dependency injection - this service is a good use case of a root service too, which create a Singleton of the service shared within the same application.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
import { Injectable, Inject, InjectionToken } from '@angular/core';
import { HttpClient } from '@angular/common/http';
import { Ajv } from 'ajv';

export const AJV_INSTANCE = new InjectionToken<Ajv>('The AJV Class Instance');

/**
 * The response of a validation result
 */
export interface ValidateResult {
  /**
   * If the result is valid or not
   */
  isValid: boolean;

  /**
   * Error text from the validator
   */
  errorsText: string;
}


@Injectable({
    provideIn: 'root'
})
export class JSONSchemaService {
  constructor(private readonly http: HttpClient, @Inject(AJV_INSTANCE) private readonly ajv: Ajv) {}

  /**
   * Fetches the Schema and adds it to the validator schema set
   * @param name The name of the schema, this will be used as the key to store it
   * @param urlPath The URL path of the schema to load
   */
  public loadSchema(name: string, urlPath: string): void {
    this.http.get(urlPath).subscribe(result => this.ajv.addSchema(result, name));
  }

  /**
   * Validate data against a schema
   * @param name The name of the schema to validate
   * @param data The data to validate
   */
  public validateData<T>(name: string, data: T): ValidateResult {
    const isValid = this.ajv.validate(name, data) as boolean;
    return { isValid, errorsText: this.ajv.errorsText() };
  }
}

Now we can use our service to load a JSON schemas into an internal Ajv map, and using the key load the schema to validate a data object against it. The service could be used alongside a form, any methods on a service or checking the result of one API before passing to another API.

A simple example of how it could be used in a form component (the example is shortened, most likely you would load your schemas from another service) or how you could validate the parameters passed to a method:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
@Component({
  selector: 'my-form-component',
  template: `
    <errors-component *ngIf="let errors; errors$ | async"></errors-component>
    <form [formGroup]="customerForm" (ngSubmit)="submit()">
      <!-- Customer form in here --->
    </form>
  ` 
})
export class FormComponent {
  
  error$ = new BehaviorSubject<string>('');

  customerForm = this.fb.group({
    id: [''],
    firstName: [''],
    lastName: [''],
    email: [''],
    dateOfBirth: ['']
  });

  constructor(private readonly fb: FormBuilder, private readonly schema: JSONSchemaService, private readonly app: AppService) {
    this.schema.loadSchema('customer', 'https://tane.dev/customer.json')
  }

  /**
   * In the submit method, we validate the input of a form - this can be on top of, or instead
   * of Angular form validation
   */
  submit() {
    const result = this.schema.validateData('customer', this.customerForm.value);
    if (result.isValid) {
       this.app.updateCustomer(this.customerForm.value);
    } else {
      this.error$.next(result.errorsText);
    }
  }
 
  /**
   * This custom method can take a type of T (which in this case is an `any`) and validate
   * that the data is valid
   */
  customMethod<T = any>(data: T) {
    const result = this.schema.validateData('customer', data);
    if (result.isValid) {
       // Do custom logic
    } else {
      this.error$.next(result.errorsText);
    }
  }
}

Conclusion

I hope you’ve found this article useful in helping understand how and where Typescript can be used to validate data within an application, and JSON Schema to validate dynamic data.

Please feel free to leave feedback on any issues or improvements, but hopefully, these examples give a clearer understanding.

For full documentation of JSON Schema, check out the Understanding JSON Schema pages to get examples of using allOf, anyOf, oneOf and using definitions

Ngx-EditorJS library for Angular

 Published: May 10, 2019  (last updated: May 10, 2019)~ 300 words~ 2 minutes reading time

Today I have published my first full Angular module - @tinynodes/ngx-editorjs on NPM .

The module is a set of features for Angular (7+) to create and control EditorJS instances.

A demo application is available to see the editor in action, and the source is also available in the <code>@tinynodes</code> monorepo .

Included Features

The library exports several features once the NgxEditorJSModule has been included in your project.

NgxEditorJSDirective

This is the main directive which can be used on any element with the [ngxEditorJS] selector and and id attribute.

1
<div id="my-editor" ngxEditorJS></div>

This will handle creating the editor instance via the NgxEditorJSService

NgxEditorJSComponent

This component can be used in any Angular component using the <ngx-editorjs> tag. Again this component can take a set of blocks, it also provides a holder input for overriding the ID.

1
<ngx-editorjs [holder]="holderProperty"></ngx-editorjs>

NgxEditorJSService

This service provides handling the life-cycle of the EditorJS instances, and exposes the underlying EditorJS API - in future releases more of the API will be exposed via service methods to make controlling the container easier.

The module is configurable and allows EditorJS plugins to be injected into the library - (see README.md ) on how to do this in a AOT-friendly way.

Next Steps

Before release I had to resolve an issue with AOT compiling and injecting EditorJS plugins - this means version 1.1.0 has already been released - now this is resolved I want support additional feature in the editor, as well as provide unit test coverage and better documentation and demo.

If you find any issues or have feature requests these can be left on the GitHub Issues page.

Providing injectable features to Angular modules

 Published: Mar 14, 2019  (last updated: Mar 16, 2019)~ 1700 words~ 8 minutes reading time

Working with Angular and Typescript ; as I have refactored and re-written components I’ve been learning to take advantage of one of the more powerful features - dependency injection.

Building Angular components, most developers will have already used this to inject features like the HTTP client or the FormBuilder to be used in a component. A common service example might look like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
import { Injectable } from '@angular/core';
import { HttpClient, HttpResponse } from '@angular/common/http';
import { Observable } from 'rxjs';

@Injectable()
export class RequestService {
  constructor(private readonly http: HttpClient) {}

  public getExample(): Observable<HttpResponse> {
    return this.http.get('/api/example');
  }
}

Angular’s own features can be injected via their class identity (e.g. HttpClient). You can also inject any components, services or pipes that are registered within a module.

Angular also provides a mechanism for custom injections, this means items like configurations or instances of external libraries can be passed into any constructor within your module or any item within the dependency tree. To do this the library needs to provide an InjectionToken which represents the key in the dependency tree, and use the Inject decorator to provide an instance into a class constructor (this is covered later in this post).

Creating The Service

I recently worked on refactoring some code where in a component it had some code to handle getting a session object from sessionStorage. When working with Angular’s ng serve making changes will often result in a refresh, destroying any existing session. This feature allowed session storage to be turned on and keep it between refreshes by allowing the component to check if a value existed in sessionStorage object.

Using the principles in SOLID I moved the code into a service. Having a service do one thing it made the code easier to understand. The unit test for this is also smaller and can cover most of the component.

The service has one additional property which is storagePrefix, used to make each instance of the service unique to it’s component.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
import { Injectable } from '@angular/core';

@Injectable()
export class ComponentService {
  private storagePrefix = 'my-app';

  public init(storagePrefix: string) {
    this.storagePrefix = storagePrefix;
  }

  public load(key: string): string | undefined {
    return sessionStorage.getItem(`${this.storagePrefix}-${key}`);
  }

  public save(key: string, value: string): void {
    sessionStorage.setItem(`${this.storagePrefix}-${key}`, value);
  }
}

In it’s current implementation it’s usable, but it has a few issues. The main one is that it uses sessionStorage but trusts that it exists on the global object, this means we should always expect it to be available, and is a bad practice. It also limits this service to only use sessionStorage and no other key/value store. It would be better if we could provide it to our service instead.

We can also set the storagePrefix via our app configuration and means our component doesn’t need to understand this configuration (yet!) so it avoids us having to call the init method within a component’s ngOnInit.

By making these injectable, we will not only be able to provide our sessionStorage but we can support any module or library that conforms to the Web Storage API . This means we no longer need to rely on the global object and we have more control over what we inject.

The powerful feature of dependency injection is the ability to pass instances of other non-angular applications. For example you could pass the instance of a jQuery application and call it’s methods from your application (you’ll also want to check out NgZone when running these, but I’ll discuss that in a future post).

This allows for the transition of applications over time within an organisation, rather than having to replace all applications in one go. It’s a good way to safely inject features from third-party libraries and means passing mock versions is easier for tests.

Creating an injectable service and module

Making features injectable

First we need to define the interface of the object want to use for the configuration. This interface provides a contract in the forRoot method described below and tells developers what options they can provide to the application

Our configuration will require a storagePrefix as a string - this is need to set the name of the store. We also pass our Storage instance, but with this we can set a default value in our component

1
2
3
4
export interface ComponentServiceConfig {
  storagePrefix: string;
  storageInterface?: Storage;
}

Next we will create an InjectionToken, a token that is usually constant representation of something we want to provide via dependency injection. You can use any type of object as your token type, and can even provide a factory method to provide dependencies. Check out the Tree-shakeable Injection Token to see how you can do some advanced use. In our case we will just have the component config object.

1
2
import { InjectionToken } from '@angular/core';
const COMPONENT_SERVICE_CONFIG = new InjectionToken<ComponentServiceConfig>('COMPONENT_SERVICE_CONFIG');

Now we have the token service we will provide a simple configuration with this token which will be an object, but a token can be used to inject anything from a string to a function, or even a class instance.

Making the Angular module configurable

To make the service configurable, it should be provided via a NgModule - a regular module might look something like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
import { NgModule } from '@angular/core';
import { CommonModule } from '@angular/common';
import { ReactiveFormsModule } from '@angular/forms';
// Module specific imports ...

@NgModule({
  imports: [CommonModule, ReactiveFormsModule],
  declarations: [FeatureFormComponent, FeatureViewComponent, FeaturePipe],
  exports: [FeatureFormComponent, FeatureViewComponent],
  provides: [ComponentService]
})
export class LibraryModule {}

To make this module accept a configuration we need to add a method that can return an instance of our module with custom providers. Providers are an important part of Angular’s dependency injection as these are the instance of what child components can inject - all will hopefully become clear through the code example.

First we will add a static forRoot method to the class. This method provides a factory function where we can pass a configuration in the application imports.

The factory returns an instance of that module with the provided configuration, and this passed into the items with Inject . The module can then be further used without the forRoot as long as it’s configured at the root of the application (see SkipSelf )

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
import { NgModule, ModuleWithProviders, Optional, SkipSelf } from '@angular/core';

// Here we can create a default object
const defaultConfig: ComponentServiceConfig = {
  storagePrefix: '',
  storageInterface: sessionStorage; // In our default value we will trust the global is there 
}

@NgModule({
  ...
})
export class LibraryModule {
  constructor(
    @Optional()
    @SkipSelf()
    parentModule: LibraryModule,
  ) {}
  static forRoot(config?: ComponentServiceConfig): ModuleWithProviders {
    return {
      ngModule: LibraryModule,
      providers: [
        {
          provide: COMPONENT_SERVICE_CONFIG,
          // Destructring allows the value of the config to be merged
          // this is a shallow operation so keep the objects simple
          useValue: { ...defaultConfig, ...config },
        },
      ],
    };
  }
}

Before we can use this in our application we also need to update the ComponentService itself to use this configuration. Here the Inject decorator will be used to tell the constructor that the ComponentServiceConfig this service needs will be available to inject via the COMPONENT_SERVICE_CONFIG token.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
import { Inject } from '@angular/core';

export class ComponentService {
  constructor(@Inject(COMPONENT_SERVICE_CONFIG) private readonly config: ComponentServiceConfig) {}

  public load(key: string): string | undefined {
    return this.config.storageInterface.getItem(`${this.config.storagePrefix}-${key}`);
  }

  public save(key: string, value: string): void {
    this.config.storageInterface.setItem(`${this.config.storagePrefix}-${key}`, value);
  }
}

As the config is set with private in the constructor, TypeScript knows to set this as a property on the class. This is the same as writing the following code and provides a clean shorthand to set up properties.

1
2
3
4
5
6
7
export class ComponentService {
  private config: ComponentServiceConfig;

  constructor(config: ComponentServiceConfig) {
    this.config = config;
  }
}

Providing the features in the app

Now that we have our configurable module, the Angular application can be provided with a configuration when importing.

As the module now accepts any Storage type, we’ll instead provide localStorage for this application, along side the storagePrefix:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
@NgModule({
  declarations: [AppComponent],
  imports: [
    BrowserModule,
    RouterModule.forRoot([], routerConfig),
    LibraryModule.forRoot({
      storageInterface: localStorage,
      storagePrefix: 'my-new-app'
    })
  ],
  providers: [...environment.mockProviders],
  bootstrap: [AppComponent]
})
export class AppModule {}

Another example is we could have an NPM library that provides the same interface, such as localStorage-ponyfill , it could be used in your application - or you may want to provide it to a TestBed configuration:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
// Using it in our application
import { createLocalStorage } from "localstorage-ponyfill";
const fakeLocalStorage = createLocalStorage({ mode : "memory" });

@NgModule({
  imports: [
    ...
    LibraryModule.forRoot({
      storageInterface: fakeLocalStorage,
      storagePrefix: 'my-new-app'
    })
    ...
  ]
})
export class AppModule {}

// Or writing some tests

TestBed.configureTestingModule({
  declarations: [TestHostComponent],
  providers: [
    {
      provide: BB_FORMS_CONTAINER_CONFIG,
      useValue: {
        storageStrategy: fakeLocalStorage,
      },
    },
  ]
});

With this technique, any kind of primitive or complex object, method or class can now be passed into your feature modules - and is a useful way to provide third-party libraries to your modules, configuring not just your services, but also components, pipes and stores.

Any class with a constructor (provided it’s in the same dependency tree) can use Inject with your exported token to access this configuration.

Hopefully you will also find this technique useful to improve the configurable and testable features of your module. Feel free to respond on twitter if you spot any typos or errors.