Sekrab Garage

Utilizing Angular Tokens

Loading external configurations in Angular Universal

AngularDesign April 2, 22

In my post Loading external configurations via http using APP_INITIALIZER, I attempted to load external configuration via HTTP, on client side. In this post I am explroing options for SSR.

Find the final result here StackBlitz

External Remote Configuration

Expanding on StackBlitz Token Test Project, where the URL of the configuration had to be set to remote HTTP, building locally and testing for server, produced identical results. The project resolve worked as expected. The only issue was: failure of remote URL meant blocking of app. This is a pitfall of having a remote config. One way to fix that, is as follows:

Slight fix to configuration

We want to distinguish served configuration, but we do not want to block the UI in case of failure. The project resolve for example, should decide what to do with error:

 return this.configService.config$.pipe(
      first((n) => n.isServed),
      map((n) => {
        // if served with error, reroute or notify user, but do not block user
        console.log(n.withError); // let's introduce this property
        return true;
      })
    );

In ConfigService I will stop making a distinction between success and failure, they both are served. Then by introducing withError property, will set it to true when failing.

// after defining withError property in IConfig...
private _createConfig(config: any, withError: boolean): void {
    // cast all keys as are
    const _config = { ...Config, ...(<IConfig>config) };

    // is severd, always
    _config.isServed = true;

    // with error
    _config.withError = withError;

    // set static member
    ConfigService._config = _config;

    // next, always next the subject
    this.config.next(config);
  }

  loadAppConfig(): Observable<boolean> {
    return this.http.get(environment.configUrl).pipe(
      map((response) => {
        // create with no errors
        this._createConfig(response, false);
        return true;
      }),
      catchError((error) => {
        // if in error, return set fall back from environment
        // and create with errors
        this._createConfig(Config, true);
        return of(false);
      })
    );
  }
This works as expected, however, if the HTTP request fails on server, Angular will attempt to reconnect after rehydration, on client.

External Local Configuration

Moving the files to localdata folder using angular.json assets:

"assets": [
  {
    "glob": "*.json",
    "input": "configs",
    "output": "/localdata"
  }
]

The config URL now looks like this localdata/config.json. It is relative.

According to Angular Docs:

If you are using one of the @nguniversal/*-engine packages (such as @nguniversal/express-engine), this is taken care for you automatically. You don't need to do anything to make relative URLs work on the server.

Well, I get:

GET localdata/config.prod.json NetworkError

I guess what they mean is that, if you go their way of rending, you are covered. That is, if you use this:

server.get('*', (req, res) => {
  res.render(indexHtml, { req, providers: [{ provide: APP_BASE_HREF, useValue: req.baseUrl }] });
});

But I don't. And I will tell you why, and how. Then I will go through the solution for the relative URLs.

Isolating the server

If we follow the documentation Server-side rendering (SSR) with Angular Universal it walks you through building the sever in src folder, and generating the sever in a build process. I find that too obtrusive. Coming from old school, cannot sleep well while my server is in my development source code. If something on the server goes wrong, I have to build and test? Everytime? Not cool.

One good scenario I might post about soon, is serving multilingual Angular app, using the same build.

Let's first reduce the size of the server.ts suggested by Angular Docs, to have only the ngExpressEngine, export it, and create a separate express app.

// server.ts
// export the ngExpressEngine
export const AppEngine = ngExpressEngine({
  bootstrap: AppServerModule
});

Building the ssr uses the following angular.json settings

// ... angular.json
"architect": {
     // ... 
    "server": {
        "builder": "@angular-devkit/build-angular:server",
        "options": {
            // choose the output path where the main.js will end up
            "outputPath": "./host/server", 
            "main": "server.ts",
            "tsConfig": "tsconfig.server.json"
        },
        "configurations": {
            "production": {
                // don't delete because there will be other files
                "deleteOutputPath": false
                // ...
            }
        }
    }
}

The main.js generated will end up in outputPath, let's create a server there, and use the exported AppEngine.

// host/server.js
const express = require('express');

// express app
var app = express();

// setup express
require('./server/express')(app);

// setup routes
require('./server/routes')(app);

// other stuff is up to you

// listen
var port = process.env.PORT || 1212;
app.listen(port, function (err) {
  console.log('started to listen to port: ' + port);
  if (err) {
      console.log(err);
      return;
  }
});

The express module is basic, you can have a look at it on StackBlitz. The routes.js is where the cooking happens:

  • PS: I cannot test on StackBlitz, you may want to use __dirname to get accurate paths
const express = require('express');

// ngExpressEngine from compiled main.js
const ssr = require('./main');

// setup the routes
module.exports = function (app) {
  // set engine, we called it AppEngine in server.ts
  app.engine('html', ssr.AppEngine);

  // set view engine
  app.set('view engine', 'html');

  // set views directory
  app.set('views', '../client');

  // expose the configs path as localdata (or whatever you choose to name it)
  app.use('/localdata', express.static('../localdata', { fallthrough: false }));

  // expose client folder
  app.use(express.static('../client'));

  // now THIS
  app.get('/*', (req, res) => {
    // point to your index.html
    res.render(`../client/index.html`, {
      req, // pass request
      res, // pass response
      // here, we can provide things for ssr
    });
  });
};

In res.render, I passed back response and request just in case I want to use them in Angular. (It's rare, but it happens). So that's the why, and how.

Provide absolute URLs for local requests

A local request is like our localdata/config.prod.json. To fix it, it must be prepended by the server URL. Our final result in ConfigService should look like this:

  loadAppConfig(): Observable<boolean> {
    // fix url first if its on server
    let url = environment.configUrl;
    if (serverUrlExsits) {
      url = serverUrl + url;
    }
    return this.http.get(url).pipe(
     // ... etc
    );
  }

The URL on the server is constructed using the REQUEST injection token, as documented on NPM packages.

// change ConfigService
// for this line to work, install @types/express
import { Request } from 'express'; 
import { REQUEST } from '@nguniversal/express-engine/tokens';

@Injectable()
export class ConfigService {
  // make it Optional to work on browser platform as well
  constructor(@Optional() @Inject(REQUEST) private request: Request) {}

 loadAppConfig(): Observable<boolean> {
    // fix url first if its on server
    let url = environment.configUrl;
    if (this.request) {
      // on ssr get a full url of current server
      url = `${this.request.protocol}://${this.request.get('host')}/${url}`;
    }
 // ... etc
  } 
}

Since we already provided req in the res.render call, this is sufficient. But it looks ugly. We can create an HTTP interceptor for localdata to make use of any other localdata. But first:

The curious case of reverse proxy

Without digressing beyond the scope of this post, reverse proxy and load balancing on production servers usually proxy https into http, and real.host.com into localhost. The latter we fixed by using req.get('host') which accesses the header. And to fix the protocol, we access another header value: x-forwarded-proto.

Here is an azure website example I set up, notice how the values in the header, are different than plain ones, because of cloud hosting setup:

aumet.azurewebsites.net/webinfo

{
    "request": {
        "headers": {
             "host": "aumet.azurewebsites.net",
            "disguised-host": "aumet.azurewebsites.net",
            "x-original-url": "/webinfo",
            "x-forwarded-for": "client-ip-address-here",
            "x-forwarded-proto": "https"
        },
       // on other servers this could be localhost
        "hostname": "aumet.azurewebsites.net",
        "path": "/webinfo",
        // don't read this value
        "protocol": "http",
 }
}

But before I add that to my Angular App, back to being obsessive about separation of concerns, this is not an Angular issue, thus it shall not belong to the app. I would rather set up the right URL, and provide it. Like this:

// in host/server/routes.js
// change the final get
  app.get('/*', (req, res) => {

    // fix and provide actual url
    let proto = req.protocol;
    if (req.headers && req.headers['x-forwarded-proto']) {
        // use this instead
        proto = req.headers['x-forwarded-proto'].toString();
    }
    // also, always use req.get('host')
    const url = `${proto}://${req.get('host')}`;

    res.render(`../client/index.html`, {
      req,
      res,
      // here, provide it
      providers: [
        {
          provide: 'serverUrl',
          useValue: url,
        },
      ],
    });
  });

Back to our Angular App, let's create a proper HTTP interceptor, to intecept localdata calls:

// Angular inteceptor
@Injectable()
export class LocalInterceptor implements HttpInterceptor {
  constructor(
    // inject our serverURL
    @Optional() @Inject('serverUrl') private serverUrl: string
  ) {}
  intercept(req: HttpRequest<any>,next: HttpHandler): Observable<HttpEvent<any>> {
    // if request does not have 'localdata' ignore
    if (req.url.indexOf('localdata') < 0) {
      return next.handle(req);
    }

    let url = req.url;
    if (this.serverUrl) {
      // use the serverUrl if it exists
      url = `${this.serverUrl}/${req.url}`;
    }

    const adjustedReq = req.clone({ url: url });
    return next.handle(adjustedReq);
  }
}

Provide the HttpInterceptor in AppModule

// app.module.ts
providers: [
    {
      provide: APP_INITIALIZER,
      useFactory: configFactory,
      multi: true,
      deps: [ConfigService],
    },
    // provide http interceptor here
    {
      provide: HTTP_INTERCEPTORS,
      useClass: LocalInterceptor,
      multi: true,
    },
  ],

And clean up ConfigService from any reference to our server. Building, testing, works.

And what is so nice about this, is you can change the server config.prod.json without restarting the server, nor worry about polluting other environments, and servers. Now I can sleep better.

Providing the config on server

Now that we have a separate server, and the cofiguration file is not remote, why not provide the config and inject it in the ConfigService?

// host/server/routes.js
// require the json file sitting in localdata
const localConfig = require('../localdata/config.prod.json');

// setup the routes
module.exports = function (app) {
   // ... 
   res.render(`../client/index.html`, {
      req,
      res,
      // also provide the localConfig
      providers: [
        {
          provide: 'localConfig',
          useValue: localConfig
        }
        // though don't lose the serverUrl, it's quite handy
      ] 
    });
  });
};

In ConfigService

 constructor(
    private http: HttpClient,
    // optional injector for localConfig
    @Optional() @Inject('localConfig') private localConfig: IConfig
  ) {}

    loadAppConfig(): Observable<boolean> {
    // if on server, grab config without HTTP call
    if (this.localConfig) {
      this._createConfig(this.localConfig, true);
      return of(true);
    }

    return this.http.get(environment.configUrl).pipe(
     // ...
    );
  }

This is the fastest and least error prone method for the server to get configuration. But it might be an overkill for some. May the force be with you.

Thank you for reading this far of my very long post. I must have made a mistake, let me know what it was