Best way to implement shared, external lookup data?

I’m looking for advice as to the best way to implement accessing shared, external lookup data.

Our application will run as Docker containers, 1 (well, 2, CUBA + PostGRES) for each client (in this case, client means client of ours, a customer that pays for our software) - but we have a need to have certain data shared that doesn’t vary by client. These are things like lists of procedure codes, National lookups of Doctor IDs and so forth.

These will be read-only for the application (though we’ll have a code an application to maintain them as well). I’m not sure how to best implement this. Also, the connection parameters will be different at development-time vs. runtime, just like the main database is now (which I’m currently doing with war-context.xml).

Ideas/help? :slight_smile:

Some things come to my mind, depends what are your needs, scale and scope.

  1. Make your applications connect to an additional datastore, which would be one more postgres database to contain the joint data - additionally make an admin application to manage that data - this seems fast and straightforward
  2. A separate database and an application to manage it, but your applications are not connected directly, instead the data are replicated by a database layer, or something in the background - need to have tables of the same structure in your client’s databases - if connectivity is lost to the shared database, the client app will still work
  3. separate database, management application and REST interface published, your client applications consume data via REST API - advantage of this approach is that other things may read the data too, and you will not need to change anything long time - more time consuming then 1 and 2