I have a system with 5 source data stores (mostly PostgreSQL databases) from which, I want to connect them all to containers running on OpenShift: A library of react applications who's components make AJAX requests for data feeds from a thin server instance that is also running on OpenShift.
So far I have created the following:
- A test server (Flask with
flask_restfulandsqlalchemy) that has a direct connection to one of the data stores and can returns REST JSON responses of any of the tables in the database being requested - A node.js app consisting of two react components (D3.js visual and data table) which on mount/update, consume JSON response of one of the database's tables
In the production system, my plan is to add in:
- graphQL to avoid the over-fetching that I'm currently doing
- Connect up my server with a redis data cache to reduce demand on the data stores
My Question:
As some of the data coming from the data stores can be large (>1M records), what would be the optimal way to do this?
- Should I rewrite my server with node.js (my thinking being that javascript is 'native' to JSON and so faster processing?)
- I have been looking into BSON and msgpck but I'm unclear if these would play nicely with react components?
- Is there a scenario where I could optionally pass data between the OpenShift applications rather than sending large JSON response objects?