We have a SQL DB that collects analog and digital sensor data from several hundred devices. The archive table is quite simple, just a (datetime) timestamp, (int) stationId, (int) datatapointId and (double) value (plus 3 bool flags).
The DB collects about 12 million entries per month. This table is supposed to be always growing, no rollbacks, updates or deletions are neccessary.
The main purpose of this table is to be used to show archive values in our web visualization ordered by timestamp and filtered by this or that id, therefore I created the relevant indexes for the fields I need. It works, but the performance is quite poor (about 5 seconds when filtering by a start/end time, stationId and datapointId).
What would be a better way to store and query this kind of data? Is SQL Server Standard 2016 perhaps not the right DBMS for such an archive? The original decision to use SQL Server dates back several years when the amount of data was way smaller than today. Is there some magic in SQL Server that I can apply to trade reliability for performance?