In order to update a database(Sybase Adaptive Server Anywhere 8) I'm taking all the data from one database on clients computer and using
SELECT * FROM tabelName OUTPUT TO 'C:\path' FORMAT ASCII DELIMITED BY ';';
Then I'm using those files to update another database that's connected to a server by making temporary tables, checking if anything is missing and then dropping the temporary table.
INPUT INTO tempTable FROM 'C:\path' FORMAT ASCII DELIMITED BY ';';
INSERT INTO mainTable
SELECT *
FROM tempTable
WHERE id NOT IN (SELECT id FROM mainTable);
DROP TABLE tempTable;
So I am wondering if there is a better way to do this since two tables are taking a very long time to export and import the data(they do have a ton of columns and data though but still and by ton of data I mean ~100 columns and ~100,000 worth of rows). It's especially weird since another table with similar amount of data goes by very quickly(although it does have less columns). The two tables that are taking a while are reading data ~1000 per sec while others are much faster.
Each client has the database.db file on their laptop. They get some data and need to submit it but have no connection, so they use their own .db file and update the one on the server later.
Getting the SQL definitions might be hard since tables were made a while back and I have no clue how to get them. The destination database is the database that holds all the data while source ones are the ones that are not connected and can be used offline if needed.
Some recommendations, tips would be greatly appreciated.