We have a web service that pumps data into 3 database tables and a web application, in a data consolidated format Data reads Server + ASP.NET Environment
There is too much data to access the database tables and the data read by them and at such a high speed so that the system has failed.
is indexed on the tables, one of them is unique, one of the tables has billions of records and a few hundred gigabytes of disk space are staying; The second table is a small, only a few million records, it is emptied daily.
Do I have to end the obvious problem of reading and writing together - and in many database tables?
I am interested in every adaptation move, though we have tried every move.
We do not have the option of installing SQL Server Enterprise Edition and. Is able to use.
Edit: The system is used to collect fitness tracker data from thousands of devices and to display data in thousands on their dashboard in real time.
The path to the requirements and characteristics is very broad for a solid answer. But one suggestion will be to set up another database and log shipping for it. Then the original DB will be "write" and the new DB will be the "reading" database.
Opposition
- Dispace
- Read the length of time for the trailer will expire by DB
PRO - Maybe "Write" can drop some indexes on DB, it can increase / display - you can then summarize the table in the "Reading" database to increase the performance of the query
No comments:
Post a Comment