Hi Guys,
I willing to hear your suggestions and advice regarding a problem of DB design in MySQL.
The situation is that I need to develop a DB design which brings High Performance under the following conditions:
+ There will be 10000 users that will consump and generate data.
+ The DB should allow users to generate a record each second 24 hours per day.
+ The DB should store records from the current time to 90 days in the past.
ALL of the above conditions will take place over a single table, so that brings the following numbers:
+ 86400 rows per user/day
+ 2,592,000 rows per user/month
+ 7,776,000 rows per user/historical
+ 77,760,000,000 rows TOTAL SIZE OF THE MAIN TABLE
So, the question is what design/tuning strategies would work in order to generate the scenario described above and keep "real-time" response times, this is:
+ The DB should allow me to retrieve the last 1-300 rows added by a random user within a response time of milliseconds.
+ The DB should be able to handle up to 10000 insertions per second.
+ There should be some sort of mechanism to automatically delete rows older than 90 days in order to make room for the new ones and keep the table size.
I really would like to hear your advices about this.