I have a system I've created for monitoring software license usage, storing the usage info in a database, and then querying it by use of an ASP.NET website. So the components are:

1. MySQL Database (5.5 or so)
2. "Data Capture" Windows service (C#) that queries license servers and inserts usage info into MySQL database
3. ASP.NET Website that queries the database and displays results.

My question is to how I should structure my database. The Data Capture service queries license servers every 10 minutes and then adds rows to these tables of the database:

license_usage
license_pool

Every 10 minutes, I'd say about 200 to 300 rows are getting added to each of those tables. There is a foreign key relationship between the license_usage table and the license_pool table.

The problem I think I might be detecting is that these tables are getting way too large. Some queries I attempt to write (to be used by the ASP website) take way too long to run. I've had the Data Capture service running for about 5 or 6 months, so with 200-300 rows getting added every 10 minutes, you get the idea.

Are there any best practices I should be looking to in order to mitigate the effects of tables that grow extremely large? I'm willing to buy some books to educate myself better in this area.

Any help is greatly appreciated.