By “burnt to a crisp”, I mean that mediamonkey bogs down after about 60K tracks or so. By bogs down, I mean it takes multiple seconds per keypress when searching for tracks or artists or what-have-you.
Thanks for explaining. I am no native english speaker, so sometimes I have to ask.
I don’t know if it is possible at all to find anything faster than MediaMonkey for that purposes. MM’s database part is based on MS Access and for complex hierarchic data I don’t know of any database system that is faster when used properly, even with a lot more than 60k entries.
Actually I find the speed with my amount of tracks already amazing. MM does a within-search in realtime, while you are typing in the search keywords. Delays of around 1 second until the answer table is appearing. And with that amount of data, a within search and building the tables for the answers simply does take some time.
I mean, for a really bad example how to do it, have a look at MusicMatch. With 4000 tracks, search times go up to minutes almost regardless of system performance! So some seconds with 60k tracks on a 1,8 GHz machine with only 512 MB RAM is rather good in my eyes.
So I guess the best you can do is to boost your systems performance in processor speed and memory according to the amount of data you are processing.
Just looked up your system configuration. 512 MB RAM? I guess you start upgrading here and add another 512 MB. I guess that will help a lot.
My 12500 tracks have a database that is 20 MB big. So 60k tracks will have 100 MB data. Add the RAM amount needed to keep the data in memory necessary for processing and you might very well end up with much more. So you better make absolutely sure, you have enough RAM so the Access database structure can be cached in RAM completely. Windows will still swap, but I believe performance will be a lot better with more RAM in your case.