It's not always about how many records the query is pulling. Do you have partitioning set up on the machine? How much storage is available?
If it is required that is one thing, but if it's just nice to have then its worth the discussion of exactly what you need to have. You really want most tags set up to record history on change. Meaning that the max time between records is unlimited. As long as your dead band is set correctly you will still record changes that are interesting. As it is set now you record a record every 1 Minute independent of if the value has changed or not. Perhaps that is needed, in my experience it is mostly not.
60 Days x 24 Hours X 60 Min X 3 tags is a minimum of 259,200 (if the value is changing inside of a 1 minute interval then you will have more) records in the database. That is not an unsubstantial number.
Hopefully you have some type of data pruning configured for the historian. Assuming the worst configuration of no data pruning and no partitioning the database will have >= 7.8M records for just those three tags in a single table. Assuming a fairly low 4 bytes per row of data, that's roughly 32GB or storage, again for just those three tags.
The performance issues come in if no indexes have been added to the tables (they aren't there by default in 7.9) and so the database engine will have to trudge through all of that data to gather the information being requested, and that can kill a server that doesn't have enough memory available.
Does the database have data pruning and partitioning configured? Has proper indexing been set up on the tables?
All of that being said, 2 months of data for a client running on this machine that already has a lot of memory hungry applications running might just be too much.