Hi, we are dealing with a large project which is expected to grow up to 360000 tags by 2014, I’d like to know which would be the optimal software and hardware arquitecture taking into account that redundancy is a must, that the servers will have to establish a lot of connections with Remote Terminal Units (Modbus TCP over GPRS) and that there will be over 20 small projects running on the gateways.
How often will the tags be changing?
How often will you scan them?
How many simultaneous clients do you need?
Can you offload the data collection from the visualization? ie: use a FEP
History? How much? How often?
At the end of the day, the hardware costs are tiny compared to the development costs. I’d just get the biggest server I could buy
Quad-core or six-core xeon(s), 8gb-16gb ram, using actual server hardware, would be a safe bet.
In addition to what Kevin said, (Linux + MySQL) or (Windows Server 2012 + SQL Server). Use a Hardware Raid Card. Enterprise SSD Drives if you can afford it, SSD Cache if you cant. Tune the stack for your hardware.