What is the storage limit of a table in database

Hi everyone, I have created a table in database that dose nothing more than recording the timestamp when a Boolean value goes high. I'd like to record as many rows as possible, but what's gonna happen if the storage memory is full? Would it cause system crash? or just stop updating the table? Shall I create a script to delete or clean the table periodically?

Most databases don't have specific limits. You just run out of disk space and the whole database stops working. Depending on the brand, queries may or may not still work.

Most brands document how must disk space a bare row takes on disk. IIRC, the above, with a 32-bit integer key, would be ~21 - 24 bytes per bare row in PostgreSQL. Indices (which are effectively necessary for decent behavior) add more.

Just how many events per year do you expect?

1 Like

I'm glad you're thinking ahead. Most people don't think about storage space until they hit the limits defined by @pturmel

Most databases don't have specific limits. You just run out of disk space and the whole database stops working. Depending on the brand, queries may or may not still work.

If you can define a practical limit where the data no longer has value/meaning or legal requirement to maintain (e.g. 1 Million rows, 6 months, 7 years, 20 years) then build in the delete now with a configurable variable while you are in development rather than leaving a built-in down-time problem for someone else to figure out sometime in the future.

1 Like