I have not seen the slow down but based on the description given to me…
After filling in a number of tables, the data entry step will slow down. As you know from working with the table, whenever the user presses enter, there is a cell edit event. Mine is a bit more complicated in that I do an Information_Schema query to determine the data type being written and then use that info plus some bounds info that I have in a dataset to determine if the data is acceptable. If the data is acceptable it is written to the table if not, it is rejected.
I am wondering if the Information_Schema query could be loading it down. As I have been working on this I found I had one other Information_Schema query that was running on a relative poll for each table. I have fixed that but I have no data yet to tell me if that helped.
I have been watching the Database Connections status screen on the gateway and see nothing unusual except every so often (once every couple minutes) it will flash up the text of an update query. Should I be able to see the text of a query under Active Queries? I am assuming the transaction history uses an update query to write data and that is what I am seeing.
Looking at the Gateway status page, it is using 1% of the cpu and 167 mb of ram. I am still using Java 1.6.0_26, is that okay?
My console has some errors with regards to one PLC, something in regards to Key equals 0: length=378. I will do a search on that.
The Event script I use:
[i]table = event.source.parent.TableName
worktable = event.source
pk = event.source.parent.PrimaryKey
colName = event.source.data.getColumnName(event.column)
data = event.source.data
att_tabledata = event.source.Attributes
att_rowcount = att_tabledata.rowCount
determining the data type of the field being edited.
query = “Select DATA_TYPE FROM [Ignition].[INFORMATION_SCHEMA].[COLUMNS] Where TABLE_NAME = ‘%s’ AND COLUMN_NAME = ‘%s’”% (table, colName)
datatype = system.db.runScalarQuery(query)
we don’t have limits for strings and if the value is -1000 we jump to below
if datatype <> ‘varchar’ and event.newValue <> -1000:
for x in range(att_rowcount):
rowName = att_tabledata.getValueAt(x,“Variable”)
if rowName == colName:
break
# aquire the values needed for boundary checking
minVal = att_tabledata.getValueAt(x,“Minimum”)
maxVal = att_tabledata.getValueAt(x,“Maximum”)
unit = att_tabledata.getValueAt(x,“Unit”)
# if the entered value is outside the boundaries reject the entry and popup a message
if event.newValue < minVal or event.newValue > maxVal:
newVal = event.oldValue
system.gui.messageBox("Valid range is %g to %g (%s)"% (minVal, maxVal, unit))
system.db.refresh(worktable, "data")
else:
newVal = event.newValue
if the value is -1000, set the field back to Null
elif event.newValue == -1000 or event.newValue == ‘-1000’:
newVal = None
system.db.refresh(worktable, “data”)
else:
newVal = event.newValue
write the data to the database
Handle the case where there are multiple primary keys
pks = [x.strip() for x in pk.split(",")]
where=""
for key in pks:
where += key + "=? AND "
remove trailing AND
where = where[:-5]
query = “UPDATE %s SET %s=? WHERE %s”% (table, colName, where)
params = [newVal]
for key in pks:
params.append(event.source.data.getValueAt(event.row, key))
system.db.runPrepStmt(query, params)
system.db.refresh(worktable, “data”)[/i]