i want to insert in a Database a high amout of data, thats reflects multible values from a view (2 values of two different tables, 1 int from an dropdown menu and so on)
my question is now if it is possible to run a named update query where i loop thro the data, collect all values i need and add them to the paramaters for an named update query that looks like:
insert into table (date, value, name) values (Date, int, string)
without having to call the namedQ for each of the 16k updates. Instead i am looking for a way to hand over multible params so that the query looks like:
A named query cannot do that (with properly secure parameters). A scripted "Prep" update can construct such an insert with many question mark parameter placeholders, but typically not 16k at a time. Most drivers have a limit of ~2000 to ~4000 question marks. But you can definitely send hundreds of rows at a time.
The amount of insert would be dyninamic, so that would not be a way to get the result i want. my main goal with avoiding so many seperate executions of a named query is to keep some performence.
to reduce the work for the database, is there kind of a singleton connection possible? so that there is onle one connection for the queries made to the DB?
otherwise i would have to go with a very ugly solution for this problem
def build_insert_string(table, data, columns=None):
"""
Build the query string and the values list for a multi-insert query.
Use with `system.db.runPrepUpdate`
params:
table (string): The table to insert into
data (list of dicts): A list containing a dict for each row to insert, where the keys are the columns names
columns_names (list of strings): The columns to be inserted. If None, they're deduced from the first row's keys.
return:
a tuple where:
- the first element is the query string formatted for a `prepUdate` (with question marks as place holders)
- the second element is the list of values
"""
if not data:
return None
if columns is None:
columns = data[0].keys()
marks = "({})".format(','.join("?" for _ in columns))
marks = ",".join(marks for _ in data)
col_names = ','.join(columns)
q = "insert into {} ({}) values {}".format(table, col_names, marks)
values = [row[c] for row in data for c in columns]
return q, values
Use it like this:
q, v = build_insert_string('table_name', dict_of_values)
system.db.runPrepUpdate(q, v)
You can dynamically create the SQL for a scripted query. The hundreds of rows would go through a single connection. You can run each of the queries needed to add up to 16k sequentially.
Pascal's example just needs a limiter for the total number of question marks.
As long as you use runPrepQuery (or a named query with a Datetime parameter though that doesn't really meet your needs in this use case) and the object is a java.util.Date or subclass thereof you don't need to do anything to it, Ignition will handle it for you. Simply just system.db.runPrepUpdate("INSERT INTO table (someDateColumn) VALUES (?)", [someJavaUtilDateVariable]) - this will do it all for you.
SQL Server is known to have a limit of just over 2000 question mark parameters per query. Use that to break your complete list of rows into suitably-sized batches.