Hi all,
I have tried to restore IA Demo from Ignition exchange. As a part of the instructions, firstly I set up SQL installation and tried to import “all_databases.sql” file. While executing the query, it always lost the connection when accessing creating ‘Documents’.
Hello @Thaha_Mutheth! The all_databases.sql file is a fairly large self-contained file so the wait_timeout and max_allowed_packet parameters need to be increased in order to successfully import.
I suggest running these two lines first before importing the all_databases.sql file:
SET GLOBAL max_allowed_packet=1073741824;
SET GLOBAL wait_timeout = 300;
I have already tried this. Still the issue I am facing. Something interrupted the query execution while the execution in between creating "document" table under IADemo . I am using mysql workbench 8.0
First, let’s make sure that the process isn’t running on the client even though the connection timed out. You can check this out by going to Server > Client Connections
on MySQL Workbench and make sure there are no processes/queries still running that pertain to the import.
Next, I would recommend increasing the timeout intervals for your MySQL Session. That can be found on MySQL Workbench under Edit > Preferences... > SQL Editor > MySQL Sessions
and set the timeout intervals for each section to 600 seconds.
It is also possible that you are running into a timeout after 5 minutes still, if so, I would suggest increasing the timeout to 10 minutes by executing these lines in MySQL Workbench:
SET GLOBAL max_allowed_packet=1073741824;
SET GLOBAL wait_timeout = 600;
SET GLOBAL net_read_timeout = 600;
SET GLOBAL connect_timeout = 600;
After executing the above lines, try to import the all_database.sql file and monitor the Client Connections page while the file is being imported. If we can determine that Workbench is losing it’s connection to your database, we may need to do a command line restore.
It’s also worth noting that we’re working on reducing the size of those files. This normally happens when you’re installing locally without a fast database server to process the large amounts of data in that file. Our update will be simply to reduce the size of the examples there to make them quicker to import, which should bypass the problem entirely.
I hope this this is the reason. I was trying to install locally. It would be good if you can split the file for different database. I tried locally, but ended up with my machine in freezing stage.
It works perfectly. Thanks