Use base64 --decode on the command line? It’s part of coreutils, so should be on just about every linux box on the planet. If you’re not using Linux… somebody else might pipe up. /-:
base64 --help
Usage: base64 [OPTION]... [FILE]
Base64 encode or decode FILE, or standard input, to standard output.
With no FILE, or when FILE is -, read standard input.
Mandatory arguments to long options are mandatory for short options too.
-d, --decode decode data
-i, --ignore-garbage when decoding, ignore non-alphabet characters
-w, --wrap=COLS wrap encoded lines after COLS character (default 76).
Use 0 to disable line wrapping
--help display this help and exit
--version output version information and exit
The data are encoded as described for the base64 alphabet in RFC 4648.
When decoding, the input may contain newlines in addition to the bytes of
the formal base64 alphabet. Use --ignore-garbage to attempt to recover
from any other non-alphabet bytes in the encoded stream.
GNU coreutils online help: <http://www.gnu.org/software/coreutils/>
Full documentation at: <http://www.gnu.org/software/coreutils/base64>
or available locally via: info '(coreutils) base64 invocation'
Unfortunately, if you’re already getting base64 data in the quarantine export you’re not going to be able to turn this into human readable data. Most data types should be converted before going into export in 7.9.4, but it looks like this might be a generic INSERT/UPDATE that we’re not sure how to deserialize?
You can create a new DB connection, copy over the HSQL db files (in the /datacache/<database name> directory in the installation folder), then retry the records and see what it tries to push into the database.
That’s the only way to force these records to deserialize and unpack. Even the data stored in the HSQL db is an (encrypted) BLOB, so it’s not human readable.
Thanks a lot guys, it's been enlightening.
Just out of curiosity, any idea why some data gets saved as a BLOB and some other data gets saved in human readable form when exporting?
Is there any way to see where these queries are coming from or going to (table in addition to database)? I have a hard time troubleshooting these in a large system.
No. The BLOB itself represents a Java object of some kind - we have to manually register them for deserialization in the export mechanism. If you get in contact with support, they may be able to help - or, at the very least, add some urgency to the ticket to make all first-party quarantine data types export-able in a friendly format.
Bumping this topic from grave since it's my exact issue.
I have some quarantines stored in SF system coming from a single named query and failing to see what values are causing the insert to fail. I understand its a FK mismatch but not the specific values/where they are coming from.
Where:
OpID is a UUID string (example '41844571-CBCD-4FDE-B632-BA4857DA100C'). NOTE: the DB stores this as type 'uniqueidentifier' and maybe this is part of the issue.
Code is a positive integer (example 2201)
Timestamp (i.e. 2023-08-21 9:53:14 AM)
The database is MSSQL. There's a primary key in the table 'id' that indexes every entry, too.
Is there anything sticking out that causes the SF quarantine to be base64 intstead of non readable?
You need to take the entire datacache/ folder (or a .zip of it) and open that in Kindling, not the export from the web UI; that web export has already lost too much information to be useful. In the Ignition install directory, under data/datacache/, there will be a folder with the same name as your database connection - either copy that whole directory somewhere and point Kindling at it (or the .script file within) or .zip it all up and open that with Kindling.
For anyone in the future, another way that may work to find the failed SF query is to search logs for Store & Forward. This has the SQL Exception included. Not sure if it works every time but may be easier than DLing Kindling, pulling the DB script file, etc.