Lots in this thread, so I'll just note the Jupyter kernel is essentially just another scripting execution context in Ignition. So when it's spun up, it jams the desired context into globals and evals what the Jupyter environment hollers at it. There's more to it, but that's kind of the gist. The upside is it gives easy (IMO) near full control over that context.
If you can talk like a Jupyter server, you can talk to the kernel. The protocol isn't terribly involved, but it's basically a souped up REPL - you type commands, Jupyter hollers them at the kernel, the kernel executes the commands, and then the kernel emits replies. The catch is that Jupyter uses ZeroMQ (for now), so if your plugin or environment doesn't natively support that then you'd need to add that to drive the kernel running in Ignition.
But if you add additional handlers in the kernel, you can hook into and extend the kernel for other non-Jupyter commands, if you like. I think it should be as simple as adding another handler.