Occasional Clock Drifts

Version: 8.0.7

Any idea what might cause these occasional clock drifts? We didn’t see any during our debug, but when we moved to the final hardware, they started occuring every now and then. Resources seem to all be failry low, Windows puts everything under 50% and the gateway doesn’t even get to half the RAM we have dedicated with roughly 5% CPU usage.

Is this real hardware or a VM?

Is it possible these are actual adjustments to the system time and not a performance issue?

Real Hardware, and it is very similar to the other hardware.

It could be, but I did disable windows automatic clock sync, so I’m hoping that that is the only place that might be changing the system clock.

You can enable GC logging and see if maybe some kind of GC event corresponds to the clock drift entries.

Add this as an additional param in ignition.conf and restart:

-Xlog:gc*:/path/to/gc.log:time,level,tags

Make sure “/path/to/gc.log” actually points somewhere that exists (minus the gc.log of course, that should get created).

1 Like

These look very consistend around 20 minutes past the hour, with a small drift forward.

Is there a gateway script you’re running hourly that may need more time than before? Perhaps a synchronous network access?

EDIT: it could also be something from outside Ignition that’s hogging the CPU or disk in an hourly schedule.

Yeah, I had noticed that too.

I didn’t see anything else outside of Ignition. I don’t have any timed scripts that seem to be hourly.

auto backups maybe? I think those default to every 15 minutes though.

Not enabled on this computer currently.

I did replace the CMOS battery this morning. Its a long shot, but figured it was worth trying. Computer was supposedly new, but maybe the customer bought a refurb one for us and that missed being replaced.

Another thing people do is historize the CPU and memory system tags. This lets you throw them into an easy chart or something and see if the drifts correlate with big memory collections or CPU spikes.

No scheduled reports or anything? The spacing on the messages is too much to be a coincidence. Something scheduled, either inside Ignition or out, has got to be running…

2 Likes

You do have a chance every hour, and you know it up to the minute, to monitor your system and see which component is getting a heavy load (CPU, disk, RAM or network), and to check which program it's coming from.

Might be worth taking a close look for malware trying to “phone home” every hour.

Hi, I am having similar issue but consistently every 15 minutes.

Automatic backups are set to every hour, I have gateway timer scripts at every minute and every hour and they work fine, and I have reports schedule for twice a day and every hour and they also work fine.

I don't have any scheduled activity that happens every 15 minutes.

Any suggestions where could I look for root causes?

Thanks!

Interesting. Something is running every fifteen minutes. Is this a VM? Where something else on the same hardware is bogging the physical hardware every fifteen minutes?

{ This looks a lot like what I've seen elsewhere with hypervisors stealing CPU time from "idle" Ignition VMs. }

Physical box. Performance manager looks all good. Only gateway service running on that box.

Huh. Detailed logging is in order then. You should turn on GC detail logging via additional java parameters in ignition.conf.

Consider also reducing the amount of memory Ignition is allowed. The more there is that is churning, the longer the GC pause time can be. Find the time of day where that memory sawtooth spikes highest, and set the limit ~20% beyond that. Make sure both initial heap memory and max heap memory are set to the same value.

1 Like

I did both things you suggested at 15:30 my time. (bring memory down and add detailed GC logs)

Since that point no clock drifts so far. No GC.log files either which I am not sure is ok or not, but it looks like it's running with no issues now.

Thanks a lot!

No ok. There's been some churn in the java settings for that, so you might need to do some research.