Submitted by gwplayer1 t3_11boje4 in askscience
ziptested t1_ja0dw1j wrote
GPS clocks are not reset. They run at adjusted frequency. In general in a satellite mission you want to avoid resets and be prepared to do adjustments not only for time dilation. In a satellite mission I worked we had an onboard clock that was the source of mission time in "milliseconds" since the satellite was powered on soon before the launch. The clock was never adjusted or reset. Time dilation and other errors accumulated. The satellite periodically transmitted a GPS time stamp along with the mission time to the mission control. Based on that the mission control uploaded a schedule of actions in mission time.
[deleted] t1_ja0kqtr wrote
[removed]
Sammy81 t1_ja33zcs wrote
That’s unusual. Usually the bird has a GPS receiver. In your case, the satellite clock usually has an epoch from which it counts time (time zero, set to something like 1958). This time is then adjusted by a parameter called deltaUT1, which is a tenth of a second resolution correction based on Earths rotation. The time dilation due to relativistic effects is minor, but an unsynced onboard clock that is not corrected will drift due to the clock a noticeable amount within just a few months. I would think Mission Control would have constant issues commanding it to take data at certain times (if it were an imaging bird for example) unless they synced the clock periodically.
parikuma t1_ja3olmd wrote
Pardon my ignorance, but how come they don't use TAI for precision?
Sammy81 t1_ja6ej2y wrote
I was not familiar with TAI until your question, but I googled it. It looks to me like TAI is the most accurate GPS time. The problem with GPS time is that it is not “earth time”. By that I mean GPS time is independent of the position of the earth. Since the earth rotation is slowing over time, GPS time deviates from it, running ahead of it as the earth slows. UTC time takes this into account. Leap seconds slow GPS time to align with “earth time”, or the time it takes the earth to rotate one time. Since satellites are often concerned with observing earth, or communicating with earth, it’s important for them to stay aligned with the actual earth rotation, so UTC time is more useful. One of my first assignments was making it easy for ground control to upload deltaUT1 and leap seconds to our satellite (Calipso) so that it’s science data was accurately tied to the ECEF reference frame.
Coomb t1_jadvbz3 wrote
As a matter of technical fact, it's not GPS time that gets adjusted with leap seconds, it's UTC. From a user perspective in most cases the difference isn't particularly meaningful because you probably want to convert between GPS time and UTC and for that use case it doesn't matter whether you add or subtract the offset to one parameter or the other. But the satellites don't update the time they broadcast every so often to align with UTC. They've been counting seconds as accurately as they can since they started broadcasting. Instead, they broadcast, in the GPS navigation message, the offset, in integer seconds, from UTC. If you are reading time directly from a GPS message, you never have to worry about it repeating or skipping an increment. UTC technically could do either one of those.
E: to be clear, the GPS control segment routinely updates the clocks on the satellites to maintain synchronization tight enough to meet the GPS specified error budget, but these adjustments are transparent to users and never anywhere close to entire seconds
[deleted] t1_ja3iqrz wrote
[removed]
[deleted] t1_ja0nxwa wrote
[deleted]
Viewing a single comment thread. View all comments