UTC doesn’t become wrong, you can either just accept a different pace of the clock, i.e. earth ppl will be ever so late to a meeting or it’s just a different kind of timezone conversion. Better would be to have a single time based on the reference frame of the center of the galaxy and everyone keep there time relative to that.
just use a time based on light?, like meter is based on the speed fo light in the vaccum, or use atomic based times?, like how long take for the hydrogen atom todo something bla bla bla
That’s actually what’s different on the moon. Relativity and all that means that time itself actually flows differently on the moon than it does on earth.
The actual problem they’re working to solve is around timekeeping and GPS applications in different reference frames, but it’s hard to make a short headline about.
When I first saw the news I was thinking “there’s no way atoms vibrate differently on the moon” but you’re right it’s about perspective and I’ve realized there’s no way I’m smart enough to handle timezones on an interplanetary scale. I can only hope that the difference between earth seconds and moon seconds can be expressed as a consistent ratio.
I will gladly use some programming library invented in the basement of a university powered by coffee, and rage.
It’s well understood math, but it’s “only” relativistic orbital mechanics.
It boils down to a pretty consistent number, but how you get there is related to the weight of the moon, how far it is from earth, and how fast it’s going.
Since the moon is going different speeds at different places in it’s orbit, the number actually changes slightly over the month.
They’re just using the average though, since it makes life far easier. We use the average for earth too, since clocks move differently at different altitudes or distances from the equator.
It’s not too bad. Relativity says that no frame of reference is special.
On earth, a second looks like a second, but a second on the moon looks too quick.
On the moon, the second looks like a second, but a second on earth looks too slow.
Both are actually correct. The simplest solution is to declare 1 to be the base reference. In this case, the earth second. Any lunar colonies will just have to accept that their second is slightly longer than they think it should be.
If it helps, the difference is tiny. A second is 6.5x10^-10 seconds longer. This works out to 56 microseconds per 24 hours. It won’t affect much for a long time. About the only thing affected would be a lunar GPS.
Unfortunately, it’s not a useful one. While we know approximately where it is, we don’t know how deep the gravity well is. That gravity well slows the passage of time, just like the earth does. Without an exact mass, and mass density, we can’t calculate the correction factor.
The second […] is defined by taking the fixed numerical value of the caesium frequency, ΔνCs, the unperturbed ground-state hyperfine transition frequency of the caesium 133 atom, to be 9192631770 when expressed in the unit Hz, which is equal to s−1.
Do not matter for relativity though, always same change.
So are you saying that a caesium-133 atom observed on both the Earth and the Moon to oscillate 9,192,631,770 times will not represent the same absolute span of time?
So, one observer will see those oscillations happen faster than the other?
Does this have to do with the specific gravity fields of both observers, in that those fields affect how the atom oscillates?
Or is there something else I’m missing?
If special relativity is the answer, all good. I’m an electrical engineer trained in classic physics, so I’ll rest knowing that I’d probably need to study that to understand the time differences.
So, one observer will see those oscillations happen faster than the other?
Not quite. In each observer’s frame of reference, time appears to pass the same; it’s only when you try to reconcile the between two objects that are not at rest with respect to each other does relativity show up.
Basically, when you bring someone back to Earth, the observers will find that their watches don’t match up even though both observers experience time passing the same way as normal (because the oberserver is by definition at rest with respect to their own frame of reference).
TL; DR: Relativity is a pain in the ass and makes no sense in everyday terms.
edit: disclaimer - I am not a physicist and have not taken physics classes in a decade plus, but I do teach science at a college. I’m going mostly on half-remembered lectures and some random one-off discussions I’ve had with my buddy in the physics department over the past few years.
It’s that relativity thing where each person will see the oscillations happening correctly, but when they look at what the other person did, the answer will seem wrong.
The difference is small enough that it really only matters if you’re NASA and building moon GPS. MPS?
No the second is still 9192631770 hyperfine transitions of Cs-133 on the moon and that’s the same length of time at least unless you want to severely annoy physicists by implying that the laws of nature aren’t constant throughout the universe. It’s just that from our perspective it looks like time is flowing differently there.
You are correct that if you are on thee moon and have a cs-133 atom with you is second will take that many transitions. And if you do the same thing on Earth, a second will take the same number of transitions.
But things get weird when you are on earth and observe a cs-133 atom that is on the moon. Because you are in different reference frames, you are traveling at different speeds and are in different gravity wells time is moving at different rates. This means that a cs atom locally will transition a different number of times in a second from your point of view on Earth vs one you are observing on the moon.
And it would all be reversed if you were on the Moon observing a clock back on the Earth.
They already have to account for this with GPS satellites. They all have atomic clocks on them but they don’t run at the same speed as clocks that are on the ground. The satellites are moving at a great speed and are further from the center of the earth than us, so the software that calculates the distance from your phone to the satellite have to use Einstein’s equations to account for the change in the rate of time.
Except the length of a second is different on the moon because of relativity. So even utc is wrong.
UTC doesn’t become wrong, you can either just accept a different pace of the clock, i.e. earth ppl will be ever so late to a meeting or it’s just a different kind of timezone conversion. Better would be to have a single time based on the reference frame of the center of the galaxy and everyone keep there time relative to that.
just use a time based on light?, like meter is based on the speed fo light in the vaccum, or use atomic based times?, like how long take for the hydrogen atom todo something bla bla bla
That’s actually what’s different on the moon. Relativity and all that means that time itself actually flows differently on the moon than it does on earth.
The actual problem they’re working to solve is around timekeeping and GPS applications in different reference frames, but it’s hard to make a short headline about.
When I first saw the news I was thinking “there’s no way atoms vibrate differently on the moon” but you’re right it’s about perspective and I’ve realized there’s no way I’m smart enough to handle timezones on an interplanetary scale. I can only hope that the difference between earth seconds and moon seconds can be expressed as a consistent ratio.
I will gladly use some programming library invented in the basement of a university powered by coffee, and rage.
It’s well understood math, but it’s “only” relativistic orbital mechanics.
It boils down to a pretty consistent number, but how you get there is related to the weight of the moon, how far it is from earth, and how fast it’s going.
Since the moon is going different speeds at different places in it’s orbit, the number actually changes slightly over the month.
They’re just using the average though, since it makes life far easier. We use the average for earth too, since clocks move differently at different altitudes or distances from the equator.
It’s not too bad. Relativity says that no frame of reference is special.
On earth, a second looks like a second, but a second on the moon looks too quick.
On the moon, the second looks like a second, but a second on earth looks too slow.
Both are actually correct. The simplest solution is to declare 1 to be the base reference. In this case, the earth second. Any lunar colonies will just have to accept that their second is slightly longer than they think it should be.
If it helps, the difference is tiny. A second is 6.5x10^-10 seconds longer. This works out to 56 microseconds per 24 hours. It won’t affect much for a long time. About the only thing affected would be a lunar GPS.
Galactic center is the frame to use for any space travel.
Unfortunately, it’s not a useful one. While we know approximately where it is, we don’t know how deep the gravity well is. That gravity well slows the passage of time, just like the earth does. Without an exact mass, and mass density, we can’t calculate the correction factor.
i guesn it’s fine, just keep it updating, like the seed at little that got more precine since the creation of the meter but it got updated too
https://en.wikipedia.org/wiki/Second
Do not matter for relativity though, always same change.
So are you saying that a caesium-133 atom observed on both the Earth and the Moon to oscillate 9,192,631,770 times will not represent the same absolute span of time?
So, one observer will see those oscillations happen faster than the other?
Does this have to do with the specific gravity fields of both observers, in that those fields affect how the atom oscillates?
Or is there something else I’m missing?
If special relativity is the answer, all good. I’m an electrical engineer trained in classic physics, so I’ll rest knowing that I’d probably need to study that to understand the time differences.
Not quite. In each observer’s frame of reference, time appears to pass the same; it’s only when you try to reconcile the between two objects that are not at rest with respect to each other does relativity show up.
Basically, when you bring someone back to Earth, the observers will find that their watches don’t match up even though both observers experience time passing the same way as normal (because the oberserver is by definition at rest with respect to their own frame of reference).
TL; DR: Relativity is a pain in the ass and makes no sense in everyday terms.
edit: disclaimer - I am not a physicist and have not taken physics classes in a decade plus, but I do teach science at a college. I’m going mostly on half-remembered lectures and some random one-off discussions I’ve had with my buddy in the physics department over the past few years.
It’s that relativity thing where each person will see the oscillations happening correctly, but when they look at what the other person did, the answer will seem wrong.
The difference is small enough that it really only matters if you’re NASA and building moon GPS. MPS?
I vote for LPS, Lunar Positioning System, vs our Global one.
Yep, and the math gives different results based on if you’re on the moon or on earth.
No the second is still 9192631770 hyperfine transitions of Cs-133 on the moon and that’s the same length of time at least unless you want to severely annoy physicists by implying that the laws of nature aren’t constant throughout the universe. It’s just that from our perspective it looks like time is flowing differently there.
You are correct that if you are on thee moon and have a cs-133 atom with you is second will take that many transitions. And if you do the same thing on Earth, a second will take the same number of transitions.
But things get weird when you are on earth and observe a cs-133 atom that is on the moon. Because you are in different reference frames, you are traveling at different speeds and are in different gravity wells time is moving at different rates. This means that a cs atom locally will transition a different number of times in a second from your point of view on Earth vs one you are observing on the moon.
And it would all be reversed if you were on the Moon observing a clock back on the Earth.
They already have to account for this with GPS satellites. They all have atomic clocks on them but they don’t run at the same speed as clocks that are on the ground. The satellites are moving at a great speed and are further from the center of the earth than us, so the software that calculates the distance from your phone to the satellite have to use Einstein’s equations to account for the change in the rate of time.
Relativity is weird.
It’s like it’s relative…