The way I understand it is the farther away a object is the faster it is moving away from us, but also the farther away something is the older it is. So could that mean things were moving apart faster in the past but are slowing down?
The way I understand it is the farther away a object is the faster it is moving away from us, but also the farther away something is the older it is. So could that mean things were moving apart faster in the past but are slowing down?
Yes that is what I’ve always heard which is why I asked this question. I was hoping for bit more detail. My assumption is that we measure expansion through red-shift, and distance doesn’t matter in that measurement? My thought was that red-shift tells you that light waves have been stretched out, but how do you know when or where most of that stretching occurred?
Think of redshift as:
“Total miles added to your car trip.”
But distance adds:
“How long you were on the road and how fast you drove at each stage.”
With only mileage, you can’t say whether you sped up or slowed down. With mileage and timestamps, the full story emerges.
I guess that’s what im getting at. What are the timestamps?
If we see light from really far away, and it has red-shifted, then are we assuming the stretching is consistent for the whole distance or is there more stretching at the beginning or end of the trip? If the expansion is accelerating then more of the red-shift happened recently when it got closer to us since farther away also means older.
My thought was what if most of the red-shift happened when it was still far away from us thus meaning it happened a long time ago.
Like you mentioned some kind of timestamp would tell us, I just don’t know what those timestamps would be. I’m sure there’s something obvious I’m missing, but the only way I can think of to measure the difference in red-shift from the same light source at different points in its journey is to measure it from two vastly distant perspectives.
I guess if you measure red-shift from closer objects and compare it to the farther objects, and its not linear based on distance. Accelerated expansion would mean closer objects have a higher ratio of red-shift to distance than the farther objects?
They are cosmic time inferred from redshift, anchored by independently measured distances. Redshift does two things at once. It tells us how much the universe expanded since emission, scale factor then = 1 / (1+z) And it tells us when the light was emitted, because we know how old the universe was at each scale factor if we assume a specific expansion history. Sounds circular but it isn’t, because the expansion history is solved for by fitting many observations at once. So, the timestamps are emission time = age of universe at that redshift, and arrival time = now. These timestamps are inferred consistently across thousands of objects. Redshift itself does not tell us where along the path the stretching occurred. If most redshift had occurred early, the curve would bend the opposite way. So the curve only fits if expansion accelerated late, not early. The missing piece isn’t intuition, it’s realizing that distance measurements are the timestamps. (Wrote this in public with Siri, so feel free to ask or if there’s typos. People with inane conversations about salads lol)
The distance does matter. There are ways of measuring/estimating distances other than red shift. So basically you plot the distances against the red shift and if the relation is linear, the rate of expansion is constant, and that isn’t the case. Interestingly, it seems lately that the rate is different based on which way of measuring you use. Something is probably wrong and nobody knows what. That is exciting, because this is how you discover new things.
As far as I know, it’s more or less consistent throughout space.
https://en.wikipedia.org/wiki/Cosmic_distance_ladder
en.wikipedia.org/wiki/Distance_measure😊 ( tell me you are new to this field of knowledge )