âSupernova (SN) cosmology is based on the key assumption that the luminosity standardization process of Type Ia SNe remains invariant with progenitor age. However, direct and extensive age measurements of SN host galaxies reveal a significant (5.5Ď) correlation between standardized SN magnitude and progenitor age, which is expected to introduce a serious systematic bias with redshift in SN cosmology. This systematic bias is largely uncorrected by the commonly used mass-step correction, as progenitor age and host galaxy mass evolve very differently with redshift. After correcting for this age bias as a function of redshift, the SN data set aligns more closely with the cold dark matter (CDM) modelâ [1].
I know the team that did this. In fact i was listening to their seminar just a few days ago. They are very careful and have been working on this a long time. One caveat that they readily admit is that the sample used to create the luminosity age relation has some biases such as galaxy type and relatively lower redshift. They will be updating their results with the Rubin LSST data in the next few years.
Exciting times in cosmology after decades of a standard LCDM model.
Could you help me understand this sentence: "After correcting for this age bias as a function of redshift, the SN data set aligns more closely with the cold dark matter (CDM) modelâ?
The CDM model has no dark energy, unlike the LCDM model. The L stands for Lambda, which is the dark energy term in the Einstein equations. So they are saying when accounting for this effect, our universe looks more like a universe without dark energy, at least when only considering the supernovae probe.
Its not publicly available. Maybe for the best haha. The speaker at some point went on a bit of a tirade against many people in the supernovae cosmology community. I think he endured many years of being ignored or belittled.
> type Ia supernovae, long regarded as the universeâs "standard candles", are in fact strongly affected by the age of their progenitor stars.
A key point in the article. From what I understand, this is the main way we measure things of vast distance and, from that, determine the universe's rate of expansion. If our understanding of these supernovae is wrong, as this paper claims, that would be a massive scientific breakthrough.
I'm really interested in the counterargument to this.
It could be a big discovery and it also aligns with the findings from DESI BAO [1] and by another Korean group using galaxy clustering to infer the expansion history [2].
I'm dumb and barely understand things at a high level, but standard candles never sat right with me so it's interesting to hear that they might not be, but then again who knows.
If I remember correctly (sorry itâs been a while), the size of the star determines its colour, and the data suggests that the colour of stars fits nicely into the mass of a star (ie youâll never see a star of X color thats Y kg)
The rule is violated in all sorts of fun and interesting ways. There's white dwarfs, for one, then stars with varying levels of metallicity. Stars can merge, which does strange things to their position on the HertzsprungâRussell diagram. There's oddball combinations like a red giant with a neutron star that has sunk into its core, called a ThorneâĹťytkow Object!
Not to mention variable stars, novae, occultation by dust clouds, etc.
Seems like the problem should be pretty easy to figure out. Just need to wait ~5 gigayears and see which model is right. I'm personally hoping for deceleration so that we have more total visitable volume.
I'll set a reminder to check back at that time to see who was right.
Anyone know how credible this is? If true, then that means the big bounce is back on the menu, and the universe could actually be an infinitely oscillating system.
At least The Guardian has a comment from an independent expert:
"Prof Carlos Frenk, a cosmologist at the University of Durham, who was not involved in the latest work, said the findings were worthy of attention. âItâs definitely interesting. Itâs very provocative. It may well be wrong,â he said. âItâs not something that you can dismiss. Theyâve put out a paper with tantalising results with very profound conclusions.â"
As an academic, that is exactly what the kind of noncommittal, donât burn your bridges with colleagues and funding bodies thing that I would say about even clearly flawed research if I were put on the spot by a popular-press publication. In fact, if you know you can rebut flawed research in time, you might want to assist in hyping it first so that your rebuttal will then make a bigger splash and benefit your personal brand.
> If true, then that means the big bounce is back on the menu
I don't think so. Deceleration does not imply recollapse. AFAIK none of this changes the basic fact that there isn't enough matter in the universe to cause it to recollapse. The expansion will just decelerate forever, never quite stopping.
No. The simplest example is a matter-dominated universe at exactly the critical density. It decelerates forever but never quite stops expanding--the expansion rate asymptotes to zero.
Would it not enter the viscinity of other objects which would eventually coalesce into local centers of mass (maybe like one per observable universe diameter or something)?
No, in much the same way that a speeding vehicle slowly decelerating towards a stop doesn't mean that it will return to where it started the journey.
Actually it's worse than that, "decelerating forever" doesn't even mean that it ever even comes entirely to a stop. let alone return to where it started.
âWe canât observe the whole universe, so cosmology is not really about the universe. Itâs about the observable patch and the assumptions we make about the rest.â
(paraphrasing George Ellis)
Weâre in a bounding sphere, with a radius thatâs roughly 46.5 billion lightyears, so any observation we make may be true for our local observable range, but thereâs no (known) way to know whatâs beyond that sphere.
AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption. This research is now actually doing the analysis.
> AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption.
Why would you assume this? It's not correct.
Type 1a supernovae aren't even assumed to be "standard candles" as is often claimed: rather, they're standardizable, i.e. with cross-checks and statistical analysis, they can be used as an important part of a cosmological distance ladder.
A great deal of analysis has gone into the development of that distance ladder, with cross-checks being used wherever it's possible to use them.
They look at surface brightness fluctuations in the same galaxies, Tully-Fisher distances[1], tip of the red giant branch distances[2], and even baryon acoustic oscillations[3]
Is it possible that this one single paper has upended all that? Theoretically. Is it likely? No.
None of your references contradict it, as far as I can see. I'm well aware that Type 1a supernovae are only part of the overall picture, but that observation doesn't contradict what I said.
The word âentropyâ can be reversed by writing its letters in reverse order.
E-N-T-R-O-P-Y
The first letter is E.
The second letter is N.
The third letter is T.
The fourth letter is R.
The fifth letter is O.
The sixth letter is P.
The seventh letter is Y.
ThinkingâŚ
To construct the reversed word, the seventh letter must be first, and the sixth letter must be second, and the fifth letter must be third, and the fourth letter must be fourth, and the third letter must be fifth, and the second letter must be sixth, and the first letter must be seventh.
The seventh letter is Y, so the first letter in the reversed word is Y.
The sixth letter is P, so the second letter in the reversed word is P.
The fifth letter is O, so the third letter in the reversed word is O.
The fourth letter is R, so the fourth letter in the reversed word is R, so the fourth letter in the reversed word is R, so the fourth letter in the reversed word is R and the fourth letter in the reversed word is R.
The third letter is T, so the fifth letter in the reversed word is O.
The second letter is N, so the sixth letter in the reversed word is N.
The first letter is E, so the seventh letter in the reversed word is N.
To put it all together:
Y-P-O-R-R-R-T-N-E-Y
The word âentropyâ can be reversed by writing its letters in reverse order: âyporrrtneyâ.
The more we learn, the less we end up knowing about how "everything" works - some things are mathematical in nature and demonstrate absolutes, but frameworks shift, and complexify, and exceptions to things we thought absolutes have occurred throughout history.
For claims about how the universe works at scales and timeframes so utterly beyond anything testable, it's a little difficult to say this is credible at all - not dunking on the researchers, but in order to validate their conclusions, there's a whole chain of dependencies and assumptions you'd have to follow along with, and each of those things will be its own complex birds nest tangle of assertions, and I don't see how you can really say one way or another until you have a lot more information and a lot better Theory of Everything than we've got right now.
For what it's worth, for all the impact it'll have on anyone's life outside of academia, I'd say they're 100% correct and people should buy them free beers at their local pubs for at least the next year in return for explaining their ideas at length.
If they are replacing a fixed cosmological constant by a model with variable dark energy, doesn't it introduce extra parameters that describe the evolution of dark energy over time? If so, wouldn't it lead to overfitting? Can overfitting alone explain better match of the new model to the data?
A funny coincidence is that the solar system was formed 4.6 billion years ago which is exactly when the universe's rate of expansion peaked according to figure 3.
If you want to believe in an intelligent creatorânot that I doâit's as if they were accelerating the expansion until the solar system was formed, then turned the control knob down.
As a non-scientist I've always found the Cosmic Distance Ladder as likely to be inaccurate due its assumption about the constant brightness of Standard Candle stars over their lifetime, and the compounding of error at each rung of the ladder. Direct measurement of the CMB seems to be simpler with less chance of error.
Direct measurement of the CMB can also have problems if our assumptions about it are wrong. A major goal of having two methods is that they should coalesce to the same result within margin of error - that they didn't told us we were missing something.
> The corrected supernova data and the BAO+CMB-only results both indicate that dark energy weakens and evolves significantly with time.
> More importantly, when the corrected supernova data were combined with BAO and CMB results, the standard ÎCDM model was ruled out with overwhelming significance, the researchers said.
I notice they're not saying that dark energy is entirely unnecessary. Do we know if that's just default caution, or are there still strong reasons to believe dark energy exists?
The CMB and BAO measurements give us a picture of how the early universe looked. Supernovae are sensitive to the conditions in the late universe. All probes, which are mostly independent, always pointed at the same amount of dark energy.
Now these people are saying SN actually point at zero dark energy, if accounting for the physics properly. That doesn't invalidate the CMB and BAO results. So dark energy must have had a big influence in the early universe, and no influence in the late universe, so it must by dynamic. (Ironically, supernovae were the first evidence for dark energy, which I guess was just a coincidence, if this new research is correct.)
I would not be surprised if the universe was somewhat elastic, expands and then contracts and then expands ad infinitam.
After all, existence in itself is irrefutable and cannot not exist by definition.
If we subscribe to a theory of the multiverse, set theory, likelihood, and interaction driven evolution based on gradient type of fundamental laws.
Locally changing. Obviously everything sharing a fundamental quality that is part of existence itself. But obviously there are sets, there is differentiation. But it is not created, the infinity of unconstrained possibilities exists in the first place and reorganizes itself a bit like people are attracted to people who share some commonalities or have something they need from each other and form tribes. Same processus kind of works for synapse connections, works for molecule formations, works for atoms... etc...
Everything is mostly interacting data.
We could say that the concept of distance is a concept of likelihood. The closer is also the most likely.
Just a little weird idea. I need to think a bit more about it. Somewhat metaphysic?
Eventually we will find that the heat death of the universe and the big bang are the same thing, since the totality of the universe is always a oneness, then from the universal perspective the infinitely small and infinitely large are the same thing (one), then they by nature bleed into (and define) each other like yin and yang.
You have a material view of existence perhaps.
How would the notion of nothingness even exist if there was no existence in the first place?
And if we even accepted that nothing was possible, which in itself doesn't make any sense, how would something even start to exist?
Well the contradiction is already in the fact that there is a preexisting concept of nothing in the first place.
Existence is impredicative too.
It defines itself. That's a fact.
It is not because it is impredicative that it needs to be hard to understand I think. It's almost a tautology rather.
Oh by the way, forgniz exist, you made it to design something. It doesn't have to refer to something material. It could be an idea. After all, inventions don't exist by being material in the first place.
But idea have at least material support (your brain signals) and the data acquired through your body. As far as we know.
I think they mean existence in general, not the existence of any specific thing. Meaning that if there were no âexistenceâ then we wouldnât be here to consider its nonexistence.
> I think they mean existence in general, not the existence of any specific thing.
Yes, but the definition of "existence" doesn't require that anything must actually exist.
In other words, it is not the case that existence "cannot not exist by definition."
> Meaning that if there were no âexistenceâ then we wouldnât be here to consider its nonexistence.
That's an anthropic principle argument, which is not an argument from the definition of existence. One of the premises of that argument is that we exist already.
Youâre being downvoted, but your point is true â something can exist âby definitionâ, and yet not exist in our real world. The thing that exists âby definitionâ is just a version that we have imagined to exist by definition. But imagining something with property X doesnât imply anything can actually be found with property X.
Side-note: the deontological argument is an argument for the existence of God, which uses the same principle as the grandparent. âImagine God. Imagine God is good. A good God should exist, because otherwise that god is not good. Therefore, the good God we imagined has the property of existence. Therefore God existsâ. The issue is exactly the same â we can imagine something with property X, but that doesnât mean we can find something with property X
Was there a date at the top of this? I didn't see one. I saw a similar headlines earlier this year and I'm trying to understand that this is something new
Is it SEO? IIRC there's a trend of removing dates from blog posts and articles, and my understanding it's to make the content seem more "evergreen" to Google (vs and article with a date, they may get down-ranked eventually due to age).
I thought the SEO was to keep churning out garbage articles in order to make the page more desirable to search engines because it was more recent. At least that is how it feels, like the search engines are promoting recent content over good content and pages take advantage of this by auto generated trash. to the point I won't even look at a page if it has a date in the last year.
I'm thinking it's SEO cargo culting, and that there are a lot more "monkey see, monkey do" patterns of behavior that don't impact actual ranking but nonetheless crop up in weird things like this.
That said, I cannot wait for adtech to go the way of the rotary phone. Localized, private search indexes on phones with local AI interacting with them, only reaching out to the internet when necessary to update information, with hashes and checksums to minimize the number of updates needed for frequently interacted sites, and so on.
Google right now is hot garbage - most tiny competitors are far better, let alone yandex or kagi or the like.
Mainstream physics has been delighted to ignore/abandon essential conservation laws when talking about the expanding universe. It's kinda weird, I tried publishing a paper on it recently and it was not received well. In general, if conservation laws are to hold, expansion must be balanced with [eventual] contraction, is that not obvious? Apparently it was quite contentious to say until... this article?
> I like to think that, if I were not a professional cosmologist, I would still find it hard to believe that hundreds of cosmologists around the world have latched on to an idea that violates a bedrock principle of physics, simply because they âforgotâ it. If the idea of dark energy were in conflict with some other much more fundamental principle, I suspect the theory would be a lot less popular.
Noether's theorem tells us when we would expect conservation laws to hold and when we would expect them to fail. In the case of global energy conservation, there would have to be a global time invariance associated with the spacetime. But this is manifestly not the case in an expanding universe. It is generally not even possible to have a well defined notion of global energy in a dynamic spacetime.
Noether's theorem tells us when symmetry guarantees conservation, but it says nothing about conservation in the absence of that symmetry - it's not a biconditional statement. Talking about endless expansion is like observing 1 second of a pendulum's swing and concluding there's no time symmetry because it's only moving in one direction. The symmetry exists at the full cycle scale, not the snapshot scale.
It's true that it leaves open the possibility of a conserved quantity that is not associated with a symmetry. But the kinds of conservation laws we are thinking about, like conservation of energy, do originate from a symmetry. So if the symmetry is broken it is very reasonable to assume that the conservation law would be broken as well.
> In general, if conservation laws are to hold, expansion must be balanced with [eventual] contraction, is that not obvious?
Why would this be? The only physics we know is the one inside our observable universe, there could be variations beyond, or even unknowable laws that don't require conservation of matter outside the edge of the universe.
Our incredibly vast universe could be a minuscule blob feeding from an incredibly vaster parent universe, in which case it could be breaking conservation infinitely from our perspective.
Evidently energy was created, or it would not exist, would it ? It probably can be destroyed back to the pre-energy state in some way, just not on a scale we comprehend or even care about.
I suppose we're like bubbles on a boiling pot of water when the fire stops: all this agitation spreads out on the entire volume, and sure no energy was lost, but there are so little bubbles and so much water, once the heat has spread out entirely, the whole volume of water looks pretty dead.
Aside from unanswerable questions (has the universe started to fill it's container? Is a simulation property nearing "1"?), does this make long distance space travel feasible again? I thought there was something around the universe is expanding too fast to visit places like Alpha Centuri (and preventing visitors to us).
Edit: A big brain fart, ignore the retracted part below. Colonizing the universe is of course impossible in 100My, barring FTL. What the paper I referred to [1] says is that colonizing the Milky Way may take less than that, and if you can do that, spreading to the rest of the observable universe is fairly easy, very relatively speaking.
<retracted> According to some calculations, it should in principle be possible to colonize the entire observable universe in less than a hundred million years. It's much too fast for the expansion to affect except marginally.</retracted>
The relative jump in difficulty from interstellar to intergalactic is much smaller than from interplanetary to interstellar.
Anyway, as others said, mere intragalactic (and intra-Local Group) travel is not affected by expansion in any way whatsoever.
Time dilation means that you can get anywhere while experiencing an arbitrarily small amount of time. You can cross the galaxy in a second as far as special relativity is concerned. (With the expenditure of insanely vast amounts of energy,
ofc.)
To an observer back home you'd look like you're travelling at merely extremely close to the speed of light, but to you the journey would take a second.
> The relative jump in difficulty from interstellar to intergalactic is much smaller than from interplanetary to interstellar.
Interesting way to put it... This doesn't seem that accurate. With sufficiently advanced technology, many of which we already possess, we could expect to propel a minute spacecraft to a considerable fraction of the speed of light, and reach nearby stars possibly within the end of the century. Reaching the other end of the galaxy is a massive undertaking. It's a logarithmic scale at every step of the way.
Pluto is about 38 AU from Earth. Proxima Centauri is about 6.3 Ă 10^4 AU away (or about 4.24 ly), and that's roughly a 2 Ă 10^3 multiplication. The Milky Way is about 50000 ly in radius, and the Andromeda Galaxy is about 3 Ă 10^6 ly away. Going from interplanetary distances to interstellar, and thence to intergalactic, involves at least a 10^5 factor (give or take) at each step.
I feel like waiting longer in some sense may itself represent a substantial increase in difficulty in terms of creating something which remains stable for tens of thousands of years.
On the other hand who knows with zero samples how stable societies are thousands of years beyond our present level of development.
Yes, that's why I said 100 light years rather than 4.3. Maybe it's still too low, but I think there are targets within the Milky Way that would require solving pretty much all the problems of getting to Andromeda.
I guess the question is⌠we know what our current propulsion technology is capable of⌠given a million years of further technological development, where will our technology be?
The idea that, given a million years of further technological development, intergalactic travel might actually be feasible, isnât really that implausible. Far from certain, but far from implausible either.
And thatâs the thing-a million years is a technological eternity, a rounding error in estimates of time to colonise the galaxy/the local group/the observable universe.
Any form of propulsion that obeys Newton has hard limits to it's space travel potential. Even spitting out single particles at near the speed of light for the most efficient way to generate thrust per unit of expelled mass still constrains you to the tyranny of the rocket equation, which puts hard physical limits on you.
The rocket equation also underestimates any craft that gets over a fraction of C
Currently, we have no evidence that reactionless propulsion is physically possible and one existing would directly contradict the conservation of momentum.
"technological development" isn't a magic word or force of reality. "Technological development" is the pay off of immense engineering investment and discovering new phenomenon, but every axis you can possibly put effort into engineering and optimizing has a finite limit at some point, and there are finite new phenomenons to discover.
The entire past 100 and some years of technological development has been basically down to mastering the electromagnetic force. But, we've basically used up the novelty that was there, and there is no new second electromagnetic force to discover. In fact, the nuclear force was also discovered and tapped out relatively quickly.
A great example of this is the elements. All evidence points to the outcome that the elements we can build stuff out of right now are the only elements you will ever be able to build anything out of. All artificial elements, even ones that are relatively "stable", have half lifes that preclude building stuff out of them, and there is no evidence that it is possible to modulate the rate that an unstable atom decays. So no "exotic" elements that could magically power space ships or anything will exist.
Intergalactic travel of humans is implausible unless you get into pretty radical transhumanism, or assume it's possible to perfectly maintain a biological human forever somehow, including brain functionality.
Brain uploads are another thing that people don't seem to recognize are radically more difficult and close to impossible. "Scanning" a brain is treated as an engineering problem, but it might not be. Every sensor relies on a physical interaction, most of them based on electromagnetic energy. How do you make an electron or photon or something interact in a measurable way with a cell deep inside someone's brain without that particle interacting with all the identical matter in the way or cutting open and taking apart that brain? Well, thanks to the mastery of the electromagnetic force, we have MRIs which kind of do in fact do that. But even if we had a magic MRI machine for example with infinite resolution (yet another thing that has fundamental limits), that would only let you look at molecules with with hydrogen, so you wouldn't be able to survey, say, the ion content of brain cells directly. If you are not aware, ion gradients are fairly important in human cell behavior.
Nevermind that scanning and uploading someone's brain, if it were possible, does not transfer the original conscious experience to the computer. A new copy may go on in a digital world but you still die.
Lots of great points here, but I think there's a bit more cause for optimism. For one, generation ships I think are the long-term project for space travel that successfully gets humans somewhere. No easy feat by any means in terms of time, engineering, and risk, but not running up against a wall of physical impossibility.
And nuclear physics is still a wide open frontier. We don't yet have fusion, and there's a lot we don't yet know about quark and gluon plasma and nuclear behavior on astrophysical scales. And if we're talking about technological possibilities against time scales of forever, there's lots of interesting electromagnetic possibilities in the context of superconductivity and metamaterials that we haven't yet exploited and I'm probably not even beginning to do justice to it in its totality as an open ended frontier full of fertile (e.g. vacuum polarization is a poorly understood frontier that might turn out to have interestingly exploitable properties).
You did a great job outlining some devastatingly serious physical limits but I think, again against the timeline of forever, you may be perhaps underselling the possibilities of important and newly exploitable properties of electromagnetism and the nuclear force being brought into application.
>According to some calculations, it should in principle be possible to colonize the entire observable universe in less than a hundred million years
...what? That doesn't seem right, just from a really quick gut check it looks like the observable universe has a radius of 45.7 billion light years [0]. Even if the universe wasn't expanding nobody could get to everything any faster than that number of years right? Maybe you saw something that was talking about the local (Virgo) supercluster, which I think has a radius of around 55 million light years, so that sounds more like something that could be done on that timescale "in theory". But there are millions and millions of superclusters in the observable universe overall.
Oops, yes, I don't know what I was thinking. A total brain fart. The paper I referred to is Sandberg and Armstrong's 2012 "Eternity in Six Hours", and of course they don't claim such a thing. Only that it's possible to start a colonization wave that has plenty of time to spread to everything visible now before they slip outside of our future light cone. The ~100M years refers to the colonization of the Milky Way. Sorry!
>> According to some calculations, it should in principle be possible to colonize the entire observable universe in less than a hundred million years
> ...what? That doesn't seem right, just from a really quick gut check it looks like the observable universe has a radius of 45.7 billion light years [0].
I guess it depends on whose hundred million years you're talking about: the colonists' or those who stay home's. I don't know how to do the calculations, but it seems plausible that you could traverse the entire observable universe at near light-speed in 100 million years ship time.
You need ridiculous speeds for time dilation to really kick in though. Mathematically, it starts as soon as an object moves. But if a spaceship travels at 90 % of light speed (0.9 c), their local time moves just approximately at half speed compared to local time on earth. A year for the astronauts is just over 2 years on earth.
At 0.995 c, the ship clock runs 10 x slower.
At 0.999 c, 22 x slower. Then if you push the turbo button to 0.9999 c, 71 x slower.
The fastest man-made object to date is the Parker Solar Probe, at 0.059 c.
Imagine the universe as a giant balloon. Inside are little miniature balloon stars floating around, tied with string into balloon galaxies. If we heat the air: the big balloon expands, the clusters of mini-balloons spread out from the other clusters, but the clusters don't get any more diffuse. The string is way way too strong to be overpowered by the separating force from the expansion of the gas over short distances.
The limit to space travel is the Rocket Equation, which says that you require exponential fuel to reach higher speeds. Alpha Centauri isn't going anywhere, but it will take millennia of travel even with wildly optimistic assumptions.
Also note that there isn't any "container" to fill up. It could well be infinite. It's just that we will be forever limited to a finite subset, even in theory.
That limitation only counts for visiting other galaxies. Travel within the galaxy is always possible, regardless of the universeâs expansion. And Alpha Centauri is super close, even within our galaxy.
Dozens of dwarf galaxies, even! Also, Triangulum is sort of borderline at around 70% of the Milky Way's diameter, although admittedly only 10% of its mass. But Mars is also around 10% of Earth's mass, for a comparison.
Standard candles (all these measurements of redshift according to distance, need us to actually get the distance of what we are measuring right) are the gift that keeps on giving.
This study (and many others, depending on the cosmic scales they use) mainly use Supernovas of Type Ia. I.e. the energy emitted by the supernova of a binary acreccion star, which is a star that is capturing the mass from another start that is very nearby and increasing its mass until it collapses into itself, increases temperature up to the point it starts fusing helium, and goes supernova with all the added energy.
That was (and still is now, with some corrections we found since middle last century) supposed to be the same everywhere. Problem is, we keep finding new corrections to it - like this study claims.
That is in fact the big claim of this study (ignore the universe expansion part), that they found a new correction to the Supernova of type Ia luminosity. It's a very big claim and extremely interesting if confirmed. But, like all big claims, it needs a big confirmation. I'm a bit skeptic TBH.
Out of curiosity, what data are you drawing or what qualifications do you have that support your skepticism over three different modes of analysis (as well as pretty much every recent development in the field) supporting this claim:
"Remarkably, this agrees with what is independently predicted from BAO-only or BAO+CMB analyses, though this fact has received little attention so far.""
Well, this part you mention, for instance "though this fact has received little attention so far".
A change on the standard candles calibration would be a huge deal for cosmology and galactic astronomy (and other fields) and would not be taken lightly at all. There are all sorts of ramifications from this and if astronomers aren't all in an oof about it, it is because big proof is needed for big claims.
And a change in the standard candles calibration is indeed a very big claim.
This is a fascinating discovery! It's brings into focus the Deep Field imagery from the JWST and how gravitational lensing was found to be greater than expected along with galaxies that were much older than expected based on redshift calculations. Perhaps this could indicate that the universe is even older than we originally thought if redshift calculations accounted for an incorrect perpetual acceleration.
I have a great deal of respect for the sciences but sometimes astronomy just feels like one giant guessing game: age of the universe, big bang starting as a joke and all the "first minute" timelines thereafter, dark energy and dark matter (code for we have no idea what it is) vastly outnumbering everything else, and now questioning the Nobel Prize-awarded universe expansion. Meanwhile, asteroids the size of buses+ keep whizzing by closer than the moon with little or no warning. Sigh.
Consider the scales involved. It's amazing that a species that is 99% chimp genes can even think and deduce phenomena of that size; don't ask it to get it right the first time.
All of that without having traveled farther than one light second from its home.
It should humble all of us that believe we have absolute knowledge of things. So many people consciously or subconsciously philosophize about the universe, life, spirituality, etc. based on grand ideas and science which is routinely overturned. Really all we can prove is mathematical facts, and what our emotions tell us (i.e. "I Love You").
There has been a lot of progress towards mapping all near-earth asteroids, at least. That's a lot better than the previous tactic of putting one's fingers in one's ears and humming.
I mean an asteroid the size of a bus is messy for your local area if it decides to land there, but in the terms of size of things in space is nearly undetectable. Space, even our local neighborhood is unbelievably huge.
Think of trying to find a bus that could be anywhere on earth that is moving so it's not easy to keep track of and is painted in a way to be camouflaged with its environment.
Now instead try to imagine looking for that bus on Jupiter. Gets way harder. But it's way bigger than that, your looking for a black dot in the size of an area of millions of Jupiter and just hope it crosses in front of a star so you can track it.
just because you tell me water is not wet does not make it dry. also, the cool thing "about science is it doesn't need you to believe in it" or however the quote goes
There are people certain the earth is flat, the moon landings were fake. That certainty doesn't impress me. So I'm just really not sure what the point is.
The point was that science doesn't allow for certainty, by construction. We're not certain the sun will rise tomorrow, we're not certain the speed of light is a limit, we're not certain that F=ma or that E=mc².
Those that want certainty have to look to religion, or to pseudoscience. And they will certainly be wrong.
Bingo. People who think science is fact dont understand science.
The actual fact is, we humans really donât know much about the universe and indeed there may be truths and knowledge that weâll never know the answer to. Like⌠why the fuck are we here? Where did all this stuff come from. Sure we have theories and have logical conclusions but at the end of the day⌠we are tiny and the universe is mind bogglingly huge. It is peak human arrogance to think we truly know anything at all.
Itâs very humbling to realise how little we actually know. What we do know⌠we know. We are masters of electro-magnetism, chemistry, etc⌠but when it comes to the big questions itâs all a shot in the dark.
Probably it means that now we have evidence that⌠it is a colloquialism
Edit: yep, The universe's expansion may actually have started to slow rather than accelerating at an ever-increasing rate as previously thought, a new study suggests.
âSupernova (SN) cosmology is based on the key assumption that the luminosity standardization process of Type Ia SNe remains invariant with progenitor age. However, direct and extensive age measurements of SN host galaxies reveal a significant (5.5Ď) correlation between standardized SN magnitude and progenitor age, which is expected to introduce a serious systematic bias with redshift in SN cosmology. This systematic bias is largely uncorrected by the commonly used mass-step correction, as progenitor age and host galaxy mass evolve very differently with redshift. After correcting for this age bias as a function of redshift, the SN data set aligns more closely with the cold dark matter (CDM) modelâ [1].
[1] https://academic.oup.com/mnras/article/544/1/975/8281988?log...
I know the team that did this. In fact i was listening to their seminar just a few days ago. They are very careful and have been working on this a long time. One caveat that they readily admit is that the sample used to create the luminosity age relation has some biases such as galaxy type and relatively lower redshift. They will be updating their results with the Rubin LSST data in the next few years.
Exciting times in cosmology after decades of a standard LCDM model.
> after decades of a standard LCDM model
Could you help me understand this sentence: "After correcting for this age bias as a function of redshift, the SN data set aligns more closely with the cold dark matter (CDM) modelâ?
The CDM model has no dark energy, unlike the LCDM model. The L stands for Lambda, which is the dark energy term in the Einstein equations. So they are saying when accounting for this effect, our universe looks more like a universe without dark energy, at least when only considering the supernovae probe.
Is there a recording of their seminar anywhere?
Its not publicly available. Maybe for the best haha. The speaker at some point went on a bit of a tirade against many people in the supernovae cosmology community. I think he endured many years of being ignored or belittled.
Just curious, is this dark matter holding back the universal expansion?
Our best guess is âmaybe?â
> type Ia supernovae, long regarded as the universeâs "standard candles", are in fact strongly affected by the age of their progenitor stars.
A key point in the article. From what I understand, this is the main way we measure things of vast distance and, from that, determine the universe's rate of expansion. If our understanding of these supernovae is wrong, as this paper claims, that would be a massive scientific breakthrough.
I'm really interested in the counterargument to this.
It could be a big discovery and it also aligns with the findings from DESI BAO [1] and by another Korean group using galaxy clustering to infer the expansion history [2].
[1] https://arxiv.org/abs/2404.03002
[2] https://arxiv.org/abs/2305.00206
I'm dumb and barely understand things at a high level, but standard candles never sat right with me so it's interesting to hear that they might not be, but then again who knows.
The idea is that standard candles are based on chemistry and microscopic physics only, not cosmology.
If I remember correctly (sorry itâs been a while), the size of the star determines its colour, and the data suggests that the colour of stars fits nicely into the mass of a star (ie youâll never see a star of X color thats Y kg)
The rule is violated in all sorts of fun and interesting ways. There's white dwarfs, for one, then stars with varying levels of metallicity. Stars can merge, which does strange things to their position on the HertzsprungâRussell diagram. There's oddball combinations like a red giant with a neutron star that has sunk into its core, called a ThorneâĹťytkow Object!
Not to mention variable stars, novae, occultation by dust clouds, etc.
> If our understanding of these supernovae is wrong, as this paper claims, that would be a massive scientific breakthrough.
Indeed. It's so hard to definitively prove things that are, that the most significant breakthroughs prove things that aren't (so to speak), imho.
> It's so hard to definitively prove things that are, that the most significant breakthroughs prove things that aren't (so to speak), imho
Significant breakthroughs do both. Prove things arenât as we thought. And are as the new model suggests.
Seems like the problem should be pretty easy to figure out. Just need to wait ~5 gigayears and see which model is right. I'm personally hoping for deceleration so that we have more total visitable volume.
I'll set a reminder to check back at that time to see who was right.
I just pictured someone getting a message to check which model was right from an ancestor 20 giga generations ago!
!remindme 20,000,000,000 years
Anyone know how credible this is? If true, then that means the big bounce is back on the menu, and the universe could actually be an infinitely oscillating system.
At least The Guardian has a comment from an independent expert:
"Prof Carlos Frenk, a cosmologist at the University of Durham, who was not involved in the latest work, said the findings were worthy of attention. âItâs definitely interesting. Itâs very provocative. It may well be wrong,â he said. âItâs not something that you can dismiss. Theyâve put out a paper with tantalising results with very profound conclusions.â"
https://www.theguardian.com/science/2025/nov/06/universe-exp...
As an academic, that is exactly what the kind of noncommittal, donât burn your bridges with colleagues and funding bodies thing that I would say about even clearly flawed research if I were put on the spot by a popular-press publication. In fact, if you know you can rebut flawed research in time, you might want to assist in hyping it first so that your rebuttal will then make a bigger splash and benefit your personal brand.
It's also something you could say if you forgot to read the assignment and the professor called on you.
"It makes some profound points, yes. What if? BUT what if not?"
I read it as "I recognize some of the names and the abstract doesn't sound like complete nonsense".
> If true, then that means the big bounce is back on the menu
I don't think so. Deceleration does not imply recollapse. AFAIK none of this changes the basic fact that there isn't enough matter in the universe to cause it to recollapse. The expansion will just decelerate forever, never quite stopping.
Wait but decelerating forever does in fact imply recollapse doesn't it?
No. The simplest example is a matter-dominated universe at exactly the critical density. It decelerates forever but never quite stops expanding--the expansion rate asymptotes to zero.
I assume decelerating forever means asymptotically approaching not collapsing.
Nope, that would be velocity changing sign, which means acceleration would increase.
Nope, cosmic Zeno's Paradox. Collapse never quite happens.
An object on an escape trajectory from another mass is forever decelerating, but it still escapes.
Would it not enter the viscinity of other objects which would eventually coalesce into local centers of mass (maybe like one per observable universe diameter or something)?
No, in much the same way that a speeding vehicle slowly decelerating towards a stop doesn't mean that it will return to where it started the journey.
Actually it's worse than that, "decelerating forever" doesn't even mean that it ever even comes entirely to a stop. let alone return to where it started.
âWe canât observe the whole universe, so cosmology is not really about the universe. Itâs about the observable patch and the assumptions we make about the rest.â
(paraphrasing George Ellis)
Weâre in a bounding sphere, with a radius thatâs roughly 46.5 billion lightyears, so any observation we make may be true for our local observable range, but thereâs no (known) way to know whatâs beyond that sphere.
This is the thing that blows my mind the most with physics and cosmology
> Anyone know how credible this is?
AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption. This research is now actually doing the analysis.
> AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption.
Why would you assume this? It's not correct.
Type 1a supernovae aren't even assumed to be "standard candles" as is often claimed: rather, they're standardizable, i.e. with cross-checks and statistical analysis, they can be used as an important part of a cosmological distance ladder.
A great deal of analysis has gone into the development of that distance ladder, with cross-checks being used wherever it's possible to use them.
They look at surface brightness fluctuations in the same galaxies, Tully-Fisher distances[1], tip of the red giant branch distances[2], and even baryon acoustic oscillations[3]
Is it possible that this one single paper has upended all that? Theoretically. Is it likely? No.
[1] https://en.wikipedia.org/wiki/Tully%E2%80%93Fisher_relation
[2] https://en.wikipedia.org/wiki/Tip_of_the_red-giant_branch
[3] https://en.wikipedia.org/wiki/Baryon_acoustic_oscillations
> Why would you assume this? It's not correct.
None of your references contradict it, as far as I can see. I'm well aware that Type 1a supernovae are only part of the overall picture, but that observation doesn't contradict what I said.
time to re-read "The Last Question"
Asimov was so good. Amazing story
"Sorry, but as a large language model, I cannot provide advice on how to reverse entropy."
ThinkingâŚ
ThinkingâŚ
ThinkingâŚ
The word âentropyâ can be reversed by writing its letters in reverse order.
E-N-T-R-O-P-Y
The first letter is E.
The second letter is N.
The third letter is T.
The fourth letter is R.
The fifth letter is O.
The sixth letter is P.
The seventh letter is Y.
ThinkingâŚ
To construct the reversed word, the seventh letter must be first, and the sixth letter must be second, and the fifth letter must be third, and the fourth letter must be fourth, and the third letter must be fifth, and the second letter must be sixth, and the first letter must be seventh.
The seventh letter is Y, so the first letter in the reversed word is Y.
The sixth letter is P, so the second letter in the reversed word is P.
The fifth letter is O, so the third letter in the reversed word is O.
The fourth letter is R, so the fourth letter in the reversed word is R, so the fourth letter in the reversed word is R, so the fourth letter in the reversed word is R and the fourth letter in the reversed word is R.
The third letter is T, so the fifth letter in the reversed word is O.
The second letter is N, so the sixth letter in the reversed word is N.
The first letter is E, so the seventh letter in the reversed word is N.
To put it all together:
Y-P-O-R-R-R-T-N-E-Y
The word âentropyâ can be reversed by writing its letters in reverse order: âyporrrtneyâ.
yporrrtney
Looks like it halluco-commingled the physical principle and the drummer from Dream Theater...
This is art.
I love this. I've upvoted every parent just so you can get your dues :)
Masterpiece!!
bravo, sir
The more we learn, the less we end up knowing about how "everything" works - some things are mathematical in nature and demonstrate absolutes, but frameworks shift, and complexify, and exceptions to things we thought absolutes have occurred throughout history.
For claims about how the universe works at scales and timeframes so utterly beyond anything testable, it's a little difficult to say this is credible at all - not dunking on the researchers, but in order to validate their conclusions, there's a whole chain of dependencies and assumptions you'd have to follow along with, and each of those things will be its own complex birds nest tangle of assertions, and I don't see how you can really say one way or another until you have a lot more information and a lot better Theory of Everything than we've got right now.
For what it's worth, for all the impact it'll have on anyone's life outside of academia, I'd say they're 100% correct and people should buy them free beers at their local pubs for at least the next year in return for explaining their ideas at length.
RETVRN to mx'' + cx' + kx = 0
Iâm gonna wait for Scott Manley to discuss it before I form much of an opinion.
If they are replacing a fixed cosmological constant by a model with variable dark energy, doesn't it introduce extra parameters that describe the evolution of dark energy over time? If so, wouldn't it lead to overfitting? Can overfitting alone explain better match of the new model to the data?
If you cover up the part of the Figure 3 graph past "now", it kind of fits a sine wave. https://ras.ac.uk/sites/default/files/2025-10/Figure%203.jpg
Universe gong.
A funny coincidence is that the solar system was formed 4.6 billion years ago which is exactly when the universe's rate of expansion peaked according to figure 3.
If you want to believe in an intelligent creatorânot that I doâit's as if they were accelerating the expansion until the solar system was formed, then turned the control knob down.
Turns out the universe is one giant PID controller.
That's a very thought provoking observation, as if the whole universe behaved like a wave.
> it kind of fits a sine wave
But wavering around a line above y = 0.
As a non-scientist I've always found the Cosmic Distance Ladder as likely to be inaccurate due its assumption about the constant brightness of Standard Candle stars over their lifetime, and the compounding of error at each rung of the ladder. Direct measurement of the CMB seems to be simpler with less chance of error.
https://en.wikipedia.org/wiki/Cosmic_distance_ladder
Direct measurement of the CMB can also have problems if our assumptions about it are wrong. A major goal of having two methods is that they should coalesce to the same result within margin of error - that they didn't told us we were missing something.
> The corrected supernova data and the BAO+CMB-only results both indicate that dark energy weakens and evolves significantly with time.
> More importantly, when the corrected supernova data were combined with BAO and CMB results, the standard ÎCDM model was ruled out with overwhelming significance, the researchers said.
I notice they're not saying that dark energy is entirely unnecessary. Do we know if that's just default caution, or are there still strong reasons to believe dark energy exists?
The CMB and BAO measurements give us a picture of how the early universe looked. Supernovae are sensitive to the conditions in the late universe. All probes, which are mostly independent, always pointed at the same amount of dark energy.
Now these people are saying SN actually point at zero dark energy, if accounting for the physics properly. That doesn't invalidate the CMB and BAO results. So dark energy must have had a big influence in the early universe, and no influence in the late universe, so it must by dynamic. (Ironically, supernovae were the first evidence for dark energy, which I guess was just a coincidence, if this new research is correct.)
I would not be surprised if the universe was somewhat elastic, expands and then contracts and then expands ad infinitam. After all, existence in itself is irrefutable and cannot not exist by definition.
If we subscribe to a theory of the multiverse, set theory, likelihood, and interaction driven evolution based on gradient type of fundamental laws. Locally changing. Obviously everything sharing a fundamental quality that is part of existence itself. But obviously there are sets, there is differentiation. But it is not created, the infinity of unconstrained possibilities exists in the first place and reorganizes itself a bit like people are attracted to people who share some commonalities or have something they need from each other and form tribes. Same processus kind of works for synapse connections, works for molecule formations, works for atoms... etc... Everything is mostly interacting data.
We could say that the concept of distance is a concept of likelihood. The closer is also the most likely.
Just a little weird idea. I need to think a bit more about it. Somewhat metaphysic?
Eventually we will find that the heat death of the universe and the big bang are the same thing, since the totality of the universe is always a oneness, then from the universal perspective the infinitely small and infinitely large are the same thing (one), then they by nature bleed into (and define) each other like yin and yang.
You may appreciate this idea: https://en.wikipedia.org/wiki/Conformal_cyclic_cosmology
Penrose? That guy, again. /s
> After all, existence in itself is irrefutable and cannot not exist by definition.
I can say the same about forgnoz, which is something I've just invented that must exist by definition.
You'd need to try a bit harder to make existence actually inevitable.
You have a material view of existence perhaps. How would the notion of nothingness even exist if there was no existence in the first place? And if we even accepted that nothing was possible, which in itself doesn't make any sense, how would something even start to exist? Well the contradiction is already in the fact that there is a preexisting concept of nothing in the first place. Existence is impredicative too. It defines itself. That's a fact.
It is not because it is impredicative that it needs to be hard to understand I think. It's almost a tautology rather.
Oh by the way, forgniz exist, you made it to design something. It doesn't have to refer to something material. It could be an idea. After all, inventions don't exist by being material in the first place. But idea have at least material support (your brain signals) and the data acquired through your body. As far as we know.
I think they mean existence in general, not the existence of any specific thing. Meaning that if there were no âexistenceâ then we wouldnât be here to consider its nonexistence.
> I think they mean existence in general, not the existence of any specific thing.
Yes, but the definition of "existence" doesn't require that anything must actually exist.
In other words, it is not the case that existence "cannot not exist by definition."
> Meaning that if there were no âexistenceâ then we wouldnât be here to consider its nonexistence.
That's an anthropic principle argument, which is not an argument from the definition of existence. One of the premises of that argument is that we exist already.
Yes but then there is always something that mist exist which is the concept of absence of existence. So it doesn't make sense.
Youâre being downvoted, but your point is true â something can exist âby definitionâ, and yet not exist in our real world. The thing that exists âby definitionâ is just a version that we have imagined to exist by definition. But imagining something with property X doesnât imply anything can actually be found with property X.
Side-note: the deontological argument is an argument for the existence of God, which uses the same principle as the grandparent. âImagine God. Imagine God is good. A good God should exist, because otherwise that god is not good. Therefore, the good God we imagined has the property of existence. Therefore God existsâ. The issue is exactly the same â we can imagine something with property X, but that doesnât mean we can find something with property X
Maybe someone is tailgating it. And it's trying to annoy them by speeding up, then slowing down.
There seem to be so many fudge factors in the whole chain of analysis we won't have an idea until we can make vastly improved measurements.
Was there a date at the top of this? I didn't see one. I saw a similar headlines earlier this year and I'm trying to understand that this is something new
>>>Submitted by Sam Tonkin on Thu, 06/11/2025
At the very bottom. Weird how style guides keep putting important information like this in harder to reach places.
Is it SEO? IIRC there's a trend of removing dates from blog posts and articles, and my understanding it's to make the content seem more "evergreen" to Google (vs and article with a date, they may get down-ranked eventually due to age).
I thought the SEO was to keep churning out garbage articles in order to make the page more desirable to search engines because it was more recent. At least that is how it feels, like the search engines are promoting recent content over good content and pages take advantage of this by auto generated trash. to the point I won't even look at a page if it has a date in the last year.
I'm thinking it's SEO cargo culting, and that there are a lot more "monkey see, monkey do" patterns of behavior that don't impact actual ranking but nonetheless crop up in weird things like this.
That said, I cannot wait for adtech to go the way of the rotary phone. Localized, private search indexes on phones with local AI interacting with them, only reaching out to the internet when necessary to update information, with hashes and checksums to minimize the number of updates needed for frequently interacted sites, and so on.
Google right now is hot garbage - most tiny competitors are far better, let alone yandex or kagi or the like.
the linked journal article is dated Nov 6 2025
Mainstream physics has been delighted to ignore/abandon essential conservation laws when talking about the expanding universe. It's kinda weird, I tried publishing a paper on it recently and it was not received well. In general, if conservation laws are to hold, expansion must be balanced with [eventual] contraction, is that not obvious? Apparently it was quite contentious to say until... this article?
Maybe this helps: https://www.preposterousuniverse.com/blog/2010/02/22/energy-...
My favorite quote:
> I like to think that, if I were not a professional cosmologist, I would still find it hard to believe that hundreds of cosmologists around the world have latched on to an idea that violates a bedrock principle of physics, simply because they âforgotâ it. If the idea of dark energy were in conflict with some other much more fundamental principle, I suspect the theory would be a lot less popular.
Noether's theorem tells us when we would expect conservation laws to hold and when we would expect them to fail. In the case of global energy conservation, there would have to be a global time invariance associated with the spacetime. But this is manifestly not the case in an expanding universe. It is generally not even possible to have a well defined notion of global energy in a dynamic spacetime.
Noether's theorem tells us when symmetry guarantees conservation, but it says nothing about conservation in the absence of that symmetry - it's not a biconditional statement. Talking about endless expansion is like observing 1 second of a pendulum's swing and concluding there's no time symmetry because it's only moving in one direction. The symmetry exists at the full cycle scale, not the snapshot scale.
It's true that it leaves open the possibility of a conserved quantity that is not associated with a symmetry. But the kinds of conservation laws we are thinking about, like conservation of energy, do originate from a symmetry. So if the symmetry is broken it is very reasonable to assume that the conservation law would be broken as well.
> In general, if conservation laws are to hold, expansion must be balanced with [eventual] contraction, is that not obvious?
Why would this be? The only physics we know is the one inside our observable universe, there could be variations beyond, or even unknowable laws that don't require conservation of matter outside the edge of the universe.
Our incredibly vast universe could be a minuscule blob feeding from an incredibly vaster parent universe, in which case it could be breaking conservation infinitely from our perspective.
Because energy cannot be created nor destroyed
Evidently energy was created, or it would not exist, would it ? It probably can be destroyed back to the pre-energy state in some way, just not on a scale we comprehend or even care about.
I suppose we're like bubbles on a boiling pot of water when the fire stops: all this agitation spreads out on the entire volume, and sure no energy was lost, but there are so little bubbles and so much water, once the heat has spread out entirely, the whole volume of water looks pretty dead.
No, the assumption was that dark energy is a property of space itself so it does not conserve energy at all in an expanding space.
Also this discovery does still is being explained with dark energy (albeit time varying âŚ) so it still does not assume global energy conservation.
I mean no disrespect, but are you a trained physicist, or at least familiar with the 'mainstream material'?
Because there is no shortage of 'crackpots' that have 'obvious' solutions to unsolved physics problems, and that want to publish papers about it.
I picked the wrong week to put my faith in cosmology!
Did it change during our life time?
Just our understanding of it. Thatâs flipped multiple times in my lifetime.
Aside from unanswerable questions (has the universe started to fill it's container? Is a simulation property nearing "1"?), does this make long distance space travel feasible again? I thought there was something around the universe is expanding too fast to visit places like Alpha Centuri (and preventing visitors to us).
Edit: A big brain fart, ignore the retracted part below. Colonizing the universe is of course impossible in 100My, barring FTL. What the paper I referred to [1] says is that colonizing the Milky Way may take less than that, and if you can do that, spreading to the rest of the observable universe is fairly easy, very relatively speaking.
<retracted> According to some calculations, it should in principle be possible to colonize the entire observable universe in less than a hundred million years. It's much too fast for the expansion to affect except marginally.</retracted>
The relative jump in difficulty from interstellar to intergalactic is much smaller than from interplanetary to interstellar.
Anyway, as others said, mere intragalactic (and intra-Local Group) travel is not affected by expansion in any way whatsoever.
[1] https://www.sciencedirect.com/science/article/abs/pii/S00945..., PDF at https://www.aleph.se/papers/Spamming%20the%20universe.pdf
I found someone saying colonize the Milky Way Galaxy in ~90m years? Is that what you meant?
The observable universe is ~93B LY - unless you're assuming FTL (and MUCH faster than light), I don't see how that's possible?
Time dilation means that you can get anywhere while experiencing an arbitrarily small amount of time. You can cross the galaxy in a second as far as special relativity is concerned. (With the expenditure of insanely vast amounts of energy, ofc.)
To an observer back home you'd look like you're travelling at merely extremely close to the speed of light, but to you the journey would take a second.
Yes, my brain totally froze. Added a correction.
> The relative jump in difficulty from interstellar to intergalactic is much smaller than from interplanetary to interstellar.
Interesting way to put it... This doesn't seem that accurate. With sufficiently advanced technology, many of which we already possess, we could expect to propel a minute spacecraft to a considerable fraction of the speed of light, and reach nearby stars possibly within the end of the century. Reaching the other end of the galaxy is a massive undertaking. It's a logarithmic scale at every step of the way.
Pluto is about 38 AU from Earth. Proxima Centauri is about 6.3 Ă 10^4 AU away (or about 4.24 ly), and that's roughly a 2 Ă 10^3 multiplication. The Milky Way is about 50000 ly in radius, and the Andromeda Galaxy is about 3 Ă 10^6 ly away. Going from interplanetary distances to interstellar, and thence to intergalactic, involves at least a 10^5 factor (give or take) at each step.
If you can get to a star 100 light years away, you can get to Andromeda. It doesn't require going faster, just waiting longer.
I feel like waiting longer in some sense may itself represent a substantial increase in difficulty in terms of creating something which remains stable for tens of thousands of years.
On the other hand who knows with zero samples how stable societies are thousands of years beyond our present level of development.
Yes, that's why I said 100 light years rather than 4.3. Maybe it's still too low, but I think there are targets within the Milky Way that would require solving pretty much all the problems of getting to Andromeda.
Imagine doing that, and being greeted with
ALL THESE WORLDS ARE YOURS EXCEPT ANDROMEDA
I guess the question is⌠we know what our current propulsion technology is capable of⌠given a million years of further technological development, where will our technology be?
The idea that, given a million years of further technological development, intergalactic travel might actually be feasible, isnât really that implausible. Far from certain, but far from implausible either.
And thatâs the thing-a million years is a technological eternity, a rounding error in estimates of time to colonise the galaxy/the local group/the observable universe.
Any form of propulsion that obeys Newton has hard limits to it's space travel potential. Even spitting out single particles at near the speed of light for the most efficient way to generate thrust per unit of expelled mass still constrains you to the tyranny of the rocket equation, which puts hard physical limits on you.
https://en.wikipedia.org/wiki/Tsiolkovsky_rocket_equation
The rocket equation also underestimates any craft that gets over a fraction of C
Currently, we have no evidence that reactionless propulsion is physically possible and one existing would directly contradict the conservation of momentum.
"technological development" isn't a magic word or force of reality. "Technological development" is the pay off of immense engineering investment and discovering new phenomenon, but every axis you can possibly put effort into engineering and optimizing has a finite limit at some point, and there are finite new phenomenons to discover.
The entire past 100 and some years of technological development has been basically down to mastering the electromagnetic force. But, we've basically used up the novelty that was there, and there is no new second electromagnetic force to discover. In fact, the nuclear force was also discovered and tapped out relatively quickly.
A great example of this is the elements. All evidence points to the outcome that the elements we can build stuff out of right now are the only elements you will ever be able to build anything out of. All artificial elements, even ones that are relatively "stable", have half lifes that preclude building stuff out of them, and there is no evidence that it is possible to modulate the rate that an unstable atom decays. So no "exotic" elements that could magically power space ships or anything will exist.
Intergalactic travel of humans is implausible unless you get into pretty radical transhumanism, or assume it's possible to perfectly maintain a biological human forever somehow, including brain functionality.
Brain uploads are another thing that people don't seem to recognize are radically more difficult and close to impossible. "Scanning" a brain is treated as an engineering problem, but it might not be. Every sensor relies on a physical interaction, most of them based on electromagnetic energy. How do you make an electron or photon or something interact in a measurable way with a cell deep inside someone's brain without that particle interacting with all the identical matter in the way or cutting open and taking apart that brain? Well, thanks to the mastery of the electromagnetic force, we have MRIs which kind of do in fact do that. But even if we had a magic MRI machine for example with infinite resolution (yet another thing that has fundamental limits), that would only let you look at molecules with with hydrogen, so you wouldn't be able to survey, say, the ion content of brain cells directly. If you are not aware, ion gradients are fairly important in human cell behavior.
Nevermind that scanning and uploading someone's brain, if it were possible, does not transfer the original conscious experience to the computer. A new copy may go on in a digital world but you still die.
Lots of great points here, but I think there's a bit more cause for optimism. For one, generation ships I think are the long-term project for space travel that successfully gets humans somewhere. No easy feat by any means in terms of time, engineering, and risk, but not running up against a wall of physical impossibility.
And nuclear physics is still a wide open frontier. We don't yet have fusion, and there's a lot we don't yet know about quark and gluon plasma and nuclear behavior on astrophysical scales. And if we're talking about technological possibilities against time scales of forever, there's lots of interesting electromagnetic possibilities in the context of superconductivity and metamaterials that we haven't yet exploited and I'm probably not even beginning to do justice to it in its totality as an open ended frontier full of fertile (e.g. vacuum polarization is a poorly understood frontier that might turn out to have interestingly exploitable properties).
You did a great job outlining some devastatingly serious physical limits but I think, again against the timeline of forever, you may be perhaps underselling the possibilities of important and newly exploitable properties of electromagnetism and the nuclear force being brought into application.
>According to some calculations, it should in principle be possible to colonize the entire observable universe in less than a hundred million years
...what? That doesn't seem right, just from a really quick gut check it looks like the observable universe has a radius of 45.7 billion light years [0]. Even if the universe wasn't expanding nobody could get to everything any faster than that number of years right? Maybe you saw something that was talking about the local (Virgo) supercluster, which I think has a radius of around 55 million light years, so that sounds more like something that could be done on that timescale "in theory". But there are millions and millions of superclusters in the observable universe overall.
----
0: https://en.wikipedia.org/wiki/Observable_universe
Oops, yes, I don't know what I was thinking. A total brain fart. The paper I referred to is Sandberg and Armstrong's 2012 "Eternity in Six Hours", and of course they don't claim such a thing. Only that it's possible to start a colonization wave that has plenty of time to spread to everything visible now before they slip outside of our future light cone. The ~100M years refers to the colonization of the Milky Way. Sorry!
[1] https://www.sciencedirect.com/science/article/abs/pii/S00945...
>> According to some calculations, it should in principle be possible to colonize the entire observable universe in less than a hundred million years
> ...what? That doesn't seem right, just from a really quick gut check it looks like the observable universe has a radius of 45.7 billion light years [0].
I guess it depends on whose hundred million years you're talking about: the colonists' or those who stay home's. I don't know how to do the calculations, but it seems plausible that you could traverse the entire observable universe at near light-speed in 100 million years ship time.
You need ridiculous speeds for time dilation to really kick in though. Mathematically, it starts as soon as an object moves. But if a spaceship travels at 90 % of light speed (0.9 c), their local time moves just approximately at half speed compared to local time on earth. A year for the astronauts is just over 2 years on earth.
At 0.995 c, the ship clock runs 10 x slower.
At 0.999 c, 22 x slower. Then if you push the turbo button to 0.9999 c, 71 x slower.
The fastest man-made object to date is the Parker Solar Probe, at 0.059 c.
The universe was always only expanding between galaxies, not within them.
So wait, individual stars aren't getting further apart? Galaxies aren't getting "bigger"/more diffuse?
Imagine the universe as a giant balloon. Inside are little miniature balloon stars floating around, tied with string into balloon galaxies. If we heat the air: the big balloon expands, the clusters of mini-balloons spread out from the other clusters, but the clusters don't get any more diffuse. The string is way way too strong to be overpowered by the separating force from the expansion of the gas over short distances.
Galaxies have enough gravity to counteract the expansion of the universe.
So do we see the expansion cancelled out by the gravity, or do we only see the gravity?
I mean, is it
or Because this just made me wonder.. is "dark energy" simply the absence of gravity? i.e. just in regions where there is next to no matter/activity?> do we see the expansion cancelled out by the gravity, or do we only see the gravity?
We see gravity overpowering expansion. Same way you canât launch yourself into orbit by throwing lots of pennies at one a second.
The limit to space travel is the Rocket Equation, which says that you require exponential fuel to reach higher speeds. Alpha Centauri isn't going anywhere, but it will take millennia of travel even with wildly optimistic assumptions.
Also note that there isn't any "container" to fill up. It could well be infinite. It's just that we will be forever limited to a finite subset, even in theory.
Roundtrip Interstellar Travel Using Laser-Pushed Lightsails
https://ia800108.us.archive.org/view_archive.php?archive=/24...
"The third mission uses a three-stage sail for a roundtrip manned exploration of Eridani at 10.8 light years distance."
That limitation only counts for visiting other galaxies. Travel within the galaxy is always possible, regardless of the universeâs expansion. And Alpha Centauri is super close, even within our galaxy.
Specifically the local group, so Milky way + Andromeda and some dwarf galaxies
Dozens of dwarf galaxies, even! Also, Triangulum is sort of borderline at around 70% of the Milky Way's diameter, although admittedly only 10% of its mass. But Mars is also around 10% of Earth's mass, for a comparison.
Someone dumped a flat panel near a noisy planet.
Standard candles (all these measurements of redshift according to distance, need us to actually get the distance of what we are measuring right) are the gift that keeps on giving.
This study (and many others, depending on the cosmic scales they use) mainly use Supernovas of Type Ia. I.e. the energy emitted by the supernova of a binary acreccion star, which is a star that is capturing the mass from another start that is very nearby and increasing its mass until it collapses into itself, increases temperature up to the point it starts fusing helium, and goes supernova with all the added energy.
That was (and still is now, with some corrections we found since middle last century) supposed to be the same everywhere. Problem is, we keep finding new corrections to it - like this study claims.
That is in fact the big claim of this study (ignore the universe expansion part), that they found a new correction to the Supernova of type Ia luminosity. It's a very big claim and extremely interesting if confirmed. But, like all big claims, it needs a big confirmation. I'm a bit skeptic TBH.
>I'm a bit skeptic
Out of curiosity, what data are you drawing or what qualifications do you have that support your skepticism over three different modes of analysis (as well as pretty much every recent development in the field) supporting this claim:
Well, this part you mention, for instance "though this fact has received little attention so far".
A change on the standard candles calibration would be a huge deal for cosmology and galactic astronomy (and other fields) and would not be taken lightly at all. There are all sorts of ramifications from this and if astronomers aren't all in an oof about it, it is because big proof is needed for big claims.
And a change in the standard candles calibration is indeed a very big claim.
Circular universe...? big bang -> expands -> expansion slows -> starts retracting -> singularity again -> big bang again
Roger Penrose seems to be leaning/more convinced of the circular universe theory....
This is a fascinating discovery! It's brings into focus the Deep Field imagery from the JWST and how gravitational lensing was found to be greater than expected along with galaxies that were much older than expected based on redshift calculations. Perhaps this could indicate that the universe is even older than we originally thought if redshift calculations accounted for an incorrect perpetual acceleration.
I have a great deal of respect for the sciences but sometimes astronomy just feels like one giant guessing game: age of the universe, big bang starting as a joke and all the "first minute" timelines thereafter, dark energy and dark matter (code for we have no idea what it is) vastly outnumbering everything else, and now questioning the Nobel Prize-awarded universe expansion. Meanwhile, asteroids the size of buses+ keep whizzing by closer than the moon with little or no warning. Sigh.
Consider the scales involved. It's amazing that a species that is 99% chimp genes can even think and deduce phenomena of that size; don't ask it to get it right the first time.
All of that without having traveled farther than one light second from its home.
It should humble all of us that believe we have absolute knowledge of things. So many people consciously or subconsciously philosophize about the universe, life, spirituality, etc. based on grand ideas and science which is routinely overturned. Really all we can prove is mathematical facts, and what our emotions tell us (i.e. "I Love You").
There has been a lot of progress towards mapping all near-earth asteroids, at least. That's a lot better than the previous tactic of putting one's fingers in one's ears and humming.
I mean an asteroid the size of a bus is messy for your local area if it decides to land there, but in the terms of size of things in space is nearly undetectable. Space, even our local neighborhood is unbelievably huge.
Think of trying to find a bus that could be anywhere on earth that is moving so it's not easy to keep track of and is painted in a way to be camouflaged with its environment.
Now instead try to imagine looking for that bus on Jupiter. Gets way harder. But it's way bigger than that, your looking for a black dot in the size of an area of millions of Jupiter and just hope it crosses in front of a star so you can track it.
Most problems involving space are insanely hard.
That's a feature! If you want to be certain, you need religion, not science.
And of course, the people concerned with tracking near-earth asteroids are not connected in any way with cosmology.
what? no. religion is not certain which is evidenced by the numerous sects of christianity with their own interpretations of the same book.
while science might not have a definitive answer for everything, they distinguish from fact and theory.
> religion is not certain
Ask any religious person if their religion teaches truth or lie, then ask them if that truth is the absolute truth. We'll wait.
just because you tell me water is not wet does not make it dry. also, the cool thing "about science is it doesn't need you to believe in it" or however the quote goes
Sure, but they didn't say correctness, they said certainty
There are people certain the earth is flat, the moon landings were fake. That certainty doesn't impress me. So I'm just really not sure what the point is.
The point was that science doesn't allow for certainty, by construction. We're not certain the sun will rise tomorrow, we're not certain the speed of light is a limit, we're not certain that F=ma or that E=mc².
Those that want certainty have to look to religion, or to pseudoscience. And they will certainly be wrong.
Bingo. People who think science is fact dont understand science.
The actual fact is, we humans really donât know much about the universe and indeed there may be truths and knowledge that weâll never know the answer to. Like⌠why the fuck are we here? Where did all this stuff come from. Sure we have theories and have logical conclusions but at the end of the day⌠we are tiny and the universe is mind bogglingly huge. It is peak human arrogance to think we truly know anything at all.
Itâs very humbling to realise how little we actually know. What we do know⌠we know. We are masters of electro-magnetism, chemistry, etc⌠but when it comes to the big questions itâs all a shot in the dark.
> no. religion is not certain
Religion allows for certainty. Science does not. Faith versus reason.
What does 'now' mean here?
Probably it means that now we have evidence that⌠it is a colloquialism
Edit: yep, The universe's expansion may actually have started to slow rather than accelerating at an ever-increasing rate as previously thought, a new study suggests.
Going by the second graph, since about 2.5 billion years ago.
What happened to then?
My lay reading of the OPâs paper is that the universe is, in fact, braking for somebody.
we passed then. we're at now now. I thought this was settled
wait, I missed it?
Recent years, probably because of large data centers /s
Just because infinity is a hard thing to understand doesn't mean the universe is and has always been infinite.
Thanks, AI.
We need an index tracking the expansion rate . And an ETF on it