It’s “sustainable” as in “we can keep producing methane even when oil fields are empty”, not as in “methane does not produce greenhouse gases when burned”
Important distinctions. Fortunately there are lots of other renewable innovations to celebrate. I like to believe we are getting closer to the solarpunk future humanity deserves!
For anyone down voting, how many species have humans caused to go extinct? Is our wellbeing worth the permanent deletion of every other species on the planet? Fuck humanity.
If it's getting all of the carbon for that methane from atmospheric carbon dioxide then it should at least be neutral. The production should, if that is how it's working, remove as much carbon from the atmosphere as burning the product would release. This would make it a hell of a lot better than fossil extraction since that's taking carbon not currently in the atmosphere and then releasing it in to the atmosphere
Those type of efuels are inefficient and expensive. Especially when we can make use of the electricity directly, without stupid conversation methods. There might be the odd edge cases here and there, but this is not going to be a "sustainable" alternative to fossil fuels.
Unlike traditional production methods, this works in a cycle and doesnt require extraction from underground. Fuels are usefull because they carry large amounts of energy at low weight/volume. It’s a vastly superior storage than lithium batteries.
This is a cool idea, hopefully it can scale up. Could it be used for large scale carbon removal? How does it compare to current carbon removal methods?
Many of the references are 404, and the key papers according to the article are by completely different authors than claimed with different titles than claimed, and nothing to do with “information thermodynamics”. I suspect that this is AI generated nonsense.
Yeah no theory of science, philosophy or computers is going to save us from the “prime mover” paradox.
But if we can prove that a universe can be “created” via simulation in our own universe, now statistically speaking it becomes clear that simulations are “cheap” and therefore it’s likely we live in one.
There’s a reality somewhere. A “physical” universe in which a computer can be created. With enough time, a computer can be created in this “top level” universe that is capable of simulating one universe. Then, assuming computational power and efficiency can be improved, multiple universes can be simulated at one time. You can also achieve better simulation fidelity by slowing down the simulated universe, like 1 second of simulation time taking 1 minute of top level universe time.
If we stop there, the probability that we’re living in a simulated universe instead of the top level universe is already pretty high (or inversely the chance that we’re living in the top level universe is pretty low).
Now, if the computers are powerful enough, the simulated universes can probably have computers in them, and those computers will eventually be able to simulate universes, too. Probably in about the same ratio.
So it’s not simulations ALL the way up, but if those postulates are rational, the chance that any randomly chosen universe in the set of all universes is the top level universe becomes vanishingly small, but non zero.
I think it’s definitely the case that if the top level simulation(s) stop, all the downstream ones would, too.
Now, if the computers are powerful enough, the simulated universes can probably have computers in them, and those computers will eventually be able to simulate universes, too
There’s one minor problem with this step of this idea. Where are those simulated computers running? For example, let’s say I spin up a virtual machine on my computer. Then, inside that machine I spin up a sub-virtual machine. The processing power to run that sub-machine doesn’t just appear out of nowhere. The processing power is still all coming from the top level machine. A bit less efficiently than just running a second VM on the top level machine.
This would be the same for universe simulations. Let’s say Universe A simulates Universe B. Now Universe B tries to simulate universe C. But, in order for Universe B to run that simulation, Universe A actually has to run that simulation. The simulation doesn’t get run for free. If anything, it’s probably less efficient for Universe A to simulate Universe B simulating Universe C. So, Universe A would make better use of resources to just run the Universe C simulation themselves and just let Universe B see the results and think they are the ones running it.
No matter how deep the universes nest, every simulation must be run by the resources of the top level universe. Either directly or at several levels of abstraction. There’s no getting around that. Now, it could be that the top level universe has a lot of resources and can run sub-universes pretty efficiently. But there will never be any more sub-universes than the top level universe can run be itself.
While that makes sense if the top-level universe follows our laws of physics, we can’t guarantee that the top-level actually has to follow the same rules as simulations. Perhaps energy and/or matter can be created from thin air, meaning that there are no issues of conservation of energy or matter. A universe like this has literally limitless energy, and so the amount of energy it uses to simulate universes, either separately or within other universes, becomes negligible or a moot point. Perhaps the simulators are more interested in what their simulations simulate rather than their own simulations; perhaps they’re trying to create new patterns of thought that they can’t imagine to create themselves. There are just a lot of questions with Simulation Theory.
ultra-unlimited.com
Oldest