Let's say that we have completely automated fishing boats. They can trap every fish in the sea. We give everyone UBI. They all decide to eat fish. No humans have to work or do anything to completely remove all fish from the sea. Is this a good idea? In previous eras we were constrained by the need for human labor to do all these things, but now AI does it, so we can have as much of it as we want until the natural resources run out. This creates problems with sustainability however, so how is that controlled?
This is a problem we already have to deal with: people got rich enough that they could afford to pay people to overfish the oceans, and we responded by limiting how much people are allowed to fish.
That is, I don't think UBI adds a new problem beyond "how do we make sure that humanity properly accounts for externalities" and "how do we make sure that AI does what we want it to do".
I was using fish to make an obvious example. The answer is regulation, but there are so many things like fish in the world. Do we have to have a regulation for every single one? It seems like it will end with whack-a-mole micromanagement of everything. It almost seems like we'll get communism eventually out of it. Except there is no all labor is of equal value, because there's no labor. I wish there was some alternative.
I'm sorry, I just still don't see the path by which UBI leads to a dramatic increase and how much regulation we need? Would you be able to give an example that is specific to UBI, and that describes a problem we wouldn't have without it?