Most power supplies people are used to using are constant voltage supplies. That means, they always (try to) supply the same voltage, no matter what the load is ("voltage regulation"). If you have a 5V supply, and attach a 1K resistor to it, then it will put 5V of potential across the resistor, and that will result in 5 mA of current flowing through it. If you attach a 100 ohm resistor, it will still put the same 5V of potential across it, and there will be 50 mA of current instead.
However, every power supply has a limit to how much current it can actually supply. If the load tries to pull too much current (assuming something doesn't just go "poof" somewhere), generally what will happen is that the power supply won't actually be able to maintain the voltage level it's supposed to any more, and the voltage will drop (so it might only supply 4V or 3V instead of the 5V it's supposed to). Essentially, the voltage will drop as much as is necessary until the resulting amount of current being drawn isn't more than what the supply can provide.
And at this point, our "constant voltage" (voltage stays the same, current may vary) supply is actually functioning like a "constant current" (current stays the same, voltage may vary) supply instead. Most power supplies aren't designed to function in this mode, though, and doing so can cause things to fail or other weird behaviors. However, it is possible to design (or discover) power sources which do operate just fine in this mode instead, which can sometimes actually be useful.
So imagine a supply that has a maximum current it can supply of say, 50 mA, but it has no voltage regulation at all (so its voltage can go very high). If you don't connect anything to the output, there's nothing to cause any current flowing, so the voltage between the output pins might actually go up to hundreds of volts or more (but for now let's just say it tops out at 100V, for the sake of example).
But what happens when you connect something to it, like say a 100 ohm resistor? Well, initially, there will be, 100V across 100 ohms, so that would be a current of 1 A. But the supply can't physically provide 1 A of power, it can only do 50 mA, so that's just not gonna happen. So the voltage immediately drops instead: 50V? That's 500 mA, still too much, so it keeps dropping. 10V (100 mA)? Nope, still dropping. 5V? Hey, now we're at 50 mA, and we can actually supply that, so the power supply stabilizes at that point, and only provides a voltage of 5V to the load.
(This all happens very very quickly, so it's not something you would usually see, unless you were looking at it on an oscilloscope, etc, but it is basically exactly what happens with constant current supplies.)
If the load changes to, say, 200 ohms, well, then the power supply won't be being "pulled down" as much, so the voltage will naturally rise, and it can "bounce back" up to 10V instead (which balances out again at 50 mA, the same current as before, and everybody's happy).
Photovoltaic / betavoltaic cells basically work exactly the same way (except it's a natural property of the material/process, not something that was designed into it). They can potentially actually supply very high (static) voltages, but the moment that current starts flowing, the voltage they supply will drop until the current is at a level they can support, and they will naturally balance out at maintaining only the voltage which keeps that level of current flowing (whatever voltage that happens to be).