Amps, etc a are a different beast. The max wattage is constant (unlike our bulb example). Here we cannot have the equipment's power rating to increase with the voltage because the semiconductors are designed to output specific power. Beyond that, the semiconductors can fail. There are usually 4 types
1. Voltage selector to select transformer tap. You simply have to select the correct voltage range as per your country. This selects the transformer tap so that you get the same output voltage from the transformer
2. SMPS based power supply (here we have SMPS that operate in the range from 110v to 240v usually. When the voltage is 110 volts, the SMPS chip switches faster (many more times per second). When the voltage is 240 volts, the SMPS chip switches slower (much lesser times per second). the result is constant voltage output
3. Linear Power Supplies. These are transformer based. We have a step down transformer the output if which is rectified. There is a capacitor bank to smoothen the DC ripple. Here we have a circuit that throws away extra voltage in the form of heat. At 110 volts, much less energy is dissipated. At 240 volts, the LPS thows much more excess energy as heat.
4. Voltage stabilizer before the equipment. This is self explanatory.
Appreciate the amount of time spent by you in writing these useful posts and explaining things.
I was a bit confused as well as I was thinking from a system design/cable sizing point of view.
I agree that with a pure resistive load like an incandescent lamp/heater etc., the Voltage and current are directly proportional and hence current increases with voltage and power increases.
But some counter arguments,
If the lamp is glowing brighter than your requirement, consider using a lower wattage lamp. I am not sure how many households still have Incandescent lamps though.
For the heater case, the water is heated quickly and the thermostat will cut off faster.
For a fan running faster, simply reduce the speed to your liking with the speed controls on the fan.
So the argument is, while agreeing to the possibility of high energy bill for a higher voltage, how much of that increased power loss actually translates to real life energy bill situation is questionable.
Let us also discuss the benefits of higher voltage,
higher voltage means lesser transmission losses, hence theoretically lesser energy bills (if the energy company passes it over to its customers).
the other benefit is better voltages for rural customers and customers at the ends of the line.