You've fallen victim to one of the first mistakes someone new to the computer world makes.
A computer will not use a single watt more power if driven by a 1200 watt power supply than it will if driven by a 300 watt power supply. The wattage of a power supply denotes its maximum capacity, not what it actually uses all the time. If your hardware needs 250 watts of power, it will get the same 250 watts no matter the capacity of the power supply.
As an example, my computer (the one in my sig) is currently in deep idle and all power saving features at max since I'm only browsing SweC at the moment. It's powered by a Corsair AX850 850 watt gold certified power supply.
According to the watt meter I have in the power outlet it's only using a total of 135 watts on the primary side. Since it's an 80+ gold supply, the actual computer is barely using 120 watts (88% efficiency at <20% load).
I don't count the displays in that of course (and they're on another outlet as well), but I shouldn't either since they're not driven by the power supply. I might also add that the computer is entirely silent, the only active fan at the moment is the side fan, since the power supply shuts off its fan if it's at less than 50% load.
What's more important is the quality of the power supply and whether it's 80+ certified or not. A non-80+ certified no-name power supply will usually land around 75% efficiency, making a 250 watt hardware draw (secondary side) use ~335 watts of actual power (primary side). The 85 watt difference becomes heat inside the power supply. An 80+ Gold certified power supply with the same 250 watt hardware draw will only draw 275 watts of actual power, so you save 60 REAL watts by having a better power supply.
The difference will be exactly the same no matter if you're comparing a 450 watt, 850 watt or even 1200 watt 80+ Gold power supply. If the hardware draws 250 watts it'll still just end up drawing somewhere around 275W from the power outlet. The difference is of course that the more powerful power supply can handle a lot more, doesn't run as hot (has better cooling) and has heavier duty parts. In essence, it'll last longer and be more secure in terms of upgradeability.
Also and typically running a low-power system on a high-capacity power supply just means that the power supply is basically working in idle. You're so far below its capacity that it can in some cases shut down its fan, or at least rev it down to its lowest level, since it won't be producing much heat. A 300 watt power supply delivering 250 watts however will be running at over 80% of capacity and run hot, usually not exactly living up to what people call "quiet".
In terms of efficiency it's even reverse to your belief. If you have a computer which uses 200 watts of power, a likely scenario for a low-power PC, you'd typically want a 400 watt power supply for everything to be perfect since power supplies are always at their peak efficiency when running at around 50% load. That said, the differences in efficiency for an 80+ certified power supply between different load levels is usually very slim, and even at 20% load you'll rarely see a great dip in efficiency either.
In other words, if there is no difference in price between a 300 watt and a 400 watt power supply, there's no reason for the 300 watt power supply to exist.
No reason whatsoever.