Power ( measured in watts ) is the product of voltage and current (how much electricity flows through the wires)
The power supply in your computer takes the 220v AC (+/- 10% , in Europe countries are supposed to standardize around 230v +/- 10% or something like that) and converts it internally into around 400v DC and from there, the power supply produces the voltages the computer needs ( 12v DC, 5v DC, 3.3v DC) all in the amount the parts need at that moment in time... the power supply only takes as much power from the mains socket as the components require at a moment.
If a power supply has 500w on it, that doesn't mean your computer consumes 500 watts all the time, it simply means that if components inside your pc demand some power, the power supply can create this power, up to a maximum of 500 watts.
So for example, if your cpu wants 100w when you're gaming, your psu can produce these 100 watts and give them to cpu. If your video card wants 200w it can also handle that, because 100w+200w is less than 500w. If you add a 2nd video card which wants 300w then your power supply can no loner do that and most likely it will turn off, because 100w + 200w + 300w is more than 500w and the power supply can only produce up to 500w.
Your cpu and your video card will constantly adjust their frequencies and turn off or slow down parts of themselves in order to save power, if they don't have work to give to those parts, so that's why your computer won't use the same amount of power all the time. When you're watching a Youtube video, only a tiny part of your video card will be busy showing the video, so the video card will use little power. When you're gaming, video card goes all the way.
It won't matter to you, but maybe others would appreciate it.
When the power supply takes the 220v AC (or 110v AC if in US or other such countries) and converts it to around 400v DC and then again converts this to the smaller voltages the parts want ...these conversions aren't perfect, there are some losses which result in heat produced and that's why power supply has fans.
But what I'm trying to say is that these conversion losses can be up to 10-15%, so for example if the components inside your computer consume 300 watts when you're gaming, your power supply may need to take 350 watts from the mains socket in order to "create" these 300 watts for the parts in your computer.
If your power supply says 500w on it, then if the parts take this much amount of power, the power supply may need to take 600 watts from the mains socket in order to create those 3 voltages and send all those 500w to all the parts in your computer.
In most 230v +/- 10%-ish countries, the mains sockets have 16 or 20A fuses and the wiring is thick enough to handle this... that's 230v x 16 A = ~ 3680 watts ... so 500-600w is not a big deal. It does matter if you have 4-5 computers and you plug them all in an extension cord.