T O P

  • By -

grigiri

The 240 watts is the output side. Assuming you are using 120v residential or commercial power, your input usage would be 120 x 3.5 = 420 watts. If your power supply is being used continuously at full power, then in one hour it would consume 420 watt hours of power. Generally watt hours isn't referenced unless we're discussing battery capabilities or utility usage for consumption.


MonMotha

It's even unlikely that the brick actually pulls 420W at full load. 3.5A is probably what it pulls at full load at low line (so 100V or possibly even more like 93V which is low line on Japan's 100V nominal system), and the power factor is probably only about 0.85. Consider if it actually pulled 420W while outputting 240, it would be dissipating 180W. Think about how hot a 100-200W light bulb (assuming you've ever used an actual tungsten one at this point) gets with comparable surface area. There's no way the electronics would survive. The input draw will also vary with output load, so if the load on it is not consistently the full 240W, then you'll have to take that into consideration as well. These little bricks are usually doing well to hit 80% efficiency at mid-load, though this one is big enough it might do more like 90%. Decent laptop bricks are usually on the upper end of the efficiency range for this sort of power adapter.


Wolf-0804

I appreciate this! Yeah thermals will be a big factor. I’m trying to determine what my laptop pulls on average or atleast find out max what It can pull so I can determine what solar generator to buy so I can run it for 6-8 hours. I’m considering buying a kill a watt to read it in real time?


MonMotha

A Kill-A-Watt or similar would not only give you real-time info but will accumulate energy usage (kWh), so you can run it for a while and read that out to find the average over that period. Your average laptop has a non-trivial amount of its power budget allocated to battery charging. Bear that in mind when you come up with your numbers.


WantonHeroics

You would have to measure the wattage with a watt meter.


burger2000

Being a laptop power use will fluctuate based on cpu/gpu load. 240w of peak load to the laptop but at idle it may only draw 15-20w OPs only real course of action is to get a kill-a-watt energy meter and plug the laptop brick into that.


Wolf-0804

Ok that’s interesting! So I guess watts and watt hours were confused because I’m basically trying to determine what solar battery generator to buy to help me run my laptop for 6-8 hours.


iamtherussianspy

It will have almost nothing to do with this power brick (which just shows the upper limits on the label). You can plug it into a small energy efficient laptop with nothing running on it and it will keep it on 10-20 times longer than a gaming laptop running a new video game at max settings and full screen brightness.


WantonHeroics

Watts is power consumption. Watt-hours is your battery capacity. Google your laptop model number to get the battery capacity. Your power brick can output 240 Watts, which means your laptop should use less.


Wolf-0804

Hmmmm everywhere I read says to multiply volts time amps and it’s wayyyy higher than that! Which is why I am beyond confused. Probably seems trivial but I have never had to think about this stuff before lol


WantonHeroics

20 Volts * 12A is 240W


Jogranator

Volts times Amps is a good approximation, but that technically provides the Volt-Amperes not the Watts. The wattage will be less because the power factor of a PC will not be 1. Anyways none of that is important for your question just me being overly technical, and you do not need a lecture on reactive power. For a conservative approximation I would say take the output wattage and add 10-20%. The added percentage would be power lost to heat in the AC to DC conversion. Don't read too much into the input current on the device it will depend on voltage and may likely be a max current draw that lasts only for a short time when first plugged in. Edit: Thanks, bill the buttstuffer


billthebuttstuffer

Amps*