1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How much Power does a Comp' use?

Discussion in 'Windows XP' started by Eddie, Jun 7, 2010.

  1. Eddie

    Eddie Flightless Bird

    Hi,,

    Need to settle an arguement twixt buddy and myself.

    I have this computer and another with xp on it; the xp comp has been
    left on for months and months, and I go over to use it often.
    My buddy says I should switch it off as it is sucking electricity, but I
    said that it would be so minimal as to not fuss about it... we are
    both stubborn and wont change points of view.
    So, in the case of a comp' with a 350watt power-supply, just how much
    power is my computer actually drawing and would it affect my electricity
    bill to the extent that he claims?

    I used Everest to look but couldnt find anything useful in there.

    Ed
     
  2. Paul

    Paul Flightless Bird

    Eddie wrote:
    > Hi,,
    >
    > Need to settle an arguement twixt buddy and myself.
    >
    > I have this computer and another with xp on it; the xp comp has been
    > left on for months and months, and I go over to use it often.
    > My buddy says I should switch it off as it is sucking electricity, but I
    > said that it would be so minimal as to not fuss about it... we are
    > both stubborn and wont change points of view.
    > So, in the case of a comp' with a 350watt power-supply, just how much
    > power is my computer actually drawing and would it affect my electricity
    > bill to the extent that he claims?
    >
    > I used Everest to look but couldnt find anything useful in there.
    >
    > Ed


    One of these will help answer the question.

    "Kill A Watt meter"
    http://www.p3international.com/products/special/P4400/P4400-CE.html

    It plugs into the wall, then the computer plugs into the outlet
    on the front. That will settle the argument, about how much power,
    once and for all.

    *******

    Or you can estimate the power, by making a list of the components, and
    using known figures for the components. But the results will be in
    error by a bit (on the high side).

    There is a fair range of power consumptions, based on whether the
    computer has fancy video cards and a hot processor in it or not. So
    not all the computers have exactly the same idle power consumption.
    For example, the Prescott processors from Intel, used to waste 25%
    of their electricity as heat, for nothing. The heat wasted wasn't
    doing any useful work. It was a form of leakage current.

    The idle consumption could be 60 watts up to perhaps 150 watts,
    depending on the vintage of computer (those are numbers I've seen here).
    Some older computers, will be drawing more power than the new ones,
    even though the new computer computes faster.

    *******

    These are some gamer computer results.

    http://www.anandtech.com/show/2977/...x-470-6-months-late-was-it-worth-the-wait-/19

    Idle power is 160 to 260 watts.

    Load power is 220 to 680 watts, playing Crysis.

    In this article, are low end systems, used for things like perhaps
    an HTPC. These use graphics (GPU) on the motherboard, rather than a hot
    video card.

    http://www.anandtech.com/show/2505/3

    Idle power is 59 to 79 watts (using weaker processor choices)

    http://www.anandtech.com/show/2505/4

    Maybe 78 to 115 watts playing a movie.

    The computer without the separate video card, draws less power
    when playing a movie, than the gamer computer draws doing
    absolutely nothing.

    So there are some examples for you.

    The size of the power supply, has nothing to do with it.
    You can buy a 700W power supply if you want, but if the
    hardware draws 60W, then it's using pretty close to 60W
    from the wall. Power supplies have an efficiency figure,
    and that helps determine how much waste heat they kick out,
    and that waste heat is a function of the computer load inside.
    The 700W rating, is the absolute max power you can draw from
    it, before it shuts down on its own. The 60W figure, is what
    it draws from the wall, when the computer is idling. Whether you
    had a 350W or a 700W, that wouldn't radically affect the 60W
    figure. It's still going to be in that ballpark. The very
    best power supplies now, can manage about 87% efficiency when
    converting electricity.

    http://images17.newegg.com/is/image/newegg/17-194-058-Z05?$S640W$

    60 watts internal load
    --------------------- = 69 watts
    0.87 efficiency

    In that example, the power supply delivers 60W to the load, and
    kicks out 9 watts of heat from the power supply fan hole. The total
    heat exhausted is 69W, 9W coming from the PSU hole and 60W coming
    from the other ventilation fans.

    Not that many years ago, power supplies were 68% efficient. (The
    68% efficient ones, don't state the efficiency on the label, which
    is how you guess at it.) For the same 60W of internal component
    loading...

    60 watts internal load
    --------------------- = 88 watts
    0.68 efficiency

    Spending a couple hundred dollars on a new power supply, to save
    19 watts of electricity, has a pretty long payback period. There
    are some cheaper ones that manage 80% efficiency, that might make
    a more reasonable choice.

    The savings go up, if you're a high end gamer. As there, the waste
    is large, and the power supply heat can be large as well. Especially
    when the computer is drawing that 680 watt number. But not too
    many people can afford the video cards that draw that kind of load.
    My crappy video card draws maybe 48.4 watts flat out, 22.6 watts
    idling (like, while I'm typing this). This site has measured all
    sorts of video cards. This is a small sampling.

    http://www.xbitlabs.com/articles/video/display/powercolor-x1900gt_5.html#sect0

    The Kill A Watt meter will give an exact answer. If you're lucky,
    perhaps you can borrow one from somewhere, long enough to measure it.

    http://library.bowdoin.edu/news/kill-a-watt.shtml

    "By presenting your Bowdoin ID Card, students, faculty, and staff
    can check out Kill-A-Watt meters (and instructions) at both H&L
    and Hatch Libraries for up to four days"

    It's amazing the things they have at libraries these days.

    Have fun,
    Paul
     
  3. tango@noplace.com

    tango@noplace.com Flightless Bird

    On Tue, 08 Jun 2010 03:04:00 -0400, Paul <nospam@needed.com> wrote:

    >The Kill A Watt meter will give an exact answer. If you're lucky,
    >perhaps you can borrow one from somewhere, long enough to measure it.
    >
    >http://library.bowdoin.edu/news/kill-a-watt.shtml
    >
    > "By presenting your Bowdoin ID Card, students, faculty, and staff
    > can check out Kill-A-Watt meters (and instructions) at both H&L
    > and Hatch Libraries for up to four days"
    >
    >It's amazing the things they have at libraries these days.
    >
    >Have fun,
    > Paul


    Having been an electrician before I retired, I'd go along with what
    you're saying. I dont think the average computer would exceed 100
    watts from the outlet, and likely less most of the time. I do believe
    the computers from before 1990 used more power than those after 90,
    but now these newest computers such as the dual core and quad core
    ones use more power than any of the older ones. Just look at how much
    cooling they need and that's the answer. Most power is lost in heat.

    Of course we are just talking the computer, not the monitor. The
    monitor is plugged into an outlet separately. The monitors use MORE
    power than the computer itself. Particularly the CRT monitors. Then
    add to that the printer. If the printer is left on, they use a little
    too, but if it's a laser printer they actually use a lot of power,
    since there is a heating element that fuses the toner to the paper,
    and that heating element sucks lots of power. I dont really know if
    that heating element is always on, or just turns on before a print
    job, but they do suck power, just like a photocopier. I once had to
    install a dedicated 20amp outlet in an office for a photocopier
    because it drew that much power and was knocking out the breakers all
    the time. That means it was using upward to around 2000 watts. Laser
    computer printers are generally smaller, but they are still a large
    draw. (of course it's simple to switch them off, how often do you use
    a printer?).

    Anyhow, I'd suggest to save power, shut off at least the monitor when
    not in use, and always keep the printer turned off except when you do
    print jobs. Why waste money, and energy.

    Personally, I keep my computer on when I plan to use it in the next
    two or three hours or less. But I generally switch off my monitor
    whenever I walk away from the computer, unless I'm jost going to the
    bathroom or to grab a bite to eat. My printer is always off. I
    probably use it twice a month, and only turn it on when I intend to
    print something. Before bed, I ALWAYS shut off the whole system
    unless I'm downloading a huge file.

    This is just a rough guess, but computer and monitor left on is likely
    using at minimum 100W. (probably more). 100W X 24 hours is 2400W.
    Roughly 2.5 KWH. If you pay $0.14 per KWH, that's 35 cents per day.
    Multiply that by 30 days and you spend $10.50 per month. However, I'd
    estimate the cost to be more like $15 to $20 per month because I think
    most newer computers use more power than 100W (with monitor, but NOT
    printer).

    So, if it's $15 a month, and you shut it off half the day, you save
    $7.50, which in a year would be a savings of $90. I dont know about
    you, but I dont have $90 to throw away.

    Another thing. If you run air conditioning, the extra heat from the
    computer makes the AC work harder, and that means more electricity
    too. Now, in winter, the computer cuts down on the heating bill
    slightly, so in that case it makes more sense to leave the computer on
    in winter.

    One last thing. Hard drives have bearings. The longer they spin, the
    more they wear out. The same for fan motors in your Power supply and
    CPU fan. Why wear out your computer when it's not in use? Finally,
    ALWAYS turn it off during storms. Lightning will wipe out a computer
    in a second. And if you dont have an AC, the comp can overheat in a
    hot house in summer. Even the best CPU fan cant keep up if the house
    is close to 100 degrees, and on hot days that happens in homes without
    AC.

    One of these days, I'm going to get one of those Kill A Watt meters.
     
  4. Doum

    Doum Flightless Bird

    Eddie <albert@greenacres.far> écrivait news:-OCC1yFtBLHA.4388
    @TK2MSFTNGP04.phx.gbl:

    > Hi,,
    >
    > Need to settle an arguement twixt buddy and myself.
    >
    > I have this computer and another with xp on it; the xp comp has been
    > left on for months and months, and I go over to use it often.
    > My buddy says I should switch it off as it is sucking electricity, but I
    > said that it would be so minimal as to not fuss about it... we are
    > both stubborn and wont change points of view.
    > So, in the case of a comp' with a 350watt power-supply, just how much
    > power is my computer actually drawing and would it affect my electricity
    > bill to the extent that he claims?
    >
    > I used Everest to look but couldnt find anything useful in there.
    >
    > Ed


    My Core2Quad PC is powered through an APC ups. That computer has 6 hard
    drives, 8 GB ram, there is also a 24" LCD monitor, an external audio
    interface and powered speakers connected to the ups.

    APC PowerChute software claims that the devices connected to the ups
    consume between 222 and 228 watts.

    HTH
     
  5. Leythos

    Leythos Flightless Bird

    In article <OCC1yFtBLHA.4388@TK2MSFTNGP04.phx.gbl>,
    albert@greenacres.far says...
    >
    > Hi,,
    >
    > Need to settle an arguement twixt buddy and myself.
    >
    > I have this computer and another with xp on it; the xp comp has been
    > left on for months and months, and I go over to use it often.
    > My buddy says I should switch it off as it is sucking electricity, but I
    > said that it would be so minimal as to not fuss about it... we are
    > both stubborn and wont change points of view.
    > So, in the case of a comp' with a 350watt power-supply, just how much
    > power is my computer actually drawing and would it affect my electricity
    > bill to the extent that he claims?
    >
    > I used Everest to look but couldnt find anything useful in there.
    >
    > Ed


    Your battery backup (UPS) should be able to tell you how many watts your
    computer is drawing - in my case, I draw about 300W all the time, but I
    have Dual Drives, Dual 24" LCD monitors, etc...

    When I look at my servers, I have a APC SU2200 UPS, it shows that two of
    them, their network switch, firewall, and tape drive, draw about 480W
    constantly - that's about $40/month in electric costs.

    --
    You can't trust your best friends, your five senses, only the little
    voice inside you that most civilians don't even hear -- Listen to that.
    Trust yourself.
    spam999free@rrohio.com (remove 999 for proper email address)
     
  6. John John - MVP

    John John - MVP Flightless Bird

    Doum wrote:
    > Eddie <albert@greenacres.far> écrivait news:-OCC1yFtBLHA.4388
    > @TK2MSFTNGP04.phx.gbl:
    >
    >> Hi,,
    >>
    >> Need to settle an arguement twixt buddy and myself.
    >>
    >> I have this computer and another with xp on it; the xp comp has been
    >> left on for months and months, and I go over to use it often.
    >> My buddy says I should switch it off as it is sucking electricity, but I
    >> said that it would be so minimal as to not fuss about it... we are
    >> both stubborn and wont change points of view.
    >> So, in the case of a comp' with a 350watt power-supply, just how much
    >> power is my computer actually drawing and would it affect my electricity
    >> bill to the extent that he claims?
    >>
    >> I used Everest to look but couldnt find anything useful in there.
    >>
    >> Ed

    >
    > My Core2Quad PC is powered through an APC ups. That computer has 6 hard
    > drives, 8 GB ram, there is also a 24" LCD monitor, an external audio
    > interface and powered speakers connected to the ups.
    >
    > APC PowerChute software claims that the devices connected to the ups
    > consume between 222 and 228 watts.


    Which translates to approximately 5.5 kilowatt-hours per day if the
    computer runs 24/7. At 10¢/kWh that is 55 cents a day, at 15¢/kWh that
    is about 83 cents per day. Per month $16.50 to $24.75.

    John
     
  7. Eddie

    Eddie Flightless Bird

    Think I'll turn it off then.

    To all repiers, Thanks.

    The comp' in question is an AMD2.4g with 1gig Ram and an average to good
    video card,, it also has a CRT monitor. Sounds like I might be
    wasting quite a few bucks.
    The monies that you guys were quoting, was that in US $$'s? were any in
    AU $$'s? (would give me a rough idea how much I been paying.)

    Ed
     
  8. No Mo

    No Mo Flightless Bird

    Re: Think I'll turn it off then.

    And, since you changed the subject line and didn't quote any of your
    previous message, who knows what you're talking about.

    "Eddie" <albert@greenacres.far> wrote in message
    news:-OHwEbywBLHA.5464@TK2MSFTNGP05.phx.gbl...
    : To all repiers, Thanks.
    :
    : The comp' in question is an AMD2.4g with 1gig Ram and an average to good
    : video card,, it also has a CRT monitor. Sounds like I might be
    : wasting quite a few bucks.
    : The monies that you guys were quoting, was that in US $$'s? were any in
    : AU $$'s? (would give me a rough idea how much I been paying.)
    :
    : Ed
     
  9. John John - MVP

    John John - MVP Flightless Bird

    Re: Think I'll turn it off then.

    A kilowatt-hour is a kilowatt-hour regardless of where you are in the
    world. Just look at your power bill and you will see your cost per kWh,
    add applicable taxes, if any, just do the math.

    1 kWh = using 1000 watts for 1 hour.
    Using 10 100 watt light bulbs for 1 hour = 1 kWh.


    Computer usage 24 hours/day, monthly cost:

    Assuming 300 watt power draw:

    (300 watts x 24 hours x 30 days)/1000 = 216 kWh

    Where I live residential power is about 10 cents/kWh + 15% tax which
    equals about 11.5 cents per kWh. Running the above 300w computer would
    cost me about $24.84/month, if I turn it off 12 hours/day I would save
    about $12.42/month... or $149.04/year.

    Of course, if you use your power saving options to turn off the monitor
    or other components in your computer your usage won't continuously be
    300 watts/hr. Also keep in mind that many utilities have staggered
    power rates, the first x kWh might cost more than the next x kWh.

    John

    Eddie wrote:
    > To all repiers, Thanks.
    >
    > The comp' in question is an AMD2.4g with 1gig Ram and an average to good
    > video card,, it also has a CRT monitor. Sounds like I might be
    > wasting quite a few bucks.
    > The monies that you guys were quoting, was that in US $$'s? were any in
    > AU $$'s? (would give me a rough idea how much I been paying.)
    >
    > Ed
     
  10. Jim

    Jim Flightless Bird

    On Tue, 08 Jun 2010 15:42:16 +0930, Eddie <albert@greenacres.far>
    wrote:

    >Hi,,
    >
    >Need to settle an arguement twixt buddy and myself.
    >
    >I have this computer and another with xp on it; the xp comp has been
    >left on for months and months, and I go over to use it often.
    >My buddy says I should switch it off as it is sucking electricity, but I
    >said that it would be so minimal as to not fuss about it... we are
    >both stubborn and wont change points of view.
    >So, in the case of a comp' with a 350watt power-supply, just how much
    >power is my computer actually drawing and would it affect my electricity
    >bill to the extent that he claims?
    >
    >I used Everest to look but couldnt find anything useful in there.
    >
    >Ed



    So long as they both work , why worry ?
     
  11. Eddie

    Eddie Flightless Bird

    Re: Think I'll turn it off then.

    John John - MVP wrote:
    > A kilowatt-hour is a kilowatt-hour regardless of where you are in the
    > world. Just look at your power bill and you will see your cost per kWh,
    > add applicable taxes, if any, just do the math.
    >
    > 1 kWh = using 1000 watts for 1 hour.
    > Using 10 100 watt light bulbs for 1 hour = 1 kWh.
    >
    >
    > Computer usage 24 hours/day, monthly cost:
    >
    > Assuming 300 watt power draw:
    >
    > (300 watts x 24 hours x 30 days)/1000 = 216 kWh
    >
    > Where I live residential power is about 10 cents/kWh + 15% tax which
    > equals about 11.5 cents per kWh. Running the above 300w computer would
    > cost me about $24.84/month, if I turn it off 12 hours/day I would save
    > about $12.42/month... or $149.04/year.
    >
    > Of course, if you use your power saving options to turn off the monitor
    > or other components in your computer your usage won't continuously be
    > 300 watts/hr. Also keep in mind that many utilities have staggered
    > power rates, the first x kWh might cost more than the next x kWh.
    >
    > John
    >
    > Eddie wrote:
    >> To all repiers, Thanks.
    >>
    >> The comp' in question is an AMD2.4g with 1gig Ram and an average to
    >> good video card,, it also has a CRT monitor. Sounds like I might be
    >> wasting quite a few bucks.
    >> The monies that you guys were quoting, was that in US $$'s? were any
    >> in AU $$'s? (would give me a rough idea how much I been paying.)
    >>
    >> Ed




    Thanks john,, I'll check my next bill using your info as a guide.

    Ed

    ps:: to No Mo,, sorry about changing subject line.
     
  12. Bill in Co.

    Bill in Co. Flightless Bird

    Re: Think I'll turn it off then.

    Also, if you're reading the computer power supply specs to get that figure
    of 300W, that only means it is capable of supplying that amount of power
    under full rated load. And I believe a computer normally draws much less
    power than that.

    John John - MVP wrote:
    > A kilowatt-hour is a kilowatt-hour regardless of where you are in the
    > world. Just look at your power bill and you will see your cost per kWh,
    > add applicable taxes, if any, just do the math.
    >
    > 1 kWh = using 1000 watts for 1 hour.
    > Using 10 100 watt light bulbs for 1 hour = 1 kWh.
    >
    >
    > Computer usage 24 hours/day, monthly cost:
    >
    > Assuming 300 watt power draw:
    >
    > (300 watts x 24 hours x 30 days)/1000 = 216 kWh
    >
    > Where I live residential power is about 10 cents/kWh + 15% tax which
    > equals about 11.5 cents per kWh. Running the above 300w computer would
    > cost me about $24.84/month, if I turn it off 12 hours/day I would save
    > about $12.42/month... or $149.04/year.
    >
    > Of course, if you use your power saving options to turn off the monitor
    > or other components in your computer your usage won't continuously be
    > 300 watts/hr. Also keep in mind that many utilities have staggered
    > power rates, the first x kWh might cost more than the next x kWh.
    >
    > John
    >
    > Eddie wrote:
    >> To all repiers, Thanks.
    >>
    >> The comp' in question is an AMD2.4g with 1gig Ram and an average to good
    >> video card,, it also has a CRT monitor. Sounds like I might be
    >> wasting quite a few bucks.
    >> The monies that you guys were quoting, was that in US $$'s? were any in
    >> AU $$'s? (would give me a rough idea how much I been paying.)
    >>
    >> Ed
     
  13. Andy

    Andy Flightless Bird

    Re: Think I'll turn it off then.

    You are correct it is the peak power out put that the maker lists.
    :)
    AL'S COMPUTERs

    "Bill in Co." <not_really_here@earthlink.net> wrote in message
    news:-OWp6E9xCLHA.980@TK2MSFTNGP04.phx.gbl...
    > Also, if you're reading the computer power supply specs to get that figure
    > of 300W, that only means it is capable of supplying that amount of power
    > under full rated load. And I believe a computer normally draws much
    > less
    > power than that.
    >
    > John John - MVP wrote:
    >> A kilowatt-hour is a kilowatt-hour regardless of where you are in the
    >> world. Just look at your power bill and you will see your cost per kWh,
    >> add applicable taxes, if any, just do the math.
    >>
    >> 1 kWh = using 1000 watts for 1 hour.
    >> Using 10 100 watt light bulbs for 1 hour = 1 kWh.
    >>
    >>
    >> Computer usage 24 hours/day, monthly cost:
    >>
    >> Assuming 300 watt power draw:
    >>
    >> (300 watts x 24 hours x 30 days)/1000 = 216 kWh
    >>
    >> Where I live residential power is about 10 cents/kWh + 15% tax which
    >> equals about 11.5 cents per kWh. Running the above 300w computer would
    >> cost me about $24.84/month, if I turn it off 12 hours/day I would save
    >> about $12.42/month... or $149.04/year.
    >>
    >> Of course, if you use your power saving options to turn off the monitor
    >> or other components in your computer your usage won't continuously be
    >> 300 watts/hr. Also keep in mind that many utilities have staggered
    >> power rates, the first x kWh might cost more than the next x kWh.
    >>
    >> John
    >>
    >> Eddie wrote:
    >>> To all repiers, Thanks.
    >>>
    >>> The comp' in question is an AMD2.4g with 1gig Ram and an average to good
    >>> video card,, it also has a CRT monitor. Sounds like I might be
    >>> wasting quite a few bucks.
    >>> The monies that you guys were quoting, was that in US $$'s? were any in
    >>> AU $$'s? (would give me a rough idea how much I been paying.)
    >>>
    >>> Ed

    >
    >
     

Share This Page