How many watts does a TV use?

Blog
/
All Posts
/
How many watts does a TV use?
Get updates in your inbox
Subscribe and stay up to speed on the latest at Aurora.
Blog
/
All Posts
/
How many watts does a TV use?
By Jon Franke, Content Marketing Manager
December 3rd, 2025
Share

Most modern LED TVs use between 50 and 200 watts (W) of electricity, depending on their size, display, type, and features. While that might not seem like much on its own, your TV’s power draw adds up over time — especially with bigger screens and more streaming hours. Understanding how much energy your TV uses helps you estimate its impact on your electric bill while identifying opportunities to save.

For homeowners considering solar energy, knowing your TV’s wattage is one small but important part of assessing solar readiness. In this guide, we’ll break down how to calculate your TV’s electricity use and cost, compare wattage by TV type, and even explore how a home solar system can offset your entertainment energy needs while reducing overall power costs.

In this article

What is wattage, and why does it matter? 

A TV’s wattage tells you how much electrical power it uses while running. You can usually find this number on a label on the back or bottom of the TV under “power input” or “power consumption,” or in the manufacturer’s manual.

Wattage is measured in, you guessed it, watts, which are calculated by multiplying voltage (V) by amperage (A) (or volts x amps). A volt is a unit of measurement for the pressure of an electric current running through a circuit. An amp is the measure of the rate at which electricity flows through a wire.

How to calculate your TV’s electricity use

On your utility bill, you’ll notice you’re charged for kilowatt hours (kWh). This number is calculated by multiplying wattage by the number of hours used to create watt-hours (Wh). A kilowatt hour is simply 1,000 watt hours, so you divide watt hours by 1,000 to determine kilowatt-hours (kWh). 

So, when we’re talking about your TV, to estimate how much it costs to run, just multiply your TV’s kWh by your local electricity rate. 

For example: 100 watts x 10 hours = 1,000 Wh

1,000 Wh ÷ 1,000 = 1 kWh

1 kWh x .13* average U.S. electricity rate per kWh = $.13 per day

(* It’s actually $.1294, but electricity rates vary greatly, so be sure to use your own electricity rate here.)

To have a clearer understanding of your TV’s electricity use, you can use a watt meter or a smart plug to monitor how efficient your appliance is when it is operating and using electricity. As you can see, the higher the wattage, the higher the electricity usage and, therefore, the higher the cost to power the TV. Understanding these numbers helps you see how your TV fits into your home’s overall energy footprint and potential solar offset.

It’s also important to note that many smart TVs use electricity even when they’re turned off.

Typical wattage by TV size and type (with estimated monthly and annual TV costs)

Several factors contribute to the wattage and operating costs of a TV, and they vary depending on the TV brand, age, type, and your usage habits. Each of these can cause a TV to be more or less expensive to use than another. While TV costs are modest compared to other appliances, such as an HVAC system, fridge, or space heater, they are still important to consider when calculating your home’s total energy usage. 

The table below compares monthly and yearly costs by TV size and type. These estimated costs were determined using the current average utility rate in the United States ($0.1294) and the EIA’s Appliance Energy Calculator. These calculations factor in that the homeowner is watching TV for 2 hours per day, 365 days per year. 

TV TypeWattageEnergy UseCost Per Year
DLP175127.8 kWh$16.53
ED/HD, <40”150109.5 kWh$14.17
ED/HD, >40”234170.8 kWh$22.10
LCD150109.5 kWh$14.17
Plasma300219.0 kWh$28.34
Set top box2014.6 kWh$1.89

Factors that affect TV power consumption

Understanding what drives your TV’s energy use can help you choose the right model and optimize how you use it. Here are some of the main factors that influence electricity consumption.

Screen size & resolution

Larger screens and higher resolutions require more pixels — and more power. A 75-inch 4K TV, for instance, can use two to three times as much electricity as a smaller 40-inch HD model. The jump to 4K or 8K resolution means millions of additional pixels drawing energy to deliver sharper images.

TV type

Currently, the most energy-efficient TVs are LED, primarily due to the backlighting used to illuminate the TV pixels. Older TVs, such as plasmas, as well as new TVs like OLED and QLED, tend to draw more watts due to their higher pixel counts and brightness output. Energy Star-rated TVs, which have to meet stringent guidelines set by the U.S. Environmental Protection Agency (EPA) and Department of Energy (DOE) for efficiency, are at least 34% more efficient than non-rated models.

Settings & usage mode

Activating energy-saving or eco mode, reducing brightness, or turning off “game mode” and HDR will reduce the device’s power consumption and operating costs.  Many parents also report that turning off your TV and picking up a book or going outside can also reduce electricity use.

Smart features & standby power

Smart TVs use power even when turned off to stay connected to Wi-Fi, apps, or voice assistants, especially when “standby mode” is on. This “phantom load” can add up across multiple devices. The easiest way to minimize standby power is to use a smart power strip that cuts power completely when the TV is off, or unplug the TV when not in use (if you can crawl back to the outlet every time). 

Understanding your TV’s impact on home energy use

Although they are a small part of the home’s overall energy consumption, TVs still contribute between 4% and 7% to a household’s yearly electricity use. When used in conjunction with other devices, such as video game systems or streaming services, this can increase their potential power consumption. Fortunately, small adjustments like reducing your standby “phantom” loads can make your entertainment setup more efficient. 

When you know how your TV and other devices impact your electricity bill, you’re in a better position to find ways to save money on those energy costs. One big way to save is, of course, solar.

Curious how much of your home’s energy use solar could offset? Use Aurora’s free solar estimate tool to see personalized savings based on your actual energy habits.

Frequently asked questions

How many watts does a smart TV use on standby?

Energy Star-rated smart TVs on standby or passive mode use no more than 0.5 W. If you want to track your TV’s exact standby consumption, you can use a watt meter or a smart plug with energy monitoring. These tools show you real-time power usage, helping you identify phantom loads. 

Will an OLED TV consume more energy than an LED?

Generally, yes. An OLED TV may consume more energy than an LED TV primarily due to the brighter screen and higher resolution. Brighter screens require more power to light up the pixels, while higher screen resolution requires more pixels to make the picture clearer. That said, there are ways to reduce energy usage for OLED TVs, such as enabling eco mode or an auto brightness limiter, which adjusts brightness automatically based on what’s shown on screen. 

Can I estimate my TV’s energy use without a watt meter?

Yes. You can estimate your TV’s energy use with a simple calculation, no wattage meter required. You can usually find your TV’s wattage listed on the back or bottom under “power input” or “power consumption.” If not, this information should be readily available in the user manual or online. Once you have the wattage, you can use this calculation to determine the approximate kilowatt-hours (kWh), which is how the utility company bills for use: 

Wattage x hours of use = watt-hours.
Then, watt-hours ÷ 1000 = kWh.

How does TV usage affect my solar energy system?

While TVs use a relatively small share of household energy (typically 4–7% of total electricity), frequent use can slightly impact how much solar energy your home consumes versus exports. 

Does resolution (4K vs 8K) significantly increase power consumption?

Yes, the higher the resolution, the more energy will be used to power the TV. An 8K TV has four times the number of pixels as a 4K TV and uses more than double the electricity. All that extra pixel density requires more processing power and brighter backlighting. Backlight intensity is a significant factor in the power use of a high-resolution TV.

By Jon Franke, Content Marketing Manager
December 3rd, 2025
Share
Cabin House

Get Started

Goodbye confusion.
Hello sunshine.

Stay up to speed on the latest at Aurora.