Maximum Light Fixture Wattage

Talk Electrician Forum

Help Support Talk Electrician Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

vladfi

Member
Joined
Oct 23, 2023
Messages
11
Reaction score
2
I have a pendant light fixture which is rated at 80W, and I would like to understand why it is rated this way. From what I've read online, this is largely to prevent damage to the fixture from heat dissipated from the bulb. However, if the bulb is just hanging (see attached picture), isn't most of the heat dissipated into the air? And if I use a bulb splitter, I imagine the same will hold true. Is it safe in this case to go above the 80W rating?
 

Attachments

  • 20231128_181131.jpg
    20231128_181131.jpg
    944.3 KB · Views: 2
No...
Electricity always generates heat as it passes though a circuit..
higher wattage = more current..
more current = more heat..

Heat is produced in the cables, connectors and everywhere that the electricity flows..

The heat you are referring to, (the wasted energy from producing the light), is not the only heat that needs to be considered by manufactures when they specify what their products are designed for..

The contacts inside the pendant connector itself will get warmer during use, and these are obviously enclosed to prevent people touching live parts, so they are not in free air.
 
Just to add to the top information above, I have seen and dealt with Bayonet Cap lamp holders where the max wattage has been exceeded and the parts that failed were the wee springs that press the brass contact onto the lamp (bulb), the springs were no longer springy.
 
I actually just replaced the old Bayonet pendant fixture with an E27 one a few days ago. I'm testing now by drawing 105W using a 7-way bulb splitter (replacing the corn bulb in the previous picture). The bulbs are warm to the touch but the fixture is room temperature.
 
I actually just replaced the old Bayonet pendant fixture with an E27 one a few days ago. I'm testing now by drawing 105W using a 7-way bulb splitter (replacing the corn bulb in the previous picture). The bulbs are warm to the touch but the fixture is room temperature.

Not really sure what you are trying to test or prove by doing that?

What measurement tools do you have for checking the integrity of the electrical contacts..?
(Which will deteriorate faster if continually run at a load current greater than they were designed for.)

Feeling the outside of the lamp holder proves nothing.
 
Isn’t it ok to fit a larger led bulb? I’d always assumed ( I may well be the first 3 letters here) that the 80w is in case someone manages to dig out an incandescent or a halogen and fit it….. I didn’t think a led with say a 150w equivalent output would be a problem….😬
 
I also opened up the fixture and touched all the contacts after I'd had the light on for a couple hours. All of them were barely warmer than room temperature.

Humourme: to be clear I'm only ever talking in actual power drawn, not "equivalent" output.
 
I also opened up the fixture and touched all the contacts after I'd had the light on for a couple hours. All of them were barely warmer than room temperature.

Manufactures ratings are typically for long term continued use...
Not just for a couple of hours!

And fingers would not be classed as industry standard test equipment..
As per my previous observation you are proving nothing...

Overloading any electrical accessory will work for a period of time...
But that is not how they are designed to work continually.
 
A couple questions:
- Do you think the circuits wouldn't have reached steady-state temperature after two hours of use? I can certainly leave it on for longer if that's the case.
- What other tests do you think I should do, and with what equipment?
 
The rating of a lamp holder is almost certainly rated for a filament lamp (remember those old fashioned things almost impossible to buy now) where they were perhaps 10% eficcient, so 100W in, 10W of light and 90W of heat out.

If you put an LED in, as you appear to have, I don't know the efficiency but it will be a LOT better than 10% so your 100W LED light might at a guess produce 10W of heat, so you could have a lot more LED power without overheating the fitting.

I doubt the actual power is a consideration, a 240V 100W light will draw about 0.4 Amps. It would have to be a VERY badly designed light fitting to get bothered by 0.4 A of current.

What are you illuminating that wants that much power? I worked out when I put all LED lights in our house, that every single light in the house on and it was under 200W.
 
A couple questions:
- Do you think the circuits wouldn't have reached steady-state temperature after two hours of use? I can certainly leave it on for longer if that's the case.
- What other tests do you think I should do, and with what equipment?

A couple of questions:
- Do you think the manufactures guidance has no account for normal service usage?
- Do you think that manufactures have less understanding of suitable tests and/or duration and/or equipment needed to verify the safe operation of their products than you do?

If you want to connect greater loads.. just go ahead and do it..
I am really not interested in debating the exact detail of why a specific product may have a maximum power rating specified by the manufacture.

I have far more important things to do AND after seeing far to many overloaded accessories where overheating is clearly evident at the cable terminations then I would never recommend exceeding the manufactures maximum power ratings!!
 
The rating of a lamp holder is almost certainly rated for a filament lamp (remember those old fashioned things almost impossible to buy now) where they were perhaps 10% eficcient, so 100W in, 10W of light and 90W of heat out.

If you put an LED in, as you appear to have, I don't know the efficiency but it will be a LOT better than 10% so your 100W LED light might at a guess produce 10W of heat, so you could have a lot more LED power without overheating the fitting.

I doubt the actual power is a consideration, a 240V 100W light will draw about 0.4 Amps. It would have to be a VERY badly designed light fitting to get bothered by 0.4 A of current.

What are you illuminating that wants that much power? I worked out when I put all LED lights in our house, that every single light in the house on and it was under 200W.
Thanks for chiming in on this Dave, sounds like you're confirming my initial guess that the power ratings are about heat dissipation from the bulb and not electrical resistance in the fitting itself generating too much heat. That would imply that the thermal properties of the bulb matter a lot, e.g my latest iteration involves seven 15W bulbs on a splitter which means almost all of the heat from the bulbs is dissipated into the air. Also, LEDs are apparently ~50% efficient these days which gives some more headroom.

I am illuminating my room and have found that I am much more energetic when the room is bright during the day, particularly in the winter. See e.g. https://meaningness.com/sad-light-lumens.
 

Attachments

  • 20231201_120809.jpg
    20231201_120809.jpg
    869 KB · Views: 0
If you really want such a high level of illumination there are plenty of floodlight fittings designed for the purpose.
I am actually using some floodlights already, mounted on a tripod next to my desk and aiming them towards the wall/ceiling because they are too bright to look at directly.
 

Attachments

  • 20231202_095235.jpg
    20231202_095235.jpg
    704 KB · Views: 0
I also opened up the fixture and touched all the contacts after I'd had the light on for a couple hours. All of them were barely warmer than room temperature.

Humourme: to be clear I'm only ever talking in actual power drawn, not "equivalent" output.
My neighbours don't believe in climate change or using led lamps. They have bought a job lot of 150w tungsten filament lamps. All their lamp holders are 8o watt, they last the abuse several years before actually burning out from the excess heat.
 
Top