One of the elements of the regs which I find difficult to agree with. (That's diplomatic enough I think. ; \ ).Cables in insulation can overheat if not derated.
Insulation prevents or inhibits the passage of heat. Under normal use a cable can get luke warm, maybe slightly warm, but hot? For heat to be generated in a cable to a point where it will cause concern the cable has to be overloaded, or under continuous load over a long period. If it isn't then I cannot see how excessive heat, hot enough to cause damage, be it in insulation or not, can cause the need to increase the size of a cable (derate it). If you generate 30A of current in a 1.0mm2 cable it would clearly overheat. When you increase the cable size, more copper, less resistance, current flows nicely, no heat. Taking the same current in a 1.0mm2 cable in insulation the heat generated cannot dissipate so the damage is catastrophic and happens quicker. But, say the 30A current is flowing in a 4.0mm2 cable which gets warm, does the presence of insulation cause a heating increase? Like an oven increases in temperature and keeps getting hotter until the stat cuts the power. Does insulation trap heat and increase heat or can it maintain a constant temperature?
you are clearly not understanding the basic principles that set the rating of a cable.
Every cable, under a particular load value, will dissipate a certain amount of power in that cable, so it will heat up.
It's all about keeping that heat rise within acceptable limits so the cable is not damaged.
Clearly, to most people, a single cable, clipped to a wall, with nothing around it and no insulation anywhere, will dissipate heat quite well, so it will not warm up much and it will be fine.
Put the SAME cable inside a conduit with other cables also carrying current, and it's obvious that the cable will heat up a lot more. So when installed like that it has to be derated so it can only be allowed to pass a smaller current, so it does not heat up to the point of damaging the cable.
Now cover the whole lot in insulation, and it's heat dissipation becomes lower again, so now it can carry an even smaller current before it woudl heat up to the point of damaging the cable.
If you can't grasp that concept and think heat can't possibly be a problem, then you might not be the best person to be designing circuits and selecting the correct cable size for a particular installation method.
See Step's very good example in post #21