A Student Network for Collaboration and Sharing
Why does a transformer supposedly burn out when supplied with DC? I did do some searching and found explanations relating to emf as e=d(flux)/dt., i.e. change in flux for emf. Change in flux didn't happen in DC ...but what makes that the reason for transformer not to burn(or wear out) in AC, whereas wear out in DC?? Is it because all the current is drawn into primary itself in a DC and whereas part of the current component does work to link flux in secondary(in case of AC) ? - Is my understanding right?
I believe you are on the right track. When you provide a signal to a transformer, it goes to the primary windings which is just an inductor. One of the things that we know about inductors is that current lags the voltage. So when we apply a signal it takes some amount of time for the current to build up. If the signal being applied is AC then before the current has a chance to build all the way up, the voltage changes and the current begins to decline. In this scenario the current is always changing and is never left to continually increase and is kept relatively small.
Now take the case of applying DC to the transformer primary windings. The DC voltage suddenly switches on and the current lags behind. It takes time for the current to build up. However, after the current builds to a maximum it continues constant at that same level. The maximum current that would be flowing would be that of the current flowing through a resistor with the value of the resistance of the windings in the primary. Well, those windings are just made up of a long piece of wire. A piece of wire has very little resistance. With very little resistance, the current would be very high. Using ohms law you can see that the smaller “R” is, the larger “I” would be.
I = V/R