I'd like to learn a bit more about this, since all the analyses I've
seen have assumed a uniform flux density in the core material. And I've
always assumed that this is a good assumption for high permeability
materials at least.
According to conventional analysis, for a given permeability, current,
and number of turns, the flux density is inversely proportional to the
magnetic path length. For a toroidal core, this path length is taken to
be pi times the average of core ID and OD, so the net result is that the
flux density will be higher in a bead than a larger core, all else being
equal. But it's due to the shorter bead magnetic path length rather than
the distance of the core from the wire going through the middle. Other
core shapes, such as EI and pot cores, also obey the path length rule.
Other parameters, such as the amount of inductance or impedance per turn
squared, also depend on the path length, as well as the core
permeability and the cross sectional area of the magnetic path. So if
you hold one of these other parameters constant as you vary the core
size or shape, you can reach different conclusions about the effect of
the variations.
Roy Lewallen, W7EL
Rick Karlquist N6RK wrote:
I have observed this phenomenon as well but it seems
to be mainly an issue with beads or shapes with small
holes. The bead magnetizes from the inside out.
Larger "toroids" with sizeable center holes don't magnetize
as easily. This make sense physics-wise since Ampere's
law says H is inversely proportional to radius.
Rick N6RK
|