Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #21   Report Post  
Old May 11th 10, 08:41 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Oct 2009
Posts: 9
Default Computer model experiment

What ever program you use let me know the result for a fat dipole.
Walk the walk ! Forget the talk!


Are you sure you have not violated the segment length/wire diameter
ratio? From Cebik; Intermediate Antenna Modeling: "In NEC-2 it
is especially important to keep the segment length (greater than)
about 4 times the wire diameter. You may reduce this value by half
by invoking the EK command." Also, what does your "Average Gain
Test" report show?

73,

Frank


  #22   Report Post  
Old May 11th 10, 08:53 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Oct 2009
Posts: 9
Default Computer model experiment

Increase diameter incrementaly in the order of 1000
inches or so.


As stated earlier the above is a gross violation of the
segment length/diameter ratio. Again; what does
your "Average Gain Test" report say under these
conditions?

Frank


  #23   Report Post  
Old May 11th 10, 09:17 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Mar 2008
Posts: 1,339
Default Computer model experiment

On May 11, 1:38*pm, Jim Lux wrote:
Ralph Mowery wrote:
"tom" wrote in message
et...
On 5/10/2010 3:12 PM, wrote:
As Clint said in the wonderful old movie, "A man's gotta know his limits".
For antenna modelers it should read, "A man's gotta know the program's
limits".


Of course, Art thinks things have changed and the computer modelers have a
better grasp upon reality than the ones even he calls "the masters". He is
an example of the blind man leading himself.


tom
K0TAR


The computer program should know its limits.


yes and no. *For EM modeling codes originally intended for use by
sophisticated users with a knowledge of the limitations of numerical
analysis, they might assume the user knows enough to formulate models
that are "well conditioned", or how to experiment to determine this.
NEC is the leading example here. It doesn't do much checking of the
inputs, and assumes *you know what you are doing.

There were modeling articles in ARRL pubs 20 years ago that described
one way to do this at a simple level: changing the number of segments in
the model and seeing if the results change. *The "average gain test" is
another way.

In many cases, the constraints on the model are not simply representable
(a lot of "it depends"), so that raises an issue for a "design rule
checker" that is reasonably robust. *Some products that use NEC as the
backend put a checker on the front (4nec2, for instance, warns you about
length/diameter ratios, almost intersections, and the like)

It's sort of like power tools vs hand tools. *The assumption is that the
user of the power tool knows how to use it.

* Anytine a program allows the

data entered to be too large or small for the calculations, it should be
flagged as being out of range. *Also many computer programs will use
simplified formulars that can mast the true outcome. *Usually it is not very
much, but as all errors start to add up the end results may be way off.


There's whole books written on this for NEC. *Part I of the NEC
documents, in particular, discusses this. *There's also a huge
professional literature on various FEM computational techniques and
their limitations. *NEC, like most numerical codes (for mechanics,
thermal, as well as EM), is very much a chainsaw without safety guards.
* It's up to the user to wear gloves and goggles and not cut their leg off.


Jim Lux of NASA no less!
All of the programs clearly state that they are based on Maxwells
equations. The bottom line of that equation is that for accountability
for all forces involved are required and where the summation of all
equals zero. This is nothing new and has been followed thru for
centuries. The equations requires first and formost equilibrium and
what the program supplies is easily checked that it meets these
requirements. It is very simple. Showing that the solution is that
inside an arbitrary boundary all within as with the whole must be
resonant and in equilibrium.It requires no more than that to show if
the program has achieved its object. I understand your preachings but
you presented no point that can be discussed.
Now you will respond that I must do such and such to back the
statement above despite that those requirements are the basis of
physics. So to you I will supply the same that I have supplied to
others which they reject, no one has stated why.
A arbitrary gaussian border containing static particles
( not waves as many summize. Gauss was very clear about the presence
of static particles) in equilibrium may be made dynamic by the
addition of a time varying field such that Maxell's equations can be
applied to solve.I have stated the over checks that can be applied to
provide correctness of this procedure. You may, of course, join the
poll that swells on behalf of NASA in opposition to the above but it
would provide me a great deal of delight if you provided more than to
just say "I am wrong". Nobody as yet provided one mathematical reason
that disputes the above, so in the absence of such you will not be
alone, only your credibility suffers but you will remain in the
majority of the poll in the eyes of the ham radio World.
Regards
Art Unwin
  #24   Report Post  
Old May 11th 10, 10:02 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Mar 2007
Posts: 801
Default Computer model experiment

Art Unwin wrote:
On May 11, 1:38 pm, Jim Lux wrote:

The computer program should know its limits.

yes and no. For EM modeling codes originally intended for use by
sophisticated users with a knowledge of the limitations of numerical
analysis, they might assume the user knows enough to formulate models
that are "well conditioned", or how to experiment to determine this.
NEC is the leading example here. It doesn't do much checking of the
inputs, and assumes you know what you are doing.

Jim Lux of NASA no less!

Speaking, however, as Jim Lux, engineer, not necessarily on NASA's behalf.

All of the programs clearly state that they are based on Maxwells
equations.

snip
I understand your preachings but
you presented no point that can be discussed.



While NEC and its ilk are clearly based on Maxwell's equations, one
should realize that they do not provide an analytical closed form
solution, but, rather, are numerical approximations, and are subject to
all the limitations inherent in that. They solve for the currents by
the method of moments, which is but one way to find a solution, and one
that happens to work quite well with things made of wires.

Within the limits of computational precision, for simple cases, where
analytical solutions are known to exist, the results of NEC and the
analytical solution are identical. That's what validation of the code
is all about.

Further, where there is no analytical solution available, measured data
on an actual antenna matches that predicted by the model, within
experimental uncertainty.

In both of the above situations, the validation has been done many
times, by many people, other than the original authors of the software,
so NEC fits in the category of "high quality validated modeling tools".

This does not mean, however, that just because NEC is based on Maxwell's
equations that you can take anything that is solvable with Maxwell and
it will be equally solvable in NEC.

I suspect that one could take the NEC algorithms, and implement a
modeling code for, say, a dipole, using an arbitrary precision math
package and get results that are accurate to any desired degree. This
would be a lot of work.

It's unclear that this would be useful, except perhaps as an
extraordinary proof for an extraordinary claim (e.g. a magic antenna
that "can't be modeled in NEC"). However, once you've done all that
software development, you'd need independent verification that you
correctly implemented it.

This is where a lot of the newer modeling codes come from (e.g. FDTD):
they are designed to model things that a method of moments code can't do
effectively.


  #25   Report Post  
Old May 12th 10, 01:30 AM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Mar 2008
Posts: 1,339
Default Computer model experiment

On May 11, 4:02*pm, Jim Lux wrote:
Art Unwin wrote:
On May 11, 1:38 pm, Jim Lux wrote:


The computer program should know its limits.
yes and no. *For EM modeling codes originally intended for use by
sophisticated users with a knowledge of the limitations of numerical
analysis, they might assume the user knows enough to formulate models
that are "well conditioned", or how to experiment to determine this.
NEC is the leading example here. It doesn't do much checking of the
inputs, and assumes *you know what you are doing.


Jim Lux of NASA no less!


Speaking, however, as Jim Lux, engineer, not necessarily on NASA's behalf..

All of the programs clearly state that they are based on Maxwells
equations.


snip
I understand your preachings but

you presented no point that can be discussed.


While NEC and its ilk are clearly based on Maxwell's equations, one
should realize that they do not provide an analytical closed form
solution, but, rather, are numerical approximations, and are subject to
all the limitations inherent in that. *They solve for the currents by
the method of moments, which is but one way to find a solution, and one
that happens to work quite well with things made of wires.

Within the limits of computational precision, for simple cases, where
analytical solutions are known to exist, the results of NEC and the
analytical solution are identical. *That's what validation of the code
is all about.

Further, where there is no analytical solution available, measured data
on an actual antenna matches that predicted by the model, within
experimental uncertainty.

In both of the above situations, the validation has been done many
times, by many people, other than the original authors of the software,
so NEC fits in the category of "high quality validated modeling tools".

This does not mean, however, that just because NEC is based on Maxwell's
equations that you can take anything that is solvable with Maxwell and
it will be equally solvable in NEC.

I suspect that one could take the NEC algorithms, and implement a
modeling code for, say, a dipole, using an arbitrary precision math
package and get results that are accurate to any desired degree. *This
would be a lot of work.

It's unclear that this would be useful, except perhaps as an
extraordinary proof for an extraordinary claim (e.g. a magic antenna
that "can't be modeled in NEC"). *However, once you've done all that
software development, you'd need independent verification that you
correctly implemented it.

This is where a lot of the newer modeling codes come from (e.g. FDTD):
they are designed to model things that a method of moments code can't do
effectively.


Again you preach but obviously you are not qualified to address the
issue.
Maxwells equations are such that all forces are accounted for when the
array is in a state of equilibrium. To use such an equation for an
array that is not in equilibrium requires additional input
( proximetry equations) which is where error creep in.When an array is
in equilibrium then Maxwell's equations are exact. The proof of the
pudding is that the resulting array is in equilibrium as is its parts.
AO pro by Beasley consistently produces an array in equilibrium when
the optimizer is used as well as including the presence of particles
dictated by Gauss., The program is of Minninec foundation which
obviously does not require the patch work aproach that NEC has. On top
of all that. it sees an element as one in encapsulation as forseen by
Gauss by removing the resistance of the element, which produces a
loss, and thus allows dealing only with all vectors as they deal with
propagation. It is only because hams use Maxwell's equation for
occasions that equilibrium does not exist, such as the yagi, do errors
start to creep in. Any array produced solely by the use of Maxwell's
equations provides proof of association by producing an array in
equilibrium which can be seen as an over check.Like you, I speak only
as an engineer on behalf of myself. Clearly, Maxwell had taken
advantage of the presence of particles when he added displacement
current so that the principle of equilibrium would be adhered to. This
being exactly the same that Faraday did when explaining the
transference from a particle to a time varying current when describing
the workings of the cage.
Regards
Art


  #26   Report Post  
Old May 12th 10, 03:23 AM posted to rec.radio.amateur.antenna
tom tom is offline
external usenet poster
 
First recorded activity by RadioBanter: May 2009
Posts: 660
Default Computer model experiment

On 5/11/2010 7:35 AM, Cecil Moore wrote:
On May 10, 10:40 pm, wrote:
And almost everything you claim about it, now that I know what you're
making claims against, is either wrong or inaccurate.


Here's my super-gain antenna with 24 dBi gain at a TOA of 23 degrees.

http://www.w5dxp.com/SUPRGAIN.EZ
--
73, Cecil, w5dxp.com



I don't know what the problem is, Cecil, it looks perfectly normal to
me. And it's great, effectively an omnidirectional super yagi on 40m
kind of thing.

You patented it, right?

tom
K0TAR
  #27   Report Post  
Old May 12th 10, 03:45 AM posted to rec.radio.amateur.antenna
tom tom is offline
external usenet poster
 
First recorded activity by RadioBanter: May 2009
Posts: 660
Default Computer model experiment

On 5/10/2010 10:40 PM, tom wrote:
On 5/10/2010 10:21 PM, Art Unwin wrote:
Ralph, the computer program I use is AO pro which is equipt with an
optimiser and based on Maxwells equation. It is required to provide


Art

I was an alpha tester on AO. Do you know what an alpha tester is?

I am sure that I know much more about this program's capabilities and
especially its limitations than you.

And almost everything you claim about it, now that I know what you're
making claims against, is either wrong or inaccurate.

tom
K0TAR


Art?

No comment?

tom
K0TAR
  #28   Report Post  
Old May 12th 10, 04:12 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 172
Default Computer model experiment

As given, the average gain is about 16.7 dB - so one knows that
something-is-afoot . . .
The driven element (wire 1) is essentially touching wire 4. Current in wire
4 is unbelievably high. With use of #30 wire things improve, but wires are
too close.

Thanks for the example. Will use it when next talking about NEC as an
example of what not to do.

73, Mac N8TT
--
J. McLaughlin; Michigan, USA
Home:

"Cecil Moore" wrote in message
...
snip

Here's my super-gain antenna with 24 dBi gain at a TOA of 23 degrees.

http://www.w5dxp.com/SUPRGAIN.EZ
--
73, Cecil, w5dxp.com


  #29   Report Post  
Old May 12th 10, 06:10 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Apr 2010
Posts: 484
Default Computer model experiment

On May 11, 8:30*pm, Art Unwin wrote:
When an array is
in equilibrium then Maxwell's equations are exact.


maxwell's equations are ALWAYS exact, it is digital models that are
inexact and have limitations due to the approximations made and the
numeric representations used.
  #30   Report Post  
Old May 12th 10, 06:42 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Mar 2007
Posts: 801
Default Computer model experiment

Art Unwin wrote:
On May 11, 4:02 pm, Jim Lux wrote:


Again you preach but obviously you are not qualified to address the
issue.


Opinions on qualification differ.

AO pro by Beasley consistently produces an array in equilibrium when
the optimizer is used as well as including the presence of particles
dictated by Gauss., The program is of Minninec foundation which
obviously does not require the patch work aproach that NEC has.


Interestingly, MININEC uses the very same method of moments that NEC
does, but, because it's "mini" it has substantial limitations. It was
developed to fit in small microcomputers of the day. I'd hardly call
NEC "patchwork". The two programs do use different formulations for the
basis function defining the current on the segment.



There are several papers out there that compare the mechanism of MININEC
vs NEC. One might start with the report by Burke and Poggio (for NEC)
and the report by Julian, Logam, and Rockway (which talks about
MININEC). John Rockway published a paper in 1995 describing the history
and differences.
"Advances in MININEC"
John Rockway, James Logan
IEEE Antennas and Propagation Magazine, v37, #4, August 1995, p7-12


Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
FA: Philbrick GAP/R Model K2-W Early Computer Tube Op-Amp [email protected] Boatanchors 3 April 19th 05 03:13 PM
FA: Philbrick GAP/R Model K2-W Early Computer Tube Op-Amp [email protected] Boatanchors 0 April 18th 05 04:26 AM
FA: Philbrick GAP/R Model K2-W Early Computer Tube Op-Amp [email protected] Boatanchors 0 April 11th 05 10:23 PM
FA: Philbrick GAP/R Model K2-W Early Computer VacuumTube Op-Amp [email protected] Boatanchors 0 March 16th 05 09:26 PM
FA: Radio Shack Model 100 laptop computer ++ [email protected] Equipment 0 January 31st 05 03:10 PM


All times are GMT +1. The time now is 01:06 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 RadioBanter.
The comments are property of their posters.
 

About Us

"It's about Radio"

 

Copyright © 2017