![]() |
EZNEC and Linux
Michael Coslo wrote:
... Visual Basic being a sucky language, give example? Krist, VB makes call into all the windows api's, it support com, windows scripting, and all the other silly stuff windows does. In other words, IT IS SLOW! And HEAVY on dependence on windows and all windows flaws ... It cannot be cross compiled for linux or most other os's without MAJOR programming efforts. C/C++ is where it is at ... VB is for babies and web site designers ... -- http://assemblywizard.tekcities.com |
EZNEC and Linux
Cecil Moore wrote:
John Smith I wrote: T-bird is the bomb! Is a bomb good or bad? :-) "The Bomb!" = "BAD" (as in, that '57 chevy is BAD!) --and, in turn-- BAD = GOOD! Regards, JS -- http://assemblywizard.tekcities.com |
EZNEC and Linux
The bottom line should be is EZNEC accurate? Has the programming been held within the confines provided by the original provider' What you are referring to is called "validation" of the modeling code. Since EZNEC uses the NEC2 engine (or the NEC4 engine) underneath, the inherent modeling accuracy is that of those engines. the U.S. government. Who overseas the content of this so called program. The same as is true of virtually every other RF and EM modeling program out there.. The manufacturer of the program oversees it. You, as a customer, get to run whatever validation suites make you happy. If you don't like the results, don't use the program. If you find a bug, you report it to the mfr, and usually they roll out a new version sooner or later that fixes the problem. This is the advantage of using a provider that has been in business a while.. they've gone through the release cycle more than once, and they've got lots of eyes looking at the product. While looking at source code sounds nice, I suspect that very, very few of the people who use modeling codes would be willing to take the time needed to go through them and understand how it works, much less try to find bugs. But it does happen. Every few years, you see Jerry Burke at LLNL announce some minor change in the NEC code base to address some bizarre corner case. If it has a patent then all would or should be revealed in the patent disclosure. Has anybody taken this for his own use for the advancement of science which is the reason for patents? Or, even better, they could reveal all in a peer reviewed paper in the technical literature or in a technical report. Gosh... isn't that what the NEC folks did. There's hundreds of pages of documentation explaining the theoretical underpinnings of NEC, how it was validated, etc.. If you're interested in the optimizers....well, there's a raft of papers and books on the virtues and problems with various optimizers, both in general and in combination with NEC or other modeling codes. Yes, in most cases, you have to wonder if the implementation of the algorithm was properly done, e.g. Excel uses the Nelder-Mead algorithm in it's "solver".. and MS doesn't tell you how they exactly did it.. so it's up to you to find some good test cases and see if it works like you think it should. Or, do reasonableness checks on the output.. but that's something you should do with an optimizer in any case. Has anybody upgraded the assigned patent for the sake of science or has something not been disclosed to prevent true examination and as such invalidates the patent? Does the government have the option of review of all algorithms or are they in the same position the country is with voting machines? Basically the purchaser is really in the position of caveat emptor especially since all programs provide different results! You betcha... You gets to pay yer money and you gets to take yer chances. But in practice, the reputable mfrs of modeling codes tend to provide validation examples, if only because that's how they do their own internal testing. For example, everybody runs the examples in the NEC manual, because those have been extensively validated, so if your code gives the same result as the NEC output (subject to the limitations of NEC and your code), then you say, "yes indeedy, my program works ok!" Jim |
EZNEC and Linux
Among EZNEC users are aerospace companies, U.S. and foreign government
agencies, universities, domestic and international broadcasters, cell phone providers, space agencies, telecommunications companies, and a very wide range of others. They are designing antennas which you make use of daily. The purchaser truly is in the position of caveat emptor, and the legal notice included with EZNEC carefully spells this out in legalese. But EZNEC also has an unconditional full money back satisfaction guarantee, and none of the above users have asked for a refund. Roy Lewallen, W7EL art wrote: The bottom line should be is EZNEC accurate? Has the programming been held within the confines provided by the original provider' the U.S. government. Who overseas the content of this so called program. If it has a patent then all would or should be revealed in the patent disclosure. Has anybody taken this for his own use for the advancement of science which is the reason for patents? Has anybody upgraded the assigned patent for the sake of science or has something not been disclosed to prevent true examination and as such invalidates the patent? Does the government have the option of review of all algorithms or are they in the same position the country is with voting machines? Basically the purchaser is really in the position of caveat emptor especially since all programs provide different results! Art |
EZNEC and Linux
On 5 Mar, 09:38, Jim Lux wrote:
The bottom line should be is EZNEC accurate? Has the programming been held within the confines provided by the original provider' What you are referring to is called "validation" of the modeling code. Since EZNEC uses the NEC2 engine (or the NEC4 engine) underneath, the inherent modeling accuracy is that of those engines. the U.S. government. Who overseas the content of this so called program. The same as is true of virtually every other RF and EM modeling program out there.. The manufacturer of the program oversees it. You, as a customer, get to run whatever validation suites make you happy. If you don't like the results, don't use the program. If you find a bug, you report it to the mfr, and usually they roll out a new version sooner or later that fixes the problem. This is the advantage of using a provider that has been in business a while.. they've gone through the release cycle more than once, and they've got lots of eyes looking at the product. While looking at source code sounds nice, I suspect that very, very few of the people who use modeling codes would be willing to take the time needed to go through them and understand how it works, much less try to find bugs. But it does happen. Every few years, you see Jerry Burke at LLNL announce some minor change in the NEC code base to address some bizarre corner case. If it has a patent then all would or should be revealed in the patent disclosure. Has anybody taken this for his own use for the advancement of science which is the reason for patents? Or, even better, they could reveal all in a peer reviewed paper in the technical literature or in a technical report. Gosh... isn't that what the NEC folks did. There's hundreds of pages of documentation explaining the theoretical underpinnings of NEC, how it was validated, etc.. If you're interested in the optimizers....well, there's a raft of papers and books on the virtues and problems with various optimizers, both in general and in combination with NEC or other modeling codes. Yes, in most cases, you have to wonder if the implementation of the algorithm was properly done, e.g. Excel uses the Nelder-Mead algorithm in it's "solver".. and MS doesn't tell you how they exactly did it.. so it's up to you to find some good test cases and see if it works like you think it should. Or, do reasonableness checks on the output.. but that's something you should do with an optimizer in any case. Has anybody upgraded the assigned patent for the sake of science or has something not been disclosed to prevent true examination and as such invalidates the patent? Does the government have the option of review of all algorithms or are they in the same position the country is with voting machines? Basically the purchaser is really in the position of caveat emptor especially since all programs provide different results! You betcha... You gets to pay yer money and you gets to take yer chances. But in practice, the reputable mfrs of modeling codes tend to provide validation examples, if only because that's how they do their own internal testing. For example, everybody runs the examples in the NEC manual, because those have been extensively validated, so if your code gives the same result as the NEC output (subject to the limitations of NEC and your code), then you say, "yes indeedy, my program works ok!" Jim Well put together Jim but it doesn't address what I am talking about. People generally consider antenna programs of being accurate( tho they are not actually depending on the programer. I explained how statics theorems can also include electro magnetics but people look for a book without using their own brain and yet they will accept a computor program. Now I put the burden back on them by asking for them to place random numbers into a program with variables to determine the best array for a particular benefit and where I state that the computor will not provide them with a yagi. The program confirms my teachings yet nobody can default it and are not willing to agree or default a computor program. I have applied for a patent after many years of work and it is the PTO that matters to me now if amateurs wish to wave their hands. See my other thread where I asked readers to check out a particular program for me to furthur prove my case. Thanks Jim for guioding me to the Rutgers antenna book that is on the web. Chaptor 21 finally gave me the go ahead to file. Regards Art |
EZNEC and Linux
art wrote:
Even asked Arie to check his but only silence reigned which emphasises that people are just lazy or choose to remain silent when un unsuitable answer occurrs. Aha, could be so. I can't remember or maybe I have banned it from my memory :-) But I agree I am also lazy and there are a lot of questions for which I do not have an answer (yet), but (I hope) I have an open mind and I am never too old to learn. ( This also emphasises what a great job W4RLN is doing for ham radio where he points out where all the programs differ and who he perceives as correct. I agree completely. Maybe the future will learn that we were fooled by Nec2/4 and/or other software, but as long as I do not have (or am willing to spend the funds for) a more accurate method of predicting behavour or performance I am afraid I will have to stick with it. To me it shows that the human mind really only believes what he wants to believe so a program with high gain results is the best seller even tho inaccurate. Hmm is it so ?. I don't know. If you ask me, every person does have it's own motives to decide if he buys or uses one program or the other, and how much he does trust the results obtained with the method(s) or underlaying software used by the program. Arie. |
EZNEC and Linux
On 6 Mar, 01:54, "4nec2" wrote:
art wrote: Even asked Arie to check his but only silence reigned which emphasises that people are just lazy or choose to remain silent when un unsuitable answer occurrs. Aha, could be so. I can't remember or maybe I have banned it from my memory :-) But I agree I am also lazy and there are a lot of questions for which I do not have an answer (yet), but (I hope) I have an open mind and I am never too old to learn. ( This also emphasises what a great job W4RLN is doing for ham radio where he points out where all the programs differ and who he perceives as correct. I agree completely. Maybe the future will learn that we were fooled by Nec2/4 and/or other software, but as long as I do not have (or am willing to spend the funds for) a more accurate method of predicting behavour or performance I am afraid I will have to stick with it. To me it shows that the human mind really only believes what he wants to believe so a program with high gain results is the best seller even tho inaccurate. Hmm is it so ?. I don't know. If you ask me, every person does have it's own motives to decide if he buys or uses one program or the other, and how much he does trust the results obtained with the method(s) or underlaying software used by the program. Arie. So what does your program provide? Art |
EZNEC and Linux
"J. Mc Laughlin" wrote in
: ... and many other things exist to frustrate accurate computing. Long, long ago when translating antenna modeling (written in FORTRAN) that had given reasonable results on a 60 bit/word CDC computer to an IBM 32 bit/word computer, I found the code for one antenna type to be unsalvageable. No matter what was done with concatenating words together, garbage resulted. A close look found that the algorithms used were much too sensitive to significant figures. Though the 60 bit CDC machines were regarded as the ants pants by engineers and scientists, the IBM 370 machines (and later) using double precision were better. The tricky bit was (IIRC) that the representation of reals on CDC machines used a base of 2 for the exponent, whereas the IBM format used 2^16, and obviously the two macines allocated a different number of bits to the mantissa and exponent. It was hard to state the extent of improvmenet in precision in the IBM format due to the use of the larger number for the exponent base. On occasions, this gave rise to different results from programs ported from on to other. It might have seemed like splitting hairs, but it showed how close to the wind some of the programs ran in terms of numerical stability. I recall in the early days of Excel (V2???) when Microsoft first allowed user developed add-ins (DLL only, they hadn't thought of VBA), I wrote a function library for Erlang functions (and some other traffic funcitons). A chap I was doing some work for asked for a spreadsheet to resemble a set of printed Erlang tables, and he went through checking them. When challenged about small differences, I offered "well see, the engineer who probably developed that set of tables as a major project, probably used a CDC machine with a mere 60 bit real representation (which was thought to be the ducks guts in its heyday), but I have used the Intel 80 bit reals inside the routines, and although Excel only uses 64 bit reals, accumulated rounding errors inside the function library are reduced (Erlang is an iteritive calculation, but can be optimised to reduce effects of rounding and overflow)". He was convinced, but I think somewhat dissapointed to see a low cost desktop computer providing a more accurate solution than the iconic CDC. Our mobile phones have probably got more powerful processors now than the 386/SX16 that I used to develop that library! Owen |
EZNEC and Linux
Owen Duffy wrote in
: Ah the clarity after the send button is pressed: exponent, whereas the IBM format used 2^16, and obviously the two should be exponent, whereas the IBM format used 2^4 (16), and obviously the two |
All times are GMT +1. The time now is 09:15 AM. |
Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
RadioBanter.com