CodeGuru Home VC++ / MFC / C++ .NET / C# Visual Basic VB Forums Developer.com
Page 3 of 4 FirstFirst 1234 LastLast
Results 31 to 45 of 53
  1. #31
    Join Date
    Apr 2000
    Location
    Belgium (Europe)
    Posts
    4,626

    Re: IF-free conception.

    Depends what "navy" you refer to. If you refer to the american, then they're like most of the other US armed forces (and military in other nato countries for that matter) and are increasingly contracting out development work (or expect it built into the specific hardware) to public companies. Public companies that don't have an interest in "magic tools" or "unfamiliar development tools" they can't find developers for.

    I've done work for companies that have military contracts, and I've never even seen anything written in ada.

  2. #32
    Join Date
    Feb 2013
    Posts
    58

    Re: IF-free conception.

    Sure, performance no longer comes from an ever increasing clock rate but from an increasing number of cores. (#1)And that puts concurrent programming at the centre stage.

    But the solution is not a holy grail or a fifth element (as the proposal you referred to suggests). (#2)The solution is much more mundate than that, namely skilled programmers.
    Razzle, through prism of HPC, you need to lessen the number of threads as far as ye can do it: in the best scenario, MT algos boost performance linearly. For instance, performance of 4 000 000 of threads could be X+X/(BSDF >=100 ) (X -- performance, for example, of 1 000 000 of threads, BSDF -- boost slowdown factor ). Moreover, BSDF could be negative number as well. In other words, the more threads the more manage losses we get.

    #2: i Never been arguing that.
    If the navy has money to spare they should put it into Ada, a "magic tool" style project the military already started. Most of the existing navy software should in Ada already, at least if they live by their own rules. Any advances should be made public and freely available for the benefit of everyone.

    The Second Coming of Ada would be a much better project, and I think the language is worth it.
    i don't see the least reason to share your optimism: high-level programming (doesn't matter c/c+, ada, fortran, etc & so on-on) depends upon compiler's capability to gen efficient output. actually, all compilers have run the same song to optimize output. only difference between hl langs is specialization for different spheres: c/c++ is more suitable for system programming, fortran for math stuff. by the way, seems ideally iconic quote about hl langs:
    John Backus said during a 1979 interview with Think, the IBM employee magazine, "Much of my work has come from being lazy. I didn't like writing programs, and so, when I was working on the IBM 701, writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."[6]

    http://en.wikipedia.org/wiki/Fortran
    Nothing can illustrate the very roots of hl programming better
    Last edited by S@rK0Y; July 31st, 2014 at 03:40 PM.

  3. #33
    Join Date
    Jul 2013
    Posts
    576

    Re: IF-free conception.

    Quote Originally Posted by OReubens View Post
    Depends what "navy" you refer to.
    I'm refering to the Navy of this proposal the OP linked to,

    http://www.navysbir.com/n13_1/N131-061.htm

  4. #34
    Join Date
    Jul 2013
    Posts
    576

    Re: IF-free conception.

    Quote Originally Posted by S@rK0Y View Post
    through prism of HPC,
    I've stopped arguing with you since I don't understand your point.

    In my last post I just put in my two cents on this proposal you referred to,

    http://www.navysbir.com/n13_1/N131-061.htm

    My point was that it looks very much like a bureaucrat's wet dream. A magic tool you apply to any crappy code and it will turn into a wonder of excellence. It's an unrealistic dream.

    Instead they Navy could put their money into Ada, a magic tool style project the US military already started. It's time to finish up.

    Ada is a sad story. Many authority figures of computer science who were around in the 1980's were highly critical of Ada. In hindsight this is hard to understand but the result was that Ada never got the chance it deserved.

    Anyway, the results of the research & development effort that would go into improving Ada could then be used to improve other high-level languages as well. Ada gets a second chance and programming languages in general may benefit.

    But as I said, that's just my two cents.

    Thank you for this discussion. I think we reached a consensus. We don't agree on anything!
    Last edited by razzle; August 1st, 2014 at 09:47 AM.

  5. #35
    Join Date
    Oct 2008
    Posts
    1,456

    Re: IF-free conception.

    Quote Originally Posted by S@rK0Y
    high-level programming [...] depends upon compiler's capability to gen efficient output.
    ... and asm depends upon programmer's ability to gen efficient code. Are you really claiming that a programmer is better than a compiler solely on the assumption that the latter can "potentially" outperform the former ? it's like saying that gambling is better than a regular job because the former can give "potentially" unlimited earnings whilst the latter not ... it's a meaningless (and wrong) statement unless some strong numbers are produced ...

    Quote Originally Posted by S@rK0Y
    Nothing can illustrate the very roots of hl programming better
    this is unfair, the "no more than 10% slowdown" philosophy has been adopted by the c++ community since day one and performance has always been a primary concern for many high level language designers.

    In fact, the compiler is never left alone in translating hl code into machine code, it's continuously supervised by the programmer in all phases of software develpment. Others already quoted the choice of algorithms and data structure ( that obviuosly do take in consideration the potential hardware in which they'll run on ) but modern compilers and compiled laguages allow more and more sophiticated ways of "telling" the compiler what the programmer wants during the compilation stage ( consider generic programming in modern c++ ). Sub-language dependent optimizations like the one you suggest (eg. those depending on hardware,OS,usage patterns,etc...) are incapsulated into library facilities capable of translating into nearly optimal code ( modulo the corner cases OReubens already speaked about ).

    Indeed, "high levelness" does not (always) mean "complexity hiding" as you seem thinking, it means abstraction, that is just a different way of thinking/representing the same thing, that may or may not incur in a (relevant)performance penalty ( or even give better performance, if a broader pov of the development process is assumed, taking into account also its social(both from the producer and consumer sides)/business nature ).

    For example, in this post someone claimed that qsort+MFCstrings was better than std::sort+MFCstrings and that this was somehow the consequence of the "higher levelness" of the STL algorithm. Actually, the comparison not only turned out to be unfair ( qsort essentially worked on C-strings pointers via some microsft specific beahvior whilst std::sort on the actual class ) but if you asked the STL routine to perform the true semantically equivalent version of the qsort code then std::sort eventually turned up faster ( while retaining better type safety and expressivity ). This just to warn against taking for granted that abstraction automatically translates into performance degradation ( especially with modern inlining compiler and compile-time language facilities ).

    So, whilst I agree that no programming language will ever automagically optimize itself, I still do believe that there's nothing preventing "abstract" code to produce optimal machine bytes as long as the abstraction is sufficiently good at representing the actual problem domain ( again, take a look at performance oriented generic c++ libraries out there ... ).
    Last edited by superbonzo; August 1st, 2014 at 03:39 AM.

  6. #36
    Join Date
    Apr 2000
    Location
    Belgium (Europe)
    Posts
    4,626

    Re: IF-free conception.

    Quote Originally Posted by superbonzo View Post
    ...are incapsulated into library facilities capable of translating into nearly optimal code
    Correct, but a somewhat incorrect view of things (which granted, is sometimes hard to get people to wrap their heads around).

    Compilers are getting better, CPU's are getting better. This means that the opportunity cost for optimizing becomes lower and lower, while the investment cost remains the same (or countrary even, more complex CPU's are making hand-optimizing increasingly harder).
    A library doesn't even have to provide 'nearly optimal' code. Often it's more about 'acceptable compromises', and some of those can be quite far from the ideal (but have benefits elsewhere that make it an acceptable compromise).

    Allow me to elaborate.
    Suppose I have a chunk of code. This chunk of code is "considered to be slow". The result takes 11 seconds, when it is expected to be near instant (or lets say 1 second).
    100 people are using that software daily to get 100 results. your opportunity cost is 10x100x100seconds per year.

    Assuming you can get it to run at 1sec, that means either...
    the company accepts that it takes 10seconds, and looses 100.000 seconds of worker time per year (which are going to be a mix of junior, mid level and maybe a few upper level employees).
    or it decides it might be worth it to let an expert programmer have a look at it... if that programmer spends more than 100.000sec of tome (27 hours, which isn't a lot for a dedicated optimization effort of this degree). they loose out. In fact they probably loose out a lot sooner than that because the expert programmer will have a bigger salary than the junior employees.

    Hardware gets upgraded so the 10sec becomes 5, and the expectant is still 1sec.
    As hardware becomes better, the need to optimize lessens, but the effort to recode it will still be the same.


    This isn't a reason to just go write sloppy code, it just means that the "normal" effort to programming is usually going to be enough. And yes, there are cases where the opportunity cost is very high and eeking out even nanoseconds of performance matters. The fact I have much more requests to take on optimizing jobs than I can handle (and it's not cheap, partly because of that) is proof of that.
    Last edited by OReubens; August 1st, 2014 at 08:49 AM.

  7. #37
    Join Date
    Oct 2008
    Posts
    1,456

    Re: IF-free conception.

    Quote Originally Posted by OReubens View Post
    Correct, but a somewhat incorrect view of things
    never said libraries are always nearly optimal nor that they should always be or that the broader picture should be ignored ...

    the issue here is whether it's possible to write high level language libraries that are simultaneously nearly optimal and sufficiently general ... I do think so, the OP thinks not ( at least this is how I interpreted the OP stances ... sorry if I got them wrong ).

  8. #38
    Join Date
    Jul 2013
    Posts
    576

    Re: IF-free conception.

    Quote Originally Posted by superbonzo View Post
    the "no more than 10% slowdown" philosophy has been adopted by the c++ community since day one
    Certainly there is no such thing as an "acceptance of a 10% slowdown" philosophy in C++.

    The C++ philosophy is that you only pay for what you get.

    The figure 10% comes from the anti-OO camp's claim that using OO will set you back 10%.

    This is not true but that's another discussion.
    Last edited by razzle; August 1st, 2014 at 04:13 PM.

  9. #39
    Join Date
    Feb 2013
    Posts
    58

    Re: IF-free conception.

    .. and asm depends upon programmer's ability to gen efficient code. Are you really claiming that a programmer is better than a compiler solely on the assumption that the latter can "potentially" outperform the former ? it's like saying that gambling is better than a regular job because the former can give "potentially" unlimited earnings whilst the latter not ... it's a meaningless (and wrong) statement unless some strong numbers are produced ...
    Superbonzo, optimized code ain't gambling at all. it's Just clear understanding of your very goal & the methods to achieve it. theoretically, we can use ATP(http://en.wikipedia.org/wiki/Automated_theorem_proving) systems. however, they're far not silver bullet too.

    P.S.
    Dear Friends, i'd like to stop such disputes. imho, it's straight road to nowhere. i'll appreciate, if someone(s) gonna post their benchmarks onto fsort(no-if). such posts shall be helpful to research subj.

  10. #40
    Join Date
    Oct 2008
    Posts
    1,456

    Re: IF-free conception.

    Quote Originally Posted by S@rK0Y View Post
    Superbonzo, optimized code ain't gambling at all.
    uh, did I say that optimization is like gambling !? sometimes I really feel my english is total crap ...

    Quote Originally Posted by S@rK0Y View Post
    Dear Friends, i'd like to stop such disputes. imho, it's straight road to nowhere
    sure, but it's a pity that adult people cannot explain themselves ( and carefully read other replies ) and come to some form of educated agreement; after all, we're here to enrich each other by sharing opinions, ain't it ?

    Quote Originally Posted by S@rK0Y View Post
    i'll appreciate, if someone(s) gonna post their benchmarks onto fsort(no-if). such posts shall be helpful to research subj.
    well, the point is that many here think that those benchmarks have poor practical and theoretical meaning for the reasons explained before, so if you refuse to address those issues then it's unlikely anybody will take their time to post them just for curiosity sake ...

  11. #41
    Join Date
    Feb 2013
    Posts
    58

    Re: IF-free conception.

    well, the point is that many here think that those benchmarks have poor practical and theoretical meaning for the reasons explained before, so if you refuse to address those issues then it's unlikely anybody will take their time to post them just for curiosity sake ...
    Superbonzo, well, i'd like to share my view: i think practice always has the last word. Everyone can say everything, but only Goddess Practice can determine who is right or wrong in the practice, i've no met serious evidences to believe in power of compilers. at most, compilers are Just stuffed w/ standard templates to translate pseudo-codes into Asm output. automatic optimization only chooses the set of those templates (according to your choice or default state). No magic, ain't it? Yes, it's possible to gear compiler up w/ capability to gen dynamic solutions. however, such way has been very restricted because it's extremely time-consuming & memory-consuming as well. combinatorial pit(explosion) is very Beast to fight against, if you want really sophisticated automation.

  12. #42
    Join Date
    Oct 2008
    Posts
    1,456

    Re: IF-free conception.

    >>> No magic, ain't it?

    some credit please , everybody's aware of the limits of compilers or the computational complexity involved in optimizations in general ( BTW, modern compilers do are more sophisticated than you seems suggesting, but this is another story ... ).

    >>> only Goddess Practice can determine who is right or wrong

    it's not that easy, because practical results are not always easy to compare, and "false winners" are just too common.


    There are three separate points I think; a practical issue, that is 1) why/when one should bother microoptimizing; a theoretical issue, that is 2) whether there exist an intrinsic limit of high level languages in reaching near optimal code; and finally a specific issue, 3) concerning your qsort/fsort comparison.


    regarding 1), I hope you agree that the decision of microoptimizing it's like any other decision involving the adoption of some technology/labour/machine in a production process. It's a balance of factors that may or may not end up being positive or negative, since it depends on the nature/value of all the production factors involved in the process. So, it doesn't matter if compilers are evil and programmers are geniuses or not, what matters is how these compare to each other in terms of cost of production and value produced, often evolving in time.

    Now, you said a programmer can potentially beat a compiler by order of magnitudes ( true in some cases, I really doubt so in all generality ) but you didn't explain how such advantage should translate into actual business value ( taking into account the time/risks associated to such endeavor ), hence making such a claim bogus. Note that the programmer faces the same complexity problems a compiler faces when systematically attacking those optimization problems and there's no guarantee the former will come up with a (order of magnitudes)better solution. As a somewhat related example, consider that it took four centuries to solve the seemingly easy 3d sphere packing optimization problem ( the Kepler conjecture ), ironically, with the aid of computer aided proof ... !


    concerning 2), I do agree with you that no "universal" optimizing compiler could ever exist, BUT I do believe that it's possible to generate optimal code via domain specific facilities, ie, good quality abstractions should be able of "guiding" the compiler to generate optimal bytes ( so, in a sense, it's like some form of high level microoptimizing ).


    regarding 3), others already pointed out these issues. The problem is that sorting is not a good candidate for testing the effectiveness of microoptimizations for the aforementioned reasons. Maybe, it would be better if you implemented some other, less data sensitive, simpler algorithm, showing predictively and unambiguously how the asm level optimization impacts the actual execution time ...

  13. #43
    Join Date
    Jan 2006
    Location
    Fox Lake, IL
    Posts
    15,007

    Re: IF-free conception.

    They'll be laughing at this thread in 50 years, I'd bet
    David

    CodeGuru Article: Bound Controls are Evil-VB6
    2013 Samples: MS CODE Samples

    CodeGuru Reviewer
    2006 Dell CSP
    2006, 2007 & 2008 MVP Visual Basic
    If your question has been answered satisfactorily, and it has been helpful, then, please, Rate this Post!

  14. #44
    Join Date
    Oct 2008
    Posts
    1,456

    Re: IF-free conception.

    wow, that's half a century of free entertainment ! should we ask for royalties ?

  15. #45
    Join Date
    Feb 2013
    Posts
    58

    Re: IF-free conception.

    regarding 1), I hope you agree that the decision of microoptimizing it's like any other decision involving the adoption of some technology/labour/machine in a production process. It's a balance of factors that may or may not end up being positive or negative, since it depends on the nature/value of all the production factors involved in the process. So, it doesn't matter if compilers are evil and programmers are geniuses or not, what matters is how these compare to each other in terms of cost of production and value produced, often evolving in time.
    Superbonzo, at most, Optimization doesn't need a genius programmer, it needs long standard routines to analyze existed bottlenecks & possible bypasses. and i gonna repeat for yet another time: modern compilers are very poor things to perform true HPC. if you were checking out fsort(no-if) code, you'd see some minor tricks compilers haven't done, but that "magic" significantly boost performance & it's portable for intel-based "stones". For instance, some "ifs" can be substituted by Just one! No magic, Just good Asming does do it for

Page 3 of 4 FirstFirst 1234 LastLast

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  





Click Here to Expand Forum to Full Width

Featured