
February 5th, 2004, 10:27 PM
#1
The most Important factor in a certain algorithm
I have read many posts I searched on these forums about algorithms, and had a look at some materials about how to analyze some simple algorithms. I however still do not understand how I can determine the most important factor in a certain algorithm. Especially in case there are many a loop and many different computations in those loops. How will I know Which one I have to choose among them then ?
Anyone please help me ?
Thanks a lot,
Last edited by Charleston; February 5th, 2004 at 10:29 PM.

February 6th, 2004, 08:34 AM
#2
Charleston,
I assume you are talking about the most important factors influencing the runtime performance of the algorithms.
For many common problems, there several existing algorithms and it is often known which algorithms are faster or better suited for computer implementation.
After you have chosen an algorithm and implemented it, you can study the runtime characteristics in detail using a code profiler.
Search the forum for the key word 'profiler'. Also look at the following link below. I think that you will be able to find some useful additional information.
http://www.codeguru.com/forum/showth...ht=performance
Good luck with the analyses.
Sincerely, Chris.
You're gonna go blind staring into that box all day.

February 6th, 2004, 09:24 AM
#3
Each algorithm has it's own time complexity (which is the most important parameter of an algorithm). Unfortunatelly what usally happens is that faster algorithms are harder to implement. Of course your selection may depend on several things.
For example if you want to sort an array of 100 numbers you don't have to use the fastest algorithm, because even an algorithm of O(n^2) (too slow) complexity would execute in some mseconds. You shouldn't bother to implement a faster algorithm (eg O(nlogn)). On the other hand, if your array size is some millions then probably you should...
As I said, sometimes faster algorithms are harder to implement. For example recursive algorithms are usually slower but easier to implement. So generally you have to balance between time complexity (and maybe complexity of memory) and level of implementation difficulty.

February 6th, 2004, 09:52 AM
#4
Thank you, Chris and yianakop for your answers,
Yes, algorithm's complexity is what I am teaching myself, and I have been stuck at how to compute a certain algorithm's time, which is lying mainly in "the most important factor in that algorithm". This factor will help me know if it is O(n^2)/O(n)/O(nlogN) etc...
Would you please tell me some more ideas abou this ?
Thanks,

February 6th, 2004, 10:02 AM
#5
Charleston,
I am certainly no expert and only have some general background information.
In general:
The order of the calculational complexity scales with the dimensions of the problem, i.e. 1dimensional corresponds to O(N), 2dimensional corresponds to O(N^2), 3dimensional corresponds to O(N^3), etc.
Many algorithms profit from logarithmic scaling, as pointed out by yiannakop. This is generally anything which can be cut up using interval halving methods. This kind of thing is very common in search algorithms and organizational sorting as well as certain mathematical problems such as FFTs.
I think that there must be good webbased tutorials or books discussing this topic. As a matter of fact, one of the wellknown Knuth books has a decent treatise on numerical complexity.
Sincerely, Chris.
You're gonna go blind staring into that box all day.

February 6th, 2004, 10:13 AM
#6
Originally posted by dude_1967
Charleston,
I am certainly no expert and only have some general background information.
No, please do not say so, what you know is really much more than the knowledege I have about this topic and many more, I am very much grateful for your ideas and suggestions...
In general:
The order of the calculational complexity scales with the dimensions of the problem, i.e. 1dimensional corresponds to O(N), 2dimensional corresponds to O(N^2), 3dimensional corresponds to O(N^3), etc.
Many algorithms profit from logarithmic scaling, as pointed out by yiannakop. This is generally anything which can be cut up using interval halving methods. This kind of thing is very common in search algorithms and organizational sorting as well as certain mathematical problems such as FFTs.
I think that there must be good webbased tutorials or books discussing this topic. As a matter of fact, one of the wellknown Knuth books has a decent treatise on numerical complexity.
Sincerely, Chris.
I will try to remember that...
Once again, thank Chris a lot,

February 6th, 2004, 10:32 AM
#7
Re: The most Important factor in a certain algorithm
Originally posted by Charleston I however still do not understand how I can determine the most important factor in a certain algorithm. Especially in case there are many a loop and many different computations in those loops. How will I know Which one I have to choose among them then ?
Conceptually, for many cases it's not that hard. You start off by analyzing the program flow. In case there are recursive function calls, you need to (at least in your mind or on paper) unroll the recursion and rewrite it as a loop.
Then, you have to quantify your input. Which parameters influence how the algorithm scales? In some cases, it's easy, and in other cases a bit harder. Most of the time this is only one number, say N, but sometimes it also depends on other numbers. For the classic sorting algorithm example, N would be the number of elements in the array.
Then when you code structure only consist of loops and ifs, you start off from the outside. How many times is the external loop executed (as a function of N). Then, check how many times the inner loops are executed. This does require either intuition (and experience) or a very careful analysis.
Finally, you multiply the inner loop's complexity by the outer loop's complexity (more, if there are more nested loops).
Get this small utility to do basic syntax highlighting in vBulletin forums (like Codeguru) easily.
Supports C++ and VB out of the box, but can be configured for other languages.

February 6th, 2004, 10:58 AM
#8

February 12th, 2004, 12:14 PM
#9
It's important to note that the 'old' practice of counting cycles/instructions, or counting how many compares/moves you're going to need isn't going to work reliably on todays computers/OS's anymore.
Too many 'outside influences' impact how a particular program/algorythm will run on a real life situation.
While it'll still give you a rough estimate of how two algorythms compare, actually implementing the algorythm is a whole different thing.
Example: While many programmers would state that quicksort is the fastest sorting algo on large datasets, I have actually seen Qsort run 10+ times slower as a much simpler sorting algo (along the lines of a heapsort) on a very large array.
Even though the Qsort did considerably less compares/moves than the heapsort it still was 10x slower... It turned out that the somewhat 'random' way Qsort accessed the array was causing much much more paging to occur vs the other algo which was working on much more localised parts of the array.
It turned out that the programmer of this particular piece of software had actually spent time in devising an algorythm that would efficiently sort a large array for a low memory situation (which was always the case on that particular machine).
Don't assume one algo is better just because 'it seems to do less'. You also need to take into account everything that goes on begind the scenes at the low level. The _ONLY_ way to really know one method is better than another is to implement both, and do an actual reallife comparison.
Note that 'generic' solutions to algorythms like the MFC/ATL or STL containers provide a good baseline for most programs. They have been made to provide good to adequate performance for a reasonable amount of code size, and they work well for 'generic' type problems which is what 99% of programmers need 99% of the time.
If you're dealing with extremely large amounts of data, or need a very specific way of dealing with the data, chances are MFC/STL may still do their job, but at horrible performance or memory overhead.
Generic type problems can use generic solutions.
Specific problems need specific solutions.

February 12th, 2004, 03:44 PM
#10
Thank OReubens a lot for advice and instructions,
You seem to know about those algorythms thoroughly, I will follow your advice, True.

February 12th, 2004, 03:56 PM
#11
You seem to know about those algorythms thoroughly
Well.. I have been doing programming for just over 20 years now, so I would be a bad excuse for a programmer if I hadn't learnt a thing or two by now ;)
Beside, 75% of my job consists of finding bottlenecks in existing software and think of other/better ways of achieving the same result in less time, less memory (or both).
The first step in trying to optimise something is always trying to find a better algorythm.

February 12th, 2004, 04:31 PM
#12
Originally posted by OReubens
The first step in trying to optimise something is always trying to find a better algorythm.
Thanks, I will try to find another algorythm to optimize my problem...
Posting Permissions
 You may not post new threads
 You may not post replies
 You may not post attachments
 You may not edit your posts

Forum Rules

Click Here to Expand Forum to Full Width
