home page - news - syllabus - topics - schedule - assignments - tutorials - tests - java - references - Mike Jacobson |
Introduction - Correctness of Algorithms - Analysis of Algorithms - ADTs, Stacks, Queues - Dictionaries - Searching and Sorting - Graph Algorithms |
Analysis of Algorithms |
In this course we will consider the amount of resources, including running time and storage space, that an algorithm uses when it is used to solve a problem.
We will be primarily interested in the resources used when an algorithm is used with large inputs. We will also try to focus on aspects of the behaviour of algorithms that are (largely) independent of the computer on which a program is executed and other implementation details.
We will begin with a discussion of issues that must be addressed if you wish to measure the resources used by an algorithm as a function of the size of the input.
Some techniques that can be used to discover a useful bound on the time used by an algorithm, using the pseudocode for the algorithm, will be discussed during the second class on this topic.
At this point we will have discovered an interesting function of the input size that somehow describes the amount of resources used by a given algorithm.
In order to make use of this, we need notation that
focuses on the most significant information about these functions that we will need, and
helps us to choose among the algorithms that can be used to solve a given problem.
This notation to describe functions will be introduced during the final of this week’s lectures on this topic. We will also discuss a few of the functions that frequently need to be considered when the running times of algorithms are examined and compared.
The textbook is a very good reference for this topic. It includes the material that will be covered in lectures as well as a significant amount of additional material that students might find to be useful.
Section 2.2 includes the introductory material that will be covered in the first lecture.
Appendix A includes material on “summations” that is needed to analyze algorithms with loops, while Section 2.3 includes the analysis of a recursive “divide and conquer” algorithm. These can serve as references for the subject of the second lecture.
Asymptotic notation (the subject of the final of this week’s lectures on the analysis of algorithms) is covered in Chapter~3.
The textbook also includes more advanced material on the analysis of algorithms that students will see later on. In particular, Chapter 4 includes material that is useful for the analyis of divide and conquer algorithms, as well as other recursive algorithms. This is generall covered in CPSC 413.
Finally, the textbook includes material on this topic that is sometimes covered in more advanced optional courses, including introductions to different kinds of analyses of algorithms: Chapter 5 is a good reference for average-case analysis, while Chapter 17 provides information about amortized analysis.
After studying this topic in this course (including working through exercises that are provided) students should understand and be able to do the following.
Students should understand the objectives and limitations of the kind of “asymptotic analysis” that has been discussed in the lectures on this topic.
Students should understand and be able to define several of the types of analysis that have been mentioned, including “worst-case” analysis and “average-case” analysis. Students should also understand the difference between these, as well as the strengths and limitations of each type of analysis.
Students should know what “summations” and “recurrences” are, and they should understand how these arise when you are trying to estimate the worst-case running time of an algorithm.
Students should be familiar with “asymptotic notation” the the use of this notation to compare the rates of growth of running times.
Students should be familiar with functions that are frequently used to state the asymptotic rates of growth of running times, including polynomials of various degrees, logarithms, and exponential functions. They should understand the relative rates of growth of these functions and they should be able to use asymptotic notation to state relationships between these rates of growth.
Students should be able to use information like the above to choose between different algorithms for the same problems, given information about resource constraints that must be satisfied.
Exercises concerning the analysis of running times of algorithms and asymptotic notation will be made available during the week.
As usual in this course, students will be expected to have read and spend a considerable amount of time trying to solve the problems on these exercises before attending the tutorials where they will be discussed.
What is the objective of the “asymtotic analysis” of an algorithm (that is, what kind of property does this try to measure)? Why is this useful?
What is the “worst-case” running time of the algorithm?
What is the “average-case” (or “expected”) running time of an algorithm? What do you need to know (or assume) before you can determine this?
Why (and when) might this be more useful than the “worst-case” running time?
What should you do if you wish to approximate the worst-case running time of an algorithm that has a loop? What kind of expression should you expect to see as a result of this?
What should you do if you wish to approximate the worst-case running time of a simple recursive algorithm (in particular, a “divide and conquer” algorithm)? What kind of expression should you expect to see as a result of this?
What are the definitions of O(f), Ω(f), &Theta(f), o(f), and ω(f), if f is a function? How are these things related?
Name some functions that you might to see when using “Big-Oh” notation to state worst-case running times of algorithms. Make sure to include polynomials functions with several different degrees, logarithms, and exponential functions.
How are the rates of growth of these functions related?
This page last modified:
http://www.cpsc.ucalgary.ca/~jacobs/cpsc331/W07/topics/analysis_of_algorithms.html |