These assignments aim to prepare you for lectures and work on algorithm analysis. We will cover this material in lectures too, but it is important that you have prepared. Use the web to find the answers!

## 1. Models of computation

To analyse algorithms, one has to decide what aspect one is interested in and what the real costs are. An important tool for such analysis are *models of computation*. One import model is the *Random Access Machine* (abbreviated RAM, an unfortunate abbreviation collision with the memory type, but a completely different concept).

- Describe the RAM model. What assumptions are made and how well does it match an actual modern computer?
- Name and describe two other important models of computation.

## 2. Complexity analysis

When a model is chosen, one can start estimating the number of steps are needed or how much memory will be used by an algorithm on a particular instance of the problem (i.e., given some data). Usually, the objective is to understand an algorithm, and not to know exactly how many steps are needed for a particular problem instance; hence, one parameterises on input data size.

As an example, consider the problem of computing the sum of two *n*-digit numbers by hand (like “19+91”), and counting the number of elementary single-digit additions you make. For *n*=2, one can easily find that 3 additions may be required (2 for the single digits, and 1 for the carry), but it is stronger to say that, in general, 2*n*-1 single-digit additions are needed for *n*-digit addition.

The addition problem is easy to analyse, but many algorithms require more special care and have computational paths that are sensitive to the input. A careful claim like “2*n*+1″ is then very hard or even impossible to derive. Instead, *asymptotic analysis* is used, in which an upper bound is sought, described by a function. An algorithms *time complexity* is the asymptotic estimate of how many steps are needed. Similarly, *space complexity* is the amount of memory, asymptotically as a function of the input size, an algorithm would need.

Asymptotic analys is, indeed, sometimes quite complicated and it is not always possible to derive details about the bounding function (such as the constants “2” and “1” in the time complexity for addition). The work-around which has become important in algorithm analysis is to express the asymptotic complexity using “Big-Oh notation”. For example, the asymptotic time complexity for addition can be written *O(n), *and we say that “2*n*+1 is in *O(n)*“.

- What does the Big-Oh notation mean? You should both have an intuitive understanding for what
*O(f(n))*means and know what it means to a mathematician. - In the following, what statements are true?
- 2
*n*+ 17 is in*O(n).* - 10
*n*is in*O(n^*2*)*. *n^*2 + log*n*is in*O(n*^2*)*

- 2
- Why do we talk to “time complexity” but report the (asymptotic) number of steps of an algorithm?
- Consider the problem om multiplying two
*n*-digit integers. What is the time complexity this problem, if we take single-digit multiplication and addition as the elementary operations?