Commit 9d413d7f by michael lundquist

### Almost done with ch 4!

parent 9dba162a
 ... ... @@ -37,4 +37,149 @@ #### Counting Primitive Operations - Algorithms are assessed by how many __primitive operations__ they perform. - A primitive operation is a low-level instruction with a constant execution time. - Some examples of primitive operations: - Assigning a value to a variable - Following an object reference - Performing an arithmetic operation - Comparisons - Accessing an array element - Calling a method - Returning from a method - The number of primitive operations, represented with *t*, is used to measure an algorithm's run time. - Measuring time complexity with primitive operations is a good measure of time complexity across machines, assuming that the primitive operations all take roughly the same amount of time in relation to each other on the different machines. #### Measuring Operations as a Function of Input Size - A function \$f(n)\$ measures the number of primitive operations of an algorithm at the input size of \$n\$ - 4.2 covers the 7 common functions \$f(n)\$ takes. - 4.3 introduces a mathematical framework for comparing these functions. #### Focusing on the Worst-Case Input - Focusing on an algorithm's worst case (largest \$n\$), is more conservative and generally easier than focusing on the average case. - Although it's easier to find the worst case than the average case, designing algorithms for the worst case is generally harder and requires better algorithm design. ## 4.2 The Seven Functions Used in this Book - The book references an [appendix of useful math facts](./umf.pdf) that I downloaded from [the author's website](http://bcs.wiley.com/he-bcs/Books?action=index&itemId=1118771338&bcsId=8635). It contains functions useful for algorithm analysis. - This section discusses 7 common forms of \$f(n)\$ #### The Constant Function - In a __constant function__, \$f(n) = c\$. The constant \$c\$ is independent of \$n\$. #### The Logarithm Function - In a __logarithm function__, \$f(n) = {\log_b n}\$ - \$b\$ is known as the __base__ of the log. - In computer science, usually \$b = 2\$ because we use binary so much. Assume \$b=2\$ when \$b\$ is omitted - The log function is the inverse of exponentiation assuming a constant base. - Formally, \$x={\log_b n}\$ if and only if \$b^x=n\$ - Some properties of the log function assuming \$a>0\$, \$b>1\$, \$c>0\$: - \${\log_b 1} = 0\$ - This actually works for \$b > 0\$, but \$b = 1\$ is nonsense - \${\log_b (ac)} = {\log_b a} + {\log_b c}\$ - \${\log_b (a/c)} = {\log_b a} - {\log_b c}\$ - \${\log_b (a^c)} = c * {\log_b a}\$ - by convention \${\log_d a^c} = {\log_d (a^c)}\$, instead of \${\log_d a^c} = ({\log_d a})^c\$ - \${\log_b a} = {\log_d a} / {\log_d b}\$ - \$b^{\log_d a} = a^{\log_d b}\$ - \${\log_b n}\$ can be calculated exactly with calculus, but \$ceil({\log_b n})\$ can be computed by iteratively dividing n by b until \$n \leq 1\$ as follows: ```java public static int log(int n, int b){ int x = 0; while(n > 1){ n /= b; x++; } return x; } ``` #### The Linear Function - In __linear functions__, \$f(n) = n\$ - Linear time is common when processing data for the first time to map out the topography of the data. #### The N-Log-N Function - In __n-log-n functions__, \$f(n) = n log n\$ - Many sorting algorithms take n log n - n log n is much faster than quadratic time. #### The Quadratic Function - In a __quadratic function__, \$f(n) = n^2\$ - Quadratic functions often have a nested loops that performs \$n\$ operations for each step of an outer loop that performs \$n\$ steps. - Often, the number of operations the inner loop performs depends on the outer loop. In these situations, you'll have to do some thinking. - For example, if each inner loop does 1 more iteration than the last inner loop, \$f(n) = n(n+1)/2\$. This [arithmetic progression](https://en.wikipedia.org/wiki/Arithmetic_progression) for summing the numbers between 1 and n was famously discovered by Carl Gauss as a young school child. - \$f(n) = \dfrac{1}{2}n^2 + \dfrac{1}{2}n\$ is quadratic, \$O(n^2)\$, because its degree is \$2\$. This is covered in more detail in future sections including 4.3 and the upcoming discussion on polynomials. #### The Cubic Function and Other Polynomials - In a __cubic function__, \$f(n) = n^3\$ - Cubic functions are less common than other functions, but do occur ##### Polynomials - __polynomial functions__ have the form \$f(n) = a_0 + a_1n + a_2n^2 + a_3n^3 + ... + a_dn^d\$ - \$a_o, a_1, ..., a_d\$ are __coefficients__ - \$a_d \neq 0\$ - \$d\$, the highest power in a polynomial, is the __degree__ of the polynomial. - Although constant, linear, quadratic, and cubic functions are technically all polynomial functions, the special attention we paid to them was warranted because they're so common. ##### Summations - The __summation notation__ \$\sum\limits_{i=a}^{b} f(i)\$ is a short hand way of writing \$f(a) + f(a+1) + f(a + 2+ ... + f(b)\$ - For \$\sum\limits_{i=a}^{b}\$ to work, \$a\leq b\$ - Summation notation is often used in data structures. For example: - [Polynomials](#polynomials) could be written in the form \$\sum\limits_{i=a}^{b} a_in^i\$ instead of \$f(n) = a_0 + a_1n + a_2n^2 + a_3n^3 + ... + a_dn^d\$ - [Gauss'es arithmetic progression](#The-Quadratic-Function) could be expressed in the form \$\sum\limits_{i=a}^{b}i = \dfrac{n(n+1)}{2}\$ #### The Exponential Function - __Exponential functions__ have the form \$f(n) = b^n\$ where \$b\$ is multiplied by itself \$n\$ times. - \$b\$ is called the __base__ - it's usually 2 - \$n\$ is called the __exponent__ - Simple forms of [Dijkstra's algorithm](https://en.wikipedia.org/wiki/Dijkstra%27s_algorithm) are exponential. - Given positive integers *a*, *b*, and *c*, we have the following properties: - \$(b^a)^c = b^{ac}\$ - \$b^a b^c = b^{a+c}\$ - \$b^a / b^c = b^{a-c}\$ - Note, the [log function properties](#The-Logarithm-Function) are derived from these rules - The \$k\$th __root__ of \$b\$ is a number \$r\$ that equals \$b\$ when raised to another number. - For example, the 5th root of \$b\$ is a number \$r\$ such that \$r^5 = b\$. - Roots are often expressed as \$b^{1/k}\$ because \$r = b^{1/k}\$ - *Proof*: using the first property, \$(b^{1/k})^k = r^k\$, therefore, \$r^k = b\$ and \$r = b^{1/k}\$ - Roots are also expressed as \$\sqrt[k]{b} = r\$ - We can express any real number in the form \$b^x\$, as we can express any real number in the form \$b^{a/c}\$, as we can express any real number in the form \$a/c\$ ##### Geometric Sums - __Geometric sums__ are of the form \$1+a+a^2+...+a^n\$, in other words, \$\sum\limits_{i=0}^{n}a^i\$. - The formula for a geometric sum is \$\dfrac{a^{n+1}-1}{a-1}\$ - In a binary number, where \$a\$ is 2, and \$\sum\limits_{i=0}^{n}2^i=2^n-1\$ - Geometric sums occur when each loop takes a factor of \$a\$ longer than the previous loop. ### 4.2.1 Comparing Growth Rates - \ No newline at end of file
 public class Examples{ public static void main(String[] args){ System.out.println("log 9, base 3 is" + log(9, 3)); } /** * Example from section 4.2, The Log Function */ public static int log(int n, int b){ int x = 0; while(n > 1){ n /= b; x++; } return x; } } \ No newline at end of file