Design and Analysis of Algorithms Final
Name: _______________________________________
NYU NetID: __________________________________
Multiple Choice
- Employing the master theorem,
the solution to the recurrence T(n) = 2T(n/4) +
n0.51 is
- Θ(n2)
- Θ(n2 log n)
- Θ(n0.51) *
- the master theorem can't be applied here.
- When we have an optimal substructure and only the locally
optimal choice matters, we should probably use:
- a straightforward recursive solution.
- dynamic programming.
- a memoized recursive algorithm.
- a greedy algorithm. *
- If we have a binary search tree of height h, then
all searching functions such as min, max, and successor
will run in
- O(h2) time.
- O(ln h) time.
- O(h) time. *
- O(lg h) time.
- A depth-first search of this tree:
Will cover the nodes in what order?
- A-C-B-H-E-F-I-K-G-J-D
- A-B-H-C-F-I-K-D-E-G-J
- A-B-C-D-E-F-G-H-I-J-K
- A-B-C-H-F-I-K-D-E-G-J
- According to the binary search tree property
- for all nodes higher in the tree than x,
x.key ≤ y.key.
- for all nodes in the left subtree of x,
y.key ≤ x.key. *
- for all nodes lower in the tree than x,
x.key ≤ y.key.
- all nodes in the right subtree of x will
be the successors of x.
- Employing the master theorem,
the solution to the recurrence T(n) = 0.25T(n/2) + 8n is
- Θ(n-2)
- Θ(n-2 log n)
- Θ(8n)
- the master theorem can't be applied here. *
- For large n, an algorithm will run the slowest
if it time complexity is:
- O(n3)
- O((1.02)n) *
- O(n5)
- O(500n log n)
- Matroids are essentially a theory of
- independence. *
- connectivity.
- resilience.
- linearity.
- Consider the following alphabet with frequencies:
Symbol
|
Frequency
|
A
|
24
|
B
|
12
|
C
|
10
|
D
|
8
|
E
|
8
|
Which of the following is a possible Huffman coding for the
string "ABBCDAED"?
- 11111101011100111110
- 11001001011100111110
- 01001001011100111110 *
- 11001001011100111110
- When we have an optimal substructure and
overlapping sub-problems, we should probably use:
- a straightforward recursive solution.
- dynamic programming. *
- a merge sort.
- a greedy algorithm.
- Employing the master theorem,
the solution to the recurrence T(n) = 4T(n/2) + cn is
- Θ(n2) *
- Θ(n2 log n)
- Θ(cn)
- the master theorem can't be applied here.
- We have one algorithm for processing customer records
with run time of O(n), and another with run time of O(lg n) + 2500.
In what circumstances might we want
to choose the O(n) algorithm?
- The algorithm will be distirbuted across a network.
- We believe our program will
always be dealing with a
number of records less than 2500. *
- We are running on a slow CPU.
- If n is very large.
- We can sometimes be loose with our analysis of
divide-and-conquer algorithms,
in that we might omit details like
- recursions and function calls.
- exponential factors.
- floors, ceilings, and boundary conditions. *
- any n raised to a power less than 3.
- Under what circumstances might we
most reasonably expect to
receive worst-case input to our binary search tree?
- We insert data into the tree as it comes along to us.
- The data has been randomly shuffled.
- A user types in the data from a sorted list. *
- All of the above.
- When we have overlapping sub-problems and we want to solve
them from the bottom up, we should employ:
- dynamic programming *
- a recursive algorithm
- a recursive algorithm with memoization
- a greedy algorithm
- When we randomize an algorithm, we then speak of its
- necessary running time.
- best-case running time.
- expected running time. *
- unbounded running time.
- For large n, an algorithm will run the fastest
if it time complexity is:
- O(n!)
- O(n3) *
- O(n4)
- O((1.02)n)
- Strongly connected components of a graph are
- components of a directed graph that can
each be reached from each other. *
- components in a weighted graph where the
connections have the highest weights.
- components of a spanning tree.
- an vertices with edges between them.
- Which of these functions grows most slowly?
- lg n
- lg* n *
- ln n
- log10 n
- For a greedy algorithm to work, the optimal choice
- must depend on many over-lapping sub-problems.
- must not depend on any future choices. *
- must be dependent on a global optimal.
- must only be available after considering
all sub-problems.
- Ω-notation applied to a function implies
- it is a function we know little about.
- it is asymptotically bound from above and below.
- only that it is asymptotically bound from below. *
- only that it is asymptotically bound from above.
- In algorithm analysis, we usually analyze algorithms
in terms of
- actual running time in nanoseconds.
- the number of disk operations.
- the number of basic operations. *
- CPU time used.
- In proving that a problem has the greedy-choice property,
we often use a proof technique called:
- cut-and-paste. *
- the master theorem.
- substitution.
- recursion-tree analysis.
- The chief advantage of a doubly-linked list over a singly-linked list is
- inserts are faster.
- there are fewer pointers to maintain.
- we can back up as well as go forward in the list. *
- less memory is used.
- Often, a good alternative to dynamic programming is
- a simple recursive solution.
- a recursive solution with memoization. *
- a brute-force solution.
- all of the above.
- A random variable is actually
- the mean of all possible values.
- an algorithm.
- a precise real number value.
- a function. *
- In the substitution method for solving recurrences,
we
- substitute a different recurrence for
the one characterizing our algorithm.
- guess a solution and then
use induction to prove it.
- substitute a polynomial factor for
an exponential one.
- substitute an n2 wherever
we see a 2n
- Consider the following recursion tree.
What closed form best characterizes this tree?
- O(n/8)
- O(n log n)
- O(n2)
- O(log n)
- The worst case of hashing with chaining occurs when
- the input comes in sorted already.
- all inputs hash to different values.
- the input is purely random.
- all inputs hash to the same value. *
- Consider this recurrence:
What is the solution to this recurrence?
- T(n) = Θ(n3)
- T(n) = Θ(n2)
- T(n) = Θ(n8)
- T(n) = Θ(n8/3)
- If you have available a simple algorithm with some
acceptable run time, and a more complex algorithm
with slightly faster run time, you should
- always choose the faster algorithm.
- always choose the simpler algorithm.
- consider how big your input is likely
to be before choosing either algorithm.
- ask your boss what to do.
- In graph theory, a tree is
- a forest with no edges.
- a cyclical graph with no forests.
- any walk on a graph.
- a connected graph with no cycles.
- Graph theory is useful for
- task management.
- map coloring.
- cellular telephone networks.
- all of the above.
- A problem with linear probing to resolve hash collisions is
- the code is extremely complex.
- clusters of keys will build up in a linked list.
- the hash table clusters around its central value.
- once inputs start to cluster in one area of the
hash table, they will become more likely
to do so. *
- Consider this graph:
Which of the following would be an adjacency list
representation of this graph?
- 1 (2, 3, 4) 2 (1, 4) 3 (1, 4) 4 (2, 3, 5) 5 (4, 6)
6 (4, 5)
- 1 (2, 3) 2 (1, 4) 3 (1, 4) 4 (2, 5, 6) 5 (4, 6)
6 (4, 5)
- 1 (2, 3) 2 (1, 4) 3 (1, 4) 4 (2, 3, 5, 6) 5 (4, 6)
6 (4, 5)
- 1 (2, 3) 2 (1, 4) 3 (1, 4) 4 (1, 3, 5, 6) 5 (4, 6)
6 (4, 5)
- Taking the graph from the previous question,
which of the following would be an adjacency matrix
representation of this graph?
-
0 | 1 | 1 |
0 | 0 | 0 |
1 | 0 | 0 |
1 | 0 | 0 |
1 | 0 | 0 |
1 | 0 | 0 |
0 | 1 | 1 |
0 | 1 | 1 |
0 | 0 | 0 |
1 | 0 | 1 |
0 | 0 | 0 |
1 | 1 | 0 |
-
0 | 1 | 0 |
0 | 0 | 0 |
1 | 0 | 0 |
1 | 0 | 0 |
1 | 0 | 0 |
1 | 0 | 0 |
0 | 1 | 1 |
1 | 1 | 1 |
0 | 0 | 0 |
1 | 0 | 1 |
0 | 0 | 0 |
1 | 1 | 0 |
-
0 | 1 | 1 |
0 | 0 | 0 |
1 | 0 | 0 |
1 | 0 | 0 |
1 | 0 | 0 |
1 | 0 | 1 |
0 | 1 | 1 |
0 | 1 | 1 |
0 | 0 | 1 |
1 | 0 | 1 |
0 | 0 | 0 |
1 | 1 | 0 |
-
0 | 1 | 1 |
0 | 0 | 0 |
1 | 0 | 0 |
1 | 0 | 0 |
0 | 0 | 0 |
1 | 0 | 0 |
0 | 1 | 1 |
0 | 1 | 1 |
0 | 0 | 0 |
1 | 0 | 1 |
1 | 0 | 0 |
1 | 1 | 0 |
- Besides running time, we can also measure algorithm performance
by:
- amount of disk-memory communication
- number of processors used
- amount of network bandwith used
- all of the above *
- An advantage of the adjacency matrix representation
over the adjacency list representation is
- it is smaller.
- it allows quicker searches for determining whether some
edge exists. *
- it can represent weighted graphs.
- it can represent directed graphs.
- Which of the following matrices represents this directed
graph:
-
0 | 0 | 1 | 0 |
0 | 1 | 1 | 0 |
0 | 1 | 1 | 0 |
0 | 0 | 1 | 0 |
-
0 | 1 | 1 | 0 |
0 | 1 | 1 | 0 |
0 | 1 | 1 | 0 |
0 | 1 | 1 | 0 |
-
0 | 0 | 1 | 0 |
1 | 0 | 0 | 0 |
0 | 1 | 1 | 0 |
1 | 0 | 0 | 1 |
-
0 | 0 | 1 | 0 |
1 | 0 | 0 | 0 |
0 | 1 | 0 | 1 |
1 | 0 | 0 | 0 |
- A breadth-first search of this graph:
Will cover the nodes in what order?
- A-B-C-D-E-F-G-H-I-J-K
- A-C-B-H-E-F-I-K-G-J-D
- A-B-H-C-F-I-K-D-E-G-J *
- A-B-C-H-F-I-K-D-E-G-J
- Θ-notation applied to a function implies
it is
- a function we know little about.
- asymptotically bound from above and below. *
- asymptotically bound from below only.
- asymptotically bound from above only.
- Using the master theorem, the solution to the recurrence
T(n) = 3T(n/3) + n/2 is:
- Θ(n/3)
- Θ(n log n) *
- Θ(n)
- the master theorem can't be applied here.
- Dynamic programming is preferred to simple recursive
solutions when
- the recursive solution is hard to grasp.
- all of the sub-problems are completely
independent of each other.
- the recursion goes beyond 4 levels.
- the same sub-problems must be solved multiple
times. *
- Direct dictionaries are only practical when
- the number of keys is small. *
- we don't care how slow lookups will be.
- we aren't worried about the large number of collisions
hashing will produce.
- linear probing is appropriate.
- A greedy algorithm is appropriate when
- we need to consider the global situation
when making any choice.
- there is extensive over-lap among sub-problems.
- only the locally best choice matters. *
- greed is the goal of the algorithm choice.
- We can use a greedy algorithm to solve
- a rod-cutting problem.
- a minimum-spanning-tree problem.
- a matrix-parenthisization problem.
- a shortest-path problem.
- If f(n) = ω(g(n)), that means
- g dominates f asymptotically
- f is bounded below by g asymptotically
- f is bounded above by g asymptotically
- f dominates g asymptotically *
- Using the master theorem, the solution to the recurrence
T(n) = 64T(n/8) - n2 log n is:
- Θ(n2)
- Θ(n4 log n)
- Θ(n2 log n)
- the master theorem can't be applied here.
- The worst-case running time of a search on a binary search
tree is
- θ(n2)
- θ(n) *
- θ(lg n)
- θ(1)
- A toppological sort on a tree makes use of
- breadth-first search.
- finding the minimum spanning tree.
- depth-first search.
- finding the shortest path.
Problems
- Let us say we have an algorithm that runs
in 10n2 + 10 time.
We want to show its order of complexity is
θ(n2). Give
an example of a set of k1, k2,
and n0 that would do this.
- How will a depth-first search process the following tree?
Write a number above each vertex indicating the order in which
it will be visited:
- Use an indicator random variable to show how many numbers we
have to pick from a set of n numbers before we are
more likely than not to have drawn the same number twice.
- Describe why the fractional knapsack problem can be solved with
a greedy algorithm, but the whole unit problem cannot.
- Prove that Kruskal's algorithm can correctly find a minimum
spanning tree.