Daa Anskey

design and analysis of algorithms
of 14
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
  BTECH DEGREE EXAMINATION,MAY2014 CS010 601 DESIGN AND ANALYSIS OF ALGORITHMS ANSWER KEY PART-A 1.   A recursive algorithm is an algorithm which calls itself with smaller (or simpler) input values, and which obtains the result for the current input by applying simple operations to the returned value for the smaller (or simpler) input.   int factorial(int n) { if (n == 0) return 1; else return (n * factorial(n-1)); } 2.   Control abstraction means a procedures whose flowof control is clear but whose primary operations are specified by other procedures whose precise meanings are left defined. 3.   MONTE CARLO METHOD 4.   A minimum spanning tree is a least-cost subset of the edges of a graph that connects all the nodes Start by picking any node and adding it to the tree Repeatedly: Pick any least-cost   edge from a node in the tree to a node not in the tree, and add the edge and new node to the tree Stop when all nodes have been added to the tree 5.   An ordering of the vertices in a directed acyclic graph, such that: If there is a path from u   to v  , then v   appears after u   in the ordering.  1.   Compute the indegrees of all vertices 2.   Find a vertex U  with indegree 0 and print it (store it in the ordering) If there is no such vertex then there is a cycle and the vertices cannot be ordered. Stop. 3.   Remove U  and all its edges (U,V)  from the graph. 4.   Update the indegrees of the remaining vertices. 5.   Repeat steps 2 through 4 while there are vertices to be processed. PART B 6.   Space Complexity:- Space complexity of an algorithm is the amount to memory needed by the program for its completion. The space requirement s(p) of any algorithm p may thereore be written as s(p) =c+Sp, where c is a constant   Time Complexity:- Time complexity of an algorithm is the amount of time needed by the program for its completion Step count Step per execution or Build a table 7.   Finding the Maximum And Minimum. to find the minimum and maximum items in a list of numbers Algorithm StraightMaxMin(a,n,max,min) // Set max to the maximum and min to the minimum of a[1:n]; { max=min=a[1]; for i=2 to n do { if (a[i]>max) then max=a[i];   if(a[i]<min) then min=a[i]; } } The best case occur when the elements are in increasing order The number of element comparison is (n-1) Worst case occur when the elements are in decreasing order The no of element comparison is 2(n-1) The average no. of comaprison is [(n-1)+(2n-2)]/2 =3/2(n-1) 8.   Divide-and-conquer  algorithms split a problem into separate subproblems, solve the subproblems, and combine the results for a solution to the srcinal problem. Example: Quicksort, Mergesort, Binary search Divide-and-conquer algorithms can be thought of as top-down algorithms    In divide and conquer, subproblems are independent.    Divide & Conquer solutions are simple as compared to Dynamic programming .    Divide & Conquer can be used for any kind of problems.    Only one decision sequence is ever generated    Dynamic Programming split a problem into subproblems, some of which are common, solve the subproblems, and combine the results for a solution to the original problem.    Example: Matrix Chain Multiplication, Longest Common Subsequence    Dynamic programming can be thought of as bottom-up    In Dynamic Programming , subproblems are not independent.    Dynamic programming solutions can often be quite complex and tricky.    Dynamic programming is generally used for Optimization Problems.     Many decision sequences may be generated. 9.   10.   DETERMINISTIC ALGORITHMS  Algorithms with the property that the result of every operation is uniquely defined are termed deterministic. Such algorithms agree with the way programs are executed on a computer. Nondeterministic algorithms ã   In a theoretical framework we can allow algorithms to contain operations whose outcome is not uniquely defined but is limited to a specified set of possibilities. ã   The machine executing such operations are allowed to choose any one of these outcomes subject to a termination condition. ã   This leads to the concept of non deterministic algorithms. PART C 2.   ASYMPTOTIC NOTATIONS   Asymptotic efficiency of algorithms concerned with how the running time of an algorithm increases with he size of the input in the limit.The notations used to decribe the asymptotic efficiency of an algorithm is called asymptotic notations.   Asymptotic complexity   is a way of expressing the main component   of the cost of an algorithm, using idealized units of computational work. Note that we have been speaking about bounds   on the performance of algorithms, rather than giving exact speeds. Different asymptotic Notations are,    Big Oh    Omega    Theta    Little oh    Little omega BIG-OH NOTATION (O)
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks