Vitali Kremez
  • Home
  • About
  • Contact
  • Cyber Security
  • Cyber Intel
  • Programming
  • Reverse Engineering
  • Exploit Development
  • Penetration Test
  • WIN32 Assembly
  • On Writing
    • Blog
    • LSAT
    • Photo
  • Honeypot
  • Forum

BLOG

Algorithms: Design and Analysis: About THE COURSE

10/2/2015

0 Comments

 
About the Course
 
Course Topics
Vocabulary for design and analysis of algorithms
Divide and conquer algorithm design paradigm
Randomization in algorithm design
Primitives for reasoning about graphs
Use and implementation of data structures
 
 
Vocabulary for design and analysis of algorithms
-E.g., “Big-Oh” notation
-“Sweet spot” for high-level reasoning about algorithms
 
Divide and conquer algorithm design paradigm
-Will apply to: Integer multiplication, sorting, matrix multiplication, closest pair
-General analysis methods (“Master Method/Theorem”)
 
Randomization in algorithm design
-Will apply to: QuickSort, primality testing, graph partitioning, hashing
 
Primitives for reasoning about graphs
-Connectivity information, shortest pahts, structure of information and social networks
 
Use and implementation of data structures
-Heaps, balanced binary tree searches, hashing and some variants (e.g., bloom filters)
 
Topics in Sequel Course
-Greedy algorithm design paradigm
-Dynamic programming algorithm design paradigm
-NP-complete problems and what to do about them
-Fast heuristics with provable guarantees
-Fast exact algorithms for special cases
-Exact algorithms that beat brute-force search
 
Skills You’ll Learn
-Become a better programmer
-Sharpen your mathematical and analytical skills
-Start “thinking algorithmically”
-Literacy with computer science’s “greatest hits”
-Ace your technical interviews
 
Excellent free reference: “Mathematics for Computer Science” by Eric Lehman
 
Merge Sort: Motivation and Example
 
*Good introduction to divide & conquer
-Improves over Selection, Insertion, Bubble sorts
*Calibrate your preparation
*Motivates guiding principles for algorithm analysis (worst-caseand asymptotic analysis)
*Analysis generalizes to “Master Method”
 
The Sorting Problem
Input: array of n number, unsorted
Output: same numbers, sorted in increasing order
 
Merge Sort: Example
Recursive algorithm
 
54187263
 
1 – 5418 -> 1458
2 – 7263 -> 2367
 
1 <> 2
 
===MERGE====
 
12345678
 
Merge Sort: Pseudocode
 
Recursively sort 1st half of input array -> merge two sorted sublists into one
 
 
Pseudocode for Merge:
C = output array [length=n]
A = 1st sorted array [n/2]
B = 2nd sorted array [n/2]
 
i = 1
J = 1
 
for k=1 to n
                 if A(i) < B(j)
                        C(k) =A(i)
                        i++
else [B(j)<A(i)]
                 C(k) = B(j)
                 j++
end
 
Merge Sort Running Time?
 
Key Question: running time of MergeSort on array of n numbers
Running time ~ #of lines of code executed]
 
Running Time of Merge
Upshot: running time of merge on array of m numbers is <= 4m+2  … <= 6m
 
Claim: Merge Sort requires <= 6 log2n + 6n operations to sort n numbers
 
2 * 2 * 2 =8 == 2**3 = 8
log2(8) = 3
 
Lesson: Merge Sort: Analysis
 
Proof of claim (assuming  n = power of 2)
[will use “recursion tree”]
 
The tree has the bottom recursive tree value:                                                                 
(log2n +1) to be exact
 
base cases = single-element arrays
 
2*j and n/2*j -> number of subproblems
at each level = j =0,1,2,.., log2, there are 2*j subproblems, each of size n/2*j
Total #  of operations at level j:
[each j =0,1,2,..log2n]
<=2*j
 
6n  -> independent of level j
 
Total
 
<=6n(log2n+1) + 6n
 
Running Time of Merge Sort:
 
Claim: For every input of n numbers, Merge Sort produces a sorted output array and uses at most 6nlog2n+6n operations
 
 
Guiding Principles for Analysis of Algorithm
 
 
Guiding Principle #1:
 
“Worst-case analysis”: our running time holds for every input  & length (particularly appropriate for “general-purpose” routines)
 
As opposed to
-“Average-case” analysis (requires domain knowledge)
-Benchmarks (requires domain knowledge)
 
Guiding Principle #2:
 
Won’t pay much attention to constant factors, lower-order terms
 
Justifications:
1 – way easier
2 – constants depend on architecture /compiler /programmer anyways
3 – lose very little predictive power
 
Guiding Principle #3:
 
Asymptotic analysis: focus on running time for large input sizes n.
 
E.g. 6nlog2n+6n  (merge sort) “better than” 1/2n**2 (insertion sort)
 
Justification: analysis: only big problems are interesting!
 
 
What is a “Fast” Algorithm?
 
Adopt these 3 guiding principles:
 
Fast algorithm – worst-case running time grows slowly with input size.
 
“Sweet Spot”: mathematical tractability and predictive power
 
Holy grail: linear running time (or close to it).
 
 
 II. ASYMPTOTIC ANALYSIS
The Gist 
Importance: vocabulary for the design and analysis off algorithms (e.g., “big-oh” notation)
  • ‘Sweet Spot’ for high-level seasoning about algorithms
  • Coarse enough to suppress architecture /language compiler-dependent details
  • Sharp enough to make useful comparisons between different algorithms, especially on large inputs (e.g., sorting or integer multiplication)
 
Asymptotic Analysis
 
Idea: suppress constant factors and lower-order terms
 
Lower-order returns become irrelevant for large inputs
 
Constant factors – too system-dependent
 
Example: equate 6nlog2n + 6n with just nlogn
 
Terminology: running time is 0(nlog2) == [“big-oh” of nlogn]
Where n = input size (e.g., length
 
Example: One Loop
Problem: does array A contain the integer t?
Given A (array of length n)
And t (an integer)
For  i= 1 to n
If A[i] ==t return True
Return False
 
Question: What is the running time?
  1. O(1)
  2. O(logn)
  3. O(n). Correct == Linear of input N
  4. O(n**2)
 
Example: Two Loops
Given A,B (arrays of length n)
And t (an integer)
 
For I = 1 to n
If A[I] == t return True
For I = 1 to n
If B[i] == t  return True
Return False
 
Question: Running Time?
  1. O(1)
  2. O(logn)
  3. O(n). Correct
  4. O(n**2)
 
Example: Two Nested Loops
Problem: do arrays A,B have a number in common?
 
Given arrays A,B of length n
For I =1 to n
For j =1 to n
If A[I] == B[j] return True
Return False
Question: Running Time?
  1. O(1)
  2. O(logn)
  3. O(n)
  4. O(n**2). Correct. Quadratic time algorithm
 
Example: Two Nested Loops (II)
Problem: does array A have duplicate entries?
Given array A of length n
For I = 1 to n
For j = I + 1 to n
If A[i]==A[j] return True
Return False
 
Question: Running Time?
O(n**2). Correct. Quadratic time algorithm
0 Comments



Leave a Reply.

    Author

    Vitali Kremez

    Archives

    November 2015
    October 2015
    September 2015

    Categories

    All

    RSS Feed

Powered by Create your own unique website with customizable templates.
  • Home
  • About
  • Contact
  • Cyber Security
  • Cyber Intel
  • Programming
  • Reverse Engineering
  • Exploit Development
  • Penetration Test
  • WIN32 Assembly
  • On Writing
    • Blog
    • LSAT
    • Photo
  • Honeypot
  • Forum