Greedy Techniques
- In the Fractional Knapsack problem with capacity 50 and items {(value,weight): (60,10),(100,20),(120,30)}, the greedy strategy by value/weight picks fractionally and yields optimal value =
A) 220
B) 240
C) 180
D) 200
Answer: B
Explanation: ratios: 6,5,4. Fill 10(60)+20(100)+20/30 of last (120 * 20/30 = 80) → 60+100+80 = 240. - For Activity Selection, with activities sorted by finish time, picking earliest finishing each time is correct because:
A) It maximizes start times
B) Of greedy-choice property and optimal substructure
C) It minimizes total durations
D) It minimizes number of activities
Answer: B
Explanation: Earliest finish leaves maximum room for others — exchange argument proves optimality. - Which problem is not always solvable optimally by a simple greedy rule?
A) Fractional Knapsack
B) Huffman Coding
C) 0/1 Knapsack
D) Activity Selection
Answer: C
Explanation: 0/1 Knapsack requires DP; greedy by ratio can fail. - Huffman coding builds an optimal prefix code by repeatedly:
A) Merging two largest-frequency nodes
B) Merging two smallest-frequency nodes
C) Selecting median frequency nodes
D) Building a balanced binary tree only
Answer: B
Explanation: Combine two smallest weights to minimize total weighted path length. - Kruskal’s algorithm for MST sorts edges by weight and adds if no cycle. Time complexity using union-find (with m edges, n nodes) is:
A) O(m log m)
B) O(m + n)
C) O(n log n)
D) O(m²)
Answer: A
Explanation: Sorting edges dominates: O(m log m); union-find nearly linear. - Prim’s algorithm using a binary heap to get MST has complexity:
A) O(m log n)
B) O(n²)
C) O(m + n)
D) O(m log m)
Answer: A
Explanation: Each decrease-key/extract-min per edge → O(m log n). - Dijkstra’s algorithm is greedy because it:
A) Always relaxes all edges each round
B) Extracts the closest unvisited vertex (greedy choice)
C) Uses divide-and-conquer
D) Sorts vertices by degree
Answer: B
Explanation: Picks minimum tentative distance and finalizes it — greedy choice. - Which shortest-path algorithm fails with negative-weight edges?
A) Bellman-Ford
B) Dijkstra
C) Johnson’s (with reweight)
D) SPFA
Answer: B
Explanation: Dijkstra assumes nonnegative edges; negative weights can yield wrong results. - In Job Sequencing with Deadlines & Penalties (n jobs), a greedy algorithm that schedules jobs in latest free slot works because:
A) It sorts by deadline ascending
B) It sorts by profit descending and places each as late as possible
C) It always schedules shortest jobs first
D) It schedules random jobs
Answer: B
Explanation: Sorting by profit ensures maximum profit; placing late leaves earlier slots for others. - The greedy algorithm for optimal merge patterns (merging files with lengths) uses:
A) Max-heap of file lengths
B) Min-heap of file lengths; repeatedly merge two smallest
C) Sort descending and merge largest first
D) Pair files arbitrarily
Answer: B
Explanation: Merging two smallest repeatedly minimizes total merge cost (Huffman-like). - Which greedy rule gives optimal solution for scheduling with identical machines to minimize makespan?
A) Longest Processing Time (LPT) first is optimal in general
B) LPT is approximation; exact is NP-hard (partition)
C) Shortest Processing Time (SPT) is optimal for makespan
D) Random assignment is optimal
Answer: B
Explanation: Scheduling on identical machines (P||Cmax) is NP-hard; LPT is a good heuristic but not always optimal. - Consider coin system {1,3,4}. For amount 6, greedy with largest coin gives 4+1+1=3 coins. Optimal is 3+3=2 coins. So coin-change greedy:
A) Always optimal for canonical systems only
B) Always optimal for any coin system
C) Never optimal
D) Optimal only for prime coin values
Answer: A
Explanation: Greedy works for canonical systems (e.g., US coins), but counterexamples exist. - To prove greedy correctness one commonly uses:
A) Induction and exchange argument
B) Randomization
C) Recurrence relations
D) Dynamic programming
Answer: A
Explanation: Exchange argument shows any optimal solution can be transformed into greedy solution without worsening. - Huffman coding is optimal because it:
A) Minimizes maximum code length only
B) Minimizes total weighted code length via greedy merges
C) Maximizes entropy
D) Uses dynamic programming
Answer: B
Explanation: Huffman’s greedy merge yields minimal weighted path length. - For interval scheduling to maximize number of non-overlapping intervals, the greedy rule is:
A) Sort by start time ascending
B) Sort by finish time ascending
C) Sort by duration ascending
D) Sort by finish time descending
Answer: B
Explanation: Earliest finish frees space for next activities ensuring optimal count. - In a matroid, greedy algorithm yields optimal solution for:
A) Any weight function if independence system is a matroid
B) Only spanning trees
C) Only Huffman-like problems
D) Only when elements are integers
Answer: A
Explanation: Greedy is optimal for matroids due to exchange property. - For minimum number of coins with canonical US coin system {1,5,10,25}, greedy always yields optimal. True or False?
A) True
B) False
C) Only for amounts <100
D) Depends on currency
Answer: A
Explanation: US coin denominations form canonical system where greedy works. - In set cover, greedy picks set with most uncovered elements iteratively. That greedy achieves approximation ratio:
A) 1 (optimal)
B) O(log n) approximation
C) 2-approximation
D) O(n) approximation
Answer: B
Explanation: Greedy yields H_n ≈ ln n approximation. - Huffman coding with frequencies {5,7,10,15,20,45} results in total weighted path length minimal — this is solved by:
A) Always merging largest two (wrong)
B) Always merging smallest two (correct)
C) Using DP
D) Using greedy descending frequency
Answer: B
Explanation: Huffman merges smallest weights each step. - Greedy algorithm for minimum-cost to connect ropes of lengths {4,3,2,6} yields merges:
A) 6+4, 10+3, 13+2 = high cost
B) 2+3=5, 4+5=9, 5+6=11 → total cost 25 (optimal)
C) Merge largest first is optimal
D) There’s no greedy solution
Answer: B
Explanation: Min-heap merges smallest ropes — identical to optimal merge pattern. - For scheduling with deadlines and unit-time tasks, if tasks are sorted by profit and placed last-possible slot, complexity is:
A) O(n²) naive; O(n log n + n α(n)) with DSU optimization
B) O(n!)
C) O(log n)
D) O(n) always
Answer: A
Explanation: Sorting O(n log n); scheduling with DSU (disjoint set) makes finds near-constant amortized. - Consider maximizing number of completed tasks with release times and deadlines on single machine — greedy by earliest deadline first (EDF) is:
A) Optimal for feasibility (no preemption) when tasks have equal processing times
B) Always optimal for arbitrary lengths without preemption
C) Never optimal
D) Only works with preemption
Answer: A
Explanation: EDF is optimal under specific conditions; with arbitrary lengths, scheduling is harder. - Greedy algorithm selecting minimal slack (deadline – remaining time) for scheduling:
A) Always optimal
B) May fail — not generally optimal
C) Equivalent to EDF
D) Solves knapsack
Answer: B
Explanation: Slack-based greedy can be suboptimal in many instances. - For minimum number of platforms problem (given arrivals/departures), greedy sorting by time and counting overlapping intervals gives complexity:
A) O(n log n) due to sorting
B) O(n²) always
C) O(n) always
D) O(log n)
Answer: A
Explanation: Sort events O(n log n), then linear sweep. - In fractional knapsack, if the next best item value/weight equals previous, choosing any fraction yields same optimal. True or False?
A) True
B) False
C) Only if weights equal
D) Only if values equal
Answer: A
Explanation: Equal ratio items are interchangeable regarding total value for given weight. - Greedy algorithm for selecting gas stations to minimize stops to reach destination: greedily go to farthest reachable station each step — this is:
A) Optimal (classic greedy interval cover)
B) Not optimal
C) Only works if stations at equal distances
D) Equivalent to Dijkstra
Answer: A
Explanation: Choosing farthest reachable extends range maximally; exchange argument proves optimality. - In edge-weighted interval scheduling where each interval gives profit and overlaps conflict, greedy by earliest finish:
A) Always optimal for weighted case
B) Not optimal — needs DP (weighted interval scheduling)
C) Works if weights monotone
D) Equivalent to unweighted case
Answer: B
Explanation: Weighted case requires DP for optimal solution. - Greedy matching in general graphs (pick any available edge and remove incident vertices) yields:
A) Maximum matching always
B) Approximation (1/2)-approx for maximum matching
C) Exact maximum cardinality matching
D) Works only for bipartite graphs
Answer: B
Explanation: Simple greedy matching is a 1/2-approximation. - For coin denominations {1,7,10}, amount 14 greedy (largest first) gives 10+1+1+1+1 = 5 coins, but optimal is 7+7=2 coins — thus greedy fails. So which property guarantees greedy coin optimality?
A) Canonical coin system (greedy-stable)
B) Primality of coin values
C) Coin values are powers of two
D) Coin values are consecutive
Answer: A
Explanation: Canonical systems ensure greedy optimality; general sets do not. - In set partition into two equal-sum subsets, greedy by largest-first (LPT-like) is:
A) Always optimal
B) Heuristic only; partition is NP-complete
C) Exact polynomial-time algorithm
D) Equivalent to Huffman
Answer: B
Explanation: Partition is NP-complete; greedy is heuristic. - The greedy algorithm for minimum vertex cover on bipartite graphs via Konig’s theorem uses:
A) Greedy edge picking
B) Maximum matching then derive minimum vertex cover (exact)
C) Random selection
D) Approximation only
Answer: B
Explanation: For bipartite graphs, min vertex cover = size of max matching; derive cover from matching. - For set packing, greedy by largest set first yields:
A) Optimal solution always
B) Approximation at best; NP-hard
C) Equivalent to set cover
D) Exact in polynomial time
Answer: B
Explanation: Set packing is NP-hard; greedy provides heuristic. - In interval partitioning (min number of resources so intervals don’t overlap per resource), greedy by start time and assigning to first free resource yields:
A) Optimal number of resources equal to max depth (max concurrent intervals)
B) Approximate only
C) Always uses more than optimal by factor 2
D) Only works for unit lengths
Answer: A
Explanation: Greedy algorithm assigning to earliest available resource is optimal and uses depth resources. - For building Huffman tree, using two smallest frequencies 3 and 3 merges to 6; merging with other nodes is always safe because:
A) Greedy choice property holds due to optimal substructure
B) It is a heuristic only
C) Because frequencies are identical only
D) Only works with powers of two
Answer: A
Explanation: Exchange argument shows merging smallest is safe. - Greedy approach for minimizing sum of completion times on single machine with jobs and processing times is:
A) Shortest Processing Time (SPT) first is optimal
B) Longest Processing Time first
C) Random order
D) Earliest due date first
Answer: A
Explanation: SPT minimizes average completion time (Smith’s rule). - For sequencing jobs with weights to minimize weighted sum of completion times, greedy rule is:
A) Sort by processing time / weight (p_i/w_i) ascending (Smith’s ratio rule)
B) Sort by shortest p only
C) Sort by largest weight only
D) Random schedule
Answer: A
Explanation: Smith’s rule: order by ratio p_i/w_i minimizes weighted sum of completion times. - For minimizing maximum lateness on single machine, greedy by earliest due date (EDD) is:
A) Optimal
B) Fails sometimes
C) Equivalent to SPT
D) Only for unit times
Answer: A
Explanation: EDD minimizes maximum lateness; simple exchange proof. - In constructing minimum spanning tree (MST), both Kruskal and Prim are greedy; Kruskal grows forest by adding safe edges while Prim:
A) Grows a single tree by adding cheapest edge crossing cut — both are greedy with cut property
B) Grows multiple trees randomly
C) Is DP-based
D) Is suboptimal for dense graphs
Answer: A
Explanation: Both exploit cut and cycle properties to make safe greedy choices. - Greedy algorithm to find subset with maximum sum ≤ S by taking largest elements first:
A) Works always
B) Fails in general (subset sum NP-hard)
C) Works if all numbers equal
D) Equivalent to knapsack fractional
Answer: B
Explanation: Subset sum is NP-hard; greedy isn’t reliable. - For scheduling with release times and preemption allowed, to minimize maximum lateness, which greedy is optimal?
A) Earliest deadline first (EDF) with preemption is optimal
B) SPT is optimal
C) Greedy fails
D) Random preemption
Answer: A
Explanation: EDF with preemption minimizes maximum lateness under preemptive scheduling. - Greedy algorithm for maximizing profit with resource constraints (knapsack-like continuous case) is fractional knapsack; but for integer knapsack it’s:
A) NP-hard → no greedy guarantee
B) Solvable by greedy always
C) Solved with Kruskal
D) Equivalent to Huffman
Answer: A
Explanation: 0/1 knapsack requires DP; greedy fails. - For graph coloring, greedy by ordering vertices by decreasing degree (DSatur) gives:
A) Optimal coloring always
B) Heuristic that can be good but not always optimal
C) Equivalent to bipartite test
D) Always gives chromatic number +1
Answer: B
Explanation: Greedy coloring heuristic depends on order; can be suboptimal. - In greedy for subset-sum approximation, repeated halving of items by size (rounding) yields:
A) A FPTAS (fully polynomial-time approximation scheme) for knapsack via DP rounding, not pure greedy
B) Exact greedy optimal for all instances
C) No approximation exists
D) Equivalent to Huffman
Answer: A
Explanation: Rounding + DP gives FPTAS; pure greedy not sufficient. - For maximum independent set on interval graphs, greedy by earliest finish yields:
A) Optimal (interval graphs are perfect)
B) Not optimal
C) Works only if intervals equal length
D) Equivalent to coloring
Answer: A
Explanation: Interval graphs allow greedy optimal for MIS via interval scheduling. - In fractional matching on bipartite graphs, greedy rounding from fractional to integral may:
A) Produce feasible solution with approximation guarantees
B) Always produce optimum integral solution
C) Fail to be feasible
D) Be irrelevant
Answer: A
Explanation: Rounding strategies can give approximations; integral max matching requires augmenting path algorithms. - The greedy approach for covering points with minimum intervals of fixed length L by sorting points and placing interval starting at leftmost uncovered point is:
A) Optimal
B) Suboptimal
C) Works only for L=1
D) Equivalent to set cover NP-hard case
Answer: A
Explanation: Greedy placing covers leftmost point and maximizes coverage; exchange proof ensures optimality. - In online algorithms, the greedy choice (serve earliest request) may be:
A) Competitive ratio bounded by constant for some problems (like paging LRU)
B) Always optimal against offline optimum
C) Equivalent to DP
D) Never useful
Answer: A
Explanation: Greedy online algorithms can be competitive; e.g., LRU has proven ratio under certain models. - For minimum sum of completion times with release dates, SPT is not always feasible; the right approach is:
A) Smith’s rule with careful handling — schedule feasible SPT among available jobs (list scheduling)
B) Always SPT plain order
C) Greedy fails completely
D) Use Kruskal
Answer: A
Explanation: Use list scheduling selecting shortest among available jobs; greedy but with dynamic selection. - A greedy algorithm has been used to approximate traveling salesman (metric TSP) by building a minimum spanning tree and performing preorder traversal (twice-around MST). Approximation ratio is:
A) 2-approximation for metric TSP (triangle inequality)
B) 1-approx (optimal)
C) 3-approx
D) Unbounded
Answer: A
Explanation: Twice-around MST gives ≤ 2 × OPT under triangle inequality. - For scheduling to minimize number of late jobs on single machine, EDD with iterative removal of job with largest processing time among those causing lateness yields:
A) Moore-Hodgson algorithm — optimal
B) Heuristic only
C) Equivalent to SPT
D) Not related to greedy
Answer: A
Explanation: Moore-Hodgson yields maximal on-time jobs via greedy removal. - Greedy approach to find minimum cardinality dominating set in general graphs:
A) Is heuristic with O(log n) approximation (set cover reduction)
B) Yields exact minimal dominating set
C) Equivalent to maximum clique
D) Runs in linear time and exact
Answer: A
Explanation: Dominating set approximable by greedy via set cover techniques. - For weighted interval scheduling, greedy by weight/length ratio:
A) Not always optimal; need DP
B) Always optimal
C) Equivalent to fractional knapsack
D) Only if intervals non-overlap
Answer: A
Explanation: Overlaps complicate greedy; weighted intervals solved by DP. - In minimum-cost arborescence (directed MST), greedy (Edmonds’ algorithm) is:
A) More complex but greedy-rooted; works for directed graphs
B) Same as Kruskal
C) Fails always
D) Equivalent to Prim
Answer: A
Explanation: Edmonds’ algorithm uses greedy selection of minimum incoming edges and contraction. - The greedy algorithm for max cover with k sets (pick k sets covering most uncovered elements) yields:
A) (1 − 1/e)-approximation
B) Exact optimum
C) 1/2-approximation only
D) No guarantee
Answer: A
Explanation: Greedy maximum k-cover has approximation guarantee (1−1/e). - In matroid intersection, greedy alone is insufficient; but greedy is optimal for single matroid optimization. True or false?
A) True (single matroid optimal)
B) False (even single matroid fails)
C) Only if matroid uniform
D) Only for graphic matroids
Answer: A
Explanation: Greedy works for single matroid; intersection requires more sophisticated algorithms. - For minimum weight feedback edge set in directed graphs, greedy removing heaviest edges is:
A) Not guaranteed optimal
B) Always optimal
C) Equivalent to MST
D) Unrelated
Answer: A
Explanation: Feedback edge set problems are complex; naive greedy may fail. - Greedy algorithm for packing intervals into minimum number of tracks (interval partitioning) is optimal by:
A) Sorting by start times and assigning to earliest finishing resource — yields depth
B) Sorting by finish time only
C) Random assignment
D) NP-hardness prevents greedy
Answer: A
Explanation: Earliest-available resource assignment attains minimal number equal to depth. - In image quantization (color palette reduction), greedy merging of closest colors may get stuck in local minima — thus:
A) Greedy is heuristic; clustering (k-means) may do better
B) Greedy always optimal
C) Greedy equivalent to Huffman
D) Greedy gives global optimum always
Answer: A
Explanation: Color merging heuristics are local; global optimum not guaranteed. - Greedy algorithm picking smallest-weight edge crossing any cut (cut property) always yields MST because:
A) Cut property ensures chosen edge is safe
B) Cycle property ensures any edge can be chosen
C) Sorting edges descending is required instead
D) Only for complete graphs
Answer: A
Explanation: Cut property proves edge is safe to add for MST. - For maximizing throughput in interval scheduling with weights, greedy fails; dynamic programming with binary search on compatible jobs is used giving O(n log n). True or False?
A) True
B) False
C) Only for unweighted case
D) Only if intervals length equal
Answer: A
Explanation: Weighted interval scheduling uses DP on sorted by finish times with binary search. - In bucket scheduling where tasks are grouped and greedy assigns groups by earliest deadline, it’s:
A) Heuristic — may fail
B) Always optimal
C) Equivalent to EDF per task
D) Identical to Moore-Hodgson
Answer: A
Explanation: Grouping can break greedy optimality; more care required. - For maximum coverage with budgeted costs, greedy by benefit/cost ratio is:
A) A known approximation (similar to knapsack cover approximations) but not exact
B) Exact optimum
C) Fails completely
D) Equivalent to set packing
Answer: A
Explanation: Ratio greedy gives good approximations, but not always optimal. - Greedy for minimum sensor placement along a line to cover points with unit radius by placing sensor at furthest reachable location is:
A) Optimal
B) Suboptimal
C) NP-hard
D) Works only for integer points
Answer: A
Explanation: Classic interval covering greedy. - In designing a Huffman code with ties (equal frequencies), choice of which pair to merge:
A) Any tie-breaking yields some optimal tree (all minimal)
B) Only specific tie-breaking yields optimal
C) Ties imply failure
D) Must reorder alphabetically only
Answer: A
Explanation: Many optimal Huffman trees exist; tie breaks preserve optimality. - The greedy algorithm for finding k centers (k-center problem) by farthest-first traversal:
A) Gives 2-approximation for metric k-center
B) Exact solution in polynomial time
C) No guarantee
D) Yields optimal for all metrics
Answer: A
Explanation: Farthest-first greedy is 2-approx for metric k-center. - For scheduling unit-time tasks with deadlines and profits, if two jobs have equal profit greedy tie-breaking by shorter deadline:
A) Can be used — many tie-breaks still yield optimal profit
B) Always fails
C) Requires sorting by deadline only
D) Equivalent to SPT
Answer: A
Explanation: Tie-breakers often arbitrary among equal profits. - Greedy for maximum matching in bipartite graphs via greedy augmenting quickly achieves:
A) 1/2-approximation for maximum matching size
B) Exact max matching always
C) Works only for trees
D) NP-hard
Answer: A
Explanation: Greedy matching picks edges arbitrarily yields at least half-optimal. - A greedy algorithm that repeatedly picks element minimizing incremental cost can fail because:
A) Local optimum may not extend to global optimum
B) Always yields global optimum
C) Equivalent to dynamic programming
D) Only applies to graphs
Answer: A
Explanation: Greedy locally minimizing incremental cost can be trapped in local minima. - For minimum weight set cover (weighted set cover), greedy picking set with minimum cost per uncovered element yields:
A) ln n approximation factor (H_n)
B) Exact polynomial solution
C) No approximation known
D) 2-approximation only
Answer: A
Explanation: Weighted set cover greedy gives H_n approximation. - Greedy approach to construct a near-optimal binary search tree by choosing root with highest frequency is:
A) Not optimal — optimal BST uses DP (Knuth optimization can speed up)
B) Always optimal
C) Equivalent to Huffman
D) Works only if frequencies equal
Answer: A
Explanation: Greedy by highest frequency fails in general; optimal BST requires DP. - In edge-weighted bipartite matching with unit capacities, greedy by lightest edges is:
A) Not optimal for maximum cardinality or minimum weight matching in general
B) Always optimal
C) Equivalent to Hungarian algorithm
D) Always yields perfect matching
Answer: A
Explanation: Minimum weight matching needs more than greedy; Hungarian algorithm gives optimum. - Greedy for selecting k items with maximum diversity (max-sum of pairwise distances) by iterative farthest selection:
A) Heuristic; approximation guarantees are problem-dependent
B) Always optimal
C) Equivalent to k-means
D) Always fails
Answer: A
Explanation: Farthest-first is heuristic; quality varies with objective. - For constructing prefix-free codes other than Huffman, greedy merging of smallest may fail if additional constraints exist (e.g., unequal symbol costs). True or False?
A) True
B) False
C) Only for equal costs
D) Only for integer weights
Answer: A
Explanation: Huffman optimality assumes uniform symbol cost per leaf depth; other constraints need different methods. - Greedy algorithm for minimizing the number of lecture rooms given class intervals uses:
A) Priority queue of current end times — assign to earliest finishing room — optimal
B) Sorting by start time and stacking naïvely — optimal
C) Sorting by end time only — fails
D) Random assignment
Answer: A
Explanation: Sweep with min-heap tracks earliest free room producing optimal counts. - For maximizing sum of selected non-overlapping weighted intervals, DP using sorted finish times is O(n log n) due to binary search for predecessor; greedy fails. True or False?
A) True
B) False
C) Only for unweighted
D) Only if weights positive
Answer: A
Explanation: Weighted interval scheduling needs DP; greedy by weight or ratio can fail. - Greedy algorithm selecting the next cheapest edge for MST but ignoring cycles (i.e., adding cheapest edge always) will:
A) Work — because any cheapest safe edge is safe only if it doesn’t create cycle; adding cheapest without checking may create cycles but if you always skip cycles it’s Kruskal essentially
B) Always create cycles and fail
C) Equivalent to Prim
D) Equivalent to Dijkstra
Answer: A
Explanation: Kruskal chooses cheapest edges and skips those forming cycles; that is safe. - The greedy algorithm for optimal checkpoint placement minimizing maximum rollback distance by placing checkpoints periodically:
A) Simple periodic placement may not be optimal under variable costs — problem-specific
B) Always optimal if period fixed
C) Equivalent to weighted set cover
D) NP-hard
Answer: A
Explanation: Greedy periodic may be suboptimal when costs vary; need more analysis. - Greedy selection of most connected node to dominate graph (dominating set heuristic) gives approximation:
A) O(log n) factor (similar to set cover)
B) Exact minimal dominating set
C) No relation to set cover
D) Always fails
Answer: A
Explanation: Dominating set approximable within ln Δ etc by greedy set cover-like methods. - For online interval scheduling where jobs arrive over time and you must accept or reject immediately without future knowledge, greedy by earliest finishing among seen jobs:
A) May be competitive with bounded ratio for some models
B) Gives optimal offline solution
C) Is NP-hard online
D) Equivalent to offline DP
Answer: A
Explanation: Online greedy can be competitive; exact ratio depends on constraint model. - Greedy approach for minimizing sum of weighted completion times on single machine with release dates is:
A) More complex — needs priority rule with dynamic selection (e.g., Smith’s rule among available jobs) but not trivial
B) Plain Smith’s rule globally
C) Equivalent to EDF
D) Always SPT
Answer: A
Explanation: With release dates, greedy must be dynamic (list scheduling) and may require more than a simple static order. - In constructing a solution to minimum-cost circulation problem greedy local improvements:
A) May not converge to global optimum; network flow algorithms needed
B) Always optimal
C) Equivalent to MST
D) Equivalent to matching
Answer: A
Explanation: Local greedy adjustments can get stuck; specialized algorithms solve min-cost flows. - For approximate set packing, greedy picking largest set reduces uncovered items quickly but may:
A) Lead to poor overall packing size in worst-case (heuristic only)
B) Always maximize packing size
C) Equivalent to perfect matching
D) Provide exact polynomial solution
Answer: A
Explanation: Set packing NP-hard; greedy is heuristic. - In binary search tree (BST) augmentation, choosing root greedily by maximum frequency yields:
A) Not necessarily optimal search cost; optimal BST needs DP
B) Always optimal
C) Equivalent to AVL balancing
D) Equivalent to Huffman
Answer: A
Explanation: Optimal BST uses DP; greedy by frequency fails. - Greedy for minimum number of cameras to monitor highway segments, place camera at rightmost point of leftmost uncovered segment — this is:
A) Optimal (classic interval covering)
B) Heuristic only
C) NP-hard
D) Equivalent to vertex cover
Answer: A
Explanation: Greedy works for interval covering problems. - For resource allocation under matroid constraints, greedy by highest weight yields:
A) Optimal solution because matroid property guarantees greedy optimality
B) Suboptimal in general
C) Only works for graphic matroids
D) Only if weights distinct
Answer: A
Explanation: Greedy optimal for matroid-weighted selection. - Greedy algorithm to build minimum-cost superstring by repeatedly merging pair with largest overlap:
A) Heuristic; not guaranteed optimal (shortest common superstring NP-hard)
B) Always optimal
C) Equivalent to Huffman
D) Works for DNA only
Answer: A
Explanation: Greedy overlap merging is heuristic with no general optimality guarantee. - For convex hull (plane) Graham scan is a greedy-like angular sort then stack — complexity:
A) O(n log n) due to sort
B) O(n²)
C) O(n) always
D) O(log n)
Answer: A
Explanation: Sorting by angle dominates; hull built in linear after sorting. - Greedy algorithm for compressing image by quadtree splitting based on variance:
A) Heuristic — optimality depends on cost model
B) Always optimal
C) Equivalent to DP
D) Exact minimal representation always
Answer: A
Explanation: Quadtree greedy splits locally; global optimal compression may require other criteria. - For minimum-latency problem (traveling repairman), greedy nearest-neighbor order:
A) Not optimal; problem is NP-hard and needs specialized approximations
B) Always optimal
C) Equivalent to MST traversal
D) Identical to TSP optimum
Answer: A
Explanation: Greedy nearest may be poor; best known approximations use MST-based tours. - In compressive sensing, greedy algorithms like Orthogonal Matching Pursuit (OMP):
A) Work well under sparsity and RIP conditions with provable guarantees
B) Always optimal without conditions
C) Equivalent to convex relaxation always
D) Never work
Answer: A
Explanation: OMP is greedy with provable recovery under restricted isometry and sparsity assumptions. - Greedy method for constructing median-of-medians pivot in selection algorithm:
A) Deterministic linear-time selection uses grouping/median-of-medians (not purely greedy)
B) Simple greedy pivot always yields linear time
C) Random pivot always superior
D) Greedy pivot fails always
Answer: A
Explanation: Median-of-medians is a deterministic selection with careful grouping; not simple greedy. - For maximum subsequence sum with nonnegative weights only, greedy by taking all positives is:
A) Optimal
B) Fails
C) Equivalent to Kadane for negatives too
D) NP-hard
Answer: A
Explanation: If weights nonnegative, entire array sum is maximum; for mixed signs Kadane (DP/greedy) applies. - A greedy algorithm that merges files by largest-first (instead of smallest-first) for optimal merge pattern will:
A) Be suboptimal (grows cost)
B) Be equivalent to Huffman
C) Be optimal for special cases only
D) Always optimal
Answer: A
Explanation: Merging large first increases total cost; merging small first is optimal. - For approximate facility location, greedy adding new facility with best cost decrease is:
A) Yields logarithmic approximation under some formulations
B) Exact polynomial algorithm
C) No better than random
D) Equivalent to set cover
Answer: A
Explanation: Greedy gives provable approximations for facility location variants. - In resource allocation where tasks require contiguous bins, greedy packing by first fit:
A) Is heuristic; first-fit-decreasing can give good approximations but not optimal in general (bin packing NP-hard)
B) Always optimal
C) Equivalent to fractional knapsack
D) Works only for identical sizes
Answer: A
Explanation: Bin packing needs sophisticated approximations; FFD gives 11/9-approx. - For maximum flow in networks, greedy augmenting path (choose any augmenting path) leads to:
A) Ford-Fulkerson which may be exponential unless augmenting path chosen cleverly (e.g., Edmonds–Karp shortest augmenting path)
B) Always polynomial greedy
C) Equivalent to Prim
D) Always optimal in one augment
Answer: A
Explanation: Ford-Fulkerson with arbitrary paths can be slow; Edmonds-Karp uses BFS for polynomial bound. - Greedy strategy for channel allocation by assigning lowest-numbered available channel may be:
A) Heuristic; can be suboptimal depending on interference model
B) Always optimal
C) Equivalent to maximum matching
D) NP-hard only for 2 channels
Answer: A
Explanation: Allocation problems depend on interference graph; greedy heuristic sometimes good, not guaranteed. - Greedy algorithm choosing the vertex with highest degree to include in vertex cover and removing its incident edges:
A) Gives O(log n)-approx (via reduction to set cover)
B) Always optimal
C) 1/2-approx only
D) Equivalent to matching
Answer: A
Explanation: Heuristic approximates but set cover reduction yields ln n factor. - For sensor placement in tree networks to monitor all nodes with minimal sensors placed at nodes covering neighbors, greedy bottom-up placing at parents of leaves yields:
A) Optimal for trees (tree vertex cover via simple greedy)
B) Heuristic only
C) Fails always
D) NP-hard on trees
Answer: A
Explanation: Tree vertex cover solved optimally by greedy bottom-up choosing parents of leaves. - Finally, to prove greedy correctness, which sequence is most typical?
A) Show greedy choice is safe (exchange argument) → show optimal substructure and conclude optimality
B) Use randomization only
C) Show dynamic programming equivalence for all problems
D) Empirically test on random instances
Answer: A
Explanation: Typical proof: greedy choice property + optimal substructure (via exchange) → combined guarantee.