IM21204 – Operations Research II

Credits: 4
Duration: One Semester (12–14 weeks)
Level: Undergraduate (Core)

This course builds on the foundations of Operations Research I and focuses on dynamic and stochastic optimization techniques. The emphasis is on modeling, analytical solution methods, and interpretation of results for decision-making under uncertainty and strategic interaction.

Course Objectives

To enable students to:

Weekly Course Outline

Week Topics No. of Lectures Assignments / Materials
1 Review of Optimization Concepts
Convexity, Optimality Conditions
3 Reading: Bazaraa et al., Ch. 1–2
2 Unconstrained Nonlinear Programming
First and Second Order Conditions
4 Problem Set 1
3 Constrained Nonlinear Programming
Lagrange Multipliers, KKT Conditions
4 Numerical examples
4 Introduction to Dynamic Programming
Principle of Optimality
3 Reading: Bertsekas, Ch. 1
5 Deterministic Dynamic Programming
Resource Allocation, Shortest Path
4 Problem Set 2
6 Dynamic Programming Applications
Inventory and Replacement Models
3 Test 1
7 Stochastic Processes Review
Discrete-Time Markov Chains
3 Reading: Ross, Ch. 4
8 Markov Chains
Classification, Steady-State Analysis
4 Problem Set 3
9 Markov Decision Processes (MDPs)
Policy Evaluation and Optimization
4 Numerical examples
Bonus Infinite Horizon MDPs
Discounted and Average Reward Criteria
2 Reading notes
Optional Game Theory Basics
Static Games, Nash Equilibrium
3 Ref: Osborne & Rubinstein
Optional Dynamic and Evolutionary Games
Repeated Games, Replicator Dynamics
3 Ref: L.C. Thomas

Textbooks and References

Assessment Scheme

Component Weight
Continuous Evaluation (Assignments, Quiz) 40% (20+20)
Midterm Exam 30%
Final Exam 30%

Course Materials

Lecture notes, problem sets, etc. are provided. Check the course moodle page for more recent updates.

  • Lecture Notes:
  • ← Back to Main Page