Chapters 1– 4are devoted to introducing optimization and optimization modeling. Convex models appear later with the other material on convex optimization. In my experience teaching mathematics students, they find modeling challenging. These chapters assume technology is available to solve problems, so that the focus can stay on formulation, as well as interpreting solutions. They build steadily in sophistication, starting with numerical instances but soon moving to algebraic formulations to make clear the distinction between model structure and data. The features of the models also build, particularly when using logical variables. In contrast with the business case study approach, each model has a limited number of features and focuses on some novel feature. I have found that mathematics students relate better to a succession of simpler models, from which they learn different modeling principles, than to a long case study.
Chapters 5– 8discuss iterative algorithms, giving some easily explained examples from discrete optimization, and the theoretical background for linear programming. This includes a little computational complexity, convexity and the study of polyhedra, optimality conditions for linear programming, and duality theory for linear programming. It is unorthodox to cover all of these before introducing the simplex method. However, conceptually these topics fit together and do not depend on the simplex method; putting them together emphasizes this fact. Chapter 8on duality is independent of Chapter 9on the simplex method, so that they can be covered them in either order. I typically skip the topics in Sections 5.3, 7.5, 8.4, and 8.5 to arrive at the simplex method about the middle of the semester.
Chapters 9– 11present the simplex method, including sensitivity analysis, and other algorithms for linear programming. A developmental approach is taken to presenting the simplex method. Starting with geometric reasoning about why it works, it is presented in the “naive” form first, where the so‐called inverse basis matrix is computed from scratch each iteration. While this form is computationally inefficient, it is very easy to explain to a mathematics student, both for computing and justifying that the algorithm works. When working examples, technology can be used to invert and multiply matrices. After completing the picture of why the method works (degeneracy, two‐phase simplex), Section 9.4 takes up the issue of making the simplex method more efficient, including the tableau form and revised simplex method. Instructors who wish to start with the tableau can use the material found here. Chapter 10on sensitivity analysis, which depends Chapter 8, can be skipped without loss of continuity; however, the interpretation of dual variables as shadow prices and partial derivatives is enriching, even in an age when sensitivity analysis can be done quickly by solving modified linear programs. An interpretation of strong duality in terms of average costs is also given in Section 10.3. Chapter 11presents three more algorithms for linear programming, all of which rely on duality: the dual simplex, transportation simplex, and a primal‐dual, or path following, interior point method. The transportation simplex method is presented first as a minimum cost flow problem, then specialized to transportation problems.
Chapters 12and 13present integer programming algorithms. These relate to the earlier material because of the importance of linear programming to establish bounds when solving an integer program. Integer programming also has greater modeling power, as demonstrated by the many applications in Chapter 14. Chapters 14and 15introduce convex programming, including some motivating applications. The earlier chapters are designed to prepare the reader to understand convex programming more readily. The KKT optimality conditions and duality theorems are a generalization of Lagrangian duality (Section 8.4). Necessary and sufficient conditions for a global optimum follow from convexity theory, already applied to linear programs in Chapter 6. Chapter 15culminates in the primal‐dual interior point method, which was presented for linear programs in Section 11.3. Quadratic programming is also introduced and the connection between the primal‐dual interior point method and sequential quadratic programming is made.
Supplemental material will be available at the web site www.gordon.edu⁄michaelveatch⁄optimizationfor the book. A full solution manual will be made available to instructors.
The book contains the amount of material covered in a typical two‐semester sequence of undergraduate classes. A semester course focusing on linear programming could cover Chapters 1, 2, Sections 3.1–3.2, 5, 6, Sections 7.1–7.4 and 8.1–8.3, 9, 10plus some other topics from these chapters and Chapter 11. A course on linear and integer programming could cover Chapters 1, 2, Sections 3.1–3.2, 4, 5, 6, Sections 7.1–7.4 and 8.1–8.3, 9, 12, and 13. A somewhat more advanced course on linear and convex programming could cover Chapters 1– 3, 5–7.4, 8, 9, Sections 11.1–11.13, 14, and 15.
Several more advanced or specialized topics have been included at the end of chapters or sections that are optional and can be easily skipped. Section 3.3 shows that a dynamic program can be solved as a linear program, an approach that relates to machine learning. Section 5.3 on computational complexity, while not difficult, is only occasional mentioned in the later chapters. Section 7.5 extends the optimality conditions needed to solve linear programs to general polyhedra. Section 8.4 introduces Lagrangian duality for linear programs and shows that it is equivalent to the usual dual; it is only needed if convex programming ( Chapter 14) is being covered. Farkas' lemma is presented in Section 8.5, providing another approach to duality theorems. The computational strategies in Section 9.4 are important to the simplex method but are not used in the sequel. The interior point algorithm in Section 11.4 is computationally more involved. It is closely related to Section 15.5.
I want to express my deep appreciation to the many people who helped make this book possible. First, I want to thank David Rader, Larry Leemis, and Susan Martonosi for encouraging me to undertake the project. I am grateful to my former and current students Isaac Bleecker, Mackenzie Hinds, Joe Iriana, Stephen Rizzo, and Michael Yee for reviewing portions of the draft. I also thank my friend John Sanderson for drawing the figures, my colleague Jonathan Senning for his technical advice, and students Isaac Bleecker, Jessica Guan, Seth McKinney, and Yi Zhou for their help with the exercises and figures.
Most of all, I am grateful for my wife Cindy's confidence in me and acceptance of my working odd hours on the project. Now we are both authors.
Michael H. Veatch
Wenham, MA
March, 2020
About the Companion Website
This book is accompanied by a companion website:
www.wiley.com/go/veatch/convexandlinearoptimization
The website includes the instructor solutions manual.
1 Introduction to Optimization Modeling
1.1 Who Uses Optimization?
Data‐driven decision‐making is on the rise. Many businesses, governmental organizations, and nonprofits collect large amounts of data and seek to use them to inform the decisions they make. In addition to data and statistical analysis, a mathematical model of the system is often needed to find the best or even a viable option. Examples include planning the flow of goods through a supply chain, scheduling personnel shifts, or choosing an investment portfolio. The approach of optimization is to develop a model describing which potential decisions could be taken, in light of physical, logical, financial, or other limitations, and assess them using a single objective. This objective could be profit or loss in a business setting, expected return on an investment, a physical measure such as minimizing time to complete a task, a health measure such as expected lives saved, or in the government and nonprofit sector, a measure of social welfare. The mathematical model is an optimization problem; the study of them is called optimization.
Читать дальше