Abdelkhalak El Hami - Optimizations and Programming

Здесь есть возможность читать онлайн «Abdelkhalak El Hami - Optimizations and Programming» — ознакомительный отрывок электронной книги совершенно бесплатно, а после прочтения отрывка купить полную версию. В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Жанр: unrecognised, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

Optimizations and Programming: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «Optimizations and Programming»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

This book is a general presentation of complex systems, examined from the point of view of management. There is no standard formula to govern such systems, nor to effectively understand and respond to them. The interdisciplinary theory of self-organization is teeming with examples of living systems that can reorganize at a higher level of complexity when confronted with an external challenge of a certain magnitude. Modern businesses, considered as complex systems, ideally know how to flexibly and resiliently adapt to their environment, and also how to prepare for change via self-organization. Understanding sources of potential crisis is essential for leaders, though not all crises are necessarily bad news, as creative firms know how to respond to challenges through innovation: new products and markets, organizational learning for collective intelligence, and more.

Optimizations and Programming — читать онлайн ознакомительный отрывок

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «Optimizations and Programming», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

We will first perform Phase I of the simplex algorithm. After adding artificial variables, we obtain the tableau as follows:

There are no longer any artificial variables in the basis We have found the - фото 82

There are no longer any artificial variables in the basis. We have found the optimal tableau of the auxiliary problem, and therefore an admissible basic solution for the initial problem. Since Phase I is now complete, delete the columns corresponding to the artificial variables and compute the reduced costs to obtain the following tableau:

This tableau is optimal The optimal solution is x 1 1 x 2 0 x 3 0 and x - фото 83

This tableau is optimal. The optimal solution is x 1= 1, x 2= 0, x 3= 0 and x 4= 1, giving an optimal value of z = −1.

1.6.3. Degeneracy and cycling

Suppose that the current basic admissible solution is degenerate (i.e. at least one basic variable is zero), and let us recall a possible unfavorable scenario that can occur when the simplex algorithm is iterated. Performing a change of basis yields a value for the cost function that is greater than or equal to the previous value. The method is only guaranteed to converge if z * < z 0; in fact, if the previous basic solution is degenerate, it might be the case that z * = z 0.

If this phenomenon occurs multiple times over consecutive iterations, the algorithm can return to a basis that was already considered earlier. In other words, it returns to the same vertex twice. This is called cycling.

Although degeneracy is a frequently encountered phenomenon in linear programming, cycling is not. Some examples have been given in the literature [GAS 75]. Despite the rarity of cycling, it would be desirable to specify a method that avoids it and therefore guarantees that the simplex method will converge. There are several possible choices for a rule that avoids the inconvenience of cycling.

In the past, the most common technique was the so-called method of small perturbations. This method applies an infinitesimal geometric modification to the vector d in order to force it to be expressible as a linear combination of m vectors, with strictly positive coefficients in some basis. Each vertex is then defined as the intersection of n hyperplanes [HIL 90].

More recently, Bland [BLA 77] suggested modifying the rules for choosing the change of basis. Bland’s rule proceeds as follows:

– for the variable entering the basis, choose the smallest index j such that the reduced cost is negative;

– for the variable leaving the basis, if there is equality in θ*, choose the variable with the lowest index.

1.6.4. Geometric structure of realizable solutions

Earlier, when we solved linear programs graphically, the optimal solutions were on the boundary of the convex set of realizable solutions. If there was a unique optimal solution, it was an extreme point.

Recall the canonical form of a linear program:

where the matrix A is assumed to be an m n matrix of rank m n The - фото 84

where the matrix A is assumed to be an m × n matrix of rank m (< n ). The standard form is then given by

It is possible to show the following result Let Γ respectively Σ be the set - фото 85

It is possible to show the following result.

Let Γ (respectively, Σ) be the set of feasible solutions of ( P ) (respectively of ( Ps )):

Then Γ and Σ are closed and convex sets Recall that the simplex method only - фото 86

Then Γ and Σ are closed and convex sets.

Recall that the simplex method only uses the basic solutions of ( Ps ). The next theorem tells us that these are in fact the extreme points of Σ.

THEOREM 1.5.– картинка 87is a basic feasible solution of ( Ps ) if and only if картинка 88is an extreme point of Σ.

Examples where the solution is not unique show that the optimal solutions lie on the boundary:

THEOREM 1.6.– Every optimal solution of ( P ) (respectively of ( Ps )) belongs to the boundary of Γ (respectively of Σ).

A question presents itself: can there be optimal solutions but no optimal basic feasible solutions? If so, the simplex method, which only considers basic solutions, would not be able to find an optimal solution. The following theorem tells us that this does not happen.

THEOREM 1.7.– If ( Ps ) has an optimal solution, then ( Ps ) has an optimal basic feasible solution.

1.7. Interior-point algorithm

The simplex algorithm moves along a sequence of vertices of the polyhedron and converges rapidly on average. However, Klee and Minty [KLE 72] found pathological examples where the simplex algorithm visits almost every vertex, which can potentially be an enormous number. The existence of theoretical linear programs that derail the simplex algorithm motivated research to find an algorithm that has polynomial complexity in m and n , even in the worst-case scenarios. Khachian [KHA 79] found the first such algorithm, based on the ellipsoid method from nonlinear optimization. Although it is faster than the simplex method in Minty’s problems, it never succeeded in establishing itself because it tended to be much slower than the simplex algorithm on real problems in practice. Nevertheless, as a mathematical result, it inspired research into interior-point algorithms.

In 1984, Karmarkar [KAR 84] found a new algorithm. The computation time in the worst-case scenario is proportional to n 3.4 L, where L denotes the number of bits required to encode the tableaus A , b and c that define the linear program. The algorithm is too complicated for computations by hand; known as the interior-point algorithm, it enters into the polyhedron itself to converge on the optimal vertex more quickly.

Today, some variants of this algorithm are able to compete with the simplex algorithm. Some software programs already include an interior-point algorithm in addition to the simplex algorithm. On some types of linear programs, interior-point algorithms can be 10 times faster than the simplex algorithm, but nevertheless, there are still many cases where the simplex algorithm is quicker [PRI 11].

EXAMPLE 1.9.– Consider the LP

The optimum is x 2 2 T with cost z 6 Starting from the interior - фото 89

The optimum is x * = (2, 2) T with cost z * = 6. Starting from the interior point x 0= (1, 1) T and without a safety factor, the algorithm performs two iterations: it considers the point x 1= (2, 1.591) T , followed by the point x *.

1.8. Duality

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «Optimizations and Programming»

Представляем Вашему вниманию похожие книги на «Optimizations and Programming» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «Optimizations and Programming»

Обсуждение, отзывы о книге «Optimizations and Programming» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x