||One of the leading parts in the optimization theory from both theoretical and practical points of view is played by the duality theory. A challenging and difficult undertaking in this field is to guarantee strong duality, the situation when the optimal objective values of the two primal and dual problems are equal and the dual has optimal solutions.
In this talk we deal with two classical duality concepts in convex optimization, the so-called Lagrange and Fenchel duality approaches, concerning some important questions which are usual raised in this context: regularity conditions, strong duality theorems and formulation of necessary and sufficient optimality conditions.
We also investigate some applications of the Fenchel duality to regularization problems in machine learning. Moreover, we report on a successful real world application of the Support Vector Machines with soft margins to the classification of image data sets.