GET THE APP

Solutions to Global Optimization Problems
..

Global Journal of Technology and Optimization

ISSN: 2229-8711

Open Access

Mini Review - (2022) Volume 13, Issue 7

Solutions to Global Optimization Problems

Shaaban A. Abdallah*
*Correspondence: Shaaban A. Abdallah, Department of Aerospace Engineering, Mechanics University of Cincinnati, Woodside Drive, Cincinnati, USA, Email:
Department of Aerospace Engineering, Mechanics University of Cincinnati, Woodside Drive, Cincinnati, USA

Received: 02-Jul-2022, Manuscript No. gjto-22-78586; Editor assigned: 04-Jul-2022, Pre QC No. P-78586; Reviewed: 16-Jul-2022, QC No. Q-78586; Revised: 21-Jul-2022, Manuscript No. R-78586; Published: 28-Jul-2022 , DOI: 10.37421/2229-8711.2022.13.304
Citation: Abdallah, Shaaban A. “Solutions to Global Optimization Problems.” Glob J Tech Optim 13 (2022): 304.
Copyright: © 2022 Abdallah SA. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Abstract

When there are no other practical solutions with superior objective function values, the solution is said to be globally optimal. When there are no other viable alternatives "close by" with superior objective function values, the solution is said to be locally optimal. The goal function and/or constraints may generate this point at the top of a "peak" or at the bottom of a "valley," but there may be a higher peak or a deeper valley far away from the current point.

Keywords

Valley• Optimal • Superior • Constraints

Introduction

Researchers must select the best combination of the object's parameters (geometric dimensions, electrical characteristics, etc.) in order to create highly effective technical systems and technological processes, in addition to the use of new principles, new materials, new physical effects, and other solutions that determine the overall structure of the object being created. Scientists from UNN also established the new algorithm's uniform convergence theorem. The experimental portion of the effort involved resolving several hundred test problems in various dimensions, and the outcomes have persuasively shown that uniform convergence exists [1].

Additionally, UNN researchers think about computationally demanding global optimization issues, whose resolution may call for exaflops-capable supercomputers. The researchers suggest generalised parallel computational methods, which may include multiple effective parallel global optimization algorithms, to tackle such computational complexity. In order to address the computational demands of supercomputing systems with shared and distributed memory multiprocessors employing thousands of processors, the proposed schemes incorporate techniques for multilevel decomposition of parallel computations.

The principles of "survival of the fittest" and natural selection, which govern biological evolution, are heuristically "imited" through these techniques. Utilizing a "population" of potential candidate solution locations, an adaptive search process is performed. In iterations, the weaker options are eliminated through a competitive selection process [2]. By swapping out components, the remaining candidates with the highest "fitness value" are then "recombined" with other solutions; they can also be "mutated" by making more minor changes to a candidate. Sequentially applying the recombination and mutation moves aims to provide new solutions that are biassed towards D subsets where good, albeit not necessarily globally optimal, solutions have previously been established.

Literature Review

When searching for numerical solutions, multi-extremal optimization problems have a limited range of analytic research opportunities and are computationally expensive because the problem's rising size causes expenses to rise exponentially. Konstantin Barkalov, an associate professor in the UNN Department of Software and Supercomputer Technologies, claims that the application of contemporary parallel computing systems broadens the range of techniques used in global optimization. It also presents a barrier for efficiently parallelizing the search procedure [3].

In the (possible or known) presence of many local optima, the goal of global optimization is to identify the model's (potentially nonlinear) globally optimal solution. Formally, global optimization looks for one or more global solutions to an optimization model with constraints. Nonlinear models are frequently used in a wide range of fields, including scientific modelling, advanced engineering design, biotechnology, data analysis, environmental management, financial planning, process control, and risk management. They frequently need a global search strategy to find a solution.

Some examples of applications include designing acoustics equipment, planning cancer therapies, modelling chemical processes, data analysis, classification, and visualisation, economic and financial forecasting, managing environmental risks, designing industrial products, designing laser equipment, optimising numerical mathematics, and packing [4].

These techniques are founded on hypothetical statistical data, allowing for an earlier stochastic description of the function-class being modelled. The features of the problem-instance are adaptively calculated and modified throughout optimization. Keep in mind that, typically, only the development of the matching one-dimensional model is exact; in addition, the search technique is typically controlled by "myopic" approximations in most practical situations. This general strategy can be used to solve (basically) continuous global optimization issues. Theoretically, only by producing an evenly distributed dense set of search sites is convergence to the best solution set guaranteed. The selection and validation of a "suitable" statistical model for the class of issues to which they are applied is one of the obvious obstacles of employing statistical methods [5].

Discussion

System, each of the processes addressing one problem from a series. There will first be an imbalance in the workload between the processors. The processor responsible for handling the I -th problem would remain idle after completing the task if addressing the I -th problem required significantly fewer repetitions of the procedure than solving the j -th problem. Second, estimations of the optima will be produced in various issues with varying degrees of certainty. Greater precision will be achieved in the solution of simpler issues than more complex ones [6].

Continuous Branch and Bound methods are intended to systematically divide the feasible region into successively smaller subregions, discover locally optimal solutions in each subregion, as opposed to Multistart methods which rely on random sampling of starting points. The global optimal solution is suggested to be the best of the locally optimal solutions. Although continuous branch and bound methods have a theoretical guarantee of convergence to the globally optimal solution, this guarantee is typically not achievable for problems with more than a few variables in a reasonable length of time. Therefore, in order to increase performance, many Continuous Branch and Bound algorithms additionally include some form of random or statistical sampling [7].

The fundamental concept behind this broad category of meta-heuristics is to "forbid" search moves to previously visited places in the (often discrete) search space, at least for the next few steps. That is, one can temporarily overlook new, less effective ideas in order to avoid already explored avenues. This strategy can lead to the exploration of new areas of D with the intention of conducting a "globalised" search for a solution. Traditional uses of tabu search in combinatorial optimization include scheduling, routing, and travelling salesman issues. A discrete approximation (encoding) of the problem can, at least in theory, make the technique immediately applicable to continuous global optimization problems, but other expansions are also feasible.

Conclusion

A simulation technique called parallel tempering, also known as replica exchange MCMC sampling, aims to enhance the dynamic features of Markov chain Monte Carlo (MCMC) sampling methods and Monte Carlo method simulations of physical systems. Giorgio Parisi, among others, later developed the replica exchange method, which was first established by Swendsen, Geyer, and Sugita. Sugita and Okamoto also developed a parallel tempering approach for molecular dynamics, which is commonly referred to as replica-exchange molecular dynamics, or REMD. Basically, N randomly initialised replicas of the system are run at various temperatures. Then, one swaps configurations at various temperatures in accordance with the Metropolis criterion.

Acknowledgement

None.

Conflict of Interest

There are no conflicts of interest by author.

References

  1. Pardalos, Panos M., H. Edwin Romeijn and Hoang Tuy. "Recent developments and trends in global optimization." J Comput Appl Math124 (2000): 209-228.
  2. Google Scholar, Crossref, Indexed at

  3. Ryoo, Hong S. and Nikolaos V. Sahinidis. "A branch-and-reduce approach to global optimization." J Glob Optim 8 (1996): 107-138.
  4. Google Scholar, Crossref, Indexed at

  5. Zhang, Yiying and Zhigang Jin. "Group teaching optimization algorithm: A novel metaheuristic method for solving global optimization problems." Expert Syst Appl 148 (2020): 113246.
  6. Google Scholar, Crossref, Indexed at

  7. Dua, Vivek, Katerina P. Papalexandri and Efstratios N. Pistikopoulos. "Global optimization issues in multiparametric continuous and mixed-integer optimization problems." J Glob Optim 8 (2004): 59-89.
  8. Google Scholar, Crossref, Indexed at

  9. Abdel-Raouf, Osama, Mohamed Abdel-Baset and Ibrahim El-henawy. "A new hybrid flower pollination algorithm for solving constrained global optimization problems." Adv Eng Technol & App 3 (2014): 1-9.
  10. Google Scholar, Indexed at

  11. Hartman, James K. "Some experiments in global optimization." Nav Res Logist. Q 20 (1973): 569-576.
  12. Google Scholar, Crossref, Indexed at

  13. Papamichail, Ioannis and Claire S. Adjiman. "Global optimization of dynamic systems." Comput Chem Eng (2004): 403-415.
  14. Google Scholar, Crossref, Indexed at

Google Scholar citation report
Citations: 664

Global Journal of Technology and Optimization received 664 citations as per Google Scholar report

Global Journal of Technology and Optimization peer review process verified at publons

Indexed In

 
arrow_upward arrow_upward