Match!

A New Method of Locating the Maximum Point of an Arbitrary Multipeak Curve in the Presence of Noise

Published on Jan 1, 1964in Journal of Basic Engineering
· DOI :10.1115/1.3653121
Harold J. Kushner6
Estimated H-index: 6
Sources
Abstract
  • References (0)
  • Citations (352)
📖 Papers frequently viewed together
3,320 Citations
1974
1 Author (Jonas Mockus)
153 Citations
7,642 Citations
78% of Scinapse members use related papers. After signing in, all features are FREE.
References0
Newest
Cited By352
Newest
#1R. Fuentes (University of Sheffield)H-Index: 2
#2Paul Gardner (University of Sheffield)H-Index: 2
Last. Elizabeth Cross (University of Sheffield)H-Index: 10
view all 8 authors...
Abstract The use of robotics is beginning to play a key role in automating the data collection process in Non Destructive Testing (NDT). Increasing the use of automation quickly leads to the gathering of large quantities of data, which makes it inefficient, perhaps even infeasible, for a human to parse the information contained in them. This paper presents a solution to this problem by making the process of NDT data acquisition an autonomous one as opposed to an automatic one. In order to achiev...
Source
#1Guillaume Briffoteaux (University of Mons)
#2Maxime Gobert (University of Mons)
Last. Daniel Tuyttens (University of Mons)H-Index: 14
view all 7 authors...
Abstract Surrogate-based optimization is widely used to deal with long-running black-box simulation-based objective functions. Actually, the use of a surrogate model such as Kriging or Artificial Neural Network allows to reduce the number of calls to the CPU time-intensive simulator. Bayesian optimization uses the ability of surrogates to provide useful information to help guiding effectively the optimization process. In this paper, the Efficient Global Optimization (EGO) reference framework is ...
Source
Abstract This work presents a new software, programmed as a Python class, that implements a multi-objective Bayesian optimization algorithm. The proposed method is able to calculate the Pareto front approximation of optimization problems with fewer objective functions evaluations than other methods, which makes it appropriate for costly objectives. The software was extensively tested on benchmark functions for optimization, and it was able to obtain Pareto Function approximations for the benchma...
Source
#1Xuxi Yang (Iowa State University)
#2M Egorov (EADS: Airbus)H-Index: 13
Last. Peng Wei (Iowa State University)
view all 5 authors...
Source
We propose a novel Bayesian method to solve the maximization of a time-dependent expensive-to-evaluate oracle. We are interested in the decision that maximizes the oracle at a finite time horizon, when relatively few noisy evaluations can be performed before the horizon. Our recursive, two-step lookahead expected payoff (\texttt{r2LEY} acquisition function makes nonmyopic decisions at every stage by maximizing the estimated expected value of the oracle at the horizon. \texttt{r2LEY}circumv...
In this study, self-optimization of a grinding machine is demonstrated with respect to production costs, while fulfilling quality and safety constraints. The quality requirements of the final workpiece are defined with respect to grinding burn and surface roughness, and the safety constraints are defined with respect to the temperature at the grinding surface. Grinding temperature is measured at the contact zone between the grinding wheel and workpiece using a pyrometer and an optical fiber, whi...
Source
#2Tim Wildey (SNL: Sandia National Laboratories)H-Index: 12
Last. Scott McCannH-Index: 5
view all 3 authors...
Source
#1Alonso Marco (MPG: Max Planck Society)H-Index: 5
#2Alexander von RohrH-Index: 1
Last. Sebastian TrimpeH-Index: 13
view all 5 authors...
When learning to ride a bike, a child falls down a number of times before achieving the first success. As falling down usually has only mild consequences, it can be seen as a tolerable failure in exchange for a faster learning process, as it provides rich information about an undesired behavior. In the context of Bayesian optimization under unknown constraints (BOC), typical strategies for safe learning explore conservatively and avoid failures by all means. On the other side of the spectrum, no...
#1Paul Kent (Warw.: University of Warwick)
#2Jürgen Branke (Warw.: University of Warwick)H-Index: 41
Last. Juergen Branke (Warw.: University of Warwick)H-Index: 8
view all 2 authors...
Quality Diversity (QD) algorithms such as MAP-Elites are a class of optimisation techniques that attempt to find a set of high-performing points from an objective function while enforcing behavioural diversity of the points over one or more interpretable, user chosen, feature functions. In this paper we propose the Bayesian Optimisation of Elites (BOP-Elites) algorithm that uses techniques from Bayesian Optimisation to explicitly model both quality and diversity with Gaussian Processes. By consi...
Apr 30, 2020 in ICLR (International Conference on Learning Representations)
#1Michael Volpp (Bosch)H-Index: 1
Last. Christian Daniel (Bosch)H-Index: 10
view all 7 authors...
Transferring knowledge across tasks to improve data-efficiency is one of the open key challenges in the area of global optimization algorithms. Readily available algorithms are typically designed to be universal optimizers and, thus, often suboptimal for specific tasks. We propose a novel transfer learning method to obtain customized optimizers within the well-established framework of Bayesian optimization, allowing our algorithm to utilize the proven generalization capabilities of Gaussian proc...