The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, thi...The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, this new method determines the new update by applying the updating formula m times to an initial positive diagonal matrix using the m previous pairs of the change in iteration and gradient. Besides the most recent pair of the change, which guarantees the quadratic termination, the choice of the other ( m -1) pairs of the change in the new method is dependent on the degree of numerical linear independence of previous search directions. In addition, the numerical linear independence theory is further discussed and the computation of the degree of linear independence is simplified. Theoretical and numerical results show that this new modified method improves efficiently the standard limited memory method.展开更多
In full waveform inversion (FWI), Hessian information of the misfit function is of vital importance for accelerating the convergence of the inversion; however, it usually is not feasible to directly calculate the He...In full waveform inversion (FWI), Hessian information of the misfit function is of vital importance for accelerating the convergence of the inversion; however, it usually is not feasible to directly calculate the Hessian matrix and its inverse. Although the limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) or Hessian-free inexact Newton (HFN) methods are able to use approximate Hessian information, the information they collect is limited. The two methods can be interlaced because they are able to provide Hessian information for each other; however, the performance of the hybrid iterative method is dependent on the effective switch between the two methods. We have designed a new scheme to realize the dynamic switch between the two methods based on the decrease ratio (DR) of the misfit function (objective function), and we propose a modified hybrid iterative optimization method. In the new scheme, we compare the DR of the two methods for a given computational cost, and choose the method with a faster DR. Using these steps, the modified method always implements the most efficient method. The results of Marmousi and overthrust model testings indicate that the convergence with our modified method is significantly faster than that in the L-BFGS method with no loss of inversion quality. Moreover, our modified outperforms the enriched method by a little speedup of the convergence. It also exhibits better efficiency than the HFN method.展开更多
In this paper, we propose an algorithm for solving nonlinear monotone equations by combining the limited memory BFGS method (L-BFGS) with a projection method. We show that the method is globally convergent if the eq...In this paper, we propose an algorithm for solving nonlinear monotone equations by combining the limited memory BFGS method (L-BFGS) with a projection method. We show that the method is globally convergent if the equation involves a Lipschitz continuous monotone function. We also present some preliminary numerical results.展开更多
In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence propert...In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence property without convexity assumption on the objective function. Under some suitable conditions, the global convergence of the proposed method is proved. Some numerical results are reported which illustrate that the proposed method is efficient.展开更多
This paper presents a stochastic modification of a limited memory BFGS method to solve bound-constrained global minimization problems with a differentiable cost function with no further smoothness. The approach is a s...This paper presents a stochastic modification of a limited memory BFGS method to solve bound-constrained global minimization problems with a differentiable cost function with no further smoothness. The approach is a stochastic descent method where the deterministic sequence, generated by a limited memory BFGS method, is replaced by a sequence of random variables. To enhance the performance of the proposed algorithm and make sure the perturbations lie within the feasible domain, we have developed a novel perturbation technique based on truncating a multivariate double exponential distribution to deal with bound-constrained problems;the theoretical study and the simulation of the developed truncated distribution are also presented. Theoretical results ensure that the proposed method converges almost surely to the global minimum. The performance of the algorithm is demonstrated through numerical experiments on some typical test functions as well as on some further engineering problems. The numerical comparisons with stochastic and meta-heuristic methods indicate that the suggested algorithm is promising.展开更多
文摘The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, this new method determines the new update by applying the updating formula m times to an initial positive diagonal matrix using the m previous pairs of the change in iteration and gradient. Besides the most recent pair of the change, which guarantees the quadratic termination, the choice of the other ( m -1) pairs of the change in the new method is dependent on the degree of numerical linear independence of previous search directions. In addition, the numerical linear independence theory is further discussed and the computation of the degree of linear independence is simplified. Theoretical and numerical results show that this new modified method improves efficiently the standard limited memory method.
基金financially supported by the National Important and Special Project on Science and Technology(2011ZX05005-005-007HZ)the National Natural Science Foundation of China(No.41274116)
文摘In full waveform inversion (FWI), Hessian information of the misfit function is of vital importance for accelerating the convergence of the inversion; however, it usually is not feasible to directly calculate the Hessian matrix and its inverse. Although the limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) or Hessian-free inexact Newton (HFN) methods are able to use approximate Hessian information, the information they collect is limited. The two methods can be interlaced because they are able to provide Hessian information for each other; however, the performance of the hybrid iterative method is dependent on the effective switch between the two methods. We have designed a new scheme to realize the dynamic switch between the two methods based on the decrease ratio (DR) of the misfit function (objective function), and we propose a modified hybrid iterative optimization method. In the new scheme, we compare the DR of the two methods for a given computational cost, and choose the method with a faster DR. Using these steps, the modified method always implements the most efficient method. The results of Marmousi and overthrust model testings indicate that the convergence with our modified method is significantly faster than that in the L-BFGS method with no loss of inversion quality. Moreover, our modified outperforms the enriched method by a little speedup of the convergence. It also exhibits better efficiency than the HFN method.
基金Support by NSF of China grant 10471036a 973 project
文摘In this paper, we propose an algorithm for solving nonlinear monotone equations by combining the limited memory BFGS method (L-BFGS) with a projection method. We show that the method is globally convergent if the equation involves a Lipschitz continuous monotone function. We also present some preliminary numerical results.
基金Supported by National Natural Science Foundation of China(Grant11001075,11161003)Post-doctoral Foundation of China grant 20090461094the Natural Science Foundation of Henan Province Eduction Department grant 2010B110004
文摘In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence property without convexity assumption on the objective function. Under some suitable conditions, the global convergence of the proposed method is proved. Some numerical results are reported which illustrate that the proposed method is efficient.
文摘This paper presents a stochastic modification of a limited memory BFGS method to solve bound-constrained global minimization problems with a differentiable cost function with no further smoothness. The approach is a stochastic descent method where the deterministic sequence, generated by a limited memory BFGS method, is replaced by a sequence of random variables. To enhance the performance of the proposed algorithm and make sure the perturbations lie within the feasible domain, we have developed a novel perturbation technique based on truncating a multivariate double exponential distribution to deal with bound-constrained problems;the theoretical study and the simulation of the developed truncated distribution are also presented. Theoretical results ensure that the proposed method converges almost surely to the global minimum. The performance of the algorithm is demonstrated through numerical experiments on some typical test functions as well as on some further engineering problems. The numerical comparisons with stochastic and meta-heuristic methods indicate that the suggested algorithm is promising.