Localized Solutions of Nonlinear Stationary Problems

Author(s):  
S. N. Antontsev ◽  
J. I. Díaz ◽  
S. Shmarev
2020 ◽  
Vol 10 (1) ◽  
pp. 522-533
Author(s):  
Amanda S. S. Correa Leão ◽  
Joelma Morbach ◽  
Andrelino V. Santos ◽  
João R. Santos Júnior

Abstract Some classes of generalized Schrödinger stationary problems are studied. Under appropriated conditions is proved the existence of at least 1 + $\begin{array}{} \sum_{i=2}^{m} \end{array}$ dim Vλi pairs of nontrivial solutions if a parameter involved in the equation is large enough, where Vλi denotes the eigenspace associated to the i-th eigenvalue λi of laplacian operator with homogeneous Dirichlet boundary condition.


1963 ◽  
Vol 18 (4) ◽  
pp. 531-538
Author(s):  
Dallas T. Hayes

Localized solutions of the BETHE—GOLDSTONE equation for two nucleons in nuclear matter are examined as a function of the center-of-mass momentum (c. m. m.) of the two nucleons. The equation depends upon the c. m. m. as parameter due to the dependence upon the c. m. m. of the projection operator appearing in the equation. An analytical solution of the equation is obtained for a non-local but separable potential, whereby a numerical solution is also obtained. An approximate solution for small c. m. m. is calculated for a square-well potential. In the range of the approximation the two analytical solutions agree exactly.


Author(s):  
Andrew Jacobsen ◽  
Matthew Schlegel ◽  
Cameron Linke ◽  
Thomas Degris ◽  
Adam White ◽  
...  

This paper investigates different vector step-size adaptation approaches for non-stationary online, continual prediction problems. Vanilla stochastic gradient descent can be considerably improved by scaling the update with a vector of appropriately chosen step-sizes. Many methods, including AdaGrad, RMSProp, and AMSGrad, keep statistics about the learning process to approximate a second order update—a vector approximation of the inverse Hessian. Another family of approaches use meta-gradient descent to adapt the stepsize parameters to minimize prediction error. These metadescent strategies are promising for non-stationary problems, but have not been as extensively explored as quasi-second order methods. We first derive a general, incremental metadescent algorithm, called AdaGain, designed to be applicable to a much broader range of algorithms, including those with semi-gradient updates or even those with accelerations, such as RMSProp. We provide an empirical comparison of methods from both families. We conclude that methods from both families can perform well, but in non-stationary prediction problems the meta-descent methods exhibit advantages. Our method is particularly robust across several prediction problems, and is competitive with the state-of-the-art method on a large-scale, time-series prediction problem on real data from a mobile robot.


2013 ◽  
Vol 110 (22) ◽  
Author(s):  
M. Avila ◽  
F. Mellibovsky ◽  
N. Roland ◽  
B. Hof

Sign in / Sign up

Export Citation Format

Share Document