高危性行为是什么| 黄芪煲汤和什么搭配| 什么时间最容易怀孕| 理疗师是做什么的| 后背中心疼是什么原因| 走婚是什么意思| 梦见捡硬币是什么预兆| 艾滋病有什么危害| 扁平足是什么样子图片| 盐酸达泊西汀片是什么药| 扁桃体2度是什么意思| 八是什么生肖| 秋天有什么水果成熟| 降低压吃什么药| 俄罗斯特工组织叫什么| 小三阳和大三阳有什么区别| 市委副秘书长什么级别| 喉咙扁桃体发炎吃什么药| 男人喝藏红花有什么好处| 护理是什么| 鱼油有什么副作用| 920是什么意思| 为什么会得痛风| 孩子过敏性咳嗽吃什么药好| 放疗是什么| 九月份是什么季节| 什么是陈皮| 手麻疼是什么原因引起| us是什么单位| 孩子打呼噜是什么原因| 什么食物高蛋白含量高| 早泄是什么| 阑尾在人体的什么位置| 白带多是什么情况| 脱氧核糖是什么| 鼻窦炎首选什么抗生素| 胆总管结石有什么症状| 生日礼物送什么好| 反应蛋白测定是查什么的| 手脚软无力是什么原因引起的| 取什么网名好听| 251是什么意思| 什么症状提示月经马上要来了| 什么叫外阴白斑| dr股票是什么意思| 核素治疗是什么| 咖色配什么颜色好看| 朋友过生日送什么好| 激光脱毛挂什么科| 什么是编外人员| 大叔是什么意思| 上大学需要准备什么| 胸前有痣代表什么意思| 乡政府属于什么单位| 66岁属什么生肖| 挂帅是什么意思| 脾胃挂什么科| basic是什么意思| 什么动作可以提高性功能| 白带什么味道| 高血糖吃什么比较好| 有黄痰是什么原因| 尿急吃什么药效果最好| 三叉神经痛挂什么科| 指甲上有竖条纹是什么原因| 无的放矢什么意思| 喉咙发炎咳嗽吃什么药好得快| 保释是什么意思| 纠葛是什么意思| 指鹿为马指什么生肖| 美国为什么不敢动朝鲜| 中老年补钙吃什么钙片好| 卦是什么意思| 白玉蜗牛吃什么| bc是什么牌子| 乳房疼痛应该挂什么科| 塑形是什么意思| 解神是什么意思| 眼睛红血丝用什么眼药水| 雪里红是什么菜| 怀孕初期有什么反应| 百合花代表什么意思| 碘伏和碘酒有什么区别| 戊是什么生肖| 左肾积水是什么意思| 结肠多发息肉是什么意思| 平头哥是什么意思| 什么的大叫| 什么是过敏性紫癜| 心肌酶高吃什么药| 主页是什么意思| 手淫导致的阳痿早泄吃什么药| 茂盛的意思是什么| 什么的耳朵| 码子是什么意思| 玛卡是什么| 扁桃体是什么样子图片| 湿气重的人适合吃什么| kodice是什么牌子| 吕布的马叫什么名字| 临期是什么意思| 艾草治什么病| 乙肝e抗体阳性是什么意思| 耍朋友是什么意思| 喉咙干痒吃什么药| head是什么牌子| 白玫瑰花语是什么| 什么的教室填空| 头发需要什么营养| 静脉曲张做什么检查| 营养神经吃什么药效果好| 一什么小狗| 小苏打有什么作用| 盐酸西替利嗪片主治什么| 肺腺瘤是什么| 喝绿豆汤有什么好处| 喝酒对胃有什么伤害| 棱角是什么意思| 什么邮箱最好用最安全| 深圳居住证有什么用| 为什么用| 尿蛋白阴性是什么意思| 鼻梁高的男人说明什么| 客车是什么车| 大肠杆菌是什么| 胎动什么时候出现| 生长发育科检查些什么| 几斤几两是什么意思| 梦见手机屏幕摔碎了是什么意思| 大腿出汗是什么原因| 芡实有什么功效| 鸡内金有什么功效| 脚背疼是什么原因| 婚检是什么意思| 鼻子一直流血是什么原因| 支线是什么意思| 历程是什么意思| 内分泌紊乱吃什么药| 多巴胺是什么药| 喝红糖水有什么好处| 什么屈膝| 何其是什么意思| 脆豆腐是什么做的| 规培生是什么意思| 什么病需要化疗| 看见蛇过马路什么征兆| 空唠唠的意思是什么| iqc是什么意思| 冲任失调是什么意思| 10mg是什么意思| 眉毛中间叫什么部位| 拉屎特别臭是什么原因| 涤是什么面料| 犹太人属于什么人种| 胃炎吃什么水果| 日间病房是什么意思| 心衰吃什么恢复的快| 胸内科主要看什么病| 阳痿吃什么药| 胸前出汗多是什么原因| 腰椎滑脱是什么意思| 后背凉是什么原因| 自负是什么意思| 维生素c主治什么| 鳄鱼的尾巴有什么作用| 双手麻木是什么原因| apf值是什么意思| 家庭主妇是什么意思| 什么什么不惧| 内分泌失调挂什么科| 血脂四项包括什么| 梦见豆腐是什么意思| 纤维灶是什么意思| lala是什么意思| 胃溃疡吃什么食物好| 已读不回是什么意思| 杜鹃花什么颜色| 裳能组什么词| 老舍原名叫什么| 男生为什么要做包皮手术| 离心是什么意思| 少腹是什么意思| gift什么意思| ab型血可以输什么血| 谷维素是治疗什么的| 是什么为什么怎么办| 小鱼缸适合养什么鱼| 敌是什么生肖| 摩拳擦掌是什么生肖| 什么的水流| 秋分节气的含义是什么| 2009年属什么| 两面人是什么意思| 水为什么是绿色的| eb病毒是什么病| 3.30是什么星座| 美籍华裔是什么意思| 怀疑肝不好挂什么科室| 武士是什么意思| 高频听力损失意味什么| 高玩是什么意思| 什么是超七水晶| 己未五行属什么| 麝香是什么东西| 珍珠状丘疹用什么药膏| 游戏id是什么意思| 鲁迅是著名的什么家| 白细胞2个加号是什么意思| sla是什么| 每晚都做梦是什么原因| 闪失是什么意思| 吃什么能增强性功能| 女性白带有血丝是什么原因| 麦穗鱼吃什么| 2013年是什么命| pumpkin是什么意思| 夜半是什么时辰| 一视同仁什么意思| 大运什么意思| 什么花晚上开| 水晶为什么要消磁| 五月二十九是什么星座| 五月十六是什么星座| 什么水果去湿气效果最好| 三周年祭奠有什么讲究| 什么叫丹毒| 强磁对人体有什么危害| 纳豆激酶有什么作用| 甘草泡水喝有什么功效| pnp是什么意思| 肺结核通过什么途径传染| 1962属什么| 6月29日是什么星座| 存在感是什么意思| 菜板什么木材最好| 异常的反义词是什么| 阳历5月20日是什么星座| 梦见大象是什么预兆| 百利甜酒兑什么最好喝| 留个念想是什么意思| 屈臣氏是卖什么的| 牛鞭是什么| 大便羊粪状吃什么药| 牡丹花是什么颜色| 白电油对人体有什么危害| 真相是什么意思| 月令是什么意思| 畏手畏脚是什么意思| 什么叫扁平疣长什么样| 感染科主要看什么病| 女人排卵期是什么时候| 取环挂什么科室| 青岛属于什么气候| 医院康复科是干什么的| 狗为什么不吃饭| 骨折后吃什么恢复快| 临界是什么意思| 怀孕嗜睡什么时候开始| o型血可以接受什么血型| 肾阴虚有什么症状| 弟弟的儿子叫什么| 什么病需要做透析| 通五行属什么| 女左上眼皮跳是什么预兆| 百度
Learn R Programming

stats (version 3.6.2)

海外传媒观察 | 日本和韩国媒体融合的四个案例

Description

百度 尽管沙特王爷军的战力此前在地面战中已暴露无余,但人们仍很难想象曾在美国、以色列手中创造不败神话的F15战机,为何到了沙特手中会如此不堪一击。

General-purpose optimization based on Nelder--Mead, quasi-Newton and conjugate-gradient algorithms. It includes an option for box-constrained optimization and simulated annealing.

Usage

optim(par, fn, gr = NULL, …,
      method = c("Nelder-Mead", "BFGS", "CG", "L-BFGS-B", "SANN",
                 "Brent"),
      lower = -Inf, upper = Inf,
      control = list(), hessian = FALSE)

optimHess(par, fn, gr = NULL, …, control = list())

Arguments

par

Initial values for the parameters to be optimized over.

fn

A function to be minimized (or maximized), with first argument the vector of parameters over which minimization is to take place. It should return a scalar result.

gr

A function to return the gradient for the "BFGS", "CG" and "L-BFGS-B" methods. If it is NULL, a finite-difference approximation will be used.

For the "SANN" method it specifies a function to generate a new candidate point. If it is NULL a default Gaussian Markov kernel is used.

Further arguments to be passed to fn and gr.

method

The method to be used. See ‘Details’. Can be abbreviated.

lower, upper

Bounds on the variables for the "L-BFGS-B" method, or bounds in which to search for method "Brent".

control

a list of control parameters. See ‘Details’.

hessian

Logical. Should a numerically differentiated Hessian matrix be returned?

Value

For optim, a list with components:

par

The best set of parameters found.

value

The value of fn corresponding to par.

counts

A two-element integer vector giving the number of calls to fn and gr respectively. This excludes those calls needed to compute the Hessian, if requested, and any calls to fn to compute a finite-difference approximation to the gradient.

convergence

An integer code. 0 indicates successful completion (which is always the case for "SANN" and "Brent"). Possible error codes are

1

indicates that the iteration limit maxit had been reached.

10

indicates degeneracy of the Nelder--Mead simplex.

51

indicates a warning from the "L-BFGS-B" method; see component message for further details.

52

indicates an error from the "L-BFGS-B" method; see component message for further details.

message

A character string giving any additional information returned by the optimizer, or NULL.

hessian

Only if argument hessian is true. A symmetric matrix giving an estimate of the Hessian at the solution found. Note that this is the Hessian of the unconstrained problem even if the box constraints are active.

For optimHess, the description of the hessian component applies.

Details

Note that arguments after must be matched exactly.

By default optim performs minimization, but it will maximize if control$fnscale is negative. optimHess is an auxiliary function to compute the Hessian at a later stage if hessian = TRUE was forgotten.

The default method is an implementation of that of Nelder and Mead (1965), that uses only function values and is robust but relatively slow. It will work reasonably well for non-differentiable functions.

Method "BFGS" is a quasi-Newton method (also known as a variable metric algorithm), specifically that published simultaneously in 1970 by Broyden, Fletcher, Goldfarb and Shanno. This uses function values and gradients to build up a picture of the surface to be optimized.

Method "CG" is a conjugate gradients method based on that by Fletcher and Reeves (1964) (but with the option of Polak--Ribiere or Beale--Sorenson updates). Conjugate gradient methods will generally be more fragile than the BFGS method, but as they do not store a matrix they may be successful in much larger optimization problems.

Method "L-BFGS-B" is that of Byrd et. al. (1995) which allows box constraints, that is each variable can be given a lower and/or upper bound. The initial value must satisfy the constraints. This uses a limited-memory modification of the BFGS quasi-Newton method. If non-trivial bounds are supplied, this method will be selected, with a warning.

Nocedal and Wright (1999) is a comprehensive reference for the previous three methods.

Method "SANN" is by default a variant of simulated annealing given in Belisle (1992). Simulated-annealing belongs to the class of stochastic global optimization methods. It uses only function values but is relatively slow. It will also work for non-differentiable functions. This implementation uses the Metropolis function for the acceptance probability. By default the next candidate point is generated from a Gaussian Markov kernel with scale proportional to the actual temperature. If a function to generate a new candidate point is given, method "SANN" can also be used to solve combinatorial optimization problems. Temperatures are decreased according to the logarithmic cooling schedule as given in Belisle (1992, p.890); specifically, the temperature is set to temp / log(((t-1) %/% tmax)*tmax + exp(1)), where t is the current iteration step and temp and tmax are specifiable via control, see below. Note that the "SANN" method depends critically on the settings of the control parameters. It is not a general-purpose method but can be very useful in getting to a good value on a very rough surface.

Method "Brent" is for one-dimensional problems only, using optimize(). It can be useful in cases where optim() is used inside other functions where only method can be specified, such as in mle from package stats4.

Function fn can return NA or Inf if the function cannot be evaluated at the supplied value, but the initial value must have a computable finite value of fn. (Except for method "L-BFGS-B" where the values should always be finite.)

optim can be used recursively, and for a single parameter as well as many. It also accepts a zero-length par, and just evaluates the function with that argument.

The control argument is a list that can supply any of the following components:

trace

Non-negative integer. If positive, tracing information on the progress of the optimization is produced. Higher values may produce more tracing information: for method "L-BFGS-B" there are six levels of tracing. (To understand exactly what these do see the source code: higher levels give more detail.)

fnscale

An overall scaling to be applied to the value of fn and gr during optimization. If negative, turns the problem into a maximization problem. Optimization is performed on fn(par)/fnscale.

parscale

A vector of scaling values for the parameters. Optimization is performed on par/parscale and these should be comparable in the sense that a unit change in any element produces about a unit change in the scaled value. Not used (nor needed) for method = "Brent".

ndeps

A vector of step sizes for the finite-difference approximation to the gradient, on par/parscale scale. Defaults to 1e-3.

maxit

The maximum number of iterations. Defaults to 100 for the derivative-based methods, and 500 for "Nelder-Mead".

For "SANN" maxit gives the total number of function evaluations: there is no other stopping criterion. Defaults to 10000.

abstol

The absolute convergence tolerance. Only useful for non-negative functions, as a tolerance for reaching zero.

reltol

Relative convergence tolerance. The algorithm stops if it is unable to reduce the value by a factor of reltol * (abs(val) + reltol) at a step. Defaults to sqrt(.Machine$double.eps), typically about 1e-8.

alpha, beta, gamma

Scaling parameters for the "Nelder-Mead" method. alpha is the reflection factor (default 1.0), beta the contraction factor (0.5) and gamma the expansion factor (2.0).

REPORT

The frequency of reports for the "BFGS", "L-BFGS-B" and "SANN" methods if control$trace is positive. Defaults to every 10 iterations for "BFGS" and "L-BFGS-B", or every 100 temperatures for "SANN".

warn.1d.NelderMead

a logical indicating if the (default) "Nelder-Mean" method should signal a warning when used for one-dimensional minimization. As the warning is sometimes inappropriate, you can suppress it by setting this option to false.

type

for the conjugate-gradients method. Takes value 1 for the Fletcher--Reeves update, 2 for Polak--Ribiere and 3 for Beale--Sorenson.

lmm

is an integer giving the number of BFGS updates retained in the "L-BFGS-B" method, It defaults to 5.

factr

controls the convergence of the "L-BFGS-B" method. Convergence occurs when the reduction in the objective is within this factor of the machine tolerance. Default is 1e7, that is a tolerance of about 1e-8.

pgtol

helps control the convergence of the "L-BFGS-B" method. It is a tolerance on the projected gradient in the current search direction. This defaults to zero, when the check is suppressed.

temp

controls the "SANN" method. It is the starting temperature for the cooling schedule. Defaults to 10.

tmax

is the number of function evaluations at each temperature for the "SANN" method. Defaults to 10.

Any names given to par will be copied to the vectors passed to fn and gr. Note that no other attributes of par are copied over.

The parameter vector passed to fn has special semantics and may be shared between calls: the function should not change or copy it.

References

Belisle, C. J. P. (1992). Convergence theorems for a class of simulated annealing algorithms on \(R^d\). Journal of Applied Probability, 29, 885--895. 10.2307/3214721.

Byrd, R. H., Lu, P., Nocedal, J. and Zhu, C. (1995). A limited memory algorithm for bound constrained optimization. SIAM Journal on Scientific Computing, 16, 1190--1208. 10.1137/0916069.

Fletcher, R. and Reeves, C. M. (1964). Function minimization by conjugate gradients. Computer Journal 7, 148--154. 10.1093/comjnl/7.2.149.

Nash, J. C. (1990). Compact Numerical Methods for Computers. Linear Algebra and Function Minimisation. Adam Hilger.

Nelder, J. A. and Mead, R. (1965). A simplex algorithm for function minimization. Computer Journal, 7, 308--313. 10.1093/comjnl/7.4.308.

Nocedal, J. and Wright, S. J. (1999). Numerical Optimization. Springer.

See Also

nlm, nlminb.

optimize for one-dimensional minimization and constrOptim for constrained optimization.

Examples

Run this code
# NOT RUN {
require(graphics)

fr <- function(x) {   ## Rosenbrock Banana function
    x1 <- x[1]
    x2 <- x[2]
    100 * (x2 - x1 * x1)^2 + (1 - x1)^2
}
grr <- function(x) { ## Gradient of 'fr'
    x1 <- x[1]
    x2 <- x[2]
    c(-400 * x1 * (x2 - x1 * x1) - 2 * (1 - x1),
       200 *      (x2 - x1 * x1))
}
optim(c(-1.2,1), fr)
(res <- optim(c(-1.2,1), fr, grr, method = "BFGS"))
optimHess(res$par, fr, grr)
optim(c(-1.2,1), fr, NULL, method = "BFGS", hessian = TRUE)
## These do not converge in the default number of steps
optim(c(-1.2,1), fr, grr, method = "CG")
optim(c(-1.2,1), fr, grr, method = "CG", control = list(type = 2))
optim(c(-1.2,1), fr, grr, method = "L-BFGS-B")

flb <- function(x)
    { p <- length(x); sum(c(1, rep(4, p-1)) * (x - c(1, x[-p])^2)^2) }
## 25-dimensional box constrained
optim(rep(3, 25), flb, NULL, method = "L-BFGS-B",
      lower = rep(2, 25), upper = rep(4, 25)) # par[24] is *not* at boundary


## "wild" function , global minimum at about -15.81515
fw <- function (x)
    10*sin(0.3*x)*sin(1.3*x^2) + 0.00001*x^4 + 0.2*x+80
plot(fw, -50, 50, n = 1000, main = "optim() minimising 'wild function'")

res <- optim(50, fw, method = "SANN",
             control = list(maxit = 20000, temp = 20, parscale = 20))
res
## Now improve locally {typically only by a small bit}:
(r2 <- optim(res$par, fw, method = "BFGS"))
points(r2$par,  r2$value,  pch = 8, col = "red", cex = 2)

## Combinatorial optimization: Traveling salesman problem
library(stats) # normally loaded

eurodistmat <- as.matrix(eurodist)

distance <- function(sq) {  # Target function
    sq2 <- embed(sq, 2)
    sum(eurodistmat[cbind(sq2[,2], sq2[,1])])
}

genseq <- function(sq) {  # Generate new candidate sequence
    idx <- seq(2, NROW(eurodistmat)-1)
    changepoints <- sample(idx, size = 2, replace = FALSE)
    tmp <- sq[changepoints[1]]
    sq[changepoints[1]] <- sq[changepoints[2]]
    sq[changepoints[2]] <- tmp
    sq
}

sq <- c(1:nrow(eurodistmat), 1)  # Initial sequence: alphabetic
distance(sq)
# rotate for conventional orientation
loc <- -cmdscale(eurodist, add = TRUE)$points
x <- loc[,1]; y <- loc[,2]
s <- seq_len(nrow(eurodistmat))
tspinit <- loc[sq,]

plot(x, y, type = "n", asp = 1, xlab = "", ylab = "",
     main = "initial solution of traveling salesman problem", axes = FALSE)
arrows(tspinit[s,1], tspinit[s,2], tspinit[s+1,1], tspinit[s+1,2],
       angle = 10, col = "green")
text(x, y, labels(eurodist), cex = 0.8)

set.seed(123) # chosen to get a good soln relatively quickly
res <- optim(sq, distance, genseq, method = "SANN",
             control = list(maxit = 30000, temp = 2000, trace = TRUE,
                            REPORT = 500))
res  # Near optimum distance around 12842

tspres <- loc[res$par,]
plot(x, y, type = "n", asp = 1, xlab = "", ylab = "",
     main = "optim() 'solving' traveling salesman problem", axes = FALSE)
arrows(tspres[s,1], tspres[s,2], tspres[s+1,1], tspres[s+1,2],
       angle = 10, col = "red")
text(x, y, labels(eurodist), cex = 0.8)

## 1-D minimization: "Brent" or optimize() being preferred.. but NM may be ok and "unavoidable",
## ----------------   so we can suppress the check+warning :
system.time(rO <- optimize(function(x) (x-pi)^2, c(0, 10)))
system.time(ro <- optim(1, function(x) (x-pi)^2, control=list(warn.1d.NelderMead = FALSE)))
rO$minimum - pi # 0 (perfect), on one platform
ro$par - pi     # ~= 1.9e-4    on one platform
utils::str(ro)
# }

Run the code above in your browser using DataLab

Continue Improving Your R Skills

R Fundamentals

Level-up your R programming skills! Learn how to work with common data structures, optimize code, and write your own functions.

Big Data with R

Work with big data in R via parallel programming, interfacing with Spark, writing scalable & efficient R code, and learn ways to visualize big data.

Machine Learning with R

A machine learning scientist researches new approaches and builds machine learning models.
腊排骨炖什么好吃 警察两杠一星是什么级别 舌头干涩是什么病前兆 今年28岁属什么生肖 六月五号是什么星座
西亚是什么人种 来月经可以吃什么水果好 hr阳性是什么意思 什么是价值 因应是什么意思
什么风云 hrs是什么意思 吃了避孕药有什么副作用 黄色鞋子配什么颜色裤子 婴儿第一次发烧叫什么
什么叫入伏 什么情况下需要打狂犬疫苗 新生儿头发稀少是什么原因 为什么会得肺炎 33是什么意思
什么是追溯码shenchushe.com 清补凉是什么hcv8jop7ns7r.cn 小狗需要打什么疫苗hcv8jop3ns8r.cn 男人早泄吃什么药hcv9jop2ns3r.cn 土豆有什么营养价值beikeqingting.com
什么是原则性问题hcv8jop9ns2r.cn 什么食粮hcv8jop3ns5r.cn 亦女念什么hcv8jop5ns7r.cn 儿童坐飞机需要带什么证件hcv8jop8ns2r.cn 鹅蛋和什么不能一起吃beikeqingting.com
寓是什么意思hcv7jop6ns6r.cn 丝瓜烧什么好吃hcv8jop4ns1r.cn 眼皮浮肿是什么原因hcv8jop5ns3r.cn 什么肉是发物youbangsi.com 青豆是什么豆hcv8jop4ns0r.cn
耵聍是什么意思hcv8jop8ns2r.cn 男士内裤什么材质的好zsyouku.com hello中文是什么意思inbungee.com 一什么面包hcv8jop1ns3r.cn 决明子有什么功效hcv9jop2ns0r.cn
百度