๐ŸงƒRoot Finding - Newton Raphson Method

MURRAIYAยท2022๋…„ 12์›” 3์ผ
0

๋‚˜๋Š” ์š”๊ทผ๋ž˜ ์„ ํ˜•์‹œ์Šคํ…œ์— ๋Œ€ํ•ด ๊ด€์‹ฌ์ด ๋งŽ์•˜๋Š”๋ฐ ์ด๋ฒˆ์—๋Š” linear๋ผ๋Š” ์กฐ๊ฑด์„ ํ•ด์ œํ•˜๊ณ  ์ •๋ง equation์˜ ํ•ด๋ฅผ ์ฐพ๋Š”๋‹ค๋Š” ๊ฒƒ์— ๋Œ€ํ•œ ์ด์•ผ๊ธฐ๋ฅผ ํ•ด๋ณด๊ณ ์ž ํ•œ๋‹ค. Optimization๊ณผ ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ํ•˜๋‚˜์˜ ๋ณด์žฅ๋œ ๋ฐฉ๋ฒ•์ด๋ž„ ๊ฒŒ ์—†๋‹ค.

  • function์ด๋ผ๋Š” ๊ฒƒ์€ input xx๋ฅผ ๋ฐ›์•„์„œ output yy๋ฅผ ๋ฐ˜ํ™˜ํ•˜๊ธฐ๋งŒ ํ•˜๋ฉด ํ•จ์ˆ˜์ด๋‹ค. ๊ทธ๊ฒƒ์ด ์ง€๊ธˆ๊นŒ์ง€ ์•Œ๋˜ ๊ทธ ์ˆ˜ํ•™ ๊ณต์‹์ผ ์ˆ˜๋„ ์žˆ์ง€๋งŒ ์ฝ”๋“œ ๋ช‡์ฒœ ์ค„๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ์–ด๋„ ๋˜‘๊ฐ™์ด function์ด๋‹ค.
  • equation์˜ root๋Š” ์—†์„ ์ˆ˜๋„ ์žˆ๊ณ , ํ•˜๋‚˜์ผ ์ˆ˜๋„ ์žˆ๊ณ , ์œ ํ•œ๊ฐœ์ผ ์ˆ˜๋„, ๋ฌดํ•œ๊ฐœ์ผ ์ˆ˜๋„ ์žˆ๋‹ค. ๋˜ real number์ผ ์ˆ˜๋„ complex number์ผ ์ˆ˜๋„ ์žˆ๋‹ค.

๐ŸGOAL
How to find x:f(x)=0x:f(x)=0 for f:Rnโ†’Rnf:\mathbb R^n \rightarrow \mathbb R^n

multi variable vector valued function ff์— ๋Œ€ํ•ด f(x)=0f(x)=0์„ ๋งŒ์กฑํ•˜๋Š” xx๋ฅผ ์ฐพ๋Š” ๊ฒƒ์„ ๋ชฉ์ ์œผ๋กœ ํ•œ๋‹ค.

Iterative Method - Initial Guess & Stopping Criterion

root finding์—๋Š” iterative method๊ฐ€ ์ด์šฉ๋œ๋‹ค.

iterative method๋Š” ํ•ญ์ƒ starting point (initial guess)๋ฅผ ํ•„์š”๋กœ ํ•˜๊ณ  ์ด starting point๋ฅผ ์ž˜ ์ •ํ•˜๋Š”๊ฒŒ ์ค‘์š”ํ•˜๋‹ค. ์ž˜๋ชป๋œ starting point์—์„œ ์‹œ์ž‘ํ•˜๋Š” ๊ฒฝ์šฐ convergeํ•˜์ง€ ์•Š๊ฑฐ๋‚˜ ์ž˜๋ชป๋œ ํ•ด๋ฅผ ์ฐพ์„ ์ˆ˜ ์žˆ๋‹ค.

์‹œ์Šคํ…œ ํŠน์„ฑ์„ ์ž˜ ํŒŒ์•…ํ•˜๋“  ๋ญ˜ ํ•˜๋“ ๊ฐ„์— ์“ธ ์ˆ˜ ์žˆ๋Š” ์กฐ๊ฑด์„ ๋ชจ๋‘ ๋ชจ์•„ ๊ทผ์ด ์กด์žฌํ•  ๊ฐ€๋Šฅ์„ฑ์ด ๋†’์€ ๊ณณ์—์„œ startํ•˜๋Š” ๊ฒƒ์ด ์ข‹๋‹ค.

Iterate๋ฅผ ํ‰์ƒ ํ•  ์ˆ˜๋Š” ์—†์œผ๋‹ˆ ์–ธ์ œ ๋ฉˆ์ถœ์ง€ ๊ฒฐ์ •ํ•˜๋Š” stopping criterion๋„ ํ•„์š”ํ•˜๋‹ค.

Newton-Raphson Method

equation์˜ root๋ฅผ ์ฐพ๋Š” ๋ฐ ๋ณด์žฅ๋œ ๋ฐฉ๋ฒ•์ด๋ž„ ๊ฒƒ์€ ์—†์ง€๋งŒ ๊ทธ๋ž˜๋„ ์•„์ฃผ ๊ดœ์ฐฎ์€ ๋ฐฉ๋ฒ•์€ ์žˆ๋‹ค. ๊ทธ๊ฒŒ Newton-Rapson Method์ด๋‹ค. ๋‹จ์ ์ด๋ผ๋ฉด f(x)f(x)์™€ fโ€ฒ(x)f^\prime(x)๋ฅผ ์ด์šฉํ•˜๊ธฐ ๋•Œ๋ฌธ์— fโ€ฒ(x)f^\prime(x)๋ฅผ ์ •์˜ํ•  ์ˆ˜ ์žˆ์–ด์•ผ ํ•œ๋‹ค๋Š” ๊ฒƒ์ด๋‹ค. ์œ„์—์„œ ๋งํ–ˆ๋“ฏ์ด functioin์ด๋ผ๋Š” ๊ฒƒ์€ ์ž…๋ ฅ๊ณผ ๊ทธ์— ๋”ฐ๋ฅธ ์ถœ๋ ฅ๋งŒ ๋ถ„๋ช…ํžˆ ์ •์˜๋˜๋ฉด ๋˜๊ณ , ํ•จ์ˆ˜๋ผ๋Š” ๊ฒƒ์ด ๋ฏธ๋ถ„๊ฐ€๋Šฅํ•˜๋‹ค๋Š” ์–˜๊ธฐ๋Š” ์•„๋‹ˆ๋‹ค.

๊ธฐ๋ณธ์ ์œผ๋กœ iteration method์ด๊ธฐ ๋•Œ๋ฌธ์— update๋ฅผ ํ•ด ๊ฐ„๋‹ค.

x(0)โ†’x(1)โ†’โ‹ฏโ†’x(k)โ†’x(k+1)โ†’โ€ฆx^{(0)}\rightarrow x^{(1)}\rightarrow \dots \rightarrow x^{(k)}\rightarrow x^{(k+1)} \rightarrow \dots

x(k)x^{(k)}๊ฐ€ ์žˆ์„ ๋•Œ x(k+1)x^{(k+1)}๋กœ ์–ด๋–ป๊ฒŒ updateํ• ์ง€๊ฐ€ ๋ฌธ์ œ๊ณ  variation์ด ๋งŽ๋‹ค.


iteration์˜ basic form์€ ์•„๋ž˜์™€ ๊ฐ™๋‹ค.

x(k+1)=x(k)+ฮฑ(k)ฮ”x(k)x^{(k+1)}=x^{(k)} +\alpha^{(k)}\Delta x^{(k)}

ฮฑ(k)\alpha^{(k)}๋Š” step size, ฮ”x(k)\Delta x^{(k)}๋Š” update direction์ด๋‹ค.

Exact Gradient Method

Newton Raphson Formula:
x(k+1)=x(k)โˆ’f(x(k))fโ€ฒ(x(k))x^{(k+1)}=x^{(k)} -{f(x^{(k)}) \over f^\prime(x^{(k)})}

Taylor Series Expansion์œผ๋กœ f(x(k+1))=f(x(k))f(x^{(k+1)})=f(x^{(k)})

Approximate Gradient Method - Secant Method

Nonlinear Equations

Newton Raphson Method

Variations

Approximation of the Jacobian

f:Rโ†’Rf:\mathbb R\rightarrow\mathbb R ์ผ ๋•Œ derivative๋ฅผ ์ด์šฉํ•˜๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ์ง€๊ธˆ ๋‹ค๋ฃจ๋Š” ํ•จ์ˆ˜๋Š” multi variable vector valued์ด๊ธฐ ๋•Œ๋ฌธ์— ๊ฐ™์€ ์—ญํ• ์ธ Jacobian์„ ์ด์šฉํ•œ๋‹ค.

Conjugate Gradient Method

Linear Algebra for Computing Zeros of Polynomials

Root Finding as Optimization

์ฐธ๊ณ ์ž๋ฃŒ:
์ธํ•˜๋Œ€ ๊น€๊ด‘๊ธฐ ๊ต์ˆ˜๋‹˜์˜ ์ˆ˜์น˜ํ•ด์„ ๊ฐ•์˜

profile
๐Ÿ™ƒSUJI KIM๐Ÿ™ƒ ๐Ÿšฉ Inha University Undergraduate ๐Ÿš— Autonomous Driving Robots ๐Ÿ“ท Computer Vision ๐Ÿ’ซ SLAM

0๊ฐœ์˜ ๋Œ“๊ธ€