スキップしてメイン コンテンツに移動

投稿

ラベル(Calculus)が付いた投稿を表示しています

Hessian Matrix

Introduction 日本語ver Today, I will talk about Taylor expansion with Hessian matrix. It is important for optimization to understand Taylor expansion with Hessian matrix. Especially, Machine learning has always situation about thinking optimization. Thus I will write it to save knowledge about Hessian matrix. Overview Definition of Hessian matrix Expression Taylor expansion with vector Optimality of the function Definition of Hessian matrix Assumption f is a function which meets a condition as follows. f output Real value after getting the n-dimensional vector. This vector is expressed as follows. \[x = [x_1,x_2,,,,x_n]\] \(\forall x_i , i \in {1,2,,,n}\), f have twice partial differential Definition Hessian matrix Hessian matrix have \(\frac{\partial^2}{\partial x_i \partial x_j} f(x) ~in~ ~element~ ~of~ (i,j)\) Thus Hessian matrix is expressed following. \[ H(f) = \left( \begin{array}{cccc} \frac{\partial^ 2}{\partial x_1^2} & \frac{\parti...

Rolle’s theorem

Introduction 日本語 ver This post is written Rolle’s theorem. The mean-value theorem is proved by Rolle’s theorem. I will write Mean-value theorem at a later. I introduce Maximum principle because proving Rolle’s theorem need Maximum principle. Maximum principle It is very easy. f is continuous function on bounded closed interval.\(\implies\)** f have max value.** Proof This proof is difficult. I write this proof in other posts. Maximum Principle Rolle’s theorem f is continuous function on [a,b] and differentiable function on (a,b). \[f(a) = f(b) \implies \exists ~~c ~~s.t~~ f'(c) = 0 , a<c<b\] Proof f(x) is constant function \[\forall c \in (a,b) , f'(c) = 0\] else when \(\exists t ~~s.t~~f(a) < f(t)\), \(\exists c ~~s.t~~ \max f(x) = f(c)\) by Maximum principle I proof \(f'(c)=0\) f is differentiable on \(x = c\) and \(f(c) >= f(c+h)\). Thus \[f'(c) = \lim_{h \rightarrow +0} \frac{f(c+h) - f(c)}{h} \leq 0\] \[f'(c) = \lim...

Taylor Expannsion

Introdction 日本語 ver Today, I introduce Taylor Expansion. I write not only One dimensional Taylor Expansion but also Multi dimensional Taylor Expansion. One dimensional Taylor Expansion f(X) is continuously differentiable for n-times on (a,b) f(x) is expressed following. \[\exists c ~~s.t~~ f(b) = \sum_{k=0}^{n-1} f^{(k)}(a)\frac{(b-a)^k}{k!} + f^{(n)}(c) \frac{(b-a)^n}{n!}, c \in (a,b)\] This is called Maclaurin Expansion. The last item is called Remainder term. Multi dimensional Taylor Expansion Multi dimensional Taylor Expansion is complex. f is n-variable function. f is continuously differentiable for m-times. \(f(x_1+h_1,x_2+h_2,.....,x_n+h_n)\) is expressed following. \[\exists \theta ~~s.t~~\] \[f(x_1+h_1,x_2+h_2,...,x_n+h_n)=f(x_1,x_2,...,x_n) + \] \[\sum_{m=0}^{n-1} \frac{1}{m-1} \sum_{k_1=1}^{n} \sum{k_2=1}^{n} ... \sum{k_{m-1}=1}^{n} \frac{\partial^{m-1} f}{\partial x_{k_1} \partial x_{k_2} .... \partial x_{k_{m-1}} }(x_1,x_2,..,x_n)h_{k_1}h_{k_2} ........