๋ณธ๋ฌธ ๋ฐ”๋กœ๊ฐ€๊ธฐ
Coursera/Deep Learning Specialization

[Deep-Special] [Lec1] Week2. Logistic Regression as a Neural Network (Last Update-20.11.02.Mon)

by Steve-Lee 2020. 11. 2.

๐Ÿ“” [Lec1] Neural Networks & Deep Learning

Deep Learning Specialization Course์˜ ์ฒซ ๋ฒˆ์งธ ๊ฐ•์˜ 'Neural Networks & Deep Learning'์˜ 2์ฃผ์ฐจ ๊ณผ์ •์ž…๋‹ˆ๋‹ค.

2์ฃผ์ฐจ ๋ชฉํ‘œ

  • ์ˆœ์ „ํŒŒ, ์—ญ์ „ํŒŒ
  • ๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€๋ถ„์„
  • ๋น„์šฉ ํ•จ์ˆ˜(Cost Function)
  • ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ•(gradient descent)

Goal

  • Build a logistic regression model structured as a shallow neural network
  • Build the general architecture of a learning algorithm, including parameter initialization, cost function and gradient calculation, and optimization implementation (gradient descent)
  • Implement computationally efficient and highly vectorized versions of models
  • Compute derivatives for logistic regression, using a backpropagation mindset
  • Use Numpy functions and Numpy matrix/vector operations
  • Work with iPython Notebooks
  • Implement vectorization across multiple training examples

 

Logistic Regression

  • Why? - Logistic Regression์„ ์™œ ์‚ฌ์šฉํ•˜๋Š” ๊ฑธ๊นŒ?
  • What? - Logistic Regression์ด๋ž€ ๋ฌด์—‡์ผ๊นŒ?
  • How? - ์–ด๋–ป๊ฒŒ ๊ตฌํ•  ์ˆ˜ ์žˆ์„๊นŒ? 

  • Answer
    • Why?
      • Logistic Regression์€ Supervised Learning Problem์„ ํ•ด๊ฒฐํ•˜๊ธฐ ์œ„ํ•ด ๊ณ ์•ˆ๋˜์—ˆ๋‹ค. (Binary Classification)
    • What?
      • Logistic Regression์˜ ๋ชฉํ‘œ๋Š” ๋ฌด์—‡์ผ๊นŒ?
      • 0๊ณผ 1๋กœ ๋ถ„๋ฅ˜ํ•˜๋Š” ๋ฌธ์ œ๊ฐ€ ์ฃผ์–ด์กŒ์„ ๋•Œ(y = {0, 1}) ํด๋ž˜์Šค๋ฅผ ์˜ˆ์ธกํ•˜๋Š” ๊ฒƒ์ด๋‹ค. e.g ๊ณ ์–‘์ด์ธ๊ฐ€ vs ๊ณ ์–‘์ด๊ฐ€ ์•„๋‹Œ๊ฐ€. ์ŠคํŒธ์ธ๊ฐ€ vs ํ–„์ธ๊ฐ€. ๊ฐ์—ผ์ธ๊ฐ€ vs ๊ฐ์—ผ์ด ์•„๋‹Œ๊ฐ€ ๋“ฑ ๋‹ค์–‘ํ•œ ์˜ˆ์‹œ๋ฅผ ๋“ค ์ˆ˜ ์žˆ๊ฒ ๋‹ค
      • ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ์™€ ๋ชจ๋ธ์˜ ์˜ˆ์ธก๊ฐ„์˜ ์—๋Ÿฌ๋ฅผ ์ค„์ด๋Š” ๊ฒƒ์ด๋‹ค.
    • How?
      • linear function์— sigmoid ํ•จ์ˆ˜๋ฅผ ์”Œ์›Œ ๊ฒฐ๊ณผ๊ฐ’์ด 0์—์„œ 1์‚ฌ์ด๊ฐ€ ๋˜๋„๋ก ๋งŒ๋“ค์–ด์ค€๋‹ค

 

Logistic Regression Cost Function

What is the difference between the cost function and the loss function for logistic regression?

โœ… Loss Function๊ณผ Cost Fuction์˜ ์ฐจ์ด๋ฅผ ๋งํ•  ์ˆ˜ ์žˆ์–ด์•ผํ•œ๋‹ค

 

์œ„์˜ ์ˆ˜์‹๊ณผ ๊ฐ™์ด Loss Function๊ณผ Cost Function์˜ ๊ฐ€์žฅ ํฐ ์ฐจ์ด๋Š” ์‹ค์ œ๊ฐ’๊ณผ ์˜ˆ์ธก๊ฐ’์˜ ์ฐจ์ด๋ฅผ ํ•˜๋‚˜์˜ ๋ฐ์ดํ„ฐ์— ๋Œ€ํ•ด์„œ๋งŒ ์ƒ๊ฐํ•˜๋Š๋ƒ ๋˜๋Š” ์ „์ฒด ํŠธ๋ ˆ์ด๋‹ ๋ฐ์ดํ„ฐ์— ๋Œ€ํ•ด์„œ ์ƒ๊ฐํ•˜๋Š๋ƒ์— ๋‹ฌ๋ ค์žˆ๋‹ค. ์ •๋ฆฌํ•˜๋ฉด

 

  • Loss Function์ด Single Training Example์— ๋Œ€ํ•œ ๊ณ„์‚ฐ์ด๋ผ๋ฉด
  • Cost Function์€ ์ „์ฒด ํŠธ๋ ˆ์ด๋‹ ์„ธํŠธ์— ๋Œ€ํ•œ ํ‰๊ท ๊ฐ’์„ ๊ณ„์‚ฐํ•œ ๊ฒƒ์ด๋‹ค

So the terminology I'm going to use is that the loss function is applied to just a single training example like so. And the cost function is the cost of your parameters. So in training your logistic regression model, we're going to try to find parameters W and B that minimize the overall costs function J written at the bottom.

 

Derivatives(๋ฏธ๋ถ„)

โœ… ๋ฏธ๋ถ„

  • ๋ฏธ๋ถ„์˜ ์ง๊ด€ ⇒ Slope(๊ธฐ์šธ๊ธฐ) ๋ผ๊ณ  ์ƒ๊ฐํ•˜์ž
  • ํ•จ์ˆ˜๊ฐ’์ด ์ง์„ ์ผ ๊ฒฝ์šฐ์—๋Š” ์–ด๋Š ์œ„์น˜์—์„œ๋“  ๋ฏธ๋ถ„๊ฐ’์ด ๊ฐ™์ง€๋งŒ ๊ณก์„  ๋˜๋Š” ๋‹ค๋ฅธ ๊ฒฝ์šฐ์—๋Š” ์œ„์น˜์— ๋”ฐ๋ผ ๋ฏธ๋ถ„๊ฐ’์ด ๋‹ฌ๋ผ์งˆ ์ˆ˜ ์žˆ๋‹ค

 

Logistic Regression Gradient Descent

Logistic Regression on m examples

 

More Vectorization Examples

Vectorizing Logistic Regression

 

Steve-YJ/Data-Science-scratch

Learning Data Science from scratch. Contribute to Steve-YJ/Data-Science-scratch development by creating an account on GitHub.

github.com

Reference

 

์‹ฌ์ธต ํ•™์Šต

Learn Deep Learning from deeplearning.ai. If you want to break into Artificial intelligence (AI), this Specialization will help you. Deep Learning is one of the most highly sought after skills in tech. We will help you become good at Deep Learning.

www.coursera.org

 

โœ… End!  -20.11.03.Tue- :)

โœ… Update!  -20.11.15.Sun am 7:00 - :)

 

๋Œ“๊ธ€