๐Ÿ™ƒ ๋กœ์ง€์Šคํ‹ฑํšŒ๊ท€, SVM ๐Ÿ™ƒ

parkeuยท2022๋…„ 9์›” 26์ผ
0

ABC๋ถ€ํŠธ์บ ํ”„

๋ชฉ๋ก ๋ณด๊ธฐ
28/55

๐Ÿ† ์ง€๋„ํ•™์Šต ์•Œ๊ณ ๋ฆฌ์ฆ˜

๋ถ„๋ฅ˜ํ˜• ์„ ํ˜• ๋ชจ๋ธ

  • ํŠน์ • ๊ณ„์ˆ˜์™€ ์ ˆํŽธ์˜ ์กฐํ•ฉ์ด ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ์— ์–ผ๋งˆ๋‚˜ ์ž˜ ๋งž๋Š”์ง€ ์ธก์ •
  • ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋Š” ๊ทœ์ œ๊ฐ€ ์žˆ๋Š”์ง€, ์žˆ๋‹ค๋ฉด ---์–ด๋–ค๋ฐฉ์‹์ธ์ง€
  • ๊ทธ๋Ÿฌ๋‚˜ ์•Œ๊ณ ๋ฆฌ์ฆ˜๋“ค์ด ๋งŒ๋“œ๋Š” ์ž˜๋ชป๋œ ๋ถ„๋ฅ˜์˜ ์ˆ˜๋ฅผ ์ตœ์†Œํ™”ํ•˜๋„๋ก w์™€ b๋ฅผ ์กฐ์ •ํ•˜๋Š” ๊ฒƒ์€ ๋ถˆ๊ฐ€๋Šฅ

y-hat : ์˜ˆ์ธกํ•œ ๊ฐ’
y : ์ •๋‹ต


์„ ํ˜• ๋ถ„๋ฅ˜ ๋ชจ๋ธ์˜ C(๊ทœ์ œ) ์„ค์ •์— ๋”ฐ๋ฅธ ๊ฒฐ์ • ๊ฒฝ๊ณ„

import mglearn
import matplotlib.pyplot as plt
from sklearn.linear_model import LogisticRegression
from sklearn.svm import LinearSVC

import warnings
warnings.filterwarnings('ignore')

plt.rc('font', family ='NanumBarunGothic')
plt.rcParams['axes.unicode_minus'] = False
plt.rcParams['figure.dpi']  = 100

X, y = mglearn.datasets.make_forge()

fig, axes = plt.subplots(1, 2, figsize=(10, 3))

for model, ax in zip([LinearSVC(max_iter=5000), LogisticRegression()], axes):
    clf = model.fit(X, y)
    mglearn.plots.plot_2d_separator(clf, X, fill=False, eps=0.5,
                                    ax=ax, alpha=.7)
    mglearn.discrete_scatter(X[:, 0], X[:, 1], y, ax=ax)
    ax.set_title(clf.__class__.__name__)
    ax.set_xlabel("ํŠน์„ฑ 0")
    ax.set_ylabel("ํŠน์„ฑ 1")
axes[0].legend()
plt.show()


C=1 ๊ทœ์ œ(๊ณต๋ถ€๋ฅผ ๋œ ์‹œํ‚ค๊ฒ ๋‹ค -> ๊ณผ๋Œ€์ ํ•ฉ์„ ํ”ผํ•˜๊ฒ ๋‹ค.) ์„ค์ • ๊ฐ’
C ์„ค์ • ๊ฐ’์„ ๋‚ฎ๊ฒŒํ•˜๋ฉด ex) 0.01, 0.001 -> ๊ทœ์ œ๊ฐ•ํ™” -> ์ผ๋ฐ˜ํ™” -> ๊ณผ์†Œ์ ํ•ฉ
C ์„ค์ • ๊ฐ’์„ ๋†’๊ฒŒํ•˜๋ฉด ex) 10, 100, 1000 -> ๊ทœ์ œ์™„ํ™” -> ๊ณผ๋Œ€์ ํ•ฉ

mglearn.plots.plot_linear_svc_regularization()


๐Ÿ—’๏ธ ์œ ๋ฐฉ์•” ๋ฐ์ดํ„ฐ์…‹์„ ์‚ฌ์šฉํ•œ ๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€(LogisticRegression) ์„ฑ๋Šฅํ‰๊ฐ€

  • ๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€ : L2, L1๋ชจ๋‘ ๊ฐ€๋Šฅ(๊ธฐ๋ณธ์€ L2)
    • ๊ทœ์ œ ๊ฐ•๋„๋ฅผ ๊ฒฐ์ •ํ•˜๋Š” C ์„ค์ •์— ๋”ฐ๋ฅธ ์„ฑ๋Šฅ ๋น„๊ต
  • ๊ธฐ๋ณธ C=1, ex) ๊ทœ์ œ ๊ฐ•ํ™” C = 0.01, ๊ทœ์ œ ์™„ํ™” C = 100

๋ฐ์ดํ„ฐ ์ค€๋น„ํ•˜๊ธฐ

from sklearn.datasets import load_breast_cancer
from sklearn.model_selection import train_test_split

# ๋ฐ์ดํ„ฐ ๊ฐ€์ ธ์˜ค๊ธฐ
cancer = load_breast_cancer()

# ๋ฐ์ดํ„ฐ์…‹ ๋ถ„๋ฆฌํ•˜๊ธฐ
X_train, X_test, y_train, y_test = train_test_split(cancer.data, cancer.target, stratify=cancer.target, random_state=7)

๐Ÿ‘€ stratify=cancer.target : ๋น„์œจ ๋ณด์ •

๋ชจ๋ธ ์„ค์ •, ํ•™์Šต

from sklearn.linear_model import LogisticRegression

# C = 1
logreg = LogisticRegression()

logreg.fit(X_train, y_train)

LogisticRegression C์— ๋”ฐ๋ฅธ ๊ทœ์ œ L2 ๋ชจ๋ธ ์„ฑ๋Šฅํ‰๊ฐ€

print('-----------๊ธฐ๋ณธ----------')
logreg001 = LogisticRegression().fit(X_train, y_train)
print('logreg ํ›ˆ๋ จ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg.score(X_train, y_train)))
print('logreg ํ…Œ์ŠคํŠธ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg.score(X_test, y_test)))

print('--------๊ทœ์ œ ๊ฐ•ํ™”--------')
logreg001 = LogisticRegression(C=0.01).fit(X_train, y_train)
print('logreg001 ํ›ˆ๋ จ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg001.score(X_train, y_train)))
print('logreg001 ํ…Œ์ŠคํŠธ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg001.score(X_test, y_test)))

print('--------๊ทœ์ œ ์™„ํ™”--------')
logreg100 = LogisticRegression(C=100).fit(X_train, y_train)
print('logreg100 ํ›ˆ๋ จ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg100.score(X_train, y_train)))
print('logreg100 ํ…Œ์ŠคํŠธ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg100.score(X_test, y_test)))

๐Ÿ‘€ ์ด ๊ฒฐ๊ณผ๋ฅผ ๋ณธ๋‹ค๋ฉด, ๊ทœ์ œ ์™„ํ™”๋œ ๋ชจ๋ธ์„ ์„ ํƒํ•˜๋Š”๊ฒƒ์ด ์ข‹์Œ!

# L2 ๊ทœ์ œ์— ๋Œ€ํ•œ feature๋“ค์˜ ๊ฐ€์ค‘์น˜๋ฅผ ํ™•์ธ
plt.plot(logreg100.coef_.T, '^', label="C=100")
plt.plot(logreg.coef_.T, 'o', label="C=1")
plt.plot(logreg001.coef_.T, 'v', label="C=0.01")
plt.xticks(range(cancer.data.shape[1]), cancer.feature_names, rotation=90)
xlims = plt.xlim()
plt.hlines(0, xlims[0], xlims[1])
plt.xlim(xlims)
plt.ylim(-5, 5)
plt.xlabel("ํŠน์„ฑ")
plt.ylabel("๊ณ„์ˆ˜ ํฌ๊ธฐ")
plt.legend()
plt.show()

LogisticRegression C์— ๋”ฐ๋ฅธ ๊ทœ์ œ L1 ๋ชจ๋ธ ์„ฑ๋Šฅํ‰๊ฐ€

print('----------๊ธฐ๋ณธ-----------')
logreg001 = LogisticRegression(penalty='l1', solver='liblinear').fit(X_train, y_train)
print('logreg ํ›ˆ๋ จ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg.score(X_train, y_train)))
print('logreg ํ…Œ์ŠคํŠธ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg.score(X_test, y_test)))

print('--------๊ทœ์ œ ๊ฐ•ํ™”--------')
logreg001 = LogisticRegression(C=0.01, penalty='l1', solver='liblinear').fit(X_train, y_train)
print('logreg001 ํ›ˆ๋ จ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg001.score(X_train, y_train)))
print('logreg001 ํ…Œ์ŠคํŠธ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg001.score(X_test, y_test)))

print('--------๊ทœ์ œ ์™„ํ™”--------')
logreg100 = LogisticRegression(C=100, penalty='l1', solver='liblinear').fit(X_train, y_train)
print('logreg100 ํ›ˆ๋ จ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg100.score(X_train, y_train)))
print('logreg100 ํ…Œ์ŠคํŠธ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(logreg100.score(X_test, y_test)))
# L1 ๊ทœ์ œ์— ๋Œ€ํ•œ feature๋“ค์˜ ๊ฐ€์ค‘์น˜๋ฅผ ํ™•์ธ
plt.plot(logreg100.coef_.T, '^', label="C=100")
plt.plot(logreg.coef_.T, 'o', label="C=1")
plt.plot(logreg001.coef_.T, 'v', label="C=0.01")
plt.xticks(range(cancer.data.shape[1]), cancer.feature_names, rotation=90)
xlims = plt.xlim()
plt.hlines(0, xlims[0], xlims[1])
plt.xlim(xlims)
plt.ylim(-5, 5)
plt.xlabel("ํŠน์„ฑ")
plt.ylabel("๊ณ„์ˆ˜ ํฌ๊ธฐ")
plt.legend()
plt.show()


๐Ÿ† ์„œํฌํŠธ ๋ฒกํ„ฐ ๋จธ์‹ (SVM)

- ๋ฐ์ดํ„ฐ์…‹์˜ ์—ฌ๋Ÿฌ ์†์„ฑ์„ ๋‚˜ํƒ€๋‚ด๋Š” ๋ฐ์ดํ„ฐํ”„๋ ˆ์ž„์˜ ๊ฐ ์—ด์€ ์—ด ๋ฒกํ„ฐ ํ˜•ํƒœ๋กœ ๊ตฌํ˜„๋จ

  • ์—ด ๋ฒกํ„ฐ๋“ค์ด ๊ฐ๊ฐ ๊ณ ์œ ์˜ ์ถ•์„ ๊ฐ–๋Š” ๋ฒกํ„ฐ ๊ณต๊ฐ„ ๋งŒ๋“ฆ -> ๋ถ„์„ ๋Œ€์ƒ์ด ๋˜๋Š” ๊ฐœ๋ณ„ ๊ด€์ธก๊ฐ’์€ ๋ชจ๋“  ์†์„ฑ(์—ด๋ฒกํ„ฐ) ๊ด€ํ•œ ๊ฐ’์„ ํ•ด๋‹น ์ถ•์˜ ์ขŒํ‘œ๋กœ ํ‘œ์‹œํ•˜์—ฌ ๋ฒกํ„ฐ ๊ณต๊ฐ„์—์„œ ์œ„์น˜๋ฅผ ๋‚˜ํƒ€๋ƒ„
  • ์†์„ฑ์ด 2๊ฐœ ์กด์žฌํ•˜๋Š” ๋ฐ์ดํ„ฐ์…‹ - 2์ฐจ์› ํ‰๋ฉด ๊ณต๊ฐ„ ์ขŒํ‘œ, 3๊ฐœ์ด๋ฉด 3์ฐจ์›, 4๊ฐœ์ด๋ฉด 4์ฐจ์›
  • ๋ฒกํ„ฐ ๊ณต๊ฐ„์— ์œ„์น˜ํ•œ ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ์˜ ์ขŒํ‘œ์™€ ๊ฐ ๋ฐ์ดํ„ฐ๊ฐ€ ์–ด๋–ค ๋ถ„๋ฅ˜ ๊ฐ’์„ ๊ฐ€์ ธ์•ผํ•˜๋Š”์ง€ ์ •๋‹ต์„ ์ž…๋ ฅ๋ฐ›์•„ ํ•™์Šต -> ๊ฐ™์€ ๋ถ„๋ฅ˜ ๊ฐ’์„ ๊ฐ–๋Š” ๋ฐ์ดํ„ฐ๋ผ๋ฆฌ ๊ฐ™์€ ๊ณต๊ฐ„์— ์œ„์น˜ํ•˜๋„๋ก ํ•จ

๐Ÿšฃโ€โ™€๏ธ SVM(Support Vector Machine) ํƒ€์ดํƒ€๋‹‰ ์ƒ์กด์ž ์˜ˆ์ธก

๐Ÿผ ์ค€๋น„

๋ฌธ์ œ์ •์˜ : SVM ์‚ฌ์šฉํ•˜์—ฌ ํƒ€์ดํƒ€๋‹‰ ์ƒ์กด์ž(1), ์‚ฌ๋ง์ž(0) ์˜ˆ์ธกํ•˜๋Š” ์ด์ง„๋ถ„๋ฅ˜๋ชจ๋ธ๋กœ ์ •์˜

๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ ์ž„ํฌํŠธ

import pandas as pd
import seaborn as sns
import matplotlib.pyplot as plt

ํ•œ๊ธ€๊นจ์ง ๋ฐฉ์ง€

import matplotlib as mpl
import matplotlib.pyplot as plt

%config InlineBackend.figure_format = 'retina'

!apt -qq -y install fonts-nanum

import matplotlib.font_manager as fm
fontpath = '/usr/share/fonts/truetype/nanum/NanumBarunGothic.ttf'
font = fm.FontProperties(fname=fontpath, size=9)
plt.rc('font', family='NanumBarunGothic') 
mpl.font_manager._rebuild()

๋ฐ์ดํ„ฐ ์ค€๋น„ํ•˜๊ณ  ํ™•์ธํ•˜๊ธฐ

# ๋ฐ์ดํ„ฐ ์ค€๋น„ํ•˜๊ธฐ
df = sns.load_dataset("titanic")

# ๋ฐ์ดํ„ฐ ํ™•์ธํ•˜๊ธฐ
df.head()

๋ฐ์ดํ„ฐ ๋ถ„์„ํ•˜๊ธฐ


attrs = df.columns

plt.figure(figsize=(20,20), dpi=200)

for i, feature in enumerate(attrs):
  plt.subplot(5, 5, i+1)
  sns.countplot(data=df, x=feature, hue='survived')
sns.despine()


๐Ÿ‘€ ๊ทธ๋ž˜ํ”„๋ฅผ ๋ณด๊ณ  ๋„ฃ๊ณ  ๋บ„ ๊ฒƒ ๊ณ ๋ฅด๊ธฐ

๋ฐ์ดํ„ฐ ์ „์ฒ˜๋ฆฌ

df.isna().sum()
# 1) NaN์ด ๋งŽ์€ ์ปฌ๋Ÿผ ๋ฐ ์ค‘๋ณต ์ปฌ๋Ÿผ ์‚ญ์ œ -> deck(NaN ๋‹ค์ˆ˜), embark_town(์ค‘๋ณต)
rdf = df.drop(['deck','embark_town'], axis=1)
rdf.info()


๐Ÿ‘€ ์ง€์›Œ์ง

# 2) age ์ปฌ๋Ÿผ์— ๋ฐ์ดํ„ฐ๊ฐ€ ์—†๋Š” row(ํ–‰) ์‚ญ์ œ -> age๊ฐ€ NaN์ธ 177๊ฑด๋งŒ ์‚ญ์ œ๋˜๋„๋ก
rdf = rdf.dropna(subset=['age'], how='any', axis=0)
rdf.info()

๐Ÿ‘€ ์‚ญ์ œ๋˜์–ด ์ค„์–ด๋“ค์—ˆ๋‹ค

# embarked ์Šน์„ ๋„์‹œ NaN ๋ฐ์ดํ„ฐ 2๊ฑด ์–ด๋–ป๊ฒŒ์ฑ„์šฐ์ง€ ? -> ์Šน์„ ๋„์‹œ ์ค‘ ๊ฐ€์žฅ ๋งŽ์ด ์ถœํ˜„ํ•œ ๋„์‹œ๋กœ ์ง€์ •
most_freq = rdf['embarked'].value_counts(dropna=True).idxmax()
most_freq # -> 'S'
# 'S'์ด๋ฏ€๋กœ NaN ๋ฐ์ดํ„ฐ 'S'๋กœ ์ง€์ •

# embarked ์—ด์˜ NaN๊ฐ’์„ ์Šน์„ ๋„์‹œ ์ค‘ ๊ฐ€์žฅ ๋งŽ์ด ์ถœํ˜„ํ•œ ๋„์‹œ๋กœ ์ฑ„์šฐ๊ธฐ
rdf['embarked'].fillna(most_freq, inplace=True)
rdf.isna().sum()

๐Ÿ‘€ NaN๊ฐ’์ด ์—†์Œ์„ ๋ณผ ์ˆ˜ ์žˆ์Œ!

# 4) ํ•™์Šต์— ํ•„์š”ํ•œ ์ปฌ๋Ÿผ์„ ์„ ํƒ
# ์ƒ์กด์—ฌ๋ถ€, ๊ฐ์‹ค ๋“ฑ๊ธ‰, ์„ฑ๋ณ„, ๋‚˜์ด, ํ˜•์ œ/์ž๋งค์ˆ˜, ๋ถ€๋ชจ/์ž๋…€์ˆ˜, ํƒ‘์Šน๋„์‹œ
ndf = rdf[['survived','pclass','sex','age','sibsp','parch','embarked']]
ndf.head()

# 5) ๋ฌธ์ž๋กœ ๋˜์–ด์žˆ๋Š” ๊ฐ’ -๋ณ€ํ™˜ > ์ธ์ฝ”๋”ฉ -> ์›ํ•ซ์ธ์ฝ”๋”ฉ(๋ฒ”์ฃผํ˜• ๋ฐ์ดํ„ฐ๋ฅผ ๋จธ์‹ ๋Ÿฌ๋‹ ๋ชจ๋ธ์ด ์ธ์‹ํ•  ์ˆ˜ ์žˆ๋„๋ก ์ˆซ์žํ˜•์œผ๋กœ ๋ณ€ํ™˜)
# ex) male [1,0], female [0,1]
# ex) S[1,0,0],C[0,1,0],Q[0,0,1]

# 5-1) onehot ์ธ์ฝ”๋”ฉ
onehot_sex = pd.get_dummies(ndf['sex'])
onehot_embarked = pd.get_dummies(ndf['embarked'])

# 5-2) ndf ๋ฐ์ดํ„ฐํ”„๋ ˆ์ž„์— ์—ฐ๊ฒฐ
ndf = pd.concat([ndf, onehot_sex],axis=1)
ndf = pd.concat([ndf, onehot_embarked],axis=1)

# 6) ๊ธฐ์กด ์ปฌ๋Ÿผ์‚ญ์ œ
ndf.drop(['sex','embarked'], axis=1, inplace=True)

ndf


๋ฐ์ดํ„ฐ ๋ถ„๋ฆฌํ•˜๊ธฐ

X = ndf[['pclass', 'age', 'sibsp', 'parch', 'female', 'male', 'C', 'Q', 'S']]
y = ndf['survived']

# X (feature, ๋…๋ฆฝ๋ณ€์ˆ˜) ๊ฐ’์„ ์ •๊ทœํ™” -> 0~1 ์‚ฌ์ด๋กœ ๊ฐ’์„ ์ค„์—ฌ์ฃผ๋Š” ์ž‘์—… -> ์Šค์ผ€์ผ๋ง(๋ฒ”์œ„์กฐ์ •)
from sklearn import preprocessing
X = preprocessing.StandardScaler().fit(X).transform(X)
X

# train, test set์œผ๋กœ ๋ถ„๋ฆฌ(7:3)
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y , test_size=0.3, random_state=7)

print('trains shape : ', X_train.shape)
print('test shape : ', X_test.shape)

SVM ๋ถ„๋ฅ˜๋ชจ๋ธ

from sklearn import svm
# ๋ชจ๋ธ ๊ฐ์ฒด ์ƒ์„ฑ kernel = 'rbf'
# ๋ฒกํ„ฐ ๊ณต๊ฐ„์„ ๋งตํ•‘ํ•˜๋Š”ํ•จ์ˆ˜ -> ์„ ํ˜•(linear), ๋‹คํ•ญ์‹(poly), ๊ฐ€์šฐ์‹œ์•ˆ RBF(rbf), ์‹œ๊ทธ๋ชจ์ด๋“œ(sigmoid)
svm_model = svm.SVC(kernel = 'rbf')

# ๋ชจ๋ธ ํ•™์Šต
svm_model.fit(X_train, y_train)
print('ํ›ˆ๋ จ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(svm_model.score(X_train, y_train)))
print('ํ…Œ์ŠคํŠธ ์„ธํŠธ ์ ์ˆ˜ : {:.8f}'.format(svm_model.score(X_test, y_test)))

# ๋ชจ๋ธ ํ•™์Šต ๊ฒฐ๊ณผ
from sklearn import metrics
y_pred = svm_model.predict(X_test) # ๋ฌธ์ œํ’€์–ด๋ด

print('accuracy : ', metrics.accuracy_score(y_test, y_pred))
print('precision : ', metrics.precision_score(y_test, y_pred))
print('recall : ', metrics.recall_score(y_test, y_pred))
print('f1s : ', metrics.f1_score(y_test, y_pred))

accuracy : 0.8046511627906977
precision : 0.873015873015873
recall : 0.6179775280898876
f1s : 0.7236842105263157


  • ์„ ํ˜• ๋ชจ๋ธ
    ์ฃผ์š” ๋งค๊ฐœ๋ณ€์ˆ˜ - ํšŒ๊ท€๋ณ€์ˆ˜ -> alpha, LinearSVC, LogisticRegression -> C
    alpha๊ฐ’์ด ํด ์ˆ˜๋ก, C๊ฐ’์ด ์ž‘์„์ˆ˜๋ก ๋ชจ๋ธ์ด ๋‹จ์ˆœํ•ด์ง
    ๋ณดํ†ต C์™€ alpha๋Š” ๋กœ๊ทธ ์Šค์ผ€์ผ๋กœ ์ตœ์ ์น˜๋ฅผ ์ •ํ•จ
  • L1, L2 ๊ทœ์ œ ์ค‘ ๋ฌด์—‡์„ ์‚ฌ์šฉํ• ์ง€ ์ •ํ•ด์•ผํ•จ
    ์ค‘์š”ํ•œ ํŠน์„ฑ์ด ๋งŽ์ง€ ์•Š์Œ -> L1
    ์ค‘์š”ํ•œ ํŠน์„ฑ์ด ๋งŽ์Œ -> L2
    L1 ๊ทœ์ œ : ๋ชจ๋ธ์˜ ํ•ด์„์ด ์ค‘์š”ํ•œ ์š”์†Œ์ผ ๋•Œ๋„ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋‹ค.
    ๋ช‡๊ฐ€์ง€ ํŠน์„ฑ๋งŒ ์‚ฌ์šฉํ•˜๋ฏ€๋กœ ํ•ด๋‹น ๋ชจ๋ธ์— ์ค‘์š”ํ•œ ํŠน์„ฑ์ด ๋ฌด์—‡์ด๊ณ  ํšจ๊ณผ๊ฐ€ ์–ด๋Š ์ •๋„์ธ์ง€ ์„ค๋ช…ํ•˜๊ธฐ ์‰ฌ์›€
  • ์„ ํ˜• ๋ชจ๋ธ์€ ํ•™์Šต ์†๋„๊ฐ€ ๋น ๋ฆ„
  • ํšŒ๊ท€์™€ ๋ถ„๋ฅ˜์—์„œ ๋ณธ ๊ณต์‹์„ ์‚ฌ์šฉํ•ด ์˜ˆ์ธก์ด ์–ด๋–ป๊ฒŒ ๋งŒ๋“ค์–ด์ง€๋Š”์ง€ ๋น„๊ต์  ์‰ฝ๊ฒŒ ์ดํ•ด
profile
๋ฐฐ๊ณ ํŒŒ์šฉ.

0๊ฐœ์˜ ๋Œ“๊ธ€