[GAN] - mnist data

egyยท2022๋…„ 9์›” 22์ผ
0

๋…ผ๋ฌธ ์ค€๋น„

๋ชฉ๋ก ๋ณด๊ธฐ
3/4
post-thumbnail

๐Ÿ“„ MNIST dataset

  • 0 ~ 9 ๊นŒ์ง€์˜ ์ˆซ์ž๋ฅผ ์†์œผ๋กœ ์“ด dataset
  • 28 x 28 ํ•ด์ƒ๋„์˜ ์ด๋ฏธ์ง€์ด๋ฉฐ, 1๊ฐœ์˜ ์ฑ„๋„์„ ๊ฐ–๋Š” gray scale ์ด๋‹ค.

Library Import

# python ํ‘œ์ค€ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ
import os
import time
import glob

# ์˜คํ”ˆ์†Œ์Šค ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ
import tensorflow as tf
from tensorflow.keras import layers
import imageio # ์ƒ์„ฑ ๋ฐ์ดํ„ฐ๋ฅผ GIFํ˜•ํƒœ๋กœ ๋งŒ๋“ค์–ด์ฃผ๊ธฐ ์œ„ํ•œ ๊ฒƒ
import matplotlib.pyplot as plt
import numpy as np
import PIL

# ๋ชจ๋“  ์ด๋ฏธ์ง€๋ฅผ ํ‘œ์‹œํ•˜๊ธฐ ์œ„ํ•ด  IPython ๋ชจ๋“ˆ์˜ ํด๋ž˜์Šค๋ฅผ ์‚ฌ์šฉ
from IPython import display 

os
์šด์˜์ฒด์ œ์—์„œ ์ œ๊ณต๋˜๋Š” ์—ฌ๋Ÿฌ๊ธฐ๋Šฅ(๊ฒฝ๋กœ ๊ฐ€์ ธ์˜ค๊ธฐ, ํด๋” ์ƒ์„ฑ, ํด๋” ๋‚ด ํŒŒ์ผ ๋ชฉ๋ก ๊ตฌํ•˜๊ธฐ ๋“ฑ)์„ ํŒŒ์ด์ฌ์—์„œ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋„๋ก ํ•˜๋Š” ๋ชจ๋“ˆ

glob

  • ์ธ์ž๋กœ ๋ฐ›์€ ํŒจํ„ด๊ณผ ์ด๋ฆ„์ด ์ผ์น˜ํ•˜๋Š” ๋ชจ๋“  ํŒŒ์ผ๊ณผ ๋””๋ ‰ํ„ฐ๋ฆฌ์˜ ๋ฆฌ์ŠคํŠธ๋ฅผ ๋ฐ˜ํ™˜
  • ํŒจํ„ด์„ * ์œผ๋กœ ์ฃผ๋ฉด ๋ชจ๋“  ํŒŒ์ผ๊ณผ ๋””๋ ‰ํ„ฐ๋ฆฌ๋ฅผ ๋ณผ ์ˆ˜ ์žˆ๋‹ค.

Dataset ์ค€๋น„

(train_images, train_labels), (_, _) = tf.keras.datasets.mnist.load_data()

train_images์˜ shape๋Š” (60000, 28, 28)์œผ๋กœ ์ด 6๋งŒ๊ฐœ์˜ 28 x 28 ํ•ด์ƒ๋„ ์ด๋ฏธ์ง€๊ฐ€ ์ €์žฅ๋˜์–ด ์žˆ๋‹ค.

train_images = train_images.reshape(train_images.shape[0], 28, 28, 1).astype('float32')
train_images = (train_images - 127.5) / 127.5 # ์ด๋ฏธ์ง€๋ฅผ [-1, 1]๋กœ ์ •๊ทœํ™”ํ•ฉ๋‹ˆ๋‹ค.

reshape
๋„˜ํŒŒ์ด ๋ฐฐ์—ด์˜ ์ฐจ์›์„ ๋ณ€ํ™˜ํ•œ๋‹ค.
astype
๋ฐ์ดํ„ฐ ํ”„๋ ˆ์ž„ ๋‚ด ๋ฐ์ดํ„ฐ๋“ค์˜ ๋ฐ์ดํ„ฐ ํƒ€์ž…์„ ๋ณ€๊ฒฝ

  • ์ด๋ฏธ์ง€ ๋ฐ์ดํ„ฐ๋ฅผ 28x28์ธ gray scale image ํ˜•ํƒœ๋กœ reshape ํ•˜๊ณ , data Type์„ float32(์‹ค์ˆ˜) ํƒ€์ž…์œผ๋กœ ๋ณ€๊ฒฝ
  • ๊ฐ๊ฐ์˜ ์ด๋ฏธ์ง€๋Š” ํ˜„์žฌ 0~255 ๊นŒ์ง€์˜ ๊ฐ’์„ ๊ฐ–๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์— -1~1 ์‚ฌ์ด์˜ ๊ฐ’์„ ๊ฐ–๋„๋ก ์—ฐ์‚ฐํ•œ๋‹ค.
BUFFER_SIZE = 60000
BATCH_SIZE = 256
# ๋ฐ์ดํ„ฐ ๋ฐฐ์น˜๋ฅผ ๋งŒ๋“ค๊ณ  ์„ž์Šต๋‹ˆ๋‹ค.
train_dataset = tf.data.Dataset.from_tensor_slices(train_images).shuffle(BUFFER_SIZE).batch(BATCH_SIZE)

tf.data.Dataset.from_tensor_slices

  • tf.data.Dataset ๋ฅผ ์ƒ์„ฑํ•˜๋Š” ํ•จ์ˆ˜๋กœ ์ž…๋ ฅ๋œ ํ…์„œ๋กœ๋ถ€ํ„ฐ slices๋ฅผ ์ƒ์„ฑํ•œ๋‹ค.
  • ์˜ˆ๋ฅผ ๋“ค์–ด MNIST์˜ ํ•™์Šต๋ฐ์ดํ„ฐ (60000, 28, 28)๊ฐ€ ์ž…๋ ฅ๋˜๋ฉด, 60000๊ฐœ์˜ slices๋กœ ๋งŒ๋“ค๊ณ  ๊ฐ๊ฐ์˜ slice๋Š” 28ร—28์˜ ์ด๋ฏธ์ง€ ํฌ๊ธฐ๋ฅผ ๊ฐ–๊ฒŒ ๋œ๋‹ค.

shuffle

  • ๋ฐ์ดํ„ฐ์…‹์„ ์ž„์˜๋กœ ์„ž์–ด์ค€๋‹ค.
  • BUFFER_SIZE๊ฐœ๋กœ ์ด๋ฃจ์–ด์ง„ ๋ฒ„ํผ๋กœ๋ถ€ํ„ฐ ์ž„์˜๋กœ ์ƒ˜ํ”Œ์„ ๋ฝ‘๊ณ , ๋ฝ‘์€ ์ƒ˜ํ”Œ์€ ๋‹ค๋ฅธ ์ƒ˜ํ”Œ๋กœ ๋Œ€์ฒดํ•œ๋‹ค.
  • ์™„๋ฒฝํ•œ ์…”ํ”Œ์„ ์œ„ํ•ด์„œ ์ „์ฒด ๋ฐ์ดํ„ฐ์…‹์˜ ํฌ๊ธฐ์— ๋น„ํ•ด ํฌ๊ฑฐ๋‚˜ ๊ฐ™์€ ๋ฒ„ํผ ํฌ๊ธฐ๊ฐ€ ์š”๊ตฌ๋œ๋‹ค.

batch
๋ฐ์ดํ„ฐ์…‹์˜ ํ•ญ๋ชฉ๋“ค์„ ํ•˜๋‚˜์˜ ๋ฐฐ์น˜๋กœ ๋ฌถ์–ด์ค€๋‹ค.

๋ชจ๋ธ ๊ตฌ์„ฑ

  • tf.keras.models ๋ชจ๋“ˆ์˜ Sequential ํด๋ž˜์Šค๋ฅผ ์‚ฌ์šฉํ•ด์„œ ์ธ๊ณต์‹ ๊ฒฝ๋ง์˜ ๊ฐ ์ธต์„ ์ˆœ์„œ๋Œ€๋กœ ์Œ“๋Š”๋‹ค.
  • Dense ์ธต์„ ์ถ”๊ฐ€ํ•˜๋Š” ์ด์œ ๋Š” ์ด๋ฏธ์ง€๊ฐ€ ํ”ฝ์…€์˜ ์œ„์น˜์— ๋ฏผ๊ฐํ•˜๊ธฐ ๋•Œ๋ฌธ์— ์™„์ „ ์—ฐ๊ฒฐ์„ ํ•ด์คŒ์œผ๋กœ์จ ํ•™์Šต์˜ ์ •ํ™•๋„๋ฅผ ๋†’์ผ ์ˆ˜ ์žˆ๊ธฐ ๋•Œ๋ฌธ์ด๋‹ค.
  • flatten์€ ๋‹ค์ฐจ์› ๋ฐฐ์—ด์„ 1์ฐจ์›์œผ๋กœ ํ‰ํƒ„ํ™”ํ•ด์ฃผ๋Š” ํ•จ์ˆ˜์ด๋‹ค.

Generator

def make_generator_model():
    model = tf.keras.Sequential()
    model.add(layers.Dense(128, activation='relu', input_shape=(100,)))
    model.add(layers.Dense(256, activation='relu'))
    model.add(layers.Dense(512, activation='relu'))
    model.add(layers.Dense(28*28*1, activation='tanh'))
    model.add(layers.Reshape((28, 28, 1)))

    return model
  • ๋งˆ์ง€๋ง‰ Layer์—์„œ Tanh ํ™œ์„ฑํ™” ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•œ ์ด์œ ๋Š” ์ด๋ฏธ์ง€ ๋ฐ์ดํ„ฐ๋ฅผ -1~1์‚ฌ์ด์˜ ๊ฐ’์œผ๋กœ ์—ฐ์‚ฐํ•˜์˜€๊ธฐ ๋•Œ๋ฌธ์œผ๋กœ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ๋‹ค.

์ƒ์„ฑ์ž ๋ชจ๋ธ์˜ ๊ตฌ์กฐ๋ฅผ ๊ทธ๋ฆผ์œผ๋กœ ๋‚˜ํƒ€๋‚ด๋ฉด ์•„๋ž˜์™€ ๊ฐ™๋‹ค.

Discriminator

def make_discriminator_model():
    model = tf.keras.Sequential()
    model.add(layers.Flatten())
    model.add(layers.Dense(512, activation='relu'))
    model.add(layers.Dense(256, activation='relu'))
    model.add(layers.Dense(128, activation='relu'))
    model.add(layers.Dense(1))

    return model
  • ์ตœ์ข… Layer์˜ ๊ฐ’์€ ์ด๋ฏธ์ง€๊ฐ€ ์ง„์งœ๋ผ๊ณ  ํŒ๋ณ„ํ•œ๋‹ค๋ฉด 1์— ๊ฐ€๊นŒ์šด ๊ฐ’, ๊ฐ€์งœ๋ผ๊ณ  ํŒ๋ณ„ํ•œ๋‹ค๋ฉด 0์— ๊ฐ€๊นŒ์šด ๊ฐ’์„ ์ถœ๋ ฅํ•˜๋„๋ก ํ•™์Šตํ•œ๋‹ค.

ํŒ๋ณ„์ž ๋ชจ๋ธ์˜ ๊ตฌ์กฐ๋ฅผ ๊ทธ๋ฆผ์œผ๋กœ ๋‚˜ํƒ€๋‚ด๋ฉด ์•„๋ž˜์™€ ๊ฐ™๋‹ค.

generator = make_generator_model()
discriminator = make_discriminator_model()

ํ›ˆ๋ จ ๊ณผ์ • ์ •์˜

# ์†์‹คํ•จ์ˆ˜ ์ •์˜
cross_entropy = tf.keras.losses.BinaryCrossentropy(from_logits=True)
  • ์†์‹คํ•จ์ˆ˜์— ๋“ค์–ด๊ฐ€๋Š” ๊ฐ’์€ ํŒ๋ณ„์ž์˜ ํŒ๋ณ„๊ฐ’(0 or 1)์œผ๋กœ ์ˆ˜๋ ดํ•˜๊ธฐ ๋•Œ๋ฌธ์— BinaryCrossentropy๋ฅผ ์‚ฌ์šฉํ•œ๋‹ค.
  • from_logits์„ True๋กœ ์„ค์ •ํ•ด์ฃผ์–ด์•ผ BinaryCrossEntropy์— ์ž…๋ ฅ๋œ ๊ฐ’์„ 0~1 ์‚ฌ์ด์˜ ๊ฐ’์œผ๋กœ ์ •๊ทœํ™” ํ•œ ํ›„ ์•Œ๋งž๊ฒŒ ๊ณ„์‚ฐํ•  ์ˆ˜ ์žˆ๋‹ค.
# ์ƒ์„ฑ์ž ์†์‹คํ•จ์ˆ˜
# fake image๋ฅผ ๋„ฃ์—ˆ์„ ๋•Œ 1์ด ๋‚˜์˜ค๋„๋ก ํ•™์Šต
def generator_loss(fake_output):
    return cross_entropy(tf.ones_like(fake_output), fake_output)

Generator์˜ loss๋Š” ์ƒ์„ฑํ•œ fake image๋ฅผ Discriminator๊ฐ€ ํŒ๋ณ„ํ•œ ๊ฒฐ๊ณผ๊ฐ’์ธ fake_output์„ ๋ฐ›์•„์„œ, ๊ทธ ๊ฐ’์ด 1์—์„œ ์–ผ๋งˆ๋‚˜ ๋ฉ€๋ฆฌ์žˆ๋Š”๊ฐ€๋ฅผ ๊ธฐ์ค€์œผ๋กœ Loss๋ฅผ ๊ฒฐ์ •ํ•œ๋‹ค.

tf.ones_like
ํŠน์ • tensor์™€ ๋น„์Šทํ•˜๋ฉด์„œ ๋ชจ๋“  element๊ฐ€ 1์ธ tensor๋ฅผ ๋งŒ๋“ค์–ด์ค€๋‹ค.

# ํŒ๋ณ„์ž ์†์‹คํ•จ์ˆ˜
# real image๋ฅผ ๋„ฃ์œผ๋ฉด 1, fake image๋ฅผ ๋„ฃ์œผ๋ฉด 0์ด ๋‚˜์˜ค๊ฒŒ ํ•™์Šต
def discriminator_loss(real_output, fake_output):
    real_loss = cross_entropy(tf.ones_like(real_output), real_output)
    fake_loss = cross_entropy(tf.zeros_like(fake_output), fake_output)
    total_loss = real_loss + fake_loss
    return total_loss
  1. Generator๊ฐ€ ์ƒ์„ฑํ•œ fake image์— ๋Œ€ํ•œ ํŒ๋ณ„ ๊ฐ’์ด 0์—์„œ ์–ผ๋งˆ๋‚˜ ๋ฉ€๋ฆฌ์žˆ๋Š”์ง€
  2. real image์— ๋Œ€ํ•œ ํŒ๋ณ„ ๊ฐ’์ด 1์—์„œ ์–ผ๋งˆ๋‚˜ ๋ฉ€๋ฆฌ์žˆ๋Š”์ง€

์œ„ ๋‘ ๊ฐœ์˜ Loss๋ฅผ ๊ฐ๊ฐ ๊ณ„์‚ฐํ•˜๊ณ , ํ•ฉ์นœ ๊ฐ’์„ Discriminator์˜ loss๋กœ ๋ฐ˜ํ™˜ํ•œ๋‹ค.

tf.zeos_like
ํŠน์ • tensor์™€ ๋น„์Šทํ•˜๋ฉด์„œ ๋ชจ๋“  element๊ฐ€ 0์ธ tensor๋ฅผ ๋งŒ๋“ค์–ด์ค€๋‹ค.

# Optimizer ์ •์˜
generator_optimizer = tf.keras.optimizers.Adam(1e-4)
discriminator_optimizer = tf.keras.optimizers.Adam(1e-4)
EPOCHS = 300
noise_dim = 100 
num_examples_to_generate = 16

# ์ด ์‹œ๋“œ๋ฅผ ์‹œ๊ฐ„์ด ์ง€๋‚˜๋„ ์žฌํ™œ์šฉํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค. 
# (GIF ์• ๋‹ˆ๋ฉ”์ด์…˜์—์„œ ์ง„์ „ ๋‚ด์šฉ์„ ์‹œ๊ฐํ™”ํ•˜๋Š”๋ฐ ์‰ฝ๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.) 
seed = tf.random.normal([num_examples_to_generate, noise_dim])

noise_dim
Generator์˜ Input์œผ๋กœ ์‚ฌ์šฉ ๋  Latant Vector์˜ ํฌ๊ธฐ๋ฅผ 100์œผ๋กœ ์ •์˜ํ•œ๋‹ค.

num_examples_to_generator
ํ›ˆ๋ จ๊ณผ์ •์—์„œ Generator๊ฐ€ ์ƒ์„ฑํ•˜๋Š” ์ด๋ฏธ์ง€๋ฅผ ๋ช‡ ๊ฐœ์”ฉ ํ™•์ธ ํ• ์ง€ ์ •์˜ํ•˜๋Š” ๋ณ€์ˆ˜

seed

  • ํ›ˆ๋ จ์€ ์ƒ์„ฑ์ž๊ฐ€ ์ž…๋ ฅ์œผ๋กœ ๋žœ๋ค์‹œ๋“œ๋ฅผ ๋ฐ›๋Š” ๊ฒƒ์œผ๋กœ๋ถ€ํ„ฐ ์‹œ์ž‘๋˜๋ฉฐ, ๊ทธ ์‹œ๋“œ๊ฐ’์„ ์‚ฌ์šฉํ•˜์—ฌ ์ด๋ฏธ์ง€๋ฅผ ์ƒ์„ฑํ•œ๋‹ค.
  • num_examples_to_generator์˜ ๊ฐœ์ˆ˜๋งŒํผ Latant Vector๋ฅผ ์ƒ์„ฑํ•˜์—ฌ Generator์—๊ฒŒ ์ƒ์„ฑ์„ ๋ช…๋ นํ•œ๋‹ค.
# `tf.function`์ด ์–ด๋–ป๊ฒŒ ์‚ฌ์šฉ๋˜๋Š”์ง€ ์ฃผ๋ชฉํ•ด ์ฃผ์„ธ์š”.
# ์ด ๋ฐ์ฝ”๋ ˆ์ดํ„ฐ๋Š” ํ•จ์ˆ˜๋ฅผ "์ปดํŒŒ์ผ"ํ•ฉ๋‹ˆ๋‹ค.
@tf.function
def train_step(images):
    noise = tf.random.normal([BATCH_SIZE, noise_dim])

    with tf.GradientTape() as gen_tape, tf.GradientTape() as disc_tape:
    	
        # generator์— noise ๋„ฃ๊ณ  fake image ์ƒ์„ฑ
        generated_images = generator(noise, training=True)
        
        # discriminator์— real image์™€ fake image ๋„ฃ๊ณ  ํŒ๋ณ„๊ฐ’ ๋ฆฌํ„ด
        real_output = discriminator(images, training=True)
        fake_output = discriminator(generated_images, training=True)
        
        # fake image๋ฅผ discriminator๊ฐ€ 1๋กœ ํ•™์Šต ํ•˜๋„๋ก ์—…๋ฐ์ดํŠธ
        gen_loss = generator_loss(fake_output)
        # real image loss์™€ fake image loss ํ•ฉํ•œ total loss ๋ฆฌํ„ด
        disc_loss = discriminator_loss(real_output, fake_output)
        
    # gen_tape.gradient(y, x) ํ•จ์ˆ˜๋กœ ๋ฏธ๋ถ„ ๊ฐ’(๊ธฐ์šธ๊ธฐ)์„ ๊ตฌํ•จ  
    gradients_of_generator = gen_tape.gradient(gen_loss, generator.trainable_variables)
    gradients_of_discriminator = disc_tape.gradient(disc_loss, discriminator.trainable_variables)
    
    # ๊ฐ€์ค‘์น˜ ์—…๋ฐ์ดํŠธ 
generator_optimizer.apply_gradients(zip(gradients_of_generator, generator.trainable_variables))
discriminator_optimizer.apply_gradients(zip(gradients_of_discriminator, discriminator.trainable_variables))

@tf.function ๋ฐ์ฝ”๋ ˆ์ดํ„ฐ

  • Tensorflow1 ๋ฒ„์ „์—์„œ session์„ ์‚ฌ์šฉํ•˜๋ฉฐ ์ปดํŒŒ์ผํ•˜๋ฉด์„œ ์ƒ๊ธฐ๋Š” ์—ฌ๋Ÿฌ ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ณ  ๊ฐ„ํŽธํ•˜๊ฒŒ ์‚ฌ์šฉํ•˜๋„๋ก ์ถ”๊ฐ€ํ•œ ๊ธฐ๋Šฅ
  • ๋‚ด๋ถ€์ ์œผ๋กœ ์„ ์–ธ๋˜๋Š” Tensorflow Graph๋ฅผ ๋งŒ๋“ค์–ด, Loss ๊ธฐ๋ก, ๊ฐ€์ค‘์น˜ ๊ธฐ๋ก ๋ฐ ์—…๋ฐ์ดํŠธ ๋“ฑ์„ ๊ฐ€๋Šฅํ•˜๊ฒŒ ํ•œ๋‹ค.

tf.GradientTape()
์‹คํ–‰๋œ ๋ชจ๋“  ์—ฐ์‚ฐ์„ tape์— ๊ธฐ๋กํ•œ ํ›„ ์ž๋™ ๋ฏธ๋ถ„์„ ์‚ฌ์šฉํ•ด tape์— ๊ธฐ๋ก๋œ ์—ฐ์‚ฐ์˜ ๊ทธ๋ž˜๋””์–ธํŠธ๋ฅผ ๊ณ„์‚ฐํ•œ๋‹ค.

def train(dataset, epochs):
    for epoch in range(epochs):
        start = time.time()
        
        for image_batch in dataset:
            train_step(image_batch)
        
        # GIF๋ฅผ ์œ„ํ•œ ์ด๋ฏธ์ง€๋ฅผ ๋ฐ”๋กœ ์ƒ์„ฑํ•ฉ๋‹ˆ๋‹ค.
        display.clear_output(wait=True)
        generate_and_save_images(generator,
                             epoch + 1,
                             seed)
        # print (' ์—ํฌํฌ {} ์—์„œ ๊ฑธ๋ฆฐ ์‹œ๊ฐ„์€ {} ์ดˆ ์ž…๋‹ˆ๋‹ค'.format(epoch +1, time.time()-start))
        print ('Time for epoch {} is {} sec'.format(epoch + 1, time.time()-start))
    
    # ๋งˆ์ง€๋ง‰ ์—ํฌํฌ๊ฐ€ ๋๋‚œ ํ›„ ์ƒ์„ฑํ•ฉ๋‹ˆ๋‹ค.
    display.clear_output(wait=True)
    generate_and_save_images(generator,
                           epochs,
                           seed)
def generate_and_save_images(model, epoch, test_input):
    # `training`์ด False๋กœ ๋งž์ถฐ์ง„ ๊ฒƒ์„ ์ฃผ๋ชฉํ•˜์„ธ์š”.
    # ์ด๋ ‡๊ฒŒ ํ•˜๋ฉด (๋ฐฐ์น˜์ •๊ทœํ™”๋ฅผ ํฌํ•จํ•˜์—ฌ) ๋ชจ๋“  ์ธต๋“ค์ด ์ถ”๋ก  ๋ชจ๋“œ๋กœ ์‹คํ–‰๋ฉ๋‹ˆ๋‹ค. 
    predictions = model(test_input, training=False)
    
    fig = plt.figure(figsize=(4,4))
    
    for i in range(predictions.shape[0]):
        plt.subplot(4, 4, i+1)
        plt.imshow(predictions[i, :, :, 0] * 127.5 + 127.5, cmap='gray')
        plt.axis('off')
    
    plt.savefig('image_at_epoch_{:04d}.png'.format(epoch))
    plt.show()

๋ชจ๋ธ ํ›ˆ๋ จ

%%time
train(train_dataset, EPOCHS)


๋งค epoch ๋งˆ๋‹ค Generator๊ฐ€ ์ƒ์„ฑํ•˜๋Š” ์ด๋ฏธ์ง€์™€ ์‹œ๊ฐ„์„ ํ™•์ธ ํ•  ์ˆ˜ ์žˆ๋‹ค.

# gif ์ƒ์„ฑ
anim_file = 'gan.gif'

with imageio.get_writer(anim_file, mode='I') as writer:
    filenames = glob.glob('image*.png')
    filenames = sorted(filenames)
    last = -1
    for i,filename in enumerate(filenames):
        frame = 2*(i**0.5)
        if round(frame) > round(last):
            last = frame
        else:
            continue
        image = imageio.imread(filename)
        writer.append_data(image)
    image = imageio.imread(filename)
    writer.append_data(image)

import IPython
if IPython.version_info > (6,2,0,''):
    display.Image(filename=anim_file)


imageio library๋กœ ํ›ˆ๋ จ๊ณผ์ •์—์„œ ์ƒ์„ฑํ•œ ์ด๋ฏธ์ง€๋ฅผ ์—ฐ์†์œผ๋กœ ์ด์–ด๋ถ™์—ฌ gif ํŒŒ์ผ์„ ์ƒ์„ฑํ•  ์ˆ˜ ์žˆ๋‹ค.

์ฐธ์กฐ

https://www.tensorflow.org/tutorials/generative/dcgan?hl=ko
https://velog.io/@wo7864/GAN-%EC%BD%94%EB%93%9C%EB%A5%BC-%ED%86%B5%ED%95%9C-%EC%9D%B4%ED%95%B41

0๊ฐœ์˜ ๋Œ“๊ธ€