Advanced Learning Algorithm 7: Tensorflow and Keras

brandon·2023년 8월 18일
0

SupervisedML

목록 보기
16/27

Tensorflow and Keras
Tensorflow is a machine learning package developed by Google. In 2019, Google integrated Keras into Tensorflow and released Tensorflow 2.0. Keras is a framework developed independently by François Chollet that creates a simple, layer-centric interface to Tensorflow. This course will be using the Keras interface.

Exercise 1: Implementing Neural Network using Keras & Tensorflow

# UNQ_C1
# GRADED CELL: Sequential model

model = Sequential(
    [               
        tf.keras.Input(shape=(400,)),    #specify input size
        ### START CODE HERE ### 
        tf.keras.layers.Dense(25, activation = 'sigmoid', name = "L1"),
        tf.keras.layers.Dense(15, activation = 'sigmoid', name = "L2"),
        tf.keras.layers.Dense(1, activation = 'sigmoid', name = "L3"),
        ### END CODE HERE ### 
    ], name = "my_model" 
)                            

Exercise 2: Implementing Neural Network using NumPy

# UNQ_C2
# GRADED FUNCTION: my_dense

def my_dense(a_in, W, b, g):
    """
    Computes dense layer
    Args:
      a_in (ndarray (n, )) : Data, 1 example 
      W    (ndarray (n,j)) : Weight matrix, n features per unit, j units
      b    (ndarray (j, )) : bias vector, j units  
      g    activation function (e.g. sigmoid, relu..)
    Returns
      a_out (ndarray (j,))  : j units
    """
    units = W.shape[1]
    a_out = np.zeros(units)
### START CODE HERE ### 
    for j in range(units):
        w = W[:, j]
        z = np.dot(w, a_in) + b[j]
        a_out[j] = g(z)
        
        
### END CODE HERE ### 
    return(a_out)

Exercise 3: Faster Calculation with Matrix Multiplication

# UNQ_C3
# UNGRADED FUNCTION: my_dense_v

def my_dense_v(A_in, W, b, g):
    """
    Computes dense layer
    Args:
      A_in (ndarray (m,n)) : Data, m examples, n features each
      W    (ndarray (n,j)) : Weight matrix, n features per unit, j units
      b    (ndarray (1,j)) : bias vector, j units  
      g    activation function (e.g. sigmoid, relu..)
    Returns
      A_out (tf.Tensor or ndarray (m,j)) : m examples, j units
    """
### START CODE HERE ### 
    z = np.matmul(A_in, W) + b
    A_out = g(z)
    
### END CODE HERE ### 
    return(A_out)
profile
everything happens for a reason

0개의 댓글