卷积1D层输出形状错误

huangapple go评论99阅读模式
英文:

Wrong shape output with conv1D layer

问题

I'm trying some experiments with a small autoencoder defined like this:

  1. input_layer = Input(shape=input_shape)
  2. x = Conv1D(8, kernel_size=5, activation='relu', padding='same')(input_layer)
  3. x = BatchNormalization()(x)
  4. encoded = Conv1D(4, kernel_size=5, activation='relu', padding='same')(x)
  5. x = Conv1D(8, kernel_size=5, activation='relu', padding='same')(encoded)
  6. x = BatchNormalization()(x)
  7. decoded = Conv1D(WINDOW_SIZE, kernel_size=5, activation='relu', padding='same')(x)

input_shape = 8192 and autoencoder.summary() returns:

  1. Layer (type) Output Shape Param #
  2. =================================================================
  3. input_1 (InputLayer) [(None, 8192, 1)] 0
  4. conv1d (Conv1D) (None, 8192, 8) 48
  5. batch_normalization (BatchNormalization) (None, 8192, 8) 32
  6. conv1d_1 (Conv1D) (None, 8192, 4) 164
  7. conv1d_2 (Conv1D) (None, 8192, 8) 168
  8. batch_normalization_1 (BatchNormalization) (None, 8192, 8) 32
  9. conv1d_3 (Conv1D) (None, 8192, 8192) 335,872
  10. =================================================================
  11. Total params: 336,316
  12. Trainable params: 336,284
  13. Non-trainable params: 32

Basically, I want an autoencoder that takes a vector of 8192 values and tries to predict another vector of 8192 values.

But when I do an inference like that:

  1. predictions = autoencoder.predict(batch_data, batch_size=BATCH_SIZE)

I have this shape:

  1. >>> predictions[0].shape
  2. (8192, 8192)

What am I doing wrong? What can I change to get a vector with (8192)?

英文:

I'm trying some experiments with a small autoencoder defined like this:

  1. input_layer = Input(shape=input_shape)
  2. x = Conv1D(8, kernel_size=5, activation='relu', padding='same')(input_layer)
  3. x = BatchNormalization()(x)
  4. encoded = Conv1D(4, kernel_size=5, activation='relu', padding='same')(x)
  5. x = Conv1D(8, kernel_size=5, activation='relu', padding='same')(encoded)
  6. x = BatchNormalization()(x)
  7. decoded = Conv1D(WINDOW_SIZE, kernel_size=5, activation='relu', padding='same')(x)

input_shape = 8192 and autoencoder.summary() returns:

  1. Layer (type) Output Shape Param #
  2. =================================================================
  3. input_1 (InputLayer) [(None, 8192, 1)] 0
  4. conv1d (Conv1D) (None, 8192, 8) 48
  5. batch_normalization (BatchN (None, 8192, 8) 32
  6. ormalization)
  7. conv1d_1 (Conv1D) (None, 8192, 4) 164
  8. conv1d_2 (Conv1D) (None, 8192, 8) 168
  9. batch_normalization_1 (Batc (None, 8192, 8) 32
  10. hNormalization)
  11. conv1d_3 (Conv1D) (None, 8192, 8192) 335872
  12. =================================================================
  13. Total params: 336,316
  14. Trainable params: 336,284
  15. Non-trainable params: 32

Basically, I want an autoencoder that takes a vector of 8192 values and try to predict an another vector of 8192 values.

but when I do an inference like that:

  1. predictions = autoencoder.predict(batch_data, batch_size=BATCH_SIZE)

I've that shape:

  1. >>> predictions[0].shape
  2. (8192,8192)

What I'm doing wrong ? what can I change to get a vector with (8192) ?

答案1

得分: 1

可能是这样的吧?

  1. import tensorflow as tf
  2. input_layer = tf.keras.layers.Input(shape=(8192, 1))
  3. x = tf.keras.layers.Conv1D(8, kernel_size=5, activation='relu', padding='same')(input_layer)
  4. x = tf.keras.layers.BatchNormalization()(x)
  5. encoded = tf.keras.layers.Conv1D(4, kernel_size=5, activation='relu', padding='same')(x)
  6. x = tf.keras.layers.Conv1D(8, kernel_size=5, activation='relu', padding='same')(encoded)
  7. x = tf.keras.layers.BatchNormalization()(x)
  8. decoded = tf.keras.layers.Conv1D(1, kernel_size=5, activation='relu', padding='same')(x)
  9. model = tf.keras.Model(input_layer, decoded)
  10. model.summary()
  1. Model: "model_1"
  2. _________________________________________________________________
  3. Layer (type) Output Shape Param #
  4. =================================================================
  5. input_2 (InputLayer) [(None, 8192, 1)] 0
  6. conv1d_4 (Conv1D) (None, 8192, 8) 48
  7. batch_normalization_2 (Batc (None, 8192, 8) 32
  8. hNormalization)
  9. conv1d_5 (Conv1D) (None, 8192, 4) 164
  10. conv1d_6 (Conv1D) (None, 8192, 8) 168
  11. batch_normalization_3 (Batc (None, 8192, 8) 32
  12. hNormalization)
  13. conv1d_7 (Conv1D) (None, 8192, 1) 41
  14. =================================================================
  15. Total params: 485
  16. Trainable params: 453
  17. Non-trainable params: 32
  18. _________________________________________________________________
英文:

Maybe something like this?

  1. import tensorflow as tf
  2. input_layer = tf.keras.layers.Input(shape=(8192, 1))
  3. x = tf.keras.layers.Conv1D(8, kernel_size=5, activation='relu', padding='same')(input_layer)
  4. x = tf.keras.layers.BatchNormalization()(x)
  5. encoded = tf.keras.layers.Conv1D(4, kernel_size=5, activation='relu', padding='same')(x)
  6. x = tf.keras.layers.Conv1D(8, kernel_size=5, activation='relu', padding='same')(encoded)
  7. x = tf.keras.layers.BatchNormalization()(x)
  8. decoded = tf.keras.layers.Conv1D(1, kernel_size=5, activation='relu', padding='same')(x)
  9. model = tf.keras.Model(input_layer, decoded)
  10. model.summary()
  1. Model: "model_1"
  2. _________________________________________________________________
  3. Layer (type) Output Shape Param #
  4. =================================================================
  5. input_2 (InputLayer) [(None, 8192, 1)] 0
  6. conv1d_4 (Conv1D) (None, 8192, 8) 48
  7. batch_normalization_2 (Batc (None, 8192, 8) 32
  8. hNormalization)
  9. conv1d_5 (Conv1D) (None, 8192, 4) 164
  10. conv1d_6 (Conv1D) (None, 8192, 8) 168
  11. batch_normalization_3 (Batc (None, 8192, 8) 32
  12. hNormalization)
  13. conv1d_7 (Conv1D) (None, 8192, 1) 41
  14. =================================================================
  15. Total params: 485
  16. Trainable params: 453
  17. Non-trainable params: 32
  18. _________________________________________________________________

huangapple
  • 本文由 发表于 2023年3月9日 19:30:36
  • 转载请务必保留本文链接:https://go.coder-hub.com/75684005.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定