common.title

Docs
Quantum Circuit
TYTAN CLOUD

QUANTUM GAMING


Overview
Terms of service

Privacy policy

Contact
Research

Sign in
Sign up
common.title

Tensornetwork+Kerasを実行してみる

Yuichiro Minato

2021/01/14 05:55

テンソルとテンソルネットワーク

テンソルは線形的な量を一般化したもので、基底を選べば多次元の配列として表現できるようなもの。そして、テンソルネットワークは大きなテンソルをより小さなテンソルに分解してネットワーク化したものです。

記法

足の数によって記法が変わります。*をノード、-をエッジとして見たときに、

・足が一つはベクトルになります。

---*
v_j = \begin{bmatrix} v_1\\ v_2\\ \vdots\\ v_n \end{bmatrix}

・足が2つは行列になります。

---*---
M_{ij} = \begin{bmatrix} M_{11} & \cdots & M_{1n}\\ \vdots\\ M_{m1} & \cdots & M_{mn} \end{bmatrix}

・足が3つはオーダー3のテンソルになります。

---*---
   |
T_{ijk} = \begin{bmatrix} T_{111} & \cdots & T_{1n1}\\ \vdots\\ T_{m11} & \cdots & T_{mn1} \end{bmatrix} , \begin{bmatrix} T_{112} & \cdots & T_{1n2}\\ \vdots\\ T_{m12} & \cdots & T_{mn2} \end{bmatrix}, \cdots

縮約

ノード同士はエッジで接続され、それぞれ計算されます。

・ベクトルと行列
ベクトルと行列は縮約するとベクトルになります。

--*--* = ----*
v'_i = \sum_jM_{ij}v_j

・行列同士
行列同士は行列になります。足の数も変わりません。

--*--*-- = --*--
M_{ik} = A_{ij}B_{jk}

具体的にTensornetwork

イメージ湧かないので、Googleから提供されているオープンソースソフトウェアを使ってみます。インストールはpip経由で、

!pip install tensornetwork
Collecting tensornetwork
  Downloading tensornetwork-0.4.4-py3-none-any.whl (360 kB)
     |████████████████████████████████| 360 kB 5.6 MB/s eta 0:00:01
[?25hCollecting graphviz>=0.11.1
  Downloading graphviz-0.16-py2.py3-none-any.whl (19 kB)
Requirement already satisfied: opt-einsum>=2.3.0 in /opt/conda/lib/python3.8/site-packages (from tensornetwork) (3.3.0)
Requirement already satisfied: numpy>=1.17 in /opt/conda/lib/python3.8/site-packages (from tensornetwork) (1.19.1)
Requirement already satisfied: scipy>=1.1 in /opt/conda/lib/python3.8/site-packages (from tensornetwork) (1.4.1)
Requirement already satisfied: h5py>=2.9.0 in /opt/conda/lib/python3.8/site-packages (from tensornetwork) (2.10.0)
Requirement already satisfied: six in /opt/conda/lib/python3.8/site-packages (from h5py>=2.9.0->tensornetwork) (1.15.0)
Installing collected packages: graphviz, tensornetwork
Successfully installed graphviz-0.16 tensornetwork-0.4.4

となります。

基本の形

基本はテンソルネットワークの形を決めてそれぞれをつなぎます。参考は下記になります。

import numpy as np
import tensornetwork as tn

a = tn.Node(np.ones((10,))) 
b = tn.Node(np.ones((10,)))
edge = a[0] ^ b[0]
final_node = tn.contract(edge)
print(final_node.tensor)
10.0

少しずつ見てみる

まず読み込みます

import numpy as np
import tensornetwork as tn

次にノードを指定します。今回はAとBというベクトルを指定します。

a = tn.Node(np.ones((10,))) 
b = tn.Node(np.ones((10,)))

aとbにエッジを指定します。

edge = a[0] ^ b[0]

そして、これらのテンソル縮約をとって表示をして見ます。

final_node = tn.contract(edge)
print(final_node.tensor)
10.0

今回は、AとB共に下記のような10この要素が全て1のベクトルなので、

\begin{bmatrix} 1\\1\\\vdots\\1 \end{bmatrix}

計算結果は、
1∗1+1∗1+...+1∗1=10
と、スカラー量となります。

ベクトルと行列

aのベクトルとbの行列の縮約を取ると、

a = tn.Node(np.ones((5))) 
b = tn.Node(np.ones((5,5)))
edge = a[0] ^ b[0]
final_node = tn.contract(edge)
print(final_node.tensor)
[5. 5. 5. 5. 5.]

これはベクトルになります。

\begin{bmatrix} 1&1&1&1&1\\ 1&1&1&1&1\\ 1&1&1&1&1\\ 1&1&1&1&1\\ 1&1&1&1&1 \end{bmatrix} \begin{bmatrix} 1\\1\\1\\1\\1 \end{bmatrix} = \begin{bmatrix} 5\\5\\5\\5\\5 \end{bmatrix}

Speeding up neural networks using TensorNetwork in Keras
Kerasと組み合わせてできるそうです。やってみたい。例題動かすだけで申し訳ないですが、
https://blog.tensorflow.org/2020/02/speeding-up-neural-networks-using-tensornetwork-in-keras.html

基本的には計算の重たい行列部分を分解します。

#ツールを読み込みます。今回はtensorflow2.0を利用
import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf
import tensornetwork as tn

#デフォルトのバックエンドをnumpyからtensorflowに変更
tn.set_default_backend("tensorflow")
class TNLayer(tf.keras.layers.Layer):

  def __init__(self):
    super(TNLayer, self).__init__()
    # 各層の変数を決めます。
    self.a_var = tf.Variable(tf.random.normal(
            shape=(32, 32, 2), stddev=1.0/32.0),
             name="a", trainable=True)
    self.b_var = tf.Variable(tf.random.normal(shape=(32, 32, 2), stddev=1.0/32.0),
                             name="b", trainable=True)
    self.bias = tf.Variable(tf.zeros(shape=(32, 32)), name="bias", trainable=True)

  def call(self, inputs):
    # 縮約を定義し、並列計算を可能にする
    def f(input_vec, a_var, b_var, bias_var):
      # ベクトルの代わりに行列に
      input_vec = tf.reshape(input_vec, (32,32))

      # ネットワークを決める
      a = tn.Node(a_var)
      b = tn.Node(b_var)
      x_node = tn.Node(input_vec)
      a[1] ^ x_node[0]
      b[1] ^ x_node[1]
      a[2] ^ b[2]

      # 行列をaとbに分解
      #   |     |
      #   a --- b
      #    \   /
      #      x

      # 縮約実行
      c = a @ x_node
      result = (c @ b).tensor

      # バイアスを付加
      return result + bias_var

    # バッチ処理に tf.vectorized_map を利用する
    result = tf.vectorized_map(
        lambda vec: f(vec, self.a_var, self.b_var, self.bias), inputs)
    return tf.nn.relu(tf.reshape(result, (-1, 1024)))

テンソル分解前のモデルは、

Dense = tf.keras.layers.Dense
fc_model = tf.keras.Sequential(
    [
     tf.keras.Input(shape=(2,)),
     Dense(1024, activation=tf.nn.relu),
     Dense(1024, activation=tf.nn.relu),
     Dense(1, activation=None)])
fc_model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 1024)              3072      
_________________________________________________________________
dense_1 (Dense)              (None, 1024)              1049600   
_________________________________________________________________
dense_2 (Dense)              (None, 1)                 1025      
=================================================================
Total params: 1,053,697
Trainable params: 1,053,697
Non-trainable params: 0
_________________________________________________________________

パラメータ数は100万あります。これをテンソルネットワークレイヤーに置き換えると、

tn_model = tf.keras.Sequential(
    [
     tf.keras.Input(shape=(2,)),
     Dense(1024, activation=tf.nn.relu),
     # MPSで書き換えた結果
     TNLayer(),
     Dense(1, activation=None)])
tn_model.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_3 (Dense)              (None, 1024)              3072      
_________________________________________________________________
tn_layer (TNLayer)           (None, 1024)              5120      
_________________________________________________________________
dense_4 (Dense)              (None, 1)                 1025      
=================================================================
Total params: 9,217
Trainable params: 9,217
Non-trainable params: 0
_________________________________________________________________

9200程度に削減できました。

訓練

訓練させてみます

X = np.concatenate([np.random.randn(20, 2) + np.array([3, 3]), 
             np.random.randn(20, 2) + np.array([-3, -3]), 
             np.random.randn(20, 2) + np.array([-3, 3]), 
             np.random.randn(20, 2) + np.array([3, -3]),])

Y = np.concatenate([np.ones((40)), -np.ones((40))])

tn_model.compile(optimizer="adam", loss="mean_squared_error")
tn_model.fit(X, Y, epochs=300, verbose=1)
Epoch 1/300
/opt/conda/lib/python3.8/site-packages/tensorflow/python/framework/indexed_slices.py:433: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory.
  warnings.warn(
3/3 [==============================] - 0s 3ms/step - loss: 0.9859
Epoch 2/300
3/3 [==============================] - 0s 2ms/step - loss: 0.9265
Epoch 3/300
3/3 [==============================] - 0s 2ms/step - loss: 0.8774
Epoch 4/300
3/3 [==============================] - 0s 2ms/step - loss: 0.8236
Epoch 5/300
3/3 [==============================] - 0s 2ms/step - loss: 0.7597
Epoch 6/300
3/3 [==============================] - 0s 2ms/step - loss: 0.6845
Epoch 7/300
3/3 [==============================] - 0s 2ms/step - loss: 0.5953
Epoch 8/300
3/3 [==============================] - 0s 2ms/step - loss: 0.4913
Epoch 9/300
3/3 [==============================] - 0s 2ms/step - loss: 0.3775
Epoch 10/300
3/3 [==============================] - 0s 2ms/step - loss: 0.2623
Epoch 11/300
3/3 [==============================] - 0s 2ms/step - loss: 0.1584
Epoch 12/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0909
Epoch 13/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0708
Epoch 14/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0801
Epoch 15/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0953
Epoch 16/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0830
Epoch 17/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0624
Epoch 18/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0531
Epoch 19/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0549
Epoch 20/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0563
Epoch 21/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0540
Epoch 22/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0498
Epoch 23/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0463
Epoch 24/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0456
Epoch 25/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0448
Epoch 26/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0438
Epoch 27/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0418
Epoch 28/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0400
Epoch 29/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0389
Epoch 30/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0383
Epoch 31/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0372
Epoch 32/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0362
Epoch 33/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0349
Epoch 34/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0340
Epoch 35/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0333
Epoch 36/300
3/3 [==============================] - ETA: 0s - loss: 0.024 - 0s 3ms/step - loss: 0.0323
Epoch 37/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0311
Epoch 38/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0307
Epoch 39/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0302
Epoch 40/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0290
Epoch 41/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0285
Epoch 42/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0271
Epoch 43/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0265
Epoch 44/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0256
Epoch 45/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0244
Epoch 46/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0239
Epoch 47/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0234
Epoch 48/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0221
Epoch 49/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0212
Epoch 50/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0204
Epoch 51/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0198
Epoch 52/300
3/3 [==============================] - 0s 6ms/step - loss: 0.0191
Epoch 53/300
3/3 [==============================] - 0s 4ms/step - loss: 0.0179
Epoch 54/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0173
Epoch 55/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0166
Epoch 56/300
3/3 [==============================] - 0s 4ms/step - loss: 0.0157
Epoch 57/300
3/3 [==============================] - 0s 4ms/step - loss: 0.0148
Epoch 58/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0142
Epoch 59/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0136
Epoch 60/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0131
Epoch 61/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0119
Epoch 62/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0115
Epoch 63/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0106
Epoch 64/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0101
Epoch 65/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0096
Epoch 66/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0091
Epoch 67/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0084
Epoch 68/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0079
Epoch 69/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0073
Epoch 70/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0068
Epoch 71/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0062
Epoch 72/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0059
Epoch 73/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0054
Epoch 74/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0049
Epoch 75/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0045
Epoch 76/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0042
Epoch 77/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0038
Epoch 78/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0035
Epoch 79/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0033
Epoch 80/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0030
Epoch 81/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0027
Epoch 82/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0024
Epoch 83/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0022
Epoch 84/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0021
Epoch 85/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0019
Epoch 86/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0017
Epoch 87/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0016
Epoch 88/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0014
Epoch 89/300
3/3 [==============================] - 0s 3ms/step - loss: 0.0013
Epoch 90/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0012
Epoch 91/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0011
Epoch 92/300
3/3 [==============================] - 0s 2ms/step - loss: 0.0010
Epoch 93/300
3/3 [==============================] - 0s 2ms/step - loss: 8.7574e-04
Epoch 94/300
3/3 [==============================] - 0s 3ms/step - loss: 8.8600e-04
Epoch 95/300
3/3 [==============================] - 0s 3ms/step - loss: 7.5164e-04
Epoch 96/300
3/3 [==============================] - 0s 3ms/step - loss: 7.5406e-04
Epoch 97/300
3/3 [==============================] - 0s 2ms/step - loss: 6.7849e-04
Epoch 98/300
3/3 [==============================] - 0s 2ms/step - loss: 6.8074e-04
Epoch 99/300
3/3 [==============================] - 0s 3ms/step - loss: 5.6292e-04
Epoch 100/300
3/3 [==============================] - 0s 2ms/step - loss: 6.1036e-04
Epoch 101/300
3/3 [==============================] - 0s 2ms/step - loss: 5.1326e-04
Epoch 102/300
3/3 [==============================] - 0s 2ms/step - loss: 4.6571e-04
Epoch 103/300
3/3 [==============================] - 0s 3ms/step - loss: 4.5498e-04
Epoch 104/300
3/3 [==============================] - 0s 3ms/step - loss: 4.0282e-04
Epoch 105/300
3/3 [==============================] - 0s 2ms/step - loss: 3.8238e-04
Epoch 106/300
3/3 [==============================] - 0s 3ms/step - loss: 3.8329e-04
Epoch 107/300
3/3 [==============================] - 0s 2ms/step - loss: 3.2295e-04
Epoch 108/300
3/3 [==============================] - 0s 3ms/step - loss: 3.4945e-04
Epoch 109/300
3/3 [==============================] - 0s 3ms/step - loss: 3.0512e-04
Epoch 110/300
3/3 [==============================] - 0s 3ms/step - loss: 2.8472e-04
Epoch 111/300
3/3 [==============================] - 0s 3ms/step - loss: 2.5208e-04
Epoch 112/300
3/3 [==============================] - 0s 2ms/step - loss: 2.5966e-04
Epoch 113/300
3/3 [==============================] - 0s 2ms/step - loss: 2.4697e-04
Epoch 114/300
3/3 [==============================] - 0s 2ms/step - loss: 2.2523e-04
Epoch 115/300
3/3 [==============================] - 0s 3ms/step - loss: 2.1603e-04
Epoch 116/300
3/3 [==============================] - 0s 3ms/step - loss: 1.9578e-04
Epoch 117/300
3/3 [==============================] - 0s 3ms/step - loss: 1.9062e-04
Epoch 118/300
3/3 [==============================] - 0s 2ms/step - loss: 1.6892e-04
Epoch 119/300
3/3 [==============================] - 0s 3ms/step - loss: 1.6410e-04
Epoch 120/300
3/3 [==============================] - 0s 2ms/step - loss: 1.5607e-04
Epoch 121/300
3/3 [==============================] - 0s 3ms/step - loss: 1.4121e-04
Epoch 122/300
3/3 [==============================] - 0s 3ms/step - loss: 1.3879e-04
Epoch 123/300
3/3 [==============================] - 0s 5ms/step - loss: 1.2620e-04
Epoch 124/300
3/3 [==============================] - 0s 3ms/step - loss: 1.3199e-04
Epoch 125/300
3/3 [==============================] - 0s 2ms/step - loss: 1.1584e-04
Epoch 126/300
3/3 [==============================] - 0s 2ms/step - loss: 1.1183e-04
Epoch 127/300
3/3 [==============================] - 0s 2ms/step - loss: 1.0627e-04
Epoch 128/300
3/3 [==============================] - 0s 2ms/step - loss: 1.0516e-04
Epoch 129/300
3/3 [==============================] - 0s 2ms/step - loss: 9.6686e-05
Epoch 130/300
3/3 [==============================] - 0s 3ms/step - loss: 8.9551e-05
Epoch 131/300
3/3 [==============================] - 0s 2ms/step - loss: 9.5922e-05
Epoch 132/300
3/3 [==============================] - 0s 3ms/step - loss: 8.5369e-05
Epoch 133/300
3/3 [==============================] - 0s 3ms/step - loss: 8.1285e-05
Epoch 134/300
3/3 [==============================] - 0s 2ms/step - loss: 8.5368e-05
Epoch 135/300
3/3 [==============================] - 0s 2ms/step - loss: 7.1356e-05
Epoch 136/300
3/3 [==============================] - 0s 2ms/step - loss: 7.5935e-05
Epoch 137/300
3/3 [==============================] - 0s 2ms/step - loss: 7.6523e-05
Epoch 138/300
3/3 [==============================] - 0s 2ms/step - loss: 6.6843e-05
Epoch 139/300
3/3 [==============================] - 0s 2ms/step - loss: 5.9801e-05
Epoch 140/300
3/3 [==============================] - 0s 3ms/step - loss: 6.3317e-05
Epoch 141/300
3/3 [==============================] - 0s 2ms/step - loss: 6.0042e-05
Epoch 142/300
3/3 [==============================] - 0s 3ms/step - loss: 6.0035e-05
Epoch 143/300
3/3 [==============================] - 0s 3ms/step - loss: 5.4436e-05
Epoch 144/300
3/3 [==============================] - 0s 2ms/step - loss: 5.4438e-05
Epoch 145/300
3/3 [==============================] - 0s 2ms/step - loss: 5.4067e-05
Epoch 146/300
3/3 [==============================] - 0s 2ms/step - loss: 4.9671e-05
Epoch 147/300
3/3 [==============================] - 0s 3ms/step - loss: 4.8846e-05
Epoch 148/300
3/3 [==============================] - 0s 2ms/step - loss: 5.0365e-05
Epoch 149/300
3/3 [==============================] - 0s 3ms/step - loss: 4.3061e-05
Epoch 150/300
3/3 [==============================] - 0s 2ms/step - loss: 4.2068e-05
Epoch 151/300
3/3 [==============================] - 0s 2ms/step - loss: 4.0363e-05
Epoch 152/300
3/3 [==============================] - 0s 2ms/step - loss: 3.6734e-05
Epoch 153/300
3/3 [==============================] - 0s 3ms/step - loss: 3.6385e-05
Epoch 154/300
3/3 [==============================] - 0s 2ms/step - loss: 3.4057e-05
Epoch 155/300
3/3 [==============================] - 0s 2ms/step - loss: 3.5116e-05
Epoch 156/300
3/3 [==============================] - 0s 3ms/step - loss: 3.0130e-05
Epoch 157/300
3/3 [==============================] - 0s 2ms/step - loss: 3.6015e-05
Epoch 158/300
3/3 [==============================] - 0s 2ms/step - loss: 3.4962e-05
Epoch 159/300
3/3 [==============================] - 0s 3ms/step - loss: 3.1835e-05
Epoch 160/300
3/3 [==============================] - 0s 3ms/step - loss: 2.7964e-05
Epoch 161/300
3/3 [==============================] - 0s 3ms/step - loss: 4.0337e-05
Epoch 162/300
3/3 [==============================] - 0s 2ms/step - loss: 3.7837e-05
Epoch 163/300
3/3 [==============================] - 0s 2ms/step - loss: 2.3564e-05
Epoch 164/300
3/3 [==============================] - 0s 2ms/step - loss: 3.6233e-05
Epoch 165/300
3/3 [==============================] - 0s 2ms/step - loss: 3.6375e-05
Epoch 166/300
3/3 [==============================] - 0s 2ms/step - loss: 3.1726e-05
Epoch 167/300
3/3 [==============================] - 0s 2ms/step - loss: 2.5174e-05
Epoch 168/300
3/3 [==============================] - 0s 2ms/step - loss: 2.8795e-05
Epoch 169/300
3/3 [==============================] - 0s 2ms/step - loss: 3.1214e-05
Epoch 170/300
3/3 [==============================] - 0s 2ms/step - loss: 2.6692e-05
Epoch 171/300
3/3 [==============================] - 0s 2ms/step - loss: 2.2342e-05
Epoch 172/300
3/3 [==============================] - 0s 2ms/step - loss: 2.1864e-05
Epoch 173/300
3/3 [==============================] - 0s 2ms/step - loss: 2.1456e-05
Epoch 174/300
3/3 [==============================] - 0s 2ms/step - loss: 1.8572e-05
Epoch 175/300
3/3 [==============================] - 0s 2ms/step - loss: 2.0431e-05
Epoch 176/300
3/3 [==============================] - 0s 2ms/step - loss: 1.9673e-05
Epoch 177/300
3/3 [==============================] - 0s 2ms/step - loss: 1.7745e-05
Epoch 178/300
3/3 [==============================] - 0s 3ms/step - loss: 1.9004e-05
Epoch 179/300
3/3 [==============================] - 0s 2ms/step - loss: 2.0076e-05
Epoch 180/300
3/3 [==============================] - 0s 2ms/step - loss: 2.0832e-05
Epoch 181/300
3/3 [==============================] - 0s 2ms/step - loss: 1.5797e-05
Epoch 182/300
3/3 [==============================] - 0s 2ms/step - loss: 1.6723e-05
Epoch 183/300
3/3 [==============================] - 0s 3ms/step - loss: 1.6544e-05
Epoch 184/300
3/3 [==============================] - 0s 3ms/step - loss: 1.6024e-05
Epoch 185/300
3/3 [==============================] - 0s 2ms/step - loss: 1.3748e-05
Epoch 186/300
3/3 [==============================] - 0s 2ms/step - loss: 1.4181e-05
Epoch 187/300
3/3 [==============================] - 0s 2ms/step - loss: 1.2579e-05
Epoch 188/300
3/3 [==============================] - 0s 2ms/step - loss: 1.2580e-05
Epoch 189/300
3/3 [==============================] - 0s 3ms/step - loss: 1.2533e-05
Epoch 190/300
3/3 [==============================] - 0s 2ms/step - loss: 1.2917e-05
Epoch 191/300
3/3 [==============================] - 0s 3ms/step - loss: 1.1905e-05
Epoch 192/300
3/3 [==============================] - 0s 3ms/step - loss: 1.0851e-05
Epoch 193/300
3/3 [==============================] - 0s 2ms/step - loss: 1.3414e-05
Epoch 194/300
3/3 [==============================] - 0s 3ms/step - loss: 1.4885e-05
Epoch 195/300
3/3 [==============================] - 0s 2ms/step - loss: 1.2855e-05
Epoch 196/300
3/3 [==============================] - 0s 2ms/step - loss: 1.0836e-05
Epoch 197/300
3/3 [==============================] - 0s 3ms/step - loss: 1.0261e-05
Epoch 198/300
3/3 [==============================] - 0s 3ms/step - loss: 9.8217e-06
Epoch 199/300
3/3 [==============================] - 0s 4ms/step - loss: 1.0741e-05
Epoch 200/300
3/3 [==============================] - 0s 3ms/step - loss: 8.6370e-06
Epoch 201/300
3/3 [==============================] - 0s 3ms/step - loss: 9.6454e-06
Epoch 202/300
3/3 [==============================] - 0s 3ms/step - loss: 9.9669e-06
Epoch 203/300
3/3 [==============================] - 0s 3ms/step - loss: 1.1830e-05
Epoch 204/300
3/3 [==============================] - 0s 3ms/step - loss: 8.8200e-06
Epoch 205/300
3/3 [==============================] - 0s 5ms/step - loss: 1.3088e-05
Epoch 206/300
3/3 [==============================] - 0s 3ms/step - loss: 9.4599e-06
Epoch 207/300
3/3 [==============================] - 0s 4ms/step - loss: 1.0797e-05
Epoch 208/300
3/3 [==============================] - 0s 3ms/step - loss: 1.2839e-05
Epoch 209/300
3/3 [==============================] - 0s 6ms/step - loss: 1.0324e-05
Epoch 210/300
3/3 [==============================] - 0s 5ms/step - loss: 8.9375e-06
Epoch 211/300
3/3 [==============================] - 0s 5ms/step - loss: 9.4638e-06
Epoch 212/300
3/3 [==============================] - 0s 4ms/step - loss: 9.5824e-06
Epoch 213/300
3/3 [==============================] - 0s 3ms/step - loss: 9.6420e-06
Epoch 214/300
3/3 [==============================] - 0s 4ms/step - loss: 8.1607e-06
Epoch 215/300
3/3 [==============================] - 0s 3ms/step - loss: 8.3069e-06
Epoch 216/300
3/3 [==============================] - 0s 3ms/step - loss: 8.0919e-06
Epoch 217/300
3/3 [==============================] - 0s 2ms/step - loss: 7.0897e-06
Epoch 218/300
3/3 [==============================] - 0s 2ms/step - loss: 7.2336e-06
Epoch 219/300
3/3 [==============================] - 0s 3ms/step - loss: 6.2919e-06
Epoch 220/300
3/3 [==============================] - 0s 2ms/step - loss: 7.2332e-06
Epoch 221/300
3/3 [==============================] - 0s 2ms/step - loss: 8.6723e-06
Epoch 222/300
3/3 [==============================] - 0s 5ms/step - loss: 6.2510e-06
Epoch 223/300
3/3 [==============================] - 0s 4ms/step - loss: 5.9212e-06
Epoch 224/300
3/3 [==============================] - 0s 3ms/step - loss: 6.6930e-06
Epoch 225/300
3/3 [==============================] - 0s 2ms/step - loss: 6.4057e-06
Epoch 226/300
3/3 [==============================] - 0s 2ms/step - loss: 5.7821e-06
Epoch 227/300
3/3 [==============================] - 0s 2ms/step - loss: 5.8294e-06
Epoch 228/300
3/3 [==============================] - 0s 2ms/step - loss: 5.5520e-06
Epoch 229/300
3/3 [==============================] - 0s 2ms/step - loss: 4.9410e-06
Epoch 230/300
3/3 [==============================] - 0s 2ms/step - loss: 5.4561e-06
Epoch 231/300
3/3 [==============================] - 0s 2ms/step - loss: 4.8599e-06
Epoch 232/300
3/3 [==============================] - 0s 2ms/step - loss: 4.3782e-06
Epoch 233/300
3/3 [==============================] - 0s 2ms/step - loss: 4.4193e-06
Epoch 234/300
3/3 [==============================] - 0s 2ms/step - loss: 4.6412e-06
Epoch 235/300
3/3 [==============================] - 0s 2ms/step - loss: 5.4934e-06
Epoch 236/300
3/3 [==============================] - 0s 2ms/step - loss: 5.0386e-06
Epoch 237/300
3/3 [==============================] - 0s 2ms/step - loss: 4.1739e-06
Epoch 238/300
3/3 [==============================] - 0s 2ms/step - loss: 4.9338e-06
Epoch 239/300
3/3 [==============================] - 0s 2ms/step - loss: 4.2226e-06
Epoch 240/300
3/3 [==============================] - 0s 2ms/step - loss: 4.1354e-06
Epoch 241/300
3/3 [==============================] - 0s 2ms/step - loss: 3.7327e-06
Epoch 242/300
3/3 [==============================] - 0s 3ms/step - loss: 4.5763e-06
Epoch 243/300
3/3 [==============================] - 0s 3ms/step - loss: 5.9205e-06
Epoch 244/300
3/3 [==============================] - 0s 2ms/step - loss: 4.4189e-06
Epoch 245/300
3/3 [==============================] - 0s 3ms/step - loss: 4.2926e-06
Epoch 246/300
3/3 [==============================] - 0s 2ms/step - loss: 3.6680e-06
Epoch 247/300
3/3 [==============================] - 0s 2ms/step - loss: 3.6443e-06
Epoch 248/300
3/3 [==============================] - 0s 2ms/step - loss: 3.5317e-06
Epoch 249/300
3/3 [==============================] - 0s 2ms/step - loss: 3.8150e-06
Epoch 250/300
3/3 [==============================] - 0s 2ms/step - loss: 3.5815e-06
Epoch 251/300
3/3 [==============================] - 0s 3ms/step - loss: 3.5809e-06
Epoch 252/300
3/3 [==============================] - 0s 2ms/step - loss: 4.2071e-06
Epoch 253/300
3/3 [==============================] - 0s 2ms/step - loss: 4.2209e-06
Epoch 254/300
3/3 [==============================] - 0s 2ms/step - loss: 4.4257e-06
Epoch 255/300
3/3 [==============================] - 0s 2ms/step - loss: 3.9730e-06
Epoch 256/300
3/3 [==============================] - 0s 2ms/step - loss: 5.6366e-06
Epoch 257/300
3/3 [==============================] - 0s 3ms/step - loss: 4.4209e-06
Epoch 258/300
3/3 [==============================] - 0s 2ms/step - loss: 3.9057e-06
Epoch 259/300
3/3 [==============================] - 0s 2ms/step - loss: 3.0130e-06
Epoch 260/300
3/3 [==============================] - 0s 3ms/step - loss: 3.0427e-06
Epoch 261/300
3/3 [==============================] - 0s 3ms/step - loss: 3.1672e-06
Epoch 262/300
3/3 [==============================] - 0s 3ms/step - loss: 3.1241e-06
Epoch 263/300
3/3 [==============================] - 0s 6ms/step - loss: 3.0838e-06
Epoch 264/300
3/3 [==============================] - 0s 3ms/step - loss: 2.8876e-06
Epoch 265/300
3/3 [==============================] - 0s 3ms/step - loss: 2.6803e-06
Epoch 266/300
3/3 [==============================] - 0s 2ms/step - loss: 2.6007e-06
Epoch 267/300
3/3 [==============================] - 0s 2ms/step - loss: 2.9473e-06
Epoch 268/300
3/3 [==============================] - 0s 2ms/step - loss: 3.0610e-06
Epoch 269/300
3/3 [==============================] - 0s 3ms/step - loss: 2.7810e-06
Epoch 270/300
3/3 [==============================] - 0s 3ms/step - loss: 2.5782e-06
Epoch 271/300
3/3 [==============================] - 0s 3ms/step - loss: 2.8270e-06
Epoch 272/300
3/3 [==============================] - 0s 2ms/step - loss: 2.5069e-06
Epoch 273/300
3/3 [==============================] - 0s 3ms/step - loss: 2.7066e-06
Epoch 274/300
3/3 [==============================] - 0s 3ms/step - loss: 2.4883e-06
Epoch 275/300
3/3 [==============================] - 0s 3ms/step - loss: 2.2770e-06
Epoch 276/300
3/3 [==============================] - 0s 3ms/step - loss: 2.2133e-06
Epoch 277/300
3/3 [==============================] - 0s 2ms/step - loss: 2.3347e-06
Epoch 278/300
3/3 [==============================] - 0s 2ms/step - loss: 2.2566e-06
Epoch 279/300
3/3 [==============================] - 0s 3ms/step - loss: 2.0183e-06
Epoch 280/300
3/3 [==============================] - 0s 2ms/step - loss: 2.3372e-06
Epoch 281/300
3/3 [==============================] - 0s 2ms/step - loss: 2.1137e-06
Epoch 282/300
3/3 [==============================] - 0s 2ms/step - loss: 2.1159e-06
Epoch 283/300
3/3 [==============================] - 0s 3ms/step - loss: 2.0542e-06
Epoch 284/300
3/3 [==============================] - 0s 2ms/step - loss: 1.9478e-06
Epoch 285/300
3/3 [==============================] - 0s 2ms/step - loss: 1.9790e-06
Epoch 286/300
3/3 [==============================] - 0s 3ms/step - loss: 2.0434e-06
Epoch 287/300
3/3 [==============================] - 0s 3ms/step - loss: 2.0363e-06
Epoch 288/300
3/3 [==============================] - 0s 3ms/step - loss: 1.9720e-06
Epoch 289/300
3/3 [==============================] - 0s 3ms/step - loss: 2.0978e-06
Epoch 290/300
3/3 [==============================] - 0s 2ms/step - loss: 1.9812e-06
Epoch 291/300
3/3 [==============================] - 0s 2ms/step - loss: 1.9890e-06
Epoch 292/300
3/3 [==============================] - 0s 2ms/step - loss: 1.9947e-06
Epoch 293/300
3/3 [==============================] - 0s 3ms/step - loss: 1.7500e-06
Epoch 294/300
3/3 [==============================] - 0s 2ms/step - loss: 1.9563e-06
Epoch 295/300
3/3 [==============================] - 0s 3ms/step - loss: 1.6496e-06
Epoch 296/300
3/3 [==============================] - 0s 2ms/step - loss: 2.1377e-06
Epoch 297/300
3/3 [==============================] - 0s 3ms/step - loss: 2.3962e-06
Epoch 298/300
3/3 [==============================] - 0s 3ms/step - loss: 1.5973e-06
Epoch 299/300
3/3 [==============================] - 0s 2ms/step - loss: 2.2113e-06
Epoch 300/300
3/3 [==============================] - 0s 2ms/step - loss: 2.3474e-06
<tensorflow.python.keras.callbacks.History at 0x7f42ac72c1f0>

イメージをプロットさせてみます

# Plotting code, feel free to ignore.
h = 1.0
x_min, x_max = X[:, 0].min() - 5, X[:, 0].max() + 5
y_min, y_max = X[:, 1].min() - 5, X[:, 1].max() + 5
xx, yy = np.meshgrid(np.arange(x_min, x_max, h),
                     np.arange(y_min, y_max, h))

# here "model" is your model's prediction (classification) function
Z = tn_model.predict(np.c_[xx.ravel(), yy.ravel()]) 

# Put the result into a color plot
Z = Z.reshape(xx.shape)
plt.contourf(xx, yy, Z)
plt.axis('off')

# Plot also the training points
plt.scatter(X[:, 0], X[:, 1], c=Y, cmap=plt.cm.Paired)
<matplotlib.collections.PathCollection at 0x7f4297ea3220>
<Figure size 432x288 with 1 Axes>

image

© 2025, blueqat Inc. All rights reserved