common.title

Docs
Quantum Circuit
TYTAN CLOUD

QUANTUM GAMING

Nobisuke

Dekisugi


autoQAOA
DEEPSCORE

Overview
Service overview
Terms of service

Privacy policy

Contact
Research

Sign in
Sign up
common.title

テンソルネットワークの目的別実行

Yuichiro Minato

2023/12/07 07:03

#テンソルネットワーク

ここでは、cuTensorNetを利用する際に、量子計算を実行する際にちょっとコツが必要なので最後確認したいと思います。

テンソル

テンソルネットワークはノードとエッジで構成されており、各ノード/エッジには特定の意味があります。

単一のノードはスカラー量を表します。

 *

単一のノードに一本の脚がある場合、それはベクトルを表します。

 *-

単一のノードに二本の脚がある場合、それは行列を表します。

-*-

脚の数が多い場合は、k階のテンソルを表します。

-*-
 |

縮約

異なるテンソルを接続して縮約を行うことで計算を実行できます。

例)ベクトルと行列を含む計算を行うと、結果はベクトルになります。

-*- -*
-*--*
-*

例)行列と行列を含む計算を行うと、結果は行列になります。

-*- -*-
-*--*-
-*-

分解

さまざまなアルゴリズムを使用してテンソルを分解することもできますが、その中で最も代表的なものの一つがSVD(特異値分解)です。

-*-
-*- -*-

Google Tensornetwork ライブラリ

ライブラリを使用して実行しましょう。

https://github.com/google/TensorNetwork

!pip install tensornetwork
Requirement already satisfied: tensornetwork in /opt/conda/lib/python3.10/site-packages (0.4.6)
Requirement already satisfied: numpy>=1.17 in /opt/conda/lib/python3.10/site-packages (from tensornetwork) (1.23.5)
Requirement already satisfied: graphviz>=0.11.1 in /opt/conda/lib/python3.10/site-packages (from tensornetwork) (0.20.1)
Requirement already satisfied: opt-einsum>=2.3.0 in /opt/conda/lib/python3.10/site-packages (from tensornetwork) (3.3.0)
Requirement already satisfied: h5py>=2.9.0 in /opt/conda/lib/python3.10/site-packages (from tensornetwork) (3.8.0)
Requirement already satisfied: scipy>=1.1 in /opt/conda/lib/python3.10/site-packages (from tensornetwork) (1.10.1)

まず、ツールをロードして、二つのベクトルを準備しましょう。

import numpy as np
import tensornetwork as tn

a = tn.Node(np.ones((10,))) 
b = tn.Node(np.ones((10,)))

次に、それらの間にエッジを持つノードを接続します。

edge = a[0] ^ b[0]

最後に、エッジを指定して収縮を行うと、計算が完了します。

final_node = tn.contract(edge)
print(final_node.tensor)
10.0

ベクトルと行列を縮約してベクトルに

a = tn.Node(np.ones((5))) 
b = tn.Node(np.ones((5,5)))
edge = a[0] ^ b[0]
final_node = tn.contract(edge)
print(final_node.tensor)
[5. 5. 5. 5. 5.]

行列と行列を縮約して行列に

a = tn.Node(np.ones((5,3))) 
b = tn.Node(np.ones((5,2)))
edge = a[0] ^ b[0]
final_node = tn.contract(edge)
print(final_node.tensor)
[[5. 5.]
 [5. 5.]
 [5. 5.]]

テンソルネットワークを量子回路のバックエンドとして

テンソルネットワークのベクトルを量子ビットにマッピングし、行列やテンソルを量子ゲートに適用することで量子回路をシミュレートできます。

今回はquimbを使用して量子回路をシミュレートします。

!pip install quimb
Requirement already satisfied: quimb in /opt/conda/lib/python3.10/site-packages (1.4.0)
Requirement already satisfied: numpy>=1.17 in /opt/conda/lib/python3.10/site-packages (from quimb) (1.23.5)
Requirement already satisfied: scipy>=1.0.0 in /opt/conda/lib/python3.10/site-packages (from quimb) (1.10.1)
Requirement already satisfied: numba>=0.39 in /opt/conda/lib/python3.10/site-packages (from quimb) (0.57.0)
Requirement already satisfied: psutil>=4.3.1 in /opt/conda/lib/python3.10/site-packages (from quimb) (5.9.5)
Requirement already satisfied: cytoolz>=0.8.0 in /opt/conda/lib/python3.10/site-packages (from quimb) (0.12.1)
Requirement already satisfied: tqdm>=4 in /opt/conda/lib/python3.10/site-packages (from quimb) (4.65.0)
Requirement already satisfied: toolz>=0.8.0 in /opt/conda/lib/python3.10/site-packages (from cytoolz>=0.8.0->quimb) (0.12.0)
Requirement already satisfied: llvmlite<0.41,>=0.40.0dev0 in /opt/conda/lib/python3.10/site-packages (from numba>=0.39->quimb) (0.40.0)
#%config InlineBackend.figure_formats = ['svg']
import quimb as qu
import quimb.tensor as qtn

from collections import Counter

80量子ビットを準備して、GHZ状態を作成します。

#number of qubits
N = 80

#initialization of circuit
circ = qtn.Circuit(N)

#apply Hgate to the first qubit
circ.h(0)

# making GHZ using CX chain
for i in range(N-1):
  circ.cx(i, i+1)

# get sampling from quantum state
Counter(circ.sample(1))
Counter({'00000000000000000000000000000000000000000000000000000000000000000000000000000000': 1})

構築した量子回路を使用してサンプルを取ります。

%%time

# get 100samples
Counter(circ.sample(100))
CPU times: user 750 ms, sys: 12.1 ms, total: 762 ms
Wall time: 823 ms
Counter({'00000000000000000000000000000000000000000000000000000000000000000000000000000000': 54,
         '11111111111111111111111111111111111111111111111111111111111111111111111111111111': 46})

より大きな回路を作りましょう。

circ = qtn.Circuit(10)

for i in range(10):
  circ.apply_gate('H', i, gate_round=0)

for r in range(1, 9):
  # even pairs
  for i in range(0, 10, 2):
    circ.apply_gate('CNOT', i, i + 1, gate_round=r)
  # Y-rotations
  for i in range(10):
    circ.apply_gate('RZ', 1.234, i, gate_round=r)
  # odd pairs
  for i in range(1, 9, 2):
    circ.apply_gate('CZ', i, i + 1, gate_round=r)
  # X-rotations
  for i in range(10):
    circ.apply_gate('RX', 1.234, i, gate_round=r)

# h gate
for i in range(10):
  circ.apply_gate('H', i, gate_round=r + 1)
circ
<Circuit(n=10, num_gates=252, gate_opts={'contract': 'auto-split-gate', 'propagate_tags': 'register'})>

次に、量子回路を描きましょう。

circ.psi.draw(color=['PSI0', 'H', 'CNOT', 'RZ', 'RX', 'CZ'])
<Figure size 600x600 with 1 Axes>

image

ここでは、量子ゲートは実際には行列で表されることが期待されますが、ユニタリ行列は3階のテンソルに分解されます。テンソルネットワークでは、ユニタリ行列が常に使用されるわけではありません。さらに、量子ビットの状態ベクトルは、各量子ビットのベクトルが独立して描かれて表されます。

状態ベクトル、振幅、期待値、サンプリング

テンソルネットワークを使用して量子回路の計算を行う際には、状態ベクトル、確率振幅、期待値、またはサンプリングなど、事前に目的を決定する必要があります。

状態ベクトル

すべてのノードを収縮させると、単一のベクトルが得られます。

circ.to_dense()
[[ 0.022278+0.044826j]
 [ 0.047567+0.001852j]
 [-0.028239+0.01407j ]
 ...
 [ 0.016   -0.008447j]
 [-0.025437-0.015225j]
 [-0.033285-0.030653j]]

振幅

ビット文字列を指定しテンソルを接続することで、対応する確率振幅を得ることができます。

circ.amplitude('0000011111')
(0.004559038599179494+0.02661946964089579j)

期待値

同じ量子回路を、対応するハミルトニアンの単一項を挿入して操作することで、期待値を得ることができます。

circ.local_expectation(qu.pauli('X') & qu.pauli('Z'), (4, 5))
(-0.07785735654723336+3.903127820947816e-17j)

サンプル

サンプリングを行うことができます。

for item in circ.sample(1):
    print(item)
1110101110

Kerasにおけるテンソルネットワークを使用したニューラルネットワークの高速化

このGoogleの記事から、テンソルネットワークを使用した行列分解のテクニックを学び、ニューラルネットワークを高速化する方法を学びます。

https://blog.tensorflow.org/2020/02/speeding-up-neural-networks-using-tensornetwork-in-keras.html

通常の全結合ニューラルネットワーク

!pip install tensorflow
Collecting tensorflow
  Downloading tensorflow-2.15.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (475.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 475.2/475.2 MB 821.1 kB/s eta 0:00:0000:0100:01
[?25hCollecting absl-py>=1.0.0 (from tensorflow)
  Downloading absl_py-2.0.0-py3-none-any.whl (130 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 130.2/130.2 kB 2.3 MB/s eta 0:00:00a 0:00:01
[?25hCollecting astunparse>=1.6.0 (from tensorflow)
  Downloading astunparse-1.6.3-py2.py3-none-any.whl (12 kB)
Collecting flatbuffers>=23.5.26 (from tensorflow)
  Downloading flatbuffers-23.5.26-py2.py3-none-any.whl (26 kB)
Collecting gast!=0.5.0,!=0.5.1,!=0.5.2,>=0.2.1 (from tensorflow)
  Downloading gast-0.5.4-py3-none-any.whl (19 kB)
Collecting google-pasta>=0.1.1 (from tensorflow)
  Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.5/57.5 kB 1.0 MB/s eta 0:00:00ta 0:00:01
[?25hRequirement already satisfied: h5py>=2.9.0 in /opt/conda/lib/python3.10/site-packages (from tensorflow) (3.8.0)
Collecting libclang>=13.0.0 (from tensorflow)
  Downloading libclang-16.0.6-py2.py3-none-manylinux2010_x86_64.whl (22.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 22.9/22.9 MB 29.2 MB/s eta 0:00:0000:0100:01
[?25hCollecting ml-dtypes~=0.2.0 (from tensorflow)
  Downloading ml_dtypes-0.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.0/1.0 MB 13.4 MB/s eta 0:00:0000:01
[?25hRequirement already satisfied: numpy<2.0.0,>=1.23.5 in /opt/conda/lib/python3.10/site-packages (from tensorflow) (1.23.5)
Requirement already satisfied: opt-einsum>=2.3.2 in /opt/conda/lib/python3.10/site-packages (from tensorflow) (3.3.0)
Requirement already satisfied: packaging in /opt/conda/lib/python3.10/site-packages (from tensorflow) (23.1)
Requirement already satisfied: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3 in /opt/conda/lib/python3.10/site-packages (from tensorflow) (3.20.3)
Requirement already satisfied: setuptools in /opt/conda/lib/python3.10/site-packages (from tensorflow) (67.7.2)
Requirement already satisfied: six>=1.12.0 in /opt/conda/lib/python3.10/site-packages (from tensorflow) (1.16.0)
Collecting termcolor>=1.1.0 (from tensorflow)
  Downloading termcolor-2.4.0-py3-none-any.whl (7.7 kB)
Requirement already satisfied: typing-extensions>=3.6.6 in /opt/conda/lib/python3.10/site-packages (from tensorflow) (4.5.0)
Collecting wrapt<1.15,>=1.11.0 (from tensorflow)
  Using cached wrapt-1.14.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (77 kB)
Collecting tensorflow-io-gcs-filesystem>=0.23.1 (from tensorflow)
  Downloading tensorflow_io_gcs_filesystem-0.34.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.4 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.4/2.4 MB 21.3 MB/s eta 0:00:00:00:01
[?25hRequirement already satisfied: grpcio<2.0,>=1.24.3 in /opt/conda/lib/python3.10/site-packages (from tensorflow) (1.54.2)
Collecting tensorboard<2.16,>=2.15 (from tensorflow)
  Downloading tensorboard-2.15.1-py3-none-any.whl (5.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.5/5.5 MB 40.2 MB/s eta 0:00:00:00:01
[?25hCollecting tensorflow-estimator<2.16,>=2.15.0 (from tensorflow)
  Downloading tensorflow_estimator-2.15.0-py2.py3-none-any.whl (441 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 442.0/442.0 kB 8.3 MB/s eta 0:00:0000:01
[?25hCollecting keras<2.16,>=2.15.0 (from tensorflow)
  Downloading keras-2.15.0-py3-none-any.whl (1.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.7/1.7 MB 7.1 MB/s eta 0:00:00:00:0100:01m
[?25hRequirement already satisfied: wheel<1.0,>=0.23.0 in /opt/conda/lib/python3.10/site-packages (from astunparse>=1.6.0->tensorflow) (0.40.0)
Requirement already satisfied: google-auth<3,>=1.6.3 in /opt/conda/lib/python3.10/site-packages (from tensorboard<2.16,>=2.15->tensorflow) (2.19.0)
Collecting google-auth-oauthlib<2,>=0.5 (from tensorboard<2.16,>=2.15->tensorflow)
  Downloading google_auth_oauthlib-1.1.0-py2.py3-none-any.whl (19 kB)
Collecting markdown>=2.6.8 (from tensorboard<2.16,>=2.15->tensorflow)
  Downloading Markdown-3.5.1-py3-none-any.whl (102 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 102.2/102.2 kB 2.2 MB/s eta 0:00:00a 0:00:01
[?25hRequirement already satisfied: requests<3,>=2.21.0 in /opt/conda/lib/python3.10/site-packages (from tensorboard<2.16,>=2.15->tensorflow) (2.29.0)
Collecting tensorboard-data-server<0.8.0,>=0.7.0 (from tensorboard<2.16,>=2.15->tensorflow)
  Downloading tensorboard_data_server-0.7.2-py3-none-manylinux_2_31_x86_64.whl (6.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.6/6.6 MB 34.6 MB/s eta 0:00:00:00:0100:01
[?25hRequirement already satisfied: werkzeug>=1.0.1 in /opt/conda/lib/python3.10/site-packages (from tensorboard<2.16,>=2.15->tensorflow) (2.3.4)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /opt/conda/lib/python3.10/site-packages (from google-auth<3,>=1.6.3->tensorboard<2.16,>=2.15->tensorflow) (5.3.1)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/conda/lib/python3.10/site-packages (from google-auth<3,>=1.6.3->tensorboard<2.16,>=2.15->tensorflow) (0.3.0)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/conda/lib/python3.10/site-packages (from google-auth<3,>=1.6.3->tensorboard<2.16,>=2.15->tensorflow) (4.9)
Requirement already satisfied: urllib3<2.0 in /opt/conda/lib/python3.10/site-packages (from google-auth<3,>=1.6.3->tensorboard<2.16,>=2.15->tensorflow) (1.26.15)
Collecting requests-oauthlib>=0.7.0 (from google-auth-oauthlib<2,>=0.5->tensorboard<2.16,>=2.15->tensorflow)
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Requirement already satisfied: charset-normalizer<4,>=2 in /opt/conda/lib/python3.10/site-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow) (3.1.0)
Requirement already satisfied: idna<4,>=2.5 in /opt/conda/lib/python3.10/site-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow) (3.4)
Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.10/site-packages (from requests<3,>=2.21.0->tensorboard<2.16,>=2.15->tensorflow) (2023.5.7)
Requirement already satisfied: MarkupSafe>=2.1.1 in /opt/conda/lib/python3.10/site-packages (from werkzeug>=1.0.1->tensorboard<2.16,>=2.15->tensorflow) (2.1.2)
Requirement already satisfied: pyasn1<0.6.0,>=0.4.6 in /opt/conda/lib/python3.10/site-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard<2.16,>=2.15->tensorflow) (0.5.0)
Requirement already satisfied: oauthlib>=3.0.0 in /opt/conda/lib/python3.10/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<2,>=0.5->tensorboard<2.16,>=2.15->tensorflow) (3.2.2)
Installing collected packages: libclang, flatbuffers, wrapt, termcolor, tensorflow-io-gcs-filesystem, tensorflow-estimator, tensorboard-data-server, ml-dtypes, markdown, keras, google-pasta, gast, astunparse, absl-py, requests-oauthlib, google-auth-oauthlib, tensorboard, tensorflow
  Attempting uninstall: wrapt
    Found existing installation: wrapt 1.15.0
    Uninstalling wrapt-1.15.0:
      Successfully uninstalled wrapt-1.15.0
Successfully installed absl-py-2.0.0 astunparse-1.6.3 flatbuffers-23.5.26 gast-0.5.4 google-auth-oauthlib-1.1.0 google-pasta-0.2.0 keras-2.15.0 libclang-16.0.6 markdown-3.5.1 ml-dtypes-0.2.0 requests-oauthlib-1.3.1 tensorboard-2.15.1 tensorboard-data-server-0.7.2 tensorflow-2.15.0 tensorflow-estimator-2.15.0 tensorflow-io-gcs-filesystem-0.34.0 termcolor-2.4.0 wrapt-1.14.1
import tensorflow as tf
import tensornetwork as tn
import matplotlib.pyplot as plt

tn.set_default_backend("tensorflow")

次に、それらのノードを「a」と「b」という名前のノードに分解するTNlayerを作成します。

class TNLayer(tf.keras.layers.Layer):
 
  def __init__(self):
    super(TNLayer, self).__init__()
    # Create the variables for the layer.
    self.a_var = tf.Variable(tf.random.normal(
            shape=(32, 32, 2), stddev=1.0/32.0),
             name="a", trainable=True)
    self.b_var = tf.Variable(tf.random.normal(shape=(32, 32, 2), stddev=1.0/32.0),
                             name="b", trainable=True)
    self.bias = tf.Variable(tf.zeros(shape=(32, 32)), name="bias", trainable=True)
 
  def call(self, inputs):
    # Define the contraction.
    # We break it out so we can parallelize a batch using
    # tf.vectorized_map (see below).
    def f(input_vec, a_var, b_var, bias_var):
      # Reshape to a matrix instead of a vector.
      input_vec = tf.reshape(input_vec, (32,32))
 
      # Now we create the network.
      a = tn.Node(a_var, backend="tensorflow")
      b = tn.Node(b_var, backend="tensorflow")
      x_node = tn.Node(input_vec, backend="tensorflow")
      a[1] ^ x_node[0]
      b[1] ^ x_node[1]
      a[2] ^ b[2]
 
      # The TN should now look like this
      #   |     |
      #   a --- b
      #    \   /
      #      x
 
      # Now we begin the contraction.
      c = a @ x_node
      result = (c @ b).tensor
 
      # To make the code shorter, we also could've used Ncon.
      # The above few lines of code is the same as this:
      # result = tn.ncon([x, a_var, b_var], [[1, 2], [-1, 1, 3], [-2, 2, 3]])
 
      # Finally, add bias.
      return result + bias_var
  
    # To deal with a batch of items, we can use the tf.vectorized_map
    # function.
    # https://www.tensorflow.org/api_docs/python/tf/vectorized_map
    result = tf.vectorized_map(
        lambda vec: f(vec, self.a_var, self.b_var, self.bias), inputs)
    return tf.nn.relu(tf.reshape(result, (-1, 1024)))

まず、それぞれ1024ノードを持つ2つのレイヤーを検討しましょう。

Dense = tf.keras.layers.Dense
fc_model = tf.keras.Sequential(
    [
     tf.keras.Input(shape=(2,)),
     Dense(1024, activation=tf.nn.swish),
     Dense(1024, activation=tf.nn.swish),
     Dense(1, activation=None)])
fc_model.summary()
Model: "sequential_5"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense_13 (Dense)            (None, 1024)              3072      
                                                                 
 dense_14 (Dense)            (None, 1024)              1049600   
                                                                 
 dense_15 (Dense)            (None, 1)                 1025      
                                                                 
=================================================================
Total params: 1053697 (4.02 MB)
Trainable params: 1053697 (4.02 MB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________

前のレイヤーをTNに置き換えます。

tn_model = tf.keras.Sequential(
  [
    tf.keras.Input(shape=(2,)),
    Dense(1024, activation=tf.nn.relu),
    # Here use a TN layer instead of the dense layer.
    TNLayer(),
    Dense(1, activation=None)
  ]
)
tn_model.summary()

Model: "sequential_6"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense_16 (Dense)            (None, 1024)              3072      
                                                                 
 tn_layer_2 (TNLayer)        (None, 1024)              5120      
                                                                 
 dense_17 (Dense)            (None, 1)                 1025      
                                                                 
=================================================================
Total params: 9217 (36.00 KB)
Trainable params: 9217 (36.00 KB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________

パラメータの数が減少したかどうかを確認できます。

次に、トレーニングを進めます。

X = np.concatenate([np.random.randn(20, 2) + np.array([3, 3]), 
             np.random.randn(20, 2) + np.array([-3, -3]), 
             np.random.randn(20, 2) + np.array([-3, 3]), 
             np.random.randn(20, 2) + np.array([3, -3]),])

Y = np.concatenate([np.ones((40)), -np.ones((40))])

全結合モデルを訓練します。

fc_model.compile(optimizer="adam", loss="mean_squared_error")
fc_model.fit(X, Y, epochs=300, verbose=1)
Epoch 1/300
3/3 [==============================] - 1s 18ms/step - loss: 0.2224
Epoch 2/300
3/3 [==============================] - 0s 27ms/step - loss: 0.1684
Epoch 3/300
3/3 [==============================] - 0s 31ms/step - loss: 0.1041
Epoch 4/300
3/3 [==============================] - 0s 25ms/step - loss: 0.0508
Epoch 5/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0572
Epoch 6/300
3/3 [==============================] - 0s 42ms/step - loss: 0.0382
Epoch 7/300
3/3 [==============================] - 0s 30ms/step - loss: 0.0373
Epoch 8/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0223
Epoch 9/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0222
Epoch 10/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0182
Epoch 11/300
3/3 [==============================] - 0s 36ms/step - loss: 0.0183
Epoch 12/300
3/3 [==============================] - 0s 28ms/step - loss: 0.0141
Epoch 13/300
3/3 [==============================] - 0s 32ms/step - loss: 0.0134
Epoch 14/300
3/3 [==============================] - 0s 27ms/step - loss: 0.0133
Epoch 15/300
3/3 [==============================] - 0s 30ms/step - loss: 0.0105
Epoch 16/300
3/3 [==============================] - 0s 28ms/step - loss: 0.0126
Epoch 17/300
3/3 [==============================] - 0s 26ms/step - loss: 0.0105
Epoch 18/300
3/3 [==============================] - 0s 32ms/step - loss: 0.0104
Epoch 19/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0101
Epoch 20/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0097
Epoch 21/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0106
Epoch 22/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0092
Epoch 23/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0094
Epoch 24/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0090
Epoch 25/300
3/3 [==============================] - 0s 30ms/step - loss: 0.0102
Epoch 26/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0092
Epoch 27/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0084
Epoch 28/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0095
Epoch 29/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0083
Epoch 30/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0083
Epoch 31/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0078
Epoch 32/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0078
Epoch 33/300
3/3 [==============================] - 0s 28ms/step - loss: 0.0077
Epoch 34/300
3/3 [==============================] - 0s 33ms/step - loss: 0.0078
Epoch 35/300
3/3 [==============================] - 0s 28ms/step - loss: 0.0085
Epoch 36/300
3/3 [==============================] - 0s 28ms/step - loss: 0.0088
Epoch 37/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0079
Epoch 38/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0093
Epoch 39/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0099
Epoch 40/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0084
Epoch 41/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0114
Epoch 42/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0097
Epoch 43/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0097
Epoch 44/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0081
Epoch 45/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0083
Epoch 46/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0074
Epoch 47/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0089
Epoch 48/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0086
Epoch 49/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0084
Epoch 50/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0080
Epoch 51/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0084
Epoch 52/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0082
Epoch 53/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0075
Epoch 54/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0094
Epoch 55/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0089
Epoch 56/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0091
Epoch 57/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0086
Epoch 58/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0091
Epoch 59/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0072
Epoch 60/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0098
Epoch 61/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0077
Epoch 62/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0083
Epoch 63/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0111
Epoch 64/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0082
Epoch 65/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0121
Epoch 66/300
3/3 [==============================] - 0s 27ms/step - loss: 0.0105
Epoch 67/300
3/3 [==============================] - 0s 15ms/step - loss: 0.0096
Epoch 68/300
3/3 [==============================] - 0s 14ms/step - loss: 0.0066
Epoch 69/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0062
Epoch 70/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0065
Epoch 71/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0059
Epoch 72/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0061
Epoch 73/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0063
Epoch 74/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0063
Epoch 75/300
3/3 [==============================] - 0s 32ms/step - loss: 0.0067
Epoch 76/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0069
Epoch 77/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0082
Epoch 78/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0069
Epoch 79/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0071
Epoch 80/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0082
Epoch 81/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0066
Epoch 82/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0087
Epoch 83/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0067
Epoch 84/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0083
Epoch 85/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0054
Epoch 86/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0058
Epoch 87/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0073
Epoch 88/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0060
Epoch 89/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0066
Epoch 90/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0058
Epoch 91/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0063
Epoch 92/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0068
Epoch 93/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0072
Epoch 94/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0069
Epoch 95/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0060
Epoch 96/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0061
Epoch 97/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0089
Epoch 98/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0070
Epoch 99/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0066
Epoch 100/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0084
Epoch 101/300
3/3 [==============================] - 0s 27ms/step - loss: 0.0069
Epoch 102/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0065
Epoch 103/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0063
Epoch 104/300
3/3 [==============================] - 0s 13ms/step - loss: 0.0089
Epoch 105/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0059
Epoch 106/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0058
Epoch 107/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0066
Epoch 108/300
3/3 [==============================] - 0s 26ms/step - loss: 0.0055
Epoch 109/300
3/3 [==============================] - 0s 16ms/step - loss: 0.0054
Epoch 110/300
3/3 [==============================] - 0s 15ms/step - loss: 0.0048
Epoch 111/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0053
Epoch 112/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0051
Epoch 113/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0046
Epoch 114/300
3/3 [==============================] - 0s 30ms/step - loss: 0.0046
Epoch 115/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0042
Epoch 116/300
3/3 [==============================] - 0s 16ms/step - loss: 0.0044
Epoch 117/300
3/3 [==============================] - 0s 27ms/step - loss: 0.0040
Epoch 118/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0049
Epoch 119/300
3/3 [==============================] - 0s 30ms/step - loss: 0.0042
Epoch 120/300
3/3 [==============================] - 0s 28ms/step - loss: 0.0047
Epoch 121/300
3/3 [==============================] - 0s 33ms/step - loss: 0.0043
Epoch 122/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0051
Epoch 123/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0049
Epoch 124/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0047
Epoch 125/300
3/3 [==============================] - 0s 27ms/step - loss: 0.0053
Epoch 126/300
3/3 [==============================] - 0s 28ms/step - loss: 0.0052
Epoch 127/300
3/3 [==============================] - 0s 28ms/step - loss: 0.0045
Epoch 128/300
3/3 [==============================] - 0s 31ms/step - loss: 0.0042
Epoch 129/300
3/3 [==============================] - 0s 32ms/step - loss: 0.0047
Epoch 130/300
3/3 [==============================] - 0s 34ms/step - loss: 0.0040
Epoch 131/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0039
Epoch 132/300
3/3 [==============================] - 0s 29ms/step - loss: 0.0050
Epoch 133/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0039
Epoch 134/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0046
Epoch 135/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0042
Epoch 136/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0052
Epoch 137/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0084
Epoch 138/300
3/3 [==============================] - 0s 17ms/step - loss: 0.0062
Epoch 139/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0062
Epoch 140/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0063
Epoch 141/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0060
Epoch 142/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0056
Epoch 143/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0101
Epoch 144/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0077
Epoch 145/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0069
Epoch 146/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0064
Epoch 147/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0054
Epoch 148/300
3/3 [==============================] - 0s 14ms/step - loss: 0.0066
Epoch 149/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0082
Epoch 150/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0071
Epoch 151/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0076
Epoch 152/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0130
Epoch 153/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0142
Epoch 154/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0187
Epoch 155/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0231
Epoch 156/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0125
Epoch 157/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0136
Epoch 158/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0104
Epoch 159/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0165
Epoch 160/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0153
Epoch 161/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0180
Epoch 162/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0169
Epoch 163/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0109
Epoch 164/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0113
Epoch 165/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0070
Epoch 166/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0093
Epoch 167/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0068
Epoch 168/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0061
Epoch 169/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0064
Epoch 170/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0076
Epoch 171/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0061
Epoch 172/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0080
Epoch 173/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0055
Epoch 174/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0046
Epoch 175/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0042
Epoch 176/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0048
Epoch 177/300
3/3 [==============================] - 0s 30ms/step - loss: 0.0052
Epoch 178/300
3/3 [==============================] - 0s 34ms/step - loss: 0.0054
Epoch 179/300
3/3 [==============================] - 0s 32ms/step - loss: 0.0050
Epoch 180/300
3/3 [==============================] - 0s 31ms/step - loss: 0.0051
Epoch 181/300
3/3 [==============================] - 0s 15ms/step - loss: 0.0042
Epoch 182/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0041
Epoch 183/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0044
Epoch 184/300
3/3 [==============================] - 0s 27ms/step - loss: 0.0043
Epoch 185/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0056
Epoch 186/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0044
Epoch 187/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0044
Epoch 188/300
3/3 [==============================] - 0s 17ms/step - loss: 0.0044
Epoch 189/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0057
Epoch 190/300
3/3 [==============================] - 0s 27ms/step - loss: 0.0051
Epoch 191/300
3/3 [==============================] - 0s 14ms/step - loss: 0.0063
Epoch 192/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0073
Epoch 193/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0055
Epoch 194/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0059
Epoch 195/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0052
Epoch 196/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0064
Epoch 197/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0057
Epoch 198/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0059
Epoch 199/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0062
Epoch 200/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0062
Epoch 201/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0083
Epoch 202/300
3/3 [==============================] - 0s 25ms/step - loss: 0.0090
Epoch 203/300
3/3 [==============================] - 0s 17ms/step - loss: 0.0073
Epoch 204/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0075
Epoch 205/300
3/3 [==============================] - 0s 25ms/step - loss: 0.0088
Epoch 206/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0092
Epoch 207/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0109
Epoch 208/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0160
Epoch 209/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0171
Epoch 210/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0142
Epoch 211/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0066
Epoch 212/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0101
Epoch 213/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0089
Epoch 214/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0065
Epoch 215/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0058
Epoch 216/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0070
Epoch 217/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0063
Epoch 218/300
3/3 [==============================] - 0s 25ms/step - loss: 0.0060
Epoch 219/300
3/3 [==============================] - 0s 25ms/step - loss: 0.0057
Epoch 220/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0102
Epoch 221/300
3/3 [==============================] - 0s 25ms/step - loss: 0.0076
Epoch 222/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0056
Epoch 223/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0049
Epoch 224/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0055
Epoch 225/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0053
Epoch 226/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0069
Epoch 227/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0055
Epoch 228/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0065
Epoch 229/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0055
Epoch 230/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0050
Epoch 231/300
3/3 [==============================] - 0s 17ms/step - loss: 0.0051
Epoch 232/300
3/3 [==============================] - 0s 27ms/step - loss: 0.0041
Epoch 233/300
3/3 [==============================] - 0s 31ms/step - loss: 0.0043
Epoch 234/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0042
Epoch 235/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0051
Epoch 236/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0049
Epoch 237/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0042
Epoch 238/300
3/3 [==============================] - 0s 28ms/step - loss: 0.0039
Epoch 239/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0037
Epoch 240/300
3/3 [==============================] - 0s 14ms/step - loss: 0.0035
Epoch 241/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0032
Epoch 242/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0030
Epoch 243/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0029
Epoch 244/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0031
Epoch 245/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0034
Epoch 246/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0033
Epoch 247/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0029
Epoch 248/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0029
Epoch 249/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0026
Epoch 250/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0026
Epoch 251/300
3/3 [==============================] - 0s 17ms/step - loss: 0.0023
Epoch 252/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0020
Epoch 253/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0025
Epoch 254/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0024
Epoch 255/300
3/3 [==============================] - 0s 17ms/step - loss: 0.0021
Epoch 256/300
3/3 [==============================] - 0s 33ms/step - loss: 0.0022
Epoch 257/300
3/3 [==============================] - 0s 15ms/step - loss: 0.0020
Epoch 258/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0019
Epoch 259/300
3/3 [==============================] - 0s 17ms/step - loss: 0.0023
Epoch 260/300
3/3 [==============================] - 0s 25ms/step - loss: 0.0021
Epoch 261/300
3/3 [==============================] - 0s 25ms/step - loss: 0.0025
Epoch 262/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0027
Epoch 263/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0021
Epoch 264/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0024
Epoch 265/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0022
Epoch 266/300
3/3 [==============================] - 0s 16ms/step - loss: 0.0019
Epoch 267/300
3/3 [==============================] - 0s 19ms/step - loss: 0.0023
Epoch 268/300
3/3 [==============================] - 0s 14ms/step - loss: 0.0023
Epoch 269/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0022
Epoch 270/300
3/3 [==============================] - 0s 11ms/step - loss: 0.0018
Epoch 271/300
3/3 [==============================] - 0s 12ms/step - loss: 0.0021
Epoch 272/300
3/3 [==============================] - 0s 12ms/step - loss: 0.0022
Epoch 273/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0024
Epoch 274/300
3/3 [==============================] - 0s 17ms/step - loss: 0.0025
Epoch 275/300
3/3 [==============================] - 0s 25ms/step - loss: 0.0022
Epoch 276/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0020
Epoch 277/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0019
Epoch 278/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0018
Epoch 279/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0018
Epoch 280/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0018
Epoch 281/300
3/3 [==============================] - 0s 21ms/step - loss: 0.0016
Epoch 282/300
3/3 [==============================] - 0s 17ms/step - loss: 0.0018
Epoch 283/300
3/3 [==============================] - 0s 17ms/step - loss: 0.0016
Epoch 284/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0019
Epoch 285/300
3/3 [==============================] - 0s 27ms/step - loss: 0.0026
Epoch 286/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0035
Epoch 287/300
3/3 [==============================] - 0s 30ms/step - loss: 0.0046
Epoch 288/300
3/3 [==============================] - 0s 22ms/step - loss: 0.0063
Epoch 289/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0057
Epoch 290/300
3/3 [==============================] - 0s 23ms/step - loss: 0.0049
Epoch 291/300
3/3 [==============================] - 0s 25ms/step - loss: 0.0043
Epoch 292/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0040
Epoch 293/300
3/3 [==============================] - 0s 26ms/step - loss: 0.0038
Epoch 294/300
3/3 [==============================] - 0s 28ms/step - loss: 0.0039
Epoch 295/300
3/3 [==============================] - 0s 24ms/step - loss: 0.0030
Epoch 296/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0041
Epoch 297/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0047
Epoch 298/300
3/3 [==============================] - 0s 18ms/step - loss: 0.0062
Epoch 299/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0055
Epoch 300/300
3/3 [==============================] - 0s 20ms/step - loss: 0.0058
<keras.src.callbacks.History at 0x7f61271938e0>

次にテンソルネットワークモデルを訓練します。

tn_model.compile(optimizer="adam", loss="mean_squared_error")
tn_model.fit(X, Y, epochs=300, verbose=1)
Epoch 1/300
3/3 [==============================] - 1s 6ms/step - loss: 0.0076
Epoch 2/300
3/3 [==============================] - 0s 5ms/step - loss: 0.0063
Epoch 3/300
3/3 [==============================] - 0s 6ms/step - loss: 0.0047
Epoch 4/300
3/3 [==============================] - 0s 5ms/step - loss: 0.0043
Epoch 5/300
3/3 [==============================] - 0s 6ms/step - loss: 0.0035
Epoch 6/300
3/3 [==============================] - 0s 5ms/step - loss: 0.0019
Epoch 7/300
3/3 [==============================] - 0s 6ms/step - loss: 0.0017
Epoch 8/300
3/3 [==============================] - 0s 5ms/step - loss: 0.0012
Epoch 9/300
3/3 [==============================] - 0s 8ms/step - loss: 7.3864e-04
Epoch 10/300
3/3 [==============================] - 0s 5ms/step - loss: 8.6364e-04
Epoch 11/300
3/3 [==============================] - 0s 5ms/step - loss: 6.0577e-04
Epoch 12/300
3/3 [==============================] - 0s 5ms/step - loss: 5.7394e-04
Epoch 13/300
3/3 [==============================] - 0s 5ms/step - loss: 5.4760e-04
Epoch 14/300
3/3 [==============================] - 0s 7ms/step - loss: 3.9549e-04
Epoch 15/300
3/3 [==============================] - 0s 5ms/step - loss: 3.9470e-04
Epoch 16/300
3/3 [==============================] - 0s 5ms/step - loss: 4.2155e-04
Epoch 17/300
3/3 [==============================] - 0s 4ms/step - loss: 2.8198e-04
Epoch 18/300
3/3 [==============================] - 0s 4ms/step - loss: 2.2104e-04
Epoch 19/300
3/3 [==============================] - 0s 4ms/step - loss: 1.9206e-04
Epoch 20/300
3/3 [==============================] - 0s 4ms/step - loss: 1.5597e-04
Epoch 21/300
3/3 [==============================] - 0s 4ms/step - loss: 1.1755e-04
Epoch 22/300
3/3 [==============================] - 0s 4ms/step - loss: 1.6133e-04
Epoch 23/300
3/3 [==============================] - 0s 4ms/step - loss: 1.2020e-04
Epoch 24/300
3/3 [==============================] - 0s 4ms/step - loss: 1.1710e-04
Epoch 25/300
3/3 [==============================] - 0s 4ms/step - loss: 1.3548e-04
Epoch 26/300
3/3 [==============================] - 0s 4ms/step - loss: 1.0694e-04
Epoch 27/300
3/3 [==============================] - 0s 4ms/step - loss: 8.0268e-05
Epoch 28/300
3/3 [==============================] - 0s 5ms/step - loss: 8.6918e-05
Epoch 29/300
3/3 [==============================] - 0s 4ms/step - loss: 8.8316e-05
Epoch 30/300
3/3 [==============================] - 0s 4ms/step - loss: 7.1958e-05
Epoch 31/300
3/3 [==============================] - 0s 7ms/step - loss: 6.2320e-05
Epoch 32/300
3/3 [==============================] - 0s 5ms/step - loss: 7.7821e-05
Epoch 33/300
3/3 [==============================] - 0s 9ms/step - loss: 5.1799e-05
Epoch 34/300
3/3 [==============================] - 0s 3ms/step - loss: 5.4159e-05
Epoch 35/300
3/3 [==============================] - 0s 4ms/step - loss: 3.0650e-05
Epoch 36/300
3/3 [==============================] - 0s 5ms/step - loss: 2.9692e-05
Epoch 37/300
3/3 [==============================] - 0s 5ms/step - loss: 2.6756e-05
Epoch 38/300
3/3 [==============================] - 0s 5ms/step - loss: 2.6074e-05
Epoch 39/300
3/3 [==============================] - 0s 5ms/step - loss: 2.8186e-05
Epoch 40/300
3/3 [==============================] - 0s 5ms/step - loss: 2.3854e-05
Epoch 41/300
3/3 [==============================] - 0s 4ms/step - loss: 1.6029e-05
Epoch 42/300
3/3 [==============================] - 0s 4ms/step - loss: 2.6330e-05
Epoch 43/300
3/3 [==============================] - 0s 4ms/step - loss: 2.0258e-05
Epoch 44/300
3/3 [==============================] - 0s 4ms/step - loss: 2.4014e-05
Epoch 45/300
3/3 [==============================] - 0s 5ms/step - loss: 1.8781e-05
Epoch 46/300
3/3 [==============================] - 0s 5ms/step - loss: 1.5471e-05
Epoch 47/300
3/3 [==============================] - 0s 5ms/step - loss: 1.2254e-05
Epoch 48/300
3/3 [==============================] - 0s 5ms/step - loss: 1.2828e-05
Epoch 49/300
3/3 [==============================] - 0s 4ms/step - loss: 9.6204e-06
Epoch 50/300
3/3 [==============================] - 0s 5ms/step - loss: 9.3080e-06
Epoch 51/300
3/3 [==============================] - 0s 5ms/step - loss: 2.0695e-05
Epoch 52/300
3/3 [==============================] - 0s 6ms/step - loss: 1.8972e-05
Epoch 53/300
3/3 [==============================] - 0s 5ms/step - loss: 1.0677e-05
Epoch 54/300
3/3 [==============================] - 0s 5ms/step - loss: 1.3067e-05
Epoch 55/300
3/3 [==============================] - 0s 7ms/step - loss: 1.3889e-05
Epoch 56/300
3/3 [==============================] - 0s 9ms/step - loss: 1.1804e-05
Epoch 57/300
3/3 [==============================] - 0s 9ms/step - loss: 1.0899e-05
Epoch 58/300
3/3 [==============================] - 0s 6ms/step - loss: 1.4515e-05
Epoch 59/300
3/3 [==============================] - 0s 6ms/step - loss: 1.3639e-05
Epoch 60/300
3/3 [==============================] - 0s 6ms/step - loss: 1.8318e-05
Epoch 61/300
3/3 [==============================] - 0s 7ms/step - loss: 1.6042e-05
Epoch 62/300
3/3 [==============================] - 0s 6ms/step - loss: 1.4876e-05
Epoch 63/300
3/3 [==============================] - 0s 5ms/step - loss: 1.4517e-05
Epoch 64/300
3/3 [==============================] - 0s 6ms/step - loss: 1.1614e-05
Epoch 65/300
3/3 [==============================] - 0s 7ms/step - loss: 1.6538e-05
Epoch 66/300
3/3 [==============================] - 0s 5ms/step - loss: 1.6504e-05
Epoch 67/300
3/3 [==============================] - 0s 6ms/step - loss: 1.8632e-05
Epoch 68/300
3/3 [==============================] - 0s 4ms/step - loss: 1.4959e-05
Epoch 69/300
3/3 [==============================] - 0s 5ms/step - loss: 1.7033e-05
Epoch 70/300
3/3 [==============================] - 0s 5ms/step - loss: 8.9530e-06
Epoch 71/300
3/3 [==============================] - 0s 5ms/step - loss: 1.4874e-05
Epoch 72/300
3/3 [==============================] - 0s 5ms/step - loss: 1.2183e-05
Epoch 73/300
3/3 [==============================] - 0s 6ms/step - loss: 7.0955e-06
Epoch 74/300
3/3 [==============================] - 0s 6ms/step - loss: 9.4228e-06
Epoch 75/300
3/3 [==============================] - 0s 6ms/step - loss: 1.1546e-05
Epoch 76/300
3/3 [==============================] - 0s 6ms/step - loss: 1.3303e-05
Epoch 77/300
3/3 [==============================] - 0s 7ms/step - loss: 1.2617e-05
Epoch 78/300
3/3 [==============================] - 0s 5ms/step - loss: 1.1051e-05
Epoch 79/300
3/3 [==============================] - 0s 4ms/step - loss: 1.3420e-05
Epoch 80/300
3/3 [==============================] - 0s 5ms/step - loss: 2.3272e-05
Epoch 81/300
3/3 [==============================] - 0s 5ms/step - loss: 1.7898e-05
Epoch 82/300
3/3 [==============================] - 0s 10ms/step - loss: 1.6887e-05
Epoch 83/300
3/3 [==============================] - 0s 5ms/step - loss: 1.8059e-05
Epoch 84/300
3/3 [==============================] - 0s 16ms/step - loss: 7.3858e-06
Epoch 85/300
3/3 [==============================] - 0s 5ms/step - loss: 8.7746e-06
Epoch 86/300
3/3 [==============================] - 0s 6ms/step - loss: 1.3572e-05
Epoch 87/300
3/3 [==============================] - 0s 6ms/step - loss: 1.7792e-05
Epoch 88/300
3/3 [==============================] - 0s 5ms/step - loss: 1.8233e-05
Epoch 89/300
3/3 [==============================] - 0s 6ms/step - loss: 3.0893e-05
Epoch 90/300
3/3 [==============================] - 0s 4ms/step - loss: 2.7913e-05
Epoch 91/300
3/3 [==============================] - 0s 4ms/step - loss: 2.5701e-05
Epoch 92/300
3/3 [==============================] - 0s 5ms/step - loss: 2.3548e-05
Epoch 93/300
3/3 [==============================] - 0s 4ms/step - loss: 2.0266e-05
Epoch 94/300
3/3 [==============================] - 0s 5ms/step - loss: 4.1517e-05
Epoch 95/300
3/3 [==============================] - 0s 5ms/step - loss: 4.6563e-05
Epoch 96/300
3/3 [==============================] - 0s 5ms/step - loss: 4.3216e-05
Epoch 97/300
3/3 [==============================] - 0s 5ms/step - loss: 4.8277e-05
Epoch 98/300
3/3 [==============================] - 0s 8ms/step - loss: 3.7540e-05
Epoch 99/300
3/3 [==============================] - 0s 5ms/step - loss: 4.3271e-05
Epoch 100/300
3/3 [==============================] - 0s 8ms/step - loss: 3.9400e-05
Epoch 101/300
3/3 [==============================] - 0s 5ms/step - loss: 2.1411e-05
Epoch 102/300
3/3 [==============================] - 0s 5ms/step - loss: 2.0404e-05
Epoch 103/300
3/3 [==============================] - 0s 6ms/step - loss: 1.6744e-05
Epoch 104/300
3/3 [==============================] - 0s 10ms/step - loss: 1.2537e-05
Epoch 105/300
3/3 [==============================] - 0s 17ms/step - loss: 6.8882e-06
Epoch 106/300
3/3 [==============================] - 0s 8ms/step - loss: 8.0056e-06
Epoch 107/300
3/3 [==============================] - 0s 4ms/step - loss: 4.5580e-06
Epoch 108/300
3/3 [==============================] - 0s 5ms/step - loss: 4.1776e-06
Epoch 109/300
3/3 [==============================] - 0s 5ms/step - loss: 3.1392e-06
Epoch 110/300
3/3 [==============================] - 0s 5ms/step - loss: 1.9955e-06
Epoch 111/300
3/3 [==============================] - 0s 5ms/step - loss: 2.9791e-06
Epoch 112/300
3/3 [==============================] - 0s 6ms/step - loss: 3.0803e-06
Epoch 113/300
3/3 [==============================] - 0s 6ms/step - loss: 3.8013e-06
Epoch 114/300
3/3 [==============================] - 0s 5ms/step - loss: 3.3294e-06
Epoch 115/300
3/3 [==============================] - 0s 8ms/step - loss: 3.0961e-06
Epoch 116/300
3/3 [==============================] - 0s 4ms/step - loss: 1.6956e-06
Epoch 117/300
3/3 [==============================] - 0s 10ms/step - loss: 1.9529e-06
Epoch 118/300
3/3 [==============================] - 0s 8ms/step - loss: 1.4603e-06
Epoch 119/300
3/3 [==============================] - 0s 6ms/step - loss: 1.4111e-06
Epoch 120/300
3/3 [==============================] - 0s 6ms/step - loss: 1.4436e-06
Epoch 121/300
3/3 [==============================] - 0s 6ms/step - loss: 2.0584e-06
Epoch 122/300
3/3 [==============================] - 0s 5ms/step - loss: 3.3954e-06
Epoch 123/300
3/3 [==============================] - 0s 5ms/step - loss: 5.7160e-06
Epoch 124/300
3/3 [==============================] - 0s 6ms/step - loss: 9.8115e-06
Epoch 125/300
3/3 [==============================] - 0s 6ms/step - loss: 9.5480e-06
Epoch 126/300
3/3 [==============================] - 0s 7ms/step - loss: 1.0243e-05
Epoch 127/300
3/3 [==============================] - 0s 4ms/step - loss: 7.0703e-06
Epoch 128/300
3/3 [==============================] - 0s 6ms/step - loss: 1.2463e-05
Epoch 129/300
3/3 [==============================] - 0s 6ms/step - loss: 1.9612e-05
Epoch 130/300
3/3 [==============================] - 0s 5ms/step - loss: 2.8438e-05
Epoch 131/300
3/3 [==============================] - 0s 8ms/step - loss: 4.4940e-05
Epoch 132/300
3/3 [==============================] - 0s 5ms/step - loss: 3.4179e-05
Epoch 133/300
3/3 [==============================] - 0s 5ms/step - loss: 6.3053e-05
Epoch 134/300
3/3 [==============================] - 0s 7ms/step - loss: 6.1221e-05
Epoch 135/300
3/3 [==============================] - 0s 6ms/step - loss: 1.4377e-04
Epoch 136/300
3/3 [==============================] - 0s 6ms/step - loss: 3.2399e-04
Epoch 137/300
3/3 [==============================] - 0s 12ms/step - loss: 3.5408e-04
Epoch 138/300
3/3 [==============================] - 0s 9ms/step - loss: 4.5623e-04
Epoch 139/300
3/3 [==============================] - 0s 6ms/step - loss: 2.9885e-04
Epoch 140/300
3/3 [==============================] - 0s 6ms/step - loss: 3.3309e-04
Epoch 141/300
3/3 [==============================] - 0s 6ms/step - loss: 3.9668e-04
Epoch 142/300
3/3 [==============================] - 0s 10ms/step - loss: 3.4967e-04
Epoch 143/300
3/3 [==============================] - 0s 12ms/step - loss: 5.2842e-04
Epoch 144/300
3/3 [==============================] - 0s 11ms/step - loss: 8.3523e-04
Epoch 145/300
3/3 [==============================] - 0s 11ms/step - loss: 4.3472e-04
Epoch 146/300
3/3 [==============================] - 0s 10ms/step - loss: 3.7172e-04
Epoch 147/300
3/3 [==============================] - 0s 8ms/step - loss: 1.9036e-04
Epoch 148/300
3/3 [==============================] - 0s 7ms/step - loss: 1.8046e-04
Epoch 149/300
3/3 [==============================] - 0s 8ms/step - loss: 1.7551e-04
Epoch 150/300
3/3 [==============================] - 0s 9ms/step - loss: 2.1982e-04
Epoch 151/300
3/3 [==============================] - 0s 7ms/step - loss: 1.8827e-04
Epoch 152/300
3/3 [==============================] - 0s 8ms/step - loss: 3.4505e-04
Epoch 153/300
3/3 [==============================] - 0s 10ms/step - loss: 2.7934e-04
Epoch 154/300
3/3 [==============================] - 0s 7ms/step - loss: 1.0758e-04
Epoch 155/300
3/3 [==============================] - 0s 6ms/step - loss: 3.1975e-05
Epoch 156/300
3/3 [==============================] - 0s 7ms/step - loss: 3.6791e-05
Epoch 157/300
3/3 [==============================] - 0s 9ms/step - loss: 3.5611e-05
Epoch 158/300
3/3 [==============================] - 0s 6ms/step - loss: 3.0068e-05
Epoch 159/300
3/3 [==============================] - 0s 8ms/step - loss: 2.9921e-05
Epoch 160/300
3/3 [==============================] - 0s 7ms/step - loss: 3.1975e-05
Epoch 161/300
3/3 [==============================] - 0s 7ms/step - loss: 3.8593e-05
Epoch 162/300
3/3 [==============================] - 0s 7ms/step - loss: 2.1411e-05
Epoch 163/300
3/3 [==============================] - 0s 10ms/step - loss: 1.9863e-05
Epoch 164/300
3/3 [==============================] - 0s 6ms/step - loss: 2.6082e-05
Epoch 165/300
3/3 [==============================] - 0s 6ms/step - loss: 2.4883e-05
Epoch 166/300
3/3 [==============================] - 0s 8ms/step - loss: 2.6882e-05
Epoch 167/300
3/3 [==============================] - 0s 6ms/step - loss: 1.9645e-05
Epoch 168/300
3/3 [==============================] - 0s 5ms/step - loss: 2.1596e-05
Epoch 169/300
3/3 [==============================] - 0s 4ms/step - loss: 2.5863e-05
Epoch 170/300
3/3 [==============================] - 0s 5ms/step - loss: 2.4134e-05
Epoch 171/300
3/3 [==============================] - 0s 5ms/step - loss: 1.9200e-05
Epoch 172/300
3/3 [==============================] - 0s 4ms/step - loss: 1.9877e-05
Epoch 173/300
3/3 [==============================] - 0s 7ms/step - loss: 2.4429e-05
Epoch 174/300
3/3 [==============================] - 0s 5ms/step - loss: 1.7474e-05
Epoch 175/300
3/3 [==============================] - 0s 5ms/step - loss: 2.0255e-05
Epoch 176/300
3/3 [==============================] - 0s 4ms/step - loss: 2.5591e-05
Epoch 177/300
3/3 [==============================] - 0s 5ms/step - loss: 3.0087e-05
Epoch 178/300
3/3 [==============================] - 0s 5ms/step - loss: 4.1125e-05
Epoch 179/300
3/3 [==============================] - 0s 5ms/step - loss: 5.3996e-05
Epoch 180/300
3/3 [==============================] - 0s 4ms/step - loss: 9.4414e-05
Epoch 181/300
3/3 [==============================] - 0s 6ms/step - loss: 1.3844e-04
Epoch 182/300
3/3 [==============================] - 0s 6ms/step - loss: 1.6887e-04
Epoch 183/300
3/3 [==============================] - 0s 5ms/step - loss: 1.7903e-04
Epoch 184/300
3/3 [==============================] - 0s 5ms/step - loss: 1.1935e-04
Epoch 185/300
3/3 [==============================] - 0s 5ms/step - loss: 8.5719e-05
Epoch 186/300
3/3 [==============================] - 0s 5ms/step - loss: 1.4072e-04
Epoch 187/300
3/3 [==============================] - 0s 5ms/step - loss: 2.2532e-04
Epoch 188/300
3/3 [==============================] - 0s 6ms/step - loss: 1.2724e-04
Epoch 189/300
3/3 [==============================] - 0s 7ms/step - loss: 1.2706e-04
Epoch 190/300
3/3 [==============================] - 0s 14ms/step - loss: 1.6291e-04
Epoch 191/300
3/3 [==============================] - 0s 5ms/step - loss: 2.1527e-04
Epoch 192/300
3/3 [==============================] - 0s 5ms/step - loss: 4.2793e-04
Epoch 193/300
3/3 [==============================] - 0s 5ms/step - loss: 3.9986e-04
Epoch 194/300
3/3 [==============================] - 0s 5ms/step - loss: 1.8023e-04
Epoch 195/300
3/3 [==============================] - 0s 4ms/step - loss: 8.4324e-05
Epoch 196/300
3/3 [==============================] - 0s 5ms/step - loss: 1.1413e-04
Epoch 197/300
3/3 [==============================] - 0s 4ms/step - loss: 1.1997e-04
Epoch 198/300
3/3 [==============================] - 0s 5ms/step - loss: 9.0738e-05
Epoch 199/300
3/3 [==============================] - 0s 7ms/step - loss: 9.1340e-05
Epoch 200/300
3/3 [==============================] - 0s 5ms/step - loss: 1.0235e-04
Epoch 201/300
3/3 [==============================] - 0s 5ms/step - loss: 1.1237e-04
Epoch 202/300
3/3 [==============================] - 0s 4ms/step - loss: 9.7597e-05
Epoch 203/300
3/3 [==============================] - 0s 4ms/step - loss: 1.1594e-04
Epoch 204/300
3/3 [==============================] - 0s 5ms/step - loss: 9.4397e-05
Epoch 205/300
3/3 [==============================] - 0s 7ms/step - loss: 1.1932e-04
Epoch 206/300
3/3 [==============================] - 0s 6ms/step - loss: 1.6804e-04
Epoch 207/300
3/3 [==============================] - 0s 5ms/step - loss: 1.9164e-04
Epoch 208/300
3/3 [==============================] - 0s 4ms/step - loss: 2.9885e-04
Epoch 209/300
3/3 [==============================] - 0s 5ms/step - loss: 3.9717e-04
Epoch 210/300
3/3 [==============================] - 0s 5ms/step - loss: 5.4022e-04
Epoch 211/300
3/3 [==============================] - 0s 4ms/step - loss: 3.5582e-04
Epoch 212/300
3/3 [==============================] - 0s 7ms/step - loss: 2.8631e-04
Epoch 213/300
3/3 [==============================] - 0s 4ms/step - loss: 4.4770e-04
Epoch 214/300
3/3 [==============================] - 0s 7ms/step - loss: 6.0618e-04
Epoch 215/300
3/3 [==============================] - 0s 5ms/step - loss: 5.7040e-04
Epoch 216/300
3/3 [==============================] - 0s 5ms/step - loss: 0.0013
Epoch 217/300
3/3 [==============================] - 0s 4ms/step - loss: 0.0018
Epoch 218/300
3/3 [==============================] - 0s 4ms/step - loss: 0.0010
Epoch 219/300
3/3 [==============================] - 0s 4ms/step - loss: 7.6459e-04
Epoch 220/300
3/3 [==============================] - 0s 4ms/step - loss: 5.8141e-04
Epoch 221/300
3/3 [==============================] - 0s 5ms/step - loss: 6.5125e-04
Epoch 222/300
3/3 [==============================] - 0s 4ms/step - loss: 6.3862e-04
Epoch 223/300
3/3 [==============================] - 0s 5ms/step - loss: 0.0011
Epoch 224/300
3/3 [==============================] - 0s 4ms/step - loss: 0.0010
Epoch 225/300
3/3 [==============================] - 0s 4ms/step - loss: 0.0012
Epoch 226/300
3/3 [==============================] - 0s 6ms/step - loss: 0.0015
Epoch 227/300
3/3 [==============================] - 0s 4ms/step - loss: 0.0012
Epoch 228/300
3/3 [==============================] - 0s 4ms/step - loss: 0.0015
Epoch 229/300
3/3 [==============================] - 0s 5ms/step - loss: 0.0012
Epoch 230/300
3/3 [==============================] - 0s 4ms/step - loss: 0.0010
Epoch 231/300
3/3 [==============================] - 0s 5ms/step - loss: 6.3690e-04
Epoch 232/300
3/3 [==============================] - 0s 4ms/step - loss: 4.5686e-04
Epoch 233/300
3/3 [==============================] - 0s 4ms/step - loss: 4.4438e-04
Epoch 234/300
3/3 [==============================] - 0s 5ms/step - loss: 2.8224e-04
Epoch 235/300
3/3 [==============================] - 0s 5ms/step - loss: 2.3249e-04
Epoch 236/300
3/3 [==============================] - 0s 4ms/step - loss: 1.7890e-04
Epoch 237/300
3/3 [==============================] - 0s 4ms/step - loss: 1.1964e-04
Epoch 238/300
3/3 [==============================] - 0s 4ms/step - loss: 8.6752e-05
Epoch 239/300
3/3 [==============================] - 0s 5ms/step - loss: 1.0846e-04
Epoch 240/300
3/3 [==============================] - 0s 6ms/step - loss: 9.4447e-05
Epoch 241/300
3/3 [==============================] - 0s 4ms/step - loss: 5.7611e-05
Epoch 242/300
3/3 [==============================] - 0s 4ms/step - loss: 1.4712e-04
Epoch 243/300
3/3 [==============================] - 0s 5ms/step - loss: 1.0553e-04
Epoch 244/300
3/3 [==============================] - 0s 4ms/step - loss: 7.2120e-05
Epoch 245/300
3/3 [==============================] - 0s 5ms/step - loss: 7.6773e-05
Epoch 246/300
3/3 [==============================] - 0s 5ms/step - loss: 3.3767e-05
Epoch 247/300
3/3 [==============================] - 0s 4ms/step - loss: 2.6522e-05
Epoch 248/300
3/3 [==============================] - 0s 4ms/step - loss: 3.6153e-05
Epoch 249/300
3/3 [==============================] - 0s 4ms/step - loss: 3.3369e-05
Epoch 250/300
3/3 [==============================] - 0s 4ms/step - loss: 1.9704e-05
Epoch 251/300
3/3 [==============================] - 0s 5ms/step - loss: 2.6987e-05
Epoch 252/300
3/3 [==============================] - 0s 4ms/step - loss: 2.1177e-05
Epoch 253/300
3/3 [==============================] - 0s 4ms/step - loss: 1.7477e-05
Epoch 254/300
3/3 [==============================] - 0s 4ms/step - loss: 1.9803e-05
Epoch 255/300
3/3 [==============================] - 0s 4ms/step - loss: 1.6846e-05
Epoch 256/300
3/3 [==============================] - 0s 4ms/step - loss: 9.2019e-06
Epoch 257/300
3/3 [==============================] - 0s 4ms/step - loss: 1.5185e-05
Epoch 258/300
3/3 [==============================] - 0s 3ms/step - loss: 1.4479e-05
Epoch 259/300
3/3 [==============================] - 0s 3ms/step - loss: 1.3070e-05
Epoch 260/300
3/3 [==============================] - 0s 3ms/step - loss: 1.8557e-05
Epoch 261/300
3/3 [==============================] - 0s 5ms/step - loss: 2.0519e-05
Epoch 262/300
3/3 [==============================] - 0s 6ms/step - loss: 2.1126e-05
Epoch 263/300
3/3 [==============================] - 0s 3ms/step - loss: 2.6900e-05
Epoch 264/300
3/3 [==============================] - 0s 4ms/step - loss: 2.7867e-05
Epoch 265/300
3/3 [==============================] - 0s 5ms/step - loss: 3.4278e-05
Epoch 266/300
3/3 [==============================] - 0s 6ms/step - loss: 2.6441e-05
Epoch 267/300
3/3 [==============================] - 0s 5ms/step - loss: 4.5033e-05
Epoch 268/300
3/3 [==============================] - 0s 4ms/step - loss: 5.4110e-05
Epoch 269/300
3/3 [==============================] - 0s 4ms/step - loss: 7.7930e-05
Epoch 270/300
3/3 [==============================] - 0s 4ms/step - loss: 1.2648e-04
Epoch 271/300
3/3 [==============================] - 0s 5ms/step - loss: 1.0874e-04
Epoch 272/300
3/3 [==============================] - 0s 5ms/step - loss: 7.5287e-05
Epoch 273/300
3/3 [==============================] - 0s 4ms/step - loss: 5.8694e-05
Epoch 274/300
3/3 [==============================] - 0s 4ms/step - loss: 4.3644e-05
Epoch 275/300
3/3 [==============================] - 0s 5ms/step - loss: 4.5586e-05
Epoch 276/300
3/3 [==============================] - 0s 8ms/step - loss: 4.2153e-05
Epoch 277/300
3/3 [==============================] - 0s 7ms/step - loss: 3.5131e-05
Epoch 278/300
3/3 [==============================] - 0s 6ms/step - loss: 2.8455e-05
Epoch 279/300
3/3 [==============================] - 0s 5ms/step - loss: 2.7532e-05
Epoch 280/300
3/3 [==============================] - 0s 5ms/step - loss: 3.2052e-05
Epoch 281/300
3/3 [==============================] - 0s 5ms/step - loss: 2.8343e-05
Epoch 282/300
3/3 [==============================] - 0s 4ms/step - loss: 2.3976e-05
Epoch 283/300
3/3 [==============================] - 0s 5ms/step - loss: 2.9259e-05
Epoch 284/300
3/3 [==============================] - 0s 6ms/step - loss: 2.6976e-05
Epoch 285/300
3/3 [==============================] - 0s 5ms/step - loss: 4.3185e-05
Epoch 286/300
3/3 [==============================] - 0s 5ms/step - loss: 3.6309e-05
Epoch 287/300
3/3 [==============================] - 0s 8ms/step - loss: 2.3292e-05
Epoch 288/300
3/3 [==============================] - 0s 7ms/step - loss: 1.3972e-05
Epoch 289/300
3/3 [==============================] - 0s 5ms/step - loss: 1.6163e-05
Epoch 290/300
3/3 [==============================] - 0s 4ms/step - loss: 1.5894e-05
Epoch 291/300
3/3 [==============================] - 0s 6ms/step - loss: 9.3688e-06
Epoch 292/300
3/3 [==============================] - 0s 13ms/step - loss: 5.6466e-06
Epoch 293/300
3/3 [==============================] - 0s 6ms/step - loss: 6.2988e-06
Epoch 294/300
3/3 [==============================] - 0s 6ms/step - loss: 4.9748e-06
Epoch 295/300
3/3 [==============================] - 0s 5ms/step - loss: 4.4377e-06
Epoch 296/300
3/3 [==============================] - 0s 5ms/step - loss: 5.5932e-06
Epoch 297/300
3/3 [==============================] - 0s 6ms/step - loss: 4.5074e-06
Epoch 298/300
3/3 [==============================] - 0s 7ms/step - loss: 3.1636e-06
Epoch 299/300
3/3 [==============================] - 0s 5ms/step - loss: 3.5167e-06
Epoch 300/300
3/3 [==============================] - 0s 4ms/step - loss: 5.9548e-06
<keras.src.callbacks.History at 0x7f6126a91750>

予測の結果を確認しましょう。
全結合モデルの結果プロット

h = 1.0
x_min, x_max = X[:, 0].min() - 5, X[:, 0].max() + 5
y_min, y_max = X[:, 1].min() - 5, X[:, 1].max() + 5
xx, yy = np.meshgrid(np.arange(x_min, x_max, h),
                     np.arange(y_min, y_max, h))

# here "model" is your model's prediction (classification) function
Z = fc_model.predict(np.c_[xx.ravel(), yy.ravel()])

# Put the result into a color plot
Z = Z.reshape(xx.shape)
plt.contourf(xx, yy, Z)
plt.axis('off')

# Plot also the training points
plt.scatter(X[:, 0], X[:, 1], c=Y, cmap=plt.cm.Paired)
13/13 [==============================] - 0s 6ms/step
<matplotlib.collections.PathCollection at 0x7f6126d13730>
<Figure size 640x480 with 1 Axes>

テンソルネットワークモデルの結果プロット

h = 1.0
x_min, x_max = X[:, 0].min() - 5, X[:, 0].max() + 5
y_min, y_max = X[:, 1].min() - 5, X[:, 1].max() + 5
xx, yy = np.meshgrid(np.arange(x_min, x_max, h),
                     np.arange(y_min, y_max, h))

# here "model" is your model's prediction (classification) function
Z = tn_model.predict(np.c_[xx.ravel(), yy.ravel()])

# Put the result into a color plot
Z = Z.reshape(xx.shape)
plt.contourf(xx, yy, Z)
plt.axis('off')

# Plot also the training points
plt.scatter(X[:, 0], X[:, 1], c=Y, cmap=plt.cm.Paired)
13/13 [==============================] - 0s 2ms/step
<matplotlib.collections.PathCollection at 0x7f6126ebbe80>
<Figure size 640x480 with 1 Axes>

PyTorchを利用した量子回路(おまけ)

!pip install torch
Collecting torch
  Downloading torch-2.1.1-cp310-cp310-manylinux1_x86_64.whl (670.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 670.2/670.2 MB 1.1 MB/s eta 0:00:0000:0100:03
[?25hCollecting filelock (from torch)
  Downloading filelock-3.13.1-py3-none-any.whl (11 kB)
Requirement already satisfied: typing-extensions in /opt/conda/lib/python3.10/site-packages (from torch) (4.5.0)
Requirement already satisfied: sympy in /opt/conda/lib/python3.10/site-packages (from torch) (1.12)
Requirement already satisfied: networkx in /opt/conda/lib/python3.10/site-packages (from torch) (2.8.8)
Requirement already satisfied: jinja2 in /opt/conda/lib/python3.10/site-packages (from torch) (3.1.2)
Requirement already satisfied: fsspec in /opt/conda/lib/python3.10/site-packages (from torch) (2023.5.0)
Collecting nvidia-cuda-nvrtc-cu12==12.1.105 (from torch)
  Downloading nvidia_cuda_nvrtc_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (23.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 23.7/23.7 MB 9.7 MB/s eta 0:00:00:00:0100:01
[?25hCollecting nvidia-cuda-runtime-cu12==12.1.105 (from torch)
  Downloading nvidia_cuda_runtime_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (823 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 823.6/823.6 kB 7.3 MB/s eta 0:00:0000:0100:01
[?25hCollecting nvidia-cuda-cupti-cu12==12.1.105 (from torch)
  Downloading nvidia_cuda_cupti_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (14.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.1/14.1 MB 9.0 MB/s eta 0:00:00:00:0100:01
[?25hCollecting nvidia-cudnn-cu12==8.9.2.26 (from torch)
  Downloading nvidia_cudnn_cu12-8.9.2.26-py3-none-manylinux1_x86_64.whl (731.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 731.7/731.7 MB 1.1 MB/s eta 0:00:0000:0100:01
[?25hCollecting nvidia-cublas-cu12==12.1.3.1 (from torch)
  Downloading nvidia_cublas_cu12-12.1.3.1-py3-none-manylinux1_x86_64.whl (410.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 410.6/410.6 MB 1.7 MB/s eta 0:00:0000:0100:02
[?25hCollecting nvidia-cufft-cu12==11.0.2.54 (from torch)
  Downloading nvidia_cufft_cu12-11.0.2.54-py3-none-manylinux1_x86_64.whl (121.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 121.6/121.6 MB 5.4 MB/s eta 0:00:0000:0100:01
[?25hCollecting nvidia-curand-cu12==10.3.2.106 (from torch)
  Downloading nvidia_curand_cu12-10.3.2.106-py3-none-manylinux1_x86_64.whl (56.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 56.5/56.5 MB 10.6 MB/s eta 0:00:0000:0100:01
[?25hCollecting nvidia-cusolver-cu12==11.4.5.107 (from torch)
  Downloading nvidia_cusolver_cu12-11.4.5.107-py3-none-manylinux1_x86_64.whl (124.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 124.2/124.2 MB 5.4 MB/s eta 0:00:0000:0100:01
[?25hCollecting nvidia-cusparse-cu12==12.1.0.106 (from torch)
  Downloading nvidia_cusparse_cu12-12.1.0.106-py3-none-manylinux1_x86_64.whl (196.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 196.0/196.0 MB 3.4 MB/s eta 0:00:0000:0100:01
[?25hCollecting nvidia-nccl-cu12==2.18.1 (from torch)
  Downloading nvidia_nccl_cu12-2.18.1-py3-none-manylinux1_x86_64.whl (209.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 209.8/209.8 MB 3.1 MB/s eta 0:00:0000:0100:01
[?25hCollecting nvidia-nvtx-cu12==12.1.105 (from torch)
  Downloading nvidia_nvtx_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (99 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 99.1/99.1 kB 1.8 MB/s eta 0:00:00ta 0:00:01
[?25hCollecting triton==2.1.0 (from torch)
  Downloading triton-2.1.0-0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (89.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 89.2/89.2 MB 8.7 MB/s eta 0:00:00:00:0100:01
[?25hCollecting nvidia-nvjitlink-cu12 (from nvidia-cusolver-cu12==11.4.5.107->torch)
  Downloading nvidia_nvjitlink_cu12-12.3.101-py3-none-manylinux1_x86_64.whl (20.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 20.5/20.5 MB 14.7 MB/s eta 0:00:0000:0100:01
[?25hRequirement already satisfied: MarkupSafe>=2.0 in /opt/conda/lib/python3.10/site-packages (from jinja2->torch) (2.1.2)
Requirement already satisfied: mpmath>=0.19 in /opt/conda/lib/python3.10/site-packages (from sympy->torch) (1.3.0)
Installing collected packages: nvidia-nvtx-cu12, nvidia-nvjitlink-cu12, nvidia-nccl-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, filelock, triton, nvidia-cusparse-cu12, nvidia-cudnn-cu12, nvidia-cusolver-cu12, torch
Successfully installed filelock-3.13.1 nvidia-cublas-cu12-12.1.3.1 nvidia-cuda-cupti-cu12-12.1.105 nvidia-cuda-nvrtc-cu12-12.1.105 nvidia-cuda-runtime-cu12-12.1.105 nvidia-cudnn-cu12-8.9.2.26 nvidia-cufft-cu12-11.0.2.54 nvidia-curand-cu12-10.3.2.106 nvidia-cusolver-cu12-11.4.5.107 nvidia-cusparse-cu12-12.1.0.106 nvidia-nccl-cu12-2.18.1 nvidia-nvjitlink-cu12-12.3.101 nvidia-nvtx-cu12-12.1.105 torch-2.1.1 triton-2.1.0
import matplotlib.pyplot as plt
import torch.optim as optim
import torch
import numpy as np
%matplotlib inline

#qubit
x = torch.tensor([1., 0.])

#variational parameter
a = torch.tensor([0.2], requires_grad=True)

#list for result
arr = []

#the first variable is list of paramters.
op = optim.Adam([a],lr=0.05)
for _ in range(100):
    y = [[torch.cos(a/2),-torch.sin(a/2)], [torch.sin(a/2),torch.cos(a/2)]]
    z = [x[0]*y[0][0]+x[1]*y[0][1], x[0]*y[1][0]+x[1]*y[1][1]]
    expt = torch.abs(z[0])**2 - torch.abs(z[1])**2

    arr.append(expt.item())  # Add the item to the arr list

    op.zero_grad()
    expt.backward()
    op.step()

plt.plot(arr)
plt.show()
<Figure size 640x480 with 1 Axes>

image

テンソル操作の基本を理解することは、量子コンピューティングと機械学習の両方の理解を進めるのに役立ちます。

cuTensorNetの利用方法

今回はQiskitからcuTensorNetに読み込んで実行する方法を確認します。

import itertools

import cupy as cp
import numpy as np
import qiskit
from qiskit.circuit.random import random_circuit

from cuquantum import contract
from cuquantum import CircuitToEinsum

ランダムな量子回路を生成する

num_qubits = 7
depth = 6

circuit = random_circuit(num_qubits, depth, seed=3)
circuit.draw(output='mpl')

倍精度を対象としたコンバーターオブジェクトを構築する

この例では、テンソルのオペランドをCuPy配列として生成します(backend=cupyを設定することにより)

myconverter = CircuitToEinsum(circuit, dtype='complex128', backend=cp)

状態ベクトルを計算する | \psi\rangle

expression, operands = myconverter.state_vector()
sv = contract(expression, *operands)
print(f'wavefunction coefficient shape: {sv.shape}')
print(type(operands[0]))
print(type(sv))

ビット文字列の振幅を計算する \langle b| \psi\rangle

bitstring = '0000000'

expression, operands = myconverter.amplitude(bitstring)
print(f'einsum expression:\n{expression} \n')

amplitude = contract(expression, *operands)
probability = abs(amplitude) ** 2
print(f'for bitstring {bitstring}, amplitude: {amplitude}, probability: {probability}\n')

amplitude_from_sv = sv[0,0,0,0,0,0,0]
amp_diff = abs(amplitude-amplitude_from_sv)
print(f'difference from state vector {amp_diff}')

期待値計算 \langle \psi|\hat{O}| \psi\rangle

この例では、パウリ文字列 IXXZZII の期待値を計算します。比較のために、オペレータと縮約された密度行列を縮約して同じ値を計算します。

pauli_string = 'IXXZZII'
expression, operands = myconverter.expectation(pauli_string, lightcone=True)
expec = contract(expression, *operands)
print(f'expectation value for {pauli_string}: {expec}')

# expectation value from reduced density matrix
qubits = myconverter.qubits
where = qubits[1:5]
rdm_expression, rdm_operands = myconverter.reduced_density_matrix(where, lightcone=True)
rdm = contract(rdm_expression, *rdm_operands)

pauli_x = cp.asarray([[0,1],[1,0]], dtype=myconverter.dtype)
pauli_z = cp.asarray([[1,0],[0,-1]], dtype=myconverter.dtype)
expec_from_rdm = cp.einsum('abcdABCD,aA,bB,cC,dD->', rdm, pauli_x, pauli_x, pauli_z, pauli_z)

print(f"is expectation value in agreement?", cp.allclose(expec, expec_from_rdm))

© 2025, blueqat Inc. All rights reserved