At blueqat, we are applying quantum technology to improve existing deep learning models and starting the development of our proprietary Large Language Model (LLM).
Practical application of quantum computers is still said to be a thing of the future. At blueqat, we have long been dedicated to popularizing NVIDIA's high-performance quantum computer simulator, cuQuantum, and we use its technology to improve existing deep learning models and progress the development of our unique model.
For several years now, blueqat has been developing calculations that utilize both GPUs and quantum computers in a hybrid manner. Starting last year, we utilized GPUs and used parameter reduction technologies such as tensor decomposition to reduce parameters and accelerate existing deep learning models with our clients. We are starting to achieve practical and positive results.
The circuit of the quantum computer, transformed into a tensor similar to deep learning, has a very good compatibility with AI and the technology can be standardized. It can also handle not only linear computations but also nonlinear cases. Furthermore, it will be easy to provide feedback to the quantum circuit in the future.
Around the world, there have been active proposals for parameter compression through tensor decomposition in models such as Transformers, and it is a good development to be able to efficiently realize the scale rule by parameters in quantum computing.