Tensor Network
Nov 9, 2019
Around 2019 summer I’ve heard about a workshop Google X was organizing about their Tensor Network research and library. The 50 seats filled up fast and I didn’t get in, but later in November (Nov. 6 - Nov. 7 2019) there was another opportunity as part of the IEEE Rebooting Computing Conference in San Mateo. The workshop was two days and I was able to snatch a seat. I booked a cheap AirBnb and headed to the Silicon Valley.
TensorNetwork is an open source library for efficient tensor calculations. Tensors are very generic mathematical constructs and their applicability is very broad, starting from quantum physics simulations, quantum computer modeling and simulations all the way to neural network machine learning frameworks. It’s not a coincidence that TensorFlow has tensor in its name, since it also uses tensors to perform the simulation of the underlying neural network. However the TensorFlow framework and the TensorNetwork library are separate projects.
To learn more about TensorNetwork you can check out the GitHub repository of the library or Google X’s blog post. There’s even a better brilliant presentation I highly advise to check out: it introduces a very intuitive visualization of tensors, matrices, vectors and operations like contraption between them.
The first afternoon the notation was introduced and some simple networks. The second afternoon contained more demos and practice, we assembled simple tensor networks, performed some contraptions, matrix factorizations. Google X researchers also showed how simpler quantum computing bits can be modeled with the library.
Later I looked up articles about TensorFlow vs Tensor Network. I plan to perform measurements how TensorNetwork could be used to speed up certain types of neural networks. There are some publications which started to explore that. I’ll follow up on the blog when I get to it.