Published on

Complex Number Deep Learning with Tensorflow

Authors
  • avatar
    Name
    Luke Wood
    Twitter

Recently while working on some research I wanted to perform deep learning in the complex number domain.

Tensorflow has built in support for complex numbers. Unfortunately the api is unstable, has some issues with backpropagation, and doesn't support tf.einsum.

Let's go over a quick solution to this.

Background

Quick review: a complex number is defined by a real and imaginary portion added together. An imaginary number is a value multiplied by i, where i=√(-1).

Representation

First we need to figure out a way to transform tensorflow complex numbers to a workable representation.

I decided to settle on a representation where we treat the real and imaginary number components as distinct channels in our tensor representation.

For example, a complex64 tensor of shape (1, 3, 5, 7) would be represented as a tensor of shape (1, 3, 5, 7, 2) with components of type float64.

Implementation

The implementation of this is pretty simple. The following tensorflow function can transform complex numbers to the desired data format.

@tf.function
def complex_to_channels(x):
    real = tf.math.real(x)
    imag = tf.math.imag(x)
    zed = tf.stack([real, imag], axis=-1)
    return zed

This can be used in a keras layer as follows:

Lambda(lambda x: complex_to_channels(x), name='from_complex')

Let's exampine the inverse:

@tf.function
def channels_to_complex(x):
    real = x[..., 0]
    imag = x[..., 1]
    result = tf.complex(real, imag)
    return result

And in a keras layer:

Lambda(lambda x: channels_to_complex(x), name='to_complex')

Example Usage: Complex Multiplication

When working with this format of complex numbers it is important to not simply apply existing keras layers or tensorflow operations to the result.

For example, when attempting to implement pointwise multiplication we must consider the nature of imaginary numbers. The imaginary number multiplication formula is defined as:

(a+bi)(c+di) = (ac−bd) + (ad+bc)i

Let's take a stab at implementing this.

Consider the complex tensors x and y of shape (1, 2, 3).

Let's begin by applying our transform:

x = complex_to_channels(x)
y = complex_to_channels(y)
print(x.shape, y.shape)
# > (1, 2, 3, 2)

Next let's find the real portion of the result

z_real = tf.math.multiply(x[..., 0], y[..., 0]) - tf.math.multiply(x[..., 1], y[..., 1])

And the imaginary...

z_imaginary = tf.math.multiply(x[..., 1], y[..., 0]) + tf.math.multiply(x[..., 0], y[..., 1])

Now we can place the result back into our representation space:

z = tf.stack(z_real, z_imaginary, axis=-1)

Conclusions

Working with complex numbers in tensorflow is not yet thoroughly supported, but this should not stop potential research works.

By using this method I have made substantial progress forward I would have not made if I had simply sat around and waited for official tensorflow support of complex64 to catch up to my needs.

I'm planning to publish a library with a set of complex tensorflow operations that follow this format in the near future, so keep your eyes out!