Photonic Computing - The Optical Future

Photonic Computing – The Optical Future

The speed of computation is limited by two factors: how fast information can be moved, data transfer, and how fast that information can be processed, data computation.

Currently, this limit is imposed via the properties of electricity, the flow of electrons.

There is however another field of computing focused on a different paradigm, optical computing, also called photonic computing.

This refers to the use of light, the flow of photons, for use in computing and electronics. Light, more specifically infrared light with wavelengths around 1500 nano-meters, is currently used for data transfer and communication over long distances, this is referred to as fiber optics.

Beyond long distance data transfer, the principle of using light for data manipulation extends to computing as well:

I work in the area of fiber optics, specifically building new types of logic that can compute all optically as opposed to electronically. I’m basically taking fiber optics and putting it into your computer to do logic and CPU components. The results of my research will be increasing the bandwidth of computers from gigahertz speeds to terabit per second speeds, which is 1000 times greater!

This type of advancement won’t be in your computer tomorrow, but it’s the building block to creating the next generation of technology, which you will see years down the road. Unlike the longer infrared wavelengths needed for long distance communication to avoid signal degradation, the light used for computation will be in the visible part of the electromagnetic spectrum, with wavelengths in the range of 450 to 700 nanometers.

This is because when working with small-scale distances, signal degradation isn’t an issue but computing speed is. Electronic circuits operate based on millions, billions of switches, that alternate between an on and off state, this switching process alone induces latency.

Photonics simply uses wave propagation, the interference pattern of waves, to determine a result. This allows for instantaneous computations, without the delay induced by switch latency: Optical computing is actually the science or the art to use photons instead of electrons to do computation.

So what we do here is we try to process data signals in the optical domain instead of the electronic domain, and then that’s actually what is cool about optical computing, because we process data while it’s traveling, we don’t stop the data movement nor the data flow, and we process it.

This is a bit similar to what we do in memory driven computing, where we bring the processor closer to the memory. Here we bring the processing closer to the data while it’s in-flight.

So if you have to compare that with what we do today, every time we send or receive information from an optical fiber then we need to convert it between the electronic domain and the optical domain, what is key here is that this technology allows us to avoid that. So what we have to make or what we have to build when we do optical computing is a logic gate, and to make that using tabletop optics like the one that you see over here, with a lot of lenses and mirrors and so on — that becomes really difficult at a macroscopic level, the issue that we have there is interference.

Now the interesting thing is, if you go to the microscopic level, and that’s what you see here, then this interference effect actually becomes key to solving the problem, so that is the thing that we use.

Photonic Computing - The Optical Future

Let me now give an example of how such a logic gate works, that’s what you see here, let take this thing here. This thing here is a logic and gate, it has two inputs and one output, and we designed this thing to do a Boolean operation where we have an output only in the case when both of the inputs are on.

So we have a two-stage process for that, the first thing that we do here is we use interference in this optical combiner, we make sure that we have a strong field only when both of the inputs are on. Then the next stage that we have here is this micro ring and this micro ring allows us to make a strong distinction between the on and the off level, so that the next gates which listen to these gates can understand the signal.

The ability to compute data while it is being transferred and cut out switch delay, will completely change how computer architecture is designed and thought about. When the computer industry moved from the vacuum tube to transistor, latency decreased from in the order of microseconds to nanoseconds. Photonics promises to reduce latency orders a magnitude again, in the order of femtoseconds or less, 1 quadrillionth of a second!

This speed factor alone would radically transform the computing industry, however, optical computing as many other pros as well. Classical computers operate in serial, with each calculation being performed one after the other. To scale to more complex problems requires more processors, which equates to increased computing power required and more complex data management.

Optical computers can operate in parallel to tackle complex problems through light reflection, as well as have increased bandwidth as compared electron based computers due to the ability to transport multiple wavelengths of light at the same time.

Photons are also massless meaning that they require much less power to excite. These factors, increased parallelism and bandwidth, translate to extremely scalable systems which are much more powerful, utilize less power and don’t come coupled with complex data management issues. This reduction in data management also has a major impact on security.

Our modern computers have data travel all over the place: from storage to different levels of memory to the cache, then finally read by the CPU which determines if it can process the data or has to send it to another computing device, and then repeats the process all back again to storage. This exposes a lot of points on the computer where data is vulnerable.

With optical computers, since data can be computed while it is in motion, translates to increased security, as fewer data is exposed. As you can see, optical computers present many benefits over classical computers. Coming up, we’ll cover some of the various optical computing initiatives as well as how this technology is to be integrated with current computing systems.

Data transfer is one of the largest bottlenecks in terms of computing performance, however with optical computing and the ability to compute data in motion, solves this problem.

Unfortunately with one problem solved, creates another, computing devices will now end up becoming the bottleneck. To solve this, there are various initiatives currently in both research and development to push the field of optical computing devices forward. One of the most widely known is by a company called, Optalysys.

They are designing an optical co-processor, referring back to the last video in this series, this is exactly what we discussed, new types of chips that will work within heterogeneous system architecture. This optical co-processor will benefit many sectors and due to the parallelism in photonics, many of the tasks this coprocessor will excel at are the same as tasks currently offloaded to GPUs.

Both this optical processor and GPUs working together will yield performance boosts like we’ve never seen before: Our approach is completely novel, because we use light to process data, not electricity. The Optalysys system connects to and turbocharges existing computer setups, whether they are standard desktop computers or a larger high performance computing cluster.

It essentially transforms a desktop machine, into an HPC system, that increases the processing capabilities of an existing supercomputer to beyond what current and even future systems can perform. It does this by taking on certain mathematical processes that can be performed faster in the optical domain.

Beyond optical computation devices, another area of research and development currently is memory, more specifically — optical RAM. While data can be computed in motion, to access data would still be a significant bottleneck, unless memory enters the optical domain as well. In R&D, through a joint collaboration by various European nations, optical RAM promises to be over 30 times faster than SRAM aka the CPU cache and 1,000 times better than DRAM.

This equates to memory latencies in the order picoseconds, unheard-of speeds for the memory! For more in-depth information on: memory, GPUs and heterogeneous system architecture, be sure to check out the previous videos in this computing series. Back on topic, there are many other optical and optoelectronic devices in research and development that we haven’t even covered, with optoelectronic devices mixing both electron and photon-based computation.

For example, transistors that can switch between both electron and photon domains and Intel’s optical multiplexers that convert between optical and electronic signals.

The final topic in optical devices that cover revolves around silicon photonics, essentially like fiber optic but on a smaller scale: The X1 photonic module, and just as a comparison, this is a 1.2 terabit photonic module and this is a 1.2 terabit electronic cable. You can see the difference, and it’s more than just the incredible weight and consumption of material resources it takes for the electronic communication, the thing that’s really amazing is that I can do this 10 centimeters or I can go a thousand meters, for exactly the same amount of energy, and that’s the breakthrough.

From the system level designer that’s so amazing about this kind of highly integrated very low cost technology.

Silicon photonics, like that demonstrated by HP, can transfer data at a rate of 1.2 terabytes per second at a distance up to 100 meters, in fact, even extending out to 50 kilometers the transfer rate is still 200 gigabits per second. For comparison, Thunderbolt 3 taps out at 10 gigabits and the current fastest ethernet connections in data centers at 100 gigabits per second.

Photonic Computing - The Optical Future

While at first this technology will be limited to the enterprise side of computing in terms of data centers, we will immediately begin to see improvements through increased cloud computing speeds. In time, as silicon photonics improves, it will move down to the consumer level and terabit speeds will be as simple as plugging in a wire at the back of your computer.

In the grand scope of things, fiber optic speeds will be slowed down no longer. From a fiber internet connection to photonic wiring and then to photonic computing, computing at the speed of light is the long-term goal of this field computing and will produce massive performance and efficiency gains.

Optical computing is a field that has been talked about for quite some time, since the 1960s, and has produced many advances in various technologies. However now, after decades of research and development is yielding tangible results that can accelerate computing performance. Photonic based computing will also play a significant role in quantum computing, due to the particle-wave duality of light.

 

loading...