Optical Chip With 100x Parallel Processing

Jul 9, 2025 6:27 AM

Oktay74tn

Views

218352

Likes

330

Dislikes

14

Optical Chip With 100x Parallel Processing
Oktay Yürük aka Oktay74tn, science and tech content
https://imgur.com/user/Oktay74tn

This is Meteor-1, a 100-way parallel optical processor developed by the Shanghai Institute of Optics and Fine Mechanics. It processes information using light, not electricity. The chip can process 100 tasks at the same time.

This is the design of the 100-channel wavelength-parallel optical computer Meteor-1. The microcomb-based spectral encoder produces over 100 phase-locked carriers across the telecommunication band. The matrix–matrix multiplication is performed in a broadband Mach-Zehnder interferometer MZI mesh. The output vectors are demultiplexed through one dense wavelength division multiplexing WDM with 50 GHz spacing for parallel photodetection.

The MZI mesh optical computing chip was produced on top of a silicon-on-insulator SOI platform. It is driven by a 64 channels matrix voltage source which provides 0 to 10 Volt with a precision of one thousandth of a Volt.

These diagrams are port-to-port maps of an identical matrix configuration at the top and a random matrix configuration at the bottom for three different wavelengths. The fidelity without extra corrections is 0.9, which is very good.

Meteor-1 can deliver a theoretical peak performance of 2,560 tera-operations per second called TOPS at a 50 GHz optical clock rate. This is within reach of the latest NVIDIA RTX 5090 with its 3,352 TOPS. The power efficiency is the computation throughput divided by the power consumption. Scientists estimate it to be at least 3.2 TOPS per watt.

The new optical chip has several advantages. It is scalable and does not require its own infrastructure. Meteor-1 shows high performance with low energy consumption.

New designs such as the parallel optical computer can continue Moore's Law, which predicts a doubling of computing power every two years. They can accelerate the artificial intelligence revolution.

Parallel optical computing capable of 100-wavelength multiplexing
Xiao Yu, Ziqi Wei, Fangyuan Sha, Xinyu Wang, Yanqi Chu, Zhen Wang, Xilin Han, Hongwei Wang, Shulan Yi, Yuhu Cheng, Guangwei Hu & Peng Xie
https://elight.springeropen.com/articles/10.1186/s43593-025-00088-8

Optical chip achieves 100-way parallel processing
https://www.futuretimeline.net/blog/2025/06/26-breakthrough-optical-chip-performs-100-tasks-at-once.htm

Wikipedia articles
https://en.wikipedia.org/wiki/Mach-Zehnder_interferometer
https://en.wikipedia.org/wiki/Moore's_law
https://en.wikipedia.org/wiki/Parallel_computing
https://en.wikipedia.org/wiki/Optical_computing

Check out https://imgur.com/user/Oktay74tn for many more astronomy and technology videos.

computer

physics

technology

computers

artificial_intelligence

Only 10 years ™ from development

3 weeks ago | Likes 1 Dislikes 0

Neat, but as someone who works with Photonics, this isn't as new as it seems. I see that it still uses thermal phase shifters in the MZI, which are slow to change. So, you can do 100 operations at once, and the same operation on different data very fast, but if you change the second matrix operation, you have to wait for the MZI elements to cool/heat to the right temp. Here is a review written by some people I work with: https://www.nature.com/articles/s41586-020-2764-0

3 weeks ago | Likes 6 Dislikes 0

Is this a quantum processor?

3 weeks ago | Likes 2 Dislikes 0

No, it is a normal processor. An optical parallel quantum computer would be the best that I can imagine. I am sure that this will come one day. AI is a powerful tool for new designs.

3 weeks ago | Likes 2 Dislikes 1

Oh hell yeah! This is technology that I read about quite a long time ago and was excited to see progress, but it seemed to have died down for a long time.

That kind of parallel processing opens the doors for some incredible computational advancement. Thank you so much for posting this!

3 weeks ago | Likes 12 Dislikes 1

You are welcome :) . I find this very fascinating too.

3 weeks ago | Likes 6 Dislikes 1

Yes but can it stop climate change

3 weeks ago | Likes 2 Dislikes 0

The internet infrastructure consumes about 2-3 % of the world's energy. Optical computers are superior to Si-based computers because photons don't generate heat. In my oponion, standard software like internet browsers could also be optimized for lower ressource use.

3 weeks ago | Likes 2 Dislikes 0

We've been waiting for optronics and positronics for quite some time. It seems we missed the m-m math functions as being critical

3 weeks ago | Likes 1 Dislikes 0

This is freaking awesome, thank you for sharing cool technology

3 weeks ago | Likes 8 Dislikes 1

Agree.

3 weeks ago | Likes 2 Dislikes 0

Top notch post. Thank you

3 weeks ago | Likes 1 Dislikes 0

"The fidelity without extra correction is 0.9". Well, consider me very whelmed

3 weeks ago | Likes 2 Dislikes 0

Sure, but can it run Doom?

3 weeks ago | Likes 3 Dislikes 0

No, only Tetris :) .

3 weeks ago | Likes 1 Dislikes 0

With the electric grid incredibly strained the way it is these days, when I hear about a new type of processor my first thought isn't about how powerful it's capable of being, it's about how much power it requires to operate.

3 weeks ago | Likes 4 Dislikes 0

That's why the datacentres that imgur loves to hate are put near generation - so you don't have overloaded transmission lines. And also a good reason to look into the cost of solar panels and batteries for your own stuff.

3 weeks ago | Likes 1 Dislikes 0

Woah woah guy do you think humanity will survive this?

3 weeks ago | Likes 1 Dislikes 0

Well, I dunno about you, but I'm not just going to lay down and die.

3 weeks ago | Likes 2 Dislikes 0

I've thought about that

3 weeks ago | Likes 1 Dislikes 0

With a consumption of one watt per 3.2TOPS, this would still be 800 watts, without any memory. So, for now this is more power hungry than the "conventional" way of doing AI (rtx pro 6000 96GB => 600 watt).

If this wants to compete, the power draw has to come down by a large margin to offset the initial cost, which is usually multiple times for new tech compared to established tech.

3 weeks ago | Likes 60 Dislikes 2

You sound like you know what you're talking about....

3 weeks ago | Likes 12 Dislikes 0

i mean it's gen 1 (or hell a prototype for all I know) I'm sure given enough R&D power consumption is going to lower. the tech is interesting, I wonder how micronized they can get it.. and how does light and optics work at the angstrom level. I would assume at those levels signal bleed through is still an issue, as at that point a photon and electron interact similarly

3 weeks ago | Likes 10 Dislikes 0

I'd imagine you'll have real issues at sub nm since that's way below the wavelength of visible light. It's why we use elections for very high resolution microscopes, where the thing you're looking at is smaller than the wavelength of any light you could use to look at it. Might be you need to increase the energy of the light you use here but then would the components survive?

3 weeks ago | Likes 4 Dislikes 0

yeah i was going to say. Nvidia 5090 is 3352 TOPs, 575 watts, which is 5.82 TOPS per watt. 3.2 is not much more than half the efficiency of a 5090

3 weeks ago | Likes 1 Dislikes 0

I've heard about light based prototypes my entire life. It's not even my field, but it's often a story that gets coverage when something working comes along. But working prototypes and their press releases are always so cool. I wish them all the luck in the world, even if they can't yet compete with an incredibly mature and optimized technology, yet.

3 weeks ago | Likes 7 Dislikes 0

And these things eventually reach a point where they have to convert optical signal to electronic. Big chip makers (amd, intel, nvidia) are working with a smaller company to get an optical chip that (i think) would let them treat a bunch of gpus like a gpu pool that just gets work farmed to it and results coming back to massively scale gpu clusters. Optical computing and chips are near future, but have to convert to electronic signsl for storage and such.

3 weeks ago | Likes 1 Dislikes 0

The computations themselves do not require an interposer. Data entering and leaving the chip do right now though, but when you're talking this kind of scale, it becomes irrelevant, especially with optical memory.

3 weeks ago | Likes 1 Dislikes 0

I think light based computing has drastically lower heat generation, so that might offset cost some.

3 weeks ago | Likes 2 Dislikes 0

You're trying to compare operations between 2 entirely different nodes of execution. This would be like comparing bead memory to HBM2, then comparing the CAS rate. It doesn't make sense.

3 weeks ago | Likes 2 Dislikes 0

They just need to use light of Eärendil's star. Problem solved!

3 weeks ago | Likes 4 Dislikes 0

(@Ngugi has entered the chat.)

3 weeks ago | Likes 2 Dislikes 0

Will also keep the spiders out

3 weeks ago | Likes 2 Dislikes 0

I know nothing about any of this except to ask the question, will this generate less heat? If so, and if it’s significantly less, then that’s possibly its strength.

3 weeks ago | Likes 1 Dislikes 0

Heat's going to be directly linked to power draw.

3 weeks ago | Likes 5 Dislikes 0

To expand on the other person's comment a little, ALL wattage becomes heat in any device/appliance. Even an oscillating fan, which might consume 30 watts of energy, and despite its apparent cooling effect, produces just as much as a 30 watt space heater (if such a thing existed).

3 weeks ago | Likes 4 Dislikes 0

Sounds like a law of thermodynamics. 😁

3 weeks ago | Likes 1 Dislikes 0

*that* old chestnut

3 weeks ago | Likes 2 Dislikes 0

Off course such a thing exists. It's me idling on the couch while staring holes into the air.

3 weeks ago | Likes 2 Dislikes 0

not entirely true with the fan, since some of the energy is being converted into kinetic energy. Most of it, actually, since simple electric motors are very efficient.

But in the case of a CPU or GPU yes, most of that energy becomes heat. Efficiency tends to be inversely proportional to heat generated.

3 weeks ago | Likes 1 Dislikes 0

Cool. I'm waiting for a technology that'll make it so people don't have to work 40, 50, 60+ hours a week for slave-wages just to make ends-meet while Jeff Bezos rents out half of Venice with our money to have a 50 million dollar wedding. Can anyone show me a technology like that?

3 weeks ago | Likes 25 Dislikes 3

May I present to you, the humble brick! Doing what peaceful protests can't since The dawn of throwing rocks

3 weeks ago | Likes 2 Dislikes 0

Don't forget soup for our families.

3 weeks ago | Likes 1 Dislikes 0

Yes actually, the French perfected the technology in the late 1700's. For some reason we don't use them anymore though. We really should.

3 weeks ago | Likes 16 Dislikes 0

Modern Problems usually require modern solutions...

3 weeks ago | Likes 5 Dislikes 0

You don't need technology for that. You need politicians that actually work for the people instead of working for the corporations and the billionaires. And new tech that improves productivity can just as easily be misused again if the increase only goes into th epockets of the shareholders and upper management.

3 weeks ago | Likes 6 Dislikes 0

If only there were a way to make them make the right choices.

3 weeks ago | Likes 1 Dislikes 0

3 weeks ago | Likes 2 Dislikes 0

Sure can.

3 weeks ago | Likes 14 Dislikes 1

(Slow clap)

3 weeks ago | Likes 4 Dislikes 0

Every person working 40 hours a week should have a decent living. People like Bezos should pay their fair share, no more and no less.

3 weeks ago | Likes 7 Dislikes 1

Remember when all those people were going to storm Area 51? People should do that to Bezos' compound

3 weeks ago | Likes 3 Dislikes 0

The issue here is that billionaires are destructive parasites that shouldn't exist at all, so their "fair share" cannot even be expressed as a percentage. So even if Bezos' 233 billion were taxed @ 99.9%, he'd be left with 233 million. STILL UNACCEPTABLE. Therefore, society must instead institute some kind of "hard cap" on these parasites, perhaps 10 million, while ALL the remaining wealth goes to building a world that will mitigate the human misery caused by these vermin in the first place.

3 weeks ago | Likes 6 Dislikes 0

BuT tHe FReE mArKets!!

3 weeks ago | Likes 3 Dislikes 0

I'm of the opinion that there should be a wealth cap at 100 m. Once you've reached 100 million, every penny after that goes into social programs and back into wages of the company earning you said money.

We can even give them a plaque "congratulations! You beat capitalism."

But nobody, fucking nobody, has a legitimate need to possess 100 million or more. Companies might need it for large projects, but that's a different can of worms altogether

3 weeks ago | Likes 1 Dislikes 0

I watch the Jimmy Dore Show regularly and I agree with 95% that is said there.

3 weeks ago | Likes 3 Dislikes 0

I mean 95% is 1000x better than what we've got, and if that could somehow be enforced as a first step it would certainly promote massive changes in quality of life for working class people. I'd take it in a second. Ultimately, however, there MUST be a cap to prevent the deep influence on politics held even by those with tens of millions. Alas, NONE of this matters anyway because, as Lucy Parsons correctly put it: "Never be deceived that the rich will allow you to vote away their wealth".

3 weeks ago | Likes 3 Dislikes 0

Moore's law misunderstood again. This optical chip can not "continue Moore's law", because Moore's law has always been about doubling the number of *electronic transistors* on a chip every 2 years. https://en.wikipedia.org/wiki/Moore%27s_law

3 weeks ago | Likes 43 Dislikes 0

I thought Moore’s law was that if anything could go wrong, some idiot who mistakenly thinks they are a genius will come along and choose the explanation with the fewest assumed entities.

3 weeks ago | Likes 15 Dislikes 1

I think you're confusing Moore's Law with Murphy's Law and at the same time twisting Occam's Razor into it.

3 weeks ago | Likes 12 Dislikes 2

I won’t recommend shaving with any twisted razor

3 weeks ago | Likes 3 Dislikes 0

It does look like that, and I'm loving it :D

3 weeks ago | Likes 4 Dislikes 0

And Drumming-Kreuger, which is a cognitive bias stating that people who possess low ability in a particular subject area will often overestimate their ability to manifest positive results in their lives simply by focusing on positive thoughts.

3 weeks ago | Likes 5 Dislikes 0

Drumming-Cougar*

3 weeks ago | Likes 1 Dislikes 0

Dunning, mayhaps?

3 weeks ago | Likes 2 Dislikes 0

I thought it was Cole's Law, which I'm sure involves shredded cabbage somewhere along the line.

3 weeks ago | Likes 18 Dislikes 0

I think you're referring to Poe's law, the reason for the disclaimer in my bio.

3 weeks ago | Likes 3 Dislikes 0