
Oktay74tn
218352
330
14
Optical Chip With 100x Parallel Processing
Oktay Yürük aka Oktay74tn, science and tech content
https://imgur.com/user/Oktay74tn
This is Meteor-1, a 100-way parallel optical processor developed by the Shanghai Institute of Optics and Fine Mechanics. It processes information using light, not electricity. The chip can process 100 tasks at the same time.
This is the design of the 100-channel wavelength-parallel optical computer Meteor-1. The microcomb-based spectral encoder produces over 100 phase-locked carriers across the telecommunication band. The matrix–matrix multiplication is performed in a broadband Mach-Zehnder interferometer MZI mesh. The output vectors are demultiplexed through one dense wavelength division multiplexing WDM with 50 GHz spacing for parallel photodetection.
The MZI mesh optical computing chip was produced on top of a silicon-on-insulator SOI platform. It is driven by a 64 channels matrix voltage source which provides 0 to 10 Volt with a precision of one thousandth of a Volt.
These diagrams are port-to-port maps of an identical matrix configuration at the top and a random matrix configuration at the bottom for three different wavelengths. The fidelity without extra corrections is 0.9, which is very good.
Meteor-1 can deliver a theoretical peak performance of 2,560 tera-operations per second called TOPS at a 50 GHz optical clock rate. This is within reach of the latest NVIDIA RTX 5090 with its 3,352 TOPS. The power efficiency is the computation throughput divided by the power consumption. Scientists estimate it to be at least 3.2 TOPS per watt.
The new optical chip has several advantages. It is scalable and does not require its own infrastructure. Meteor-1 shows high performance with low energy consumption.
New designs such as the parallel optical computer can continue Moore's Law, which predicts a doubling of computing power every two years. They can accelerate the artificial intelligence revolution.
Parallel optical computing capable of 100-wavelength multiplexing
Xiao Yu, Ziqi Wei, Fangyuan Sha, Xinyu Wang, Yanqi Chu, Zhen Wang, Xilin Han, Hongwei Wang, Shulan Yi, Yuhu Cheng, Guangwei Hu & Peng Xie
https://elight.springeropen.com/articles/10.1186/s43593-025-00088-8
Optical chip achieves 100-way parallel processing
https://www.futuretimeline.net/blog/2025/06/26-breakthrough-optical-chip-performs-100-tasks-at-once.htm
Wikipedia articles
https://en.wikipedia.org/wiki/Mach-Zehnder_interferometer
https://en.wikipedia.org/wiki/Moore's_law
https://en.wikipedia.org/wiki/Parallel_computing
https://en.wikipedia.org/wiki/Optical_computing
Check out https://imgur.com/user/Oktay74tn for many more astronomy and technology videos.
ThailandExpress
Only 10 years ™ from development
DevourerOfOtters
Neat, but as someone who works with Photonics, this isn't as new as it seems. I see that it still uses thermal phase shifters in the MZI, which are slow to change. So, you can do 100 operations at once, and the same operation on different data very fast, but if you change the second matrix operation, you have to wait for the MZI elements to cool/heat to the right temp. Here is a review written by some people I work with: https://www.nature.com/articles/s41586-020-2764-0
quadraspaz1
Is this a quantum processor?
Oktay74tn
No, it is a normal processor. An optical parallel quantum computer would be the best that I can imagine. I am sure that this will come one day. AI is a powerful tool for new designs.
BoobJiggle
Oh hell yeah! This is technology that I read about quite a long time ago and was excited to see progress, but it seemed to have died down for a long time.
That kind of parallel processing opens the doors for some incredible computational advancement. Thank you so much for posting this!
Oktay74tn
You are welcome :) . I find this very fascinating too.
BrickSprickly
Yes but can it stop climate change
Oktay74tn
The internet infrastructure consumes about 2-3 % of the world's energy. Optical computers are superior to Si-based computers because photons don't generate heat. In my oponion, standard software like internet browsers could also be optimized for lower ressource use.
GeorgeCostabaplaps
We've been waiting for optronics and positronics for quite some time. It seems we missed the m-m math functions as being critical
HoldThatTHOT
This is freaking awesome, thank you for sharing cool technology
WiiShaker
Agree.
dbox
Top notch post. Thank you
seheim
"The fidelity without extra correction is 0.9". Well, consider me very whelmed
YippeeKayakOB
Sure, but can it run Doom?
Oktay74tn
No, only Tetris :) .
CitrusyGarlic
With the electric grid incredibly strained the way it is these days, when I hear about a new type of processor my first thought isn't about how powerful it's capable of being, it's about how much power it requires to operate.
technicalfool
That's why the datacentres that imgur loves to hate are put near generation - so you don't have overloaded transmission lines. And also a good reason to look into the cost of solar panels and batteries for your own stuff.
BrickSprickly
Woah woah guy do you think humanity will survive this?
CitrusyGarlic
Well, I dunno about you, but I'm not just going to lay down and die.
BrickSprickly
I've thought about that
Adester
With a consumption of one watt per 3.2TOPS, this would still be 800 watts, without any memory. So, for now this is more power hungry than the "conventional" way of doing AI (rtx pro 6000 96GB => 600 watt).
If this wants to compete, the power draw has to come down by a large margin to offset the initial cost, which is usually multiple times for new tech compared to established tech.
Whatdoyousaytoanicecupoftea
You sound like you know what you're talking about....
mak10z
i mean it's gen 1 (or hell a prototype for all I know) I'm sure given enough R&D power consumption is going to lower. the tech is interesting, I wonder how micronized they can get it.. and how does light and optics work at the angstrom level. I would assume at those levels signal bleed through is still an issue, as at that point a photon and electron interact similarly
Snowentific
I'd imagine you'll have real issues at sub nm since that's way below the wavelength of visible light. It's why we use elections for very high resolution microscopes, where the thing you're looking at is smaller than the wavelength of any light you could use to look at it. Might be you need to increase the energy of the light you use here but then would the components survive?
JamesFaction
yeah i was going to say. Nvidia 5090 is 3352 TOPs, 575 watts, which is 5.82 TOPS per watt. 3.2 is not much more than half the efficiency of a 5090
nclu
I've heard about light based prototypes my entire life. It's not even my field, but it's often a story that gets coverage when something working comes along. But working prototypes and their press releases are always so cool. I wish them all the luck in the world, even if they can't yet compete with an incredibly mature and optimized technology, yet.
sadurdaynight
And these things eventually reach a point where they have to convert optical signal to electronic. Big chip makers (amd, intel, nvidia) are working with a smaller company to get an optical chip that (i think) would let them treat a bunch of gpus like a gpu pool that just gets work farmed to it and results coming back to massively scale gpu clusters. Optical computing and chips are near future, but have to convert to electronic signsl for storage and such.
XKSapphire
The computations themselves do not require an interposer. Data entering and leaving the chip do right now though, but when you're talking this kind of scale, it becomes irrelevant, especially with optical memory.
sadurdaynight
I think light based computing has drastically lower heat generation, so that might offset cost some.
XKSapphire
You're trying to compare operations between 2 entirely different nodes of execution. This would be like comparing bead memory to HBM2, then comparing the CAS rate. It doesn't make sense.
Ondradactyl
They just need to use light of Eärendil's star. Problem solved!
drduffer
(@Ngugi has entered the chat.)
Ngugi
Will also keep the spiders out
drduffer
I know nothing about any of this except to ask the question, will this generate less heat? If so, and if it’s significantly less, then that’s possibly its strength.
PlueRyvius
Heat's going to be directly linked to power draw.
darbythemiddle
To expand on the other person's comment a little, ALL wattage becomes heat in any device/appliance. Even an oscillating fan, which might consume 30 watts of energy, and despite its apparent cooling effect, produces just as much as a 30 watt space heater (if such a thing existed).
drduffer
Sounds like a law of thermodynamics. 😁
ItHappenedInThe20thCentury
*that* old chestnut
Adester
Off course such a thing exists. It's me idling on the couch while staring holes into the air.
JamesFaction
not entirely true with the fan, since some of the energy is being converted into kinetic energy. Most of it, actually, since simple electric motors are very efficient.
But in the case of a CPU or GPU yes, most of that energy becomes heat. Efficiency tends to be inversely proportional to heat generated.
NChomsky
Cool. I'm waiting for a technology that'll make it so people don't have to work 40, 50, 60+ hours a week for slave-wages just to make ends-meet while Jeff Bezos rents out half of Venice with our money to have a 50 million dollar wedding. Can anyone show me a technology like that?
Euclid11010
May I present to you, the humble brick! Doing what peaceful protests can't since The dawn of throwing rocks
NChomsky
Don't forget soup for our families.
ShadeWilson2
https://media2.giphy.com/media/v1.Y2lkPWE1NzM3M2U1djE1ZHhoYW9jNHphcWhnMDF0OGtkbDh5MmVlYm9tdm5jd21yeDBxdSZlcD12MV9naWZzX3NlYXJjaCZjdD1n/12gxeCI1BGKAj6/200w.webp
KratArona
Yes actually, the French perfected the technology in the late 1700's. For some reason we don't use them anymore though. We really should.
NChomsky
Modern Problems usually require modern solutions...
RiverHawk
You don't need technology for that. You need politicians that actually work for the people instead of working for the corporations and the billionaires. And new tech that improves productivity can just as easily be misused again if the increase only goes into th epockets of the shareholders and upper management.
rbudrick
If only there were a way to make them make the right choices.
NChomsky
AyatollahBahloni
Sure can.
TonyBalogna
(Slow clap)
Oktay74tn
Every person working 40 hours a week should have a decent living. People like Bezos should pay their fair share, no more and no less.
BrickSprickly
Remember when all those people were going to storm Area 51? People should do that to Bezos' compound
NChomsky
The issue here is that billionaires are destructive parasites that shouldn't exist at all, so their "fair share" cannot even be expressed as a percentage. So even if Bezos' 233 billion were taxed @ 99.9%, he'd be left with 233 million. STILL UNACCEPTABLE. Therefore, society must instead institute some kind of "hard cap" on these parasites, perhaps 10 million, while ALL the remaining wealth goes to building a world that will mitigate the human misery caused by these vermin in the first place.
BrickSprickly
BuT tHe FReE mArKets!!
BoobJiggle
I'm of the opinion that there should be a wealth cap at 100 m. Once you've reached 100 million, every penny after that goes into social programs and back into wages of the company earning you said money.
We can even give them a plaque "congratulations! You beat capitalism."
But nobody, fucking nobody, has a legitimate need to possess 100 million or more. Companies might need it for large projects, but that's a different can of worms altogether
Oktay74tn
I watch the Jimmy Dore Show regularly and I agree with 95% that is said there.
NChomsky
I mean 95% is 1000x better than what we've got, and if that could somehow be enforced as a first step it would certainly promote massive changes in quality of life for working class people. I'd take it in a second. Ultimately, however, there MUST be a cap to prevent the deep influence on politics held even by those with tens of millions. Alas, NONE of this matters anyway because, as Lucy Parsons correctly put it: "Never be deceived that the rich will allow you to vote away their wealth".
DrKonrad
Moore's law misunderstood again. This optical chip can not "continue Moore's law", because Moore's law has always been about doubling the number of *electronic transistors* on a chip every 2 years. https://en.wikipedia.org/wiki/Moore%27s_law
Imalwaysready
I thought Moore’s law was that if anything could go wrong, some idiot who mistakenly thinks they are a genius will come along and choose the explanation with the fewest assumed entities.
Kodan00
I think you're confusing Moore's Law with Murphy's Law and at the same time twisting Occam's Razor into it.
gumshoe99
I won’t recommend shaving with any twisted razor
MadHatter69
It does look like that, and I'm loving it :D
Preincarnated
And Drumming-Kreuger, which is a cognitive bias stating that people who possess low ability in a particular subject area will often overestimate their ability to manifest positive results in their lives simply by focusing on positive thoughts.
TheOncelor
Drumming-Cougar*
gobblinal
Dunning, mayhaps?
technicalfool
I thought it was Cole's Law, which I'm sure involves shredded cabbage somewhere along the line.
somethingsomethingwittyhere
I think you're referring to Poe's law, the reason for the disclaimer in my bio.