4.8 million frames per second is wild. Currently the phantom can achieve I think 1.5M fps in black and white with a "postal stamp" resolution.
I wonder what resolution this camera could output but I guess it's probably smaller than < 120px.
They should probably have indicated how many frames they can capture too.
> The team created a DRUM camera with a sequence depth of seven frames, meaning that it captures seven frames in each short movie.
Seems like their movies could possibly only be 7 frames in length so not really comparable to Phantom type cameras which can capture a lot more content.
Still, it’s great to see the mirror tech used in DLP projectors having other uses. Some 3D resin printers use it too but that’s still projection I guess. I think those dlp mirror chips are amazing and probably fascinating to hack around with.
> He realized that rapidly changing the tilt angle of periodic facets on a diffraction grating, which can generate several replicas of the incident light traveling in different directions, could present a way to sweep through different spatial positions to gate out frames at different time points.
My guess is the ‘sequence depth’ relates to the amount of diffraction gates. So perhaps all that’s needed is an ultrafast shutter to string together a long ‘movie’
The diffraction grate (not gate, it is gating but it's a grate) has a number of possible diffraction modes dependent on the angular range of the DMD (an array of tiny mirrors that flip back and forth between two positions). In this case, there are 7 possible diffraction modes, and the sequence depth (the term for the number of frames) is 7 in reflection of this.
I didn't mean to conflate the two, it's gating through a... 'grating'. Though the article and the paper do use the term 'diffraction gating'.
> "Luckily, it is possible to accomplish this type of swept diffraction gate by using a digital micromirror device (DMD)—a common optical component in projectors—in an unconventional way," said Liang. "DMDs are mass-produced and require no mechanical movement to produce the diffraction gate, making the system cost-efficient and stable."
It's wild indeed. Maybe even wilder is that a camera capable of taking on million frames per second was developed in 1940. It was called the Repatronic camera [1],[2]. I can't provide a link, but I read in the book "Dark Sun" by Richard Rhodes that they had a 3.5 million frames per second camera when they detonated the first thermonuclear bomb, Ivy Mike, in 1952. I found in a different place that the guy who built this camera was a certain Berlyn Brixner. The wikipidia page for Berlyn Brixner [3] does not mention this detail, unfortunately.
> He realized that rapidly changing the tilt angle of periodic facets on a diffraction grating, which can generate several replicas of the incident light traveling in different directions, could present a way to sweep through different spatial positions to gate out frames at different time points.
Is this saying that light through a lenticular lens sheet comes out slightly early/late depending on angles, and the sheet could be used as an array of delay lines with different time constants?
I don't think so, I think it's more like sweeping an image across multiple sensors over time using the grating (probably via one of the piezo optical effects?)
But I'm not familiar with your terminology, so maybe that's what you said :)
Edit: Ah, they're using a DMD. The paper is published here: https://opg.optica.org/optica/fulltext.cfm?uri=optica-10-9-1.... Iiuc when the DMD mirrors are switching between on and off you can get different diffraction patterns while they're shifting, and these different diffraction orders will move the image on the camera.
Suppose you have two points, X and Y, each distance d from some origin O. At time 0, some light is emitted from O towards both X and Y. The earliest it can arrive at either of them is time d/c, where c is the speed of light in a vacuum.
If you want the light to arrive at different times you can either do something to make the paths different lengths, or to make the speed of light different on the two paths, or both. You can't make the speed faster than c on either of the paths but you can make it slower by making the light go through something with an index of refraction greater than 1.
Now suppose you've got some thing that you have built to process light, and you care about the differences in arrival time of the light to different parts of the thing. There's probably going to be some part of your thing that it makes the most sense to use as a reference, and then describe light arrival times elsewhere relative to that reference point.
There's usually no particular reason to pick as the reference the point on your thing that light first reaches. If you do indeed pick a reference that is not the earliest place of contact with the light, then you can have light arrive earlier at other parts of your thing.
light either arrives exactly on time, or arrives late. what ever other convoluted description you want to come up with, light cannot be early based on your equation of d/c
Consider light from a distant source, sufficiently far away that the wavefront at your instrument is planar. You have a planar sensor array parallel to the incoming wavefront, so the wavefront arrives at all sensors at the same time.
There are different kinds of sensors sensing different aspects of the incoming light. You need to combine them, making sure that you are combining readings that come from the same plane of the wavefront.
But one of your sensors takes a little longer than the others to process and produce a reading. You either need to make that sensor start processing earlier than the other sensors, or make the other sensors start processing later, or add some sort of delay between the other sensor outputs and the thing that combines all the sensor outputs.
If you fix this by moving that one sensor a little out of your sensor plane, moving it toward the light source, most people, including most scientists, would say that your fix was to make the light arrive at that sensor earlier, because the light in fact arrives at that sensor now before it arrives at the other sensors.
Let's say that the sensors process different wavelengths of light. Then another adjustment you might do to get the outputs synced would be to fill the space between where light enters and the sensor plane with a gas that happens to have a low index of refraction for the wavelengths the slow sensor uses and a high index for the wavelengths the other sensors use.
That would make the light arrive later at all the sensors than it would have if the gas had not been added, but it will less later at the slow sensor. But again nearly everyone would say that what you fix did was make the light arrive earlier at the slow sensor, because before the fix light arrived at the slow sensor at the same time it arrived at the other sensors, and after the fix it arrives at the slow sensor before it arrives at the other sensors.
So if light arrives through a fiber-optic cable, you will insist it's never on time? I don't think that's a great definition. Please stop acting like it's the only reasonable definition.
The speed of light isn't constant. Light travels slower in glass and water than in air. The difference in light speed in different materials is why refraction and lenses work. Additionally, light can bounce around inside a prism or a lens and take a longer path which takes more time.
Sure it can. If the material being used allows one speed normally. And by changing something like voltage you can make the light go faster or take a longer round. Then you can speak of light being “early(Ier)” or late compared to the default state.
are you listening to the words you're saying? seriously? yes, we can make light go slower so that the sensor receives light at different times. i'm with you all the way to the point you make some ridiculous not thought out comment about making light go faster. No. Just stop. We cannot do that. We can only slow light down.
Nobody is saying they are increasing c, everything here is relative. You are making many comments that are misinterpreting what people are saying to you.
Rather than be incredulous and faux-shocked, go look up cherenkov radiation.
It's when electromagnetic radiation emitted when a charged particle (such as an electron) passes through a dielectric medium at a speed greater than the phase velocity (speed of propagation of a wavefront in a medium) of light in that medium.
If your relative time is based on measuring the event through a glass of water then by removing the glass of water the light would get there faster thus an early register of the event.
Without having read any more than can be gleaned from the article, it sounds like it's using similar principles. That is, the image is transformed into a domain (frequency) that is more amenable to rapidly capturing the necessary information quickly without time gating directly. It's not obvious to me what the setup is, but this type of approach is neat.
pointing at several different sensors, or relying on the speed of light to time shift arrival at the sensor? I still can't tell from the article, but any of those would be a neat trick.