zlacker

[return to "RealFill: Image completion using diffusion models"]
1. jawns+ao[view] [source] 2023-09-29 20:35:29
>>flavor+(OP)
There's definitely value in providing this functionality for photographs taken in the present.

But I think the real value -- and this is definitely in Google's favor -- is providing this functionality for photos you have taken in the past.

I have probably 30K+ photos in Google Photos that capture moments from the past 15 years. There are quite a lot of them where I've taken multiple shots of the same scene in quick succession, and it would be fairly straightforward for Google to detect such groupings and apply the technique to produce synthesized pictures that are better than the originals. It already does something similar for photo collages and "best in a series of rapid shots." They surface without my having to do anything.

◧◩
2. Boppre+EP[view] [source] 2023-09-29 23:52:17
>>jawns+ao
That's exactly why I've been keeping all "duplicates" in my photo collections.

They do take up a lot of space, and just today I asked in photo.stackexchange for backup compression techniques that can exploit inter-image similarities: https://photo.stackexchange.com/questions/132609/backup-comp...

◧◩◪
3. bick_n+Wf1[view] [source] 2023-09-30 06:06:10
>>Boppre+EP
Tiled/stacked approach as others mention is good, and probably the best approach. Could also try doing an uncompressed format (even just .png uncompressed) or something simple like RLE then 7zip them together since 7zip is the only archive format that does inter-file (as opposed to intra-file) compression as far as I am aware.

Unfortunately lossless video compression won't help here as it will compress frames individually for lossless.

◧◩◪◨
4. adrian+uh1[view] [source] 2023-09-30 06:29:05
>>bick_n+Wf1
Inter file compression has been solved ever since tar|gz
◧◩◪◨⬒
5. beagle+3l1[view] [source] 2023-09-30 07:40:37
>>adrian+uh1
Not so. Gzip’s window is very small - 32K in the original gzip iirc, which meant even identical copies of a 33KB file would bot help each other.

Iirc it was Bzip2 that bumped that up to 1MB, and there are now compressors with larger windows - but files have also grown, it’s not a solved problem for compression utilities.

It is solved for backup - but, reatic, and a few others will do that across a backup set with no “window size” limit.

…. And all of that is only true for lossless, which does not include images or video.

[go to top]