The results I get for resolution is correct. For example, an image taken with my Nexus 6P has the resolution of 3648 × 2736 with 180 ppi (pixels pr inch). When I use that photo and takeSnapshotAsync, I get an image of 1080x1080 - however, the ppi changes to 72, making the image a bit grainy. The result is for sharing on SoMe, so that sucks.
My question is, is there any way to maintain PPI? I can’t see any handles besides ‘quality’, so I find it strange that the quality is decreased. Tagging @ide , as you seemed knowledgeable about the issue when I asked how to correctly save to a desired resoltion
Hi @jhalborg - I’m slightly confused by your question, since you said that the resulting image did have the correct resolution, 1080x1080 px. The ppi is not an inherent property of an image, but rather can change depending on how & where the image is displayed. You should be able to control the ppi (to some extent) when displaying an image.
You may want to try multiplying targetPixelCount and pixelRatio, rather than dividing them, to produce pixels. This would give you a resulting image that is actually larger than 1080x1080 px, but you could display it at 1080x1080 “logical”/“device-independent” pixels at a larger resolution.
The user can now place a logo as a ‘sticker’ on the image, let’s say Ryan Air
I want to save the result by snapshotting the containing view for the user to share it on SoMe
My question rephrased would therefore be: How can I make sure that no picture quality is lost in the snapshotting process? I just targeted 1080p, as that is what Instagram allows and it should look just fine, but the results are quite grainy.
EDIT: Better examples. Also, for a moment I thought maybe the phone’s screen resolution might have an impact, but that can’t be it, as the images above are generated on an iPhone 8 which has a pretty great screen.
One way of improving quality might be passing a higher pixel target to the method than 1080p, but that’d create a new problem AFAIK. If, for example, the user’s photo actually is 1080x1080 and I pass 2160x2160 to the takeSnapshotAsync method, wouldn’t that create a somewhat distorted image?
Just tried it with a targetPixelCount of 2000, and the resulting image PPI is still exactly 72 PPI (measured by the Mac Preview app). I tried on both an iPhone 8 and a Nexus 6P, same result regardless.
So it seems to me that the takeSnapshotAsync might be hardcoded to save snapshots to a set quality of 72 PPI? Maybe this should actually be a github issue, @esamelson ?
The results you’re getting from quality are expected – quality should not change the actual physical size of an image, but only how much its data is compressed.
If you want your image to look crisper, you need to either display the 1080x1080 screenshot at a smaller physical dimension (e.g. 540x540 “logical”/“device-independent” pixels) or take a screenshot with a larger actual pixel size (e.g. 2160x2160).
@esamelson - I see, that makes sense. PPI is the wrong metric.
However, there’s a clear downgrade in picture quality, no matter how many pixels I use.
Here is another example with a picture of my Macbooks keyboard and speaker grill, complete with a lot of finger grease and a small hair It was taken with an iPhone 8, and the original has a size of 4032x3024 pixels and is very sharp.
Both results are noticeably more blurry than the original, in spite of setting 2 being close to the original in pixels. Is there anything that can be done to preserve the quality of the original in the snapshot?
@jhalborg - if you compare the result of takeSnapshotAsync with a manual screenshot of the same screen on your device (e.g. on an iPhone, hold down power + press home), is it significantly different?
Hi @jhalborg - sorry for the silence here. Yes, if you could create a github issue with the relevant information from this thread and an easy repro case, that would be great