Last week's post on improving SL image quality via a workaround from SLer Frenchbloke Vanmoer caused a lot of skepticism, mostly due to some poor technical wording on my part. (My bad and mea culpa!) However, Firestorm support and QA volunteer Whirly Fizzle came by Comments, and confirmed that this does, indeed, apparently work:
When I first read this blog post I thought it was a load of cobblers. But Frenchbloke is actually correct! The blog post just explains it badly. [Sorry! - ed]
This will also work just as well on the Linden Lab viewer & probably every other viewer. Those debug settings are from the Linden Lab viewer, they are not Firestorm specific.
It's easy enough to test for yourself. Get hold of a super high quality 8192 image with lots of fine detail & make sure it's saved in a lossless format - I used PNG in my test.
Set the debugs max_texture_dimension_X and max_texture_dimension_Y to 8192.
Upload your 8192x8192 image.
This image will upload and the uploader will resize the image to the maximum allowed size of 1024x1024 - this limit is enforced server side, there is no way around it.Now open your 8192 PNG in Photoshop & reduce the image size to 12.5% using the Bicubic Sharpen (best for reduction) & save this image.
Set the debugs max_texture_dimension_X and max_texture_dimension_Y back to default & upload your image.
More from Mr. Fizzle, including a sample comparison images they took:
Compare these 2 uploaded textures side by side.
The 8192 texture resized by the uploader is much sharper, has less noise & the highlights pop more then the texture that was 1024 before upload.
I'm quite surprised the uploader makes a better job of the resize then Photoshop tbh.
I asked one of the Firestorm developers what was going on here & they are still investigating, but initially the thought is that this trick has more validity then it first seems.
You give the viewer a texture to upload & it's converted to a 1024x1024 j2k texture (JPEG 2000).
If you upload a 1024, the j2k compression goes through and does lossy compression on the data you give it.
If you upload a 8192 (allowed by changing those debug settings) the viewer will apply the same lossy compression, except this time it has 8 times the quality on input with which to work.Also to note, using this trick will not cause any extra "lag" - the uploaded 8102 x8192 image is still 1024x1024 after upload...
On the left is the original 8192 PNG image uploaded by changing the debug settings.
On the right is the same PNG reduced to 1024 in Photoshop using Bicubic Sharpen (best for reduction) before upload.
It's pretty obvious the 8192 texture on the left is better quality even though both images after upload are 1024x1024.
Also note that the Photoshop resized 1024 image does not have all the compression artifacts you can see in the right uploaded texture when viewed in Windows photo viewer.
Photoshop did a fine job at resizing the texture.
The viewer upload/compression/resize clearly reduces quality much more when uploading the 1024 image then it does when uploading the 8192 image.Whirly Fizzle
Firestorm support & QA.
And there's a whole forum thread on this from Beq Janus going into further detail, and there's even more detail and discussion in the original post's Comments thread.
I'm sure if won't be long until LL enforces a image file dimension check server side, the same they are doing with limiting the number of group tag changes due to scripted RLV devices that animates group tags.
Unfortunately the majority of the SL community does not know how to correctly use textures to make creations that are optimize to minimize "lag".
Posted by: Gattz Gilman | Wednesday, February 20, 2019 at 05:37 PM
Sadly, while the observations here are correct the post is just as full of misinformation as the earlier post that Whirly and I took time to try to correctly explain. It seems that misinformation rules the day.
*You cannot upload high than 1024x1024 - try it yourself*
The lab strictly enforces the resolution of ALL image uploads. This is simple fact. All that is happening in this example is that the source image that you are allowed to specify in the uploader can be larger than the default 2048. It then gets resized to the target dimensions of 1024 BEFORE it is sent to the lab.
The "improvement" attained is not through the increased size but instead through the algorithm being used to do the resizing. It will give enhanced details in certain use cases and more grainy results if you have a smooth texture gradient.
This was explained in detail in my blog http://beqsother.blogspot.com/2019/02/compression-depression-tales-of.html and discussed with other creators in the SL forums https://community.secondlife.com/forums/topic/433164-bi-curious-you-should-be-or-why-your-assumed-wisdom-may-not-be-correct/
TL;DR when resizing your images always pick the best algorithm. In general, to preserve fine detail use the bilinear appraoach, to maintain smooth gradients use bicubic.
Beq
Posted by: Beq Janus | Sunday, February 24, 2019 at 03:26 AM
Actually, the wording of my first comment is a little strong. This post goes a long way to correcting the misinformation, thank you Hamlet.
There is still a misdirection here though, the bottom line is, keep control of textures yourself, use the beta/test grid to verify texture uploads and experiment yourself with the best resizing for your use case. Much as I would love to think that in the viewer we can do a better job than Adobe, it is not the case, we simply do a different job than the Adobe defaults and in some cases that works out better.
These posts and the resultant investigations and conversations have been useful and have led to some interesting subsequent discussion about whether the 2048 limit should be in place, and the answer, from my perspective, is in two parts.
1) The limit itself serves no real purpose. Ultimately the viewer will upload a 1024 so limiting the input size makes no sense....but...
2) As was raised in the past in this blog, one of the problems we face in SecondLife is inefficient content design, and a move that encourages the creation of high resolution (1024) textures as part of a default workflow is going to lead to a worsening of this situation.
To this end I have proposed an amendment to the viewer (and server) that would allow for the simplified creation of multiple resolutions in a single transaction. https://jira.secondlife.com/browse/BUG-226352
Ultimately I would like to see this generate a range of resolutions from a single (high resolution) source image, for a single upload fee. I think the feature has value without the single fee change, but the take up would be far lower.
The hope is that with such a feature, artists and creators can work with high-resolution images and the viewer will produce multiple resolution uploads allowing the creators to test the appearance and make better decisions about when a 1024x1024 image is really needed.
Beq
Firestorm Viewer Dev.
Posted by: Beq Janus | Sunday, February 24, 2019 at 04:27 AM