The screenshots below were taken on May 13, 2013, after the patch to Tribble.
1. The winter jacket looks slightly worse after the texture changes. The difference is most noticeable on the lower edge of the lapel and the second buckle from the top on the left (right-hand side of the character).
I usually run with low graphics settings. The specifics are provided in the following tickets:
ticket ID #41,044
ticket ID #41,091
I've taken some closer screenshots from the tailor. The screenshots below were taken on May 14, 2013, between 9am and 10am PST. Compared to the Holodeck image, the Tribble image is blurry near the bottom edge of the lapel and the buckles.
Any way to extract these new textures into a readable/editable format? Previously, .wtex textures were just regular .dds files with some custom header, now it seems they've migrated to a new format (.crn perhaps?) with a new header...
Each texture's mipmaps are automatically generated when we convert our photoshop files into something the game can understand. When we do this conversion, it's called 'Processing a Texture.' When Zer0 says we 'Updated every texture everywhere' what she means, is that we REPROCESSED every texture everywhere. It does not mean that we repainted everything by hand. We simply reprocessed them with some different configurations, to yield a different result.
Originally Posted by tacofangs
This is probably the most noticeable of our 4 reprocessing techniques. If you look back at our MipChain, those smaller Mips tend to get blurry, and lose data as they get shrunk. So, much like a photoshop sharpen filter, we sharpen up those mips a bit, to make them pop out more at a distance, and look better on lower end systems. This sharpening does not happen to the full texture, only the mips the lower mips. Now, there are some textures that look worse with this sharpening, and those textures can be unsharpened on a case by case basis (and many already have), but overall, we believe the sharpening has drastically improved the look of the game, both at a distance, and on low end systems.
If I understood you correctly, each texture has a full-size, unprocessed image that is reduced to mips of various sizes. The smaller mips were sharpened, but not the larger mips. I have a question about how the sharpen filtered was applied. I can think of two possibilities:
1. The sharpen filter was applied directly to the mips themselves.
2. The original image was sharpened and then re-reduced to create new mips.
Which method was used? Is it possible that method 2 might produce better results than method 1?