You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So on each iteration the multiple decodes are because the image is used at different sizes, and the decodes are at different sizes. On a real news site each of those images would be a different image, so it wouldn't be multiple decodes. So that doesn't really seem like a problem. Perhaps the test should be altered to make it more realistic?
Then after an iteration the presshell is destroy which discards the decoded copy immediately, and the new iteration we decode again. We could be slightly less aggressive on this discard perhaps. I believe there was a reason why we were this discard was introduced with a large amount getting accumulated in the past. We could maybe just do the discard a little slower to avoid the re-decodes. The decodes are off main thread, so the speed up would be small likely, and the benefit to real sites limited to less usual conditions I think.
The idea is that it would be more realistic to add a unique placeholder images for each placement on the page rather than using a differently-sized resources/newssite/news-next/dist/placeholder_light.jpg for each:
The text was updated successfully, but these errors were encountered:
This was raised in a Gecko bug https://bugzilla.mozilla.org/show_bug.cgi?id=1915251#c1:
The idea is that it would be more realistic to add a unique placeholder images for each placement on the page rather than using a differently-sized
resources/newssite/news-next/dist/placeholder_light.jpg
for each:The text was updated successfully, but these errors were encountered: