Enable blob cache when debugging
We've had several issues with shader caches which are only implemented
on Android which makes them challenging to debug.
The existing BlobCache object has a default implementation that is
disabled by default. When debug layers are enabled resize the default
cache so that it gets used and provides behavior closer to Android.
This caching will be enabled when debug layers are enabled, e.g.
EGL_PLATFORM_ANGLE_DEBUG_LAYERS_ENABLED_ANGLE attribute is true.
This is done for angle_end2end_tests and deqp tests.
Reland after bugfix angleproject:4535
Bug: b:152292873
Bug: angleproject:4535
Change-Id: Icefa8c55e39985d653d8d8a8bc8c734210025b50
Reviewed-on: https://chromium-review.googlesource.com/c/angle/angle/+/2134449
Reviewed-by: Geoff Lang <geofflang@chromium.org>
Commit-Queue: Courtney Goeltzenleuchter <courtneygo@google.com>
diff --git a/src/libANGLE/Display.cpp b/src/libANGLE/Display.cpp
index f3b6ebf..b10711c 100644
--- a/src/libANGLE/Display.cpp
+++ b/src/libANGLE/Display.cpp
@@ -655,6 +655,14 @@
ASSERT(mImplementation != nullptr);
mImplementation->setBlobCache(&mBlobCache);
+ // Enable shader caching if debug layers are turned on. This allows us to test that shaders are
+ // properly saved & restored on all platforms. The cache won't allocate space until it's used
+ // and will be ignored entirely if the application / system sets it's own cache functions.
+ if (rx::ShouldUseDebugLayers(mAttributeMap))
+ {
+ mBlobCache.resize(1024 * 1024);
+ }
+
gl::InitializeDebugAnnotations(&mAnnotator);
gl::InitializeDebugMutexIfNeeded();