Change force_discrete/integrated_gpu semantic to force_high_performance/low_power_gpu

This aligns with WebGL's powerPreference semantic.

More importantly, discrete/integrated GPUs might be a good description of Mac
dual GPU situation, but it's incorrect on Windows.

BUG=966251
TEST=bots
R=kbr@chromium.org

Change-Id: I2c7f073f06be3c2b2444bf0a81f56f71e656a713
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/1626458
Reviewed-by: Kenneth Russell <kbr@chromium.org>
Reviewed-by: Antoine Labour <piman@chromium.org>
Reviewed-by: Daniel Cheng <dcheng@chromium.org>
Commit-Queue: Zhenyao Mo <zmo@chromium.org>
Cr-Original-Commit-Position: refs/heads/master@{#662876}
Cr-Mirrored-From: https://chromium.googlesource.com/chromium/src
Cr-Mirrored-Commit: dd6173b081c20988cdf74d8137aad36ab35af92d
diff --git a/proxy/ppb_graphics_3d_proxy.cc b/proxy/ppb_graphics_3d_proxy.cc
index f42220f..e657093 100644
--- a/proxy/ppb_graphics_3d_proxy.cc
+++ b/proxy/ppb_graphics_3d_proxy.cc
@@ -217,8 +217,8 @@
         case PP_GRAPHICS3DATTRIB_GPU_PREFERENCE:
           attrib_helper.gpu_preference =
               (value == PP_GRAPHICS3DATTRIB_GPU_PREFERENCE_LOW_POWER)
-                  ? gl::PreferIntegratedGpu
-                  : gl::PreferDiscreteGpu;
+                  ? gl::GpuPreference::kLowPower
+                  : gl::GpuPreference::kHighPerformance;
           break;
         case PP_GRAPHICS3DATTRIB_SINGLE_BUFFER:
           attrib_helper.single_buffer = !!value;