[X86][AVX] Decode constant bits from insert_subvector(c1, c2, c3)

This mostly happens due to SimplifyDemandedVectorElts reducing a vector to insert_subvector(undef, c1, 0)

git-svn-id: https://llvm.org/svn/llvm-project/llvm/trunk@363499 91177308-0d34-0410-b5e6-96231b3b80d8
5 files changed