Patchwork TYPE_PRECISION for vectors

login
register
mail settings
Submitter Marc Glisse
Date May 7, 2013, 9:26 p.m.
Message ID <alpine.DEB.2.02.1305072313070.21865@stedding.saclay.inria.fr>
Download mbox | patch
Permalink /patch/242465/
State New
Headers show

Comments

Marc Glisse - May 7, 2013, 9:26 p.m.
Hello,

this patch is about the use of TYPE_PRECISION for non-scalar types. For 
complex types, it is unused (always 0) and for vectors, the field is used 
to store the log of the number of elements, so in both cases we shouldn't 
use the macro. I tried to enforce it, see the SCALAR_TYPE_CHECK in tree.h 
(I assume I should remove it and the FIXME before commit), and started 
fixing the crashes, but there were too many and I gave up. Those fixes I 
did write are included in the patch, since I believe we eventually want to 
get there.

For hashing in gimple.c, including TYPE_UNSIGNED for complex and vector 
seems questionable: I think that it is the same as for the inner type, 
which is already hashed. I didn't change that in the patch.

Passes bootstrap+testsuite on x86_64-linux-gnu.


2013-05-07  Marc Glisse  <marc.glisse@inria.fr>

gcc/
 	* stor-layout.c (element_precision): New function.
 	* machmode.h (element_precision): Declare it.
 	* tree.c (build_minus_one_cst): New function.
 	(element_precision): Likewise.
 	* tree.h (SCALAR_TYPE_CHECK): New macro.
 	(build_minus_one_cst): Declare new function.
 	(element_precision): Likewise.
 	* fold-const.c (operand_equal_p): Use element_precision.
 	(fold_binary_loc): Handle vector types.
 	* convert.c (convert_to_integer): Use element_precision.
 	* gimple.c (iterative_hash_canonical_type): Handle complex and vectors
 	separately.

gcc/c-family/
 	* c-common.c (vector_types_convertible_p): No TYPE_PRECISION for
 	vectors.

gcc/testsuite/
 	* gcc.dg/vector-shift.c: New testcase.
Marc Glisse - May 7, 2013, 11:03 p.m.
On Tue, 7 May 2013, Marc Glisse wrote:

> 	* fold-const.c (operand_equal_p): Use element_precision.
> 	(fold_binary_loc): Handle vector types.

Note that this causes a regression in some cases: it now recognizes 
LROTATE_EXPR for vectors, but then on x86_64 it isn't a supported vector 
operation so it is lowered to scalar rotations, which is worse than the 
shift+shift+or that we started with. But it only means that the vector 
lowering is bad, not that the patch is wrong. I didn't check if the 
vectorizer was clever enough to split rotations into shifts+or when 
that helps.
Jakub Jelinek - May 8, 2013, 6:22 a.m.
On Wed, May 08, 2013 at 01:03:41AM +0200, Marc Glisse wrote:
> On Tue, 7 May 2013, Marc Glisse wrote:
> 
> >	* fold-const.c (operand_equal_p): Use element_precision.
> >	(fold_binary_loc): Handle vector types.
> 
> Note that this causes a regression in some cases: it now recognizes
> LROTATE_EXPR for vectors, but then on x86_64 it isn't a supported
> vector operation so it is lowered to scalar rotations, which is
> worse than the shift+shift+or that we started with. But it only
> means that the vector lowering is bad, not that the patch is wrong.
> I didn't check if the vectorizer was clever enough to split
> rotations into shifts+or when that helps.

For rotations if we don't have a vector rotation we just need to
pattern detect it as shifts + or.  It is on my todo list.

	Jakub
Richard Guenther - May 10, 2013, 9:42 a.m.
On Tue, May 7, 2013 at 11:26 PM, Marc Glisse <marc.glisse@inria.fr> wrote:
> Hello,
>
> this patch is about the use of TYPE_PRECISION for non-scalar types. For
> complex types, it is unused (always 0) and for vectors, the field is used to
> store the log of the number of elements, so in both cases we shouldn't use
> the macro. I tried to enforce it, see the SCALAR_TYPE_CHECK in tree.h (I
> assume I should remove it and the FIXME before commit), and started fixing
> the crashes, but there were too many and I gave up. Those fixes I did write
> are included in the patch, since I believe we eventually want to get there.
>
> For hashing in gimple.c, including TYPE_UNSIGNED for complex and vector
> seems questionable: I think that it is the same as for the inner type, which
> is already hashed. I didn't change that in the patch.
>
> Passes bootstrap+testsuite on x86_64-linux-gnu.

Ok.

Thanks,
Richard.

>
> 2013-05-07  Marc Glisse  <marc.glisse@inria.fr>
>
> gcc/
>         * stor-layout.c (element_precision): New function.
>         * machmode.h (element_precision): Declare it.
>         * tree.c (build_minus_one_cst): New function.
>         (element_precision): Likewise.
>         * tree.h (SCALAR_TYPE_CHECK): New macro.
>         (build_minus_one_cst): Declare new function.
>         (element_precision): Likewise.
>         * fold-const.c (operand_equal_p): Use element_precision.
>         (fold_binary_loc): Handle vector types.
>         * convert.c (convert_to_integer): Use element_precision.
>         * gimple.c (iterative_hash_canonical_type): Handle complex and
> vectors
>         separately.
>
> gcc/c-family/
>         * c-common.c (vector_types_convertible_p): No TYPE_PRECISION for
>         vectors.
>
> gcc/testsuite/
>         * gcc.dg/vector-shift.c: New testcase.
>
> --
> Marc Glisse
> Index: gcc/convert.c
> ===================================================================
> --- gcc/convert.c       (revision 198685)
> +++ gcc/convert.c       (working copy)
> @@ -348,22 +348,22 @@ convert_to_real (tree type, tree expr)
>     fixed-point or vector; in other cases error is called.
>
>     The result of this is always supposed to be a newly created tree node
>     not in use in any existing structure.  */
>
>  tree
>  convert_to_integer (tree type, tree expr)
>  {
>    enum tree_code ex_form = TREE_CODE (expr);
>    tree intype = TREE_TYPE (expr);
> -  unsigned int inprec = TYPE_PRECISION (intype);
> -  unsigned int outprec = TYPE_PRECISION (type);
> +  unsigned int inprec = element_precision (intype);
> +  unsigned int outprec = element_precision (type);
>
>    /* An INTEGER_TYPE cannot be incomplete, but an ENUMERAL_TYPE can
>       be.  Consider `enum E = { a, b = (enum E) 3 };'.  */
>    if (!COMPLETE_TYPE_P (type))
>      {
>        error ("conversion to incomplete type");
>        return error_mark_node;
>      }
>
>    /* Convert e.g. (long)round(d) -> lround(d).  */
> Index: gcc/tree.c
> ===================================================================
> --- gcc/tree.c  (revision 198685)
> +++ gcc/tree.c  (working copy)
> @@ -1615,20 +1615,59 @@ build_one_cst (tree type)
>      case COMPLEX_TYPE:
>        return build_complex (type,
>                             build_one_cst (TREE_TYPE (type)),
>                             build_zero_cst (TREE_TYPE (type)));
>
>      default:
>        gcc_unreachable ();
>      }
>  }
>
> +/* Return a constant of arithmetic type TYPE which is the
> +   opposite of the multiplicative identity of the set TYPE.  */
> +
> +tree
> +build_minus_one_cst (tree type)
> +{
> +  switch (TREE_CODE (type))
> +    {
> +    case INTEGER_TYPE: case ENUMERAL_TYPE: case BOOLEAN_TYPE:
> +    case POINTER_TYPE: case REFERENCE_TYPE:
> +    case OFFSET_TYPE:
> +      return build_int_cst (type, -1);
> +
> +    case REAL_TYPE:
> +      return build_real (type, dconstm1);
> +
> +    case FIXED_POINT_TYPE:
> +      /* We can only generate 1 for accum types.  */
> +      gcc_assert (ALL_SCALAR_ACCUM_MODE_P (TYPE_MODE (type)));
> +      return build_fixed (type, fixed_from_double_int
> (double_int_minus_one,
> +                                                      TYPE_MODE (type)));
> +
> +    case VECTOR_TYPE:
> +      {
> +       tree scalar = build_minus_one_cst (TREE_TYPE (type));
> +
> +       return build_vector_from_val (type, scalar);
> +      }
> +
> +    case COMPLEX_TYPE:
> +      return build_complex (type,
> +                           build_minus_one_cst (TREE_TYPE (type)),
> +                           build_zero_cst (TREE_TYPE (type)));
> +
> +    default:
> +      gcc_unreachable ();
> +    }
> +}
> +
>  /* Build 0 constant of type TYPE.  This is used by constructor folding
>     and thus the constant should be represented in memory by
>     zero(es).  */
>
>  tree
>  build_zero_cst (tree type)
>  {
>    switch (TREE_CODE (type))
>      {
>      case INTEGER_TYPE: case ENUMERAL_TYPE: case BOOLEAN_TYPE:
> @@ -6921,20 +6960,33 @@ compare_tree_int (const_tree t, unsigned
>  bool
>  valid_constant_size_p (const_tree size)
>  {
>    if (! host_integerp (size, 1)
>        || TREE_OVERFLOW (size)
>        || tree_int_cst_sign_bit (size) != 0)
>      return false;
>    return true;
>  }
>
> +/* Return the precision of the type, or for a complex or vector type the
> +   precision of the type of its elements.  */
> +
> +unsigned int
> +element_precision (const_tree type)
> +{
> +  enum tree_code code = TREE_CODE (type);
> +  if (code == COMPLEX_TYPE || code == VECTOR_TYPE)
> +    type = TREE_TYPE (type);
> +
> +  return TYPE_PRECISION (type);
> +}
> +
>  /* Return true if CODE represents an associative tree code.  Otherwise
>     return false.  */
>  bool
>  associative_tree_code (enum tree_code code)
>  {
>    switch (code)
>      {
>      case BIT_IOR_EXPR:
>      case BIT_AND_EXPR:
>      case BIT_XOR_EXPR:
> Index: gcc/tree.h
> ===================================================================
> --- gcc/tree.h  (revision 198685)
> +++ gcc/tree.h  (working copy)
> @@ -929,20 +929,23 @@ extern void omp_clause_range_check_faile
>
>  #define RECORD_OR_UNION_CHECK(T)       \
>    TREE_CHECK3 (T, RECORD_TYPE, UNION_TYPE, QUAL_UNION_TYPE)
>  #define NOT_RECORD_OR_UNION_CHECK(T) \
>    TREE_NOT_CHECK3 (T, RECORD_TYPE, UNION_TYPE, QUAL_UNION_TYPE)
>
>  #define NUMERICAL_TYPE_CHECK(T)                                        \
>    TREE_CHECK5 (T, INTEGER_TYPE, ENUMERAL_TYPE, BOOLEAN_TYPE, REAL_TYPE,
> \
>                FIXED_POINT_TYPE)
>
> +#define SCALAR_TYPE_CHECK(T)   \
> +  TREE_NOT_CHECK2 (TYPE_CHECK (T), COMPLEX_TYPE, VECTOR_TYPE)
> +
>  /* Here is how primitive or already-canonicalized types' hash codes
>     are made.  */
>  #define TYPE_HASH(TYPE) (TYPE_UID (TYPE))
>
>  /* A simple hash function for an arbitrary tree node.  This must not be
>     used in hash tables which are saved to a PCH.  */
>  #define TREE_HASH(NODE) ((size_t) (NODE) & 0777777)
>
>  /* Tests if CODE is a conversion expr (NOP_EXPR or CONVERT_EXPR).  */
>  #define CONVERT_EXPR_CODE_P(CODE)                              \
> @@ -2108,20 +2111,22 @@ struct GTY(()) tree_block {
>     overloaded and used for different macros in different kinds of types.
>     Each macro must check to ensure the tree node is of the proper kind of
>     type.  Note also that some of the front-ends also overload these fields,
>     so they must be checked as well.  */
>
>  #define TYPE_UID(NODE) (TYPE_CHECK (NODE)->type_common.uid)
>  #define TYPE_SIZE(NODE) (TYPE_CHECK (NODE)->type_common.size)
>  #define TYPE_SIZE_UNIT(NODE) (TYPE_CHECK (NODE)->type_common.size_unit)
>  #define TYPE_POINTER_TO(NODE) (TYPE_CHECK (NODE)->type_common.pointer_to)
>  #define TYPE_REFERENCE_TO(NODE) (TYPE_CHECK
> (NODE)->type_common.reference_to)
> +//FIXME: we want
> +//#define TYPE_PRECISION(NODE) (SCALAR_TYPE_CHECK
> (NODE)->type_common.precision)
>  #define TYPE_PRECISION(NODE) (TYPE_CHECK (NODE)->type_common.precision)
>  #define TYPE_NAME(NODE) (TYPE_CHECK (NODE)->type_common.name)
>  #define TYPE_NEXT_VARIANT(NODE) (TYPE_CHECK
> (NODE)->type_common.next_variant)
>  #define TYPE_MAIN_VARIANT(NODE) (TYPE_CHECK
> (NODE)->type_common.main_variant)
>  #define TYPE_CONTEXT(NODE) (TYPE_CHECK (NODE)->type_common.context)
>
>  /* Vector types need to check target flags to determine type.  */
>  extern enum machine_mode vector_type_mode (const_tree);
>  #define TYPE_MODE(NODE) \
>    (VECTOR_TYPE_P (TYPE_CHECK (NODE)) \
> @@ -4759,20 +4764,21 @@ extern tree make_vector_stat (unsigned M
>  extern tree build_vector_stat (tree, tree * MEM_STAT_DECL);
>  #define build_vector(t,v) build_vector_stat (t, v MEM_STAT_INFO)
>  extern tree build_vector_from_ctor (tree, vec<constructor_elt, va_gc> *);
>  extern tree build_vector_from_val (tree, tree);
>  extern tree build_constructor (tree, vec<constructor_elt, va_gc> *);
>  extern tree build_constructor_single (tree, tree, tree);
>  extern tree build_constructor_from_list (tree, tree);
>  extern tree build_real_from_int_cst (tree, const_tree);
>  extern tree build_complex (tree, tree, tree);
>  extern tree build_one_cst (tree);
> +extern tree build_minus_one_cst (tree);
>  extern tree build_zero_cst (tree);
>  extern tree build_string (int, const char *);
>  extern tree build_tree_list_stat (tree, tree MEM_STAT_DECL);
>  #define build_tree_list(t,q) build_tree_list_stat(t,q MEM_STAT_INFO)
>  extern tree build_tree_list_vec_stat (const vec<tree, va_gc>
> *MEM_STAT_DECL);
>  #define build_tree_list_vec(v) build_tree_list_vec_stat (v MEM_STAT_INFO)
>  extern tree build_decl_stat (location_t, enum tree_code,
>                              tree, tree MEM_STAT_DECL);
>  extern tree build_fn_decl (const char *, tree);
>  #define build_decl(l,c,t,q) build_decl_stat (l,c,t,q MEM_STAT_INFO)
> @@ -4859,20 +4865,21 @@ tree_low_cst (const_tree t, int pos)
>  extern HOST_WIDE_INT size_low_cst (const_tree);
>  extern int tree_int_cst_sgn (const_tree);
>  extern int tree_int_cst_sign_bit (const_tree);
>  extern unsigned int tree_int_cst_min_precision (tree, bool);
>  extern bool tree_expr_nonnegative_p (tree);
>  extern bool tree_expr_nonnegative_warnv_p (tree, bool *);
>  extern bool may_negate_without_overflow_p (const_tree);
>  extern tree strip_array_types (tree);
>  extern tree excess_precision_type (tree);
>  extern bool valid_constant_size_p (const_tree);
> +extern unsigned int element_precision (const_tree);
>
>  /* Construct various nodes representing fract or accum data types.  */
>
>  extern tree make_fract_type (int, int, int);
>  extern tree make_accum_type (int, int, int);
>
>  #define make_signed_fract_type(P) make_fract_type (P, 0, 0)
>  #define make_unsigned_fract_type(P) make_fract_type (P, 1, 0)
>  #define make_sat_signed_fract_type(P) make_fract_type (P, 0, 1)
>  #define make_sat_unsigned_fract_type(P) make_fract_type (P, 1, 1)
> Index: gcc/gimple.c
> ===================================================================
> --- gcc/gimple.c        (revision 198685)
> +++ gcc/gimple.c        (working copy)
> @@ -3076,29 +3076,36 @@ iterative_hash_canonical_type (tree type
>       checked.  */
>    v = iterative_hash_hashval_t (TREE_CODE (type), 0);
>    v = iterative_hash_hashval_t (TREE_ADDRESSABLE (type), v);
>    v = iterative_hash_hashval_t (TYPE_ALIGN (type), v);
>    v = iterative_hash_hashval_t (TYPE_MODE (type), v);
>
>    /* Incorporate common features of numerical types.  */
>    if (INTEGRAL_TYPE_P (type)
>        || SCALAR_FLOAT_TYPE_P (type)
>        || FIXED_POINT_TYPE_P (type)
> -      || TREE_CODE (type) == VECTOR_TYPE
> -      || TREE_CODE (type) == COMPLEX_TYPE
>        || TREE_CODE (type) == OFFSET_TYPE
>        || POINTER_TYPE_P (type))
>      {
>        v = iterative_hash_hashval_t (TYPE_PRECISION (type), v);
>        v = iterative_hash_hashval_t (TYPE_UNSIGNED (type), v);
>      }
>
> +  if (VECTOR_TYPE_P (type))
> +    {
> +      v = iterative_hash_hashval_t (TYPE_VECTOR_SUBPARTS (type), v);
> +      v = iterative_hash_hashval_t (TYPE_UNSIGNED (type), v);
> +    }
> +
> +  if (TREE_CODE (type) == COMPLEX_TYPE)
> +    v = iterative_hash_hashval_t (TYPE_UNSIGNED (type), v);
> +
>    /* For pointer and reference types, fold in information about the type
>       pointed to but do not recurse to the pointed-to type.  */
>    if (POINTER_TYPE_P (type))
>      {
>        v = iterative_hash_hashval_t (TYPE_REF_CAN_ALIAS_ALL (type), v);
>        v = iterative_hash_hashval_t (TYPE_ADDR_SPACE (TREE_TYPE (type)), v);
>        v = iterative_hash_hashval_t (TYPE_RESTRICT (type), v);
>        v = iterative_hash_hashval_t (TREE_CODE (TREE_TYPE (type)), v);
>      }
>
> Index: gcc/testsuite/gcc.dg/vector-shift.c
> ===================================================================
> --- gcc/testsuite/gcc.dg/vector-shift.c (revision 0)
> +++ gcc/testsuite/gcc.dg/vector-shift.c (revision 0)
> @@ -0,0 +1,13 @@
> +/* { dg-do compile } */
> +/* { dg-options "-fdump-tree-original" } */
> +
> +typedef unsigned vec __attribute__ ((vector_size (4 * sizeof (int))));
> +
> +void
> +f (vec *x)
> +{
> +  *x = (*x << 4) << 3;
> +}
> +
> +/* { dg-final { scan-tree-dump "<< 7" "original" } } */
> +/* { dg-final { cleanup-tree-dump "original" } } */
>
> Property changes on: gcc/testsuite/gcc.dg/vector-shift.c
> ___________________________________________________________________
> Added: svn:eol-style
>    + native
> Added: svn:keywords
>    + Author Date Id Revision URL
>
> Index: gcc/c-family/c-common.c
> ===================================================================
> --- gcc/c-family/c-common.c     (revision 198685)
> +++ gcc/c-family/c-common.c     (working copy)
> @@ -2220,21 +2220,21 @@ vector_types_convertible_p (const_tree t
>    static bool emitted_lax_note = false;
>    bool convertible_lax;
>
>    if ((TYPE_VECTOR_OPAQUE (t1) || TYPE_VECTOR_OPAQUE (t2))
>        && tree_int_cst_equal (TYPE_SIZE (t1), TYPE_SIZE (t2)))
>      return true;
>
>    convertible_lax =
>      (tree_int_cst_equal (TYPE_SIZE (t1), TYPE_SIZE (t2))
>       && (TREE_CODE (TREE_TYPE (t1)) != REAL_TYPE ||
> -        TYPE_PRECISION (t1) == TYPE_PRECISION (t2))
> +        TYPE_VECTOR_SUBPARTS (t1) == TYPE_VECTOR_SUBPARTS (t2))
>       && (INTEGRAL_TYPE_P (TREE_TYPE (t1))
>          == INTEGRAL_TYPE_P (TREE_TYPE (t2))));
>
>    if (!convertible_lax || flag_lax_vector_conversions)
>      return convertible_lax;
>
>    if (TYPE_VECTOR_SUBPARTS (t1) == TYPE_VECTOR_SUBPARTS (t2)
>        && lang_hooks.types_compatible_p (TREE_TYPE (t1), TREE_TYPE (t2)))
>      return true;
>
> Index: gcc/fold-const.c
> ===================================================================
> --- gcc/fold-const.c    (revision 198685)
> +++ gcc/fold-const.c    (working copy)
> @@ -2438,21 +2438,22 @@ operand_equal_p (const_tree arg0, const_
>      return 0;
>
>    /* We cannot consider pointers to different address space equal.  */
>    if (POINTER_TYPE_P (TREE_TYPE (arg0)) && POINTER_TYPE_P (TREE_TYPE
> (arg1))
>        && (TYPE_ADDR_SPACE (TREE_TYPE (TREE_TYPE (arg0)))
>           != TYPE_ADDR_SPACE (TREE_TYPE (TREE_TYPE (arg1)))))
>      return 0;
>
>    /* If both types don't have the same precision, then it is not safe
>       to strip NOPs.  */
> -  if (TYPE_PRECISION (TREE_TYPE (arg0)) != TYPE_PRECISION (TREE_TYPE
> (arg1)))
> +  if (element_precision (TREE_TYPE (arg0))
> +      != element_precision (TREE_TYPE (arg1)))
>      return 0;
>
>    STRIP_NOPS (arg0);
>    STRIP_NOPS (arg1);
>
>    /* In case both args are comparisons but with different comparison
>       code, try to swap the comparison operands of one arg to produce
>       a match and compare that variant.  */
>    if (TREE_CODE (arg0) != TREE_CODE (arg1)
>        && COMPARISON_CLASS_P (arg0)
> @@ -9870,20 +9871,21 @@ exact_inverse (tree type, tree cst)
>     return NULL_TREE.  */
>
>  tree
>  fold_binary_loc (location_t loc,
>              enum tree_code code, tree type, tree op0, tree op1)
>  {
>    enum tree_code_class kind = TREE_CODE_CLASS (code);
>    tree arg0, arg1, tem;
>    tree t1 = NULL_TREE;
>    bool strict_overflow_p;
> +  unsigned int prec;
>
>    gcc_assert (IS_EXPR_CODE_CLASS (kind)
>               && TREE_CODE_LENGTH (code) == 2
>               && op0 != NULL_TREE
>               && op1 != NULL_TREE);
>
>    arg0 = op0;
>    arg1 = op1;
>
>    /* Strip any conversions that don't change the mode.  This is
> @@ -10140,35 +10142,35 @@ fold_binary_loc (location_t loc,
>
>           /* ~X + X is -1.  */
>           if (TREE_CODE (arg0) == BIT_NOT_EXPR
>               && !TYPE_OVERFLOW_TRAPS (type))
>             {
>               tree tem = TREE_OPERAND (arg0, 0);
>
>               STRIP_NOPS (tem);
>               if (operand_equal_p (tem, arg1, 0))
>                 {
> -                 t1 = build_int_cst_type (type, -1);
> +                 t1 = build_minus_one_cst (type);
>                   return omit_one_operand_loc (loc, type, t1, arg1);
>                 }
>             }
>
>           /* X + ~X is -1.  */
>           if (TREE_CODE (arg1) == BIT_NOT_EXPR
>               && !TYPE_OVERFLOW_TRAPS (type))
>             {
>               tree tem = TREE_OPERAND (arg1, 0);
>
>               STRIP_NOPS (tem);
>               if (operand_equal_p (arg0, tem, 0))
>                 {
> -                 t1 = build_int_cst_type (type, -1);
> +                 t1 = build_minus_one_cst (type);
>                   return omit_one_operand_loc (loc, type, t1, arg0);
>                 }
>             }
>
>           /* X + (X / CST) * -CST is X % CST.  */
>           if (TREE_CODE (arg1) == MULT_EXPR
>               && TREE_CODE (TREE_OPERAND (arg1, 0)) == TRUNC_DIV_EXPR
>               && operand_equal_p (arg0,
>                                   TREE_OPERAND (TREE_OPERAND (arg1, 0), 0),
> 0))
>             {
> @@ -10380,75 +10382,76 @@ fold_binary_loc (location_t loc,
>         code0 = TREE_CODE (arg0);
>         code1 = TREE_CODE (arg1);
>         if (((code0 == RSHIFT_EXPR && code1 == LSHIFT_EXPR)
>              || (code1 == RSHIFT_EXPR && code0 == LSHIFT_EXPR))
>             && operand_equal_p (TREE_OPERAND (arg0, 0),
>                                 TREE_OPERAND (arg1, 0), 0)
>             && (rtype = TREE_TYPE (TREE_OPERAND (arg0, 0)),
>                 TYPE_UNSIGNED (rtype))
>             /* Only create rotates in complete modes.  Other cases are not
>                expanded properly.  */
> -           && TYPE_PRECISION (rtype) == GET_MODE_PRECISION (TYPE_MODE
> (rtype)))
> +           && (element_precision (rtype)
> +               == element_precision (TYPE_MODE (rtype))))
>           {
>             tree tree01, tree11;
>             enum tree_code code01, code11;
>
>             tree01 = TREE_OPERAND (arg0, 1);
>             tree11 = TREE_OPERAND (arg1, 1);
>             STRIP_NOPS (tree01);
>             STRIP_NOPS (tree11);
>             code01 = TREE_CODE (tree01);
>             code11 = TREE_CODE (tree11);
>             if (code01 == INTEGER_CST
>                 && code11 == INTEGER_CST
>                 && TREE_INT_CST_HIGH (tree01) == 0
>                 && TREE_INT_CST_HIGH (tree11) == 0
>                 && ((TREE_INT_CST_LOW (tree01) + TREE_INT_CST_LOW (tree11))
> -                   == TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg0, 0)))))
> +                   == element_precision (TREE_TYPE (TREE_OPERAND (arg0,
> 0)))))
>               {
>                 tem = build2_loc (loc, LROTATE_EXPR,
>                                   TREE_TYPE (TREE_OPERAND (arg0, 0)),
>                                   TREE_OPERAND (arg0, 0),
>                                   code0 == LSHIFT_EXPR ? tree01 : tree11);
>                 return fold_convert_loc (loc, type, tem);
>               }
>             else if (code11 == MINUS_EXPR)
>               {
>                 tree tree110, tree111;
>                 tree110 = TREE_OPERAND (tree11, 0);
>                 tree111 = TREE_OPERAND (tree11, 1);
>                 STRIP_NOPS (tree110);
>                 STRIP_NOPS (tree111);
>                 if (TREE_CODE (tree110) == INTEGER_CST
>                     && 0 == compare_tree_int (tree110,
> -                                             TYPE_PRECISION
> +                                             element_precision
>                                               (TREE_TYPE (TREE_OPERAND
>                                                           (arg0, 0))))
>                     && operand_equal_p (tree01, tree111, 0))
>                   return
>                     fold_convert_loc (loc, type,
>                                       build2 ((code0 == LSHIFT_EXPR
>                                                ? LROTATE_EXPR
>                                                : RROTATE_EXPR),
>                                               TREE_TYPE (TREE_OPERAND (arg0,
> 0)),
>                                               TREE_OPERAND (arg0, 0),
> tree01));
>               }
>             else if (code01 == MINUS_EXPR)
>               {
>                 tree tree010, tree011;
>                 tree010 = TREE_OPERAND (tree01, 0);
>                 tree011 = TREE_OPERAND (tree01, 1);
>                 STRIP_NOPS (tree010);
>                 STRIP_NOPS (tree011);
>                 if (TREE_CODE (tree010) == INTEGER_CST
>                     && 0 == compare_tree_int (tree010,
> -                                             TYPE_PRECISION
> +                                             element_precision
>                                               (TREE_TYPE (TREE_OPERAND
>                                                           (arg0, 0))))
>                     && operand_equal_p (tree11, tree011, 0))
>                     return fold_convert_loc
>                       (loc, type,
>                        build2 ((code0 != LSHIFT_EXPR
>                                 ? LROTATE_EXPR
>                                 : RROTATE_EXPR),
>                                TREE_TYPE (TREE_OPERAND (arg0, 0)),
>                                TREE_OPERAND (arg0, 0), tree11));
> @@ -11750,22 +11753,21 @@ fold_binary_loc (location_t loc,
>             }
>         }
>
>        t1 = distribute_bit_expr (loc, code, type, arg0, arg1);
>        if (t1 != NULL_TREE)
>         return t1;
>        /* Simplify ((int)c & 0377) into (int)c, if c is unsigned char.  */
>        if (TREE_CODE (arg1) == INTEGER_CST && TREE_CODE (arg0) == NOP_EXPR
>           && TYPE_UNSIGNED (TREE_TYPE (TREE_OPERAND (arg0, 0))))
>         {
> -         unsigned int prec
> -           = TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg0, 0)));
> +         prec = TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg0, 0)));
>
>           if (prec < BITS_PER_WORD && prec < HOST_BITS_PER_WIDE_INT
>               && (~TREE_INT_CST_LOW (arg1)
>                   & (((HOST_WIDE_INT) 1 << prec) - 1)) == 0)
>             return
>               fold_convert_loc (loc, type, TREE_OPERAND (arg0, 0));
>         }
>
>        /* Convert (and (not arg0) (not arg1)) to (not (or (arg0) (arg1))).
>
> @@ -11819,21 +11821,21 @@ fold_binary_loc (location_t loc,
>             = tree_low_cst (arg1, TYPE_UNSIGNED (TREE_TYPE (arg1)));
>           unsigned HOST_WIDE_INT newmask, zerobits = 0;
>           tree shift_type = TREE_TYPE (arg0);
>
>           if (TREE_CODE (arg0) == LSHIFT_EXPR)
>             zerobits = ((((unsigned HOST_WIDE_INT) 1) << shiftc) - 1);
>           else if (TREE_CODE (arg0) == RSHIFT_EXPR
>                    && TYPE_PRECISION (TREE_TYPE (arg0))
>                       == GET_MODE_BITSIZE (TYPE_MODE (TREE_TYPE (arg0))))
>             {
> -             unsigned int prec = TYPE_PRECISION (TREE_TYPE (arg0));
> +             prec = TYPE_PRECISION (TREE_TYPE (arg0));
>               tree arg00 = TREE_OPERAND (arg0, 0);
>               /* See if more bits can be proven as zero because of
>                  zero extension.  */
>               if (TREE_CODE (arg00) == NOP_EXPR
>                   && TYPE_UNSIGNED (TREE_TYPE (TREE_OPERAND (arg00, 0))))
>                 {
>                   tree inner_type = TREE_TYPE (TREE_OPERAND (arg00, 0));
>                   if (TYPE_PRECISION (inner_type)
>                       == GET_MODE_BITSIZE (TYPE_MODE (inner_type))
>                       && TYPE_PRECISION (inner_type) < prec)
> @@ -11862,22 +11864,20 @@ fold_binary_loc (location_t loc,
>             }
>
>           /* ((X << 16) & 0xff00) is (X, 0).  */
>           if ((mask & zerobits) == mask)
>             return omit_one_operand_loc (loc, type,
>                                      build_int_cst (type, 0), arg0);
>
>           newmask = mask | zerobits;
>           if (newmask != mask && (newmask & (newmask + 1)) == 0)
>             {
> -             unsigned int prec;
> -
>               /* Only do the transformation if NEWMASK is some integer
>                  mode's mask.  */
>               for (prec = BITS_PER_UNIT;
>                    prec < HOST_BITS_PER_WIDE_INT; prec <<= 1)
>                 if (newmask == (((unsigned HOST_WIDE_INT) 1) << prec) - 1)
>                   break;
>               if (prec < HOST_BITS_PER_WIDE_INT
>                   || newmask == ~(unsigned HOST_WIDE_INT) 0)
>                 {
>                   tree newmaskt;
> @@ -12407,78 +12407,79 @@ fold_binary_loc (location_t loc,
>        if (integer_zerop (arg1))
>         return non_lvalue_loc (loc, fold_convert_loc (loc, type, arg0));
>        if (integer_zerop (arg0))
>         return omit_one_operand_loc (loc, type, arg0, arg1);
>
>        /* Since negative shift count is not well-defined,
>          don't try to compute it in the compiler.  */
>        if (TREE_CODE (arg1) == INTEGER_CST && tree_int_cst_sgn (arg1) < 0)
>         return NULL_TREE;
>
> +      prec = element_precision (type);
> +
>        /* Turn (a OP c1) OP c2 into a OP (c1+c2).  */
>        if (TREE_CODE (op0) == code && host_integerp (arg1, false)
> -         && TREE_INT_CST_LOW (arg1) < TYPE_PRECISION (type)
> +         && TREE_INT_CST_LOW (arg1) < prec
>           && host_integerp (TREE_OPERAND (arg0, 1), false)
> -         && TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)) < TYPE_PRECISION
> (type))
> +         && TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)) < prec)
>         {
>           HOST_WIDE_INT low = (TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1))
>                                + TREE_INT_CST_LOW (arg1));
>
>           /* Deal with a OP (c1 + c2) being undefined but (a OP c1) OP c2
>              being well defined.  */
> -         if (low >= TYPE_PRECISION (type))
> +         if (low >= prec)
>             {
>               if (code == LROTATE_EXPR || code == RROTATE_EXPR)
> -               low = low % TYPE_PRECISION (type);
> +               low = low % prec;
>               else if (TYPE_UNSIGNED (type) || code == LSHIFT_EXPR)
> -               return omit_one_operand_loc (loc, type, build_int_cst (type,
> 0),
> +               return omit_one_operand_loc (loc, type, build_zero_cst
> (type),
>                                          TREE_OPERAND (arg0, 0));
>               else
> -               low = TYPE_PRECISION (type) - 1;
> +               low = prec - 1;
>             }
>
>           return fold_build2_loc (loc, code, type, TREE_OPERAND (arg0, 0),
> -                             build_int_cst (type, low));
> +                                 build_int_cst (TREE_TYPE (arg1), low));
>         }
>
>        /* Transform (x >> c) << c into x & (-1<<c), or transform (x << c) >>
> c
>           into x & ((unsigned)-1 >> c) for unsigned types.  */
>        if (((code == LSHIFT_EXPR && TREE_CODE (arg0) == RSHIFT_EXPR)
>             || (TYPE_UNSIGNED (type)
>                && code == RSHIFT_EXPR && TREE_CODE (arg0) == LSHIFT_EXPR))
>           && host_integerp (arg1, false)
> -         && TREE_INT_CST_LOW (arg1) < TYPE_PRECISION (type)
> +         && TREE_INT_CST_LOW (arg1) < prec
>           && host_integerp (TREE_OPERAND (arg0, 1), false)
> -         && TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)) < TYPE_PRECISION
> (type))
> +         && TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)) < prec)
>         {
>           HOST_WIDE_INT low0 = TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1));
>           HOST_WIDE_INT low1 = TREE_INT_CST_LOW (arg1);
>           tree lshift;
>           tree arg00;
>
>           if (low0 == low1)
>             {
>               arg00 = fold_convert_loc (loc, type, TREE_OPERAND (arg0, 0));
>
> -             lshift = build_int_cst (type, -1);
> -             lshift = int_const_binop (code, lshift, arg1);
> +             lshift = build_minus_one_cst (type);
> +             lshift = const_binop (code, lshift, arg1);
>
>               return fold_build2_loc (loc, BIT_AND_EXPR, type, arg00,
> lshift);
>             }
>         }
>
>        /* Rewrite an LROTATE_EXPR by a constant into an
>          RROTATE_EXPR by a new constant.  */
>        if (code == LROTATE_EXPR && TREE_CODE (arg1) == INTEGER_CST)
>         {
> -         tree tem = build_int_cst (TREE_TYPE (arg1),
> -                                   TYPE_PRECISION (type));
> +         tree tem = build_int_cst (TREE_TYPE (arg1), prec);
>           tem = const_binop (MINUS_EXPR, tem, arg1);
>           return fold_build2_loc (loc, RROTATE_EXPR, type, op0, tem);
>         }
>
>        /* If we have a rotate of a bit operation with the rotate count and
>          the second operand of the bit operation both constant,
>          permute the two operations.  */
>        if (code == RROTATE_EXPR && TREE_CODE (arg1) == INTEGER_CST
>           && (TREE_CODE (arg0) == BIT_AND_EXPR
>               || TREE_CODE (arg0) == BIT_IOR_EXPR
> @@ -12492,21 +12493,21 @@ fold_binary_loc (location_t loc,
>
>        /* Two consecutive rotates adding up to the precision of the
>          type can be ignored.  */
>        if (code == RROTATE_EXPR && TREE_CODE (arg1) == INTEGER_CST
>           && TREE_CODE (arg0) == RROTATE_EXPR
>           && TREE_CODE (TREE_OPERAND (arg0, 1)) == INTEGER_CST
>           && TREE_INT_CST_HIGH (arg1) == 0
>           && TREE_INT_CST_HIGH (TREE_OPERAND (arg0, 1)) == 0
>           && ((TREE_INT_CST_LOW (arg1)
>                + TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)))
> -             == (unsigned int) TYPE_PRECISION (type)))
> +             == prec))
>         return TREE_OPERAND (arg0, 0);
>
>        /* Fold (X & C2) << C1 into (X << C1) & (C2 << C1)
>               (X & C2) >> C1 into (X >> C1) & (C2 >> C1)
>          if the latter can be further optimized.  */
>        if ((code == LSHIFT_EXPR || code == RSHIFT_EXPR)
>           && TREE_CODE (arg0) == BIT_AND_EXPR
>           && TREE_CODE (arg1) == INTEGER_CST
>           && TREE_CODE (TREE_OPERAND (arg0, 1)) == INTEGER_CST)
>         {
> @@ -12905,22 +12906,22 @@ fold_binary_loc (location_t loc,
>          C1 is a valid shift constant, and C2 is a power of two, i.e.
>          a single bit.  */
>        if (TREE_CODE (arg0) == BIT_AND_EXPR
>           && TREE_CODE (TREE_OPERAND (arg0, 0)) == RSHIFT_EXPR
>           && TREE_CODE (TREE_OPERAND (TREE_OPERAND (arg0, 0), 1))
>              == INTEGER_CST
>           && integer_pow2p (TREE_OPERAND (arg0, 1))
>           && integer_zerop (arg1))
>         {
>           tree itype = TREE_TYPE (arg0);
> -         unsigned HOST_WIDE_INT prec = TYPE_PRECISION (itype);
>           tree arg001 = TREE_OPERAND (TREE_OPERAND (arg0, 0), 1);
> +         prec = TYPE_PRECISION (itype);
>
>           /* Check for a valid shift count.  */
>           if (TREE_INT_CST_HIGH (arg001) == 0
>               && TREE_INT_CST_LOW (arg001) < prec)
>             {
>               tree arg01 = TREE_OPERAND (arg0, 1);
>               tree arg000 = TREE_OPERAND (TREE_OPERAND (arg0, 0), 0);
>               unsigned HOST_WIDE_INT log2 = tree_log2 (arg01);
>               /* If (C2 << C1) doesn't overflow, then ((X >> C1) & C2) != 0
>                  can be rewritten as (X & (C2 << C1)) != 0.  */
> Index: gcc/stor-layout.c
> ===================================================================
> --- gcc/stor-layout.c   (revision 198685)
> +++ gcc/stor-layout.c   (working copy)
> @@ -446,20 +446,32 @@ mode_for_vector (enum machine_mode inner
>
>  /* Return the alignment of MODE. This will be bounded by 1 and
>     BIGGEST_ALIGNMENT.  */
>
>  unsigned int
>  get_mode_alignment (enum machine_mode mode)
>  {
>    return MIN (BIGGEST_ALIGNMENT, MAX (1,
> mode_base_align[mode]*BITS_PER_UNIT));
>  }
>
> +/* Return the precision of the mode, or for a complex or vector mode the
> +   precision of the mode of its elements.  */
> +
> +unsigned int
> +element_precision (enum machine_mode mode)
> +{
> +  if (COMPLEX_MODE_P (mode) || VECTOR_MODE_P (mode))
> +    mode = GET_MODE_INNER (mode);
> +
> +  return GET_MODE_PRECISION (mode);
> +}
> +
>  /* Return the natural mode of an array, given that it is SIZE bytes in
>     total and has elements of type ELEM_TYPE.  */
>
>  static enum machine_mode
>  mode_for_array (tree elem_type, tree size)
>  {
>    tree elem_size;
>    unsigned HOST_WIDE_INT int_size, int_elem_size;
>    bool limit_p;
>
> Index: gcc/machmode.h
> ===================================================================
> --- gcc/machmode.h      (revision 198685)
> +++ gcc/machmode.h      (working copy)
> @@ -290,20 +290,24 @@ extern enum machine_mode get_best_mode (
>                                         enum machine_mode, bool);
>
>  /* Determine alignment, 1<=result<=BIGGEST_ALIGNMENT.  */
>
>  extern CONST_MODE_BASE_ALIGN unsigned char
> mode_base_align[NUM_MACHINE_MODES];
>
>  extern unsigned get_mode_alignment (enum machine_mode);
>
>  #define GET_MODE_ALIGNMENT(MODE) get_mode_alignment (MODE)
>
> +/* Get the precision of the mode or its inner mode if it has one.  */
> +
> +extern unsigned int element_precision (enum machine_mode);
> +
>  /* For each class, get the narrowest mode in that class.  */
>
>  extern const unsigned char class_narrowest_mode[MAX_MODE_CLASS];
>  #define GET_CLASS_NARROWEST_MODE(CLASS) \
>    ((enum machine_mode) class_narrowest_mode[CLASS])
>
>  /* Define the integer modes whose sizes are BITS_PER_UNIT and BITS_PER_WORD
>     and the mode whose class is Pmode and whose size is POINTER_SIZE.  */
>
>  extern enum machine_mode byte_mode;
>

Patch

Index: gcc/convert.c

===================================================================
--- gcc/convert.c	(revision 198685)

+++ gcc/convert.c	(working copy)

@@ -348,22 +348,22 @@  convert_to_real (tree type, tree expr)

    fixed-point or vector; in other cases error is called.
 
    The result of this is always supposed to be a newly created tree node
    not in use in any existing structure.  */
 
 tree
 convert_to_integer (tree type, tree expr)
 {
   enum tree_code ex_form = TREE_CODE (expr);
   tree intype = TREE_TYPE (expr);
-  unsigned int inprec = TYPE_PRECISION (intype);

-  unsigned int outprec = TYPE_PRECISION (type);

+  unsigned int inprec = element_precision (intype);

+  unsigned int outprec = element_precision (type);

 
   /* An INTEGER_TYPE cannot be incomplete, but an ENUMERAL_TYPE can
      be.  Consider `enum E = { a, b = (enum E) 3 };'.  */
   if (!COMPLETE_TYPE_P (type))
     {
       error ("conversion to incomplete type");
       return error_mark_node;
     }
 
   /* Convert e.g. (long)round(d) -> lround(d).  */
Index: gcc/tree.c

===================================================================
--- gcc/tree.c	(revision 198685)

+++ gcc/tree.c	(working copy)

@@ -1615,20 +1615,59 @@  build_one_cst (tree type)

     case COMPLEX_TYPE:
       return build_complex (type,
 			    build_one_cst (TREE_TYPE (type)),
 			    build_zero_cst (TREE_TYPE (type)));
 
     default:
       gcc_unreachable ();
     }
 }
 
+/* Return a constant of arithmetic type TYPE which is the

+   opposite of the multiplicative identity of the set TYPE.  */

+

+tree

+build_minus_one_cst (tree type)

+{

+  switch (TREE_CODE (type))

+    {

+    case INTEGER_TYPE: case ENUMERAL_TYPE: case BOOLEAN_TYPE:

+    case POINTER_TYPE: case REFERENCE_TYPE:

+    case OFFSET_TYPE:

+      return build_int_cst (type, -1);

+

+    case REAL_TYPE:

+      return build_real (type, dconstm1);

+

+    case FIXED_POINT_TYPE:

+      /* We can only generate 1 for accum types.  */

+      gcc_assert (ALL_SCALAR_ACCUM_MODE_P (TYPE_MODE (type)));

+      return build_fixed (type, fixed_from_double_int (double_int_minus_one,

+						       TYPE_MODE (type)));

+

+    case VECTOR_TYPE:

+      {

+	tree scalar = build_minus_one_cst (TREE_TYPE (type));

+

+	return build_vector_from_val (type, scalar);

+      }

+

+    case COMPLEX_TYPE:

+      return build_complex (type,

+			    build_minus_one_cst (TREE_TYPE (type)),

+			    build_zero_cst (TREE_TYPE (type)));

+

+    default:

+      gcc_unreachable ();

+    }

+}

+

 /* Build 0 constant of type TYPE.  This is used by constructor folding
    and thus the constant should be represented in memory by
    zero(es).  */
 
 tree
 build_zero_cst (tree type)
 {
   switch (TREE_CODE (type))
     {
     case INTEGER_TYPE: case ENUMERAL_TYPE: case BOOLEAN_TYPE:
@@ -6921,20 +6960,33 @@  compare_tree_int (const_tree t, unsigned

 bool
 valid_constant_size_p (const_tree size)
 {
   if (! host_integerp (size, 1)
       || TREE_OVERFLOW (size)
       || tree_int_cst_sign_bit (size) != 0)
     return false;
   return true;
 }
 
+/* Return the precision of the type, or for a complex or vector type the

+   precision of the type of its elements.  */

+

+unsigned int

+element_precision (const_tree type)

+{

+  enum tree_code code = TREE_CODE (type);

+  if (code == COMPLEX_TYPE || code == VECTOR_TYPE)

+    type = TREE_TYPE (type);

+

+  return TYPE_PRECISION (type);

+}

+

 /* Return true if CODE represents an associative tree code.  Otherwise
    return false.  */
 bool
 associative_tree_code (enum tree_code code)
 {
   switch (code)
     {
     case BIT_IOR_EXPR:
     case BIT_AND_EXPR:
     case BIT_XOR_EXPR:
Index: gcc/tree.h

===================================================================
--- gcc/tree.h	(revision 198685)

+++ gcc/tree.h	(working copy)

@@ -929,20 +929,23 @@  extern void omp_clause_range_check_faile

 
 #define RECORD_OR_UNION_CHECK(T)	\
   TREE_CHECK3 (T, RECORD_TYPE, UNION_TYPE, QUAL_UNION_TYPE)
 #define NOT_RECORD_OR_UNION_CHECK(T) \
   TREE_NOT_CHECK3 (T, RECORD_TYPE, UNION_TYPE, QUAL_UNION_TYPE)
 
 #define NUMERICAL_TYPE_CHECK(T)					\
   TREE_CHECK5 (T, INTEGER_TYPE, ENUMERAL_TYPE, BOOLEAN_TYPE, REAL_TYPE,	\
 	       FIXED_POINT_TYPE)
 
+#define SCALAR_TYPE_CHECK(T)	\

+  TREE_NOT_CHECK2 (TYPE_CHECK (T), COMPLEX_TYPE, VECTOR_TYPE)

+

 /* Here is how primitive or already-canonicalized types' hash codes
    are made.  */
 #define TYPE_HASH(TYPE) (TYPE_UID (TYPE))
 
 /* A simple hash function for an arbitrary tree node.  This must not be
    used in hash tables which are saved to a PCH.  */
 #define TREE_HASH(NODE) ((size_t) (NODE) & 0777777)
 
 /* Tests if CODE is a conversion expr (NOP_EXPR or CONVERT_EXPR).  */
 #define CONVERT_EXPR_CODE_P(CODE)				\
@@ -2108,20 +2111,22 @@  struct GTY(()) tree_block {

    overloaded and used for different macros in different kinds of types.
    Each macro must check to ensure the tree node is of the proper kind of
    type.  Note also that some of the front-ends also overload these fields,
    so they must be checked as well.  */
 
 #define TYPE_UID(NODE) (TYPE_CHECK (NODE)->type_common.uid)
 #define TYPE_SIZE(NODE) (TYPE_CHECK (NODE)->type_common.size)
 #define TYPE_SIZE_UNIT(NODE) (TYPE_CHECK (NODE)->type_common.size_unit)
 #define TYPE_POINTER_TO(NODE) (TYPE_CHECK (NODE)->type_common.pointer_to)
 #define TYPE_REFERENCE_TO(NODE) (TYPE_CHECK (NODE)->type_common.reference_to)
+//FIXME: we want

+//#define TYPE_PRECISION(NODE) (SCALAR_TYPE_CHECK (NODE)->type_common.precision)

 #define TYPE_PRECISION(NODE) (TYPE_CHECK (NODE)->type_common.precision)
 #define TYPE_NAME(NODE) (TYPE_CHECK (NODE)->type_common.name)
 #define TYPE_NEXT_VARIANT(NODE) (TYPE_CHECK (NODE)->type_common.next_variant)
 #define TYPE_MAIN_VARIANT(NODE) (TYPE_CHECK (NODE)->type_common.main_variant)
 #define TYPE_CONTEXT(NODE) (TYPE_CHECK (NODE)->type_common.context)
 
 /* Vector types need to check target flags to determine type.  */
 extern enum machine_mode vector_type_mode (const_tree);
 #define TYPE_MODE(NODE) \
   (VECTOR_TYPE_P (TYPE_CHECK (NODE)) \
@@ -4759,20 +4764,21 @@  extern tree make_vector_stat (unsigned M

 extern tree build_vector_stat (tree, tree * MEM_STAT_DECL);
 #define build_vector(t,v) build_vector_stat (t, v MEM_STAT_INFO)
 extern tree build_vector_from_ctor (tree, vec<constructor_elt, va_gc> *);
 extern tree build_vector_from_val (tree, tree);
 extern tree build_constructor (tree, vec<constructor_elt, va_gc> *);
 extern tree build_constructor_single (tree, tree, tree);
 extern tree build_constructor_from_list (tree, tree);
 extern tree build_real_from_int_cst (tree, const_tree);
 extern tree build_complex (tree, tree, tree);
 extern tree build_one_cst (tree);
+extern tree build_minus_one_cst (tree);

 extern tree build_zero_cst (tree);
 extern tree build_string (int, const char *);
 extern tree build_tree_list_stat (tree, tree MEM_STAT_DECL);
 #define build_tree_list(t,q) build_tree_list_stat(t,q MEM_STAT_INFO)
 extern tree build_tree_list_vec_stat (const vec<tree, va_gc> *MEM_STAT_DECL);
 #define build_tree_list_vec(v) build_tree_list_vec_stat (v MEM_STAT_INFO)
 extern tree build_decl_stat (location_t, enum tree_code,
 			     tree, tree MEM_STAT_DECL);
 extern tree build_fn_decl (const char *, tree);
 #define build_decl(l,c,t,q) build_decl_stat (l,c,t,q MEM_STAT_INFO)
@@ -4859,20 +4865,21 @@  tree_low_cst (const_tree t, int pos)

 extern HOST_WIDE_INT size_low_cst (const_tree);
 extern int tree_int_cst_sgn (const_tree);
 extern int tree_int_cst_sign_bit (const_tree);
 extern unsigned int tree_int_cst_min_precision (tree, bool);
 extern bool tree_expr_nonnegative_p (tree);
 extern bool tree_expr_nonnegative_warnv_p (tree, bool *);
 extern bool may_negate_without_overflow_p (const_tree);
 extern tree strip_array_types (tree);
 extern tree excess_precision_type (tree);
 extern bool valid_constant_size_p (const_tree);
+extern unsigned int element_precision (const_tree);

 
 /* Construct various nodes representing fract or accum data types.  */
 
 extern tree make_fract_type (int, int, int);
 extern tree make_accum_type (int, int, int);
 
 #define make_signed_fract_type(P) make_fract_type (P, 0, 0)
 #define make_unsigned_fract_type(P) make_fract_type (P, 1, 0)
 #define make_sat_signed_fract_type(P) make_fract_type (P, 0, 1)
 #define make_sat_unsigned_fract_type(P) make_fract_type (P, 1, 1)
Index: gcc/gimple.c

===================================================================
--- gcc/gimple.c	(revision 198685)

+++ gcc/gimple.c	(working copy)

@@ -3076,29 +3076,36 @@  iterative_hash_canonical_type (tree type

      checked.  */
   v = iterative_hash_hashval_t (TREE_CODE (type), 0);
   v = iterative_hash_hashval_t (TREE_ADDRESSABLE (type), v);
   v = iterative_hash_hashval_t (TYPE_ALIGN (type), v);
   v = iterative_hash_hashval_t (TYPE_MODE (type), v);
 
   /* Incorporate common features of numerical types.  */
   if (INTEGRAL_TYPE_P (type)
       || SCALAR_FLOAT_TYPE_P (type)
       || FIXED_POINT_TYPE_P (type)
-      || TREE_CODE (type) == VECTOR_TYPE

-      || TREE_CODE (type) == COMPLEX_TYPE

       || TREE_CODE (type) == OFFSET_TYPE
       || POINTER_TYPE_P (type))
     {
       v = iterative_hash_hashval_t (TYPE_PRECISION (type), v);
       v = iterative_hash_hashval_t (TYPE_UNSIGNED (type), v);
     }
 
+  if (VECTOR_TYPE_P (type))

+    {

+      v = iterative_hash_hashval_t (TYPE_VECTOR_SUBPARTS (type), v);

+      v = iterative_hash_hashval_t (TYPE_UNSIGNED (type), v);

+    }

+

+  if (TREE_CODE (type) == COMPLEX_TYPE)

+    v = iterative_hash_hashval_t (TYPE_UNSIGNED (type), v);

+

   /* For pointer and reference types, fold in information about the type
      pointed to but do not recurse to the pointed-to type.  */
   if (POINTER_TYPE_P (type))
     {
       v = iterative_hash_hashval_t (TYPE_REF_CAN_ALIAS_ALL (type), v);
       v = iterative_hash_hashval_t (TYPE_ADDR_SPACE (TREE_TYPE (type)), v);
       v = iterative_hash_hashval_t (TYPE_RESTRICT (type), v);
       v = iterative_hash_hashval_t (TREE_CODE (TREE_TYPE (type)), v);
     }
 
Index: gcc/testsuite/gcc.dg/vector-shift.c

===================================================================
--- gcc/testsuite/gcc.dg/vector-shift.c	(revision 0)

+++ gcc/testsuite/gcc.dg/vector-shift.c	(revision 0)

@@ -0,0 +1,13 @@ 

+/* { dg-do compile } */

+/* { dg-options "-fdump-tree-original" } */

+

+typedef unsigned vec __attribute__ ((vector_size (4 * sizeof (int))));

+

+void

+f (vec *x)

+{

+  *x = (*x << 4) << 3;

+}

+

+/* { dg-final { scan-tree-dump "<< 7" "original" } } */

+/* { dg-final { cleanup-tree-dump "original" } } */


Property changes on: gcc/testsuite/gcc.dg/vector-shift.c
___________________________________________________________________
Added: svn:eol-style
   + native
Added: svn:keywords
   + Author Date Id Revision URL

Index: gcc/c-family/c-common.c

===================================================================
--- gcc/c-family/c-common.c	(revision 198685)

+++ gcc/c-family/c-common.c	(working copy)

@@ -2220,21 +2220,21 @@  vector_types_convertible_p (const_tree t

   static bool emitted_lax_note = false;
   bool convertible_lax;
 
   if ((TYPE_VECTOR_OPAQUE (t1) || TYPE_VECTOR_OPAQUE (t2))
       && tree_int_cst_equal (TYPE_SIZE (t1), TYPE_SIZE (t2)))
     return true;
 
   convertible_lax =
     (tree_int_cst_equal (TYPE_SIZE (t1), TYPE_SIZE (t2))
      && (TREE_CODE (TREE_TYPE (t1)) != REAL_TYPE ||
-	 TYPE_PRECISION (t1) == TYPE_PRECISION (t2))

+	 TYPE_VECTOR_SUBPARTS (t1) == TYPE_VECTOR_SUBPARTS (t2))

      && (INTEGRAL_TYPE_P (TREE_TYPE (t1))
 	 == INTEGRAL_TYPE_P (TREE_TYPE (t2))));
 
   if (!convertible_lax || flag_lax_vector_conversions)
     return convertible_lax;
 
   if (TYPE_VECTOR_SUBPARTS (t1) == TYPE_VECTOR_SUBPARTS (t2)
       && lang_hooks.types_compatible_p (TREE_TYPE (t1), TREE_TYPE (t2)))
     return true;
 
Index: gcc/fold-const.c

===================================================================
--- gcc/fold-const.c	(revision 198685)

+++ gcc/fold-const.c	(working copy)

@@ -2438,21 +2438,22 @@  operand_equal_p (const_tree arg0, const_

     return 0;
 
   /* We cannot consider pointers to different address space equal.  */
   if (POINTER_TYPE_P (TREE_TYPE (arg0)) && POINTER_TYPE_P (TREE_TYPE (arg1))
       && (TYPE_ADDR_SPACE (TREE_TYPE (TREE_TYPE (arg0)))
 	  != TYPE_ADDR_SPACE (TREE_TYPE (TREE_TYPE (arg1)))))
     return 0;
 
   /* If both types don't have the same precision, then it is not safe
      to strip NOPs.  */
-  if (TYPE_PRECISION (TREE_TYPE (arg0)) != TYPE_PRECISION (TREE_TYPE (arg1)))

+  if (element_precision (TREE_TYPE (arg0))

+      != element_precision (TREE_TYPE (arg1)))

     return 0;
 
   STRIP_NOPS (arg0);
   STRIP_NOPS (arg1);
 
   /* In case both args are comparisons but with different comparison
      code, try to swap the comparison operands of one arg to produce
      a match and compare that variant.  */
   if (TREE_CODE (arg0) != TREE_CODE (arg1)
       && COMPARISON_CLASS_P (arg0)
@@ -9870,20 +9871,21 @@  exact_inverse (tree type, tree cst)

    return NULL_TREE.  */
 
 tree
 fold_binary_loc (location_t loc,
 	     enum tree_code code, tree type, tree op0, tree op1)
 {
   enum tree_code_class kind = TREE_CODE_CLASS (code);
   tree arg0, arg1, tem;
   tree t1 = NULL_TREE;
   bool strict_overflow_p;
+  unsigned int prec;

 
   gcc_assert (IS_EXPR_CODE_CLASS (kind)
 	      && TREE_CODE_LENGTH (code) == 2
 	      && op0 != NULL_TREE
 	      && op1 != NULL_TREE);
 
   arg0 = op0;
   arg1 = op1;
 
   /* Strip any conversions that don't change the mode.  This is
@@ -10140,35 +10142,35 @@  fold_binary_loc (location_t loc,

 
 	  /* ~X + X is -1.  */
 	  if (TREE_CODE (arg0) == BIT_NOT_EXPR
 	      && !TYPE_OVERFLOW_TRAPS (type))
 	    {
 	      tree tem = TREE_OPERAND (arg0, 0);
 
 	      STRIP_NOPS (tem);
 	      if (operand_equal_p (tem, arg1, 0))
 		{
-		  t1 = build_int_cst_type (type, -1);

+		  t1 = build_minus_one_cst (type);

 		  return omit_one_operand_loc (loc, type, t1, arg1);
 		}
 	    }
 
 	  /* X + ~X is -1.  */
 	  if (TREE_CODE (arg1) == BIT_NOT_EXPR
 	      && !TYPE_OVERFLOW_TRAPS (type))
 	    {
 	      tree tem = TREE_OPERAND (arg1, 0);
 
 	      STRIP_NOPS (tem);
 	      if (operand_equal_p (arg0, tem, 0))
 		{
-		  t1 = build_int_cst_type (type, -1);

+		  t1 = build_minus_one_cst (type);

 		  return omit_one_operand_loc (loc, type, t1, arg0);
 		}
 	    }
 
 	  /* X + (X / CST) * -CST is X % CST.  */
 	  if (TREE_CODE (arg1) == MULT_EXPR
 	      && TREE_CODE (TREE_OPERAND (arg1, 0)) == TRUNC_DIV_EXPR
 	      && operand_equal_p (arg0,
 				  TREE_OPERAND (TREE_OPERAND (arg1, 0), 0), 0))
 	    {
@@ -10380,75 +10382,76 @@  fold_binary_loc (location_t loc,

 	code0 = TREE_CODE (arg0);
 	code1 = TREE_CODE (arg1);
 	if (((code0 == RSHIFT_EXPR && code1 == LSHIFT_EXPR)
 	     || (code1 == RSHIFT_EXPR && code0 == LSHIFT_EXPR))
 	    && operand_equal_p (TREE_OPERAND (arg0, 0),
 			        TREE_OPERAND (arg1, 0), 0)
 	    && (rtype = TREE_TYPE (TREE_OPERAND (arg0, 0)),
 	        TYPE_UNSIGNED (rtype))
 	    /* Only create rotates in complete modes.  Other cases are not
 	       expanded properly.  */
-	    && TYPE_PRECISION (rtype) == GET_MODE_PRECISION (TYPE_MODE (rtype)))

+	    && (element_precision (rtype)

+		== element_precision (TYPE_MODE (rtype))))

 	  {
 	    tree tree01, tree11;
 	    enum tree_code code01, code11;
 
 	    tree01 = TREE_OPERAND (arg0, 1);
 	    tree11 = TREE_OPERAND (arg1, 1);
 	    STRIP_NOPS (tree01);
 	    STRIP_NOPS (tree11);
 	    code01 = TREE_CODE (tree01);
 	    code11 = TREE_CODE (tree11);
 	    if (code01 == INTEGER_CST
 		&& code11 == INTEGER_CST
 		&& TREE_INT_CST_HIGH (tree01) == 0
 		&& TREE_INT_CST_HIGH (tree11) == 0
 		&& ((TREE_INT_CST_LOW (tree01) + TREE_INT_CST_LOW (tree11))
-		    == TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg0, 0)))))

+		    == element_precision (TREE_TYPE (TREE_OPERAND (arg0, 0)))))

 	      {
 		tem = build2_loc (loc, LROTATE_EXPR,
 				  TREE_TYPE (TREE_OPERAND (arg0, 0)),
 				  TREE_OPERAND (arg0, 0),
 				  code0 == LSHIFT_EXPR ? tree01 : tree11);
 		return fold_convert_loc (loc, type, tem);
 	      }
 	    else if (code11 == MINUS_EXPR)
 	      {
 		tree tree110, tree111;
 		tree110 = TREE_OPERAND (tree11, 0);
 		tree111 = TREE_OPERAND (tree11, 1);
 		STRIP_NOPS (tree110);
 		STRIP_NOPS (tree111);
 		if (TREE_CODE (tree110) == INTEGER_CST
 		    && 0 == compare_tree_int (tree110,
-					      TYPE_PRECISION

+					      element_precision

 					      (TREE_TYPE (TREE_OPERAND
 							  (arg0, 0))))
 		    && operand_equal_p (tree01, tree111, 0))
 		  return
 		    fold_convert_loc (loc, type,
 				      build2 ((code0 == LSHIFT_EXPR
 					       ? LROTATE_EXPR
 					       : RROTATE_EXPR),
 					      TREE_TYPE (TREE_OPERAND (arg0, 0)),
 					      TREE_OPERAND (arg0, 0), tree01));
 	      }
 	    else if (code01 == MINUS_EXPR)
 	      {
 		tree tree010, tree011;
 		tree010 = TREE_OPERAND (tree01, 0);
 		tree011 = TREE_OPERAND (tree01, 1);
 		STRIP_NOPS (tree010);
 		STRIP_NOPS (tree011);
 		if (TREE_CODE (tree010) == INTEGER_CST
 		    && 0 == compare_tree_int (tree010,
-					      TYPE_PRECISION

+					      element_precision

 					      (TREE_TYPE (TREE_OPERAND
 							  (arg0, 0))))
 		    && operand_equal_p (tree11, tree011, 0))
 		    return fold_convert_loc
 		      (loc, type,
 		       build2 ((code0 != LSHIFT_EXPR
 				? LROTATE_EXPR
 				: RROTATE_EXPR),
 			       TREE_TYPE (TREE_OPERAND (arg0, 0)),
 			       TREE_OPERAND (arg0, 0), tree11));
@@ -11750,22 +11753,21 @@  fold_binary_loc (location_t loc,

 	    }
 	}
 
       t1 = distribute_bit_expr (loc, code, type, arg0, arg1);
       if (t1 != NULL_TREE)
 	return t1;
       /* Simplify ((int)c & 0377) into (int)c, if c is unsigned char.  */
       if (TREE_CODE (arg1) == INTEGER_CST && TREE_CODE (arg0) == NOP_EXPR
 	  && TYPE_UNSIGNED (TREE_TYPE (TREE_OPERAND (arg0, 0))))
 	{
-	  unsigned int prec

-	    = TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg0, 0)));

+	  prec = TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg0, 0)));

 
 	  if (prec < BITS_PER_WORD && prec < HOST_BITS_PER_WIDE_INT
 	      && (~TREE_INT_CST_LOW (arg1)
 		  & (((HOST_WIDE_INT) 1 << prec) - 1)) == 0)
 	    return
 	      fold_convert_loc (loc, type, TREE_OPERAND (arg0, 0));
 	}
 
       /* Convert (and (not arg0) (not arg1)) to (not (or (arg0) (arg1))).
 
@@ -11819,21 +11821,21 @@  fold_binary_loc (location_t loc,

 	    = tree_low_cst (arg1, TYPE_UNSIGNED (TREE_TYPE (arg1)));
 	  unsigned HOST_WIDE_INT newmask, zerobits = 0;
 	  tree shift_type = TREE_TYPE (arg0);
 
 	  if (TREE_CODE (arg0) == LSHIFT_EXPR)
 	    zerobits = ((((unsigned HOST_WIDE_INT) 1) << shiftc) - 1);
 	  else if (TREE_CODE (arg0) == RSHIFT_EXPR
 		   && TYPE_PRECISION (TREE_TYPE (arg0))
 		      == GET_MODE_BITSIZE (TYPE_MODE (TREE_TYPE (arg0))))
 	    {
-	      unsigned int prec = TYPE_PRECISION (TREE_TYPE (arg0));

+	      prec = TYPE_PRECISION (TREE_TYPE (arg0));

 	      tree arg00 = TREE_OPERAND (arg0, 0);
 	      /* See if more bits can be proven as zero because of
 		 zero extension.  */
 	      if (TREE_CODE (arg00) == NOP_EXPR
 		  && TYPE_UNSIGNED (TREE_TYPE (TREE_OPERAND (arg00, 0))))
 		{
 		  tree inner_type = TREE_TYPE (TREE_OPERAND (arg00, 0));
 		  if (TYPE_PRECISION (inner_type)
 		      == GET_MODE_BITSIZE (TYPE_MODE (inner_type))
 		      && TYPE_PRECISION (inner_type) < prec)
@@ -11862,22 +11864,20 @@  fold_binary_loc (location_t loc,

 	    }
 
 	  /* ((X << 16) & 0xff00) is (X, 0).  */
 	  if ((mask & zerobits) == mask)
 	    return omit_one_operand_loc (loc, type,
 				     build_int_cst (type, 0), arg0);
 
 	  newmask = mask | zerobits;
 	  if (newmask != mask && (newmask & (newmask + 1)) == 0)
 	    {
-	      unsigned int prec;

-

 	      /* Only do the transformation if NEWMASK is some integer
 		 mode's mask.  */
 	      for (prec = BITS_PER_UNIT;
 		   prec < HOST_BITS_PER_WIDE_INT; prec <<= 1)
 		if (newmask == (((unsigned HOST_WIDE_INT) 1) << prec) - 1)
 		  break;
 	      if (prec < HOST_BITS_PER_WIDE_INT
 		  || newmask == ~(unsigned HOST_WIDE_INT) 0)
 		{
 		  tree newmaskt;
@@ -12407,78 +12407,79 @@  fold_binary_loc (location_t loc,

       if (integer_zerop (arg1))
 	return non_lvalue_loc (loc, fold_convert_loc (loc, type, arg0));
       if (integer_zerop (arg0))
 	return omit_one_operand_loc (loc, type, arg0, arg1);
 
       /* Since negative shift count is not well-defined,
 	 don't try to compute it in the compiler.  */
       if (TREE_CODE (arg1) == INTEGER_CST && tree_int_cst_sgn (arg1) < 0)
 	return NULL_TREE;
 
+      prec = element_precision (type);

+

       /* Turn (a OP c1) OP c2 into a OP (c1+c2).  */
       if (TREE_CODE (op0) == code && host_integerp (arg1, false)
-	  && TREE_INT_CST_LOW (arg1) < TYPE_PRECISION (type)

+	  && TREE_INT_CST_LOW (arg1) < prec

 	  && host_integerp (TREE_OPERAND (arg0, 1), false)
-	  && TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)) < TYPE_PRECISION (type))

+	  && TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)) < prec)

 	{
 	  HOST_WIDE_INT low = (TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1))
 			       + TREE_INT_CST_LOW (arg1));
 
 	  /* Deal with a OP (c1 + c2) being undefined but (a OP c1) OP c2
 	     being well defined.  */
-	  if (low >= TYPE_PRECISION (type))

+	  if (low >= prec)

 	    {
 	      if (code == LROTATE_EXPR || code == RROTATE_EXPR)
-	        low = low % TYPE_PRECISION (type);

+	        low = low % prec;

 	      else if (TYPE_UNSIGNED (type) || code == LSHIFT_EXPR)
-		return omit_one_operand_loc (loc, type, build_int_cst (type, 0),

+		return omit_one_operand_loc (loc, type, build_zero_cst (type),

 					 TREE_OPERAND (arg0, 0));
 	      else
-		low = TYPE_PRECISION (type) - 1;

+		low = prec - 1;

 	    }
 
 	  return fold_build2_loc (loc, code, type, TREE_OPERAND (arg0, 0),
-			      build_int_cst (type, low));

+				  build_int_cst (TREE_TYPE (arg1), low));

 	}
 
       /* Transform (x >> c) << c into x & (-1<<c), or transform (x << c) >> c
          into x & ((unsigned)-1 >> c) for unsigned types.  */
       if (((code == LSHIFT_EXPR && TREE_CODE (arg0) == RSHIFT_EXPR)
            || (TYPE_UNSIGNED (type)
 	       && code == RSHIFT_EXPR && TREE_CODE (arg0) == LSHIFT_EXPR))
 	  && host_integerp (arg1, false)
-	  && TREE_INT_CST_LOW (arg1) < TYPE_PRECISION (type)

+	  && TREE_INT_CST_LOW (arg1) < prec

 	  && host_integerp (TREE_OPERAND (arg0, 1), false)
-	  && TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)) < TYPE_PRECISION (type))

+	  && TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)) < prec)

 	{
 	  HOST_WIDE_INT low0 = TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1));
 	  HOST_WIDE_INT low1 = TREE_INT_CST_LOW (arg1);
 	  tree lshift;
 	  tree arg00;
 
 	  if (low0 == low1)
 	    {
 	      arg00 = fold_convert_loc (loc, type, TREE_OPERAND (arg0, 0));
 
-	      lshift = build_int_cst (type, -1);

-	      lshift = int_const_binop (code, lshift, arg1);

+	      lshift = build_minus_one_cst (type);

+	      lshift = const_binop (code, lshift, arg1);

 
 	      return fold_build2_loc (loc, BIT_AND_EXPR, type, arg00, lshift);
 	    }
 	}
 
       /* Rewrite an LROTATE_EXPR by a constant into an
 	 RROTATE_EXPR by a new constant.  */
       if (code == LROTATE_EXPR && TREE_CODE (arg1) == INTEGER_CST)
 	{
-	  tree tem = build_int_cst (TREE_TYPE (arg1),

-				    TYPE_PRECISION (type));

+	  tree tem = build_int_cst (TREE_TYPE (arg1), prec);

 	  tem = const_binop (MINUS_EXPR, tem, arg1);
 	  return fold_build2_loc (loc, RROTATE_EXPR, type, op0, tem);
 	}
 
       /* If we have a rotate of a bit operation with the rotate count and
 	 the second operand of the bit operation both constant,
 	 permute the two operations.  */
       if (code == RROTATE_EXPR && TREE_CODE (arg1) == INTEGER_CST
 	  && (TREE_CODE (arg0) == BIT_AND_EXPR
 	      || TREE_CODE (arg0) == BIT_IOR_EXPR
@@ -12492,21 +12493,21 @@  fold_binary_loc (location_t loc,

 
       /* Two consecutive rotates adding up to the precision of the
 	 type can be ignored.  */
       if (code == RROTATE_EXPR && TREE_CODE (arg1) == INTEGER_CST
 	  && TREE_CODE (arg0) == RROTATE_EXPR
 	  && TREE_CODE (TREE_OPERAND (arg0, 1)) == INTEGER_CST
 	  && TREE_INT_CST_HIGH (arg1) == 0
 	  && TREE_INT_CST_HIGH (TREE_OPERAND (arg0, 1)) == 0
 	  && ((TREE_INT_CST_LOW (arg1)
 	       + TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)))
-	      == (unsigned int) TYPE_PRECISION (type)))

+	      == prec))

 	return TREE_OPERAND (arg0, 0);
 
       /* Fold (X & C2) << C1 into (X << C1) & (C2 << C1)
 	      (X & C2) >> C1 into (X >> C1) & (C2 >> C1)
 	 if the latter can be further optimized.  */
       if ((code == LSHIFT_EXPR || code == RSHIFT_EXPR)
 	  && TREE_CODE (arg0) == BIT_AND_EXPR
 	  && TREE_CODE (arg1) == INTEGER_CST
 	  && TREE_CODE (TREE_OPERAND (arg0, 1)) == INTEGER_CST)
 	{
@@ -12905,22 +12906,22 @@  fold_binary_loc (location_t loc,

 	 C1 is a valid shift constant, and C2 is a power of two, i.e.
 	 a single bit.  */
       if (TREE_CODE (arg0) == BIT_AND_EXPR
 	  && TREE_CODE (TREE_OPERAND (arg0, 0)) == RSHIFT_EXPR
 	  && TREE_CODE (TREE_OPERAND (TREE_OPERAND (arg0, 0), 1))
 	     == INTEGER_CST
 	  && integer_pow2p (TREE_OPERAND (arg0, 1))
 	  && integer_zerop (arg1))
 	{
 	  tree itype = TREE_TYPE (arg0);
-	  unsigned HOST_WIDE_INT prec = TYPE_PRECISION (itype);

 	  tree arg001 = TREE_OPERAND (TREE_OPERAND (arg0, 0), 1);
+	  prec = TYPE_PRECISION (itype);

 
 	  /* Check for a valid shift count.  */
 	  if (TREE_INT_CST_HIGH (arg001) == 0
 	      && TREE_INT_CST_LOW (arg001) < prec)
 	    {
 	      tree arg01 = TREE_OPERAND (arg0, 1);
 	      tree arg000 = TREE_OPERAND (TREE_OPERAND (arg0, 0), 0);
 	      unsigned HOST_WIDE_INT log2 = tree_log2 (arg01);
 	      /* If (C2 << C1) doesn't overflow, then ((X >> C1) & C2) != 0
 		 can be rewritten as (X & (C2 << C1)) != 0.  */
Index: gcc/stor-layout.c

===================================================================
--- gcc/stor-layout.c	(revision 198685)

+++ gcc/stor-layout.c	(working copy)

@@ -446,20 +446,32 @@  mode_for_vector (enum machine_mode inner

 
 /* Return the alignment of MODE. This will be bounded by 1 and
    BIGGEST_ALIGNMENT.  */
 
 unsigned int
 get_mode_alignment (enum machine_mode mode)
 {
   return MIN (BIGGEST_ALIGNMENT, MAX (1, mode_base_align[mode]*BITS_PER_UNIT));
 }
 
+/* Return the precision of the mode, or for a complex or vector mode the

+   precision of the mode of its elements.  */

+

+unsigned int

+element_precision (enum machine_mode mode)

+{

+  if (COMPLEX_MODE_P (mode) || VECTOR_MODE_P (mode))

+    mode = GET_MODE_INNER (mode);

+

+  return GET_MODE_PRECISION (mode);

+}

+

 /* Return the natural mode of an array, given that it is SIZE bytes in
    total and has elements of type ELEM_TYPE.  */
 
 static enum machine_mode
 mode_for_array (tree elem_type, tree size)
 {
   tree elem_size;
   unsigned HOST_WIDE_INT int_size, int_elem_size;
   bool limit_p;
 
Index: gcc/machmode.h

===================================================================
--- gcc/machmode.h	(revision 198685)

+++ gcc/machmode.h	(working copy)

@@ -290,20 +290,24 @@  extern enum machine_mode get_best_mode (

 					enum machine_mode, bool);
 
 /* Determine alignment, 1<=result<=BIGGEST_ALIGNMENT.  */
 
 extern CONST_MODE_BASE_ALIGN unsigned char mode_base_align[NUM_MACHINE_MODES];
 
 extern unsigned get_mode_alignment (enum machine_mode);
 
 #define GET_MODE_ALIGNMENT(MODE) get_mode_alignment (MODE)
 
+/* Get the precision of the mode or its inner mode if it has one.  */

+

+extern unsigned int element_precision (enum machine_mode);

+

 /* For each class, get the narrowest mode in that class.  */
 
 extern const unsigned char class_narrowest_mode[MAX_MODE_CLASS];
 #define GET_CLASS_NARROWEST_MODE(CLASS) \
   ((enum machine_mode) class_narrowest_mode[CLASS])
 
 /* Define the integer modes whose sizes are BITS_PER_UNIT and BITS_PER_WORD
    and the mode whose class is Pmode and whose size is POINTER_SIZE.  */
 
 extern enum machine_mode byte_mode;