diff mbox series

[005/nnn] poly_int: rtx constants

Message ID 87d15du7dh.fsf@linaro.org
State New
Headers show
Series [005/nnn] poly_int: rtx constants | expand

Commit Message

Richard Sandiford Oct. 23, 2017, 5 p.m. UTC
This patch adds an rtl representation of poly_int values.
There were three possible ways of doing this:

(1) Add a new rtl code for the poly_ints themselves and store the
    coefficients as trailing wide_ints.  This would give constants like:

      (const_poly_int [c0 c1 ... cn])

    The runtime value would be:

      c0 + c1 * x1 + ... + cn * xn

(2) Like (1), but use rtxes for the coefficients.  This would give
    constants like:

      (const_poly_int [(const_int c0)
                       (const_int c1)
                       ...
                       (const_int cn)])

    although the coefficients could be const_wide_ints instead
    of const_ints where appropriate.

(3) Add a new rtl code for the polynomial indeterminates,
    then use them in const wrappers.  A constant like c0 + c1 * x1
    would then look like:

      (const:M (plus:M (mult:M (const_param:M x1)
                               (const_int c1))
                       (const_int c0)))

There didn't seem to be that much to choose between them.  The main
advantage of (1) is that it's a more efficient representation and
that we can refer to the cofficients directly as wide_int_storage.


2017-10-23  Richard Sandiford  <richard.sandiford@linaro.org>
	    Alan Hayward  <alan.hayward@arm.com>
	    David Sherwood  <david.sherwood@arm.com>

gcc/
	* doc/rtl.texi (const_poly_int): Document.
	* gengenrtl.c (excluded_rtx): Return true for CONST_POLY_INT.
	* rtl.h (const_poly_int_def): New struct.
	(rtx_def::u): Add a cpi field.
	(CASE_CONST_UNIQUE, CASE_CONST_ANY): Add CONST_POLY_INT.
	(CONST_POLY_INT_P, CONST_POLY_INT_COEFFS): New macros.
	(wi::rtx_to_poly_wide_ref): New typedef
	(const_poly_int_value, wi::to_poly_wide, rtx_to_poly_int64)
	(poly_int_rtx_p): New functions.
	(trunc_int_for_mode): Declare a poly_int64 version.
	(plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT.
	(immed_wide_int_const): Take a poly_wide_int_ref rather than
	a wide_int_ref.
	(strip_offset): Declare.
	(strip_offset_and_add): New function.
	* rtl.def (CONST_POLY_INT): New rtx code.
	* rtl.c (rtx_size): Handle CONST_POLY_INT.
	(shared_const_p): Use poly_int_rtx_p.
	* emit-rtl.h (gen_int_mode): Take a poly_int64 instead of a
	HOST_WIDE_INT.
	(gen_int_shift_amount): Likewise.
	* emit-rtl.c (const_poly_int_hasher): New class.
	(const_poly_int_htab): New variable.
	(init_emit_once): Initialize it when NUM_POLY_INT_COEFFS > 1.
	(const_poly_int_hasher::hash): New function.
	(const_poly_int_hasher::equal): Likewise.
	(gen_int_mode): Take a poly_int64 instead of a HOST_WIDE_INT.
	(immed_wide_int_const): Rename to...
	(immed_wide_int_const_1): ...this and make static.
	(immed_wide_int_const): New function, taking a poly_wide_int_ref
	instead of a wide_int_ref.
	(gen_int_shift_amount): Take a poly_int64 instead of a HOST_WIDE_INT.
	(gen_lowpart_common): Handle CONST_POLY_INT.
	* cse.c (hash_rtx_cb, equiv_constant): Likewise.
	* cselib.c (cselib_hash_rtx): Likewise.
	* dwarf2out.c (const_ok_for_output_1): Likewise.
	* expr.c (convert_modes): Likewise.
	* print-rtl.c (rtx_writer::print_rtx, print_value): Likewise.
	* rtlhash.c (add_rtx): Likewise.
	* explow.c (trunc_int_for_mode): Add a poly_int64 version.
	(plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT.
	Handle existing CONST_POLY_INT rtxes.
	* expmed.h (expand_shift): Take a poly_int64 instead of a
	HOST_WIDE_INT.
	* expmed.c (expand_shift): Likewise.
	* rtlanal.c (strip_offset): New function.
	(commutative_operand_precedence): Give CONST_POLY_INT the same
	precedence as CONST_DOUBLE and put CONST_WIDE_INT between that
	and CONST_INT.
	* rtl-tests.c (const_poly_int_tests): New struct.
	(rtl_tests_c_tests): Use it.
	* simplify-rtx.c (simplify_const_unary_operation): Handle
	CONST_POLY_INT.
	(simplify_const_binary_operation): Likewise.
	(simplify_binary_operation_1): Fold additions of symbolic constants
	and CONST_POLY_INTs.
	(simplify_subreg): Handle extensions and truncations of
	CONST_POLY_INTs.
	(simplify_const_poly_int_tests): New struct.
	(simplify_rtx_c_tests): Use it.
	* wide-int.h (storage_ref): Add default constructor.
	(wide_int_ref_storage): Likewise.
	(trailing_wide_ints): Use GTY((user)).
	(trailing_wide_ints::operator[]): Add a const version.
	(trailing_wide_ints::get_precision): New function.
	(trailing_wide_ints::extra_size): Likewise.

Comments

Jeff Law Nov. 17, 2017, 3:58 a.m. UTC | #1
On 10/23/2017 11:00 AM, Richard Sandiford wrote:
> This patch adds an rtl representation of poly_int values.
> There were three possible ways of doing this:
> 
> (1) Add a new rtl code for the poly_ints themselves and store the
>     coefficients as trailing wide_ints.  This would give constants like:
> 
>       (const_poly_int [c0 c1 ... cn])
> 
>     The runtime value would be:
> 
>       c0 + c1 * x1 + ... + cn * xn
> 
> (2) Like (1), but use rtxes for the coefficients.  This would give
>     constants like:
> 
>       (const_poly_int [(const_int c0)
>                        (const_int c1)
>                        ...
>                        (const_int cn)])
> 
>     although the coefficients could be const_wide_ints instead
>     of const_ints where appropriate.
> 
> (3) Add a new rtl code for the polynomial indeterminates,
>     then use them in const wrappers.  A constant like c0 + c1 * x1
>     would then look like:
> 
>       (const:M (plus:M (mult:M (const_param:M x1)
>                                (const_int c1))
>                        (const_int c0)))
> 
> There didn't seem to be that much to choose between them.  The main
> advantage of (1) is that it's a more efficient representation and
> that we can refer to the cofficients directly as wide_int_storage.
Well, and #1 feels more like how we handle CONST_INT :-)
> 
> 
> 2017-10-23  Richard Sandiford  <richard.sandiford@linaro.org>
> 	    Alan Hayward  <alan.hayward@arm.com>
> 	    David Sherwood  <david.sherwood@arm.com>
> 
> gcc/
> 	* doc/rtl.texi (const_poly_int): Document.
> 	* gengenrtl.c (excluded_rtx): Return true for CONST_POLY_INT.
> 	* rtl.h (const_poly_int_def): New struct.
> 	(rtx_def::u): Add a cpi field.
> 	(CASE_CONST_UNIQUE, CASE_CONST_ANY): Add CONST_POLY_INT.
> 	(CONST_POLY_INT_P, CONST_POLY_INT_COEFFS): New macros.
> 	(wi::rtx_to_poly_wide_ref): New typedef
> 	(const_poly_int_value, wi::to_poly_wide, rtx_to_poly_int64)
> 	(poly_int_rtx_p): New functions.
> 	(trunc_int_for_mode): Declare a poly_int64 version.
> 	(plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT.
> 	(immed_wide_int_const): Take a poly_wide_int_ref rather than
> 	a wide_int_ref.
> 	(strip_offset): Declare.
> 	(strip_offset_and_add): New function.
> 	* rtl.def (CONST_POLY_INT): New rtx code.
> 	* rtl.c (rtx_size): Handle CONST_POLY_INT.
> 	(shared_const_p): Use poly_int_rtx_p.
> 	* emit-rtl.h (gen_int_mode): Take a poly_int64 instead of a
> 	HOST_WIDE_INT.
> 	(gen_int_shift_amount): Likewise.
> 	* emit-rtl.c (const_poly_int_hasher): New class.
> 	(const_poly_int_htab): New variable.
> 	(init_emit_once): Initialize it when NUM_POLY_INT_COEFFS > 1.
> 	(const_poly_int_hasher::hash): New function.
> 	(const_poly_int_hasher::equal): Likewise.
> 	(gen_int_mode): Take a poly_int64 instead of a HOST_WIDE_INT.
> 	(immed_wide_int_const): Rename to...
> 	(immed_wide_int_const_1): ...this and make static.
> 	(immed_wide_int_const): New function, taking a poly_wide_int_ref
> 	instead of a wide_int_ref.
> 	(gen_int_shift_amount): Take a poly_int64 instead of a HOST_WIDE_INT.
> 	(gen_lowpart_common): Handle CONST_POLY_INT.
> 	* cse.c (hash_rtx_cb, equiv_constant): Likewise.
> 	* cselib.c (cselib_hash_rtx): Likewise.
> 	* dwarf2out.c (const_ok_for_output_1): Likewise.
> 	* expr.c (convert_modes): Likewise.
> 	* print-rtl.c (rtx_writer::print_rtx, print_value): Likewise.
> 	* rtlhash.c (add_rtx): Likewise.
> 	* explow.c (trunc_int_for_mode): Add a poly_int64 version.
> 	(plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT.
> 	Handle existing CONST_POLY_INT rtxes.
> 	* expmed.h (expand_shift): Take a poly_int64 instead of a
> 	HOST_WIDE_INT.
> 	* expmed.c (expand_shift): Likewise.
> 	* rtlanal.c (strip_offset): New function.
> 	(commutative_operand_precedence): Give CONST_POLY_INT the same
> 	precedence as CONST_DOUBLE and put CONST_WIDE_INT between that
> 	and CONST_INT.
> 	* rtl-tests.c (const_poly_int_tests): New struct.
> 	(rtl_tests_c_tests): Use it.
> 	* simplify-rtx.c (simplify_const_unary_operation): Handle
> 	CONST_POLY_INT.
> 	(simplify_const_binary_operation): Likewise.
> 	(simplify_binary_operation_1): Fold additions of symbolic constants
> 	and CONST_POLY_INTs.
> 	(simplify_subreg): Handle extensions and truncations of
> 	CONST_POLY_INTs.
> 	(simplify_const_poly_int_tests): New struct.
> 	(simplify_rtx_c_tests): Use it.
> 	* wide-int.h (storage_ref): Add default constructor.
> 	(wide_int_ref_storage): Likewise.
> 	(trailing_wide_ints): Use GTY((user)).
> 	(trailing_wide_ints::operator[]): Add a const version.
> 	(trailing_wide_ints::get_precision): New function.
> 	(trailing_wide_ints::extra_size): Likewise.
Do we need to define anything WRT structure sharing in rtl.texi for a
CONST_POLY_INT?



>  
> Index: gcc/rtl.c
> ===================================================================
> --- gcc/rtl.c	2017-10-23 16:52:20.579835373 +0100
> +++ gcc/rtl.c	2017-10-23 17:00:54.443002147 +0100
> @@ -257,9 +261,10 @@ shared_const_p (const_rtx orig)
>  
>    /* CONST can be shared if it contains a SYMBOL_REF.  If it contains
>       a LABEL_REF, it isn't sharable.  */
> +  poly_int64 offset;
>    return (GET_CODE (XEXP (orig, 0)) == PLUS
>  	  && GET_CODE (XEXP (XEXP (orig, 0), 0)) == SYMBOL_REF
> -	  && CONST_INT_P (XEXP (XEXP (orig, 0), 1)));
> +	  && poly_int_rtx_p (XEXP (XEXP (orig, 0), 1), &offset));
Did this just change structure sharing for CONST_WIDE_INT?



> +  /* Create a new rtx.  There's a choice to be made here between installing
> +     the actual mode of the rtx or leaving it as VOIDmode (for consistency
> +     with CONST_INT).  In practice the handling of the codes is different
> +     enough that we get no benefit from using VOIDmode, and various places
> +     assume that VOIDmode implies CONST_INT.  Using the real mode seems like
> +     the right long-term direction anyway.  */
Certainly my preference is to get the mode in there.  I see modeless
CONST_INTs as a long standing wart and I'm not keen to repeat it.



> Index: gcc/wide-int.h
> ===================================================================
> --- gcc/wide-int.h	2017-10-23 17:00:20.923835582 +0100
> +++ gcc/wide-int.h	2017-10-23 17:00:54.445999420 +0100
> @@ -613,6 +613,7 @@ #define SHIFT_FUNCTION \
>       access.  */
>    struct storage_ref
>    {
> +    storage_ref () {}
>      storage_ref (const HOST_WIDE_INT *, unsigned int, unsigned int);
>  
>      const HOST_WIDE_INT *val;
> @@ -944,6 +945,8 @@ struct wide_int_ref_storage : public wi:
>    HOST_WIDE_INT scratch[2];
>  
>  public:
> +  wide_int_ref_storage () {}
> +
>    wide_int_ref_storage (const wi::storage_ref &);
>  
>    template <typename T>
So doesn't this play into the whole question about initialization of
these objects.  So I'll defer on this hunk until we settle that
question, but the rest is OK.


Jeff
Richard Sandiford Dec. 15, 2017, 1:25 a.m. UTC | #2
Jeff Law <law@redhat.com> writes:
> On 10/23/2017 11:00 AM, Richard Sandiford wrote:
>> This patch adds an rtl representation of poly_int values.
>> There were three possible ways of doing this:
>> 
>> (1) Add a new rtl code for the poly_ints themselves and store the
>>     coefficients as trailing wide_ints.  This would give constants like:
>> 
>>       (const_poly_int [c0 c1 ... cn])
>> 
>>     The runtime value would be:
>> 
>>       c0 + c1 * x1 + ... + cn * xn
>> 
>> (2) Like (1), but use rtxes for the coefficients.  This would give
>>     constants like:
>> 
>>       (const_poly_int [(const_int c0)
>>                        (const_int c1)
>>                        ...
>>                        (const_int cn)])
>> 
>>     although the coefficients could be const_wide_ints instead
>>     of const_ints where appropriate.
>> 
>> (3) Add a new rtl code for the polynomial indeterminates,
>>     then use them in const wrappers.  A constant like c0 + c1 * x1
>>     would then look like:
>> 
>>       (const:M (plus:M (mult:M (const_param:M x1)
>>                                (const_int c1))
>>                        (const_int c0)))
>> 
>> There didn't seem to be that much to choose between them.  The main
>> advantage of (1) is that it's a more efficient representation and
>> that we can refer to the cofficients directly as wide_int_storage.
> Well, and #1 feels more like how we handle CONST_INT :-)
>> 
>> 
>> 2017-10-23  Richard Sandiford  <richard.sandiford@linaro.org>
>> 	    Alan Hayward  <alan.hayward@arm.com>
>> 	    David Sherwood  <david.sherwood@arm.com>
>> 
>> gcc/
>> 	* doc/rtl.texi (const_poly_int): Document.
>> 	* gengenrtl.c (excluded_rtx): Return true for CONST_POLY_INT.
>> 	* rtl.h (const_poly_int_def): New struct.
>> 	(rtx_def::u): Add a cpi field.
>> 	(CASE_CONST_UNIQUE, CASE_CONST_ANY): Add CONST_POLY_INT.
>> 	(CONST_POLY_INT_P, CONST_POLY_INT_COEFFS): New macros.
>> 	(wi::rtx_to_poly_wide_ref): New typedef
>> 	(const_poly_int_value, wi::to_poly_wide, rtx_to_poly_int64)
>> 	(poly_int_rtx_p): New functions.
>> 	(trunc_int_for_mode): Declare a poly_int64 version.
>> 	(plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT.
>> 	(immed_wide_int_const): Take a poly_wide_int_ref rather than
>> 	a wide_int_ref.
>> 	(strip_offset): Declare.
>> 	(strip_offset_and_add): New function.
>> 	* rtl.def (CONST_POLY_INT): New rtx code.
>> 	* rtl.c (rtx_size): Handle CONST_POLY_INT.
>> 	(shared_const_p): Use poly_int_rtx_p.
>> 	* emit-rtl.h (gen_int_mode): Take a poly_int64 instead of a
>> 	HOST_WIDE_INT.
>> 	(gen_int_shift_amount): Likewise.
>> 	* emit-rtl.c (const_poly_int_hasher): New class.
>> 	(const_poly_int_htab): New variable.
>> 	(init_emit_once): Initialize it when NUM_POLY_INT_COEFFS > 1.
>> 	(const_poly_int_hasher::hash): New function.
>> 	(const_poly_int_hasher::equal): Likewise.
>> 	(gen_int_mode): Take a poly_int64 instead of a HOST_WIDE_INT.
>> 	(immed_wide_int_const): Rename to...
>> 	(immed_wide_int_const_1): ...this and make static.
>> 	(immed_wide_int_const): New function, taking a poly_wide_int_ref
>> 	instead of a wide_int_ref.
>> 	(gen_int_shift_amount): Take a poly_int64 instead of a HOST_WIDE_INT.
>> 	(gen_lowpart_common): Handle CONST_POLY_INT.
>> 	* cse.c (hash_rtx_cb, equiv_constant): Likewise.
>> 	* cselib.c (cselib_hash_rtx): Likewise.
>> 	* dwarf2out.c (const_ok_for_output_1): Likewise.
>> 	* expr.c (convert_modes): Likewise.
>> 	* print-rtl.c (rtx_writer::print_rtx, print_value): Likewise.
>> 	* rtlhash.c (add_rtx): Likewise.
>> 	* explow.c (trunc_int_for_mode): Add a poly_int64 version.
>> 	(plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT.
>> 	Handle existing CONST_POLY_INT rtxes.
>> 	* expmed.h (expand_shift): Take a poly_int64 instead of a
>> 	HOST_WIDE_INT.
>> 	* expmed.c (expand_shift): Likewise.
>> 	* rtlanal.c (strip_offset): New function.
>> 	(commutative_operand_precedence): Give CONST_POLY_INT the same
>> 	precedence as CONST_DOUBLE and put CONST_WIDE_INT between that
>> 	and CONST_INT.
>> 	* rtl-tests.c (const_poly_int_tests): New struct.
>> 	(rtl_tests_c_tests): Use it.
>> 	* simplify-rtx.c (simplify_const_unary_operation): Handle
>> 	CONST_POLY_INT.
>> 	(simplify_const_binary_operation): Likewise.
>> 	(simplify_binary_operation_1): Fold additions of symbolic constants
>> 	and CONST_POLY_INTs.
>> 	(simplify_subreg): Handle extensions and truncations of
>> 	CONST_POLY_INTs.
>> 	(simplify_const_poly_int_tests): New struct.
>> 	(simplify_rtx_c_tests): Use it.
>> 	* wide-int.h (storage_ref): Add default constructor.
>> 	(wide_int_ref_storage): Likewise.
>> 	(trailing_wide_ints): Use GTY((user)).
>> 	(trailing_wide_ints::operator[]): Add a const version.
>> 	(trailing_wide_ints::get_precision): New function.
>> 	(trailing_wide_ints::extra_size): Likewise.
> Do we need to define anything WRT structure sharing in rtl.texi for a
> CONST_POLY_INT?

Good catch.  Fixed in the patch below.

>> Index: gcc/rtl.c
>> ===================================================================
>> --- gcc/rtl.c	2017-10-23 16:52:20.579835373 +0100
>> +++ gcc/rtl.c	2017-10-23 17:00:54.443002147 +0100
>> @@ -257,9 +261,10 @@ shared_const_p (const_rtx orig)
>>  
>>    /* CONST can be shared if it contains a SYMBOL_REF.  If it contains
>>       a LABEL_REF, it isn't sharable.  */
>> +  poly_int64 offset;
>>    return (GET_CODE (XEXP (orig, 0)) == PLUS
>>  	  && GET_CODE (XEXP (XEXP (orig, 0), 0)) == SYMBOL_REF
>> -	  && CONST_INT_P (XEXP (XEXP (orig, 0), 1)));
>> +	  && poly_int_rtx_p (XEXP (XEXP (orig, 0), 1), &offset));
> Did this just change structure sharing for CONST_WIDE_INT?

No, we'd only use CONST_WIDE_INT for things that don't fit in
poly_int64.

>> +  /* Create a new rtx.  There's a choice to be made here between installing
>> +     the actual mode of the rtx or leaving it as VOIDmode (for consistency
>> +     with CONST_INT).  In practice the handling of the codes is different
>> +     enough that we get no benefit from using VOIDmode, and various places
>> +     assume that VOIDmode implies CONST_INT.  Using the real mode seems like
>> +     the right long-term direction anyway.  */
> Certainly my preference is to get the mode in there.  I see modeless
> CONST_INTs as a long standing wart and I'm not keen to repeat it.

Yeah.  Still regularly hit problems related to modeless CONST_INTs
today (including the gen_int_shift_amount patch).

>> Index: gcc/wide-int.h
>> ===================================================================
>> --- gcc/wide-int.h	2017-10-23 17:00:20.923835582 +0100
>> +++ gcc/wide-int.h	2017-10-23 17:00:54.445999420 +0100
>> @@ -613,6 +613,7 @@ #define SHIFT_FUNCTION \
>>       access.  */
>>    struct storage_ref
>>    {
>> +    storage_ref () {}
>>      storage_ref (const HOST_WIDE_INT *, unsigned int, unsigned int);
>>  
>>      const HOST_WIDE_INT *val;
>> @@ -944,6 +945,8 @@ struct wide_int_ref_storage : public wi:
>>    HOST_WIDE_INT scratch[2];
>>  
>>  public:
>> +  wide_int_ref_storage () {}
>> +
>>    wide_int_ref_storage (const wi::storage_ref &);
>>  
>>    template <typename T>
> So doesn't this play into the whole question about initialization of
> these objects.  So I'll defer on this hunk until we settle that
> question, but the rest is OK.

Any more thoughts on this?  In the end the 001 patch went in with
the empty constructors.  Like I say, I'm happy to switch to C++-11
"= default;" once we require C++11, but I think having well-defined
implicit construction would make switching to "= default" harder
in future.

Thanks,
Richard


2017-11-15  Richard Sandiford  <richard.sandiford@linaro.org>
	    Alan Hayward  <alan.hayward@arm.com>
	    David Sherwood  <david.sherwood@arm.com>

gcc/
	* doc/rtl.texi (const_poly_int): Document.  Also document the
	rtl sharing behavior.
	* gengenrtl.c (excluded_rtx): Return true for CONST_POLY_INT.
	* rtl.h (const_poly_int_def): New struct.
	(rtx_def::u): Add a cpi field.
	(CASE_CONST_UNIQUE, CASE_CONST_ANY): Add CONST_POLY_INT.
	(CONST_POLY_INT_P, CONST_POLY_INT_COEFFS): New macros.
	(wi::rtx_to_poly_wide_ref): New typedef
	(const_poly_int_value, wi::to_poly_wide, rtx_to_poly_int64)
	(poly_int_rtx_p): New functions.
	(trunc_int_for_mode): Declare a poly_int64 version.
	(plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT.
	(immed_wide_int_const): Take a poly_wide_int_ref rather than
	a wide_int_ref.
	(strip_offset): Declare.
	(strip_offset_and_add): New function.
	* rtl.def (CONST_POLY_INT): New rtx code.
	* rtl.c (rtx_size): Handle CONST_POLY_INT.
	(shared_const_p): Use poly_int_rtx_p.
	* emit-rtl.h (gen_int_mode): Take a poly_int64 instead of a
	HOST_WIDE_INT.
	(gen_int_shift_amount): Likewise.
	* emit-rtl.c (const_poly_int_hasher): New class.
	(const_poly_int_htab): New variable.
	(init_emit_once): Initialize it when NUM_POLY_INT_COEFFS > 1.
	(const_poly_int_hasher::hash): New function.
	(const_poly_int_hasher::equal): Likewise.
	(gen_int_mode): Take a poly_int64 instead of a HOST_WIDE_INT.
	(immed_wide_int_const): Rename to...
	(immed_wide_int_const_1): ...this and make static.
	(immed_wide_int_const): New function, taking a poly_wide_int_ref
	instead of a wide_int_ref.
	(gen_int_shift_amount): Take a poly_int64 instead of a HOST_WIDE_INT.
	(gen_lowpart_common): Handle CONST_POLY_INT.
	* cse.c (hash_rtx_cb, equiv_constant): Likewise.
	* cselib.c (cselib_hash_rtx): Likewise.
	* dwarf2out.c (const_ok_for_output_1): Likewise.
	* expr.c (convert_modes): Likewise.
	* print-rtl.c (rtx_writer::print_rtx, print_value): Likewise.
	* rtlhash.c (add_rtx): Likewise.
	* explow.c (trunc_int_for_mode): Add a poly_int64 version.
	(plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT.
	Handle existing CONST_POLY_INT rtxes.
	* expmed.h (expand_shift): Take a poly_int64 instead of a
	HOST_WIDE_INT.
	* expmed.c (expand_shift): Likewise.
	* rtlanal.c (strip_offset): New function.
	(commutative_operand_precedence): Give CONST_POLY_INT the same
	precedence as CONST_DOUBLE and put CONST_WIDE_INT between that
	and CONST_INT.
	* rtl-tests.c (const_poly_int_tests): New struct.
	(rtl_tests_c_tests): Use it.
	* simplify-rtx.c (simplify_const_unary_operation): Handle
	CONST_POLY_INT.
	(simplify_const_binary_operation): Likewise.
	(simplify_binary_operation_1): Fold additions of symbolic constants
	and CONST_POLY_INTs.
	(simplify_subreg): Handle extensions and truncations of
	CONST_POLY_INTs.
	(simplify_const_poly_int_tests): New struct.
	(simplify_rtx_c_tests): Use it.
	* wide-int.h (storage_ref): Add default constructor.
	(wide_int_ref_storage): Likewise.
	(trailing_wide_ints): Use GTY((user)).
	(trailing_wide_ints::operator[]): Add a const version.
	(trailing_wide_ints::get_precision): New function.
	(trailing_wide_ints::extra_size): Likewise.

Index: gcc/doc/rtl.texi
===================================================================
--- gcc/doc/rtl.texi	2017-12-15 01:16:50.894351263 +0000
+++ gcc/doc/rtl.texi	2017-12-15 01:16:51.235339239 +0000
@@ -1633,6 +1633,15 @@ is accessed with the macro @code{CONST_F
 data is accessed with @code{CONST_FIXED_VALUE_HIGH}; the low part is
 accessed with @code{CONST_FIXED_VALUE_LOW}.
 
+@findex const_poly_int
+@item (const_poly_int:@var{m} [@var{c0} @var{c1} @dots{}])
+Represents a @code{poly_int}-style polynomial integer with coefficients
+@var{c0}, @var{c1}, @dots{}.  The coefficients are @code{wide_int}-based
+integers rather than rtxes.  @code{CONST_POLY_INT_COEFFS} gives the
+values of individual coefficients (which is mostly only useful in
+low-level routines) and @code{const_poly_int_value} gives the full
+@code{poly_int} value.
+
 @findex const_vector
 @item (const_vector:@var{m} [@var{x0} @var{x1} @dots{}])
 Represents a vector constant.  The square brackets stand for the vector
@@ -4236,6 +4245,11 @@ referring to it.
 @item
 All @code{const_int} expressions with equal values are shared.
 
+@cindex @code{const_poly_int}, RTL sharing
+@item
+All @code{const_poly_int} expressions with equal modes and values
+are shared.
+
 @cindex @code{pc}, RTL sharing
 @item
 There is only one @code{pc} expression.
Index: gcc/gengenrtl.c
===================================================================
--- gcc/gengenrtl.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/gengenrtl.c	2017-12-15 01:16:51.240339063 +0000
@@ -157,6 +157,7 @@ excluded_rtx (int idx)
   return (strcmp (defs[idx].enumname, "VAR_LOCATION") == 0
 	  || strcmp (defs[idx].enumname, "CONST_DOUBLE") == 0
 	  || strcmp (defs[idx].enumname, "CONST_WIDE_INT") == 0
+	  || strcmp (defs[idx].enumname, "CONST_POLY_INT") == 0
 	  || strcmp (defs[idx].enumname, "CONST_FIXED") == 0);
 }
 
Index: gcc/rtl.h
===================================================================
--- gcc/rtl.h	2017-12-15 01:16:50.894351263 +0000
+++ gcc/rtl.h	2017-12-15 01:16:51.241339028 +0000
@@ -280,6 +280,10 @@ #define CWI_GET_NUM_ELEM(RTX)					\
 #define CWI_PUT_NUM_ELEM(RTX, NUM)					\
   (RTL_FLAG_CHECK1("CWI_PUT_NUM_ELEM", (RTX), CONST_WIDE_INT)->u2.num_elem = (NUM))
 
+struct GTY((variable_size)) const_poly_int_def {
+  trailing_wide_ints<NUM_POLY_INT_COEFFS> coeffs;
+};
+
 /* RTL expression ("rtx").  */
 
 /* The GTY "desc" and "tag" options below are a kludge: we need a desc
@@ -424,6 +428,7 @@ struct GTY((desc("0"), tag("0"),
     struct real_value rv;
     struct fixed_value fv;
     struct hwivec_def hwiv;
+    struct const_poly_int_def cpi;
   } GTY ((special ("rtx_def"), desc ("GET_CODE (&%0)"))) u;
 };
 
@@ -734,6 +739,7 @@ #define CASE_CONST_SCALAR_INT \
 #define CASE_CONST_UNIQUE \
    case CONST_INT: \
    case CONST_WIDE_INT: \
+   case CONST_POLY_INT: \
    case CONST_DOUBLE: \
    case CONST_FIXED
 
@@ -741,6 +747,7 @@ #define CASE_CONST_UNIQUE \
 #define CASE_CONST_ANY \
    case CONST_INT: \
    case CONST_WIDE_INT: \
+   case CONST_POLY_INT: \
    case CONST_DOUBLE: \
    case CONST_FIXED: \
    case CONST_VECTOR
@@ -773,6 +780,11 @@ #define CONST_INT_P(X) (GET_CODE (X) ==
 /* Predicate yielding nonzero iff X is an rtx for a constant integer.  */
 #define CONST_WIDE_INT_P(X) (GET_CODE (X) == CONST_WIDE_INT)
 
+/* Predicate yielding nonzero iff X is an rtx for a polynomial constant
+   integer.  */
+#define CONST_POLY_INT_P(X) \
+  (NUM_POLY_INT_COEFFS > 1 && GET_CODE (X) == CONST_POLY_INT)
+
 /* Predicate yielding nonzero iff X is an rtx for a constant fixed-point.  */
 #define CONST_FIXED_P(X) (GET_CODE (X) == CONST_FIXED)
 
@@ -1914,6 +1926,12 @@ #define CONST_WIDE_INT_VEC(RTX) HWIVEC_C
 #define CONST_WIDE_INT_NUNITS(RTX) CWI_GET_NUM_ELEM (RTX)
 #define CONST_WIDE_INT_ELT(RTX, N) CWI_ELT (RTX, N)
 
+/* For a CONST_POLY_INT, CONST_POLY_INT_COEFFS gives access to the
+   individual coefficients, in the form of a trailing_wide_ints structure.  */
+#define CONST_POLY_INT_COEFFS(RTX) \
+  (RTL_FLAG_CHECK1("CONST_POLY_INT_COEFFS", (RTX), \
+		   CONST_POLY_INT)->u.cpi.coeffs)
+
 /* For a CONST_DOUBLE:
 #if TARGET_SUPPORTS_WIDE_INT == 0
    For a VOIDmode, there are two integers CONST_DOUBLE_LOW is the
@@ -2227,6 +2245,84 @@ wi::max_value (machine_mode mode, signop
   return max_value (GET_MODE_PRECISION (as_a <scalar_mode> (mode)), sgn);
 }
 
+namespace wi
+{
+  typedef poly_int<NUM_POLY_INT_COEFFS,
+		   generic_wide_int <wide_int_ref_storage <false, false> > >
+    rtx_to_poly_wide_ref;
+  rtx_to_poly_wide_ref to_poly_wide (const_rtx, machine_mode);
+}
+
+/* Return the value of a CONST_POLY_INT in its native precision.  */
+
+inline wi::rtx_to_poly_wide_ref
+const_poly_int_value (const_rtx x)
+{
+  poly_int<NUM_POLY_INT_COEFFS, WIDE_INT_REF_FOR (wide_int)> res;
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    res.coeffs[i] = CONST_POLY_INT_COEFFS (x)[i];
+  return res;
+}
+
+/* Return true if X is a scalar integer or a CONST_POLY_INT.  The value
+   can then be extracted using wi::to_poly_wide.  */
+
+inline bool
+poly_int_rtx_p (const_rtx x)
+{
+  return CONST_SCALAR_INT_P (x) || CONST_POLY_INT_P (x);
+}
+
+/* Access X (which satisfies poly_int_rtx_p) as a poly_wide_int.
+   MODE is the mode of X.  */
+
+inline wi::rtx_to_poly_wide_ref
+wi::to_poly_wide (const_rtx x, machine_mode mode)
+{
+  if (CONST_POLY_INT_P (x))
+    return const_poly_int_value (x);
+  return rtx_mode_t (const_cast<rtx> (x), mode);
+}
+
+/* Return the value of X as a poly_int64.  */
+
+inline poly_int64
+rtx_to_poly_int64 (const_rtx x)
+{
+  if (CONST_POLY_INT_P (x))
+    {
+      poly_int64 res;
+      for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	res.coeffs[i] = CONST_POLY_INT_COEFFS (x)[i].to_shwi ();
+      return res;
+    }
+  return INTVAL (x);
+}
+
+/* Return true if arbitrary value X is an integer constant that can
+   be represented as a poly_int64.  Store the value in *RES if so,
+   otherwise leave it unmodified.  */
+
+inline bool
+poly_int_rtx_p (const_rtx x, poly_int64_pod *res)
+{
+  if (CONST_INT_P (x))
+    {
+      *res = INTVAL (x);
+      return true;
+    }
+  if (CONST_POLY_INT_P (x))
+    {
+      for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	if (!wi::fits_shwi_p (CONST_POLY_INT_COEFFS (x)[i]))
+	  return false;
+      for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	res->coeffs[i] = CONST_POLY_INT_COEFFS (x)[i].to_shwi ();
+      return true;
+    }
+  return false;
+}
+
 extern void init_rtlanal (void);
 extern int rtx_cost (rtx, machine_mode, enum rtx_code, int, bool);
 extern int address_cost (rtx, machine_mode, addr_space_t, bool);
@@ -2764,7 +2860,8 @@ #define EXTRACT_ARGS_IN_RANGE(SIZE, POS,
 
 /* In explow.c */
 extern HOST_WIDE_INT trunc_int_for_mode	(HOST_WIDE_INT, machine_mode);
-extern rtx plus_constant (machine_mode, rtx, HOST_WIDE_INT, bool = false);
+extern poly_int64 trunc_int_for_mode (poly_int64, machine_mode);
+extern rtx plus_constant (machine_mode, rtx, poly_int64, bool = false);
 extern HOST_WIDE_INT get_stack_check_protect (void);
 
 /* In rtl.c */
@@ -3075,13 +3172,11 @@ extern void end_sequence (void);
 extern double_int rtx_to_double_int (const_rtx);
 #endif
 extern void cwi_output_hex (FILE *, const_rtx);
-#ifndef GENERATOR_FILE
-extern rtx immed_wide_int_const (const wide_int_ref &, machine_mode);
-#endif
 #if TARGET_SUPPORTS_WIDE_INT == 0
 extern rtx immed_double_const (HOST_WIDE_INT, HOST_WIDE_INT,
 			       machine_mode);
 #endif
+extern rtx immed_wide_int_const (const poly_wide_int_ref &, machine_mode);
 
 /* In varasm.c  */
 extern rtx force_const_mem (machine_mode, rtx);
@@ -3269,6 +3364,7 @@ extern HOST_WIDE_INT get_integer_term (c
 extern rtx get_related_value (const_rtx);
 extern bool offset_within_block_p (const_rtx, HOST_WIDE_INT);
 extern void split_const (rtx, rtx *, rtx *);
+extern rtx strip_offset (rtx, poly_int64_pod *);
 extern bool unsigned_reg_p (rtx);
 extern int reg_mentioned_p (const_rtx, const_rtx);
 extern int count_occurrences (const_rtx, const_rtx, int);
@@ -4203,6 +4299,21 @@ load_extend_op (machine_mode mode)
   return UNKNOWN;
 }
 
+/* If X is a PLUS of a base and a constant offset, add the constant to *OFFSET
+   and return the base.  Return X otherwise.  */
+
+inline rtx
+strip_offset_and_add (rtx x, poly_int64_pod *offset)
+{
+  if (GET_CODE (x) == PLUS)
+    {
+      poly_int64 suboffset;
+      x = strip_offset (x, &suboffset);
+      *offset += suboffset;
+    }
+  return x;
+}
+
 /* gtype-desc.c.  */
 extern void gt_ggc_mx (rtx &);
 extern void gt_pch_nx (rtx &);
Index: gcc/rtl.def
===================================================================
--- gcc/rtl.def	2017-12-15 01:16:50.894351263 +0000
+++ gcc/rtl.def	2017-12-15 01:16:51.240339063 +0000
@@ -348,6 +348,9 @@ DEF_RTL_EXPR(CONST_INT, "const_int", "w"
 /* numeric integer constant */
 DEF_RTL_EXPR(CONST_WIDE_INT, "const_wide_int", "", RTX_CONST_OBJ)
 
+/* An rtx representation of a poly_wide_int.  */
+DEF_RTL_EXPR(CONST_POLY_INT, "const_poly_int", "", RTX_CONST_OBJ)
+
 /* fixed-point constant */
 DEF_RTL_EXPR(CONST_FIXED, "const_fixed", "www", RTX_CONST_OBJ)
 
Index: gcc/rtl.c
===================================================================
--- gcc/rtl.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/rtl.c	2017-12-15 01:16:51.240339063 +0000
@@ -189,6 +189,10 @@ rtx_size (const_rtx x)
 	    + sizeof (struct hwivec_def)
 	    + ((CONST_WIDE_INT_NUNITS (x) - 1)
 	       * sizeof (HOST_WIDE_INT)));
+  if (CONST_POLY_INT_P (x))
+    return (RTX_HDR_SIZE
+	    + sizeof (struct const_poly_int_def)
+	    + CONST_POLY_INT_COEFFS (x).extra_size ());
   if (GET_CODE (x) == SYMBOL_REF && SYMBOL_REF_HAS_BLOCK_INFO_P (x))
     return RTX_HDR_SIZE + sizeof (struct block_symbol);
   return RTX_CODE_SIZE (GET_CODE (x));
@@ -257,9 +261,10 @@ shared_const_p (const_rtx orig)
 
   /* CONST can be shared if it contains a SYMBOL_REF.  If it contains
      a LABEL_REF, it isn't sharable.  */
+  poly_int64 offset;
   return (GET_CODE (XEXP (orig, 0)) == PLUS
 	  && GET_CODE (XEXP (XEXP (orig, 0), 0)) == SYMBOL_REF
-	  && CONST_INT_P (XEXP (XEXP (orig, 0), 1)));
+	  && poly_int_rtx_p (XEXP (XEXP (orig, 0), 1), &offset));
 }
 
 
Index: gcc/emit-rtl.h
===================================================================
--- gcc/emit-rtl.h	2017-12-15 01:16:50.894351263 +0000
+++ gcc/emit-rtl.h	2017-12-15 01:16:51.238339134 +0000
@@ -362,14 +362,14 @@ extern rtvec gen_rtvec (int, ...);
 extern rtx copy_insn_1 (rtx);
 extern rtx copy_insn (rtx);
 extern rtx_insn *copy_delay_slot_insn (rtx_insn *);
-extern rtx gen_int_mode (HOST_WIDE_INT, machine_mode);
+extern rtx gen_int_mode (poly_int64, machine_mode);
 extern rtx_insn *emit_copy_of_insn_after (rtx_insn *, rtx_insn *);
 extern void set_reg_attrs_from_value (rtx, rtx);
 extern void set_reg_attrs_for_parm (rtx, rtx);
 extern void set_reg_attrs_for_decl_rtl (tree t, rtx x);
 extern void adjust_reg_mode (rtx, machine_mode);
 extern int mem_expr_equal_p (const_tree, const_tree);
-extern rtx gen_int_shift_amount (machine_mode, HOST_WIDE_INT);
+extern rtx gen_int_shift_amount (machine_mode, poly_int64);
 
 extern bool need_atomic_barrier_p (enum memmodel, bool);
 
Index: gcc/emit-rtl.c
===================================================================
--- gcc/emit-rtl.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/emit-rtl.c	2017-12-15 01:16:51.238339134 +0000
@@ -148,6 +148,16 @@ struct const_wide_int_hasher : ggc_cache
 
 static GTY ((cache)) hash_table<const_wide_int_hasher> *const_wide_int_htab;
 
+struct const_poly_int_hasher : ggc_cache_ptr_hash<rtx_def>
+{
+  typedef std::pair<machine_mode, poly_wide_int_ref> compare_type;
+
+  static hashval_t hash (rtx x);
+  static bool equal (rtx x, const compare_type &y);
+};
+
+static GTY ((cache)) hash_table<const_poly_int_hasher> *const_poly_int_htab;
+
 /* A hash table storing register attribute structures.  */
 struct reg_attr_hasher : ggc_cache_ptr_hash<reg_attrs>
 {
@@ -257,6 +267,31 @@ const_wide_int_hasher::equal (rtx x, rtx
 }
 #endif
 
+/* Returns a hash code for CONST_POLY_INT X.  */
+
+hashval_t
+const_poly_int_hasher::hash (rtx x)
+{
+  inchash::hash h;
+  h.add_int (GET_MODE (x));
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    h.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]);
+  return h.end ();
+}
+
+/* Returns nonzero if CONST_POLY_INT X is an rtx representation of Y.  */
+
+bool
+const_poly_int_hasher::equal (rtx x, const compare_type &y)
+{
+  if (GET_MODE (x) != y.first)
+    return false;
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    if (CONST_POLY_INT_COEFFS (x)[i] != y.second.coeffs[i])
+      return false;
+  return true;
+}
+
 /* Returns a hash code for X (which is really a CONST_DOUBLE).  */
 hashval_t
 const_double_hasher::hash (rtx x)
@@ -520,9 +555,13 @@ gen_rtx_CONST_INT (machine_mode mode ATT
 }
 
 rtx
-gen_int_mode (HOST_WIDE_INT c, machine_mode mode)
+gen_int_mode (poly_int64 c, machine_mode mode)
 {
-  return GEN_INT (trunc_int_for_mode (c, mode));
+  c = trunc_int_for_mode (c, mode);
+  if (c.is_constant ())
+    return GEN_INT (c.coeffs[0]);
+  unsigned int prec = GET_MODE_PRECISION (as_a <scalar_mode> (mode));
+  return immed_wide_int_const (poly_wide_int::from (c, prec, SIGNED), mode);
 }
 
 /* CONST_DOUBLEs might be created from pairs of integers, or from
@@ -626,8 +665,8 @@ lookup_const_wide_int (rtx wint)
    a CONST_DOUBLE (if !TARGET_SUPPORTS_WIDE_INT) or a CONST_WIDE_INT
    (if TARGET_SUPPORTS_WIDE_INT).  */
 
-rtx
-immed_wide_int_const (const wide_int_ref &v, machine_mode mode)
+static rtx
+immed_wide_int_const_1 (const wide_int_ref &v, machine_mode mode)
 {
   unsigned int len = v.get_len ();
   /* Not scalar_int_mode because we also allow pointer bound modes.  */
@@ -714,6 +753,53 @@ immed_double_const (HOST_WIDE_INT i0, HO
 }
 #endif
 
+/* Return an rtx representation of C in mode MODE.  */
+
+rtx
+immed_wide_int_const (const poly_wide_int_ref &c, machine_mode mode)
+{
+  if (c.is_constant ())
+    return immed_wide_int_const_1 (c.coeffs[0], mode);
+
+  /* Not scalar_int_mode because we also allow pointer bound modes.  */
+  unsigned int prec = GET_MODE_PRECISION (as_a <scalar_mode> (mode));
+
+  /* Allow truncation but not extension since we do not know if the
+     number is signed or unsigned.  */
+  gcc_assert (prec <= c.coeffs[0].get_precision ());
+  poly_wide_int newc = poly_wide_int::from (c, prec, SIGNED);
+
+  /* See whether we already have an rtx for this constant.  */
+  inchash::hash h;
+  h.add_int (mode);
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    h.add_wide_int (newc.coeffs[i]);
+  const_poly_int_hasher::compare_type typed_value (mode, newc);
+  rtx *slot = const_poly_int_htab->find_slot_with_hash (typed_value,
+							h.end (), INSERT);
+  rtx x = *slot;
+  if (x)
+    return x;
+
+  /* Create a new rtx.  There's a choice to be made here between installing
+     the actual mode of the rtx or leaving it as VOIDmode (for consistency
+     with CONST_INT).  In practice the handling of the codes is different
+     enough that we get no benefit from using VOIDmode, and various places
+     assume that VOIDmode implies CONST_INT.  Using the real mode seems like
+     the right long-term direction anyway.  */
+  typedef trailing_wide_ints<NUM_POLY_INT_COEFFS> twi;
+  size_t extra_size = twi::extra_size (prec);
+  x = rtx_alloc_v (CONST_POLY_INT,
+		   sizeof (struct const_poly_int_def) + extra_size);
+  PUT_MODE (x, mode);
+  CONST_POLY_INT_COEFFS (x).set_precision (prec);
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    CONST_POLY_INT_COEFFS (x)[i] = newc.coeffs[i];
+
+  *slot = x;
+  return x;
+}
+
 rtx
 gen_rtx_REG (machine_mode mode, unsigned int regno)
 {
@@ -1517,7 +1603,8 @@ gen_lowpart_common (machine_mode mode, r
     }
   else if (GET_CODE (x) == SUBREG || REG_P (x)
 	   || GET_CODE (x) == CONCAT || const_vec_p (x)
-	   || CONST_DOUBLE_AS_FLOAT_P (x) || CONST_SCALAR_INT_P (x))
+	   || CONST_DOUBLE_AS_FLOAT_P (x) || CONST_SCALAR_INT_P (x)
+	   || CONST_POLY_INT_P (x))
     return lowpart_subreg (mode, x, innermode);
 
   /* Otherwise, we can't do this.  */
@@ -6124,6 +6211,9 @@ init_emit_once (void)
 #endif
   const_double_htab = hash_table<const_double_hasher>::create_ggc (37);
 
+  if (NUM_POLY_INT_COEFFS > 1)
+    const_poly_int_htab = hash_table<const_poly_int_hasher>::create_ggc (37);
+
   const_fixed_htab = hash_table<const_fixed_hasher>::create_ggc (37);
 
   reg_attrs_htab = hash_table<reg_attr_hasher>::create_ggc (37);
@@ -6517,7 +6607,7 @@ need_atomic_barrier_p (enum memmodel mod
    by VALUE bits.  */
 
 rtx
-gen_int_shift_amount (machine_mode mode, HOST_WIDE_INT value)
+gen_int_shift_amount (machine_mode mode, poly_int64 value)
 {
   /* ??? Using the inner mode should be wide enough for all useful
      cases (e.g. QImode usually has 8 shiftable bits, while a QImode
Index: gcc/cse.c
===================================================================
--- gcc/cse.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/cse.c	2017-12-15 01:16:51.234339275 +0000
@@ -2323,6 +2323,15 @@ hash_rtx_cb (const_rtx x, machine_mode m
 	hash += CONST_WIDE_INT_ELT (x, i);
       return hash;
 
+    case CONST_POLY_INT:
+      {
+	inchash::hash h;
+	h.add_int (hash);
+	for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	  h.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]);
+	return h.end ();
+      }
+
     case CONST_DOUBLE:
       /* This is like the general case, except that it only counts
 	 the integers representing the constant.  */
@@ -3781,6 +3790,8 @@ equiv_constant (rtx x)
       /* See if we previously assigned a constant value to this SUBREG.  */
       if ((new_rtx = lookup_as_function (x, CONST_INT)) != 0
 	  || (new_rtx = lookup_as_function (x, CONST_WIDE_INT)) != 0
+	  || (NUM_POLY_INT_COEFFS > 1
+	      && (new_rtx = lookup_as_function (x, CONST_POLY_INT)) != 0)
           || (new_rtx = lookup_as_function (x, CONST_DOUBLE)) != 0
           || (new_rtx = lookup_as_function (x, CONST_FIXED)) != 0)
         return new_rtx;
Index: gcc/cselib.c
===================================================================
--- gcc/cselib.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/cselib.c	2017-12-15 01:16:51.234339275 +0000
@@ -1128,6 +1128,15 @@ cselib_hash_rtx (rtx x, int create, mach
 	hash += CONST_WIDE_INT_ELT (x, i);
       return hash;
 
+    case CONST_POLY_INT:
+      {
+	inchash::hash h;
+	h.add_int (hash);
+	for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	  h.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]);
+	return h.end ();
+      }
+
     case CONST_DOUBLE:
       /* This is like the general case, except that it only counts
 	 the integers representing the constant.  */
Index: gcc/dwarf2out.c
===================================================================
--- gcc/dwarf2out.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/dwarf2out.c	2017-12-15 01:16:51.237339169 +0000
@@ -13781,6 +13781,16 @@ const_ok_for_output_1 (rtx rtl)
       return false;
     }
 
+  if (CONST_POLY_INT_P (rtl))
+    return false;
+
+  if (targetm.const_not_ok_for_debug_p (rtl))
+    {
+      expansion_failed (NULL_TREE, rtl,
+			"Expression rejected for debug by the backend.\n");
+      return false;
+    }
+
   /* FIXME: Refer to PR60655. It is possible for simplification
      of rtl expressions in var tracking to produce such expressions.
      We should really identify / validate expressions
Index: gcc/expr.c
===================================================================
--- gcc/expr.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/expr.c	2017-12-15 01:16:51.239339098 +0000
@@ -692,6 +692,7 @@ convert_modes (machine_mode mode, machin
       && is_int_mode (oldmode, &int_oldmode)
       && GET_MODE_PRECISION (int_mode) <= GET_MODE_PRECISION (int_oldmode)
       && ((MEM_P (x) && !MEM_VOLATILE_P (x) && direct_load[(int) int_mode])
+	  || CONST_POLY_INT_P (x)
           || (REG_P (x)
               && (!HARD_REGISTER_P (x)
 		  || targetm.hard_regno_mode_ok (REGNO (x), int_mode))
Index: gcc/print-rtl.c
===================================================================
--- gcc/print-rtl.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/print-rtl.c	2017-12-15 01:16:51.240339063 +0000
@@ -908,6 +908,17 @@ rtx_writer::print_rtx (const_rtx in_rtx)
       fprintf (m_outfile, " ");
       cwi_output_hex (m_outfile, in_rtx);
       break;
+
+    case CONST_POLY_INT:
+      fprintf (m_outfile, " [");
+      print_dec (CONST_POLY_INT_COEFFS (in_rtx)[0], m_outfile, SIGNED);
+      for (unsigned int i = 1; i < NUM_POLY_INT_COEFFS; ++i)
+	{
+	  fprintf (m_outfile, ", ");
+	  print_dec (CONST_POLY_INT_COEFFS (in_rtx)[i], m_outfile, SIGNED);
+	}
+      fprintf (m_outfile, "]");
+      break;
 #endif
 
     case CODE_LABEL:
@@ -1595,6 +1606,17 @@ print_value (pretty_printer *pp, const_r
       }
       break;
 
+    case CONST_POLY_INT:
+      pp_left_bracket (pp);
+      pp_wide_int (pp, CONST_POLY_INT_COEFFS (x)[0], SIGNED);
+      for (unsigned int i = 1; i < NUM_POLY_INT_COEFFS; ++i)
+	{
+	  pp_string (pp, ", ");
+	  pp_wide_int (pp, CONST_POLY_INT_COEFFS (x)[i], SIGNED);
+	}
+      pp_right_bracket (pp);
+      break;
+
     case CONST_DOUBLE:
       if (FLOAT_MODE_P (GET_MODE (x)))
 	{
Index: gcc/rtlhash.c
===================================================================
--- gcc/rtlhash.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/rtlhash.c	2017-12-15 01:16:51.241339028 +0000
@@ -55,6 +55,10 @@ add_rtx (const_rtx x, hash &hstate)
       for (i = 0; i < CONST_WIDE_INT_NUNITS (x); i++)
 	hstate.add_object (CONST_WIDE_INT_ELT (x, i));
       return;
+    case CONST_POLY_INT:
+      for (i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	hstate.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]);
+      break;
     case SYMBOL_REF:
       if (XSTR (x, 0))
 	hstate.add (XSTR (x, 0), strlen (XSTR (x, 0)) + 1);
Index: gcc/explow.c
===================================================================
--- gcc/explow.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/explow.c	2017-12-15 01:16:51.238339134 +0000
@@ -77,13 +77,23 @@ trunc_int_for_mode (HOST_WIDE_INT c, mac
   return c;
 }
 
+/* Likewise for polynomial values, using the sign-extended representation
+   for each individual coefficient.  */
+
+poly_int64
+trunc_int_for_mode (poly_int64 x, machine_mode mode)
+{
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    x.coeffs[i] = trunc_int_for_mode (x.coeffs[i], mode);
+  return x;
+}
+
 /* Return an rtx for the sum of X and the integer C, given that X has
    mode MODE.  INPLACE is true if X can be modified inplace or false
    if it must be treated as immutable.  */
 
 rtx
-plus_constant (machine_mode mode, rtx x, HOST_WIDE_INT c,
-	       bool inplace)
+plus_constant (machine_mode mode, rtx x, poly_int64 c, bool inplace)
 {
   RTX_CODE code;
   rtx y;
@@ -92,7 +102,7 @@ plus_constant (machine_mode mode, rtx x,
 
   gcc_assert (GET_MODE (x) == VOIDmode || GET_MODE (x) == mode);
 
-  if (c == 0)
+  if (known_eq (c, 0))
     return x;
 
  restart:
@@ -180,10 +190,12 @@ plus_constant (machine_mode mode, rtx x,
       break;
 
     default:
+      if (CONST_POLY_INT_P (x))
+	return immed_wide_int_const (const_poly_int_value (x) + c, mode);
       break;
     }
 
-  if (c != 0)
+  if (maybe_ne (c, 0))
     x = gen_rtx_PLUS (mode, x, gen_int_mode (c, mode));
 
   if (GET_CODE (x) == SYMBOL_REF || GET_CODE (x) == LABEL_REF)
Index: gcc/expmed.h
===================================================================
--- gcc/expmed.h	2017-12-15 01:16:50.894351263 +0000
+++ gcc/expmed.h	2017-12-15 01:16:51.239339098 +0000
@@ -712,8 +712,8 @@ extern unsigned HOST_WIDE_INT choose_mul
 #ifdef TREE_CODE
 extern rtx expand_variable_shift (enum tree_code, machine_mode,
 				  rtx, tree, rtx, int);
-extern rtx expand_shift (enum tree_code, machine_mode, rtx, int, rtx,
-			     int);
+extern rtx expand_shift (enum tree_code, machine_mode, rtx, poly_int64, rtx,
+			 int);
 extern rtx expand_divmod (int, enum tree_code, machine_mode, rtx, rtx,
 			  rtx, int);
 #endif
Index: gcc/expmed.c
===================================================================
--- gcc/expmed.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/expmed.c	2017-12-15 01:16:51.239339098 +0000
@@ -2541,7 +2541,7 @@ expand_shift_1 (enum tree_code code, mac
 
 rtx
 expand_shift (enum tree_code code, machine_mode mode, rtx shifted,
-	      int amount, rtx target, int unsignedp)
+	      poly_int64 amount, rtx target, int unsignedp)
 {
   return expand_shift_1 (code, mode, shifted,
 			 gen_int_shift_amount (mode, amount),
Index: gcc/rtlanal.c
===================================================================
--- gcc/rtlanal.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/rtlanal.c	2017-12-15 01:16:51.241339028 +0000
@@ -915,6 +915,28 @@ split_const (rtx x, rtx *base_out, rtx *
   *base_out = x;
   *offset_out = const0_rtx;
 }
+
+/* Express integer value X as some value Y plus a polynomial offset,
+   where Y is either const0_rtx, X or something within X (as opposed
+   to a new rtx).  Return the Y and store the offset in *OFFSET_OUT.  */
+
+rtx
+strip_offset (rtx x, poly_int64_pod *offset_out)
+{
+  rtx base = const0_rtx;
+  rtx test = x;
+  if (GET_CODE (test) == CONST)
+    test = XEXP (test, 0);
+  if (GET_CODE (test) == PLUS)
+    {
+      base = XEXP (test, 0);
+      test = XEXP (test, 1);
+    }
+  if (poly_int_rtx_p (test, offset_out))
+    return base;
+  *offset_out = 0;
+  return x;
+}
 
 /* Return the number of places FIND appears within X.  If COUNT_DEST is
    zero, we do not count occurrences inside the destination of a SET.  */
@@ -3406,13 +3428,15 @@ commutative_operand_precedence (rtx op)
 
   /* Constants always become the second operand.  Prefer "nice" constants.  */
   if (code == CONST_INT)
-    return -8;
+    return -10;
   if (code == CONST_WIDE_INT)
-    return -7;
+    return -9;
+  if (code == CONST_POLY_INT)
+    return -8;
   if (code == CONST_DOUBLE)
-    return -7;
+    return -8;
   if (code == CONST_FIXED)
-    return -7;
+    return -8;
   op = avoid_constant_pool_reference (op);
   code = GET_CODE (op);
 
@@ -3420,13 +3444,15 @@ commutative_operand_precedence (rtx op)
     {
     case RTX_CONST_OBJ:
       if (code == CONST_INT)
-        return -6;
+	return -7;
       if (code == CONST_WIDE_INT)
-        return -6;
+	return -6;
+      if (code == CONST_POLY_INT)
+	return -5;
       if (code == CONST_DOUBLE)
-        return -5;
+	return -5;
       if (code == CONST_FIXED)
-        return -5;
+	return -5;
       return -4;
 
     case RTX_EXTRA:
Index: gcc/rtl-tests.c
===================================================================
--- gcc/rtl-tests.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/rtl-tests.c	2017-12-15 01:16:51.240339063 +0000
@@ -228,6 +228,62 @@ test_uncond_jump ()
 		      jump_insn);
 }
 
+template<unsigned int N>
+struct const_poly_int_tests
+{
+  static void run ();
+};
+
+template<>
+struct const_poly_int_tests<1>
+{
+  static void run () {}
+};
+
+/* Test various CONST_POLY_INT properties.  */
+
+template<unsigned int N>
+void
+const_poly_int_tests<N>::run ()
+{
+  rtx x1 = gen_int_mode (poly_int64 (1, 1), QImode);
+  rtx x255 = gen_int_mode (poly_int64 (1, 255), QImode);
+
+  /* Test that constants are unique.  */
+  ASSERT_EQ (x1, gen_int_mode (poly_int64 (1, 1), QImode));
+  ASSERT_NE (x1, gen_int_mode (poly_int64 (1, 1), HImode));
+  ASSERT_NE (x1, x255);
+
+  /* Test const_poly_int_value.  */
+  ASSERT_KNOWN_EQ (const_poly_int_value (x1), poly_int64 (1, 1));
+  ASSERT_KNOWN_EQ (const_poly_int_value (x255), poly_int64 (1, -1));
+
+  /* Test rtx_to_poly_int64.  */
+  ASSERT_KNOWN_EQ (rtx_to_poly_int64 (x1), poly_int64 (1, 1));
+  ASSERT_KNOWN_EQ (rtx_to_poly_int64 (x255), poly_int64 (1, -1));
+  ASSERT_MAYBE_NE (rtx_to_poly_int64 (x255), poly_int64 (1, 255));
+
+  /* Test plus_constant of a symbol.  */
+  rtx symbol = gen_rtx_SYMBOL_REF (Pmode, "foo");
+  rtx offset1 = gen_int_mode (poly_int64 (9, 11), Pmode);
+  rtx sum1 = gen_rtx_CONST (Pmode, gen_rtx_PLUS (Pmode, symbol, offset1));
+  ASSERT_RTX_EQ (plus_constant (Pmode, symbol, poly_int64 (9, 11)), sum1);
+
+  /* Test plus_constant of a CONST.  */
+  rtx offset2 = gen_int_mode (poly_int64 (12, 20), Pmode);
+  rtx sum2 = gen_rtx_CONST (Pmode, gen_rtx_PLUS (Pmode, symbol, offset2));
+  ASSERT_RTX_EQ (plus_constant (Pmode, sum1, poly_int64 (3, 9)), sum2);
+
+  /* Test a cancelling plus_constant.  */
+  ASSERT_EQ (plus_constant (Pmode, sum2, poly_int64 (-12, -20)), symbol);
+
+  /* Test plus_constant on integer constants.  */
+  ASSERT_EQ (plus_constant (QImode, const1_rtx, poly_int64 (4, -2)),
+	     gen_int_mode (poly_int64 (5, -2), QImode));
+  ASSERT_EQ (plus_constant (QImode, x1, poly_int64 (4, -2)),
+	     gen_int_mode (poly_int64 (5, -1), QImode));
+}
+
 /* Run all of the selftests within this file.  */
 
 void
@@ -238,6 +294,7 @@ rtl_tests_c_tests ()
   test_dumping_rtx_reuse ();
   test_single_set ();
   test_uncond_jump ();
+  const_poly_int_tests<NUM_POLY_INT_COEFFS>::run ();
 
   /* Purge state.  */
   set_first_insn (NULL);
Index: gcc/simplify-rtx.c
===================================================================
--- gcc/simplify-rtx.c	2017-12-15 01:16:50.894351263 +0000
+++ gcc/simplify-rtx.c	2017-12-15 01:16:51.242338992 +0000
@@ -2038,6 +2038,26 @@ simplify_const_unary_operation (enum rtx
 	}
     }
 
+  /* Handle polynomial integers.  */
+  else if (CONST_POLY_INT_P (op))
+    {
+      poly_wide_int result;
+      switch (code)
+	{
+	case NEG:
+	  result = -const_poly_int_value (op);
+	  break;
+
+	case NOT:
+	  result = ~const_poly_int_value (op);
+	  break;
+
+	default:
+	  return NULL_RTX;
+	}
+      return immed_wide_int_const (result, mode);
+    }
+
   return NULL_RTX;
 }
 
@@ -2218,6 +2238,7 @@ simplify_binary_operation_1 (enum rtx_co
   rtx tem, reversed, opleft, opright, elt0, elt1;
   HOST_WIDE_INT val;
   scalar_int_mode int_mode, inner_mode;
+  poly_int64 offset;
 
   /* Even if we can't compute a constant result,
      there are some cases worth simplifying.  */
@@ -2530,6 +2551,12 @@ simplify_binary_operation_1 (enum rtx_co
 	    return simplify_gen_binary (MINUS, mode, tem, XEXP (op0, 0));
 	}
 
+      if ((GET_CODE (op0) == CONST
+	   || GET_CODE (op0) == SYMBOL_REF
+	   || GET_CODE (op0) == LABEL_REF)
+	  && poly_int_rtx_p (op1, &offset))
+	return plus_constant (mode, op0, trunc_int_for_mode (-offset, mode));
+
       /* Don't let a relocatable value get a negative coeff.  */
       if (CONST_INT_P (op1) && GET_MODE (op0) != VOIDmode)
 	return simplify_gen_binary (PLUS, mode,
@@ -4327,6 +4354,57 @@ simplify_const_binary_operation (enum rt
       return immed_wide_int_const (result, int_mode);
     }
 
+  /* Handle polynomial integers.  */
+  if (NUM_POLY_INT_COEFFS > 1
+      && is_a <scalar_int_mode> (mode, &int_mode)
+      && poly_int_rtx_p (op0)
+      && poly_int_rtx_p (op1))
+    {
+      poly_wide_int result;
+      switch (code)
+	{
+	case PLUS:
+	  result = wi::to_poly_wide (op0, mode) + wi::to_poly_wide (op1, mode);
+	  break;
+
+	case MINUS:
+	  result = wi::to_poly_wide (op0, mode) - wi::to_poly_wide (op1, mode);
+	  break;
+
+	case MULT:
+	  if (CONST_SCALAR_INT_P (op1))
+	    result = wi::to_poly_wide (op0, mode) * rtx_mode_t (op1, mode);
+	  else
+	    return NULL_RTX;
+	  break;
+
+	case ASHIFT:
+	  if (CONST_SCALAR_INT_P (op1))
+	    {
+	      wide_int shift = rtx_mode_t (op1, mode);
+	      if (SHIFT_COUNT_TRUNCATED)
+		shift = wi::umod_trunc (shift, GET_MODE_PRECISION (int_mode));
+	      else if (wi::geu_p (shift, GET_MODE_PRECISION (int_mode)))
+		return NULL_RTX;
+	      result = wi::to_poly_wide (op0, mode) << shift;
+	    }
+	  else
+	    return NULL_RTX;
+	  break;
+
+	case IOR:
+	  if (!CONST_SCALAR_INT_P (op1)
+	      || !can_ior_p (wi::to_poly_wide (op0, mode),
+			     rtx_mode_t (op1, mode), &result))
+	    return NULL_RTX;
+	  break;
+
+	default:
+	  return NULL_RTX;
+	}
+      return immed_wide_int_const (result, int_mode);
+    }
+
   return NULL_RTX;
 }
 
@@ -6370,13 +6448,27 @@ simplify_subreg (machine_mode outermode,
   scalar_int_mode int_outermode, int_innermode;
   if (is_a <scalar_int_mode> (outermode, &int_outermode)
       && is_a <scalar_int_mode> (innermode, &int_innermode)
-      && (GET_MODE_PRECISION (int_outermode)
-	  < GET_MODE_PRECISION (int_innermode))
       && byte == subreg_lowpart_offset (int_outermode, int_innermode))
     {
-      rtx tem = simplify_truncation (int_outermode, op, int_innermode);
-      if (tem)
-	return tem;
+      /* Handle polynomial integers.  The upper bits of a paradoxical
+	 subreg are undefined, so this is safe regardless of whether
+	 we're truncating or extending.  */
+      if (CONST_POLY_INT_P (op))
+	{
+	  poly_wide_int val
+	    = poly_wide_int::from (const_poly_int_value (op),
+				   GET_MODE_PRECISION (int_outermode),
+				   SIGNED);
+	  return immed_wide_int_const (val, int_outermode);
+	}
+
+      if (GET_MODE_PRECISION (int_outermode)
+	  < GET_MODE_PRECISION (int_innermode))
+	{
+	  rtx tem = simplify_truncation (int_outermode, op, int_innermode);
+	  if (tem)
+	    return tem;
+	}
     }
 
   return NULL_RTX;
@@ -6685,12 +6777,60 @@ test_vector_ops ()
     }
 }
 
+template<unsigned int N>
+struct simplify_const_poly_int_tests
+{
+  static void run ();
+};
+
+template<>
+struct simplify_const_poly_int_tests<1>
+{
+  static void run () {}
+};
+
+/* Test various CONST_POLY_INT properties.  */
+
+template<unsigned int N>
+void
+simplify_const_poly_int_tests<N>::run ()
+{
+  rtx x1 = gen_int_mode (poly_int64 (1, 1), QImode);
+  rtx x2 = gen_int_mode (poly_int64 (-80, 127), QImode);
+  rtx x3 = gen_int_mode (poly_int64 (-79, -128), QImode);
+  rtx x4 = gen_int_mode (poly_int64 (5, 4), QImode);
+  rtx x5 = gen_int_mode (poly_int64 (30, 24), QImode);
+  rtx x6 = gen_int_mode (poly_int64 (20, 16), QImode);
+  rtx x7 = gen_int_mode (poly_int64 (7, 4), QImode);
+  rtx x8 = gen_int_mode (poly_int64 (30, 24), HImode);
+  rtx x9 = gen_int_mode (poly_int64 (-30, -24), HImode);
+  rtx x10 = gen_int_mode (poly_int64 (-31, -24), HImode);
+  rtx two = GEN_INT (2);
+  rtx six = GEN_INT (6);
+  HOST_WIDE_INT offset = subreg_lowpart_offset (QImode, HImode);
+
+  /* These tests only try limited operation combinations.  Fuller arithmetic
+     testing is done directly on poly_ints.  */
+  ASSERT_EQ (simplify_unary_operation (NEG, HImode, x8, HImode), x9);
+  ASSERT_EQ (simplify_unary_operation (NOT, HImode, x8, HImode), x10);
+  ASSERT_EQ (simplify_unary_operation (TRUNCATE, QImode, x8, HImode), x5);
+  ASSERT_EQ (simplify_binary_operation (PLUS, QImode, x1, x2), x3);
+  ASSERT_EQ (simplify_binary_operation (MINUS, QImode, x3, x1), x2);
+  ASSERT_EQ (simplify_binary_operation (MULT, QImode, x4, six), x5);
+  ASSERT_EQ (simplify_binary_operation (MULT, QImode, six, x4), x5);
+  ASSERT_EQ (simplify_binary_operation (ASHIFT, QImode, x4, two), x6);
+  ASSERT_EQ (simplify_binary_operation (IOR, QImode, x4, two), x7);
+  ASSERT_EQ (simplify_subreg (HImode, x5, QImode, 0), x8);
+  ASSERT_EQ (simplify_subreg (QImode, x8, HImode, offset), x5);
+}
+
 /* Run all of the selftests within this file.  */
 
 void
 simplify_rtx_c_tests ()
 {
   test_vector_ops ();
+  simplify_const_poly_int_tests<NUM_POLY_INT_COEFFS>::run ();
 }
 
 } // namespace selftest
Index: gcc/wide-int.h
===================================================================
--- gcc/wide-int.h	2017-12-15 01:16:50.894351263 +0000
+++ gcc/wide-int.h	2017-12-15 01:16:51.242338992 +0000
@@ -613,6 +613,7 @@ #define SHIFT_FUNCTION \
      access.  */
   struct storage_ref
   {
+    storage_ref () {}
     storage_ref (const HOST_WIDE_INT *, unsigned int, unsigned int);
 
     const HOST_WIDE_INT *val;
@@ -944,6 +945,8 @@ struct wide_int_ref_storage : public wi:
   HOST_WIDE_INT scratch[2];
 
 public:
+  wide_int_ref_storage () {}
+
   wide_int_ref_storage (const wi::storage_ref &);
 
   template <typename T>
@@ -1323,7 +1326,7 @@ typedef generic_wide_int <trailing_wide_
    bytes beyond the sizeof need to be allocated.  Use set_precision
    to initialize the structure.  */
 template <int N>
-class GTY(()) trailing_wide_ints
+class GTY((user)) trailing_wide_ints
 {
 private:
   /* The shared precision of each number.  */
@@ -1340,9 +1343,14 @@ class GTY(()) trailing_wide_ints
   HOST_WIDE_INT m_val[1];
 
 public:
+  typedef WIDE_INT_REF_FOR (trailing_wide_int_storage) const_reference;
+
   void set_precision (unsigned int);
+  unsigned int get_precision () const { return m_precision; }
   trailing_wide_int operator [] (unsigned int);
+  const_reference operator [] (unsigned int) const;
   static size_t extra_size (unsigned int);
+  size_t extra_size () const { return extra_size (m_precision); }
 };
 
 inline trailing_wide_int_storage::
@@ -1414,6 +1422,14 @@ trailing_wide_ints <N>::operator [] (uns
 				    &m_val[index * m_max_len]);
 }
 
+template <int N>
+inline typename trailing_wide_ints <N>::const_reference
+trailing_wide_ints <N>::operator [] (unsigned int index) const
+{
+  return wi::storage_ref (&m_val[index * m_max_len],
+			  m_len[index], m_precision);
+}
+
 /* Return how many extra bytes need to be added to the end of the structure
    in order to handle N wide_ints of precision PRECISION.  */
 template <int N>
Jeff Law Dec. 19, 2017, 4:52 a.m. UTC | #3
On 12/14/2017 06:25 PM, Richard Sandiford wrote:
> Jeff Law <law@redhat.com> writes:
>> On 10/23/2017 11:00 AM, Richard Sandiford wrote:
>>> This patch adds an rtl representation of poly_int values.
>>> There were three possible ways of doing this:
>>>
>>> (1) Add a new rtl code for the poly_ints themselves and store the
>>>     coefficients as trailing wide_ints.  This would give constants like:
>>>
>>>       (const_poly_int [c0 c1 ... cn])
>>>
>>>     The runtime value would be:
>>>
>>>       c0 + c1 * x1 + ... + cn * xn
>>>
>>> (2) Like (1), but use rtxes for the coefficients.  This would give
>>>     constants like:
>>>
>>>       (const_poly_int [(const_int c0)
>>>                        (const_int c1)
>>>                        ...
>>>                        (const_int cn)])
>>>
>>>     although the coefficients could be const_wide_ints instead
>>>     of const_ints where appropriate.
>>>
>>> (3) Add a new rtl code for the polynomial indeterminates,
>>>     then use them in const wrappers.  A constant like c0 + c1 * x1
>>>     would then look like:
>>>
>>>       (const:M (plus:M (mult:M (const_param:M x1)
>>>                                (const_int c1))
>>>                        (const_int c0)))
>>>
>>> There didn't seem to be that much to choose between them.  The main
>>> advantage of (1) is that it's a more efficient representation and
>>> that we can refer to the cofficients directly as wide_int_storage.
>> Well, and #1 feels more like how we handle CONST_INT :-)
>>>
>>>
>>> 2017-10-23  Richard Sandiford  <richard.sandiford@linaro.org>
>>> 	    Alan Hayward  <alan.hayward@arm.com>
>>> 	    David Sherwood  <david.sherwood@arm.com>
>>>
>>> gcc/
>>> 	* doc/rtl.texi (const_poly_int): Document.
>>> 	* gengenrtl.c (excluded_rtx): Return true for CONST_POLY_INT.
>>> 	* rtl.h (const_poly_int_def): New struct.
>>> 	(rtx_def::u): Add a cpi field.
>>> 	(CASE_CONST_UNIQUE, CASE_CONST_ANY): Add CONST_POLY_INT.
>>> 	(CONST_POLY_INT_P, CONST_POLY_INT_COEFFS): New macros.
>>> 	(wi::rtx_to_poly_wide_ref): New typedef
>>> 	(const_poly_int_value, wi::to_poly_wide, rtx_to_poly_int64)
>>> 	(poly_int_rtx_p): New functions.
>>> 	(trunc_int_for_mode): Declare a poly_int64 version.
>>> 	(plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT.
>>> 	(immed_wide_int_const): Take a poly_wide_int_ref rather than
>>> 	a wide_int_ref.
>>> 	(strip_offset): Declare.
>>> 	(strip_offset_and_add): New function.
>>> 	* rtl.def (CONST_POLY_INT): New rtx code.
>>> 	* rtl.c (rtx_size): Handle CONST_POLY_INT.
>>> 	(shared_const_p): Use poly_int_rtx_p.
>>> 	* emit-rtl.h (gen_int_mode): Take a poly_int64 instead of a
>>> 	HOST_WIDE_INT.
>>> 	(gen_int_shift_amount): Likewise.
>>> 	* emit-rtl.c (const_poly_int_hasher): New class.
>>> 	(const_poly_int_htab): New variable.
>>> 	(init_emit_once): Initialize it when NUM_POLY_INT_COEFFS > 1.
>>> 	(const_poly_int_hasher::hash): New function.
>>> 	(const_poly_int_hasher::equal): Likewise.
>>> 	(gen_int_mode): Take a poly_int64 instead of a HOST_WIDE_INT.
>>> 	(immed_wide_int_const): Rename to...
>>> 	(immed_wide_int_const_1): ...this and make static.
>>> 	(immed_wide_int_const): New function, taking a poly_wide_int_ref
>>> 	instead of a wide_int_ref.
>>> 	(gen_int_shift_amount): Take a poly_int64 instead of a HOST_WIDE_INT.
>>> 	(gen_lowpart_common): Handle CONST_POLY_INT.
>>> 	* cse.c (hash_rtx_cb, equiv_constant): Likewise.
>>> 	* cselib.c (cselib_hash_rtx): Likewise.
>>> 	* dwarf2out.c (const_ok_for_output_1): Likewise.
>>> 	* expr.c (convert_modes): Likewise.
>>> 	* print-rtl.c (rtx_writer::print_rtx, print_value): Likewise.
>>> 	* rtlhash.c (add_rtx): Likewise.
>>> 	* explow.c (trunc_int_for_mode): Add a poly_int64 version.
>>> 	(plus_constant): Take a poly_int64 instead of a HOST_WIDE_INT.
>>> 	Handle existing CONST_POLY_INT rtxes.
>>> 	* expmed.h (expand_shift): Take a poly_int64 instead of a
>>> 	HOST_WIDE_INT.
>>> 	* expmed.c (expand_shift): Likewise.
>>> 	* rtlanal.c (strip_offset): New function.
>>> 	(commutative_operand_precedence): Give CONST_POLY_INT the same
>>> 	precedence as CONST_DOUBLE and put CONST_WIDE_INT between that
>>> 	and CONST_INT.
>>> 	* rtl-tests.c (const_poly_int_tests): New struct.
>>> 	(rtl_tests_c_tests): Use it.
>>> 	* simplify-rtx.c (simplify_const_unary_operation): Handle
>>> 	CONST_POLY_INT.
>>> 	(simplify_const_binary_operation): Likewise.
>>> 	(simplify_binary_operation_1): Fold additions of symbolic constants
>>> 	and CONST_POLY_INTs.
>>> 	(simplify_subreg): Handle extensions and truncations of
>>> 	CONST_POLY_INTs.
>>> 	(simplify_const_poly_int_tests): New struct.
>>> 	(simplify_rtx_c_tests): Use it.
>>> 	* wide-int.h (storage_ref): Add default constructor.
>>> 	(wide_int_ref_storage): Likewise.
>>> 	(trailing_wide_ints): Use GTY((user)).
>>> 	(trailing_wide_ints::operator[]): Add a const version.
>>> 	(trailing_wide_ints::get_precision): New function.
>>> 	(trailing_wide_ints::extra_size): Likewise.
>> Do we need to define anything WRT structure sharing in rtl.texi for a
>> CONST_POLY_INT?
> 
> Good catch.  Fixed in the patch below.
> 
>>> Index: gcc/rtl.c
>>> ===================================================================
>>> --- gcc/rtl.c	2017-10-23 16:52:20.579835373 +0100
>>> +++ gcc/rtl.c	2017-10-23 17:00:54.443002147 +0100
>>> @@ -257,9 +261,10 @@ shared_const_p (const_rtx orig)
>>>  
>>>    /* CONST can be shared if it contains a SYMBOL_REF.  If it contains
>>>       a LABEL_REF, it isn't sharable.  */
>>> +  poly_int64 offset;
>>>    return (GET_CODE (XEXP (orig, 0)) == PLUS
>>>  	  && GET_CODE (XEXP (XEXP (orig, 0), 0)) == SYMBOL_REF
>>> -	  && CONST_INT_P (XEXP (XEXP (orig, 0), 1)));
>>> +	  && poly_int_rtx_p (XEXP (XEXP (orig, 0), 1), &offset));
>> Did this just change structure sharing for CONST_WIDE_INT?
> 
> No, we'd only use CONST_WIDE_INT for things that don't fit in
> poly_int64.
> 
>>> +  /* Create a new rtx.  There's a choice to be made here between installing
>>> +     the actual mode of the rtx or leaving it as VOIDmode (for consistency
>>> +     with CONST_INT).  In practice the handling of the codes is different
>>> +     enough that we get no benefit from using VOIDmode, and various places
>>> +     assume that VOIDmode implies CONST_INT.  Using the real mode seems like
>>> +     the right long-term direction anyway.  */
>> Certainly my preference is to get the mode in there.  I see modeless
>> CONST_INTs as a long standing wart and I'm not keen to repeat it.
> 
> Yeah.  Still regularly hit problems related to modeless CONST_INTs
> today (including the gen_int_shift_amount patch).
> 
>>> Index: gcc/wide-int.h
>>> ===================================================================
>>> --- gcc/wide-int.h	2017-10-23 17:00:20.923835582 +0100
>>> +++ gcc/wide-int.h	2017-10-23 17:00:54.445999420 +0100
>>> @@ -613,6 +613,7 @@ #define SHIFT_FUNCTION \
>>>       access.  */
>>>    struct storage_ref
>>>    {
>>> +    storage_ref () {}
>>>      storage_ref (const HOST_WIDE_INT *, unsigned int, unsigned int);
>>>  
>>>      const HOST_WIDE_INT *val;
>>> @@ -944,6 +945,8 @@ struct wide_int_ref_storage : public wi:
>>>    HOST_WIDE_INT scratch[2];
>>>  
>>>  public:
>>> +  wide_int_ref_storage () {}
>>> +
>>>    wide_int_ref_storage (const wi::storage_ref &);
>>>  
>>>    template <typename T>
>> So doesn't this play into the whole question about initialization of
>> these objects.  So I'll defer on this hunk until we settle that
>> question, but the rest is OK.
> 
> Any more thoughts on this?  In the end the 001 patch went in with
> the empty constructors.  Like I say, I'm happy to switch to C++-11
> "= default;" once we require C++11, but I think having well-defined
> implicit construction would make switching to "= default" harder
> in future.
I think we're good to go.  I would have slightly preferred to avoid the
empty ctor, but not enough to raise an objection to Richi's ACK and
ultimately make the switch to = default harder later.

And just to be clear, I'd like to propose we step forward to C++11 in
the gcc-9 timeframe.  I haven't run that by anyone, but that's the
timeframe I'd personally prefer.


jeff
diff mbox series

Patch

Index: gcc/doc/rtl.texi
===================================================================
--- gcc/doc/rtl.texi	2017-10-23 17:00:20.916834036 +0100
+++ gcc/doc/rtl.texi	2017-10-23 17:00:54.437007600 +0100
@@ -1621,6 +1621,15 @@  is accessed with the macro @code{CONST_F
 data is accessed with @code{CONST_FIXED_VALUE_HIGH}; the low part is
 accessed with @code{CONST_FIXED_VALUE_LOW}.
 
+@findex const_poly_int
+@item (const_poly_int:@var{m} [@var{c0} @var{c1} @dots{}])
+Represents a @code{poly_int}-style polynomial integer with coefficients
+@var{c0}, @var{c1}, @dots{}.  The coefficients are @code{wide_int}-based
+integers rather than rtxes.  @code{CONST_POLY_INT_COEFFS} gives the
+values of individual coefficients (which is mostly only useful in
+low-level routines) and @code{const_poly_int_value} gives the full
+@code{poly_int} value.
+
 @findex const_vector
 @item (const_vector:@var{m} [@var{x0} @var{x1} @dots{}])
 Represents a vector constant.  The square brackets stand for the vector
Index: gcc/gengenrtl.c
===================================================================
--- gcc/gengenrtl.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/gengenrtl.c	2017-10-23 17:00:54.442003055 +0100
@@ -157,6 +157,7 @@  excluded_rtx (int idx)
   return (strcmp (defs[idx].enumname, "VAR_LOCATION") == 0
 	  || strcmp (defs[idx].enumname, "CONST_DOUBLE") == 0
 	  || strcmp (defs[idx].enumname, "CONST_WIDE_INT") == 0
+	  || strcmp (defs[idx].enumname, "CONST_POLY_INT") == 0
 	  || strcmp (defs[idx].enumname, "CONST_FIXED") == 0);
 }
 
Index: gcc/rtl.h
===================================================================
--- gcc/rtl.h	2017-10-23 16:52:20.579835373 +0100
+++ gcc/rtl.h	2017-10-23 17:00:54.444001238 +0100
@@ -280,6 +280,10 @@  #define CWI_GET_NUM_ELEM(RTX)					\
 #define CWI_PUT_NUM_ELEM(RTX, NUM)					\
   (RTL_FLAG_CHECK1("CWI_PUT_NUM_ELEM", (RTX), CONST_WIDE_INT)->u2.num_elem = (NUM))
 
+struct GTY((variable_size)) const_poly_int_def {
+  trailing_wide_ints<NUM_POLY_INT_COEFFS> coeffs;
+};
+
 /* RTL expression ("rtx").  */
 
 /* The GTY "desc" and "tag" options below are a kludge: we need a desc
@@ -424,6 +428,7 @@  struct GTY((desc("0"), tag("0"),
     struct real_value rv;
     struct fixed_value fv;
     struct hwivec_def hwiv;
+    struct const_poly_int_def cpi;
   } GTY ((special ("rtx_def"), desc ("GET_CODE (&%0)"))) u;
 };
 
@@ -734,6 +739,7 @@  #define CASE_CONST_SCALAR_INT \
 #define CASE_CONST_UNIQUE \
    case CONST_INT: \
    case CONST_WIDE_INT: \
+   case CONST_POLY_INT: \
    case CONST_DOUBLE: \
    case CONST_FIXED
 
@@ -741,6 +747,7 @@  #define CASE_CONST_UNIQUE \
 #define CASE_CONST_ANY \
    case CONST_INT: \
    case CONST_WIDE_INT: \
+   case CONST_POLY_INT: \
    case CONST_DOUBLE: \
    case CONST_FIXED: \
    case CONST_VECTOR
@@ -773,6 +780,11 @@  #define CONST_INT_P(X) (GET_CODE (X) ==
 /* Predicate yielding nonzero iff X is an rtx for a constant integer.  */
 #define CONST_WIDE_INT_P(X) (GET_CODE (X) == CONST_WIDE_INT)
 
+/* Predicate yielding nonzero iff X is an rtx for a polynomial constant
+   integer.  */
+#define CONST_POLY_INT_P(X) \
+  (NUM_POLY_INT_COEFFS > 1 && GET_CODE (X) == CONST_POLY_INT)
+
 /* Predicate yielding nonzero iff X is an rtx for a constant fixed-point.  */
 #define CONST_FIXED_P(X) (GET_CODE (X) == CONST_FIXED)
 
@@ -1871,6 +1883,12 @@  #define CONST_WIDE_INT_VEC(RTX) HWIVEC_C
 #define CONST_WIDE_INT_NUNITS(RTX) CWI_GET_NUM_ELEM (RTX)
 #define CONST_WIDE_INT_ELT(RTX, N) CWI_ELT (RTX, N)
 
+/* For a CONST_POLY_INT, CONST_POLY_INT_COEFFS gives access to the
+   individual coefficients, in the form of a trailing_wide_ints structure.  */
+#define CONST_POLY_INT_COEFFS(RTX) \
+  (RTL_FLAG_CHECK1("CONST_POLY_INT_COEFFS", (RTX), \
+		   CONST_POLY_INT)->u.cpi.coeffs)
+
 /* For a CONST_DOUBLE:
 #if TARGET_SUPPORTS_WIDE_INT == 0
    For a VOIDmode, there are two integers CONST_DOUBLE_LOW is the
@@ -2184,6 +2202,84 @@  wi::max_value (machine_mode mode, signop
   return max_value (GET_MODE_PRECISION (as_a <scalar_mode> (mode)), sgn);
 }
 
+namespace wi
+{
+  typedef poly_int<NUM_POLY_INT_COEFFS,
+		   generic_wide_int <wide_int_ref_storage <false, false> > >
+    rtx_to_poly_wide_ref;
+  rtx_to_poly_wide_ref to_poly_wide (const_rtx, machine_mode);
+}
+
+/* Return the value of a CONST_POLY_INT in its native precision.  */
+
+inline wi::rtx_to_poly_wide_ref
+const_poly_int_value (const_rtx x)
+{
+  poly_int<NUM_POLY_INT_COEFFS, WIDE_INT_REF_FOR (wide_int)> res;
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    res.coeffs[i] = CONST_POLY_INT_COEFFS (x)[i];
+  return res;
+}
+
+/* Return true if X is a scalar integer or a CONST_POLY_INT.  The value
+   can then be extracted using wi::to_poly_wide.  */
+
+inline bool
+poly_int_rtx_p (const_rtx x)
+{
+  return CONST_SCALAR_INT_P (x) || CONST_POLY_INT_P (x);
+}
+
+/* Access X (which satisfies poly_int_rtx_p) as a poly_wide_int.
+   MODE is the mode of X.  */
+
+inline wi::rtx_to_poly_wide_ref
+wi::to_poly_wide (const_rtx x, machine_mode mode)
+{
+  if (CONST_POLY_INT_P (x))
+    return const_poly_int_value (x);
+  return rtx_mode_t (const_cast<rtx> (x), mode);
+}
+
+/* Return the value of X as a poly_int64.  */
+
+inline poly_int64
+rtx_to_poly_int64 (const_rtx x)
+{
+  if (CONST_POLY_INT_P (x))
+    {
+      poly_int64 res;
+      for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	res.coeffs[i] = CONST_POLY_INT_COEFFS (x)[i].to_shwi ();
+      return res;
+    }
+  return INTVAL (x);
+}
+
+/* Return true if arbitrary value X is an integer constant that can
+   be represented as a poly_int64.  Store the value in *RES if so,
+   otherwise leave it unmodified.  */
+
+inline bool
+poly_int_rtx_p (const_rtx x, poly_int64_pod *res)
+{
+  if (CONST_INT_P (x))
+    {
+      *res = INTVAL (x);
+      return true;
+    }
+  if (CONST_POLY_INT_P (x))
+    {
+      for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	if (!wi::fits_shwi_p (CONST_POLY_INT_COEFFS (x)[i]))
+	  return false;
+      for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	res->coeffs[i] = CONST_POLY_INT_COEFFS (x)[i].to_shwi ();
+      return true;
+    }
+  return false;
+}
+
 extern void init_rtlanal (void);
 extern int rtx_cost (rtx, machine_mode, enum rtx_code, int, bool);
 extern int address_cost (rtx, machine_mode, addr_space_t, bool);
@@ -2721,7 +2817,8 @@  #define EXTRACT_ARGS_IN_RANGE(SIZE, POS,
 
 /* In explow.c */
 extern HOST_WIDE_INT trunc_int_for_mode	(HOST_WIDE_INT, machine_mode);
-extern rtx plus_constant (machine_mode, rtx, HOST_WIDE_INT, bool = false);
+extern poly_int64 trunc_int_for_mode (poly_int64, machine_mode);
+extern rtx plus_constant (machine_mode, rtx, poly_int64, bool = false);
 extern HOST_WIDE_INT get_stack_check_protect (void);
 
 /* In rtl.c */
@@ -3032,13 +3129,11 @@  extern void end_sequence (void);
 extern double_int rtx_to_double_int (const_rtx);
 #endif
 extern void cwi_output_hex (FILE *, const_rtx);
-#ifndef GENERATOR_FILE
-extern rtx immed_wide_int_const (const wide_int_ref &, machine_mode);
-#endif
 #if TARGET_SUPPORTS_WIDE_INT == 0
 extern rtx immed_double_const (HOST_WIDE_INT, HOST_WIDE_INT,
 			       machine_mode);
 #endif
+extern rtx immed_wide_int_const (const poly_wide_int_ref &, machine_mode);
 
 /* In varasm.c  */
 extern rtx force_const_mem (machine_mode, rtx);
@@ -3226,6 +3321,7 @@  extern HOST_WIDE_INT get_integer_term (c
 extern rtx get_related_value (const_rtx);
 extern bool offset_within_block_p (const_rtx, HOST_WIDE_INT);
 extern void split_const (rtx, rtx *, rtx *);
+extern rtx strip_offset (rtx, poly_int64_pod *);
 extern bool unsigned_reg_p (rtx);
 extern int reg_mentioned_p (const_rtx, const_rtx);
 extern int count_occurrences (const_rtx, const_rtx, int);
@@ -4160,6 +4256,21 @@  load_extend_op (machine_mode mode)
   return UNKNOWN;
 }
 
+/* If X is a PLUS of a base and a constant offset, add the constant to *OFFSET
+   and return the base.  Return X otherwise.  */
+
+inline rtx
+strip_offset_and_add (rtx x, poly_int64_pod *offset)
+{
+  if (GET_CODE (x) == PLUS)
+    {
+      poly_int64 suboffset;
+      x = strip_offset (x, &suboffset);
+      *offset += suboffset;
+    }
+  return x;
+}
+
 /* gtype-desc.c.  */
 extern void gt_ggc_mx (rtx &);
 extern void gt_pch_nx (rtx &);
Index: gcc/rtl.def
===================================================================
--- gcc/rtl.def	2017-10-23 16:52:20.579835373 +0100
+++ gcc/rtl.def	2017-10-23 17:00:54.443002147 +0100
@@ -348,6 +348,9 @@  DEF_RTL_EXPR(CONST_INT, "const_int", "w"
 /* numeric integer constant */
 DEF_RTL_EXPR(CONST_WIDE_INT, "const_wide_int", "", RTX_CONST_OBJ)
 
+/* An rtx representation of a poly_wide_int.  */
+DEF_RTL_EXPR(CONST_POLY_INT, "const_poly_int", "", RTX_CONST_OBJ)
+
 /* fixed-point constant */
 DEF_RTL_EXPR(CONST_FIXED, "const_fixed", "www", RTX_CONST_OBJ)
 
Index: gcc/rtl.c
===================================================================
--- gcc/rtl.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/rtl.c	2017-10-23 17:00:54.443002147 +0100
@@ -189,6 +189,10 @@  rtx_size (const_rtx x)
 	    + sizeof (struct hwivec_def)
 	    + ((CONST_WIDE_INT_NUNITS (x) - 1)
 	       * sizeof (HOST_WIDE_INT)));
+  if (CONST_POLY_INT_P (x))
+    return (RTX_HDR_SIZE
+	    + sizeof (struct const_poly_int_def)
+	    + CONST_POLY_INT_COEFFS (x).extra_size ());
   if (GET_CODE (x) == SYMBOL_REF && SYMBOL_REF_HAS_BLOCK_INFO_P (x))
     return RTX_HDR_SIZE + sizeof (struct block_symbol);
   return RTX_CODE_SIZE (GET_CODE (x));
@@ -257,9 +261,10 @@  shared_const_p (const_rtx orig)
 
   /* CONST can be shared if it contains a SYMBOL_REF.  If it contains
      a LABEL_REF, it isn't sharable.  */
+  poly_int64 offset;
   return (GET_CODE (XEXP (orig, 0)) == PLUS
 	  && GET_CODE (XEXP (XEXP (orig, 0), 0)) == SYMBOL_REF
-	  && CONST_INT_P (XEXP (XEXP (orig, 0), 1)));
+	  && poly_int_rtx_p (XEXP (XEXP (orig, 0), 1), &offset));
 }
 
 
Index: gcc/emit-rtl.h
===================================================================
--- gcc/emit-rtl.h	2017-10-23 16:52:20.579835373 +0100
+++ gcc/emit-rtl.h	2017-10-23 17:00:54.440004873 +0100
@@ -362,14 +362,14 @@  extern rtvec gen_rtvec (int, ...);
 extern rtx copy_insn_1 (rtx);
 extern rtx copy_insn (rtx);
 extern rtx_insn *copy_delay_slot_insn (rtx_insn *);
-extern rtx gen_int_mode (HOST_WIDE_INT, machine_mode);
+extern rtx gen_int_mode (poly_int64, machine_mode);
 extern rtx_insn *emit_copy_of_insn_after (rtx_insn *, rtx_insn *);
 extern void set_reg_attrs_from_value (rtx, rtx);
 extern void set_reg_attrs_for_parm (rtx, rtx);
 extern void set_reg_attrs_for_decl_rtl (tree t, rtx x);
 extern void adjust_reg_mode (rtx, machine_mode);
 extern int mem_expr_equal_p (const_tree, const_tree);
-extern rtx gen_int_shift_amount (machine_mode, HOST_WIDE_INT);
+extern rtx gen_int_shift_amount (machine_mode, poly_int64);
 
 extern bool need_atomic_barrier_p (enum memmodel, bool);
 
Index: gcc/emit-rtl.c
===================================================================
--- gcc/emit-rtl.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/emit-rtl.c	2017-10-23 17:00:54.440004873 +0100
@@ -148,6 +148,16 @@  struct const_wide_int_hasher : ggc_cache
 
 static GTY ((cache)) hash_table<const_wide_int_hasher> *const_wide_int_htab;
 
+struct const_poly_int_hasher : ggc_cache_ptr_hash<rtx_def>
+{
+  typedef std::pair<machine_mode, poly_wide_int_ref> compare_type;
+
+  static hashval_t hash (rtx x);
+  static bool equal (rtx x, const compare_type &y);
+};
+
+static GTY ((cache)) hash_table<const_poly_int_hasher> *const_poly_int_htab;
+
 /* A hash table storing register attribute structures.  */
 struct reg_attr_hasher : ggc_cache_ptr_hash<reg_attrs>
 {
@@ -257,6 +267,31 @@  const_wide_int_hasher::equal (rtx x, rtx
 }
 #endif
 
+/* Returns a hash code for CONST_POLY_INT X.  */
+
+hashval_t
+const_poly_int_hasher::hash (rtx x)
+{
+  inchash::hash h;
+  h.add_int (GET_MODE (x));
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    h.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]);
+  return h.end ();
+}
+
+/* Returns nonzero if CONST_POLY_INT X is an rtx representation of Y.  */
+
+bool
+const_poly_int_hasher::equal (rtx x, const compare_type &y)
+{
+  if (GET_MODE (x) != y.first)
+    return false;
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    if (CONST_POLY_INT_COEFFS (x)[i] != y.second.coeffs[i])
+      return false;
+  return true;
+}
+
 /* Returns a hash code for X (which is really a CONST_DOUBLE).  */
 hashval_t
 const_double_hasher::hash (rtx x)
@@ -520,9 +555,13 @@  gen_rtx_CONST_INT (machine_mode mode ATT
 }
 
 rtx
-gen_int_mode (HOST_WIDE_INT c, machine_mode mode)
+gen_int_mode (poly_int64 c, machine_mode mode)
 {
-  return GEN_INT (trunc_int_for_mode (c, mode));
+  c = trunc_int_for_mode (c, mode);
+  if (c.is_constant ())
+    return GEN_INT (c.coeffs[0]);
+  unsigned int prec = GET_MODE_PRECISION (as_a <scalar_mode> (mode));
+  return immed_wide_int_const (poly_wide_int::from (c, prec, SIGNED), mode);
 }
 
 /* CONST_DOUBLEs might be created from pairs of integers, or from
@@ -626,8 +665,8 @@  lookup_const_wide_int (rtx wint)
    a CONST_DOUBLE (if !TARGET_SUPPORTS_WIDE_INT) or a CONST_WIDE_INT
    (if TARGET_SUPPORTS_WIDE_INT).  */
 
-rtx
-immed_wide_int_const (const wide_int_ref &v, machine_mode mode)
+static rtx
+immed_wide_int_const_1 (const wide_int_ref &v, machine_mode mode)
 {
   unsigned int len = v.get_len ();
   /* Not scalar_int_mode because we also allow pointer bound modes.  */
@@ -714,6 +753,53 @@  immed_double_const (HOST_WIDE_INT i0, HO
 }
 #endif
 
+/* Return an rtx representation of C in mode MODE.  */
+
+rtx
+immed_wide_int_const (const poly_wide_int_ref &c, machine_mode mode)
+{
+  if (c.is_constant ())
+    return immed_wide_int_const_1 (c.coeffs[0], mode);
+
+  /* Not scalar_int_mode because we also allow pointer bound modes.  */
+  unsigned int prec = GET_MODE_PRECISION (as_a <scalar_mode> (mode));
+
+  /* Allow truncation but not extension since we do not know if the
+     number is signed or unsigned.  */
+  gcc_assert (prec <= c.coeffs[0].get_precision ());
+  poly_wide_int newc = poly_wide_int::from (c, prec, SIGNED);
+
+  /* See whether we already have an rtx for this constant.  */
+  inchash::hash h;
+  h.add_int (mode);
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    h.add_wide_int (newc.coeffs[i]);
+  const_poly_int_hasher::compare_type typed_value (mode, newc);
+  rtx *slot = const_poly_int_htab->find_slot_with_hash (typed_value,
+							h.end (), INSERT);
+  rtx x = *slot;
+  if (x)
+    return x;
+
+  /* Create a new rtx.  There's a choice to be made here between installing
+     the actual mode of the rtx or leaving it as VOIDmode (for consistency
+     with CONST_INT).  In practice the handling of the codes is different
+     enough that we get no benefit from using VOIDmode, and various places
+     assume that VOIDmode implies CONST_INT.  Using the real mode seems like
+     the right long-term direction anyway.  */
+  typedef trailing_wide_ints<NUM_POLY_INT_COEFFS> twi;
+  size_t extra_size = twi::extra_size (prec);
+  x = rtx_alloc_v (CONST_POLY_INT,
+		   sizeof (struct const_poly_int_def) + extra_size);
+  PUT_MODE (x, mode);
+  CONST_POLY_INT_COEFFS (x).set_precision (prec);
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    CONST_POLY_INT_COEFFS (x)[i] = newc.coeffs[i];
+
+  *slot = x;
+  return x;
+}
+
 rtx
 gen_rtx_REG (machine_mode mode, unsigned int regno)
 {
@@ -1502,7 +1588,8 @@  gen_lowpart_common (machine_mode mode, r
     }
   else if (GET_CODE (x) == SUBREG || REG_P (x)
 	   || GET_CODE (x) == CONCAT || const_vec_p (x)
-	   || CONST_DOUBLE_AS_FLOAT_P (x) || CONST_SCALAR_INT_P (x))
+	   || CONST_DOUBLE_AS_FLOAT_P (x) || CONST_SCALAR_INT_P (x)
+	   || CONST_POLY_INT_P (x))
     return lowpart_subreg (mode, x, innermode);
 
   /* Otherwise, we can't do this.  */
@@ -6089,6 +6176,9 @@  init_emit_once (void)
 #endif
   const_double_htab = hash_table<const_double_hasher>::create_ggc (37);
 
+  if (NUM_POLY_INT_COEFFS > 1)
+    const_poly_int_htab = hash_table<const_poly_int_hasher>::create_ggc (37);
+
   const_fixed_htab = hash_table<const_fixed_hasher>::create_ggc (37);
 
   reg_attrs_htab = hash_table<reg_attr_hasher>::create_ggc (37);
@@ -6482,7 +6572,7 @@  need_atomic_barrier_p (enum memmodel mod
    by VALUE bits.  */
 
 rtx
-gen_int_shift_amount (machine_mode mode, HOST_WIDE_INT value)
+gen_int_shift_amount (machine_mode mode, poly_int64 value)
 {
   return gen_int_mode (value, get_shift_amount_mode (mode));
 }
Index: gcc/cse.c
===================================================================
--- gcc/cse.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/cse.c	2017-10-23 17:00:54.436008509 +0100
@@ -2323,6 +2323,15 @@  hash_rtx_cb (const_rtx x, machine_mode m
 	hash += CONST_WIDE_INT_ELT (x, i);
       return hash;
 
+    case CONST_POLY_INT:
+      {
+	inchash::hash h;
+	h.add_int (hash);
+	for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	  h.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]);
+	return h.end ();
+      }
+
     case CONST_DOUBLE:
       /* This is like the general case, except that it only counts
 	 the integers representing the constant.  */
@@ -3781,6 +3790,8 @@  equiv_constant (rtx x)
       /* See if we previously assigned a constant value to this SUBREG.  */
       if ((new_rtx = lookup_as_function (x, CONST_INT)) != 0
 	  || (new_rtx = lookup_as_function (x, CONST_WIDE_INT)) != 0
+	  || (NUM_POLY_INT_COEFFS > 1
+	      && (new_rtx = lookup_as_function (x, CONST_POLY_INT)) != 0)
           || (new_rtx = lookup_as_function (x, CONST_DOUBLE)) != 0
           || (new_rtx = lookup_as_function (x, CONST_FIXED)) != 0)
         return new_rtx;
Index: gcc/cselib.c
===================================================================
--- gcc/cselib.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/cselib.c	2017-10-23 17:00:54.436008509 +0100
@@ -1128,6 +1128,15 @@  cselib_hash_rtx (rtx x, int create, mach
 	hash += CONST_WIDE_INT_ELT (x, i);
       return hash;
 
+    case CONST_POLY_INT:
+      {
+	inchash::hash h;
+	h.add_int (hash);
+	for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	  h.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]);
+	return h.end ();
+      }
+
     case CONST_DOUBLE:
       /* This is like the general case, except that it only counts
 	 the integers representing the constant.  */
Index: gcc/dwarf2out.c
===================================================================
--- gcc/dwarf2out.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/dwarf2out.c	2017-10-23 17:00:54.439005782 +0100
@@ -13753,6 +13753,9 @@  const_ok_for_output_1 (rtx rtl)
       return false;
     }
 
+  if (CONST_POLY_INT_P (rtl))
+    return false;
+
   if (targetm.const_not_ok_for_debug_p (rtl))
     {
       expansion_failed (NULL_TREE, rtl,
Index: gcc/expr.c
===================================================================
--- gcc/expr.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/expr.c	2017-10-23 17:00:54.442003055 +0100
@@ -692,6 +692,7 @@  convert_modes (machine_mode mode, machin
       && is_int_mode (oldmode, &int_oldmode)
       && GET_MODE_PRECISION (int_mode) <= GET_MODE_PRECISION (int_oldmode)
       && ((MEM_P (x) && !MEM_VOLATILE_P (x) && direct_load[(int) int_mode])
+	  || CONST_POLY_INT_P (x)
           || (REG_P (x)
               && (!HARD_REGISTER_P (x)
 		  || targetm.hard_regno_mode_ok (REGNO (x), int_mode))
Index: gcc/print-rtl.c
===================================================================
--- gcc/print-rtl.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/print-rtl.c	2017-10-23 17:00:54.443002147 +0100
@@ -898,6 +898,17 @@  rtx_writer::print_rtx (const_rtx in_rtx)
       fprintf (m_outfile, " ");
       cwi_output_hex (m_outfile, in_rtx);
       break;
+
+    case CONST_POLY_INT:
+      fprintf (m_outfile, " [");
+      print_dec (CONST_POLY_INT_COEFFS (in_rtx)[0], m_outfile, SIGNED);
+      for (unsigned int i = 1; i < NUM_POLY_INT_COEFFS; ++i)
+	{
+	  fprintf (m_outfile, ", ");
+	  print_dec (CONST_POLY_INT_COEFFS (in_rtx)[i], m_outfile, SIGNED);
+	}
+      fprintf (m_outfile, "]");
+      break;
 #endif
 
     case CODE_LABEL:
@@ -1568,6 +1579,17 @@  print_value (pretty_printer *pp, const_r
       }
       break;
 
+    case CONST_POLY_INT:
+      pp_left_bracket (pp);
+      pp_wide_int (pp, CONST_POLY_INT_COEFFS (x)[0], SIGNED);
+      for (unsigned int i = 1; i < NUM_POLY_INT_COEFFS; ++i)
+	{
+	  pp_string (pp, ", ");
+	  pp_wide_int (pp, CONST_POLY_INT_COEFFS (x)[i], SIGNED);
+	}
+      pp_right_bracket (pp);
+      break;
+
     case CONST_DOUBLE:
       if (FLOAT_MODE_P (GET_MODE (x)))
 	{
Index: gcc/rtlhash.c
===================================================================
--- gcc/rtlhash.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/rtlhash.c	2017-10-23 17:00:54.444001238 +0100
@@ -55,6 +55,10 @@  add_rtx (const_rtx x, hash &hstate)
       for (i = 0; i < CONST_WIDE_INT_NUNITS (x); i++)
 	hstate.add_object (CONST_WIDE_INT_ELT (x, i));
       return;
+    case CONST_POLY_INT:
+      for (i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+	hstate.add_wide_int (CONST_POLY_INT_COEFFS (x)[i]);
+      break;
     case SYMBOL_REF:
       if (XSTR (x, 0))
 	hstate.add (XSTR (x, 0), strlen (XSTR (x, 0)) + 1);
Index: gcc/explow.c
===================================================================
--- gcc/explow.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/explow.c	2017-10-23 17:00:54.440004873 +0100
@@ -77,13 +77,23 @@  trunc_int_for_mode (HOST_WIDE_INT c, mac
   return c;
 }
 
+/* Likewise for polynomial values, using the sign-extended representation
+   for each individual coefficient.  */
+
+poly_int64
+trunc_int_for_mode (poly_int64 x, machine_mode mode)
+{
+  for (unsigned int i = 0; i < NUM_POLY_INT_COEFFS; ++i)
+    x.coeffs[i] = trunc_int_for_mode (x.coeffs[i], mode);
+  return x;
+}
+
 /* Return an rtx for the sum of X and the integer C, given that X has
    mode MODE.  INPLACE is true if X can be modified inplace or false
    if it must be treated as immutable.  */
 
 rtx
-plus_constant (machine_mode mode, rtx x, HOST_WIDE_INT c,
-	       bool inplace)
+plus_constant (machine_mode mode, rtx x, poly_int64 c, bool inplace)
 {
   RTX_CODE code;
   rtx y;
@@ -92,7 +102,7 @@  plus_constant (machine_mode mode, rtx x,
 
   gcc_assert (GET_MODE (x) == VOIDmode || GET_MODE (x) == mode);
 
-  if (c == 0)
+  if (known_zero (c))
     return x;
 
  restart:
@@ -180,10 +190,12 @@  plus_constant (machine_mode mode, rtx x,
       break;
 
     default:
+      if (CONST_POLY_INT_P (x))
+	return immed_wide_int_const (const_poly_int_value (x) + c, mode);
       break;
     }
 
-  if (c != 0)
+  if (maybe_nonzero (c))
     x = gen_rtx_PLUS (mode, x, gen_int_mode (c, mode));
 
   if (GET_CODE (x) == SYMBOL_REF || GET_CODE (x) == LABEL_REF)
Index: gcc/expmed.h
===================================================================
--- gcc/expmed.h	2017-10-23 16:52:20.579835373 +0100
+++ gcc/expmed.h	2017-10-23 17:00:54.441003964 +0100
@@ -712,8 +712,8 @@  extern unsigned HOST_WIDE_INT choose_mul
 #ifdef TREE_CODE
 extern rtx expand_variable_shift (enum tree_code, machine_mode,
 				  rtx, tree, rtx, int);
-extern rtx expand_shift (enum tree_code, machine_mode, rtx, int, rtx,
-			     int);
+extern rtx expand_shift (enum tree_code, machine_mode, rtx, poly_int64, rtx,
+			 int);
 extern rtx expand_divmod (int, enum tree_code, machine_mode, rtx, rtx,
 			  rtx, int);
 #endif
Index: gcc/expmed.c
===================================================================
--- gcc/expmed.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/expmed.c	2017-10-23 17:00:54.441003964 +0100
@@ -2541,7 +2541,7 @@  expand_shift_1 (enum tree_code code, mac
 
 rtx
 expand_shift (enum tree_code code, machine_mode mode, rtx shifted,
-	      int amount, rtx target, int unsignedp)
+	      poly_int64 amount, rtx target, int unsignedp)
 {
   return expand_shift_1 (code, mode, shifted,
 			 gen_int_shift_amount (mode, amount),
Index: gcc/rtlanal.c
===================================================================
--- gcc/rtlanal.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/rtlanal.c	2017-10-23 17:00:54.444001238 +0100
@@ -915,6 +915,28 @@  split_const (rtx x, rtx *base_out, rtx *
   *base_out = x;
   *offset_out = const0_rtx;
 }
+
+/* Express integer value X as some value Y plus a polynomial offset,
+   where Y is either const0_rtx, X or something within X (as opposed
+   to a new rtx).  Return the Y and store the offset in *OFFSET_OUT.  */
+
+rtx
+strip_offset (rtx x, poly_int64_pod *offset_out)
+{
+  rtx base = const0_rtx;
+  rtx test = x;
+  if (GET_CODE (test) == CONST)
+    test = XEXP (test, 0);
+  if (GET_CODE (test) == PLUS)
+    {
+      base = XEXP (test, 0);
+      test = XEXP (test, 1);
+    }
+  if (poly_int_rtx_p (test, offset_out))
+    return base;
+  *offset_out = 0;
+  return x;
+}
 
 /* Return the number of places FIND appears within X.  If COUNT_DEST is
    zero, we do not count occurrences inside the destination of a SET.  */
@@ -3406,13 +3428,15 @@  commutative_operand_precedence (rtx op)
 
   /* Constants always become the second operand.  Prefer "nice" constants.  */
   if (code == CONST_INT)
-    return -8;
+    return -10;
   if (code == CONST_WIDE_INT)
-    return -7;
+    return -9;
+  if (code == CONST_POLY_INT)
+    return -8;
   if (code == CONST_DOUBLE)
-    return -7;
+    return -8;
   if (code == CONST_FIXED)
-    return -7;
+    return -8;
   op = avoid_constant_pool_reference (op);
   code = GET_CODE (op);
 
@@ -3420,13 +3444,15 @@  commutative_operand_precedence (rtx op)
     {
     case RTX_CONST_OBJ:
       if (code == CONST_INT)
-        return -6;
+	return -7;
       if (code == CONST_WIDE_INT)
-        return -6;
+	return -6;
+      if (code == CONST_POLY_INT)
+	return -5;
       if (code == CONST_DOUBLE)
-        return -5;
+	return -5;
       if (code == CONST_FIXED)
-        return -5;
+	return -5;
       return -4;
 
     case RTX_EXTRA:
Index: gcc/rtl-tests.c
===================================================================
--- gcc/rtl-tests.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/rtl-tests.c	2017-10-23 17:00:54.443002147 +0100
@@ -228,6 +228,62 @@  test_uncond_jump ()
 		      jump_insn);
 }
 
+template<unsigned int N>
+struct const_poly_int_tests
+{
+  static void run ();
+};
+
+template<>
+struct const_poly_int_tests<1>
+{
+  static void run () {}
+};
+
+/* Test various CONST_POLY_INT properties.  */
+
+template<unsigned int N>
+void
+const_poly_int_tests<N>::run ()
+{
+  rtx x1 = gen_int_mode (poly_int64 (1, 1), QImode);
+  rtx x255 = gen_int_mode (poly_int64 (1, 255), QImode);
+
+  /* Test that constants are unique.  */
+  ASSERT_EQ (x1, gen_int_mode (poly_int64 (1, 1), QImode));
+  ASSERT_NE (x1, gen_int_mode (poly_int64 (1, 1), HImode));
+  ASSERT_NE (x1, x255);
+
+  /* Test const_poly_int_value.  */
+  ASSERT_MUST_EQ (const_poly_int_value (x1), poly_int64 (1, 1));
+  ASSERT_MUST_EQ (const_poly_int_value (x255), poly_int64 (1, -1));
+
+  /* Test rtx_to_poly_int64.  */
+  ASSERT_MUST_EQ (rtx_to_poly_int64 (x1), poly_int64 (1, 1));
+  ASSERT_MUST_EQ (rtx_to_poly_int64 (x255), poly_int64 (1, -1));
+  ASSERT_MAY_NE (rtx_to_poly_int64 (x255), poly_int64 (1, 255));
+
+  /* Test plus_constant of a symbol.  */
+  rtx symbol = gen_rtx_SYMBOL_REF (Pmode, "foo");
+  rtx offset1 = gen_int_mode (poly_int64 (9, 11), Pmode);
+  rtx sum1 = gen_rtx_CONST (Pmode, gen_rtx_PLUS (Pmode, symbol, offset1));
+  ASSERT_RTX_EQ (plus_constant (Pmode, symbol, poly_int64 (9, 11)), sum1);
+
+  /* Test plus_constant of a CONST.  */
+  rtx offset2 = gen_int_mode (poly_int64 (12, 20), Pmode);
+  rtx sum2 = gen_rtx_CONST (Pmode, gen_rtx_PLUS (Pmode, symbol, offset2));
+  ASSERT_RTX_EQ (plus_constant (Pmode, sum1, poly_int64 (3, 9)), sum2);
+
+  /* Test a cancelling plus_constant.  */
+  ASSERT_EQ (plus_constant (Pmode, sum2, poly_int64 (-12, -20)), symbol);
+
+  /* Test plus_constant on integer constants.  */
+  ASSERT_EQ (plus_constant (QImode, const1_rtx, poly_int64 (4, -2)),
+	     gen_int_mode (poly_int64 (5, -2), QImode));
+  ASSERT_EQ (plus_constant (QImode, x1, poly_int64 (4, -2)),
+	     gen_int_mode (poly_int64 (5, -1), QImode));
+}
+
 /* Run all of the selftests within this file.  */
 
 void
@@ -238,6 +294,7 @@  rtl_tests_c_tests ()
   test_dumping_rtx_reuse ();
   test_single_set ();
   test_uncond_jump ();
+  const_poly_int_tests<NUM_POLY_INT_COEFFS>::run ();
 
   /* Purge state.  */
   set_first_insn (NULL);
Index: gcc/simplify-rtx.c
===================================================================
--- gcc/simplify-rtx.c	2017-10-23 16:52:20.579835373 +0100
+++ gcc/simplify-rtx.c	2017-10-23 17:00:54.445000329 +0100
@@ -2039,6 +2039,26 @@  simplify_const_unary_operation (enum rtx
 	}
     }
 
+  /* Handle polynomial integers.  */
+  else if (CONST_POLY_INT_P (op))
+    {
+      poly_wide_int result;
+      switch (code)
+	{
+	case NEG:
+	  result = -const_poly_int_value (op);
+	  break;
+
+	case NOT:
+	  result = ~const_poly_int_value (op);
+	  break;
+
+	default:
+	  return NULL_RTX;
+	}
+      return immed_wide_int_const (result, mode);
+    }
+
   return NULL_RTX;
 }
 
@@ -2219,6 +2239,7 @@  simplify_binary_operation_1 (enum rtx_co
   rtx tem, reversed, opleft, opright, elt0, elt1;
   HOST_WIDE_INT val;
   scalar_int_mode int_mode, inner_mode;
+  poly_int64 offset;
 
   /* Even if we can't compute a constant result,
      there are some cases worth simplifying.  */
@@ -2531,6 +2552,12 @@  simplify_binary_operation_1 (enum rtx_co
 	    return simplify_gen_binary (MINUS, mode, tem, XEXP (op0, 0));
 	}
 
+      if ((GET_CODE (op0) == CONST
+	   || GET_CODE (op0) == SYMBOL_REF
+	   || GET_CODE (op0) == LABEL_REF)
+	  && poly_int_rtx_p (op1, &offset))
+	return plus_constant (mode, op0, trunc_int_for_mode (-offset, mode));
+
       /* Don't let a relocatable value get a negative coeff.  */
       if (CONST_INT_P (op1) && GET_MODE (op0) != VOIDmode)
 	return simplify_gen_binary (PLUS, mode,
@@ -4325,6 +4352,57 @@  simplify_const_binary_operation (enum rt
       return immed_wide_int_const (result, int_mode);
     }
 
+  /* Handle polynomial integers.  */
+  if (NUM_POLY_INT_COEFFS > 1
+      && is_a <scalar_int_mode> (mode, &int_mode)
+      && poly_int_rtx_p (op0)
+      && poly_int_rtx_p (op1))
+    {
+      poly_wide_int result;
+      switch (code)
+	{
+	case PLUS:
+	  result = wi::to_poly_wide (op0, mode) + wi::to_poly_wide (op1, mode);
+	  break;
+
+	case MINUS:
+	  result = wi::to_poly_wide (op0, mode) - wi::to_poly_wide (op1, mode);
+	  break;
+
+	case MULT:
+	  if (CONST_SCALAR_INT_P (op1))
+	    result = wi::to_poly_wide (op0, mode) * rtx_mode_t (op1, mode);
+	  else
+	    return NULL_RTX;
+	  break;
+
+	case ASHIFT:
+	  if (CONST_SCALAR_INT_P (op1))
+	    {
+	      wide_int shift = rtx_mode_t (op1, mode);
+	      if (SHIFT_COUNT_TRUNCATED)
+		shift = wi::umod_trunc (shift, GET_MODE_PRECISION (int_mode));
+	      else if (wi::geu_p (shift, GET_MODE_PRECISION (int_mode)))
+		return NULL_RTX;
+	      result = wi::to_poly_wide (op0, mode) << shift;
+	    }
+	  else
+	    return NULL_RTX;
+	  break;
+
+	case IOR:
+	  if (!CONST_SCALAR_INT_P (op1)
+	      || !can_ior_p (wi::to_poly_wide (op0, mode),
+			     rtx_mode_t (op1, mode), &result))
+	    return NULL_RTX;
+	  break;
+
+	default:
+	  return NULL_RTX;
+	}
+      return immed_wide_int_const (result, int_mode);
+    }
+
   return NULL_RTX;
 }
 
@@ -6317,13 +6395,27 @@  simplify_subreg (machine_mode outermode,
   scalar_int_mode int_outermode, int_innermode;
   if (is_a <scalar_int_mode> (outermode, &int_outermode)
       && is_a <scalar_int_mode> (innermode, &int_innermode)
-      && (GET_MODE_PRECISION (int_outermode)
-	  < GET_MODE_PRECISION (int_innermode))
       && byte == subreg_lowpart_offset (int_outermode, int_innermode))
     {
-      rtx tem = simplify_truncation (int_outermode, op, int_innermode);
-      if (tem)
-	return tem;
+      /* Handle polynomial integers.  The upper bits of a paradoxical
+	 subreg are undefined, so this is safe regardless of whether
+	 we're truncating or extending.  */
+      if (CONST_POLY_INT_P (op))
+	{
+	  poly_wide_int val
+	    = poly_wide_int::from (const_poly_int_value (op),
+				   GET_MODE_PRECISION (int_outermode),
+				   SIGNED);
+	  return immed_wide_int_const (val, int_outermode);
+	}
+
+      if (GET_MODE_PRECISION (int_outermode)
+	  < GET_MODE_PRECISION (int_innermode))
+	{
+	  rtx tem = simplify_truncation (int_outermode, op, int_innermode);
+	  if (tem)
+	    return tem;
+	}
     }
 
   return NULL_RTX;
@@ -6629,12 +6721,60 @@  test_vector_ops ()
     }
 }
 
+template<unsigned int N>
+struct simplify_const_poly_int_tests
+{
+  static void run ();
+};
+
+template<>
+struct simplify_const_poly_int_tests<1>
+{
+  static void run () {}
+};
+
+/* Test various CONST_POLY_INT properties.  */
+
+template<unsigned int N>
+void
+simplify_const_poly_int_tests<N>::run ()
+{
+  rtx x1 = gen_int_mode (poly_int64 (1, 1), QImode);
+  rtx x2 = gen_int_mode (poly_int64 (-80, 127), QImode);
+  rtx x3 = gen_int_mode (poly_int64 (-79, -128), QImode);
+  rtx x4 = gen_int_mode (poly_int64 (5, 4), QImode);
+  rtx x5 = gen_int_mode (poly_int64 (30, 24), QImode);
+  rtx x6 = gen_int_mode (poly_int64 (20, 16), QImode);
+  rtx x7 = gen_int_mode (poly_int64 (7, 4), QImode);
+  rtx x8 = gen_int_mode (poly_int64 (30, 24), HImode);
+  rtx x9 = gen_int_mode (poly_int64 (-30, -24), HImode);
+  rtx x10 = gen_int_mode (poly_int64 (-31, -24), HImode);
+  rtx two = GEN_INT (2);
+  rtx six = GEN_INT (6);
+  HOST_WIDE_INT offset = subreg_lowpart_offset (QImode, HImode);
+
+  /* These tests only try limited operation combinations.  Fuller arithmetic
+     testing is done directly on poly_ints.  */
+  ASSERT_EQ (simplify_unary_operation (NEG, HImode, x8, HImode), x9);
+  ASSERT_EQ (simplify_unary_operation (NOT, HImode, x8, HImode), x10);
+  ASSERT_EQ (simplify_unary_operation (TRUNCATE, QImode, x8, HImode), x5);
+  ASSERT_EQ (simplify_binary_operation (PLUS, QImode, x1, x2), x3);
+  ASSERT_EQ (simplify_binary_operation (MINUS, QImode, x3, x1), x2);
+  ASSERT_EQ (simplify_binary_operation (MULT, QImode, x4, six), x5);
+  ASSERT_EQ (simplify_binary_operation (MULT, QImode, six, x4), x5);
+  ASSERT_EQ (simplify_binary_operation (ASHIFT, QImode, x4, two), x6);
+  ASSERT_EQ (simplify_binary_operation (IOR, QImode, x4, two), x7);
+  ASSERT_EQ (simplify_subreg (HImode, x5, QImode, 0), x8);
+  ASSERT_EQ (simplify_subreg (QImode, x8, HImode, offset), x5);
+}
+
 /* Run all of the selftests within this file.  */
 
 void
 simplify_rtx_c_tests ()
 {
   test_vector_ops ();
+  simplify_const_poly_int_tests<NUM_POLY_INT_COEFFS>::run ();
 }
 
 } // namespace selftest
Index: gcc/wide-int.h
===================================================================
--- gcc/wide-int.h	2017-10-23 17:00:20.923835582 +0100
+++ gcc/wide-int.h	2017-10-23 17:00:54.445999420 +0100
@@ -613,6 +613,7 @@  #define SHIFT_FUNCTION \
      access.  */
   struct storage_ref
   {
+    storage_ref () {}
     storage_ref (const HOST_WIDE_INT *, unsigned int, unsigned int);
 
     const HOST_WIDE_INT *val;
@@ -944,6 +945,8 @@  struct wide_int_ref_storage : public wi:
   HOST_WIDE_INT scratch[2];
 
 public:
+  wide_int_ref_storage () {}
+
   wide_int_ref_storage (const wi::storage_ref &);
 
   template <typename T>
@@ -1323,7 +1326,7 @@  typedef generic_wide_int <trailing_wide_
    bytes beyond the sizeof need to be allocated.  Use set_precision
    to initialize the structure.  */
 template <int N>
-class GTY(()) trailing_wide_ints
+class GTY((user)) trailing_wide_ints
 {
 private:
   /* The shared precision of each number.  */
@@ -1340,9 +1343,14 @@  class GTY(()) trailing_wide_ints
   HOST_WIDE_INT m_val[1];
 
 public:
+  typedef WIDE_INT_REF_FOR (trailing_wide_int_storage) const_reference;
+
   void set_precision (unsigned int);
+  unsigned int get_precision () const { return m_precision; }
   trailing_wide_int operator [] (unsigned int);
+  const_reference operator [] (unsigned int) const;
   static size_t extra_size (unsigned int);
+  size_t extra_size () const { return extra_size (m_precision); }
 };
 
 inline trailing_wide_int_storage::
@@ -1414,6 +1422,14 @@  trailing_wide_ints <N>::operator [] (uns
 				    &m_val[index * m_max_len]);
 }
 
+template <int N>
+inline typename trailing_wide_ints <N>::const_reference
+trailing_wide_ints <N>::operator [] (unsigned int index) const
+{
+  return wi::storage_ref (&m_val[index * m_max_len],
+			  m_len[index], m_precision);
+}
+
 /* Return how many extra bytes need to be added to the end of the structure
    in order to handle N wide_ints of precision PRECISION.  */
 template <int N>