Patchwork wide int patch #6: Replacement of hwi extraction from int-csts.

login
register
mail settings
Submitter Kenneth Zadeck
Date Oct. 17, 2012, 9:47 p.m.
Message ID <507F278C.4050305@naturalbridge.com>
Download mbox | patch
Permalink /patch/192147/
State New
Headers show

Comments

Kenneth Zadeck - Oct. 17, 2012, 9:47 p.m.
Richi,

I apologize for the size of this patch, but it only does one very small 
thing, it is just that it does it all over the middle end.

This patch introduces a new api for extracting a signed or unsigned hwi 
from an integer cst.  There had been an abi for doing this, but it has 
some problems that were fixed here, and it was only used sometimes.

The new abi consists of 6 functions, three for testing if the constant 
in the int cst will fit and three for actually pulling out the value.

The ones for testing are tree_fits_uhwi_p, tree_fits_shwi_p, 
tree_fits_hwi_p.   The first two of these are unsigned and signed 
versions, and the second one takes a boolean parameter which is true if 
the value is positive.   This replaces the host_integerp which basically 
has the same functionality of tree_fits_hwi_p.   The reason for changing 
this is that there were about 400 calls to host_integerp and all but 4 
of them took a constant parameter. These names are chosen to be similar 
to the similar functions in wide-int and are more mnemonic that the 
existing name since they use the more readable u and s prefixes that a 
lot of other places do.

On the accessing side, there is tree_to_uhwi, tree_to_shwi, and 
tree_to_hwi.   The first two do checking when checking is enabled. The 
last does no checking.

it is expected normally, the unchecked accesses follows an explicit 
check, or if there is no explicit check, then one of the checked ones 
follows.  This is always not the case for places that just want to look 
at the bottom bits of some large number, such as several places that 
compute alignment.  These just use the naked unchecked access.

There were a lot of places that either did no checking or did the 
checking inline.   This patch tracks down all of those places and they 
now use the same abi.

There are three places where i found what appear to be bugs in the 
code.   These are places where the existing code did an unsigned check 
and then followed it with a signed access (or visa versa). These places 
are marked with comments that contain the string "THE NEXT LINE IS 
POSSIBLY WRONG".   With some guidance from the reviewer, i will fix 
these are remove the unacceptable comments.

Aside from that, this patch is very very boring because it just makes 
this one transformation. Look for interesting stuff in tree.h.   
Everything else is just forcing everywhere to use a single abi.

This patch could go on with a little work without the other 5 patches or 
it can wait for them.

kenny
2012-10-9  Kenneth Zadeck <zadeck@naturalbridge.com>

	* tree.c (int_size_in_bytes, max_int_size_in_bytes,
	bit_position, byte_position, free_lang_data_in_decl,
	tree_int_cst_compare, compare_tree_int,
	build_nonstandard_integer_type, get_narrower,
	build_vector_type_for_mode, int_cst_value,
	widest_int_cst_value, get_binfo_at_offset): Replaced access of
	lower HWI from int cst with new api.
	* tree.h (PREDICT_EXPR_PREDICTOR, VL_EXP_OPERAND_LENGTH)
	(CHREC_VARIABLE): Replaced access of lower HWI from int cst with
	new api.
	(tree_fits_uhwi_p, tree_fits_shwi_p, tree_fits_hwi_p,
	tree_to_uhwi, tree_to_shwi, tree_to_hwi, tree_low_cst): New functions.
	* ada/gcc-interface/cuintp.c (UI_From_gnu): Replaced access of
	lower HWI from int cst with new api.
	* ada/gcc-interface/decl.c (gnat_to_gnu_entity,
	gnat_to_gnu_component_type, allocatable_size_p,
	gnat_to_gnu_field, create_field_decl_from): Ditto.
	* ada/gcc-interface/misc.c (gnat_type_max_size,
	default_pass_by_ref): Ditto.
	* ada/gcc-interface/trans.c (Call_to_gnu): Ditto.
	* ada/gcc-interface/utils.c (make_aligning_type,
	make_packable_type, make_type_from_size,
	rest_of_record_type_compilation, create_field_decl,
	invalidate_global_renaming_pointers, potential_alignment_gap,
	build_vms_descriptor32, build_vms_descriptor,
	convert_vms_descriptor64, convert_vms_descriptor32,
	unchecked_convert, get_nonnull_operand,
	handle_vector_size_attribute, handle_vector_type_attribute): Ditto.
	* ada/gcc-interface/utils2.c (known_alignment,
	nonbinary_modular_operation): Ditto.
	* alias.c (ao_ref_from_mem, adjust_offset_for_component_ref): Ditto.
	* builtins.c (get_object_alignment_2, get_pointer_alignment_1,
	c_strlen, c_getstr, target_char_cast,
	expand_builtin_mempcpy_args, expand_builtin_strncpy,
	expand_builtin_memset_args, expand_builtin_memset_args,
	expand_builtin_frame_address, expand_builtin_alloca,
	expand_builtin_atomic_compare_exchange, fold_builtin_powi,
	fold_builtin_memset, fold_builtin_memory_op,
	fold_builtin_memchr, fold_builtin_memcmp,
	fold_builtin_strncmp, fold_builtin_load_exponent,
	fold_builtin_snprintf, expand_builtin_object_size,
	expand_builtin_memory_chk, maybe_emit_chk_warning,
	maybe_emit_sprintf_chk_warning, fold_builtin_object_size,
	fold_builtin_memory_chk, fold_builtin_stxcpy_chk,
	fold_builtin_stxncpy_chk, fold_builtin_strcat_chk,
	fold_builtin_strncat_chk, fold_builtin_sprintf_chk_1,
	fold_builtin_snprintf_chk_1, do_mpfr_bessel_n): Ditto.
	* c-family/c-ada-spec.c (is_simple_enum, dump_generic_ada_node): Ditto.
	* c-family/c-common.c (finish_fname_decls, c_type_hash,
	match_case_to_enum_1, get_priority,
	handle_alloc_size_attribute, handle_vector_size_attribute,
	check_function_sentinel, get_nonnull_operand,
	parse_optimize_options, check_function_arguments_recurse,
	c_parse_error, fold_offsetof_1, complete_array_type,
	sync_resolve_size, get_atomic_generic_size,
	get_atomic_generic_size, get_atomic_generic_size,
	warn_for_sign_compare,
	convert_vector_to_pointer_for_subscript): Ditto.
	* c-family/c-cppbuiltin.c (static, cpp_atomic_builtins): Ditto.
	* c-family/c-format.c (check_format_string, check_format_arg): Ditto.
	* c-family/c-pragma.c (handle_pragma_pack): Ditto.
	* c-family/c-pretty-print.c (pp_c_direct_abstract_declarator,
	pp_c_integer_constant, pp_c_character_constant,
	pp_c_postfix_expression): Ditto.
	* c/c-decl.c (check_bitfield_type_and_width, grokdeclarator,
	finish_struct): Ditto.
	* c/c-parser.c (c_lex_one_token, c_parser_omp_clause_collapse,
	c_parser_omp_for_loop): Ditto.
	* c/c-typeck.c (push_init_level, process_init_element,
	c_finish_bc_stmt): Ditto.
	* cfgexpand.c (add_stack_var, expand_one_stack_var,
	defer_stack_allocation, expand_one_var,
	stack_protect_classify_type): Ditto.
	* cgraphunit.c (expand_function): Ditto.
	* config/alpha/alpha.c (va_list_skip_additions,
	alpha_stdarg_optimize_hook): Ditto.
	* config/alpha/predicates.md (): Ditto.
	* config/arm/arm.c (aapcs_vfp_sub_candidate,
	arm_have_conditional_execution): Ditto.
	* config/avr/avr.c (avr_fold_builtin): Ditto.
	* config/bfin/bfin.c (bfin_local_alignment): Ditto.
	* config/c6x/predicates.md (): Ditto.
	* config/darwin.c (darwin_mergeable_constant_section,
	machopic_select_section, darwin_asm_declare_object_name): Ditto.
	* config/epiphany/epiphany.c (
	epiphany_special_round_type_align,
	epiphany_adjust_field_align): Ditto.
	* config/i386/i386.c (ix86_function_regparm,
	ix86_keep_aggregate_return_pointer, classify_argument,
	ix86_data_alignment, ix86_local_alignment,
	ix86_builtin_tm_store, get_element_number): Ditto.
	* config/ia64/predicates.md (): Ditto.
	* config/iq2000/iq2000.c (iq2000_function_arg): Ditto.
	* config/m32c/m32c-pragma.c (m32c_pragma_address): Ditto.
	* config/m32c/m32c.c (function_vector_handler,
	current_function_special_page_vector): Ditto.
	* config/mep/mep-pragma.c (mep_pragma_coprocessor_width,
	mep_pragma_coprocessor_subclass): Ditto.
	* config/mep/mep.c (mep_attrlist_to_encoding,
	mep_insert_attributes, mep_output_aligned_common): Ditto.
	* config/mips/mips.c (mips_function_arg, r10k_safe_mem_expr_p): Ditto.
	* config/picochip/picochip.c (picochip_compute_arg_size): Ditto.
	* config/rs6000/rs6000-c.c (altivec_resolve_overloaded_builtin): Ditto.
	* config/rs6000/rs6000.c (
	rs6000_builtin_support_vector_misalignment,
	offsettable_ok_by_alignment,
	rs6000_darwin64_record_arg_advance_recurse,
	rs6000_darwin64_record_arg_recurse,
	rs6000_expand_binop_builtin, altivec_expand_predicate_builtin,
	rs6000_expand_ternop_builtin, altivec_expand_dst_builtin,
	get_element_number, altivec_expand_builtin,
	spe_expand_builtin, paired_expand_predicate_builtin,
	spe_expand_predicate_builtin): Ditto.
	* config/s390/s390.c (s390_encode_section_info): Ditto.
	* config/sh/sh.c (sh_print_operand,
	sh2a_handle_function_vector_handler_attribute,
	sh2a_get_function_vector_number): Ditto.
	* config/sol2-c.c (solaris_pragma_align): Ditto.
	* config/sparc/sparc.c (function_arg_record_value_1,
	function_arg_record_value_2, sparc_struct_value_rtx,
	sparc_handle_vis_mul8x16, sparc_fold_builtin): Ditto.
	* config/vms/vms-c.c (handle_pragma_pointer_size): Ditto.
	* coverage.c (build_fn_info): Ditto.
	* cp/call.c (build_array_conv): Ditto.
	* cp/class.c (layout_class_type, dump_class_hierarchy_r,
	dump_class_hierarchy_1, dump_array, dump_thunk): Ditto.
	* cp/cp-tree.h (TMPL_PARMS_DEPTH): Ditto.
	* cp/decl.c (check_array_designated_initializer,
	reshape_init_array_1, grokdeclarator): Ditto.
	* cp/dump.c (cp_dump_tree): Ditto.
	* cp/error.c (dump_type_suffix,
	resolve_virtual_fun_from_obj_type_ref, dump_expr): Ditto.
	* cp/init.c (build_vec_init): Ditto.
	* cp/mangle.c (write_integer_cst): Ditto.
	* cp/method.c (make_thunk, use_thunk): Ditto.
	* cp/parser.c (cp_lexer_get_preprocessor_token,
	cp_parser_userdef_string_literal,
	cp_parser_omp_clause_collapse, cp_parser_omp_for_loop): Ditto.
	* cp/semantics.c (cxx_eval_array_reference,
	cxx_eval_bit_field_ref, cxx_eval_vec_init_1,
	cxx_fold_indirect_ref): Ditto.
	* cp/tree.c (debug_binfo, debug_binfo,
	handle_init_priority_attribute): Ditto.
	* cp/typeck2.c (digest_init_r): Ditto.
	* cppbuiltin.c (define_builtin_macros_for_type_sizes): Ditto.
	* dbxout.c (dbxout_type_fields, dbxout_type_method_1,
	dbxout_range_type, dbxout_type, dbxout_expand_expr,
	dbxout_symbol): Ditto.
	* dojump.c (do_jump): Ditto.
	* dwarf2out.c (simple_type_size_in_bits, dw_sra_loc_expr,
	loc_list_from_tree, add_data_member_location_attribute,
	fortran_common, native_encode_initializer, add_bound_info,
	add_bit_offset_attribute, add_bit_size_attribute,
	add_pure_or_virtual_attribute, descr_info_loc,
	add_descr_info_field, gen_descr_array_type_die,
	gen_enumeration_type_die): Ditto.
	* emit-rtl.c (get_mem_align_offset,
	set_mem_attributes_minus_bitpos, widen_memory_access): Ditto.
	* except.c (init_eh, expand_builtin_eh_common,
	expand_builtin_eh_return_data_regno, collect_one_action_chain): Ditto.
	* explow.c (int_expr_size): Ditto.
	* expr.c (get_bit_range, count_type_elements,
	categorize_ctor_elements_1, store_constructor,
	get_inner_reference, highest_pow2_factor, highest_pow2_factor,
	expand_constructor, expand_expr_real_2, expand_expr_real_1,
	is_aligning_offset, string_constant): Ditto.
	* expr.h (ADD_PARM_SIZE, SUB_PARM_SIZE): Ditto.
	* fold-const.c (negate_expr_p, fold_negate_expr,
	make_bit_field_ref, extract_muldiv_1, fold_single_bit_test,
	fold_plusminus_mult_expr, native_encode_string,
	fold_unary_loc, fold_comparison,
	get_pointer_modulus_and_residue, fold_binary_loc,
	fold_ternary_loc, multiple_of_p,
	tree_call_nonnegative_warnv_p, fold_read_from_constant_string,
	fold_indirect_ref_1): Ditto.
	* fortran/target-memory.c (encode_derived,
	gfc_interpret_derived, expr_to_char): Ditto.
	* fortran/trans-array.c (gfc_trans_array_constructor_value): Ditto.
	* fortran/trans-common.c (build_common_decl): Ditto.
	* fortran/trans-const.c (gfc_conv_string_init): Ditto.
	* fortran/trans-decl.c (gfc_can_put_var_on_stack): Ditto.
	* fortran/trans-expr.c (gfc_string_to_single_character,
	gfc_optimize_len_trim): Ditto.
	* fortran/trans-io.c (gfc_build_io_library_fndecls): Ditto.
	* fortran/trans-types.c (gfc_get_dtype): Ditto.
	* function.c (locate_and_pad_parm, pad_below): Ditto.
	* gimple-fold.c (gimple_extract_devirt_binfo_from_cst,
	gimple_fold_call, get_base_constructor,
	fold_const_aggregate_ref_1, gimple_get_virt_method_for_binfo,
	gimple_val_nonnegative_real_p): Ditto.
	* gimple-pretty-print.c (dump_gimple_call): Ditto.
	* gimple-ssa-strength-reduction.c (stmt_cost): Ditto.
	* gimple.c (gimple_compare_field_offset): Ditto.
	* gimplify.c (gimple_add_tmp_var, gimple_fold_indirect_ref): Ditto.
	* go/go-gcc.cc (Gcc_backend::type_size(Btype*): Ditto.
	* go/gofrontend/expressions.cc (
	Type_conversion_expression::do_get_tree(Translate_context*): Ditto.
	* godump.c (go_format_type, go_output_typedef): Ditto.
	* graphite-scop-detection.c (graphite_can_represent_init): Ditto.
	* graphite-sese-to-poly.c (pdr_add_data_dimensions): Ditto.
	* ipa-prop.c (ipa_print_node_jump_functions_for_edge,
	type_like_member_ptr_p, determine_known_aggregate_parts,
	ipa_analyze_virtual_call_uses): Ditto.
	* java/boehm.c (uses_jv_markobj_p): Ditto.
	* java/class.c (get_dispatch_vector): Ditto.
	* java/expr.c (build_newarray, build_anewarray, build_jni_stub): Ditto.
	* java/typeck.c (java_array_type_length): Ditto.
	* lto-streamer-out.c (write_symbol): Ditto.
	* lto/lto-lang.c (get_nonnull_operand): Ditto.
	* objc/objc-act.c (check_string_class_template,
	objc_decl_method_attributes, gen_declaration, gen_type_name_0): Ditto.
	* objc/objc-encoding.c (encode_array, encode_vector,
	encode_field): Ditto.
	* objc/objc-next-runtime-abi-01.c (
	generate_v1_objc_protocol_extension,
	generate_v1_property_table, build_v1_category_initializer,
	generate_objc_class_ext): Ditto.
	* objc/objc-next-runtime-abi-02.c (
	generate_v2_meth_descriptor_table, generate_v2_property_table,
	build_v2_protocol_initializer, generate_v2_dispatch_table,
	build_v2_ivar_list_initializer, generate_v2_ivars_list,
	generate_v2_class_structs): Ditto.
	* omp-low.c (expand_omp_atomic, lower_omp_for_lastprivate): Ditto.
	* predict.c (strips_small_constant,
	is_comparison_with_loop_invariant_p, predict_iv_comparison,
	predict_loops): Ditto.
	* rtlanal.c (rtx_addr_can_trap_p_1): Ditto.
	* sdbout.c (plain_type_1, sdbout_field_types, sdbout_one_type): Ditto.
	* simplify-rtx.c (delegitimize_mem_from_attrs): Ditto.
	* stmt.c (dump_case_nodes, expand_switch_as_decision_tree_p,
	emit_case_dispatch_table): Ditto.
	* stor-layout.c (mode_for_size_tree, mode_for_array,
	layout_decl, excess_unit_span, place_field,
	compute_record_mode, finish_bitfield_representative,
	finish_bitfield_layout): Ditto.
	* targhooks.c (default_mangle_decl_assembler_name): Ditto.
	* testsuite/gcc.dg/20020219-1.c (fixed comment.): Ditto.
	* trans-mem.c (tm_log_add, tm_log_emit_stmt, build_tm_load,
	build_tm_store): Ditto.
	* tree-cfg.c (verify_expr, verify_expr,
	move_stmt_eh_region_tree_nr): Ditto.
	* tree-data-ref.c (gcd_of_steps_may_divide_p): Ditto.
	* tree-dfa.c (get_ref_base_and_extent): Ditto.
	* tree-flow-inline.h (get_addr_base_and_unit_offset_1): Ditto.
	* tree-inline.c (remap_eh_region_tree_nr): Ditto.
	* tree-object-size.c (static, compute_object_offset,
	addr_object_size, alloc_object_size, plus_stmt_object_size,
	compute_object_sizes): Ditto.
	* tree-pretty-print.c (dump_array_domain, dump_generic_node): Ditto.
	* tree-sra.c (type_internals_preclude_sra_p,
	completely_scalarize_record, compare_access_positions,
	make_fancy_name_1, build_user_friendly_ref_for_offset,
	maybe_add_sra_candidate, expr_with_var_bounded_array_refs_p,
	analyze_all_variable_accesses, sra_modify_expr,
	find_param_candidates, splice_param_accesses,
	decide_one_param_reduction): Ditto.
	* tree-ssa-address.c (copy_ref_info): Ditto.
	* tree-ssa-alias.c (ao_ref_init_from_ptr_and_size,
	stmt_kills_ref_p_1): Ditto.
	* tree-ssa-ccp.c (ccp_finalize, bit_value_assume_aligned,
	evaluate_stmt, fold_builtin_alloca_with_align): Ditto.
	* tree-ssa-forwprop.c (simplify_builtin_call,
	simplify_bitfield_ref, is_combined_permutation_identity,
	simplify_vector_constructor): Ditto.
	* tree-ssa-loop-ivcanon.c (try_unroll_loop_completely): Ditto.
	* tree-ssa-loop-ivopts.c (get_loop_invariant_expr_id,
	iv_period): Ditto.
	* tree-ssa-loop-niter.c (number_of_iterations_ne_max,
	number_of_iterations_ne): Ditto.
	* tree-ssa-loop-prefetch.c (analyze_ref, add_subscript_strides,
	self_reuse_distance): Ditto.
	* tree-ssa-math-opts.c (execute_cse_sincos, find_bswap_1,
	find_bswap): Ditto.
	* tree-ssa-phiopt.c (add_or_mark_expr, hoist_adjacent_loads): Ditto.
	* tree-ssa-reassoc.c (decrement_power, acceptable_pow_call): Ditto.
	* tree-ssa-sccvn.c (vn_reference_eq,
	copy_reference_ops_from_ref, ao_ref_init_from_vn_reference,
	vn_reference_fold_indirect,
	vn_reference_maybe_forwprop_address,
	fully_constant_vn_reference_p, vn_reference_lookup_3,
	simplify_binary_expression): Ditto.
	* tree-ssa-strlen.c (get_stridx, adjust_last_stmt,
	handle_builtin_memcpy, handle_pointer_plus): Ditto.
	* tree-ssa-structalias.c (process_constraint,
	get_constraint_for_1, push_fields_onto_fieldstack,
	create_variable_info_for_1): Ditto.
	* tree-stdarg.c (va_list_counter_bump,
	check_all_va_list_escapes): Ditto.
	* tree-switch-conversion.c (emit_case_bit_tests, static,
	create_temp_arrays): Ditto.
	* tree-vect-data-refs.c (vect_get_smallest_scalar_type,
	vect_drs_dependent_in_basic_block, vect_check_interleaving,
	vect_compute_data_ref_alignment,
	vect_verify_datarefs_alignment, vect_analyze_group_access,
	vect_analyze_data_ref_access, vect_check_gather,
	vect_supportable_dr_alignment): Ditto.
	* tree-vect-generic.c (static, expand_vector_piecewise,
	expand_vector_parallel, expand_vector_addition,
	expand_vector_divmod, vector_element, lower_vec_perm,
	lower_vec_perm): Ditto.
	* tree-vect-loop.c (vect_analyze_loop_form,
	vect_model_reduction_cost, get_initial_def_for_reduction,
	vect_create_epilog_for_reduction): Ditto.
	* tree-vect-patterns.c (vect_recog_pow_pattern,
	vect_recog_divmod_pattern): Ditto.
	* tree-vect-stmts.c (vectorizable_load): Ditto.
	* tree-vectorizer.h (LOOP_VINFO_INT_NITERS, NITERS_KNOWN_P): Ditto.
	* tree-vrp.c (extract_range_from_binary_expr_1,
	register_edge_assert_for_2): Ditto.
	* var-tracking.c (prepare_call_arguments,
	emit_note_insn_var_location): Ditto.
	* varasm.c (get_block_for_decl, assemble_noswitch_variable,
	assemble_variable_contents, decode_addr_const,
	decode_addr_const, output_constant,
	output_constructor_array_range,
	output_constructor_regular_field, output_constructor_bitfield,
	place_block_symbol, output_object_block): Ditto.
	* varpool.c (varpool_remove_node): Ditto.
Richard Guenther - Oct. 18, 2012, 10:22 a.m.
On Wed, Oct 17, 2012 at 11:47 PM, Kenneth Zadeck
<zadeck@naturalbridge.com> wrote:
> Richi,
>
> I apologize for the size of this patch, but it only does one very small
> thing, it is just that it does it all over the middle end.
>
> This patch introduces a new api for extracting a signed or unsigned hwi from
> an integer cst.  There had been an abi for doing this, but it has some
> problems that were fixed here, and it was only used sometimes.
>
> The new abi consists of 6 functions, three for testing if the constant in
> the int cst will fit and three for actually pulling out the value.
>
> The ones for testing are tree_fits_uhwi_p, tree_fits_shwi_p,
> tree_fits_hwi_p.   The first two of these are unsigned and signed versions,
> and the second one takes a boolean parameter which is true if the value is
> positive.   This replaces the host_integerp which basically has the same
> functionality of tree_fits_hwi_p.   The reason for changing this is that
> there were about 400 calls to host_integerp and all but 4 of them took a
> constant parameter. These names are chosen to be similar to the similar
> functions in wide-int and are more mnemonic that the existing name since
> they use the more readable u and s prefixes that a lot of other places do.
>
> On the accessing side, there is tree_to_uhwi, tree_to_shwi, and tree_to_hwi.
> The first two do checking when checking is enabled. The last does no
> checking.

Just a quick note here - the changelog mentions tree_low_cst (as new
function!?) but not host_integerp.  You should probably get rid of both,
otherwise uses will creap back as people are familiar with them
(I'm not sure these changes for consistency are always good ...)

I don't like that tree_to_hwi does not do any checking.  In fact I don't
like that it _exists_, after all it has a return type which signedness
does not magically change.  Unchecked conversions should use
TREE_LOW_CST.

Thus, my 2 cents before I really look at the patch (which will likely
be next week only, so maybe you can do a followup with the above
suggestions).

Thanks,
Richard.

> it is expected normally, the unchecked accesses follows an explicit check,
> or if there is no explicit check, then one of the checked ones follows.
> This is always not the case for places that just want to look at the bottom
> bits of some large number, such as several places that compute alignment.
> These just use the naked unchecked access.
>
> There were a lot of places that either did no checking or did the checking
> inline.   This patch tracks down all of those places and they now use the
> same abi.
>
> There are three places where i found what appear to be bugs in the code.
> These are places where the existing code did an unsigned check and then
> followed it with a signed access (or visa versa). These places are marked
> with comments that contain the string "THE NEXT LINE IS POSSIBLY WRONG".
> With some guidance from the reviewer, i will fix these are remove the
> unacceptable comments.
>
> Aside from that, this patch is very very boring because it just makes this
> one transformation. Look for interesting stuff in tree.h.   Everything else
> is just forcing everywhere to use a single abi.
>
> This patch could go on with a little work without the other 5 patches or it
> can wait for them.
>
> kenny
Kenneth Zadeck - Oct. 18, 2012, 12:52 p.m.
On 10/18/2012 06:22 AM, Richard Biener wrote:
> On Wed, Oct 17, 2012 at 11:47 PM, Kenneth Zadeck
> <zadeck@naturalbridge.com> wrote:
>> Richi,
>>
>> I apologize for the size of this patch, but it only does one very small
>> thing, it is just that it does it all over the middle end.
>>
>> This patch introduces a new api for extracting a signed or unsigned hwi from
>> an integer cst.  There had been an abi for doing this, but it has some
>> problems that were fixed here, and it was only used sometimes.
>>
>> The new abi consists of 6 functions, three for testing if the constant in
>> the int cst will fit and three for actually pulling out the value.
>>
>> The ones for testing are tree_fits_uhwi_p, tree_fits_shwi_p,
>> tree_fits_hwi_p.   The first two of these are unsigned and signed versions,
>> and the second one takes a boolean parameter which is true if the value is
>> positive.   This replaces the host_integerp which basically has the same
>> functionality of tree_fits_hwi_p.   The reason for changing this is that
>> there were about 400 calls to host_integerp and all but 4 of them took a
>> constant parameter. These names are chosen to be similar to the similar
>> functions in wide-int and are more mnemonic that the existing name since
>> they use the more readable u and s prefixes that a lot of other places do.
>>
>> On the accessing side, there is tree_to_uhwi, tree_to_shwi, and tree_to_hwi.
>> The first two do checking when checking is enabled. The last does no
>> checking.
> Just a quick note here - the changelog mentions tree_low_cst (as new
> function!?) but not host_integerp.  You should probably get rid of both,
> otherwise uses will creap back as people are familiar with them
> (I'm not sure these changes for consistency are always good ...)
i will fix this.
> I don't like that tree_to_hwi does not do any checking.  In fact I don't
> like that it _exists_, after all it has a return type which signedness
> does not magically change.  Unchecked conversions should use
> TREE_LOW_CST.
the idea is that when wide-int goes in, there is actually no 
TREE_INT_CST_LOW.   The concept of low and high go out the door. the 
int-cst will have an array in it that is big enough to hold the value.
so tree_to_hwi becomes a short hand for just accessing the lower element 
of the array.

you could argue that you should say tree_fits_uhwi_p followed by 
tree_to_uhwi (and the same for signed).   This is an easy fix.   it just 
seemed a little redundant.

I should also point out that about 2/3 if this patch is going to die as 
the rest of the wide int stuff goes in.   But i did not want to submit a 
patch that only converted 1/3 of the cases.   The truth is that most of 
these places where you are doing this conversion are just because the 
people were too lazy to do the math at the full precision of the double int.
>
> Thus, my 2 cents before I really look at the patch (which will likely
> be next week only, so maybe you can do a followup with the above
> suggestions).
>
> Thanks,
> Richard.
>
>> it is expected normally, the unchecked accesses follows an explicit check,
>> or if there is no explicit check, then one of the checked ones follows.
>> This is always not the case for places that just want to look at the bottom
>> bits of some large number, such as several places that compute alignment.
>> These just use the naked unchecked access.
>>
>> There were a lot of places that either did no checking or did the checking
>> inline.   This patch tracks down all of those places and they now use the
>> same abi.
>>
>> There are three places where i found what appear to be bugs in the code.
>> These are places where the existing code did an unsigned check and then
>> followed it with a signed access (or visa versa). These places are marked
>> with comments that contain the string "THE NEXT LINE IS POSSIBLY WRONG".
>> With some guidance from the reviewer, i will fix these are remove the
>> unacceptable comments.
>>
>> Aside from that, this patch is very very boring because it just makes this
>> one transformation. Look for interesting stuff in tree.h.   Everything else
>> is just forcing everywhere to use a single abi.
>>
>> This patch could go on with a little work without the other 5 patches or it
>> can wait for them.
>>
>> kenny
Richard Guenther - Oct. 18, 2012, 12:58 p.m.
On Thu, Oct 18, 2012 at 2:52 PM, Kenneth Zadeck
<zadeck@naturalbridge.com> wrote:
>
> On 10/18/2012 06:22 AM, Richard Biener wrote:
>>
>> On Wed, Oct 17, 2012 at 11:47 PM, Kenneth Zadeck
>> <zadeck@naturalbridge.com> wrote:
>>>
>>> Richi,
>>>
>>> I apologize for the size of this patch, but it only does one very small
>>> thing, it is just that it does it all over the middle end.
>>>
>>> This patch introduces a new api for extracting a signed or unsigned hwi
>>> from
>>> an integer cst.  There had been an abi for doing this, but it has some
>>> problems that were fixed here, and it was only used sometimes.
>>>
>>> The new abi consists of 6 functions, three for testing if the constant in
>>> the int cst will fit and three for actually pulling out the value.
>>>
>>> The ones for testing are tree_fits_uhwi_p, tree_fits_shwi_p,
>>> tree_fits_hwi_p.   The first two of these are unsigned and signed
>>> versions,
>>> and the second one takes a boolean parameter which is true if the value
>>> is
>>> positive.   This replaces the host_integerp which basically has the same
>>> functionality of tree_fits_hwi_p.   The reason for changing this is that
>>> there were about 400 calls to host_integerp and all but 4 of them took a
>>> constant parameter. These names are chosen to be similar to the similar
>>> functions in wide-int and are more mnemonic that the existing name since
>>> they use the more readable u and s prefixes that a lot of other places
>>> do.
>>>
>>> On the accessing side, there is tree_to_uhwi, tree_to_shwi, and
>>> tree_to_hwi.
>>> The first two do checking when checking is enabled. The last does no
>>> checking.
>>
>> Just a quick note here - the changelog mentions tree_low_cst (as new
>> function!?) but not host_integerp.  You should probably get rid of both,
>> otherwise uses will creap back as people are familiar with them
>> (I'm not sure these changes for consistency are always good ...)
>
> i will fix this.
>
>> I don't like that tree_to_hwi does not do any checking.  In fact I don't
>> like that it _exists_, after all it has a return type which signedness
>> does not magically change.  Unchecked conversions should use
>> TREE_LOW_CST.
>
> the idea is that when wide-int goes in, there is actually no
> TREE_INT_CST_LOW.   The concept of low and high go out the door. the int-cst
> will have an array in it that is big enough to hold the value.
> so tree_to_hwi becomes a short hand for just accessing the lower element of
> the array.
>
> you could argue that you should say tree_fits_uhwi_p followed by
> tree_to_uhwi (and the same for signed).   This is an easy fix.   it just
> seemed a little redundant.

Well, if you want raw access to the lower element (when do you actually
want that, when not in wide-int.c/h?) ... you still need to care about the
signedness of the result.  And tree_fits_uhwi_p does not return the
same as tree_fits_shwi_p all the time.

I don't see any goodness in tree_to_hwi nor tree_fits_hwi really.  Because
if you just access the lower word then that still has a sign (either
HOST_WIDE_INT or unsigned HOST_WIDE_INT).  We should get rid
of those places - can you enumerate them?  I think you said it was just
a few callers with variable signedness argument.

Richard.

> I should also point out that about 2/3 if this patch is going to die as the
> rest of the wide int stuff goes in.   But i did not want to submit a patch
> that only converted 1/3 of the cases.   The truth is that most of these
> places where you are doing this conversion are just because the people were
> too lazy to do the math at the full precision of the double int.
>
>>
>> Thus, my 2 cents before I really look at the patch (which will likely
>> be next week only, so maybe you can do a followup with the above
>> suggestions).
>>
>> Thanks,
>> Richard.
>>
>>> it is expected normally, the unchecked accesses follows an explicit
>>> check,
>>> or if there is no explicit check, then one of the checked ones follows.
>>> This is always not the case for places that just want to look at the
>>> bottom
>>> bits of some large number, such as several places that compute alignment.
>>> These just use the naked unchecked access.
>>>
>>> There were a lot of places that either did no checking or did the
>>> checking
>>> inline.   This patch tracks down all of those places and they now use the
>>> same abi.
>>>
>>> There are three places where i found what appear to be bugs in the code.
>>> These are places where the existing code did an unsigned check and then
>>> followed it with a signed access (or visa versa). These places are marked
>>> with comments that contain the string "THE NEXT LINE IS POSSIBLY WRONG".
>>> With some guidance from the reviewer, i will fix these are remove the
>>> unacceptable comments.
>>>
>>> Aside from that, this patch is very very boring because it just makes
>>> this
>>> one transformation. Look for interesting stuff in tree.h.   Everything
>>> else
>>> is just forcing everywhere to use a single abi.
>>>
>>> This patch could go on with a little work without the other 5 patches or
>>> it
>>> can wait for them.
>>>
>>> kenny
>
>
Kenneth Zadeck - Oct. 18, 2012, 2 p.m.
you know richi, i did not know who i was actually talking to.   i said 
who is this richard beiner person and then i saw the email address.


On 10/18/2012 08:58 AM, Richard Biener wrote:
> On Thu, Oct 18, 2012 at 2:52 PM, Kenneth Zadeck
> <zadeck@naturalbridge.com> wrote:
>> On 10/18/2012 06:22 AM, Richard Biener wrote:
>>> On Wed, Oct 17, 2012 at 11:47 PM, Kenneth Zadeck
>>> <zadeck@naturalbridge.com> wrote:
>>>> Richi,
>>>>
>>>> I apologize for the size of this patch, but it only does one very small
>>>> thing, it is just that it does it all over the middle end.
>>>>
>>>> This patch introduces a new api for extracting a signed or unsigned hwi
>>>> from
>>>> an integer cst.  There had been an abi for doing this, but it has some
>>>> problems that were fixed here, and it was only used sometimes.
>>>>
>>>> The new abi consists of 6 functions, three for testing if the constant in
>>>> the int cst will fit and three for actually pulling out the value.
>>>>
>>>> The ones for testing are tree_fits_uhwi_p, tree_fits_shwi_p,
>>>> tree_fits_hwi_p.   The first two of these are unsigned and signed
>>>> versions,
>>>> and the second one takes a boolean parameter which is true if the value
>>>> is
>>>> positive.   This replaces the host_integerp which basically has the same
>>>> functionality of tree_fits_hwi_p.   The reason for changing this is that
>>>> there were about 400 calls to host_integerp and all but 4 of them took a
>>>> constant parameter. These names are chosen to be similar to the similar
>>>> functions in wide-int and are more mnemonic that the existing name since
>>>> they use the more readable u and s prefixes that a lot of other places
>>>> do.
>>>>
>>>> On the accessing side, there is tree_to_uhwi, tree_to_shwi, and
>>>> tree_to_hwi.
>>>> The first two do checking when checking is enabled. The last does no
>>>> checking.
>>> Just a quick note here - the changelog mentions tree_low_cst (as new
>>> function!?) but not host_integerp.  You should probably get rid of both,
>>> otherwise uses will creap back as people are familiar with them
>>> (I'm not sure these changes for consistency are always good ...)
>> i will fix this.
these are bugs in the changelog, not the code.   new changelog included.
>>
>>> I don't like that tree_to_hwi does not do any checking.  In fact I don't
>>> like that it _exists_, after all it has a return type which signedness
>>> does not magically change.  Unchecked conversions should use
>>> TREE_LOW_CST.
>> the idea is that when wide-int goes in, there is actually no
>> TREE_INT_CST_LOW.   The concept of low and high go out the door. the int-cst
>> will have an array in it that is big enough to hold the value.
>> so tree_to_hwi becomes a short hand for just accessing the lower element of
>> the array.
>>
>> you could argue that you should say tree_fits_uhwi_p followed by
>> tree_to_uhwi (and the same for signed).   This is an easy fix.   it just
>> seemed a little redundant.
> Well, if you want raw access to the lower element (when do you actually
> want that, when not in wide-int.c/h?) ... you still need to care about the
> signedness of the result.  And tree_fits_uhwi_p does not return the
> same as tree_fits_shwi_p all the time.
>
> I don't see any goodness in tree_to_hwi nor tree_fits_hwi really.  Because
> if you just access the lower word then that still has a sign (either
> HOST_WIDE_INT or unsigned HOST_WIDE_INT).  We should get rid
> of those places - can you enumerate them?  I think you said it was just
> a few callers with variable signedness argument.
Note that tree_fits_hwi_p does check.   it just takes a parameter to say 
if it wants signed or unsigned checking (it is the old host_integerp, 
repackaged).   You really do need this function as it is for the 4 or 5 
places it is called.  The parameter to it is typically, but not always, 
the sign of the type of the int cst being passed to it.

it is the tree_to_hwi that is unchecked.  Most of the places are 
identified with comments.  This survives the changelog.   (i happen to 
be in the group of people that think changelogs are useless, and that we 
should do a better job of commenting the code.)

I do not know if this is sloppyness or not, but the signedness that is 
checked rarely matches the sign of the variable that the value is 
assigned.  I found this quite frustrating when i was making the patch 
but this kind of thing is common in code where the original writer "knew 
what (s)he was doing."  Unless you are doing comparisons or shifting, 
the signedness of target does not really make much difference.

if you want me to change the sequences of explicit checking and 
unchecked access to explicit checking followed by a checked access, then 
i am happy to do this.

kenny
> Richard.
>
>> I should also point out that about 2/3 if this patch is going to die as the
>> rest of the wide int stuff goes in.   But i did not want to submit a patch
>> that only converted 1/3 of the cases.   The truth is that most of these
>> places where you are doing this conversion are just because the people were
>> too lazy to do the math at the full precision of the double int.
>>
>>> Thus, my 2 cents before I really look at the patch (which will likely
>>> be next week only, so maybe you can do a followup with the above
>>> suggestions).
>>>
>>> Thanks,
>>> Richard.
>>>
>>>> it is expected normally, the unchecked accesses follows an explicit
>>>> check,
>>>> or if there is no explicit check, then one of the checked ones follows.
>>>> This is always not the case for places that just want to look at the
>>>> bottom
>>>> bits of some large number, such as several places that compute alignment.
>>>> These just use the naked unchecked access.
>>>>
>>>> There were a lot of places that either did no checking or did the
>>>> checking
>>>> inline.   This patch tracks down all of those places and they now use the
>>>> same abi.
>>>>
>>>> There are three places where i found what appear to be bugs in the code.
>>>> These are places where the existing code did an unsigned check and then
>>>> followed it with a signed access (or visa versa). These places are marked
>>>> with comments that contain the string "THE NEXT LINE IS POSSIBLY WRONG".
>>>> With some guidance from the reviewer, i will fix these are remove the
>>>> unacceptable comments.
>>>>
>>>> Aside from that, this patch is very very boring because it just makes
>>>> this
>>>> one transformation. Look for interesting stuff in tree.h.   Everything
>>>> else
>>>> is just forcing everywhere to use a single abi.
>>>>
>>>> This patch could go on with a little work without the other 5 patches or
>>>> it
>>>> can wait for them.
>>>>
>>>> kenny
>>
2012-10-9  Kenneth Zadeck <zadeck@naturalbridge.com>

	* tree.c (int_size_in_bytes, max_int_size_in_bytes,
	bit_position, byte_position, free_lang_data_in_decl,
	tree_int_cst_compare, valid_constant_size_p,
	build_nonstandard_integer_type, get_narrower,
	build_vector_type_for_mode, int_cst_value,
	widest_int_cst_value, get_binfo_at_offset): Replaced access of
	lower HWI from int cst with new api.
	(host_integerp, tree_low_cst): Deleted.
	* tree.h (PREDICT_EXPR_PREDICTOR, VL_EXP_OPERAND_LENGTH)
	(CHREC_VARIABLE): Replaced access of lower HWI from int cst with
	new api.
	(tree_fits_uhwi_p, tree_fits_shwi_p, tree_fits_hwi_p,
	tree_to_uhwi, tree_to_shwi, tree_to_hwi): New functions.
	(host_integerp, tree_low_cst): Deleted.
	* ada/gcc-interface/cuintp.c (UI_From_gnu): Replaced access of
	lower HWI from int cst with new api.
	* ada/gcc-interface/decl.c (gnat_to_gnu_entity,
	gnat_to_gnu_component_type, allocatable_size_p,
	gnat_to_gnu_field, create_field_decl_from): Ditto.
	* ada/gcc-interface/misc.c (gnat_type_max_size,
	default_pass_by_ref): Ditto.
	* ada/gcc-interface/trans.c (Call_to_gnu): Ditto.
	* ada/gcc-interface/utils.c (make_aligning_type,
	make_packable_type, make_type_from_size,
	rest_of_record_type_compilation, create_field_decl,
	invalidate_global_renaming_pointers, potential_alignment_gap,
	build_vms_descriptor32, build_vms_descriptor,
	convert_vms_descriptor64, convert_vms_descriptor32,
	unchecked_convert, get_nonnull_operand,
	handle_vector_size_attribute, handle_vector_type_attribute): Ditto.
	* ada/gcc-interface/utils2.c (known_alignment,
	nonbinary_modular_operation): Ditto.
	* alias.c (ao_ref_from_mem, adjust_offset_for_component_ref): Ditto.
	* builtins.c (get_object_alignment_2, get_pointer_alignment_1,
	c_strlen, c_getstr, target_char_cast,
	expand_builtin_mempcpy_args, expand_builtin_strncpy,
	expand_builtin_memset_args, expand_builtin_memset_args,
	expand_builtin_frame_address, expand_builtin_alloca,
	expand_builtin_atomic_compare_exchange, fold_builtin_powi,
	fold_builtin_memset, fold_builtin_memory_op,
	fold_builtin_memchr, fold_builtin_memcmp,
	fold_builtin_strncmp, fold_builtin_load_exponent,
	fold_builtin_snprintf, expand_builtin_object_size,
	expand_builtin_memory_chk, maybe_emit_chk_warning,
	maybe_emit_sprintf_chk_warning, fold_builtin_object_size,
	fold_builtin_memory_chk, fold_builtin_stxcpy_chk,
	fold_builtin_stxncpy_chk, fold_builtin_strcat_chk,
	fold_builtin_strncat_chk, fold_builtin_sprintf_chk_1,
	fold_builtin_snprintf_chk_1, do_mpfr_bessel_n): Ditto.
	* c-family/c-ada-spec.c (is_simple_enum, dump_generic_ada_node): Ditto.
	* c-family/c-common.c (finish_fname_decls, c_type_hash,
	match_case_to_enum_1, get_priority,
	handle_alloc_size_attribute, handle_vector_size_attribute,
	check_function_sentinel, get_nonnull_operand,
	parse_optimize_options, check_function_arguments_recurse,
	c_parse_error, fold_offsetof_1, complete_array_type,
	sync_resolve_size, get_atomic_generic_size,
	get_atomic_generic_size, get_atomic_generic_size,
	warn_for_sign_compare,
	convert_vector_to_pointer_for_subscript): Ditto.
	* c-family/c-cppbuiltin.c (static, cpp_atomic_builtins): Ditto.
	* c-family/c-format.c (check_format_string, check_format_arg): Ditto.
	* c-family/c-pragma.c (handle_pragma_pack): Ditto.
	* c-family/c-pretty-print.c (pp_c_direct_abstract_declarator,
	pp_c_integer_constant, pp_c_character_constant,
	pp_c_postfix_expression): Ditto.
	* c/c-decl.c (check_bitfield_type_and_width, grokdeclarator,
	finish_struct): Ditto.
	* c/c-parser.c (c_lex_one_token, c_parser_omp_clause_collapse,
	c_parser_omp_for_loop): Ditto.
	* c/c-typeck.c (push_init_level, process_init_element,
	c_finish_bc_stmt): Ditto.
	* cfgexpand.c (add_stack_var, expand_one_stack_var,
	defer_stack_allocation, expand_one_var,
	stack_protect_classify_type): Ditto.
	* cgraphunit.c (expand_function): Ditto.
	* config/alpha/alpha.c (va_list_skip_additions,
	alpha_stdarg_optimize_hook): Ditto.
	* config/alpha/predicates.md (small_symbolic_operand): Ditto.
	* config/arm/arm.c (aapcs_vfp_sub_candidate,
	arm_have_conditional_execution): Ditto.
	* config/avr/avr.c (avr_fold_builtin): Ditto.
	* config/bfin/bfin.c (bfin_local_alignment): Ditto.
	* config/c6x/predicates.md (sdata_symbolic_operand): Ditto.
	* config/darwin.c (darwin_mergeable_constant_section,
	machopic_select_section, darwin_asm_declare_object_name): Ditto.
	* config/epiphany/epiphany.c (
	epiphany_special_round_type_align,
	epiphany_adjust_field_align): Ditto.
	* config/i386/i386.c (ix86_function_regparm,
	ix86_keep_aggregate_return_pointer, classify_argument,
	ix86_data_alignment, ix86_local_alignment,
	ix86_builtin_tm_store, get_element_number): Ditto.
	* config/ia64/predicates.md (sdata_symbolic_operand): Ditto.
	* config/iq2000/iq2000.c (iq2000_function_arg): Ditto.
	* config/m32c/m32c-pragma.c (m32c_pragma_address): Ditto.
	* config/m32c/m32c.c (function_vector_handler,
	current_function_special_page_vector): Ditto.
	* config/mep/mep-pragma.c (mep_pragma_coprocessor_width,
	mep_pragma_coprocessor_subclass): Ditto.
	* config/mep/mep.c (mep_attrlist_to_encoding,
	mep_insert_attributes, mep_output_aligned_common): Ditto.
	* config/mips/mips.c (mips_function_arg, r10k_safe_mem_expr_p): Ditto.
	* config/picochip/picochip.c (picochip_compute_arg_size): Ditto.
	* config/rs6000/rs6000-c.c (altivec_resolve_overloaded_builtin): Ditto.
	* config/rs6000/rs6000.c (
	rs6000_builtin_support_vector_misalignment,
	offsettable_ok_by_alignment,
	rs6000_darwin64_record_arg_advance_recurse,
	rs6000_darwin64_record_arg_recurse,
	rs6000_expand_binop_builtin, altivec_expand_predicate_builtin,
	rs6000_expand_ternop_builtin, altivec_expand_dst_builtin,
	get_element_number, altivec_expand_builtin,
	spe_expand_builtin, paired_expand_predicate_builtin,
	spe_expand_predicate_builtin): Ditto.
	* config/s390/s390.c (s390_encode_section_info): Ditto.
	* config/sh/sh.c (sh_print_operand,
	sh2a_handle_function_vector_handler_attribute,
	sh2a_get_function_vector_number): Ditto.
	* config/sol2-c.c (solaris_pragma_align): Ditto.
	* config/sparc/sparc.c (function_arg_record_value_1,
	function_arg_record_value_2, sparc_struct_value_rtx,
	sparc_handle_vis_mul8x16, sparc_fold_builtin): Ditto.
	* config/vms/vms-c.c (handle_pragma_pointer_size): Ditto.
	* coverage.c (build_fn_info): Ditto.
	* cp/call.c (build_array_conv): Ditto.
	* cp/class.c (layout_class_type, dump_class_hierarchy_r,
	dump_class_hierarchy_1, dump_array, dump_thunk): Ditto.
	* cp/cp-tree.h (TMPL_PARMS_DEPTH): Ditto.
	* cp/decl.c (check_array_designated_initializer,
	reshape_init_array_1, grokdeclarator): Ditto.
	* cp/dump.c (cp_dump_tree): Ditto.
	* cp/error.c (dump_type_suffix,
	resolve_virtual_fun_from_obj_type_ref, dump_expr): Ditto.
	* cp/init.c (build_vec_init): Ditto.
	* cp/mangle.c (write_integer_cst): Ditto.
	* cp/method.c (make_thunk, use_thunk): Ditto.
	* cp/parser.c (cp_lexer_get_preprocessor_token,
	cp_parser_userdef_string_literal,
	cp_parser_omp_clause_collapse, cp_parser_omp_for_loop): Ditto.
	* cp/semantics.c (cxx_eval_array_reference,
	cxx_eval_bit_field_ref, cxx_eval_vec_init_1,
	cxx_fold_indirect_ref): Ditto.
	* cp/tree.c (debug_binfo, debug_binfo,
	handle_init_priority_attribute): Ditto.
	* cp/typeck2.c (digest_init_r): Ditto.
	* cppbuiltin.c (define_builtin_macros_for_type_sizes): Ditto.
	* dbxout.c (dbxout_type_fields, dbxout_type_method_1,
	dbxout_range_type, dbxout_type, dbxout_expand_expr,
	dbxout_symbol): Ditto.
	* dojump.c (do_jump): Ditto.
	* dwarf2out.c (simple_type_size_in_bits, dw_sra_loc_expr,
	loc_list_from_tree, add_data_member_location_attribute,
	fortran_common, native_encode_initializer, add_bound_info,
	add_bit_offset_attribute, add_bit_size_attribute,
	add_pure_or_virtual_attribute, descr_info_loc,
	add_descr_info_field, gen_descr_array_type_die,
	gen_enumeration_type_die): Ditto.
	* emit-rtl.c (get_mem_align_offset,
	set_mem_attributes_minus_bitpos, widen_memory_access): Ditto.
	* except.c (init_eh, expand_builtin_eh_common,
	expand_builtin_eh_return_data_regno, collect_one_action_chain): Ditto.
	* explow.c (int_expr_size): Ditto.
	* expr.c (get_bit_range, count_type_elements,
	categorize_ctor_elements_1, store_constructor,
	get_inner_reference, highest_pow2_factor, highest_pow2_factor,
	expand_constructor, expand_expr_real_2, expand_expr_real_1,
	is_aligning_offset, string_constant): Ditto.
	* expr.h (ADD_PARM_SIZE, SUB_PARM_SIZE): Ditto.
	* fold-const.c (negate_expr_p, fold_negate_expr,
	make_bit_field_ref, extract_muldiv_1, fold_single_bit_test,
	fold_plusminus_mult_expr, native_encode_string,
	fold_unary_loc, fold_comparison,
	get_pointer_modulus_and_residue, fold_binary_loc,
	fold_ternary_loc, multiple_of_p,
	tree_call_nonnegative_warnv_p, fold_read_from_constant_string,
	fold_indirect_ref_1): Ditto.
	* fortran/target-memory.c (encode_derived,
	gfc_interpret_derived, expr_to_char): Ditto.
	* fortran/trans-array.c (gfc_trans_array_constructor_value): Ditto.
	* fortran/trans-common.c (build_common_decl): Ditto.
	* fortran/trans-const.c (gfc_conv_string_init): Ditto.
	* fortran/trans-decl.c (gfc_can_put_var_on_stack): Ditto.
	* fortran/trans-expr.c (gfc_string_to_single_character,
	gfc_optimize_len_trim): Ditto.
	* fortran/trans-io.c (gfc_build_io_library_fndecls): Ditto.
	* fortran/trans-types.c (gfc_get_dtype): Ditto.
	* function.c (locate_and_pad_parm, pad_below): Ditto.
	* gimple-fold.c (gimple_extract_devirt_binfo_from_cst,
	gimple_fold_call, get_base_constructor,
	fold_const_aggregate_ref_1, gimple_get_virt_method_for_binfo,
	gimple_val_nonnegative_real_p): Ditto.
	* gimple-pretty-print.c (dump_gimple_call): Ditto.
	* gimple-ssa-strength-reduction.c (stmt_cost): Ditto.
	* gimple.c (gimple_compare_field_offset): Ditto.
	* gimplify.c (gimple_add_tmp_var, gimple_fold_indirect_ref): Ditto.
	* go/go-gcc.cc (Gcc_backend::type_size(Btype*): Ditto.
	* go/gofrontend/expressions.cc (
	Type_conversion_expression::do_get_tree(Translate_context*): Ditto.
	* godump.c (go_format_type, go_output_typedef): Ditto.
	* graphite-scop-detection.c (graphite_can_represent_init): Ditto.
	* graphite-sese-to-poly.c (pdr_add_data_dimensions): Ditto.
	* ipa-prop.c (ipa_print_node_jump_functions_for_edge,
	type_like_member_ptr_p, determine_known_aggregate_parts,
	ipa_analyze_virtual_call_uses): Ditto.
	* java/boehm.c (uses_jv_markobj_p): Ditto.
	* java/class.c (get_dispatch_vector): Ditto.
	* java/expr.c (build_newarray, build_anewarray, build_jni_stub): Ditto.
	* java/typeck.c (java_array_type_length): Ditto.
	* lto-streamer-out.c (write_symbol): Ditto.
	* lto/lto-lang.c (get_nonnull_operand): Ditto.
	* objc/objc-act.c (check_string_class_template,
	objc_decl_method_attributes, gen_declaration, gen_type_name_0): Ditto.
	* objc/objc-encoding.c (encode_array, encode_vector,
	encode_field): Ditto.
	* objc/objc-next-runtime-abi-01.c (
	generate_v1_objc_protocol_extension,
	generate_v1_property_table, build_v1_category_initializer,
	generate_objc_class_ext): Ditto.
	* objc/objc-next-runtime-abi-02.c (
	generate_v2_meth_descriptor_table, generate_v2_property_table,
	build_v2_protocol_initializer, generate_v2_dispatch_table,
	build_v2_ivar_list_initializer, generate_v2_ivars_list,
	generate_v2_class_structs): Ditto.
	* omp-low.c (expand_omp_atomic, lower_omp_for_lastprivate): Ditto.
	* predict.c (strips_small_constant,
	is_comparison_with_loop_invariant_p, predict_iv_comparison,
	predict_loops): Ditto.
	* rtlanal.c (rtx_addr_can_trap_p_1): Ditto.
	* sdbout.c (plain_type_1, sdbout_field_types, sdbout_one_type): Ditto.
	* simplify-rtx.c (delegitimize_mem_from_attrs): Ditto.
	* stmt.c (dump_case_nodes, expand_switch_as_decision_tree_p,
	emit_case_dispatch_table): Ditto.
	* stor-layout.c (mode_for_size_tree, mode_for_array,
	layout_decl, excess_unit_span, place_field,
	compute_record_mode, finish_bitfield_representative,
	finish_bitfield_layout): Ditto.
	* targhooks.c (default_mangle_decl_assembler_name): Ditto.
	* testsuite/gcc.dg/20020219-1.c (fixed comment.): Ditto.
	* trans-mem.c (tm_log_add, tm_log_emit_stmt, build_tm_load,
	build_tm_store): Ditto.
	* tree-cfg.c (verify_expr, verify_expr,
	move_stmt_eh_region_tree_nr): Ditto.
	* tree-data-ref.c (gcd_of_steps_may_divide_p): Ditto.
	* tree-dfa.c (get_ref_base_and_extent): Ditto.
	* tree-flow-inline.h (get_addr_base_and_unit_offset_1): Ditto.
	* tree-inline.c (remap_eh_region_tree_nr): Ditto.
	* tree-object-size.c (static, compute_object_offset,
	addr_object_size, alloc_object_size, plus_stmt_object_size,
	compute_object_sizes): Ditto.
	* tree-pretty-print.c (dump_array_domain, dump_generic_node): Ditto.
	* tree-sra.c (type_internals_preclude_sra_p,
	completely_scalarize_record, compare_access_positions,
	make_fancy_name_1, build_user_friendly_ref_for_offset,
	maybe_add_sra_candidate, expr_with_var_bounded_array_refs_p,
	analyze_all_variable_accesses, sra_modify_expr,
	find_param_candidates, splice_param_accesses,
	decide_one_param_reduction): Ditto.
	* tree-ssa-address.c (copy_ref_info): Ditto.
	* tree-ssa-alias.c (ao_ref_init_from_ptr_and_size,
	stmt_kills_ref_p_1): Ditto.
	* tree-ssa-ccp.c (ccp_finalize, bit_value_assume_aligned,
	evaluate_stmt, fold_builtin_alloca_with_align): Ditto.
	* tree-ssa-forwprop.c (simplify_builtin_call,
	simplify_bitfield_ref, is_combined_permutation_identity,
	simplify_vector_constructor): Ditto.
	* tree-ssa-loop-ivcanon.c (try_unroll_loop_completely): Ditto.
	* tree-ssa-loop-ivopts.c (get_loop_invariant_expr_id,
	iv_period): Ditto.
	* tree-ssa-loop-niter.c (number_of_iterations_ne_max,
	number_of_iterations_ne): Ditto.
	* tree-ssa-loop-prefetch.c (analyze_ref, add_subscript_strides,
	self_reuse_distance): Ditto.
	* tree-ssa-math-opts.c (execute_cse_sincos, find_bswap_1,
	find_bswap): Ditto.
	* tree-ssa-phiopt.c (add_or_mark_expr, hoist_adjacent_loads): Ditto.
	* tree-ssa-reassoc.c (decrement_power, acceptable_pow_call): Ditto.
	* tree-ssa-sccvn.c (vn_reference_eq,
	copy_reference_ops_from_ref, ao_ref_init_from_vn_reference,
	vn_reference_fold_indirect,
	vn_reference_maybe_forwprop_address,
	fully_constant_vn_reference_p, vn_reference_lookup_3,
	simplify_binary_expression): Ditto.
	* tree-ssa-strlen.c (get_stridx, adjust_last_stmt,
	handle_builtin_memcpy, handle_pointer_plus): Ditto.
	* tree-ssa-structalias.c (process_constraint,
	get_constraint_for_1, push_fields_onto_fieldstack,
	create_variable_info_for_1): Ditto.
	* tree-stdarg.c (va_list_counter_bump,
	check_all_va_list_escapes): Ditto.
	* tree-switch-conversion.c (emit_case_bit_tests, static,
	create_temp_arrays): Ditto.
	* tree-vect-data-refs.c (vect_get_smallest_scalar_type,
	vect_drs_dependent_in_basic_block, vect_check_interleaving,
	vect_compute_data_ref_alignment,
	vect_verify_datarefs_alignment, vect_analyze_group_access,
	vect_analyze_data_ref_access, vect_check_gather,
	vect_supportable_dr_alignment): Ditto.
	* tree-vect-generic.c (static, expand_vector_piecewise,
	expand_vector_parallel, expand_vector_addition,
	expand_vector_divmod, vector_element, lower_vec_perm,
	lower_vec_perm): Ditto.
	* tree-vect-loop.c (vect_analyze_loop_form,
	vect_model_reduction_cost, get_initial_def_for_reduction,
	vect_create_epilog_for_reduction): Ditto.
	* tree-vect-patterns.c (vect_recog_pow_pattern,
	vect_recog_divmod_pattern): Ditto.
	* tree-vect-stmts.c (vectorizable_load): Ditto.
	* tree-vectorizer.h (LOOP_VINFO_INT_NITERS, NITERS_KNOWN_P): Ditto.
	* tree-vrp.c (extract_range_from_binary_expr_1,
	register_edge_assert_for_2): Ditto.
	* var-tracking.c (prepare_call_arguments,
	emit_note_insn_var_location): Ditto.
	* varasm.c (get_block_for_decl, assemble_noswitch_variable,
	assemble_variable_contents, decode_addr_const,
	decode_addr_const, output_constant,
	output_constructor_array_range,
	output_constructor_regular_field, output_constructor_bitfield,
	place_block_symbol, output_object_block): Ditto.
	* varpool.c (varpool_remove_node): Ditto.
Richard Guenther - Oct. 19, 2012, 8:12 a.m.
On Thu, Oct 18, 2012 at 4:00 PM, Kenneth Zadeck
<zadeck@naturalbridge.com> wrote:
> you know richi, i did not know who i was actually talking to.   i said who
> is this richard beiner person and then i saw the email address.

;)

> On 10/18/2012 08:58 AM, Richard Biener wrote:
>>
>> On Thu, Oct 18, 2012 at 2:52 PM, Kenneth Zadeck
>> <zadeck@naturalbridge.com> wrote:
>>>
>>> On 10/18/2012 06:22 AM, Richard Biener wrote:
>>>>
>>>> On Wed, Oct 17, 2012 at 11:47 PM, Kenneth Zadeck
>>>> <zadeck@naturalbridge.com> wrote:
>>>>>
>>>>> Richi,
>>>>>
>>>>> I apologize for the size of this patch, but it only does one very small
>>>>> thing, it is just that it does it all over the middle end.
>>>>>
>>>>> This patch introduces a new api for extracting a signed or unsigned hwi
>>>>> from
>>>>> an integer cst.  There had been an abi for doing this, but it has some
>>>>> problems that were fixed here, and it was only used sometimes.
>>>>>
>>>>> The new abi consists of 6 functions, three for testing if the constant
>>>>> in
>>>>> the int cst will fit and three for actually pulling out the value.
>>>>>
>>>>> The ones for testing are tree_fits_uhwi_p, tree_fits_shwi_p,
>>>>> tree_fits_hwi_p.   The first two of these are unsigned and signed
>>>>> versions,
>>>>> and the second one takes a boolean parameter which is true if the value
>>>>> is
>>>>> positive.   This replaces the host_integerp which basically has the
>>>>> same
>>>>> functionality of tree_fits_hwi_p.   The reason for changing this is
>>>>> that
>>>>> there were about 400 calls to host_integerp and all but 4 of them took
>>>>> a
>>>>> constant parameter. These names are chosen to be similar to the similar
>>>>> functions in wide-int and are more mnemonic that the existing name
>>>>> since
>>>>> they use the more readable u and s prefixes that a lot of other places
>>>>> do.
>>>>>
>>>>> On the accessing side, there is tree_to_uhwi, tree_to_shwi, and
>>>>> tree_to_hwi.
>>>>> The first two do checking when checking is enabled. The last does no
>>>>> checking.
>>>>
>>>> Just a quick note here - the changelog mentions tree_low_cst (as new
>>>> function!?) but not host_integerp.  You should probably get rid of both,
>>>> otherwise uses will creap back as people are familiar with them
>>>> (I'm not sure these changes for consistency are always good ...)
>>>
>>> i will fix this.
>
> these are bugs in the changelog, not the code.   new changelog included.
>
>>>
>>>> I don't like that tree_to_hwi does not do any checking.  In fact I don't
>>>> like that it _exists_, after all it has a return type which signedness
>>>> does not magically change.  Unchecked conversions should use
>>>> TREE_LOW_CST.
>>>
>>> the idea is that when wide-int goes in, there is actually no
>>> TREE_INT_CST_LOW.   The concept of low and high go out the door. the
>>> int-cst
>>> will have an array in it that is big enough to hold the value.
>>> so tree_to_hwi becomes a short hand for just accessing the lower element
>>> of
>>> the array.
>>>
>>> you could argue that you should say tree_fits_uhwi_p followed by
>>> tree_to_uhwi (and the same for signed).   This is an easy fix.   it just
>>> seemed a little redundant.
>>
>> Well, if you want raw access to the lower element (when do you actually
>> want that, when not in wide-int.c/h?) ... you still need to care about the
>> signedness of the result.  And tree_fits_uhwi_p does not return the
>> same as tree_fits_shwi_p all the time.
>>
>> I don't see any goodness in tree_to_hwi nor tree_fits_hwi really.  Because
>> if you just access the lower word then that still has a sign (either
>> HOST_WIDE_INT or unsigned HOST_WIDE_INT).  We should get rid
>> of those places - can you enumerate them?  I think you said it was just
>> a few callers with variable signedness argument.
>
> Note that tree_fits_hwi_p does check.   it just takes a parameter to say if
> it wants signed or unsigned checking (it is the old host_integerp,
> repackaged).   You really do need this function as it is for the 4 or 5
> places it is called.  The parameter to it is typically, but not always, the
> sign of the type of the int cst being passed to it.
>
> it is the tree_to_hwi that is unchecked.  Most of the places are identified
> with comments.  This survives the changelog.   (i happen to be in the group
> of people that think changelogs are useless, and that we should do a better
> job of commenting the code.)
>
> I do not know if this is sloppyness or not, but the signedness that is
> checked rarely matches the sign of the variable that the value is assigned.
> I found this quite frustrating when i was making the patch but this kind of
> thing is common in code where the original writer "knew what (s)he was
> doing."  Unless you are doing comparisons or shifting, the signedness of
> target does not really make much difference.
>
> if you want me to change the sequences of explicit checking and unchecked
> access to explicit checking followed by a checked access, then i am happy to
> do this.

Disclaimer: not looking at the patch (again).

The existing tree_low_cst function performs checking, so tree_to_hwi
should as well.

I don't think mismatch of signedness of the variable assigned to with the
sign we use for hwi extraction is any good.  C++ isn't type-safe here
for the return value but if we'd use a reference as return slot we
could make it so ...
(in exchange for quite some ugliness IMNSHO):

void tree_to_shwi (const_tree tree, HOST_WIDE_INT &hwi);

vs.

void tree_to_uhwi (const_tree tree, unsigned HOST_WIDE_INT &hwi);

maybe more natural would be

void hwi_from_tree (HOST_WIDE_INT &hwi, const_tree tree);
void hwi_from_tree (unsigned HOST_WIDE_INT &hwi, const_tree tree);

let the C++ bikeshedding begin!  (the point is to do appropriate checking
for a conversion of (INTEGER_CST) tree to HOST_WIDE_INT vs.
unsigned HOST_WIDE_INT)

No, I don't want you to do the above transform with this patch ;)

Thanks,
Richard.


> kenny
>
>> Richard.
>>
>>> I should also point out that about 2/3 if this patch is going to die as
>>> the
>>> rest of the wide int stuff goes in.   But i did not want to submit a
>>> patch
>>> that only converted 1/3 of the cases.   The truth is that most of these
>>> places where you are doing this conversion are just because the people
>>> were
>>> too lazy to do the math at the full precision of the double int.
>>>
>>>> Thus, my 2 cents before I really look at the patch (which will likely
>>>> be next week only, so maybe you can do a followup with the above
>>>> suggestions).
>>>>
>>>> Thanks,
>>>> Richard.
>>>>
>>>>> it is expected normally, the unchecked accesses follows an explicit
>>>>> check,
>>>>> or if there is no explicit check, then one of the checked ones follows.
>>>>> This is always not the case for places that just want to look at the
>>>>> bottom
>>>>> bits of some large number, such as several places that compute
>>>>> alignment.
>>>>> These just use the naked unchecked access.
>>>>>
>>>>> There were a lot of places that either did no checking or did the
>>>>> checking
>>>>> inline.   This patch tracks down all of those places and they now use
>>>>> the
>>>>> same abi.
>>>>>
>>>>> There are three places where i found what appear to be bugs in the
>>>>> code.
>>>>> These are places where the existing code did an unsigned check and then
>>>>> followed it with a signed access (or visa versa). These places are
>>>>> marked
>>>>> with comments that contain the string "THE NEXT LINE IS POSSIBLY
>>>>> WRONG".
>>>>> With some guidance from the reviewer, i will fix these are remove the
>>>>> unacceptable comments.
>>>>>
>>>>> Aside from that, this patch is very very boring because it just makes
>>>>> this
>>>>> one transformation. Look for interesting stuff in tree.h.   Everything
>>>>> else
>>>>> is just forcing everywhere to use a single abi.
>>>>>
>>>>> This patch could go on with a little work without the other 5 patches
>>>>> or
>>>>> it
>>>>> can wait for them.
>>>>>
>>>>> kenny
>>>
>>>
>
Lawrence Crowl - Oct. 22, 2012, 9:58 p.m.
On 10/19/12, Richard Biener <richard.guenther@gmail.com> wrote:
> The existing tree_low_cst function performs checking, so
> tree_to_hwi should as well.
>
> I don't think mismatch of signedness of the variable assigned to
> with the sign we use for hwi extraction is any good.  C++ isn't
> type-safe here for the return value but if we'd use a reference
> as return slot we could make it so ...  (in exchange for quite
> some ugliness IMNSHO):
>
> void tree_to_shwi (const_tree tree, HOST_WIDE_INT &hwi);
>
> vs.
>
> void tree_to_uhwi (const_tree tree, unsigned HOST_WIDE_INT &hwi);
>
> maybe more natural would be
>
> void hwi_from_tree (HOST_WIDE_INT &hwi, const_tree tree);
> void hwi_from_tree (unsigned HOST_WIDE_INT &hwi, const_tree tree);
>
> let the C++ bikeshedding begin!  (the point is to do appropriate
> checking for a conversion of (INTEGER_CST) tree to HOST_WIDE_INT
> vs.  unsigned HOST_WIDE_INT)

We could add conversion operators to achieve the effect.  However,
we probably don't want to do so until we can make them explicit.
Unfortunately, explicit conversion operators are not available
until C++11.

> No, I don't want you to do the above transform with this patch ;)

Patch

diff --git a/gcc/ada/gcc-interface/cuintp.c b/gcc/ada/gcc-interface/cuintp.c
index e077d9c..4b6f916 100644
--- a/gcc/ada/gcc-interface/cuintp.c
+++ b/gcc/ada/gcc-interface/cuintp.c
@@ -160,23 +160,23 @@  UI_From_gnu (tree Input)
   Int_Vector vec;
 
 #if HOST_BITS_PER_WIDE_INT == 64
-  /* On 64-bit hosts, host_integerp tells whether the input fits in a
+  /* On 64-bit hosts, tree_fits_shwi_p tells whether the input fits in a
      signed 64-bit integer.  Then a truncation tells whether it fits
      in a signed 32-bit integer.  */
-  if (host_integerp (Input, 0))
+  if (tree_fits_shwi_p (Input))
     {
-      HOST_WIDE_INT hw_input = TREE_INT_CST_LOW (Input);
+      HOST_WIDE_INT hw_input = tree_to_hwi (Input);
       if (hw_input == (int) hw_input)
 	return UI_From_Int (hw_input);
     }
   else
     return No_Uint;
 #else
-  /* On 32-bit hosts, host_integerp tells whether the input fits in a
+  /* On 32-bit hosts, tree_fits_shwi_p tells whether the input fits in a
      signed 32-bit integer.  Then a sign test tells whether it fits
      in a signed 64-bit integer.  */
-  if (host_integerp (Input, 0))
-    return UI_From_Int (TREE_INT_CST_LOW (Input));
+  if (tree_fits_shwi_p (Input))
+    return UI_From_Int (tree_to_hwi (Input));
   else if (TREE_INT_CST_HIGH (Input) < 0 && TYPE_UNSIGNED (gnu_type))
     return No_Uint;
 #endif
@@ -186,10 +186,9 @@  UI_From_gnu (tree Input)
 
   for (i = Max_For_Dint - 1; i >= 0; i--)
     {
-      v[i] = tree_low_cst (fold_build1 (ABS_EXPR, gnu_type,
+      v[i] = tree_to_shwi (fold_build1 (ABS_EXPR, gnu_type,
 					fold_build2 (TRUNC_MOD_EXPR, gnu_type,
-						     gnu_temp, gnu_base)),
-			   0);
+						     gnu_temp, gnu_base)));
       gnu_temp = fold_build2 (TRUNC_DIV_EXPR, gnu_type, gnu_temp, gnu_base);
     }
 
diff --git a/gcc/ada/gcc-interface/decl.c b/gcc/ada/gcc-interface/decl.c
index 3935bb3..bd4d78e 100644
--- a/gcc/ada/gcc-interface/decl.c
+++ b/gcc/ada/gcc-interface/decl.c
@@ -834,13 +834,13 @@  gnat_to_gnu_entity (Entity_Id gnat_entity, tree gnu_expr, int definition)
 		align_cap = get_mode_alignment (ptr_mode);
 	      }
 
-	    if (!host_integerp (TYPE_SIZE (gnu_type), 1)
+	    if (!tree_fits_uhwi_p (TYPE_SIZE (gnu_type))
 		|| compare_tree_int (TYPE_SIZE (gnu_type), size_cap) > 0)
 	      align = 0;
 	    else if (compare_tree_int (TYPE_SIZE (gnu_type), align_cap) > 0)
 	      align = align_cap;
 	    else
-	      align = ceil_pow2 (tree_low_cst (TYPE_SIZE (gnu_type), 1));
+	      align = ceil_pow2 (tree_to_uhwi (TYPE_SIZE (gnu_type)));
 
 	    /* But make sure not to under-align the object.  */
 	    if (align <= TYPE_ALIGN (gnu_type))
@@ -1482,10 +1482,10 @@  gnat_to_gnu_entity (Entity_Id gnat_entity, tree gnu_expr, int definition)
 	    && const_flag
 	    && gnu_expr && TREE_CONSTANT (gnu_expr)
 	    && AGGREGATE_TYPE_P (gnu_type)
-	    && host_integerp (TYPE_SIZE_UNIT (gnu_type), 1)
+	    && tree_fits_uhwi_p (TYPE_SIZE_UNIT (gnu_type))
 	    && !(TYPE_IS_PADDING_P (gnu_type)
-		 && !host_integerp (TYPE_SIZE_UNIT
-				    (TREE_TYPE (TYPE_FIELDS (gnu_type))), 1)))
+		 && !tree_fits_uhwi_p (TYPE_SIZE_UNIT
+				    (TREE_TYPE (TYPE_FIELDS (gnu_type))))))
 	  static_p = true;
 
 	/* Now create the variable or the constant and set various flags.  */
@@ -3410,7 +3410,7 @@  gnat_to_gnu_entity (Entity_Id gnat_entity, tree gnu_expr, int definition)
 			gnu_size = DECL_SIZE (gnu_old_field);
 			if (RECORD_OR_UNION_TYPE_P (gnu_field_type)
 			    && !TYPE_FAT_POINTER_P (gnu_field_type)
-			    && host_integerp (TYPE_SIZE (gnu_field_type), 1))
+			    && tree_fits_uhwi_p (TYPE_SIZE (gnu_field_type)))
 			  gnu_field_type
 			    = make_packable_type (gnu_field_type, true);
 		      }
@@ -4399,7 +4399,7 @@  gnat_to_gnu_entity (Entity_Id gnat_entity, tree gnu_expr, int definition)
 							NULL_TREE))
 		  {
 		    unsigned int size
-		      = TREE_INT_CST_LOW (TYPE_SIZE (gnu_return_type));
+		      = tree_to_uhwi (TYPE_SIZE (gnu_return_type));
 		    unsigned int i = BITS_PER_UNIT;
 		    enum machine_mode mode;
 
@@ -4770,22 +4770,22 @@  gnat_to_gnu_entity (Entity_Id gnat_entity, tree gnu_expr, int definition)
 
 	      /* Consider an alignment as suspicious if the alignment/size
 		 ratio is greater or equal to the byte/bit ratio.  */
-	      if (host_integerp (size, 1)
-		  && align >= TREE_INT_CST_LOW (size) * BITS_PER_UNIT)
+	      if (tree_fits_uhwi_p (size)
+		  && align >= tree_to_hwi (size) * BITS_PER_UNIT)
 		post_error_ne ("?suspiciously large alignment specified for&",
 			       Expression (Alignment_Clause (gnat_entity)),
 			       gnat_entity);
 	    }
 	}
       else if (Is_Atomic (gnat_entity) && !gnu_size
-	       && host_integerp (TYPE_SIZE (gnu_type), 1)
+	       && tree_fits_uhwi_p (TYPE_SIZE (gnu_type))
 	       && integer_pow2p (TYPE_SIZE (gnu_type)))
 	align = MIN (BIGGEST_ALIGNMENT,
-		     tree_low_cst (TYPE_SIZE (gnu_type), 1));
+		     tree_to_uhwi (TYPE_SIZE (gnu_type)));
       else if (Is_Atomic (gnat_entity) && gnu_size
-	       && host_integerp (gnu_size, 1)
+	       && tree_fits_uhwi_p (gnu_size)
 	       && integer_pow2p (gnu_size))
-	align = MIN (BIGGEST_ALIGNMENT, tree_low_cst (gnu_size, 1));
+	align = MIN (BIGGEST_ALIGNMENT, tree_to_uhwi (gnu_size));
 
       /* See if we need to pad the type.  If we did, and made a record,
 	 the name of the new type may be changed.  So get it back for
@@ -5431,7 +5431,7 @@  gnat_to_gnu_component_type (Entity_Id gnat_array, bool definition,
       && !Strict_Alignment (gnat_type)
       && RECORD_OR_UNION_TYPE_P (gnu_type)
       && !TYPE_FAT_POINTER_P (gnu_type)
-      && host_integerp (TYPE_SIZE (gnu_type), 1))
+      && tree_fits_uhwi_p (TYPE_SIZE (gnu_type)))
     gnu_type = make_packable_type (gnu_type, false);
 
   if (Has_Atomic_Components (gnat_array))
@@ -5930,7 +5930,7 @@  allocatable_size_p (tree gnu_size, bool static_p)
   /* We can allocate a fixed size if it hasn't overflowed and can be handled
      (efficiently) on the host.  */
   if (TREE_CODE (gnu_size) == INTEGER_CST)
-    return !TREE_OVERFLOW (gnu_size) && host_integerp (gnu_size, 1);
+    return !TREE_OVERFLOW (gnu_size) && tree_fits_uhwi_p (gnu_size);
 
   /* We can allocate a variable size if this isn't a static allocation.  */
   else
@@ -6387,7 +6387,7 @@  gnat_to_gnu_field (Entity_Id gnat_field, tree gnu_record_type, int packed,
   if (!needs_strict_alignment
       && RECORD_OR_UNION_TYPE_P (gnu_field_type)
       && !TYPE_FAT_POINTER_P (gnu_field_type)
-      && host_integerp (TYPE_SIZE (gnu_field_type), 1)
+      && tree_fits_uhwi_p (TYPE_SIZE (gnu_field_type))
       && (packed == 1
 	  || (gnu_size
 	      && (tree_int_cst_lt (gnu_size, TYPE_SIZE (gnu_field_type))
@@ -8110,7 +8110,7 @@  create_field_decl_from (tree old_field, tree field_type, tree record_type,
 {
   tree t = TREE_VALUE (purpose_member (old_field, pos_list));
   tree pos = TREE_VEC_ELT (t, 0), bitpos = TREE_VEC_ELT (t, 2);
-  unsigned int offset_align = tree_low_cst (TREE_VEC_ELT (t, 1), 1);
+  unsigned int offset_align = tree_to_uhwi (TREE_VEC_ELT (t, 1));
   tree new_pos, new_field;
   unsigned int i;
   subst_pair *s;
diff --git a/gcc/ada/gcc-interface/misc.c b/gcc/ada/gcc-interface/misc.c
index baa44c9..e90e119 100644
--- a/gcc/ada/gcc-interface/misc.c
+++ b/gcc/ada/gcc-interface/misc.c
@@ -569,7 +569,7 @@  gnat_type_max_size (const_tree gnu_type)
 
   /* If we don't have a constant, see what we can get from TYPE_ADA_SIZE,
      which should stay untouched.  */
-  if (!host_integerp (max_unitsize, 1)
+  if (!tree_fits_uhwi_p (max_unitsize)
       && RECORD_OR_UNION_TYPE_P (gnu_type)
       && !TYPE_FAT_POINTER_P (gnu_type)
       && TYPE_ADA_SIZE (gnu_type))
@@ -578,7 +578,7 @@  gnat_type_max_size (const_tree gnu_type)
 
       /* If we have succeeded in finding a constant, round it up to the
 	 type's alignment and return the result in units.  */
-      if (host_integerp (max_adasize, 1))
+      if (tree_fits_uhwi_p (max_adasize))
 	max_unitsize
 	  = size_binop (CEIL_DIV_EXPR,
 			round_up (max_adasize, TYPE_ALIGN (gnu_type)),
@@ -618,7 +618,7 @@  default_pass_by_ref (tree gnu_type)
     return true;
 
   if (AGGREGATE_TYPE_P (gnu_type)
-      && (!host_integerp (TYPE_SIZE (gnu_type), 1)
+      && (!tree_fits_uhwi_p (TYPE_SIZE (gnu_type))
 	  || 0 < compare_tree_int (TYPE_SIZE (gnu_type),
 				   8 * TYPE_ALIGN (gnu_type))))
     return true;
diff --git a/gcc/ada/gcc-interface/trans.c b/gcc/ada/gcc-interface/trans.c
index aac483c..5f1337a 100644
--- a/gcc/ada/gcc-interface/trans.c
+++ b/gcc/ada/gcc-interface/trans.c
@@ -4034,7 +4034,7 @@  Call_to_gnu (Node_Id gnat_node, tree *gnu_result_type_p, tree gnu_target,
 	    gnu_actual
 	      = unchecked_convert (DECL_ARG_TYPE (gnu_formal),
 				   convert (gnat_type_for_size
-					    (TREE_INT_CST_LOW (gnu_size), 1),
+					    (tree_to_uhwi (gnu_size), 1),
 					    integer_zero_node),
 				   false);
 	  else
diff --git a/gcc/ada/gcc-interface/utils.c b/gcc/ada/gcc-interface/utils.c
index d9121c1..08e37c4 100644
--- a/gcc/ada/gcc-interface/utils.c
+++ b/gcc/ada/gcc-interface/utils.c
@@ -754,7 +754,7 @@  make_aligning_type (tree type, unsigned int align, tree size,
 tree
 make_packable_type (tree type, bool in_record)
 {
-  unsigned HOST_WIDE_INT size = tree_low_cst (TYPE_SIZE (type), 1);
+  unsigned HOST_WIDE_INT size = tree_to_uhwi (TYPE_SIZE (type));
   unsigned HOST_WIDE_INT new_size;
   tree new_type, old_field, field_list = NULL_TREE;
   unsigned int align;
@@ -789,12 +789,12 @@  make_packable_type (tree type, bool in_record)
 
       /* Do not try to shrink the size if the RM size is not constant.  */
       if (TYPE_CONTAINS_TEMPLATE_P (type)
-	  || !host_integerp (TYPE_ADA_SIZE (type), 1))
+	  || !tree_fits_uhwi_p (TYPE_ADA_SIZE (type)))
 	return type;
 
       /* Round the RM size up to a unit boundary to get the minimal size
 	 for a BLKmode record.  Give up if it's already the size.  */
-      new_size = TREE_INT_CST_LOW (TYPE_ADA_SIZE (type));
+      new_size = tree_to_hwi (TYPE_ADA_SIZE (type));
       new_size = (new_size + BITS_PER_UNIT - 1) & -BITS_PER_UNIT;
       if (new_size == size)
 	return type;
@@ -815,7 +815,7 @@  make_packable_type (tree type, bool in_record)
 
       if (RECORD_OR_UNION_TYPE_P (new_field_type)
 	  && !TYPE_FAT_POINTER_P (new_field_type)
-	  && host_integerp (TYPE_SIZE (new_field_type), 1))
+	  && tree_fits_uhwi_p (TYPE_SIZE (new_field_type)))
 	new_field_type = make_packable_type (new_field_type, true);
 
       /* However, for the last field in a not already packed record type
@@ -898,10 +898,10 @@  make_type_from_size (tree type, tree size_tree, bool for_biased)
 
   /* If size indicates an error, just return TYPE to avoid propagating
      the error.  Likewise if it's too large to represent.  */
-  if (!size_tree || !host_integerp (size_tree, 1))
+  if (!size_tree || !tree_fits_uhwi_p (size_tree))
     return type;
 
-  size = tree_low_cst (size_tree, 1);
+  size = tree_to_uhwi (size_tree);
 
   switch (TREE_CODE (type))
     {
@@ -1726,10 +1726,10 @@  rest_of_record_type_compilation (tree record_type)
 	    pos = compute_related_constant (curpos, last_pos);
 
 	  if (!pos && TREE_CODE (curpos) == MULT_EXPR
-	      && host_integerp (TREE_OPERAND (curpos, 1), 1))
+	      && tree_fits_uhwi_p (TREE_OPERAND (curpos, 1)))
 	    {
 	      tree offset = TREE_OPERAND (curpos, 0);
-	      align = tree_low_cst (TREE_OPERAND (curpos, 1), 1);
+	      align = tree_to_uhwi (TREE_OPERAND (curpos, 1));
 
 	      /* An offset which is a bitwise AND with a negative power of 2
 		 means an alignment corresponding to this power of 2.  Note
@@ -1737,11 +1737,11 @@  rest_of_record_type_compilation (tree record_type)
 		 we don't directly use tree_int_cst_sgn.  */
 	      offset = remove_conversions (offset, true);
 	      if (TREE_CODE (offset) == BIT_AND_EXPR
-		  && host_integerp (TREE_OPERAND (offset, 1), 0)
+		  && tree_fits_shwi_p (TREE_OPERAND (offset, 1))
 		  && TREE_INT_CST_HIGH (TREE_OPERAND (offset, 1)) < 0)
 		{
 		  unsigned int pow
-		    = - tree_low_cst (TREE_OPERAND (offset, 1), 0);
+		    = - tree_to_shwi (TREE_OPERAND (offset, 1));
 		  if (exact_log2 (pow) > 0)
 		    align *= pow;
 		}
@@ -1752,13 +1752,11 @@  rest_of_record_type_compilation (tree record_type)
 	  else if (!pos && TREE_CODE (curpos) == PLUS_EXPR
 		   && TREE_CODE (TREE_OPERAND (curpos, 1)) == INTEGER_CST
 		   && TREE_CODE (TREE_OPERAND (curpos, 0)) == MULT_EXPR
-		   && host_integerp (TREE_OPERAND
-				     (TREE_OPERAND (curpos, 0), 1),
-				     1))
+		   && tree_fits_uhwi_p (TREE_OPERAND
+				     (TREE_OPERAND (curpos, 0), 1)))
 	    {
 	      align
-		= tree_low_cst
-		(TREE_OPERAND (TREE_OPERAND (curpos, 0), 1), 1);
+		= tree_to_uhwi (TREE_OPERAND (TREE_OPERAND (curpos, 0), 1));
 	      pos = compute_related_constant (curpos,
 					      round_up (last_pos, align));
 	    }
@@ -2374,8 +2372,8 @@  create_field_decl (tree field_name, tree field_type, tree record_type,
 	 that an alignment of 0 is taken as infinite.  */
       unsigned int known_align;
 
-      if (host_integerp (pos, 1))
-	known_align = tree_low_cst (pos, 1) & - tree_low_cst (pos, 1);
+      if (tree_fits_uhwi_p (pos))
+	known_align = tree_to_hwi (pos) & - tree_to_hwi (pos);
       else
 	known_align = BITS_PER_UNIT;
 
@@ -2385,7 +2383,7 @@  create_field_decl (tree field_name, tree field_type, tree record_type,
 
       layout_decl (field_decl, known_align);
       SET_DECL_OFFSET_ALIGN (field_decl,
-			     host_integerp (pos, 1) ? BIGGEST_ALIGNMENT
+			     tree_fits_uhwi_p (pos) ? BIGGEST_ALIGNMENT
 			     : BITS_PER_UNIT);
       pos_from_bit (&DECL_FIELD_OFFSET (field_decl),
 		    &DECL_FIELD_BIT_OFFSET (field_decl),
@@ -2536,8 +2534,8 @@  invalidate_global_renaming_pointers (void)
 bool
 value_factor_p (tree value, HOST_WIDE_INT factor)
 {
-  if (host_integerp (value, 1))
-    return tree_low_cst (value, 1) % factor == 0;
+  if (tree_fits_uhwi_p (value))
+    return tree_to_hwi (value) % factor == 0;
 
   if (TREE_CODE (value) == MULT_EXPR)
     return (value_factor_p (TREE_OPERAND (value, 0), factor)
@@ -2570,16 +2568,16 @@  potential_alignment_gap (tree prev_field, tree curr_field, tree offset)
   /* If the distance between the end of prev_field and the beginning of
      curr_field is constant, then there is a gap if the value of this
      constant is not null. */
-  if (offset && host_integerp (offset, 1))
+  if (offset && tree_fits_uhwi_p (offset))
     return !integer_zerop (offset);
 
   /* If the size and position of the previous field are constant,
      then check the sum of this size and position. There will be a gap
      iff it is not multiple of the current field alignment. */
-  if (host_integerp (DECL_SIZE (prev_field), 1)
-      && host_integerp (bit_position (prev_field), 1))
-    return ((tree_low_cst (bit_position (prev_field), 1)
-	     + tree_low_cst (DECL_SIZE (prev_field), 1))
+  if (tree_fits_uhwi_p (DECL_SIZE (prev_field))
+      && tree_fits_uhwi_p (bit_position (prev_field)))
+    return ((tree_to_hwi (bit_position (prev_field))
+	     + tree_to_hwi (DECL_SIZE (prev_field)))
 	    % DECL_ALIGN (curr_field) != 0);
 
   /* If both the position and size of the previous field are multiples
@@ -3212,7 +3210,7 @@  build_vms_descriptor32 (tree type, Mechanism_Type mech, Entity_Id gnat_entity)
     case ENUMERAL_TYPE:
     case BOOLEAN_TYPE:
       if (TYPE_VAX_FLOATING_POINT_P (type))
-	switch (tree_low_cst (TYPE_DIGITS_VALUE (type), 1))
+	switch (tree_to_uhwi (TYPE_DIGITS_VALUE (type)))
 	  {
 	  case 6:
 	    dtype = 10;
@@ -3252,7 +3250,7 @@  build_vms_descriptor32 (tree type, Mechanism_Type mech, Entity_Id gnat_entity)
     case COMPLEX_TYPE:
       if (TREE_CODE (TREE_TYPE (type)) == INTEGER_TYPE
 	  && TYPE_VAX_FLOATING_POINT_P (type))
-	switch (tree_low_cst (TYPE_DIGITS_VALUE (type), 1))
+	switch (tree_to_uhwi (TYPE_DIGITS_VALUE (type)))
 	  {
 	  case 6:
 	    dtype = 12;
@@ -3513,7 +3511,7 @@  build_vms_descriptor (tree type, Mechanism_Type mech, Entity_Id gnat_entity)
     case ENUMERAL_TYPE:
     case BOOLEAN_TYPE:
       if (TYPE_VAX_FLOATING_POINT_P (type))
-	switch (tree_low_cst (TYPE_DIGITS_VALUE (type), 1))
+	switch (tree_to_uhwi (TYPE_DIGITS_VALUE (type)))
 	  {
 	  case 6:
 	    dtype = 10;
@@ -3553,7 +3551,7 @@  build_vms_descriptor (tree type, Mechanism_Type mech, Entity_Id gnat_entity)
     case COMPLEX_TYPE:
       if (TREE_CODE (TREE_TYPE (type)) == INTEGER_TYPE
 	  && TYPE_VAX_FLOATING_POINT_P (type))
-	switch (tree_low_cst (TYPE_DIGITS_VALUE (type), 1))
+	switch (tree_to_uhwi (TYPE_DIGITS_VALUE (type)))
 	  {
 	  case 6:
 	    dtype = 12;
@@ -3807,7 +3805,7 @@  convert_vms_descriptor64 (tree gnu_type, tree gnu_expr, Entity_Id gnat_subprog)
       tree max_field = DECL_CHAIN (TYPE_FIELDS (template_type));
       tree template_tree, template_addr, aflags, dimct, t, u;
       /* See the head comment of build_vms_descriptor.  */
-      int iklass = TREE_INT_CST_LOW (DECL_INITIAL (klass));
+      int iklass = tree_to_shwi (DECL_INITIAL (klass));
       tree lfield, ufield;
       VEC(constructor_elt,gc) *v;
 
@@ -3961,7 +3959,7 @@  convert_vms_descriptor32 (tree gnu_type, tree gnu_expr, Entity_Id gnat_subprog)
       tree max_field = DECL_CHAIN (TYPE_FIELDS (template_type));
       tree template_tree, template_addr, aflags, dimct, t, u;
       /* See the head comment of build_vms_descriptor.  */
-      int iklass = TREE_INT_CST_LOW (DECL_INITIAL (klass));
+      int iklass = tree_to_shwi (DECL_INITIAL (klass));
       VEC(constructor_elt,gc) *v;
 
       /* Convert POINTER to the pointer-to-array type.  */
@@ -5258,7 +5256,7 @@  unchecked_convert (tree type, tree expr, bool notrunc_p)
 				     GET_MODE_BITSIZE (TYPE_MODE (type))))
     {
       tree rec_type = make_node (RECORD_TYPE);
-      unsigned HOST_WIDE_INT prec = TREE_INT_CST_LOW (TYPE_RM_SIZE (type));
+      unsigned HOST_WIDE_INT prec = tree_to_uhwi (TYPE_RM_SIZE (type));
       tree field_type, field;
 
       if (TYPE_UNSIGNED (type))
@@ -5287,7 +5285,7 @@  unchecked_convert (tree type, tree expr, bool notrunc_p)
 				     GET_MODE_BITSIZE (TYPE_MODE (etype))))
     {
       tree rec_type = make_node (RECORD_TYPE);
-      unsigned HOST_WIDE_INT prec = TREE_INT_CST_LOW (TYPE_RM_SIZE (etype));
+      unsigned HOST_WIDE_INT prec = tree_to_uhwi (TYPE_RM_SIZE (etype));
       VEC(constructor_elt,gc) *v = VEC_alloc (constructor_elt, gc, 1);
       tree field_type, field;
 
@@ -6006,10 +6004,10 @@  get_nonnull_operand (tree arg_num_expr, unsigned HOST_WIDE_INT *valp)
 {
   /* Verify the arg number is a constant.  */
   if (TREE_CODE (arg_num_expr) != INTEGER_CST
-      || TREE_INT_CST_HIGH (arg_num_expr) != 0)
+      || !tree_fits_uhwi_p (arg_num_expr))
     return false;
 
-  *valp = TREE_INT_CST_LOW (arg_num_expr);
+  *valp = tree_to_hwi (arg_num_expr);
   return true;
 }
 
@@ -6246,7 +6244,7 @@  handle_vector_size_attribute (tree *node, tree name, tree args,
 
   size = TREE_VALUE (args);
 
-  if (!host_integerp (size, 1))
+  if (!tree_fits_uhwi_p (size))
     {
       warning (OPT_Wattributes, "%qs attribute ignored",
 	       IDENTIFIER_POINTER (name));
@@ -6254,7 +6252,7 @@  handle_vector_size_attribute (tree *node, tree name, tree args,
     }
 
   /* Get the vector size (in bytes).  */
-  vecsize = tree_low_cst (size, 1);
+  vecsize = tree_to_hwi (size);
 
   /* We need to provide for vector pointers, vector arrays, and
      functions returning vectors.  For example:
@@ -6278,7 +6276,7 @@  handle_vector_size_attribute (tree *node, tree name, tree args,
       || (!SCALAR_FLOAT_MODE_P (orig_mode)
 	  && GET_MODE_CLASS (orig_mode) != MODE_INT
 	  && !ALL_SCALAR_FIXED_POINT_MODE_P (orig_mode))
-      || !host_integerp (TYPE_SIZE_UNIT (type), 1)
+      || !tree_fits_uhwi_p (TYPE_SIZE_UNIT (type))
       || TREE_CODE (type) == BOOLEAN_TYPE)
     {
       error ("invalid vector type for attribute %qs",
@@ -6286,7 +6284,7 @@  handle_vector_size_attribute (tree *node, tree name, tree args,
       return NULL_TREE;
     }
 
-  if (vecsize % tree_low_cst (TYPE_SIZE_UNIT (type), 1))
+  if (vecsize % tree_to_hwi (TYPE_SIZE_UNIT (type)))
     {
       error ("vector size not an integral multiple of component size");
       return NULL;
@@ -6299,7 +6297,7 @@  handle_vector_size_attribute (tree *node, tree name, tree args,
     }
 
   /* Calculate how many units fit in the vector.  */
-  nunits = vecsize / tree_low_cst (TYPE_SIZE_UNIT (type), 1);
+  nunits = vecsize / tree_to_hwi (TYPE_SIZE_UNIT (type));
   if (nunits & (nunits - 1))
     {
       error ("number of components of the vector not a power of two");
@@ -6354,7 +6352,7 @@  handle_vector_type_attribute (tree *node, tree name, tree ARG_UNUSED (args),
      bases, and this attribute is for binding implementors, not end-users, so
      we should never get there from legitimate explicit uses.  */
 
-  if (!host_integerp (rep_size, 1))
+  if (!tree_fits_uhwi_p (rep_size))
     return NULL_TREE;
 
   /* Get the element type/mode and check this is something we know
@@ -6369,7 +6367,7 @@  handle_vector_type_attribute (tree *node, tree name, tree ARG_UNUSED (args),
       || (!SCALAR_FLOAT_MODE_P (elem_mode)
 	  && GET_MODE_CLASS (elem_mode) != MODE_INT
 	  && !ALL_SCALAR_FIXED_POINT_MODE_P (elem_mode))
-      || !host_integerp (TYPE_SIZE_UNIT (elem_type), 1))
+      || !tree_fits_uhwi_p (TYPE_SIZE_UNIT (elem_type)))
     {
       error ("invalid element type for attribute %qs",
 	     IDENTIFIER_POINTER (name));
@@ -6378,9 +6376,9 @@  handle_vector_type_attribute (tree *node, tree name, tree ARG_UNUSED (args),
 
   /* Sanity check the vector size and element type consistency.  */
 
-  vec_bytes = tree_low_cst (rep_size, 1);
+  vec_bytes = tree_to_hwi (rep_size);
 
-  if (vec_bytes % tree_low_cst (TYPE_SIZE_UNIT (elem_type), 1))
+  if (vec_bytes % tree_to_hwi (TYPE_SIZE_UNIT (elem_type)))
     {
       error ("vector size not an integral multiple of component size");
       return NULL;
@@ -6392,7 +6390,7 @@  handle_vector_type_attribute (tree *node, tree name, tree ARG_UNUSED (args),
       return NULL;
     }
 
-  vec_units = vec_bytes / tree_low_cst (TYPE_SIZE_UNIT (elem_type), 1);
+  vec_units = vec_bytes / tree_to_hwi (TYPE_SIZE_UNIT (elem_type));
   if (vec_units & (vec_units - 1))
     {
       error ("number of components of the vector not a power of two");
diff --git a/gcc/ada/gcc-interface/utils2.c b/gcc/ada/gcc-interface/utils2.c
index 4578114..472f181 100644
--- a/gcc/ada/gcc-interface/utils2.c
+++ b/gcc/ada/gcc-interface/utils2.c
@@ -119,7 +119,7 @@  known_alignment (tree exp)
 
     case INTEGER_CST:
       {
-	unsigned HOST_WIDE_INT c = TREE_INT_CST_LOW (exp);
+	unsigned HOST_WIDE_INT c = tree_to_uhwi (exp);
 	/* The first part of this represents the lowest bit in the constant,
 	   but it is originally in bytes, not bits.  */
 	this_alignment = MIN (BITS_PER_UNIT * (c & -c), BIGGEST_ALIGNMENT);
@@ -626,7 +626,7 @@  nonbinary_modular_operation (enum tree_code op_code, tree type, tree lhs,
 static unsigned int
 resolve_atomic_size (tree type)
 {
-  unsigned HOST_WIDE_INT size = tree_low_cst (TYPE_SIZE_UNIT (type), 1);
+  unsigned HOST_WIDE_INT size = tree_to_uhwi (TYPE_SIZE_UNIT (type));
 
   if (size == 1 || size == 2 || size == 4 || size == 8 || size == 16)
     return size;
diff --git a/gcc/alias.c b/gcc/alias.c
index 9e67823..e854d96 100644
--- a/gcc/alias.c
+++ b/gcc/alias.c
@@ -359,8 +359,8 @@  ao_ref_from_mem (ao_ref *ref, const_rtx mem)
   if (MEM_EXPR (mem) != get_spill_slot_decl (false)
       && (ref->offset < 0
 	  || (DECL_P (ref->base)
-	      && (!host_integerp (DECL_SIZE (ref->base), 1)
-		  || (TREE_INT_CST_LOW (DECL_SIZE ((ref->base)))
+	      && (!tree_fits_uhwi_p (DECL_SIZE (ref->base))
+		  || ((unsigned HOST_WIDE_INT)tree_to_hwi (DECL_SIZE ((ref->base)))
 		      < (unsigned HOST_WIDE_INT)(ref->offset + ref->size))))))
     return false;
 
@@ -2283,13 +2283,13 @@  adjust_offset_for_component_ref (tree x, bool *known_p,
       tree xoffset = component_ref_field_offset (x);
       tree field = TREE_OPERAND (x, 1);
 
-      if (! host_integerp (xoffset, 1))
+      if (! tree_fits_uhwi_p (xoffset))
 	{
 	  *known_p = false;
 	  return;
 	}
-      *offset += (tree_low_cst (xoffset, 1)
-		  + (tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 1)
+      *offset += (tree_to_hwi (xoffset)
+		  + (tree_to_hwi (DECL_FIELD_BIT_OFFSET (field))
 		     / BITS_PER_UNIT));
 
       x = TREE_OPERAND (x, 0);
diff --git a/gcc/builtins.c b/gcc/builtins.c
index 67a9599..c1722ab 100644
--- a/gcc/builtins.c
+++ b/gcc/builtins.c
@@ -338,8 +338,8 @@  get_object_alignment_2 (tree exp, unsigned int *alignp,
       if (TREE_CODE (addr) == BIT_AND_EXPR
 	  && TREE_CODE (TREE_OPERAND (addr, 1)) == INTEGER_CST)
 	{
-	  align = (TREE_INT_CST_LOW (TREE_OPERAND (addr, 1))
-		    & -TREE_INT_CST_LOW (TREE_OPERAND (addr, 1)));
+	  align = (tree_to_hwi (TREE_OPERAND (addr, 1))
+		   & -tree_to_hwi (TREE_OPERAND (addr, 1)));
 	  align *= BITS_PER_UNIT;
 	  addr = TREE_OPERAND (addr, 0);
 	}
@@ -356,7 +356,7 @@  get_object_alignment_2 (tree exp, unsigned int *alignp,
 	    {
 	      unsigned HOST_WIDE_INT step = 1;
 	      if (TMR_STEP (exp))
-		step = TREE_INT_CST_LOW (TMR_STEP (exp));
+		step = tree_to_hwi (TMR_STEP (exp));
 	      align = MIN (align, (step & -step) * BITS_PER_UNIT);
 	    }
 	  if (TMR_INDEX2 (exp))
@@ -407,23 +407,23 @@  get_object_alignment_2 (tree exp, unsigned int *alignp,
 	}
       else
 	next_offset = NULL;
-      if (host_integerp (offset, 1))
+      if (tree_fits_uhwi_p (offset))
 	{
 	  /* Any overflow in calculating offset_bits won't change
 	     the alignment.  */
 	  unsigned offset_bits
-	    = ((unsigned) tree_low_cst (offset, 1) * BITS_PER_UNIT);
+	    = ((unsigned) tree_to_hwi (offset) * BITS_PER_UNIT);
 
 	  if (offset_bits)
 	    inner = MIN (inner, (offset_bits & -offset_bits));
 	}
       else if (TREE_CODE (offset) == MULT_EXPR
-	       && host_integerp (TREE_OPERAND (offset, 1), 1))
+	       && tree_fits_uhwi_p (TREE_OPERAND (offset, 1)))
 	{
 	  /* Any overflow in calculating offset_factor won't change
 	     the alignment.  */
 	  unsigned offset_factor
-	    = ((unsigned) tree_low_cst (TREE_OPERAND (offset, 1), 1)
+	    = ((unsigned) tree_to_hwi (TREE_OPERAND (offset, 1))
 	       * BITS_PER_UNIT);
 
 	  if (offset_factor)
@@ -514,7 +514,7 @@  get_pointer_alignment_1 (tree exp, unsigned int *alignp,
   else if (TREE_CODE (exp) == INTEGER_CST)
     {
       *alignp = BIGGEST_ALIGNMENT;
-      *bitposp = ((TREE_INT_CST_LOW (exp) * BITS_PER_UNIT)
+      *bitposp = ((tree_to_hwi (exp) * BITS_PER_UNIT)
 		  & (BIGGEST_ALIGNMENT - 1));
       return true;
     }
@@ -623,10 +623,10 @@  c_strlen (tree src, int only_value)
      a null character if we can represent it as a single HOST_WIDE_INT.  */
   if (offset_node == 0)
     offset = 0;
-  else if (! host_integerp (offset_node, 0))
+  else if (!tree_fits_shwi_p (offset_node))
     offset = -1;
   else
-    offset = tree_low_cst (offset_node, 0);
+    offset = tree_to_hwi (offset_node);
 
   /* If the offset is known to be out of bounds, warn, and call strlen at
      runtime.  */
@@ -664,11 +664,11 @@  c_getstr (tree src)
 
   if (offset_node == 0)
     return TREE_STRING_POINTER (src);
-  else if (!host_integerp (offset_node, 1)
+  else if (!tree_fits_uhwi_p (offset_node)
 	   || compare_tree_int (offset_node, TREE_STRING_LENGTH (src) - 1) > 0)
     return 0;
 
-  return TREE_STRING_POINTER (src) + tree_low_cst (offset_node, 1);
+  return TREE_STRING_POINTER (src) + tree_to_uhwi (offset_node);
 }
 
 /* Return a constant integer corresponding to target reading
@@ -723,7 +723,9 @@  target_char_cast (tree cst, char *p)
       || CHAR_TYPE_SIZE > HOST_BITS_PER_WIDE_INT)
     return 1;
 
-  val = TREE_INT_CST_LOW (cst);
+  /* Do not care if it fits or not right here.  */
+  val = tree_to_hwi (cst);
+
   if (CHAR_TYPE_SIZE < HOST_BITS_PER_WIDE_INT)
     val &= (((unsigned HOST_WIDE_INT) 1) << CHAR_TYPE_SIZE) - 1;
 
@@ -3171,7 +3173,7 @@  expand_builtin_mempcpy_args (tree dest, tree src, tree len,
 	return NULL_RTX;
 
       /* If LEN is not constant, call the normal function.  */
-      if (! host_integerp (len, 1))
+      if (! tree_fits_uhwi_p (len))
 	return NULL_RTX;
 
       len_rtx = expand_normal (len);
@@ -3406,7 +3408,7 @@  expand_builtin_strncpy (tree exp, rtx target)
       tree slen = c_strlen (src, 1);
 
       /* We must be passed a constant len and src parameter.  */
-      if (!host_integerp (len, 1) || !slen || !host_integerp (slen, 1))
+      if (!tree_fits_uhwi_p (len) || !slen || !tree_fits_uhwi_p (slen))
 	return NULL_RTX;
 
       slen = size_binop_loc (loc, PLUS_EXPR, slen, ssize_int (1));
@@ -3420,15 +3422,15 @@  expand_builtin_strncpy (tree exp, rtx target)
 	  const char *p = c_getstr (src);
 	  rtx dest_mem;
 
-	  if (!p || dest_align == 0 || !host_integerp (len, 1)
-	      || !can_store_by_pieces (tree_low_cst (len, 1),
+	  if (!p || dest_align == 0 || !tree_fits_uhwi_p (len)
+	      || !can_store_by_pieces (tree_to_uhwi (len),
 				       builtin_strncpy_read_str,
 				       CONST_CAST (char *, p),
 				       dest_align, false))
 	    return NULL_RTX;
 
 	  dest_mem = get_memory_rtx (dest, len);
-	  store_by_pieces (dest_mem, tree_low_cst (len, 1),
+	  store_by_pieces (dest_mem, tree_to_uhwi (len),
 			   builtin_strncpy_read_str,
 			   CONST_CAST (char *, p), dest_align, false, 0);
 	  dest_mem = force_operand (XEXP (dest_mem, 0), target);
@@ -3561,13 +3563,13 @@  expand_builtin_memset_args (tree dest, tree val, tree len,
        * the coefficients by pieces (in the required modes).
        * We can't pass builtin_memset_gen_str as that emits RTL.  */
       c = 1;
-      if (host_integerp (len, 1)
-	  && can_store_by_pieces (tree_low_cst (len, 1),
+      if (tree_fits_uhwi_p (len)
+	  && can_store_by_pieces (tree_to_hwi (len),
 				  builtin_memset_read_str, &c, dest_align,
 				  true))
 	{
 	  val_rtx = force_reg (val_mode, val_rtx);
-	  store_by_pieces (dest_mem, tree_low_cst (len, 1),
+	  store_by_pieces (dest_mem, tree_to_hwi (len),
 			   builtin_memset_gen_str, val_rtx, dest_align,
 			   true, 0);
 	}
@@ -3586,11 +3588,11 @@  expand_builtin_memset_args (tree dest, tree val, tree len,
 
   if (c)
     {
-      if (host_integerp (len, 1)
-	  && can_store_by_pieces (tree_low_cst (len, 1),
+      if (tree_fits_uhwi_p (len)
+	  && can_store_by_pieces (tree_to_hwi (len),
 				  builtin_memset_read_str, &c, dest_align,
 				  true))
-	store_by_pieces (dest_mem, tree_low_cst (len, 1),
+	store_by_pieces (dest_mem, tree_to_hwi (len),
 			 builtin_memset_read_str, &c, dest_align, true, 0);
       else if (!set_storage_via_setmem (dest_mem, len_rtx,
 					gen_int_mode (c, val_mode),
@@ -4495,7 +4497,7 @@  expand_builtin_frame_address (tree fndecl, tree exp)
   if (call_expr_nargs (exp) == 0)
     /* Warning about missing arg was already issued.  */
     return const0_rtx;
-  else if (! host_integerp (CALL_EXPR_ARG (exp, 0), 1))
+  else if (! tree_fits_uhwi_p (CALL_EXPR_ARG (exp, 0)))
     {
       if (DECL_FUNCTION_CODE (fndecl) == BUILT_IN_FRAME_ADDRESS)
 	error ("invalid argument to %<__builtin_frame_address%>");
@@ -4507,7 +4509,7 @@  expand_builtin_frame_address (tree fndecl, tree exp)
     {
       rtx tem
 	= expand_builtin_return_addr (DECL_FUNCTION_CODE (fndecl),
-				      tree_low_cst (CALL_EXPR_ARG (exp, 0), 1));
+				      tree_to_uhwi (CALL_EXPR_ARG (exp, 0)));
 
       /* Some ports cannot access arbitrary stack frames.  */
       if (tem == NULL)
@@ -4561,7 +4563,7 @@  expand_builtin_alloca (tree exp, bool cannot_accumulate)
 
   /* Compute the alignment.  */
   align = (alloca_with_align
-	   ? TREE_INT_CST_LOW (CALL_EXPR_ARG (exp, 1))
+	   ? tree_to_uhwi (CALL_EXPR_ARG (exp, 1))
 	   : BIGGEST_ALIGNMENT);
 
   /* Allocate the desired space.  */
@@ -5388,7 +5390,7 @@  expand_builtin_atomic_compare_exchange (enum machine_mode mode, tree exp,
 
   weak = CALL_EXPR_ARG (exp, 3);
   is_weak = false;
-  if (host_integerp (weak, 0) && tree_low_cst (weak, 0) != 0)
+  if (tree_fits_shwi_p (weak) && tree_to_hwi (weak) != 0)
     is_weak = true;
 
   oldval = expect;
@@ -8504,9 +8506,9 @@  fold_builtin_powi (location_t loc, tree fndecl ATTRIBUTE_UNUSED,
   if (real_onep (arg0))
     return omit_one_operand_loc (loc, type, build_real (type, dconst1), arg1);
 
-  if (host_integerp (arg1, 0))
+  if (tree_fits_shwi_p (arg1))
     {
-      HOST_WIDE_INT c = TREE_INT_CST_LOW (arg1);
+      HOST_WIDE_INT c = tree_to_hwi (arg1);
 
       /* Evaluate powi at compile-time.  */
       if (TREE_CODE (arg0) == REAL_CST
@@ -8603,7 +8605,7 @@  fold_builtin_memset (location_t loc, tree dest, tree c, tree len,
       || ! validate_arg (len, INTEGER_TYPE))
     return NULL_TREE;
 
-  if (! host_integerp (len, 1))
+  if (! tree_fits_uhwi_p (len))
     return NULL_TREE;
 
   /* If the LEN parameter is zero, return DEST.  */
@@ -8633,7 +8635,7 @@  fold_builtin_memset (location_t loc, tree dest, tree c, tree len,
   if (! var_decl_component_p (var))
     return NULL_TREE;
 
-  length = tree_low_cst (len, 1);
+  length = tree_to_uhwi (len);
   if (GET_MODE_SIZE (TYPE_MODE (etype)) != length
       || get_pointer_alignment (dest) / BITS_PER_UNIT < length)
     return NULL_TREE;
@@ -8648,7 +8650,7 @@  fold_builtin_memset (location_t loc, tree dest, tree c, tree len,
       if (CHAR_BIT != 8 || BITS_PER_UNIT != 8 || HOST_BITS_PER_WIDE_INT > 64)
 	return NULL_TREE;
 
-      cval = TREE_INT_CST_LOW (c);
+      cval = tree_to_hwi (c);
       cval &= 0xff;
       cval |= cval << 8;
       cval |= cval << 16;
@@ -8736,9 +8738,9 @@  fold_builtin_memory_op (location_t loc, tree dest, tree src,
 	  if (!dest_align || !src_align)
 	    return NULL_TREE;
 	  if (readonly_data_expr (src)
-	      || (host_integerp (len, 1)
+	      || (tree_fits_uhwi_p (len)
 		  && (MIN (src_align, dest_align) / BITS_PER_UNIT
-		      >= (unsigned HOST_WIDE_INT) tree_low_cst (len, 1))))
+		      >= (unsigned HOST_WIDE_INT) tree_to_uhwi (len))))
 	    {
 	      tree fn = builtin_decl_implicit (BUILT_IN_MEMCPY);
 	      if (!fn)
@@ -8761,8 +8763,8 @@  fold_builtin_memory_op (location_t loc, tree dest, tree src,
 	      destvar = TREE_OPERAND (dest, 0);
 	      dest_base = get_ref_base_and_extent (destvar, &dest_offset,
 						   &size, &maxsize);
-	      if (host_integerp (len, 1))
-		maxsize = tree_low_cst (len, 1);
+	      if (tree_fits_uhwi_p (len))
+		maxsize = tree_to_hwi (len);
 	      else
 		maxsize = -1;
 	      src_offset /= BITS_PER_UNIT;
@@ -8828,7 +8830,7 @@  fold_builtin_memory_op (location_t loc, tree dest, tree src,
 	  return NULL_TREE;
 	}
 
-      if (!host_integerp (len, 0))
+      if (!tree_fits_shwi_p (len))
 	return NULL_TREE;
       /* FIXME:
          This logic lose for arguments like (type *)malloc (sizeof (type)),
@@ -9116,7 +9118,7 @@  fold_builtin_memchr (location_t loc, tree arg1, tree arg2, tree len, tree type)
       const char *p1;
 
       if (TREE_CODE (arg2) != INTEGER_CST
-	  || !host_integerp (len, 1))
+	  || !tree_fits_uhwi_p (len))
 	return NULL_TREE;
 
       p1 = c_getstr (arg1);
@@ -9129,7 +9131,7 @@  fold_builtin_memchr (location_t loc, tree arg1, tree arg2, tree len, tree type)
 	  if (target_char_cast (arg2, &c))
 	    return NULL_TREE;
 
-	  r = (const char *) memchr (p1, c, tree_low_cst (len, 1));
+	  r = (const char *) memchr (p1, c, tree_to_uhwi (len));
 
 	  if (r == NULL)
 	    return build_int_cst (TREE_TYPE (arg1), 0);
@@ -9168,11 +9170,11 @@  fold_builtin_memcmp (location_t loc, tree arg1, tree arg2, tree len)
 
   /* If all arguments are constant, and the value of len is not greater
      than the lengths of arg1 and arg2, evaluate at compile-time.  */
-  if (host_integerp (len, 1) && p1 && p2
+  if (tree_fits_uhwi_p (len) && p1 && p2
       && compare_tree_int (len, strlen (p1) + 1) <= 0
       && compare_tree_int (len, strlen (p2) + 1) <= 0)
     {
-      const int r = memcmp (p1, p2, tree_low_cst (len, 1));
+      const int r = memcmp (p1, p2, tree_to_uhwi (len));
 
       if (r > 0)
 	return integer_one_node;
@@ -9184,7 +9186,7 @@  fold_builtin_memcmp (location_t loc, tree arg1, tree arg2, tree len)
 
   /* If len parameter is one, return an expression corresponding to
      (*(const unsigned char*)arg1 - (const unsigned char*)arg2).  */
-  if (host_integerp (len, 1) && tree_low_cst (len, 1) == 1)
+  if (tree_fits_uhwi_p (len) && tree_to_hwi (len) == 1)
     {
       tree cst_uchar_node = build_type_variant (unsigned_char_type_node, 1, 0);
       tree cst_uchar_ptr_node
@@ -9296,9 +9298,9 @@  fold_builtin_strncmp (location_t loc, tree arg1, tree arg2, tree len)
   p1 = c_getstr (arg1);
   p2 = c_getstr (arg2);
 
-  if (host_integerp (len, 1) && p1 && p2)
+  if (tree_fits_uhwi_p (len) && p1 && p2)
     {
-      const int i = strncmp (p1, p2, tree_low_cst (len, 1));
+      const int i = strncmp (p1, p2, tree_to_hwi (len));
       if (i > 0)
 	return integer_one_node;
       else if (i < 0)
@@ -9344,7 +9346,7 @@  fold_builtin_strncmp (location_t loc, tree arg1, tree arg2, tree len)
 
   /* If len parameter is one, return an expression corresponding to
      (*(const unsigned char*)arg1 - (const unsigned char*)arg2).  */
-  if (host_integerp (len, 1) && tree_low_cst (len, 1) == 1)
+  if (tree_fits_uhwi_p (len) && tree_to_hwi (len) == 1)
     {
       tree cst_uchar_node = build_type_variant (unsigned_char_type_node, 1, 0);
       tree cst_uchar_ptr_node
@@ -9793,7 +9795,7 @@  fold_builtin_load_exponent (location_t loc, tree arg0, tree arg1,
       /* If both arguments are constant, then try to evaluate it.  */
       if ((ldexp || REAL_MODE_FORMAT (TYPE_MODE (type))->b == 2)
 	  && TREE_CODE (arg0) == REAL_CST && !TREE_OVERFLOW (arg0)
-	  && host_integerp (arg1, 0))
+	  && tree_fits_shwi_p (arg1))
         {
 	  /* Bound the maximum adjustment to twice the range of the
 	     mode's valid exponents.  Use abs to ensure the range is
@@ -9803,7 +9805,7 @@  fold_builtin_load_exponent (location_t loc, tree arg0, tree arg1,
 		 - REAL_MODE_FORMAT (TYPE_MODE (type))->emin);
 
 	  /* Get the user-requested adjustment.  */
-	  const HOST_WIDE_INT req_exp_adj = tree_low_cst (arg1, 0);
+	  const HOST_WIDE_INT req_exp_adj = tree_to_shwi (arg1);
 
 	  /* The requested adjustment must be inside this range.  This
 	     is a preliminary cap to avoid things like overflow, we
@@ -12273,7 +12275,7 @@  fold_builtin_snprintf (location_t loc, tree dest, tree destsize, tree fmt,
   if (orig && !validate_arg (orig, POINTER_TYPE))
     return NULL_TREE;
 
-  if (!host_integerp (destsize, 1))
+  if (!tree_fits_uhwi_p (destsize))
     return NULL_TREE;
 
   /* Check whether the format is a literal string constant.  */
@@ -12287,7 +12289,7 @@  fold_builtin_snprintf (location_t loc, tree dest, tree destsize, tree fmt,
   if (!init_target_chars ())
     return NULL_TREE;
 
-  destlen = tree_low_cst (destsize, 1);
+  destlen = tree_to_hwi (destsize);
 
   /* If the format doesn't contain % args or %%, use strcpy.  */
   if (strchr (fmt_str, target_percent) == NULL)
@@ -12332,10 +12334,10 @@  fold_builtin_snprintf (location_t loc, tree dest, tree destsize, tree fmt,
 	return NULL_TREE;
 
       retval = c_strlen (orig, 1);
-      if (!retval || !host_integerp (retval, 1))  
+      if (!retval || !tree_fits_uhwi_p (retval))  
 	return NULL_TREE;
 
-      origlen = tree_low_cst (retval, 1);
+      origlen = tree_to_hwi (retval);
       /* We could expand this as
 	 memcpy (str1, str2, cst - 1); str1[cst - 1] = '\0';
 	 or to
@@ -12397,7 +12399,7 @@  expand_builtin_object_size (tree exp)
       return const0_rtx;
     }
 
-  object_size_type = tree_low_cst (ost, 0);
+  object_size_type = tree_to_shwi (ost);
 
   return object_size_type < 2 ? constm1_rtx : const0_rtx;
 }
@@ -12426,10 +12428,10 @@  expand_builtin_memory_chk (tree exp, rtx target, enum machine_mode mode,
   len = CALL_EXPR_ARG (exp, 2);
   size = CALL_EXPR_ARG (exp, 3);
 
-  if (! host_integerp (size, 1))
+  if (! tree_fits_uhwi_p (size))
     return NULL_RTX;
 
-  if (host_integerp (len, 1) || integer_all_onesp (size))
+  if (tree_fits_uhwi_p (len) || integer_all_onesp (size))
     {
       tree fn;
 
@@ -12560,22 +12562,22 @@  maybe_emit_chk_warning (tree exp, enum built_in_function fcode)
   if (!len || !size)
     return;
 
-  if (! host_integerp (size, 1) || integer_all_onesp (size))
+  if (! tree_fits_uhwi_p (size) || integer_all_onesp (size))
     return;
 
   if (is_strlen)
     {
       len = c_strlen (len, 1);
-      if (! len || ! host_integerp (len, 1) || tree_int_cst_lt (len, size))
+      if (! len || ! tree_fits_uhwi_p (len) || tree_int_cst_lt (len, size))
 	return;
     }
   else if (fcode == BUILT_IN_STRNCAT_CHK)
     {
       tree src = CALL_EXPR_ARG (exp, 1);
-      if (! src || ! host_integerp (len, 1) || tree_int_cst_lt (len, size))
+      if (! src || ! tree_fits_uhwi_p (len) || tree_int_cst_lt (len, size))
 	return;
       src = c_strlen (src, 1);
-      if (! src || ! host_integerp (src, 1))
+      if (! src || ! tree_fits_uhwi_p (src))
 	{
 	  warning_at (loc, 0, "%Kcall to %D might overflow destination buffer",
 		      exp, get_callee_fndecl (exp));
@@ -12584,7 +12586,7 @@  maybe_emit_chk_warning (tree exp, enum built_in_function fcode)
       else if (tree_int_cst_lt (src, size))
 	return;
     }
-  else if (! host_integerp (len, 1) || ! tree_int_cst_lt (size, len))
+  else if (! tree_fits_uhwi_p (len) || ! tree_int_cst_lt (size, len))
     return;
 
   warning_at (loc, 0, "%Kcall to %D will always overflow destination buffer",
@@ -12608,7 +12610,7 @@  maybe_emit_sprintf_chk_warning (tree exp, enum built_in_function fcode)
   size = CALL_EXPR_ARG (exp, 2);
   fmt = CALL_EXPR_ARG (exp, 3);
 
-  if (! host_integerp (size, 1) || integer_all_onesp (size))
+  if (! tree_fits_uhwi_p (size) || integer_all_onesp (size))
     return;
 
   /* Check whether the format is a literal string constant.  */
@@ -12636,7 +12638,7 @@  maybe_emit_sprintf_chk_warning (tree exp, enum built_in_function fcode)
 	return;
 
       len = c_strlen (arg, 1);
-      if (!len || ! host_integerp (len, 1))
+      if (!len || ! tree_fits_uhwi_p (len))
 	return;
     }
   else
@@ -12691,7 +12693,7 @@  fold_builtin_object_size (tree ptr, tree ost)
       || compare_tree_int (ost, 3) > 0)
     return NULL_TREE;
 
-  object_size_type = tree_low_cst (ost, 0);
+  object_size_type = tree_to_shwi (ost);
 
   /* __builtin_object_size doesn't evaluate side-effects in its arguments;
      if there are any side-effects, it returns (size_t) -1 for types 0 and 1
@@ -12757,17 +12759,17 @@  fold_builtin_memory_chk (location_t loc, tree fndecl,
 	}
     }
 
-  if (! host_integerp (size, 1))
+  if (! tree_fits_uhwi_p (size))
     return NULL_TREE;
 
   if (! integer_all_onesp (size))
     {
-      if (! host_integerp (len, 1))
+      if (! tree_fits_uhwi_p (len))
 	{
 	  /* If LEN is not constant, try MAXLEN too.
 	     For MAXLEN only allow optimizing into non-_ocs function
 	     if SIZE is >= MAXLEN, never convert to __ocs_fail ().  */
-	  if (maxlen == NULL_TREE || ! host_integerp (maxlen, 1))
+	  if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
 	    {
 	      if (fcode == BUILT_IN_MEMPCPY_CHK && ignore)
 		{
@@ -12839,18 +12841,18 @@  fold_builtin_stxcpy_chk (location_t loc, tree fndecl, tree dest,
   if (fcode == BUILT_IN_STRCPY_CHK && operand_equal_p (src, dest, 0))
     return fold_convert_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)), dest);
 
-  if (! host_integerp (size, 1))
+  if (! tree_fits_uhwi_p (size))
     return NULL_TREE;
 
   if (! integer_all_onesp (size))
     {
       len = c_strlen (src, 1);
-      if (! len || ! host_integerp (len, 1))
+      if (! len || ! tree_fits_uhwi_p (len))
 	{
 	  /* If LEN is not constant, try MAXLEN too.
 	     For MAXLEN only allow optimizing into non-_ocs function
 	     if SIZE is >= MAXLEN, never convert to __ocs_fail ().  */
-	  if (maxlen == NULL_TREE || ! host_integerp (maxlen, 1))
+	  if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
 	    {
 	      if (fcode == BUILT_IN_STPCPY_CHK)
 		{
@@ -12926,17 +12928,17 @@  fold_builtin_stxncpy_chk (location_t loc, tree dest, tree src,
          return build_call_expr_loc (loc, fn, 4, dest, src, len, size);
     }
 
-  if (! host_integerp (size, 1))
+  if (! tree_fits_uhwi_p (size))
     return NULL_TREE;
 
   if (! integer_all_onesp (size))
     {
-      if (! host_integerp (len, 1))
+      if (! tree_fits_uhwi_p (len))
 	{
 	  /* If LEN is not constant, try MAXLEN too.
 	     For MAXLEN only allow optimizing into non-_ocs function
 	     if SIZE is >= MAXLEN, never convert to __ocs_fail ().  */
-	  if (maxlen == NULL_TREE || ! host_integerp (maxlen, 1))
+	  if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
 	    return NULL_TREE;
 	}
       else
@@ -12975,7 +12977,7 @@  fold_builtin_strcat_chk (location_t loc, tree fndecl, tree dest,
   if (p && *p == '\0')
     return omit_one_operand_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)), dest, src);
 
-  if (! host_integerp (size, 1) || ! integer_all_onesp (size))
+  if (! tree_fits_uhwi_p (size) || ! integer_all_onesp (size))
     return NULL_TREE;
 
   /* If __builtin_strcat_chk is used, assume strcat is available.  */
@@ -13009,15 +13011,15 @@  fold_builtin_strncat_chk (location_t loc, tree fndecl,
   else if (integer_zerop (len))
     return omit_one_operand_loc (loc, TREE_TYPE (TREE_TYPE (fndecl)), dest, src);
 
-  if (! host_integerp (size, 1))
+  if (! tree_fits_uhwi_p (size))
     return NULL_TREE;
 
   if (! integer_all_onesp (size))
     {
       tree src_len = c_strlen (src, 1);
       if (src_len
-	  && host_integerp (src_len, 1)
-	  && host_integerp (len, 1)
+	  && tree_fits_uhwi_p (src_len)
+	  && tree_fits_uhwi_p (len)
 	  && ! tree_int_cst_lt (len, src_len))
 	{
 	  /* If LEN >= strlen (SRC), optimize into __strcat_chk.  */
@@ -13066,7 +13068,7 @@  fold_builtin_sprintf_chk_1 (location_t loc, int nargs, tree *args,
   if (!validate_arg (fmt, POINTER_TYPE))
     return NULL_TREE;
 
-  if (! host_integerp (size, 1))
+  if (! tree_fits_uhwi_p (size))
     return NULL_TREE;
 
   len = NULL_TREE;
@@ -13097,7 +13099,7 @@  fold_builtin_sprintf_chk_1 (location_t loc, int nargs, tree *args,
 	      if (validate_arg (arg, POINTER_TYPE))
 		{
 		  len = c_strlen (arg, 1);
-		  if (! len || ! host_integerp (len, 1))
+		  if (! len || ! tree_fits_uhwi_p (len))
 		    len = NULL_TREE;
 		}
 	    }
@@ -13174,17 +13176,17 @@  fold_builtin_snprintf_chk_1 (location_t loc, int nargs, tree *args,
   if (!validate_arg (fmt, POINTER_TYPE))
     return NULL_TREE;
 
-  if (! host_integerp (size, 1))
+  if (! tree_fits_uhwi_p (size))
     return NULL_TREE;
 
   if (! integer_all_onesp (size))
     {
-      if (! host_integerp (len, 1))
+      if (! tree_fits_uhwi_p (len))
 	{
 	  /* If LEN is not constant, try MAXLEN too.
 	     For MAXLEN only allow optimizing into non-_ocs function
 	     if SIZE is >= MAXLEN, never convert to __ocs_fail ().  */
-	  if (maxlen == NULL_TREE || ! host_integerp (maxlen, 1))
+	  if (maxlen == NULL_TREE || ! tree_fits_uhwi_p (maxlen))
 	    return NULL_TREE;
 	}
       else
@@ -13829,10 +13831,10 @@  do_mpfr_bessel_n (tree arg1, tree arg2, tree type,
   /* To proceed, MPFR must exactly represent the target floating point
      format, which only happens when the target base equals two.  */
   if (REAL_MODE_FORMAT (TYPE_MODE (type))->b == 2
-      && host_integerp (arg1, 0)
+      && tree_fits_shwi_p (arg1)
       && TREE_CODE (arg2) == REAL_CST && !TREE_OVERFLOW (arg2))
     {
-      const HOST_WIDE_INT n = tree_low_cst(arg1, 0);
+      const HOST_WIDE_INT n = tree_to_hwi(arg1);
       const REAL_VALUE_TYPE *const ra = &TREE_REAL_CST (arg2);
 
       if (n == (long)n
diff --git a/gcc/c-family/c-ada-spec.c b/gcc/c-family/c-ada-spec.c
index 4f38a63..badaa79 100644
--- a/gcc/c-family/c-ada-spec.c
+++ b/gcc/c-family/c-ada-spec.c
@@ -1773,9 +1773,9 @@  is_simple_enum (tree node)
       if (TREE_CODE (int_val) != INTEGER_CST)
 	int_val = DECL_INITIAL (int_val);
 
-      if (!host_integerp (int_val, 0))
+      if (!tree_fits_shwi_p (int_val))
 	return false;
-      else if (TREE_INT_CST_LOW (int_val) != count)
+      else if ((unsigned HOST_WIDE_INT)tree_to_hwi (int_val) != count)
 	return false;
 
       count++;
@@ -2169,10 +2169,10 @@  dump_generic_ada_node (pretty_printer *buffer, tree node, tree type,
     case INTEGER_CST:
       if (TREE_CODE (TREE_TYPE (node)) == POINTER_TYPE)
 	{
-	  pp_wide_integer (buffer, TREE_INT_CST_LOW (node));
+	  pp_wide_integer (buffer, tree_to_shwi (node));
 	  pp_string (buffer, "B"); /* pseudo-unit */
 	}
-      else if (!host_integerp (node, 0))
+      else if (!tree_fits_shwi_p (node))
 	{
 	  tree val = node;
 	  unsigned HOST_WIDE_INT low = TREE_INT_CST_LOW (val);
@@ -2190,7 +2190,7 @@  dump_generic_ada_node (pretty_printer *buffer, tree node, tree type,
 	  pp_string (buffer, pp_buffer (buffer)->digit_buffer);
 	}
       else
-	pp_wide_integer (buffer, TREE_INT_CST_LOW (node));
+	pp_wide_integer (buffer, tree_to_hwi (node));
       break;
 
     case REAL_CST:
diff --git a/gcc/c-family/c-common.c b/gcc/c-family/c-common.c
index 6de2f1c..930d5e5 100644
--- a/gcc/c-family/c-common.c
+++ b/gcc/c-family/c-common.c
@@ -826,7 +826,7 @@  finish_fname_decls (void)
       for (saved = TREE_PURPOSE (stack); saved; saved = TREE_CHAIN (saved))
 	{
 	  tree decl = TREE_PURPOSE (saved);
-	  unsigned ix = TREE_INT_CST_LOW (TREE_VALUE (saved));
+	  unsigned ix = tree_to_uhwi (TREE_VALUE (saved));
 
 	  *fname_vars[ix].decl = decl;
 	}
@@ -4578,7 +4578,7 @@  c_type_hash (const void *p)
   if (TREE_CODE (TYPE_SIZE (t)) != INTEGER_CST)
     size = 0;
   else
-    size = TREE_INT_CST_LOW (TYPE_SIZE (t));
+    size = tree_to_shwi (TYPE_SIZE (t));
   return ((size << 24) | (n_elements << shift));
 }
 
@@ -5850,14 +5850,14 @@  match_case_to_enum_1 (tree key, tree type, tree label)
 
   /* ??? Not working too hard to print the double-word value.
      Should perhaps be done with %lwd in the diagnostic routines?  */
-  if (TREE_INT_CST_HIGH (key) == 0)
+  if (tree_fits_uhwi_p (key))
     snprintf (buf, sizeof (buf), HOST_WIDE_INT_PRINT_UNSIGNED,
-	      TREE_INT_CST_LOW (key));
+	      tree_to_hwi (key));
   else if (!TYPE_UNSIGNED (type)
 	   && TREE_INT_CST_HIGH (key) == -1
 	   && TREE_INT_CST_LOW (key) != 0)
     snprintf (buf, sizeof (buf), "-" HOST_WIDE_INT_PRINT_UNSIGNED,
-	      -TREE_INT_CST_LOW (key));
+	      - tree_to_shwi (key));
   else
     snprintf (buf, sizeof (buf), HOST_WIDE_INT_PRINT_DOUBLE_HEX,
 	      (unsigned HOST_WIDE_INT) TREE_INT_CST_HIGH (key),
@@ -6733,11 +6733,11 @@  get_priority (tree args, bool is_destructor)
 
   arg = TREE_VALUE (args);
   arg = default_conversion (arg);
-  if (!host_integerp (arg, /*pos=*/0)
+  if (!tree_fits_shwi_p (arg)
       || !INTEGRAL_TYPE_P (TREE_TYPE (arg)))
     goto invalid;
 
-  pri = tree_low_cst (arg, /*pos=*/0);
+  pri = tree_to_shwi (arg);
   if (pri < 0 || pri > MAX_INIT_PRIORITY)
     goto invalid;
 
@@ -7646,9 +7646,9 @@  handle_alloc_size_attribute (tree *node, tree ARG_UNUSED (name), tree args,
       tree position = TREE_VALUE (args);
 
       if (TREE_CODE (position) != INTEGER_CST
-	  || TREE_INT_CST_HIGH (position)
-	  || TREE_INT_CST_LOW (position) < 1
-	  || TREE_INT_CST_LOW (position) > arg_count )
+	  || !tree_fits_uhwi_p (position)
+	  || tree_to_hwi (position) < 1
+	  || tree_to_hwi (position) > arg_count )
 	{
 	  warning (OPT_Wattributes,
 	           "alloc_size parameter outside range");
@@ -8050,14 +8050,14 @@  handle_vector_size_attribute (tree *node, tree name, tree args,
 
   size = TREE_VALUE (args);
 
-  if (!host_integerp (size, 1))
+  if (!tree_fits_uhwi_p (size))
     {
       warning (OPT_Wattributes, "%qE attribute ignored", name);
       return NULL_TREE;
     }
 
   /* Get the vector size (in bytes).  */
-  vecsize = tree_low_cst (size, 1);
+  vecsize = tree_to_hwi (size);
 
   /* We need to provide for vector pointers, vector arrays, and
      functions returning vectors.  For example:
@@ -8083,14 +8083,14 @@  handle_vector_size_attribute (tree *node, tree name, tree args,
       || (!SCALAR_FLOAT_MODE_P (orig_mode)
 	  && GET_MODE_CLASS (orig_mode) != MODE_INT
 	  && !ALL_SCALAR_FIXED_POINT_MODE_P (orig_mode))
-      || !host_integerp (TYPE_SIZE_UNIT (type), 1)
+      || !tree_fits_uhwi_p (TYPE_SIZE_UNIT (type))
       || TREE_CODE (type) == BOOLEAN_TYPE)
     {
       error ("invalid vector type for attribute %qE", name);
       return NULL_TREE;
     }
 
-  if (vecsize % tree_low_cst (TYPE_SIZE_UNIT (type), 1))
+  if (vecsize % tree_to_hwi (TYPE_SIZE_UNIT (type)))
     {
       error ("vector size not an integral multiple of component size");
       return NULL;
@@ -8103,7 +8103,7 @@  handle_vector_size_attribute (tree *node, tree name, tree args,
     }
 
   /* Calculate how many units fit in the vector.  */
-  nunits = vecsize / tree_low_cst (TYPE_SIZE_UNIT (type), 1);
+  nunits = vecsize / tree_to_hwi (TYPE_SIZE_UNIT (type));
   if (nunits & (nunits - 1))
     {
       error ("number of components of the vector not a power of two");
@@ -8265,7 +8265,7 @@  check_function_sentinel (const_tree fntype, int nargs, tree *argarray)
       if (TREE_VALUE (attr))
 	{
 	  tree p = TREE_VALUE (TREE_VALUE (attr));
-	  pos = TREE_INT_CST_LOW (p);
+	  pos = tree_to_shwi (p);
 	}
 
       /* The sentinel must be one of the varargs, i.e.
@@ -8340,10 +8340,10 @@  get_nonnull_operand (tree arg_num_expr, unsigned HOST_WIDE_INT *valp)
 {
   /* Verify the arg number is a constant.  */
   if (TREE_CODE (arg_num_expr) != INTEGER_CST
-      || TREE_INT_CST_HIGH (arg_num_expr) != 0)
+      || !tree_fits_uhwi_p (arg_num_expr))
     return false;
 
-  *valp = TREE_INT_CST_LOW (arg_num_expr);
+  *valp = tree_to_hwi (arg_num_expr);
   return true;
 }
 
@@ -8544,7 +8544,7 @@  parse_optimize_options (tree args, bool attr_p)
       if (TREE_CODE (value) == INTEGER_CST)
 	{
 	  char buffer[20];
-	  sprintf (buffer, "-O%ld", (long) TREE_INT_CST_LOW (value));
+	  sprintf (buffer, "-O%ld", (long) tree_to_shwi (value));
 	  VEC_safe_push (const_char_p, gc, optimize_args, ggc_strdup (buffer));
 	}
 
@@ -8768,9 +8768,9 @@  check_function_arguments_recurse (void (*callback)
 	    format_num_expr = TREE_VALUE (TREE_VALUE (attrs));
 
 	    gcc_assert (TREE_CODE (format_num_expr) == INTEGER_CST
-			&& !TREE_INT_CST_HIGH (format_num_expr));
+			&& tree_fits_uhwi_p (format_num_expr));
 
-	    format_num = TREE_INT_CST_LOW (format_num_expr);
+	    format_num = tree_to_hwi (format_num_expr);
 
 	    for (inner_arg = first_call_expr_arg (param, &iter), i = 1;
 		 inner_arg != 0;
@@ -9024,7 +9024,7 @@  c_parse_error (const char *gmsgid, enum cpp_ttype token_type,
 	   || token_type == CPP_CHAR16
 	   || token_type == CPP_CHAR32)
     {
-      unsigned int val = TREE_INT_CST_LOW (value);
+      unsigned int val = tree_to_uhwi (value);
       const char *prefix;
 
       switch (token_type)
@@ -9266,8 +9266,7 @@  fold_offsetof_1 (tree expr)
 	  return error_mark_node;
 	}
       off = size_binop_loc (input_location, PLUS_EXPR, DECL_FIELD_OFFSET (t),
-			    size_int (tree_low_cst (DECL_FIELD_BIT_OFFSET (t),
-						    1)
+			    size_int (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (t))
 				      / BITS_PER_UNIT));
       break;
 
@@ -9622,7 +9621,7 @@  complete_array_type (tree *ptype, tree initial_value, bool do_default)
     {
       error ("size of array is too large");
       /* If we proceed with the array type as it is, we'll eventually
-	 crash in tree_low_cst().  */
+	 crash in tree_to_uhwi ().  */
       type = error_mark_node;
     }
 
@@ -9680,7 +9679,7 @@  sync_resolve_size (tree function, VEC(tree,gc) *params)
   if (!INTEGRAL_TYPE_P (type) && !POINTER_TYPE_P (type))
     goto incompatible;
 
-  size = tree_low_cst (TYPE_SIZE_UNIT (type), 1);
+  size = tree_to_uhwi (TYPE_SIZE_UNIT (type));
   if (size == 1 || size == 2 || size == 4 || size == 8 || size == 16)
     return size;
 
@@ -9838,7 +9837,7 @@  get_atomic_generic_size (location_t loc, tree function, VEC(tree,gc) *params)
       return 0;
     }
 
-  size_0 = tree_low_cst (TYPE_SIZE_UNIT (TREE_TYPE (type_0)), 1);
+  size_0 = tree_to_uhwi (TYPE_SIZE_UNIT (TREE_TYPE (type_0)));
 
   /* Zero size objects are not allowed.  */
   if (size_0 == 0)
@@ -9863,7 +9862,7 @@  get_atomic_generic_size (location_t loc, tree function, VEC(tree,gc) *params)
 		    function);
 	  return 0;
 	}
-      size = tree_low_cst (TYPE_SIZE_UNIT (TREE_TYPE (type)), 1);
+      size = tree_to_uhwi (TYPE_SIZE_UNIT (TREE_TYPE (type)));
       if (size != size_0)
 	{
 	  error_at (loc, "size mismatch in argument %d of %qE", x + 1,
@@ -9878,7 +9877,7 @@  get_atomic_generic_size (location_t loc, tree function, VEC(tree,gc) *params)
       tree p = VEC_index (tree, params, x);
       if (TREE_CODE (p) == INTEGER_CST)
         {
-	  int i = tree_low_cst (p, 1);
+	  int i = tree_to_uhwi (p);
 	  if (i < 0 || i >= MEMMODEL_LAST)
 	    {
 	      warning_at (loc, OPT_Winvalid_memory_model,
@@ -10682,24 +10681,24 @@  warn_for_sign_compare (location_t location,
       if (TREE_CODE (op1) == BIT_NOT_EXPR)
 	op1 = c_common_get_narrower (TREE_OPERAND (op1, 0), &unsignedp1);
 
-      if (host_integerp (op0, 0) || host_integerp (op1, 0))
+      if (tree_fits_shwi_p (op0) || tree_fits_shwi_p (op1))
         {
           tree primop;
           HOST_WIDE_INT constant, mask;
           int unsignedp;
           unsigned int bits;
 
-          if (host_integerp (op0, 0))
+          if (tree_fits_shwi_p (op0))
             {
               primop = op1;
               unsignedp = unsignedp1;
-              constant = tree_low_cst (op0, 0);
+              constant = tree_to_hwi (op0);
             }
           else
             {
               primop = op0;
               unsignedp = unsignedp0;
-              constant = tree_low_cst (op1, 0);
+              constant = tree_to_hwi (op1);
             }
 
           bits = TYPE_PRECISION (TREE_TYPE (primop));
@@ -11140,8 +11139,8 @@  convert_vector_to_pointer_for_subscript (location_t loc,
       tree type1;
 
       if (TREE_CODE (index) == INTEGER_CST)
-        if (!host_integerp (index, 1)
-            || ((unsigned HOST_WIDE_INT) tree_low_cst (index, 1)
+        if (!tree_fits_uhwi_p (index)
+            || ((unsigned HOST_WIDE_INT) tree_to_hwi (index)
                >= TYPE_VECTOR_SUBPARTS (type)))
           warning_at (loc, OPT_Warray_bounds, "index value is out of bound");
 
diff --git a/gcc/c-family/c-cppbuiltin.c b/gcc/c-family/c-cppbuiltin.c
index 89a22a3..e39e606 100644
--- a/gcc/c-family/c-cppbuiltin.c
+++ b/gcc/c-family/c-cppbuiltin.c
@@ -107,7 +107,7 @@  static void
 builtin_define_type_sizeof (const char *name, tree type)
 {
   builtin_define_with_int_value (name,
-				 tree_low_cst (TYPE_SIZE_UNIT (type), 1));
+				 tree_to_uhwi (TYPE_SIZE_UNIT (type)));
 }
 
 /* Define the float.h constants for TYPE using NAME_PREFIX, FP_SUFFIX,
@@ -649,7 +649,7 @@  cpp_atomic_builtins (cpp_reader *pfile)
   /* Tell the source code about various types.  These map to the C++11 and C11
      macros where 2 indicates lock-free always, and 1 indicates sometimes
      lock free.  */
-#define SIZEOF_NODE(T) (tree_low_cst (TYPE_SIZE_UNIT (T), 1))
+#define SIZEOF_NODE(T) (tree_to_uhwi (TYPE_SIZE_UNIT (T)))
 #define SWAP_INDEX(T) ((SIZEOF_NODE (T) < SWAP_LIMIT) ? SIZEOF_NODE (T) : 0)
   builtin_define_with_int_value ("__GCC_ATOMIC_BOOL_LOCK_FREE", 
 			(have_swap[SWAP_INDEX (boolean_type_node)]? 2 : 1));
diff --git a/gcc/c-family/c-format.c b/gcc/c-family/c-format.c
index 2d1ed81..73b5720 100644
--- a/gcc/c-family/c-format.c
+++ b/gcc/c-family/c-format.c
@@ -249,13 +249,13 @@  check_format_string (tree fntype, unsigned HOST_WIDE_INT format_num,
 static bool
 get_constant (tree expr, unsigned HOST_WIDE_INT *value, int validated_p)
 {
-  if (TREE_CODE (expr) != INTEGER_CST || TREE_INT_CST_HIGH (expr) != 0)
+  if (TREE_CODE (expr) != INTEGER_CST || !tree_fits_uhwi_p (expr))
     {
       gcc_assert (!validated_p);
       return false;
     }
 
-  *value = TREE_INT_CST_LOW (expr);
+  *value = tree_to_hwi (expr);
 
   return true;
 }
@@ -1478,8 +1478,8 @@  check_format_arg (void *ctx, tree format_tree,
 	  res->number_non_literal++;
 	  return;
 	}
-      if (!host_integerp (arg1, 0)
-	  || (offset = tree_low_cst (arg1, 0)) < 0)
+      if (!tree_fits_shwi_p (arg1)
+	  || (offset = tree_to_hwi (arg1)) < 0)
 	{
 	  res->number_non_literal++;
 	  return;
@@ -1525,8 +1525,8 @@  check_format_arg (void *ctx, tree format_tree,
       return;
     }
   if (TREE_CODE (format_tree) == ARRAY_REF
-      && host_integerp (TREE_OPERAND (format_tree, 1), 0)
-      && (offset += tree_low_cst (TREE_OPERAND (format_tree, 1), 0)) >= 0)
+      && tree_fits_shwi_p (TREE_OPERAND (format_tree, 1))
+      && (offset += tree_to_hwi (TREE_OPERAND (format_tree, 1))) >= 0)
     format_tree = TREE_OPERAND (format_tree, 0);
   if (TREE_CODE (format_tree) == VAR_DECL
       && TREE_CODE (TREE_TYPE (format_tree)) == ARRAY_TYPE
@@ -1556,9 +1556,9 @@  check_format_arg (void *ctx, tree format_tree,
       /* Variable length arrays can't be initialized.  */
       gcc_assert (TREE_CODE (array_size) == INTEGER_CST);
 
-      if (host_integerp (array_size, 0))
+      if (tree_fits_shwi_p (array_size))
 	{
-	  HOST_WIDE_INT array_size_value = TREE_INT_CST_LOW (array_size);
+	  HOST_WIDE_INT array_size_value = tree_to_hwi (array_size);
 	  if (array_size_value > 0
 	      && array_size_value == (int) array_size_value
 	      && format_length > array_size_value)
diff --git a/gcc/c-family/c-pragma.c b/gcc/c-family/c-pragma.c
index 70d8748e..3a47441 100644
--- a/gcc/c-family/c-pragma.c
+++ b/gcc/c-family/c-pragma.c
@@ -153,7 +153,7 @@  handle_pragma_pack (cpp_reader * ARG_UNUSED (dummy))
     {
       if (TREE_CODE (x) != INTEGER_CST)
 	GCC_BAD ("invalid constant in %<#pragma pack%> - ignored");
-      align = TREE_INT_CST_LOW (x);
+      align = tree_to_shwi (x);
       action = set;
       if (pragma_lex (&x) != CPP_CLOSE_PAREN)
 	GCC_BAD ("malformed %<#pragma pack%> - ignored");
@@ -185,7 +185,7 @@  handle_pragma_pack (cpp_reader * ARG_UNUSED (dummy))
 	    {
 	      if (TREE_CODE (x) != INTEGER_CST)
 		GCC_BAD ("invalid constant in %<#pragma pack%> - ignored");
-	      align = TREE_INT_CST_LOW (x);
+	      align = tree_to_shwi (x);
 	      if (align == -1)
 		action = set;
 	    }
diff --git a/gcc/c-family/c-pretty-print.c b/gcc/c-family/c-pretty-print.c
index edeccce..d448e5a 100644
--- a/gcc/c-family/c-pretty-print.c
+++ b/gcc/c-family/c-pretty-print.c
@@ -582,8 +582,8 @@  pp_c_direct_abstract_declarator (c_pretty_printer *pp, tree t)
 	  tree maxval = TYPE_MAX_VALUE (TYPE_DOMAIN (t));
 	  tree type = TREE_TYPE (maxval);
 
-	  if (host_integerp (maxval, 0))
-	    pp_wide_integer (pp, tree_low_cst (maxval, 0) + 1);
+	  if (tree_fits_shwi_p (maxval))
+	    pp_wide_integer (pp, tree_to_hwi (maxval) + 1);
 	  else
 	    pp_expression (pp, fold_build2 (PLUS_EXPR, type, maxval,
 					    build_int_cst (type, 1)));
@@ -911,10 +911,10 @@  pp_c_integer_constant (c_pretty_printer *pp, tree i)
     ? TYPE_CANONICAL (TREE_TYPE (i))
     : TREE_TYPE (i);
 
-  if (host_integerp (i, 0))
-    pp_wide_integer (pp, TREE_INT_CST_LOW (i));
-  else if (host_integerp (i, 1))
-    pp_unsigned_wide_integer (pp, TREE_INT_CST_LOW (i));
+  if (tree_fits_shwi_p (i))
+    pp_wide_integer (pp, tree_to_hwi (i));
+  else if (tree_fits_uhwi_p (i))
+    pp_unsigned_wide_integer (pp, tree_to_hwi (i));
   else
     {
       unsigned HOST_WIDE_INT low = TREE_INT_CST_LOW (i);
@@ -950,10 +950,10 @@  pp_c_character_constant (c_pretty_printer *pp, tree c)
   if (type == wchar_type_node)
     pp_character (pp, 'L');
   pp_quote (pp);
-  if (host_integerp (c, TYPE_UNSIGNED (type)))
-    pp_c_char (pp, tree_low_cst (c, TYPE_UNSIGNED (type)));
+  if (tree_fits_hwi_p (c, TYPE_UNSIGNED (type)))
+    pp_c_char (pp, tree_to_hwi (c));
   else
-    pp_scalar (pp, "\\x%x", (unsigned) TREE_INT_CST_LOW (c));
+    pp_scalar (pp, "\\x%x", (unsigned) tree_to_uhwi (c));
   pp_quote (pp);
 }
 
@@ -1581,8 +1581,8 @@  pp_c_postfix_expression (c_pretty_printer *pp, tree e)
 	if (type
 	    && tree_int_cst_equal (TYPE_SIZE (type), TREE_OPERAND (e, 1)))
 	  {
-	    HOST_WIDE_INT bitpos = tree_low_cst (TREE_OPERAND (e, 2), 0);
-	    HOST_WIDE_INT size = tree_low_cst (TYPE_SIZE (type), 0);
+	    HOST_WIDE_INT bitpos = tree_to_shwi (TREE_OPERAND (e, 2));
+	    HOST_WIDE_INT size = tree_to_shwi (TYPE_SIZE (type));
 	    if ((bitpos % size) == 0)
 	      {
 		pp_c_left_paren (pp);
diff --git a/gcc/c/c-decl.c b/gcc/c/c-decl.c
index a4a8108..1679ac7 100644
--- a/gcc/c/c-decl.c
+++ b/gcc/c/c-decl.c
@@ -4788,7 +4788,7 @@  check_bitfield_type_and_width (tree *type, tree *width, tree orig_name)
       *width = build_int_cst (integer_type_node, w);
     }
   else
-    w = tree_low_cst (*width, 1);
+    w = tree_to_uhwi (*width);
 
   if (TREE_CODE (*type) == ENUMERAL_TYPE)
     {
@@ -5825,7 +5825,7 @@  grokdeclarator (const struct c_declarator *declarator,
       else
 	error_at (loc, "size of unnamed array is too large");
       /* If we proceed with the array type as it is, we'll eventually
-	 crash in tree_low_cst().  */
+	 crash in tree_to_shwi ().  */
       type = error_mark_node;
     }
 
@@ -7150,7 +7150,7 @@  finish_struct (location_t loc, tree t, tree fieldlist, tree attributes,
 
       if (DECL_INITIAL (x))
 	{
-	  unsigned HOST_WIDE_INT width = tree_low_cst (DECL_INITIAL (x), 1);
+	  unsigned HOST_WIDE_INT width = tree_to_uhwi (DECL_INITIAL (x));
 	  DECL_SIZE (x) = bitsize_int (width);
 	  DECL_BIT_FIELD (x) = 1;
 	  SET_DECL_C_BIT_FIELD (x);
@@ -7215,7 +7215,7 @@  finish_struct (location_t loc, tree t, tree fieldlist, tree attributes,
 	  && TREE_TYPE (*fieldlistp) != error_mark_node)
 	{
 	  unsigned HOST_WIDE_INT width
-	    = tree_low_cst (DECL_INITIAL (*fieldlistp), 1);
+	    = tree_to_uhwi (DECL_INITIAL (*fieldlistp));
 	  tree type = TREE_TYPE (*fieldlistp);
 	  if (width != TYPE_PRECISION (type))
 	    {
diff --git a/gcc/c/c-parser.c b/gcc/c/c-parser.c
index bea9791..73d6cb0 100644
--- a/gcc/c/c-parser.c
+++ b/gcc/c/c-parser.c
@@ -380,7 +380,7 @@  c_lex_one_token (c_parser *parser, c_token *token)
       break;
     case CPP_PRAGMA:
       /* We smuggled the cpp_token->u.pragma value in an INTEGER_CST.  */
-      token->pragma_kind = (enum pragma_kind) TREE_INT_CST_LOW (token->value);
+      token->pragma_kind = (enum pragma_kind) tree_to_uhwi (token->value);
       token->value = NULL;
       break;
     default:
@@ -8875,8 +8875,8 @@  c_parser_omp_clause_collapse (c_parser *parser, tree list)
   if (num == error_mark_node)
     return list;
   if (!INTEGRAL_TYPE_P (TREE_TYPE (num))
-      || !host_integerp (num, 0)
-      || (n = tree_low_cst (num, 0)) <= 0
+      || !tree_fits_shwi_p (num)
+      || (n = tree_to_hwi (num)) <= 0
       || (int) n != n)
     {
       error_at (loc,
@@ -9926,7 +9926,7 @@  c_parser_omp_for_loop (location_t loc,
 
   for (cl = clauses; cl; cl = OMP_CLAUSE_CHAIN (cl))
     if (OMP_CLAUSE_CODE (cl) == OMP_CLAUSE_COLLAPSE)
-      collapse = tree_low_cst (OMP_CLAUSE_COLLAPSE_EXPR (cl), 0);
+      collapse = tree_to_shwi (OMP_CLAUSE_COLLAPSE_EXPR (cl));
 
   gcc_assert (collapse >= 1);
 
diff --git a/gcc/c/c-typeck.c b/gcc/c/c-typeck.c
index 5b4ad28..fd9e4f6 100644
--- a/gcc/c/c-typeck.c
+++ b/gcc/c/c-typeck.c
@@ -6795,7 +6795,7 @@  push_init_level (int implicit, struct obstack * braced_init_obstack)
   else if (TREE_CODE (constructor_type) == ARRAY_TYPE)
     {
       constructor_type = TREE_TYPE (constructor_type);
-      push_array_bounds (tree_low_cst (constructor_index, 1));
+      push_array_bounds (tree_to_uhwi (constructor_index));
       constructor_depth++;
     }
 
@@ -8341,7 +8341,7 @@  process_init_element (struct c_expr value, bool implicit,
 	  /* Now output the actual element.  */
 	  if (value.value)
 	    {
-	      push_array_bounds (tree_low_cst (constructor_index, 1));
+	      push_array_bounds (tree_to_uhwi (constructor_index));
 	      output_init_element (value.value, value.original_type,
 				   strict_string, elttype,
 				   constructor_index, 1, implicit,
@@ -9074,7 +9074,7 @@  c_finish_bc_stmt (location_t loc, tree *label_p, bool is_break)
     }
   else if (TREE_CODE (label) == LABEL_DECL)
     ;
-  else switch (TREE_INT_CST_LOW (label))
+  else switch (tree_to_uhwi (label))
     {
     case 0:
       if (is_break)
diff --git a/gcc/cfgexpand.c b/gcc/cfgexpand.c
index 0404605..77749c1 100644
--- a/gcc/cfgexpand.c
+++ b/gcc/cfgexpand.c
@@ -270,7 +270,7 @@  add_stack_var (tree decl)
   * (size_t *)pointer_map_insert (decl_to_stack_part, decl) = stack_vars_num;
 
   v->decl = decl;
-  v->size = tree_low_cst (DECL_SIZE_UNIT (SSAVAR (decl)), 1);
+  v->size = tree_to_uhwi (DECL_SIZE_UNIT (SSAVAR (decl)));
   /* Ensure that all variables have size, so that &a != &b for any two
      variables that are simultaneously live.  */
   if (v->size == 0)
@@ -981,7 +981,7 @@  expand_one_stack_var (tree var)
   HOST_WIDE_INT size, offset;
   unsigned byte_align;
 
-  size = tree_low_cst (DECL_SIZE_UNIT (SSAVAR (var)), 1);
+  size = tree_to_uhwi (DECL_SIZE_UNIT (SSAVAR (var)));
   byte_align = align_local_variable (SSAVAR (var));
 
   /* We handle highly aligned variables in expand_stack_vars.  */
@@ -1077,7 +1077,7 @@  defer_stack_allocation (tree var, bool toplevel)
      other hand, we don't want the function's stack frame size to
      get completely out of hand.  So we avoid adding scalars and
      "small" aggregates to the list at all.  */
-  if (optimize == 0 && tree_low_cst (DECL_SIZE_UNIT (var), 1) < 32)
+  if (optimize == 0 && tree_to_uhwi (DECL_SIZE_UNIT (var)) < 32)
     return false;
 
   return true;
@@ -1191,7 +1191,7 @@  expand_one_var (tree var, bool toplevel, bool really_expand)
     {
       if (really_expand)
         expand_one_stack_var (origvar);
-      return tree_low_cst (DECL_SIZE_UNIT (var), 1);
+      return tree_to_uhwi (DECL_SIZE_UNIT (var));
     }
   return 0;
 }
@@ -1262,10 +1262,10 @@  stack_protect_classify_type (tree type)
 	  unsigned HOST_WIDE_INT len;
 
 	  if (!TYPE_SIZE_UNIT (type)
-	      || !host_integerp (TYPE_SIZE_UNIT (type), 1))
+	      || !tree_fits_uhwi_p (TYPE_SIZE_UNIT (type)))
 	    len = max;
 	  else
-	    len = tree_low_cst (TYPE_SIZE_UNIT (type), 1);
+	    len = tree_to_hwi (TYPE_SIZE_UNIT (type));
 
 	  if (len < max)
 	    ret = SPCT_HAS_SMALL_CHAR_ARRAY | SPCT_HAS_ARRAY;
diff --git a/gcc/cgraphunit.c b/gcc/cgraphunit.c
index 64460ac..b769448 100644
--- a/gcc/cgraphunit.c
+++ b/gcc/cgraphunit.c
@@ -1621,7 +1621,7 @@  expand_function (struct cgraph_node *node)
 				   larger_than_size))
 	{
 	  unsigned int size_as_int
-	    = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (ret_type));
+	    = tree_to_uhwi (TYPE_SIZE_UNIT (ret_type));
 
 	  if (compare_tree_int (TYPE_SIZE_UNIT (ret_type), size_as_int) == 0)
 	    warning (OPT_Wlarger_than_, "size of return value of %q+D is %u bytes",
diff --git a/gcc/config/alpha/alpha.c b/gcc/config/alpha/alpha.c
index d726b5a..367b007 100644
--- a/gcc/config/alpha/alpha.c
+++ b/gcc/config/alpha/alpha.c
@@ -5862,7 +5862,7 @@  va_list_skip_additions (tree lhs)
       if (!CONVERT_EXPR_CODE_P (code)
 	  && ((code != PLUS_EXPR && code != POINTER_PLUS_EXPR)
 	      || TREE_CODE (gimple_assign_rhs2 (stmt)) != INTEGER_CST
-	      || !host_integerp (gimple_assign_rhs2 (stmt), 1)))
+	      || !tree_fits_uhwi_p (gimple_assign_rhs2 (stmt))))
 	return stmt;
 
       lhs = gimple_assign_rhs1 (stmt);
@@ -5988,10 +5988,10 @@  alpha_stdarg_optimize_hook (struct stdarg_info *si, const_gimple stmt)
 	  else
 	    goto escapes;
 
-	  if (!host_integerp (gimple_assign_rhs2 (arg2_stmt), 0))
+	  if (!tree_fits_shwi_p (gimple_assign_rhs2 (arg2_stmt)))
 	    goto escapes;
 
-	  sub = tree_low_cst (gimple_assign_rhs2 (arg2_stmt), 0);
+	  sub = tree_to_hwi (gimple_assign_rhs2 (arg2_stmt));
 	  if (code2 == MINUS_EXPR)
 	    sub = -sub;
 	  if (sub < -48 || sub > -32)
diff --git a/gcc/config/alpha/predicates.md b/gcc/config/alpha/predicates.md
index 0a1885b..4f686aa 100644
--- a/gcc/config/alpha/predicates.md
+++ b/gcc/config/alpha/predicates.md
@@ -358,7 +358,7 @@ 
 	    && !SYMBOL_REF_TLS_MODEL (op))
     {
       if (SYMBOL_REF_DECL (op))
-        max_ofs = tree_low_cst (DECL_SIZE_UNIT (SYMBOL_REF_DECL (op)), 1);
+        max_ofs = tree_to_uhwi (DECL_SIZE_UNIT (SYMBOL_REF_DECL (op)));
     }
   else
     return false;
diff --git a/gcc/config/arm/arm.c b/gcc/config/arm/arm.c
index 1470602..8662dc8 100644
--- a/gcc/config/arm/arm.c
+++ b/gcc/config/arm/arm.c
@@ -4167,18 +4167,18 @@  aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
 	if (count == -1
 	    || !index
 	    || !TYPE_MAX_VALUE (index)
-	    || !host_integerp (TYPE_MAX_VALUE (index), 1)
+	    || !tree_fits_uhwi_p (TYPE_MAX_VALUE (index))
 	    || !TYPE_MIN_VALUE (index)
-	    || !host_integerp (TYPE_MIN_VALUE (index), 1)
+	    || !tree_fits_uhwi_p (TYPE_MIN_VALUE (index))
 	    || count < 0)
 	  return -1;
 
-	count *= (1 + tree_low_cst (TYPE_MAX_VALUE (index), 1)
-		      - tree_low_cst (TYPE_MIN_VALUE (index), 1));
+	count *= (1 + tree_to_hwi (TYPE_MAX_VALUE (index))
+		      - tree_to_hwi (TYPE_MIN_VALUE (index)));
 
 	/* There must be no padding.  */
-	if (!host_integerp (TYPE_SIZE (type), 1)
-	    || (tree_low_cst (TYPE_SIZE (type), 1)
+	if (!tree_fits_uhwi_p (TYPE_SIZE (type))
+	    || (tree_to_hwi (TYPE_SIZE (type))
 		!= count * GET_MODE_BITSIZE (*modep)))
 	  return -1;
 
@@ -4207,8 +4207,8 @@  aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
 	  }
 
 	/* There must be no padding.  */
-	if (!host_integerp (TYPE_SIZE (type), 1)
-	    || (tree_low_cst (TYPE_SIZE (type), 1)
+	if (!tree_fits_uhwi_p (TYPE_SIZE (type))
+	    || (tree_to_hwi (TYPE_SIZE (type))
 		!= count * GET_MODE_BITSIZE (*modep)))
 	  return -1;
 
@@ -4239,8 +4239,8 @@  aapcs_vfp_sub_candidate (const_tree type, enum machine_mode *modep)
 	  }
 
 	/* There must be no padding.  */
-	if (!host_integerp (TYPE_SIZE (type), 1)
-	    || (tree_low_cst (TYPE_SIZE (type), 1)
+	tree_fits_uhwi (TYPE_SIZE (type))
+	    || (tree_to_hwi (TYPE_SIZE (type))
 		!= count * GET_MODE_BITSIZE (*modep)))
 	  return -1;
 
@@ -25147,7 +25147,7 @@  arm_have_conditional_execution (void)
 static HOST_WIDE_INT
 arm_vector_alignment (const_tree type)
 {
-  HOST_WIDE_INT align = tree_low_cst (TYPE_SIZE (type), 0);
+  HOST_WIDE_INT align = tree_to_shwi (TYPE_SIZE (type));
 
   if (TARGET_AAPCS_BASED)
     align = MIN (align, 64);
diff --git a/gcc/config/avr/avr.c b/gcc/config/avr/avr.c
index 1c5bab2..dea838d 100644
--- a/gcc/config/avr/avr.c
+++ b/gcc/config/avr/avr.c
@@ -11684,7 +11684,7 @@  avr_fold_builtin (tree fndecl, int n_args ATTRIBUTE_UNUSED, tree *arg,
             /* Inserting bits known at compile time is easy and can be
                performed by AND and OR with appropriate masks.  */
 
-            int bits = TREE_INT_CST_LOW (tbits);
+            int bits = tree_to_shwi (tbits);
             int mask_ior = 0, mask_and = 0xff;
 
             for (i = 0; i < 8; i++)
diff --git a/gcc/config/bfin/bfin.c b/gcc/config/bfin/bfin.c
index 2c01cf7..2cfe281 100644
--- a/gcc/config/bfin/bfin.c
+++ b/gcc/config/bfin/bfin.c
@@ -3286,8 +3286,8 @@  bfin_local_alignment (tree type, unsigned align)
      memcpy can use 32 bit loads/stores.  */
   if (TYPE_SIZE (type)
       && TREE_CODE (TYPE_SIZE (type)) == INTEGER_CST
-      && (TREE_INT_CST_LOW (TYPE_SIZE (type)) > 8
-	  || TREE_INT_CST_HIGH (TYPE_SIZE (type))) && align < 32)
+      && (tree_to_hwi (TYPE_SIZE (type)) > 8
+	  || !tree_fits_uhwi_p (TYPE_SIZE (type))) && align < 32)
     return 32;
   return align;
 }
diff --git a/gcc/config/c6x/predicates.md b/gcc/config/c6x/predicates.md
index c4ccf54..0920c96 100644
--- a/gcc/config/c6x/predicates.md
+++ b/gcc/config/c6x/predicates.md
@@ -210,9 +210,9 @@ 
         t = DECL_SIZE_UNIT (t);
       else
 	t = TYPE_SIZE_UNIT (TREE_TYPE (t));
-      if (t && host_integerp (t, 0))
+      if (t && tree_fits_shwi_p (t))
 	{
-	  size = tree_low_cst (t, 0);
+	  size = tree_to_hwi (t);
 	  if (size < 0)
 	    size = 0;
 	}
diff --git a/gcc/config/darwin.c b/gcc/config/darwin.c
index 5a9f50a..4921054 100644
--- a/gcc/config/darwin.c
+++ b/gcc/config/darwin.c
@@ -1243,20 +1243,21 @@  darwin_mergeable_constant_section (tree exp,
       && (align & (align -1)) == 0)
     {
       tree size = TYPE_SIZE_UNIT (TREE_TYPE (exp));
+      unsigned HOST_WIDE_INT usize;
 
-      if (TREE_CODE (size) == INTEGER_CST
-	  && TREE_INT_CST_LOW (size) == 4
-	  && TREE_INT_CST_HIGH (size) == 0)
+      if (TREE_CODE (size) != INTEGER_CST
+	  !! !tree_fits_uhwi_p (size))
+	return readonly_data_section;
+
+      usize = tree_to_hwi (size);
+
+      if (usize == 4)
         return darwin_sections[literal4_section];
-      else if (TREE_CODE (size) == INTEGER_CST
-	       && TREE_INT_CST_LOW (size) == 8
-	       && TREE_INT_CST_HIGH (size) == 0)
+      else if (usize == 8)
         return darwin_sections[literal8_section];
       else if (HAVE_GAS_LITERAL16
 	       && TARGET_64BIT
-               && TREE_CODE (size) == INTEGER_CST
-               && TREE_INT_CST_LOW (size) == 16
-               && TREE_INT_CST_HIGH (size) == 0)
+               && usize == 16)
         return darwin_sections[literal16_section];
       else
         return readonly_data_section;
@@ -1464,7 +1465,7 @@  machopic_select_section (tree decl,
 
   zsize = (DECL_P (decl) 
 	   && (TREE_CODE (decl) == VAR_DECL || TREE_CODE (decl) == CONST_DECL) 
-	   && tree_low_cst (DECL_SIZE_UNIT (decl), 1) == 0);
+	   && tree_to_uhwi (DECL_SIZE_UNIT (decl)) == 0);
 
   one = DECL_P (decl) 
 	&& TREE_CODE (decl) == VAR_DECL 
@@ -1607,7 +1608,7 @@  machopic_select_section (tree decl,
       static bool warned_objc_46 = false;
       /* We shall assert that zero-sized objects are an error in ObjC 
          meta-data.  */
-      gcc_assert (tree_low_cst (DECL_SIZE_UNIT (decl), 1) != 0);
+      gcc_assert (tree_to_uhwi (DECL_SIZE_UNIT (decl)) != 0);
       
       /* ??? This mechanism for determining the metadata section is
 	 broken when LTO is in use, since the frontend that generated
@@ -2096,7 +2097,7 @@  darwin_asm_declare_object_name (FILE *file,
 	machopic_define_symbol (DECL_RTL (decl));
     }
 
-  size = tree_low_cst (DECL_SIZE_UNIT (decl), 1);
+  size = tree_to_uhwi (DECL_SIZE_UNIT (decl));
 
 #ifdef DEBUG_DARWIN_MEM_ALLOCATORS
 fprintf (file, "# dadon: %s %s (%llu, %u) local %d weak %d"
diff --git a/gcc/config/epiphany/epiphany.c b/gcc/config/epiphany/epiphany.c
index 65bb1c8..585ffef 100644
--- a/gcc/config/epiphany/epiphany.c
+++ b/gcc/config/epiphany/epiphany.c
@@ -2663,11 +2663,11 @@  epiphany_special_round_type_align (tree type, unsigned computed,
 	continue;
       offset = bit_position (field);
       size = DECL_SIZE (field);
-      if (!host_integerp (offset, 1) || !host_integerp (size, 1)
-	  || TREE_INT_CST_LOW (offset) >= try_align
-	  || TREE_INT_CST_LOW (size) >= try_align)
+      if (!tree_fits_uhwi_p (offset) || !tree_fits_uhwi_p (size)
+	  || tree_to_hwi (offset) >= try_align
+	  || tree_to_hwi (size) >= try_align)
 	return try_align;
-      total = TREE_INT_CST_LOW (offset) + TREE_INT_CST_LOW (size);
+      total = tree_to_hwi (offset) + tree_to_hwi (size);
       if (total > max)
 	max = total;
     }
@@ -2690,7 +2690,7 @@  epiphany_adjust_field_align (tree field, unsigned computed)
     {
       tree elmsz = TYPE_SIZE (TREE_TYPE (TREE_TYPE (field)));
 
-      if (!host_integerp (elmsz, 1) || tree_low_cst (elmsz, 1) >= 32)
+      if (!tree_fits_uhwi_p (elmsz) || tree_to_uhwi (elmsz) >= 32)
 	return 64;
     }
   return computed;
diff --git a/gcc/config/i386/i386.c b/gcc/config/i386/i386.c
index c10e49458..4fd1dd5 100644
--- a/gcc/config/i386/i386.c
+++ b/gcc/config/i386/i386.c
@@ -5365,7 +5365,7 @@  ix86_function_regparm (const_tree type, const_tree decl)
       attr = lookup_attribute ("regparm", TYPE_ATTRIBUTES (type));
       if (attr)
 	{
-	  regparm = TREE_INT_CST_LOW (TREE_VALUE (TREE_VALUE (attr)));
+	  regparm = tree_to_shwi (TREE_VALUE (TREE_VALUE (attr)));
 	  return regparm;
 	}
     }
@@ -5492,7 +5492,7 @@  ix86_keep_aggregate_return_pointer (tree fntype)
       attr = lookup_attribute ("callee_pop_aggregate_return",
 			       TYPE_ATTRIBUTES (fntype));
       if (attr)
-	return (TREE_INT_CST_LOW (TREE_VALUE (TREE_VALUE (attr))) == 0);
+	return (tree_to_hwi (TREE_VALUE (TREE_VALUE (attr))) == 0);
 
       /* For 32-bit MS-ABI the default is to keep aggregate
          return pointer.  */
@@ -6178,7 +6178,7 @@  classify_argument (enum machine_mode mode, const_tree type,
 		      for (i = (int_bit_position (field)
 				+ (bit_offset % 64)) / 8 / 8;
 			   i < ((int_bit_position (field) + (bit_offset % 64))
-			        + tree_low_cst (DECL_SIZE (field), 0)
+			        + tree_to_shwi (DECL_SIZE (field))
 				+ 63) / 8 / 8; i++)
 			classes[i] =
 			  merge_classes (X86_64_INTEGER_CLASS,
@@ -24886,8 +24886,8 @@  ix86_data_alignment (tree type, int align)
   if (AGGREGATE_TYPE_P (type)
       && TYPE_SIZE (type)
       && TREE_CODE (TYPE_SIZE (type)) == INTEGER_CST
-      && (TREE_INT_CST_LOW (TYPE_SIZE (type)) >= (unsigned) max_align
-	  || TREE_INT_CST_HIGH (TYPE_SIZE (type)))
+      && (tree_to_hwi (TYPE_SIZE (type)) >= (unsigned) max_align
+	  || !tree_fits_uhwi_p (TYPE_SIZE (type)))
       && align < max_align)
     align = max_align;
 
@@ -24898,8 +24898,8 @@  ix86_data_alignment (tree type, int align)
       if (AGGREGATE_TYPE_P (type)
 	   && TYPE_SIZE (type)
 	   && TREE_CODE (TYPE_SIZE (type)) == INTEGER_CST
-	   && (TREE_INT_CST_LOW (TYPE_SIZE (type)) >= 128
-	       || TREE_INT_CST_HIGH (TYPE_SIZE (type))) && align < 128)
+	   && (tree_to_hwi (TYPE_SIZE (type)) >= 128
+	       || !tree_fits_uhwi_p (TYPE_SIZE (type))) && align < 128)
 	return 128;
     }
 
@@ -25008,8 +25008,8 @@  ix86_local_alignment (tree exp, enum machine_mode mode,
 		   != TYPE_MAIN_VARIANT (va_list_type_node)))
 	   && TYPE_SIZE (type)
 	   && TREE_CODE (TYPE_SIZE (type)) == INTEGER_CST
-	   && (TREE_INT_CST_LOW (TYPE_SIZE (type)) >= 16
-	       || TREE_INT_CST_HIGH (TYPE_SIZE (type))) && align < 128)
+	   && (tree_to_hwi (TYPE_SIZE (type)) >= 16
+	       || !tree_fits_uhwi_p (TYPE_SIZE (type))) && align < 128)
 	return 128;
     }
   if (TREE_CODE (type) == ARRAY_TYPE)
@@ -28000,7 +28000,7 @@  ix86_builtin_tm_load (tree type)
 {
   if (TREE_CODE (type) == VECTOR_TYPE)
     {
-      switch (tree_low_cst (TYPE_SIZE (type), 1))
+      switch (tree_to_uhwi (TYPE_SIZE (type)))
 	{
 	case 64:
 	  return builtin_decl_explicit (BUILT_IN_TM_LOAD_M64);
@@ -28020,7 +28020,7 @@  ix86_builtin_tm_store (tree type)
 {
   if (TREE_CODE (type) == VECTOR_TYPE)
     {
-      switch (tree_low_cst (TYPE_SIZE (type), 1))
+      switch (tree_to_uhwi (TYPE_SIZE (type)))
 	{
 	case 64:
 	  return builtin_decl_explicit (BUILT_IN_TM_STORE_M64);
@@ -30308,8 +30308,8 @@  get_element_number (tree vec_type, tree arg)
 {
   unsigned HOST_WIDE_INT elt, max = TYPE_VECTOR_SUBPARTS (vec_type) - 1;
 
-  if (!host_integerp (arg, 1)
-      || (elt = tree_low_cst (arg, 1), elt > max))
+  if (!tree_fits_uhwi_p (arg)
+      || (elt = tree_to_hwi (arg), elt > max))
     {
       error ("selector must be an integer constant in the range 0..%wi", max);
       return 0;
diff --git a/gcc/config/ia64/predicates.md b/gcc/config/ia64/predicates.md
index 6622b20..89f95f1 100644
--- a/gcc/config/ia64/predicates.md
+++ b/gcc/config/ia64/predicates.md
@@ -72,9 +72,9 @@ 
 	    t = DECL_SIZE_UNIT (t);
 	  else
 	    t = TYPE_SIZE_UNIT (TREE_TYPE (t));
-	  if (t && host_integerp (t, 0))
+	  if (t && tree_fits_shwi_p (t))
 	    {
-	      size = tree_low_cst (t, 0);
+	      size = tree_to_hwi (t, 0);
 	      if (size < 0)
 		size = 0;
 	    }
diff --git a/gcc/config/iq2000/iq2000.c b/gcc/config/iq2000/iq2000.c
index da0b43d..d80cf00 100644
--- a/gcc/config/iq2000/iq2000.c
+++ b/gcc/config/iq2000/iq2000.c
@@ -1281,7 +1281,7 @@  iq2000_function_arg (cumulative_args_t cum_v, enum machine_mode mode,
 
       if (! type || TREE_CODE (type) != RECORD_TYPE
 	  || ! named  || ! TYPE_SIZE_UNIT (type)
-	  || ! host_integerp (TYPE_SIZE_UNIT (type), 1))
+	  || ! tree_fits_uhwi_p (TYPE_SIZE_UNIT (type)))
 	ret = gen_rtx_REG (mode, regbase + *arg_words + bias);
       else
 	{
@@ -1291,7 +1291,7 @@  iq2000_function_arg (cumulative_args_t cum_v, enum machine_mode mode,
 	    if (TREE_CODE (field) == FIELD_DECL
 		&& TREE_CODE (TREE_TYPE (field)) == REAL_TYPE
 		&& TYPE_PRECISION (TREE_TYPE (field)) == BITS_PER_WORD
-		&& host_integerp (bit_position (field), 0)
+		&& tree_fits_shwi_p (bit_position (field))
 		&& int_bit_position (field) % BITS_PER_WORD == 0)
 	      break;
 
@@ -1309,7 +1309,7 @@  iq2000_function_arg (cumulative_args_t cum_v, enum machine_mode mode,
 	      /* ??? If this is a packed structure, then the last hunk won't
 		 be 64 bits.  */
 	      chunks
-		= tree_low_cst (TYPE_SIZE_UNIT (type), 1) / UNITS_PER_WORD;
+		= tree_to_hwi (TYPE_SIZE_UNIT (type)) / UNITS_PER_WORD;
 	      if (chunks + *arg_words + bias > (unsigned) MAX_ARGS_IN_REGISTERS)
 		chunks = MAX_ARGS_IN_REGISTERS - *arg_words - bias;
 
diff --git a/gcc/config/m32c/m32c-pragma.c b/gcc/config/m32c/m32c-pragma.c
index 517c601..f15f2c7 100644
--- a/gcc/config/m32c/m32c-pragma.c
+++ b/gcc/config/m32c/m32c-pragma.c
@@ -46,9 +46,9 @@  m32c_pragma_memregs (cpp_reader * reader ATTRIBUTE_UNUSED)
   type = pragma_lex (&val);
   if (type == CPP_NUMBER)
     {
-      if (host_integerp (val, 1))
+      if (tree_fits_uhwi_p (val))
 	{
-	  i = tree_low_cst (val, 1);
+	  i = tree_to_hwi (val);
 
 	  type = pragma_lex (&val);
 	  if (type != CPP_EOF)
@@ -95,7 +95,7 @@  m32c_pragma_address (cpp_reader * reader ATTRIBUTE_UNUSED)
 	{
 	  if (var != error_mark_node)
 	    {
-	      unsigned uaddr = tree_low_cst (addr, 1);
+	      unsigned uaddr = tree_to_uhwi (addr);
 	      m32c_note_pragma_address (IDENTIFIER_POINTER (var), uaddr);
 	    }
 
diff --git a/gcc/config/m32c/m32c.c b/gcc/config/m32c/m32c.c
index 100c16f..0788611 100644
--- a/gcc/config/m32c/m32c.c
+++ b/gcc/config/m32c/m32c.c
@@ -2936,8 +2936,8 @@  function_vector_handler (tree * node ATTRIBUTE_UNUSED,
                 name);
       *no_add_attrs = true;
     }
-  else if (TREE_INT_CST_LOW (TREE_VALUE (args)) < 18
-           || TREE_INT_CST_LOW (TREE_VALUE (args)) > 255)
+  else if (tree_to_shwi (TREE_VALUE (args)) < 18
+           || tree_to_hwi (TREE_VALUE (args)) > 255)
     {
       /* The argument value must be between 18 to 255.  */
       warning (OPT_Wattributes,
@@ -2969,7 +2969,7 @@  current_function_special_page_vector (rtx x)
         {
           if (is_attribute_p ("function_vector", TREE_PURPOSE (list)))
             {
-              num = TREE_INT_CST_LOW (TREE_VALUE (TREE_VALUE (list)));
+              num = tree_to_shwi (TREE_VALUE (TREE_VALUE (list)));
               return num;
             }
 
diff --git a/gcc/config/mep/mep-pragma.c b/gcc/config/mep/mep-pragma.c
index 9f51620..790542b 100644
--- a/gcc/config/mep/mep-pragma.c
+++ b/gcc/config/mep/mep-pragma.c
@@ -232,9 +232,9 @@  mep_pragma_coprocessor_width (void)
   switch (type)
     {
     case CPP_NUMBER:
-      if (! host_integerp (val, 1))
+      if (! tree_fits_uhwi_p (val))
 	break;
-      i = tree_low_cst (val, 1);
+      i = tree_to_uhwi (val);
       /* This pragma no longer has any effect.  */
 #if 0
       if (i == 32)
@@ -273,7 +273,7 @@  mep_pragma_coprocessor_subclass (void)
   type = mep_pragma_lex (&val);
   if (type != CPP_CHAR)
     goto syntax_error;
-  class_letter = tree_low_cst (val, 1);
+  class_letter = tree_to_uhwi (val);
   if (class_letter >= 'A' && class_letter <= 'D')
     switch (class_letter)
       {
diff --git a/gcc/config/mep/mep.c b/gcc/config/mep/mep.c
index 79611a8..3614ba2 100644
--- a/gcc/config/mep/mep.c
+++ b/gcc/config/mep/mep.c
@@ -4211,7 +4211,7 @@  mep_attrlist_to_encoding (tree list, tree decl)
 	      && TREE_VALUE (TREE_VALUE (list))
 	      && TREE_CODE (TREE_VALUE (TREE_VALUE (list))) == INTEGER_CST)
 	    {
-	      int location = TREE_INT_CST_LOW (TREE_VALUE (TREE_VALUE(list)));
+	      int location = tree_to_shwi (TREE_VALUE (TREE_VALUE(list)));
 	      if (location >= 0
 		  && location <= 0x1000000)
 		return 'i';
@@ -4300,7 +4300,7 @@  mep_insert_attributes (tree decl, tree *attributes)
 	      && TREE_VALUE (attr)
 	      && TREE_VALUE (TREE_VALUE(attr)))
 	    {
-	      int location = TREE_INT_CST_LOW (TREE_VALUE (TREE_VALUE(attr)));
+	      int location = tree_to_shwi (TREE_VALUE (TREE_VALUE(attr)));
 	      static tree previous_value = 0;
 	      static int previous_location = 0;
 	      static tree previous_name = 0;
@@ -4716,7 +4716,7 @@  mep_output_aligned_common (FILE *stream, tree decl, const char *name,
       if (attr
 	  && TREE_VALUE (attr)
 	  && TREE_VALUE (TREE_VALUE(attr)))
-	location = TREE_INT_CST_LOW (TREE_VALUE (TREE_VALUE(attr)));
+	location = tree_to_shwi (TREE_VALUE (TREE_VALUE(attr)));
       if (location == -1)
 	return;
       if (global)
diff --git a/gcc/config/mips/mips.c b/gcc/config/mips/mips.c
index d37a2f4..fb22838 100644
--- a/gcc/config/mips/mips.c
+++ b/gcc/config/mips/mips.c
@@ -4972,7 +4972,7 @@  mips_function_arg (cumulative_args_t cum_v, enum machine_mode mode,
       && type != 0
       && TREE_CODE (type) == RECORD_TYPE
       && TYPE_SIZE_UNIT (type)
-      && host_integerp (TYPE_SIZE_UNIT (type), 1))
+      && tree_fits_uhwi_p (TYPE_SIZE_UNIT (type)))
     {
       tree field;
 
@@ -4981,7 +4981,7 @@  mips_function_arg (cumulative_args_t cum_v, enum machine_mode mode,
 	if (TREE_CODE (field) == FIELD_DECL
 	    && SCALAR_FLOAT_TYPE_P (TREE_TYPE (field))
 	    && TYPE_PRECISION (TREE_TYPE (field)) == BITS_PER_WORD
-	    && host_integerp (bit_position (field), 0)
+	    && tree_fits_shwi_p (bit_position (field))
 	    && int_bit_position (field) % BITS_PER_WORD == 0)
 	  break;
 
@@ -14459,7 +14459,7 @@  r10k_safe_mem_expr_p (tree expr, HOST_WIDE_INT offset)
     return false;
 
   offset += bitoffset / BITS_PER_UNIT;
-  return offset >= 0 && offset < tree_low_cst (DECL_SIZE_UNIT (inner), 1);
+  return offset >= 0 && offset < tree_to_uhwi (DECL_SIZE_UNIT (inner));
 }
 
 /* A for_each_rtx callback for which DATA points to the instruction
diff --git a/gcc/config/picochip/picochip.c b/gcc/config/picochip/picochip.c
index 1a6c1ef..5c52c83 100644
--- a/gcc/config/picochip/picochip.c
+++ b/gcc/config/picochip/picochip.c
@@ -809,7 +809,7 @@  picochip_compute_arg_size (const_tree type, enum machine_mode mode)
   int type_size_in_units = 0;
 
   if (type)
-    type_size_in_units = tree_low_cst (TYPE_SIZE_UNIT (type), 1);
+    type_size_in_units = tree_to_uhwi (TYPE_SIZE_UNIT (type));
   else
     type_size_in_units = GET_MODE_SIZE (mode);
 
diff --git a/gcc/config/rs6000/rs6000-c.c b/gcc/config/rs6000/rs6000-c.c
index 58101ab..6724df0 100644
--- a/gcc/config/rs6000/rs6000-c.c
+++ b/gcc/config/rs6000/rs6000-c.c
@@ -3620,8 +3620,8 @@  altivec_resolve_overloaded_builtin (location_t loc, tree fndecl,
       mode = TYPE_MODE (arg1_type);
       if ((mode == V2DFmode || mode == V2DImode) && VECTOR_MEM_VSX_P (mode)
 	  && TREE_CODE (arg2) == INTEGER_CST
-	  && TREE_INT_CST_HIGH (arg2) == 0
-	  && (TREE_INT_CST_LOW (arg2) == 0 || TREE_INT_CST_LOW (arg2) == 1))
+	  && tree_fits_uhwi_p (arg2)
+	  && (tree_to_hwi (arg2) == 0 || tree_to_hwi (arg2) == 1))
 	{
 	  tree call = NULL_TREE;
 
@@ -3697,8 +3697,8 @@  altivec_resolve_overloaded_builtin (location_t loc, tree fndecl,
       mode = TYPE_MODE (arg1_type);
       if ((mode == V2DFmode || mode == V2DImode) && VECTOR_UNIT_VSX_P (mode)
 	  && TREE_CODE (arg2) == INTEGER_CST
-	  && TREE_INT_CST_HIGH (arg2) == 0
-	  && (TREE_INT_CST_LOW (arg2) == 0 || TREE_INT_CST_LOW (arg2) == 1))
+	  && tree_fits_uhwi_p (arg2)
+	  && (tree_to_hwi (arg2) == 0 || tree_to_hwi (arg2) == 1))
 	{
 	  tree call = NULL_TREE;
 
diff --git a/gcc/config/rs6000/rs6000.c b/gcc/config/rs6000/rs6000.c
index f4e4dec..c7d542f 100644
--- a/gcc/config/rs6000/rs6000.c
+++ b/gcc/config/rs6000/rs6000.c
@@ -3300,7 +3300,7 @@  rs6000_builtin_support_vector_misalignment (enum machine_mode mode,
 	     it's word aligned.  */
 	  if (rs6000_vector_alignment_reachable (type, is_packed))
             {
-              int element_size = TREE_INT_CST_LOW (TYPE_SIZE (type));
+              int element_size = tree_to_shwi (TYPE_SIZE (type));
 
               if (element_size == 64 || element_size == 32)
                return true;
@@ -5228,10 +5228,10 @@  offsettable_ok_by_alignment (rtx op, HOST_WIDE_INT offset,
       if (!DECL_SIZE_UNIT (decl))
 	return false;
 
-      if (!host_integerp (DECL_SIZE_UNIT (decl), 1))
+      if (!tree_fits_uhwi_p (DECL_SIZE_UNIT (decl)))
 	return false;
 
-      dsize = tree_low_cst (DECL_SIZE_UNIT (decl), 1);
+      dsize = tree_to_hwi (DECL_SIZE_UNIT (decl));
       if (dsize > 32768)
 	return false;
 
@@ -5244,8 +5244,8 @@  offsettable_ok_by_alignment (rtx op, HOST_WIDE_INT offset,
   if (TREE_CODE (decl) == STRING_CST)
     dsize = TREE_STRING_LENGTH (decl);
   else if (TYPE_SIZE_UNIT (type)
-	   && host_integerp (TYPE_SIZE_UNIT (type), 1))
-    dsize = tree_low_cst (TYPE_SIZE_UNIT (type), 1);
+	   && tree_fits_uhwi_p (TYPE_SIZE_UNIT (type)))
+    dsize = tree_to_hwi (TYPE_SIZE_UNIT (type));
   else
     return false;
   if (dsize > 32768)
@@ -7745,7 +7745,7 @@  rs6000_darwin64_record_arg_advance_recurse (CUMULATIVE_ARGS *cum,
 	mode = TYPE_MODE (ftype);
 
 	if (DECL_SIZE (f) != 0
-	    && host_integerp (bit_position (f), 1))
+	    && tree_fits_uhwi_p (bit_position (f)))
 	  bitpos += int_bit_position (f);
 
 	/* ??? FIXME: else assume zero offset.  */
@@ -8222,7 +8222,7 @@  rs6000_darwin64_record_arg_recurse (CUMULATIVE_ARGS *cum, const_tree type,
 	mode = TYPE_MODE (ftype);
 
 	if (DECL_SIZE (f) != 0
-	    && host_integerp (bit_position (f), 1))
+	    && tree_fits_uhwi_p (bit_position (f)))
 	  bitpos += int_bit_position (f);
 
 	/* ??? FIXME: else assume zero offset.  */
@@ -9913,7 +9913,7 @@  rs6000_expand_binop_builtin (enum insn_code icode, tree exp, rtx target)
       /* Only allow 5-bit unsigned literals.  */
       STRIP_NOPS (arg1);
       if (TREE_CODE (arg1) != INTEGER_CST
-	  || TREE_INT_CST_LOW (arg1) & ~0x1f)
+	  || tree_to_uhwi (arg1) & ~0x1f)
 	{
 	  error ("argument 2 must be a 5-bit unsigned literal");
 	  return const0_rtx;
@@ -9958,7 +9958,7 @@  altivec_expand_predicate_builtin (enum insn_code icode, tree exp, rtx target)
       return const0_rtx;
     }
   else
-    cr6_form_int = TREE_INT_CST_LOW (cr6_form);
+    cr6_form_int = tree_to_shwi (cr6_form);
 
   gcc_assert (mode0 == mode1);
 
@@ -10261,7 +10261,7 @@  rs6000_expand_ternop_builtin (enum insn_code icode, tree exp, rtx target)
       /* Only allow 4-bit unsigned literals.  */
       STRIP_NOPS (arg2);
       if (TREE_CODE (arg2) != INTEGER_CST
-	  || TREE_INT_CST_LOW (arg2) & ~0xf)
+	  || tree_to_uhwi (arg2) & ~0xf)
 	{
 	  error ("argument 3 must be a 4-bit unsigned literal");
 	  return const0_rtx;
@@ -10279,7 +10279,7 @@  rs6000_expand_ternop_builtin (enum insn_code icode, tree exp, rtx target)
       /* Only allow 2-bit unsigned literals.  */
       STRIP_NOPS (arg2);
       if (TREE_CODE (arg2) != INTEGER_CST
-	  || TREE_INT_CST_LOW (arg2) & ~0x3)
+	  || tree_to_uhwi (arg2) & ~0x3)
 	{
 	  error ("argument 3 must be a 2-bit unsigned literal");
 	  return const0_rtx;
@@ -10291,7 +10291,7 @@  rs6000_expand_ternop_builtin (enum insn_code icode, tree exp, rtx target)
       /* Only allow 1-bit unsigned literals.  */
       STRIP_NOPS (arg2);
       if (TREE_CODE (arg2) != INTEGER_CST
-	  || TREE_INT_CST_LOW (arg2) & ~0x1)
+	  || tree_to_uhwi (arg2) & ~0x1)
 	{
 	  error ("argument 3 must be a 1-bit unsigned literal");
 	  return const0_rtx;
@@ -10474,7 +10474,7 @@  altivec_expand_dst_builtin (tree exp, rtx target ATTRIBUTE_UNUSED,
 	*expandedp = true;
 	STRIP_NOPS (arg2);
 	if (TREE_CODE (arg2) != INTEGER_CST
-	    || TREE_INT_CST_LOW (arg2) & ~0x3)
+	    || tree_to_uhwi (arg2) & ~0x3)
 	  {
 	    error ("argument to %qs must be a 2-bit unsigned literal", d->name);
 	    return const0_rtx;
@@ -10528,8 +10528,8 @@  get_element_number (tree vec_type, tree arg)
 {
   unsigned HOST_WIDE_INT elt, max = TYPE_VECTOR_SUBPARTS (vec_type) - 1;
 
-  if (!host_integerp (arg, 1)
-      || (elt = tree_low_cst (arg, 1), elt > max))
+  if (!tree_fits_uhwi_p (arg)
+      || (elt = tree_to_hwi (arg), elt > max))
     {
       error ("selector must be an integer constant in the range 0..%wi", max);
       return 0;
@@ -10721,7 +10721,7 @@  altivec_expand_builtin (tree exp, rtx target, bool *expandedp)
 	return const0_rtx;
 
       if (TREE_CODE (arg0) != INTEGER_CST
-	  || TREE_INT_CST_LOW (arg0) & ~0x3)
+	  || tree_to_uhwi (arg0) & ~0x3)
 	{
 	  error ("argument to dss must be a 2-bit unsigned literal");
 	  return const0_rtx;
@@ -10930,7 +10930,7 @@  spe_expand_builtin (tree exp, rtx target, bool *expandedp)
     case SPE_BUILTIN_EVSTWWO:
       arg1 = CALL_EXPR_ARG (exp, 2);
       if (TREE_CODE (arg1) != INTEGER_CST
-	  || TREE_INT_CST_LOW (arg1) & ~0x1f)
+	  || tree_to_uhwi (arg1) & ~0x1f)
 	{
 	  error ("argument 2 must be a 5-bit unsigned literal");
 	  return const0_rtx;
@@ -11056,7 +11056,7 @@  paired_expand_predicate_builtin (enum insn_code icode, tree exp, rtx target)
       return const0_rtx;
     }
   else
-    form_int = TREE_INT_CST_LOW (form);
+    form_int = tree_to_shwi (form);
 
   gcc_assert (mode0 == mode1);
 
@@ -11128,7 +11128,7 @@  spe_expand_predicate_builtin (enum insn_code icode, tree exp, rtx target)
       return const0_rtx;
     }
   else
-    form_int = TREE_INT_CST_LOW (form);
+    form_int = tree_to_shwi (form);
 
   gcc_assert (mode0 == mode1);
 
diff --git a/gcc/config/s390/s390.c b/gcc/config/s390/s390.c
index bcbb2d4..a3def28 100644
--- a/gcc/config/s390/s390.c
+++ b/gcc/config/s390/s390.c
@@ -9507,9 +9507,9 @@  s390_encode_section_info (tree decl, rtx rtl, int first)
 	SYMBOL_REF_FLAGS (XEXP (rtl, 0)) |= SYMBOL_FLAG_ALIGN1;
       if (!DECL_SIZE (decl)
 	  || !DECL_ALIGN (decl)
-	  || !host_integerp (DECL_SIZE (decl), 0)
+	  || !tree_fits_shwi_p (DECL_SIZE (decl))
 	  || (DECL_ALIGN (decl) <= 64
-	      && DECL_ALIGN (decl) != tree_low_cst (DECL_SIZE (decl), 0)))
+	      && DECL_ALIGN (decl) != tree_to_hwi (DECL_SIZE (decl))))
 	SYMBOL_REF_FLAGS (XEXP (rtl, 0)) |= SYMBOL_FLAG_NOT_NATURALLY_ALIGNED;
     }
 
diff --git a/gcc/config/sh/sh.c b/gcc/config/sh/sh.c
index 3599879..6be440e 100644
--- a/gcc/config/sh/sh.c
+++ b/gcc/config/sh/sh.c
@@ -1131,7 +1131,7 @@  sh_print_operand (FILE *stream, rtx x, int code)
 				      DECL_ATTRIBUTES (current_function_decl));
       if (trapa_attr)
 	fprintf (stream, "trapa #%ld",
-		 (long) TREE_INT_CST_LOW (TREE_VALUE (TREE_VALUE (trapa_attr))));
+		 (long) tree_to_shwi (TREE_VALUE (TREE_VALUE (trapa_attr))));
       else if (sh_cfun_interrupt_handler_p ())
 	{
 	  if (sh_cfun_resbank_handler_p ())
@@ -9505,7 +9505,7 @@  sh2a_handle_function_vector_handler_attribute (tree * node, tree name,
                name);
       *no_add_attrs = true;
     }
-  else if (TREE_INT_CST_LOW (TREE_VALUE (args)) > 255)
+  else if (tree_to_shwi (TREE_VALUE (args)) > 255)
     {
       /* The argument value must be between 0 to 255.  */
       warning (OPT_Wattributes,
@@ -9554,7 +9554,7 @@  sh2a_get_function_vector_number (rtx x)
         {
           if (is_attribute_p ("function_vector", TREE_PURPOSE (list)))
             {
-              num = TREE_INT_CST_LOW (TREE_VALUE (TREE_VALUE (list)));
+              num = tree_to_hwi (TREE_VALUE (TREE_VALUE (list)));
               return num;
             }
 
diff --git a/gcc/config/sol2-c.c b/gcc/config/sol2-c.c
index 0076be7..921b76f 100644
--- a/gcc/config/sol2-c.c
+++ b/gcc/config/sol2-c.c
@@ -94,8 +94,8 @@  solaris_pragma_align (cpp_reader *pfile ATTRIBUTE_UNUSED)
       return;
     }
 
-  low = TREE_INT_CST_LOW (x);
-  if (TREE_INT_CST_HIGH (x) != 0
+  low = tree_to_hwi (x);
+  if (!tree_fits_uhwi_p (x)
       || (low != 1 && low != 2 && low != 4 && low != 8 && low != 16
 	  && low != 32 && low != 64 && low != 128))
     {
diff --git a/gcc/config/sparc/sparc.c b/gcc/config/sparc/sparc.c
index 8849c03..92db31c 100644
--- a/gcc/config/sparc/sparc.c
+++ b/gcc/config/sparc/sparc.c
@@ -5967,7 +5967,7 @@  function_arg_record_value_1 (const_tree type, HOST_WIDE_INT startbitpos,
 	      if (integer_zerop (DECL_SIZE (field)))
 		continue;
 
-	      if (host_integerp (bit_position (field), 1))
+	      if (tree_fits_uhwi_p (bit_position (field)))
 		bitpos += int_bit_position (field);
 	    }
 
@@ -6115,7 +6115,7 @@  function_arg_record_value_2 (const_tree type, HOST_WIDE_INT startbitpos,
 	      if (integer_zerop (DECL_SIZE (field)))
 		continue;
 
-	      if (host_integerp (bit_position (field), 1))
+	      if (tree_fits_uhwi_p (bit_position (field)))
 		bitpos += int_bit_position (field);
 	    }
 
@@ -6782,10 +6782,10 @@  sparc_struct_value_rtx (tree fndecl, int incoming)
 
 	  /* Calculate the return object size */
 	  tree size = TYPE_SIZE_UNIT (TREE_TYPE (fndecl));
-	  rtx size_rtx = GEN_INT (TREE_INT_CST_LOW (size) & 0xfff);
+	  rtx size_rtx = GEN_INT (tree_to_shwi (size) & 0xfff);
 	  /* Construct a temporary return value */
 	  rtx temp_val
-	    = assign_stack_local (Pmode, TREE_INT_CST_LOW (size), 0);
+	    = assign_stack_local (Pmode, tree_to_shwi (size), 0);
 
 	  /* Implement SPARC 32-bit psABI callee return struct checking:
 
@@ -9979,31 +9979,31 @@  sparc_handle_vis_mul8x16 (tree *n_elts, int fncode, tree inner_type,
       for (i = 0; i < num; ++i)
 	{
 	  int val
-	    = sparc_vis_mul8x16 (TREE_INT_CST_LOW (VECTOR_CST_ELT (cst0, i)),
-				 TREE_INT_CST_LOW (VECTOR_CST_ELT (cst1, i)));
+	    = sparc_vis_mul8x16 (tree_to_shwi (VECTOR_CST_ELT (cst0, i)),
+				 tree_to_shwi (VECTOR_CST_ELT (cst1, i)));
 	  n_elts[i] = build_int_cst (inner_type, val);
 	}
       break;
 
     case CODE_FOR_fmul8x16au_vis:
-      scale = TREE_INT_CST_LOW (VECTOR_CST_ELT (cst1, 0));
+      scale = tree_to_shwi (VECTOR_CST_ELT (cst1, 0));
 
       for (i = 0; i < num; ++i)
 	{
 	  int val
-	    = sparc_vis_mul8x16 (TREE_INT_CST_LOW (VECTOR_CST_ELT (cst0, i)),
+	    = sparc_vis_mul8x16 (tree_to_shwi (VECTOR_CST_ELT (cst0, i)),
 				 scale);
 	  n_elts[i] = build_int_cst (inner_type, val);
 	}
       break;
 
     case CODE_FOR_fmul8x16al_vis:
-      scale = TREE_INT_CST_LOW (VECTOR_CST_ELT (cst1, 1));
+      scale = tree_to_shwi (VECTOR_CST_ELT (cst1, 1));
 
       for (i = 0; i < num; ++i)
 	{
 	  int val
-	    = sparc_vis_mul8x16 (TREE_INT_CST_LOW (VECTOR_CST_ELT (cst0, i)),
+	    = sparc_vis_mul8x16 (tree_to_shwi (VECTOR_CST_ELT (cst0, i)),
 				 scale);
 	  n_elts[i] = build_int_cst (inner_type, val);
 	}
@@ -10063,7 +10063,7 @@  sparc_fold_builtin (tree fndecl, int n_args ATTRIBUTE_UNUSED,
 	  n_elts = XALLOCAVEC (tree, VECTOR_CST_NELTS (arg0));
 	  for (i = 0; i < VECTOR_CST_NELTS (arg0); ++i)
 	    n_elts[i] = build_int_cst (inner_type,
-				       TREE_INT_CST_LOW
+				       tree_to_shwi
 				         (VECTOR_CST_ELT (arg0, i)) << 4);
 	  return build_vector (rtype, n_elts);
 	}
diff --git a/gcc/config/vms/vms-c.c b/gcc/config/vms/vms-c.c
index 7339323..294f294 100644
--- a/gcc/config/vms/vms-c.c
+++ b/gcc/config/vms/vms-c.c
@@ -317,7 +317,7 @@  handle_pragma_pointer_size (const char *pragma_name)
       int val;
 
       if (TREE_CODE (x) == INTEGER_CST)
-        val = TREE_INT_CST_LOW (x);
+        val = tree_to_shwi (x);
       else
         val = -1;
 
diff --git a/gcc/coverage.c b/gcc/coverage.c
index f9b12e8..9bb0f89 100644
--- a/gcc/coverage.c
+++ b/gcc/coverage.c
@@ -800,7 +800,7 @@  build_fn_info (const struct coverage_data *data, tree type, tree key)
 
 	if (var)
 	  count
-	    = tree_low_cst (TYPE_MAX_VALUE (TYPE_DOMAIN (TREE_TYPE (var))), 0)
+	    = tree_to_shwi (TYPE_MAX_VALUE (TYPE_DOMAIN (TREE_TYPE (var))))
 	    + 1;
 
 	CONSTRUCTOR_APPEND_ELT (ctr, TYPE_FIELDS (ctr_type),
diff --git a/gcc/cp/call.c b/gcc/cp/call.c
index 006cf41..f888d32 100644
--- a/gcc/cp/call.c
+++ b/gcc/cp/call.c
@@ -944,7 +944,8 @@  build_array_conv (tree type, tree ctor, int flags, tsubst_flags_t complain)
 
   if (TYPE_DOMAIN (type))
     {
-      unsigned HOST_WIDE_INT alen = tree_low_cst (array_type_nelts_top (type), 1);
+      unsigned HOST_WIDE_INT alen 
+	= tree_to_uhwi (array_type_nelts_top (type));
       if (alen < len)
 	return NULL;
     }
diff --git a/gcc/cp/class.c b/gcc/cp/class.c
index 8de1423..c62a33e 100644
--- a/gcc/cp/class.c
+++ b/gcc/cp/class.c
@@ -5774,7 +5774,7 @@  layout_class_type (tree t, tree *virtuals_p)
 	{
 	  unsigned HOST_WIDE_INT width;
 	  tree ftype = TREE_TYPE (field);
-	  width = tree_low_cst (DECL_SIZE (field), /*unsignedp=*/1);
+	  width = tree_to_uhwi (DECL_SIZE (field));
 	  if (width != TYPE_PRECISION (ftype))
 	    {
 	      TREE_TYPE (field)
@@ -7585,7 +7585,7 @@  dump_class_hierarchy_r (FILE *stream,
   igo = TREE_CHAIN (binfo);
 
   fprintf (stream, HOST_WIDE_INT_PRINT_DEC,
-	   tree_low_cst (BINFO_OFFSET (binfo), 0));
+	   tree_to_shwi (BINFO_OFFSET (binfo)));
   if (is_empty_class (BINFO_TYPE (binfo)))
     fprintf (stream, " empty");
   else if (CLASSTYPE_NEARLY_EMPTY_P (BINFO_TYPE (binfo)))
@@ -7661,10 +7661,10 @@  dump_class_hierarchy_1 (FILE *stream, int flags, tree t)
 {
   fprintf (stream, "Class %s\n", type_as_string (t, TFF_PLAIN_IDENTIFIER));
   fprintf (stream, "   size=%lu align=%lu\n",
-	   (unsigned long)(tree_low_cst (TYPE_SIZE (t), 0) / BITS_PER_UNIT),
+	   (unsigned long)(tree_to_shwi (TYPE_SIZE (t)) / BITS_PER_UNIT),
 	   (unsigned long)(TYPE_ALIGN (t) / BITS_PER_UNIT));
   fprintf (stream, "   base size=%lu base align=%lu\n",
-	   (unsigned long)(tree_low_cst (TYPE_SIZE (CLASSTYPE_AS_BASE (t)), 0)
+	   (unsigned long)(tree_to_shwi (TYPE_SIZE (CLASSTYPE_AS_BASE (t)))
 			   / BITS_PER_UNIT),
 	   (unsigned long)(TYPE_ALIGN (CLASSTYPE_AS_BASE (t))
 			   / BITS_PER_UNIT));
@@ -7701,7 +7701,7 @@  dump_array (FILE * stream, tree decl)
   HOST_WIDE_INT elt;
   tree size = TYPE_MAX_VALUE (TYPE_DOMAIN (TREE_TYPE (decl)));
 
-  elt = (tree_low_cst (TYPE_SIZE (TREE_TYPE (TREE_TYPE (decl))), 0)
+  elt = (tree_to_shwi (TYPE_SIZE (TREE_TYPE (TREE_TYPE (decl))))
 	 / BITS_PER_UNIT);
   fprintf (stream, "%s:", decl_as_string (decl, TFF_PLAIN_IDENTIFIER));
   fprintf (stream, " %s entries",
@@ -7789,10 +7789,10 @@  dump_thunk (FILE *stream, int indent, tree thunk)
 	/*NOP*/;
       else if (DECL_THIS_THUNK_P (thunk))
 	fprintf (stream, " vcall="  HOST_WIDE_INT_PRINT_DEC,
-		 tree_low_cst (virtual_adjust, 0));
+		 tree_to_shwi (virtual_adjust));
       else
 	fprintf (stream, " vbase=" HOST_WIDE_INT_PRINT_DEC "(%s)",
-		 tree_low_cst (BINFO_VPTR_FIELD (virtual_adjust), 0),
+		 tree_to_shwi (BINFO_VPTR_FIELD (virtual_adjust)),
 		 type_as_string (BINFO_TYPE (virtual_adjust), TFF_SCOPE));
       if (THUNK_ALIAS (thunk))
 	fprintf (stream, " alias to %p", (void *)THUNK_ALIAS (thunk));
diff --git a/gcc/cp/cp-tree.h b/gcc/cp/cp-tree.h
index 034668d..fdbf632 100644
--- a/gcc/cp/cp-tree.h
+++ b/gcc/cp/cp-tree.h
@@ -2744,7 +2744,7 @@  extern void decl_shadowed_for_var_insert (tree, tree);
 
 /* The number of levels of template parameters given by NODE.  */
 #define TMPL_PARMS_DEPTH(NODE) \
-  ((HOST_WIDE_INT) TREE_INT_CST_LOW (TREE_PURPOSE (NODE)))
+  ((HOST_WIDE_INT) tree_to_shwi (TREE_PURPOSE (NODE)))
 
 /* The TEMPLATE_DECL instantiated or specialized by NODE.  This
    TEMPLATE_DECL will be the immediate parent, not the most general
@@ -3619,7 +3619,7 @@  more_aggr_init_expr_args_p (const aggr_init_expr_arg_iterator *iter)
 /* Accessor macros for C++ template decl nodes.  */
 
 /* The DECL_TEMPLATE_PARMS are a list.  The TREE_PURPOSE of each node
-   is a INT_CST whose TREE_INT_CST_LOW indicates the level of the
+   is a INT_CST whose tree_to_uhwi indicates the level of the
    template parameters, with 1 being the outermost set of template
    parameters.  The TREE_VALUE is a vector, whose elements are the
    template parameters at each level.  Each element in the vector is a
diff --git a/gcc/cp/decl.c b/gcc/cp/decl.c
index 72754a9..368d738 100644
--- a/gcc/cp/decl.c
+++ b/gcc/cp/decl.c
@@ -4698,7 +4698,7 @@  check_array_designated_initializer (const constructor_elt *ce,
       else if (TREE_CODE (ce->index) == INTEGER_CST)
 	{
 	  /* A C99 designator is OK if it matches the current index.  */
-	  if (TREE_INT_CST_LOW (ce->index) == index)
+	  if (tree_to_uhwi (ce->index) == index)
 	    return true;
 	  else
 	    sorry ("non-trivial designated initializers not supported");
@@ -4991,12 +4991,11 @@  reshape_init_array_1 (tree elt_type, tree max_index, reshape_iter *d,
       if (integer_all_onesp (max_index))
 	return new_init;
 
-      if (host_integerp (max_index, 1))
-	max_index_cst = tree_low_cst (max_index, 1);
+      if (tree_fits_uhwi_p (max_index))
+	max_index_cst = tree_to_hwi (max_index);
       /* sizetype is sign extended, not zero extended.  */
       else
-	max_index_cst = tree_low_cst (fold_convert (size_type_node, max_index),
-				      1);
+	max_index_cst = tree_to_uhwi (fold_convert (size_type_node, max_index));
     }
 
   /* Loop until there are no more initializers.  */
@@ -9717,7 +9716,7 @@  grokdeclarator (const cp_declarator *declarator,
     {
       error ("size of array %qs is too large", name);
       /* If we proceed with the array type as it is, we'll eventually
-	 crash in tree_low_cst().  */
+	 crash in tree_to_uhwi ().  */
       type = error_mark_node;
     }
 
diff --git a/gcc/cp/dump.c b/gcc/cp/dump.c
index 3e6ef8f..bec5d83 100644
--- a/gcc/cp/dump.c
+++ b/gcc/cp/dump.c
@@ -347,7 +347,7 @@  cp_dump_tree (void* dump_info, tree t)
 	    }
 	  dump_int (di, "fixd", THUNK_FIXED_OFFSET (t));
 	  if (virt)
-	    dump_int (di, "virt", tree_low_cst (virt, 0));
+	    dump_int (di, "virt", tree_to_shwi (virt));
 	  dump_child ("fn", DECL_INITIAL (t));
 	}
       break;
diff --git a/gcc/cp/error.c b/gcc/cp/error.c
index e1aa938..ea69403 100644
--- a/gcc/cp/error.c
+++ b/gcc/cp/error.c
@@ -848,8 +848,8 @@  dump_type_suffix (tree t, int flags)
 	  tree max = TYPE_MAX_VALUE (dtype);
 	  if (integer_all_onesp (max))
 	    pp_character (cxx_pp, '0');
-	  else if (host_integerp (max, 0))
-	    pp_wide_integer (cxx_pp, tree_low_cst (max, 0) + 1);
+	  else if (tree_fits_shwi_p (max))
+	    pp_wide_integer (cxx_pp, tree_to_hwi (max) + 1);
 	  else if (TREE_CODE (max) == MINUS_EXPR)
 	    dump_expr (TREE_OPERAND (max, 0),
 		       flags & ~TFF_EXPR_IN_PARENS);
@@ -1760,7 +1760,7 @@  static tree
 resolve_virtual_fun_from_obj_type_ref (tree ref)
 {
   tree obj_type = TREE_TYPE (OBJ_TYPE_REF_OBJECT (ref));
-  HOST_WIDE_INT index = tree_low_cst (OBJ_TYPE_REF_TOKEN (ref), 1);
+  HOST_WIDE_INT index = tree_to_uhwi (OBJ_TYPE_REF_TOKEN (ref));
   tree fun = BINFO_VIRTUALS (TYPE_BINFO (TREE_TYPE (obj_type)));
   while (index)
     {
@@ -2171,7 +2171,7 @@  dump_expr (tree t, int flags)
 	      pp_cxx_right_paren (cxx_pp);
 	      break;
 	    }
-	  else if (host_integerp (idx, 0))
+	  else if (tree_fits_shwi_p (idx))
 	    {
 	      tree virtuals;
 	      unsigned HOST_WIDE_INT n;
@@ -2180,7 +2180,7 @@  dump_expr (tree t, int flags)
 	      t = TYPE_METHOD_BASETYPE (t);
 	      virtuals = BINFO_VIRTUALS (TYPE_BINFO (TYPE_MAIN_VARIANT (t)));
 
-	      n = tree_low_cst (idx, 0);
+	      n = tree_to_hwi (idx);
 
 	      /* Map vtable index back one, to allow for the null pointer to
 		 member.  */
diff --git a/gcc/cp/init.c b/gcc/cp/init.c
index 40d0ce3..886eb2a 100644
--- a/gcc/cp/init.c
+++ b/gcc/cp/init.c
@@ -3516,9 +3516,9 @@  build_vec_init (tree base, tree maxindex, tree init,
 
   if (from_array
       || ((type_build_ctor_call (type) || init || explicit_value_init_p)
-	  && ! (host_integerp (maxindex, 0)
+	  && ! (tree_fits_shwi_p (maxindex)
 		&& (num_initialized_elts
-		    == tree_low_cst (maxindex, 0) + 1))))
+		    == tree_to_hwi (maxindex) + 1))))
     {
       /* If the ITERATOR is equal to -1, then we don't have to loop;
 	 we've already initialized all the elements.  */
diff --git a/gcc/cp/mangle.c b/gcc/cp/mangle.c
index eee44a1..71bf868 100644
--- a/gcc/cp/mangle.c
+++ b/gcc/cp/mangle.c
@@ -1476,7 +1476,7 @@  write_integer_cst (const tree cst)
 
 	  done = integer_zerop (d);
 	  tmp = fold_build2_loc (input_location, MINUS_EXPR, type, n, tmp);
-	  c = hwint_to_ascii (TREE_INT_CST_LOW (tmp), 10, ptr,
+	  c = hwint_to_ascii (tree_to_uhwi (tmp), 10, ptr,
 			      done ? 1 : chunk_digits);
 	  ptr -= c;
 	  count += c;
@@ -1488,7 +1488,7 @@  write_integer_cst (const tree cst)
   else
     {
       /* A small num.  */
-      unsigned HOST_WIDE_INT low = TREE_INT_CST_LOW (cst);
+      unsigned HOST_WIDE_INT low = tree_to_hwi (cst);
 
       if (sign < 0)
 	{
diff --git a/gcc/cp/method.c b/gcc/cp/method.c
index a42ed60..7fa86b2 100644
--- a/gcc/cp/method.c
+++ b/gcc/cp/method.c
@@ -97,7 +97,7 @@  make_thunk (tree function, bool this_adjusting,
 		    convert (ssizetype,
 			     TYPE_SIZE_UNIT (vtable_entry_type)));
 
-  d = tree_low_cst (fixed_offset, 0);
+  d = tree_to_shwi (fixed_offset);
 
   /* See if we already have the thunk in question.  For this_adjusting
      thunks VIRTUAL_OFFSET will be an INTEGER_CST, for covariant thunks it
@@ -324,7 +324,7 @@  use_thunk (tree thunk_fndecl, bool emit_p)
     {
       if (!this_adjusting)
 	virtual_offset = BINFO_VPTR_FIELD (virtual_offset);
-      virtual_value = tree_low_cst (virtual_offset, /*pos=*/0);
+      virtual_value = tree_to_shwi (virtual_offset);
       gcc_assert (virtual_value);
     }
   else
diff --git a/gcc/cp/parser.c b/gcc/cp/parser.c
index baaa809..b69e9b8 100644
--- a/gcc/cp/parser.c
+++ b/gcc/cp/parser.c
@@ -787,7 +787,7 @@  cp_lexer_get_preprocessor_token (cp_lexer *lexer, cp_token *token)
     {
       /* We smuggled the cpp_token->u.pragma value in an INTEGER_CST.  */
       token->pragma_kind = ((enum pragma_kind)
-			    TREE_INT_CST_LOW (token->u.value));
+			    tree_to_shwi (token->u.value));
       token->u.value = NULL_TREE;
     }
 }
@@ -3698,7 +3698,7 @@  cp_parser_userdef_string_literal (cp_token *token)
   tree name = cp_literal_operator_id (IDENTIFIER_POINTER (suffix_id));
   tree value = USERDEF_LITERAL_VALUE (literal);
   int len = TREE_STRING_LENGTH (value)
-	/ TREE_INT_CST_LOW (TYPE_SIZE_UNIT (TREE_TYPE (TREE_TYPE (value)))) - 1;
+	/ tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (TREE_TYPE (value)))) - 1;
   tree decl, result;
 
   /* Build up a call to the user-defined operator  */
@@ -25215,8 +25215,8 @@  cp_parser_omp_clause_collapse (cp_parser *parser, tree list, location_t location
     return list;
   num = fold_non_dependent_expr (num);
   if (!INTEGRAL_TYPE_P (TREE_TYPE (num))
-      || !host_integerp (num, 0)
-      || (n = tree_low_cst (num, 0)) <= 0
+      || !tree_fits_shwi_p (num)
+      || (n = tree_to_shwi (num)) <= 0
       || (int) n != n)
     {
       error_at (loc, "collapse argument needs positive constant integer expression");
@@ -26319,7 +26319,7 @@  cp_parser_omp_for_loop (cp_parser *parser, tree clauses, tree *par_clauses)
 
   for (cl = clauses; cl; cl = OMP_CLAUSE_CHAIN (cl))
     if (OMP_CLAUSE_CODE (cl) == OMP_CLAUSE_COLLAPSE)
-      collapse = tree_low_cst (OMP_CLAUSE_COLLAPSE_EXPR (cl), 0);
+      collapse = tree_to_shwi (OMP_CLAUSE_COLLAPSE_EXPR (cl));
 
   gcc_assert (collapse >= 1);
 
diff --git a/gcc/cp/semantics.c b/gcc/cp/semantics.c
index 17ac36f..3aa851f 100644
--- a/gcc/cp/semantics.c
+++ b/gcc/cp/semantics.c
@@ -6867,7 +6867,7 @@  cxx_eval_array_reference (const constexpr_call *call, tree t,
       *non_constant_p = true;
       return t;
     }
-  i = tree_low_cst (index, 0);
+  i = tree_to_shwi (index);
   if (TREE_CODE (ary) == CONSTRUCTOR)
     return VEC_index (constructor_elt, CONSTRUCTOR_ELTS (ary), i).value;
   else if (elem_nchars == 1)
@@ -6976,8 +6976,8 @@  cxx_eval_bit_field_ref (const constexpr_call *call, tree t,
     return t;
 
   start = TREE_OPERAND (t, 2);
-  istart = tree_low_cst (start, 0);
-  isize = tree_low_cst (TREE_OPERAND (t, 1), 0);
+  istart = tree_to_shwi (start);
+  isize = tree_to_shwi (TREE_OPERAND (t, 1));
   utype = TREE_TYPE (t);
   if (!TYPE_UNSIGNED (utype))
     utype = build_nonstandard_integer_type (TYPE_PRECISION (utype), 1);
@@ -6989,11 +6989,11 @@  cxx_eval_bit_field_ref (const constexpr_call *call, tree t,
 	return value;
       if (TREE_CODE (TREE_TYPE (field)) == INTEGER_TYPE
 	  && TREE_CODE (value) == INTEGER_CST
-	  && host_integerp (bitpos, 0)
-	  && host_integerp (DECL_SIZE (field), 0))
+	  && tree_fits_shwi_p (bitpos)
+	  && tree_fits_shwi_p (DECL_SIZE (field)))
 	{
-	  HOST_WIDE_INT bit = tree_low_cst (bitpos, 0);
-	  HOST_WIDE_INT sz = tree_low_cst (DECL_SIZE (field), 0);
+	  HOST_WIDE_INT bit = tree_to_hwi (bitpos);
+	  HOST_WIDE_INT sz = tree_to_hwi (DECL_SIZE (field));
 	  HOST_WIDE_INT shift;
 	  if (bit >= istart && bit + sz <= istart + isize)
 	    {
@@ -7148,7 +7148,7 @@  cxx_eval_vec_init_1 (const constexpr_call *call, tree atype, tree init,
 		     bool *non_constant_p)
 {
   tree elttype = TREE_TYPE (atype);
-  int max = tree_low_cst (array_type_nelts (atype), 0);
+  int max = tree_to_shwi (array_type_nelts (atype));
   VEC(constructor_elt,gc) *n = VEC_alloc (constructor_elt, gc, max + 1);
   bool pre_init = false;
   int i;
@@ -7366,9 +7366,9 @@  cxx_fold_indirect_ref (location_t loc, tree type, tree op0, bool *empty_base)
 	      && (same_type_ignoring_top_level_qualifiers_p
 		  (type, TREE_TYPE (op00type))))
 	    {
-	      HOST_WIDE_INT offset = tree_low_cst (op01, 0);
+	      HOST_WIDE_INT offset = tree_to_shwi (op01);
 	      tree part_width = TYPE_SIZE (type);
-	      unsigned HOST_WIDE_INT part_widthi = tree_low_cst (part_width, 0)/BITS_PER_UNIT;
+	      unsigned HOST_WIDE_INT part_widthi = tree_to_shwi (part_width)/BITS_PER_UNIT;
 	      unsigned HOST_WIDE_INT indexi = offset * BITS_PER_UNIT;
 	      tree index = bitsize_int (indexi);
 
diff --git a/gcc/cp/tree.c b/gcc/cp/tree.c
index 60dc549..6ea7157 100644
--- a/gcc/cp/tree.c
+++ b/gcc/cp/tree.c
@@ -1596,7 +1596,7 @@  debug_binfo (tree elem)
   fprintf (stderr, "type \"%s\", offset = " HOST_WIDE_INT_PRINT_DEC
 	   "\nvtable type:\n",
 	   TYPE_NAME_STRING (BINFO_TYPE (elem)),
-	   TREE_INT_CST_LOW (BINFO_OFFSET (elem)));
+	   tree_to_shwi (BINFO_OFFSET (elem)));
   debug_tree (BINFO_TYPE (elem));
   if (BINFO_VTABLE (elem))
     fprintf (stderr, "vtable decl \"%s\"\n",
@@ -1612,7 +1612,7 @@  debug_binfo (tree elem)
       tree fndecl = TREE_VALUE (virtuals);
       fprintf (stderr, "%s [%ld =? %ld]\n",
 	       IDENTIFIER_POINTER (DECL_ASSEMBLER_NAME (fndecl)),
-	       (long) n, (long) TREE_INT_CST_LOW (DECL_VINDEX (fndecl)));
+	       (long) n, (long) tree_to_shwi (DECL_VINDEX (fndecl)));
       ++n;
       virtuals = TREE_CHAIN (virtuals);
     }
@@ -3073,7 +3073,7 @@  handle_init_priority_attribute (tree* node,
       return NULL_TREE;
     }
 
-  pri = TREE_INT_CST_LOW (initp_expr);
+  pri = tree_to_shwi (initp_expr);
 
   type = strip_array_types (type);
 
diff --git a/gcc/cp/typeck2.c b/gcc/cp/typeck2.c
index 3dbfcb6..b36c184 100644
--- a/gcc/cp/typeck2.c
+++ b/gcc/cp/typeck2.c
@@ -919,7 +919,7 @@  digest_init_r (tree type, tree init, bool nested, int flags,
 	    }
 	  if (TYPE_DOMAIN (type) != 0 && TREE_CONSTANT (TYPE_SIZE (type)))
 	    {
-	      int size = TREE_INT_CST_LOW (TYPE_SIZE (type));
+	      int size = tree_to_shwi (TYPE_SIZE (type));
 	      size = (size + BITS_PER_UNIT - 1) / BITS_PER_UNIT;
 	      /* In C it is ok to subtract 1 from the length of the string
 		 because it's ok to ignore the terminating null char that is
diff --git a/gcc/cppbuiltin.c b/gcc/cppbuiltin.c
index 05d82f5..b6e174c 100644
--- a/gcc/cppbuiltin.c
+++ b/gcc/cppbuiltin.c
@@ -126,7 +126,7 @@  define_builtin_macros_for_type_sizes (cpp_reader *pfile)
 {
 #define define_type_sizeof(NAME, TYPE)                             \
     cpp_define_formatted (pfile, NAME"="HOST_WIDE_INT_PRINT_DEC,   \
-                          tree_low_cst (TYPE_SIZE_UNIT (TYPE), 1))
+                          tree_to_uhwi (TYPE_SIZE_UNIT (TYPE)))
 
   define_type_sizeof ("__SIZEOF_INT__", integer_type_node);
   define_type_sizeof ("__SIZEOF_LONG__", long_integer_type_node);
diff --git a/gcc/dbxout.c b/gcc/dbxout.c
index e8e73bb..15b2024 100644
--- a/gcc/dbxout.c
+++ b/gcc/dbxout.c
@@ -1521,9 +1521,9 @@  dbxout_type_fields (tree type)
 	  /* Omit fields whose position or size are variable or too large to
 	     represent.  */
 	  || (TREE_CODE (tem) == FIELD_DECL
-	      && (! host_integerp (bit_position (tem), 0)
+	      && (! tree_fits_shwi_p (bit_position (tem))
 		  || ! DECL_SIZE (tem)
-		  || ! host_integerp (DECL_SIZE (tem), 1))))
+		  || ! tree_fits_uhwi_p (DECL_SIZE (tem)))))
 	continue;
 
       else if (TREE_CODE (tem) != CONST_DECL)
@@ -1568,7 +1568,7 @@  dbxout_type_fields (tree type)
 	      stabstr_C (',');
 	      stabstr_D (int_bit_position (tem));
 	      stabstr_C (',');
-	      stabstr_D (tree_low_cst (DECL_SIZE (tem), 1));
+	      stabstr_D (tree_to_uhwi (DECL_SIZE (tem)));
 	      stabstr_C (';');
 	    }
 	}
@@ -1612,9 +1612,9 @@  dbxout_type_method_1 (tree decl)
   stabstr_C (c1);
   stabstr_C (c2);
 
-  if (DECL_VINDEX (decl) && host_integerp (DECL_VINDEX (decl), 0))
+  if (DECL_VINDEX (decl) && tree_fits_shwi_p (DECL_VINDEX (decl)))
     {
-      stabstr_D (tree_low_cst (DECL_VINDEX (decl), 0));
+      stabstr_D (tree_to_hwi (DECL_VINDEX (decl)));
       stabstr_C (';');
       dbxout_type (DECL_CONTEXT (decl), 0);
       stabstr_C (';');
@@ -1720,23 +1720,23 @@  dbxout_range_type (tree type, tree low, tree high)
     }
 
   stabstr_C (';');
-  if (low && host_integerp (low, 0))
+  if (low && tree_fits_shwi_p (low))
     {
       if (print_int_cst_bounds_in_octal_p (type, low, high))
         stabstr_O (low);
       else
-        stabstr_D (tree_low_cst (low, 0));
+        stabstr_D (tree_to_hwi (low));
     }
   else
     stabstr_C ('0');
 
   stabstr_C (';');
-  if (high && host_integerp (high, 0))
+  if (high && tree_fits_shwi_p (high))
     {
       if (print_int_cst_bounds_in_octal_p (type, low, high))
         stabstr_O (high);
       else
-        stabstr_D (tree_low_cst (high, 0));
+        stabstr_D (tree_to_hwi (high));
       stabstr_C (';');
     }
   else
@@ -1866,7 +1866,7 @@  dbxout_type (tree type, int full)
 	 Sun dbx crashes if we do.  */
       if (! full || !COMPLETE_TYPE_P (type)
 	  /* No way in DBX fmt to describe a variable size.  */
-	  || ! host_integerp (TYPE_SIZE (type), 1))
+	  || ! tree_fits_uhwi_p (TYPE_SIZE (type)))
 	return;
       break;
     case TYPE_DEFINED:
@@ -1891,7 +1891,7 @@  dbxout_type (tree type, int full)
 	 && !full)
 	|| !COMPLETE_TYPE_P (type)
 	/* No way in DBX fmt to describe a variable size.  */
-	|| ! host_integerp (TYPE_SIZE (type), 1))
+	|| ! tree_fits_uhwi_p (TYPE_SIZE (type)))
       {
 	typevec[TYPE_SYMTAB_ADDRESS (type)].status = TYPE_XREF;
 	return;
@@ -2149,7 +2149,7 @@  dbxout_type (tree type, int full)
 	     && !full)
 	    || !COMPLETE_TYPE_P (type)
 	    /* No way in DBX fmt to describe a variable size.  */
-	    || ! host_integerp (TYPE_SIZE (type), 1))
+	    || ! tree_fits_uhwi_p (TYPE_SIZE (type)))
 	  {
 	    /* If the type is just a cross reference, output one
 	       and mark the type as partially described.
@@ -2213,10 +2213,10 @@  dbxout_type (tree type, int full)
 		     	 offset within the vtable where we must look
 		     	 to find the necessary adjustment.  */
 		      stabstr_D
-			(tree_low_cst (BINFO_VPTR_FIELD (child), 0)
+			(tree_to_shwi (BINFO_VPTR_FIELD (child))
 			 * BITS_PER_UNIT);
 		    else
-		      stabstr_D (tree_low_cst (BINFO_OFFSET (child), 0)
+		      stabstr_D (tree_to_shwi (BINFO_OFFSET (child))
 				       * BITS_PER_UNIT);
 		    stabstr_C (',');
 		    dbxout_type (BINFO_TYPE (child), 0);
@@ -2231,11 +2231,11 @@  dbxout_type (tree type, int full)
 		    stabstr_C (':');
 		    dbxout_type (BINFO_TYPE (child), full);
 		    stabstr_C (',');
-		    stabstr_D (tree_low_cst (BINFO_OFFSET (child), 0)
+		    stabstr_D (tree_to_shwi (BINFO_OFFSET (child))
 				     * BITS_PER_UNIT);
 		    stabstr_C (',');
 		    stabstr_D
-		      (tree_low_cst (TYPE_SIZE (BINFO_TYPE (child)), 0)
+		      (tree_to_shwi (TYPE_SIZE (BINFO_TYPE (child)))
 		       * BITS_PER_UNIT);
 		    stabstr_C (';');
 		  }
@@ -2302,11 +2302,8 @@  dbxout_type (tree type, int full)
           if (TREE_CODE (value) == CONST_DECL)
             value = DECL_INITIAL (value);
 
-	  if (TREE_INT_CST_HIGH (value) == 0)
-	    stabstr_D (TREE_INT_CST_LOW (value));
-	  else if (TREE_INT_CST_HIGH (value) == -1
-		   && (HOST_WIDE_INT) TREE_INT_CST_LOW (value) < 0)
-	    stabstr_D (TREE_INT_CST_LOW (value));
+	  if (tree_fits_shwi_p (value))
+	    stabstr_D (tree_to_hwi (value));
 	  else
 	    stabstr_O (value);
 
@@ -2519,9 +2516,9 @@  dbxout_expand_expr (tree expr)
 	  return NULL;
 	if (offset != NULL)
 	  {
-	    if (!host_integerp (offset, 0))
+	    if (!tree_fits_shwi_p (offset))
 	      return NULL;
-	    x = adjust_address_nv (x, mode, tree_low_cst (offset, 0));
+	    x = adjust_address_nv (x, mode, tree_to_shwi (offset));
 	  }
 	if (bitpos != 0)
 	  x = adjust_address_nv (x, mode, bitpos / BITS_PER_UNIT);
@@ -2799,7 +2796,7 @@  dbxout_symbol (tree decl, int local ATTRIBUTE_UNUSED)
 		/* Do not generate a tag for records of variable size,
 		   since this type can not be properly described in the
 		   DBX format, and it confuses some tools such as objdump.  */
-		&& host_integerp (TYPE_SIZE (type), 1))
+		&& tree_fits_uhwi_p (TYPE_SIZE (type)))
 	      {
 		tree name = TYPE_NAME (type);
 		if (TREE_CODE (name) == TYPE_DECL)
@@ -2915,7 +2912,7 @@  dbxout_symbol (tree decl, int local ATTRIBUTE_UNUSED)
 	 ??? Why do we skip emitting the type and location in this case?  */
       if (TREE_STATIC (decl) && TREE_READONLY (decl)
 	  && DECL_INITIAL (decl) != 0
-	  && host_integerp (DECL_INITIAL (decl), 0)
+	  && tree_fits_shwi_p (DECL_INITIAL (decl))
 	  && ! TREE_ASM_WRITTEN (decl)
 	  && (DECL_FILE_SCOPE_P (decl)
 	      || TREE_CODE (DECL_CONTEXT (decl)) == BLOCK
@@ -2927,7 +2924,7 @@  dbxout_symbol (tree decl, int local ATTRIBUTE_UNUSED)
 	  if (TREE_CODE (TREE_TYPE (decl)) == INTEGER_TYPE
 	      || TREE_CODE (TREE_TYPE (decl)) == ENUMERAL_TYPE)
 	    {
-	      HOST_WIDE_INT ival = TREE_INT_CST_LOW (DECL_INITIAL (decl));
+	      HOST_WIDE_INT ival = tree_to_hwi (DECL_INITIAL (decl));
 
 	      dbxout_begin_complex_stabs ();
 	      dbxout_symbol_name (decl, NULL, 'c');
diff --git a/gcc/dojump.c b/gcc/dojump.c
index a8c75ae..c5d1e67 100644
--- a/gcc/dojump.c
+++ b/gcc/dojump.c
@@ -509,10 +509,10 @@  do_jump (tree exp, rtx if_false_label, rtx if_true_label, int prob)
 		  && compare_tree_int (shift, 0) >= 0
 		  && compare_tree_int (shift, HOST_BITS_PER_WIDE_INT) < 0
 		  && prefer_and_bit_test (TYPE_MODE (argtype),
-					  TREE_INT_CST_LOW (shift)))
+					  tree_to_uhwi (shift)))
 		{
 		  unsigned HOST_WIDE_INT mask
-		    = (unsigned HOST_WIDE_INT) 1 << TREE_INT_CST_LOW (shift);
+		    = (unsigned HOST_WIDE_INT) 1 << tree_to_hwi (shift);
 		  do_jump (build2 (BIT_AND_EXPR, argtype, arg,
 				   build_int_cstu (argtype, mask)),
 			   clr_label, set_label, setclr_prob);
diff --git a/gcc/dwarf2out.c b/gcc/dwarf2out.c
index bcf01e7..88aa976 100644
--- a/gcc/dwarf2out.c
+++ b/gcc/dwarf2out.c
@@ -9443,8 +9443,8 @@  simple_type_size_in_bits (const_tree type)
     return BITS_PER_WORD;
   else if (TYPE_SIZE (type) == NULL_TREE)
     return 0;
-  else if (host_integerp (TYPE_SIZE (type), 1))
-    return tree_low_cst (TYPE_SIZE (type), 1);
+  else if (tree_fits_uhwi_p (TYPE_SIZE (type)))
+    return tree_to_hwi (TYPE_SIZE (type));
   else
     return TYPE_ALIGN (type);
 }
@@ -12730,10 +12730,10 @@  dw_sra_loc_expr (tree decl, rtx loc)
   enum var_init_status initialized;
 
   if (DECL_SIZE (decl) == NULL
-      || !host_integerp (DECL_SIZE (decl), 1))
+      || !tree_fits_uhwi_p (DECL_SIZE (decl)))
     return NULL;
 
-  decl_size = tree_low_cst (DECL_SIZE (decl), 1);
+  decl_size = tree_to_hwi (DECL_SIZE (decl));
   descr = NULL;
   descr_tail = &descr;
 
@@ -13433,17 +13433,17 @@  loc_list_from_tree (tree loc, int want_address)
       }
 
     case INTEGER_CST:
-      if ((want_address || !host_integerp (loc, 0))
+      if ((want_address || !tree_fits_shwi_p (loc))
 	  && (ret = cst_pool_loc_descr (loc)))
 	have_address = 1;
       else if (want_address == 2
-	       && host_integerp (loc, 0)
+	       && tree_fits_shwi_p (loc)
 	       && (ret = address_of_int_loc_descriptor
 	       		   (int_size_in_bytes (TREE_TYPE (loc)),
-	       		    tree_low_cst (loc, 0))))
+	       		    tree_to_hwi (loc))))
 	have_address = 1;
-      else if (host_integerp (loc, 0))
-	ret = int_loc_descriptor (tree_low_cst (loc, 0));
+      else if (tree_fits_shwi_p (loc))
+	ret = int_loc_descriptor (tree_to_hwi (loc));
       else
 	{
 	  expansion_failed (loc, NULL_RTX,
@@ -13532,13 +13532,13 @@  loc_list_from_tree (tree loc, int want_address)
 
     case POINTER_PLUS_EXPR:
     case PLUS_EXPR:
-      if (host_integerp (TREE_OPERAND (loc, 1), 0))
+      if (tree_fits_shwi_p (TREE_OPERAND (loc, 1)))
 	{
 	  list_ret = loc_list_from_tree (TREE_OPERAND (loc, 0), 0);
 	  if (list_ret == 0)
 	    return 0;
 
-	  loc_list_plus_const (list_ret, tree_low_cst (TREE_OPERAND (loc, 1), 0));
+	  loc_list_plus_const (list_ret, tree_to_hwi (TREE_OPERAND (loc, 1)));
 	  break;
 	}
 
@@ -14042,7 +14042,7 @@  add_data_member_location_attribute (dw_die_ref die, tree decl)
 	  add_loc_descr (&loc_descr, tmp);
 
 	  /* Calculate the address of the offset.  */
-	  offset = tree_low_cst (BINFO_VPTR_FIELD (decl), 0);
+	  offset = tree_to_shwi (BINFO_VPTR_FIELD (decl));
 	  gcc_assert (offset < 0);
 
 	  tmp = int_loc_descriptor (-offset);
@@ -14059,7 +14059,7 @@  add_data_member_location_attribute (dw_die_ref die, tree decl)
 	  add_loc_descr (&loc_descr, tmp);
 	}
       else
-	offset = tree_low_cst (BINFO_OFFSET (decl), 0);
+	offset = tree_to_shwi (BINFO_OFFSET (decl));
     }
   else
     offset = field_byte_offset (decl);
@@ -14719,9 +14719,9 @@  fortran_common (tree decl, HOST_WIDE_INT *value)
   *value = 0;
   if (offset != NULL)
     {
-      if (!host_integerp (offset, 0))
+      if (!tree_fits_shwi_p (offset))
 	return NULL_TREE;
-      *value = tree_low_cst (offset, 0);
+      *value = tree_to_hwi (offset);
     }
   if (bitpos != 0)
     *value += bitpos / BITS_PER_UNIT;
@@ -14887,14 +14887,14 @@  native_encode_initializer (tree init, unsigned char *array, int size)
 	  constructor_elt *ce;
 
 	  if (TYPE_DOMAIN (type) == NULL_TREE
-	      || !host_integerp (TYPE_MIN_VALUE (TYPE_DOMAIN (type)), 0))
+	      || !tree_fits_shwi_p (TYPE_MIN_VALUE (TYPE_DOMAIN (type))))
 	    return false;
 
 	  fieldsize = int_size_in_bytes (TREE_TYPE (type));
 	  if (fieldsize <= 0)
 	    return false;
 
-	  min_index = tree_low_cst (TYPE_MIN_VALUE (TYPE_DOMAIN (type)), 0);
+	  min_index = tree_to_shwi (TYPE_MIN_VALUE (TYPE_DOMAIN (type)));
 	  memset (array, '\0', size);
 	  FOR_EACH_VEC_ELT (constructor_elt, CONSTRUCTOR_ELTS (init), cnt, ce)
 	    {
@@ -14902,10 +14902,10 @@  native_encode_initializer (tree init, unsigned char *array, int size)
 	      tree index = ce->index;
 	      int pos = curpos;
 	      if (index && TREE_CODE (index) == RANGE_EXPR)
-		pos = (tree_low_cst (TREE_OPERAND (index, 0), 0) - min_index)
+		pos = (tree_to_shwi (TREE_OPERAND (index, 0)) - min_index)
 		      * fieldsize;
 	      else if (index)
-		pos = (tree_low_cst (index, 0) - min_index) * fieldsize;
+		pos = (tree_to_shwi (index) - min_index) * fieldsize;
 
 	      if (val)
 		{
@@ -14916,8 +14916,8 @@  native_encode_initializer (tree init, unsigned char *array, int size)
 	      curpos = pos + fieldsize;
 	      if (index && TREE_CODE (index) == RANGE_EXPR)
 		{
-		  int count = tree_low_cst (TREE_OPERAND (index, 1), 0)
-			      - tree_low_cst (TREE_OPERAND (index, 0), 0);
+		  int count = tree_to_shwi (TREE_OPERAND (index, 1))
+			      - tree_to_shwi (TREE_OPERAND (index, 0));
 		  while (count-- > 0)
 		    {
 		      if (val)
@@ -14961,9 +14961,9 @@  native_encode_initializer (tree init, unsigned char *array, int size)
 		  && ! TYPE_MAX_VALUE (TYPE_DOMAIN (TREE_TYPE (field))))
 		return false;
 	      else if (DECL_SIZE_UNIT (field) == NULL_TREE
-		       || !host_integerp (DECL_SIZE_UNIT (field), 0))
+		       || !tree_fits_shwi_p (DECL_SIZE_UNIT (field)))
 		return false;
-	      fieldsize = tree_low_cst (DECL_SIZE_UNIT (field), 0);
+	      fieldsize = tree_to_shwi (DECL_SIZE_UNIT (field));
 	      pos = int_byte_position (field);
 	      gcc_assert (pos + fieldsize <= size);
 	      if (val
@@ -15334,9 +15334,9 @@  add_bound_info (dw_die_ref subrange_die, enum dwarf_attribute bound_attr, tree b
 
 	/* Use the default if possible.  */
 	if (bound_attr == DW_AT_lower_bound
-	    && host_integerp (bound, 0)
+	    && tree_fits_shwi_p (bound)
 	    && (dflt = lower_bound_default ()) != -1
-	    && tree_low_cst (bound, 0) == dflt)
+	    && tree_to_hwi (bound) == dflt)
 	  ;
 
 	/* Otherwise represent the bound as an unsigned value with the
@@ -15347,12 +15347,12 @@  add_bound_info (dw_die_ref subrange_die, enum dwarf_attribute bound_attr, tree b
 	    unsigned HOST_WIDE_INT mask
 	      = ((unsigned HOST_WIDE_INT) 1 << prec) - 1;
 	    add_AT_unsigned (subrange_die, bound_attr,
-		  	     TREE_INT_CST_LOW (bound) & mask);
+		  	     tree_to_uhwi (bound) & mask);
 	  }
 	else if (prec == HOST_BITS_PER_WIDE_INT
-		 || TREE_INT_CST_HIGH (bound) == 0)
+		 || tree_fits_uhwi_p (bound))
 	  add_AT_unsigned (subrange_die, bound_attr,
-		  	   TREE_INT_CST_LOW (bound));
+		  	   tree_to_hwi (bound));
 	else
 	  add_AT_double (subrange_die, bound_attr, TREE_INT_CST_HIGH (bound),
 		         TREE_INT_CST_LOW (bound));
@@ -15565,8 +15565,8 @@  add_bit_offset_attribute (dw_die_ref die, tree decl)
   /* We can't yet handle bit-fields whose offsets are variable, so if we
      encounter such things, just return without generating any attribute
      whatsoever.  Likewise for variable or too large size.  */
-  if (! host_integerp (bit_position (decl), 0)
-      || ! host_integerp (DECL_SIZE (decl), 1))
+  if (! tree_fits_shwi_p (bit_position (decl))
+      || ! tree_fits_uhwi_p (DECL_SIZE (decl)))
     return;
 
   bitpos_int = int_bit_position (decl);
@@ -15581,7 +15581,7 @@  add_bit_offset_attribute (dw_die_ref die, tree decl)
 
   if (! BYTES_BIG_ENDIAN)
     {
-      highest_order_field_bit_offset += tree_low_cst (DECL_SIZE (decl), 0);
+      highest_order_field_bit_offset += tree_to_shwi (DECL_SIZE (decl));
       highest_order_object_bit_offset += simple_type_size_in_bits (type);
     }
 
@@ -15606,8 +15606,8 @@  add_bit_size_attribute (dw_die_ref die, tree decl)
   gcc_assert (TREE_CODE (decl) == FIELD_DECL
 	      && DECL_BIT_FIELD_TYPE (decl));
 
-  if (host_integerp (DECL_SIZE (decl), 1))
-    add_AT_unsigned (die, DW_AT_bit_size, tree_low_cst (DECL_SIZE (decl), 1));
+  if (tree_fits_uhwi_p (DECL_SIZE (decl)))
+    add_AT_unsigned (die, DW_AT_bit_size, tree_to_hwi (DECL_SIZE (decl)));
 }
 
 /* If the compiled language is ANSI C, then add a 'prototyped'
@@ -15676,10 +15676,10 @@  add_pure_or_virtual_attribute (dw_die_ref die, tree func_decl)
     {
       add_AT_unsigned (die, DW_AT_virtuality, DW_VIRTUALITY_virtual);
 
-      if (host_integerp (DECL_VINDEX (func_decl), 0))
+      if (tree_fits_shwi_p (DECL_VINDEX (func_decl)))
 	add_AT_loc (die, DW_AT_vtable_elem_location,
 		    new_loc_descr (DW_OP_constu,
-				   tree_low_cst (DECL_VINDEX (func_decl), 0),
+				   tree_to_hwi (DECL_VINDEX (func_decl)),
 				   0));
 
       /* GNU extension: Record what type this method came from originally.  */
@@ -16226,8 +16226,8 @@  descr_info_loc (tree val, tree base_decl)
     case VAR_DECL:
       return loc_descriptor_from_tree (val, 0);
     case INTEGER_CST:
-      if (host_integerp (val, 0))
-	return int_loc_descriptor (tree_low_cst (val, 0));
+      if (tree_fits_shwi_p (val))
+	return int_loc_descriptor (tree_to_shwi (val));
       break;
     case INDIRECT_REF:
       size = int_size_in_bytes (TREE_TYPE (val));
@@ -16243,14 +16243,18 @@  descr_info_loc (tree val, tree base_decl)
       return loc;
     case POINTER_PLUS_EXPR:
     case PLUS_EXPR:
-      if (host_integerp (TREE_OPERAND (val, 1), 1)
-	  && (unsigned HOST_WIDE_INT) tree_low_cst (TREE_OPERAND (val, 1), 1)
+      if (tree_fits_uhwi_p (TREE_OPERAND (val, 1))
+	  && (unsigned HOST_WIDE_INT) tree_to_hwi (TREE_OPERAND (val, 1))
 	     < 16384)
 	{
 	  loc = descr_info_loc (TREE_OPERAND (val, 0), base_decl);
 	  if (!loc)
 	    break;
-	  loc_descr_plus_const (&loc, tree_low_cst (TREE_OPERAND (val, 1), 0));
+
+	  /* THIS CODE LOOKS INCORRECT, THE ABOVE PREDICATE CHECKED IF
+	     THE VALUE FIT IN AN UNSIGNED HWI, BUT NOW WE ARE USING A
+	     SIGNED HWI.  */
+	  loc_descr_plus_const (&loc, tree_to_shwi (TREE_OPERAND (val, 1)));
 	}
       else
 	{
@@ -16290,9 +16294,9 @@  add_descr_info_field (dw_die_ref die, enum dwarf_attribute attr,
 {
   dw_loc_descr_ref loc;
 
-  if (host_integerp (val, 0))
+  if (tree_fits_shwi_p (val))
     {
-      add_AT_unsigned (die, attr, tree_low_cst (val, 0));
+      add_AT_unsigned (die, attr, tree_to_shwi (val));
       return;
     }
 
@@ -16343,9 +16347,9 @@  gen_descr_array_type_die (tree type, struct array_descr_info *info,
 	  /* If it is the default value, omit it.  */
 	  int dflt;
 
-	  if (host_integerp (info->dimen[dim].lower_bound, 0)
+	  if (tree_fits_shwi_p (info->dimen[dim].lower_bound)
 	      && (dflt = lower_bound_default ()) != -1
-	      && tree_low_cst (info->dimen[dim].lower_bound, 0) == dflt)
+	      && tree_to_hwi (info->dimen[dim].lower_bound) == dflt)
 	    ;
 	  else
 	    add_descr_info_field (subrange_die, DW_AT_lower_bound,
@@ -16493,15 +16497,14 @@  gen_enumeration_type_die (tree type, dw_die_ref context_die)
 	  if (TREE_CODE (value) == CONST_DECL)
 	    value = DECL_INITIAL (value);
 
-	  if (host_integerp (value, TYPE_UNSIGNED (TREE_TYPE (value))))
+	  if (tree_fits_hwi_p (value, TYPE_UNSIGNED (TREE_TYPE (value))))
 	    /* DWARF2 does not provide a way of indicating whether or
 	       not enumeration constants are signed or unsigned.  GDB
 	       always assumes the values are signed, so we output all
 	       values as if they were signed.  That means that
 	       enumeration constants with very large unsigned values
 	       will appear to have negative values in the debugger.  */
-	    add_AT_int (enum_die, DW_AT_const_value,
-			tree_low_cst (value, tree_int_cst_sgn (value) > 0));
+	    add_AT_int (enum_die, DW_AT_const_value, tree_to_hwi (value));
 	}
 
       add_gnat_descriptive_type_attribute (type_die, type, context_die);
diff --git a/gcc/emit-rtl.c b/gcc/emit-rtl.c
index 5b809d9..38f79bf 100644
--- a/gcc/emit-rtl.c
+++ b/gcc/emit-rtl.c
@@ -1612,12 +1612,12 @@  get_mem_align_offset (rtx mem, unsigned int align)
 	  tree bit_offset = DECL_FIELD_BIT_OFFSET (field);
 
 	  if (!byte_offset
-	      || !host_integerp (byte_offset, 1)
-	      || !host_integerp (bit_offset, 1))
+	      || !tree_fits_uhwi_p (byte_offset)
+	      || !tree_fits_uhwi_p (bit_offset))
 	    return -1;
 
-	  offset += tree_low_cst (byte_offset, 1);
-	  offset += tree_low_cst (bit_offset, 1) / BITS_PER_UNIT;
+	  offset += tree_to_hwi (byte_offset);
+	  offset += tree_to_hwi (bit_offset) / BITS_PER_UNIT;
 
 	  if (inner == NULL_TREE)
 	    {
@@ -1742,10 +1742,10 @@  set_mem_attributes_minus_bitpos (rtx ref, tree t, int objectp,
 						attrs.align);
 #endif
 	    }
-	  if (TREE_INT_CST_LOW (TREE_OPERAND (t, 1)) != 0)
+	  if (tree_to_hwi (TREE_OPERAND (t, 1)) != 0)
 	    {
 	      unsigned HOST_WIDE_INT ioff
-		= TREE_INT_CST_LOW (TREE_OPERAND (t, 1));
+		= tree_to_hwi (TREE_OPERAND (t, 1));
 	      unsigned HOST_WIDE_INT aoff = (ioff & -ioff) * BITS_PER_UNIT;
 	      attrs.align = MIN (aoff, attrs.align);
 	    }
@@ -1762,10 +1762,10 @@  set_mem_attributes_minus_bitpos (rtx ref, tree t, int objectp,
     attrs.align = MAX (attrs.align, TYPE_ALIGN (type));
 
   /* If the size is known, we can set that.  */
-  if (TYPE_SIZE_UNIT (type) && host_integerp (TYPE_SIZE_UNIT (type), 1))
+  if (TYPE_SIZE_UNIT (type) && tree_fits_uhwi_p (TYPE_SIZE_UNIT (type)))
     {
       attrs.size_known_p = true;
-      attrs.size = tree_low_cst (TYPE_SIZE_UNIT (type), 1);
+      attrs.size = tree_to_hwi (TYPE_SIZE_UNIT (type));
     }
 
   /* If T is not a type, we may be able to deduce some more information about
@@ -1825,10 +1825,10 @@  set_mem_attributes_minus_bitpos (rtx ref, tree t, int objectp,
 	  attrs.offset_known_p = true;
 	  attrs.offset = 0;
 	  apply_bitpos = bitpos;
-	  if (DECL_SIZE_UNIT (t) && host_integerp (DECL_SIZE_UNIT (t), 1))
+	  if (DECL_SIZE_UNIT (t) && tree_fits_uhwi_p (DECL_SIZE_UNIT (t)))
 	    {
 	      attrs.size_known_p = true;
-	      attrs.size = tree_low_cst (DECL_SIZE_UNIT (t), 1);
+	      attrs.size = tree_to_hwi (DECL_SIZE_UNIT (t));
 	    }
 	  else
 	    attrs.size_known_p = false;
@@ -1897,9 +1897,9 @@  set_mem_attributes_minus_bitpos (rtx ref, tree t, int objectp,
 	    {
 	      attrs.expr = t2;
 	      attrs.offset_known_p = false;
-	      if (host_integerp (off_tree, 1))
+	      if (tree_fits_uhwi_p (off_tree))
 		{
-		  HOST_WIDE_INT ioff = tree_low_cst (off_tree, 1);
+		  HOST_WIDE_INT ioff = tree_to_hwi (off_tree);
 		  HOST_WIDE_INT aoff = (ioff & -ioff) * BITS_PER_UNIT;
 		  attrs.align = DECL_ALIGN (t2);
 		  if (aoff && (unsigned HOST_WIDE_INT) aoff < attrs.align)
@@ -1914,10 +1914,10 @@  set_mem_attributes_minus_bitpos (rtx ref, tree t, int objectp,
 	    {
 	      attrs.expr = t2;
 	      attrs.offset_known_p = false;
-	      if (host_integerp (off_tree, 1))
+	      if (tree_fits_uhwi_p (off_tree))
 		{
 		  attrs.offset_known_p = true;
-		  attrs.offset = tree_low_cst (off_tree, 1);
+		  attrs.offset = tree_to_hwi (off_tree);
 		  apply_bitpos = bitpos;
 		}
 	      /* ??? Any reason the field size would be different than
@@ -2385,15 +2385,15 @@  widen_memory_access (rtx memref, enum machine_mode mode, HOST_WIDE_INT offset)
 	      && attrs.offset >= 0)
 	    break;
 
-	  if (! host_integerp (offset, 1))
+	  if (! tree_fits_uhwi_p (offset))
 	    {
 	      attrs.expr = NULL_TREE;
 	      break;
 	    }
 
 	  attrs.expr = TREE_OPERAND (attrs.expr, 0);
-	  attrs.offset += tree_low_cst (offset, 1);
-	  attrs.offset += (tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 1)
+	  attrs.offset += tree_to_uhwi (offset);
+	  attrs.offset += (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
 			   / BITS_PER_UNIT);
 	}
       /* Similarly for the decl.  */
diff --git a/gcc/except.c b/gcc/except.c
index 88cac85..cde20f0 100644
--- a/gcc/except.c
+++ b/gcc/except.c
@@ -281,20 +281,20 @@  init_eh (void)
       /* Cache the interesting field offsets so that we have
 	 easy access from rtl.  */
       sjlj_fc_call_site_ofs
-	= (tree_low_cst (DECL_FIELD_OFFSET (f_cs), 1)
-	   + tree_low_cst (DECL_FIELD_BIT_OFFSET (f_cs), 1) / BITS_PER_UNIT);
+	= (tree_to_uhwi (DECL_FIELD_OFFSET (f_cs))
+	   + tree_to_uhwi (DECL_FIELD_BIT_OFFSET (f_cs)) / BITS_PER_UNIT);
       sjlj_fc_data_ofs
-	= (tree_low_cst (DECL_FIELD_OFFSET (f_data), 1)
-	   + tree_low_cst (DECL_FIELD_BIT_OFFSET (f_data), 1) / BITS_PER_UNIT);
+	= (tree_to_uhwi (DECL_FIELD_OFFSET (f_data))
+	   + tree_to_uhwi (DECL_FIELD_BIT_OFFSET (f_data)) / BITS_PER_UNIT);
       sjlj_fc_personality_ofs
-	= (tree_low_cst (DECL_FIELD_OFFSET (f_per), 1)
-	   + tree_low_cst (DECL_FIELD_BIT_OFFSET (f_per), 1) / BITS_PER_UNIT);
+	= (tree_to_uhwi (DECL_FIELD_OFFSET (f_per))
+	   + tree_to_uhwi (DECL_FIELD_BIT_OFFSET (f_per)) / BITS_PER_UNIT);
       sjlj_fc_lsda_ofs
-	= (tree_low_cst (DECL_FIELD_OFFSET (f_lsda), 1)
-	   + tree_low_cst (DECL_FIELD_BIT_OFFSET (f_lsda), 1) / BITS_PER_UNIT);
+	= (tree_to_uhwi (DECL_FIELD_OFFSET (f_lsda))
+	   + tree_to_uhwi (DECL_FIELD_BIT_OFFSET (f_lsda)) / BITS_PER_UNIT);
       sjlj_fc_jbuf_ofs
-	= (tree_low_cst (DECL_FIELD_OFFSET (f_jbuf), 1)
-	   + tree_low_cst (DECL_FIELD_BIT_OFFSET (f_jbuf), 1) / BITS_PER_UNIT);
+	= (tree_to_uhwi (DECL_FIELD_OFFSET (f_jbuf))
+	   + tree_to_uhwi (DECL_FIELD_BIT_OFFSET (f_jbuf)) / BITS_PER_UNIT);
     }
 }
 
@@ -1952,8 +1952,8 @@  expand_builtin_eh_common (tree region_nr_t)
   HOST_WIDE_INT region_nr;
   eh_region region;
 
-  gcc_assert (host_integerp (region_nr_t, 0));
-  region_nr = tree_low_cst (region_nr_t, 0);
+  gcc_assert (tree_fits_shwi_p (region_nr_t));
+  region_nr = tree_to_hwi (region_nr_t);
 
   region = VEC_index (eh_region, cfun->eh->region_array, region_nr);
 
@@ -2047,7 +2047,7 @@  expand_builtin_eh_return_data_regno (tree exp)
       return constm1_rtx;
     }
 
-  iwhich = tree_low_cst (which, 1);
+  iwhich = tree_to_uhwi (which);
   iwhich = EH_RETURN_DATA_REGNO (iwhich);
   if (iwhich == INVALID_REGNUM)
     return constm1_rtx;
@@ -2316,7 +2316,7 @@  collect_one_action_chain (htab_t ar_hash, eh_region region)
 	      {
 		/* Retrieve the filter from the head of the filter list
 		   where we have stored it (see assign_filter_values).  */
-		int filter = TREE_INT_CST_LOW (TREE_VALUE (c->filter_list));
+		int filter = tree_to_shwi (TREE_VALUE (c->filter_list));
 		next = add_action_record (ar_hash, filter, 0);
 	      }
 	    else
@@ -2343,7 +2343,7 @@  collect_one_action_chain (htab_t ar_hash, eh_region region)
 		flt_node = c->filter_list;
 		for (; flt_node; flt_node = TREE_CHAIN (flt_node))
 		  {
-		    int filter = TREE_INT_CST_LOW (TREE_VALUE (flt_node));
+		    int filter = tree_to_shwi (TREE_VALUE (flt_node));
 		    next = add_action_record (ar_hash, filter, next);
 		  }
 	      }
diff --git a/gcc/explow.c b/gcc/explow.c
index 86f71cd..101f5ae 100644
--- a/gcc/explow.c
+++ b/gcc/explow.c
@@ -242,10 +242,10 @@  int_expr_size (tree exp)
       gcc_assert (size);
     }
 
-  if (size == 0 || !host_integerp (size, 0))
+  if (size == 0 || !tree_fits_shwi_p (size))
     return -1;
 
-  return tree_low_cst (size, 0);
+  return tree_to_hwi (size);
 }
 
 /* Return a copy of X in which all memory references
diff --git a/gcc/expr.c b/gcc/expr.c
index 6677cb0..c2591c2 100644
--- a/gcc/expr.c
+++ b/gcc/expr.c
@@ -4520,14 +4520,14 @@  get_bit_range (unsigned HOST_WIDE_INT *bitstart,
      relative to the representative.  DECL_FIELD_OFFSET of field and
      repr are the same by construction if they are not constants,
      see finish_bitfield_layout.  */
-  if (host_integerp (DECL_FIELD_OFFSET (field), 1)
-      && host_integerp (DECL_FIELD_OFFSET (repr), 1))
-    bitoffset = (tree_low_cst (DECL_FIELD_OFFSET (field), 1)
-		 - tree_low_cst (DECL_FIELD_OFFSET (repr), 1)) * BITS_PER_UNIT;
+  if (tree_fits_uhwi_p (DECL_FIELD_OFFSET (field))
+      && tree_fits_uhwi_p (DECL_FIELD_OFFSET (repr)))
+    bitoffset = (tree_to_hwi (DECL_FIELD_OFFSET (field))
+		 - tree_to_hwi (DECL_FIELD_OFFSET (repr))) * BITS_PER_UNIT;
   else
     bitoffset = 0;
-  bitoffset += (tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 1)
-		- tree_low_cst (DECL_FIELD_BIT_OFFSET (repr), 1));
+  bitoffset += (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
+		- tree_to_uhwi (DECL_FIELD_BIT_OFFSET (repr)));
 
   /* If the adjustment is larger than bitpos, we would have a negative bit
      position for the lower bound and this may wreak havoc later.  This can
@@ -4548,7 +4548,7 @@  get_bit_range (unsigned HOST_WIDE_INT *bitstart,
   else
     *bitstart = *bitpos - bitoffset;
 
-  *bitend = *bitstart + tree_low_cst (DECL_SIZE (repr), 1) - 1;
+  *bitend = *bitstart + tree_to_uhwi (DECL_SIZE (repr)) - 1;
 }
 
 /* Returns true if the MEM_REF REF refers to an object that does not
@@ -5382,11 +5382,11 @@  count_type_elements (const_tree type, bool for_ctor_p)
 	tree nelts;
 
 	nelts = array_type_nelts (type);
-	if (nelts && host_integerp (nelts, 1))
+	if (nelts && tree_fits_uhwi_p (nelts))
 	  {
 	    unsigned HOST_WIDE_INT n;
 
-	    n = tree_low_cst (nelts, 1) + 1;
+	    n = tree_to_uhwi (nelts) + 1;
 	    if (n == 0 || for_ctor_p)
 	      return n;
 	    else
@@ -5501,9 +5501,9 @@  categorize_ctor_elements_1 (const_tree ctor, HOST_WIDE_INT *p_nz_elts,
 	  tree lo_index = TREE_OPERAND (purpose, 0);
 	  tree hi_index = TREE_OPERAND (purpose, 1);
 
-	  if (host_integerp (lo_index, 1) && host_integerp (hi_index, 1))
-	    mult = (tree_low_cst (hi_index, 1)
-		    - tree_low_cst (lo_index, 1) + 1);
+	  if (tree_fits_uhwi_p (lo_index) && tree_fits_uhwi_p (hi_index))
+	    mult = (tree_to_hwi (hi_index)
+		    - tree_to_hwi (lo_index) + 1);
 	}
       num_fields += mult;
       elt_type = TREE_TYPE (value);
@@ -5806,8 +5806,8 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 	    if (cleared && initializer_zerop (value))
 	      continue;
 
-	    if (host_integerp (DECL_SIZE (field), 1))
-	      bitsize = tree_low_cst (DECL_SIZE (field), 1);
+	    if (tree_fits_uhwi_p (DECL_SIZE (field)))
+	      bitsize = tree_to_hwi (DECL_SIZE (field));
 	    else
 	      bitsize = -1;
 
@@ -5816,14 +5816,14 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 	      mode = VOIDmode;
 
 	    offset = DECL_FIELD_OFFSET (field);
-	    if (host_integerp (offset, 0)
-		&& host_integerp (bit_position (field), 0))
+	    if (tree_fits_shwi_p (offset)
+		&& tree_fits_shwi_p (bit_position (field)))
 	      {
 		bitpos = int_bit_position (field);
 		offset = 0;
 	      }
 	    else
-	      bitpos = tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 0);
+	      bitpos = tree_to_shwi (DECL_FIELD_BIT_OFFSET (field));
 
 	    if (offset)
 	      {
@@ -5906,14 +5906,14 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 	domain = TYPE_DOMAIN (type);
 	const_bounds_p = (TYPE_MIN_VALUE (domain)
 			  && TYPE_MAX_VALUE (domain)
-			  && host_integerp (TYPE_MIN_VALUE (domain), 0)
-			  && host_integerp (TYPE_MAX_VALUE (domain), 0));
+			  && tree_fits_shwi_p (TYPE_MIN_VALUE (domain))
+			  && tree_fits_shwi_p (TYPE_MAX_VALUE (domain)));
 
 	/* If we have constant bounds for the range of the type, get them.  */
 	if (const_bounds_p)
 	  {
-	    minelt = tree_low_cst (TYPE_MIN_VALUE (domain), 0);
-	    maxelt = tree_low_cst (TYPE_MAX_VALUE (domain), 0);
+	    minelt = tree_to_hwi (TYPE_MIN_VALUE (domain));
+	    maxelt = tree_to_hwi (TYPE_MAX_VALUE (domain));
 	  }
 
 	/* If the constructor has fewer elements than the array, clear
@@ -5945,15 +5945,15 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 		    tree lo_index = TREE_OPERAND (index, 0);
 		    tree hi_index = TREE_OPERAND (index, 1);
 
-		    if (! host_integerp (lo_index, 1)
-			|| ! host_integerp (hi_index, 1))
+		    if (! tree_fits_uhwi_p (lo_index)
+			|| ! tree_fits_uhwi_p (hi_index))
 		      {
 			need_to_clear = 1;
 			break;
 		      }
 
-		    this_node_count = (tree_low_cst (hi_index, 1)
-				       - tree_low_cst (lo_index, 1) + 1);
+		    this_node_count = (tree_to_hwi (hi_index)
+				       - tree_to_hwi (lo_index) + 1);
 		  }
 		else
 		  this_node_count = 1;
@@ -6000,8 +6000,8 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 
 	    mode = TYPE_MODE (elttype);
 	    if (mode == BLKmode)
-	      bitsize = (host_integerp (TYPE_SIZE (elttype), 1)
-			 ? tree_low_cst (TYPE_SIZE (elttype), 1)
+	      bitsize = (tree_fits_uhwi_p (TYPE_SIZE (elttype))
+			 ? tree_to_hwi (TYPE_SIZE (elttype))
 			 : -1);
 	    else
 	      bitsize = GET_MODE_BITSIZE (mode);
@@ -6016,21 +6016,21 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 
 		/* If the range is constant and "small", unroll the loop.  */
 		if (const_bounds_p
-		    && host_integerp (lo_index, 0)
-		    && host_integerp (hi_index, 0)
-		    && (lo = tree_low_cst (lo_index, 0),
-			hi = tree_low_cst (hi_index, 0),
+		    && tree_fits_shwi_p (lo_index)
+		    && tree_fits_shwi_p (hi_index)
+		    && (lo = tree_to_hwi (lo_index),
+			hi = tree_to_hwi (hi_index),
 			count = hi - lo + 1,
 			(!MEM_P (target)
 			 || count <= 2
-			 || (host_integerp (TYPE_SIZE (elttype), 1)
-			     && (tree_low_cst (TYPE_SIZE (elttype), 1) * count
+			 || (tree_fits_uhwi_p (TYPE_SIZE (elttype))
+			     && (tree_to_hwi (TYPE_SIZE (elttype)) * count
 				 <= 40 * 8)))))
 		  {
 		    lo -= minelt;  hi -= minelt;
 		    for (; lo <= hi; lo++)
 		      {
-			bitpos = lo * tree_low_cst (TYPE_SIZE (elttype), 0);
+			bitpos = lo * tree_to_shwi (TYPE_SIZE (elttype));
 
 			if (MEM_P (target)
 			    && !MEM_KEEP_ALIAS_SET_P (target)
@@ -6105,8 +6105,8 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 		    emit_label (loop_end);
 		  }
 	      }
-	    else if ((index != 0 && ! host_integerp (index, 0))
-		     || ! host_integerp (TYPE_SIZE (elttype), 1))
+	    else if ((index != 0 && ! tree_fits_shwi_p (index))
+		     || ! tree_fits_uhwi_p (TYPE_SIZE (elttype)))
 	      {
 		tree position;
 
@@ -6133,10 +6133,10 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 	    else
 	      {
 		if (index != 0)
-		  bitpos = ((tree_low_cst (index, 0) - minelt)
-			    * tree_low_cst (TYPE_SIZE (elttype), 1));
+		  bitpos = ((tree_to_shwi (index) - minelt)
+			    * tree_to_uhwi (TYPE_SIZE (elttype)));
 		else
-		  bitpos = (i * tree_low_cst (TYPE_SIZE (elttype), 1));
+		  bitpos = (i * tree_to_uhwi (TYPE_SIZE (elttype)));
 
 		if (MEM_P (target) && !MEM_KEEP_ALIAS_SET_P (target)
 		    && TREE_CODE (type) == ARRAY_TYPE
@@ -6160,7 +6160,7 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 	int need_to_clear;
 	int icode = CODE_FOR_nothing;
 	tree elttype = TREE_TYPE (type);
-	int elt_size = tree_low_cst (TYPE_SIZE (elttype), 1);
+	int elt_size = tree_to_uhwi (TYPE_SIZE (elttype));
 	enum machine_mode eltmode = TYPE_MODE (elttype);
 	HOST_WIDE_INT bitsize;
 	HOST_WIDE_INT bitpos;
@@ -6200,10 +6200,10 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 
 	    FOR_EACH_CONSTRUCTOR_VALUE (CONSTRUCTOR_ELTS (exp), idx, value)
 	      {
-		int n_elts_here = tree_low_cst
+		int n_elts_here = tree_to_uhwi
 		  (int_const_binop (TRUNC_DIV_EXPR,
 				    TYPE_SIZE (TREE_TYPE (value)),
-				    TYPE_SIZE (elttype)), 1);
+				    TYPE_SIZE (elttype)));
 
 		count += n_elts_here;
 		if (mostly_zeros_p (value))
@@ -6242,12 +6242,12 @@  store_constructor (tree exp, rtx target, int cleared, HOST_WIDE_INT size)
 	    HOST_WIDE_INT eltpos;
 	    tree value = ce->value;
 
-	    bitsize = tree_low_cst (TYPE_SIZE (TREE_TYPE (value)), 1);
+	    bitsize = tree_to_uhwi (TYPE_SIZE (TREE_TYPE (value)));
 	    if (cleared && initializer_zerop (value))
 	      continue;
 
 	    if (ce->index)
-	      eltpos = tree_low_cst (ce->index, 1);
+	      eltpos = tree_to_uhwi (ce->index);
 	    else
 	      eltpos = i;
 
@@ -6597,10 +6597,10 @@  get_inner_reference (tree exp, HOST_WIDE_INT *pbitsize,
 
   if (size_tree != 0)
     {
-      if (! host_integerp (size_tree, 1))
+      if (! tree_fits_uhwi_p (size_tree))
 	mode = BLKmode, *pbitsize = -1;
       else
-	*pbitsize = tree_low_cst (size_tree, 1);
+	*pbitsize = tree_to_hwi (size_tree);
     }
 
   /* Compute cumulative bit-offset for nested component-refs and array-refs,
@@ -7300,9 +7300,9 @@  highest_pow2_factor (const_tree exp)
 	return BIGGEST_ALIGNMENT;
       else
 	{
-	  /* Note: tree_low_cst is intentionally not used here,
+	  /* Note: tree_fits_*_p is intentionally not used here,
 	     we don't care about the upper bits.  */
-	  c0 = TREE_INT_CST_LOW (exp);
+	  c0 = tree_to_hwi (exp);
 	  c0 &= -c0;
 	  return c0 ? c0 : BIGGEST_ALIGNMENT;
 	}
@@ -7321,10 +7321,10 @@  highest_pow2_factor (const_tree exp)
     case ROUND_DIV_EXPR:  case TRUNC_DIV_EXPR:  case FLOOR_DIV_EXPR:
     case CEIL_DIV_EXPR:
       if (integer_pow2p (TREE_OPERAND (exp, 1))
-	  && host_integerp (TREE_OPERAND (exp, 1), 1))
+	  && tree_fits_uhwi_p (TREE_OPERAND (exp, 1)))
 	{
 	  c0 = highest_pow2_factor (TREE_OPERAND (exp, 0));
-	  c1 = tree_low_cst (TREE_OPERAND (exp, 1), 1);
+	  c1 = tree_to_hwi (TREE_OPERAND (exp, 1));
 	  return MAX (1, c0 / c1);
 	}
       break;
@@ -7731,9 +7731,9 @@  expand_constructor (tree exp, rtx target, enum expand_modifier modifier,
        && ((mode == BLKmode
 	    && ! (target != 0 && safe_from_p (target, exp, 1)))
 		  || TREE_ADDRESSABLE (exp)
-		  || (host_integerp (TYPE_SIZE_UNIT (type), 1)
+		  || (tree_fits_uhwi_p (TYPE_SIZE_UNIT (type))
 		      && (! MOVE_BY_PIECES_P
-				     (tree_low_cst (TYPE_SIZE_UNIT (type), 1),
+				     (tree_to_hwi (TYPE_SIZE_UNIT (type)),
 				      TYPE_ALIGN (type)))
 		      && ! mostly_zeros_p (exp))))
       || ((modifier == EXPAND_INITIALIZER || modifier == EXPAND_CONST_ADDRESS)
@@ -8509,7 +8509,7 @@  expand_expr_real_2 (sepops ops, rtx target, enum machine_mode tmode,
 	 indexed address, for machines that support that.  */
 
       if (modifier == EXPAND_SUM && mode == ptr_mode
-	  && host_integerp (treeop1, 0))
+	  && tree_fits_shwi_p (treeop1))
 	{
 	  tree exp1 = treeop1;
 
@@ -8522,7 +8522,7 @@  expand_expr_real_2 (sepops ops, rtx target, enum machine_mode tmode,
 	    op0 = copy_to_mode_reg (mode, op0);
 
 	  return REDUCE_BIT_FIELD (gen_rtx_MULT (mode, op0,
-			       gen_int_mode (tree_low_cst (exp1, 0),
+			       gen_int_mode (tree_to_shwi (exp1),
 					     TYPE_MODE (TREE_TYPE (exp1)))));
 	}
 
@@ -9538,9 +9538,9 @@  expand_expr_real_1 (tree exp, rtx target, enum machine_mode tmode,
 	    tree bftype;
 	    base = TREE_OPERAND (base, 0);
 	    if (offset == 0
-		&& host_integerp (TYPE_SIZE (TREE_TYPE (exp)), 1)
+		&& tree_fits_uhwi_p (TYPE_SIZE (TREE_TYPE (exp)))
 		&& (GET_MODE_BITSIZE (DECL_MODE (base))
-		    == TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (exp)))))
+		    == tree_to_hwi (TYPE_SIZE (TREE_TYPE (exp)))))
 	      return expand_expr (build1 (VIEW_CONVERT_EXPR,
 					  TREE_TYPE (exp), base),
 				  target, tmode, modifier);
@@ -9733,7 +9733,7 @@  expand_expr_real_1 (tree exp, rtx target, enum machine_mode tmode,
 			if (GET_MODE_CLASS (mode) == MODE_INT
 			    && GET_MODE_SIZE (mode) == 1)
 			  return gen_int_mode (TREE_STRING_POINTER (init)
-					       [TREE_INT_CST_LOW (index1)],
+					       [tree_to_uhwi (index1)],
 					       mode);
 		      }
 		  }
@@ -9771,7 +9771,7 @@  expand_expr_real_1 (tree exp, rtx target, enum machine_mode tmode,
 		op0 = expand_expr (value, target, tmode, modifier);
 		if (DECL_BIT_FIELD (field))
 		  {
-		    HOST_WIDE_INT bitsize = TREE_INT_CST_LOW (DECL_SIZE (field));
+		    HOST_WIDE_INT bitsize = tree_to_shwi (DECL_SIZE (field));
 		    enum machine_mode imode = TYPE_MODE (TREE_TYPE (field));
 
 		    if (TYPE_UNSIGNED (TREE_TYPE (field)))
@@ -10480,10 +10480,10 @@  is_aligning_offset (const_tree offset, const_tree exp)
   /* We must now have a BIT_AND_EXPR with a constant that is one less than
      power of 2 and which is larger than BIGGEST_ALIGNMENT.  */
   if (TREE_CODE (offset) != BIT_AND_EXPR
-      || !host_integerp (TREE_OPERAND (offset, 1), 1)
+      || !tree_fits_uhwi_p (TREE_OPERAND (offset, 1))
       || compare_tree_int (TREE_OPERAND (offset, 1),
 			   BIGGEST_ALIGNMENT / BITS_PER_UNIT) <= 0
-      || !exact_log2 (tree_low_cst (TREE_OPERAND (offset, 1), 1) + 1) < 0)
+      || !exact_log2 (tree_to_hwi (TREE_OPERAND (offset, 1)) + 1) < 0)
     return 0;
 
   /* Look at the first operand of BIT_AND_EXPR and strip any conversion.
@@ -10617,7 +10617,7 @@  string_constant (tree arg, tree *ptr_offset)
 	 and inside of the bounds of the string literal.  */
       offset = fold_convert (sizetype, offset);
       if (compare_tree_int (DECL_SIZE_UNIT (array), length) > 0
-	  && (! host_integerp (offset, 1)
+	  && (! tree_fits_uhwi_p (offset)
 	      || compare_tree_int (offset, length) >= 0))
 	return 0;
 
diff --git a/gcc/expr.h b/gcc/expr.h
index 154648e..442597f 100644
--- a/gcc/expr.h
+++ b/gcc/expr.h
@@ -28,7 +28,7 @@  along with GCC; see the file COPYING3.  If not see
 #include "rtl.h"
 /* For optimize_size */
 #include "flags.h"
-/* For host_integerp, tree_low_cst, fold_convert, size_binop, ssize_int,
+/* For tree_fits_*hwi, tree_to_*hwi, fold_convert, size_binop, ssize_int,
    TREE_CODE, TYPE_SIZE, int_size_in_bytes,    */
 #include "tree.h"
 /* For GET_MODE_BITSIZE, word_mode */
@@ -96,8 +96,8 @@  struct locate_and_pad_arg_data
 #define ADD_PARM_SIZE(TO, INC)					\
 do {								\
   tree inc = (INC);						\
-  if (host_integerp (inc, 0))					\
-    (TO).constant += tree_low_cst (inc, 0);			\
+  if (tree_fits_shwi_p (inc))					\
+    (TO).constant += tree_to_hwi (inc);				\
   else if ((TO).var == 0)					\
     (TO).var = fold_convert (ssizetype, inc);			\
   else								\
@@ -108,8 +108,8 @@  do {								\
 #define SUB_PARM_SIZE(TO, DEC)					\
 do {								\
   tree dec = (DEC);						\
-  if (host_integerp (dec, 0))					\
-    (TO).constant -= tree_low_cst (dec, 0);			\
+  if (tree_fits_shwi_p (dec))					\
+    (TO).constant -= tree_to_hwi (dec);				\
   else if ((TO).var == 0)					\
     (TO).var = size_binop (MINUS_EXPR, ssize_int (0),		\
 			   fold_convert (ssizetype, dec));	\
diff --git a/gcc/fold-const.c b/gcc/fold-const.c
index fd0075c..03eeac2 100644
--- a/gcc/fold-const.c
+++ b/gcc/fold-const.c
@@ -498,9 +498,8 @@  negate_expr_p (tree t)
       if (TREE_CODE (TREE_OPERAND (t, 1)) == INTEGER_CST)
 	{
 	  tree op1 = TREE_OPERAND (t, 1);
-	  if (TREE_INT_CST_HIGH (op1) == 0
-	      && (unsigned HOST_WIDE_INT) (TYPE_PRECISION (type) - 1)
-		 == TREE_INT_CST_LOW (op1))
+	  if (tree_fits_uhwi_p (op1)
+	      && (TYPE_PRECISION (type) - 1) == tree_to_hwi (op1))
 	    return true;
 	}
       break;
@@ -696,9 +695,8 @@  fold_negate_expr (location_t loc, tree t)
       if (TREE_CODE (TREE_OPERAND (t, 1)) == INTEGER_CST)
 	{
 	  tree op1 = TREE_OPERAND (t, 1);
-	  if (TREE_INT_CST_HIGH (op1) == 0
-	      && (unsigned HOST_WIDE_INT) (TYPE_PRECISION (type) - 1)
-		 == TREE_INT_CST_LOW (op1))
+	  if (tree_fits_uhwi_p (op1)
+	      && (TYPE_PRECISION (type) - 1) == tree_to_hwi (op1))
 	    {
 	      tree ntype = TYPE_UNSIGNED (type)
 			   ? signed_type_for (type)
@@ -3275,8 +3273,8 @@  make_bit_field_ref (location_t loc, tree inner, tree type,
       tree size = TYPE_SIZE (TREE_TYPE (inner));
       if ((INTEGRAL_TYPE_P (TREE_TYPE (inner))
 	   || POINTER_TYPE_P (TREE_TYPE (inner)))
-	  && host_integerp (size, 0)
-	  && tree_low_cst (size, 0) == bitsize)
+	  && tree_fits_shwi_p (size)
+	  && tree_to_hwi (size) == bitsize)
 	return fold_convert_loc (loc, type, inner);
     }
 
@@ -5713,8 +5711,8 @@  extract_muldiv_1 (tree t, tree c, enum tree_code code, tree wide_type,
 	  && (tcode == RSHIFT_EXPR || TYPE_UNSIGNED (TREE_TYPE (op0)))
 	  /* const_binop may not detect overflow correctly,
 	     so check for it explicitly here.  */
-	  && TYPE_PRECISION (TREE_TYPE (size_one_node)) > TREE_INT_CST_LOW (op1)
-	  && TREE_INT_CST_HIGH (op1) == 0
+	  && tree_fits_uhwi_p (op1)
+	  && TYPE_PRECISION (TREE_TYPE (size_one_node)) > tree_to_hwi (op1)
 	  && 0 != (t1 = fold_convert (ctype,
 				      const_binop (LSHIFT_EXPR,
 						   size_one_node,
@@ -6482,7 +6480,7 @@  fold_single_bit_test (location_t loc, enum tree_code code,
 	  && 0 > compare_tree_int (TREE_OPERAND (inner, 1),
 				   bitnum - TYPE_PRECISION (type)))
 	{
-	  bitnum += TREE_INT_CST_LOW (TREE_OPERAND (inner, 1));
+	  bitnum += tree_to_uhwi (TREE_OPERAND (inner, 1));
 	  inner = TREE_OPERAND (inner, 0);
 	}
 
@@ -7087,14 +7085,14 @@  fold_plusminus_mult_expr (location_t loc, enum tree_code code, tree type,
   /* No identical multiplicands; see if we can find a common
      power-of-two factor in non-power-of-two multiplies.  This
      can help in multi-dimensional array access.  */
-  else if (host_integerp (arg01, 0)
-	   && host_integerp (arg11, 0))
+  else if (tree_fits_shwi_p (arg01)
+	   && tree_fits_shwi_p (arg11))
     {
       HOST_WIDE_INT int01, int11, tmp;
       bool swap = false;
       tree maybe_same;
-      int01 = TREE_INT_CST_LOW (arg01);
-      int11 = TREE_INT_CST_LOW (arg11);
+      int01 = tree_to_hwi (arg01);
+      int11 = tree_to_hwi (arg11);
 
       /* Move min of absolute values to int11.  */
       if (absu_hwi (int01) < absu_hwi (int11))
@@ -7290,9 +7288,9 @@  native_encode_string (const_tree expr, unsigned char *ptr, int len)
   if (TREE_CODE (type) != ARRAY_TYPE
       || TREE_CODE (TREE_TYPE (type)) != INTEGER_TYPE
       || GET_MODE_BITSIZE (TYPE_MODE (TREE_TYPE (type))) != BITS_PER_UNIT
-      || !host_integerp (TYPE_SIZE_UNIT (type), 0))
+      || !tree_fits_shwi_p (TYPE_SIZE_UNIT (type)))
     return 0;
-  total_bytes = tree_low_cst (TYPE_SIZE_UNIT (type), 0);
+  total_bytes = tree_to_hwi (TYPE_SIZE_UNIT (type));
   if (total_bytes > len)
     return 0;
   if (TREE_STRING_LENGTH (expr) < total_bytes)
@@ -7899,11 +7897,11 @@  fold_unary_loc (location_t loc, enum tree_code code, tree type, tree op0)
 	    change = 1;
 	  else if (TYPE_PRECISION (TREE_TYPE (and1))
 		   <= HOST_BITS_PER_WIDE_INT
-		   && host_integerp (and1, 1))
+		   && tree_fits_uhwi_p (and1))
 	    {
 	      unsigned HOST_WIDE_INT cst;
 
-	      cst = tree_low_cst (and1, 1);
+	      cst = tree_to_hwi (and1);
 	      cst &= (HOST_WIDE_INT) -1
 		     << (TYPE_PRECISION (TREE_TYPE (and1)) - 1);
 	      change = (cst == 0);
@@ -8837,7 +8835,7 @@  fold_comparison (location_t loc, enum tree_code code, tree type,
 	      indirect_base0 = true;
 	    }
 	  offset0 = TREE_OPERAND (arg0, 1);
-	  if (host_integerp (offset0, 0))
+	  if (tree_fits_shwi_p (offset0))
 	    {
 	      HOST_WIDE_INT off = size_low_cst (offset0);
 	      if ((HOST_WIDE_INT) (((unsigned HOST_WIDE_INT) off)
@@ -8871,7 +8869,7 @@  fold_comparison (location_t loc, enum tree_code code, tree type,
 	      indirect_base1 = true;
 	    }
 	  offset1 = TREE_OPERAND (arg1, 1);
-	  if (host_integerp (offset1, 0))
+	  if (tree_fits_shwi_p (offset1))
 	    {
 	      HOST_WIDE_INT off = size_low_cst (offset1);
 	      if ((HOST_WIDE_INT) (((unsigned HOST_WIDE_INT) off)
@@ -9509,7 +9507,7 @@  get_pointer_modulus_and_residue (tree expr, unsigned HOST_WIDE_INT *residue,
       inner_code = TREE_CODE (op1);
       if (inner_code == INTEGER_CST)
 	{
-	  *residue += TREE_INT_CST_LOW (op1);
+	  *residue += tree_to_uhwi (op1);
 	  return modulus;
 	}
       else if (inner_code == MULT_EXPR)
@@ -9520,7 +9518,7 @@  get_pointer_modulus_and_residue (tree expr, unsigned HOST_WIDE_INT *residue,
 	      unsigned HOST_WIDE_INT align;
 
 	      /* Compute the greatest power-of-2 divisor of op1.  */
-	      align = TREE_INT_CST_LOW (op1);
+	      align = tree_to_uhwi (op1);
 	      align &= -align;
 
 	      /* If align is non-zero and less than *modulus, replace
@@ -10223,9 +10221,9 @@  fold_binary_loc (location_t loc,
 	    code11 = TREE_CODE (tree11);
 	    if (code01 == INTEGER_CST
 		&& code11 == INTEGER_CST
-		&& TREE_INT_CST_HIGH (tree01) == 0
-		&& TREE_INT_CST_HIGH (tree11) == 0
-		&& ((TREE_INT_CST_LOW (tree01) + TREE_INT_CST_LOW (tree11))
+		&& tree_fits_uhwi_p (tree01)
+		&& tree_fits_uhwi_p (tree11)
+		&& ((tree_to_hwi (tree01) + tree_to_hwi (tree11))
 		    == TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg0, 0)))))
 	      {
 		tem = build2_loc (loc, LROTATE_EXPR,
@@ -11442,9 +11440,9 @@  fold_binary_loc (location_t loc,
 	 and for - instead of + (or unary - instead of +)
 	 and/or ^ instead of |.
 	 If B is constant and (B & M) == 0, fold into A & M.  */
-      if (host_integerp (arg1, 1))
+      if (tree_fits_uhwi_p (arg1))
 	{
-	  unsigned HOST_WIDE_INT cst1 = tree_low_cst (arg1, 1);
+	  unsigned HOST_WIDE_INT cst1 = tree_to_hwi (arg1);
 	  if (~cst1 && (cst1 & (cst1 + 1)) == 0
 	      && INTEGRAL_TYPE_P (TREE_TYPE (arg0))
 	      && (TREE_CODE (arg0) == PLUS_EXPR
@@ -11468,8 +11466,8 @@  fold_binary_loc (location_t loc,
 		  which = 1;
 		}
 
-	      if (!host_integerp (TYPE_MAX_VALUE (TREE_TYPE (arg0)), 1)
-		  || (tree_low_cst (TYPE_MAX_VALUE (TREE_TYPE (arg0)), 1)
+	      if (!tree_fits_uhwi_p (TYPE_MAX_VALUE (TREE_TYPE (arg0)))
+		  || (tree_to_hwi (TYPE_MAX_VALUE (TREE_TYPE (arg0)))
 		      & cst1) != cst1)
 		which = -1;
 
@@ -11482,9 +11480,8 @@  fold_binary_loc (location_t loc,
 		    if (TREE_CODE (TREE_OPERAND (pmop[which], 1))
 			!= INTEGER_CST)
 		      break;
-		    /* tree_low_cst not used, because we don't care about
-		       the upper bits.  */
-		    cst0 = TREE_INT_CST_LOW (TREE_OPERAND (pmop[which], 1));
+		    /* We don't care about the upper bits.  */
+		    cst0 = tree_to_hwi (TREE_OPERAND (pmop[which], 1));
 		    cst0 &= cst1;
 		    if (TREE_CODE (pmop[which]) == BIT_AND_EXPR)
 		      {
@@ -11503,7 +11500,7 @@  fold_binary_loc (location_t loc,
 		       omitted (assumed 0).  */
 		    if ((TREE_CODE (arg0) == PLUS_EXPR
 			 || (TREE_CODE (arg0) == MINUS_EXPR && which == 0))
-			&& (TREE_INT_CST_LOW (pmop[which]) & cst1) == 0)
+			&& (tree_to_hwi (pmop[which]) & cst1) == 0)
 		      pmop[which] = NULL;
 		    break;
 		  default:
@@ -11566,7 +11563,7 @@  fold_binary_loc (location_t loc,
 	    = TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg0, 0)));
 
 	  if (prec < BITS_PER_WORD && prec < HOST_BITS_PER_WIDE_INT
-	      && (~TREE_INT_CST_LOW (arg1)
+	      && (~tree_to_hwi (arg1)
 		  & (((HOST_WIDE_INT) 1 << prec) - 1)) == 0)
 	    return
 	      fold_convert_loc (loc, type, TREE_OPERAND (arg0, 0));
@@ -11592,10 +11589,10 @@  fold_binary_loc (location_t loc,
       /* If arg0 is derived from the address of an object or function, we may
 	 be able to fold this expression using the object or function's
 	 alignment.  */
-      if (POINTER_TYPE_P (TREE_TYPE (arg0)) && host_integerp (arg1, 1))
+      if (POINTER_TYPE_P (TREE_TYPE (arg0)) && tree_fits_uhwi_p (arg1))
 	{
 	  unsigned HOST_WIDE_INT modulus, residue;
-	  unsigned HOST_WIDE_INT low = TREE_INT_CST_LOW (arg1);
+	  unsigned HOST_WIDE_INT low = tree_to_hwi (arg1);
 
 	  modulus = get_pointer_modulus_and_residue (arg0, &residue,
 						     integer_onep (arg1));
@@ -11612,16 +11609,15 @@  fold_binary_loc (location_t loc,
 	 if the new mask might be further optimized.  */
       if ((TREE_CODE (arg0) == LSHIFT_EXPR
 	   || TREE_CODE (arg0) == RSHIFT_EXPR)
-	  && host_integerp (TREE_OPERAND (arg0, 1), 1)
-	  && host_integerp (arg1, TYPE_UNSIGNED (TREE_TYPE (arg1)))
-	  && tree_low_cst (TREE_OPERAND (arg0, 1), 1)
+	  && tree_fits_uhwi_p (TREE_OPERAND (arg0, 1))
+	  && tree_fits_hwi_p (arg1, TYPE_UNSIGNED (TREE_TYPE (arg1)))
+	  && tree_to_hwi (TREE_OPERAND (arg0, 1))
 	     < TYPE_PRECISION (TREE_TYPE (arg0))
 	  && TYPE_PRECISION (TREE_TYPE (arg0)) <= HOST_BITS_PER_WIDE_INT
-	  && tree_low_cst (TREE_OPERAND (arg0, 1), 1) > 0)
+	  && tree_to_hwi (TREE_OPERAND (arg0, 1)) > 0)
 	{
-	  unsigned int shiftc = tree_low_cst (TREE_OPERAND (arg0, 1), 1);
-	  unsigned HOST_WIDE_INT mask
-	    = tree_low_cst (arg1, TYPE_UNSIGNED (TREE_TYPE (arg1)));
+	  unsigned int shiftc = tree_to_hwi (TREE_OPERAND (arg0, 1));
+	  unsigned HOST_WIDE_INT mask = tree_to_hwi (arg1);
 	  unsigned HOST_WIDE_INT newmask, zerobits = 0;
 	  tree shift_type = TREE_TYPE (arg0);
 
@@ -12220,13 +12216,13 @@  fold_binary_loc (location_t loc,
 	return NULL_TREE;
 
       /* Turn (a OP c1) OP c2 into a OP (c1+c2).  */
-      if (TREE_CODE (op0) == code && host_integerp (arg1, false)
-	  && TREE_INT_CST_LOW (arg1) < TYPE_PRECISION (type)
-	  && host_integerp (TREE_OPERAND (arg0, 1), false)
-	  && TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)) < TYPE_PRECISION (type))
+      if (TREE_CODE (op0) == code && tree_fits_shwi_p (arg1)
+	  && tree_to_hwi (arg1) < TYPE_PRECISION (type)
+	  && tree_fits_shwi_p (TREE_OPERAND (arg0, 1))
+	  && tree_to_hwi (TREE_OPERAND (arg0, 1)) < TYPE_PRECISION (type))
 	{
-	  HOST_WIDE_INT low = (TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1))
-			       + TREE_INT_CST_LOW (arg1));
+	  HOST_WIDE_INT low = (tree_to_hwi (TREE_OPERAND (arg0, 1))
+			       + tree_to_hwi (arg1));
 
 	  /* Deal with a OP (c1 + c2) being undefined but (a OP c1) OP c2
 	     being well defined.  */
@@ -12250,13 +12246,13 @@  fold_binary_loc (location_t loc,
       if (((code == LSHIFT_EXPR && TREE_CODE (arg0) == RSHIFT_EXPR)
            || (TYPE_UNSIGNED (type)
 	       && code == RSHIFT_EXPR && TREE_CODE (arg0) == LSHIFT_EXPR))
-	  && host_integerp (arg1, false)
-	  && TREE_INT_CST_LOW (arg1) < TYPE_PRECISION (type)
-	  && host_integerp (TREE_OPERAND (arg0, 1), false)
-	  && TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)) < TYPE_PRECISION (type))
+	  && tree_fits_shwi_p (arg1)
+	  && tree_to_hwi (arg1) < TYPE_PRECISION (type)
+	  && tree_fits_shwi_p (TREE_OPERAND (arg0, 1))
+	  && tree_to_hwi (TREE_OPERAND (arg0, 1)) < TYPE_PRECISION (type))
 	{
-	  HOST_WIDE_INT low0 = TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1));
-	  HOST_WIDE_INT low1 = TREE_INT_CST_LOW (arg1);
+	  HOST_WIDE_INT low0 = tree_to_hwi (TREE_OPERAND (arg0, 1));
+	  HOST_WIDE_INT low1 = tree_to_hwi (arg1);
 	  tree lshift;
 	  tree arg00;
 
@@ -12300,10 +12296,10 @@  fold_binary_loc (location_t loc,
       if (code == RROTATE_EXPR && TREE_CODE (arg1) == INTEGER_CST
 	  && TREE_CODE (arg0) == RROTATE_EXPR
 	  && TREE_CODE (TREE_OPERAND (arg0, 1)) == INTEGER_CST
-	  && TREE_INT_CST_HIGH (arg1) == 0
-	  && TREE_INT_CST_HIGH (TREE_OPERAND (arg0, 1)) == 0
-	  && ((TREE_INT_CST_LOW (arg1)
-	       + TREE_INT_CST_LOW (TREE_OPERAND (arg0, 1)))
+	  && tree_fits_uhwi_p (arg1)
+	  && tree_fits_uhwi_p (TREE_OPERAND (arg0, 1))
+	  && ((tree_to_hwi (arg1)
+	       + tree_to_hwi (TREE_OPERAND (arg0, 1)))
 	      == (unsigned int) TYPE_PRECISION (type)))
 	return TREE_OPERAND (arg0, 0);
 
@@ -12630,7 +12626,7 @@  fold_binary_loc (location_t loc,
 	  && operand_equal_p (tree_strip_nop_conversions (TREE_OPERAND (arg0,
 									1)),
 			      arg1, 0)
-	  && (TREE_INT_CST_LOW (TREE_OPERAND (arg0, 0)) & 1) == 1)
+	  && (tree_to_hwi (TREE_OPERAND (arg0, 0)) & 1) == 1)
 	{
 	  return omit_two_operands_loc (loc, type,
 				    code == NE_EXPR
@@ -12721,15 +12717,15 @@  fold_binary_loc (location_t loc,
 	  tree arg001 = TREE_OPERAND (TREE_OPERAND (arg0, 0), 1);
 
 	  /* Check for a valid shift count.  */
-	  if (TREE_INT_CST_HIGH (arg001) == 0
-	      && TREE_INT_CST_LOW (arg001) < prec)
+	  if (tree_fits_uhwi_p (arg001)
+	      && (unsigned HOST_WIDE_INT)tree_to_hwi (arg001) < prec)
 	    {
 	      tree arg01 = TREE_OPERAND (arg0, 1);
 	      tree arg000 = TREE_OPERAND (TREE_OPERAND (arg0, 0), 0);
 	      unsigned HOST_WIDE_INT log2 = tree_log2 (arg01);
 	      /* If (C2 << C1) doesn't overflow, then ((X >> C1) & C2) != 0
 		 can be rewritten as (X & (C2 << C1)) != 0.  */
-	      if ((log2 + TREE_INT_CST_LOW (arg001)) < prec)
+	      if ((log2 + tree_to_hwi (arg001)) < prec)
 		{
 		  tem = fold_build2_loc (loc, LSHIFT_EXPR, itype, arg01, arg001);
 		  tem = fold_build2_loc (loc, BIT_AND_EXPR, itype, arg000, tem);
@@ -12847,9 +12843,8 @@  fold_binary_loc (location_t loc,
 	  tree arg00 = TREE_OPERAND (arg0, 0);
 	  tree arg01 = TREE_OPERAND (arg0, 1);
 	  tree itype = TREE_TYPE (arg00);
-	  if (TREE_INT_CST_HIGH (arg01) == 0
-	      && TREE_INT_CST_LOW (arg01)
-		 == (unsigned HOST_WIDE_INT) (TYPE_PRECISION (itype) - 1))
+	  if (tree_fits_uhwi_p (arg01)
+	      && tree_to_hwi (arg01) == (TYPE_PRECISION (itype) - 1))
 	    {
 	      if (TYPE_UNSIGNED (itype))
 		{
@@ -13922,8 +13917,8 @@  fold_ternary_loc (location_t loc, enum tree_code code, tree type,
 	  STRIP_NOPS (tem);
 	  if (TREE_CODE (tem) == RSHIFT_EXPR
               && TREE_CODE (TREE_OPERAND (tem, 1)) == INTEGER_CST
-              && (unsigned HOST_WIDE_INT) tree_log2 (arg1) ==
-	         TREE_INT_CST_LOW (TREE_OPERAND (tem, 1)))
+              && (unsigned HOST_WIDE_INT) tree_log2 (arg1)
+	      == tree_to_uhwi (TREE_OPERAND (tem, 1)))
 	    return fold_build2_loc (loc, BIT_AND_EXPR, type,
 				TREE_OPERAND (tem, 0), arg1);
 	}
@@ -14003,9 +13998,9 @@  fold_ternary_loc (location_t loc, enum tree_code code, tree type,
 		  && TREE_TYPE (type) == TREE_TYPE (TREE_TYPE (arg0)))))
 	{
 	  tree eltype = TREE_TYPE (TREE_TYPE (arg0));
-	  unsigned HOST_WIDE_INT width = tree_low_cst (TYPE_SIZE (eltype), 1);
-	  unsigned HOST_WIDE_INT n = tree_low_cst (arg1, 1);
-	  unsigned HOST_WIDE_INT idx = tree_low_cst (op2, 1);
+	  unsigned HOST_WIDE_INT width = tree_to_uhwi (TYPE_SIZE (eltype));
+	  unsigned HOST_WIDE_INT n = tree_to_uhwi (arg1);
+	  unsigned HOST_WIDE_INT idx = tree_to_uhwi (op2);
 
 	  if (n != 0
 	      && (idx % width) == 0
@@ -14065,7 +14060,7 @@  fold_ternary_loc (location_t loc, enum tree_code code, tree type,
 
       /* A bit-field-ref that referenced the full argument can be stripped.  */
       if (INTEGRAL_TYPE_P (TREE_TYPE (arg0))
-	  && TYPE_PRECISION (TREE_TYPE (arg0)) == tree_low_cst (arg1, 1)
+	  && TYPE_PRECISION (TREE_TYPE (arg0)) == tree_to_uhwi (arg1)
 	  && integer_zerop (op2))
 	return fold_convert_loc (loc, type, arg0);
 
@@ -14073,17 +14068,17 @@  fold_ternary_loc (location_t loc, enum tree_code code, tree type,
          fold (nearly) all BIT_FIELD_REFs.  */
       if (CONSTANT_CLASS_P (arg0)
 	  && can_native_interpret_type_p (type)
-	  && host_integerp (TYPE_SIZE_UNIT (TREE_TYPE (arg0)), 1)
+	  && tree_fits_uhwi_p (TYPE_SIZE_UNIT (TREE_TYPE (arg0)))
 	  /* This limitation should not be necessary, we just need to
 	     round this up to mode size.  */
-	  && tree_low_cst (op1, 1) % BITS_PER_UNIT == 0
+	  && tree_to_uhwi (op1) % BITS_PER_UNIT == 0
 	  /* Need bit-shifting of the buffer to relax the following.  */
-	  && tree_low_cst (op2, 1) % BITS_PER_UNIT == 0)
+	  && tree_to_uhwi (op2) % BITS_PER_UNIT == 0)
 	{
-	  unsigned HOST_WIDE_INT bitpos = tree_low_cst (op2, 1);
-	  unsigned HOST_WIDE_INT bitsize = tree_low_cst (op1, 1);
+	  unsigned HOST_WIDE_INT bitpos = tree_to_uhwi (op2);
+	  unsigned HOST_WIDE_INT bitsize = tree_to_uhwi (op1);
 	  unsigned HOST_WIDE_INT clen;
-	  clen = tree_low_cst (TYPE_SIZE_UNIT (TREE_TYPE (arg0)), 1);
+	  clen = tree_to_uhwi (TYPE_SIZE_UNIT (TREE_TYPE (arg0)));
 	  /* ???  We cannot tell native_encode_expr to start at
 	     some random byte only.  So limit us to a reasonable amount
 	     of work.  */
@@ -14925,9 +14920,8 @@  multiple_of_p (tree type, const_tree top, const_tree bottom)
 	  op1 = TREE_OPERAND (top, 1);
 	  /* const_binop may not detect overflow correctly,
 	     so check for it explicitly here.  */
-	  if (TYPE_PRECISION (TREE_TYPE (size_one_node))
-	      > TREE_INT_CST_LOW (op1)
-	      && TREE_INT_CST_HIGH (op1) == 0
+	  if (tree_fits_uhwi_p (op1)
+	      && TYPE_PRECISION (TREE_TYPE (size_one_node)) > tree_to_hwi (op1)
 	      && 0 != (t1 = fold_convert (type,
 					  const_binop (LSHIFT_EXPR,
 						       size_one_node,
@@ -15311,7 +15305,7 @@  tree_call_nonnegative_warnv_p (tree type, tree fndecl,
 	/* True if the 1st argument is nonnegative or the second
 	   argument is an even integer.  */
 	if (TREE_CODE (arg1) == INTEGER_CST
-	    && (TREE_INT_CST_LOW (arg1) & 1) == 0)
+	    && (tree_to_hwi (arg1) & 1) == 0)
 	  return true;
 	return tree_expr_nonnegative_warnv_p (arg0,
 					      strict_overflow_p);
@@ -15895,7 +15889,7 @@  fold_read_from_constant_string (tree exp)
 	  && (GET_MODE_SIZE (TYPE_MODE (TREE_TYPE (TREE_TYPE (string)))) == 1))
 	return build_int_cst_type (TREE_TYPE (exp),
 				   (TREE_STRING_POINTER (string)
-				    [TREE_INT_CST_LOW (index)]));
+				    [tree_to_uhwi (index)]));
     }
   return NULL;
 }
@@ -16245,9 +16239,9 @@  fold_indirect_ref_1 (location_t loc, tree type, tree op0)
 	  if (TREE_CODE (op00type) == VECTOR_TYPE
 	      && type == TREE_TYPE (op00type))
 	    {
-	      HOST_WIDE_INT offset = tree_low_cst (op01, 0);
+	      HOST_WIDE_INT offset = tree_to_shwi (op01);
 	      tree part_width = TYPE_SIZE (type);
-	      unsigned HOST_WIDE_INT part_widthi = tree_low_cst (part_width, 0)/BITS_PER_UNIT;
+	      unsigned HOST_WIDE_INT part_widthi = tree_to_shwi (part_width)/BITS_PER_UNIT;
 	      unsigned HOST_WIDE_INT indexi = offset * BITS_PER_UNIT;
 	      tree index = bitsize_int (indexi);
 
diff --git a/gcc/fortran/target-memory.c b/gcc/fortran/target-memory.c
index aec7fa2..4c49fbe 100644
--- a/gcc/fortran/target-memory.c
+++ b/gcc/fortran/target-memory.c
@@ -239,8 +239,8 @@  encode_derived (gfc_expr *source, unsigned char *buffer, size_t buffer_size)
       gcc_assert (cmp);
       if (!c->expr)
 	continue;
-      ptr = TREE_INT_CST_LOW(DECL_FIELD_OFFSET(cmp->backend_decl))
-	    + TREE_INT_CST_LOW(DECL_FIELD_BIT_OFFSET(cmp->backend_decl))/8;
+      ptr = tree_to_shwi (DECL_FIELD_OFFSET(cmp->backend_decl))
+	    + tree_to_shwi (DECL_FIELD_BIT_OFFSET(cmp->backend_decl))/8;
 
       if (c->expr->expr_type == EXPR_NULL)
 	{
@@ -522,9 +522,9 @@  gfc_interpret_derived (unsigned char *buffer, size_t buffer_size, gfc_expr *resu
 	 i.e. there are, e.g., no bit fields.  */
 
       gcc_assert (cmp->backend_decl);
-      ptr = TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (cmp->backend_decl));
+      ptr = tree_to_shwi (DECL_FIELD_BIT_OFFSET (cmp->backend_decl));
       gcc_assert (ptr % 8 == 0);
-      ptr = ptr/8 + TREE_INT_CST_LOW (DECL_FIELD_OFFSET (cmp->backend_decl));
+      ptr = ptr/8 + tree_to_shwi (DECL_FIELD_OFFSET (cmp->backend_decl));
 
       gfc_target_interpret_expr (&buffer[ptr], buffer_size - ptr, e, true);
     }
@@ -633,8 +633,8 @@  expr_to_char (gfc_expr *e, unsigned char *data, unsigned char *chk, size_t len)
 	  gcc_assert (cmp && cmp->backend_decl);
 	  if (!c->expr)
 	    continue;
-	    ptr = TREE_INT_CST_LOW(DECL_FIELD_OFFSET(cmp->backend_decl))
-			+ TREE_INT_CST_LOW(DECL_FIELD_BIT_OFFSET(cmp->backend_decl))/8;
+	    ptr = tree_to_shwi (DECL_FIELD_OFFSET(cmp->backend_decl))
+			+ tree_to_shwi (DECL_FIELD_BIT_OFFSET(cmp->backend_decl))/8;
 	  expr_to_char (c->expr, &data[ptr], &chk[ptr], len);
 	}
       return len;
diff --git a/gcc/fortran/trans-array.c b/gcc/fortran/trans-array.c
index 3e684ee..ba5c5d5 100644
--- a/gcc/fortran/trans-array.c
+++ b/gcc/fortran/trans-array.c
@@ -1682,7 +1682,7 @@  gfc_trans_array_constructor_value (stmtblock_t * pblock, tree type,
 	      tmp = gfc_build_addr_expr (NULL_TREE, tmp);
 	      init = gfc_build_addr_expr (NULL_TREE, init);
 
-	      size = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (type));
+	      size = tree_to_shwi (TYPE_SIZE_UNIT (type));
 	      bound = build_int_cst (size_type_node, n * size);
 	      tmp = build_call_expr_loc (input_location,
 					 builtin_decl_explicit (BUILT_IN_MEMCPY),
diff --git a/gcc/fortran/trans-common.c b/gcc/fortran/trans-common.c
index 86cf007..92bf5b2 100644
--- a/gcc/fortran/trans-common.c
+++ b/gcc/fortran/trans-common.c
@@ -399,8 +399,8 @@  build_common_decl (gfc_common_head *com, tree union_type, bool is_init)
 	gfc_warning ("Named COMMON block '%s' at %L shall be of the "
 		     "same size as elsewhere (%lu vs %lu bytes)", com->name,
 		     &com->where,
-		     (unsigned long) TREE_INT_CST_LOW (size),
-		     (unsigned long) TREE_INT_CST_LOW (DECL_SIZE_UNIT (decl)));
+		     (unsigned long) tree_to_uhwi (size),
+		     (unsigned long) tree_to_uhwi (DECL_SIZE_UNIT (decl)));
 
       if (tree_int_cst_lt (DECL_SIZE_UNIT (decl), size))
 	{
diff --git a/gcc/fortran/trans-const.c b/gcc/fortran/trans-const.c
index fa820ef..85d7679 100644
--- a/gcc/fortran/trans-const.c
+++ b/gcc/fortran/trans-const.c
@@ -148,7 +148,7 @@  gfc_conv_string_init (tree length, gfc_expr * expr)
   gcc_assert (INTEGER_CST_P (length));
   gcc_assert (TREE_INT_CST_HIGH (length) == 0);
 
-  len = TREE_INT_CST_LOW (length);
+  len = tree_to_shwi (length);
   slen = expr->value.character.length;
 
   if (len > slen)
diff --git a/gcc/fortran/trans-decl.c b/gcc/fortran/trans-decl.c
index 910b150..ca5a144 100644
--- a/gcc/fortran/trans-decl.c
+++ b/gcc/fortran/trans-decl.c
@@ -410,7 +410,7 @@  gfc_can_put_var_on_stack (tree size)
   if (TREE_INT_CST_HIGH (size) != 0)
     return 0;
 
-  low = TREE_INT_CST_LOW (size);
+  low = tree_to_uhwi (size);
   if (low > (unsigned HOST_WIDE_INT) gfc_option.flag_max_stack_var_size)
     return 0;
 
diff --git a/gcc/fortran/trans-expr.c b/gcc/fortran/trans-expr.c
index 1178e3d..5caac01 100644
--- a/gcc/fortran/trans-expr.c
+++ b/gcc/fortran/trans-expr.c
@@ -2209,7 +2209,7 @@  gfc_string_to_single_character (tree len, tree str, int kind)
       || !POINTER_TYPE_P (TREE_TYPE (str)))
     return NULL_TREE;
 
-  if (TREE_INT_CST_LOW (len) == 1)
+  if (tree_to_shwi (len) == 1)
     {
       str = fold_convert (gfc_get_pchar_type (kind), str);
       return build_fold_indirect_ref_loc (input_location, str);
@@ -2221,8 +2221,8 @@  gfc_string_to_single_character (tree len, tree str, int kind)
       && TREE_CODE (TREE_OPERAND (TREE_OPERAND (str, 0), 0)) == STRING_CST
       && array_ref_low_bound (TREE_OPERAND (str, 0))
 	 == TREE_OPERAND (TREE_OPERAND (str, 0), 1)
-      && TREE_INT_CST_LOW (len) > 1
-      && TREE_INT_CST_LOW (len)
+      && tree_to_uhwi (len) > 1
+      && tree_to_uhwi (len)
 	 == (unsigned HOST_WIDE_INT)
 	    TREE_STRING_LENGTH (TREE_OPERAND (TREE_OPERAND (str, 0), 0)))
     {
@@ -2319,8 +2319,8 @@  gfc_optimize_len_trim (tree len, tree str, int kind)
       && TREE_CODE (TREE_OPERAND (TREE_OPERAND (str, 0), 0)) == STRING_CST
       && array_ref_low_bound (TREE_OPERAND (str, 0))
 	 == TREE_OPERAND (TREE_OPERAND (str, 0), 1)
-      && TREE_INT_CST_LOW (len) >= 1
-      && TREE_INT_CST_LOW (len)
+      && tree_to_uhwi (len) >= 1
+      && tree_to_uhwi (len)
 	 == (unsigned HOST_WIDE_INT)
 	    TREE_STRING_LENGTH (TREE_OPERAND (TREE_OPERAND (str, 0), 0)))
     {
diff --git a/gcc/fortran/trans-io.c b/gcc/fortran/trans-io.c
index 940129e..b70e2f3 100644
--- a/gcc/fortran/trans-io.c
+++ b/gcc/fortran/trans-io.c
@@ -293,8 +293,8 @@  gfc_build_io_library_fndecls (void)
 			    = build_pointer_type (gfc_intio_type_node);
   types[IOPARM_type_parray] = pchar_type_node;
   types[IOPARM_type_pchar] = pchar_type_node;
-  pad_size = 16 * TREE_INT_CST_LOW (TYPE_SIZE_UNIT (pchar_type_node));
-  pad_size += 32 * TREE_INT_CST_LOW (TYPE_SIZE_UNIT (integer_type_node));
+  pad_size = 16 * tree_to_shwi (TYPE_SIZE_UNIT (pchar_type_node));
+  pad_size += 32 * tree_to_shwi (TYPE_SIZE_UNIT (integer_type_node));
   pad_idx = build_index_type (size_int (pad_size - 1));
   types[IOPARM_type_pad] = build_array_type (char_type_node, pad_idx);
 
diff --git a/gcc/fortran/trans-types.c b/gcc/fortran/trans-types.c
index 81b7fa5..474fe66 100644
--- a/gcc/fortran/trans-types.c
+++ b/gcc/fortran/trans-types.c
@@ -1452,7 +1452,7 @@  gfc_get_dtype (tree type)
       if (tree_int_cst_lt (gfc_max_array_element_size, size))
 	gfc_fatal_error ("Array element size too big at %C");
 
-      i += TREE_INT_CST_LOW (size) << GFC_DTYPE_SIZE_SHIFT;
+      i += tree_to_shwi (size) << GFC_DTYPE_SIZE_SHIFT;
     }
   dtype = build_int_cst (gfc_array_index_type, i);
 
diff --git a/gcc/function.c b/gcc/function.c
index 9efbc3a..80dc4a3 100644
--- a/gcc/function.c
+++ b/gcc/function.c
@@ -3791,8 +3791,8 @@  locate_and_pad_parm (enum machine_mode passed_mode, tree type, int in_regs,
   {
     tree s2 = sizetree;
     if (where_pad != none
-	&& (!host_integerp (sizetree, 1)
-	    || (tree_low_cst (sizetree, 1) * BITS_PER_UNIT) % round_boundary))
+	&& (!tree_fits_uhwi_p (sizetree)
+	    || (tree_to_hwi (sizetree) * BITS_PER_UNIT) % round_boundary))
       s2 = round_up (s2, round_boundary / BITS_PER_UNIT);
     SUB_PARM_SIZE (locate->slot_offset, s2);
   }
@@ -3834,7 +3834,7 @@  locate_and_pad_parm (enum machine_mode passed_mode, tree type, int in_regs,
 
 #ifdef PUSH_ROUNDING
   if (passed_mode != BLKmode)
-    sizetree = size_int (PUSH_ROUNDING (TREE_INT_CST_LOW (sizetree)));
+    sizetree = size_int (PUSH_ROUNDING (tree_to_uhwi (sizetree)));
 #endif
 
   /* Pad_below needs the pre-rounded size to know how much to pad below
@@ -3844,8 +3844,8 @@  locate_and_pad_parm (enum machine_mode passed_mode, tree type, int in_regs,
     pad_below (&locate->offset, passed_mode, sizetree);
 
   if (where_pad != none
-      && (!host_integerp (sizetree, 1)
-	  || (tree_low_cst (sizetree, 1) * BITS_PER_UNIT) % round_boundary))
+      && (!tree_fits_uhwi_p (sizetree)
+	  || (tree_to_hwi (sizetree) * BITS_PER_UNIT) % round_boundary))
     sizetree = round_up (sizetree, round_boundary / BITS_PER_UNIT);
 
   ADD_PARM_SIZE (locate->size, sizetree);
@@ -3936,7 +3936,7 @@  pad_below (struct args_size *offset_ptr, enum machine_mode passed_mode, tree siz
   else
     {
       if (TREE_CODE (sizetree) != INTEGER_CST
-	  || (TREE_INT_CST_LOW (sizetree) * BITS_PER_UNIT) % PARM_BOUNDARY)
+	  || (tree_to_uhwi (sizetree) * BITS_PER_UNIT) % PARM_BOUNDARY)
 	{
 	  /* Round the size up to multiple of PARM_BOUNDARY bits.  */
 	  tree s2 = round_up (sizetree, PARM_BOUNDARY / BITS_PER_UNIT);
diff --git a/gcc/gimple-fold.c b/gcc/gimple-fold.c
index 66d0766..5ff70a2 100644
--- a/gcc/gimple-fold.c
+++ b/gcc/gimple-fold.c
@@ -1049,7 +1049,7 @@  gimple_extract_devirt_binfo_from_cst (tree cst)
 	    continue;
 
 	  pos = int_bit_position (fld);
-	  size = tree_low_cst (DECL_SIZE (fld), 1);
+	  size = tree_to_uhwi (DECL_SIZE (fld));
 	  if (pos <= offset && (pos + size) > offset)
 	    break;
 	}
@@ -1112,7 +1112,7 @@  gimple_fold_call (gimple_stmt_iterator *gsi, bool inplace)
 	  if (binfo)
 	    {
 	      HOST_WIDE_INT token
-		= TREE_INT_CST_LOW (OBJ_TYPE_REF_TOKEN (callee));
+		= tree_to_shwi (OBJ_TYPE_REF_TOKEN (callee));
 	      tree fndecl = gimple_get_virt_method_for_binfo (token, binfo);
 	      if (fndecl)
 		{
@@ -2686,7 +2686,7 @@  get_base_constructor (tree base, HOST_WIDE_INT *bit_offset,
     {
       if (!integer_zerop (TREE_OPERAND (base, 1)))
 	{
-	  if (!host_integerp (TREE_OPERAND (base, 1), 0))
+	  if (!tree_fits_shwi_p (TREE_OPERAND (base, 1)))
 	    return NULL_TREE;
 	  *bit_offset += (mem_ref_offset (base).low
 			  * BITS_PER_UNIT);
@@ -3024,13 +3024,13 @@  fold_const_aggregate_ref_1 (tree t, tree (*valueize) (tree))
 	  if ((low_bound = array_ref_low_bound (t),
 	       TREE_CODE (low_bound) == INTEGER_CST)
 	      && (unit_size = array_ref_element_size (t),
-		  host_integerp (unit_size, 1))
+		  tree_fits_uhwi_p (unit_size))
 	      && (doffset = (TREE_INT_CST (idx) - TREE_INT_CST (low_bound))
 			    .sext (TYPE_PRECISION (TREE_TYPE (idx))),
 		  doffset.fits_shwi ()))
 	    {
 	      offset = doffset.to_shwi ();
-	      offset *= TREE_INT_CST_LOW (unit_size);
+	      offset *= tree_to_shwi (unit_size);
 	      offset *= BITS_PER_UNIT;
 
 	      base = TREE_OPERAND (t, 0);
@@ -3046,7 +3046,7 @@  fold_const_aggregate_ref_1 (tree t, tree (*valueize) (tree))
 	      if (!ctor)
 		return NULL_TREE;
 	      return fold_ctor_reference (TREE_TYPE (t), ctor, offset,
-					  TREE_INT_CST_LOW (unit_size)
+					  tree_to_shwi (unit_size)
 					  * BITS_PER_UNIT,
 					  base);
 	    }
@@ -3118,7 +3118,7 @@  gimple_get_virt_method_for_binfo (HOST_WIDE_INT token, tree known_binfo)
 
   if (TREE_CODE (v) == POINTER_PLUS_EXPR)
     {
-      offset = tree_low_cst (TREE_OPERAND (v, 1), 1) * BITS_PER_UNIT;
+      offset = tree_to_uhwi (TREE_OPERAND (v, 1)) * BITS_PER_UNIT;
       v = TREE_OPERAND (v, 0);
     }
   else
@@ -3134,7 +3134,7 @@  gimple_get_virt_method_for_binfo (HOST_WIDE_INT token, tree known_binfo)
       || DECL_INITIAL (v) == error_mark_node)
     return NULL_TREE;
   gcc_checking_assert (TREE_CODE (TREE_TYPE (v)) == ARRAY_TYPE);
-  size = tree_low_cst (TYPE_SIZE (TREE_TYPE (TREE_TYPE (v))), 1);
+  size = tree_to_uhwi (TYPE_SIZE (TREE_TYPE (TREE_TYPE (v))));
   offset += token * size;
   fn = fold_ctor_reference (TREE_TYPE (TREE_TYPE (v)), DECL_INITIAL (v),
 			    offset, size, vtable);
@@ -3253,7 +3253,7 @@  gimple_val_nonnegative_real_p (tree val)
 	      arg1 = gimple_call_arg (def_stmt, 1);
 
 	      if (TREE_CODE (arg1) == INTEGER_CST
-		  && (TREE_INT_CST_LOW (arg1) & 1) == 0)
+		  && (tree_to_hwi (arg1) & 1) == 0)
 		return true;
 
 	      break;
diff --git a/gcc/gimple-pretty-print.c b/gcc/gimple-pretty-print.c
index 4b3235e..4d34c64 100644
--- a/gcc/gimple-pretty-print.c
+++ b/gcc/gimple-pretty-print.c
@@ -712,7 +712,7 @@  dump_gimple_call (pretty_printer *buffer, gimple gs, int spc, int flags)
       pp_string (buffer, " [ ");
 
       /* Get the transaction code properties.  */
-      props = TREE_INT_CST_LOW (t);
+      props = tree_to_hwi (t);
 
       if (props & PR_INSTRUMENTEDCODE)
 	pp_string (buffer, "instrumentedCode ");
diff --git a/gcc/gimple-ssa-strength-reduction.c b/gcc/gimple-ssa-strength-reduction.c
index 46600a5..6ee5c7a 100644
--- a/gcc/gimple-ssa-strength-reduction.c
+++ b/gcc/gimple-ssa-strength-reduction.c
@@ -461,8 +461,8 @@  stmt_cost (gimple gs, bool speed)
     case MULT_EXPR:
       rhs2 = gimple_assign_rhs2 (gs);
 
-      if (host_integerp (rhs2, 0))
-	return mult_by_coeff_cost (TREE_INT_CST_LOW (rhs2), lhs_mode, speed);
+      if (tree_fits_shwi_p (rhs2))
+	return mult_by_coeff_cost (tree_to_hwi (rhs2), lhs_mode, speed);
 
       gcc_assert (TREE_CODE (rhs1) != INTEGER_CST);
       return mul_cost (speed, lhs_mode);
diff --git a/gcc/gimple.c b/gcc/gimple.c
index 6088682..1d509f6 100644
--- a/gcc/gimple.c
+++ b/gcc/gimple.c
@@ -3038,16 +3038,16 @@  gimple_compare_field_offset (tree f1, tree f2)
   /* Fortran and C do not always agree on what DECL_OFFSET_ALIGN
      should be, so handle differing ones specially by decomposing
      the offset into a byte and bit offset manually.  */
-  if (host_integerp (DECL_FIELD_OFFSET (f1), 0)
-      && host_integerp (DECL_FIELD_OFFSET (f2), 0))
+  if (tree_fits_shwi_p (DECL_FIELD_OFFSET (f1))
+      && tree_fits_shwi_p (DECL_FIELD_OFFSET (f2)))
     {
       unsigned HOST_WIDE_INT byte_offset1, byte_offset2;
       unsigned HOST_WIDE_INT bit_offset1, bit_offset2;
-      bit_offset1 = TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (f1));
-      byte_offset1 = (TREE_INT_CST_LOW (DECL_FIELD_OFFSET (f1))
+      bit_offset1 = tree_to_hwi (DECL_FIELD_BIT_OFFSET (f1));
+      byte_offset1 = (tree_to_hwi (DECL_FIELD_OFFSET (f1))
 		      + bit_offset1 / BITS_PER_UNIT);
-      bit_offset2 = TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (f2));
-      byte_offset2 = (TREE_INT_CST_LOW (DECL_FIELD_OFFSET (f2))
+      bit_offset2 = tree_to_hwi (DECL_FIELD_BIT_OFFSET (f2));
+      byte_offset2 = (tree_to_hwi (DECL_FIELD_OFFSET (f2))
 		      + bit_offset2 / BITS_PER_UNIT);
       if (byte_offset1 != byte_offset2)
 	return false;
diff --git a/gcc/gimplify.c b/gcc/gimplify.c
index a871e7d..88dbe66 100644
--- a/gcc/gimplify.c
+++ b/gcc/gimplify.c
@@ -728,7 +728,7 @@  gimple_add_tmp_var (tree tmp)
   /* Later processing assumes that the object size is constant, which might
      not be true at this point.  Force the use of a constant upper bound in
      this case.  */
-  if (!host_integerp (DECL_SIZE_UNIT (tmp), 1))
+  if (!tree_fits_uhwi_p (DECL_SIZE_UNIT (tmp)))
     force_constant_size (tmp);
 
   DECL_CONTEXT (tmp) = current_function_decl;
@@ -4326,12 +4326,12 @@  gimple_fold_indirect_ref (tree t)
       if (TREE_CODE (addr) == ADDR_EXPR
 	  && TREE_CODE (TREE_TYPE (addrtype)) == VECTOR_TYPE
 	  && useless_type_conversion_p (type, TREE_TYPE (TREE_TYPE (addrtype)))
-	  && host_integerp (off, 1))
+	  && tree_fits_uhwi_p (off))
 	{
-          unsigned HOST_WIDE_INT offset = tree_low_cst (off, 1);
+          unsigned HOST_WIDE_INT offset = tree_to_hwi (off);
           tree part_width = TYPE_SIZE (type);
           unsigned HOST_WIDE_INT part_widthi
-            = tree_low_cst (part_width, 0) / BITS_PER_UNIT;
+            = tree_to_shwi (part_width) / BITS_PER_UNIT;
           unsigned HOST_WIDE_INT indexi = offset * BITS_PER_UNIT;
           tree index = bitsize_int (indexi);
           if (offset / part_widthi
diff --git a/gcc/go/go-gcc.cc b/gcc/go/go-gcc.cc
index 84bc972..4d72b59 100644
--- a/gcc/go/go-gcc.cc
+++ b/gcc/go/go-gcc.cc
@@ -782,7 +782,7 @@  Gcc_backend::type_size(Btype* btype)
   t = TYPE_SIZE_UNIT(t);
   gcc_assert(TREE_CODE(t) == INTEGER_CST);
   gcc_assert(TREE_INT_CST_HIGH(t) == 0);
-  unsigned HOST_WIDE_INT val_wide = TREE_INT_CST_LOW(t);
+  unsigned HOST_WIDE_INT val_wide = tree_to_uhwi (t);
   size_t ret = static_cast<size_t>(val_wide);
   gcc_assert(ret == val_wide);
   return ret;
diff --git a/gcc/go/gofrontend/expressions.cc b/gcc/go/gofrontend/expressions.cc
index eef7ea7..381ba86 100644
--- a/gcc/go/gofrontend/expressions.cc
+++ b/gcc/go/gofrontend/expressions.cc
@@ -3147,9 +3147,9 @@  Type_conversion_expression::do_get_tree(Translate_context* context)
 	   && expr_type->integer_type() != NULL)
     {
       expr_tree = fold_convert(integer_type_node, expr_tree);
-      if (host_integerp(expr_tree, 0))
+      if (tree_fits_shwi(expr_tree))
 	{
-	  HOST_WIDE_INT intval = tree_low_cst(expr_tree, 0);
+	  HOST_WIDE_INT intval = tree_to_hwi (expr_tree);
 	  std::string s;
 	  Lex::append_char(intval, true, &s, this->location());
 	  Expression* se = Expression::make_string(s, this->location());
diff --git a/gcc/godump.c b/gcc/godump.c
index ab1edc6..0302b06 100644
--- a/gcc/godump.c
+++ b/gcc/godump.c
@@ -728,12 +728,12 @@  go_format_type (struct godump_container *container, tree type,
 	  && tree_int_cst_sgn (TYPE_MIN_VALUE (TYPE_DOMAIN (type))) == 0
 	  && TYPE_MAX_VALUE (TYPE_DOMAIN (type)) != NULL_TREE
 	  && TREE_CODE (TYPE_MAX_VALUE (TYPE_DOMAIN (type))) == INTEGER_CST
-	  && host_integerp (TYPE_MAX_VALUE (TYPE_DOMAIN (type)), 0))
+	  && tree_fits_shwi_p (TYPE_MAX_VALUE (TYPE_DOMAIN (type))))
 	{
 	  char buf[100];
 
 	  snprintf (buf, sizeof buf, HOST_WIDE_INT_PRINT_DEC "+1",
-		    tree_low_cst (TYPE_MAX_VALUE (TYPE_DOMAIN (type)), 0));
+		    tree_to_shwi (TYPE_MAX_VALUE (TYPE_DOMAIN (type))));
 	  obstack_grow (ob, buf, strlen (buf));
 	}
       obstack_1grow (ob, ']');
@@ -981,13 +981,13 @@  go_output_typedef (struct godump_container *container, tree decl)
 	  if (*slot != NULL)
 	    macro_hash_del (*slot);
 
-	  if (host_integerp (TREE_VALUE (element), 0))
+	  if (tree_fits_shwi_p (TREE_VALUE (element)))
 	    snprintf (buf, sizeof buf, HOST_WIDE_INT_PRINT_DEC,
-		     tree_low_cst (TREE_VALUE (element), 0));
-	  else if (host_integerp (TREE_VALUE (element), 1))
+		     tree_to_hwi (TREE_VALUE (element)));
+	  else if (tree_fits_uhwi_p (TREE_VALUE (element)))
 	    snprintf (buf, sizeof buf, HOST_WIDE_INT_PRINT_UNSIGNED,
 		     ((unsigned HOST_WIDE_INT)
-		      tree_low_cst (TREE_VALUE (element), 1)));
+		      tree_to_hwi (TREE_VALUE (element))));
 	  else
 	    snprintf (buf, sizeof buf, HOST_WIDE_INT_PRINT_DOUBLE_HEX,
 		     ((unsigned HOST_WIDE_INT)
diff --git a/gcc/graphite-scop-detection.c b/gcc/graphite-scop-detection.c
index 0ea9e6a..b9b22ab 100644
--- a/gcc/graphite-scop-detection.c
+++ b/gcc/graphite-scop-detection.c
@@ -162,10 +162,10 @@  graphite_can_represent_init (tree e)
     case MULT_EXPR:
       if (chrec_contains_symbols (TREE_OPERAND (e, 0)))
 	return graphite_can_represent_init (TREE_OPERAND (e, 0))
-	  && host_integerp (TREE_OPERAND (e, 1), 0);
+	  && tree_fits_shwi_p (TREE_OPERAND (e, 1));
       else
 	return graphite_can_represent_init (TREE_OPERAND (e, 1))
-	  && host_integerp (TREE_OPERAND (e, 0), 0);
+	  && tree_fits_shwi_p (TREE_OPERAND (e, 0));
 
     case PLUS_EXPR:
     case POINTER_PLUS_EXPR:
diff --git a/gcc/graphite-sese-to-poly.c b/gcc/graphite-sese-to-poly.c
index 3a7b910..d9b99bb 100644
--- a/gcc/graphite-sese-to-poly.c
+++ b/gcc/graphite-sese-to-poly.c
@@ -1521,9 +1521,9 @@  pdr_add_data_dimensions (isl_set *extent, scop_p scop, data_reference_p dr)
          subscript - low >= 0 and high - subscript >= 0 in case one of
 	 the two bounds isn't known.  Do the same here?  */
 
-      if (host_integerp (low, 0)
+      if (tree_fits_shwi_p (low)
 	  && high
-	  && host_integerp (high, 0)
+	  && tree_fits_shwi_p (high)
 	  /* 1-element arrays at end of structures may extend over
 	     their declared size.  */
 	  && !(array_at_struct_end_p (ref)
diff --git a/gcc/ipa-prop.c b/gcc/ipa-prop.c
index fb2346b..633cbc4 100644
--- a/gcc/ipa-prop.c
+++ b/gcc/ipa-prop.c
@@ -222,7 +222,7 @@  ipa_print_node_jump_functions_for_edge (FILE *f, struct cgraph_edge *cs)
 		       item->offset);
 	      if (TYPE_P (item->value))
 		fprintf (f, "clobber of " HOST_WIDE_INT_PRINT_DEC " bits",
-			 tree_low_cst (TYPE_SIZE (item->value), 1));
+			 tree_to_uhwi (TYPE_SIZE (item->value)));
 	      else
 		{
 		  fprintf (f, "cst: ");
@@ -1124,7 +1124,7 @@  type_like_member_ptr_p (tree type, tree *method_ptr, tree *delta)
   fld = TYPE_FIELDS (type);
   if (!fld || !POINTER_TYPE_P (TREE_TYPE (fld))
       || TREE_CODE (TREE_TYPE (TREE_TYPE (fld))) != METHOD_TYPE
-      || !host_integerp (DECL_FIELD_OFFSET (fld), 1))
+      || !tree_fits_uhwi_p (DECL_FIELD_OFFSET (fld)))
     return false;
 
   if (method_ptr)
@@ -1132,7 +1132,7 @@  type_like_member_ptr_p (tree type, tree *method_ptr, tree *delta)
 
   fld = DECL_CHAIN (fld);
   if (!fld || INTEGRAL_TYPE_P (fld)
-      || !host_integerp (DECL_FIELD_OFFSET (fld), 1))
+      || !tree_fits_uhwi_p (DECL_FIELD_OFFSET (fld)))
     return false;
   if (delta)
     *delta = fld;
@@ -1202,13 +1202,13 @@  determine_known_aggregate_parts (gimple call, tree arg,
       if (TREE_CODE (arg) == SSA_NAME)
 	{
 	  tree type_size;
-          if (!host_integerp (TYPE_SIZE (TREE_TYPE (TREE_TYPE (arg))), 1))
+          if (!tree_fits_uhwi_p (TYPE_SIZE (TREE_TYPE (TREE_TYPE (arg)))))
             return;
 	  check_ref = true;
 	  arg_base = arg;
 	  arg_offset = 0;
 	  type_size = TYPE_SIZE (TREE_TYPE (TREE_TYPE (arg)));
-	  arg_size = tree_low_cst (type_size, 1);
+	  arg_size = tree_to_uhwi (type_size);
 	  ao_ref_init_from_ptr_and_size (&r, arg_base, NULL_TREE);
 	}
       else if (TREE_CODE (arg) == ADDR_EXPR)
@@ -1796,7 +1796,7 @@  ipa_analyze_virtual_call_uses (struct cgraph_node *node,
   cs = ipa_note_param_call (node, index, call);
   ii = cs->indirect_info;
   ii->offset = anc_offset;
-  ii->otr_token = tree_low_cst (OBJ_TYPE_REF_TOKEN (target), 1);
+  ii->otr_token = tree_to_uhwi (OBJ_TYPE_REF_TOKEN (target));
   ii->otr_type = TREE_TYPE (TREE_TYPE (OBJ_TYPE_REF_OBJECT (target)));
   ii->polymorphic = 1;
 }
diff --git a/gcc/java/boehm.c b/gcc/java/boehm.c
index 0fa8964..a612c8d 100644
--- a/gcc/java/boehm.c
+++ b/gcc/java/boehm.c
@@ -234,5 +234,5 @@  uses_jv_markobj_p (tree dtable)
      point in asserting unless we hit the bad case.  */
   gcc_assert (!flag_reduced_reflection || TARGET_VTABLE_USES_DESCRIPTORS == 0);
   v = VEC_index (constructor_elt, CONSTRUCTOR_ELTS (dtable), 3).value;
-  return (PROCEDURE_OBJECT_DESCRIPTOR == TREE_INT_CST_LOW (v));
+  return (PROCEDURE_OBJECT_DESCRIPTOR == tree_to_hwi (v));
 }
diff --git a/gcc/java/class.c b/gcc/java/class.c
index a89b831..daff83c 100644
--- a/gcc/java/class.c
+++ b/gcc/java/class.c
@@ -1578,14 +1578,14 @@  get_dispatch_vector (tree type)
       HOST_WIDE_INT i;
       tree method;
       tree super = CLASSTYPE_SUPER (type);
-      HOST_WIDE_INT nvirtuals = tree_low_cst (TYPE_NVIRTUALS (type), 0);
+      HOST_WIDE_INT nvirtuals = tree_to_shwi (TYPE_NVIRTUALS (type));
       vtable = make_tree_vec (nvirtuals);
       TYPE_VTABLE (type) = vtable;
       if (super != NULL_TREE)
 	{
 	  tree super_vtable = get_dispatch_vector (super);
 
-	  for (i = tree_low_cst (TYPE_NVIRTUALS (super), 0); --i >= 0; )
+	  for (i = tree_to_shwi (TYPE_NVIRTUALS (super)); --i >= 0; )
 	    TREE_VEC_ELT (vtable, i) = TREE_VEC_ELT (super_vtable, i);
 	}
 
@@ -1594,8 +1594,8 @@  get_dispatch_vector (tree type)
 	{
 	  tree method_index = get_method_index (method);
 	  if (method_index != NULL_TREE
-	      && host_integerp (method_index, 0))
-	    TREE_VEC_ELT (vtable, tree_low_cst (method_index, 0)) = method;
+	      && tree_fits_shwi_p (method_index))
+	    TREE_VEC_ELT (vtable, tree_to_hwi (method_index)) = method;
 	}
     }
 
diff --git a/gcc/java/expr.c b/gcc/java/expr.c
index 8041cdd..055d3a6 100644
--- a/gcc/java/expr.c
+++ b/gcc/java/expr.c
@@ -1050,8 +1050,8 @@  build_newarray (int atype_value, tree length)
   tree prim_type = decode_newarray_type (atype_value);
   tree type
     = build_java_array_type (prim_type,
-			     host_integerp (length, 0) == INTEGER_CST
-			     ? tree_low_cst (length, 0) : -1);
+			     tree_fits_shwi_p (length)
+			     ? tree_to_hwi (length) : -1);
 
   /* Pass a reference to the primitive type class and save the runtime
      some work.  */
@@ -1070,8 +1070,8 @@  build_anewarray (tree class_type, tree length)
 {
   tree type
     = build_java_array_type (class_type,
-			     host_integerp (length, 0)
-			     ? tree_low_cst (length, 0) : -1);
+			     tree_fits_shwi_p (length)
+			     ? tree_to_hwi (length) : -1);
 
   return build_call_nary (promote_type (type),
 			  build_address_of (soft_anewarray_node),
@@ -2673,7 +2673,7 @@  build_jni_stub (tree method)
      special way, we would do that here.  */
   for (tem = method_args; tem != NULL_TREE; tem = DECL_CHAIN (tem))
     {
-      int arg_bits = TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (tem)));
+      int arg_bits = tree_to_shwi (TYPE_SIZE (TREE_TYPE (tem)));
 #ifdef PARM_BOUNDARY
       arg_bits = (((arg_bits + PARM_BOUNDARY - 1) / PARM_BOUNDARY)
                   * PARM_BOUNDARY);
diff --git a/gcc/java/typeck.c b/gcc/java/typeck.c
index 6755217..94b56e1 100644
--- a/gcc/java/typeck.c
+++ b/gcc/java/typeck.c
@@ -219,7 +219,7 @@  java_array_type_length (tree array_type)
 	{
 	  tree high = TYPE_MAX_VALUE (index_type);
 	  if (TREE_CODE (high) == INTEGER_CST)
-	    return TREE_INT_CST_LOW (high) + 1;
+	    return tree_to_shwi (high) + 1;
 	}
     }
   return -1;
diff --git a/gcc/lto-streamer-out.c b/gcc/lto-streamer-out.c
index afe4951..a7c23ef 100644
--- a/gcc/lto-streamer-out.c
+++ b/gcc/lto-streamer-out.c
@@ -1256,7 +1256,7 @@  write_symbol (struct streamer_tree_cache_d *cache,
   if (kind == GCCPK_COMMON
       && DECL_SIZE_UNIT (t)
       && TREE_CODE (DECL_SIZE_UNIT (t)) == INTEGER_CST)
-    size = TREE_INT_CST_LOW (DECL_SIZE_UNIT (t));
+    size = tree_to_uhwi (DECL_SIZE_UNIT (t));
   else
     size = 0;
 
diff --git a/gcc/lto/lto-lang.c b/gcc/lto/lto-lang.c
index 280d883..6c703be 100644
--- a/gcc/lto/lto-lang.c
+++ b/gcc/lto/lto-lang.c
@@ -317,7 +317,7 @@  get_nonnull_operand (tree arg_num_expr, unsigned HOST_WIDE_INT *valp)
       || TREE_INT_CST_HIGH (arg_num_expr) != 0)
     return false;
 
-  *valp = TREE_INT_CST_LOW (arg_num_expr);
+  *valp = tree_to_uhwi (arg_num_expr);
   return true;
 }
 
diff --git a/gcc/objc/objc-act.c b/gcc/objc/objc-act.c
index caa16c7..c9e3f35 100644
--- a/gcc/objc/objc-act.c
+++ b/gcc/objc/objc-act.c
@@ -3023,8 +3023,8 @@  check_string_class_template (void)
 
 #define AT_LEAST_AS_LARGE_AS(F, T) \
   (F && TREE_CODE (F) == FIELD_DECL \
-     && (TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (F))) \
-	 >= TREE_INT_CST_LOW (TYPE_SIZE (T))))
+     && (tree_to_uhwi (TYPE_SIZE (TREE_TYPE (F))) \
+	 >= tree_to_uhwi (TYPE_SIZE (T))))
 
   if (!AT_LEAST_AS_LARGE_AS (field_decl, ptr_type_node))
     return 0;
@@ -4880,7 +4880,7 @@  objc_decl_method_attributes (tree *node, tree attributes, int flags)
 		    {
 		      TREE_VALUE (second_argument)
 			= build_int_cst (integer_type_node,
-					 TREE_INT_CST_LOW (number) + 2);
+					 tree_to_uhwi (number) + 2);
 		    }
 
 		  /* This is the third argument, the "first-to-check",
@@ -4891,12 +4891,12 @@  objc_decl_method_attributes (tree *node, tree attributes, int flags)
 		  number = TREE_VALUE (third_argument);
 		  if (number
 		      && TREE_CODE (number) == INTEGER_CST
-		      && TREE_INT_CST_HIGH (number) == 0
-		      && TREE_INT_CST_LOW (number) != 0)
+		      && tree_fits_uhwi_p (number)
+		      && tree_to_hwi (number) != 0)
 		    {
 		      TREE_VALUE (third_argument)
 			= build_int_cst (integer_type_node,
-					 TREE_INT_CST_LOW (number) + 2);
+					 tree_to_hwi (number) + 2);
 		    }
 		}
 	      filtered_attributes = chainon (filtered_attributes,
@@ -4931,12 +4931,12 @@  objc_decl_method_attributes (tree *node, tree attributes, int flags)
 		  tree number = TREE_VALUE (argument);
 		  if (number
 		      && TREE_CODE (number) == INTEGER_CST
-		      && TREE_INT_CST_HIGH (number) == 0
-		      && TREE_INT_CST_LOW (number) != 0)
+		      && tree_fits_uhwi_p (number)
+		      && tree_to_hwi (number) != 0)
 		    {
 		      TREE_VALUE (argument)
 			= build_int_cst (integer_type_node,
-					 TREE_INT_CST_LOW (number) + 2);
+					 tree_to_hwi (number) + 2);
 		    }
 		  argument = TREE_CHAIN (argument);
 		}
@@ -8889,7 +8889,7 @@  gen_declaration (tree decl)
       if (DECL_INITIAL (decl)
 	  && TREE_CODE (DECL_INITIAL (decl)) == INTEGER_CST)
 	sprintf (errbuf + strlen (errbuf), ": " HOST_WIDE_INT_PRINT_DEC,
-		 TREE_INT_CST_LOW (DECL_INITIAL (decl)));
+		 tree_to_shwi (DECL_INITIAL (decl)));
     }
 
   return errbuf;
@@ -8929,7 +8929,7 @@  gen_type_name_0 (tree type)
 		char sz[20];
 
 		sprintf (sz, HOST_WIDE_INT_PRINT_DEC,
-			 (TREE_INT_CST_LOW
+			 (tree_to_shwi
 			  (TYPE_MAX_VALUE (TYPE_DOMAIN (type))) + 1));
 		strcat (errbuf, sz);
 	      }
diff --git a/gcc/objc/objc-encoding.c b/gcc/objc/objc-encoding.c
index 8d4a9c7..a0c5533 100644
--- a/gcc/objc/objc-encoding.c
+++ b/gcc/objc/objc-encoding.c
@@ -395,12 +395,12 @@  encode_array (tree type, int curtype, int format)
 	 array.  */
       sprintf (buffer, "[" HOST_WIDE_INT_PRINT_DEC, (HOST_WIDE_INT)0);
     }
-  else if (TREE_INT_CST_LOW (TYPE_SIZE (array_of)) == 0)
+  else if (tree_to_shwi (TYPE_SIZE (array_of)) == 0)
    sprintf (buffer, "[" HOST_WIDE_INT_PRINT_DEC, (HOST_WIDE_INT)0);
   else
     sprintf (buffer, "[" HOST_WIDE_INT_PRINT_DEC,
-	     TREE_INT_CST_LOW (an_int_cst)
-	      / TREE_INT_CST_LOW (TYPE_SIZE (array_of)));
+	     tree_to_shwi (an_int_cst)
+	      / tree_to_shwi (TYPE_SIZE (array_of)));
 
   obstack_grow (&util_obstack, buffer, strlen (buffer));
   encode_type (array_of, curtype, format);
@@ -427,7 +427,7 @@  encode_vector (tree type, int curtype, int format)
   sprintf (buffer, "![" HOST_WIDE_INT_PRINT_DEC ",%d",
 	   /* We want to compute the equivalent of sizeof (<vector>).
 	      Code inspired by c_sizeof_or_alignof_type.  */
-	   ((TREE_INT_CST_LOW (TYPE_SIZE_UNIT (type))
+	   ((tree_to_shwi (TYPE_SIZE_UNIT (type))
 	     / (TYPE_PRECISION (char_type_node) / BITS_PER_UNIT))),
 	   /* We want to compute the equivalent of __alignof__
 	      (<vector>).  Code inspired by
@@ -822,7 +822,7 @@  encode_field (tree field_decl, int curtype, int format)
      between GNU and NeXT runtimes.  */
   if (DECL_BIT_FIELD_TYPE (field_decl))
     {
-      int size = tree_low_cst (DECL_SIZE (field_decl), 1);
+      int size = tree_to_uhwi (DECL_SIZE (field_decl));
 
       if (flag_next_runtime)
 	encode_next_bitfield (size);
diff --git a/gcc/objc/objc-next-runtime-abi-01.c b/gcc/objc/objc-next-runtime-abi-01.c
index cf24591..3017b80 100644
--- a/gcc/objc/objc-next-runtime-abi-01.c
+++ b/gcc/objc/objc-next-runtime-abi-01.c
@@ -1198,7 +1198,7 @@  generate_v1_objc_protocol_extension (tree proto_interface,
     build_v1_objc_protocol_extension_template ();
 
   /* uint32_t size */
-  size = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (objc_protocol_extension_template));
+  size = tree_to_uhwi (TYPE_SIZE_UNIT (objc_protocol_extension_template));
   CONSTRUCTOR_APPEND_ELT (v, NULL_TREE, build_int_cst (NULL_TREE, size));
 
   /* Try for meaningful diagnostics.  */
@@ -1342,7 +1342,7 @@  generate_v1_property_table (tree context, tree klass_ctxt)
 						  is_proto ? context
 							   : klass_ctxt);
 
-  init_val = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (objc_v1_property_template));
+  init_val = tree_to_shwi (TYPE_SIZE_UNIT (objc_v1_property_template));
   if (is_proto)
     snprintf (buf, BUFSIZE, "_OBJC_ProtocolPropList_%s",
 	      IDENTIFIER_POINTER (PROTOCOL_NAME (context)));
@@ -1722,7 +1722,7 @@  build_v1_category_initializer (tree type, tree cat_name, tree class_name,
 
   if (flag_objc_abi >= 1)
     {
-      int val = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (objc_category_template));
+      int val = tree_to_shwi (TYPE_SIZE_UNIT (objc_category_template));
       expr = build_int_cst (NULL_TREE, val);
       CONSTRUCTOR_APPEND_ELT (v, NULL_TREE, expr);
       ltyp = objc_prop_list_ptr;
@@ -1824,7 +1824,7 @@  generate_objc_class_ext (tree property_list, tree context)
     build_objc_class_ext_template ();
 
   /* uint32_t size */
-  size = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (objc_class_ext_template));
+  size = tree_to_shwi (TYPE_SIZE_UNIT (objc_class_ext_template));
   CONSTRUCTOR_APPEND_ELT (v, NULL_TREE, build_int_cst (NULL_TREE, size));
 
   ltyp = const_string_type_node;
diff --git a/gcc/objc/objc-next-runtime-abi-02.c b/gcc/objc/objc-next-runtime-abi-02.c
index cf899d3..29a28af 100644
--- a/gcc/objc/objc-next-runtime-abi-02.c
+++ b/gcc/objc/objc-next-runtime-abi-02.c
@@ -2325,7 +2325,7 @@  generate_v2_meth_descriptor_table (tree chain, tree protocol,
 
   decl = start_var_decl (method_list_template, buf);
 
-  entsize = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (objc_method_template));
+  entsize = tree_to_shwi (TYPE_SIZE_UNIT (objc_method_template));
   CONSTRUCTOR_APPEND_ELT (v, NULL_TREE, build_int_cst (NULL_TREE, entsize));
   CONSTRUCTOR_APPEND_ELT (v, NULL_TREE, build_int_cst (NULL_TREE, size));
   initlist =
@@ -2439,7 +2439,7 @@  generate_v2_property_table (tree context, tree klass_ctxt)
 						  is_proto ? context
 							   : klass_ctxt);
 
-  init_val = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (objc_v2_property_template));
+  init_val = tree_to_shwi (TYPE_SIZE_UNIT (objc_v2_property_template));
   if (is_proto)
     snprintf (buf, BUFSIZE, "_OBJC_ProtocolPropList_%s",
 	      IDENTIFIER_POINTER (PROTOCOL_NAME (context)));
@@ -2514,7 +2514,7 @@  build_v2_protocol_initializer (tree type, tree protocol_name, tree protocol_list
 
   /* const uint32_t size;  = sizeof(struct protocol_t) */
   expr = build_int_cst (integer_type_node,
-	      TREE_INT_CST_LOW (TYPE_SIZE_UNIT (objc_v2_protocol_template)));
+	      tree_to_shwi (TYPE_SIZE_UNIT (objc_v2_protocol_template)));
   CONSTRUCTOR_APPEND_ELT (inits, NULL_TREE, expr);
   /* const uint32_t flags; = 0 */
   CONSTRUCTOR_APPEND_ELT (inits, NULL_TREE, integer_zero_node);
@@ -2628,7 +2628,7 @@  generate_v2_dispatch_table (tree chain, const char *name, tree attr)
 
   decl = start_var_decl  (method_list_template, name);
 
-  init_val = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (objc_method_template));
+  init_val = tree_to_shwi (TYPE_SIZE_UNIT (objc_method_template));
   CONSTRUCTOR_APPEND_ELT (v, NULL_TREE,
 			  build_int_cst (integer_type_node, init_val));
   CONSTRUCTOR_APPEND_ELT (v, NULL_TREE,
@@ -2857,7 +2857,7 @@  build_v2_ivar_list_initializer (tree class_name, tree type, tree field_decl)
 			      build_int_cst (integer_type_node, val));
 
       /* Set size.  */
-      val = TREE_INT_CST_LOW (DECL_SIZE_UNIT (field_decl));
+      val = tree_to_shwi (DECL_SIZE_UNIT (field_decl));
       CONSTRUCTOR_APPEND_ELT (ivar, NULL_TREE,
 			      build_int_cst (integer_type_node, val));
 
@@ -2926,7 +2926,7 @@  generate_v2_ivars_list (tree chain, const char *name, tree attr, tree templ)
 
   initlist = build_v2_ivar_list_initializer (CLASS_NAME (templ),
 					     objc_v2_ivar_template, chain);
-  ivar_t_size = TREE_INT_CST_LOW  (TYPE_SIZE_UNIT (objc_v2_ivar_template));
+  ivar_t_size = tree_to_shwi (TYPE_SIZE_UNIT (objc_v2_ivar_template));
 
   decl = start_var_decl (ivar_list_template, name);
   CONSTRUCTOR_APPEND_ELT (inits, NULL_TREE,
@@ -3184,7 +3184,7 @@  generate_v2_class_structs (struct imp_entry *impent)
 				    buf, meta_clac_meth);
     }
 
-  instanceStart = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (objc_v2_class_template));
+  instanceStart = tree_to_shwi (TYPE_SIZE_UNIT (objc_v2_class_template));
 
   /* Currently there are no class ivars and generation of class
      variables for the root of the inheritance has been removed.  It
@@ -3194,7 +3194,7 @@  generate_v2_class_structs (struct imp_entry *impent)
 
   class_ivars = NULL_TREE;
   /* TODO: Add total size of class variables when implemented. */
-  instanceSize = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (objc_v2_class_template));
+  instanceSize = tree_to_shwi (TYPE_SIZE_UNIT (objc_v2_class_template));
 
   /* So now build the META CLASS structs.  */
   /* static struct class_ro_t  _OBJC_METACLASS_Foo = { ... }; */
@@ -3276,7 +3276,7 @@  generate_v2_class_structs (struct imp_entry *impent)
 
   if (field && TREE_CODE (field) == FIELD_DECL)
     instanceSize = int_byte_position (field) * BITS_PER_UNIT
-		   + tree_low_cst (DECL_SIZE (field), 0);
+		   + tree_to_shwi (DECL_SIZE (field));
   else
     instanceSize = 0;
   instanceSize /= BITS_PER_UNIT;
diff --git a/gcc/omp-low.c b/gcc/omp-low.c
index 4d32fb6..119d895 100644
--- a/gcc/omp-low.c
+++ b/gcc/omp-low.c
@@ -5483,7 +5483,7 @@  expand_omp_atomic (struct omp_region *region)
   HOST_WIDE_INT index;
 
   /* Make sure the type is one of the supported sizes.  */
-  index = tree_low_cst (TYPE_SIZE_UNIT (type), 1);
+  index = tree_to_uhwi (TYPE_SIZE_UNIT (type));
   index = exact_log2 (index);
   if (index >= 0 && index <= 4)
     {
@@ -6222,9 +6222,9 @@  lower_omp_for_lastprivate (struct omp_for_data *fd, gimple_seq *body_p,
 
   /* When possible, use a strict equality expression.  This can let VRP
      type optimizations deduce the value and remove a copy.  */
-  if (host_integerp (fd->loop.step, 0))
+  if (tree_fits_shwi_p (fd->loop.step))
     {
-      HOST_WIDE_INT step = TREE_INT_CST_LOW (fd->loop.step);
+      HOST_WIDE_INT step = tree_to_hwi (fd->loop.step);
       if (step == 1 || step == -1)
 	cond_code = EQ_EXPR;
     }
@@ -6242,7 +6242,7 @@  lower_omp_for_lastprivate (struct omp_for_data *fd, gimple_seq *body_p,
       /* Optimize: v = 0; is usually cheaper than v = some_other_constant.  */
       vinit = fd->loop.n1;
       if (cond_code == EQ_EXPR
-	  && host_integerp (fd->loop.n2, 0)
+	  && tree_fits_shwi_p (fd->loop.n2)
 	  && ! integer_zerop (fd->loop.n2))
 	vinit = build_int_cst (TREE_TYPE (fd->loop.v), 0);
 
diff --git a/gcc/predict.c b/gcc/predict.c
index f0db9f4..349b7ff 100644
--- a/gcc/predict.c
+++ b/gcc/predict.c
@@ -966,15 +966,15 @@  strips_small_constant (tree t1, tree t2)
     return NULL;
   else if (TREE_CODE (t1) == SSA_NAME)
     ret = t1;
-  else if (host_integerp (t1, 0))
-    value = tree_low_cst (t1, 0);
+  else if (tree_fits_shwi_p (t1))
+    value = tree_to_hwi (t1);
   else
     return NULL;
 
   if (!t2)
     return ret;
-  else if (host_integerp (t2, 0))
-    value = tree_low_cst (t2, 0);
+  else if (tree_fits_shwi_p (t2))
+    value = tree_to_hwi (t2);
   else if (TREE_CODE (t2) == SSA_NAME)
     {
       if (ret)
@@ -1070,8 +1070,8 @@  is_comparison_with_loop_invariant_p (gimple stmt, struct loop *loop,
 	code = invert_tree_comparison (code, false);
       bound = iv0.base;
       base = iv1.base;
-      if (host_integerp (iv1.step, 0))
-	step = tree_low_cst (iv1.step, 0);
+      if (tree_fits_shwi_p (iv1.step))
+	step = tree_to_hwi (iv1.step);
       else
 	return false;
     }
@@ -1079,8 +1079,8 @@  is_comparison_with_loop_invariant_p (gimple stmt, struct loop *loop,
     {
       bound = iv1.base;
       base = iv0.base;
-      if (host_integerp (iv0.step, 0))
-	step = tree_low_cst (iv0.step, 0);  
+      if (tree_fits_shwi_p (iv0.step))
+	step = tree_to_hwi (iv0.step);  
       else
 	return false;
     }
@@ -1213,15 +1213,15 @@  predict_iv_comparison (struct loop *loop, basic_block bb,
 
   /* If loop bound, base and compare bound are all constants, we can
      calculate the probability directly.  */
-  if (host_integerp (loop_bound_var, 0)
-      && host_integerp (compare_var, 0)
-      && host_integerp (compare_base, 0))
+  if (tree_fits_shwi_p (loop_bound_var)
+      && tree_fits_shwi_p (compare_var)
+      && tree_fits_shwi_p (compare_base))
     {
       int probability;
       HOST_WIDE_INT compare_count;
-      HOST_WIDE_INT loop_bound = tree_low_cst (loop_bound_var, 0);
-      HOST_WIDE_INT compare_bound = tree_low_cst (compare_var, 0);
-      HOST_WIDE_INT base = tree_low_cst (compare_base, 0);
+      HOST_WIDE_INT loop_bound = tree_to_hwi (loop_bound_var);
+      HOST_WIDE_INT compare_bound = tree_to_hwi (compare_var);
+      HOST_WIDE_INT base = tree_to_hwi (compare_base);
       HOST_WIDE_INT loop_count = (loop_bound - base) / compare_step;
 
       if ((compare_step > 0)
@@ -1339,9 +1339,9 @@  predict_loops (void)
 
 	  if (TREE_CODE (niter) == INTEGER_CST)
 	    {
-	      if (host_integerp (niter, 1)
+	      if (tree_fits_uhwi_p (niter)
 		  && compare_tree_int (niter, max-1) == -1)
-		nitercst = tree_low_cst (niter, 1) + 1;
+		nitercst = tree_to_hwi (niter) + 1;
 	      else
 		nitercst = max;
 	      predictor = PRED_LOOP_ITERATIONS;
diff --git a/gcc/rtlanal.c b/gcc/rtlanal.c
index 852a3d7..0951955 100644
--- a/gcc/rtlanal.c
+++ b/gcc/rtlanal.c
@@ -280,8 +280,8 @@  rtx_addr_can_trap_p_1 (const_rtx x, HOST_WIDE_INT offset, HOST_WIDE_INT size,
 	  if (!decl)
 	    decl_size = -1;
 	  else if (DECL_P (decl) && DECL_SIZE_UNIT (decl))
-	    decl_size = (host_integerp (DECL_SIZE_UNIT (decl), 0)
-			 ? tree_low_cst (DECL_SIZE_UNIT (decl), 0)
+	    decl_size = (tree_fits_shwi_p (DECL_SIZE_UNIT (decl))
+			 ? tree_to_hwi (DECL_SIZE_UNIT (decl))
 			 : -1);
 	  else if (TREE_CODE (decl) == STRING_CST)
 	    decl_size = TREE_STRING_LENGTH (decl);
diff --git a/gcc/sdbout.c b/gcc/sdbout.c
index 59892ba..c45797c 100644
--- a/gcc/sdbout.c
+++ b/gcc/sdbout.c
@@ -536,10 +536,10 @@  plain_type_1 (tree type, int level)
 	    = (TYPE_DOMAIN (type)
 	       && TYPE_MIN_VALUE (TYPE_DOMAIN (type)) != 0
 	       && TYPE_MAX_VALUE (TYPE_DOMAIN (type)) != 0
-	       && host_integerp (TYPE_MAX_VALUE (TYPE_DOMAIN (type)), 0)
-	       && host_integerp (TYPE_MIN_VALUE (TYPE_DOMAIN (type)), 0)
-	       ? (tree_low_cst (TYPE_MAX_VALUE (TYPE_DOMAIN (type)), 0)
-		  - tree_low_cst (TYPE_MIN_VALUE (TYPE_DOMAIN (type)), 0) + 1)
+	       && tree_fits_shwi_p (TYPE_MAX_VALUE (TYPE_DOMAIN (type)))
+	       && tree_fits_shwi_p (TYPE_MIN_VALUE (TYPE_DOMAIN (type)))
+	       ? (tree_to_hwi (TYPE_MAX_VALUE (TYPE_DOMAIN (type)))
+		  - tree_to_hwi (TYPE_MIN_VALUE (TYPE_DOMAIN (type))) + 1)
 	       : 0);
 
 	return PUSH_DERIVED_LEVEL (DT_ARY, m);
@@ -995,8 +995,8 @@  sdbout_field_types (tree type)
     if (TREE_CODE (tail) == FIELD_DECL
 	&& DECL_NAME (tail)
 	&& DECL_SIZE (tail)
-	&& host_integerp (DECL_SIZE (tail), 1)
-	&& host_integerp (bit_position (tail), 0))
+	&& tree_fits_uhwi_p (DECL_SIZE (tail))
+	&& tree_fits_shwi_p (bit_position (tail)))
       {
 	if (POINTER_TYPE_P (TREE_TYPE (tail)))
 	  sdbout_one_type (TREE_TYPE (TREE_TYPE (tail)));
@@ -1135,7 +1135,7 @@  sdbout_one_type (tree type)
 		  continue;
 
 		PUT_SDB_DEF (IDENTIFIER_POINTER (child_type_name));
-		PUT_SDB_INT_VAL (tree_low_cst (BINFO_OFFSET (child), 0));
+		PUT_SDB_INT_VAL (tree_to_shwi (BINFO_OFFSET (child)));
 		PUT_SDB_SCL (member_scl);
 		sdbout_type (BINFO_TYPE (child));
 		PUT_SDB_ENDEF;
@@ -1153,10 +1153,10 @@  sdbout_one_type (tree type)
 	        if (TREE_CODE (value) == CONST_DECL)
 	          value = DECL_INITIAL (value);
 
-	        if (host_integerp (value, 0))
+	        if (tree_fits_shwi_p (value))
 		  {
 		    PUT_SDB_DEF (IDENTIFIER_POINTER (TREE_PURPOSE (tem)));
-		    PUT_SDB_INT_VAL (tree_low_cst (value, 0));
+		    PUT_SDB_INT_VAL (tree_to_hwi (value));
 		    PUT_SDB_SCL (C_MOE);
 		    PUT_SDB_TYPE (T_MOE);
 		    PUT_SDB_ENDEF;
@@ -1174,8 +1174,8 @@  sdbout_one_type (tree type)
 	    if (TREE_CODE (tem) == FIELD_DECL
 		&& DECL_NAME (tem)
 		&& DECL_SIZE (tem)
-		&& host_integerp (DECL_SIZE (tem), 1)
-		&& host_integerp (bit_position (tem), 0))
+		&& tree_fits_uhwi_p (DECL_SIZE (tem))
+		&& tree_fits_shwi_p (bit_position (tem)))
 	      {
 		const char *name;
 
@@ -1186,7 +1186,7 @@  sdbout_one_type (tree type)
 		    PUT_SDB_INT_VAL (int_bit_position (tem));
 		    PUT_SDB_SCL (C_FIELD);
 		    sdbout_type (DECL_BIT_FIELD_TYPE (tem));
-		    PUT_SDB_SIZE (tree_low_cst (DECL_SIZE (tem), 1));
+		    PUT_SDB_SIZE (tree_to_hwi (DECL_SIZE (tem)));
 		  }
 		else
 		  {
diff --git a/gcc/simplify-rtx.c b/gcc/simplify-rtx.c
index 3a0d643..6485ec5 100644
--- a/gcc/simplify-rtx.c
+++ b/gcc/simplify-rtx.c
@@ -316,13 +316,13 @@  delegitimize_mem_from_attrs (rtx x)
 					&mode, &unsignedp, &volatilep, false);
 	    if (bitsize != GET_MODE_BITSIZE (mode)
 		|| (bitpos % BITS_PER_UNIT)
-		|| (toffset && !host_integerp (toffset, 0)))
+		|| (toffset && !tree_fits_shwi_p (toffset)))
 	      decl = NULL;
 	    else
 	      {
 		offset += bitpos / BITS_PER_UNIT;
 		if (toffset)
-		  offset += TREE_INT_CST_LOW (toffset);
+		  offset += tree_to_shwi (toffset);
 	      }
 	    break;
 	  }
diff --git a/gcc/stmt.c b/gcc/stmt.c
index fb3323e..447fde4 100644
--- a/gcc/stmt.c
+++ b/gcc/stmt.c
@@ -1695,8 +1695,8 @@  dump_case_nodes (FILE *f, struct case_node *root,
 
   dump_case_nodes (f, root->left, indent_step, indent_level);
 
-  low = tree_low_cst (root->low, 0);
-  high = tree_low_cst (root->high, 0);
+  low = tree_to_shwi (root->low);
+  high = tree_to_shwi (root->high);
 
   fputs (";; ", f);
   if (high == low)
@@ -1769,7 +1769,7 @@  expand_switch_as_decision_tree_p (tree range,
      who knows...  */
   max_ratio = optimize_insn_for_size_p () ? 3 : 10;
   if (count < case_values_threshold ()
-      || ! host_integerp (range, /*pos=*/1)
+      || ! tree_fits_uhwi_p (range)
       || compare_tree_int (range, max_ratio * count) > 0)
     return true;
 
@@ -1891,7 +1891,7 @@  emit_case_dispatch_table (tree index_expr, tree index_type,
 
   /* Get table of labels to jump to, in order of case index.  */
 
-  ncases = tree_low_cst (range, 0) + 1;
+  ncases = tree_to_shwi (range) + 1;
   labelvec = XALLOCAVEC (rtx, ncases);
   memset (labelvec, 0, ncases * sizeof (rtx));
 
@@ -1901,11 +1901,11 @@  emit_case_dispatch_table (tree index_expr, tree index_type,
 	 value since that should fit in a HOST_WIDE_INT while the
 	 actual values may not.  */
       HOST_WIDE_INT i_low
-	= tree_low_cst (fold_build2 (MINUS_EXPR, index_type,
-				     n->low, minval), 1);
+	= tree_to_uhwi (fold_build2 (MINUS_EXPR, index_type,
+				     n->low, minval));
       HOST_WIDE_INT i_high
-	= tree_low_cst (fold_build2 (MINUS_EXPR, index_type,
-				     n->high, minval), 1);
+	= tree_to_uhwi (fold_build2 (MINUS_EXPR, index_type,
+				     n->high, minval));
       HOST_WIDE_INT i;
 
       for (i = i_low; i <= i_high; i ++)
diff --git a/gcc/stor-layout.c b/gcc/stor-layout.c
index 674f888..626d1fd 100644
--- a/gcc/stor-layout.c
+++ b/gcc/stor-layout.c
@@ -357,9 +357,9 @@  mode_for_size_tree (const_tree size, enum mode_class mclass, int limit)
   unsigned HOST_WIDE_INT uhwi;
   unsigned int ui;
 
-  if (!host_integerp (size, 1))
+  if (!tree_fits_uhwi_p (size))
     return BLKmode;
-  uhwi = tree_low_cst (size, 1);
+  uhwi = tree_to_hwi (size);
   ui = uhwi;
   if (uhwi != ui)
     return BLKmode;
@@ -495,10 +495,10 @@  mode_for_array (tree elem_type, tree size)
     return TYPE_MODE (elem_type);
 
   limit_p = true;
-  if (host_integerp (size, 1) && host_integerp (elem_size, 1))
+  if (tree_fits_uhwi_p (size) && tree_fits_uhwi_p (elem_size))
     {
-      int_size = tree_low_cst (size, 1);
-      int_elem_size = tree_low_cst (elem_size, 1);
+      int_size = tree_to_hwi (size);
+      int_elem_size = tree_to_hwi (elem_size);
       if (int_elem_size > 0
 	  && int_size % int_elem_size == 0
 	  && targetm.array_mode_supported_p (TYPE_MODE (elem_type),
@@ -704,7 +704,7 @@  layout_decl (tree decl, unsigned int known_align)
       if (size != 0 && TREE_CODE (size) == INTEGER_CST
 	  && compare_tree_int (size, larger_than_size) > 0)
 	{
-	  int size_as_int = TREE_INT_CST_LOW (size);
+	  int size_as_int = tree_to_shwi (size);
 
 	  if (compare_tree_int (size, size_as_int) == 0)
 	    warning (OPT_Wlarger_than_, "size of %q+D is %d bytes", decl, size_as_int);
@@ -1069,7 +1069,7 @@  excess_unit_span (HOST_WIDE_INT byte_offset, HOST_WIDE_INT bit_offset,
 
   offset = offset % align;
   return ((offset + size + align - 1) / align
-	  > ((unsigned HOST_WIDE_INT) tree_low_cst (TYPE_SIZE (type), 1)
+	  > ((unsigned HOST_WIDE_INT) tree_to_uhwi (TYPE_SIZE (type))
 	     / align));
 }
 #endif
@@ -1129,14 +1129,14 @@  place_field (record_layout_info rli, tree field)
   /* Work out the known alignment so far.  Note that A & (-A) is the
      value of the least-significant bit in A that is one.  */
   if (! integer_zerop (rli->bitpos))
-    known_align = (tree_low_cst (rli->bitpos, 1)
-		   & - tree_low_cst (rli->bitpos, 1));
+    known_align = (tree_to_uhwi (rli->bitpos)
+		   & - tree_to_uhwi (rli->bitpos));
   else if (integer_zerop (rli->offset))
     known_align = 0;
-  else if (host_integerp (rli->offset, 1))
+  else if (tree_fits_uhwi_p (rli->offset))
     known_align = (BITS_PER_UNIT
-		   * (tree_low_cst (rli->offset, 1)
-		      & - tree_low_cst (rli->offset, 1)));
+		   * (tree_to_hwi (rli->offset)
+		      & - tree_to_hwi (rli->offset)));
   else
     known_align = rli->offset_align;
 
@@ -1210,15 +1210,18 @@  place_field (record_layout_info rli, tree field)
 	  || TYPE_ALIGN (type) <= BITS_PER_UNIT)
       && maximum_field_alignment == 0
       && ! integer_zerop (DECL_SIZE (field))
-      && host_integerp (DECL_SIZE (field), 1)
-      && host_integerp (rli->offset, 1)
-      && host_integerp (TYPE_SIZE (type), 1))
+      && tree_fits_uhwi_p (DECL_SIZE (field))
+      && tree_fits_uhwi_p (rli->offset)
+      && tree_fits_uhwi_p (TYPE_SIZE (type)))
     {
       unsigned int type_align = TYPE_ALIGN (type);
       tree dsize = DECL_SIZE (field);
-      HOST_WIDE_INT field_size = tree_low_cst (dsize, 1);
-      HOST_WIDE_INT offset = tree_low_cst (rli->offset, 0);
-      HOST_WIDE_INT bit_offset = tree_low_cst (rli->bitpos, 0);
+      HOST_WIDE_INT field_size = tree_to_uhwi (dsize);
+      /* THE NEXT LINE IS POSSIBLY WRONG.  IT DOES NOT MATCH THE
+	 PREDICATE ABOVE SINCE THE ABOVE ONE DOES AN UNSIGNED CHECK
+	 AND THIS IS A SIGNED ACCESS.  */
+      HOST_WIDE_INT offset = tree_to_shwi (rli->offset);
+      HOST_WIDE_INT bit_offset = tree_to_shwi (rli->bitpos);
 
 #ifdef ADJUST_FIELD_ALIGN
       if (! TYPE_USER_ALIGN (type))
@@ -1254,15 +1257,18 @@  place_field (record_layout_info rli, tree field)
       && DECL_BIT_FIELD_TYPE (field)
       && ! DECL_PACKED (field)
       && ! integer_zerop (DECL_SIZE (field))
-      && host_integerp (DECL_SIZE (field), 1)
-      && host_integerp (rli->offset, 1)
-      && host_integerp (TYPE_SIZE (type), 1))
+      && tree_fits_uhwi_p (DECL_SIZE (field))
+      && tree_fits_uhwi_p (rli->offset)
+      && tree_fits_uhwi_p (TYPE_SIZE (type)))
     {
       unsigned int type_align = TYPE_ALIGN (type);
       tree dsize = DECL_SIZE (field);
-      HOST_WIDE_INT field_size = tree_low_cst (dsize, 1);
-      HOST_WIDE_INT offset = tree_low_cst (rli->offset, 0);
-      HOST_WIDE_INT bit_offset = tree_low_cst (rli->bitpos, 0);
+      HOST_WIDE_INT field_size = tree_to_hwi (dsize);
+      /* THE NEXT LINE IS POSSIBLY WRONG.  IT DOES NOT MATCH THE
+	 PREDICATE ABOVE SINCE THE ABOVE ONE DOES AN UNSIGNED CHECK
+	 AND THIS IS A SIGNED ACCESS.  */
+      HOST_WIDE_INT offset = tree_to_hwi (rli->offset);
+      HOST_WIDE_INT bit_offset = tree_to_shwi (rli->bitpos);
 
 #ifdef ADJUST_FIELD_ALIGN
       if (! TYPE_USER_ALIGN (type))
@@ -1316,18 +1322,21 @@  place_field (record_layout_info rli, tree field)
 	  if (DECL_BIT_FIELD_TYPE (field)
 	      && !integer_zerop (DECL_SIZE (field))
 	      && !integer_zerop (DECL_SIZE (rli->prev_field))
-	      && host_integerp (DECL_SIZE (rli->prev_field), 0)
-	      && host_integerp (TYPE_SIZE (type), 0)
+	      && tree_fits_shwi_p (DECL_SIZE (rli->prev_field))
+	      && tree_fits_shwi_p (TYPE_SIZE (type))
 	      && simple_cst_equal (TYPE_SIZE (type), TYPE_SIZE (prev_type)))
 	    {
 	      /* We're in the middle of a run of equal type size fields; make
 		 sure we realign if we run out of bits.  (Not decl size,
 		 type size!) */
-	      HOST_WIDE_INT bitsize = tree_low_cst (DECL_SIZE (field), 1);
+	      HOST_WIDE_INT bitsize = tree_to_uhwi (DECL_SIZE (field));
 
 	      if (rli->remaining_in_alignment < bitsize)
 		{
-		  HOST_WIDE_INT typesize = tree_low_cst (TYPE_SIZE (type), 1);
+		  /* THE NEXT LINE IS POSSIBLY WRONG.  IT DOES NOT MATCH THE
+		     PREDICATE ABOVE SINCE THE ABOVE ONE DOES AN SIGNED CHECK
+		     AND THIS IS A UNSIGNED ACCESS.  */
+		  HOST_WIDE_INT typesize = tree_to_uhwi (TYPE_SIZE (type));
 
 		  /* out of bits; bump up to next 'word'.  */
 		  rli->bitpos
@@ -1399,13 +1408,13 @@  place_field (record_layout_info rli, tree field)
 	     until we see a bitfield (and come by here again) we just skip
 	     calculating it.  */
 	  if (DECL_SIZE (field) != NULL
-	      && host_integerp (TYPE_SIZE (TREE_TYPE (field)), 1)
-	      && host_integerp (DECL_SIZE (field), 1))
+	      && tree_fits_uhwi_p (TYPE_SIZE (TREE_TYPE (field)))
+	      && tree_fits_uhwi_p (DECL_SIZE (field)))
 	    {
 	      unsigned HOST_WIDE_INT bitsize
-		= tree_low_cst (DECL_SIZE (field), 1);
+		= tree_to_hwi (DECL_SIZE (field));
 	      unsigned HOST_WIDE_INT typesize
-		= tree_low_cst (TYPE_SIZE (TREE_TYPE (field)), 1);
+		= tree_to_hwi (TYPE_SIZE (TREE_TYPE (field)));
 
 	      if (typesize < bitsize)
 		rli->remaining_in_alignment = 0;
@@ -1437,14 +1446,14 @@  place_field (record_layout_info rli, tree field)
      approximate this by seeing if its position changed), lay out the field
      again; perhaps we can use an integral mode for it now.  */
   if (! integer_zerop (DECL_FIELD_BIT_OFFSET (field)))
-    actual_align = (tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 1)
-		    & - tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 1));
+    actual_align = (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
+		    & - tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field)));
   else if (integer_zerop (DECL_FIELD_OFFSET (field)))
     actual_align = MAX (BIGGEST_ALIGNMENT, rli->record_align);
-  else if (host_integerp (DECL_FIELD_OFFSET (field), 1))
+  else if (tree_fits_uhwi_p (DECL_FIELD_OFFSET (field)))
     actual_align = (BITS_PER_UNIT
-		   * (tree_low_cst (DECL_FIELD_OFFSET (field), 1)
-		      & - tree_low_cst (DECL_FIELD_OFFSET (field), 1)));
+		   * (tree_to_hwi (DECL_FIELD_OFFSET (field))
+		      & - tree_to_hwi (DECL_FIELD_OFFSET (field))));
   else
     actual_align = DECL_OFFSET_ALIGN (field);
   /* ACTUAL_ALIGN is still the actual alignment *within the record* .
@@ -1600,7 +1609,7 @@  compute_record_mode (tree type)
      line.  */
   SET_TYPE_MODE (type, BLKmode);
 
-  if (! host_integerp (TYPE_SIZE (type), 1))
+  if (! tree_fits_uhwi_p (TYPE_SIZE (type)))
     return;
 
   /* A record which has any BLKmode members must itself be
@@ -1616,9 +1625,9 @@  compute_record_mode (tree type)
 	      && ! TYPE_NO_FORCE_BLK (TREE_TYPE (field))
 	      && !(TYPE_SIZE (TREE_TYPE (field)) != 0
 		   && integer_zerop (TYPE_SIZE (TREE_TYPE (field)))))
-	  || ! host_integerp (bit_position (field), 1)
+	  || ! tree_fits_uhwi_p (bit_position (field))
 	  || DECL_SIZE (field) == 0
-	  || ! host_integerp (DECL_SIZE (field), 1))
+	  || ! tree_fits_uhwi_p (DECL_SIZE (field)))
 	return;
 
       /* If this field is the whole struct, remember its mode so
@@ -1637,8 +1646,8 @@  compute_record_mode (tree type)
      matches the type's size.  This only applies to RECORD_TYPE.  This
      does not apply to unions.  */
   if (TREE_CODE (type) == RECORD_TYPE && mode != VOIDmode
-      && host_integerp (TYPE_SIZE (type), 1)
-      && GET_MODE_BITSIZE (mode) == TREE_INT_CST_LOW (TYPE_SIZE (type)))
+      && tree_fits_uhwi_p (TYPE_SIZE (type))
+      && GET_MODE_BITSIZE (mode) == tree_to_hwi (TYPE_SIZE (type)))
     SET_TYPE_MODE (type, mode);
   else
     SET_TYPE_MODE (type, mode_for_size_tree (TYPE_SIZE (type), MODE_INT, 1));
@@ -1779,11 +1788,11 @@  finish_bitfield_representative (tree repr, tree field)
 
   size = size_diffop (DECL_FIELD_OFFSET (field),
 		      DECL_FIELD_OFFSET (repr));
-  gcc_assert (host_integerp (size, 1));
-  bitsize = (tree_low_cst (size, 1) * BITS_PER_UNIT
-	     + tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 1)
-	     - tree_low_cst (DECL_FIELD_BIT_OFFSET (repr), 1)
-	     + tree_low_cst (DECL_SIZE (field), 1));
+  gcc_assert (tree_fits_uhwi_p (size));
+  bitsize = (tree_to_hwi (size) * BITS_PER_UNIT
+	     + tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
+	     - tree_to_uhwi (DECL_FIELD_BIT_OFFSET (repr))
+	     + tree_to_uhwi (DECL_SIZE (field)));
 
   /* Round up bitsize to multiples of BITS_PER_UNIT.  */
   bitsize = (bitsize + BITS_PER_UNIT - 1) & ~(BITS_PER_UNIT - 1);
@@ -1801,11 +1810,11 @@  finish_bitfield_representative (tree repr, tree field)
 	return;
       maxsize = size_diffop (DECL_FIELD_OFFSET (nextf),
 			     DECL_FIELD_OFFSET (repr));
-      if (host_integerp (maxsize, 1))
+      if (tree_fits_uhwi_p (maxsize))
 	{
-	  maxbitsize = (tree_low_cst (maxsize, 1) * BITS_PER_UNIT
-			+ tree_low_cst (DECL_FIELD_BIT_OFFSET (nextf), 1)
-			- tree_low_cst (DECL_FIELD_BIT_OFFSET (repr), 1));
+	  maxbitsize = (tree_to_hwi (maxsize) * BITS_PER_UNIT
+			+ tree_to_uhwi (DECL_FIELD_BIT_OFFSET (nextf))
+			- tree_to_uhwi (DECL_FIELD_BIT_OFFSET (repr)));
 	  /* If the group ends within a bitfield nextf does not need to be
 	     aligned to BITS_PER_UNIT.  Thus round up.  */
 	  maxbitsize = (maxbitsize + BITS_PER_UNIT - 1) & ~(BITS_PER_UNIT - 1);
@@ -1822,9 +1831,9 @@  finish_bitfield_representative (tree repr, tree field)
 	 use bitsize as fallback for this case.  */
       tree maxsize = size_diffop (TYPE_SIZE_UNIT (DECL_CONTEXT (field)),
 				  DECL_FIELD_OFFSET (repr));
-      if (host_integerp (maxsize, 1))
-	maxbitsize = (tree_low_cst (maxsize, 1) * BITS_PER_UNIT
-		      - tree_low_cst (DECL_FIELD_BIT_OFFSET (repr), 1));
+      if (tree_fits_uhwi_p (maxsize))
+	maxbitsize = (tree_to_hwi (maxsize) * BITS_PER_UNIT
+		      - tree_to_uhwi (DECL_FIELD_BIT_OFFSET (repr)));
       else
 	maxbitsize = bitsize;
     }
@@ -1935,8 +1944,8 @@  finish_bitfield_layout (record_layout_info rli)
 	     representative to be generated.  That will at most
 	     generate worse code but still maintain correctness with
 	     respect to the C++ memory model.  */
-	  else if (!((host_integerp (DECL_FIELD_OFFSET (repr), 1)
-		      && host_integerp (DECL_FIELD_OFFSET (field), 1))
+	  else if (!((tree_fits_uhwi_p (DECL_FIELD_OFFSET (repr))
+		      && tree_fits_uhwi_p (DECL_FIELD_OFFSET (field)))
 		     || operand_equal_p (DECL_FIELD_OFFSET (repr),
 					 DECL_FIELD_OFFSET (field), 0)))
 	    {
diff --git a/gcc/targhooks.c b/gcc/targhooks.c
index 265fc98..0d7f3e4 100644
--- a/gcc/targhooks.c
+++ b/gcc/targhooks.c
@@ -949,7 +949,7 @@  tree default_mangle_decl_assembler_name (tree decl ATTRIBUTE_UNUSED,
 HOST_WIDE_INT
 default_vector_alignment (const_tree type)
 {
-  return tree_low_cst (TYPE_SIZE (type), 0);
+  return tree_to_shwi (TYPE_SIZE (type));
 }
 
 bool
diff --git a/gcc/testsuite/gcc.dg/20020219-1.c b/gcc/testsuite/gcc.dg/20020219-1.c
index ffdf19a..1e000e6 100644
--- a/gcc/testsuite/gcc.dg/20020219-1.c
+++ b/gcc/testsuite/gcc.dg/20020219-1.c
@@ -1,5 +1,5 @@ 
 /* PR c/4389
-   This testcase failed because host_integerp (x, 0) was returning
+   This testcase failed because tree_fits_shwi_p (x) was returning
    1 even for constants bigger than 2^31.  It fails under under hppa
    hpux without -mdisable-indexing because the pointer x - 1 is used
    as the base address of an indexed load.  Because the struct A is not
diff --git a/gcc/trans-mem.c b/gcc/trans-mem.c
index ef384ac..8aa708a 100644
--- a/gcc/trans-mem.c
+++ b/gcc/trans-mem.c
@@ -1010,8 +1010,8 @@  tm_log_add (basic_block entry_block, tree addr, gimple stmt)
       if (entry_block
 	  && transaction_invariant_address_p (lp->addr, entry_block)
 	  && TYPE_SIZE_UNIT (type) != NULL
-	  && host_integerp (TYPE_SIZE_UNIT (type), 1)
-	  && (tree_low_cst (TYPE_SIZE_UNIT (type), 1)
+	  && tree_fits_uhwi_p (TYPE_SIZE_UNIT (type))
+	  && (tree_to_hwi (TYPE_SIZE_UNIT (type))
 	      < PARAM_VALUE (PARAM_TM_MAX_AGGREGATE_SIZE))
 	  /* We must be able to copy this type normally.  I.e., no
 	     special constructors and the like.  */
@@ -1094,9 +1094,9 @@  tm_log_emit_stmt (tree addr, gimple stmt)
     code = BUILT_IN_TM_LOG_DOUBLE;
   else if (type == long_double_type_node)
     code = BUILT_IN_TM_LOG_LDOUBLE;
-  else if (host_integerp (size, 1))
+  else if (tree_fits_uhwi_p (size))
     {
-      unsigned int n = tree_low_cst (size, 1);
+      unsigned int n = tree_to_hwi (size);
       switch (n)
 	{
 	case 1:
@@ -2027,9 +2027,9 @@  build_tm_load (location_t loc, tree lhs, tree rhs, gimple_stmt_iterator *gsi)
   else if (type == long_double_type_node)
     code = BUILT_IN_TM_LOAD_LDOUBLE;
   else if (TYPE_SIZE_UNIT (type) != NULL
-	   && host_integerp (TYPE_SIZE_UNIT (type), 1))
+	   && tree_fits_uhwi_p (TYPE_SIZE_UNIT (type)))
     {
-      switch (tree_low_cst (TYPE_SIZE_UNIT (type), 1))
+      switch (tree_to_hwi (TYPE_SIZE_UNIT (type)))
 	{
 	case 1:
 	  code = BUILT_IN_TM_LOAD_1;
@@ -2099,9 +2099,9 @@  build_tm_store (location_t loc, tree lhs, tree rhs, gimple_stmt_iterator *gsi)
   else if (type == long_double_type_node)
     code = BUILT_IN_TM_STORE_LDOUBLE;
   else if (TYPE_SIZE_UNIT (type) != NULL
-	   && host_integerp (TYPE_SIZE_UNIT (type), 1))
+	   && tree_fits_uhwi_p (TYPE_SIZE_UNIT (type)))
     {
-      switch (tree_low_cst (TYPE_SIZE_UNIT (type), 1))
+      switch (tree_to_hwi (TYPE_SIZE_UNIT (type)))
 	{
 	case 1:
 	  code = BUILT_IN_TM_STORE_1;
diff --git a/gcc/tree-cfg.c b/gcc/tree-cfg.c
index af277b7..d175405 100644
--- a/gcc/tree-cfg.c
+++ b/gcc/tree-cfg.c
@@ -2775,15 +2775,15 @@  verify_expr (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
 	    }
 	  else if (TREE_CODE (t) == BIT_FIELD_REF)
 	    {
-	      if (!host_integerp (TREE_OPERAND (t, 1), 1)
-		  || !host_integerp (TREE_OPERAND (t, 2), 1))
+	      if (!tree_fits_uhwi_p (TREE_OPERAND (t, 1))
+		  || !tree_fits_uhwi_p (TREE_OPERAND (t, 2)))
 		{
 		  error ("invalid position or size operand to BIT_FIELD_REF");
 		  return t;
 		}
 	      if (INTEGRAL_TYPE_P (TREE_TYPE (t))
 		  && (TYPE_PRECISION (TREE_TYPE (t))
-		      != TREE_INT_CST_LOW (TREE_OPERAND (t, 1))))
+		      != tree_to_hwi (TREE_OPERAND (t, 1))))
 		{
 		  error ("integral result type precision does not match "
 			 "field size of BIT_FIELD_REF");
@@ -2793,7 +2793,7 @@  verify_expr (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
 		       && !AGGREGATE_TYPE_P (TREE_TYPE (t))
 		       && TYPE_MODE (TREE_TYPE (t)) != BLKmode
 		       && (GET_MODE_PRECISION (TYPE_MODE (TREE_TYPE (t)))
-			   != TREE_INT_CST_LOW (TREE_OPERAND (t, 1))))
+			   != tree_to_uhwi (TREE_OPERAND (t, 1))))
 		{
 		  error ("mode precision of non-integral result does not "
 			 "match field size of BIT_FIELD_REF");
@@ -6154,7 +6154,7 @@  move_stmt_eh_region_tree_nr (tree old_t_nr, struct move_stmt_d *p)
 {
   int old_nr, new_nr;
 
-  old_nr = tree_low_cst (old_t_nr, 0);
+  old_nr = tree_to_shwi (old_t_nr);
   new_nr = move_stmt_eh_region_nr (old_nr, p);
 
   return build_int_cst (integer_type_node, new_nr);
diff --git a/gcc/tree-data-ref.c b/gcc/tree-data-ref.c
index 0d647d7..b606210 100644
--- a/gcc/tree-data-ref.c
+++ b/gcc/tree-data-ref.c
@@ -2794,16 +2794,16 @@  gcd_of_steps_may_divide_p (const_tree chrec, const_tree cst)
   HOST_WIDE_INT cd = 0, val;
   tree step;
 
-  if (!host_integerp (cst, 0))
+  if (!tree_fits_shwi_p (cst))
     return true;
-  val = tree_low_cst (cst, 0);
+  val = tree_to_shwi (cst);
 
   while (TREE_CODE (chrec) == POLYNOMIAL_CHREC)
     {
       step = CHREC_RIGHT (chrec);
-      if (!host_integerp (step, 0))
+      if (!tree_fits_shwi_p (step))
 	return true;
-      cd = gcd (cd, tree_low_cst (step, 0));
+      cd = gcd (cd, tree_to_shwi (step));
       chrec = CHREC_LEFT (chrec);
     }
 
diff --git a/gcc/tree-dfa.c b/gcc/tree-dfa.c
index 423923f..9be06aa 100644
--- a/gcc/tree-dfa.c
+++ b/gcc/tree-dfa.c
@@ -404,10 +404,10 @@  get_ref_base_and_extent (tree exp, HOST_WIDE_INT *poffset,
     }
   if (size_tree != NULL_TREE)
     {
-      if (! host_integerp (size_tree, 1))
+      if (! tree_fits_uhwi_p (size_tree))
 	bitsize = -1;
       else
-	bitsize = TREE_INT_CST_LOW (size_tree);
+	bitsize = tree_to_hwi (size_tree);
     }
 
   /* Initially, maxsize is the same as the accessed element size.
@@ -455,11 +455,11 @@  get_ref_base_and_extent (tree exp, HOST_WIDE_INT *poffset,
 		      {
 			tree fsize = DECL_SIZE_UNIT (field);
 			tree ssize = TYPE_SIZE_UNIT (stype);
-			if (host_integerp (fsize, 0)
-			    && host_integerp (ssize, 0)
+			if (tree_fits_shwi_p (fsize)
+			    && tree_fits_shwi_p (ssize)
 			    && doffset.fits_shwi ())
-			  maxsize += ((TREE_INT_CST_LOW (ssize)
-				       - TREE_INT_CST_LOW (fsize))
+			  maxsize += ((tree_to_hwi (ssize)
+				       - tree_to_hwi (fsize))
 				      * BITS_PER_UNIT
 					- doffset.to_shwi ());
 			else
@@ -475,9 +475,9 @@  get_ref_base_and_extent (tree exp, HOST_WIDE_INT *poffset,
 		   because that would get us out of the structure otherwise.  */
 		if (maxsize != -1
 		    && csize
-		    && host_integerp (csize, 1)
+		    && tree_fits_uhwi_p (csize)
 		    && bit_offset.fits_shwi ())
-		  maxsize = TREE_INT_CST_LOW (csize)
+		  maxsize = tree_to_hwi (csize)
 			    - bit_offset.to_shwi ();
 		else
 		  maxsize = -1;
@@ -520,9 +520,9 @@  get_ref_base_and_extent (tree exp, HOST_WIDE_INT *poffset,
 		   because that would get us outside of the array otherwise.  */
 		if (maxsize != -1
 		    && asize
-		    && host_integerp (asize, 1)
+		    && tree_fits_uhwi_p (asize)
 		    && bit_offset.fits_shwi ())
-		  maxsize = TREE_INT_CST_LOW (asize)
+		  maxsize = tree_to_hwi (asize)
 			    - bit_offset.to_shwi ();
 		else
 		  maxsize = -1;
@@ -629,9 +629,9 @@  get_ref_base_and_extent (tree exp, HOST_WIDE_INT *poffset,
 
   if (seen_variable_array_ref
       && maxsize != -1
-      && (!host_integerp (TYPE_SIZE (base_type), 1)
+      && (!tree_fits_uhwi_p (TYPE_SIZE (base_type))
 	  || (hbit_offset + maxsize
-	      == (signed) TREE_INT_CST_LOW (TYPE_SIZE (base_type)))))
+	      == (signed) tree_to_hwi (TYPE_SIZE (base_type)))))
     maxsize = -1;
 
   /* In case of a decl or constant base object we can do better.  */
@@ -641,16 +641,17 @@  get_ref_base_and_extent (tree exp, HOST_WIDE_INT *poffset,
       /* If maxsize is unknown adjust it according to the size of the
          base decl.  */
       if (maxsize == -1
-	  && host_integerp (DECL_SIZE (exp), 1))
-	maxsize = TREE_INT_CST_LOW (DECL_SIZE (exp)) - hbit_offset;
+	  && tree_fits_uhwi_p (DECL_SIZE (exp)))
+	maxsize = tree_to_hwi (DECL_SIZE (exp)) - hbit_offset;
     }
   else if (CONSTANT_CLASS_P (exp))
     {
       /* If maxsize is unknown adjust it according to the size of the
          base type constant.  */
       if (maxsize == -1
-	  && host_integerp (TYPE_SIZE (TREE_TYPE (exp)), 1))
-	maxsize = TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (exp))) - hbit_offset;
+	  && tree_fits_uhwi_p (TYPE_SIZE (TREE_TYPE (exp))))
+	maxsize = tree_to_hwi (TYPE_SIZE (TREE_TYPE (exp))) 
+	  - hbit_offset;
     }
 
   /* ???  Due to negative offsets in ARRAY_REF we can end up with
diff --git a/gcc/tree-flow-inline.h b/gcc/tree-flow-inline.h
index 6c55da6..a433223 100644
--- a/gcc/tree-flow-inline.h
+++ b/gcc/tree-flow-inline.h
@@ -1208,12 +1208,12 @@  get_addr_base_and_unit_offset_1 (tree exp, HOST_WIDE_INT *poffset,
 
 	    if (!this_offset
 		|| TREE_CODE (this_offset) != INTEGER_CST
-		|| (TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (field))
+		|| (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
 		    % BITS_PER_UNIT))
 	      return NULL_TREE;
 
-	    hthis_offset = TREE_INT_CST_LOW (this_offset);
-	    hthis_offset += (TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (field))
+	    hthis_offset = tree_to_uhwi (this_offset);
+	    hthis_offset += (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field))
 			     / BITS_PER_UNIT);
 	    byte_offset += hthis_offset;
 	  }
@@ -1236,10 +1236,10 @@  get_addr_base_and_unit_offset_1 (tree exp, HOST_WIDE_INT *poffset,
 		&& (unit_size = array_ref_element_size (exp),
 		    TREE_CODE (unit_size) == INTEGER_CST))
 	      {
-		HOST_WIDE_INT hindex = TREE_INT_CST_LOW (index);
+		HOST_WIDE_INT hindex = tree_to_hwi (index);
 
-		hindex -= TREE_INT_CST_LOW (low_bound);
-		hindex *= TREE_INT_CST_LOW (unit_size);
+		hindex -= tree_to_uhwi (low_bound);
+		hindex *= tree_to_uhwi (unit_size);
 		byte_offset += hindex;
 	      }
 	    else
@@ -1251,7 +1251,7 @@  get_addr_base_and_unit_offset_1 (tree exp, HOST_WIDE_INT *poffset,
 	  break;
 
 	case IMAGPART_EXPR:
-	  byte_offset += TREE_INT_CST_LOW (TYPE_SIZE_UNIT (TREE_TYPE (exp)));
+	  byte_offset += tree_to_uhwi (TYPE_SIZE_UNIT (TREE_TYPE (exp)));
 	  break;
 
 	case VIEW_CONVERT_EXPR:
diff --git a/gcc/tree-inline.c b/gcc/tree-inline.c
index 2c8071e..40ee079 100644
--- a/gcc/tree-inline.c
+++ b/gcc/tree-inline.c
@@ -1184,7 +1184,7 @@  remap_eh_region_tree_nr (tree old_t_nr, copy_body_data *id)
 {
   int old_nr, new_nr;
 
-  old_nr = tree_low_cst (old_t_nr, 0);
+  old_nr = tree_to_shwi (old_t_nr);
   new_nr = remap_eh_region_nr (old_nr, id);
 
   return build_int_cst (integer_type_node, new_nr);
diff --git a/gcc/tree-object-size.c b/gcc/tree-object-size.c
index 9a537f1..c9e30f7 100644
--- a/gcc/tree-object-size.c
+++ b/gcc/tree-object-size.c
@@ -78,8 +78,8 @@  static unsigned HOST_WIDE_INT offset_limit;
 static void
 init_offset_limit (void)
 {
-  if (host_integerp (TYPE_MAX_VALUE (sizetype), 1))
-    offset_limit = tree_low_cst (TYPE_MAX_VALUE (sizetype), 1);
+  if (tree_fits_uhwi_p (TYPE_MAX_VALUE (sizetype)))
+    offset_limit = tree_to_hwi (TYPE_MAX_VALUE (sizetype));
   else
     offset_limit = -1;
   offset_limit /= 2;
@@ -107,7 +107,7 @@  compute_object_offset (const_tree expr, const_tree var)
 
       t = TREE_OPERAND (expr, 1);
       off = size_binop (PLUS_EXPR, DECL_FIELD_OFFSET (t),
-			size_int (tree_low_cst (DECL_FIELD_BIT_OFFSET (t), 1)
+			size_int (tree_to_uhwi (DECL_FIELD_BIT_OFFSET (t))
 				  / BITS_PER_UNIT));
       break;
 
@@ -206,16 +206,16 @@  addr_object_size (struct object_size_info *osi, const_tree ptr,
     }
   else if (pt_var
 	   && DECL_P (pt_var)
-	   && host_integerp (DECL_SIZE_UNIT (pt_var), 1)
+	   && tree_fits_uhwi_p (DECL_SIZE_UNIT (pt_var))
 	   && (unsigned HOST_WIDE_INT)
-	        tree_low_cst (DECL_SIZE_UNIT (pt_var), 1) < offset_limit)
+	        tree_to_hwi (DECL_SIZE_UNIT (pt_var)) < offset_limit)
     pt_var_size = DECL_SIZE_UNIT (pt_var);
   else if (pt_var
 	   && TREE_CODE (pt_var) == STRING_CST
 	   && TYPE_SIZE_UNIT (TREE_TYPE (pt_var))
-	   && host_integerp (TYPE_SIZE_UNIT (TREE_TYPE (pt_var)), 1)
+	   && tree_fits_uhwi_p (TYPE_SIZE_UNIT (TREE_TYPE (pt_var)))
 	   && (unsigned HOST_WIDE_INT)
-	      tree_low_cst (TYPE_SIZE_UNIT (TREE_TYPE (pt_var)), 1)
+	      tree_to_hwi (TYPE_SIZE_UNIT (TREE_TYPE (pt_var)))
 	      < offset_limit)
     pt_var_size = TYPE_SIZE_UNIT (TREE_TYPE (pt_var));
   else
@@ -240,7 +240,7 @@  addr_object_size (struct object_size_info *osi, const_tree ptr,
 	  if (var != pt_var && TREE_CODE (var) == ARRAY_REF)
 	    var = TREE_OPERAND (var, 0);
 	  if (! TYPE_SIZE_UNIT (TREE_TYPE (var))
-	      || ! host_integerp (TYPE_SIZE_UNIT (TREE_TYPE (var)), 1)
+	      || ! tree_fits_uhwi_p (TYPE_SIZE_UNIT (TREE_TYPE (var)))
 	      || (pt_var_size
 		  && tree_int_cst_lt (pt_var_size,
 				      TYPE_SIZE_UNIT (TREE_TYPE (var)))))
@@ -368,8 +368,8 @@  addr_object_size (struct object_size_info *osi, const_tree ptr,
   else
     bytes = pt_var_size;
 
-  if (host_integerp (bytes, 1))
-    return tree_low_cst (bytes, 1);
+  if (tree_fits_uhwi_p (bytes))
+    return tree_to_hwi (bytes);
 
   return unknown[object_size_type];
 }
@@ -398,9 +398,9 @@  alloc_object_size (const_gimple call, int object_size_type)
     {
       tree p = TREE_VALUE (alloc_size);
 
-      arg1 = TREE_INT_CST_LOW (TREE_VALUE (p))-1;
+      arg1 = tree_to_shwi (TREE_VALUE (p))-1;
       if (TREE_CHAIN (p))
-        arg2 = TREE_INT_CST_LOW (TREE_VALUE (TREE_CHAIN (p)))-1;
+        arg2 = tree_to_shwi (TREE_VALUE (TREE_CHAIN (p)))-1;
     }
 
   if (DECL_BUILT_IN_CLASS (callee) == BUILT_IN_NORMAL)
@@ -431,8 +431,8 @@  alloc_object_size (const_gimple call, int object_size_type)
   else if (arg1 >= 0)
     bytes = fold_convert (sizetype, gimple_call_arg (call, arg1));
 
-  if (bytes && host_integerp (bytes, 1))
-    return tree_low_cst (bytes, 1);
+  if (bytes && tree_fits_uhwi_p (bytes))
+    return tree_to_hwi (bytes);
 
   return unknown[object_size_type];
 }
@@ -792,13 +792,13 @@  plus_stmt_object_size (struct object_size_info *osi, tree var, gimple stmt)
       && (TREE_CODE (op0) == SSA_NAME
 	  || TREE_CODE (op0) == ADDR_EXPR))
     {
-      if (! host_integerp (op1, 1))
+      if (! tree_fits_uhwi_p (op1))
 	bytes = unknown[object_size_type];
       else if (TREE_CODE (op0) == SSA_NAME)
-	return merge_object_sizes (osi, var, op0, tree_low_cst (op1, 1));
+	return merge_object_sizes (osi, var, op0, tree_to_hwi (op1));
       else
 	{
-	  unsigned HOST_WIDE_INT off = tree_low_cst (op1, 1);
+	  unsigned HOST_WIDE_INT off = tree_to_hwi (op1);
 
           /* op0 will be ADDR_EXPR here.  */
 	  bytes = addr_object_size (osi, op0, object_size_type);
@@ -1224,10 +1224,10 @@  compute_object_sizes (void)
 		{
 		  tree ost = gimple_call_arg (call, 1);
 
-		  if (host_integerp (ost, 1))
+		  if (tree_fits_uhwi_p (ost))
 		    {
 		      unsigned HOST_WIDE_INT object_size_type
-			= tree_low_cst (ost, 1);
+			= tree_to_hwi (ost);
 
 		      if (object_size_type < 2)
 			result = fold_convert (size_type_node,
diff --git a/gcc/tree-pretty-print.c b/gcc/tree-pretty-print.c
index a92b6d0..01d4af7 100644
--- a/gcc/tree-pretty-print.c
+++ b/gcc/tree-pretty-print.c
@@ -269,8 +269,8 @@  dump_array_domain (pretty_printer *buffer, tree domain, int spc, int flags)
 
       if (min && max
 	  && integer_zerop (min)
-	  && host_integerp (max, 0))
-	pp_wide_integer (buffer, TREE_INT_CST_LOW (max) + 1);
+	  && tree_fits_shwi_p (max))
+	pp_wide_integer (buffer, tree_to_shwi (max) + 1);
       else
 	{
 	  if (min)
@@ -1028,15 +1028,15 @@  dump_generic_node (pretty_printer *buffer, tree node, int spc, int flags,
              NB: Neither of the following divisors can be trivially
              used to recover the original literal:
 
-             TREE_INT_CST_LOW (TYPE_SIZE_UNIT (TREE_TYPE (node)))
+             tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (node)))
 	     TYPE_PRECISION (TREE_TYPE (TREE_TYPE (node)))  */
-	  pp_wide_integer (buffer, TREE_INT_CST_LOW (node));
+	  pp_wide_integer (buffer, tree_to_hwi (node));
 	  pp_string (buffer, "B"); /* pseudo-unit */
 	}
-      else if (host_integerp (node, 0))
-	pp_wide_integer (buffer, TREE_INT_CST_LOW (node));
-      else if (host_integerp (node, 1))
-	pp_unsigned_wide_integer (buffer, TREE_INT_CST_LOW (node));
+      else if (tree_fits_shwi_p (node))
+	pp_wide_integer (buffer, tree_to_hwi (node));
+      else if (tree_fits_uhwi_p (node))
+	pp_unsigned_wide_integer (buffer, tree_to_hwi (node));
       else
 	{
 	  tree val = node;
diff --git a/gcc/tree-sra.c b/gcc/tree-sra.c
index 5acb612..a172d91 100644
--- a/gcc/tree-sra.c
+++ b/gcc/tree-sra.c
@@ -705,12 +705,12 @@  type_internals_preclude_sra_p (tree type, const char **msg)
 		*msg = "zero structure field size";
 	        return true;
 	      }
-	    if (!host_integerp (DECL_FIELD_OFFSET (fld), 1))
+	    if (!tree_fits_uhwi_p (DECL_FIELD_OFFSET (fld)))
 	      {
 		*msg = "structure field offset not fixed";
 		return true;
 	      }
-	    if (!host_integerp (DECL_SIZE (fld), 1))
+	    if (!tree_fits_uhwi_p (DECL_SIZE (fld)))
 	      {
 	        *msg = "structure field size not fixed";
 		return true;
@@ -947,7 +947,7 @@  completely_scalarize_record (tree base, tree decl, HOST_WIDE_INT offset,
 	    struct access *access;
 	    HOST_WIDE_INT size;
 
-	    size = tree_low_cst (DECL_SIZE (fld), 1);
+	    size = tree_to_uhwi (DECL_SIZE (fld));
 	    access = create_access_1 (base, pos, size);
 	    access->expr = nref;
 	    access->type = ft;
@@ -966,7 +966,7 @@  completely_scalarize_record (tree base, tree decl, HOST_WIDE_INT offset,
 static void
 completely_scalarize_var (tree var)
 {
-  HOST_WIDE_INT size = tree_low_cst (DECL_SIZE (var), 1);
+  HOST_WIDE_INT size = tree_to_uhwi (DECL_SIZE (var));
   struct access *access;
 
   access = create_access_1 (var, 0, size);
@@ -1314,11 +1314,11 @@  compare_access_positions (const void *a, const void *b)
 	return TYPE_PRECISION (f2->type) - TYPE_PRECISION (f1->type);
       /* Put any integral type with non-full precision last.  */
       else if (INTEGRAL_TYPE_P (f1->type)
-	       && (TREE_INT_CST_LOW (TYPE_SIZE (f1->type))
+	       && (tree_to_uhwi (TYPE_SIZE (f1->type))
 		   != TYPE_PRECISION (f1->type)))
 	return 1;
       else if (INTEGRAL_TYPE_P (f2->type)
-	       && (TREE_INT_CST_LOW (TYPE_SIZE (f2->type))
+	       && (tree_to_uhwi (TYPE_SIZE (f2->type))
 		   != TYPE_PRECISION (f2->type)))
 	return -1;
       /* Stabilize the sort.  */
@@ -1380,7 +1380,7 @@  make_fancy_name_1 (tree expr)
       index = TREE_OPERAND (expr, 1);
       if (TREE_CODE (index) != INTEGER_CST)
 	break;
-      sprintf (buffer, HOST_WIDE_INT_PRINT_DEC, TREE_INT_CST_LOW (index));
+      sprintf (buffer, HOST_WIDE_INT_PRINT_DEC, tree_to_shwi (index));
       obstack_grow (&name_obstack, buffer, strlen (buffer));
       break;
 
@@ -1394,7 +1394,7 @@  make_fancy_name_1 (tree expr)
 	{
 	  obstack_1grow (&name_obstack, '$');
 	  sprintf (buffer, HOST_WIDE_INT_PRINT_DEC,
-		   TREE_INT_CST_LOW (TREE_OPERAND (expr, 1)));
+		   tree_to_shwi (TREE_OPERAND (expr, 1)));
 	  obstack_grow (&name_obstack, buffer, strlen (buffer));
 	}
       break;
@@ -1564,14 +1564,14 @@  build_user_friendly_ref_for_offset (tree *res, tree type, HOST_WIDE_INT offset,
 		continue;
 
 	      tr_pos = bit_position (fld);
-	      if (!tr_pos || !host_integerp (tr_pos, 1))
+	      if (!tr_pos || !tree_fits_uhwi_p (tr_pos))
 		continue;
-	      pos = TREE_INT_CST_LOW (tr_pos);
+	      pos = tree_to_hwi (tr_pos);
 	      gcc_assert (TREE_CODE (type) == RECORD_TYPE || pos == 0);
 	      tr_size = DECL_SIZE (fld);
-	      if (!tr_size || !host_integerp (tr_size, 1))
+	      if (!tr_size || !tree_fits_uhwi_p (tr_size))
 		continue;
-	      size = TREE_INT_CST_LOW (tr_size);
+	      size = tree_to_hwi (tr_size);
 	      if (size == 0)
 		{
 		  if (pos != offset)
@@ -1594,9 +1594,9 @@  build_user_friendly_ref_for_offset (tree *res, tree type, HOST_WIDE_INT offset,
 
 	case ARRAY_TYPE:
 	  tr_size = TYPE_SIZE (TREE_TYPE (type));
-	  if (!tr_size || !host_integerp (tr_size, 1))
+	  if (!tr_size || !tree_fits_uhwi_p (tr_size))
 	    return false;
-	  el_size = tree_low_cst (tr_size, 1);
+	  el_size = tree_to_hwi (tr_size);
 
 	  minidx = TYPE_MIN_VALUE (TYPE_DOMAIN (type));
 	  if (TREE_CODE (minidx) != INTEGER_CST || el_size == 0)
@@ -1672,12 +1672,12 @@  maybe_add_sra_candidate (tree var)
       reject (var, "has incomplete type");
       return false;
     }
-  if (!host_integerp (TYPE_SIZE (type), 1))
+  if (!tree_fits_uhwi_p (TYPE_SIZE (type)))
     {
       reject (var, "type size not fixed");
       return false;
     }
-  if (tree_low_cst (TYPE_SIZE (type), 1) == 0)
+  if (tree_to_uhwi (TYPE_SIZE (type)) == 0)
     {
       reject (var, "type size is zero");
       return false;
@@ -2010,7 +2010,7 @@  expr_with_var_bounded_array_refs_p (tree expr)
   while (handled_component_p (expr))
     {
       if (TREE_CODE (expr) == ARRAY_REF
-	  && !host_integerp (array_ref_low_bound (expr), 0))
+	  && !tree_fits_shwi_p (array_ref_low_bound (expr)))
 	return true;
       expr = TREE_OPERAND (expr, 0);
     }
@@ -2373,7 +2373,7 @@  analyze_all_variable_accesses (void)
 	if (TREE_CODE (var) == VAR_DECL
 	    && type_consists_of_records_p (TREE_TYPE (var)))
 	  {
-	    if ((unsigned) tree_low_cst (TYPE_SIZE (TREE_TYPE (var)), 1)
+	    if ((unsigned) tree_to_uhwi (TYPE_SIZE (TREE_TYPE (var)))
 		<= max_total_scalarization_size)
 	      {
 		completely_scalarize_var (var);
@@ -2651,12 +2651,12 @@  sra_modify_expr (tree *expr, gimple_stmt_iterator *gsi, bool write)
     {
       HOST_WIDE_INT start_offset, chunk_size;
       if (bfr
-	  && host_integerp (TREE_OPERAND (bfr, 1), 1)
-	  && host_integerp (TREE_OPERAND (bfr, 2), 1))
+	  && tree_fits_uhwi_p (TREE_OPERAND (bfr, 1))
+	  && tree_fits_uhwi_p (TREE_OPERAND (bfr, 2)))
 	{
-	  chunk_size = tree_low_cst (TREE_OPERAND (bfr, 1), 1);
+	  chunk_size = tree_to_hwi (TREE_OPERAND (bfr, 1));
 	  start_offset = access->offset
-	    + tree_low_cst (TREE_OPERAND (bfr, 2), 1);
+	    + tree_to_hwi (TREE_OPERAND (bfr, 2));
 	}
       else
 	start_offset = chunk_size = 0;
@@ -3478,8 +3478,8 @@  find_param_candidates (void)
 	continue;
 
       if (!COMPLETE_TYPE_P (type)
-	  || !host_integerp (TYPE_SIZE (type), 1)
-          || tree_low_cst (TYPE_SIZE (type), 1) == 0
+	  || !tree_fits_uhwi_p (TYPE_SIZE (type))
+          || tree_to_hwi (TYPE_SIZE (type)) == 0
 	  || (AGGREGATE_TYPE_P (type)
 	      && type_internals_preclude_sra_p (type, &msg)))
 	continue;
@@ -3848,9 +3848,9 @@  splice_param_accesses (tree parm, bool *ro_grp)
     }
 
   if (POINTER_TYPE_P (TREE_TYPE (parm)))
-    agg_size = tree_low_cst (TYPE_SIZE (TREE_TYPE (TREE_TYPE (parm))), 1);
+    agg_size = tree_to_uhwi (TYPE_SIZE (TREE_TYPE (TREE_TYPE (parm))));
   else
-    agg_size = tree_low_cst (TYPE_SIZE (TREE_TYPE (parm)), 1);
+    agg_size = tree_to_uhwi (TYPE_SIZE (TREE_TYPE (parm)));
   if (total_size >= agg_size)
     return NULL;
 
@@ -3869,13 +3869,13 @@  decide_one_param_reduction (struct access *repr)
   tree parm;
 
   parm = repr->base;
-  cur_parm_size = tree_low_cst (TYPE_SIZE (TREE_TYPE (parm)), 1);
+  cur_parm_size = tree_to_uhwi (TYPE_SIZE (TREE_TYPE (parm)));
   gcc_assert (cur_parm_size > 0);
 
   if (POINTER_TYPE_P (TREE_TYPE (parm)))
     {
       by_ref = true;
-      agg_size = tree_low_cst (TYPE_SIZE (TREE_TYPE (TREE_TYPE (parm))), 1);
+      agg_size = tree_to_uhwi (TYPE_SIZE (TREE_TYPE (TREE_TYPE (parm))));
     }
   else
     {
diff --git a/gcc/tree-ssa-address.c b/gcc/tree-ssa-address.c
index ce69e51..6857762 100644
--- a/gcc/tree-ssa-address.c
+++ b/gcc/tree-ssa-address.c
@@ -866,7 +866,7 @@  copy_ref_info (tree new_ref, tree old_ref)
 	      && !(TREE_CODE (new_ref) == TARGET_MEM_REF
 		   && (TMR_INDEX2 (new_ref)
 		       || (TMR_STEP (new_ref)
-			   && (TREE_INT_CST_LOW (TMR_STEP (new_ref))
+			   && (tree_to_shwi (TMR_STEP (new_ref))
 			       < align)))))
 	    {
 	      unsigned int inc = (mem_ref_offset (old_ref)
diff --git a/gcc/tree-ssa-alias.c b/gcc/tree-ssa-alias.c
index b045da2..6eabb70 100644
--- a/gcc/tree-ssa-alias.c
+++ b/gcc/tree-ssa-alias.c
@@ -558,9 +558,9 @@  ao_ref_init_from_ptr_and_size (ao_ref *ref, tree ptr, tree size)
       ref->offset = 0;
     }
   if (size
-      && host_integerp (size, 0)
-      && TREE_INT_CST_LOW (size) * 8 / 8 == TREE_INT_CST_LOW (size))
-    ref->max_size = ref->size = TREE_INT_CST_LOW (size) * 8;
+      && tree_fits_shwi_p (size)
+      && tree_to_hwi (size) * 8 / 8 == tree_to_hwi (size))
+    ref->max_size = ref->size = tree_to_hwi (size) * 8;
   else
     ref->max_size = ref->size = -1;
   ref->ref_alias_set = 0;
@@ -1872,7 +1872,7 @@  stmt_kills_ref_p_1 (gimple stmt, ao_ref *ref)
 	      tree len = gimple_call_arg (stmt, 2);
 	      tree base = NULL_TREE;
 	      HOST_WIDE_INT offset = 0;
-	      if (!host_integerp (len, 0))
+	      if (!tree_fits_shwi_p (len))
 		return false;
 	      if (TREE_CODE (dest) == ADDR_EXPR)
 		base = get_addr_base_and_unit_offset (TREE_OPERAND (dest, 0),
@@ -1882,7 +1882,7 @@  stmt_kills_ref_p_1 (gimple stmt, ao_ref *ref)
 	      if (base
 		  && base == ao_ref_base (ref))
 		{
-		  HOST_WIDE_INT size = TREE_INT_CST_LOW (len);
+		  HOST_WIDE_INT size = tree_to_hwi (len);
 		  if (offset <= ref->offset / BITS_PER_UNIT
 		      && (offset + size
 		          >= ((ref->offset + ref->max_size + BITS_PER_UNIT - 1)
diff --git a/gcc/tree-ssa-ccp.c b/gcc/tree-ssa-ccp.c
index 49a4c6f..76b04c3 100644
--- a/gcc/tree-ssa-ccp.c
+++ b/gcc/tree-ssa-ccp.c
@@ -819,9 +819,10 @@  ccp_finalize (void)
 	 bits the misalignment.  */
       tem = val->mask.low;
       align = (tem & -tem);
+      /* All we care about are the lower bits.  */
       if (align > 1)
 	set_ptr_info_alignment (get_ptr_info (name), align,
-				TREE_INT_CST_LOW (val->value) & (align - 1));
+				tree_to_hwi (val->value) & (align - 1));
     }
 
   /* Perform substitutions based on the known constant values.  */
@@ -1420,18 +1421,18 @@  bit_value_assume_aligned (gimple stmt)
 	       && TREE_CODE (ptrval.value) == INTEGER_CST)
 	      || ptrval.mask.is_minus_one ());
   align = gimple_call_arg (stmt, 1);
-  if (!host_integerp (align, 1))
+  if (!tree_fits_uhwi_p (align))
     return ptrval;
-  aligni = tree_low_cst (align, 1);
+  aligni = tree_to_hwi (align);
   if (aligni <= 1
       || (aligni & (aligni - 1)) != 0)
     return ptrval;
   if (gimple_call_num_args (stmt) > 2)
     {
       misalign = gimple_call_arg (stmt, 2);
-      if (!host_integerp (misalign, 1))
+      if (!tree_fits_uhwi_p (misalign))
 	return ptrval;
-      misaligni = tree_low_cst (misalign, 1);
+      misaligni = tree_to_hwi (misalign);
       if (misaligni >= aligni)
 	return ptrval;
     }
@@ -1612,7 +1613,7 @@  evaluate_stmt (gimple stmt)
 	    case BUILT_IN_ALLOCA:
 	    case BUILT_IN_ALLOCA_WITH_ALIGN:
 	      align = (DECL_FUNCTION_CODE (fndecl) == BUILT_IN_ALLOCA_WITH_ALIGN
-		       ? TREE_INT_CST_LOW (gimple_call_arg (stmt, 1))
+		       ? tree_to_uhwi (gimple_call_arg (stmt, 1))
 		       : BIGGEST_ALIGNMENT);
 	      val.lattice_val = CONSTANT;
 	      val.value = build_int_cst (TREE_TYPE (gimple_get_lhs (stmt)), 0);
@@ -1779,10 +1780,10 @@  fold_builtin_alloca_with_align (gimple stmt)
   arg = get_constant_value (gimple_call_arg (stmt, 0));
   if (arg == NULL_TREE
       || TREE_CODE (arg) != INTEGER_CST
-      || !host_integerp (arg, 1))
+      || !tree_fits_uhwi_p (arg))
     return NULL_TREE;
 
-  size = TREE_INT_CST_LOW (arg);
+  size = tree_to_hwi (arg);
 
   /* Heuristic: don't fold large allocas.  */
   threshold = (unsigned HOST_WIDE_INT)PARAM_VALUE (PARAM_LARGE_STACK_FRAME);
@@ -1800,7 +1801,7 @@  fold_builtin_alloca_with_align (gimple stmt)
   n_elem = size * 8 / BITS_PER_UNIT;
   array_type = build_array_type_nelts (elem_type, n_elem);
   var = create_tmp_var (array_type, NULL);
-  DECL_ALIGN (var) = TREE_INT_CST_LOW (gimple_call_arg (stmt, 1));
+  DECL_ALIGN (var) = tree_to_shwi (gimple_call_arg (stmt, 1));
   {
     struct ptr_info_def *pi = SSA_NAME_PTR_INFO (lhs);
     if (pi != NULL && !pi->pt.anything)
diff --git a/gcc/tree-ssa-forwprop.c b/gcc/tree-ssa-forwprop.c
index eb1af4e..703325a 100644
--- a/gcc/tree-ssa-forwprop.c
+++ b/gcc/tree-ssa-forwprop.c
@@ -1434,8 +1434,8 @@  simplify_builtin_call (gimple_stmt_iterator *gsi_p, tree callee2)
 	  char *src_buf;
 	  use_operand_p use_p;
 
-	  if (!host_integerp (val2, 0)
-	      || !host_integerp (len2, 1))
+	  if (!tree_fits_shwi_p (val2)
+	      || !tree_fits_uhwi_p (len2))
 	    break;
 	  if (is_gimple_call (stmt1))
 	    {
@@ -1454,15 +1454,15 @@  simplify_builtin_call (gimple_stmt_iterator *gsi_p, tree callee2)
 	      src1 = gimple_call_arg (stmt1, 1);
 	      len1 = gimple_call_arg (stmt1, 2);
 	      lhs1 = gimple_call_lhs (stmt1);
-	      if (!host_integerp (len1, 1))
+	      if (!tree_fits_uhwi_p (len1))
 		break;
 	      str1 = string_constant (src1, &off1);
 	      if (str1 == NULL_TREE)
 		break;
-	      if (!host_integerp (off1, 1)
+	      if (!tree_fits_uhwi_p (off1)
 		  || compare_tree_int (off1, TREE_STRING_LENGTH (str1) - 1) > 0
 		  || compare_tree_int (len1, TREE_STRING_LENGTH (str1)
-					     - tree_low_cst (off1, 1)) > 0
+					     - tree_to_uhwi (off1)) > 0
 		  || TREE_CODE (TREE_TYPE (str1)) != ARRAY_TYPE
 		  || TYPE_MODE (TREE_TYPE (TREE_TYPE (str1)))
 		     != TYPE_MODE (char_type_node))
@@ -1476,7 +1476,7 @@  simplify_builtin_call (gimple_stmt_iterator *gsi_p, tree callee2)
 	      src1 = gimple_assign_rhs1 (stmt1);
 	      if (TREE_CODE (ptr1) != MEM_REF
 		  || TYPE_MODE (TREE_TYPE (ptr1)) != TYPE_MODE (char_type_node)
-		  || !host_integerp (src1, 0))
+		  || !tree_fits_shwi_p (src1))
 		break;
 	      ptr1 = build_fold_addr_expr (ptr1);
 	      callee1 = NULL_TREE;
@@ -1500,16 +1500,16 @@  simplify_builtin_call (gimple_stmt_iterator *gsi_p, tree callee2)
 	  /* If the difference between the second and first destination pointer
 	     is not constant, or is bigger than memcpy length, bail out.  */
 	  if (diff == NULL
-	      || !host_integerp (diff, 1)
+	      || !tree_fits_uhwi_p (diff)
 	      || tree_int_cst_lt (len1, diff))
 	    break;
 
 	  /* Use maximum of difference plus memset length and memcpy length
 	     as the new memcpy length, if it is too big, bail out.  */
-	  src_len = tree_low_cst (diff, 1);
-	  src_len += tree_low_cst (len2, 1);
-	  if (src_len < (unsigned HOST_WIDE_INT) tree_low_cst (len1, 1))
-	    src_len = tree_low_cst (len1, 1);
+	  src_len = tree_to_hwi (diff);
+	  src_len += tree_to_uhwi (len2);
+	  if (src_len < (unsigned HOST_WIDE_INT) tree_to_uhwi (len1))
+	    src_len = tree_to_hwi (len1);
 	  if (src_len > 1024)
 	    break;
 
@@ -1535,12 +1535,12 @@  simplify_builtin_call (gimple_stmt_iterator *gsi_p, tree callee2)
 	  src_buf = XALLOCAVEC (char, src_len + 1);
 	  if (callee1)
 	    memcpy (src_buf,
-		    TREE_STRING_POINTER (str1) + tree_low_cst (off1, 1),
-		    tree_low_cst (len1, 1));
+		    TREE_STRING_POINTER (str1) + tree_to_uhwi (off1),
+		    tree_to_uhwi (len1));
 	  else
-	    src_buf[0] = tree_low_cst (src1, 0);
-	  memset (src_buf + tree_low_cst (diff, 1),
-		  tree_low_cst (val2, 0), tree_low_cst (len2, 1));
+	    src_buf[0] = tree_to_shwi (src1);
+	  memset (src_buf + tree_to_uhwi (diff),
+		  tree_to_shwi (val2), tree_to_uhwi (len2));
 	  src_buf[src_len] = '\0';
 	  /* Neither builtin_strncpy_read_str nor builtin_memcpy_read_str
 	     handle embedded '\0's.  */
@@ -2606,11 +2606,11 @@  simplify_bitfield_ref (gimple_stmt_iterator *gsi)
   if (TREE_TYPE (op) != elem_type)
     return false;
 
-  size = TREE_INT_CST_LOW (TYPE_SIZE (elem_type));
-  n = TREE_INT_CST_LOW (op1) / size;
+  size = tree_to_uhwi (TYPE_SIZE (elem_type));
+  n = tree_to_uhwi (op1) / size;
   if (n != 1)
     return false;
-  idx = TREE_INT_CST_LOW (op2) / size;
+  idx = tree_to_uhwi (op2) / size;
 
   if (code == VEC_PERM_EXPR)
     {
@@ -2620,7 +2620,7 @@  simplify_bitfield_ref (gimple_stmt_iterator *gsi)
       if (TREE_CODE (m) != VECTOR_CST)
 	return false;
       nelts = VECTOR_CST_NELTS (m);
-      idx = TREE_INT_CST_LOW (VECTOR_CST_ELT (m, idx));
+      idx = tree_to_uhwi (VECTOR_CST_ELT (m, idx));
       idx %= 2 * nelts;
       if (idx < nelts)
 	{
@@ -2664,7 +2664,7 @@  is_combined_permutation_identity (tree mask1, tree mask2)
     {
       tree val = VECTOR_CST_ELT (mask, i);
       gcc_assert (TREE_CODE (val) == INTEGER_CST);
-      j = TREE_INT_CST_LOW (val) & (2 * nelts - 1);
+      j = tree_to_shwi (val) & (2 * nelts - 1);
       if (j == i)
 	maybe_identity2 = false;
       else if (j == i + nelts)
@@ -2809,7 +2809,7 @@  simplify_vector_constructor (gimple_stmt_iterator *gsi)
 
   nelts = TYPE_VECTOR_SUBPARTS (type);
   elem_type = TREE_TYPE (type);
-  elem_size = TREE_INT_CST_LOW (TYPE_SIZE (elem_type));
+  elem_size = tree_to_uhwi (TYPE_SIZE (elem_type));
 
   sel = XALLOCAVEC (unsigned char, nelts);
   orig = NULL;
@@ -2842,9 +2842,9 @@  simplify_vector_constructor (gimple_stmt_iterator *gsi)
 	    return false;
 	  orig = ref;
 	}
-      if (TREE_INT_CST_LOW (TREE_OPERAND (op1, 1)) != elem_size)
+      if (tree_to_uhwi (TREE_OPERAND (op1, 1)) != elem_size)
 	return false;
-      sel[i] = TREE_INT_CST_LOW (TREE_OPERAND (op1, 2)) / elem_size;
+      sel[i] = tree_to_uhwi (TREE_OPERAND (op1, 2)) / elem_size;
       if (sel[i] != i) maybe_ident = false;
     }
   if (i < nelts)
diff --git a/gcc/tree-ssa-loop-ivcanon.c b/gcc/tree-ssa-loop-ivcanon.c
index b790e1f..2f0d29b 100644
--- a/gcc/tree-ssa-loop-ivcanon.c
+++ b/gcc/tree-ssa-loop-ivcanon.c
@@ -330,9 +330,9 @@  try_unroll_loop_completely (struct loop *loop,
   if (loop->inner)
     return false;
 
-  if (!host_integerp (niter, 1))
+  if (!tree_fits_uhwi_p (niter))
     return false;
-  n_unroll = tree_low_cst (niter, 1);
+  n_unroll = tree_to_hwi (niter);
 
   max_unroll = PARAM_VALUE (PARAM_MAX_COMPLETELY_PEEL_TIMES);
   if (n_unroll > max_unroll)
diff --git a/gcc/tree-ssa-loop-ivopts.c b/gcc/tree-ssa-loop-ivopts.c
index 74097f8..ea74548 100644
--- a/gcc/tree-ssa-loop-ivopts.c
+++ b/gcc/tree-ssa-loop-ivopts.c
@@ -3877,16 +3877,16 @@  get_loop_invariant_expr_id (struct ivopts_data *data, tree ubase,
             {
               tree ind = TREE_OPERAND (usym, 1);
               if (TREE_CODE (ind) == INTEGER_CST
-                  && host_integerp (ind, 0)
-                  && TREE_INT_CST_LOW (ind) == 0)
+                  && tree_fits_shwi_p (ind)
+                  && tree_to_hwi (ind) == 0)
                 usym = TREE_OPERAND (usym, 0);
             }
           if (TREE_CODE (csym) == ARRAY_REF)
             {
               tree ind = TREE_OPERAND (csym, 1);
               if (TREE_CODE (ind) == INTEGER_CST
-                  && host_integerp (ind, 0)
-                  && TREE_INT_CST_LOW (ind) == 0)
+                  && tree_fits_shwi_p (ind)
+                  && tree_to_hwi (ind) == 0)
                 csym = TREE_OPERAND (csym, 0);
             }
           if (operand_equal_p (usym, csym, 0))
@@ -4260,7 +4260,7 @@  iv_period (struct iv *iv)
 
   period = build_low_bits_mask (type,
                                 (TYPE_PRECISION (type)
-                                 - tree_low_cst (pow2div, 1)));
+                                 - tree_to_uhwi (pow2div)));
 
   return period;
 }
diff --git a/gcc/tree-ssa-loop-niter.c b/gcc/tree-ssa-loop-niter.c
index 84ae610..fe4db37 100644
--- a/gcc/tree-ssa-loop-niter.c
+++ b/gcc/tree-ssa-loop-niter.c
@@ -574,7 +574,7 @@  number_of_iterations_ne_max (mpz_t bnd, bool no_overflow, tree c, tree s,
   if (!no_overflow)
     {
       max = double_int::mask (TYPE_PRECISION (TREE_TYPE (c))
-			     - tree_low_cst (num_ending_zeros (s), 1));
+			     - tree_to_uhwi (num_ending_zeros (s)));
       mpz_set_double_int (bnd, max, true);
       return;
     }
@@ -664,7 +664,7 @@  number_of_iterations_ne (tree type, affine_iv *iv, tree final,
   bits = num_ending_zeros (s);
   bound = build_low_bits_mask (niter_type,
 			       (TYPE_PRECISION (niter_type)
-				- tree_low_cst (bits, 1)));
+				- tree_to_uhwi (bits)));
 
   d = fold_binary_to_constant (LSHIFT_EXPR, niter_type,
 			       build_int_cst (niter_type, 1), bits);
diff --git a/gcc/tree-ssa-loop-prefetch.c b/gcc/tree-ssa-loop-prefetch.c
index fe4df9a..5aa8372 100644
--- a/gcc/tree-ssa-loop-prefetch.c
+++ b/gcc/tree-ssa-loop-prefetch.c
@@ -498,7 +498,7 @@  analyze_ref (struct loop *loop, tree *ref_p, tree *base,
   for (; TREE_CODE (ref) == COMPONENT_REF; ref = TREE_OPERAND (ref, 0))
     {
       off = DECL_FIELD_BIT_OFFSET (TREE_OPERAND (ref, 1));
-      bit_offset = TREE_INT_CST_LOW (off);
+      bit_offset = tree_to_shwi (off);
       gcc_assert (bit_offset % BITS_PER_UNIT == 0);
 
       *delta += bit_offset / BITS_PER_UNIT;
@@ -1407,8 +1407,8 @@  add_subscript_strides (tree access_fn, unsigned stride,
       if ((unsigned) loop_depth (aloop) <= min_depth)
 	continue;
 
-      if (host_integerp (step, 0))
-	astep = tree_low_cst (step, 0);
+      if (tree_fits_shwi_p (step))
+	astep = tree_to_shwi (step);
       else
 	astep = L1_CACHE_LINE_SIZE;
 
@@ -1457,8 +1457,8 @@  self_reuse_distance (data_reference_p dr, unsigned *loop_sizes, unsigned n,
       if (TREE_CODE (ref) == ARRAY_REF)
 	{
 	  stride = TYPE_SIZE_UNIT (TREE_TYPE (ref));
-	  if (host_integerp (stride, 1))
-	    astride = tree_low_cst (stride, 1);
+	  if (tree_fits_uhwi_p (stride))
+	    astride = tree_to_hwi (stride);
 	  else
 	    astride = L1_CACHE_LINE_SIZE;
 
diff --git a/gcc/tree-ssa-math-opts.c b/gcc/tree-ssa-math-opts.c
index d9f4e9e..55e20c8 100644
--- a/gcc/tree-ssa-math-opts.c
+++ b/gcc/tree-ssa-math-opts.c
@@ -1439,10 +1439,10 @@  execute_cse_sincos (void)
 		CASE_FLT_FN (BUILT_IN_POWI):
 		  arg0 = gimple_call_arg (stmt, 0);
 		  arg1 = gimple_call_arg (stmt, 1);
-		  if (!host_integerp (arg1, 0))
+		  if (!tree_fits_shwi_p (arg1))
 		    break;
 
-		  n = TREE_INT_CST_LOW (arg1);
+		  n = tree_to_hwi (arg1);
 		  loc = gimple_location (stmt);
 		  result = gimple_expand_builtin_powi (&gsi, loc, arg0, n);
 
@@ -1681,7 +1681,7 @@  find_bswap_1 (gimple stmt, struct symbolic_number *n, int limit)
 	case RSHIFT_EXPR:
 	case LROTATE_EXPR:
 	case RROTATE_EXPR:
-	  if (!do_shift_rotate (code, n, (int)TREE_INT_CST_LOW (rhs2)))
+	  if (!do_shift_rotate (code, n, (int)tree_to_shwi (rhs2)))
 	    return NULL_TREE;
 	  break;
 	CASE_CONVERT:
@@ -1774,7 +1774,7 @@  find_bswap (gimple stmt)
      increase that number by three  here in order to also
      cover signed -> unsigned converions of the src operand as can be seen
      in libgcc, and for initial shift/and operation of the src operand.  */
-  limit = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (gimple_expr_type (stmt)));
+  limit = tree_to_shwi (TYPE_SIZE_UNIT (gimple_expr_type (stmt)));
   limit += 1 + (int) ceil_log2 ((unsigned HOST_WIDE_INT) limit);
   source_expr =  find_bswap_1 (stmt, &n, limit);
 
diff --git a/gcc/tree-ssa-phiopt.c b/gcc/tree-ssa-phiopt.c
index 948620f..e3e90d5 100644
--- a/gcc/tree-ssa-phiopt.c
+++ b/gcc/tree-ssa-phiopt.c
@@ -1280,7 +1280,7 @@  add_or_mark_expr (basic_block bb, tree exp,
 
   if (TREE_CODE (exp) == MEM_REF
       && TREE_CODE (TREE_OPERAND (exp, 0)) == SSA_NAME
-      && host_integerp (TREE_OPERAND (exp, 1), 0)
+      && tree_fits_shwi_p (TREE_OPERAND (exp, 1))
       && (size = int_size_in_bytes (TREE_TYPE (exp))) > 0)
     {
       tree name = TREE_OPERAND (exp, 0);
@@ -1294,7 +1294,7 @@  add_or_mark_expr (basic_block bb, tree exp,
       map.ssa_name_ver = SSA_NAME_VERSION (name);
       map.bb = 0;
       map.store = store;
-      map.offset = tree_low_cst (TREE_OPERAND (exp, 1), 0);
+      map.offset = tree_to_shwi (TREE_OPERAND (exp, 1));
       map.size = size;
 
       slot = htab_find_slot (seen_ssa_names, &map, INSERT);
@@ -1882,14 +1882,14 @@  hoist_adjacent_loads (basic_block bb0, basic_block bb1,
       tree_offset2 = bit_position (field2);
       tree_size2 = DECL_SIZE (field2);
 
-      if (!host_integerp (tree_offset1, 1)
-	  || !host_integerp (tree_offset2, 1)
-	  || !host_integerp (tree_size2, 1))
+      if (!tree_fits_uhwi_p (tree_offset1)
+	  || !tree_fits_uhwi_p (tree_offset2)
+	  || !tree_fits_uhwi_p (tree_size2))
 	continue;
 
-      offset1 = TREE_INT_CST_LOW (tree_offset1);
-      offset2 = TREE_INT_CST_LOW (tree_offset2);
-      size2 = TREE_INT_CST_LOW (tree_size2);
+      offset1 = tree_to_hwi (tree_offset1);
+      offset2 = tree_to_hwi (tree_offset2);
+      size2 = tree_to_hwi (tree_size2);
       align1 = DECL_ALIGN (field1) % param_align_bits;
 
       if (offset1 % BITS_PER_UNIT != 0)
diff --git a/gcc/tree-ssa-reassoc.c b/gcc/tree-ssa-reassoc.c
index 960e2c3..7e95675 100644
--- a/gcc/tree-ssa-reassoc.c
+++ b/gcc/tree-ssa-reassoc.c
@@ -1042,7 +1042,7 @@  decrement_power (gimple stmt)
 
     CASE_FLT_FN (BUILT_IN_POWI):
       arg1 = gimple_call_arg (stmt, 1);
-      power = TREE_INT_CST_LOW (arg1) - 1;
+      power = tree_to_shwi (arg1) - 1;
       gimple_call_set_arg (stmt, 1, build_int_cst (TREE_TYPE (arg1), power));
       return power;
 
@@ -2730,10 +2730,10 @@  acceptable_pow_call (gimple stmt, tree *base, HOST_WIDE_INT *exponent)
       *base = gimple_call_arg (stmt, 0);
       arg1 = gimple_call_arg (stmt, 1);
 
-      if (!host_integerp (arg1, 0))
+      if (!tree_fits_shwi_p (arg1))
 	return false;
 
-      *exponent = TREE_INT_CST_LOW (arg1);
+      *exponent = tree_to_shwi (arg1);
       break;
 
     default:
diff --git a/gcc/tree-ssa-sccvn.c b/gcc/tree-ssa-sccvn.c
index 832328d..837fe6d 100644
--- a/gcc/tree-ssa-sccvn.c
+++ b/gcc/tree-ssa-sccvn.c
@@ -567,11 +567,11 @@  vn_reference_eq (const void *p1, const void *p2)
     }
   else if (INTEGRAL_TYPE_P (vr1->type)
 	   && (TYPE_PRECISION (vr1->type)
-	       != TREE_INT_CST_LOW (TYPE_SIZE (vr1->type))))
+	       != tree_to_uhwi (TYPE_SIZE (vr1->type))))
     return false;
   else if (INTEGRAL_TYPE_P (vr2->type)
 	   && (TYPE_PRECISION (vr2->type)
-	       != TREE_INT_CST_LOW (TYPE_SIZE (vr2->type))))
+	       != tree_to_uhwi (TYPE_SIZE (vr2->type))))
     return false;
 
   i = 0;
@@ -689,8 +689,8 @@  copy_reference_ops_from_ref (tree ref, VEC(vn_reference_op_s, heap) **result)
 	case MEM_REF:
 	  /* The base address gets its own vn_reference_op_s structure.  */
 	  temp.op0 = TREE_OPERAND (ref, 1);
-	  if (host_integerp (TREE_OPERAND (ref, 1), 0))
-	    temp.off = TREE_INT_CST_LOW (TREE_OPERAND (ref, 1));
+	  if (tree_fits_shwi_p (TREE_OPERAND (ref, 1)))
+	    temp.off = tree_to_hwi (TREE_OPERAND (ref, 1));
 	  break;
 	case BIT_FIELD_REF:
 	  /* Record bits and position.  */
@@ -710,7 +710,7 @@  copy_reference_ops_from_ref (tree ref, VEC(vn_reference_op_s, heap) **result)
 		&& TREE_CODE (this_offset) == INTEGER_CST)
 	      {
 		tree bit_offset = DECL_FIELD_BIT_OFFSET (TREE_OPERAND (ref, 1));
-		if (TREE_INT_CST_LOW (bit_offset) % BITS_PER_UNIT == 0)
+		if (tree_to_uhwi (bit_offset) % BITS_PER_UNIT == 0)
 		  {
 		    double_int off
 		      = tree_to_double_int (this_offset)
@@ -791,7 +791,7 @@  copy_reference_ops_from_ref (tree ref, VEC(vn_reference_op_s, heap) **result)
 	  break;
 	case IMAGPART_EXPR:
 	  /* This is only interesting for its constant offset.  */
-	  temp.off = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (TREE_TYPE (ref)));
+	  temp.off = tree_to_uhwi (TYPE_SIZE_UNIT (TREE_TYPE (ref)));
 	  break;
 	default:
 	  gcc_unreachable ();
@@ -844,10 +844,10 @@  ao_ref_init_from_vn_reference (ao_ref *ref,
     }
   if (size_tree != NULL_TREE)
     {
-      if (!host_integerp (size_tree, 1))
+      if (!tree_fits_uhwi_p (size_tree))
 	size = -1;
       else
-	size = TREE_INT_CST_LOW (size_tree);
+	size = tree_to_hwi (size_tree);
     }
 
   /* Initially, maxsize is the same as the accessed element size.
@@ -903,7 +903,7 @@  ao_ref_init_from_vn_reference (ao_ref *ref,
 
 	/* And now the usual component-reference style ops.  */
 	case BIT_FIELD_REF:
-	  offset += tree_low_cst (op->op1, 0);
+	  offset += tree_to_shwi (op->op1);
 	  break;
 
 	case COMPONENT_REF:
@@ -914,13 +914,13 @@  ao_ref_init_from_vn_reference (ao_ref *ref,
 	       parts manually.  */
 
 	    if (op->op1
-		|| !host_integerp (DECL_FIELD_OFFSET (field), 1))
+		|| !tree_fits_uhwi_p (DECL_FIELD_OFFSET (field)))
 	      max_size = -1;
 	    else
 	      {
-		offset += (TREE_INT_CST_LOW (DECL_FIELD_OFFSET (field))
+		offset += (tree_to_hwi (DECL_FIELD_OFFSET (field))
 			   * BITS_PER_UNIT);
-		offset += TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (field));
+		offset += tree_to_hwi (DECL_FIELD_BIT_OFFSET (field));
 	      }
 	    break;
 	  }
@@ -928,15 +928,15 @@  ao_ref_init_from_vn_reference (ao_ref *ref,
 	case ARRAY_RANGE_REF:
 	case ARRAY_REF:
 	  /* We recorded the lower bound and the element size.  */
-	  if (!host_integerp (op->op0, 0)
-	      || !host_integerp (op->op1, 0)
-	      || !host_integerp (op->op2, 0))
+	  if (!tree_fits_shwi_p (op->op0)
+	      || !tree_fits_shwi_p (op->op1)
+	      || !tree_fits_shwi_p (op->op2))
 	    max_size = -1;
 	  else
 	    {
-	      HOST_WIDE_INT hindex = TREE_INT_CST_LOW (op->op0);
-	      hindex -= TREE_INT_CST_LOW (op->op1);
-	      hindex *= TREE_INT_CST_LOW (op->op2);
+	      HOST_WIDE_INT hindex = tree_to_hwi (op->op0);
+	      hindex -= tree_to_hwi (op->op1);
+	      hindex *= tree_to_hwi (op->op2);
 	      hindex *= BITS_PER_UNIT;
 	      offset += hindex;
 	    }
@@ -1076,8 +1076,8 @@  vn_reference_fold_indirect (VEC (vn_reference_op_s, heap) **ops,
       off += double_int::from_shwi (addr_offset);
       mem_op->op0 = double_int_to_tree (TREE_TYPE (mem_op->op0), off);
       op->op0 = build_fold_addr_expr (addr_base);
-      if (host_integerp (mem_op->op0, 0))
-	mem_op->off = TREE_INT_CST_LOW (mem_op->op0);
+      if (tree_fits_shwi_p (mem_op->op0))
+	mem_op->off = tree_to_hwi (mem_op->op0);
       else
 	mem_op->off = -1;
     }
@@ -1141,8 +1141,8 @@  vn_reference_maybe_forwprop_address (VEC (vn_reference_op_s, heap) **ops,
     }
 
   mem_op->op0 = double_int_to_tree (TREE_TYPE (mem_op->op0), off);
-  if (host_integerp (mem_op->op0, 0))
-    mem_op->off = TREE_INT_CST_LOW (mem_op->op0);
+  if (tree_fits_shwi_p (mem_op->op0))
+    mem_op->off = tree_to_hwi (mem_op->op0);
   else
     mem_op->off = -1;
   if (TREE_CODE (op->op0) == SSA_NAME)
@@ -1221,7 +1221,7 @@  fully_constant_vn_reference_p (vn_reference_t ref)
 	  && compare_tree_int (op->op0, TREE_STRING_LENGTH (arg0->op0)) < 0)
 	return build_int_cst_type (op->type,
 				   (TREE_STRING_POINTER (arg0->op0)
-				    [TREE_INT_CST_LOW (op->op0)]));
+				    [tree_to_uhwi (op->op0)]));
     }
 
   return NULL_TREE;
@@ -1511,16 +1511,15 @@  vn_reference_lookup_3 (ao_ref *ref, tree vuse, void *vr_)
   if (is_gimple_reg_type (vr->type)
       && gimple_call_builtin_p (def_stmt, BUILT_IN_MEMSET)
       && integer_zerop (gimple_call_arg (def_stmt, 1))
-      && host_integerp (gimple_call_arg (def_stmt, 2), 1)
+      && tree_fits_uhwi_p (gimple_call_arg (def_stmt, 2))
       && TREE_CODE (gimple_call_arg (def_stmt, 0)) == ADDR_EXPR)
     {
       tree ref2 = TREE_OPERAND (gimple_call_arg (def_stmt, 0), 0);
       tree base2;
       HOST_WIDE_INT offset2, size2, maxsize2;
       base2 = get_ref_base_and_extent (ref2, &offset2, &size2, &maxsize2);
-      size2 = TREE_INT_CST_LOW (gimple_call_arg (def_stmt, 2)) * 8;
-      if ((unsigned HOST_WIDE_INT)size2 / 8
-	  == TREE_INT_CST_LOW (gimple_call_arg (def_stmt, 2))
+      size2 = tree_to_hwi (gimple_call_arg (def_stmt, 2)) * 8;
+      if (size2 / 8 == tree_to_hwi (gimple_call_arg (def_stmt, 2))
 	  && maxsize2 != -1
 	  && operand_equal_p (base, base2, 0)
 	  && offset2 <= offset
@@ -1623,7 +1622,7 @@  vn_reference_lookup_3 (ao_ref *ref, tree vuse, void *vr_)
 	    {
 	      tree val = NULL_TREE;
 	      HOST_WIDE_INT elsz
-		= TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (TREE_TYPE (rhs1))));
+		= tree_to_uhwi (TYPE_SIZE (TREE_TYPE (TREE_TYPE (rhs1))));
 	      if (gimple_assign_rhs_code (def_stmt2) == COMPLEX_EXPR)
 		{
 		  if (off == 0)
@@ -1765,7 +1764,7 @@  vn_reference_lookup_3 (ao_ref *ref, tree vuse, void *vr_)
 	       || TREE_CODE (gimple_call_arg (def_stmt, 0)) == SSA_NAME)
 	   && (TREE_CODE (gimple_call_arg (def_stmt, 1)) == ADDR_EXPR
 	       || TREE_CODE (gimple_call_arg (def_stmt, 1)) == SSA_NAME)
-	   && host_integerp (gimple_call_arg (def_stmt, 2), 1))
+	   && tree_fits_uhwi_p (gimple_call_arg (def_stmt, 2)))
     {
       tree lhs, rhs;
       ao_ref r;
@@ -1792,10 +1791,10 @@  vn_reference_lookup_3 (ao_ref *ref, tree vuse, void *vr_)
 	  if (!tem)
 	    return (void *)-1;
 	  if (TREE_CODE (tem) == MEM_REF
-	      && host_integerp (TREE_OPERAND (tem, 1), 1))
+	      && tree_fits_uhwi_p (TREE_OPERAND (tem, 1)))
 	    {
 	      lhs = TREE_OPERAND (tem, 0);
-	      lhs_offset += TREE_INT_CST_LOW (TREE_OPERAND (tem, 1));
+	      lhs_offset += tree_to_hwi (TREE_OPERAND (tem, 1));
 	    }
 	  else if (DECL_P (tem))
 	    lhs = build_fold_addr_expr (tem);
@@ -1818,10 +1817,10 @@  vn_reference_lookup_3 (ao_ref *ref, tree vuse, void *vr_)
 	  if (!tem)
 	    return (void *)-1;
 	  if (TREE_CODE (tem) == MEM_REF
-	      && host_integerp (TREE_OPERAND (tem, 1), 1))
+	      && tree_fits_uhwi_p (TREE_OPERAND (tem, 1)))
 	    {
 	      rhs = TREE_OPERAND (tem, 0);
-	      rhs_offset += TREE_INT_CST_LOW (TREE_OPERAND (tem, 1));
+	      rhs_offset += tree_to_hwi (TREE_OPERAND (tem, 1));
 	    }
 	  else if (DECL_P (tem))
 	    rhs = build_fold_addr_expr (tem);
@@ -1832,14 +1831,14 @@  vn_reference_lookup_3 (ao_ref *ref, tree vuse, void *vr_)
 	  && TREE_CODE (rhs) != ADDR_EXPR)
 	return (void *)-1;
 
-      copy_size = TREE_INT_CST_LOW (gimple_call_arg (def_stmt, 2));
+      copy_size = tree_to_shwi (gimple_call_arg (def_stmt, 2));
 
       /* The bases of the destination and the references have to agree.  */
       if ((TREE_CODE (base) != MEM_REF
 	   && !DECL_P (base))
 	  || (TREE_CODE (base) == MEM_REF
 	      && (TREE_OPERAND (base, 0) != lhs
-		  || !host_integerp (TREE_OPERAND (base, 1), 1)))
+		  || !tree_fits_uhwi_p (TREE_OPERAND (base, 1))))
 	  || (DECL_P (base)
 	      && (TREE_CODE (lhs) != ADDR_EXPR
 		  || TREE_OPERAND (lhs, 0) != base)))
@@ -1848,7 +1847,7 @@  vn_reference_lookup_3 (ao_ref *ref, tree vuse, void *vr_)
       /* And the access has to be contained within the memcpy destination.  */
       at = offset / BITS_PER_UNIT;
       if (TREE_CODE (base) == MEM_REF)
-	at += TREE_INT_CST_LOW (TREE_OPERAND (base, 1));
+	at += tree_to_shwi (TREE_OPERAND (base, 1));
       if (lhs_offset > at
 	  || lhs_offset + copy_size < at + maxsize / BITS_PER_UNIT)
 	return (void *)-1;
@@ -3149,12 +3148,12 @@  simplify_binary_expression (gimple stmt)
   /* Pointer plus constant can be represented as invariant address.
      Do so to allow further propatation, see also tree forwprop.  */
   if (code == POINTER_PLUS_EXPR
-      && host_integerp (op1, 1)
+      && tree_fits_uhwi_p (op1)
       && TREE_CODE (op0) == ADDR_EXPR
       && is_gimple_min_invariant (op0))
     return build_invariant_address (TREE_TYPE (op0),
 				    TREE_OPERAND (op0, 0),
-				    TREE_INT_CST_LOW (op1));
+				    tree_to_hwi (op1));
 
   /* Avoid folding if nothing changed.  */
   if (op0 == gimple_assign_rhs1 (stmt)
diff --git a/gcc/tree-ssa-strlen.c b/gcc/tree-ssa-strlen.c
index d2b6e25..3b9abf0 100644
--- a/gcc/tree-ssa-strlen.c
+++ b/gcc/tree-ssa-strlen.c
@@ -191,10 +191,10 @@  get_stridx (tree exp)
 
   s = string_constant (exp, &o);
   if (s != NULL_TREE
-      && (o == NULL_TREE || host_integerp (o, 0))
+      && (o == NULL_TREE || tree_fits_shwi_p (o))
       && TREE_STRING_LENGTH (s) > 0)
     {
-      HOST_WIDE_INT offset = o ? tree_low_cst (o, 0) : 0;
+      HOST_WIDE_INT offset = o ? tree_to_hwi (o) : 0;
       const char *p = TREE_STRING_POINTER (s);
       int max = TREE_STRING_LENGTH (s) - 1;
 
@@ -826,16 +826,16 @@  adjust_last_stmt (strinfo si, gimple stmt, bool is_strcat)
     }
 
   len = gimple_call_arg (last.stmt, 2);
-  if (host_integerp (len, 1))
+  if (tree_fits_uhwi_p (len))
     {
-      if (!host_integerp (last.len, 1)
+      if (!tree_fits_uhwi_p (last.len)
 	  || integer_zerop (len)
-	  || (unsigned HOST_WIDE_INT) tree_low_cst (len, 1)
-	     != (unsigned HOST_WIDE_INT) tree_low_cst (last.len, 1) + 1)
+	  || (unsigned HOST_WIDE_INT) tree_to_hwi (len)
+	     != (unsigned HOST_WIDE_INT) tree_to_hwi (last.len) + 1)
 	return;
       /* Don't adjust the length if it is divisible by 4, it is more efficient
 	 to store the extra '\0' in that case.  */
-      if ((((unsigned HOST_WIDE_INT) tree_low_cst (len, 1)) & 3) == 0)
+      if ((((unsigned HOST_WIDE_INT) tree_to_uhwi (len)) & 3) == 0)
 	return;
     }
   else if (TREE_CODE (len) == SSA_NAME)
@@ -1290,7 +1290,7 @@  handle_builtin_memcpy (enum built_in_function bcode, gimple_stmt_iterator *gsi)
     return;
 
   if (olddsi != NULL
-      && host_integerp (len, 1)
+      && tree_fits_uhwi_p (len)
       && !integer_zerop (len))
     adjust_last_stmt (olddsi, stmt, false);
 
@@ -1316,8 +1316,8 @@  handle_builtin_memcpy (enum built_in_function bcode, gimple_stmt_iterator *gsi)
       si = NULL;
       /* Handle memcpy (x, "abcd", 5) or
 	 memcpy (x, "abc\0uvw", 7).  */
-      if (!host_integerp (len, 1)
-	  || (unsigned HOST_WIDE_INT) tree_low_cst (len, 1)
+      if (!tree_fits_uhwi_p (len)
+	  || (unsigned HOST_WIDE_INT) tree_to_hwi (len)
 	     <= (unsigned HOST_WIDE_INT) ~idx)
 	return;
     }
@@ -1606,11 +1606,11 @@  handle_pointer_plus (gimple_stmt_iterator *gsi)
   if (idx < 0)
     {
       tree off = gimple_assign_rhs2 (stmt);
-      if (host_integerp (off, 1)
-	  && (unsigned HOST_WIDE_INT) tree_low_cst (off, 1)
+      if (tree_fits_uhwi_p (off)
+	  && (unsigned HOST_WIDE_INT) tree_to_hwi (off)
 	     <= (unsigned HOST_WIDE_INT) ~idx)
 	VEC_replace (int, ssa_ver_to_stridx, SSA_NAME_VERSION (lhs),
-		     ~(~idx - (int) tree_low_cst (off, 1)));
+		     ~(~idx - (int) tree_to_hwi (off)));
       return;
     }
 
diff --git a/gcc/tree-ssa-structalias.c b/gcc/tree-ssa-structalias.c
index 711fbef..8da8fed 100644
--- a/gcc/tree-ssa-structalias.c
+++ b/gcc/tree-ssa-structalias.c
@@ -2864,12 +2864,12 @@  process_constraint (constraint_t t)
 static HOST_WIDE_INT
 bitpos_of_field (const tree fdecl)
 {
-  if (!host_integerp (DECL_FIELD_OFFSET (fdecl), 0)
-      || !host_integerp (DECL_FIELD_BIT_OFFSET (fdecl), 0))
+  if (!tree_fits_shwi_p (DECL_FIELD_OFFSET (fdecl))
+      || !tree_fits_shwi_p (DECL_FIELD_BIT_OFFSET (fdecl)))
     return -1;
 
-  return (TREE_INT_CST_LOW (DECL_FIELD_OFFSET (fdecl)) * BITS_PER_UNIT
-	  + TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (fdecl)));
+  return (tree_to_hwi (DECL_FIELD_OFFSET (fdecl)) * BITS_PER_UNIT
+	  + tree_to_hwi (DECL_FIELD_BIT_OFFSET (fdecl)));
 }
 
 
@@ -3301,8 +3301,8 @@  get_constraint_for_1 (tree t, VEC (ce_s, heap) **results, bool address_p,
 		  && curr)
 		{
 		  unsigned HOST_WIDE_INT size;
-		  if (host_integerp (TYPE_SIZE (TREE_TYPE (t)), 1))
-		    size = TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (t)));
+		  if (tree_fits_uhwi_p (TYPE_SIZE (TREE_TYPE (t))))
+		    size = tree_to_hwi (TYPE_SIZE (TREE_TYPE (t)));
 		  else
 		    size = -1;
 		  for (; curr; curr = curr->next)
@@ -5196,7 +5196,7 @@  push_fields_onto_fieldstack (tree type, VEC(fieldoff_s,heap) **fieldstack,
 	      }
 
 	    if (!DECL_SIZE (field)
-		|| !host_integerp (DECL_SIZE (field), 1))
+		|| !tree_fits_uhwi_p (DECL_SIZE (field)))
 	      has_unknown_size = true;
 
 	    /* If adjacent fields do not contain pointers merge them.  */
@@ -5208,7 +5208,7 @@  push_fields_onto_fieldstack (tree type, VEC(fieldoff_s,heap) **fieldstack,
 		&& !pair->has_unknown_size
 		&& pair->offset + (HOST_WIDE_INT)pair->size == offset + foff)
 	      {
-		pair->size += TREE_INT_CST_LOW (DECL_SIZE (field));
+		pair->size += tree_to_hwi (DECL_SIZE (field));
 	      }
 	    else
 	      {
@@ -5216,7 +5216,7 @@  push_fields_onto_fieldstack (tree type, VEC(fieldoff_s,heap) **fieldstack,
 		e.offset = offset + foff;
 		e.has_unknown_size = has_unknown_size;
 		if (!has_unknown_size)
-		  e.size = TREE_INT_CST_LOW (DECL_SIZE (field));
+		  e.size = tree_to_shwi (DECL_SIZE (field));
 		else
 		  e.size = -1;
 		e.must_have_pointers = must_have_pointers_p;
@@ -5472,7 +5472,7 @@  create_variable_info_for_1 (tree decl, const char *name)
   unsigned int i;
 
   if (!declsize
-      || !host_integerp (declsize, 1))
+      || !tree_fits_uhwi_p (declsize))
     {
       vi = new_var_info (decl, name);
       vi->offset = 0;
@@ -5533,7 +5533,7 @@  create_variable_info_for_1 (tree decl, const char *name)
       vi = new_var_info (decl, name);
       vi->offset = 0;
       vi->may_have_pointers = true;
-      vi->fullsize = TREE_INT_CST_LOW (declsize);
+      vi->fullsize = tree_to_hwi (declsize);
       vi->size = vi->fullsize;
       vi->is_full_var = true;
       VEC_free (fieldoff_s, heap, fieldstack);
@@ -5541,7 +5541,7 @@  create_variable_info_for_1 (tree decl, const char *name)
     }
 
   vi = new_var_info (decl, name);
-  vi->fullsize = TREE_INT_CST_LOW (declsize);
+  vi->fullsize = tree_to_uhwi (declsize);
   for (i = 0, newvi = vi;
        VEC_iterate (fieldoff_s, fieldstack, i, fo);
        ++i, newvi = newvi->next)
diff --git a/gcc/tree-stdarg.c b/gcc/tree-stdarg.c
index 8965ca3..b3af8ea 100644
--- a/gcc/tree-stdarg.c
+++ b/gcc/tree-stdarg.c
@@ -165,9 +165,9 @@  va_list_counter_bump (struct stdarg_info *si, tree counter, tree rhs,
       if ((rhs_code == POINTER_PLUS_EXPR
 	   || rhs_code == PLUS_EXPR)
 	  && TREE_CODE (rhs1) == SSA_NAME
-	  && host_integerp (gimple_assign_rhs2 (stmt), 1))
+	  && tree_fits_uhwi_p (gimple_assign_rhs2 (stmt)))
 	{
-	  ret += tree_low_cst (gimple_assign_rhs2 (stmt), 1);
+	  ret += tree_to_hwi (gimple_assign_rhs2 (stmt));
 	  lhs = rhs1;
 	  continue;
 	}
@@ -175,9 +175,9 @@  va_list_counter_bump (struct stdarg_info *si, tree counter, tree rhs,
       if (rhs_code == ADDR_EXPR 
 	  && TREE_CODE (TREE_OPERAND (rhs1, 0)) == MEM_REF
 	  && TREE_CODE (TREE_OPERAND (TREE_OPERAND (rhs1, 0), 0)) == SSA_NAME
-	  && host_integerp (TREE_OPERAND (TREE_OPERAND (rhs1, 0), 1), 1))
+	  && tree_fits_uhwi_p (TREE_OPERAND (TREE_OPERAND (rhs1, 0), 1)))
 	{
-	  ret += tree_low_cst (TREE_OPERAND (TREE_OPERAND (rhs1, 0), 1), 1);
+	  ret += tree_to_hwi (TREE_OPERAND (TREE_OPERAND (rhs1, 0), 1));
 	  lhs = TREE_OPERAND (TREE_OPERAND (rhs1, 0), 0);
 	  continue;
 	}
@@ -232,9 +232,9 @@  va_list_counter_bump (struct stdarg_info *si, tree counter, tree rhs,
       if ((rhs_code == POINTER_PLUS_EXPR
 	   || rhs_code == PLUS_EXPR)
 	  && TREE_CODE (rhs1) == SSA_NAME
-	  && host_integerp (gimple_assign_rhs2 (stmt), 1))
+	  && tree_fits_uhwi_p (gimple_assign_rhs2 (stmt)))
 	{
-	  val -= tree_low_cst (gimple_assign_rhs2 (stmt), 1);
+	  val -= tree_to_hwi (gimple_assign_rhs2 (stmt));
 	  lhs = rhs1;
 	  continue;
 	}
@@ -242,9 +242,9 @@  va_list_counter_bump (struct stdarg_info *si, tree counter, tree rhs,
       if (rhs_code == ADDR_EXPR 
 	  && TREE_CODE (TREE_OPERAND (rhs1, 0)) == MEM_REF
 	  && TREE_CODE (TREE_OPERAND (TREE_OPERAND (rhs1, 0), 0)) == SSA_NAME
-	  && host_integerp (TREE_OPERAND (TREE_OPERAND (rhs1, 0), 1), 1))
+	  && tree_fits_uhwi_p (TREE_OPERAND (TREE_OPERAND (rhs1, 0), 1)))
 	{
-	  val -= tree_low_cst (TREE_OPERAND (TREE_OPERAND (rhs1, 0), 1), 1);
+	  val -= tree_to_hwi (TREE_OPERAND (TREE_OPERAND (rhs1, 0), 1));
 	  lhs = TREE_OPERAND (TREE_OPERAND (rhs1, 0), 0);
 	  continue;
 	}
@@ -551,15 +551,15 @@  check_all_va_list_escapes (struct stdarg_info *si)
 		  if (rhs_code == MEM_REF
 		      && TREE_OPERAND (rhs, 0) == use
 		      && TYPE_SIZE_UNIT (TREE_TYPE (rhs))
-		      && host_integerp (TYPE_SIZE_UNIT (TREE_TYPE (rhs)), 1)
+		      && tree_fits_uhwi_p (TYPE_SIZE_UNIT (TREE_TYPE (rhs)))
 		      && si->offsets[SSA_NAME_VERSION (use)] != -1)
 		    {
 		      unsigned HOST_WIDE_INT gpr_size;
 		      tree access_size = TYPE_SIZE_UNIT (TREE_TYPE (rhs));
 
 		      gpr_size = si->offsets[SSA_NAME_VERSION (use)]
-			  	 + tree_low_cst (TREE_OPERAND (rhs, 1), 0)
-				 + tree_low_cst (access_size, 1);
+			  	 + tree_to_shwi (TREE_OPERAND (rhs, 1))
+				 + tree_to_uhwi (access_size);
 		      if (gpr_size >= VA_LIST_MAX_GPR_SIZE)
 			cfun->va_list_gpr_size = VA_LIST_MAX_GPR_SIZE;
 		      else if (gpr_size > cfun->va_list_gpr_size)
diff --git a/gcc/tree-switch-conversion.c b/gcc/tree-switch-conversion.c
index bbbd3ca..60aadbf 100644
--- a/gcc/tree-switch-conversion.c
+++ b/gcc/tree-switch-conversion.c
@@ -348,15 +348,13 @@  emit_case_bit_tests (gimple swtch, tree index_expr,
       else
         test[k].bits++;
 
-      lo = tree_low_cst (int_const_binop (MINUS_EXPR,
-					  CASE_LOW (cs), minval),
-			 1);
+      lo = tree_to_uhwi (int_const_binop (MINUS_EXPR,
+					  CASE_LOW (cs), minval));
       if (CASE_HIGH (cs) == NULL_TREE)
 	hi = lo;
       else
-	hi = tree_low_cst (int_const_binop (MINUS_EXPR, 
-					    CASE_HIGH (cs), minval),
-			   1);
+	hi = tree_to_uhwi (int_const_binop (MINUS_EXPR, 
+					    CASE_HIGH (cs), minval));
 
       for (j = lo; j <= hi; j++)
         if (j >= HOST_BITS_PER_WIDE_INT)
@@ -691,13 +689,13 @@  static bool
 check_range (struct switch_conv_info *info)
 {
   gcc_assert (info->range_size);
-  if (!host_integerp (info->range_size, 1))
+  if (!tree_fits_uhwi_p (info->range_size))
     {
       info->reason = "index range way too large or otherwise unusable";
       return false;
     }
 
-  if ((unsigned HOST_WIDE_INT) tree_low_cst (info->range_size, 1)
+  if ((unsigned HOST_WIDE_INT) tree_to_uhwi (info->range_size)
       > ((unsigned) info->count * SWITCH_CONVERSION_BRANCH_RATIO))
     {
       info->reason = "the maximum range-branch ratio exceeded";
@@ -797,7 +795,7 @@  create_temp_arrays (struct switch_conv_info *info)
   info->target_outbound_names = info->target_inbound_names + info->phi_count;
   for (i = 0; i < info->phi_count; i++)
     info->constructors[i]
-      = VEC_alloc (constructor_elt, gc, tree_low_cst (info->range_size, 1) + 1);
+      = VEC_alloc (constructor_elt, gc, tree_to_uhwi (info->range_size) + 1);
 }
 
 /* Free the arrays created by create_temp_arrays().  The vectors that are
diff --git a/gcc/tree-vect-data-refs.c b/gcc/tree-vect-data-refs.c
index dc6e1e7..d93c5b7 100644
--- a/gcc/tree-vect-data-refs.c
+++ b/gcc/tree-vect-data-refs.c
@@ -109,7 +109,7 @@  vect_get_smallest_scalar_type (gimple stmt, HOST_WIDE_INT *lhs_size_unit,
   tree scalar_type = gimple_expr_type (stmt);
   HOST_WIDE_INT lhs, rhs;
 
-  lhs = rhs = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (scalar_type));
+  lhs = rhs = tree_to_shwi (TYPE_SIZE_UNIT (scalar_type));
 
   if (is_gimple_assign (stmt)
       && (gimple_assign_cast_p (stmt)
@@ -119,7 +119,7 @@  vect_get_smallest_scalar_type (gimple stmt, HOST_WIDE_INT *lhs_size_unit,
     {
       tree rhs_type = TREE_TYPE (gimple_assign_rhs1 (stmt));
 
-      rhs = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (rhs_type));
+      rhs = tree_to_shwi (TYPE_SIZE_UNIT (rhs_type));
       if (rhs < lhs)
         scalar_type = rhs_type;
     }
@@ -362,16 +362,16 @@  vect_drs_dependent_in_basic_block (struct data_reference *dra,
     return true;
 
   /* Check the types.  */
-  type_size_a = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (TREE_TYPE (DR_REF (dra))));
-  type_size_b = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (TREE_TYPE (DR_REF (drb))));
+  type_size_a = tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (DR_REF (dra))));
+  type_size_b = tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (DR_REF (drb))));
 
   if (type_size_a != type_size_b
       || !types_compatible_p (TREE_TYPE (DR_REF (dra)),
                               TREE_TYPE (DR_REF (drb))))
     return true;
 
-  init_a = TREE_INT_CST_LOW (DR_INIT (dra));
-  init_b = TREE_INT_CST_LOW (DR_INIT (drb));
+  init_a = tree_to_shwi (DR_INIT (dra));
+  init_b = tree_to_shwi (DR_INIT (drb));
 
   /* Two different locations - no dependence.  */
   if (init_a != init_b)
@@ -414,8 +414,8 @@  vect_check_interleaving (struct data_reference *dra,
      2. their steps are equal
      3. the step (if greater than zero) is greater than the difference between
         data-refs' inits.  */
-  type_size_a = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (TREE_TYPE (DR_REF (dra))));
-  type_size_b = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (TREE_TYPE (DR_REF (drb))));
+  type_size_a = tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (DR_REF (dra))));
+  type_size_b = tree_to_shwi (TYPE_SIZE_UNIT (TREE_TYPE (DR_REF (drb))));
 
   if (type_size_a != type_size_b
       || tree_int_cst_compare (DR_STEP (dra), DR_STEP (drb))
@@ -423,9 +423,9 @@  vect_check_interleaving (struct data_reference *dra,
                               TREE_TYPE (DR_REF (drb))))
     return false;
 
-  init_a = TREE_INT_CST_LOW (DR_INIT (dra));
-  init_b = TREE_INT_CST_LOW (DR_INIT (drb));
-  step = TREE_INT_CST_LOW (DR_STEP (dra));
+  init_a = tree_to_shwi (DR_INIT (dra));
+  init_b = tree_to_shwi (DR_INIT (drb));
+  step = tree_to_shwi (DR_STEP (dra));
 
   if (init_a > init_b)
     {
@@ -866,7 +866,7 @@  vect_compute_data_ref_alignment (struct data_reference *dr)
   if (loop && nested_in_vect_loop_p (loop, stmt))
     {
       tree step = DR_STEP (dr);
-      HOST_WIDE_INT dr_step = TREE_INT_CST_LOW (step);
+      HOST_WIDE_INT dr_step = tree_to_shwi (step);
 
       if (dr_step % GET_MODE_SIZE (TYPE_MODE (vectype)) == 0)
         {
@@ -894,7 +894,7 @@  vect_compute_data_ref_alignment (struct data_reference *dr)
   if (!loop)
     {
       tree step = DR_STEP (dr);
-      HOST_WIDE_INT dr_step = TREE_INT_CST_LOW (step);
+      HOST_WIDE_INT dr_step = tree_to_shwi (step);
 
       if (dr_step % GET_MODE_SIZE (TYPE_MODE (vectype)) != 0)
 	{
@@ -984,7 +984,7 @@  vect_compute_data_ref_alignment (struct data_reference *dr)
   /* Modulo alignment.  */
   misalign = size_binop (FLOOR_MOD_EXPR, misalign, alignment);
 
-  if (!host_integerp (misalign, 1))
+  if (!tree_fits_uhwi_p (misalign))
     {
       /* Negative or overflowed misalignment value.  */
       if (dump_kind_p (MSG_MISSED_OPTIMIZATION))
@@ -993,7 +993,7 @@  vect_compute_data_ref_alignment (struct data_reference *dr)
       return false;
     }
 
-  SET_DR_MISALIGNMENT (dr, TREE_INT_CST_LOW (misalign));
+  SET_DR_MISALIGNMENT (dr, tree_to_shwi (misalign));
 
   if (dump_kind_p (MSG_MISSED_OPTIMIZATION))
     {
@@ -1171,10 +1171,10 @@  vect_verify_datarefs_alignment (loop_vec_info loop_vinfo, bb_vec_info bb_vinfo)
 static bool
 not_size_aligned (tree exp)
 {
-  if (!host_integerp (TYPE_SIZE (TREE_TYPE (exp)), 1))
+  if (!tree_fits_uhwi_p (TYPE_SIZE (TREE_TYPE (exp))))
     return true;
 
-  return (TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (exp)))
+  return (tree_to_hwi (TYPE_SIZE (TREE_TYPE (exp)))
 	  > get_object_alignment (exp));
 }
 
@@ -2222,12 +2222,12 @@  vect_analyze_group_access (struct data_reference *dr)
 {
   tree step = DR_STEP (dr);
   tree scalar_type = TREE_TYPE (DR_REF (dr));
-  HOST_WIDE_INT type_size = TREE_INT_CST_LOW (TYPE_SIZE_UNIT (scalar_type));
+  HOST_WIDE_INT type_size = tree_to_shwi (TYPE_SIZE_UNIT (scalar_type));
   gimple stmt = DR_STMT (dr);
   stmt_vec_info stmt_info = vinfo_for_stmt (stmt);
   loop_vec_info loop_vinfo = STMT_VINFO_LOOP_VINFO (stmt_info);
   bb_vec_info bb_vinfo = STMT_VINFO_BB_VINFO (stmt_info);
-  HOST_WIDE_INT dr_step = TREE_INT_CST_LOW (step);
+  HOST_WIDE_INT dr_step = tree_to_shwi (step);
   HOST_WIDE_INT groupsize, last_accessed_element = 1;
   bool slp_impossible = false;
   struct loop *loop = NULL;
@@ -2364,8 +2364,8 @@  vect_analyze_group_access (struct data_reference *dr)
           data_ref = STMT_VINFO_DATA_REF (vinfo_for_stmt (next));
           /* Check that the distance between two accesses is equal to the type
              size. Otherwise, we have gaps.  */
-          diff = (TREE_INT_CST_LOW (DR_INIT (data_ref))
-                  - TREE_INT_CST_LOW (prev_init)) / type_size;
+          diff = (tree_to_shwi (DR_INIT (data_ref))
+                  - tree_to_shwi (prev_init)) / type_size;
 	  if (diff != 1)
 	    {
 	      /* FORNOW: SLP of accesses with gaps is not supported.  */
@@ -2544,7 +2544,7 @@  vect_analyze_data_ref_access (struct data_reference *dr)
   /* Consecutive?  */
   if (TREE_CODE (step) == INTEGER_CST)
     {
-      HOST_WIDE_INT dr_step = TREE_INT_CST_LOW (step);
+      HOST_WIDE_INT dr_step = tree_to_shwi (step);
       if (!tree_int_cst_compare (step, TYPE_SIZE_UNIT (scalar_type))
 	  || (dr_step < 0
 	      && !compare_tree_int (TYPE_SIZE_UNIT (scalar_type), -dr_step)))
@@ -2827,9 +2827,9 @@  vect_check_gather (gimple stmt, loop_vec_info loop_vinfo, tree *basep,
 	    }
 	  break;
 	case MULT_EXPR:
-	  if (scale == 1 && host_integerp (op1, 0))
+	  if (scale == 1 && tree_fits_shwi_p (op1))
 	    {
-	      scale = tree_low_cst (op1, 0);
+	      scale = tree_to_hwi (op1);
 	      off = op0;
 	      continue;
 	    }
@@ -4967,7 +4967,7 @@  vect_supportable_dr_alignment (struct data_reference *dr,
 	{
 	  tree vectype = STMT_VINFO_VECTYPE (stmt_info);
 	  if ((nested_in_vect_loop
-	       && (TREE_INT_CST_LOW (DR_STEP (dr))
+	       && (tree_to_shwi (DR_STEP (dr))
 	 	   != GET_MODE_SIZE (TYPE_MODE (vectype))))
               || !loop_vinfo)
 	    return dr_explicit_realign;
diff --git a/gcc/tree-vect-generic.c b/gcc/tree-vect-generic.c
index d950d81..3b6bf56 100644
--- a/gcc/tree-vect-generic.c
+++ b/gcc/tree-vect-generic.c
@@ -46,7 +46,7 @@  static void expand_vector_operations_1 (gimple_stmt_iterator *);
 static tree
 build_replicated_const (tree type, tree inner_type, HOST_WIDE_INT value)
 {
-  int width = tree_low_cst (TYPE_SIZE (inner_type), 1);
+  int width = tree_to_uhwi (TYPE_SIZE (inner_type));
   int n = HOST_BITS_PER_WIDE_INT / width;
   unsigned HOST_WIDE_INT low, high, mask;
   tree ret;
@@ -235,8 +235,8 @@  expand_vector_piecewise (gimple_stmt_iterator *gsi, elem_op_func f,
   tree part_width = TYPE_SIZE (inner_type);
   tree index = bitsize_int (0);
   int nunits = TYPE_VECTOR_SUBPARTS (type);
-  int delta = tree_low_cst (part_width, 1)
-	      / tree_low_cst (TYPE_SIZE (TREE_TYPE (type)), 1);
+  int delta = tree_to_uhwi (part_width)
+	      / tree_to_uhwi (TYPE_SIZE (TREE_TYPE (type)));
   int i;
   location_t loc = gimple_location (gsi_stmt (*gsi));
 
@@ -269,7 +269,7 @@  expand_vector_parallel (gimple_stmt_iterator *gsi, elem_op_func f, tree type,
 {
   tree result, compute_type;
   enum machine_mode mode;
-  int n_words = tree_low_cst (TYPE_SIZE_UNIT (type), 1) / UNITS_PER_WORD;
+  int n_words = tree_to_uhwi (TYPE_SIZE_UNIT (type)) / UNITS_PER_WORD;
   location_t loc = gimple_location (gsi_stmt (*gsi));
 
   /* We have three strategies.  If the type is already correct, just do
@@ -292,7 +292,7 @@  expand_vector_parallel (gimple_stmt_iterator *gsi, elem_op_func f, tree type,
   else
     {
       /* Use a single scalar operation with a mode no wider than word_mode.  */
-      mode = mode_for_size (tree_low_cst (TYPE_SIZE (type), 1), MODE_INT, 0);
+      mode = mode_for_size (tree_to_uhwi (TYPE_SIZE (type)), MODE_INT, 0);
       compute_type = lang_hooks.types.type_for_mode (mode, 1);
       result = f (gsi, compute_type, a, b, NULL_TREE, NULL_TREE, code);
       warning_at (loc, OPT_Wvector_operation_performance,
@@ -314,7 +314,7 @@  expand_vector_addition (gimple_stmt_iterator *gsi,
 			tree type, tree a, tree b, enum tree_code code)
 {
   int parts_per_word = UNITS_PER_WORD
-	  	       / tree_low_cst (TYPE_SIZE_UNIT (TREE_TYPE (type)), 1);
+	  	       / tree_to_uhwi (TYPE_SIZE_UNIT (TREE_TYPE (type)));
 
   if (INTEGRAL_TYPE_P (TREE_TYPE (type))
       && parts_per_word >= 4
@@ -475,7 +475,7 @@  expand_vector_divmod (gimple_stmt_iterator *gsi, tree type, tree op0,
       tree cst = VECTOR_CST_ELT (op1, i);
       unsigned HOST_WIDE_INT ml;
 
-      if (!host_integerp (cst, unsignedp) || integer_zerop (cst))
+      if (!tree_fits_hwi_p (cst, unsignedp) || integer_zerop (cst))
 	return NULL_TREE;
       pre_shifts[i] = 0;
       post_shifts[i] = 0;
@@ -496,7 +496,7 @@  expand_vector_divmod (gimple_stmt_iterator *gsi, tree type, tree op0,
       if (unsignedp)
 	{
 	  unsigned HOST_WIDE_INT mh;
-	  unsigned HOST_WIDE_INT d = tree_low_cst (cst, 1) & mask;
+	  unsigned HOST_WIDE_INT d = tree_to_uhwi (cst) & mask;
 
 	  if (d >= ((unsigned HOST_WIDE_INT) 1 << (prec - 1)))
 	    /* FIXME: Can transform this into op0 >= op1 ? 1 : 0.  */
@@ -528,9 +528,9 @@  expand_vector_divmod (gimple_stmt_iterator *gsi, tree type, tree op0,
 		      unsigned HOST_WIDE_INT d2;
 		      int this_pre_shift;
 
-		      if (!host_integerp (cst2, 1))
+		      if (!tree_fits_uhwi_p (cst2))
 			return NULL_TREE;
-		      d2 = tree_low_cst (cst2, 1) & mask;
+		      d2 = tree_to_hwi (cst2) & mask;
 		      if (d2 == 0)
 			return NULL_TREE;
 		      this_pre_shift = floor_log2 (d2 & -d2);
@@ -566,7 +566,7 @@  expand_vector_divmod (gimple_stmt_iterator *gsi, tree type, tree op0,
 	}
       else
 	{
-	  HOST_WIDE_INT d = tree_low_cst (cst, 0);
+	  HOST_WIDE_INT d = tree_to_shwi (cst);
 	  unsigned HOST_WIDE_INT abs_d;
 
 	  if (d == -1)
@@ -1030,8 +1030,8 @@  vector_element (gimple_stmt_iterator *gsi, tree vect, tree idx, tree *ptmpvec)
 
       /* Given that we're about to compute a binary modulus,
 	 we don't care about the high bits of the value.  */
-      index = TREE_INT_CST_LOW (idx);
-      if (!host_integerp (idx, 1) || index >= elements)
+      index = tree_to_hwi (idx);
+      if (!tree_fits_uhwi_p (idx) || index >= elements)
 	{
 	  index &= elements - 1;
 	  idx = build_int_cst (TREE_TYPE (idx), index);
@@ -1136,7 +1136,7 @@  lower_vec_perm (gimple_stmt_iterator *gsi)
       unsigned char *sel_int = XALLOCAVEC (unsigned char, elements);
 
       for (i = 0; i < elements; ++i)
-	sel_int[i] = (TREE_INT_CST_LOW (VECTOR_CST_ELT (mask, i))
+	sel_int[i] = (tree_to_shwi (VECTOR_CST_ELT (mask, i))
 		      & (2 * elements - 1));
 
       if (can_vec_perm_p (TYPE_MODE (vect_type), false, sel_int))
@@ -1162,8 +1162,8 @@  lower_vec_perm (gimple_stmt_iterator *gsi)
         {
 	  unsigned HOST_WIDE_INT index;
 
-	  index = TREE_INT_CST_LOW (i_val);
-	  if (!host_integerp (i_val, 1) || index >= elements)
+	  index = tree_to_hwi (i_val);
+	  if (!tree_fits_uhwi_p (i_val) || index >= elements)
 	    i_val = build_int_cst (mask_elt_type, index & (elements - 1));
 
           if (two_operand_p && (index & elements) != 0)
diff --git a/gcc/tree-vect-loop.c b/gcc/tree-vect-loop.c
index 58edfcb..a8d3960 100644
--- a/gcc/tree-vect-loop.c
+++ b/gcc/tree-vect-loop.c
@@ -1242,7 +1242,7 @@  vect_analyze_loop_form (struct loop *loop)
 	  dump_generic_expr (MSG_NOTE, TDF_DETAILS, number_of_iterations);
         }
     }
-  else if (TREE_INT_CST_LOW (number_of_iterations) == 0)
+  else if (tree_to_uhwi (number_of_iterations) == 0)
     {
       if (dump_kind_p (MSG_MISSED_OPTIMIZATION))
 	dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
@@ -3048,10 +3048,10 @@  vect_model_reduction_cost (stmt_vec_info stmt_info, enum tree_code reduc_code,
 	}
       else
 	{
-	  int vec_size_in_bits = tree_low_cst (TYPE_SIZE (vectype), 1);
+	  int vec_size_in_bits = tree_to_uhwi (TYPE_SIZE (vectype));
 	  tree bitsize =
 	    TYPE_SIZE (TREE_TYPE (gimple_assign_lhs (orig_stmt)));
-	  int element_bitsize = tree_low_cst (bitsize, 1);
+	  int element_bitsize = tree_to_uhwi (bitsize);
 	  int nelements = vec_size_in_bits / element_bitsize;
 
 	  optab = optab_for_tree_code (code, vectype, optab_default);
@@ -3519,7 +3519,7 @@  get_initial_def_for_reduction (gimple stmt, tree init_val,
       if (SCALAR_FLOAT_TYPE_P (scalar_type))
         init_value = build_real (scalar_type, TREE_REAL_CST (init_val));
       else
-        init_value = build_int_cst (scalar_type, TREE_INT_CST_LOW (init_val));
+        init_value = build_int_cst (scalar_type, tree_to_shwi (init_val));
     }
   else
     init_value = init_val;
@@ -4019,8 +4019,8 @@  vect_create_epilog_for_reduction (VEC (tree, heap) *vect_defs, gimple stmt,
       enum tree_code shift_code = ERROR_MARK;
       bool have_whole_vector_shift = true;
       int bit_offset;
-      int element_bitsize = tree_low_cst (bitsize, 1);
-      int vec_size_in_bits = tree_low_cst (TYPE_SIZE (vectype), 1);
+      int element_bitsize = tree_to_uhwi (bitsize);
+      int vec_size_in_bits = tree_to_uhwi (TYPE_SIZE (vectype));
       tree vec_temp;
 
       if (optab_handler (vec_shr_optab, mode) != CODE_FOR_nothing)
@@ -4097,7 +4097,7 @@  vect_create_epilog_for_reduction (VEC (tree, heap) *vect_defs, gimple stmt,
             dump_printf_loc (MSG_NOTE, vect_location,
 			     "Reduce using scalar code. ");
 
-          vec_size_in_bits = tree_low_cst (TYPE_SIZE (vectype), 1);
+          vec_size_in_bits = tree_to_uhwi (TYPE_SIZE (vectype));
           FOR_EACH_VEC_ELT (gimple, new_phis, i, new_phi)
             {
               if (gimple_code (new_phi) == GIMPLE_PHI)
diff --git a/gcc/tree-vect-patterns.c b/gcc/tree-vect-patterns.c
index b0974ec..a9c421e 100644
--- a/gcc/tree-vect-patterns.c
+++ b/gcc/tree-vect-patterns.c
@@ -772,8 +772,8 @@  vect_recog_pow_pattern (VEC (gimple, heap) **stmts, tree *type_in,
   *type_out = NULL_TREE;
 
   /* Catch squaring.  */
-  if ((host_integerp (exp, 0)
-       && tree_low_cst (exp, 0) == 2)
+  if ((tree_fits_shwi_p (exp)
+       && tree_to_hwi (exp) == 2)
       || (TREE_CODE (exp) == REAL_CST
           && REAL_VALUES_EQUAL (TREE_REAL_CST (exp), dconst2)))
     {
@@ -1800,7 +1800,7 @@  vect_recog_divmod_pattern (VEC (gimple, heap) **stmts,
       return pattern_stmt;
     }
 
-  if (!host_integerp (oprnd1, TYPE_UNSIGNED (itype))
+  if (!tree_fits_hwi_p (oprnd1, TYPE_UNSIGNED (itype))
       || integer_zerop (oprnd1)
       || prec > HOST_BITS_PER_WIDE_INT)
     return NULL;
@@ -1814,7 +1814,7 @@  vect_recog_divmod_pattern (VEC (gimple, heap) **stmts,
     {
       unsigned HOST_WIDE_INT mh, ml;
       int pre_shift, post_shift;
-      unsigned HOST_WIDE_INT d = tree_low_cst (oprnd1, 1)
+      unsigned HOST_WIDE_INT d = tree_to_uhwi (oprnd1)
 				 & GET_MODE_MASK (TYPE_MODE (itype));
       tree t1, t2, t3, t4;
 
@@ -1931,7 +1931,7 @@  vect_recog_divmod_pattern (VEC (gimple, heap) **stmts,
     {
       unsigned HOST_WIDE_INT ml;
       int post_shift;
-      HOST_WIDE_INT d = tree_low_cst (oprnd1, 0);
+      HOST_WIDE_INT d = tree_to_shwi (oprnd1);
       unsigned HOST_WIDE_INT abs_d;
       bool add = false;
       tree t1, t2, t3, t4;
diff --git a/gcc/tree-vect-stmts.c b/gcc/tree-vect-stmts.c
index 92eaac4..576d6e1 100644
--- a/gcc/tree-vect-stmts.c
+++ b/gcc/tree-vect-stmts.c
@@ -4928,7 +4928,7 @@  vectorizable_load (gimple stmt, gimple_stmt_iterator *gsi, gimple *vec_stmt,
      nested within an outer-loop that is being vectorized.  */
 
   if (nested_in_vect_loop
-      && (TREE_INT_CST_LOW (DR_STEP (dr))
+      && (tree_to_shwi (DR_STEP (dr))
 	  % GET_MODE_SIZE (TYPE_MODE (vectype)) != 0))
     {
       gcc_assert (alignment_support_scheme != dr_explicit_realign_optimized);
diff --git a/gcc/tree-vectorizer.h b/gcc/tree-vectorizer.h
index 5762e00..adb994b 100644
--- a/gcc/tree-vectorizer.h
+++ b/gcc/tree-vectorizer.h
@@ -318,7 +318,7 @@  typedef struct _loop_vec_info {
 #define LOOP_VINFO_LOOP_NEST(L)            (L)->loop_nest
 #define LOOP_VINFO_DATAREFS(L)             (L)->datarefs
 #define LOOP_VINFO_DDRS(L)                 (L)->ddrs
-#define LOOP_VINFO_INT_NITERS(L)           (TREE_INT_CST_LOW ((L)->num_iters))
+#define LOOP_VINFO_INT_NITERS(L)           (tree_to_shwi ((L)->num_iters))
 #define LOOP_PEELING_FOR_ALIGNMENT(L)      (L)->peeling_for_alignment
 #define LOOP_VINFO_UNALIGNED_DR(L)         (L)->unaligned_dr
 #define LOOP_VINFO_MAY_MISALIGN_STMTS(L)   (L)->may_misalign_stmts
@@ -340,8 +340,8 @@  VEC_length (gimple, (L)->may_misalign_stmts) > 0
 VEC_length (ddr_p, (L)->may_alias_ddrs) > 0
 
 #define NITERS_KNOWN_P(n)                     \
-(host_integerp ((n),0)                        \
-&& TREE_INT_CST_LOW ((n)) > 0)
+(tree_fits_shwi_p ((n))                         \
+ && tree_to_hwi ((n)) > 0)
 
 #define LOOP_VINFO_NITERS_KNOWN_P(L)          \
 NITERS_KNOWN_P((L)->num_iters)
diff --git a/gcc/tree-vrp.c b/gcc/tree-vrp.c
index de3eb2c..b75d85a 100644
--- a/gcc/tree-vrp.c
+++ b/gcc/tree-vrp.c
@@ -2769,7 +2769,7 @@  extract_range_from_binary_expr_1 (value_range_t *vr,
 	      vr1p.min
 		= double_int_to_tree (expr_type,
 				      double_int_one
-				      .llshift (TREE_INT_CST_LOW (vr1.min),
+				      .llshift (tree_to_shwi (vr1.min),
 					        TYPE_PRECISION (expr_type)));
 	      vr1p.max = vr1p.min;
 	      /* We have to use a wrapping multiply though as signed overflow
@@ -2794,7 +2794,7 @@  extract_range_from_binary_expr_1 (value_range_t *vr,
 	      if (!uns)
 		overflow_pos -= 1;
 
-	      bound_shift = overflow_pos - TREE_INT_CST_LOW (vr1.max);
+	      bound_shift = overflow_pos - tree_to_shwi (vr1.max);
 	      /* If bound_shift == HOST_BITS_PER_DOUBLE_INT, the llshift can
 		 overflow.  However, for that to happen, vr1.max needs to be
 		 zero, which means vr1 is a singleton range of zero, which
@@ -4777,15 +4777,15 @@  register_edge_assert_for_2 (tree name, edge e, gimple_stmt_iterator bsi,
 	  name2 = gimple_assign_rhs1 (def_stmt);
 	  cst2 = gimple_assign_rhs2 (def_stmt);
 	  if (TREE_CODE (name2) == SSA_NAME
-	      && host_integerp (cst2, 1)
+	      && tree_fits_uhwi_p (cst2)
 	      && INTEGRAL_TYPE_P (TREE_TYPE (name2))
-	      && IN_RANGE (tree_low_cst (cst2, 1), 1, prec - 1)
+	      && IN_RANGE (tree_to_uhwi (cst2), 1, prec - 1)
 	      && prec <= HOST_BITS_PER_DOUBLE_INT
 	      && prec == GET_MODE_PRECISION (TYPE_MODE (TREE_TYPE (val)))
 	      && live_on_edge (e, name2)
 	      && !has_single_use (name2))
 	    {
-	      mask = double_int::mask (tree_low_cst (cst2, 1));
+	      mask = double_int::mask (tree_to_uhwi (cst2));
 	      val2 = fold_binary (LSHIFT_EXPR, TREE_TYPE (val), val, cst2);
 	    }
 	}
diff --git a/gcc/tree.c b/gcc/tree.c
index 2af5fca..d85f9a6 100644
--- a/gcc/tree.c
+++ b/gcc/tree.c
@@ -2370,12 +2370,11 @@  int_size_in_bytes (const_tree type)
   t = TYPE_SIZE_UNIT (type);
   if (t == 0
       || TREE_CODE (t) != INTEGER_CST
-      || TREE_INT_CST_HIGH (t) != 0
       /* If the result would appear negative, it's too big to represent.  */
-      || (HOST_WIDE_INT) TREE_INT_CST_LOW (t) < 0)
+      || !tree_fits_uhwi_p (t))
     return -1;
 
-  return TREE_INT_CST_LOW (t);
+  return tree_to_hwi (t);
 }
 
 /* Return the maximum size of TYPE (in bytes) as a wide integer
@@ -2393,8 +2392,8 @@  max_int_size_in_bytes (const_tree type)
     {
       size_tree = TYPE_ARRAY_MAX_SIZE (type);
 
-      if (size_tree && host_integerp (size_tree, 1))
-	size = tree_low_cst (size_tree, 1);
+      if (size_tree && tree_fits_uhwi_p (size_tree))
+	size = tree_to_hwi (size_tree);
     }
 
   /* If we still haven't been able to get a size, see if the language
@@ -2404,8 +2403,8 @@  max_int_size_in_bytes (const_tree type)
     {
       size_tree = lang_hooks.types.max_size (type);
 
-      if (size_tree && host_integerp (size_tree, 1))
-	size = tree_low_cst (size_tree, 1);
+      if (size_tree && tree_fits_uhwi_p (size_tree))
+	size = tree_to_hwi (size_tree);
     }
 
   return size;
@@ -2440,7 +2439,7 @@  bit_position (const_tree field)
 HOST_WIDE_INT
 int_bit_position (const_tree field)
 {
-  return tree_low_cst (bit_position (field), 0);
+  return tree_to_shwi (bit_position (field));
 }
 
 /* Return the byte position of FIELD, in bytes from the start of the record.
@@ -2460,7 +2459,7 @@  byte_position (const_tree field)
 HOST_WIDE_INT
 int_byte_position (const_tree field)
 {
-  return tree_low_cst (byte_position (field), 0);
+  return tree_to_shwi (byte_position (field));
 }
 
 /* Return the strictest alignment, in bits, that T is known to have.  */
@@ -4681,7 +4680,7 @@  free_lang_data_in_decl (tree decl)
          DECL_VINDEX referring to itself into a vtable slot number as it
 	 should.  Happens with functions that are copied and then forgotten
 	 about.  Just clear it, it won't matter anymore.  */
-      if (DECL_VINDEX (decl) && !host_integerp (DECL_VINDEX (decl), 0))
+      if (DECL_VINDEX (decl) && !tree_fits_shwi_p (DECL_VINDEX (decl)))
 	DECL_VINDEX (decl) = NULL_TREE;
     }
   else if (TREE_CODE (decl) == VAR_DECL)
@@ -6532,37 +6531,6 @@  tree_int_cst_compare (const_tree t1, const_tree t2)
     return 0;
 }
 
-/* Return 1 if T is an INTEGER_CST that can be manipulated efficiently on
-   the host.  If POS is zero, the value can be represented in a single
-   HOST_WIDE_INT.  If POS is nonzero, the value must be non-negative and can
-   be represented in a single unsigned HOST_WIDE_INT.  */
-
-int
-host_integerp (const_tree t, int pos)
-{
-  if (t == NULL_TREE)
-    return 0;
-
-  return (TREE_CODE (t) == INTEGER_CST
-	  && ((TREE_INT_CST_HIGH (t) == 0
-	       && (HOST_WIDE_INT) TREE_INT_CST_LOW (t) >= 0)
-	      || (! pos && TREE_INT_CST_HIGH (t) == -1
-		  && (HOST_WIDE_INT) TREE_INT_CST_LOW (t) < 0
-		  && !TYPE_UNSIGNED (TREE_TYPE (t)))
-	      || (pos && TREE_INT_CST_HIGH (t) == 0)));
-}
-
-/* Return the HOST_WIDE_INT least significant bits of T if it is an
-   INTEGER_CST and there is no overflow.  POS is nonzero if the result must
-   be non-negative.  We must be able to satisfy the above conditions.  */
-
-HOST_WIDE_INT
-tree_low_cst (const_tree t, int pos)
-{
-  gcc_assert (host_integerp (t, pos));
-  return TREE_INT_CST_LOW (t);
-}
-
 /* Return the HOST_WIDE_INT least significant bits of T, a sizetype
    kind INTEGER_CST.  This makes sure to properly sign-extend the
    constant.  */
@@ -6850,7 +6818,7 @@  compare_tree_int (const_tree t, unsigned HOST_WIDE_INT u)
 bool
 valid_constant_size_p (const_tree size)
 {
-  if (! host_integerp (size, 1)
+  if (! tree_fits_uhwi_p (size)
       || TREE_OVERFLOW (size)
       || tree_int_cst_sign_bit (size) != 0)
     return false;
@@ -7303,8 +7271,8 @@  build_nonstandard_integer_type (unsigned HOST_WIDE_INT precision,
     fixup_signed_type (itype);
 
   ret = itype;
-  if (host_integerp (TYPE_MAX_VALUE (itype), 1))
-    ret = type_hash_canon (tree_low_cst (TYPE_MAX_VALUE (itype), 1), itype);
+  if (tree_fits_uhwi_p (TYPE_MAX_VALUE (itype)))
+    ret = type_hash_canon (tree_to_hwi (TYPE_MAX_VALUE (itype)), itype);
   if (precision <= MAX_INT_CACHED_PREC)
     nonstandard_integer_type_cache[precision + unsignedp] = ret;
 
@@ -8245,10 +8213,10 @@  get_narrower (tree op, int *unsignedp_ptr)
       && TREE_CODE (TREE_TYPE (op)) != FIXED_POINT_TYPE
       /* Ensure field is laid out already.  */
       && DECL_SIZE (TREE_OPERAND (op, 1)) != 0
-      && host_integerp (DECL_SIZE (TREE_OPERAND (op, 1)), 1))
+      && tree_fits_uhwi_p (DECL_SIZE (TREE_OPERAND (op, 1))))
     {
       unsigned HOST_WIDE_INT innerprec
-	= tree_low_cst (DECL_SIZE (TREE_OPERAND (op, 1)), 1);
+	= tree_to_hwi (DECL_SIZE (TREE_OPERAND (op, 1)));
       int unsignedp = (DECL_UNSIGNED (TREE_OPERAND (op, 1))
 		       || TYPE_UNSIGNED (TREE_TYPE (TREE_OPERAND (op, 1))));
       tree type = lang_hooks.types.type_for_size (innerprec, unsignedp);
@@ -9843,10 +9811,10 @@  build_vector_type_for_mode (tree innertype, enum machine_mode mode)
     case MODE_INT:
       /* Check that there are no leftover bits.  */
       gcc_assert (GET_MODE_BITSIZE (mode)
-		  % TREE_INT_CST_LOW (TYPE_SIZE (innertype)) == 0);
-
+		  % tree_to_shwi (TYPE_SIZE (innertype)) == 0);
+      
       nunits = GET_MODE_BITSIZE (mode)
-	       / TREE_INT_CST_LOW (TYPE_SIZE (innertype));
+	/ tree_to_shwi (TYPE_SIZE (innertype));
       break;
 
     default:
@@ -10183,11 +10151,7 @@  HOST_WIDE_INT
 int_cst_value (const_tree x)
 {
   unsigned bits = TYPE_PRECISION (TREE_TYPE (x));
-  unsigned HOST_WIDE_INT val = TREE_INT_CST_LOW (x);
-
-  /* Make sure the sign-extended value will fit in a HOST_WIDE_INT.  */
-  gcc_assert (TREE_INT_CST_HIGH (x) == 0
-	      || TREE_INT_CST_HIGH (x) == -1);
+  unsigned HOST_WIDE_INT val = tree_to_hwi (x);
 
   if (bits < HOST_BITS_PER_WIDE_INT)
     {
@@ -10207,7 +10171,7 @@  HOST_WIDEST_INT
 widest_int_cst_value (const_tree x)
 {
   unsigned bits = TYPE_PRECISION (TREE_TYPE (x));
-  unsigned HOST_WIDEST_INT val = TREE_INT_CST_LOW (x);
+  unsigned HOST_WIDEST_INT val = tree_to_hwi (x);
 
 #if HOST_BITS_PER_WIDEST_INT > HOST_BITS_PER_WIDE_INT
   gcc_assert (HOST_BITS_PER_WIDEST_INT >= HOST_BITS_PER_DOUBLE_INT);
@@ -11397,7 +11361,7 @@  get_binfo_at_offset (tree binfo, HOST_WIDE_INT offset, tree expected_type)
 	    continue;
 
 	  pos = int_bit_position (fld);
-	  size = tree_low_cst (DECL_SIZE (fld), 1);
+	  size = tree_to_uhwi (DECL_SIZE (fld));
 	  if (pos <= offset && (pos + size) > offset)
 	    break;
 	}
diff --git a/gcc/tree.h b/gcc/tree.h
index ff4ae52..c9c5869 100644
--- a/gcc/tree.h
+++ b/gcc/tree.h
@@ -735,6 +735,8 @@  enum tree_node_structure_enum {
 };
 #undef DEFTREESTRUCT
 
+#define NULL_TREE (tree) NULL
+
 /* Define accessors for the fields that all tree nodes have
    (though some fields are not used for all kinds of nodes).  */
 
@@ -1112,7 +1114,7 @@  extern void omp_clause_range_check_failed (const_tree, const char *, int,
 #define SET_PREDICT_EXPR_OUTCOME(NODE, OUTCOME) \
   (PREDICT_EXPR_CHECK(NODE)->base.addressable_flag = (int) OUTCOME)
 #define PREDICT_EXPR_PREDICTOR(NODE) \
-  ((enum br_predictor)tree_low_cst (TREE_OPERAND (PREDICT_EXPR_CHECK (NODE), 0), 0))
+  ((enum br_predictor)tree_to_shwi (TREE_OPERAND (PREDICT_EXPR_CHECK (NODE), 0)))
 
 /* In a VAR_DECL, nonzero means allocate static storage.
    In a FUNCTION_DECL, nonzero if function has been defined.
@@ -1601,7 +1603,7 @@  struct GTY(()) tree_constructor {
    Note that we have to bypass the use of TREE_OPERAND to access
    that field to avoid infinite recursion in expanding the macros.  */
 #define VL_EXP_OPERAND_LENGTH(NODE) \
-  ((int)TREE_INT_CST_LOW (VL_EXP_CHECK (NODE)->exp.operands[0]))
+  ((int)tree_to_uhwi (VL_EXP_CHECK (NODE)->exp.operands[0]))
 
 /* Nonzero if is_gimple_debug() may possibly hold.  */
 #define MAY_HAVE_DEBUG_STMTS    (flag_var_tracking_assignments)
@@ -1700,7 +1702,7 @@  extern void protected_set_expr_location (tree, location_t);
 #define CHREC_VAR(NODE)           TREE_OPERAND (POLYNOMIAL_CHREC_CHECK (NODE), 0)
 #define CHREC_LEFT(NODE)          TREE_OPERAND (POLYNOMIAL_CHREC_CHECK (NODE), 1)
 #define CHREC_RIGHT(NODE)         TREE_OPERAND (POLYNOMIAL_CHREC_CHECK (NODE), 2)
-#define CHREC_VARIABLE(NODE)      TREE_INT_CST_LOW (CHREC_VAR (NODE))
+#define CHREC_VARIABLE(NODE)      tree_to_shwi (CHREC_VAR (NODE))
 
 /* LABEL_EXPR accessor. This gives access to the label associated with
    the given label expression.  */
@@ -4080,6 +4082,114 @@  omp_clause_elt_check (const_tree __t, int __i,
 
 #endif
 
+
+/* Return true if T is an INTEGER_CST whose value must be non-negative
+   and can be represented in a single unsigned HOST_WIDE_INT.  */
+
+static inline bool
+tree_fits_uhwi_p (const_tree cst)
+{
+  if (cst == NULL_TREE)
+    return false;
+
+  if (TREE_CODE (cst) != INTEGER_CST)
+    return false;
+
+#ifdef NEW_REP_FOR_INT_CST
+  return TREE_INT_CST_NUNITS (cst) == 1 
+    || (TREE_INT_CST_NUNITS (cst) == 2 && TREE_INT_CST_ELT[1] == 0);
+
+#else
+  return (TREE_INT_CST_HIGH (cst) == 0);
+#endif
+}
+
+/* Return true if CST is an INTEGER_CST whose value can be represented
+   in a single HOST_WIDE_INT.  */
+
+static inline bool
+tree_fits_shwi_p (const_tree cst)
+{
+  if (cst == NULL_TREE)
+    return false;
+
+  if (TREE_CODE (cst) != INTEGER_CST)
+    return false;
+
+#ifdef NEW_REP_FOR_INT_CST
+  return TREE_INT_CST_NUNITS (cst) == 1;
+#else
+  return ((TREE_INT_CST_HIGH (cst) == 0
+	   && (HOST_WIDE_INT) TREE_INT_CST_LOW (cst) >= 0)
+	  || (TREE_INT_CST_HIGH (cst) == -1
+	      && (HOST_WIDE_INT) TREE_INT_CST_LOW (cst) < 0
+	      && !TYPE_UNSIGNED (TREE_TYPE (cst))));
+#endif
+}
+
+/* Return true if T is an INTEGER_CST that can be manipulated
+   efficiently on the host.  If POS is false, the value can be
+   represented in a single HOST_WIDE_INT.  If POS is true, the value
+   must be non-negative and can be represented in a single unsigned
+   HOST_WIDE_INT.  */
+
+static inline bool
+tree_fits_hwi_p (const_tree cst, bool pos)
+{
+  if (cst == NULL_TREE)
+    return 0;
+
+  if (TREE_CODE (cst) != INTEGER_CST)
+    return false;
+
+  return pos ? tree_fits_uhwi_p (cst) : tree_fits_shwi_p (cst);
+}
+
+/* Return the unsigned HOST_WIDE_INT least significant bits of CST.
+   If checking is enabled, this ices if the value does not fit.  */
+
+static inline unsigned HOST_WIDE_INT
+tree_to_uhwi (const_tree cst)
+{
+  gcc_checking_assert (tree_fits_uhwi_p (cst));
+
+#ifdef NEW_REP_FOR_INT_CST
+  return (unsigned HOST_WIDE_INT)TREE_INT_CST_ELT (cst, 0);
+#else
+  return (unsigned HOST_WIDE_INT)TREE_INT_CST_LOW (cst);
+#endif
+}
+
+/* Return the HOST_WIDE_INT least significant bits of CST.  If
+   checking is enabled, this ices if the value does not fit.  */
+
+static inline HOST_WIDE_INT
+tree_to_shwi (const_tree cst)
+{
+  gcc_checking_assert (tree_fits_shwi_p (cst));
+
+#ifdef NEW_REP_FOR_INT_CST
+  return (HOST_WIDE_INT)TREE_INT_CST_ELT (cst, 0);
+#else
+  return (HOST_WIDE_INT)TREE_INT_CST_LOW (cst);
+#endif
+}
+
+/* Return the HOST_WIDE_INT least significant bits of CST.  No
+   checking is done to assure that it fits.  It is assumed that one of
+   tree_fits_uhwi_p or tree_fits_shwi_p was done before this call. */
+
+static inline HOST_WIDE_INT
+tree_to_hwi (const_tree cst)
+{
+#ifdef NEW_REP_FOR_INT_CST
+  return TREE_INT_CST_ELT (cst, 0);
+#else
+  return TREE_INT_CST_LOW (cst);
+#endif
+}
+
+
 /* Compute the number of operands in an expression node NODE.  For
    tcc_vl_exp nodes like CALL_EXPRs, this is stored in the node itself,
    otherwise it is looked up from the node's code.  */
@@ -4547,8 +4657,6 @@  enum ptrmemfunc_vbit_where_t
   ptrmemfunc_vbit_in_delta
 };
 
-#define NULL_TREE (tree) NULL
-
 /* True if NODE is an erroneous expression.  */
 
 #define error_operand_p(NODE)					\
@@ -4821,20 +4929,6 @@  extern int attribute_list_contained (const_tree, const_tree);
 extern int tree_int_cst_equal (const_tree, const_tree);
 extern int tree_int_cst_lt (const_tree, const_tree);
 extern int tree_int_cst_compare (const_tree, const_tree);
-extern int host_integerp (const_tree, int)
-#ifndef ENABLE_TREE_CHECKING
-  ATTRIBUTE_PURE /* host_integerp is pure only when checking is disabled.  */
-#endif
-  ;
-extern HOST_WIDE_INT tree_low_cst (const_tree, int);
-#if !defined ENABLE_TREE_CHECKING && (GCC_VERSION >= 4003)
-extern inline __attribute__ ((__gnu_inline__)) HOST_WIDE_INT
-tree_low_cst (const_tree t, int pos)
-{
-  gcc_assert (host_integerp (t, pos));
-  return TREE_INT_CST_LOW (t);
-}
-#endif
 extern HOST_WIDE_INT size_low_cst (const_tree);
 extern int tree_int_cst_sgn (const_tree);
 extern int tree_int_cst_sign_bit (const_tree);
diff --git a/gcc/var-tracking.c b/gcc/var-tracking.c
index fb6bc1c..0b2d35e 100644
--- a/gcc/var-tracking.c
+++ b/gcc/var-tracking.c
@@ -6147,9 +6147,9 @@  prepare_call_arguments (basic_block bb, rtx insn)
 			  && DECL_INITIAL (SYMBOL_REF_DECL (l->loc)))
 			{
 			  initial = DECL_INITIAL (SYMBOL_REF_DECL (l->loc));
-			  if (host_integerp (initial, 0))
+			  if (tree_fits_shwi_p (initial))
 			    {
-			      item = GEN_INT (tree_low_cst (initial, 0));
+			      item = GEN_INT (tree_to_shwi (initial));
 			      item = gen_rtx_CONCAT (indmode, mem, item);
 			      call_arguments
 				= gen_rtx_EXPR_LIST (VOIDmode, item,
@@ -6232,7 +6232,7 @@  prepare_call_arguments (basic_block bb, rtx insn)
 	= TYPE_MODE (TREE_TYPE (OBJ_TYPE_REF_EXPR (obj_type_ref)));
       rtx clobbered = gen_rtx_MEM (mode, this_arg);
       HOST_WIDE_INT token
-	= tree_low_cst (OBJ_TYPE_REF_TOKEN (obj_type_ref), 0);
+	= tree_to_shwi (OBJ_TYPE_REF_TOKEN (obj_type_ref));
       if (token)
 	clobbered = plus_constant (mode, clobbered,
 				   token * GET_MODE_SIZE (mode));
@@ -8527,7 +8527,7 @@  emit_note_insn_var_location (void **varp, void *data)
       ++n_var_parts;
     }
   type_size_unit = TYPE_SIZE_UNIT (TREE_TYPE (decl));
-  if ((unsigned HOST_WIDE_INT) last_limit < TREE_INT_CST_LOW (type_size_unit))
+  if ((unsigned HOST_WIDE_INT) last_limit < tree_to_uhwi (type_size_unit))
     complete = false;
 
   if (! flag_var_tracking_uninit)
diff --git a/gcc/varasm.c b/gcc/varasm.c
index 2eaaf03..0666fcb 100644
--- a/gcc/varasm.c
+++ b/gcc/varasm.c
@@ -1065,7 +1065,7 @@  get_block_for_decl (tree decl)
      constant size.  */
   if (DECL_SIZE_UNIT (decl) == NULL)
     return NULL;
-  if (!host_integerp (DECL_SIZE_UNIT (decl), 1))
+  if (!tree_fits_uhwi_p (DECL_SIZE_UNIT (decl)))
     return NULL;
 
   /* Find out which section should contain DECL.  We cannot put it into
@@ -1828,7 +1828,7 @@  assemble_noswitch_variable (tree decl, const char *name, section *sect)
 {
   unsigned HOST_WIDE_INT size, rounded;
 
-  size = tree_low_cst (DECL_SIZE_UNIT (decl), 1);
+  size = tree_to_uhwi (DECL_SIZE_UNIT (decl));
   rounded = size;
 
   /* Don't allocate zero bytes of common,
@@ -1872,11 +1872,11 @@  assemble_variable_contents (tree decl, const char *name,
 	  && !initializer_zerop (DECL_INITIAL (decl)))
 	/* Output the actual data.  */
 	output_constant (DECL_INITIAL (decl),
-			 tree_low_cst (DECL_SIZE_UNIT (decl), 1),
+			 tree_to_uhwi (DECL_SIZE_UNIT (decl)),
 			 DECL_ALIGN (decl));
       else
 	/* Leave space for it.  */
-	assemble_zeros (tree_low_cst (DECL_SIZE_UNIT (decl), 1));
+	assemble_zeros (tree_to_uhwi (DECL_SIZE_UNIT (decl)));
     }
 }
 
@@ -2568,7 +2568,7 @@  decode_addr_const (tree exp, struct addr_const *value)
   while (1)
     {
       if (TREE_CODE (target) == COMPONENT_REF
-	  && host_integerp (byte_position (TREE_OPERAND (target, 1)), 0))
+	  && tree_fits_shwi_p (byte_position (TREE_OPERAND (target, 1))))
 	{
 	  offset += int_byte_position (TREE_OPERAND (target, 1));
 	  target = TREE_OPERAND (target, 0);
@@ -2576,8 +2576,8 @@  decode_addr_const (tree exp, struct addr_const *value)
       else if (TREE_CODE (target) == ARRAY_REF
 	       || TREE_CODE (target) == ARRAY_RANGE_REF)
 	{
-	  offset += (tree_low_cst (TYPE_SIZE_UNIT (TREE_TYPE (target)), 1)
-		     * tree_low_cst (TREE_OPERAND (target, 1), 0));
+	  offset += (tree_to_uhwi (TYPE_SIZE_UNIT (TREE_TYPE (target)))
+		     * tree_to_shwi (TREE_OPERAND (target, 1)));
 	  target = TREE_OPERAND (target, 0);
 	}
       else if (TREE_CODE (target) == MEM_REF
@@ -4531,7 +4531,7 @@  output_constant (tree exp, unsigned HOST_WIDE_INT size, unsigned int align)
   if (TREE_CODE (exp) == FDESC_EXPR)
     {
 #ifdef ASM_OUTPUT_FDESC
-      HOST_WIDE_INT part = tree_low_cst (TREE_OPERAND (exp, 1), 0);
+      HOST_WIDE_INT part = tree_to_shwi (TREE_OPERAND (exp, 1));
       tree decl = TREE_OPERAND (exp, 0);
       ASM_OUTPUT_FDESC (asm_out_file, decl, part);
 #else
@@ -4708,9 +4708,9 @@  output_constructor_array_range (oc_local_state *local)
     = int_size_in_bytes (TREE_TYPE (local->type));
 
   HOST_WIDE_INT lo_index
-    = tree_low_cst (TREE_OPERAND (local->index, 0), 0);
+    = tree_to_shwi (TREE_OPERAND (local->index, 0));
   HOST_WIDE_INT hi_index
-    = tree_low_cst (TREE_OPERAND (local->index, 1), 0);
+    = tree_to_shwi (TREE_OPERAND (local->index, 1));
   HOST_WIDE_INT index;
 
   unsigned int align2
@@ -4751,7 +4751,7 @@  output_constructor_regular_field (oc_local_state *local)
       double_int idx = tree_to_double_int (local->index)
 		       - tree_to_double_int (local->min_index);
       idx = idx.sext (prec);
-      fieldpos = (tree_low_cst (TYPE_SIZE_UNIT (TREE_TYPE (local->val)), 1)
+      fieldpos = (tree_to_uhwi (TYPE_SIZE_UNIT (TREE_TYPE (local->val)))
 		  * idx.low);
     }
   else if (local->field != NULL_TREE)
@@ -4801,7 +4801,7 @@  output_constructor_regular_field (oc_local_state *local)
 	  gcc_assert (!fieldsize || !DECL_CHAIN (local->field));
 	}
       else
-	fieldsize = tree_low_cst (DECL_SIZE_UNIT (local->field), 1);
+	fieldsize = tree_to_uhwi (DECL_SIZE_UNIT (local->field));
     }
   else
     fieldsize = int_size_in_bytes (TREE_TYPE (local->type));
@@ -4825,15 +4825,15 @@  output_constructor_bitfield (oc_local_state *local, oc_outer_state *outer)
   /* Bit size of this element.  */
   HOST_WIDE_INT ebitsize
     = (local->field
-       ? tree_low_cst (DECL_SIZE (local->field), 1)
-       : tree_low_cst (TYPE_SIZE (TREE_TYPE (local->type)), 1));
+       ? tree_to_uhwi (DECL_SIZE (local->field))
+       : tree_to_uhwi (TYPE_SIZE (TREE_TYPE (local->type))));
 
   /* Relative index of this element if this is an array component.  */
   HOST_WIDE_INT relative_index
     = (!local->field
        ? (local->index
-	  ? (tree_low_cst (local->index, 0)
-	     - tree_low_cst (local->min_index, 0))
+	  ? (tree_to_shwi (local->index)
+	     - tree_to_shwi (local->min_index))
 	  : local->last_relative_index + 1)
        : 0);
 
@@ -6935,7 +6935,7 @@  place_block_symbol (rtx symbol)
     {
       decl = SYMBOL_REF_DECL (symbol);
       alignment = DECL_ALIGN (decl);
-      size = tree_low_cst (DECL_SIZE_UNIT (decl), 1);
+      size = tree_to_uhwi (DECL_SIZE_UNIT (decl));
     }
 
   /* Calculate the object's offset from the start of the block.  */
@@ -7083,7 +7083,7 @@  output_object_block (struct object_block *block)
 	{
 	  decl = SYMBOL_REF_DECL (symbol);
 	  assemble_variable_contents (decl, XSTR (symbol, 0), false);
-	  offset += tree_low_cst (DECL_SIZE_UNIT (decl), 1);
+	  offset += tree_to_uhwi (DECL_SIZE_UNIT (decl));
 	}
     }
 }
diff --git a/gcc/varpool.c b/gcc/varpool.c
index 314c66e..add09b1 100644
--- a/gcc/varpool.c
+++ b/gcc/varpool.c
@@ -64,7 +64,7 @@  varpool_remove_node (struct varpool_node *node)
       /* Keep vtables for BINFO folding.  */
       && !DECL_VIRTUAL_P (node->symbol.decl)
       /* dbxout output constant initializers for readonly vars.  */
-      && (!host_integerp (DECL_INITIAL (node->symbol.decl), 0)
+      && (!tree_fits_shwi_p (DECL_INITIAL (node->symbol.decl))
 	  || !TREE_READONLY (node->symbol.decl)))
     DECL_INITIAL (node->symbol.decl) = error_mark_node;
   ggc_free (node);