From patchwork Fri Mar 29 23:15:13 2013 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Steven Bosscher X-Patchwork-Id: 232498 Return-Path: X-Original-To: incoming@patchwork.ozlabs.org Delivered-To: patchwork-incoming@bilbo.ozlabs.org Received: from sourceware.org (server1.sourceware.org [209.132.180.131]) (using TLSv1 with cipher DHE-RSA-AES256-SHA (256/256 bits)) (Client CN "localhost", Issuer "www.qmailtoaster.com" (not verified)) by ozlabs.org (Postfix) with ESMTPS id ED8512C00CD for ; Sat, 30 Mar 2013 10:16:27 +1100 (EST) DomainKey-Signature: a=rsa-sha1; c=nofws; d=gcc.gnu.org; h=list-id :list-unsubscribe:list-archive:list-post:list-help:sender :mime-version:from:date:message-id:subject:to:content-type; q= dns; s=default; b=WF3asuhIT8Vs1PGEYrKr8rkOSZvCjM8U/s0Lde7FyXpCNs KC8a49U4OW/ToMgHL8nWhwikYOvtQIQUQLpA/Z7GUp9J8Shql7Jh7jkVrKjq7VB1 qZk06mPXgFOSqKVbSIP6x+E3BdqdvkqY9a9TTu61461Mh9PO1ZDxXKocAfxGg= DKIM-Signature: v=1; a=rsa-sha1; c=relaxed; d=gcc.gnu.org; h=list-id :list-unsubscribe:list-archive:list-post:list-help:sender :mime-version:from:date:message-id:subject:to:content-type; s= default; bh=yRq4lqKcCvawUwFIKlejYIMX3dg=; b=wWuAi0w+ZqTCGo7LjRD/ 1fO1HI1PHU9BRCowVY+bH1E1h53UHuF0tuJxIl4T8O3VW923RTKI+t2MOyjRulcT Cbr4thbJuGD+WJfxkB/RLbZKRWIOwJYOGdM5C8YV3CD6GtfXwCZFSXmGpThTbR8T 3phNa6R3r3kBbLAQvYxKVVk= Received: (qmail 22229 invoked by alias); 29 Mar 2013 23:16:13 -0000 Mailing-List: contact gcc-patches-help@gcc.gnu.org; run by ezmlm Precedence: bulk List-Id: List-Unsubscribe: List-Archive: List-Post: List-Help: Sender: gcc-patches-owner@gcc.gnu.org Delivered-To: mailing list gcc-patches@gcc.gnu.org Received: (qmail 22208 invoked by uid 89); 29 Mar 2013 23:16:01 -0000 X-Spam-SWARE-Status: No, score=-3.1 required=5.0 tests=AWL, BAYES_50, FREEMAIL_FROM, KHOP_RCVD_TRUST, RCVD_IN_DNSWL_LOW, RCVD_IN_HOSTKARMA_YE autolearn=ham version=3.3.1 Received: from mail-vc0-f170.google.com (HELO mail-vc0-f170.google.com) (209.85.220.170) by sourceware.org (qpsmtpd/0.84/v0.84-167-ge50287c) with ESMTP; Fri, 29 Mar 2013 23:15:55 +0000 Received: by mail-vc0-f170.google.com with SMTP id lf10so889098vcb.1 for ; Fri, 29 Mar 2013 16:15:54 -0700 (PDT) X-Received: by 10.220.153.143 with SMTP id k15mr3149014vcw.33.1364598954075; Fri, 29 Mar 2013 16:15:54 -0700 (PDT) MIME-Version: 1.0 Received: by 10.58.68.228 with HTTP; Fri, 29 Mar 2013 16:15:13 -0700 (PDT) From: Steven Bosscher Date: Sat, 30 Mar 2013 00:15:13 +0100 Message-ID: Subject: [patch] Stop using JUMP_INSN for jump table data To: GCC Patches X-Virus-Found: No Hello, GCC uses fake JUMP_INSNs as placeholders for jump table data. These JUMP_INSNs have an ADDR_VEC or ADDR_DIFF_VEC as PATTERN, but they are not real instructions and they are not inside basic blocks. This results in special-casing JUMP_P insns in various places throughout the compiler. The attached patch adds a new object, JUMP_TABLE_DATA, to hold jump table data. All remaining JUMP_P insns will be real insns, which helps simplify things a bit -- or at least make things more intuitive. A number of back ends, and probably some code in the middle end (not sure), uses the "*active_insn*" family of functions while traversing the insn chain (instead of the CFG) and expect jump table data to be considered an active insn. Therefore, JUMP_TABLE_DATA is an RTX_INSN for the moment, but I intend to make it an RTX_EXTRA before stage1 ends. Bootstrapped and tested on powerpc64-unknown-linux-gnu and on x86_64-unknown-linux-gnu. OK for trunk? Ciao! Steven * postreload-gcse.c (bb_has_well_behaved_predecessors): Correct test for table jump at the end of a basic block using tablejump_p. * targhooks.c (default_invalid_within_doloop): Likewise. * config/rs6000/rs6000.c (TARGET_INVALID_WITHIN_DOLOOP): Remove target hook implementation that is identical to the default hook. (rs6000_invalid_within_doloop): Remove. * bb-reorder.c (fix_crossing_unconditional_branches): Remove set but unused variable from tablejump_p call. * rtl.def (JUMP_TABLE_DATA): New RTX_INSN object. * rtl.h (RTX_PREV, RTX_NEXT): Adjust for new JUMP_TABLE_DATA. (INSN_DELETED_P): Likewise. (emit_jump_table_data): New prototype. * gengtype.c (adjust_field_rtx_def): Handle JUMP_TABLE_DATA fields after 4th as unused. * print-rtl.c (print_rtl): Handle JUMP_TABLE_DATA. * sched-vis.c (print_insn): Likewise. * emit-rtl.c (active_insn_p): Consider JUMP_TABLE_DATA an active insn for compatibility with back ends that use next_active_insn to identify jump table data. (set_insn_deleted): Remove no longer useful JUMP_TABLE_DATA_P check. (remove_insn): Likewise. (emit_insn): Do not accept JUMP_TABLE_DATA objects in insn chains to be emitted. (emit_debug_insn, emit_jump_insn, emit_call_insn, emit_label): Idem. (emit_jump_table_data): New function. * cfgbuild.c (inside_basic_block_p): A JUMP_INSN is always inside a basic block, a JUMP_TABLE_DATA never is. (control_flow_insn_p): JUMP_TABLE_DATA is not a control flow insn. * cfgrtl.c (duplicate_insn_chain): Split handling of JUMP_TABLE_DATA off from code handling real insns. * final.c (get_attr_length_1): Simplify for JUMP_INSNs. * function.c (instantiate_virtual_regs): Remove JUMP_TABLE_DATA_P test, now redundant because JUMP_TABLE_DATA is not an INSN_P insn. * gcse.c (insert_insn_end_basic_block): Likewise, JUMP_TABLE_DATA_P is not a NONDEBUG_INSN_P. * ira-costs.c (scan_one_insn): Likewise. * jump.c (mark_all_labels): Likewise. (mark_jump_label_1): Likewise. * lra-eliminations.c (eliminate_regs_in_insn): Likewise. * lra.c (get_insn_freq): Expect all insns reaching here to be in a basic block. (check_rtl): Remove JUMP_TABLE_DATA_P test, not a NONDEBUG_INSN_P insn. * predict.c (expensive_function_p): Use FOR_BB_INSNS. * reload1.c (calculate_needs_all_insns): Call set_label_offsets for JUMP_TABLE_DATA_P insns. (calculate_elim_costs_all_insns): Likewise. (set_label_offsets): Recurse on the PATTERN of JUMP_TABLE_DATA insns. (elimination_costs_in_insn): Remove redundant JUMP_TABLE_DATA_P test. (delete_output_reload): Code style fixups. * reorg.c (dbr_schedule): Move JUMP_TABLE_DATA_P up to avoid setting insn flags on this non-insn. * sched-rgn.c (add_branch_dependences): Treat JUMP_TABLE_DATA insns as scheduling barriers, for pre-change compatibility. * stmt.c (emit_case_dispatch_table): Emit jump table data not as JUMP_INSN objects but instead as JUMP_TABLE_DATA objects. * config/alpha/alpha.c (alpha_does_function_need_gp): Remove redundant JUMP_TABLE_DATA_P test. * config/arm/arm.c (thumb_far_jump_used_p): Likewise. * config/frv/frv.c (frv_function_contains_far_jump): Likewise. (frv_for_each_packet): Likewise. * config/i386/i386.c (min_insn_size): Likewise. (ix86_avoid_jump_mispredicts): Likewise. * config/m32r/m32r.c (m32r_is_insn): Likewise. * config/mep/mep.c (mep_reorg_erepeat): Likewise. * config/mips/mips.c (USEFUL_INSN_P): Likewise. (mips16_insn_length): Robustify. (mips_has_long_branch_p): Remove redundant JUMP_TABLE_DATA_P test. (mips16_split_long_branches): Likewise. * config/pa/pa.c (pa_combine_instructions): Likewise. * config/rs6000/rs6000.c (get_next_active_insn): Treat JUMP_TABLE_DATA objects as active insns, like in active_insn_p. * config/s390/s390.c (s390_chunkify_start): Treat JUMP_TABLE_DATA as contributing to pool range lengths. * config/sh/sh.c (find_barrier): Restore check for ADDR_DIFF_VEC. Remove redundant JUMP_TABLE_DATA_P test. (sh_loop_align): Likewise. (split_branches): Likewise. (sh_insn_length_adjustment): Likewise. * config/spu/spu.c (get_branch_target): Likewise. Index: postreload-gcse.c =================================================================== --- postreload-gcse.c (revision 197233) +++ postreload-gcse.c (working copy) @@ -918,7 +918,7 @@ bb_has_well_behaved_predecessors (basic_block bb) if ((pred->flags & EDGE_ABNORMAL_CALL) && cfun->has_nonlocal_label) return false; - if (JUMP_TABLE_DATA_P (BB_END (pred->src))) + if (tablejump_p (BB_END (pred->src), NULL, NULL)) return false; } return true; Index: targhooks.c =================================================================== --- targhooks.c (revision 197233) +++ targhooks.c (working copy) @@ -474,7 +474,7 @@ default_invalid_within_doloop (const_rtx insn) if (CALL_P (insn)) return "Function call in loop."; - if (JUMP_TABLE_DATA_P (insn)) + if (tablejump_p (insn, NULL, NULL) || computed_jump_p (insn)) return "Computed branch in the loop."; return NULL; Index: config/rs6000/rs6000.c =================================================================== --- config/rs6000/rs6000.c (revision 197234) +++ config/rs6000/rs6000.c (working copy) @@ -1290,9 +1290,6 @@ static const struct attribute_spec rs6000_attribut #undef TARGET_FUNCTION_OK_FOR_SIBCALL #define TARGET_FUNCTION_OK_FOR_SIBCALL rs6000_function_ok_for_sibcall -#undef TARGET_INVALID_WITHIN_DOLOOP -#define TARGET_INVALID_WITHIN_DOLOOP rs6000_invalid_within_doloop - #undef TARGET_REGISTER_MOVE_COST #define TARGET_REGISTER_MOVE_COST rs6000_register_move_cost #undef TARGET_MEMORY_MOVE_COST @@ -18778,22 +18775,6 @@ rs6000_function_ok_for_sibcall (tree decl, tree ex return false; } -/* NULL if INSN insn is valid within a low-overhead loop. - Otherwise return why doloop cannot be applied. - PowerPC uses the COUNT register for branch on table instructions. */ - -static const char * -rs6000_invalid_within_doloop (const_rtx insn) -{ - if (CALL_P (insn)) - return "Function call in the loop."; - - if (JUMP_TABLE_DATA_P (insn)) - return "Computed branch in the loop."; - - return NULL; -} - static int rs6000_ra_ever_killed (void) { Index: bb-reorder.c =================================================================== --- bb-reorder.c (revision 197233) +++ bb-reorder.c (working copy) @@ -1998,14 +1998,14 @@ fix_crossing_unconditional_branches (void) if (JUMP_P (last_insn) && (succ->flags & EDGE_CROSSING)) { - rtx label2, table; + rtx label2; gcc_assert (!any_condjump_p (last_insn)); /* Make sure the jump is not already an indirect or table jump. */ if (!computed_jump_p (last_insn) - && !tablejump_p (last_insn, &label2, &table)) + && !tablejump_p (last_insn, &label2, NULL)) { /* We have found a "crossing" unconditional branch. Now we must convert it to an indirect jump. First create Index: rtl.def =================================================================== --- rtl.def (revision 197233) +++ rtl.def (working copy) @@ -64,7 +64,8 @@ along with GCC; see the file COPYING3. If not see RTX_BITFIELD_OPS an rtx code for a bit-field operation (ZERO_EXTRACT, SIGN_EXTRACT) RTX_INSN - an rtx code for a machine insn (INSN, JUMP_INSN, CALL_INSN) + an rtx code for a machine insn (INSN, JUMP_INSN, CALL_INSN) or + data that will be output as assembly pseudo-ops (DEBUG_INSN) RTX_MATCH an rtx code for something that matches in insns (e.g, MATCH_DUP) RTX_AUTOINC @@ -137,6 +138,13 @@ DEF_RTL_EXPR(JUMP_INSN, "jump_insn", "iuuBeiie0", All other fields ( rtx->u.fld[] ) have exact same meaning as INSN's. */ DEF_RTL_EXPR(CALL_INSN, "call_insn", "iuuBeiiee", RTX_INSN) +/* Placeholder for tablejump JUMP_INSNs. The pattern of this kind + of rtx is always either an ADDR_VEC or an ADDR_DIFF_VEC. These + placeholders do not appear as real instructions inside a basic + block, but are considered active_insn_p instructions for historical + reasons, when jump table data was represented with JUMP_INSNs. */ +DEF_RTL_EXPR(JUMP_TABLE_DATA, "jump_table_data", "iuuBe0000", RTX_INSN) + /* A marker that indicates that control will not flow through. */ DEF_RTL_EXPR(BARRIER, "barrier", "iuu00000", RTX_EXTRA) @@ -214,8 +222,12 @@ DEF_RTL_EXPR(UNSPEC, "unspec", "Ei", RTX_EXTRA) /* Similar, but a volatile operation and one which may trap. */ DEF_RTL_EXPR(UNSPEC_VOLATILE, "unspec_volatile", "Ei", RTX_EXTRA) -/* Vector of addresses, stored as full words. */ -/* Each element is a LABEL_REF to a CODE_LABEL whose address we want. */ +/* ---------------------------------------------------------------------- + Table jump addresses. + ---------------------------------------------------------------------- */ + +/* Vector of addresses, stored as full words. + Each element is a LABEL_REF to a CODE_LABEL whose address we want. */ DEF_RTL_EXPR(ADDR_VEC, "addr_vec", "E", RTX_EXTRA) /* Vector of address differences X0 - BASE, X1 - BASE, ... @@ -240,7 +252,6 @@ DEF_RTL_EXPR(ADDR_VEC, "addr_vec", "E", RTX_EXTRA) The third, fourth and fifth operands are only valid when CASE_VECTOR_SHORTEN_MODE is defined, and only in an optimizing compilation. */ - DEF_RTL_EXPR(ADDR_DIFF_VEC, "addr_diff_vec", "eEee0", RTX_EXTRA) /* Memory prefetch, with attributes supported on some targets. Index: rtl.h =================================================================== --- rtl.h (revision 197233) +++ rtl.h (working copy) @@ -363,6 +363,7 @@ struct GTY((chain_next ("RTX_NEXT (&%h)"), */ #define RTX_PREV(X) ((INSN_P (X) \ || NOTE_P (X) \ + || JUMP_TABLE_DATA_P (X) \ || BARRIER_P (X) \ || LABEL_P (X)) \ && PREV_INSN (X) != NULL \ @@ -469,9 +470,7 @@ struct GTY((variable_size)) rtvec_def { #define BARRIER_P(X) (GET_CODE (X) == BARRIER) /* Predicate yielding nonzero iff X is a data for a jump table. */ -#define JUMP_TABLE_DATA_P(INSN) \ - (JUMP_P (INSN) && (GET_CODE (PATTERN (INSN)) == ADDR_VEC || \ - GET_CODE (PATTERN (INSN)) == ADDR_DIFF_VEC)) +#define JUMP_TABLE_DATA_P(INSN) (GET_CODE (INSN) == JUMP_TABLE_DATA) /* Predicate yielding nonzero iff X is a return or simple_return. */ #define ANY_RETURN_P(X) \ @@ -849,8 +848,8 @@ extern void rtl_check_failed_flag (const char *, c /* 1 if RTX is an insn that has been deleted. */ #define INSN_DELETED_P(RTX) \ - (RTL_FLAG_CHECK7("INSN_DELETED_P", (RTX), DEBUG_INSN, INSN, \ - CALL_INSN, JUMP_INSN, \ + (RTL_FLAG_CHECK8("INSN_DELETED_P", (RTX), DEBUG_INSN, INSN, \ + CALL_INSN, JUMP_INSN, JUMP_TABLE_DATA, \ CODE_LABEL, BARRIER, NOTE)->volatil) /* 1 if RTX is a call to a const function. Built from ECF_CONST and @@ -869,7 +868,7 @@ extern void rtl_check_failed_flag (const char *, c /* 1 if RTX is a call to a looping const or pure function. Built from ECF_LOOPING_CONST_OR_PURE and DECL_LOOPING_CONST_OR_PURE_P. */ -#define RTL_LOOPING_CONST_OR_PURE_CALL_P(RTX) \ +#define RTL_LOOPING_CONST_OR_PURE_CALL_P(RTX) \ (RTL_FLAG_CHECK1("CONST_OR_PURE_CALL_P", (RTX), CALL_INSN)->call) /* 1 if RTX is a call_insn for a sibling call. */ @@ -1881,6 +1880,7 @@ extern rtx emit_debug_insn (rtx); extern rtx emit_jump_insn (rtx); extern rtx emit_call_insn (rtx); extern rtx emit_label (rtx); +extern rtx emit_jump_table_data (rtx); extern rtx emit_barrier (void); extern rtx emit_note (enum insn_note); extern rtx emit_note_copy (rtx); Index: gengtype.c =================================================================== --- gengtype.c (revision 197233) +++ gengtype.c (working copy) @@ -1219,6 +1219,8 @@ adjust_field_rtx_def (type_p t, options_p ARG_UNUS t = scalar_tp, subname = "rt_int"; else if (i == SYMBOL_REF && aindex == 2) t = symbol_union_tp, subname = ""; + else if (i == JUMP_TABLE_DATA && aindex >= 5) + t = scalar_tp, subname = "rt_int"; else if (i == BARRIER && aindex >= 3) t = scalar_tp, subname = "rt_int"; else if (i == ENTRY_VALUE && aindex == 0) Index: print-rtl.c =================================================================== --- print-rtl.c (revision 197233) +++ print-rtl.c (working copy) @@ -778,6 +778,7 @@ print_rtl (FILE *outf, const_rtx rtx_first) case CALL_INSN: case NOTE: case CODE_LABEL: + case JUMP_TABLE_DATA: case BARRIER: for (tmp_rtx = rtx_first; tmp_rtx != 0; tmp_rtx = NEXT_INSN (tmp_rtx)) { Index: sched-vis.c =================================================================== --- sched-vis.c (revision 197233) +++ sched-vis.c (working copy) @@ -666,6 +666,11 @@ print_insn (pretty_printer *pp, const_rtx x, int v case CODE_LABEL: pp_printf (pp, "L%d:", INSN_UID (x)); break; + case JUMP_TABLE_DATA: + pp_string (pp, "jump_table_data{\n"); + print_pattern (pp, PATTERN (x), verbose); + pp_string (pp, "}"); + break; case BARRIER: pp_string (pp, "barrier"); break; Index: emit-rtl.c =================================================================== --- emit-rtl.c (revision 197233) +++ emit-rtl.c (working copy) @@ -3268,6 +3268,7 @@ int active_insn_p (const_rtx insn) { return (CALL_P (insn) || JUMP_P (insn) + || JUMP_TABLE_DATA_P (insn) /* FIXME */ || (NONJUMP_INSN_P (insn) && (! reload_completed || (GET_CODE (PATTERN (insn)) != USE @@ -3900,7 +3901,7 @@ add_insn_before (rtx insn, rtx before, basic_block void set_insn_deleted (rtx insn) { - if (INSN_P (insn) && !JUMP_TABLE_DATA_P (insn)) + if (INSN_P (insn)) df_insn_delete (insn); PUT_CODE (insn, NOTE); NOTE_KIND (insn) = NOTE_INSN_DELETED; @@ -3968,7 +3969,7 @@ remove_insn (rtx insn) } /* Invalidate data flow information associated with INSN. */ - if (INSN_P (insn) && !JUMP_TABLE_DATA_P (insn)) + if (INSN_P (insn)) df_insn_delete (insn); /* Fix up basic block boundaries, if necessary. */ @@ -4661,6 +4662,7 @@ emit_insn (rtx x) break; #ifdef ENABLE_RTL_CHECKING + case JUMP_TABLE_DATA: case SEQUENCE: gcc_unreachable (); break; @@ -4707,6 +4709,7 @@ emit_debug_insn (rtx x) break; #ifdef ENABLE_RTL_CHECKING + case JUMP_TABLE_DATA: case SEQUENCE: gcc_unreachable (); break; @@ -4749,6 +4752,7 @@ emit_jump_insn (rtx x) break; #ifdef ENABLE_RTL_CHECKING + case JUMP_TABLE_DATA: case SEQUENCE: gcc_unreachable (); break; @@ -4785,6 +4789,7 @@ emit_call_insn (rtx x) #ifdef ENABLE_RTL_CHECKING case SEQUENCE: + case JUMP_TABLE_DATA: gcc_unreachable (); break; #endif @@ -4809,6 +4814,20 @@ emit_label (rtx label) return label; } +/* Make an insn of code JUMP_TABLE_DATA + and add it to the end of the doubly-linked list. */ + +rtx +emit_jump_table_data (rtx table) +{ + rtx jump_table_data = rtx_alloc (JUMP_TABLE_DATA); + INSN_UID (jump_table_data) = cur_insn_uid++; + PATTERN (jump_table_data) = table; + BLOCK_FOR_INSN (jump_table_data) = NULL; + add_insn (jump_table_data); + return jump_table_data; +} + /* Make an insn of code BARRIER and add it to the end of the doubly-linked list. */ Index: cfgbuild.c =================================================================== --- cfgbuild.c (revision 197234) +++ cfgbuild.c (working copy) @@ -54,13 +54,12 @@ inside_basic_block_p (const_rtx insn) || ! JUMP_TABLE_DATA_P (insn)); case JUMP_INSN: - return (! JUMP_TABLE_DATA_P (insn)); - case CALL_INSN: case INSN: case DEBUG_INSN: return true; + case JUMP_TABLE_DATA: case BARRIER: case NOTE: return false; @@ -84,8 +83,7 @@ control_flow_insn_p (const_rtx insn) return false; case JUMP_INSN: - /* Jump insn always causes control transfer except for tablejumps. */ - return (! JUMP_TABLE_DATA_P (insn)); + return true; case CALL_INSN: /* Noreturn and sibling call instructions terminate the basic blocks @@ -109,8 +107,9 @@ control_flow_insn_p (const_rtx insn) return false; break; + case JUMP_TABLE_DATA: case BARRIER: - /* It is nonsense to reach barrier when looking for the + /* It is nonsense to reach this when looking for the end of basic block, but before dead code is eliminated this may happen. */ return false; Index: cfgrtl.c =================================================================== --- cfgrtl.c (revision 197234) +++ cfgrtl.c (working copy) @@ -2488,7 +2488,7 @@ rtl_verify_flow_info (void) break; case CODE_LABEL: - /* An addr_vec is placed outside any basic block. */ + /* An ADDR_VEC is placed outside any basic block. */ if (NEXT_INSN (x) && JUMP_TABLE_DATA_P (NEXT_INSN (x))) x = NEXT_INSN (x); @@ -3604,7 +3604,7 @@ cfg_layout_can_duplicate_bb_p (const_basic_block b rtx duplicate_insn_chain (rtx from, rtx to) { - rtx insn, last, copy; + rtx insn, next, last, copy; /* Avoid updating of boundaries of previous basic block. The note will get removed from insn stream in fixup. */ @@ -3624,24 +3624,6 @@ duplicate_insn_chain (rtx from, rtx to) case INSN: case CALL_INSN: case JUMP_INSN: - /* Avoid copying of dispatch tables. We never duplicate - tablejumps, so this can hit only in case the table got - moved far from original jump. */ - if (JUMP_TABLE_DATA_P (insn)) - { - /* Avoid copying following barrier as well if any - (and debug insns in between). */ - rtx next; - - for (next = NEXT_INSN (insn); - next != NEXT_INSN (to); - next = NEXT_INSN (next)) - if (!DEBUG_INSN_P (next)) - break; - if (next != NEXT_INSN (to) && BARRIER_P (next)) - insn = next; - break; - } copy = emit_copy_of_insn_after (insn, get_last_insn ()); if (JUMP_P (insn) && JUMP_LABEL (insn) != NULL_RTX && ANY_RETURN_P (JUMP_LABEL (insn))) @@ -3649,6 +3631,21 @@ duplicate_insn_chain (rtx from, rtx to) maybe_copy_prologue_epilogue_insn (insn, copy); break; + case JUMP_TABLE_DATA: + /* Avoid copying of dispatch tables. We never duplicate + tablejumps, so this can hit only in case the table got + moved far from original jump. + Avoid copying following barrier as well if any + (and debug insns in between). */ + for (next = NEXT_INSN (insn); + next != NEXT_INSN (to); + next = NEXT_INSN (next)) + if (!DEBUG_INSN_P (next)) + break; + if (next != NEXT_INSN (to) && BARRIER_P (next)) + insn = next; + break; + case CODE_LABEL: break; Index: final.c =================================================================== --- final.c (revision 197234) +++ final.c (working copy) @@ -391,20 +391,10 @@ get_attr_length_1 (rtx insn, int (*fallback_fn) (r return 0; case CALL_INSN: + case JUMP_INSN: length = fallback_fn (insn); break; - case JUMP_INSN: - body = PATTERN (insn); - if (JUMP_TABLE_DATA_P (insn)) - { - /* Alignment is machine-dependent and should be handled by - ADDR_VEC_ALIGN. */ - } - else - length = fallback_fn (insn); - break; - case INSN: body = PATTERN (insn); if (GET_CODE (body) == USE || GET_CODE (body) == CLOBBER) Index: function.c =================================================================== --- function.c (revision 197234) +++ function.c (working copy) @@ -1915,8 +1915,7 @@ instantiate_virtual_regs (void) { /* These patterns in the instruction stream can never be recognized. Fortunately, they shouldn't contain virtual registers either. */ - if (JUMP_TABLE_DATA_P (insn) - || GET_CODE (PATTERN (insn)) == USE + if (GET_CODE (PATTERN (insn)) == USE || GET_CODE (PATTERN (insn)) == CLOBBER || GET_CODE (PATTERN (insn)) == ASM_INPUT) continue; Index: gcse.c =================================================================== --- gcse.c (revision 197234) +++ gcse.c (working copy) @@ -2149,19 +2149,9 @@ insert_insn_end_basic_block (struct expr *expr, ba || single_succ_edge (bb)->flags & EDGE_ABNORMAL))) { #ifdef HAVE_cc0 - rtx note; -#endif - - /* If this is a jump table, then we can't insert stuff here. Since - we know the previous real insn must be the tablejump, we insert - the new instruction just before the tablejump. */ - if (JUMP_TABLE_DATA_P (insn)) - insn = prev_active_insn (insn); - -#ifdef HAVE_cc0 /* FIXME: 'twould be nice to call prev_cc0_setter here but it aborts if cc0 isn't set. */ - note = find_reg_note (insn, REG_CC_SETTER, NULL_RTX); + rtx note = find_reg_note (insn, REG_CC_SETTER, NULL_RTX); if (note) insn = XEXP (note, 0); else Index: ira-costs.c =================================================================== --- ira-costs.c (revision 197234) +++ ira-costs.c (working copy) @@ -1269,8 +1269,7 @@ scan_one_insn (rtx insn) int i, k; bool counted_mem; - if (!NONDEBUG_INSN_P (insn) - || JUMP_TABLE_DATA_P (insn)) + if (!NONDEBUG_INSN_P (insn)) return insn; pat_code = GET_CODE (PATTERN (insn)); Index: jump.c =================================================================== --- jump.c (revision 197233) +++ jump.c (working copy) @@ -274,17 +274,11 @@ mark_all_labels (rtx f) basic blocks. If those non-insns represent tablejump data, they contain label references that we must record. */ for (insn = BB_HEADER (bb); insn; insn = NEXT_INSN (insn)) - if (INSN_P (insn)) - { - gcc_assert (JUMP_TABLE_DATA_P (insn)); - mark_jump_label (PATTERN (insn), insn, 0); - } + if (JUMP_TABLE_DATA_P (insn)) + mark_jump_label (PATTERN (insn), insn, 0); for (insn = BB_FOOTER (bb); insn; insn = NEXT_INSN (insn)) - if (INSN_P (insn)) - { - gcc_assert (JUMP_TABLE_DATA_P (insn)); - mark_jump_label (PATTERN (insn), insn, 0); - } + if (JUMP_TABLE_DATA_P (insn)) + mark_jump_label (PATTERN (insn), insn, 0); } } else @@ -296,6 +290,8 @@ mark_all_labels (rtx f) ; else if (LABEL_P (insn)) prev_nonjump_insn = NULL; + else if (JUMP_TABLE_DATA_P (insn)) + mark_jump_label (PATTERN (insn), insn, 0); else if (NONDEBUG_INSN_P (insn)) { mark_jump_label (PATTERN (insn), insn, 0); @@ -1163,8 +1159,8 @@ mark_jump_label_1 (rtx x, rtx insn, bool in_mem, b return; } - /* Do walk the labels in a vector, but not the first operand of an - ADDR_DIFF_VEC. Don't set the JUMP_LABEL of a vector. */ + /* Do walk the labels in a vector, but not the first operand of an + ADDR_DIFF_VEC. Don't set the JUMP_LABEL of a vector. */ case ADDR_VEC: case ADDR_DIFF_VEC: if (! INSN_DELETED_P (insn)) Index: lra-eliminations.c =================================================================== --- lra-eliminations.c (revision 197234) +++ lra-eliminations.c (working copy) @@ -767,8 +767,7 @@ eliminate_regs_in_insn (rtx insn, bool replace_p) if (icode < 0 && asm_noperands (PATTERN (insn)) < 0 && ! DEBUG_INSN_P (insn)) { - lra_assert (JUMP_TABLE_DATA_P (insn) - || GET_CODE (PATTERN (insn)) == USE + lra_assert (GET_CODE (PATTERN (insn)) == USE || GET_CODE (PATTERN (insn)) == CLOBBER || GET_CODE (PATTERN (insn)) == ASM_INPUT); return; Index: lra.c =================================================================== --- lra.c (revision 197234) +++ lra.c (working copy) @@ -1619,18 +1619,10 @@ add_regs_to_insn_regno_info (lra_insn_recog_data_t static int get_insn_freq (rtx insn) { - basic_block bb; + basic_block bb = BLOCK_FOR_INSN (insn); - if ((bb = BLOCK_FOR_INSN (insn)) != NULL) - return REG_FREQ_FROM_BB (bb); - else - { - lra_assert (lra_insn_recog_data[INSN_UID (insn)] - ->insn_static_data->n_operands == 0); - /* We don't care about such insn, e.g. it might be jump with - addr_vec. */ - return 1; - } + gcc_checking_assert (bb != NULL); + return REG_FREQ_FROM_BB (bb); } /* Invalidate all reg info of INSN with DATA and execution frequency @@ -1997,7 +1989,6 @@ check_rtl (bool final_p) FOR_EACH_BB (bb) FOR_BB_INSNS (bb, insn) if (NONDEBUG_INSN_P (insn) - && ! JUMP_TABLE_DATA_P (insn) && GET_CODE (PATTERN (insn)) != USE && GET_CODE (PATTERN (insn)) != CLOBBER && GET_CODE (PATTERN (insn)) != ASM_INPUT) Index: predict.c =================================================================== --- predict.c (revision 197233) +++ predict.c (working copy) @@ -2731,8 +2731,7 @@ expensive_function_p (int threshold) { rtx insn; - for (insn = BB_HEAD (bb); insn != NEXT_INSN (BB_END (bb)); - insn = NEXT_INSN (insn)) + FOR_BB_INSNS (bb, insn) if (active_insn_p (insn)) { sum += bb->frequency; Index: reload1.c =================================================================== --- reload1.c (revision 197234) +++ reload1.c (working copy) @@ -1490,7 +1490,7 @@ calculate_needs_all_insns (int global) include REG_LABEL_OPERAND and REG_LABEL_TARGET), we need to see what effects this has on the known offsets at labels. */ - if (LABEL_P (insn) || JUMP_P (insn) + if (LABEL_P (insn) || JUMP_P (insn) || JUMP_TABLE_DATA_P (insn) || (INSN_P (insn) && REG_NOTES (insn) != 0)) set_label_offsets (insn, insn, 0); @@ -1620,7 +1620,7 @@ calculate_elim_costs_all_insns (void) include REG_LABEL_OPERAND and REG_LABEL_TARGET), we need to see what effects this has on the known offsets at labels. */ - if (LABEL_P (insn) || JUMP_P (insn) + if (LABEL_P (insn) || JUMP_P (insn) || JUMP_TABLE_DATA_P (insn) || (INSN_P (insn) && REG_NOTES (insn) != 0)) set_label_offsets (insn, insn, 0); @@ -2404,6 +2404,10 @@ set_label_offsets (rtx x, rtx insn, int initial_p) return; + case JUMP_TABLE_DATA: + set_label_offsets (PATTERN (insn), insn, initial_p); + return; + case JUMP_INSN: set_label_offsets (PATTERN (insn), insn, initial_p); @@ -3234,11 +3238,10 @@ eliminate_regs_in_insn (rtx insn, int replace) if (! insn_is_asm && icode < 0) { - gcc_assert (JUMP_TABLE_DATA_P (insn) + gcc_assert (DEBUG_INSN_P (insn) || GET_CODE (PATTERN (insn)) == USE || GET_CODE (PATTERN (insn)) == CLOBBER - || GET_CODE (PATTERN (insn)) == ASM_INPUT - || DEBUG_INSN_P (insn)); + || GET_CODE (PATTERN (insn)) == ASM_INPUT); if (DEBUG_INSN_P (insn)) INSN_VAR_LOCATION_LOC (insn) = eliminate_regs (INSN_VAR_LOCATION_LOC (insn), VOIDmode, insn); @@ -3644,11 +3647,10 @@ elimination_costs_in_insn (rtx insn) if (! insn_is_asm && icode < 0) { - gcc_assert (JUMP_TABLE_DATA_P (insn) + gcc_assert (DEBUG_INSN_P (insn) || GET_CODE (PATTERN (insn)) == USE || GET_CODE (PATTERN (insn)) == CLOBBER - || GET_CODE (PATTERN (insn)) == ASM_INPUT - || DEBUG_INSN_P (insn)); + || GET_CODE (PATTERN (insn)) == ASM_INPUT); return; } @@ -8866,8 +8868,7 @@ delete_output_reload (rtx insn, int j, int last_re since if they are the only uses, they are dead. */ if (set != 0 && SET_DEST (set) == reg) continue; - if (LABEL_P (i2) - || JUMP_P (i2)) + if (LABEL_P (i2) || JUMP_P (i2)) break; if ((NONJUMP_INSN_P (i2) || CALL_P (i2)) && reg_mentioned_p (reg, PATTERN (i2))) @@ -8891,8 +8892,7 @@ delete_output_reload (rtx insn, int j, int last_re delete_address_reloads (i2, insn); delete_insn (i2); } - if (LABEL_P (i2) - || JUMP_P (i2)) + if (LABEL_P (i2) || JUMP_P (i2)) break; } Index: reorg.c =================================================================== --- reorg.c (revision 197234) +++ reorg.c (working copy) @@ -3700,14 +3700,14 @@ dbr_schedule (rtx first) { rtx target; + /* Skip vector tables. We can't get attributes for them. */ + if (JUMP_TABLE_DATA_P (insn)) + continue; + if (JUMP_P (insn)) INSN_ANNULLED_BRANCH_P (insn) = 0; INSN_FROM_TARGET_P (insn) = 0; - /* Skip vector tables. We can't get attributes for them. */ - if (JUMP_TABLE_DATA_P (insn)) - continue; - if (num_delay_slots (insn) > 0) obstack_ptr_grow (&unfilled_slots_obstack, insn); Index: sched-rgn.c =================================================================== --- sched-rgn.c (revision 197233) +++ sched-rgn.c (working copy) @@ -2449,7 +2449,7 @@ add_branch_dependences (rtx head, rtx tail) insn = tail; last = 0; while (CALL_P (insn) - || JUMP_P (insn) + || JUMP_P (insn) || JUMP_TABLE_DATA_P (insn) || (NONJUMP_INSN_P (insn) && (GET_CODE (PATTERN (insn)) == USE || GET_CODE (PATTERN (insn)) == CLOBBER @@ -2536,7 +2536,7 @@ add_branch_dependences (rtx head, rtx tail) possible improvement for handling COND_EXECs in this scheduler: it could remove always-true predicates. */ - if (!reload_completed || ! JUMP_P (tail)) + if (!reload_completed || ! (JUMP_P (tail) || JUMP_TABLE_DATA_P (tail))) return; insn = tail; Index: stmt.c =================================================================== --- stmt.c (revision 197233) +++ stmt.c (working copy) @@ -2025,13 +2025,14 @@ emit_case_dispatch_table (tree index_expr, tree in emit_label (table_label); if (CASE_VECTOR_PC_RELATIVE || flag_pic) - emit_jump_insn (gen_rtx_ADDR_DIFF_VEC (CASE_VECTOR_MODE, - gen_rtx_LABEL_REF (Pmode, table_label), - gen_rtvec_v (ncases, labelvec), - const0_rtx, const0_rtx)); + emit_jump_table_data (gen_rtx_ADDR_DIFF_VEC (CASE_VECTOR_MODE, + gen_rtx_LABEL_REF (Pmode, + table_label), + gen_rtvec_v (ncases, labelvec), + const0_rtx, const0_rtx)); else - emit_jump_insn (gen_rtx_ADDR_VEC (CASE_VECTOR_MODE, - gen_rtvec_v (ncases, labelvec))); + emit_jump_table_data (gen_rtx_ADDR_VEC (CASE_VECTOR_MODE, + gen_rtvec_v (ncases, labelvec))); /* Record no drop-through after the table. */ emit_barrier (); Index: config/alpha/alpha.c =================================================================== --- config/alpha/alpha.c (revision 197233) +++ config/alpha/alpha.c (working copy) @@ -7454,7 +7454,6 @@ alpha_does_function_need_gp (void) for (; insn; insn = NEXT_INSN (insn)) if (NONDEBUG_INSN_P (insn) - && ! JUMP_TABLE_DATA_P (insn) && GET_CODE (PATTERN (insn)) != USE && GET_CODE (PATTERN (insn)) != CLOBBER && get_attr_usegp (insn)) Index: config/arm/arm.c =================================================================== --- config/arm/arm.c (revision 197234) +++ config/arm/arm.c (working copy) @@ -22654,11 +22654,7 @@ thumb_far_jump_used_p (void) insn with the far jump attribute set. */ for (insn = get_insns (); insn; insn = NEXT_INSN (insn)) { - if (JUMP_P (insn) - /* Ignore tablejump patterns. */ - && ! JUMP_TABLE_DATA_P (insn) - && get_attr_far_jump (insn) == FAR_JUMP_YES - ) + if (JUMP_P (insn) && get_attr_far_jump (insn) == FAR_JUMP_YES) { /* Record the fact that we have decided that the function does use far jumps. */ Index: config/frv/frv.c =================================================================== --- config/frv/frv.c (revision 197234) +++ config/frv/frv.c (working copy) @@ -1409,8 +1409,6 @@ frv_function_contains_far_jump (void) rtx insn = get_insns (); while (insn != NULL && !(JUMP_P (insn) - /* Ignore tablejump patterns. */ - && ! JUMP_TABLE_DATA_P (insn) && get_attr_far_jump (insn) == FAR_JUMP_YES)) insn = NEXT_INSN (insn); return (insn != NULL); @@ -7480,7 +7478,7 @@ frv_for_each_packet (void (*handle_packet) (void)) frv_start_packet_block (); } - if (INSN_P (insn) && ! JUMP_TABLE_DATA_P (insn)) + if (INSN_P (insn)) switch (GET_CODE (PATTERN (insn))) { case USE: Index: config/i386/i386.c =================================================================== --- config/i386/i386.c (revision 197234) +++ config/i386/i386.c (working copy) @@ -35116,8 +35116,6 @@ min_insn_size (rtx insn) if (GET_CODE (PATTERN (insn)) == UNSPEC_VOLATILE && XINT (PATTERN (insn), 1) == UNSPECV_ALIGN) return 0; - if (JUMP_TABLE_DATA_P (insn)) - return 0; /* Important case - calls are always 5 bytes. It is common to have many calls in the row. */ @@ -35208,9 +35206,7 @@ ix86_avoid_jump_mispredicts (void) while (nbytes + max_skip >= 16) { start = NEXT_INSN (start); - if ((JUMP_P (start) - && ! JUMP_TABLE_DATA_P (start)) - || CALL_P (start)) + if (JUMP_P (start) || CALL_P (start)) njumps--, isjump = 1; else isjump = 0; @@ -35225,9 +35221,7 @@ ix86_avoid_jump_mispredicts (void) if (dump_file) fprintf (dump_file, "Insn %i estimated to %i bytes\n", INSN_UID (insn), min_size); - if ((JUMP_P (insn) - && ! JUMP_TABLE_DATA_P (insn)) - || CALL_P (insn)) + if (JUMP_P (insn) || CALL_P (insn)) njumps++; else continue; @@ -35235,9 +35229,7 @@ ix86_avoid_jump_mispredicts (void) while (njumps > 3) { start = NEXT_INSN (start); - if ((JUMP_P (start) - && ! JUMP_TABLE_DATA_P (start)) - || CALL_P (start)) + if (JUMP_P (start) || CALL_P (start)) njumps--, isjump = 1; else isjump = 0; Index: config/m32r/m32r.c =================================================================== --- config/m32r/m32r.c (revision 197234) +++ config/m32r/m32r.c (working copy) @@ -1308,7 +1308,6 @@ static int m32r_is_insn (rtx insn) { return (NONDEBUG_INSN_P (insn) - && ! JUMP_TABLE_DATA_P (insn) && GET_CODE (PATTERN (insn)) != USE && GET_CODE (PATTERN (insn)) != CLOBBER); } Index: config/mep/mep.c =================================================================== --- config/mep/mep.c (revision 197233) +++ config/mep/mep.c (working copy) @@ -5511,7 +5511,6 @@ mep_reorg_erepeat (rtx insns) for (insn = insns; insn; insn = NEXT_INSN (insn)) if (JUMP_P (insn) - && ! JUMP_TABLE_DATA_P (insn) && mep_invertable_branch_p (insn)) { if (dump_file) Index: config/mips/mips.c =================================================================== --- config/mips/mips.c (revision 197234) +++ config/mips/mips.c (working copy) @@ -99,7 +99,6 @@ along with GCC; see the file COPYING3. If not see moved to rtl.h. */ #define USEFUL_INSN_P(INSN) \ (NONDEBUG_INSN_P (INSN) \ - && ! JUMP_TABLE_DATA_P (INSN) \ && GET_CODE (PATTERN (INSN)) != USE \ && GET_CODE (PATTERN (INSN)) != CLOBBER) @@ -14654,8 +14653,10 @@ mips16_insn_length (rtx insn) rtx body = PATTERN (insn); if (GET_CODE (body) == ADDR_VEC) return GET_MODE_SIZE (GET_MODE (body)) * XVECLEN (body, 0); - if (GET_CODE (body) == ADDR_DIFF_VEC) + else if (GET_CODE (body) == ADDR_DIFF_VEC) return GET_MODE_SIZE (GET_MODE (body)) * XVECLEN (body, 1); + else + gcc_unreachable (); } return get_attr_length (insn); } @@ -16184,7 +16185,6 @@ mips_has_long_branch_p (void) for (insn = get_insns (); insn; insn = NEXT_INSN (insn)) FOR_EACH_SUBINSN (subinsn, insn) if (JUMP_P (subinsn) - && USEFUL_INSN_P (subinsn) && get_attr_length (subinsn) > normal_length && (any_condjump_p (subinsn) || any_uncondjump_p (subinsn))) return true; @@ -16286,7 +16286,6 @@ mips16_split_long_branches (void) something_changed = false; for (insn = get_insns (); insn; insn = NEXT_INSN (insn)) if (JUMP_P (insn) - && USEFUL_INSN_P (insn) && get_attr_length (insn) > 8 && (any_condjump_p (insn) || any_uncondjump_p (insn))) { Index: config/pa/pa.c =================================================================== --- config/pa/pa.c (revision 197234) +++ config/pa/pa.c (working copy) @@ -9134,7 +9134,6 @@ pa_combine_instructions (void) /* We only care about INSNs, JUMP_INSNs, and CALL_INSNs. Also ignore any special USE insns. */ if ((! NONJUMP_INSN_P (anchor) && ! JUMP_P (anchor) && ! CALL_P (anchor)) - || JUMP_TABLE_DATA_P (anchor) || GET_CODE (PATTERN (anchor)) == USE || GET_CODE (PATTERN (anchor)) == CLOBBER) continue; @@ -9159,8 +9158,7 @@ pa_combine_instructions (void) continue; /* Anything except a regular INSN will stop our search. */ - if (! NONJUMP_INSN_P (floater) - || JUMP_TABLE_DATA_P (floater)) + if (! NONJUMP_INSN_P (floater)) { floater = NULL_RTX; break; @@ -9220,8 +9218,7 @@ pa_combine_instructions (void) continue; /* Anything except a regular INSN will stop our search. */ - if (! NONJUMP_INSN_P (floater) - || JUMP_TABLE_DATA_P (floater)) + if (! NONJUMP_INSN_P (floater)) { floater = NULL_RTX; break; Index: config/rs6000/rs6000.c =================================================================== --- config/rs6000/rs6000.c (revision 197234) +++ config/rs6000/rs6000.c (working copy) @@ -23940,7 +23921,7 @@ get_next_active_insn (rtx insn, rtx tail) return NULL_RTX; if (CALL_P (insn) - || JUMP_P (insn) + || JUMP_P (insn) || JUMP_TABLE_DATA_P (insn) || (NONJUMP_INSN_P (insn) && GET_CODE (PATTERN (insn)) != USE && GET_CODE (PATTERN (insn)) != CLOBBER Index: config/s390/s390.c =================================================================== --- config/s390/s390.c (revision 197234) +++ config/s390/s390.c (working copy) @@ -6867,7 +6867,7 @@ s390_chunkify_start (void) } } - if (JUMP_P (insn) || LABEL_P (insn)) + if (JUMP_P (insn) || JUMP_TABLE_DATA_P (insn) || LABEL_P (insn)) { if (curr_pool) s390_add_pool_insn (curr_pool, insn); Index: config/sh/sh.c =================================================================== --- config/sh/sh.c (revision 197234) +++ config/sh/sh.c (working copy) @@ -5213,7 +5213,8 @@ find_barrier (int num_mova, rtx mova, rtx from) if (found_si > count_si) count_si = found_si; } - else if (JUMP_TABLE_DATA_P (from)) + else if (JUMP_TABLE_DATA_P (from) + && GET_CODE (PATTERN (from)) == ADDR_DIFF_VEC) { if ((num_mova > 1 && GET_MODE (prev_nonnote_insn (from)) == VOIDmode) || (num_mova @@ -5247,7 +5248,7 @@ find_barrier (int num_mova, rtx mova, rtx from) /* There is a possibility that a bf is transformed into a bf/s by the delay slot scheduler. */ - if (JUMP_P (from) && !JUMP_TABLE_DATA_P (from) + if (JUMP_P (from) && get_attr_type (from) == TYPE_CBRANCH && ! sequence_insn_p (from)) inc += 2; @@ -5973,7 +5974,6 @@ sh_loop_align (rtx label) if (! next || ! INSN_P (next) - || GET_CODE (PATTERN (next)) == ADDR_DIFF_VEC || recog_memoized (next) == CODE_FOR_consttable_2) return 0; @@ -6494,9 +6494,7 @@ split_branches (rtx first) so transform it into a note. */ SET_INSN_DELETED (insn); } - else if (JUMP_P (insn) - /* Don't mess with ADDR_DIFF_VEC */ - && ! JUMP_TABLE_DATA_P (insn)) + else if (JUMP_P (insn)) { enum attr_type type = get_attr_type (insn); if (type == TYPE_CBRANCH) @@ -10122,8 +10120,7 @@ sh_insn_length_adjustment (rtx insn) if (((NONJUMP_INSN_P (insn) && GET_CODE (PATTERN (insn)) != USE && GET_CODE (PATTERN (insn)) != CLOBBER) - || CALL_P (insn) - || (JUMP_P (insn) && !JUMP_TABLE_DATA_P (insn))) + || CALL_P (insn) || JUMP_P (insn)) && ! sequence_insn_p (insn) && get_attr_needs_delay_slot (insn) == NEEDS_DELAY_SLOT_YES) return 2; @@ -10131,7 +10128,7 @@ sh_insn_length_adjustment (rtx insn) /* SH2e has a bug that prevents the use of annulled branches, so if the delay slot is not filled, we'll have to put a NOP in it. */ if (sh_cpu_attr == CPU_SH2E - && JUMP_P (insn) && !JUMP_TABLE_DATA_P (insn) + && JUMP_P (insn) && get_attr_type (insn) == TYPE_CBRANCH && ! sequence_insn_p (insn)) return 2; Index: config/spu/spu.c =================================================================== --- config/spu/spu.c (revision 197234) +++ config/spu/spu.c (working copy) @@ -2171,10 +2171,6 @@ get_branch_target (rtx branch) if (GET_CODE (PATTERN (branch)) == RETURN) return gen_rtx_REG (SImode, LINK_REGISTER_REGNUM); - /* jump table */ - if (JUMP_TABLE_DATA_P (branch)) - return 0; - /* ASM GOTOs. */ if (extract_asm_operands (PATTERN (branch)) != NULL) return NULL;