Patchwork [v2] Re: PR62304 (was Re: (Still) ICE for cris-elf at r214710)

login
register
mail settings
Submitter David Malcolm
Date Aug. 29, 2014, 6:07 p.m.
Message ID <1409335624.31600.32.camel@surprise>
Download mbox | patch
Permalink /patch/384345/
State New
Headers show

Comments

David Malcolm - Aug. 29, 2014, 6:07 p.m.
On Fri, 2014-08-29 at 18:15 +0200, Hans-Peter Nilsson wrote:
> > From: David Malcolm <dmalcolm@redhat.com>
> > Date: Fri, 29 Aug 2014 17:33:54 +0200
> 
> > On Fri, 2014-08-29 at 16:48 +0200, Hans-Peter Nilsson wrote:
> > > Sorry, but that didn't help.  I still get the exact same error.
> > > (Yep, I double-checked that I didn't goof testing...)
> 
> Famous last words...
> 
> > Fully identical, or just the top 2 frames?  The error in the above
> > backtrace is the call to JUMP_LABEL_AS_INSN here:
> > 
> > > > 0x9119c2 find_dead_or_set_registers
> > > > >         /tmp/hpautotest-gcc1/gcc/gcc/resource.c:500
> > 
> > which I believe the patch removes.
> > 
> > That said, PR62304 has *two* backtraces: the one you posted earlier,
> > plus a similar-looking one due to a different overzealous cast by me at:
> > 0xae862f follow_jumps
> >         /vol/gcc/src/hg/trunk/local/gcc/reorg.c:2326
> > 
> > Maybe you're seeing that one?  (or a third...)
> 
> (Oh my, how embarrassing: by "exact same" I must have meant "in
> about the first 80 characters and the first frame".)
> 
> It seems it's a third one.  Yay for reorg.c.  Or rather, nay.
> 
> /tmp/pr62304/gcc/libgcc/libgcc2.c: In function '__absvsi2':
> /tmp/pr62304/gcc/libgcc/libgcc2.c:232:1: internal compiler error: in safe_as_a, at is-a.h:205
>  }
>  ^
> 0x90bb53 safe_as_a<rtx_insn*, rtx_def>
> 	/tmp/pr62304/gcc/gcc/is-a.h:205
> 0x90bb53 NEXT_INSN
> 	/tmp/pr62304/gcc/gcc/rtl.h:1338
> 0x90bb53 follow_jumps
> 	/tmp/pr62304/gcc/gcc/reorg.c:2315
> 0x90f50c relax_delay_slots
> 	/tmp/pr62304/gcc/gcc/reorg.c:3175
> 0x90f50c dbr_schedule
> 	/tmp/pr62304/gcc/gcc/reorg.c:3743
> 0x91088f rest_of_handle_delay_slots
> 	/tmp/pr62304/gcc/gcc/reorg.c:3885
> 0x91088f execute
> 	/tmp/pr62304/gcc/gcc/reorg.c:3916
> 
> For:
> int
> __absvsi2 (int a)
> {
>   int w = a;
>   if (a < 0)
>     w = -(unsigned int) a;
>   if (w < 0)
>     __builtin_abort ();
>    return w;
> }
> With "./cc1 -fpreprocessed -O2 this.i"

Yes: I made various mistakes in reorg.c and resource.c where I assumed
that a JUMP_LABEL(insn) was an insn, whereas the existing code is set up
to handle RETURN nodes.

BTW, in another email in the thread you said:

> Thanks for the heads-up.  BTW, the ChangeLog entries should say
> "what" not "why"; that goes into a comment in the source.

OK.   Where possible I've added comments in the new code, and chosen
variable names that express that we could be dealing with both insns and
RETURN nodes.  Much of the patch is simply reverting changes made
earlier.  Does it make sense to go in and add comments where I'm
reverting an change?  I felt that adding stuff to a ChangeLog makes
sense from a "belt and braces" POV; if we have to review the change in 6
months time, it's easier to have bonus "why", as it were.

The attached patch fixes both reproducers you posted (tested with
cris-elf), and the case seen in PR62304 (tested with
sparc64-sun-solaris2.10); bootstrap on x86_64 in progress.

It eliminates all uses of JUMP_LABEL_AS_INSN from reorg.c, and indeed
after that there are only 6 uses in the tree (including config subdirs).

2014-08-29  David Malcolm  <dmalcolm@redhat.com>

	PR bootstrap/62304

	* gcc/reorg.c (skip_consecutive_labels): Convert return type and
	param back from rtx_insn * to rtx.  Rename param from "label" to
	"label_or_return", reintroducing "label" as an rtx_insn * after
	we've ensured it's not a RETURN.
	(first_active_target_insn): Likewise for return type and param;
	add a checked cast to rtx_insn * once we've ensured "insn" is not
	a RETURN.
	(steal_delay_list_from_target): Convert param "pnew_thread" back
	from rtx_insn ** to rtx *.  Replace use of JUMP_LABEL_AS_INSN
	with JUMP_LABEL.
	(own_thread_p): Convert param "thread" back from an rtx_insn * to
	an rtx.  Introduce local rtx_insn * "thread_insn" with a checked
	cast once we've established we're not dealing with a RETURN,
	renaming subsequent uses of "thread" to "thread_insn".
	(fill_simple_delay_slots): Convert uses of JUMP_LABEL_AS_INSN back
	to JUMP_LABEL.
	(follow_jumps): Convert return type and param "label" from
	rtx_insn * back to rtx.  Move initialization of "value" to after
	the handling for ANY_RETURN_P, adding a checked cast there to
	rtx_insn *.  Convert local rtx_insn * "this_label" to an rtx and
	rename to "this_label_or_return", reintroducing "this_label" as
	an rtx_insn * once we've handled the case where it could be an
	ANY_RETURN_P.
	(fill_slots_from_thread): Rename param "thread" to
	"thread_or_return", converting from an rtx_insn * back to an rtx.
	Reintroduce name "thread" as an rtx_insn * local with a checked
	cast once we've handled the case of it being an ANY_RETURN_P.
	Convert local "new_thread" from an rtx_insn * back to an rtx.
	Add a checked cast when assigning to "trial" from "new_thread".
	Convert use of JUMP_LABEL_AS_INSN back to JUMP_LABEL.  Add a
	checked cast to rtx_insn * from "new_thread" when invoking
	get_label_before.
	(fill_eager_delay_slots): Convert locals "target_label",
	"insn_at_target" from rtx_insn * back to rtx.
	Convert uses of JUMP_LABEL_AS_INSN back to JUMP_LABEL.
	(relax_delay_slots): Convert locals "trial", "target_label" from
	rtx_insn * back to rtx.  Convert uses of JUMP_LABEL_AS_INSN back
	to JUMP_LABEL.  Add a checked cast to rtx_insn * on "trial" when
	invoking update_block.
	(dbr_schedule): Convert use of JUMP_LABEL_AS_INSN back to
	JUMP_LABEL; this removes all JUMP_LABEL_AS_INSN from reorg.c.

	* resource.h (mark_target_live_regs): Undo erroneous conversion
	of second param of r214693, converting it back from rtx_insn * to
	rtx, since it could be a RETURN.

	* resource.c (find_dead_or_set_registers): Similarly, convert
	param "jump_target" back from an rtx_insn ** to an rtx *, as we
	could be writing back a RETURN.  Rename local rtx_insn * "next" to
	"next_insn", and introduce "lab_or_return" as a local rtx,
	handling the case where JUMP_LABEL (this_jump_insn) is a RETURN.
	(mark_target_live_regs): Undo erroneous conversion
	of second param of r214693, converting it back from rtx_insn * to
	rtx, since it could be a RETURN.  Rename it from "target" to
	"target_maybe_return", reintroducing the name "target" as a local
	rtx_insn * with a checked cast, after we've handled the case of
	ANY_RETURN_P.
Jeff Law - Aug. 30, 2014, 5:41 a.m.
On 08/29/14 12:07, David Malcolm wrote:

>
> Yes: I made various mistakes in reorg.c and resource.c where I assumed
> that a JUMP_LABEL(insn) was an insn, whereas the existing code is set up
> to handle RETURN nodes.
Well, it would seem to me that reorg is being totally braindead in 
mixing and matching these two nodes.   In particular whatever code is 
passing around a RETURN rtx into places that normally accept some kind 
of INSN would appear to be broken.

>
> It eliminates all uses of JUMP_LABEL_AS_INSN from reorg.c, and indeed
> after that there are only 6 uses in the tree (including config subdirs).
Good to some extent as I see JUMP_LABEL_AS_INSN as papering over bugs 
elsewhere, but this patch is also a step backwards as we're papering 
over a mess in reorg.c.


>
> 2014-08-29  David Malcolm  <dmalcolm@redhat.com>
>
> 	PR bootstrap/62304
>
> 	* gcc/reorg.c (skip_consecutive_labels): Convert return type and
> 	param back from rtx_insn * to rtx.  Rename param from "label" to
> 	"label_or_return", reintroducing "label" as an rtx_insn * after
> 	we've ensured it's not a RETURN.
> 	(first_active_target_insn): Likewise for return type and param;
> 	add a checked cast to rtx_insn * once we've ensured "insn" is not
> 	a RETURN.
> 	(steal_delay_list_from_target): Convert param "pnew_thread" back
> 	from rtx_insn ** to rtx *.  Replace use of JUMP_LABEL_AS_INSN
> 	with JUMP_LABEL.
> 	(own_thread_p): Convert param "thread" back from an rtx_insn * to
> 	an rtx.  Introduce local rtx_insn * "thread_insn" with a checked
> 	cast once we've established we're not dealing with a RETURN,
> 	renaming subsequent uses of "thread" to "thread_insn".
> 	(fill_simple_delay_slots): Convert uses of JUMP_LABEL_AS_INSN back
> 	to JUMP_LABEL.
> 	(follow_jumps): Convert return type and param "label" from
> 	rtx_insn * back to rtx.  Move initialization of "value" to after
> 	the handling for ANY_RETURN_P, adding a checked cast there to
> 	rtx_insn *.  Convert local rtx_insn * "this_label" to an rtx and
> 	rename to "this_label_or_return", reintroducing "this_label" as
> 	an rtx_insn * once we've handled the case where it could be an
> 	ANY_RETURN_P.
> 	(fill_slots_from_thread): Rename param "thread" to
> 	"thread_or_return", converting from an rtx_insn * back to an rtx.
> 	Reintroduce name "thread" as an rtx_insn * local with a checked
> 	cast once we've handled the case of it being an ANY_RETURN_P.
> 	Convert local "new_thread" from an rtx_insn * back to an rtx.
> 	Add a checked cast when assigning to "trial" from "new_thread".
> 	Convert use of JUMP_LABEL_AS_INSN back to JUMP_LABEL.  Add a
> 	checked cast to rtx_insn * from "new_thread" when invoking
> 	get_label_before.
> 	(fill_eager_delay_slots): Convert locals "target_label",
> 	"insn_at_target" from rtx_insn * back to rtx.
> 	Convert uses of JUMP_LABEL_AS_INSN back to JUMP_LABEL.
> 	(relax_delay_slots): Convert locals "trial", "target_label" from
> 	rtx_insn * back to rtx.  Convert uses of JUMP_LABEL_AS_INSN back
> 	to JUMP_LABEL.  Add a checked cast to rtx_insn * on "trial" when
> 	invoking update_block.
> 	(dbr_schedule): Convert use of JUMP_LABEL_AS_INSN back to
> 	JUMP_LABEL; this removes all JUMP_LABEL_AS_INSN from reorg.c.
>
> 	* resource.h (mark_target_live_regs): Undo erroneous conversion
> 	of second param of r214693, converting it back from rtx_insn * to
> 	rtx, since it could be a RETURN.
>
> 	* resource.c (find_dead_or_set_registers): Similarly, convert
> 	param "jump_target" back from an rtx_insn ** to an rtx *, as we
> 	could be writing back a RETURN.  Rename local rtx_insn * "next" to
> 	"next_insn", and introduce "lab_or_return" as a local rtx,
> 	handling the case where JUMP_LABEL (this_jump_insn) is a RETURN.
> 	(mark_target_live_regs): Undo erroneous conversion
> 	of second param of r214693, converting it back from rtx_insn * to
> 	rtx, since it could be a RETURN.  Rename it from "target" to
> 	"target_maybe_return", reintroducing the name "target" as a local
> 	rtx_insn * with a checked cast, after we've handled the case of
> 	ANY_RETURN_P.
I'll OK as a means to restore the trunk to working order, but let's add 
a follow-up item to track down places where we're passing things like a 
RETURN rtx in places where we really are expecting insns.

jeff
>
David Malcolm - Aug. 30, 2014, 2:35 p.m.
On Fri, 2014-08-29 at 23:41 -0600, Jeff Law wrote:
> On 08/29/14 12:07, David Malcolm wrote:
> 
> >
> > Yes: I made various mistakes in reorg.c and resource.c where I assumed
> > that a JUMP_LABEL(insn) was an insn, whereas the existing code is set up
> > to handle RETURN nodes.
> Well, it would seem to me that reorg is being totally braindead in 
> mixing and matching these two nodes.   In particular whatever code is 
> passing around a RETURN rtx into places that normally accept some kind 
> of INSN would appear to be broken.
> 
> >
> > It eliminates all uses of JUMP_LABEL_AS_INSN from reorg.c, and indeed
> > after that there are only 6 uses in the tree (including config subdirs).
> Good to some extent as I see JUMP_LABEL_AS_INSN as papering over bugs 
> elsewhere, but this patch is also a step backwards as we're papering 
> over a mess in reorg.c.
> 
> 
> >
> > 2014-08-29  David Malcolm  <dmalcolm@redhat.com>
> >
> > 	PR bootstrap/62304
> >
> > 	* gcc/reorg.c (skip_consecutive_labels): Convert return type and
> > 	param back from rtx_insn * to rtx.  Rename param from "label" to
> > 	"label_or_return", reintroducing "label" as an rtx_insn * after
> > 	we've ensured it's not a RETURN.
> > 	(first_active_target_insn): Likewise for return type and param;
> > 	add a checked cast to rtx_insn * once we've ensured "insn" is not
> > 	a RETURN.
> > 	(steal_delay_list_from_target): Convert param "pnew_thread" back
> > 	from rtx_insn ** to rtx *.  Replace use of JUMP_LABEL_AS_INSN
> > 	with JUMP_LABEL.
> > 	(own_thread_p): Convert param "thread" back from an rtx_insn * to
> > 	an rtx.  Introduce local rtx_insn * "thread_insn" with a checked
> > 	cast once we've established we're not dealing with a RETURN,
> > 	renaming subsequent uses of "thread" to "thread_insn".
> > 	(fill_simple_delay_slots): Convert uses of JUMP_LABEL_AS_INSN back
> > 	to JUMP_LABEL.
> > 	(follow_jumps): Convert return type and param "label" from
> > 	rtx_insn * back to rtx.  Move initialization of "value" to after
> > 	the handling for ANY_RETURN_P, adding a checked cast there to
> > 	rtx_insn *.  Convert local rtx_insn * "this_label" to an rtx and
> > 	rename to "this_label_or_return", reintroducing "this_label" as
> > 	an rtx_insn * once we've handled the case where it could be an
> > 	ANY_RETURN_P.
> > 	(fill_slots_from_thread): Rename param "thread" to
> > 	"thread_or_return", converting from an rtx_insn * back to an rtx.
> > 	Reintroduce name "thread" as an rtx_insn * local with a checked
> > 	cast once we've handled the case of it being an ANY_RETURN_P.
> > 	Convert local "new_thread" from an rtx_insn * back to an rtx.
> > 	Add a checked cast when assigning to "trial" from "new_thread".
> > 	Convert use of JUMP_LABEL_AS_INSN back to JUMP_LABEL.  Add a
> > 	checked cast to rtx_insn * from "new_thread" when invoking
> > 	get_label_before.
> > 	(fill_eager_delay_slots): Convert locals "target_label",
> > 	"insn_at_target" from rtx_insn * back to rtx.
> > 	Convert uses of JUMP_LABEL_AS_INSN back to JUMP_LABEL.
> > 	(relax_delay_slots): Convert locals "trial", "target_label" from
> > 	rtx_insn * back to rtx.  Convert uses of JUMP_LABEL_AS_INSN back
> > 	to JUMP_LABEL.  Add a checked cast to rtx_insn * on "trial" when
> > 	invoking update_block.
> > 	(dbr_schedule): Convert use of JUMP_LABEL_AS_INSN back to
> > 	JUMP_LABEL; this removes all JUMP_LABEL_AS_INSN from reorg.c.
> >
> > 	* resource.h (mark_target_live_regs): Undo erroneous conversion
> > 	of second param of r214693, converting it back from rtx_insn * to
> > 	rtx, since it could be a RETURN.
> >
> > 	* resource.c (find_dead_or_set_registers): Similarly, convert
> > 	param "jump_target" back from an rtx_insn ** to an rtx *, as we
> > 	could be writing back a RETURN.  Rename local rtx_insn * "next" to
> > 	"next_insn", and introduce "lab_or_return" as a local rtx,
> > 	handling the case where JUMP_LABEL (this_jump_insn) is a RETURN.
> > 	(mark_target_live_regs): Undo erroneous conversion
> > 	of second param of r214693, converting it back from rtx_insn * to
> > 	rtx, since it could be a RETURN.  Rename it from "target" to
> > 	"target_maybe_return", reintroducing the name "target" as a local
> > 	rtx_insn * with a checked cast, after we've handled the case of
> > 	ANY_RETURN_P.
> I'll OK as a means to restore the trunk to working order, but let's add 
> a follow-up item to track down places where we're passing things like a 
> RETURN rtx in places where we really are expecting insns.

Thanks; bootstrapped on x86_64 and ppc (gcc110); committed to trunk as
r214752.

I plan to have a close look at everywhere that JUMP_LABEL is not an
insn, though I may wait to after current stage1 to do that; for this
stage1 my primary objective for rtx-classes is to use them to document
the existing status quo, and I hope to context-switch back to trying to
merge the JIT branch in a week or two.  Hope that sounds reasonable

Dave

Patch

Index: gcc/reorg.c
===================================================================
--- gcc/reorg.c	(revision 214732)
+++ gcc/reorg.c	(working copy)
@@ -142,14 +142,16 @@ 
 /* Return the last label to mark the same position as LABEL.  Return LABEL
    itself if it is null or any return rtx.  */
 
-static rtx_insn *
-skip_consecutive_labels (rtx_insn *label)
+static rtx
+skip_consecutive_labels (rtx label_or_return)
 {
   rtx_insn *insn;
 
-  if (label && ANY_RETURN_P (label))
-    return label;
+  if (label_or_return && ANY_RETURN_P (label_or_return))
+    return label_or_return;
 
+  rtx_insn *label = as_a <rtx_insn *> (label_or_return);
+
   for (insn = label; insn != 0 && !INSN_P (insn); insn = NEXT_INSN (insn))
     if (LABEL_P (insn))
       label = insn;
@@ -229,7 +231,7 @@ 
 						    struct resources *,
 						    struct resources *,
 						    int, int *, int *,
-						    rtx_insn **);
+						    rtx *);
 static rtx_insn_list *steal_delay_list_from_fallthrough (rtx_insn *, rtx,
 							 rtx_sequence *,
 							 rtx_insn_list *,
@@ -239,7 +241,7 @@ 
 							 int, int *, int *);
 static void try_merge_delay_insns (rtx, rtx_insn *);
 static rtx redundant_insn (rtx, rtx_insn *, rtx);
-static int own_thread_p (rtx_insn *, rtx, int);
+static int own_thread_p (rtx, rtx, int);
 static void update_block (rtx_insn *, rtx);
 static int reorg_redirect_jump (rtx_insn *, rtx);
 static void update_reg_dead_notes (rtx, rtx);
@@ -246,8 +248,7 @@ 
 static void fix_reg_dead_note (rtx, rtx);
 static void update_reg_unused_notes (rtx, rtx);
 static void fill_simple_delay_slots (int);
-static rtx_insn_list *fill_slots_from_thread (rtx_insn *, rtx,
-					      rtx_insn *, rtx_insn *,
+static rtx_insn_list *fill_slots_from_thread (rtx_insn *, rtx, rtx, rtx,
 					      int, int, int, int,
 					      int *, rtx_insn_list *);
 static void fill_eager_delay_slots (void);
@@ -257,12 +258,12 @@ 
 /* A wrapper around next_active_insn which takes care to return ret_rtx
    unchanged.  */
 
-static rtx_insn *
-first_active_target_insn (rtx_insn *insn)
+static rtx
+first_active_target_insn (rtx insn)
 {
   if (ANY_RETURN_P (insn))
     return insn;
-  return next_active_insn (insn);
+  return next_active_insn (as_a <rtx_insn *> (insn));
 }
 
 /* Return true iff INSN is a simplejump, or any kind of return insn.  */
@@ -1089,7 +1090,7 @@ 
 			      struct resources *needed,
 			      struct resources *other_needed,
 			      int slots_to_fill, int *pslots_filled,
-			      int *pannul_p, rtx_insn **pnew_thread)
+			      int *pannul_p, rtx *pnew_thread)
 {
   int slots_remaining = slots_to_fill - *pslots_filled;
   int total_slots_filled = *pslots_filled;
@@ -1202,7 +1203,7 @@ 
       update_block (seq->insn (i), insn);
 
   /* Show the place to which we will be branching.  */
-  *pnew_thread = first_active_target_insn (JUMP_LABEL_AS_INSN (seq->insn (0)));
+  *pnew_thread = first_active_target_insn (JUMP_LABEL (seq->insn (0)));
 
   /* Add any new insns to the delay list and update the count of the
      number of slots filled.  */
@@ -1715,7 +1716,7 @@ 
    finding an active insn, we do not own this thread.  */
 
 static int
-own_thread_p (rtx_insn *thread, rtx label, int allow_fallthrough)
+own_thread_p (rtx thread, rtx label, int allow_fallthrough)
 {
   rtx_insn *active_insn;
   rtx_insn *insn;
@@ -1724,10 +1725,13 @@ 
   if (thread == 0 || ANY_RETURN_P (thread))
     return 0;
 
-  /* Get the first active insn, or THREAD, if it is an active insn.  */
-  active_insn = next_active_insn (PREV_INSN (thread));
+  /* We have a non-NULL insn.  */
+  rtx_insn *thread_insn = as_a <rtx_insn *> (thread);
 
-  for (insn = thread; insn != active_insn; insn = NEXT_INSN (insn))
+  /* Get the first active insn, or THREAD_INSN, if it is an active insn.  */
+  active_insn = next_active_insn (PREV_INSN (thread_insn));
+
+  for (insn = thread_insn; insn != active_insn; insn = NEXT_INSN (insn))
     if (LABEL_P (insn)
 	&& (insn != label || LABEL_NUSES (insn) != 1))
       return 0;
@@ -1736,7 +1740,7 @@ 
     return 1;
 
   /* Ensure that we reach a BARRIER before any insn or label.  */
-  for (insn = prev_nonnote_insn (thread);
+  for (insn = prev_nonnote_insn (thread_insn);
        insn == 0 || !BARRIER_P (insn);
        insn = prev_nonnote_insn (insn))
     if (insn == 0
@@ -2275,8 +2279,8 @@ 
 	  = fill_slots_from_thread (insn, const_true_rtx,
 				    next_active_insn (JUMP_LABEL (insn)),
 				    NULL, 1, 1,
-				    own_thread_p (JUMP_LABEL_AS_INSN (insn),
-						  JUMP_LABEL_AS_INSN (insn), 0),
+				    own_thread_p (JUMP_LABEL (insn),
+						  JUMP_LABEL (insn), 0),
 				    slots_to_fill, &slots_filled,
 				    delay_list);
 
@@ -2301,17 +2305,19 @@ 
    If the returned label is obtained by following a crossing jump,
    set *CROSSING to true, otherwise set it to false.  */
 
-static rtx_insn *
-follow_jumps (rtx_insn *label, rtx jump, bool *crossing)
+static rtx
+follow_jumps (rtx label, rtx jump, bool *crossing)
 {
   rtx_insn *insn;
   rtx_insn *next;
-  rtx_insn *value = label;
   int depth;
 
   *crossing = false;
   if (ANY_RETURN_P (label))
     return label;
+
+  rtx_insn *value = as_a <rtx_insn *> (label);
+
   for (depth = 0;
        (depth < 10
 	&& (insn = next_active_insn (value)) != 0
@@ -2323,15 +2329,17 @@ 
 	&& BARRIER_P (next));
        depth++)
     {
-      rtx_insn *this_label = JUMP_LABEL_AS_INSN (insn);
+      rtx this_label_or_return = JUMP_LABEL (insn);
 
       /* If we have found a cycle, make the insn jump to itself.  */
-      if (this_label == label)
+      if (this_label_or_return == label)
 	return label;
 
       /* Cannot follow returns and cannot look through tablejumps.  */
-      if (ANY_RETURN_P (this_label))
-	return this_label;
+      if (ANY_RETURN_P (this_label_or_return))
+	return this_label_or_return;
+
+      rtx_insn *this_label = as_a <rtx_insn *> (this_label_or_return);
       if (NEXT_INSN (this_label)
 	  && JUMP_TABLE_DATA_P (NEXT_INSN (this_label)))
 	break;
@@ -2372,13 +2380,13 @@ 
    slot.  We then adjust the jump to point after the insns we have taken.  */
 
 static rtx_insn_list *
-fill_slots_from_thread (rtx_insn *insn, rtx condition, rtx_insn *thread,
-			rtx_insn *opposite_thread, int likely,
+fill_slots_from_thread (rtx_insn *insn, rtx condition, rtx thread_or_return,
+			rtx opposite_thread, int likely,
 			int thread_if_true,
 			int own_thread, int slots_to_fill,
 			int *pslots_filled, rtx_insn_list *delay_list)
 {
-  rtx_insn *new_thread;
+  rtx new_thread;
   struct resources opposite_needed, set, needed;
   rtx_insn *trial;
   int lose = 0;
@@ -2393,9 +2401,11 @@ 
 
   /* If our thread is the end of subroutine, we can't get any delay
      insns from that.  */
-  if (thread == NULL_RTX || ANY_RETURN_P (thread))
+  if (thread_or_return == NULL_RTX || ANY_RETURN_P (thread_or_return))
     return delay_list;
 
+  rtx_insn *thread = as_a <rtx_insn *> (thread_or_return);
+
   /* If this is an unconditional branch, nothing is needed at the
      opposite thread.  Otherwise, compute what is needed there.  */
   if (condition == const_true_rtx)
@@ -2716,7 +2726,9 @@ 
       rtx dest;
       rtx src;
 
-      trial = new_thread;
+      /* We know "new_thread" is an insn due to NONJUMP_INSN_P (new_thread)
+	 above.  */
+      trial = as_a <rtx_insn *> (new_thread);
       pat = PATTERN (trial);
 
       if (!NONJUMP_INSN_P (trial)
@@ -2797,7 +2809,7 @@ 
 	  && redirect_with_delay_list_safe_p (insn,
 					      JUMP_LABEL (new_thread),
 					      delay_list))
-	new_thread = follow_jumps (JUMP_LABEL_AS_INSN (new_thread), insn,
+	new_thread = follow_jumps (JUMP_LABEL (new_thread), insn,
 				   &crossing);
 
       if (ANY_RETURN_P (new_thread))
@@ -2805,7 +2817,8 @@ 
       else if (LABEL_P (new_thread))
 	label = new_thread;
       else
-	label = get_label_before (new_thread, JUMP_LABEL (insn));
+	label = get_label_before (as_a <rtx_insn *> (new_thread),
+				  JUMP_LABEL (insn));
 
       if (label)
 	{
@@ -2838,7 +2851,8 @@ 
   for (i = 0; i < num_unfilled_slots; i++)
     {
       rtx condition;
-      rtx_insn *target_label, *insn_at_target, *fallthrough_insn;
+      rtx target_label, insn_at_target;
+      rtx_insn *fallthrough_insn;
       rtx_insn_list *delay_list = 0;
       int own_target;
       int own_fallthrough;
@@ -2867,7 +2881,7 @@ 
 	continue;
 
       slots_filled = 0;
-      target_label = JUMP_LABEL_AS_INSN (insn);
+      target_label = JUMP_LABEL (insn);
       condition = get_branch_condition (insn, target_label);
 
       if (condition == 0)
@@ -2911,7 +2925,7 @@ 
 		 we might have found a redundant insn which we deleted
 		 from the thread that was filled.  So we have to recompute
 		 the next insn at the target.  */
-	      target_label = JUMP_LABEL_AS_INSN (insn);
+	      target_label = JUMP_LABEL (insn);
 	      insn_at_target = first_active_target_insn (target_label);
 
 	      delay_list
@@ -3153,7 +3167,9 @@ 
 {
   rtx_insn *insn, *next;
   rtx_sequence *pat;
-  rtx_insn *trial, *delay_insn, *target_label;
+  rtx trial;
+  rtx_insn *delay_insn;
+  rtx target_label;
 
   /* Look at every JUMP_INSN and see if we can improve it.  */
   for (insn = first; insn; insn = next)
@@ -3168,7 +3184,7 @@ 
 	 group of consecutive labels.  */
       if (JUMP_P (insn)
 	  && (condjump_p (insn) || condjump_in_parallel_p (insn))
-	  && !ANY_RETURN_P (target_label = JUMP_LABEL_AS_INSN (insn)))
+	  && !ANY_RETURN_P (target_label = JUMP_LABEL (insn)))
 	{
 	  target_label
 	    = skip_consecutive_labels (follow_jumps (target_label, insn,
@@ -3243,7 +3259,7 @@ 
 	  && 0 > mostly_true_jump (other))
 	{
 	  rtx other_target = JUMP_LABEL (other);
-	  target_label = JUMP_LABEL_AS_INSN (insn);
+	  target_label = JUMP_LABEL (insn);
 
 	  if (invert_jump (other, target_label, 0))
 	    reorg_redirect_jump (insn, other_target);
@@ -3315,7 +3331,7 @@ 
 	  || !(condjump_p (delay_insn) || condjump_in_parallel_p (delay_insn)))
 	continue;
 
-      target_label = JUMP_LABEL_AS_INSN (delay_insn);
+      target_label = JUMP_LABEL (delay_insn);
       if (target_label && ANY_RETURN_P (target_label))
 	continue;
 
@@ -3353,8 +3369,10 @@ 
 
 	  if (tmp)
 	    {
-	      /* Insert the special USE insn and update dataflow info.  */
-	      update_block (trial, tmp);
+	      /* Insert the special USE insn and update dataflow info.
+		 We know "trial" is an insn here as it is the output of
+		 next_real_insn () above.  */
+	      update_block (as_a <rtx_insn *> (trial), tmp);
 	      
 	      /* Now emit a label before the special USE insn, and
 		 redirect our jump to the new label.  */
@@ -3374,7 +3392,7 @@ 
 	  && redundant_insn (XVECEXP (PATTERN (trial), 0, 1), insn, 0))
 	{
 	  rtx_sequence *trial_seq = as_a <rtx_sequence *> (PATTERN (trial));
-	  target_label = JUMP_LABEL_AS_INSN (trial_seq->insn (0));
+	  target_label = JUMP_LABEL (trial_seq->insn (0));
 	  if (ANY_RETURN_P (target_label))
 	    target_label = find_end_label (target_label);
 	  
@@ -3716,7 +3734,7 @@ 
       if (JUMP_P (insn)
 	  && (condjump_p (insn) || condjump_in_parallel_p (insn))
 	  && !ANY_RETURN_P (JUMP_LABEL (insn))
-	  && ((target = skip_consecutive_labels (JUMP_LABEL_AS_INSN (insn)))
+	  && ((target = skip_consecutive_labels (JUMP_LABEL (insn)))
 	      != JUMP_LABEL (insn)))
 	redirect_jump (insn, target, 1);
     }
Index: gcc/resource.c
===================================================================
--- gcc/resource.c	(revision 214732)
+++ gcc/resource.c	(working copy)
@@ -81,7 +81,7 @@ 
 static int find_basic_block (rtx, int);
 static rtx_insn *next_insn_no_annul (rtx_insn *);
 static rtx_insn *find_dead_or_set_registers (rtx_insn *, struct resources*,
-					     rtx_insn **, int, struct resources,
+					     rtx *, int, struct resources,
 					     struct resources);
 
 /* Utility function called from mark_target_live_regs via note_stores.
@@ -422,19 +422,20 @@ 
 
 static rtx_insn *
 find_dead_or_set_registers (rtx_insn *target, struct resources *res,
-			    rtx_insn **jump_target, int jump_count,
+			    rtx *jump_target, int jump_count,
 			    struct resources set, struct resources needed)
 {
   HARD_REG_SET scratch;
-  rtx_insn *insn, *next;
+  rtx_insn *insn;
+  rtx_insn *next_insn;
   rtx_insn *jump_insn = 0;
   int i;
 
-  for (insn = target; insn; insn = next)
+  for (insn = target; insn; insn = next_insn)
     {
       rtx_insn *this_jump_insn = insn;
 
-      next = NEXT_INSN (insn);
+      next_insn = NEXT_INSN (insn);
 
       /* If this instruction can throw an exception, then we don't
 	 know where we might end up next.  That means that we have to
@@ -497,14 +498,16 @@ 
 	      if (any_uncondjump_p (this_jump_insn)
 		  || ANY_RETURN_P (PATTERN (this_jump_insn)))
 		{
-		  next = JUMP_LABEL_AS_INSN (this_jump_insn);
-		  if (ANY_RETURN_P (next))
-		    next = NULL;
+		  rtx lab_or_return = JUMP_LABEL (this_jump_insn);
+		  if (ANY_RETURN_P (lab_or_return))
+		    next_insn = NULL;
+		  else
+		    next_insn = as_a <rtx_insn *> (lab_or_return);
 		  if (jump_insn == 0)
 		    {
 		      jump_insn = insn;
 		      if (jump_target)
-			*jump_target = JUMP_LABEL_AS_INSN (this_jump_insn);
+			*jump_target = JUMP_LABEL (this_jump_insn);
 		    }
 		}
 	      else if (any_condjump_p (this_jump_insn))
@@ -572,7 +575,7 @@ 
 		    find_dead_or_set_registers (JUMP_LABEL_AS_INSN (this_jump_insn),
 						&target_res, 0, jump_count,
 						target_set, needed);
-		  find_dead_or_set_registers (next,
+		  find_dead_or_set_registers (next_insn,
 					      &fallthrough_res, 0, jump_count,
 					      set, needed);
 		  IOR_HARD_REG_SET (fallthrough_res.regs, target_res.regs);
@@ -880,7 +883,7 @@ 
    init_resource_info () was invoked before we are called.  */
 
 void
-mark_target_live_regs (rtx_insn *insns, rtx_insn *target, struct resources *res)
+mark_target_live_regs (rtx_insn *insns, rtx target_maybe_return, struct resources *res)
 {
   int b = -1;
   unsigned int i;
@@ -887,19 +890,23 @@ 
   struct target_info *tinfo = NULL;
   rtx_insn *insn;
   rtx jump_insn = 0;
-  rtx_insn *jump_target;
+  rtx jump_target;
   HARD_REG_SET scratch;
   struct resources set, needed;
 
   /* Handle end of function.  */
-  if (target == 0 || ANY_RETURN_P (target))
+  if (target_maybe_return == 0 || ANY_RETURN_P (target_maybe_return))
     {
       *res = end_of_function_needs;
       return;
     }
 
+  /* We've handled the case of RETURN/SIMPLE_RETURN; we should now have an
+     instruction.  */
+  rtx_insn *target = as_a <rtx_insn *> (target_maybe_return);
+
   /* Handle return insn.  */
-  else if (return_insn_p (target))
+  if (return_insn_p (target))
     {
       *res = end_of_function_needs;
       mark_referenced_resources (target, res, false);
Index: gcc/resource.h
===================================================================
--- gcc/resource.h	(revision 214732)
+++ gcc/resource.h	(working copy)
@@ -44,7 +44,7 @@ 
   MARK_SRC_DEST_CALL = 1
 };
 
-extern void mark_target_live_regs (rtx_insn *, rtx_insn *, struct resources *);
+extern void mark_target_live_regs (rtx_insn *, rtx, struct resources *);
 extern void mark_set_resources (rtx, struct resources *, int,
 				enum mark_resource_type);
 extern void mark_referenced_resources (rtx, struct resources *, bool);