Patchwork [RFC] Re-write LTO type merging again, do tree merging

login
register
mail settings
Submitter Richard Guenther
Date June 13, 2013, 10:08 a.m.
Message ID <alpine.LNX.2.00.1306131022370.26078@zhemvz.fhfr.qr>
Download mbox | patch
Permalink /patch/251030/
State New
Headers show

Comments

Richard Guenther - June 13, 2013, 10:08 a.m.
On Wed, 12 Jun 2013, Richard Biener wrote:

> 
> The following patch re-writes LTO type merging completely with the
> goal to move as much work as possible to the compile step, away
> from WPA time.  At the same time the merging itself gets very
> conservative but also more general - it now merges arbitrary trees,
> not only types, but only if they are bit-identical and have the
> same outgoing tree pointers.
> 
> Especially the latter means that we now have to merge SCCs of trees
> together and either take the whole SCC as prevailing or throw it
> away.  Moving work to the compile step means that we compute
> SCCs and their hashes there, re-organizing streaming to stream
> tree bodies as SCC blocks together with the computed hash.
> 
> When we ask the streamer to output a tree T then it now has
> to DFS walk all tree pointers, collecting SCCs of not yet
> streamed trees and output them like the following:
> 
>  { LTO_tree_scc, N, hash, entry_len,
>    { header1, header2, ... headerN },
>    { bits1, refs1, bits2, refs2, ... bitsN, refsN } }
>  { LTO_tree_scc, 1, hash, header, bits, refs }
>  { LTO_tree_scc, M, hash, entry_len,
>    { header1, header2, ... headerM },
>    { bits1, refs1, bits2, refs2, ... bitsM, refsM } }
>  LTO_tree_pickle_reference to T
> 
> with tree references in refsN always being LTO_tree_pickle_references
> instead of starting a new tree inline.  That results in at most
> N extra LTO_tree_pickle_references for N streamed trees, together
> with the LTO_tree_scc wrapping overhead this causes a slight
> increase in LTO object size (around 10% last time I measured, which
> was before some additional optimization went in).
> 
> The overhead also happens on the LTRANS file producing side
> which now has to do the DFS walk and stream the extra data.
> It doesn't do the hashing though as on the LTRANS consumer
> side no merging is performed.
> 
> The patch preserves the core of the old merging code to compare
> with the new code and output some statistics.  That means that
> if you build with -flto-report[-wpa] you get an additional
> compile-time and memory overhead.
> 
> For reference here are the stats when LTO bootstrapping for
> stage2 cc1:
> 
> WPA statistics
> [WPA] read 2494507 SCCs of average size 2.380067
> [WPA] 5937095 tree bodies read in total
> [WPA] tree SCC table: size 524287, 286280 elements, collision ratio: 
> 0.806376
> [WPA] tree SCC max chain length 11 (size 1)
> [WPA] Compared 403361 SCCs, 6226 collisions (0.015435)
> [WPA] Merged 399980 SCCs
> [WPA] Merged 2438250 tree bodies
> [WPA] Merged 192475 types
> [WPA] 195422 types prevailed
> [WPA] Old merging code merges an additional 54582 types of which 21083 are 
> in the same SCC with their prevailing variant
> 
> this says that we've streamed in 5937095 tree bodies in
> 2494507 SCCs (so the average SCC size is small), of those
> we were able to immediately ggc_free 399980 SCCs because they
> already existed in identical form (16% of the SCCs, 41% of the trees
> and 49% of the types).  The old merging code forced the merge
> of an additional 54582 types (but 21083 of them it merged with
> a type that is in the same SCC, that is, it changed the shape
> of the SCC and collapsed parts of it - something that is
> suspicious).
> 
> The patch was LTO bootstrapped (testing currently running) on
> x86_64-unknown-linux-gnu and I've built SPEC2k6 with -Ofast -g -flto
> and did a test run of the binaries which shows that
> currently  471.omnetpp, 483.xalancbmk and 447.dealII fail
> (471.omnetpp segfaults in __cxxabiv1::__dynamic_cast) - these
> fails were introduced quite recently likely due to the improved
> FUNCTION_DECL and VAR_DECL merging and the cgraph fixup Honza did.

The following incremental patch fixes that.



Here are some updated numbers on cc1 disk, memory and compile-time
usage when built in stage3 with LTO and release checking:

Input object file size unpatched: 324509482 bytes
Input object file size patched:   373225406 bytes (115%)

WPA time/maxrss unpatched:

10.83user 0.67system 0:11.53elapsed 99%CPU (0avgtext+0avgdata 
575108maxresident)k
0inputs+778264outputs (0major+397754minor)pagefaults 0swaps

 ipa lto decl in         :   3.39 (32%) usr
 ipa lto decl out        :   3.88 (36%) usr

WPA time/maxrss patched:

18.35user 2.12system 0:20.56elapsed 99%CPU (0avgtext+0avgdata 
648800maxresident)k
16inputs+1263088outputs (0major+606347minor)pagefaults 0swaps

 ipa lto decl in         :   3.30 (18%) usr
 ipa lto decl out        :  11.46 (63%) usr

LTRANS file size unpatched: 398407935 bytes
LTRANS file size patched:   645106888 bytes (162%)

Unpatched WPA statistics:

[WPA] GIMPLE type table: size 262139, 134158 elements, 365200 searches, 
427507 collisions (ratio: 1.170611)
[WPA] GIMPLE type hash cache table: size 524287, 200149 elements, 4042887 
searches, 3624678 collisions (ratio: 0.896557)
[WPA] Merged 229106 types
[WPA] GIMPLE canonical type table: size 16381, 6221 elements, 134279 
searches, 4370 collisions (ratio: 0.032544)
[WPA] GIMPLE canonical type hash table: size 262139, 134223 elements, 
574837 searches, 506302 collisions (ratio: 0.880775)
[WPA] # of input files: 422
...
[WPA] Size of mmap'd section decls: 36618988 bytes
[WPA] Size of mmap'd section function_body: 40566108 bytes
[WPA] Size of mmap'd section statics: 0 bytes
[WPA] Size of mmap'd section symtab: 0 bytes
[WPA] Size of mmap'd section refs: 138315 bytes
[WPA] Size of mmap'd section asm: 0 bytes
[WPA] Size of mmap'd section jmpfuncs: 665899 bytes
[WPA] Size of mmap'd section pureconst: 59901 bytes
[WPA] Size of mmap'd section reference: 0 bytes
[WPA] Size of mmap'd section profile: 8845 bytes
[WPA] Size of mmap'd section symbol_nodes: 1144120 bytes
[WPA] Size of mmap'd section opts: 0 bytes
[WPA] Size of mmap'd section cgraphopt: 0 bytes
[WPA] Size of mmap'd section inline: 844004 bytes
[WPA] Size of mmap'd section ipcp_trans: 0 bytes

Patched WPA statistics:

[WPA] read 2386084 SCCs of average size 2.421782
[WPA] 5778576 tree bodies read in total
[WPA] tree SCC table: size 524287, 267346 elements, collision ratio: 
0.855240
[WPA] tree SCC max chain length 11 (size 1)
[WPA] Compared 362074 SCCs, 5417 collisions (0.014961)
[WPA] Merged 358942 SCCs
[WPA] Merged 2384340 tree bodies
[WPA] Merged 175201 types
[WPA] 188264 types prevailed
[WPA] Old merging code merges an additional 53081 types of which 21094 are 
in the same SCC
[WPA] GIMPLE canonical type table: size 16381, 6234 elements, 188399 
searches, 5850 collisions (ratio: 0.031051)
[WPA] GIMPLE canonical type hash table: size 262139, 188343 elements, 
1217265 searches, 817652 collisions (ratio: 0.671712)
[WPA] # of input files: 422
...
[WPA] Size of mmap'd section decls: 68171523 bytes
[WPA] Size of mmap'd section function_body: 56454939 bytes
[WPA] Size of mmap'd section statics: 0 bytes
[WPA] Size of mmap'd section symtab: 0 bytes
[WPA] Size of mmap'd section refs: 138463 bytes
[WPA] Size of mmap'd section asm: 0 bytes
[WPA] Size of mmap'd section jmpfuncs: 1185254 bytes
[WPA] Size of mmap'd section pureconst: 59943 bytes
[WPA] Size of mmap'd section reference: 0 bytes
[WPA] Size of mmap'd section profile: 8845 bytes
[WPA] Size of mmap'd section symbol_nodes: 1145897 bytes
[WPA] Size of mmap'd section opts: 0 bytes
[WPA] Size of mmap'd section cgraphopt: 0 bytes
[WPA] Size of mmap'd section inline: 895848 bytes
[WPA] Size of mmap'd section ipcp_trans: 0 bytes


That gives me a mixed feeling despite the overall way superior
design (and correctness).  The WPA LTRANS output side has a hard
time with both writing more trees (we merged less types) and
probably performing the DFS walks.

As you can see from the compile object file sizes the overhead
of streaming in SCCs is manageable so the doubling in LTRANS
file size has to come from extra trees we stream there (I'm
going to add some more counters and try doing stats per LTRANS
file we output).

Richard.
Richard Guenther - June 13, 2013, 2:20 p.m.
On Thu, 13 Jun 2013, Richard Biener wrote:

> On Wed, 12 Jun 2013, Richard Biener wrote:
> 
> > 
> > The following patch re-writes LTO type merging completely with the
> > goal to move as much work as possible to the compile step, away
> > from WPA time.  At the same time the merging itself gets very
> > conservative but also more general - it now merges arbitrary trees,
> > not only types, but only if they are bit-identical and have the
> > same outgoing tree pointers.
> > 
> > Especially the latter means that we now have to merge SCCs of trees
> > together and either take the whole SCC as prevailing or throw it
> > away.  Moving work to the compile step means that we compute
> > SCCs and their hashes there, re-organizing streaming to stream
> > tree bodies as SCC blocks together with the computed hash.
> > 
> > When we ask the streamer to output a tree T then it now has
> > to DFS walk all tree pointers, collecting SCCs of not yet
> > streamed trees and output them like the following:
> > 
> >  { LTO_tree_scc, N, hash, entry_len,
> >    { header1, header2, ... headerN },
> >    { bits1, refs1, bits2, refs2, ... bitsN, refsN } }
> >  { LTO_tree_scc, 1, hash, header, bits, refs }
> >  { LTO_tree_scc, M, hash, entry_len,
> >    { header1, header2, ... headerM },
> >    { bits1, refs1, bits2, refs2, ... bitsM, refsM } }
> >  LTO_tree_pickle_reference to T
> > 
> > with tree references in refsN always being LTO_tree_pickle_references
> > instead of starting a new tree inline.  That results in at most
> > N extra LTO_tree_pickle_references for N streamed trees, together
> > with the LTO_tree_scc wrapping overhead this causes a slight
> > increase in LTO object size (around 10% last time I measured, which
> > was before some additional optimization went in).
> > 
> > The overhead also happens on the LTRANS file producing side
> > which now has to do the DFS walk and stream the extra data.
> > It doesn't do the hashing though as on the LTRANS consumer
> > side no merging is performed.
> > 
> > The patch preserves the core of the old merging code to compare
> > with the new code and output some statistics.  That means that
> > if you build with -flto-report[-wpa] you get an additional
> > compile-time and memory overhead.
> > 
> > For reference here are the stats when LTO bootstrapping for
> > stage2 cc1:
> > 
> > WPA statistics
> > [WPA] read 2494507 SCCs of average size 2.380067
> > [WPA] 5937095 tree bodies read in total
> > [WPA] tree SCC table: size 524287, 286280 elements, collision ratio: 
> > 0.806376
> > [WPA] tree SCC max chain length 11 (size 1)
> > [WPA] Compared 403361 SCCs, 6226 collisions (0.015435)
> > [WPA] Merged 399980 SCCs
> > [WPA] Merged 2438250 tree bodies
> > [WPA] Merged 192475 types
> > [WPA] 195422 types prevailed
> > [WPA] Old merging code merges an additional 54582 types of which 21083 are 
> > in the same SCC with their prevailing variant
> > 
> > this says that we've streamed in 5937095 tree bodies in
> > 2494507 SCCs (so the average SCC size is small), of those
> > we were able to immediately ggc_free 399980 SCCs because they
> > already existed in identical form (16% of the SCCs, 41% of the trees
> > and 49% of the types).  The old merging code forced the merge
> > of an additional 54582 types (but 21083 of them it merged with
> > a type that is in the same SCC, that is, it changed the shape
> > of the SCC and collapsed parts of it - something that is
> > suspicious).
> > 
> > The patch was LTO bootstrapped (testing currently running) on
> > x86_64-unknown-linux-gnu and I've built SPEC2k6 with -Ofast -g -flto
> > and did a test run of the binaries which shows that
> > currently  471.omnetpp, 483.xalancbmk and 447.dealII fail
> > (471.omnetpp segfaults in __cxxabiv1::__dynamic_cast) - these
> > fails were introduced quite recently likely due to the improved
> > FUNCTION_DECL and VAR_DECL merging and the cgraph fixup Honza did.
> 
> The following incremental patch fixes that.
> 
> Index: trunk/gcc/lto-symtab.c
> ===================================================================
> --- trunk.orig/gcc/lto-symtab.c 2013-06-12 16:47:38.000000000 +0200
> +++ trunk/gcc/lto-symtab.c      2013-06-12 17:00:12.664126423 +0200
> @@ -96,9 +96,6 @@ lto_varpool_replace_node (struct varpool
>  
>    ipa_clone_referring ((symtab_node)prevailing_node, 
> &vnode->symbol.ref_list);
>  
> -  /* Be sure we can garbage collect the initializer.  */
> -  if (DECL_INITIAL (vnode->symbol.decl))
> -    DECL_INITIAL (vnode->symbol.decl) = error_mark_node;
>    /* Finally remove the replaced node.  */
>    varpool_remove_node (vnode);
>  }
> Index: trunk/gcc/varpool.c
> ===================================================================
> --- trunk.orig/gcc/varpool.c    2013-06-12 13:13:06.000000000 +0200
> +++ trunk/gcc/varpool.c 2013-06-12 17:01:46.088248807 +0200
> @@ -77,15 +77,8 @@ varpool_remove_node (struct varpool_node
>  
>  /* Renove node initializer when it is no longer needed.  */
>  void
> -varpool_remove_initializer (struct varpool_node *node)
> +varpool_remove_initializer (struct varpool_node *)
>  {
> -  if (DECL_INITIAL (node->symbol.decl)
> -      && !DECL_IN_CONSTANT_POOL (node->symbol.decl)
> -      /* Keep vtables for BINFO folding.  */
> -      && !DECL_VIRTUAL_P (node->symbol.decl)
> -      /* FIXME: http://gcc.gnu.org/PR55395 */
> -      && debug_info_level == DINFO_LEVEL_NONE)
> -    DECL_INITIAL (node->symbol.decl) = error_mark_node;
>  }
>  
>  /* Dump given cgraph node.  */
> 
> 
> Here are some updated numbers on cc1 disk, memory and compile-time
> usage when built in stage3 with LTO and release checking:

Ok, not streaming and comparing TREE_USED gets it improved to

> Input object file size unpatched: 324509482 bytes
> Input object file size patched:   373225406 bytes (115%)
> 
> WPA time/maxrss unpatched:
> 
> 10.83user 0.67system 0:11.53elapsed 99%CPU (0avgtext+0avgdata 
> 575108maxresident)k
> 0inputs+778264outputs (0major+397754minor)pagefaults 0swaps
> 
>  ipa lto decl in         :   3.39 (32%) usr
>  ipa lto decl out        :   3.88 (36%) usr
> 
> WPA time/maxrss patched:
> 
> 18.35user 2.12system 0:20.56elapsed 99%CPU (0avgtext+0avgdata 
> 648800maxresident)k
> 16inputs+1263088outputs (0major+606347minor)pagefaults 0swaps

13.01user 0.68system 0:13.72elapsed 99%CPU (0avgtext+0avgdata 
520092maxresident)k
0inputs+8outputs (0major+389794minor)pagefaults 0swaps

>  ipa lto decl in         :   3.30 (18%) usr
>  ipa lto decl out        :  11.46 (63%) usr

 ipa lto decl in         :   3.31 (25%) usr
 ipa lto decl out        :   6.27 (48%) usr

> LTRANS file size unpatched: 398407935 bytes
> LTRANS file size patched:   645106888 bytes (162%)

LTRANS file size patched:   451821845 bytes (113%)

> 
> Unpatched WPA statistics:
> 
> [WPA] GIMPLE type table: size 262139, 134158 elements, 365200 searches, 
> 427507 collisions (ratio: 1.170611)
> [WPA] GIMPLE type hash cache table: size 524287, 200149 elements, 4042887 
> searches, 3624678 collisions (ratio: 0.896557)
> [WPA] Merged 229106 types
> [WPA] GIMPLE canonical type table: size 16381, 6221 elements, 134279 
> searches, 4370 collisions (ratio: 0.032544)
> [WPA] GIMPLE canonical type hash table: size 262139, 134223 elements, 
> 574837 searches, 506302 collisions (ratio: 0.880775)
> [WPA] # of input files: 422
> ...
> [WPA] Size of mmap'd section decls: 36618988 bytes
> [WPA] Size of mmap'd section function_body: 40566108 bytes
> [WPA] Size of mmap'd section statics: 0 bytes
> [WPA] Size of mmap'd section symtab: 0 bytes
> [WPA] Size of mmap'd section refs: 138315 bytes
> [WPA] Size of mmap'd section asm: 0 bytes
> [WPA] Size of mmap'd section jmpfuncs: 665899 bytes
> [WPA] Size of mmap'd section pureconst: 59901 bytes
> [WPA] Size of mmap'd section reference: 0 bytes
> [WPA] Size of mmap'd section profile: 8845 bytes
> [WPA] Size of mmap'd section symbol_nodes: 1144120 bytes
> [WPA] Size of mmap'd section opts: 0 bytes
> [WPA] Size of mmap'd section cgraphopt: 0 bytes
> [WPA] Size of mmap'd section inline: 844004 bytes
> [WPA] Size of mmap'd section ipcp_trans: 0 bytes
> 
> Patched WPA statistics:
> 
> [WPA] read 2386084 SCCs of average size 2.421782
> [WPA] 5778576 tree bodies read in total
> [WPA] tree SCC table: size 524287, 267346 elements, collision ratio: 
> 0.855240
> [WPA] tree SCC max chain length 11 (size 1)
> [WPA] Compared 362074 SCCs, 5417 collisions (0.014961)
> [WPA] Merged 358942 SCCs
> [WPA] Merged 2384340 tree bodies
> [WPA] Merged 175201 types
> [WPA] 188264 types prevailed
> [WPA] Old merging code merges an additional 53081 types of which 21094 are 
> in the same SCC
> [WPA] GIMPLE canonical type table: size 16381, 6234 elements, 188399 
> searches, 5850 collisions (ratio: 0.031051)
> [WPA] GIMPLE canonical type hash table: size 262139, 188343 elements, 
> 1217265 searches, 817652 collisions (ratio: 0.671712)
> [WPA] # of input files: 422
> ...
> [WPA] Size of mmap'd section decls: 68171523 bytes
> [WPA] Size of mmap'd section function_body: 56454939 bytes
> [WPA] Size of mmap'd section statics: 0 bytes
> [WPA] Size of mmap'd section symtab: 0 bytes
> [WPA] Size of mmap'd section refs: 138463 bytes
> [WPA] Size of mmap'd section asm: 0 bytes
> [WPA] Size of mmap'd section jmpfuncs: 1185254 bytes
> [WPA] Size of mmap'd section pureconst: 59943 bytes
> [WPA] Size of mmap'd section reference: 0 bytes
> [WPA] Size of mmap'd section profile: 8845 bytes
> [WPA] Size of mmap'd section symbol_nodes: 1145897 bytes
> [WPA] Size of mmap'd section opts: 0 bytes
> [WPA] Size of mmap'd section cgraphopt: 0 bytes
> [WPA] Size of mmap'd section inline: 895848 bytes
> [WPA] Size of mmap'd section ipcp_trans: 0 bytes

[WPA] read 2386075 SCCs of average size 2.421786
[WPA] 5778564 tree bodies read in total
[WPA] tree SCC table: size 524287, 255520 elements, collision ratio: 
0.944901
[WPA] tree SCC max chain length 11 (size 1)
[WPA] Compared 373893 SCCs, 5706 collisions (0.015261)
[WPA] Merged 370688 SCCs
[WPA] Merged 3165669 tree bodies
[WPA] Merged 188791 types
[WPA] 174670 types prevailed (547208 associated trees)
[WPA] Old merging code merges an additional 40013 types of which 21620 are 
in the same SCC with their prevailing variant (411743 and 324175 
associated trees)
[WPA] GIMPLE canonical type table: size 16381, 6233 elements, 174805 
searches, 5484 collisions (ratio: 0.031372)
[WPA] GIMPLE canonical type hash table: size 262139, 174753 elements, 
862715 searches, 755816 collisions (ratio: 0.876090)
[WPA] # of input files: 422
...
[WPA] Size of mmap'd section decls: 67911573 bytes
[WPA] Size of mmap'd section function_body: 0 bytes
[WPA] Size of mmap'd section statics: 0 bytes
[WPA] Size of mmap'd section symtab: 0 bytes
[WPA] Size of mmap'd section refs: 0 bytes
[WPA] Size of mmap'd section asm: 0 bytes
[WPA] Size of mmap'd section jmpfuncs: 0 bytes
[WPA] Size of mmap'd section pureconst: 0 bytes
[WPA] Size of mmap'd section reference: 0 bytes
[WPA] Size of mmap'd section profile: 0 bytes
[WPA] Size of mmap'd section symbol_nodes: 0 bytes
[WPA] Size of mmap'd section opts: 0 bytes
[WPA] Size of mmap'd section cgraphopt: 0 bytes
[WPA] Size of mmap'd section inline: 0 bytes
[WPA] Size of mmap'd section ipcp_trans: 0 bytes


That's now much more reasonable.

Complete fixed patch below, for reference (without adjusted
changelog).

Richard.


	* lto-streamer.h (enum LTO_tags): Add LTO_tree_scc.
	(lto_input_scc): Declare.
	(lto_input_tree_1): Likewise.
	* lto-streamer-in.c (lto_read_body): Use streamer_tree_cache_get_tree.
	(lto_read_tree_1): Split out from ...
	(lto_read_tree): ... this.
	(lto_input_scc): New function.
	(lto_input_tree_1): Split out from ...
	(lto_input_tree): ... this.  Handle LTO_tree_scc.
	(lto_data_in_create): Create the streamer cache without hashes.
	* lto-streamer-out.c (create_output_block): Create the streamer
	cache with hashes when not doing WPA.
	(lto_write_tree_1): Split out from ...
	(lto_write_tree): ... this.
	(get_symbol_initial_value): New function.
	(lto_output_tree_1): Split out from ...
	(lto_output_tree): ... this.  Write trees as series of SCCs
	using a DFS walk via DFS_write_tree.
	(struct sccs, struct scc_entry): New types.
	(next_dfs_num, sccstack, sccstate, sccstate_obstack): New globals.
	(DFS_write_tree_body): New function.
	(DFS_write_tree): Likewise.
	(hash_tree): Likewise.
	(scc_entry_compare): Likewise.
	(hash_scc): Likewise.
	* tree-streamer-in.c (lto_input_ts_list_tree_pointers): Stream
	TREE_CHAIN as regular reference.
	(streamer_read_integer_cst): Remove.
	(streamer_get_pickled_tree): Adjust.
	* tree-streamer-out.c (streamer_write_chain): Disable streaming
	of DECL_EXTERNALs in BLOCK_VARS for now.
	(write_ts_list_tree_pointers): Stream TREE_CHAIN as regular
	reference.
	* tree-streamer.c (streamer_tree_cache_add_to_node_array):
	Add hash value argument and record that if hashes are recorded
	in the cache.
	(streamer_tree_cache_insert_1): Adjust.
	(streamer_tree_cache_insert): Likewise.
	(streamer_tree_cache_insert_at): Rename to ...
	(streamer_tree_cache_replace_tree): ... this and adjust.
	(streamer_tree_cache_append): Adjust.
	(record_common_node): Likewise.
	(streamer_tree_cache_create): Add argument whether to
	record hash values together with trees.
	(streamer_tree_cache_delete): Adjust.
	* tree-streamer.h (struct streamer_tree_cache_d): Add
	vector of hashes.
	(streamer_read_integer_cst): Remove.
	(streamer_tree_cache_insert): Adjust.
	(streamer_tree_cache_append): Likewise.
	(streamer_tree_cache_insert_at): Rename to ...
	(streamer_tree_cache_replace_tree): ... this and adjust.
	(streamer_tree_cache_create): Add argument whether to record hashes.
	(streamer_tree_cache_get): Rename to ...
	(streamer_tree_cache_get_tree): ... this.
	(streamer_tree_cache_get_hash): New function.
	* tree.c (cache_integer_cst): New function.
	* tree.h (cache_integer_cst): Declare.

	lto/
	* Make-lang.in (lto.o): Add $(DATA_STREAMER_H) dependency.
	* lto.c: Include data-streamer.h.
	(lto_read_in_decl_state): Use streamer_tree_cache_get_tree.
	(gimple_type_leader_entry_s, gimple_type_leader,
	gimple_lookup_type_leader): Remove.
	(gtc_visit): Simplify.
	(gimple_types_compatible_p): Likewise.
	(gimple_register_type_1): Likewise.  Merge into ...
	(gimple_register_type): ... this.  Keep it as legacy for
	statistics purposes for now.
	(fixup_integer_cst): Remove.
	(LTO_FIXUP_TREE, lto_fixup_types, lto_ft_*): Simplify and
	rename to ...
	(MAYBE_REMEMBER_WITH_VARS, maybe_remember_with_vars,
	maybe_remember_with_vars_*): ... these.
	(uniquify_nodes): Remove.
	(lto_fixup_prevailing_type): New function.
	(struct tree_scc, struct tree_scc_hasher): New type and hasher.
	(tree_scc_hash, tree_scc_hash_obstack): New globals.
	(num_merged_types, num_prevailing_types, num_not_merged_types,
	num_not_merged_types_in_same_scc, total_scc_size, num_sccs_read,
	total_scc_size_merged, num_sccs_merged, num_scc_compares,
	num_scc_compare_collisions): New global counters.
	(compare_tree_sccs_1): New function.
	(compare_tree_sccs): Likewise.
	(unify_scc): Likewise.
	(lto_read_decls): Stream in tree SCCs and unify them on the
	way in.  Finalize prevailing SCC tree members.
	(read_cgraph_and_symbols): Do not initialize or free gimple_type_leader.
	Allocate and free tree_scc_hash_obstack and tree_scc_hash, do not bother
	to ggc-collect during merging.
	(print_lto_report_1): Adjust for new merging code.

Index: trunk/gcc/lto-streamer.h
===================================================================
*** trunk.orig/gcc/lto-streamer.h	2013-06-13 14:35:48.000000000 +0200
--- trunk/gcc/lto-streamer.h	2013-06-13 14:35:53.249107976 +0200
*************** enum LTO_tags
*** 199,204 ****
--- 199,207 ----
    /* EH try/catch node.  */
    LTO_eh_catch,
  
+   /* Special for global streamer.  A blob of unnamed tree nodes.  */
+   LTO_tree_scc,
+ 
    /* References to indexable tree nodes.  These objects are stored in
       tables that are written separately from the function bodies that
       reference them.  This way they can be instantiated even when the
*************** tree lto_input_tree_ref (struct lto_inpu
*** 854,859 ****
--- 857,866 ----
  			 struct function *, enum LTO_tags);
  void lto_tag_check_set (enum LTO_tags, int, ...);
  void lto_init_eh (void);
+ hashval_t lto_input_scc (struct lto_input_block *, struct data_in *,
+ 			 unsigned *, unsigned *);
+ tree lto_input_tree_1 (struct lto_input_block *, struct data_in *,
+ 		       enum LTO_tags, hashval_t hash);
  tree lto_input_tree (struct lto_input_block *, struct data_in *);
  
  
Index: trunk/gcc/lto-streamer-in.c
===================================================================
*** trunk.orig/gcc/lto-streamer-in.c	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/lto-streamer-in.c	2013-06-13 14:35:53.250107988 +0200
*************** lto_read_body (struct lto_file_decl_data
*** 1016,1022 ****
  	  unsigned i;
  	  for (i = len; i-- > from;)
  	    {
! 	      tree t = cache->nodes[i];
  	      if (t == NULL_TREE)
  		continue;
  
--- 1016,1022 ----
  	  unsigned i;
  	  for (i = len; i-- > from;)
  	    {
! 	      tree t = streamer_tree_cache_get_tree (cache, i);
  	      if (t == NULL_TREE)
  		continue;
  
*************** lto_input_function_body (struct lto_file
*** 1056,1067 ****
  }
  
  
  /* Read the physical representation of a tree node with tag TAG from
     input block IB using the per-file context in DATA_IN.  */
  
  static tree
  lto_read_tree (struct lto_input_block *ib, struct data_in *data_in,
! 	       enum LTO_tags tag)
  {
    /* Instantiate a new tree node.  */
    tree result = streamer_alloc_tree (ib, data_in, tag);
--- 1056,1098 ----
  }
  
  
+ /* Read the physical representation of a tree node EXPR from
+    input block IB using the per-file context in DATA_IN.  */
+ 
+ static void
+ lto_read_tree_1 (struct lto_input_block *ib, struct data_in *data_in, tree expr)
+ {
+   /* Read all the bitfield values in EXPR.  Note that for LTO, we
+      only write language-independent bitfields, so no more unpacking is
+      needed.  */
+   streamer_read_tree_bitfields (ib, data_in, expr);
+ 
+   /* Read all the pointer fields in EXPR.  */
+   streamer_read_tree_body (ib, data_in, expr);
+ 
+   /* Read any LTO-specific data not read by the tree streamer.  */
+   if (DECL_P (expr)
+       && TREE_CODE (expr) != FUNCTION_DECL
+       && TREE_CODE (expr) != TRANSLATION_UNIT_DECL)
+     DECL_INITIAL (expr) = stream_read_tree (ib, data_in);
+ 
+   /* We should never try to instantiate an MD or NORMAL builtin here.  */
+   if (TREE_CODE (expr) == FUNCTION_DECL)
+     gcc_assert (!streamer_handle_as_builtin_p (expr));
+ 
+ #ifdef LTO_STREAMER_DEBUG
+   /* Remove the mapping to RESULT's original address set by
+      streamer_alloc_tree.  */
+   lto_orig_address_remove (expr);
+ #endif
+ }
+ 
  /* Read the physical representation of a tree node with tag TAG from
     input block IB using the per-file context in DATA_IN.  */
  
  static tree
  lto_read_tree (struct lto_input_block *ib, struct data_in *data_in,
! 	       enum LTO_tags tag, hashval_t hash)
  {
    /* Instantiate a new tree node.  */
    tree result = streamer_alloc_tree (ib, data_in, tag);
*************** lto_read_tree (struct lto_input_block *i
*** 1069,1103 ****
    /* Enter RESULT in the reader cache.  This will make RESULT
       available so that circular references in the rest of the tree
       structure can be resolved in subsequent calls to stream_read_tree.  */
!   streamer_tree_cache_append (data_in->reader_cache, result);
  
!   /* Read all the bitfield values in RESULT.  Note that for LTO, we
!      only write language-independent bitfields, so no more unpacking is
!      needed.  */
!   streamer_read_tree_bitfields (ib, data_in, result);
  
!   /* Read all the pointer fields in RESULT.  */
!   streamer_read_tree_body (ib, data_in, result);
  
!   /* Read any LTO-specific data not read by the tree streamer.  */
!   if (DECL_P (result)
!       && TREE_CODE (result) != FUNCTION_DECL
!       && TREE_CODE (result) != TRANSLATION_UNIT_DECL)
!     DECL_INITIAL (result) = stream_read_tree (ib, data_in);
  
-   /* We should never try to instantiate an MD or NORMAL builtin here.  */
-   if (TREE_CODE (result) == FUNCTION_DECL)
-     gcc_assert (!streamer_handle_as_builtin_p (result));
  
!   /* end_marker = */ streamer_read_uchar (ib);
  
! #ifdef LTO_STREAMER_DEBUG
!   /* Remove the mapping to RESULT's original address set by
!      streamer_alloc_tree.  */
!   lto_orig_address_remove (result);
! #endif
  
!   return result;
  }
  
  
--- 1100,1169 ----
    /* Enter RESULT in the reader cache.  This will make RESULT
       available so that circular references in the rest of the tree
       structure can be resolved in subsequent calls to stream_read_tree.  */
!   streamer_tree_cache_append (data_in->reader_cache, result, hash);
  
!   lto_read_tree_1 (ib, data_in, result);
  
!   /* end_marker = */ streamer_read_uchar (ib);
  
!   return result;
! }
  
  
! /* Populate the reader cache with trees materialized from the SCC
!    following in the IB, DATA_IN stream.  */
  
! hashval_t
! lto_input_scc (struct lto_input_block *ib, struct data_in *data_in,
! 	       unsigned *len, unsigned *entry_len)
! {
!   /* A blob of unnamed tree nodes, fill the cache from it and
!      recurse.  */
!   unsigned size = streamer_read_uhwi (ib);
!   hashval_t scc_hash = streamer_read_uhwi (ib);
!   unsigned scc_entry_len = 1;
  
!   if (size == 1)
!     {
!       enum LTO_tags tag = streamer_read_record_start (ib);
!       lto_input_tree_1 (ib, data_in, tag, scc_hash);
!     }
!   else
!     {
!       unsigned int first = data_in->reader_cache->nodes.length ();
!       tree result;
! 
!       scc_entry_len = streamer_read_uhwi (ib);
! 
!       /* Materialize size trees by reading their headers.  */
!       for (unsigned i = 0; i < size; ++i)
! 	{
! 	  enum LTO_tags tag = streamer_read_record_start (ib);
! 	  if (tag == LTO_null
! 	      || (tag >= LTO_field_decl_ref && tag <= LTO_global_decl_ref)
! 	      || tag == LTO_tree_pickle_reference
! 	      || tag == LTO_builtin_decl
! 	      || tag == LTO_integer_cst
! 	      || tag == LTO_tree_scc)
! 	    gcc_unreachable ();
! 
! 	  result = streamer_alloc_tree (ib, data_in, tag);
! 	  streamer_tree_cache_append (data_in->reader_cache, result, 0);
! 	}
! 
!       /* Read the tree bitpacks and references.  */
!       for (unsigned i = 0; i < size; ++i)
! 	{
! 	  result = streamer_tree_cache_get_tree (data_in->reader_cache,
! 						 first + i);
! 	  lto_read_tree_1 (ib, data_in, result);
! 	  /* end_marker = */ streamer_read_uchar (ib);
! 	}
!     }
! 
!   *len = size;
!   *entry_len = scc_entry_len;
!   return scc_hash;
  }
  
  
*************** lto_read_tree (struct lto_input_block *i
*** 1106,1117 ****
     to previously read nodes.  */
  
  tree
! lto_input_tree (struct lto_input_block *ib, struct data_in *data_in)
  {
-   enum LTO_tags tag;
    tree result;
  
-   tag = streamer_read_record_start (ib);
    gcc_assert ((unsigned) tag < (unsigned) LTO_NUM_TAGS);
  
    if (tag == LTO_null)
--- 1172,1182 ----
     to previously read nodes.  */
  
  tree
! lto_input_tree_1 (struct lto_input_block *ib, struct data_in *data_in,
! 		  enum LTO_tags tag, hashval_t hash)
  {
    tree result;
  
    gcc_assert ((unsigned) tag < (unsigned) LTO_NUM_TAGS);
  
    if (tag == LTO_null)
*************** lto_input_tree (struct lto_input_block *
*** 1137,1155 ****
      }
    else if (tag == LTO_integer_cst)
      {
!       /* For shared integer constants we only need the type and its hi/low
! 	 words.  */
!       result = streamer_read_integer_cst (ib, data_in);
      }
    else
      {
        /* Otherwise, materialize a new node from IB.  */
!       result = lto_read_tree (ib, data_in, tag);
      }
  
    return result;
  }
  
  
  /* Input toplevel asms.  */
  
--- 1202,1240 ----
      }
    else if (tag == LTO_integer_cst)
      {
!       /* For shared integer constants in singletons we can use the existing
!          tree integer constant merging code.  */
!       tree type = stream_read_tree (ib, data_in);
!       unsigned HOST_WIDE_INT low = streamer_read_uhwi (ib);
!       HOST_WIDE_INT high = streamer_read_hwi (ib);
!       result = build_int_cst_wide (type, low, high);
!       streamer_tree_cache_append (data_in->reader_cache, result, hash);
!     }
!   else if (tag == LTO_tree_scc)
!     {
!       unsigned len, entry_len;
! 
!       /* Input and skip the SCC.  */
!       lto_input_scc (ib, data_in, &len, &entry_len);
! 
!       /* Recurse.  */
!       return lto_input_tree (ib, data_in);
      }
    else
      {
        /* Otherwise, materialize a new node from IB.  */
!       result = lto_read_tree (ib, data_in, tag, hash);
      }
  
    return result;
  }
  
+ tree
+ lto_input_tree (struct lto_input_block *ib, struct data_in *data_in)
+ {
+   return lto_input_tree_1 (ib, data_in, streamer_read_record_start (ib), 0);
+ }
+ 
  
  /* Input toplevel asms.  */
  
*************** lto_data_in_create (struct lto_file_decl
*** 1220,1226 ****
    data_in->strings = strings;
    data_in->strings_len = len;
    data_in->globals_resolution = resolutions;
!   data_in->reader_cache = streamer_tree_cache_create ();
  
    return data_in;
  }
--- 1305,1311 ----
    data_in->strings = strings;
    data_in->strings_len = len;
    data_in->globals_resolution = resolutions;
!   data_in->reader_cache = streamer_tree_cache_create (false);
  
    return data_in;
  }
Index: trunk/gcc/lto-streamer-out.c
===================================================================
*** trunk.orig/gcc/lto-streamer-out.c	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/lto-streamer-out.c	2013-06-13 15:35:29.920956587 +0200
*************** create_output_block (enum lto_section_ty
*** 71,77 ****
    ob->decl_state = lto_get_out_decl_state ();
    ob->main_stream = XCNEW (struct lto_output_stream);
    ob->string_stream = XCNEW (struct lto_output_stream);
!   ob->writer_cache = streamer_tree_cache_create ();
  
    if (section_type == LTO_section_function_body)
      ob->cfg_stream = XCNEW (struct lto_output_stream);
--- 71,77 ----
    ob->decl_state = lto_get_out_decl_state ();
    ob->main_stream = XCNEW (struct lto_output_stream);
    ob->string_stream = XCNEW (struct lto_output_stream);
!   ob->writer_cache = streamer_tree_cache_create (!flag_wpa);
  
    if (section_type == LTO_section_function_body)
      ob->cfg_stream = XCNEW (struct lto_output_stream);
*************** lto_is_streamable (tree expr)
*** 293,319 ****
  }
  
  
  /* Write a physical representation of tree node EXPR to output block
     OB.  If REF_P is true, the leaves of EXPR are emitted as references
     via lto_output_tree_ref.  IX is the index into the streamer cache
     where EXPR is stored.  */
  
  static void
! lto_write_tree (struct output_block *ob, tree expr, bool ref_p)
  {
-   struct bitpack_d bp;
- 
-   if (!lto_is_streamable (expr))
-     internal_error ("tree code %qs is not supported in LTO streams",
- 	            tree_code_name[TREE_CODE (expr)]);
- 
-   /* Write the header, containing everything needed to materialize
-      EXPR on the reading side.  */
-   streamer_write_tree_header (ob, expr);
- 
    /* Pack all the non-pointer fields in EXPR into a bitpack and write
       the resulting bitpack.  */
!   bp = bitpack_create (ob->main_stream);
    streamer_pack_tree_bitfields (ob, &bp, expr);
    streamer_write_bitpack (&bp);
  
--- 293,339 ----
  }
  
  
+ /* For EXPR lookup and return what we want to stream to OB as DECL_INITIAL.  */
+ 
+ static tree
+ get_symbol_initial_value (struct output_block *ob, tree expr)
+ {
+   gcc_checking_assert (DECL_P (expr)
+ 		       && TREE_CODE (expr) != FUNCTION_DECL
+ 		       && TREE_CODE (expr) != TRANSLATION_UNIT_DECL);
+ 
+   /* Handle DECL_INITIAL for symbols.  */
+   tree initial = DECL_INITIAL (expr);
+   if (TREE_CODE (expr) == VAR_DECL
+       && (TREE_STATIC (expr) || DECL_EXTERNAL (expr))
+       && initial)
+     {
+       lto_symtab_encoder_t encoder;
+       struct varpool_node *vnode;
+ 
+       encoder = ob->decl_state->symtab_node_encoder;
+       vnode = varpool_get_node (expr);
+       if (!vnode
+ 	  || !lto_symtab_encoder_encode_initializer_p (encoder,
+ 						       vnode))
+ 	initial = error_mark_node;
+     }
+ 
+   return initial;
+ }
+ 
+ 
  /* Write a physical representation of tree node EXPR to output block
     OB.  If REF_P is true, the leaves of EXPR are emitted as references
     via lto_output_tree_ref.  IX is the index into the streamer cache
     where EXPR is stored.  */
  
  static void
! lto_write_tree_1 (struct output_block *ob, tree expr, bool ref_p)
  {
    /* Pack all the non-pointer fields in EXPR into a bitpack and write
       the resulting bitpack.  */
!   bitpack_d bp = bitpack_create (ob->main_stream);
    streamer_pack_tree_bitfields (ob, &bp, expr);
    streamer_write_bitpack (&bp);
  
*************** lto_write_tree (struct output_block *ob,
*** 345,366 ****
  
        stream_write_tree (ob, initial, ref_p);
      }
  
    /* Mark the end of EXPR.  */
    streamer_write_zero (ob);
  }
  
- 
  /* Emit the physical representation of tree node EXPR to output block
     OB.  If THIS_REF_P is true, the leaves of EXPR are emitted as references
     via lto_output_tree_ref.  REF_P is used for streaming siblings of EXPR.  */
  
! void
! lto_output_tree (struct output_block *ob, tree expr,
! 		 bool ref_p, bool this_ref_p)
  {
    unsigned ix;
-   bool existed_p;
  
    if (expr == NULL_TREE)
      {
--- 365,403 ----
  
        stream_write_tree (ob, initial, ref_p);
      }
+ }
+ 
+ /* Write a physical representation of tree node EXPR to output block
+    OB.  If REF_P is true, the leaves of EXPR are emitted as references
+    via lto_output_tree_ref.  IX is the index into the streamer cache
+    where EXPR is stored.  */
+ 
+ static void
+ lto_write_tree (struct output_block *ob, tree expr, bool ref_p)
+ {
+   if (!lto_is_streamable (expr))
+     internal_error ("tree code %qs is not supported in LTO streams",
+ 		    tree_code_name[TREE_CODE (expr)]);
+ 
+   /* Write the header, containing everything needed to materialize
+      EXPR on the reading side.  */
+   streamer_write_tree_header (ob, expr);
+ 
+   lto_write_tree_1 (ob, expr, ref_p);
  
    /* Mark the end of EXPR.  */
    streamer_write_zero (ob);
  }
  
  /* Emit the physical representation of tree node EXPR to output block
     OB.  If THIS_REF_P is true, the leaves of EXPR are emitted as references
     via lto_output_tree_ref.  REF_P is used for streaming siblings of EXPR.  */
  
! static void
! lto_output_tree_1 (struct output_block *ob, tree expr, hashval_t hash,
! 		   bool ref_p, bool this_ref_p)
  {
    unsigned ix;
  
    if (expr == NULL_TREE)
      {
*************** lto_output_tree (struct output_block *ob
*** 374,391 ****
        return;
      }
  
!   /* Shared INTEGER_CST nodes are special because they need their original type
!      to be materialized by the reader (to implement TYPE_CACHED_VALUES).  */
!   if (TREE_CODE (expr) == INTEGER_CST
!       && !TREE_OVERFLOW (expr))
      {
        streamer_write_integer_cst (ob, expr, ref_p);
        return;
      }
  
!   existed_p = streamer_tree_cache_insert (ob->writer_cache, expr, &ix);
    if (existed_p)
      {
        /* If a node has already been streamed out, make sure that
  	 we don't write it more than once.  Otherwise, the reader
  	 will instantiate two different nodes for the same object.  */
--- 411,1328 ----
        return;
      }
  
!   gcc_checking_assert (!streamer_tree_cache_lookup (ob->writer_cache, expr, &ix));
! 
!   streamer_tree_cache_insert (ob->writer_cache, expr, hash, &ix);
!   if (streamer_handle_as_builtin_p (expr))
!     {
!       /* MD and NORMAL builtins do not need to be written out
! 	 completely as they are always instantiated by the
! 	 compiler on startup.  The only builtins that need to
! 	 be written out are BUILT_IN_FRONTEND.  For all other
! 	 builtins, we simply write the class and code.  */
!       streamer_write_builtin (ob, expr);
!     }
!   else if (TREE_CODE (expr) == INTEGER_CST
! 	   && !TREE_OVERFLOW (expr))
      {
+       /* Shared INTEGER_CST nodes are special because they need their
+ 	 original type to be materialized by the reader (to implement
+ 	 TYPE_CACHED_VALUES).  */
        streamer_write_integer_cst (ob, expr, ref_p);
+     }
+   else
+     {
+       /* This is the first time we see EXPR, write its fields
+ 	 to OB.  */
+       lto_write_tree (ob, expr, ref_p);
+     }
+ }
+ 
+ struct sccs
+ {
+   unsigned int dfsnum;
+   unsigned int low;
+ };
+ 
+ struct scc_entry
+ {
+   tree t;
+   hashval_t hash;
+ };
+ 
+ static unsigned int next_dfs_num;
+ static vec<scc_entry> sccstack;
+ static struct pointer_map_t *sccstate;
+ static struct obstack sccstate_obstack;
+ 
+ static void
+ DFS_write_tree (struct output_block *ob, sccs *from_state,
+ 		tree expr, bool ref_p, bool this_ref_p);
+ 
+ /* Handle the tree EXPR in the DFS walk with SCC state EXPR_STATE and
+    DFS recurse for all tree edges originating from it.  */
+ 
+ static void
+ DFS_write_tree_body (struct output_block *ob,
+ 		     tree expr, sccs *expr_state, bool ref_p)
+ {
+ #define DFS_follow_tree_edge(DEST) \
+   DFS_write_tree (ob, expr_state, DEST, ref_p, ref_p)
+ 
+   enum tree_code code;
+ 
+   code = TREE_CODE (expr);
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_TYPED))
+     {
+       if (TREE_CODE (expr) != IDENTIFIER_NODE)
+ 	DFS_follow_tree_edge (TREE_TYPE (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_VECTOR))
+     {
+       for (unsigned i = 0; i < VECTOR_CST_NELTS (expr); ++i)
+ 	DFS_follow_tree_edge (VECTOR_CST_ELT (expr, i));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_COMPLEX))
+     {
+       DFS_follow_tree_edge (TREE_REALPART (expr));
+       DFS_follow_tree_edge (TREE_IMAGPART (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_MINIMAL))
+     {
+       DFS_follow_tree_edge (DECL_NAME (expr));
+       DFS_follow_tree_edge (DECL_CONTEXT (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_COMMON))
+     {
+       DFS_follow_tree_edge (DECL_SIZE (expr));
+       DFS_follow_tree_edge (DECL_SIZE_UNIT (expr));
+ 
+       /* Note, DECL_INITIAL is not handled here.  Since DECL_INITIAL needs
+ 	 special handling in LTO, it must be handled by streamer hooks.  */
+ 
+       DFS_follow_tree_edge (DECL_ATTRIBUTES (expr));
+ 
+       /* Do not follow DECL_ABSTRACT_ORIGIN.  We cannot handle debug information
+ 	 for early inlining so drop it on the floor instead of ICEing in
+ 	 dwarf2out.c.  */
+ 
+       if ((TREE_CODE (expr) == VAR_DECL
+ 	   || TREE_CODE (expr) == PARM_DECL)
+ 	  && DECL_HAS_VALUE_EXPR_P (expr))
+ 	DFS_follow_tree_edge (DECL_VALUE_EXPR (expr));
+       if (TREE_CODE (expr) == VAR_DECL)
+ 	DFS_follow_tree_edge (DECL_DEBUG_EXPR (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_NON_COMMON))
+     {
+       if (TREE_CODE (expr) == FUNCTION_DECL)
+ 	{
+ 	  for (tree t = DECL_ARGUMENTS (expr); t; t = TREE_CHAIN (t))
+ 	    DFS_follow_tree_edge (t);
+ 	  DFS_follow_tree_edge (DECL_RESULT (expr));
+ 	}
+       else if (TREE_CODE (expr) == TYPE_DECL)
+ 	DFS_follow_tree_edge (DECL_ORIGINAL_TYPE (expr));
+       DFS_follow_tree_edge (DECL_VINDEX (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_WITH_VIS))
+     {
+       /* Make sure we don't inadvertently set the assembler name.  */
+       if (DECL_ASSEMBLER_NAME_SET_P (expr))
+ 	DFS_follow_tree_edge (DECL_ASSEMBLER_NAME (expr));
+       DFS_follow_tree_edge (DECL_SECTION_NAME (expr));
+       DFS_follow_tree_edge (DECL_COMDAT_GROUP (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_FIELD_DECL))
+     {
+       DFS_follow_tree_edge (DECL_FIELD_OFFSET (expr));
+       DFS_follow_tree_edge (DECL_BIT_FIELD_TYPE (expr));
+       DFS_follow_tree_edge (DECL_BIT_FIELD_REPRESENTATIVE (expr));
+       DFS_follow_tree_edge (DECL_FIELD_BIT_OFFSET (expr));
+       DFS_follow_tree_edge (DECL_FCONTEXT (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_FUNCTION_DECL))
+     {
+       DFS_follow_tree_edge (DECL_FUNCTION_PERSONALITY (expr));
+       DFS_follow_tree_edge (DECL_FUNCTION_SPECIFIC_TARGET (expr));
+       DFS_follow_tree_edge (DECL_FUNCTION_SPECIFIC_OPTIMIZATION (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_TYPE_COMMON))
+     {
+       DFS_follow_tree_edge (TYPE_SIZE (expr));
+       DFS_follow_tree_edge (TYPE_SIZE_UNIT (expr));
+       DFS_follow_tree_edge (TYPE_ATTRIBUTES (expr));
+       DFS_follow_tree_edge (TYPE_NAME (expr));
+       /* Do not follow TYPE_POINTER_TO or TYPE_REFERENCE_TO.  They will be
+ 	 reconstructed during fixup.  */
+       /* Do not follow TYPE_NEXT_VARIANT, we reconstruct the variant lists
+ 	 during fixup.  */
+       DFS_follow_tree_edge (TYPE_MAIN_VARIANT (expr));
+       DFS_follow_tree_edge (TYPE_CONTEXT (expr));
+       /* TYPE_CANONICAL is re-computed during type merging, so no need
+ 	 to follow it here.  */
+       DFS_follow_tree_edge (TYPE_STUB_DECL (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_TYPE_NON_COMMON))
+     {
+       if (TREE_CODE (expr) == ENUMERAL_TYPE)
+ 	DFS_follow_tree_edge (TYPE_VALUES (expr));
+       else if (TREE_CODE (expr) == ARRAY_TYPE)
+ 	DFS_follow_tree_edge (TYPE_DOMAIN (expr));
+       else if (RECORD_OR_UNION_TYPE_P (expr))
+ 	for (tree t = TYPE_FIELDS (expr); t; t = TREE_CHAIN (t))
+ 	  DFS_follow_tree_edge (t);
+       else if (TREE_CODE (expr) == FUNCTION_TYPE
+ 	       || TREE_CODE (expr) == METHOD_TYPE)
+ 	DFS_follow_tree_edge (TYPE_ARG_TYPES (expr));
+ 
+       if (!POINTER_TYPE_P (expr))
+ 	DFS_follow_tree_edge (TYPE_MINVAL (expr));
+       DFS_follow_tree_edge (TYPE_MAXVAL (expr));
+       if (RECORD_OR_UNION_TYPE_P (expr))
+ 	DFS_follow_tree_edge (TYPE_BINFO (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_LIST))
+     {
+       DFS_follow_tree_edge (TREE_PURPOSE (expr));
+       DFS_follow_tree_edge (TREE_VALUE (expr));
+       DFS_follow_tree_edge (TREE_CHAIN (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_VEC))
+     {
+       for (int i = 0; i < TREE_VEC_LENGTH (expr); i++)
+ 	DFS_follow_tree_edge (TREE_VEC_ELT (expr, i));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_EXP))
+     {
+       for (int i = 0; i < TREE_OPERAND_LENGTH (expr); i++)
+ 	DFS_follow_tree_edge (TREE_OPERAND (expr, i));
+       DFS_follow_tree_edge (TREE_BLOCK (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_BLOCK))
+     {
+       for (tree t = BLOCK_VARS (expr); t; t = TREE_CHAIN (t))
+ 	/* ???  FIXME.  See also streamer_write_chain.  */
+ 	if (!(VAR_OR_FUNCTION_DECL_P (t)
+ 	      && DECL_EXTERNAL (t)))
+ 	  DFS_follow_tree_edge (t);
+ 
+       DFS_follow_tree_edge (BLOCK_SUPERCONTEXT (expr));
+ 
+       /* Follow BLOCK_ABSTRACT_ORIGIN for the limited cases we can
+ 	 handle - those that represent inlined function scopes.
+ 	 For the drop rest them on the floor instead of ICEing
+ 	 in dwarf2out.c.  */
+       if (inlined_function_outer_scope_p (expr))
+ 	{
+ 	  tree ultimate_origin = block_ultimate_origin (expr);
+ 	  DFS_follow_tree_edge (ultimate_origin);
+ 	}
+       /* Do not follow BLOCK_NONLOCALIZED_VARS.  We cannot handle debug
+ 	 information for early inlined BLOCKs so drop it on the floor instead
+ 	 of ICEing in dwarf2out.c.  */
+ 
+       /* BLOCK_FRAGMENT_ORIGIN and BLOCK_FRAGMENT_CHAIN is not live at LTO
+ 	 streaming time.  */
+ 
+       /* Do not output BLOCK_SUBBLOCKS.  Instead on streaming-in this
+ 	 list is re-constructed from BLOCK_SUPERCONTEXT.  */
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_BINFO))
+     {
+       unsigned i;
+       tree t;
+ 
+       /* Note that the number of BINFO slots has already been emitted in
+ 	 EXPR's header (see streamer_write_tree_header) because this length
+ 	 is needed to build the empty BINFO node on the reader side.  */
+       FOR_EACH_VEC_ELT (*BINFO_BASE_BINFOS (expr), i, t)
+ 	DFS_follow_tree_edge (t);
+       DFS_follow_tree_edge (BINFO_OFFSET (expr));
+       DFS_follow_tree_edge (BINFO_VTABLE (expr));
+       DFS_follow_tree_edge (BINFO_VPTR_FIELD (expr));
+ 
+       /* The number of BINFO_BASE_ACCESSES has already been emitted in
+ 	 EXPR's bitfield section.  */
+       FOR_EACH_VEC_SAFE_ELT (BINFO_BASE_ACCESSES (expr), i, t)
+ 	DFS_follow_tree_edge (t);
+ 
+       DFS_follow_tree_edge (BINFO_INHERITANCE_CHAIN (expr));
+       DFS_follow_tree_edge (BINFO_SUBVTT_INDEX (expr));
+       DFS_follow_tree_edge (BINFO_VPTR_INDEX (expr));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_CONSTRUCTOR))
+     {
+       unsigned i;
+       tree index, value;
+ 
+       FOR_EACH_CONSTRUCTOR_ELT (CONSTRUCTOR_ELTS (expr), i, index, value)
+ 	{
+ 	  DFS_follow_tree_edge (index);
+ 	  DFS_follow_tree_edge (value);
+ 	}
+     }
+ 
+ #undef DFS_follow_tree_edge
+ }
+ 
+ /* Return a hash value for the tree T.  */
+ 
+ static hashval_t
+ hash_tree (struct streamer_tree_cache_d *cache, tree t)
+ {
+ #define visit(SIBLING) \
+   do { \
+     unsigned ix; \
+     if (SIBLING && streamer_tree_cache_lookup (cache, SIBLING, &ix)) \
+       v = iterative_hash_hashval_t (streamer_tree_cache_get_hash (cache, ix), v); \
+   } while (0)
+ 
+   /* Hash TS_BASE.  */
+   enum tree_code code = TREE_CODE (t);
+   hashval_t v = iterative_hash_host_wide_int (code, 0);
+   if (!TYPE_P (t))
+     {
+       v = iterative_hash_host_wide_int (TREE_SIDE_EFFECTS (t)
+ 					| (TREE_CONSTANT (t) << 1)
+ 					| (TREE_READONLY (t) << 2)
+ 					| (TREE_PUBLIC (t) << 3), v);
+     }
+   v = iterative_hash_host_wide_int (TREE_ADDRESSABLE (t)
+ 				    | (TREE_THIS_VOLATILE (t) << 1), v);
+   if (DECL_P (t))
+     v = iterative_hash_host_wide_int (DECL_UNSIGNED (t), v);
+   else if (TYPE_P (t))
+     v = iterative_hash_host_wide_int (TYPE_UNSIGNED (t), v);
+   if (TYPE_P (t))
+     v = iterative_hash_host_wide_int (TYPE_ARTIFICIAL (t), v);
+   else
+     v = iterative_hash_host_wide_int (TREE_NO_WARNING (t), v);
+   v = iterative_hash_host_wide_int (TREE_NOTHROW (t)
+ 				    | (TREE_STATIC (t) << 1)
+ 				    | (TREE_PRIVATE (t) << 2)
+ 				    | (TREE_PROTECTED (t) << 3)
+ 				    | (TREE_DEPRECATED (t) << 4), v);
+   if (TYPE_P (t))
+     v = iterative_hash_host_wide_int (TYPE_SATURATING (t)
+ 				      | (TYPE_ADDR_SPACE (t) << 1), v);
+   else if (code == SSA_NAME)
+     v = iterative_hash_host_wide_int (SSA_NAME_IS_DEFAULT_DEF (t), v);
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_INT_CST))
+     {
+       v = iterative_hash_host_wide_int (TREE_INT_CST_LOW (t), v);
+       v = iterative_hash_host_wide_int (TREE_INT_CST_HIGH (t), v);
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_REAL_CST))
+     {
+       REAL_VALUE_TYPE r = TREE_REAL_CST (t);
+       v = iterative_hash_host_wide_int (r.cl, v);
+       v = iterative_hash_host_wide_int (r.decimal
+ 					| (r.sign << 1)
+ 					| (r.signalling << 2)
+ 					| (r.canonical << 3), v);
+       v = iterative_hash_host_wide_int (r.uexp, v);
+       for (unsigned i = 0; i < SIGSZ; ++i)
+ 	v = iterative_hash_host_wide_int (r.sig[i], v);
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_FIXED_CST))
+     {
+       FIXED_VALUE_TYPE f = TREE_FIXED_CST (t);
+       v = iterative_hash_host_wide_int (f.mode, v);
+       v = iterative_hash_host_wide_int (f.data.low, v);
+       v = iterative_hash_host_wide_int (f.data.high, v);
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_COMMON))
+     {
+       v = iterative_hash_host_wide_int (DECL_MODE (t), v);
+       v = iterative_hash_host_wide_int (DECL_NONLOCAL (t)
+ 					| (DECL_VIRTUAL_P (t) << 1)
+ 					| (DECL_IGNORED_P (t) << 2)
+ 					| (DECL_ABSTRACT (t) << 3)
+ 					| (DECL_ARTIFICIAL (t) << 4)
+ 					| (DECL_USER_ALIGN (t) << 5)
+ 					| (DECL_PRESERVE_P (t) << 6)
+ 					| (DECL_EXTERNAL (t) << 7)
+ 					| (DECL_GIMPLE_REG_P (t) << 8), v);
+       v = iterative_hash_host_wide_int (DECL_ALIGN (t), v);
+       if (code == LABEL_DECL)
+ 	{
+ 	  v = iterative_hash_host_wide_int (DECL_ERROR_ISSUED (t), v);
+ 	  v = iterative_hash_host_wide_int (EH_LANDING_PAD_NR (t), v);
+ 	  v = iterative_hash_host_wide_int (LABEL_DECL_UID (t), v);
+ 	}
+       else if (code == FIELD_DECL)
+ 	{
+ 	  v = iterative_hash_host_wide_int (DECL_PACKED (t)
+ 					    | (DECL_NONADDRESSABLE_P (t) << 1),
+ 					    v);
+ 	  v = iterative_hash_host_wide_int (DECL_OFFSET_ALIGN (t), v);
+ 	}
+       else if (code == VAR_DECL)
+ 	{
+ 	  v = iterative_hash_host_wide_int (DECL_HAS_DEBUG_EXPR_P (t)
+ 					    | (DECL_NONLOCAL_FRAME (t) << 1),
+ 					    v);
+ 	}
+       if (code == RESULT_DECL
+ 	  || code == PARM_DECL
+ 	  || code == VAR_DECL)
+ 	{
+ 	  v = iterative_hash_host_wide_int (DECL_BY_REFERENCE (t), v);
+ 	  if (code == VAR_DECL
+ 	      || code == PARM_DECL)
+ 	    v = iterative_hash_host_wide_int (DECL_HAS_VALUE_EXPR_P (t), v);
+ 	}
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_WRTL))
+     v = iterative_hash_host_wide_int (DECL_REGISTER (t), v);
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_WITH_VIS))
+     {
+       v = iterative_hash_host_wide_int (DECL_DEFER_OUTPUT (t)
+ 					| (DECL_COMMON (t) << 1)
+ 					| (DECL_DLLIMPORT_P (t) << 2)
+ 					| (DECL_WEAK (t) << 3)
+ 					| (DECL_SEEN_IN_BIND_EXPR_P (t) << 4)
+ 					| (DECL_COMDAT (t) << 5)
+ 					| (DECL_VISIBILITY_SPECIFIED (t) << 6),
+ 					v);
+       v = iterative_hash_host_wide_int (DECL_VISIBILITY (t), v);
+       if (code == VAR_DECL)
+ 	{
+ 	  v = iterative_hash_host_wide_int (DECL_HARD_REGISTER (t)
+ 					    | (DECL_IN_TEXT_SECTION (t) << 1)
+ 					    | (DECL_IN_CONSTANT_POOL (t) << 2),
+ 					    v);
+ 	  v = iterative_hash_host_wide_int (DECL_TLS_MODEL (t), v);
+ 	}
+       if (VAR_OR_FUNCTION_DECL_P (t))
+ 	v = iterative_hash_host_wide_int (DECL_INIT_PRIORITY (t), v);
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_FUNCTION_DECL))
+     {
+       v = iterative_hash_host_wide_int (DECL_BUILT_IN_CLASS (t), v);
+       v = iterative_hash_host_wide_int (DECL_STATIC_CONSTRUCTOR (t)
+ 					| (DECL_STATIC_DESTRUCTOR (t) << 1)
+ 					| (DECL_UNINLINABLE (t) << 2)
+ 					| (DECL_POSSIBLY_INLINED (t) << 3)
+ 					| (DECL_IS_NOVOPS (t) << 4)
+ 					| (DECL_IS_RETURNS_TWICE (t) << 5)
+ 					| (DECL_IS_MALLOC (t) << 6)
+ 					| (DECL_IS_OPERATOR_NEW (t) << 7)
+ 					| (DECL_DECLARED_INLINE_P (t) << 8)
+ 					| (DECL_STATIC_CHAIN (t) << 9)
+ 					| (DECL_NO_INLINE_WARNING_P (t) << 10)
+ 					| (DECL_NO_INSTRUMENT_FUNCTION_ENTRY_EXIT (t) << 11)
+ 					| (DECL_NO_LIMIT_STACK (t) << 12)
+ 					| (DECL_DISREGARD_INLINE_LIMITS (t) << 13)
+ 					| (DECL_PURE_P (t) << 14)
+ 					| (DECL_LOOPING_CONST_OR_PURE_P (t) << 15), v);
+       if (DECL_BUILT_IN_CLASS (t) != NOT_BUILT_IN)
+ 	v = iterative_hash_host_wide_int (DECL_FUNCTION_CODE (t), v);
+       if (DECL_STATIC_DESTRUCTOR (t))
+ 	v = iterative_hash_host_wide_int (DECL_FINI_PRIORITY (t), v);
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_TYPE_COMMON))
+     {
+       v = iterative_hash_host_wide_int (TYPE_MODE (t), v);
+       v = iterative_hash_host_wide_int (TYPE_STRING_FLAG (t)
+ 					| (TYPE_NO_FORCE_BLK (t) << 1)
+ 					| (TYPE_NEEDS_CONSTRUCTING (t) << 2)
+ 					| (TYPE_PACKED (t) << 3)
+ 					| (TYPE_RESTRICT (t) << 4)
+ 					| (TYPE_CONTAINS_PLACEHOLDER_INTERNAL (t) << 5)
+ 					| (TYPE_USER_ALIGN (t) << 6)
+ 					| (TYPE_READONLY (t) << 7), v);
+       if (RECORD_OR_UNION_TYPE_P (t))
+ 	v = iterative_hash_host_wide_int (TYPE_TRANSPARENT_AGGR (t), v);
+       else if (code == ARRAY_TYPE)
+ 	v = iterative_hash_host_wide_int (TYPE_NONALIASED_COMPONENT (t), v);
+       v = iterative_hash_host_wide_int (TYPE_PRECISION (t), v);
+       v = iterative_hash_host_wide_int (TYPE_ALIGN (t), v);
+       v = iterative_hash_host_wide_int (TYPE_ALIAS_SET (t) == 0 ? 0 : -1, v);
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_TRANSLATION_UNIT_DECL))
+     v = iterative_hash (TRANSLATION_UNIT_LANGUAGE (t),
+ 			strlen (TRANSLATION_UNIT_LANGUAGE (t)), v);
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_TARGET_OPTION))
+     v = iterative_hash (t, sizeof (struct cl_target_option), v);
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_OPTIMIZATION))
+     v = iterative_hash (t, sizeof (struct cl_optimization), v);
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_IDENTIFIER))
+     v = iterative_hash_host_wide_int (IDENTIFIER_HASH_VALUE (t), v);
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_STRING))
+     v = iterative_hash (TREE_STRING_POINTER (t), TREE_STRING_LENGTH (t), v);
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_TYPED))
+     {
+       if (POINTER_TYPE_P (t))
+ 	{
+ 	  /* For pointers factor in the pointed-to type recursively as
+ 	     we cannot recurse through only pointers.
+ 	     ???  We can generalize this by keeping track of the
+ 	     in-SCC edges for each tree (or arbitrarily the first
+ 	     such edge) and hashing that in in a second stage
+ 	     (instead of the quadratic mixing of the SCC we do now).  */
+ 	  hashval_t x;
+ 	  unsigned ix;
+ 	  if (streamer_tree_cache_lookup (cache, TREE_TYPE (t), &ix))
+ 	    x = streamer_tree_cache_get_hash (cache, ix);
+ 	  else
+ 	    x = hash_tree (cache, TREE_TYPE (t));
+ 	  v = iterative_hash_hashval_t (x, v);
+ 	}
+       else if (code != IDENTIFIER_NODE)
+ 	visit (TREE_TYPE (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_VECTOR))
+     for (unsigned i = 0; i < VECTOR_CST_NELTS (t); ++i)
+       visit (VECTOR_CST_ELT (t, i));
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_COMPLEX))
+     {
+       visit (TREE_REALPART (t));
+       visit (TREE_IMAGPART (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_MINIMAL))
+     {
+       visit (DECL_NAME (t));
+       if (DECL_FILE_SCOPE_P (t))
+ 	;
+       else
+ 	visit (DECL_CONTEXT (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_COMMON))
+     {
+       visit (DECL_SIZE (t));
+       visit (DECL_SIZE_UNIT (t));
+       visit (DECL_ATTRIBUTES (t));
+       if ((code == VAR_DECL
+ 	   || code == PARM_DECL)
+ 	  && DECL_HAS_VALUE_EXPR_P (t))
+ 	visit (DECL_VALUE_EXPR (t));
+       if (code == VAR_DECL
+ 	  && DECL_HAS_DEBUG_EXPR_P (t))
+ 	visit (DECL_DEBUG_EXPR (t));
+       /* ???  Hash DECL_INITIAL as streamed.  Needs the output-block to
+          be able to call get_symbol_initial_value.  */
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_NON_COMMON))
+     {
+       if (code == FUNCTION_DECL)
+ 	{
+ 	  for (tree a = DECL_ARGUMENTS (t); a; a = DECL_CHAIN (a))
+ 	    visit (a);
+ 	  visit (DECL_RESULT (t));
+ 	}
+       else if (code == TYPE_DECL)
+ 	visit (DECL_ORIGINAL_TYPE (t));
+       visit (DECL_VINDEX (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_DECL_WITH_VIS))
+     {
+       if (DECL_ASSEMBLER_NAME_SET_P (t))
+ 	visit (DECL_ASSEMBLER_NAME (t));
+       visit (DECL_SECTION_NAME (t));
+       visit (DECL_COMDAT_GROUP (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_FIELD_DECL))
+     {
+       visit (DECL_FIELD_OFFSET (t));
+       visit (DECL_BIT_FIELD_TYPE (t));
+       visit (DECL_BIT_FIELD_REPRESENTATIVE (t));
+       visit (DECL_FIELD_BIT_OFFSET (t));
+       visit (DECL_FCONTEXT (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_FUNCTION_DECL))
+     {
+       visit (DECL_FUNCTION_PERSONALITY (t));
+       visit (DECL_FUNCTION_SPECIFIC_TARGET (t));
+       visit (DECL_FUNCTION_SPECIFIC_OPTIMIZATION (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_TYPE_COMMON))
+     {
+       visit (TYPE_SIZE (t));
+       visit (TYPE_SIZE_UNIT (t));
+       visit (TYPE_ATTRIBUTES (t));
+       visit (TYPE_NAME (t));
+       visit (TYPE_MAIN_VARIANT (t));
+       if (TYPE_FILE_SCOPE_P (t))
+ 	;
+       else
+ 	visit (TYPE_CONTEXT (t));
+       visit (TYPE_STUB_DECL (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_TYPE_NON_COMMON))
+     {
+       if (code == ENUMERAL_TYPE)
+ 	visit (TYPE_VALUES (t));
+       else if (code == ARRAY_TYPE)
+ 	visit (TYPE_DOMAIN (t));
+       else if (RECORD_OR_UNION_TYPE_P (t))
+ 	for (tree f = TYPE_FIELDS (t); f; f = TREE_CHAIN (f))
+ 	  visit (f);
+       else if (code == FUNCTION_TYPE
+ 	       || code == METHOD_TYPE)
+ 	visit (TYPE_ARG_TYPES (t));
+       if (!POINTER_TYPE_P (t))
+ 	visit (TYPE_MINVAL (t));
+       visit (TYPE_MAXVAL (t));
+       if (RECORD_OR_UNION_TYPE_P (t))
+ 	visit (TYPE_BINFO (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_LIST))
+     {
+       visit (TREE_PURPOSE (t));
+       visit (TREE_VALUE (t));
+       visit (TREE_CHAIN (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_VEC))
+     for (int i = 0; i < TREE_VEC_LENGTH (t); ++i)
+       visit (TREE_VEC_ELT (t, i));
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_EXP))
+     {
+       v = iterative_hash_host_wide_int (TREE_OPERAND_LENGTH (t), v);
+       for (int i = 0; i < TREE_OPERAND_LENGTH (t); ++i)
+ 	visit (TREE_OPERAND (t, i));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_BINFO))
+     {
+       unsigned i;
+       tree b;
+       FOR_EACH_VEC_ELT (*BINFO_BASE_BINFOS (t), i, b)
+ 	visit (b);
+       visit (BINFO_OFFSET (t));
+       visit (BINFO_VTABLE (t));
+       visit (BINFO_VPTR_FIELD (t));
+       FOR_EACH_VEC_SAFE_ELT (BINFO_BASE_ACCESSES (t), i, b)
+ 	visit (b);
+       visit (BINFO_INHERITANCE_CHAIN (t));
+       visit (BINFO_SUBVTT_INDEX (t));
+       visit (BINFO_VPTR_INDEX (t));
+     }
+ 
+   if (CODE_CONTAINS_STRUCT (code, TS_CONSTRUCTOR))
+     {
+       unsigned i;
+       tree index, value;
+       v = iterative_hash_host_wide_int (CONSTRUCTOR_NELTS (t), v);
+       FOR_EACH_CONSTRUCTOR_ELT (CONSTRUCTOR_ELTS (t), i, index, value)
+ 	{
+ 	  visit (index);
+ 	  visit (value);
+ 	}
+     }
+ 
+   return v;
+ 
+ #undef visit
+ }
+ 
+ /* Compare two SCC entries by their hash value for qsorting them.  */
+ 
+ static int
+ scc_entry_compare (const void *p1_, const void *p2_)
+ {
+   const scc_entry *p1 = (const scc_entry *) p1_;
+   const scc_entry *p2 = (const scc_entry *) p2_;
+   if (p1->hash < p2->hash)
+     return -1;
+   else if (p1->hash > p2->hash)
+     return 1;
+   return 0;
+ }
+ 
+ /* Return a hash value for the SCC on the SCC stack from FIRST with
+    size SIZE.  */
+ 
+ static hashval_t
+ hash_scc (struct streamer_tree_cache_d *cache, unsigned first, unsigned size)
+ {
+   /* Compute hash values for the SCC members.  */
+   for (unsigned i = 0; i < size; ++i)
+     sccstack[first+i].hash = hash_tree (cache, sccstack[first+i].t);
+ 
+   if (size == 1)
+     return sccstack[first].hash;
+ 
+   /* Sort the SCC of type, hash pairs so that when we mix in
+      all members of the SCC the hash value becomes independent on
+      the order we visited the SCC.  Disregard hashes equal to
+      the hash of the tree we mix into because we cannot guarantee
+      a stable sort for those across different TUs.  */
+   qsort (&sccstack[first], size, sizeof (scc_entry), scc_entry_compare);
+   hashval_t *tem = XALLOCAVEC (hashval_t, size);
+   for (unsigned i = 0; i < size; ++i)
+     {
+       hashval_t hash = sccstack[first+i].hash;
+       hashval_t orig_hash = hash;
+       unsigned j;
+       /* Skip same hashes.  */
+       for (j = i + 1;
+ 	   j < size && sccstack[first+j].hash == orig_hash; ++j)
+ 	;
+       for (; j < size; ++j)
+ 	hash = iterative_hash_hashval_t (sccstack[first+j].hash, hash);
+       for (j = 0; sccstack[first+j].hash != orig_hash; ++j)
+ 	hash = iterative_hash_hashval_t (sccstack[first+j].hash, hash);
+       tem[i] = hash;
+     }
+   hashval_t scc_hash = 0;
+   for (unsigned i = 0; i < size; ++i)
+     {
+       sccstack[first+i].hash = tem[i];
+       scc_hash = iterative_hash_hashval_t (tem[i], scc_hash);
+     }
+   return scc_hash;
+ }
+ 
+ /* DFS walk EXPR and stream SCCs of tree bodies if they are not
+    already in the streamer cache.  Main routine called for
+    each visit of EXPR.  */
+ 
+ static void
+ DFS_write_tree (struct output_block *ob, sccs *from_state,
+ 		tree expr, bool ref_p, bool this_ref_p)
+ {
+   unsigned ix;
+   sccs **slot;
+ 
+   /* Handle special cases.  */
+   if (expr == NULL_TREE)
+     return;
+ 
+   /* Do not DFS walk into indexable trees.  */
+   if (this_ref_p && tree_is_indexable (expr))
+     return;
+ 
+   /* Check if we already streamed EXPR.  */
+   if (streamer_tree_cache_lookup (ob->writer_cache, expr, &ix))
+     return;
+ 
+   slot = (sccs **)pointer_map_insert (sccstate, expr);
+   sccs *cstate = *slot;
+   if (!cstate)
+     {
+       scc_entry e = { expr, 0 };
+       /* Not yet visited.  DFS recurse and push it onto the stack.  */
+       *slot = cstate = XOBNEW (&sccstate_obstack, struct sccs);
+       sccstack.safe_push (e);
+       cstate->dfsnum = next_dfs_num++;
+       cstate->low = cstate->dfsnum;
+ 
+       if (streamer_handle_as_builtin_p (expr))
+ 	;
+       else if (TREE_CODE (expr) == INTEGER_CST
+ 	       && !TREE_OVERFLOW (expr))
+ 	DFS_write_tree (ob, cstate, TREE_TYPE (expr), ref_p, ref_p);
+       else
+ 	{
+ 	  DFS_write_tree_body (ob, expr, cstate, ref_p);
+ 
+ 	  /* Walk any LTO-specific edges.  */
+ 	  if (DECL_P (expr)
+ 	      && TREE_CODE (expr) != FUNCTION_DECL
+ 	      && TREE_CODE (expr) != TRANSLATION_UNIT_DECL)
+ 	    {
+ 	      /* Handle DECL_INITIAL for symbols.  */
+ 	      tree initial = get_symbol_initial_value (ob, expr);
+ 	      DFS_write_tree (ob, cstate, initial, ref_p, ref_p);
+ 	    }
+ 	}
+ 
+       /* See if we found an SCC.  */
+       if (cstate->low == cstate->dfsnum)
+ 	{
+ 	  unsigned first, size;
+ 	  tree x;
+ 
+ 	  /* Pop the SCC and compute its size.  */
+ 	  first = sccstack.length ();
+ 	  do
+ 	    {
+ 	      x = sccstack[--first].t;
+ 	    }
+ 	  while (x != expr);
+ 	  size = sccstack.length () - first;
+ 
+ 	  /* No need to compute hashes for LTRANS units, we don't perform
+ 	     any merging there.  */
+ 	  hashval_t scc_hash = 0;
+ 	  unsigned scc_entry_len = 0;
+ 	  if (!flag_wpa)
+ 	    {
+ 	      scc_hash = hash_scc (ob->writer_cache, first, size);
+ 
+ 	      /* Put the entries with the least number of collisions first.  */
+ 	      unsigned entry_start = 0;
+ 	      scc_entry_len = size + 1;
+ 	      for (unsigned i = 0; i < size;)
+ 		{
+ 		  unsigned from = i;
+ 		  for (i = i + 1; i < size
+ 		       && (sccstack[first + i].hash
+ 			   == sccstack[first + from].hash); ++i)
+ 		    ;
+ 		  if (i - from < scc_entry_len)
+ 		    {
+ 		      scc_entry_len = i - from;
+ 		      entry_start = from;
+ 		    }
+ 		}
+ 	      for (unsigned i = 0; i < scc_entry_len; ++i)
+ 		{
+ 		  scc_entry tem = sccstack[first + i];
+ 		  sccstack[first + i] = sccstack[first + entry_start + i];
+ 		  sccstack[first + entry_start + i] = tem;
+ 		}
+ 	    }
+ 
+ 	  /* Write LTO_tree_scc.  */
+ 	  streamer_write_record_start (ob, LTO_tree_scc);
+ 	  streamer_write_uhwi (ob, size);
+ 	  streamer_write_uhwi (ob, scc_hash);
+ 
+ 	  /* Write size-1 SCCs without wrapping them inside SCC bundles.
+ 	     All INTEGER_CSTs need to be handled this way as we need
+ 	     their type to materialize them.  Also builtins are handled
+ 	     this way.
+ 	     ???  We still wrap these in LTO_tree_scc so at the
+ 	     input side we can properly identify the tree we want
+ 	     to ultimatively return.  */
+ 	  size_t old_len = ob->writer_cache->nodes.length ();
+ 	  if (size == 1)
+ 	    lto_output_tree_1 (ob, expr, scc_hash, ref_p, this_ref_p);
+ 	  else
+ 	    {
+ 	      /* Write the size of the SCC entry candidates.  */
+ 	      streamer_write_uhwi (ob, scc_entry_len);
+ 
+ 	      /* Write all headers and populate the streamer cache.  */
+ 	      for (unsigned i = 0; i < size; ++i)
+ 		{
+ 		  hashval_t hash = sccstack[first+i].hash;
+ 		  tree t = sccstack[first+i].t;
+ 		  bool exists_p = streamer_tree_cache_insert (ob->writer_cache,
+ 							      t, hash, &ix);
+ 		  gcc_assert (!exists_p);
+ 
+ 		  if (!lto_is_streamable (t))
+ 		    internal_error ("tree code %qs is not supported "
+ 				    "in LTO streams",
+ 				    tree_code_name[TREE_CODE (t)]);
+ 
+ 		  gcc_checking_assert (!streamer_handle_as_builtin_p (t));
+ 
+ 		  /* Write the header, containing everything needed to
+ 		     materialize EXPR on the reading side.  */
+ 		  streamer_write_tree_header (ob, t);
+ 		}
+ 
+ 	      /* Write the bitpacks and tree references.  */
+ 	      for (unsigned i = 0; i < size; ++i)
+ 		{
+ 		  lto_write_tree_1 (ob, sccstack[first+i].t, ref_p);
+ 
+ 		  /* Mark the end of the tree.  */
+ 		  streamer_write_zero (ob);
+ 		}
+ 	    }
+ 	  gcc_assert (old_len + size == ob->writer_cache->nodes.length ());
+ 
+ 	  /* Finally truncate the vector.  */
+ 	  sccstack.truncate (first);
+ 
+ 	  if (from_state)
+ 	    from_state->low = MIN (from_state->low, cstate->low);
+ 	  return;
+ 	}
+ 
+       if (from_state)
+ 	from_state->low = MIN (from_state->low, cstate->low);
+     }
+   gcc_checking_assert (from_state);
+   if (cstate->dfsnum < from_state->dfsnum)
+     from_state->low = MIN (cstate->dfsnum, from_state->low);
+ }
+ 
+ extern unsigned long num_trees_output;
+ extern unsigned long num_grefs_output;
+ extern unsigned long num_lrefs_output;
+ 
+ /* Emit the physical representation of tree node EXPR to output block
+    OB.  If THIS_REF_P is true, the leaves of EXPR are emitted as references
+    via lto_output_tree_ref.  REF_P is used for streaming siblings of EXPR.  */
+ 
+ void
+ lto_output_tree (struct output_block *ob, tree expr,
+ 		 bool ref_p, bool this_ref_p)
+ {
+   unsigned ix;
+   bool existed_p;
+ 
+   num_trees_output++;
+ 
+   if (expr == NULL_TREE)
+     {
+       streamer_write_record_start (ob, LTO_null);
+       return;
+     }
+ 
+   if (this_ref_p && tree_is_indexable (expr))
+     {
+       num_grefs_output++;
+       lto_output_tree_ref (ob, expr);
        return;
      }
  
!   existed_p = streamer_tree_cache_lookup (ob->writer_cache, expr, &ix);
    if (existed_p)
      {
+       num_lrefs_output++;
        /* If a node has already been streamed out, make sure that
  	 we don't write it more than once.  Otherwise, the reader
  	 will instantiate two different nodes for the same object.  */
*************** lto_output_tree (struct output_block *ob
*** 394,413 ****
        streamer_write_enum (ob->main_stream, LTO_tags, LTO_NUM_TAGS,
  			   lto_tree_code_to_tag (TREE_CODE (expr)));
      }
-   else if (streamer_handle_as_builtin_p (expr))
-     {
-       /* MD and NORMAL builtins do not need to be written out
- 	 completely as they are always instantiated by the
- 	 compiler on startup.  The only builtins that need to
- 	 be written out are BUILT_IN_FRONTEND.  For all other
- 	 builtins, we simply write the class and code.  */
-       streamer_write_builtin (ob, expr);
-     }
    else
      {
!       /* This is the first time we see EXPR, write its fields
! 	 to OB.  */
!       lto_write_tree (ob, expr, ref_p);
      }
  }
  
--- 1331,1361 ----
        streamer_write_enum (ob->main_stream, LTO_tags, LTO_NUM_TAGS,
  			   lto_tree_code_to_tag (TREE_CODE (expr)));
      }
    else
      {
!       /* This is the first time we see EXPR, write all reachable
! 	 trees to OB.  */
! 
!       /* Start the DFS walk.  */
!       /* Save ob state ... */
!       /* let's see ... */
!       sccstate = pointer_map_create ();
!       gcc_obstack_init (&sccstate_obstack);
!       next_dfs_num = 1;
!       DFS_write_tree (ob, NULL, expr, ref_p, this_ref_p);
!       sccstack.release ();
!       pointer_map_destroy (sccstate);
!       obstack_free (&sccstate_obstack, NULL);
! 
!       /* Finally append a reference to the tree we were writing.
! 	 ???  If expr ended up as a singleton we could have
! 	 inlined it here and avoid outputting a reference.  */
!       existed_p = streamer_tree_cache_lookup (ob->writer_cache, expr, &ix);
!       gcc_assert (existed_p);
!       streamer_write_record_start (ob, LTO_tree_pickle_reference);
!       streamer_write_uhwi (ob, ix);
!       streamer_write_enum (ob->main_stream, LTO_tags, LTO_NUM_TAGS,
! 			   lto_tree_code_to_tag (TREE_CODE (expr)));
      }
  }
  
Index: trunk/gcc/tree-streamer-in.c
===================================================================
*** trunk.orig/gcc/tree-streamer-in.c	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/tree-streamer-in.c	2013-06-13 15:35:55.522262466 +0200
*************** unpack_ts_base_value_fields (struct bitp
*** 122,128 ****
      TYPE_ARTIFICIAL (expr) = (unsigned) bp_unpack_value (bp, 1);
    else
      TREE_NO_WARNING (expr) = (unsigned) bp_unpack_value (bp, 1);
-   TREE_USED (expr) = (unsigned) bp_unpack_value (bp, 1);
    TREE_NOTHROW (expr) = (unsigned) bp_unpack_value (bp, 1);
    TREE_STATIC (expr) = (unsigned) bp_unpack_value (bp, 1);
    TREE_PRIVATE (expr) = (unsigned) bp_unpack_value (bp, 1);
--- 122,127 ----
*************** lto_input_ts_list_tree_pointers (struct
*** 811,817 ****
  {
    TREE_PURPOSE (expr) = stream_read_tree (ib, data_in);
    TREE_VALUE (expr) = stream_read_tree (ib, data_in);
!   TREE_CHAIN (expr) = streamer_read_chain (ib, data_in);
  }
  
  
--- 810,816 ----
  {
    TREE_PURPOSE (expr) = stream_read_tree (ib, data_in);
    TREE_VALUE (expr) = stream_read_tree (ib, data_in);
!   TREE_CHAIN (expr) = stream_read_tree (ib, data_in);
  }
  
  
*************** streamer_read_tree_body (struct lto_inpu
*** 1021,1039 ****
  }
  
  
- /* Read and INTEGER_CST node from input block IB using the per-file
-    context in DATA_IN.  */
- 
- tree
- streamer_read_integer_cst (struct lto_input_block *ib, struct data_in *data_in)
- {
-   tree type = stream_read_tree (ib, data_in);
-   unsigned HOST_WIDE_INT low = streamer_read_uhwi (ib);
-   HOST_WIDE_INT high = streamer_read_hwi (ib);
-   return build_int_cst_wide (type, low, high);
- }
- 
- 
  /* Read an index IX from input block IB and return the tree node at
     DATA_IN->FILE_DATA->GLOBALS_INDEX[IX].  */
  
--- 1020,1025 ----
*************** streamer_get_pickled_tree (struct lto_in
*** 1047,1053 ****
    ix = streamer_read_uhwi (ib);
    expected_tag = streamer_read_enum (ib, LTO_tags, LTO_NUM_TAGS);
  
!   result = streamer_tree_cache_get (data_in->reader_cache, ix);
    gcc_assert (result
                && TREE_CODE (result) == lto_tag_to_tree_code (expected_tag));
  
--- 1033,1039 ----
    ix = streamer_read_uhwi (ib);
    expected_tag = streamer_read_enum (ib, LTO_tags, LTO_NUM_TAGS);
  
!   result = streamer_tree_cache_get_tree (data_in->reader_cache, ix);
    gcc_assert (result
                && TREE_CODE (result) == lto_tag_to_tree_code (expected_tag));
  
*************** streamer_get_builtin_tree (struct lto_in
*** 1091,1097 ****
    if (asmname)
      set_builtin_user_assembler_name (result, asmname);
  
!   streamer_tree_cache_append (data_in->reader_cache, result);
  
    return result;
  }
--- 1077,1083 ----
    if (asmname)
      set_builtin_user_assembler_name (result, asmname);
  
!   streamer_tree_cache_append (data_in->reader_cache, result, 0);
  
    return result;
  }
Index: trunk/gcc/lto/lto.c
===================================================================
*** trunk.orig/gcc/lto/lto.c	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/lto/lto.c	2013-06-13 15:36:51.976937026 +0200
*************** along with GCC; see the file COPYING3.
*** 45,50 ****
--- 45,51 ----
  #include "tree-streamer.h"
  #include "splay-tree.h"
  #include "lto-partition.h"
+ #include "data-streamer.h"
  
  static GTY(()) tree first_personality_decl;
  
*************** lto_read_in_decl_state (struct data_in *
*** 254,260 ****
    uint32_t i, j;
  
    ix = *data++;
!   decl = streamer_tree_cache_get (data_in->reader_cache, ix);
    if (TREE_CODE (decl) != FUNCTION_DECL)
      {
        gcc_assert (decl == void_type_node);
--- 255,261 ----
    uint32_t i, j;
  
    ix = *data++;
!   decl = streamer_tree_cache_get_tree (data_in->reader_cache, ix);
    if (TREE_CODE (decl) != FUNCTION_DECL)
      {
        gcc_assert (decl == void_type_node);
*************** lto_read_in_decl_state (struct data_in *
*** 268,274 ****
        tree *decls = ggc_alloc_vec_tree (size);
  
        for (j = 0; j < size; j++)
! 	decls[j] = streamer_tree_cache_get (data_in->reader_cache, data[j]);
  
        state->streams[i].size = size;
        state->streams[i].trees = decls;
--- 269,275 ----
        tree *decls = ggc_alloc_vec_tree (size);
  
        for (j = 0; j < size; j++)
! 	decls[j] = streamer_tree_cache_get_tree (data_in->reader_cache, data[j]);
  
        state->streams[i].size = size;
        state->streams[i].trees = decls;
*************** lto_read_in_decl_state (struct data_in *
*** 280,285 ****
--- 281,289 ----
  
  
  
+ /* ???  Old hashing and merging code follows, we keep it for statistics
+    purposes for now.  */
+ 
  /* Global type table.  FIXME, it should be possible to re-use some
     of the type hashing routines in tree.c (type_hash_canon, type_hash_lookup,
     etc), but those assume that types were built with the various
*************** struct sccs
*** 366,398 ****
  static unsigned int next_dfs_num;
  static unsigned int gtc_next_dfs_num;
  
- /* GIMPLE type merging cache.  A direct-mapped cache based on TYPE_UID.  */
- 
- typedef struct GTY(()) gimple_type_leader_entry_s {
-   tree type;
-   tree leader;
- } gimple_type_leader_entry;
- 
- #define GIMPLE_TYPE_LEADER_SIZE 16381
- static GTY((length("GIMPLE_TYPE_LEADER_SIZE")))
-   gimple_type_leader_entry *gimple_type_leader;
- 
- /* Lookup an existing leader for T and return it or NULL_TREE, if
-    there is none in the cache.  */
- 
- static inline tree
- gimple_lookup_type_leader (tree t)
- {
-   gimple_type_leader_entry *leader;
- 
-   leader = &gimple_type_leader[TYPE_UID (t) % GIMPLE_TYPE_LEADER_SIZE];
-   if (leader->type != t)
-     return NULL_TREE;
- 
-   return leader->leader;
- }
- 
- 
  /* Return true if T1 and T2 have the same name.  If FOR_COMPLETION_P is
     true then if any type has no name return false, otherwise return
     true if both types have no names.  */
--- 370,375 ----
*************** gtc_visit (tree t1, tree t2,
*** 450,456 ****
    struct sccs *cstate = NULL;
    type_pair_t p;
    void **slot;
-   tree leader1, leader2;
  
    /* Check first for the obvious case of pointer identity.  */
    if (t1 == t2)
--- 427,432 ----
*************** gtc_visit (tree t1, tree t2,
*** 507,521 ****
        /* For other types fall through to more complex checks.  */
      }
  
-   /* If the types have been previously registered and found equal
-      they still are.  */
-   leader1 = gimple_lookup_type_leader (t1);
-   leader2 = gimple_lookup_type_leader (t2);
-   if (leader1 == t2
-       || t1 == leader2
-       || (leader1 && leader1 == leader2))
-     return true;
- 
    /* If the hash values of t1 and t2 are different the types can't
       possibly be the same.  This helps keeping the type-pair hashtable
       small, only tracking comparisons for hash collisions.  */
--- 483,488 ----
*************** gimple_types_compatible_p (tree t1, tree
*** 879,885 ****
    struct obstack sccstate_obstack;
    type_pair_t p = NULL;
    bool res;
-   tree leader1, leader2;
  
    /* Before starting to set up the SCC machinery handle simple cases.  */
  
--- 846,851 ----
*************** gimple_types_compatible_p (tree t1, tree
*** 938,952 ****
        /* For other types fall through to more complex checks.  */
      }
  
-   /* If the types have been previously registered and found equal
-      they still are.  */
-   leader1 = gimple_lookup_type_leader (t1);
-   leader2 = gimple_lookup_type_leader (t2);
-   if (leader1 == t2
-       || t1 == leader2
-       || (leader1 && leader1 == leader2))
-     return true;
- 
    /* If the hash values of t1 and t2 are different the types can't
       possibly be the same.  This helps keeping the type-pair hashtable
       small, only tracking comparisons for hash collisions.  */
--- 904,909 ----
*************** gimple_type_eq (const void *p1, const vo
*** 1335,1402 ****
  				    CONST_CAST_TREE (t2));
  }
  
! 
! /* Worker for gimple_register_type.
!    Register type T in the global type table gimple_types.
!    When REGISTERING_MV is false first recurse for the main variant of T.  */
  
  static tree
! gimple_register_type_1 (tree t, bool registering_mv)
  {
    void **slot;
-   gimple_type_leader_entry *leader;
- 
-   /* If we registered this type before return the cached result.  */
-   leader = &gimple_type_leader[TYPE_UID (t) % GIMPLE_TYPE_LEADER_SIZE];
-   if (leader->type == t)
-     return leader->leader;
- 
-   /* Always register the main variant first.  This is important so we
-      pick up the non-typedef variants as canonical, otherwise we'll end
-      up taking typedef ids for structure tags during comparison.
-      It also makes sure that main variants will be merged to main variants.
-      As we are operating on a possibly partially fixed up type graph
-      do not bother to recurse more than once, otherwise we may end up
-      walking in circles.
-      If we are registering a main variant it will either remain its
-      own main variant or it will be merged to something else in which
-      case we do not care for the main variant leader.  */
-   if (!registering_mv
-       && TYPE_MAIN_VARIANT (t) != t)
-     gimple_register_type_1 (TYPE_MAIN_VARIANT (t), true);
  
    /* See if we already have an equivalent type registered.  */
    slot = htab_find_slot (gimple_types, t, INSERT);
    if (*slot
        && *(tree *)slot != t)
!     {
!       tree new_type = (tree) *((tree *) slot);
!       leader->type = t;
!       leader->leader = new_type;
!       return new_type;
!     }
  
    /* If not, insert it to the cache and the hash.  */
-   leader->type = t;
-   leader->leader = t;
    *slot = (void *) t;
    return t;
  }
  
! /* Register type T in the global type table gimple_types.
!    If another type T', compatible with T, already existed in
!    gimple_types then return T', otherwise return T.  This is used by
!    LTO to merge identical types read from different TUs.  */
! 
! static tree
! gimple_register_type (tree t)
! {
!   gcc_assert (TYPE_P (t));
!   return gimple_register_type_1 (t, false);
! }
! 
! #define GIMPLE_REGISTER_TYPE(tt) \
!    (TREE_VISITED (tt) ? gimple_register_type (tt) : tt)
  
  
  
--- 1292,1316 ----
  				    CONST_CAST_TREE (t2));
  }
  
! /* Register type T in the global type table gimple_types.  */
  
  static tree
! gimple_register_type (tree t)
  {
    void **slot;
  
    /* See if we already have an equivalent type registered.  */
    slot = htab_find_slot (gimple_types, t, INSERT);
    if (*slot
        && *(tree *)slot != t)
!     return (tree) *((tree *) slot);
  
    /* If not, insert it to the cache and the hash.  */
    *slot = (void *) t;
    return t;
  }
  
! /* End of old merging code.  */
  
  
  
*************** remember_with_vars (tree t)
*** 1413,1632 ****
    *(tree *) htab_find_slot (tree_with_vars, t, INSERT) = t;
  }
  
! #define LTO_FIXUP_TREE(tt) \
    do \
      { \
        if (tt) \
  	{ \
- 	  if (TYPE_P (tt)) \
- 	    (tt) = GIMPLE_REGISTER_TYPE (tt); \
  	  if (VAR_OR_FUNCTION_DECL_P (tt) && TREE_PUBLIC (tt)) \
  	    remember_with_vars (t); \
- 	  if (TREE_CODE (tt) == INTEGER_CST) \
- 	    (tt) = fixup_integer_cst (tt); \
  	} \
      } while (0)
  
- static void lto_fixup_types (tree);
- 
- /* Return integer_cst T with updated type.  */
- 
- static tree
- fixup_integer_cst (tree t)
- {
-   tree type = GIMPLE_REGISTER_TYPE (TREE_TYPE (t));
- 
-   if (type == TREE_TYPE (t))
-     return t;
- 
-   /* If overflow was set, streamer_read_integer_cst
-      produced local copy of T. */
-   if (TREE_OVERFLOW (t))
-     {
-       TREE_TYPE (t) = type;
-       return t;
-     }
-   else
-   /* Otherwise produce new shared node for the new type.  */
-     return build_int_cst_wide (type, TREE_INT_CST_LOW (t),
- 			       TREE_INT_CST_HIGH (t));
- }
- 
  /* Fix up fields of a tree_typed T.  */
  
  static void
! lto_ft_typed (tree t)
  {
!   LTO_FIXUP_TREE (TREE_TYPE (t));
  }
  
  /* Fix up fields of a tree_common T.  */
  
  static void
! lto_ft_common (tree t)
  {
!   lto_ft_typed (t);
!   LTO_FIXUP_TREE (TREE_CHAIN (t));
  }
  
  /* Fix up fields of a decl_minimal T.  */
  
  static void
! lto_ft_decl_minimal (tree t)
  {
!   lto_ft_common (t);
!   LTO_FIXUP_TREE (DECL_NAME (t));
!   LTO_FIXUP_TREE (DECL_CONTEXT (t));
  }
  
  /* Fix up fields of a decl_common T.  */
  
  static void
! lto_ft_decl_common (tree t)
  {
!   lto_ft_decl_minimal (t);
!   LTO_FIXUP_TREE (DECL_SIZE (t));
!   LTO_FIXUP_TREE (DECL_SIZE_UNIT (t));
!   LTO_FIXUP_TREE (DECL_INITIAL (t));
!   LTO_FIXUP_TREE (DECL_ATTRIBUTES (t));
!   LTO_FIXUP_TREE (DECL_ABSTRACT_ORIGIN (t));
  }
  
  /* Fix up fields of a decl_with_vis T.  */
  
  static void
! lto_ft_decl_with_vis (tree t)
  {
!   lto_ft_decl_common (t);
  
    /* Accessor macro has side-effects, use field-name here. */
!   LTO_FIXUP_TREE (t->decl_with_vis.assembler_name);
!   LTO_FIXUP_TREE (DECL_SECTION_NAME (t));
  }
  
  /* Fix up fields of a decl_non_common T.  */
  
  static void
! lto_ft_decl_non_common (tree t)
  {
!   lto_ft_decl_with_vis (t);
!   LTO_FIXUP_TREE (DECL_ARGUMENT_FLD (t));
!   LTO_FIXUP_TREE (DECL_RESULT_FLD (t));
!   LTO_FIXUP_TREE (DECL_VINDEX (t));
!   /* The C frontends may create exact duplicates for DECL_ORIGINAL_TYPE
!      like for 'typedef enum foo foo'.  We have no way of avoiding to
!      merge them and dwarf2out.c cannot deal with this,
!      so fix this up by clearing DECL_ORIGINAL_TYPE in this case.  */
!   if (TREE_CODE (t) == TYPE_DECL
!       && DECL_ORIGINAL_TYPE (t) == TREE_TYPE (t))
!     DECL_ORIGINAL_TYPE (t) = NULL_TREE;
  }
  
  /* Fix up fields of a decl_non_common T.  */
  
  static void
! lto_ft_function (tree t)
  {
!   lto_ft_decl_non_common (t);
!   LTO_FIXUP_TREE (DECL_FUNCTION_PERSONALITY (t));
  }
  
  /* Fix up fields of a field_decl T.  */
  
  static void
! lto_ft_field_decl (tree t)
  {
!   lto_ft_decl_common (t);
!   LTO_FIXUP_TREE (DECL_FIELD_OFFSET (t));
!   LTO_FIXUP_TREE (DECL_BIT_FIELD_TYPE (t));
!   LTO_FIXUP_TREE (DECL_QUALIFIER (t));
!   LTO_FIXUP_TREE (DECL_FIELD_BIT_OFFSET (t));
!   LTO_FIXUP_TREE (DECL_FCONTEXT (t));
  }
  
  /* Fix up fields of a type T.  */
  
  static void
! lto_ft_type (tree t)
  {
!   lto_ft_common (t);
!   LTO_FIXUP_TREE (TYPE_CACHED_VALUES (t));
!   LTO_FIXUP_TREE (TYPE_SIZE (t));
!   LTO_FIXUP_TREE (TYPE_SIZE_UNIT (t));
!   LTO_FIXUP_TREE (TYPE_ATTRIBUTES (t));
!   LTO_FIXUP_TREE (TYPE_NAME (t));
  
    /* Accessors are for derived node types only. */
    if (!POINTER_TYPE_P (t))
!     LTO_FIXUP_TREE (TYPE_MINVAL (t));
!   LTO_FIXUP_TREE (TYPE_MAXVAL (t));
  
    /* Accessor is for derived node types only. */
!   LTO_FIXUP_TREE (t->type_non_common.binfo);
  
!   LTO_FIXUP_TREE (TYPE_CONTEXT (t));
  }
  
  /* Fix up fields of a BINFO T.  */
  
  static void
! lto_ft_binfo (tree t)
  {
    unsigned HOST_WIDE_INT i, n;
-   tree base, saved_base;
  
!   lto_ft_common (t);
!   LTO_FIXUP_TREE (BINFO_VTABLE (t));
!   LTO_FIXUP_TREE (BINFO_OFFSET (t));
!   LTO_FIXUP_TREE (BINFO_VIRTUALS (t));
!   LTO_FIXUP_TREE (BINFO_VPTR_FIELD (t));
    n = vec_safe_length (BINFO_BASE_ACCESSES (t));
    for (i = 0; i < n; i++)
!     {
!       saved_base = base = BINFO_BASE_ACCESS (t, i);
!       LTO_FIXUP_TREE (base);
!       if (base != saved_base)
! 	(*BINFO_BASE_ACCESSES (t))[i] = base;
!     }
!   LTO_FIXUP_TREE (BINFO_INHERITANCE_CHAIN (t));
!   LTO_FIXUP_TREE (BINFO_SUBVTT_INDEX (t));
!   LTO_FIXUP_TREE (BINFO_VPTR_INDEX (t));
    n = BINFO_N_BASE_BINFOS (t);
    for (i = 0; i < n; i++)
!     {
!       saved_base = base = BINFO_BASE_BINFO (t, i);
!       LTO_FIXUP_TREE (base);
!       if (base != saved_base)
! 	(*BINFO_BASE_BINFOS (t))[i] = base;
!     }
  }
  
  /* Fix up fields of a CONSTRUCTOR T.  */
  
  static void
! lto_ft_constructor (tree t)
  {
    unsigned HOST_WIDE_INT idx;
    constructor_elt *ce;
  
!   lto_ft_typed (t);
  
    for (idx = 0; vec_safe_iterate (CONSTRUCTOR_ELTS (t), idx, &ce); idx++)
      {
!       LTO_FIXUP_TREE (ce->index);
!       LTO_FIXUP_TREE (ce->value);
      }
  }
  
  /* Fix up fields of an expression tree T.  */
  
  static void
! lto_ft_expr (tree t)
  {
    int i;
!   lto_ft_typed (t);
    for (i = TREE_OPERAND_LENGTH (t) - 1; i >= 0; --i)
!     LTO_FIXUP_TREE (TREE_OPERAND (t, i));
  }
  
  /* Given a tree T fixup fields of T by replacing types with their merged
--- 1327,1499 ----
    *(tree *) htab_find_slot (tree_with_vars, t, INSERT) = t;
  }
  
! #define MAYBE_REMEMBER_WITH_VARS(tt) \
    do \
      { \
        if (tt) \
  	{ \
  	  if (VAR_OR_FUNCTION_DECL_P (tt) && TREE_PUBLIC (tt)) \
  	    remember_with_vars (t); \
  	} \
      } while (0)
  
  /* Fix up fields of a tree_typed T.  */
  
  static void
! maybe_remember_with_vars_typed (tree t)
  {
!   MAYBE_REMEMBER_WITH_VARS (TREE_TYPE (t));
  }
  
  /* Fix up fields of a tree_common T.  */
  
  static void
! maybe_remember_with_vars_common (tree t)
  {
!   maybe_remember_with_vars_typed (t);
!   MAYBE_REMEMBER_WITH_VARS (TREE_CHAIN (t));
  }
  
  /* Fix up fields of a decl_minimal T.  */
  
  static void
! maybe_remember_with_vars_decl_minimal (tree t)
  {
!   maybe_remember_with_vars_common (t);
!   MAYBE_REMEMBER_WITH_VARS (DECL_NAME (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_CONTEXT (t));
  }
  
  /* Fix up fields of a decl_common T.  */
  
  static void
! maybe_remember_with_vars_decl_common (tree t)
  {
!   maybe_remember_with_vars_decl_minimal (t);
!   MAYBE_REMEMBER_WITH_VARS (DECL_SIZE (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_SIZE_UNIT (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_INITIAL (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_ATTRIBUTES (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_ABSTRACT_ORIGIN (t));
  }
  
  /* Fix up fields of a decl_with_vis T.  */
  
  static void
! maybe_remember_with_vars_decl_with_vis (tree t)
  {
!   maybe_remember_with_vars_decl_common (t);
  
    /* Accessor macro has side-effects, use field-name here. */
!   MAYBE_REMEMBER_WITH_VARS (t->decl_with_vis.assembler_name);
!   MAYBE_REMEMBER_WITH_VARS (DECL_SECTION_NAME (t));
  }
  
  /* Fix up fields of a decl_non_common T.  */
  
  static void
! maybe_remember_with_vars_decl_non_common (tree t)
  {
!   maybe_remember_with_vars_decl_with_vis (t);
!   MAYBE_REMEMBER_WITH_VARS (DECL_ARGUMENT_FLD (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_RESULT_FLD (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_VINDEX (t));
  }
  
  /* Fix up fields of a decl_non_common T.  */
  
  static void
! maybe_remember_with_vars_function (tree t)
  {
!   maybe_remember_with_vars_decl_non_common (t);
!   MAYBE_REMEMBER_WITH_VARS (DECL_FUNCTION_PERSONALITY (t));
  }
  
  /* Fix up fields of a field_decl T.  */
  
  static void
! maybe_remember_with_vars_field_decl (tree t)
  {
!   maybe_remember_with_vars_decl_common (t);
!   MAYBE_REMEMBER_WITH_VARS (DECL_FIELD_OFFSET (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_BIT_FIELD_TYPE (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_QUALIFIER (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_FIELD_BIT_OFFSET (t));
!   MAYBE_REMEMBER_WITH_VARS (DECL_FCONTEXT (t));
  }
  
  /* Fix up fields of a type T.  */
  
  static void
! maybe_remember_with_vars_type (tree t)
  {
!   maybe_remember_with_vars_common (t);
!   MAYBE_REMEMBER_WITH_VARS (TYPE_CACHED_VALUES (t));
!   MAYBE_REMEMBER_WITH_VARS (TYPE_SIZE (t));
!   MAYBE_REMEMBER_WITH_VARS (TYPE_SIZE_UNIT (t));
!   MAYBE_REMEMBER_WITH_VARS (TYPE_ATTRIBUTES (t));
!   MAYBE_REMEMBER_WITH_VARS (TYPE_NAME (t));
  
    /* Accessors are for derived node types only. */
    if (!POINTER_TYPE_P (t))
!     MAYBE_REMEMBER_WITH_VARS (TYPE_MINVAL (t));
!   MAYBE_REMEMBER_WITH_VARS (TYPE_MAXVAL (t));
  
    /* Accessor is for derived node types only. */
!   MAYBE_REMEMBER_WITH_VARS (t->type_non_common.binfo);
  
!   MAYBE_REMEMBER_WITH_VARS (TYPE_CONTEXT (t));
  }
  
  /* Fix up fields of a BINFO T.  */
  
  static void
! maybe_remember_with_vars_binfo (tree t)
  {
    unsigned HOST_WIDE_INT i, n;
  
!   maybe_remember_with_vars_common (t);
!   MAYBE_REMEMBER_WITH_VARS (BINFO_VTABLE (t));
!   MAYBE_REMEMBER_WITH_VARS (BINFO_OFFSET (t));
!   MAYBE_REMEMBER_WITH_VARS (BINFO_VIRTUALS (t));
!   MAYBE_REMEMBER_WITH_VARS (BINFO_VPTR_FIELD (t));
    n = vec_safe_length (BINFO_BASE_ACCESSES (t));
    for (i = 0; i < n; i++)
!     MAYBE_REMEMBER_WITH_VARS (BINFO_BASE_ACCESS (t, i));
!   MAYBE_REMEMBER_WITH_VARS (BINFO_INHERITANCE_CHAIN (t));
!   MAYBE_REMEMBER_WITH_VARS (BINFO_SUBVTT_INDEX (t));
!   MAYBE_REMEMBER_WITH_VARS (BINFO_VPTR_INDEX (t));
    n = BINFO_N_BASE_BINFOS (t);
    for (i = 0; i < n; i++)
!     MAYBE_REMEMBER_WITH_VARS (BINFO_BASE_BINFO (t, i));
  }
  
  /* Fix up fields of a CONSTRUCTOR T.  */
  
  static void
! maybe_remember_with_vars_constructor (tree t)
  {
    unsigned HOST_WIDE_INT idx;
    constructor_elt *ce;
  
!   maybe_remember_with_vars_typed (t);
  
    for (idx = 0; vec_safe_iterate (CONSTRUCTOR_ELTS (t), idx, &ce); idx++)
      {
!       MAYBE_REMEMBER_WITH_VARS (ce->index);
!       MAYBE_REMEMBER_WITH_VARS (ce->value);
      }
  }
  
  /* Fix up fields of an expression tree T.  */
  
  static void
! maybe_remember_with_vars_expr (tree t)
  {
    int i;
!   maybe_remember_with_vars_typed (t);
    for (i = TREE_OPERAND_LENGTH (t) - 1; i >= 0; --i)
!     MAYBE_REMEMBER_WITH_VARS (TREE_OPERAND (t, i));
  }
  
  /* Given a tree T fixup fields of T by replacing types with their merged
*************** lto_ft_expr (tree t)
*** 1635,1641 ****
     for instance integer or string constants.  */
  
  static void
! lto_fixup_types (tree t)
  {
    switch (TREE_CODE (t))
      {
--- 1502,1508 ----
     for instance integer or string constants.  */
  
  static void
! maybe_remember_with_vars (tree t)
  {
    switch (TREE_CODE (t))
      {
*************** lto_fixup_types (tree t)
*** 1643,1655 ****
        break;
  
      case TREE_LIST:
!       LTO_FIXUP_TREE (TREE_VALUE (t));
!       LTO_FIXUP_TREE (TREE_PURPOSE (t));
!       LTO_FIXUP_TREE (TREE_CHAIN (t));
        break;
  
      case FIELD_DECL:
!       lto_ft_field_decl (t);
        break;
  
      case LABEL_DECL:
--- 1510,1522 ----
        break;
  
      case TREE_LIST:
!       MAYBE_REMEMBER_WITH_VARS (TREE_VALUE (t));
!       MAYBE_REMEMBER_WITH_VARS (TREE_PURPOSE (t));
!       MAYBE_REMEMBER_WITH_VARS (TREE_CHAIN (t));
        break;
  
      case FIELD_DECL:
!       maybe_remember_with_vars_field_decl (t);
        break;
  
      case LABEL_DECL:
*************** lto_fixup_types (tree t)
*** 1657,1683 ****
      case PARM_DECL:
      case RESULT_DECL:
      case IMPORTED_DECL:
!       lto_ft_decl_common (t);
        break;
  
      case VAR_DECL:
!       lto_ft_decl_with_vis (t);
        break;
  
      case TYPE_DECL:
!       lto_ft_decl_non_common (t);
        break;
  
      case FUNCTION_DECL:
!       lto_ft_function (t);
        break;
  
      case TREE_BINFO:
!       lto_ft_binfo (t);
        break;
  
      case PLACEHOLDER_EXPR:
!       lto_ft_common (t);
        break;
  
      case BLOCK:
--- 1524,1550 ----
      case PARM_DECL:
      case RESULT_DECL:
      case IMPORTED_DECL:
!       maybe_remember_with_vars_decl_common (t);
        break;
  
      case VAR_DECL:
!       maybe_remember_with_vars_decl_with_vis (t);
        break;
  
      case TYPE_DECL:
!       maybe_remember_with_vars_decl_non_common (t);
        break;
  
      case FUNCTION_DECL:
!       maybe_remember_with_vars_function (t);
        break;
  
      case TREE_BINFO:
!       maybe_remember_with_vars_binfo (t);
        break;
  
      case PLACEHOLDER_EXPR:
!       maybe_remember_with_vars_common (t);
        break;
  
      case BLOCK:
*************** lto_fixup_types (tree t)
*** 1686,1706 ****
      case TARGET_OPTION_NODE:
        break;
  
      default:
        if (TYPE_P (t))
! 	lto_ft_type (t);
!       else if (TREE_CODE (t) == CONSTRUCTOR)
! 	lto_ft_constructor (t);
        else if (CONSTANT_CLASS_P (t))
! 	LTO_FIXUP_TREE (TREE_TYPE (t));
        else if (EXPR_P (t))
! 	{
! 	  lto_ft_expr (t);
! 	}
        else
! 	{
! 	  remember_with_vars (t);
! 	}
      }
  }
  
--- 1553,1571 ----
      case TARGET_OPTION_NODE:
        break;
  
+     case CONSTRUCTOR:
+       maybe_remember_with_vars_constructor (t);
+       break;
+ 
      default:
        if (TYPE_P (t))
! 	maybe_remember_with_vars_type (t);
        else if (CONSTANT_CLASS_P (t))
! 	MAYBE_REMEMBER_WITH_VARS (TREE_TYPE (t));
        else if (EXPR_P (t))
! 	maybe_remember_with_vars_expr (t);
        else
! 	remember_with_vars (t);
      }
  }
  
*************** lto_register_function_decl_in_symtab (st
*** 1786,2006 ****
      }
  }
  
- static unsigned long num_merged_types = 0;
  
! /* Given a streamer cache structure DATA_IN (holding a sequence of trees
!    for one compilation unit) go over all trees starting at index FROM until the
!    end of the sequence and replace fields of those trees, and the trees
!    themself with their canonical variants as per gimple_register_type.  */
  
  static void
! uniquify_nodes (struct data_in *data_in, unsigned from)
  {
!   struct streamer_tree_cache_d *cache = data_in->reader_cache;
!   unsigned len = cache->nodes.length ();
!   unsigned i;
  
!   /* Go backwards because children streamed for the first time come
!      as part of their parents, and hence are created after them.  */
  
!   /* First register all the types in the cache.  This makes sure to
!      have the original structure in the type cycles when registering
!      them and computing hashes.  */
!   for (i = len; i-- > from;)
!     {
!       tree t = cache->nodes[i];
!       if (t && TYPE_P (t))
! 	{
! 	  tree newt = gimple_register_type (t);
! 	  /* Mark non-prevailing types so we fix them up.  No need
! 	     to reset that flag afterwards - nothing that refers
! 	     to those types is left and they are collected.  */
! 	  if (newt != t)
! 	    {
! 	      num_merged_types++;
! 	      TREE_VISITED (t) = 1;
! 	    }
  	}
      }
  
!   /* Second fixup all trees in the new cache entries.  */
!   for (i = len; i-- > from;)
      {
!       tree t = cache->nodes[i];
!       tree oldt = t;
!       if (!t)
! 	continue;
! 
!       /* First fixup the fields of T.  */
!       lto_fixup_types (t);
! 
!       if (!TYPE_P (t))
! 	continue;
! 
!       /* Now try to find a canonical variant of T itself.  */
!       t = GIMPLE_REGISTER_TYPE (t);
! 
!       if (t == oldt)
! 	{
! 	  /* The following re-creates proper variant lists while fixing up
! 	     the variant leaders.  We do not stream TYPE_NEXT_VARIANT so the
! 	     variant list state before fixup is broken.  */
! 	  tree tem, mv;
! 
! #ifdef ENABLE_CHECKING
! 	  /* Remove us from our main variant list if we are not the
! 	     variant leader.  */
! 	  if (TYPE_MAIN_VARIANT (t) != t)
! 	    {
! 	      tem = TYPE_MAIN_VARIANT (t);
! 	      while (tem && TYPE_NEXT_VARIANT (tem) != t)
! 		tem = TYPE_NEXT_VARIANT (tem);
! 	      gcc_assert (!tem && !TYPE_NEXT_VARIANT (t));
! 	    }
! #endif
  
! 	  /* Query our new main variant.  */
! 	  mv = GIMPLE_REGISTER_TYPE (TYPE_MAIN_VARIANT (t));
  
! 	  /* If we were the variant leader and we get replaced ourselves drop
! 	     all variants from our list.  */
! 	  if (TYPE_MAIN_VARIANT (t) == t
! 	      && mv != t)
! 	    {
! 	      tem = t;
! 	      while (tem)
! 		{
! 		  tree tem2 = TYPE_NEXT_VARIANT (tem);
! 		  TYPE_NEXT_VARIANT (tem) = NULL_TREE;
! 		  tem = tem2;
! 		}
! 	    }
  
! 	  /* If we are not our own variant leader link us into our new leaders
! 	     variant list.  */
! 	  if (mv != t)
! 	    {
! 	      TYPE_NEXT_VARIANT (t) = TYPE_NEXT_VARIANT (mv);
! 	      TYPE_NEXT_VARIANT (mv) = t;
! 	      if (RECORD_OR_UNION_TYPE_P (t))
! 		TYPE_BINFO (t) = TYPE_BINFO (mv);
! 	      /* Preserve the invariant that type variants share their
! 		 TYPE_FIELDS.  */
! 	      if (RECORD_OR_UNION_TYPE_P (t)
! 		  && TYPE_FIELDS (mv) != TYPE_FIELDS (t))
! 		{
! 		  tree f1, f2;
! 		  for (f1 = TYPE_FIELDS (mv), f2 = TYPE_FIELDS (t);
! 		       f1 && f2; f1 = TREE_CHAIN (f1), f2 = TREE_CHAIN (f2))
! 		    {
! 		      unsigned ix;
! 		      gcc_assert (f1 != f2
! 				  && DECL_NAME (f1) == DECL_NAME (f2));
! 		      if (!streamer_tree_cache_lookup (cache, f2, &ix))
! 			gcc_unreachable ();
! 		      /* If we're going to replace an element which we'd
! 			 still visit in the next iterations, we wouldn't
! 			 handle it, so do it here.  We do have to handle it
! 			 even though the field_decl itself will be removed,
! 			 as it could refer to e.g. integer_cst which we
! 			 wouldn't reach via any other way, hence they
! 			 (and their type) would stay uncollected.  */
! 		      /* ???  We should rather make sure to replace all
! 			 references to f2 with f1.  That means handling
! 			 COMPONENT_REFs and CONSTRUCTOR elements in
! 			 lto_fixup_types and special-case the field-decl
! 			 operand handling.  */
! 		      /* ???  Not sure the above is all relevant in this
! 		         path canonicalizing TYPE_FIELDS to that of the
! 			 main variant.  */
! 		      if (ix < i)
! 			lto_fixup_types (f2);
! 		      streamer_tree_cache_insert_at (cache, f1, ix);
! 		    }
! 		  TYPE_FIELDS (t) = TYPE_FIELDS (mv);
! 		}
! 	    }
  
! 	  /* Finally adjust our main variant and fix it up.  */
! 	  TYPE_MAIN_VARIANT (t) = mv;
  
! 	  /* The following reconstructs the pointer chains
! 	     of the new pointed-to type if we are a main variant.  We do
! 	     not stream those so they are broken before fixup.  */
! 	  if (TREE_CODE (t) == POINTER_TYPE
! 	      && TYPE_MAIN_VARIANT (t) == t)
! 	    {
! 	      TYPE_NEXT_PTR_TO (t) = TYPE_POINTER_TO (TREE_TYPE (t));
! 	      TYPE_POINTER_TO (TREE_TYPE (t)) = t;
! 	    }
! 	  else if (TREE_CODE (t) == REFERENCE_TYPE
! 		   && TYPE_MAIN_VARIANT (t) == t)
! 	    {
! 	      TYPE_NEXT_REF_TO (t) = TYPE_REFERENCE_TO (TREE_TYPE (t));
! 	      TYPE_REFERENCE_TO (TREE_TYPE (t)) = t;
! 	    }
  	}
  
!       else
  	{
! 	  if (RECORD_OR_UNION_TYPE_P (t))
! 	    {
! 	      tree f1, f2;
! 	      if (TYPE_FIELDS (t) != TYPE_FIELDS (oldt))
! 		for (f1 = TYPE_FIELDS (t), f2 = TYPE_FIELDS (oldt);
! 		     f1 && f2; f1 = TREE_CHAIN (f1), f2 = TREE_CHAIN (f2))
! 		  {
! 		    unsigned ix;
! 		    gcc_assert (f1 != f2 && DECL_NAME (f1) == DECL_NAME (f2));
! 		    if (!streamer_tree_cache_lookup (cache, f2, &ix))
! 		      gcc_unreachable ();
! 		    /* If we're going to replace an element which we'd
! 		       still visit in the next iterations, we wouldn't
! 		       handle it, so do it here.  We do have to handle it
! 		       even though the field_decl itself will be removed,
! 		       as it could refer to e.g. integer_cst which we
! 		       wouldn't reach via any other way, hence they
! 		       (and their type) would stay uncollected.  */
! 		    /* ???  We should rather make sure to replace all
! 		       references to f2 with f1.  That means handling
! 		       COMPONENT_REFs and CONSTRUCTOR elements in
! 		       lto_fixup_types and special-case the field-decl
! 		       operand handling.  */
! 		    if (ix < i)
! 		      lto_fixup_types (f2);
! 		    streamer_tree_cache_insert_at (cache, f1, ix);
! 		  }
! 	    }
  
! 	  /* If we found a tree that is equal to oldt replace it in the
! 	     cache, so that further users (in the various LTO sections)
! 	     make use of it.  */
! 	  streamer_tree_cache_insert_at (cache, t, i);
  	}
      }
  
!   /* Finally compute the canonical type of all TREE_TYPEs and register
!      VAR_DECL and FUNCTION_DECL nodes in the symbol table.
!      From this point there are no longer any types with
!      TYPE_STRUCTURAL_EQUALITY_P and its type-based alias problems.
!      This step requires the TYPE_POINTER_TO lists being present, so
!      make sure it is done last.  */
!   for (i = len; i-- > from;)
!     {
!       tree t = cache->nodes[i];
!       if (t == NULL_TREE)
! 	continue;
! 
!       if (TREE_CODE (t) == VAR_DECL)
! 	lto_register_var_decl_in_symtab (data_in, t);
!       else if (TREE_CODE (t) == FUNCTION_DECL && !DECL_BUILT_IN (t))
! 	lto_register_function_decl_in_symtab (data_in, t);
!       else if (!flag_wpa
! 	       && TREE_CODE (t) == TYPE_DECL)
! 	debug_hooks->type_decl (t, !DECL_FILE_SCOPE_P (t));
!       else if (TYPE_P (t) && !TYPE_CANONICAL (t))
! 	TYPE_CANONICAL (t) = gimple_register_canonical_type (t);
      }
  }
  
  
--- 1651,2378 ----
      }
  }
  
  
! /* For the type T re-materialize it in the type variant list and
!    the pointer/reference-to chains.  */
  
  static void
! lto_fixup_prevailing_type (tree t)
  {
!   /* The following re-creates proper variant lists while fixing up
!      the variant leaders.  We do not stream TYPE_NEXT_VARIANT so the
!      variant list state before fixup is broken.  */
  
!   /* If we are not our own variant leader link us into our new leaders
!      variant list.  */
!   if (TYPE_MAIN_VARIANT (t) != t)
!     {
!       tree mv = TYPE_MAIN_VARIANT (t);
!       TYPE_NEXT_VARIANT (t) = TYPE_NEXT_VARIANT (mv);
!       TYPE_NEXT_VARIANT (mv) = t;
!     }
  
!   /* The following reconstructs the pointer chains
!      of the new pointed-to type if we are a main variant.  We do
!      not stream those so they are broken before fixup.  */
!   if (TREE_CODE (t) == POINTER_TYPE
!       && TYPE_MAIN_VARIANT (t) == t)
!     {
!       TYPE_NEXT_PTR_TO (t) = TYPE_POINTER_TO (TREE_TYPE (t));
!       TYPE_POINTER_TO (TREE_TYPE (t)) = t;
!     }
!   else if (TREE_CODE (t) == REFERENCE_TYPE
! 	   && TYPE_MAIN_VARIANT (t) == t)
!     {
!       TYPE_NEXT_REF_TO (t) = TYPE_REFERENCE_TO (TREE_TYPE (t));
!       TYPE_REFERENCE_TO (TREE_TYPE (t)) = t;
!     }
! }
! 
! 
! /* We keep prevailing tree SCCs in a hashtable with manual collision
!    handling (in case all hashes compare the same) and keep the colliding
!    entries in the tree_scc->next chain.  */
! 
! struct tree_scc
! {
!   tree_scc *next;
!   /* Hash of the whole SCC.  */
!   hashval_t hash;
!   /* Number of trees in the SCC.  */
!   unsigned len;
!   /* Number of possible entries into the SCC (tree nodes [0..entry_len-1]
!      which share the same individual tree hash).  */
!   unsigned entry_len;
!   /* The members of the SCC.
!      We only need to remember the first entry node candidate for prevailing
!      SCCs (but of course have access to all entries for SCCs we are
!      processing).
!      ???  For prevailing SCCs we really only need hash and the first
!      entry candidate, but that's too awkward to implement.  */
!   tree entries[1];
! };
! 
! struct tree_scc_hasher : typed_noop_remove <tree_scc>
! {
!   typedef tree_scc value_type;
!   typedef tree_scc compare_type;
!   static inline hashval_t hash (const value_type *);
!   static inline bool equal (const value_type *, const compare_type *);
! };
! 
! hashval_t
! tree_scc_hasher::hash (const value_type *scc)
! {
!   return scc->hash;
! }
! 
! bool
! tree_scc_hasher::equal (const value_type *scc1, const compare_type *scc2)
! {
!   if (scc1->hash != scc2->hash
!       || scc1->len != scc2->len
!       || scc1->entry_len != scc2->entry_len)
!     return false;
!   return true;
! }
! 
! static hash_table <tree_scc_hasher> tree_scc_hash;
! static struct obstack tree_scc_hash_obstack;
! 
! static unsigned long num_merged_types;
! static unsigned long num_prevailing_types;
! static unsigned long num_not_merged_types;
! static unsigned long num_not_merged_types_in_same_scc;
! static unsigned long num_not_merged_types_trees;
! static unsigned long num_not_merged_types_in_same_scc_trees;
! static unsigned long num_type_scc_trees;
! static unsigned long total_scc_size;
! static unsigned long num_sccs_read;
! static unsigned long total_scc_size_merged;
! static unsigned long num_sccs_merged;
! static unsigned long num_scc_compares;
! static unsigned long num_scc_compare_collisions;
! 
! 
! /* Compare the two entries T1 and T2 of two SCCs that are possibly equal,
!    recursing through in-SCC tree edges.  Returns true if the SCCs entered
!    through T1 and T2 are equal and fills in *MAP with the pairs of
!    SCC entries we visited, starting with (*MAP)[0] = T1 and (*MAP)[1] = T2.  */
! 
! static bool
! compare_tree_sccs_1 (tree t1, tree t2, tree **map)
! {
!   enum tree_code code;
! 
!   /* Handle combinations where at least one of t1 or t2 is NULL_TREE.  */
!   if (!t1 && !t2)
!     return true;
!   if (!t1 || !t2)
!     return false;
! 
!   /* Outside of the SCC trees have to compare pointer equal.  */
!   if (!TREE_VISITED (t2))
!     return t1 == t2;
! 
!   /* Inside of the SCC tree they cannot be pointer equal.  */
!   gcc_assert (t1 != t2);
! 
!   /* If we have already visited t1, we reached a cycle.  Optimistically
!      return equal.  */
!   if (TREE_ASM_WRITTEN (t2))
!     return true;
! 
!   /* Mark already visited nodes.  */
!   TREE_ASM_WRITTEN (t2) = 1;
! 
!   /* Push the pair onto map.  */
!   (*map)[0] = t1;
!   (*map)[1] = t2;
!   *map = *map + 2;
! 
!   /* Compare value-fields.  */
! #define compare_values(X) \
!   do { \
!     if (X(t1) != X(t2)) \
!       return false; \
!   } while (0)
! 
!   compare_values (TREE_CODE);
!   code = TREE_CODE (t1);
! 
!   if (!TYPE_P (t1))
!     {
!       compare_values (TREE_SIDE_EFFECTS);
!       compare_values (TREE_CONSTANT);
!       compare_values (TREE_READONLY);
!       compare_values (TREE_PUBLIC);
!     }
!   compare_values (TREE_ADDRESSABLE);
!   compare_values (TREE_THIS_VOLATILE);
!   if (DECL_P (t1))
!     compare_values (DECL_UNSIGNED);
!   else if (TYPE_P (t1))
!     compare_values (TYPE_UNSIGNED);
!   if (TYPE_P (t1))
!     compare_values (TYPE_ARTIFICIAL);
!   else
!     compare_values (TREE_NO_WARNING);
!   compare_values (TREE_NOTHROW);
!   compare_values (TREE_STATIC);
!   compare_values (TREE_PRIVATE);
!   compare_values (TREE_PROTECTED);
!   compare_values (TREE_DEPRECATED);
!   if (TYPE_P (t1))
!     {
!       compare_values (TYPE_SATURATING);
!       compare_values (TYPE_ADDR_SPACE);
!     }
!   else if (code == SSA_NAME)
!     compare_values (SSA_NAME_IS_DEFAULT_DEF);
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_INT_CST))
!     {
!       compare_values (TREE_INT_CST_LOW);
!       compare_values (TREE_INT_CST_HIGH);
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_REAL_CST))
!     {
!       /* ???  No suitable compare routine available.  */
!       REAL_VALUE_TYPE r1 = TREE_REAL_CST (t1);
!       REAL_VALUE_TYPE r2 = TREE_REAL_CST (t2);
!       if (r1.cl != r2.cl
! 	  || r1.decimal != r2.decimal
! 	  || r1.sign != r2.sign
! 	  || r1.signalling != r2.signalling
! 	  || r1.canonical != r2.canonical
! 	  || r1.uexp != r2.uexp)
! 	return false;
!       for (unsigned i = 0; i < SIGSZ; ++i)
! 	if (r1.sig[i] != r2.sig[i])
! 	  return false;
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_FIXED_CST))
!     if (!fixed_compare (EQ_EXPR,
! 			TREE_FIXED_CST_PTR (t1), TREE_FIXED_CST_PTR (t2)))
!       return false;
! 
! 
!   /* We don't want to compare locations, so there is nothing do compare
!      for TS_DECL_MINIMAL.  */
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_DECL_COMMON))
!     {
!       compare_values (DECL_MODE);
!       compare_values (DECL_NONLOCAL);
!       compare_values (DECL_VIRTUAL_P);
!       compare_values (DECL_IGNORED_P);
!       compare_values (DECL_ABSTRACT);
!       compare_values (DECL_ARTIFICIAL);
!       compare_values (DECL_USER_ALIGN);
!       compare_values (DECL_PRESERVE_P);
!       compare_values (DECL_EXTERNAL);
!       compare_values (DECL_GIMPLE_REG_P);
!       compare_values (DECL_ALIGN);
!       if (code == LABEL_DECL)
! 	{
! 	  compare_values (DECL_ERROR_ISSUED);
! 	  compare_values (EH_LANDING_PAD_NR);
! 	  compare_values (LABEL_DECL_UID);
! 	}
!       else if (code == FIELD_DECL)
! 	{
! 	  compare_values (DECL_PACKED);
! 	  compare_values (DECL_NONADDRESSABLE_P);
! 	  compare_values (DECL_OFFSET_ALIGN);
! 	}
!       else if (code == VAR_DECL)
! 	{
! 	  compare_values (DECL_HAS_DEBUG_EXPR_P);
! 	  compare_values (DECL_NONLOCAL_FRAME);
! 	}
!       if (code == RESULT_DECL
! 	  || code == PARM_DECL
! 	  || code == VAR_DECL)
! 	{
! 	  compare_values (DECL_BY_REFERENCE);
! 	  if (code == VAR_DECL
! 	      || code == PARM_DECL)
! 	    compare_values (DECL_HAS_VALUE_EXPR_P);
! 	}
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_DECL_WRTL))
!     compare_values (DECL_REGISTER);
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_DECL_WITH_VIS))
!     {
!       compare_values (DECL_DEFER_OUTPUT);
!       compare_values (DECL_COMMON);
!       compare_values (DECL_DLLIMPORT_P);
!       compare_values (DECL_WEAK);
!       compare_values (DECL_SEEN_IN_BIND_EXPR_P);
!       compare_values (DECL_COMDAT);
!       compare_values (DECL_VISIBILITY);
!       compare_values (DECL_VISIBILITY_SPECIFIED);
!       if (code == VAR_DECL)
! 	{
! 	  compare_values (DECL_HARD_REGISTER);
! 	  compare_values (DECL_IN_TEXT_SECTION);
! 	  compare_values (DECL_IN_CONSTANT_POOL);
! 	  compare_values (DECL_TLS_MODEL);
! 	}
!       if (VAR_OR_FUNCTION_DECL_P (t1))
! 	compare_values (DECL_INIT_PRIORITY);
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_FUNCTION_DECL))
!     {
!       compare_values (DECL_BUILT_IN_CLASS);
!       compare_values (DECL_STATIC_CONSTRUCTOR);
!       compare_values (DECL_STATIC_DESTRUCTOR);
!       compare_values (DECL_UNINLINABLE);
!       compare_values (DECL_POSSIBLY_INLINED);
!       compare_values (DECL_IS_NOVOPS);
!       compare_values (DECL_IS_RETURNS_TWICE);
!       compare_values (DECL_IS_MALLOC);
!       compare_values (DECL_IS_OPERATOR_NEW);
!       compare_values (DECL_DECLARED_INLINE_P);
!       compare_values (DECL_STATIC_CHAIN);
!       compare_values (DECL_NO_INLINE_WARNING_P);
!       compare_values (DECL_NO_INSTRUMENT_FUNCTION_ENTRY_EXIT);
!       compare_values (DECL_NO_LIMIT_STACK);
!       compare_values (DECL_DISREGARD_INLINE_LIMITS);
!       compare_values (DECL_PURE_P);
!       compare_values (DECL_LOOPING_CONST_OR_PURE_P);
!       if (DECL_BUILT_IN_CLASS (t1) != NOT_BUILT_IN)
! 	compare_values (DECL_FUNCTION_CODE);
!       if (DECL_STATIC_DESTRUCTOR (t1))
! 	compare_values (DECL_FINI_PRIORITY);
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_TYPE_COMMON))
!     {
!       compare_values (TYPE_MODE);
!       compare_values (TYPE_STRING_FLAG);
!       compare_values (TYPE_NO_FORCE_BLK);
!       compare_values (TYPE_NEEDS_CONSTRUCTING);
!       if (RECORD_OR_UNION_TYPE_P (t1))
! 	compare_values (TYPE_TRANSPARENT_AGGR);
!       else if (code == ARRAY_TYPE)
! 	compare_values (TYPE_NONALIASED_COMPONENT);
!       compare_values (TYPE_PACKED);
!       compare_values (TYPE_RESTRICT);
!       compare_values (TYPE_CONTAINS_PLACEHOLDER_INTERNAL);
!       compare_values (TYPE_USER_ALIGN);
!       compare_values (TYPE_READONLY);
!       compare_values (TYPE_PRECISION);
!       compare_values (TYPE_ALIGN);
!       compare_values (TYPE_ALIAS_SET);
!     }
! 
!   /* We don't want to compare locations, so there is nothing do compare
!      for TS_EXP.  */
! 
!   /* BLOCKs are function local and we don't merge anything there, so
!      simply refuse to merge.  */
!   if (CODE_CONTAINS_STRUCT (code, TS_BLOCK))
!     return false;
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_TRANSLATION_UNIT_DECL))
!     if (strcmp (TRANSLATION_UNIT_LANGUAGE (t1),
! 		TRANSLATION_UNIT_LANGUAGE (t2)) != 0)
!       return false;
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_TARGET_OPTION))
!     if (memcmp (TREE_TARGET_OPTION (t1), TREE_TARGET_OPTION (t2),
! 		sizeof (struct cl_target_option)) != 0)
!       return false;
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_OPTIMIZATION))
!     if (memcmp (TREE_OPTIMIZATION (t1), TREE_OPTIMIZATION (t2),
! 		sizeof (struct cl_optimization)) != 0)
!       return false;
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_BINFO))
!     if (vec_safe_length (BINFO_BASE_ACCESSES (t1))
! 	!= vec_safe_length (BINFO_BASE_ACCESSES (t2)))
!       return false;
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_CONSTRUCTOR))
!     compare_values (CONSTRUCTOR_NELTS);
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_IDENTIFIER))
!     if (IDENTIFIER_LENGTH (t1) != IDENTIFIER_LENGTH (t2)
! 	|| memcmp (IDENTIFIER_POINTER (t1), IDENTIFIER_POINTER (t2),
! 		   IDENTIFIER_LENGTH (t1)) != 0)
!       return false;
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_STRING))
!     if (TREE_STRING_LENGTH (t1) != TREE_STRING_LENGTH (t2)
! 	|| memcmp (TREE_STRING_POINTER (t1), TREE_STRING_POINTER (t2),
! 		   TREE_STRING_LENGTH (t1)) != 0)
!       return false;
! 
! #undef compare_values
! 
! 
!   /* Compare pointer fields.  */
! 
!   /* Recurse.  Search & Replaced from DFS_write_tree_body.  */
! #define compare_tree_edges(E1, E2) \
!   do { \
!     if (!compare_tree_sccs_1 ((E1), (E2), map)) \
!       return false; \
!   } while (0)
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_TYPED))
!     {
!       if (code != IDENTIFIER_NODE)
! 	compare_tree_edges (TREE_TYPE (t1), TREE_TYPE (t2));
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_VECTOR))
!     {
!       unsigned i;
!       /* Note that the number of elements for EXPR has already been emitted
! 	 in EXPR's header (see streamer_write_tree_header).  */
!       for (i = 0; i < VECTOR_CST_NELTS (t1); ++i)
! 	compare_tree_edges (VECTOR_CST_ELT (t1, i), VECTOR_CST_ELT (t2, i));
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_COMPLEX))
!     {
!       compare_tree_edges (TREE_REALPART (t1), TREE_REALPART (t2));
!       compare_tree_edges (TREE_IMAGPART (t1), TREE_IMAGPART (t2));
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_DECL_MINIMAL))
!     {
!       compare_tree_edges (DECL_NAME (t1), DECL_NAME (t2));
!       /* ???  Global decls from different TUs have non-matching
! 	 TRANSLATION_UNIT_DECLs.  Only consider a small set of
! 	 decls equivalent, we should not end up merging others.  */
!       if ((code == TYPE_DECL
! 	   || code == NAMESPACE_DECL
! 	   || code == IMPORTED_DECL
! 	   || code == CONST_DECL
! 	   || (VAR_OR_FUNCTION_DECL_P (t1)
! 	       && (TREE_PUBLIC (t1) || DECL_EXTERNAL (t1))))
! 	  && DECL_FILE_SCOPE_P (t1) && DECL_FILE_SCOPE_P (t2))
! 	;
!       else
! 	compare_tree_edges (DECL_CONTEXT (t1), DECL_CONTEXT (t2));
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_DECL_COMMON))
!     {
!       compare_tree_edges (DECL_SIZE (t1), DECL_SIZE (t2));
!       compare_tree_edges (DECL_SIZE_UNIT (t1), DECL_SIZE_UNIT (t2));
!       compare_tree_edges (DECL_ATTRIBUTES (t1), DECL_ATTRIBUTES (t2));
!       if ((code == VAR_DECL
! 	   || code == PARM_DECL)
! 	  && DECL_HAS_VALUE_EXPR_P (t1))
! 	compare_tree_edges (DECL_VALUE_EXPR (t1), DECL_VALUE_EXPR (t2));
!       if (code == VAR_DECL
! 	  && DECL_HAS_DEBUG_EXPR_P (t1))
! 	compare_tree_edges (DECL_DEBUG_EXPR (t1), DECL_DEBUG_EXPR (t2));
!       /* LTO specific edges.  */
!       if (code != FUNCTION_DECL
! 	  && code != TRANSLATION_UNIT_DECL)
! 	compare_tree_edges (DECL_INITIAL (t1), DECL_INITIAL (t2));
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_DECL_NON_COMMON))
!     {
!       if (code == FUNCTION_DECL)
! 	{
! 	  tree a1, a2;
! 	  for (a1 = DECL_ARGUMENTS (t1), a2 = DECL_ARGUMENTS (t2);
! 	       a1 || a2;
! 	       a1 = TREE_CHAIN (a1), a2 = TREE_CHAIN (a2))
! 	    compare_tree_edges (a1, a2);
! 	  compare_tree_edges (DECL_RESULT (t1), DECL_RESULT (t2));
! 	}
!       else if (code == TYPE_DECL)
! 	compare_tree_edges (DECL_ORIGINAL_TYPE (t1), DECL_ORIGINAL_TYPE (t2));
!       compare_tree_edges (DECL_VINDEX (t1), DECL_VINDEX (t2));
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_DECL_WITH_VIS))
!     {
!       /* Make sure we don't inadvertently set the assembler name.  */
!       if (DECL_ASSEMBLER_NAME_SET_P (t1))
! 	compare_tree_edges (DECL_ASSEMBLER_NAME (t1),
! 			    DECL_ASSEMBLER_NAME (t2));
!       compare_tree_edges (DECL_SECTION_NAME (t1), DECL_SECTION_NAME (t2));
!       compare_tree_edges (DECL_COMDAT_GROUP (t1), DECL_COMDAT_GROUP (t2));
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_FIELD_DECL))
!     {
!       compare_tree_edges (DECL_FIELD_OFFSET (t1), DECL_FIELD_OFFSET (t2));
!       compare_tree_edges (DECL_BIT_FIELD_TYPE (t1), DECL_BIT_FIELD_TYPE (t2));
!       compare_tree_edges (DECL_BIT_FIELD_REPRESENTATIVE (t1),
! 			  DECL_BIT_FIELD_REPRESENTATIVE (t2));
!       compare_tree_edges (DECL_FIELD_BIT_OFFSET (t1),
! 			  DECL_FIELD_BIT_OFFSET (t2));
!       compare_tree_edges (DECL_FCONTEXT (t1), DECL_FCONTEXT (t2));
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_FUNCTION_DECL))
!     {
!       compare_tree_edges (DECL_FUNCTION_PERSONALITY (t1),
! 			  DECL_FUNCTION_PERSONALITY (t2));
!       compare_tree_edges (DECL_FUNCTION_SPECIFIC_TARGET (t1),
! 			  DECL_FUNCTION_SPECIFIC_TARGET (t2));
!       compare_tree_edges (DECL_FUNCTION_SPECIFIC_OPTIMIZATION (t1),
! 			  DECL_FUNCTION_SPECIFIC_OPTIMIZATION (t2));
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_TYPE_COMMON))
!     {
!       compare_tree_edges (TYPE_SIZE (t1), TYPE_SIZE (t2));
!       compare_tree_edges (TYPE_SIZE_UNIT (t1), TYPE_SIZE_UNIT (t2));
!       compare_tree_edges (TYPE_ATTRIBUTES (t1), TYPE_ATTRIBUTES (t2));
!       compare_tree_edges (TYPE_NAME (t1), TYPE_NAME (t2));
!       /* Do not compare TYPE_POINTER_TO or TYPE_REFERENCE_TO.  They will be
! 	 reconstructed during fixup.  */
!       /* Do not compare TYPE_NEXT_VARIANT, we reconstruct the variant lists
! 	 during fixup.  */
!       compare_tree_edges (TYPE_MAIN_VARIANT (t1), TYPE_MAIN_VARIANT (t2));
!       /* ???  Global types from different TUs have non-matching
! 	 TRANSLATION_UNIT_DECLs.  Still merge them if they are otherwise
! 	 equal.  */
!       if (TYPE_FILE_SCOPE_P (t1) && TYPE_FILE_SCOPE_P (t2))
! 	;
!       else
! 	compare_tree_edges (TYPE_CONTEXT (t1), TYPE_CONTEXT (t2));
!       /* TYPE_CANONICAL is re-computed during type merging, so do not
! 	 compare it here.  */
!       compare_tree_edges (TYPE_STUB_DECL (t1), TYPE_STUB_DECL (t2));
!     }
! 
!   if (CODE_CONTAINS_STRUCT (code, TS_TYPE_NON_COMMON))
!     {
!       if (code == ENUMERAL_TYPE)
! 	compare_tree_edges (TYPE_VALUES (t1), TYPE_VALUES (t2));
!       else if (code == ARRAY_TYPE)
! 	compare_tree_edges (TYPE_DOMAIN (t1), TYPE_DOMAIN (t2));
!       else if (RECORD_OR_UNION_TYPE_P (t1))
! 	{
! 	  tree f1, f2;
! 	  for (f1 = TYPE_FIELDS (t1), f2 = TYPE_FIELDS (t2);
! 	       f1 || f2;
! 	       f1 = TREE_CHAIN (f1), f2 = TREE_CHAIN (f2))
! 	    compare_tree_edges (f1, f2);
! 	  compare_tree_edges (TYPE_BINFO (t1), TYPE_BINFO (t2));
  	}
+       else if (code == FUNCTION_TYPE
+ 	       || code == METHOD_TYPE)
+ 	compare_tree_edges (TYPE_ARG_TYPES (t1), TYPE_ARG_TYPES (t2));
+       if (!POINTER_TYPE_P (t1))
+ 	compare_tree_edges (TYPE_MINVAL (t1), TYPE_MINVAL (t2));
+       compare_tree_edges (TYPE_MAXVAL (t1), TYPE_MAXVAL (t2));
      }
  
!   if (CODE_CONTAINS_STRUCT (code, TS_LIST))
      {
!       compare_tree_edges (TREE_PURPOSE (t1), TREE_PURPOSE (t2));
!       compare_tree_edges (TREE_VALUE (t1), TREE_VALUE (t2));
!       compare_tree_edges (TREE_CHAIN (t1), TREE_CHAIN (t2));
!     }
  
!   if (CODE_CONTAINS_STRUCT (code, TS_VEC))
!     for (int i = 0; i < TREE_VEC_LENGTH (t1); i++)
!       compare_tree_edges (TREE_VEC_ELT (t1, i), TREE_VEC_ELT (t2, i));
  
!   if (CODE_CONTAINS_STRUCT (code, TS_EXP))
!     {
!       for (int i = 0; i < TREE_OPERAND_LENGTH (t1); i++)
! 	compare_tree_edges (TREE_OPERAND (t1, i),
! 			    TREE_OPERAND (t2, i));
  
!       /* BLOCKs are function local and we don't merge anything there.  */
!       if (TREE_BLOCK (t1) || TREE_BLOCK (t2))
! 	return false;
!     }
  
!   if (CODE_CONTAINS_STRUCT (code, TS_BINFO))
!     {
!       unsigned i;
!       tree t;
!       /* Lengths have already been compared above.  */
!       FOR_EACH_VEC_ELT (*BINFO_BASE_BINFOS (t1), i, t)
! 	compare_tree_edges (t, BINFO_BASE_BINFO (t2, i));
!       FOR_EACH_VEC_SAFE_ELT (BINFO_BASE_ACCESSES (t1), i, t)
! 	compare_tree_edges (t, BINFO_BASE_ACCESS (t2, i));
!       compare_tree_edges (BINFO_OFFSET (t1), BINFO_OFFSET (t2));
!       compare_tree_edges (BINFO_VTABLE (t1), BINFO_VTABLE (t2));
!       compare_tree_edges (BINFO_VPTR_FIELD (t1), BINFO_VPTR_FIELD (t2));
!       compare_tree_edges (BINFO_INHERITANCE_CHAIN (t1),
! 			  BINFO_INHERITANCE_CHAIN (t2));
!       compare_tree_edges (BINFO_SUBVTT_INDEX (t1),
! 			  BINFO_SUBVTT_INDEX (t2));
!       compare_tree_edges (BINFO_VPTR_INDEX (t1),
! 			  BINFO_VPTR_INDEX (t2));
!     }
  
!   if (CODE_CONTAINS_STRUCT (code, TS_CONSTRUCTOR))
!     {
!       unsigned i;
!       tree index, value;
!       /* Lengths have already been compared above.  */
!       FOR_EACH_CONSTRUCTOR_ELT (CONSTRUCTOR_ELTS (t1), i, index, value)
! 	{
! 	  compare_tree_edges (index, CONSTRUCTOR_ELT (t2, i)->index);
! 	  compare_tree_edges (value, CONSTRUCTOR_ELT (t2, i)->value);
  	}
+     }
  
! #undef compare_tree_edges
! 
!   return true;
! }
! 
! /* Compare the tree scc SCC to the prevailing candidate PSCC, filling
!    out MAP if they are equal.  */
! 
! static bool
! compare_tree_sccs (tree_scc *pscc, tree_scc *scc,
! 		   tree *map)
! {
!   /* Assume SCC entry hashes are sorted after their cardinality.  Which
!      means we can simply take the first n-tuple of equal hashes
!      (which is recorded as entry_len) and do n SCC entry candidate
!      comparisons.  */
!   for (unsigned i = 0; i < pscc->entry_len; ++i)
!     {
!       tree *mapp = map;
!       num_scc_compare_collisions++;
!       if (compare_tree_sccs_1 (pscc->entries[0], scc->entries[i], &mapp))
  	{
! 	  /* Equal - no need to reset TREE_VISITED or TREE_ASM_WRITTEN
! 	     on the scc as all trees will be freed.  */
! 	  return true;
! 	}
!       /* Reset TREE_ASM_WRITTEN on scc for the next compare or in case
!          the SCC prevails.  */
!       for (unsigned j = 0; j < scc->len; ++j)
! 	TREE_ASM_WRITTEN (scc->entries[j]) = 0;
!     }
! 
!   return false;
! }
! 
! /* Try to unify the SCC with nodes FROM to FROM + LEN in CACHE and
!    hash value SCC_HASH with an already recorded SCC.  Return true if
!    that was successful, otherwise return false.  */
  
! static bool
! unify_scc (struct streamer_tree_cache_d *cache, unsigned from,
! 	   unsigned len, unsigned scc_entry_len, hashval_t scc_hash)
! {
!   bool unified_p = false;
!   tree_scc *scc
!     = (tree_scc *) alloca (sizeof (tree_scc) + (len - 1) * sizeof (tree));
!   scc->next = NULL;
!   scc->hash = scc_hash;
!   scc->len = len;
!   scc->entry_len = scc_entry_len;
!   for (unsigned i = 0; i < len; ++i)
!     {
!       tree t = streamer_tree_cache_get_tree (cache, from + i);
!       scc->entries[i] = t;
!       /* Do not merge SCCs with local entities inside them.  Also do
! 	 not merge TRANSLATION_UNIT_DECLs.  */
!       if (TREE_CODE (t) == TRANSLATION_UNIT_DECL
! 	  || (VAR_OR_FUNCTION_DECL_P (t)
! 	      && !(TREE_PUBLIC (t) || DECL_EXTERNAL (t)))
! 	  || TREE_CODE (t) == LABEL_DECL)
! 	{
! 	  /* Avoid doing any work for these cases and do not worry to
! 	     record the SCCs for further merging.  */
! 	  return false;
  	}
      }
  
!   /* Look for the list of candidate SCCs to compare against.  */
!   tree_scc **slot;
!   slot = tree_scc_hash.find_slot_with_hash (scc, scc_hash, INSERT);
!   if (*slot)
!     {
!       /* Try unifying against each candidate.  */
!       num_scc_compares++;
! 
!       /* Set TREE_VISITED on the scc so we can easily identify tree nodes
! 	 outside of the scc when following tree edges.  Make sure
! 	 that TREE_ASM_WRITTEN is unset so we can use it as 2nd bit
! 	 to track whether we visited the SCC member during the compare.
! 	 We cannot use TREE_VISITED on the pscc members as the extended
! 	 scc and pscc can overlap.  */
!       for (unsigned i = 0; i < scc->len; ++i)
! 	{
! 	  TREE_VISITED (scc->entries[i]) = 1;
! 	  gcc_assert (!TREE_ASM_WRITTEN (scc->entries[i]));
! 	}
! 
!       tree *map = XALLOCAVEC (tree, 2 * len);
!       for (tree_scc *pscc = *slot; pscc; pscc = pscc->next)
! 	{
! 	  if (!compare_tree_sccs (pscc, scc, map))
! 	    continue;
! 
! 	  /* Found an equal SCC.  */
! 	  unified_p = true;
! 	  num_scc_compare_collisions--;
! 	  num_sccs_merged++;
! 	  total_scc_size_merged += len;
! 
! 	  /* Fixup the streamer cache with the prevailing nodes according
! 	     to the tree node mapping computed by compare_tree_sccs.  */
! 	  for (unsigned i = 0; i < len; ++i)
! 	    {
! 	      tree t = map[2*i+1];
! 	      enum tree_code code = TREE_CODE (t);
! 	      unsigned ix;
! 	      bool r;
! 	      /* IDENTIFIER_NODEs should be singletons and are merged by the
! 		 streamer.  The others should be singletons, too, and we
! 		 should not merge them in any way.  */
! 	      gcc_assert (code != TRANSLATION_UNIT_DECL
! 			  && code != IDENTIFIER_NODE
! 			  && !streamer_handle_as_builtin_p (t));
! 	      r = streamer_tree_cache_lookup (cache, t, &ix);
! 	      gcc_assert (r && ix >= from);
! 	      streamer_tree_cache_replace_tree (cache, map[2 * i], ix);
! 	      if (TYPE_P (t))
! 		num_merged_types++;
! 	    }
! 	  /* Free the tree nodes from the read SCC.  */
! 	  for (unsigned i = 0; i < len; ++i)
! 	    ggc_free (scc->entries[i]);
! 	  break;
! 	}
! 
!       /* Reset TREE_VISITED if we didn't unify the SCC with another.  */
!       if (!unified_p)
! 	for (unsigned i = 0; i < scc->len; ++i)
! 	  TREE_VISITED (scc->entries[i]) = 0;
!     }
! 
!   /* If we didn't unify it to any candidate duplicate the relevant
!      pieces to permanent storage and link it into the chain.  */
!   if (!unified_p)
!     {
!       tree_scc *pscc
! 	= XOBNEWVAR (&tree_scc_hash_obstack, tree_scc, sizeof (tree_scc));
!       memcpy (pscc, scc, sizeof (tree_scc));
!       pscc->next = (*slot);
!       *slot = pscc;
      }
+   return unified_p;
  }
  
  
*************** lto_read_decls (struct lto_file_decl_dat
*** 2036,2044 ****
      {
        tree t;
        unsigned from = data_in->reader_cache->nodes.length ();
!       t = stream_read_tree (&ib_main, data_in);
!       gcc_assert (t && ib_main.p <= ib_main.len);
!       uniquify_nodes (data_in, from);
      }
  
    /* Read in lto_in_decl_state objects.  */
--- 2408,2530 ----
      {
        tree t;
        unsigned from = data_in->reader_cache->nodes.length ();
!       /* Read and uniquify SCCs as in the input stream.  */
!       enum LTO_tags tag = streamer_read_record_start (&ib_main);
!       if (tag == LTO_tree_scc)
! 	{
! 	  unsigned len_;
! 	  unsigned scc_entry_len;
! 	  hashval_t scc_hash = lto_input_scc (&ib_main, data_in, &len_,
! 					      &scc_entry_len);
! 	  unsigned len = data_in->reader_cache->nodes.length () - from;
! 	  gcc_assert (len == len_);
! 
! 	  total_scc_size += len;
! 	  num_sccs_read++;
! 
! 	  /* We have the special case of size-1 SCCs that are pre-merged
! 	     by means of identifier and string sharing for example.
! 	     ???  Maybe we should avoid streaming those as SCCs.  */
! 	  tree first = streamer_tree_cache_get_tree (data_in->reader_cache,
! 						     from);
! 	  if (len == 1
! 	      && (TREE_CODE (first) == IDENTIFIER_NODE
! 		  || TREE_CODE (first) == INTEGER_CST
! 		  || TREE_CODE (first) == TRANSLATION_UNIT_DECL
! 		  || streamer_handle_as_builtin_p (first)))
! 	    continue;
! 
! 	  /* Try to unify the SCC with already existing ones.  */
! 	  if (!flag_ltrans
! 	      && unify_scc (data_in->reader_cache, from,
! 			    len, scc_entry_len, scc_hash))
! 	    continue;
! 
! 	  /* Do remaining fixup tasks for prevailing nodes.  */
! 	  bool seen_type = false;
! 	  bool not_merged_type_same_scc = false;
! 	  bool not_merged_type_not_same_scc = false;
! 	  for (unsigned i = 0; i < len; ++i)
! 	    {
! 	      tree t = streamer_tree_cache_get_tree (data_in->reader_cache,
! 						     from + i);
! 	      /* For statistics, see if the old code would have merged
! 		 the type.  */
! 	      if (TYPE_P (t)
! 		  && (flag_lto_report || (flag_wpa && flag_lto_report_wpa)))
! 		{
! 		  tree newt = gimple_register_type (t);
! 		  if (newt != t)
! 		    {
! 		      num_not_merged_types++;
! 		      unsigned j;
! 		      /* Check if we can never merge the types because
! 			 they are in the same SCC and thus the old
! 			 code was broken.  */
! 		      for (j = 0; j < len; ++j)
! 			if (i != j
! 			    && streamer_tree_cache_get_tree
! 			         (data_in->reader_cache, from + j) == newt)
! 			  {
! 			    num_not_merged_types_in_same_scc++;
! 			    not_merged_type_same_scc = true;
! 			    break;
! 			  }
! 		      if (j == len)
! 			not_merged_type_not_same_scc = true;
! 		    }
! 		}
! 	      /* Reconstruct the type variant and pointer-to/reference-to
! 		 chains.  */
! 	      if (TYPE_P (t))
! 		{
! 		  seen_type = true;
! 		  num_prevailing_types++;
! 		  lto_fixup_prevailing_type (t);
! 		}
! 	      /* Compute the canonical type of all types.
! 		 ???  Should be able to assert that !TYPE_CANONICAL.  */
! 	      if (TYPE_P (t) && !TYPE_CANONICAL (t))
! 		TYPE_CANONICAL (t) = gimple_register_canonical_type (t);
! 	      /* Link shared INTEGER_CSTs into TYPE_CACHED_VALUEs of its
! 		 type which is also member of this SCC.  */
! 	      if (TREE_CODE (t) == INTEGER_CST
! 		  && !TREE_OVERFLOW (t))
! 		cache_integer_cst (t);
! 	      /* Register TYPE_DECLs with the debuginfo machinery.  */
! 	      if (!flag_wpa
! 		  && TREE_CODE (t) == TYPE_DECL)
! 		debug_hooks->type_decl (t, !DECL_FILE_SCOPE_P (t));
! 	      if (!flag_ltrans)
! 		{
! 		  /* Register variables and functions with the
! 		     symbol table.  */
! 		  if (TREE_CODE (t) == VAR_DECL)
! 		    lto_register_var_decl_in_symtab (data_in, t);
! 		  else if (TREE_CODE (t) == FUNCTION_DECL
! 			   && !DECL_BUILT_IN (t))
! 		    lto_register_function_decl_in_symtab (data_in, t);
! 		  /* Scan the tree for references to global functions or
! 		     variables and record those for later fixup.  */
! 		  maybe_remember_with_vars (t);
! 		}
! 	    }
! 	  if (not_merged_type_same_scc)
! 	    {
! 	      num_not_merged_types_in_same_scc_trees += len;
! 	      num_not_merged_types_trees += len;
! 	    }
! 	  else if (not_merged_type_not_same_scc)
! 	    num_not_merged_types_trees += len;
! 	  if (seen_type)
! 	    num_type_scc_trees += len;
! 	}
!       else
! 	{
! 	  /* Pickle stray references.  */
! 	  t = lto_input_tree_1 (&ib_main, data_in, tag, 0);
! 	  gcc_assert (t && data_in->reader_cache->nodes.length () == from);
! 	}
      }
  
    /* Read in lto_in_decl_state objects.  */
*************** read_cgraph_and_symbols (unsigned nfiles
*** 2898,2906 ****
    type_hash_cache = htab_create_ggc (512, tree_int_map_hash,
  				     tree_int_map_eq, NULL);
    type_pair_cache = XCNEWVEC (struct type_pair_d, GIMPLE_TYPE_PAIR_SIZE);
-   gimple_type_leader = ggc_alloc_cleared_vec_gimple_type_leader_entry_s
- 		        (GIMPLE_TYPE_LEADER_SIZE);
    gimple_types = htab_create_ggc (16381, gimple_type_hash, gimple_type_eq, 0);
  
    if (!quiet_flag)
      fprintf (stderr, "Reading object files:");
--- 3384,3392 ----
    type_hash_cache = htab_create_ggc (512, tree_int_map_hash,
  				     tree_int_map_eq, NULL);
    type_pair_cache = XCNEWVEC (struct type_pair_d, GIMPLE_TYPE_PAIR_SIZE);
    gimple_types = htab_create_ggc (16381, gimple_type_hash, gimple_type_eq, 0);
+   gcc_obstack_init (&tree_scc_hash_obstack);
+   tree_scc_hash.create (4096);
  
    if (!quiet_flag)
      fprintf (stderr, "Reading object files:");
*************** read_cgraph_and_symbols (unsigned nfiles
*** 2933,2939 ****
        lto_obj_file_close (current_lto_file);
        free (current_lto_file);
        current_lto_file = NULL;
-       ggc_collect ();
      }
  
    lto_flatten_files (decl_data, count, last_file_ix);
--- 3419,3424 ----
*************** read_cgraph_and_symbols (unsigned nfiles
*** 2955,2961 ****
    type_hash_cache = NULL;
    free (type_pair_cache);
    type_pair_cache = NULL;
!   gimple_type_leader = NULL;
    free_gimple_type_tables ();
    ggc_collect ();
  
--- 3440,3447 ----
    type_hash_cache = NULL;
    free (type_pair_cache);
    type_pair_cache = NULL;
!   tree_scc_hash.dispose ();
!   obstack_free (&tree_scc_hash_obstack, NULL);
    free_gimple_type_tables ();
    ggc_collect ();
  
*************** print_lto_report_1 (void)
*** 3121,3147 ****
    const char *pfx = (flag_lto) ? "LTO" : (flag_wpa) ? "WPA" : "LTRANS";
    fprintf (stderr, "%s statistics\n", pfx);
  
!   if (gimple_types)
!     fprintf (stderr, "[%s] GIMPLE type table: size %ld, %ld elements, "
! 	     "%ld searches, %ld collisions (ratio: %f)\n", pfx,
! 	     (long) htab_size (gimple_types),
! 	     (long) htab_elements (gimple_types),
! 	     (long) gimple_types->searches,
! 	     (long) gimple_types->collisions,
! 	     htab_collisions (gimple_types));
!   else
!     fprintf (stderr, "[%s] GIMPLE type table is empty\n", pfx);
!   if (type_hash_cache)
!     fprintf (stderr, "[%s] GIMPLE type hash cache table: size %ld, %ld elements, "
! 	     "%ld searches, %ld collisions (ratio: %f)\n", pfx,
! 	     (long) htab_size (type_hash_cache),
! 	     (long) htab_elements (type_hash_cache),
! 	     (long) type_hash_cache->searches,
! 	     (long) type_hash_cache->collisions,
! 	     htab_collisions (type_hash_cache));
!   else
!     fprintf (stderr, "[%s] GIMPLE type hash cache table is empty\n", pfx);
!   fprintf (stderr, "[%s] Merged %lu types\n", pfx, num_merged_types);
  
    print_gimple_types_stats (pfx);
    print_lto_report (pfx);
--- 3607,3655 ----
    const char *pfx = (flag_lto) ? "LTO" : (flag_wpa) ? "WPA" : "LTRANS";
    fprintf (stderr, "%s statistics\n", pfx);
  
!   fprintf (stderr, "[%s] read %lu SCCs of average size %f\n",
! 	   pfx, num_sccs_read, total_scc_size / (double)num_sccs_read);
!   fprintf (stderr, "[%s] %lu tree bodies read in total\n", pfx, total_scc_size);
!   if (flag_wpa && tree_scc_hash.is_created ())
!     {
!       fprintf (stderr, "[%s] tree SCC table: size %ld, %ld elements, "
! 	       "collision ratio: %f\n", pfx,
! 	       (long) tree_scc_hash.size (),
! 	       (long) tree_scc_hash.elements (),
! 	       tree_scc_hash.collisions ());
!       hash_table<tree_scc_hasher>::iterator hiter;
!       tree_scc *scc, *max_scc = NULL;
!       unsigned max_length = 0;
!       FOR_EACH_HASH_TABLE_ELEMENT (tree_scc_hash, scc, x, hiter)
! 	{
! 	  unsigned length = 0;
! 	  tree_scc *s = scc;
! 	  for (; s; s = s->next)
! 	    length++;
! 	  if (length > max_length)
! 	    {
! 	      max_length = length;
! 	      max_scc = scc;
! 	    }
! 	}
!       fprintf (stderr, "[%s] tree SCC max chain length %u (size %u)\n",
! 	       pfx, max_length, max_scc->len);
!       fprintf (stderr, "[%s] Compared %lu SCCs, %lu collisions (%f)\n", pfx,
! 	       num_scc_compares, num_scc_compare_collisions,
! 	       num_scc_compare_collisions / (double) num_scc_compares);
!       fprintf (stderr, "[%s] Merged %lu SCCs\n", pfx, num_sccs_merged);
!       fprintf (stderr, "[%s] Merged %lu tree bodies\n", pfx,
! 	       total_scc_size_merged);
!       fprintf (stderr, "[%s] Merged %lu types\n", pfx, num_merged_types);
!       fprintf (stderr, "[%s] %lu types prevailed (%lu associated trees)\n",
! 	       pfx, num_prevailing_types, num_type_scc_trees);
!       fprintf (stderr, "[%s] Old merging code merges an additional %lu types"
! 	       " of which %lu are in the same SCC with their "
! 	       "prevailing variant (%lu and %lu associated trees)\n",
! 	       pfx, num_not_merged_types, num_not_merged_types_in_same_scc,
! 	       num_not_merged_types_trees,
! 	       num_not_merged_types_in_same_scc_trees);
!     }
  
    print_gimple_types_stats (pfx);
    print_lto_report (pfx);
Index: trunk/gcc/lto/Make-lang.in
===================================================================
*** trunk.orig/gcc/lto/Make-lang.in	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/lto/Make-lang.in	2013-06-13 14:35:53.257108072 +0200
*************** lto/lto.o: lto/lto.c $(CONFIG_H) $(SYSTE
*** 85,91 ****
  	langhooks.h $(VEC_H) $(BITMAP_H) pointer-set.h $(IPA_PROP_H) \
  	$(COMMON_H) debug.h $(GIMPLE_H) $(LTO_H) $(LTO_TREE_H) \
  	$(LTO_TAGS_H) $(LTO_STREAMER_H) $(SPLAY_TREE_H) gt-lto-lto.h \
! 	$(TREE_STREAMER_H) lto/lto-partition.h
  lto/lto-partition.o: lto/lto-partition.c $(CONFIG_H) $(SYSTEM_H) coretypes.h \
  	toplev.h $(TREE_H) $(TM_H) \
  	$(CGRAPH_H) $(TIMEVAR_H) \
--- 85,91 ----
  	langhooks.h $(VEC_H) $(BITMAP_H) pointer-set.h $(IPA_PROP_H) \
  	$(COMMON_H) debug.h $(GIMPLE_H) $(LTO_H) $(LTO_TREE_H) \
  	$(LTO_TAGS_H) $(LTO_STREAMER_H) $(SPLAY_TREE_H) gt-lto-lto.h \
! 	$(TREE_STREAMER_H) $(DATA_STREAMER_H) lto/lto-partition.h
  lto/lto-partition.o: lto/lto-partition.c $(CONFIG_H) $(SYSTEM_H) coretypes.h \
  	toplev.h $(TREE_H) $(TM_H) \
  	$(CGRAPH_H) $(TIMEVAR_H) \
Index: trunk/gcc/tree-streamer-out.c
===================================================================
*** trunk.orig/gcc/tree-streamer-out.c	2013-06-13 14:35:48.000000000 +0200
--- trunk/gcc/tree-streamer-out.c	2013-06-13 15:35:37.231043943 +0200
*************** pack_ts_base_value_fields (struct bitpac
*** 95,101 ****
      bp_pack_value (bp, TYPE_ARTIFICIAL (expr), 1);
    else
      bp_pack_value (bp, TREE_NO_WARNING (expr), 1);
-   bp_pack_value (bp, TREE_USED (expr), 1);
    bp_pack_value (bp, TREE_NOTHROW (expr), 1);
    bp_pack_value (bp, TREE_STATIC (expr), 1);
    bp_pack_value (bp, TREE_PRIVATE (expr), 1);
--- 95,100 ----
*************** streamer_write_chain (struct output_bloc
*** 491,499 ****
  	 to the global decls section as we do not want to have them
  	 enter decl merging.  This is, of course, only for the call
  	 for streaming BLOCK_VARS, but other callers are safe.  */
        if (VAR_OR_FUNCTION_DECL_P (t)
  	  && DECL_EXTERNAL (t))
! 	stream_write_tree_shallow_non_ref (ob, t, ref_p);
        else
  	stream_write_tree (ob, t, ref_p);
  
--- 490,499 ----
  	 to the global decls section as we do not want to have them
  	 enter decl merging.  This is, of course, only for the call
  	 for streaming BLOCK_VARS, but other callers are safe.  */
+       /* ???  FIXME wrt SCC streaming.  Drop these for now.  */
        if (VAR_OR_FUNCTION_DECL_P (t)
  	  && DECL_EXTERNAL (t))
! 	; /* stream_write_tree_shallow_non_ref (ob, t, ref_p); */
        else
  	stream_write_tree (ob, t, ref_p);
  
*************** write_ts_list_tree_pointers (struct outp
*** 716,722 ****
  {
    stream_write_tree (ob, TREE_PURPOSE (expr), ref_p);
    stream_write_tree (ob, TREE_VALUE (expr), ref_p);
!   streamer_write_chain (ob, TREE_CHAIN (expr), ref_p);
  }
  
  
--- 716,722 ----
  {
    stream_write_tree (ob, TREE_PURPOSE (expr), ref_p);
    stream_write_tree (ob, TREE_VALUE (expr), ref_p);
!   stream_write_tree (ob, TREE_CHAIN (expr), ref_p);
  }
  
  
*************** write_ts_constructor_tree_pointers (stru
*** 834,839 ****
--- 834,841 ----
      }
  }
  
+ extern unsigned long num_tree_bodies_written;
+ 
  /* Write all pointer fields in EXPR to output block OB.  If REF_P is true,
     the leaves of EXPR are emitted as references.  */
  
*************** streamer_write_tree_body (struct output_
*** 842,847 ****
--- 844,851 ----
  {
    enum tree_code code;
  
+   num_tree_bodies_written++;
+ 
    code = TREE_CODE (expr);
  
    if (CODE_CONTAINS_STRUCT (code, TS_TYPED))
Index: trunk/gcc/tree-streamer.c
===================================================================
*** trunk.orig/gcc/tree-streamer.c	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/tree-streamer.c	2013-06-13 14:35:53.258108084 +0200
*************** streamer_check_handled_ts_structures (vo
*** 92,107 ****
  
  static void
  streamer_tree_cache_add_to_node_array (struct streamer_tree_cache_d *cache,
! 				       unsigned ix, tree t)
  {
    /* Make sure we're either replacing an old element or
       appending consecutively.  */
    gcc_assert (ix <= cache->nodes.length ());
  
    if (ix == cache->nodes.length ())
!     cache->nodes.safe_push (t);
    else
!     cache->nodes[ix] = t;
  }
  
  
--- 92,115 ----
  
  static void
  streamer_tree_cache_add_to_node_array (struct streamer_tree_cache_d *cache,
! 				       unsigned ix, tree t, hashval_t hash)
  {
    /* Make sure we're either replacing an old element or
       appending consecutively.  */
    gcc_assert (ix <= cache->nodes.length ());
  
    if (ix == cache->nodes.length ())
!     {
!       cache->nodes.safe_push (t);
!       if (cache->hashes.exists ())
! 	cache->hashes.safe_push (hash);
!     }
    else
!     {
!       cache->nodes[ix] = t;
!       if (cache->hashes.exists ())
! 	cache->hashes[ix] = hash;
!     }
  }
  
  
*************** streamer_tree_cache_add_to_node_array (s
*** 117,123 ****
  
  static bool
  streamer_tree_cache_insert_1 (struct streamer_tree_cache_d *cache,
! 			      tree t, unsigned *ix_p,
  			      bool insert_at_next_slot_p)
  {
    void **slot;
--- 125,131 ----
  
  static bool
  streamer_tree_cache_insert_1 (struct streamer_tree_cache_d *cache,
! 			      tree t, hashval_t hash, unsigned *ix_p,
  			      bool insert_at_next_slot_p)
  {
    void **slot;
*************** streamer_tree_cache_insert_1 (struct str
*** 136,142 ****
  	ix = *ix_p;
         *slot = (void *)(size_t) (ix + 1);
  
!       streamer_tree_cache_add_to_node_array (cache, ix, t);
  
        /* Indicate that the item was not present in the cache.  */
        existed_p = false;
--- 144,150 ----
  	ix = *ix_p;
         *slot = (void *)(size_t) (ix + 1);
  
!       streamer_tree_cache_add_to_node_array (cache, ix, t, hash);
  
        /* Indicate that the item was not present in the cache.  */
        existed_p = false;
*************** streamer_tree_cache_insert_1 (struct str
*** 151,157 ****
  	     location, and ENTRY->TO does not match *IX_P, add T to
  	     the requested location slot.  */
  	  ix = *ix_p;
! 	  streamer_tree_cache_add_to_node_array (cache, ix, t);
  	  *slot = (void *)(size_t) (ix + 1);
  	}
  
--- 159,165 ----
  	     location, and ENTRY->TO does not match *IX_P, add T to
  	     the requested location slot.  */
  	  ix = *ix_p;
! 	  streamer_tree_cache_add_to_node_array (cache, ix, t, hash);
  	  *slot = (void *)(size_t) (ix + 1);
  	}
  
*************** streamer_tree_cache_insert_1 (struct str
*** 174,203 ****
  
  bool
  streamer_tree_cache_insert (struct streamer_tree_cache_d *cache, tree t,
! 			    unsigned *ix_p)
  {
!   return streamer_tree_cache_insert_1 (cache, t, ix_p, true);
  }
  
  
! /* Insert tree node T in CACHE at slot IX.  If T already
!    existed in the cache return true.  Otherwise, return false.  */
  
! bool
! streamer_tree_cache_insert_at (struct streamer_tree_cache_d *cache,
! 			       tree t, unsigned ix)
  {
!   return streamer_tree_cache_insert_1 (cache, t, &ix, false);
  }
  
  
  /* Appends tree node T to CACHE, even if T already existed in it.  */
  
  void
! streamer_tree_cache_append (struct streamer_tree_cache_d *cache, tree t)
  {
    unsigned ix = cache->nodes.length ();
!   streamer_tree_cache_insert_1 (cache, t, &ix, false);
  }
  
  /* Return true if tree node T exists in CACHE, otherwise false.  If IX_P is
--- 182,214 ----
  
  bool
  streamer_tree_cache_insert (struct streamer_tree_cache_d *cache, tree t,
! 			    hashval_t hash, unsigned *ix_p)
  {
!   return streamer_tree_cache_insert_1 (cache, t, hash, ix_p, true);
  }
  
  
! /* Replace the tree node with T in CACHE at slot IX.  */
  
! void
! streamer_tree_cache_replace_tree (struct streamer_tree_cache_d *cache,
! 				  tree t, unsigned ix)
  {
!   hashval_t hash = 0;
!   if (cache->hashes.exists ())
!     hash = streamer_tree_cache_get_hash (cache, ix);
!   streamer_tree_cache_insert_1 (cache, t, hash, &ix, false);
  }
  
  
  /* Appends tree node T to CACHE, even if T already existed in it.  */
  
  void
! streamer_tree_cache_append (struct streamer_tree_cache_d *cache,
! 			    tree t, hashval_t hash)
  {
    unsigned ix = cache->nodes.length ();
!   streamer_tree_cache_insert_1 (cache, t, hash, &ix, false);
  }
  
  /* Return true if tree node T exists in CACHE, otherwise false.  If IX_P is
*************** record_common_node (struct streamer_tree
*** 257,263 ****
    if (!node)
      node = error_mark_node;
  
!   streamer_tree_cache_append (cache, node);
  
    if (POINTER_TYPE_P (node)
        || TREE_CODE (node) == COMPLEX_TYPE
--- 268,277 ----
    if (!node)
      node = error_mark_node;
  
!   /* ???  FIXME, devise a better hash value.  But the hash needs to be equal
!      for all frontend and lto1 invocations.  So just use the position
!      in the cache as hash value.  */
!   streamer_tree_cache_append (cache, node, cache->nodes.length ());
  
    if (POINTER_TYPE_P (node)
        || TREE_CODE (node) == COMPLEX_TYPE
*************** preload_common_nodes (struct streamer_tr
*** 305,317 ****
  /* Create a cache of pickled nodes.  */
  
  struct streamer_tree_cache_d *
! streamer_tree_cache_create (void)
  {
    struct streamer_tree_cache_d *cache;
  
    cache = XCNEW (struct streamer_tree_cache_d);
  
    cache->node_map = pointer_map_create ();
  
    /* Load all the well-known tree nodes that are always created by
       the compiler on startup.  This prevents writing them out
--- 319,334 ----
  /* Create a cache of pickled nodes.  */
  
  struct streamer_tree_cache_d *
! streamer_tree_cache_create (bool with_hashes)
  {
    struct streamer_tree_cache_d *cache;
  
    cache = XCNEW (struct streamer_tree_cache_d);
  
    cache->node_map = pointer_map_create ();
+   cache->nodes.create (165);
+   if (with_hashes)
+     cache->hashes.create (165);
  
    /* Load all the well-known tree nodes that are always created by
       the compiler on startup.  This prevents writing them out
*************** streamer_tree_cache_delete (struct strea
*** 332,336 ****
--- 349,354 ----
  
    pointer_map_destroy (c->node_map);
    c->nodes.release ();
+   c->hashes.release ();
    free (c);
  }
Index: trunk/gcc/tree-streamer.h
===================================================================
*** trunk.orig/gcc/tree-streamer.h	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/tree-streamer.h	2013-06-13 14:35:53.258108084 +0200
*************** along with GCC; see the file COPYING3.
*** 43,48 ****
--- 43,49 ----
       T.  The reconstructed T is inserted in some array so that when
       the reference index for T is found in the input stream, it can be
       used to look up into the array to get the reconstructed T.  */
+ 
  struct streamer_tree_cache_d
  {
    /* The mapping between tree nodes and slots into the nodes array.  */
*************** struct streamer_tree_cache_d
*** 50,55 ****
--- 51,58 ----
  
    /* The nodes pickled so far.  */
    vec<tree> nodes;
+   /* The node hashes (if available).  */
+   vec<hashval_t> hashes;
  };
  
  /* Return true if tree node EXPR should be streamed as a builtin.  For
*************** tree streamer_alloc_tree (struct lto_inp
*** 71,77 ****
  void streamer_read_tree_body (struct lto_input_block *, struct data_in *, tree);
  tree streamer_get_pickled_tree (struct lto_input_block *, struct data_in *);
  tree streamer_get_builtin_tree (struct lto_input_block *, struct data_in *);
- tree streamer_read_integer_cst (struct lto_input_block *, struct data_in *);
  struct bitpack_d streamer_read_tree_bitfields (struct lto_input_block *,
  					       struct data_in *, tree);
  
--- 74,79 ----
*************** void streamer_write_builtin (struct outp
*** 89,110 ****
  /* In tree-streamer.c.  */
  void streamer_check_handled_ts_structures (void);
  bool streamer_tree_cache_insert (struct streamer_tree_cache_d *, tree,
! 				 unsigned *);
! bool streamer_tree_cache_insert_at (struct streamer_tree_cache_d *, tree,
! 				    unsigned);
! void streamer_tree_cache_append (struct streamer_tree_cache_d *, tree);
  bool streamer_tree_cache_lookup (struct streamer_tree_cache_d *, tree,
  				 unsigned *);
! struct streamer_tree_cache_d *streamer_tree_cache_create (void);
  void streamer_tree_cache_delete (struct streamer_tree_cache_d *);
  
  /* Return the tree node at slot IX in CACHE.  */
  
  static inline tree
! streamer_tree_cache_get (struct streamer_tree_cache_d *cache, unsigned ix)
  {
    return cache->nodes[ix];
  }
  
  
  #endif  /* GCC_TREE_STREAMER_H  */
--- 91,121 ----
  /* In tree-streamer.c.  */
  void streamer_check_handled_ts_structures (void);
  bool streamer_tree_cache_insert (struct streamer_tree_cache_d *, tree,
! 				 hashval_t, unsigned *);
! void streamer_tree_cache_replace_tree (struct streamer_tree_cache_d *, tree,
! 				       unsigned);
! void streamer_tree_cache_append (struct streamer_tree_cache_d *, tree,
! 				 hashval_t);
  bool streamer_tree_cache_lookup (struct streamer_tree_cache_d *, tree,
  				 unsigned *);
! struct streamer_tree_cache_d *streamer_tree_cache_create (bool);
  void streamer_tree_cache_delete (struct streamer_tree_cache_d *);
  
  /* Return the tree node at slot IX in CACHE.  */
  
  static inline tree
! streamer_tree_cache_get_tree (struct streamer_tree_cache_d *cache, unsigned ix)
  {
    return cache->nodes[ix];
  }
  
+ /* Return the tree hash value at slot IX in CACHE.  */
+ 
+ static inline hashval_t
+ streamer_tree_cache_get_hash (struct streamer_tree_cache_d *cache, unsigned ix)
+ {
+   return cache->hashes[ix];
+ }
+ 
  
  #endif  /* GCC_TREE_STREAMER_H  */
Index: trunk/gcc/lto-streamer.c
===================================================================
*** trunk.orig/gcc/lto-streamer.c	2013-06-13 14:35:48.000000000 +0200
--- trunk/gcc/lto-streamer.c	2013-06-13 14:35:53.258108084 +0200
*************** lto_get_section_name (int section_type,
*** 176,181 ****
--- 176,185 ----
    return concat (LTO_SECTION_NAME_PREFIX, sep, add, post, NULL);
  }
  
+ unsigned long num_tree_bodies_written = 0;
+ unsigned long num_trees_output = 0;
+ unsigned long num_grefs_output = 0;
+ unsigned long num_lrefs_output = 0;
  
  /* Show various memory usage statistics related to LTO.  */
  
*************** print_lto_report (const char *s)
*** 227,232 ****
--- 231,245 ----
  	       HOST_WIDE_INT_PRINT_UNSIGNED "\n", s,
  	       lto_stats.num_output_symtab_nodes);
  
+       fprintf (stderr, "[%s] # of output trees: %lu\n",
+ 	       s, num_trees_output);
+       fprintf (stderr, "[%s] # of output references to globals: %lu\n",
+ 	       s, num_grefs_output);
+       fprintf (stderr, "[%s] # of output local references: %lu\n",
+ 	       s, num_lrefs_output);
+       fprintf (stderr, "[%s] # of output tree bodies: %lu\n",
+ 	       s, num_tree_bodies_written);
+ 
        fprintf (stderr, "[%s] # callgraph partitions: "
  	       HOST_WIDE_INT_PRINT_UNSIGNED "\n", s,
  	       lto_stats.num_cgraph_partitions);
Index: trunk/gcc/tree.c
===================================================================
*** trunk.orig/gcc/tree.c	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/tree.c	2013-06-13 14:35:53.261108120 +0200
*************** build_int_cst_wide (tree type, unsigned
*** 1266,1271 ****
--- 1266,1364 ----
    return t;
  }
  
+ void
+ cache_integer_cst (tree t)
+ {
+   tree type = TREE_TYPE (t);
+   HOST_WIDE_INT hi = TREE_INT_CST_HIGH (t);
+   unsigned HOST_WIDE_INT low = TREE_INT_CST_LOW (t);
+   int ix = -1;
+   int limit = 0;
+ 
+   gcc_assert (!TREE_OVERFLOW (t));
+ 
+   switch (TREE_CODE (type))
+     {
+     case NULLPTR_TYPE:
+       gcc_assert (hi == 0 && low == 0);
+       /* Fallthru.  */
+ 
+     case POINTER_TYPE:
+     case REFERENCE_TYPE:
+       /* Cache NULL pointer.  */
+       if (!hi && !low)
+ 	{
+ 	  limit = 1;
+ 	  ix = 0;
+ 	}
+       break;
+ 
+     case BOOLEAN_TYPE:
+       /* Cache false or true.  */
+       limit = 2;
+       if (!hi && low < 2)
+ 	ix = low;
+       break;
+ 
+     case INTEGER_TYPE:
+     case OFFSET_TYPE:
+       if (TYPE_UNSIGNED (type))
+ 	{
+ 	  /* Cache 0..N */
+ 	  limit = INTEGER_SHARE_LIMIT;
+ 	  if (!hi && low < (unsigned HOST_WIDE_INT)INTEGER_SHARE_LIMIT)
+ 	    ix = low;
+ 	}
+       else
+ 	{
+ 	  /* Cache -1..N */
+ 	  limit = INTEGER_SHARE_LIMIT + 1;
+ 	  if (!hi && low < (unsigned HOST_WIDE_INT)INTEGER_SHARE_LIMIT)
+ 	    ix = low + 1;
+ 	  else if (hi == -1 && low == -(unsigned HOST_WIDE_INT)1)
+ 	    ix = 0;
+ 	}
+       break;
+ 
+     case ENUMERAL_TYPE:
+       break;
+ 
+     default:
+       gcc_unreachable ();
+     }
+ 
+   if (ix >= 0)
+     {
+       /* Look for it in the type's vector of small shared ints.  */
+       if (!TYPE_CACHED_VALUES_P (type))
+ 	{
+ 	  TYPE_CACHED_VALUES_P (type) = 1;
+ 	  TYPE_CACHED_VALUES (type) = make_tree_vec (limit);
+ 	}
+ 
+       gcc_assert (TREE_VEC_ELT (TYPE_CACHED_VALUES (type), ix) == NULL_TREE);
+       TREE_VEC_ELT (TYPE_CACHED_VALUES (type), ix) = t;
+     }
+   else
+     {
+       /* Use the cache of larger shared ints.  */
+       void **slot;
+ 
+       slot = htab_find_slot (int_cst_hash_table, t, INSERT);
+       /* If there is already an entry for the number verify it's the
+          same.  */
+       if (*slot)
+ 	{
+ 	  gcc_assert (TREE_INT_CST_LOW ((tree)*slot) == low
+ 		      && TREE_INT_CST_HIGH ((tree)*slot) == hi);
+ 	  return;
+ 	}
+       /* Otherwise insert this one into the hash table.  */
+       *slot = t;
+     }
+ }
+ 
+ 
  /* Builds an integer constant in TYPE such that lowest BITS bits are ones
     and the rest are zeros.  */
  
Index: trunk/gcc/tree.h
===================================================================
*** trunk.orig/gcc/tree.h	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/tree.h	2013-06-13 14:35:53.262108132 +0200
*************** extern const_tree strip_invariant_refs (
*** 5637,5642 ****
--- 5637,5643 ----
  extern tree lhd_gcc_personality (void);
  extern void assign_assembler_name_if_neeeded (tree);
  extern void warn_deprecated_use (tree, tree);
+ extern void cache_integer_cst (tree);
  
  
  /* In cgraph.c */
Index: trunk/gcc/lto-symtab.c
===================================================================
*** trunk.orig/gcc/lto-symtab.c	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/lto-symtab.c	2013-06-13 14:35:53.262108132 +0200
*************** lto_varpool_replace_node (struct varpool
*** 96,104 ****
  
    ipa_clone_referring ((symtab_node)prevailing_node, &vnode->symbol.ref_list);
  
-   /* Be sure we can garbage collect the initializer.  */
-   if (DECL_INITIAL (vnode->symbol.decl))
-     DECL_INITIAL (vnode->symbol.decl) = error_mark_node;
    /* Finally remove the replaced node.  */
    varpool_remove_node (vnode);
  }
--- 96,101 ----
Index: trunk/gcc/varpool.c
===================================================================
*** trunk.orig/gcc/varpool.c	2013-06-13 14:35:21.000000000 +0200
--- trunk/gcc/varpool.c	2013-06-13 14:35:53.262108132 +0200
*************** varpool_remove_node (struct varpool_node
*** 77,91 ****
  
  /* Renove node initializer when it is no longer needed.  */
  void
! varpool_remove_initializer (struct varpool_node *node)
  {
-   if (DECL_INITIAL (node->symbol.decl)
-       && !DECL_IN_CONSTANT_POOL (node->symbol.decl)
-       /* Keep vtables for BINFO folding.  */
-       && !DECL_VIRTUAL_P (node->symbol.decl)
-       /* FIXME: http://gcc.gnu.org/PR55395 */
-       && debug_info_level == DINFO_LEVEL_NONE)
-     DECL_INITIAL (node->symbol.decl) = error_mark_node;
  }
  
  /* Dump given cgraph node.  */
--- 77,84 ----
  
  /* Renove node initializer when it is no longer needed.  */
  void
! varpool_remove_initializer (struct varpool_node *)
  {
  }
  
  /* Dump given cgraph node.  */

Patch

Index: trunk/gcc/lto-symtab.c
===================================================================
--- trunk.orig/gcc/lto-symtab.c 2013-06-12 16:47:38.000000000 +0200
+++ trunk/gcc/lto-symtab.c      2013-06-12 17:00:12.664126423 +0200
@@ -96,9 +96,6 @@  lto_varpool_replace_node (struct varpool
 
   ipa_clone_referring ((symtab_node)prevailing_node, 
&vnode->symbol.ref_list);
 
-  /* Be sure we can garbage collect the initializer.  */
-  if (DECL_INITIAL (vnode->symbol.decl))
-    DECL_INITIAL (vnode->symbol.decl) = error_mark_node;
   /* Finally remove the replaced node.  */
   varpool_remove_node (vnode);
 }
Index: trunk/gcc/varpool.c
===================================================================
--- trunk.orig/gcc/varpool.c    2013-06-12 13:13:06.000000000 +0200
+++ trunk/gcc/varpool.c 2013-06-12 17:01:46.088248807 +0200
@@ -77,15 +77,8 @@  varpool_remove_node (struct varpool_node
 
 /* Renove node initializer when it is no longer needed.  */
 void
-varpool_remove_initializer (struct varpool_node *node)
+varpool_remove_initializer (struct varpool_node *)
 {
-  if (DECL_INITIAL (node->symbol.decl)
-      && !DECL_IN_CONSTANT_POOL (node->symbol.decl)
-      /* Keep vtables for BINFO folding.  */
-      && !DECL_VIRTUAL_P (node->symbol.decl)
-      /* FIXME: http://gcc.gnu.org/PR55395 */
-      && debug_info_level == DINFO_LEVEL_NONE)
-    DECL_INITIAL (node->symbol.decl) = error_mark_node;
 }
 
 /* Dump given cgraph node.  */