{"id":2175316,"url":"http://patchwork.ozlabs.org/api/1.0/patches/2175316/?format=json","project":{"id":17,"url":"http://patchwork.ozlabs.org/api/1.0/projects/17/?format=json","name":"GNU Compiler Collection","link_name":"gcc","list_id":"gcc-patches.gcc.gnu.org","list_email":"gcc-patches@gcc.gnu.org","web_url":null,"scm_url":null,"webscm_url":null},"msgid":"<20251217190018.487429-2-rdapp@ventanamicro.com>","date":"2025-12-17T19:00:15","name":"[v3,1/4] RISC-V: Change gather/scatter iterators.","commit_ref":null,"pull_url":null,"state":"new","archived":false,"hash":"016a6b0f54e4433c37a39cddd458513d4c9e6047","submitter":{"id":86205,"url":"http://patchwork.ozlabs.org/api/1.0/people/86205/?format=json","name":"Robin Dapp","email":"rdapp.gcc@gmail.com"},"delegate":null,"mbox":"http://patchwork.ozlabs.org/project/gcc/patch/20251217190018.487429-2-rdapp@ventanamicro.com/mbox/","series":[{"id":485748,"url":"http://patchwork.ozlabs.org/api/1.0/series/485748/?format=json","date":"2025-12-17T19:00:18","name":"VLS-related stuff.","version":3,"mbox":"http://patchwork.ozlabs.org/series/485748/mbox/"}],"check":"pending","checks":"http://patchwork.ozlabs.org/api/patches/2175316/checks/","tags":{},"headers":{"Return-Path":"<gcc-patches-bounces~incoming=patchwork.ozlabs.org@gcc.gnu.org>","X-Original-To":["incoming@patchwork.ozlabs.org","gcc-patches@gcc.gnu.org"],"Delivered-To":["patchwork-incoming@legolas.ozlabs.org","gcc-patches@gcc.gnu.org"],"Authentication-Results":["legolas.ozlabs.org;\n\tdkim=pass (2048-bit key;\n unprotected) header.d=gmail.com header.i=@gmail.com header.a=rsa-sha256\n header.s=20230601 header.b=m++XPjzE;\n\tdkim-atps=neutral","legolas.ozlabs.org;\n spf=pass (sender SPF authorized) smtp.mailfrom=gcc.gnu.org\n (client-ip=38.145.34.32; helo=vm01.sourceware.org;\n envelope-from=gcc-patches-bounces~incoming=patchwork.ozlabs.org@gcc.gnu.org;\n receiver=patchwork.ozlabs.org)","sourceware.org;\n\tdkim=pass (2048-bit key,\n unprotected) header.d=gmail.com header.i=@gmail.com header.a=rsa-sha256\n header.s=20230601 header.b=m++XPjzE","sourceware.org;\n dmarc=pass (p=none dis=none) header.from=gmail.com","sourceware.org; spf=pass smtp.mailfrom=gmail.com","server2.sourceware.org;\n arc=none smtp.remote-ip=209.85.208.44"],"Received":["from vm01.sourceware.org (vm01.sourceware.org [38.145.34.32])\n\t(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)\n\t key-exchange x25519 server-signature ECDSA (secp384r1) server-digest SHA384)\n\t(No client certificate requested)\n\tby legolas.ozlabs.org (Postfix) with ESMTPS id 4dWjns5KGNz1xty\n\tfor <incoming@patchwork.ozlabs.org>; Thu, 18 Dec 2025 06:01:37 +1100 (AEDT)","from vm01.sourceware.org (localhost [127.0.0.1])\n\tby sourceware.org (Postfix) with ESMTP id A785C4BA2E20\n\tfor <incoming@patchwork.ozlabs.org>; Wed, 17 Dec 2025 19:01:35 +0000 (GMT)","from mail-ed1-f44.google.com (mail-ed1-f44.google.com\n [209.85.208.44])\n by sourceware.org (Postfix) with ESMTPS id F3F534BA2E20\n for <gcc-patches@gcc.gnu.org>; Wed, 17 Dec 2025 19:00:23 +0000 (GMT)","by mail-ed1-f44.google.com with SMTP id\n 4fb4d7f45d1cf-640e9f5951aso1410538a12.1\n for <gcc-patches@gcc.gnu.org>; Wed, 17 Dec 2025 11:00:23 -0800 (PST)","from x1c10.dc1.ventanamicro.com\n (ip-149-172-150-237.um42.pools.vodafone-ip.de. [149.172.150.237])\n by smtp.gmail.com with ESMTPSA id\n 4fb4d7f45d1cf-64b5886d434sm232486a12.21.2025.12.17.11.00.20\n (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256);\n Wed, 17 Dec 2025 11:00:21 -0800 (PST)"],"DKIM-Filter":["OpenDKIM Filter v2.11.0 sourceware.org A785C4BA2E20","OpenDKIM Filter v2.11.0 sourceware.org F3F534BA2E20"],"DMARC-Filter":"OpenDMARC Filter v1.4.2 sourceware.org F3F534BA2E20","ARC-Filter":"OpenARC Filter v1.0.0 sourceware.org F3F534BA2E20","ARC-Seal":"i=1; a=rsa-sha256; d=sourceware.org; s=key; t=1765998024; cv=none;\n b=qX/paw7cDh3dx5nZ3BqzP4hntkcStzUZEM2DVqRj60CCwwIlQhB4k7A9MI7wHzy1H4UXxWBVrbrewHAwFemERyKD6m9CAiy3MZu+w0Na1DSduoR5ywcV5b4FXMqdoeua7yJLzhvnEChfvzjDo0g23wl0paDg8NXqH4hd4XqJwmY=","ARC-Message-Signature":"i=1; a=rsa-sha256; d=sourceware.org; s=key;\n t=1765998024; c=relaxed/simple;\n bh=upe1IXRa3NhUIebmZVOxLgTpKiLK6DgARNAOq2fWNzk=;\n h=DKIM-Signature:From:To:Subject:Date:Message-ID:MIME-Version;\n b=CNvZUFuK9dT+1YI43fXfQDTUKl3WY4LHBvEsKFEMIMB6wpHz8cwWuMyDoy8/7gV95GZqdB3ZuKAS/dCSFwlhRL/uo+JPIVJyyQ7aWyVxfbc0Fi/ZY/NLncxyuUys/vC5ZusGYX5cFUKdO3gzZP063XQJnF5x7ZzkJXxy1+Aw6Lw=","ARC-Authentication-Results":"i=1; server2.sourceware.org","DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n d=gmail.com; s=20230601; t=1765998022; x=1766602822; darn=gcc.gnu.org;\n h=content-transfer-encoding:mime-version:references:in-reply-to\n :message-id:date:subject:cc:to:from:from:to:cc:subject:date\n :message-id:reply-to;\n bh=13WXRmZLRiIax3p+w48zwk8sELljRzAvmmt90dG6kAg=;\n b=m++XPjzEHiWcLaHU4ZzeR4nZ0R7T2BS8kcoWEygylkG2Rs/YTQobSavjZ5QAWsbPvS\n 5OhhaoNS0QcUzdPF87nBO7h2Ubv4TUrqDNlqd9vMslnDeawGRls85DP6c0PKB6XBUUdk\n cbq/gIMASyv2xY3mwOHUHRo/9m2WKwOUd+ZH5tgngDKCbOZidmLN1pm+f9euYY8m/NG5\n jQdhg3Cn5Xm1Q1g1WCRPBBNHCrnItcrsdXnEGsP7sQl5PMCLSL58CzlBxgVz5qxMSz0o\n lULwDsVfjDbD4fKYraoEjO1Qf3Q4QbROjy5XKinGXPQqUpL8qWEenG4d54VFyB3fBCi4\n s5+g==","X-Google-DKIM-Signature":"v=1; a=rsa-sha256; c=relaxed/relaxed;\n d=1e100.net; s=20230601; t=1765998022; x=1766602822;\n h=content-transfer-encoding:mime-version:references:in-reply-to\n :message-id:date:subject:cc:to:from:x-gm-gg:x-gm-message-state:from\n :to:cc:subject:date:message-id:reply-to;\n bh=13WXRmZLRiIax3p+w48zwk8sELljRzAvmmt90dG6kAg=;\n b=Yj09Tbizz2LfXUWZo2e5TOz1Ipr0Bf+yM80o0zq2UDtARoourp+bQSv1TOaowd1iiJ\n bPaa+Dr++LCzZGxruyWw5BgwzCT+ikjzbraah6Lds5+nn+4CmdkyYeuktWxIHZtyk+9T\n UXaEom3Lh2gtCWkybmgz9dK/mU7cxNOI6hFeyXjTq39QZwNO/Ixg5mc87bX7aPOwpazI\n wLfsJzmr7gNwYrkqyHG4P4OP7/q3a9mo3Cd/I6c7tD7dXN/Pc3UUg6PAQ3oPtwzhMxC8\n jbt3L7PXHav7osJ7sYPtC0pQJMGUnjOvcOdtFb/AU0KcWjN4Mf9TZUsh5N7mP6WlFU0Y\n rfCQ==","X-Gm-Message-State":"AOJu0YwgfbaOeXsC0GtA5+O30kKI211DV0t+IzIdE4hxMvjUFyjOX7eT\n yhjU8+fgiW3Xxk47Tq6CgoGjR1gDjL7ZaoikSadNwsErf0D0LSwRKVI9gNUicw==","X-Gm-Gg":"AY/fxX4//ZlGjOxBS8vjXy6N/SCUV+AHQ/qC22V09B4DHiashS0zZpvtBFzlDLWcJWA\n Q8iOrYwytEKimfob8MsLEgGK+6vC+LU8m88ZBAC0e0Kp6bpXOZ2gjCf3aN/rI/m1aWMrr8haz1/\n y0YhRwieLXqkIuoJT8tgAhv83ebNRY9WEu1PyjCYzPrIqt+cIlpvfBlcaRFgOs5yRDmBcVtJayt\n kGpB16K2wLOo0XUXBf4muvBbg1HAVK8RWncT8L8zjsbu7Yw7lHZ1BCxmHXOIvp99ajkOUezBKke\n +Yw1BB+IJeH7Mlyz5t+zLob81+OjFdZWPsOpHHPApSyRY/hP3nHIYdHN5Qqakk0CJAYYlcS4sR0\n LFhyjbEj973dusZ0dhgzmoXPW900Y/o7Qi3on0Gn+wOJ+idGoT7Z3coES+dreG5odECkNm9/0CX\n 0RDvqtYMqjXEAPA7DtUrE+vffVmIgRYurJIfxvKWfd1rxDpC/1JF+CUtHxX8qbaeUuvxF0Hv+SC\n 6lQiCQ7XLvGMeA=","X-Google-Smtp-Source":"\n AGHT+IHu5+udLC/u+mbFPc1wBk6piCjdgLZkNsB8+GeC7hP4YNbPodKS/bC1w/HFN6z9P3Tql1JkLg==","X-Received":"by 2002:a17:907:c10:b0:b76:52d0:3a0b with SMTP id\n a640c23a62f3a-b80205ea0d6mr35988666b.25.1765998021443;\n Wed, 17 Dec 2025 11:00:21 -0800 (PST)","From":"Robin Dapp <rdapp.gcc@gmail.com>","X-Google-Original-From":"Robin Dapp <rdapp@ventanamicro.com>","To":"gcc-patches@gcc.gnu.org","Cc":"kito.cheng@gmail.com, juzhe.zhong@rivai.ai, jeffreyalaw@gmail.com,\n pan2.li@intel.com, rdapp.gcc@gmail.com","Subject":"[PATCH v3 1/4] RISC-V: Change gather/scatter iterators.","Date":"Wed, 17 Dec 2025 20:00:15 +0100","Message-ID":"<20251217190018.487429-2-rdapp@ventanamicro.com>","X-Mailer":"git-send-email 2.51.1","In-Reply-To":"<20251217190018.487429-1-rdapp@ventanamicro.com>","References":"<20251217190018.487429-1-rdapp@ventanamicro.com>","MIME-Version":"1.0","Content-Transfer-Encoding":"8bit","X-BeenThere":"gcc-patches@gcc.gnu.org","X-Mailman-Version":"2.1.30","Precedence":"list","List-Id":"Gcc-patches mailing list <gcc-patches.gcc.gnu.org>","List-Unsubscribe":"<https://gcc.gnu.org/mailman/options/gcc-patches>,\n <mailto:gcc-patches-request@gcc.gnu.org?subject=unsubscribe>","List-Archive":"<https://gcc.gnu.org/pipermail/gcc-patches/>","List-Post":"<mailto:gcc-patches@gcc.gnu.org>","List-Help":"<mailto:gcc-patches-request@gcc.gnu.org?subject=help>","List-Subscribe":"<https://gcc.gnu.org/mailman/listinfo/gcc-patches>,\n <mailto:gcc-patches-request@gcc.gnu.org?subject=subscribe>","Errors-To":"gcc-patches-bounces~incoming=patchwork.ozlabs.org@gcc.gnu.org"},"content":"This patch changes the gather/scatter mode iterators from a ratio\nscheme to a more direct one where the index mode size is\n1/2, 1/4, 1/8, 2, 4, 8 times the data mode size.  It also adds VLS modes\nto the iterators and removes the now unnecessary\ngather_scatter_valid_offset_p.\n\ngcc/ChangeLog:\n\n\t* config/riscv/autovec.md (mask_len_gather_load<RATIO64:mode><RATIO64I:mode>): Change from this...\n\t(mask_len_gather_load<mode><vindex>): ...to this.\n\t(mask_len_gather_load<RATIO32:mode><RATIO32I:mode>): Ditto.\n\t(mask_len_gather_load<mode><vindex_double_trunc>): Ditto.\n\t(mask_len_gather_load<RATIO16:mode><RATIO16I:mode>): Ditto.\n\t(mask_len_gather_load<mode><vindex_quad_trunc>): Ditto.\n\t(mask_len_gather_load<RATIO8:mode><RATIO8I:mode>): Ditto.\n\t(mask_len_gather_load<mode><vindex_oct_trunc>): Ditto.\n\t(mask_len_gather_load<RATIO4:mode><RATIO4I:mode>): Ditto.\n\t(mask_len_gather_load<mode><vindex_double_ext>): Ditto.\n\t(mask_len_gather_load<RATIO2:mode><RATIO2I:mode>): Ditto.\n\t(mask_len_gather_load<mode><vindex_quad_ext>): Ditto.\n\t(mask_len_gather_load<mode><mode>): Ditto.\n\t(mask_len_gather_load<mode><vindex_oct_ext>): Ditto.\n\t(mask_len_scatter_store<RATIO64:mode><RATIO64I:mode>): Ditto.\n\t(mask_len_scatter_store<mode><vindex>): Ditto.\n\t(mask_len_scatter_store<RATIO32:mode><RATIO32I:mode>): Ditto.\n\t(mask_len_scatter_store<mode><vindex_double_trunc>): Ditto.\n\t(mask_len_scatter_store<RATIO16:mode><RATIO16I:mode>): Ditto.\n\t(mask_len_scatter_store<mode><vindex_quad_trunc>): Ditto.\n\t(mask_len_scatter_store<RATIO8:mode><RATIO8I:mode>): Ditto.\n\t(mask_len_scatter_store<mode><vindex_oct_trunc>): Ditto.\n\t(mask_len_scatter_store<RATIO4:mode><RATIO4I:mode>): Ditto.\n\t(mask_len_scatter_store<mode><vindex_double_ext>): Ditto.\n\t(mask_len_scatter_store<RATIO2:mode><RATIO2I:mode>): Ditto.\n\t(mask_len_scatter_store<mode><vindex_quad_ext>): Ditto.\n\t(mask_len_scatter_store<mode><mode>): Ditto.\n\t(mask_len_scatter_store<mode><vindex_oct_ext>): Ditto.\n\t* config/riscv/riscv-v.cc (prepare_gather_scatter): Use new\n\tscheme\n\t(get_gather_scatter_code): Ditto.\n\t(expand_gather_scatter): Ditto.\n\t* config/riscv/riscv-vector-builtins-bases.cc: Ditto.\n\t* config/riscv/vector-iterators.md: Ditto.\n\t* config/riscv/vector.md (@pred_indexed_<order>store<RATIO64:mode><RATIO64I:mode>): Go from this...\n\t(@pred_indexed_<order>store<mode>_same_eew): ...to this.\n\t(@pred_indexed_<order>store<RATIO32:mode><RATIO32I:mode>):\n\tDitto.\n\t(@pred_indexed_<order>store<mode>_x2_greater_eew): Ditto.\n\t(@pred_indexed_<order>store<RATIO16:mode><RATIO16I:mode>):\n\tDitto.\n\t(@pred_indexed_<order>store<mode>_x4_greater_eew): Ditto.\n\t(@pred_indexed_<order>store<RATIO8:mode><RATIO8I:mode>): Ditto.\n\t(@pred_indexed_<order>store<mode>_x8_greater_eew): Ditto.\n\t(@pred_indexed_<order>store<RATIO4:mode><RATIO4I:mode>): Ditto.\n\t(@pred_indexed_<order>store<mode>_x2_smaller_eew): Ditto.\n\t(@pred_indexed_<order>store<RATIO2:mode><RATIO2I:mode>): Ditto.\n\t(@pred_indexed_<order>store<mode>_x4_smaller_eew): Ditto.\n\t(@pred_indexed_<order>store<RATIO1:mode><RATIO1:mode>): Ditto.\n\t(@pred_indexed_<order>store<mode>_x8_smaller_eew): Ditto.\n---\n gcc/config/riscv/autovec.md                   |  147 +-\n gcc/config/riscv/riscv-v.cc                   |  114 +-\n .../riscv/riscv-vector-builtins-bases.cc      |   54 +-\n gcc/config/riscv/vector-iterators.md          | 1417 ++++++++++++++++-\n gcc/config/riscv/vector.md                    |   72 +-\n 5 files changed, 1565 insertions(+), 239 deletions(-)","diff":"diff --git a/gcc/config/riscv/autovec.md b/gcc/config/riscv/autovec.md\nindex 1f3ff16ed67..8ff3f55ffc4 100644\n--- a/gcc/config/riscv/autovec.md\n+++ b/gcc/config/riscv/autovec.md\n@@ -51,110 +51,113 @@ (define_expand \"mask_len_store<mode><vm>\"\n ;; == Gather Load\n ;; =========================================================================\n \n-(define_expand \"mask_len_gather_load<RATIO64:mode><RATIO64I:mode>\"\n-  [(match_operand:RATIO64 0 \"register_operand\")\n+;; Same element size for index, extension operand is irrelevant.\n+(define_expand \"mask_len_gather_load<mode><vindex>\"\n+  [(match_operand:VINDEXED 0 \"register_operand\")\n    (match_operand 1 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO64I 2 \"register_operand\")\n+   (match_operand:<VINDEX> 2 \"register_operand\")\n    (match_operand 3 \"const_1_operand\")\n    (match_operand 4 \"const_1_operand\")\n-   (match_operand:<RATIO64:VM> 5 \"vector_mask_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"maskload_else_operand\")\n    (match_operand 7 \"autovec_length_operand\")\n    (match_operand 8 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO64I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, true);\n   DONE;\n })\n \n-(define_expand \"mask_len_gather_load<RATIO32:mode><RATIO32I:mode>\"\n-  [(match_operand:RATIO32 0 \"register_operand\")\n+;; e.g. DImode, index SImode\n+(define_expand \"mask_len_gather_load<mode><vindex_double_trunc>\"\n+  [(match_operand:VEEWEXT2 0 \"register_operand\")\n    (match_operand 1 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO32I 2 \"register_operand\")\n+   (match_operand:<VINDEX_DOUBLE_TRUNC> 2 \"register_operand\")\n    (match_operand 3 \"const_1_operand\")\n    (match_operand 4 \"const_1_operand\")\n-   (match_operand:<RATIO32:VM> 5 \"vector_mask_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"maskload_else_operand\")\n    (match_operand 7 \"autovec_length_operand\")\n    (match_operand 8 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO32I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, true);\n   DONE;\n })\n \n-(define_expand \"mask_len_gather_load<RATIO16:mode><RATIO16I:mode>\"\n-  [(match_operand:RATIO16 0 \"register_operand\")\n+;; e.g. DImode, index HImode\n+(define_expand \"mask_len_gather_load<mode><vindex_quad_trunc>\"\n+  [(match_operand:VEEWEXT4 0 \"register_operand\")\n    (match_operand 1 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO16I 2 \"register_operand\")\n+   (match_operand:<VINDEX_QUAD_TRUNC> 2 \"register_operand\")\n    (match_operand 3 \"const_1_operand\")\n    (match_operand 4 \"const_1_operand\")\n-   (match_operand:<RATIO16:VM> 5 \"vector_mask_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"maskload_else_operand\")\n    (match_operand 7 \"autovec_length_operand\")\n    (match_operand 8 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO16I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, true);\n   DONE;\n })\n \n-(define_expand \"mask_len_gather_load<RATIO8:mode><RATIO8I:mode>\"\n-  [(match_operand:RATIO8 0 \"register_operand\")\n+;; e.g. DImode, index QImode\n+(define_expand \"mask_len_gather_load<mode><vindex_oct_trunc>\"\n+  [(match_operand:VEEWEXT8 0 \"register_operand\")\n    (match_operand 1 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO8I 2 \"register_operand\")\n+   (match_operand:<VINDEX_OCT_TRUNC> 2 \"register_operand\")\n    (match_operand 3 \"const_1_operand\")\n    (match_operand 4 \"const_1_operand\")\n-   (match_operand:<RATIO8:VM> 5 \"vector_mask_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"maskload_else_operand\")\n    (match_operand 7 \"autovec_length_operand\")\n    (match_operand 8 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO8I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, true);\n   DONE;\n })\n \n-(define_expand \"mask_len_gather_load<RATIO4:mode><RATIO4I:mode>\"\n-  [(match_operand:RATIO4 0 \"register_operand\")\n+;; e.g. SImode, index DImode\n+(define_expand \"mask_len_gather_load<mode><vindex_double_ext>\"\n+  [(match_operand:VEEWTRUNC2 0 \"register_operand\")\n    (match_operand 1 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO4I 2 \"register_operand\")\n+   (match_operand:<VINDEX_DOUBLE_EXT> 2 \"register_operand\")\n    (match_operand 3 \"const_1_operand\")\n    (match_operand 4 \"const_1_operand\")\n-   (match_operand:<RATIO4:VM> 5 \"vector_mask_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"maskload_else_operand\")\n    (match_operand 7 \"autovec_length_operand\")\n    (match_operand 8 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO4I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, true);\n   DONE;\n })\n \n-(define_expand \"mask_len_gather_load<RATIO2:mode><RATIO2I:mode>\"\n-  [(match_operand:RATIO2 0 \"register_operand\")\n+;; e.g. HImode, index DImode\n+(define_expand \"mask_len_gather_load<mode><vindex_quad_ext>\"\n+  [(match_operand:VEEWTRUNC4 0 \"register_operand\")\n    (match_operand 1 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO2I 2 \"register_operand\")\n+   (match_operand:<VINDEX_QUAD_EXT> 2 \"register_operand\")\n    (match_operand 3 \"const_1_operand\")\n    (match_operand 4 \"const_1_operand\")\n-   (match_operand:<RATIO2:VM> 5 \"vector_mask_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"maskload_else_operand\")\n    (match_operand 7 \"autovec_length_operand\")\n    (match_operand 8 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO2I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, true);\n   DONE;\n })\n \n-;; When SEW = 8 and LMUL = 8, we can't find any index mode with\n-;; larger SEW. Since RVV indexed load/store support zero extend\n-;; implicitly and not support scaling, we should only allow\n-;; operands[3] and operands[4] to be const_1_operand.\n-(define_expand \"mask_len_gather_load<mode><mode>\"\n-  [(match_operand:RATIO1 0 \"register_operand\")\n+;; e.g. QImode, index DImode\n+(define_expand \"mask_len_gather_load<mode><vindex_oct_ext>\"\n+  [(match_operand:VEEWTRUNC8 0 \"register_operand\")\n    (match_operand 1 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO1 2 \"register_operand\")\n+   (match_operand:<VINDEX_OCT_EXT> 2 \"register_operand\")\n    (match_operand 3 \"const_1_operand\")\n    (match_operand 4 \"const_1_operand\")\n    (match_operand:<VM> 5 \"vector_mask_operand\")\n@@ -171,106 +174,102 @@ (define_expand \"mask_len_gather_load<mode><mode>\"\n ;; == Scatter Store\n ;; =========================================================================\n \n-(define_expand \"mask_len_scatter_store<RATIO64:mode><RATIO64I:mode>\"\n+(define_expand \"mask_len_scatter_store<mode><vindex>\"\n   [(match_operand 0 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO64I 1 \"register_operand\")\n+   (match_operand:<VINDEX> 1 \"register_operand\")\n    (match_operand 2 \"const_1_operand\")\n    (match_operand 3 \"const_1_operand\")\n-   (match_operand:RATIO64 4 \"register_operand\")\n-   (match_operand:<RATIO64:VM> 5 \"vector_mask_operand\")\n+   (match_operand:VINDEXED 4 \"register_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"autovec_length_operand\")\n    (match_operand 7 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO64I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, false);\n   DONE;\n })\n \n-(define_expand \"mask_len_scatter_store<RATIO32:mode><RATIO32I:mode>\"\n+(define_expand \"mask_len_scatter_store<mode><vindex_double_trunc>\"\n   [(match_operand 0 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO32I 1 \"register_operand\")\n+   (match_operand:<VINDEX_DOUBLE_TRUNC> 1 \"register_operand\")\n    (match_operand 2 \"const_1_operand\")\n    (match_operand 3 \"const_1_operand\")\n-   (match_operand:RATIO32 4 \"register_operand\")\n-   (match_operand:<RATIO32:VM> 5 \"vector_mask_operand\")\n+   (match_operand:VEEWEXT2 4 \"register_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"autovec_length_operand\")\n    (match_operand 7 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO32I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, false);\n   DONE;\n })\n \n-(define_expand \"mask_len_scatter_store<RATIO16:mode><RATIO16I:mode>\"\n+(define_expand \"mask_len_scatter_store<mode><vindex_quad_trunc>\"\n   [(match_operand 0 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO16I 1 \"register_operand\")\n+   (match_operand:<VINDEX_QUAD_TRUNC> 1 \"register_operand\")\n    (match_operand 2 \"const_1_operand\")\n    (match_operand 3 \"const_1_operand\")\n-   (match_operand:RATIO16 4 \"register_operand\")\n-   (match_operand:<RATIO16:VM> 5 \"vector_mask_operand\")\n+   (match_operand:VEEWEXT4 4 \"register_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"autovec_length_operand\")\n    (match_operand 7 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO16I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, false);\n   DONE;\n })\n \n-(define_expand \"mask_len_scatter_store<RATIO8:mode><RATIO8I:mode>\"\n+(define_expand \"mask_len_scatter_store<mode><vindex_oct_trunc>\"\n   [(match_operand 0 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO8I 1 \"register_operand\")\n+   (match_operand:<VINDEX_OCT_TRUNC> 1 \"register_operand\")\n    (match_operand 2 \"const_1_operand\")\n    (match_operand 3 \"const_1_operand\")\n-   (match_operand:RATIO8 4 \"register_operand\")\n-   (match_operand:<RATIO8:VM> 5 \"vector_mask_operand\")\n+   (match_operand:VEEWEXT8 4 \"register_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"autovec_length_operand\")\n    (match_operand 7 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO8I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, false);\n   DONE;\n })\n \n-(define_expand \"mask_len_scatter_store<RATIO4:mode><RATIO4I:mode>\"\n+(define_expand \"mask_len_scatter_store<mode><vindex_double_ext>\"\n   [(match_operand 0 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO4I 1 \"register_operand\")\n+   (match_operand:<VINDEX_DOUBLE_EXT> 1 \"register_operand\")\n    (match_operand 2 \"const_1_operand\")\n    (match_operand 3 \"const_1_operand\")\n-   (match_operand:RATIO4 4 \"register_operand\")\n-   (match_operand:<RATIO4:VM> 5 \"vector_mask_operand\")\n+   (match_operand:VEEWTRUNC2 4 \"register_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"autovec_length_operand\")\n    (match_operand 7 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO4I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, false);\n   DONE;\n })\n \n-(define_expand \"mask_len_scatter_store<RATIO2:mode><RATIO2I:mode>\"\n+(define_expand \"mask_len_scatter_store<mode><vindex_quad_ext>\"\n   [(match_operand 0 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO2I 1 \"register_operand\")\n+   (match_operand:<VINDEX_QUAD_EXT> 1 \"register_operand\")\n    (match_operand 2 \"const_1_operand\")\n    (match_operand 3 \"const_1_operand\")\n-   (match_operand:RATIO2 4 \"register_operand\")\n-   (match_operand:<RATIO2:VM> 5 \"vector_mask_operand\")\n+   (match_operand:VEEWTRUNC4 4 \"register_operand\")\n+   (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"autovec_length_operand\")\n    (match_operand 7 \"const_0_operand\")]\n-  \"TARGET_VECTOR && riscv_vector::gather_scatter_valid_offset_p (<RATIO2I:MODE>mode)\"\n+  \"TARGET_VECTOR\"\n {\n   riscv_vector::expand_gather_scatter (operands, false);\n   DONE;\n })\n \n-;; When SEW = 8 and LMUL = 8, we can't find any index mode with\n-;; larger SEW. Since RVV indexed load/store support zero extend\n-;; implicitly and not support scaling, we should only allow\n-;; operands[3] and operands[4] to be const_1_operand.\n-(define_expand \"mask_len_scatter_store<mode><mode>\"\n+(define_expand \"mask_len_scatter_store<mode><vindex_oct_ext>\"\n   [(match_operand 0 \"pmode_reg_or_0_operand\")\n-   (match_operand:RATIO1 1 \"register_operand\")\n+   (match_operand:<VINDEX_OCT_EXT> 1 \"register_operand\")\n    (match_operand 2 \"const_1_operand\")\n    (match_operand 3 \"const_1_operand\")\n-   (match_operand:RATIO1 4 \"register_operand\")\n+   (match_operand:VEEWTRUNC8 4 \"register_operand\")\n    (match_operand:<VM> 5 \"vector_mask_operand\")\n    (match_operand 6 \"autovec_length_operand\")\n    (match_operand 7 \"const_0_operand\")]\ndiff --git a/gcc/config/riscv/riscv-v.cc b/gcc/config/riscv/riscv-v.cc\nindex f3c44313967..ae8db718b80 100644\n--- a/gcc/config/riscv/riscv-v.cc\n+++ b/gcc/config/riscv/riscv-v.cc\n@@ -4692,53 +4692,81 @@ expand_cond_binop (unsigned icode, rtx *ops)\n \n /* Prepare insn_code for gather_load/scatter_store according to\n    the vector mode and index mode.  */\n-static insn_code\n-prepare_gather_scatter (machine_mode vec_mode, machine_mode idx_mode,\n-\t\t\tbool is_load)\n+insn_code\n+get_gather_scatter_code (machine_mode vec_mode, machine_mode idx_mode,\n+\t\t\t bool is_load)\n {\n-  if (!is_load)\n-    return code_for_pred_indexed_store (UNSPEC_UNORDERED, vec_mode, idx_mode);\n-  else\n+  unsigned src_eew_bitsize = GET_MODE_BITSIZE (GET_MODE_INNER (idx_mode));\n+  unsigned dst_eew_bitsize = GET_MODE_BITSIZE (GET_MODE_INNER (vec_mode));\n+  if (dst_eew_bitsize == src_eew_bitsize)\n     {\n-      unsigned src_eew_bitsize = GET_MODE_BITSIZE (GET_MODE_INNER (idx_mode));\n-      unsigned dst_eew_bitsize = GET_MODE_BITSIZE (GET_MODE_INNER (vec_mode));\n-      if (dst_eew_bitsize == src_eew_bitsize)\n-\treturn code_for_pred_indexed_load_same_eew (UNSPEC_UNORDERED, vec_mode);\n-      else if (dst_eew_bitsize > src_eew_bitsize)\n+      if (is_load)\n+\treturn code_for_pred_indexed_load_same_eew\n+\t  (UNSPEC_UNORDERED, vec_mode);\n+      else\n+\treturn code_for_pred_indexed_store_same_eew\n+\t  (UNSPEC_UNORDERED, vec_mode);\n+    }\n+  else if (dst_eew_bitsize > src_eew_bitsize)\n+    {\n+      unsigned factor = dst_eew_bitsize / src_eew_bitsize;\n+      switch (factor)\n \t{\n-\t  unsigned factor = dst_eew_bitsize / src_eew_bitsize;\n-\t  switch (factor)\n-\t    {\n-\t    case 2:\n-\t      return code_for_pred_indexed_load_x2_greater_eew (\n-\t\tUNSPEC_UNORDERED, vec_mode);\n-\t    case 4:\n-\t      return code_for_pred_indexed_load_x4_greater_eew (\n-\t\tUNSPEC_UNORDERED, vec_mode);\n-\t    case 8:\n-\t      return code_for_pred_indexed_load_x8_greater_eew (\n-\t\tUNSPEC_UNORDERED, vec_mode);\n-\t    default:\n-\t      gcc_unreachable ();\n-\t    }\n+\tcase 2:\n+\t  if (is_load)\n+\t    return\n+\t      code_for_pred_indexed_load_x2_greater_eew\n+\t\t(UNSPEC_UNORDERED, vec_mode);\n+\t  else\n+\t    return\n+\t      code_for_pred_indexed_store_x2_greater_eew\n+\t\t(UNSPEC_UNORDERED, vec_mode);\n+\tcase 4:\n+\t  if (is_load)\n+\t    return code_for_pred_indexed_load_x4_greater_eew\n+\t\t(UNSPEC_UNORDERED, vec_mode);\n+\t  else\n+\t    return code_for_pred_indexed_store_x4_greater_eew\n+\t\t(UNSPEC_UNORDERED, vec_mode);\n+\tcase 8:\n+\t  if (is_load)\n+\t    return code_for_pred_indexed_load_x8_greater_eew\n+\t      (UNSPEC_UNORDERED, vec_mode);\n+\t  else\n+\t    return code_for_pred_indexed_store_x8_greater_eew\n+\t      (UNSPEC_UNORDERED, vec_mode);\n+\tdefault:\n+\t  gcc_unreachable ();\n \t}\n-      else\n+    }\n+  else\n+    {\n+      unsigned factor = src_eew_bitsize / dst_eew_bitsize;\n+      switch (factor)\n \t{\n-\t  unsigned factor = src_eew_bitsize / dst_eew_bitsize;\n-\t  switch (factor)\n-\t    {\n-\t    case 2:\n-\t      return code_for_pred_indexed_load_x2_smaller_eew (\n-\t\tUNSPEC_UNORDERED, vec_mode);\n-\t    case 4:\n-\t      return code_for_pred_indexed_load_x4_smaller_eew (\n-\t\tUNSPEC_UNORDERED, vec_mode);\n-\t    case 8:\n-\t      return code_for_pred_indexed_load_x8_smaller_eew (\n-\t\tUNSPEC_UNORDERED, vec_mode);\n-\t    default:\n-\t      gcc_unreachable ();\n-\t    }\n+\tcase 2:\n+\t  if (is_load)\n+\t    return code_for_pred_indexed_load_x2_smaller_eew\n+\t      (UNSPEC_UNORDERED, vec_mode);\n+\t  else\n+\t    return code_for_pred_indexed_store_x2_smaller_eew\n+\t      (UNSPEC_UNORDERED, vec_mode);\n+\tcase 4:\n+\t  if (is_load)\n+\t    return code_for_pred_indexed_load_x4_smaller_eew\n+\t      (UNSPEC_UNORDERED, vec_mode);\n+\t  else\n+\t    return code_for_pred_indexed_store_x4_smaller_eew\n+\t      (UNSPEC_UNORDERED, vec_mode);\n+\tcase 8:\n+\t  if (is_load)\n+\t    return code_for_pred_indexed_load_x8_smaller_eew\n+\t      (UNSPEC_UNORDERED, vec_mode);\n+\t  else\n+\t    return code_for_pred_indexed_store_x8_smaller_eew\n+\t      (UNSPEC_UNORDERED, vec_mode);\n+\tdefault:\n+\t  gcc_unreachable ();\n \t}\n     }\n }\n@@ -4769,7 +4797,7 @@ expand_gather_scatter (rtx *ops, bool is_load)\n   machine_mode idx_mode = GET_MODE (vec_offset);\n   bool is_vlmax = is_vlmax_len_p (vec_mode, len);\n \n-  insn_code icode = prepare_gather_scatter (vec_mode, idx_mode, is_load);\n+  insn_code icode = get_gather_scatter_code (vec_mode, idx_mode, is_load);\n   if (is_vlmax)\n     {\n       if (is_load)\ndiff --git a/gcc/config/riscv/riscv-vector-builtins-bases.cc b/gcc/config/riscv/riscv-vector-builtins-bases.cc\nindex d00403a1fc5..15866d18342 100644\n--- a/gcc/config/riscv/riscv-vector-builtins-bases.cc\n+++ b/gcc/config/riscv/riscv-vector-builtins-bases.cc\n@@ -199,9 +199,57 @@ public:\n       {\n \tint unspec = ORDERED_P ? UNSPEC_ORDERED : UNSPEC_UNORDERED;\n \tif (STORE_P)\n-\t  return e.use_exact_insn (\n-\t    code_for_pred_indexed_store (unspec, e.vector_mode (),\n-\t\t\t\t\t e.index_mode ()));\n+\t  {\n+\t    unsigned src_eew_bitsize\n+\t      = GET_MODE_BITSIZE (GET_MODE_INNER (e.index_mode ()));\n+\t    unsigned dst_eew_bitsize\n+\t      = GET_MODE_BITSIZE (GET_MODE_INNER (e.vector_mode ()));\n+\t    if (dst_eew_bitsize == src_eew_bitsize)\n+\t      return e.use_exact_insn (\n+\t\tcode_for_pred_indexed_store_same_eew (unspec, e.vector_mode ()));\n+\t    else if (dst_eew_bitsize > src_eew_bitsize)\n+\t      {\n+\t\tunsigned factor = dst_eew_bitsize / src_eew_bitsize;\n+\t\tswitch (factor)\n+\t\t  {\n+\t\t  case 2:\n+\t\t    return e.use_exact_insn (\n+\t\t      code_for_pred_indexed_store_x2_greater_eew (\n+\t\t\tunspec, e.vector_mode ()));\n+\t\t  case 4:\n+\t\t    return e.use_exact_insn (\n+\t\t      code_for_pred_indexed_store_x4_greater_eew (\n+\t\t\tunspec, e.vector_mode ()));\n+\t\t  case 8:\n+\t\t    return e.use_exact_insn (\n+\t\t      code_for_pred_indexed_store_x8_greater_eew (\n+\t\t\tunspec, e.vector_mode ()));\n+\t\t  default:\n+\t\t    gcc_unreachable ();\n+\t\t  }\n+\t      }\n+\t    else\n+\t      {\n+\t\tunsigned factor = src_eew_bitsize / dst_eew_bitsize;\n+\t\tswitch (factor)\n+\t\t  {\n+\t\t  case 2:\n+\t\t    return e.use_exact_insn (\n+\t\t      code_for_pred_indexed_store_x2_smaller_eew (\n+\t\t\tunspec, e.vector_mode ()));\n+\t\t  case 4:\n+\t\t    return e.use_exact_insn (\n+\t\t      code_for_pred_indexed_store_x4_smaller_eew (\n+\t\t\tunspec, e.vector_mode ()));\n+\t\t  case 8:\n+\t\t    return e.use_exact_insn (\n+\t\t      code_for_pred_indexed_store_x8_smaller_eew (\n+\t\t\tunspec, e.vector_mode ()));\n+\t\t  default:\n+\t\t    gcc_unreachable ();\n+\t\t  }\n+\t      }\n+\t  }\n \telse\n \t  {\n \t    unsigned src_eew_bitsize\ndiff --git a/gcc/config/riscv/vector-iterators.md b/gcc/config/riscv/vector-iterators.md\nindex e4f3c449637..b6282607ceb 100644\n--- a/gcc/config/riscv/vector-iterators.md\n+++ b/gcc/config/riscv/vector-iterators.md\n@@ -345,6 +345,85 @@ (define_mode_iterator VEEWEXT2 [\n \n   (RVVM8DF \"TARGET_VECTOR_ELEN_FP_64\") (RVVM4DF \"TARGET_VECTOR_ELEN_FP_64\")\n   (RVVM2DF \"TARGET_VECTOR_ELEN_FP_64\") (RVVM1DF \"TARGET_VECTOR_ELEN_FP_64\")\n+\n+  (V1HI \"riscv_vector::vls_mode_valid_p (V1HImode)\")\n+  (V2HI \"riscv_vector::vls_mode_valid_p (V2HImode)\")\n+  (V4HI \"riscv_vector::vls_mode_valid_p (V4HImode)\")\n+  (V8HI \"riscv_vector::vls_mode_valid_p (V8HImode)\")\n+  (V16HI \"riscv_vector::vls_mode_valid_p (V16HImode)\")\n+  (V32HI \"riscv_vector::vls_mode_valid_p (V32HImode) && TARGET_MIN_VLEN >= 64\")\n+  (V64HI \"riscv_vector::vls_mode_valid_p (V64HImode) && TARGET_MIN_VLEN >= 128\")\n+  (V128HI \"riscv_vector::vls_mode_valid_p (V128HImode) && TARGET_MIN_VLEN >= 256\")\n+  (V256HI \"riscv_vector::vls_mode_valid_p (V256HImode) && TARGET_MIN_VLEN >= 512\")\n+  (V512HI \"riscv_vector::vls_mode_valid_p (V512HImode) && TARGET_MIN_VLEN >= 1024\")\n+  (V1024HI \"riscv_vector::vls_mode_valid_p (V1024HImode) && TARGET_MIN_VLEN >= 2048\")\n+  (V1SI \"riscv_vector::vls_mode_valid_p (V1SImode)\")\n+  (V2SI \"riscv_vector::vls_mode_valid_p (V2SImode)\")\n+  (V4SI \"riscv_vector::vls_mode_valid_p (V4SImode)\")\n+  (V8SI \"riscv_vector::vls_mode_valid_p (V8SImode)\")\n+  (V16SI \"riscv_vector::vls_mode_valid_p (V16SImode) && TARGET_MIN_VLEN >= 64\")\n+  (V32SI \"riscv_vector::vls_mode_valid_p (V32SImode) && TARGET_MIN_VLEN >= 128\")\n+  (V64SI \"riscv_vector::vls_mode_valid_p (V64SImode) && TARGET_MIN_VLEN >= 256\")\n+  (V128SI \"riscv_vector::vls_mode_valid_p (V128SImode) && TARGET_MIN_VLEN >= 512\")\n+  (V256SI \"riscv_vector::vls_mode_valid_p (V256SImode) && TARGET_MIN_VLEN >= 1024\")\n+  (V512SI \"riscv_vector::vls_mode_valid_p (V512SImode) && TARGET_MIN_VLEN >= 2048\")\n+  (V1024SI \"riscv_vector::vls_mode_valid_p (V1024SImode) && TARGET_MIN_VLEN >= 4096\")\n+  (V1DI \"riscv_vector::vls_mode_valid_p (V1DImode) && TARGET_VECTOR_ELEN_64\")\n+  (V2DI \"riscv_vector::vls_mode_valid_p (V2DImode) && TARGET_VECTOR_ELEN_64\")\n+  (V4DI \"riscv_vector::vls_mode_valid_p (V4DImode) && TARGET_VECTOR_ELEN_64\")\n+  (V8DI \"riscv_vector::vls_mode_valid_p (V8DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 64\")\n+  (V16DI \"riscv_vector::vls_mode_valid_p (V16DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 128\")\n+  (V32DI \"riscv_vector::vls_mode_valid_p (V32DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 256\")\n+  (V64DI \"riscv_vector::vls_mode_valid_p (V64DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 512\")\n+  (V128DI \"riscv_vector::vls_mode_valid_p (V128DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 1024\")\n+  (V256DI \"riscv_vector::vls_mode_valid_p (V256DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 2048\")\n+  (V512DI \"riscv_vector::vls_mode_valid_p (V512DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 4096\")\n+\n+  (V1HF \"riscv_vector::vls_mode_valid_p (V1HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V2HF \"riscv_vector::vls_mode_valid_p (V2HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V4HF \"riscv_vector::vls_mode_valid_p (V4HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V8HF \"riscv_vector::vls_mode_valid_p (V8HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V16HF \"riscv_vector::vls_mode_valid_p (V16HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V32HF \"riscv_vector::vls_mode_valid_p (V32HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 64\")\n+  (V64HF \"riscv_vector::vls_mode_valid_p (V64HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 128\")\n+  (V128HF \"riscv_vector::vls_mode_valid_p (V128HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 256\")\n+  (V256HF \"riscv_vector::vls_mode_valid_p (V256HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 512\")\n+  (V512HF \"riscv_vector::vls_mode_valid_p (V512HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 1024\")\n+  (V1024HF \"riscv_vector::vls_mode_valid_p (V1024HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 2048\")\n+  (V2048HF \"riscv_vector::vls_mode_valid_p (V2048HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 4096\")\n+  (V1BF \"riscv_vector::vls_mode_valid_p (V1BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V2BF \"riscv_vector::vls_mode_valid_p (V2BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V4BF \"riscv_vector::vls_mode_valid_p (V4BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V8BF \"riscv_vector::vls_mode_valid_p (V8BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V16BF \"riscv_vector::vls_mode_valid_p (V16BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V32BF \"riscv_vector::vls_mode_valid_p (V32BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 64\")\n+  (V64BF \"riscv_vector::vls_mode_valid_p (V64BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 128\")\n+  (V128BF \"riscv_vector::vls_mode_valid_p (V128BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 256\")\n+  (V256BF \"riscv_vector::vls_mode_valid_p (V256BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 512\")\n+  (V512BF \"riscv_vector::vls_mode_valid_p (V512BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 1024\")\n+  (V1024BF \"riscv_vector::vls_mode_valid_p (V1024BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 2048\")\n+  (V2048BF \"riscv_vector::vls_mode_valid_p (V2048BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 4096\")\n+  (V1SF \"riscv_vector::vls_mode_valid_p (V1SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V2SF \"riscv_vector::vls_mode_valid_p (V2SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V4SF \"riscv_vector::vls_mode_valid_p (V4SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V8SF \"riscv_vector::vls_mode_valid_p (V8SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V16SF \"riscv_vector::vls_mode_valid_p (V16SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 64\")\n+  (V32SF \"riscv_vector::vls_mode_valid_p (V32SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 128\")\n+  (V64SF \"riscv_vector::vls_mode_valid_p (V64SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 256\")\n+  (V128SF \"riscv_vector::vls_mode_valid_p (V128SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 512\")\n+  (V256SF \"riscv_vector::vls_mode_valid_p (V256SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 1024\")\n+  (V512SF \"riscv_vector::vls_mode_valid_p (V512SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 2048\")\n+  (V1024SF \"riscv_vector::vls_mode_valid_p (V1024SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 4096\")\n+  (V1DF \"riscv_vector::vls_mode_valid_p (V1DFmode) && TARGET_VECTOR_ELEN_FP_64\")\n+  (V2DF \"riscv_vector::vls_mode_valid_p (V2DFmode) && TARGET_VECTOR_ELEN_FP_64\")\n+  (V4DF \"riscv_vector::vls_mode_valid_p (V4DFmode) && TARGET_VECTOR_ELEN_FP_64\")\n+  (V8DF \"riscv_vector::vls_mode_valid_p (V8DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 64\")\n+  (V16DF \"riscv_vector::vls_mode_valid_p (V16DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 128\")\n+  (V32DF \"riscv_vector::vls_mode_valid_p (V32DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 256\")\n+  (V64DF \"riscv_vector::vls_mode_valid_p (V64DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 512\")\n+  (V128DF \"riscv_vector::vls_mode_valid_p (V128DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 1024\")\n+  (V256DF \"riscv_vector::vls_mode_valid_p (V256DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 2048\")\n+  (V512DF \"riscv_vector::vls_mode_valid_p (V512DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 4096\")\n ])\n \n (define_mode_iterator VEEWEXT4 [\n@@ -358,6 +437,50 @@ (define_mode_iterator VEEWEXT4 [\n \n   (RVVM8DF \"TARGET_VECTOR_ELEN_FP_64\") (RVVM4DF \"TARGET_VECTOR_ELEN_FP_64\")\n   (RVVM2DF \"TARGET_VECTOR_ELEN_FP_64\") (RVVM1DF \"TARGET_VECTOR_ELEN_FP_64\")\n+\n+  (V1SI \"riscv_vector::vls_mode_valid_p (V1SImode)\")\n+  (V2SI \"riscv_vector::vls_mode_valid_p (V2SImode)\")\n+  (V4SI \"riscv_vector::vls_mode_valid_p (V4SImode)\")\n+  (V8SI \"riscv_vector::vls_mode_valid_p (V8SImode)\")\n+  (V16SI \"riscv_vector::vls_mode_valid_p (V16SImode) && TARGET_MIN_VLEN >= 64\")\n+  (V32SI \"riscv_vector::vls_mode_valid_p (V32SImode) && TARGET_MIN_VLEN >= 128\")\n+  (V64SI \"riscv_vector::vls_mode_valid_p (V64SImode) && TARGET_MIN_VLEN >= 256\")\n+  (V128SI \"riscv_vector::vls_mode_valid_p (V128SImode) && TARGET_MIN_VLEN >= 512\")\n+  (V256SI \"riscv_vector::vls_mode_valid_p (V256SImode) && TARGET_MIN_VLEN >= 1024\")\n+  (V512SI \"riscv_vector::vls_mode_valid_p (V512SImode) && TARGET_MIN_VLEN >= 2048\")\n+  (V1024SI \"riscv_vector::vls_mode_valid_p (V1024SImode) && TARGET_MIN_VLEN >= 4096\")\n+  (V1DI \"riscv_vector::vls_mode_valid_p (V1DImode) && TARGET_VECTOR_ELEN_64\")\n+  (V2DI \"riscv_vector::vls_mode_valid_p (V2DImode) && TARGET_VECTOR_ELEN_64\")\n+  (V4DI \"riscv_vector::vls_mode_valid_p (V4DImode) && TARGET_VECTOR_ELEN_64\")\n+  (V8DI \"riscv_vector::vls_mode_valid_p (V8DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 64\")\n+  (V16DI \"riscv_vector::vls_mode_valid_p (V16DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 128\")\n+  (V32DI \"riscv_vector::vls_mode_valid_p (V32DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 256\")\n+  (V64DI \"riscv_vector::vls_mode_valid_p (V64DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 512\")\n+  (V128DI \"riscv_vector::vls_mode_valid_p (V128DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 1024\")\n+  (V256DI \"riscv_vector::vls_mode_valid_p (V256DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 2048\")\n+  (V512DI \"riscv_vector::vls_mode_valid_p (V512DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 4096\")\n+\n+  (V1SF \"riscv_vector::vls_mode_valid_p (V1SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V2SF \"riscv_vector::vls_mode_valid_p (V2SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V4SF \"riscv_vector::vls_mode_valid_p (V4SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V8SF \"riscv_vector::vls_mode_valid_p (V8SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V16SF \"riscv_vector::vls_mode_valid_p (V16SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 64\")\n+  (V32SF \"riscv_vector::vls_mode_valid_p (V32SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 128\")\n+  (V64SF \"riscv_vector::vls_mode_valid_p (V64SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 256\")\n+  (V128SF \"riscv_vector::vls_mode_valid_p (V128SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 512\")\n+  (V256SF \"riscv_vector::vls_mode_valid_p (V256SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 1024\")\n+  (V512SF \"riscv_vector::vls_mode_valid_p (V512SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 2048\")\n+  (V1024SF \"riscv_vector::vls_mode_valid_p (V1024SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 4096\")\n+  (V1DF \"riscv_vector::vls_mode_valid_p (V1DFmode) && TARGET_VECTOR_ELEN_FP_64\")\n+  (V2DF \"riscv_vector::vls_mode_valid_p (V2DFmode) && TARGET_VECTOR_ELEN_FP_64\")\n+  (V4DF \"riscv_vector::vls_mode_valid_p (V4DFmode) && TARGET_VECTOR_ELEN_FP_64\")\n+  (V8DF \"riscv_vector::vls_mode_valid_p (V8DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 64\")\n+  (V16DF \"riscv_vector::vls_mode_valid_p (V16DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 128\")\n+  (V32DF \"riscv_vector::vls_mode_valid_p (V32DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 256\")\n+  (V64DF \"riscv_vector::vls_mode_valid_p (V64DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 512\")\n+  (V128DF \"riscv_vector::vls_mode_valid_p (V128DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 1024\")\n+  (V256DF \"riscv_vector::vls_mode_valid_p (V256DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 2048\")\n+  (V512DF \"riscv_vector::vls_mode_valid_p (V512DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 4096\")\n ])\n \n (define_mode_iterator VEEWEXT8 [\n@@ -366,6 +489,28 @@ (define_mode_iterator VEEWEXT8 [\n \n   (RVVM8DF \"TARGET_VECTOR_ELEN_FP_64\") (RVVM4DF \"TARGET_VECTOR_ELEN_FP_64\")\n   (RVVM2DF \"TARGET_VECTOR_ELEN_FP_64\") (RVVM1DF \"TARGET_VECTOR_ELEN_FP_64\")\n+\n+  (V1DI \"riscv_vector::vls_mode_valid_p (V1DImode) && TARGET_VECTOR_ELEN_64\")\n+  (V2DI \"riscv_vector::vls_mode_valid_p (V2DImode) && TARGET_VECTOR_ELEN_64\")\n+  (V4DI \"riscv_vector::vls_mode_valid_p (V4DImode) && TARGET_VECTOR_ELEN_64\")\n+  (V8DI \"riscv_vector::vls_mode_valid_p (V8DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 64\")\n+  (V16DI \"riscv_vector::vls_mode_valid_p (V16DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 128\")\n+  (V32DI \"riscv_vector::vls_mode_valid_p (V32DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 256\")\n+  (V64DI \"riscv_vector::vls_mode_valid_p (V64DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 512\")\n+  (V128DI \"riscv_vector::vls_mode_valid_p (V128DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 1024\")\n+  (V256DI \"riscv_vector::vls_mode_valid_p (V256DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 2048\")\n+  (V512DI \"riscv_vector::vls_mode_valid_p (V512DImode) && TARGET_VECTOR_ELEN_64 && TARGET_MIN_VLEN >= 4096\")\n+\n+  (V1DF \"riscv_vector::vls_mode_valid_p (V1DFmode) && TARGET_VECTOR_ELEN_FP_64\")\n+  (V2DF \"riscv_vector::vls_mode_valid_p (V2DFmode) && TARGET_VECTOR_ELEN_FP_64\")\n+  (V4DF \"riscv_vector::vls_mode_valid_p (V4DFmode) && TARGET_VECTOR_ELEN_FP_64\")\n+  (V8DF \"riscv_vector::vls_mode_valid_p (V8DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 64\")\n+  (V16DF \"riscv_vector::vls_mode_valid_p (V16DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 128\")\n+  (V32DF \"riscv_vector::vls_mode_valid_p (V32DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 256\")\n+  (V64DF \"riscv_vector::vls_mode_valid_p (V64DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 512\")\n+  (V128DF \"riscv_vector::vls_mode_valid_p (V128DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 1024\")\n+  (V256DF \"riscv_vector::vls_mode_valid_p (V256DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 2048\")\n+  (V512DF \"riscv_vector::vls_mode_valid_p (V512DFmode) && TARGET_VECTOR_ELEN_FP_64 && TARGET_MIN_VLEN >= 4096\")\n ])\n \n (define_mode_iterator VEEWTRUNC2 [\n@@ -390,6 +535,73 @@ (define_mode_iterator VEEWTRUNC2 [\n   (RVVM2SF \"TARGET_VECTOR_ELEN_FP_32 && TARGET_64BIT\")\n   (RVVM1SF \"TARGET_VECTOR_ELEN_FP_32 && TARGET_64BIT\")\n   (RVVMF2SF \"TARGET_VECTOR_ELEN_FP_32 && TARGET_VECTOR_ELEN_64 && TARGET_64BIT\")\n+\n+  (V1QI \"riscv_vector::vls_mode_valid_p (V1QImode)\")\n+  (V2QI \"riscv_vector::vls_mode_valid_p (V2QImode)\")\n+  (V4QI \"riscv_vector::vls_mode_valid_p (V4QImode)\")\n+  (V8QI \"riscv_vector::vls_mode_valid_p (V8QImode)\")\n+  (V16QI \"riscv_vector::vls_mode_valid_p (V16QImode)\")\n+  (V32QI \"riscv_vector::vls_mode_valid_p (V32QImode) && TARGET_MIN_VLEN >= 64\")\n+  (V64QI \"riscv_vector::vls_mode_valid_p (V64QImode) && TARGET_MIN_VLEN >= 128\")\n+  (V128QI \"riscv_vector::vls_mode_valid_p (V128QImode) && TARGET_MIN_VLEN >= 256\")\n+  (V256QI \"riscv_vector::vls_mode_valid_p (V256QImode) && TARGET_MIN_VLEN >= 512\")\n+  (V512QI \"riscv_vector::vls_mode_valid_p (V512QImode) && TARGET_MIN_VLEN >= 1024\")\n+  (V1024QI \"riscv_vector::vls_mode_valid_p (V1024QImode) && TARGET_MIN_VLEN >= 2048\")\n+  (V2048QI \"riscv_vector::vls_mode_valid_p (V1024QImode) && TARGET_MIN_VLEN >= 2048\")\n+  (V1HI \"riscv_vector::vls_mode_valid_p (V1HImode)\")\n+  (V2HI \"riscv_vector::vls_mode_valid_p (V2HImode)\")\n+  (V4HI \"riscv_vector::vls_mode_valid_p (V4HImode)\")\n+  (V8HI \"riscv_vector::vls_mode_valid_p (V8HImode)\")\n+  (V16HI \"riscv_vector::vls_mode_valid_p (V16HImode)\")\n+  (V32HI \"riscv_vector::vls_mode_valid_p (V32HImode) && TARGET_MIN_VLEN >= 64\")\n+  (V64HI \"riscv_vector::vls_mode_valid_p (V64HImode) && TARGET_MIN_VLEN >= 128\")\n+  (V128HI \"riscv_vector::vls_mode_valid_p (V128HImode) && TARGET_MIN_VLEN >= 256\")\n+  (V256HI \"riscv_vector::vls_mode_valid_p (V256HImode) && TARGET_MIN_VLEN >= 512\")\n+  (V512HI \"riscv_vector::vls_mode_valid_p (V512HImode) && TARGET_MIN_VLEN >= 1024\")\n+  (V1024HI \"riscv_vector::vls_mode_valid_p (V1024HImode) && TARGET_MIN_VLEN >= 2048\")\n+  (V1SI \"riscv_vector::vls_mode_valid_p (V1SImode)\")\n+  (V2SI \"riscv_vector::vls_mode_valid_p (V2SImode)\")\n+  (V4SI \"riscv_vector::vls_mode_valid_p (V4SImode)\")\n+  (V8SI \"riscv_vector::vls_mode_valid_p (V8SImode)\")\n+  (V16SI \"riscv_vector::vls_mode_valid_p (V16SImode) && TARGET_MIN_VLEN >= 64\")\n+  (V32SI \"riscv_vector::vls_mode_valid_p (V32SImode) && TARGET_MIN_VLEN >= 128\")\n+  (V64SI \"riscv_vector::vls_mode_valid_p (V64SImode) && TARGET_MIN_VLEN >= 256\")\n+  (V128SI \"riscv_vector::vls_mode_valid_p (V128SImode) && TARGET_MIN_VLEN >= 512\")\n+  (V256SI \"riscv_vector::vls_mode_valid_p (V256SImode) && TARGET_MIN_VLEN >= 1024\")\n+  (V512SI \"riscv_vector::vls_mode_valid_p (V512SImode) && TARGET_MIN_VLEN >= 2048\")\n+\n+  (V1HF \"riscv_vector::vls_mode_valid_p (V1HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V2HF \"riscv_vector::vls_mode_valid_p (V2HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V4HF \"riscv_vector::vls_mode_valid_p (V4HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V8HF \"riscv_vector::vls_mode_valid_p (V8HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V16HF \"riscv_vector::vls_mode_valid_p (V16HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V32HF \"riscv_vector::vls_mode_valid_p (V32HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 64\")\n+  (V64HF \"riscv_vector::vls_mode_valid_p (V64HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 128\")\n+  (V128HF \"riscv_vector::vls_mode_valid_p (V128HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 256\")\n+  (V256HF \"riscv_vector::vls_mode_valid_p (V256HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 512\")\n+  (V512HF \"riscv_vector::vls_mode_valid_p (V512HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 1024\")\n+  (V1024HF \"riscv_vector::vls_mode_valid_p (V1024HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 2048\")\n+  (V1BF \"riscv_vector::vls_mode_valid_p (V1BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V2BF \"riscv_vector::vls_mode_valid_p (V2BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V4BF \"riscv_vector::vls_mode_valid_p (V4BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V8BF \"riscv_vector::vls_mode_valid_p (V8BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V16BF \"riscv_vector::vls_mode_valid_p (V16BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V32BF \"riscv_vector::vls_mode_valid_p (V32BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 64\")\n+  (V64BF \"riscv_vector::vls_mode_valid_p (V64BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 128\")\n+  (V128BF \"riscv_vector::vls_mode_valid_p (V128BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 256\")\n+  (V256BF \"riscv_vector::vls_mode_valid_p (V256BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 512\")\n+  (V512BF \"riscv_vector::vls_mode_valid_p (V512BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 1024\")\n+  (V1024BF \"riscv_vector::vls_mode_valid_p (V1024BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 2048\")\n+  (V1SF \"riscv_vector::vls_mode_valid_p (V1SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V2SF \"riscv_vector::vls_mode_valid_p (V2SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V4SF \"riscv_vector::vls_mode_valid_p (V4SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V8SF \"riscv_vector::vls_mode_valid_p (V8SFmode) && TARGET_VECTOR_ELEN_FP_32\")\n+  (V16SF \"riscv_vector::vls_mode_valid_p (V16SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 64\")\n+  (V32SF \"riscv_vector::vls_mode_valid_p (V32SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 128\")\n+  (V64SF \"riscv_vector::vls_mode_valid_p (V64SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 256\")\n+  (V128SF \"riscv_vector::vls_mode_valid_p (V128SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 512\")\n+  (V256SF \"riscv_vector::vls_mode_valid_p (V256SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 1024\")\n+  (V512SF \"riscv_vector::vls_mode_valid_p (V512SFmode) && TARGET_VECTOR_ELEN_FP_32 && TARGET_MIN_VLEN >= 2048\")\n ])\n \n (define_mode_iterator VEEWTRUNC4 [\n@@ -409,6 +621,49 @@ (define_mode_iterator VEEWTRUNC4 [\n   (RVVM1HF \"TARGET_VECTOR_ELEN_FP_16 && TARGET_64BIT\")\n   (RVVMF2HF \"TARGET_VECTOR_ELEN_FP_16 && TARGET_64BIT\")\n   (RVVMF4HF \"TARGET_VECTOR_ELEN_FP_16 && TARGET_VECTOR_ELEN_64 && TARGET_64BIT\")\n+\n+  (V1QI \"riscv_vector::vls_mode_valid_p (V1QImode)\")\n+  (V2QI \"riscv_vector::vls_mode_valid_p (V2QImode)\")\n+  (V4QI \"riscv_vector::vls_mode_valid_p (V4QImode)\")\n+  (V8QI \"riscv_vector::vls_mode_valid_p (V8QImode)\")\n+  (V16QI \"riscv_vector::vls_mode_valid_p (V16QImode)\")\n+  (V32QI \"riscv_vector::vls_mode_valid_p (V32QImode) && TARGET_MIN_VLEN >= 64\")\n+  (V64QI \"riscv_vector::vls_mode_valid_p (V64QImode) && TARGET_MIN_VLEN >= 128\")\n+  (V128QI \"riscv_vector::vls_mode_valid_p (V128QImode) && TARGET_MIN_VLEN >= 256\")\n+  (V256QI \"riscv_vector::vls_mode_valid_p (V256QImode) && TARGET_MIN_VLEN >= 512\")\n+  (V512QI \"riscv_vector::vls_mode_valid_p (V512QImode) && TARGET_MIN_VLEN >= 1024\")\n+  (V1024QI \"riscv_vector::vls_mode_valid_p (V1024QImode) && TARGET_MIN_VLEN >= 2048\")\n+  (V1HI \"riscv_vector::vls_mode_valid_p (V1HImode)\")\n+  (V2HI \"riscv_vector::vls_mode_valid_p (V2HImode)\")\n+  (V4HI \"riscv_vector::vls_mode_valid_p (V4HImode)\")\n+  (V8HI \"riscv_vector::vls_mode_valid_p (V8HImode)\")\n+  (V16HI \"riscv_vector::vls_mode_valid_p (V16HImode)\")\n+  (V32HI \"riscv_vector::vls_mode_valid_p (V32HImode) && TARGET_MIN_VLEN >= 64\")\n+  (V64HI \"riscv_vector::vls_mode_valid_p (V64HImode) && TARGET_MIN_VLEN >= 128\")\n+  (V128HI \"riscv_vector::vls_mode_valid_p (V128HImode) && TARGET_MIN_VLEN >= 256\")\n+  (V256HI \"riscv_vector::vls_mode_valid_p (V256HImode) && TARGET_MIN_VLEN >= 512\")\n+  (V512HI \"riscv_vector::vls_mode_valid_p (V512HImode) && TARGET_MIN_VLEN >= 1024\")\n+\n+  (V1HF \"riscv_vector::vls_mode_valid_p (V1HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V2HF \"riscv_vector::vls_mode_valid_p (V2HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V4HF \"riscv_vector::vls_mode_valid_p (V4HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V8HF \"riscv_vector::vls_mode_valid_p (V8HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V16HF \"riscv_vector::vls_mode_valid_p (V16HFmode) && TARGET_VECTOR_ELEN_FP_16\")\n+  (V32HF \"riscv_vector::vls_mode_valid_p (V32HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 64\")\n+  (V64HF \"riscv_vector::vls_mode_valid_p (V64HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 128\")\n+  (V128HF \"riscv_vector::vls_mode_valid_p (V128HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 256\")\n+  (V256HF \"riscv_vector::vls_mode_valid_p (V256HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 512\")\n+  (V512HF \"riscv_vector::vls_mode_valid_p (V512HFmode) && TARGET_VECTOR_ELEN_FP_16 && TARGET_MIN_VLEN >= 1024\")\n+  (V1BF \"riscv_vector::vls_mode_valid_p (V1BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V2BF \"riscv_vector::vls_mode_valid_p (V2BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V4BF \"riscv_vector::vls_mode_valid_p (V4BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V8BF \"riscv_vector::vls_mode_valid_p (V8BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V16BF \"riscv_vector::vls_mode_valid_p (V16BFmode) && TARGET_VECTOR_ELEN_BF_16\")\n+  (V32BF \"riscv_vector::vls_mode_valid_p (V32BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 64\")\n+  (V64BF \"riscv_vector::vls_mode_valid_p (V64BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 128\")\n+  (V128BF \"riscv_vector::vls_mode_valid_p (V128BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 256\")\n+  (V256BF \"riscv_vector::vls_mode_valid_p (V256BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 512\")\n+  (V512BF \"riscv_vector::vls_mode_valid_p (V512BFmode) && TARGET_VECTOR_ELEN_BF_16 && TARGET_MIN_VLEN >= 1024\")\n ])\n \n (define_mode_iterator VEEWTRUNC8 [\n@@ -416,6 +671,17 @@ (define_mode_iterator VEEWTRUNC8 [\n   (RVVMF2QI \"TARGET_64BIT\")\n   (RVVMF4QI \"TARGET_64BIT\")\n   (RVVMF8QI \"TARGET_VECTOR_ELEN_64 && TARGET_64BIT\")\n+\n+  (V1QI \"riscv_vector::vls_mode_valid_p (V1QImode)\")\n+  (V2QI \"riscv_vector::vls_mode_valid_p (V2QImode)\")\n+  (V4QI \"riscv_vector::vls_mode_valid_p (V4QImode)\")\n+  (V8QI \"riscv_vector::vls_mode_valid_p (V8QImode)\")\n+  (V16QI \"riscv_vector::vls_mode_valid_p (V16QImode)\")\n+  (V32QI \"riscv_vector::vls_mode_valid_p (V32QImode) && TARGET_MIN_VLEN >= 64\")\n+  (V64QI \"riscv_vector::vls_mode_valid_p (V64QImode) && TARGET_MIN_VLEN >= 128\")\n+  (V128QI \"riscv_vector::vls_mode_valid_p (V128QImode) && TARGET_MIN_VLEN >= 256\")\n+  (V256QI \"riscv_vector::vls_mode_valid_p (V256QImode) && TARGET_MIN_VLEN >= 512\")\n+  (V512QI \"riscv_vector::vls_mode_valid_p (V512QImode) && TARGET_MIN_VLEN >= 1024\")\n ])\n \n (define_mode_iterator VEI16 [\n@@ -1924,6 +2190,117 @@ (define_mode_attr VINDEX [\n   (V512DF \"V512DI\")\n ])\n \n+(define_mode_attr vindex [\n+  (RVVM8QI \"rvvm8qi\") (RVVM4QI \"rvvm4qi\") (RVVM2QI \"rvvm2qi\") (RVVM1QI \"rvvm1qi\")\n+  (RVVMF2QI \"rvvmf2qi\") (RVVMF4QI \"rvvmf4qi\") (RVVMF8QI \"rvvmf8qi\")\n+\n+  (RVVM8HI \"rvvm8hi\") (RVVM4HI \"rvvm4hi\") (RVVM2HI \"rvvm2hi\") (RVVM1HI \"rvvm1hi\") (RVVMF2HI \"rvvmf2hi\") (RVVMF4HI \"rvvmf4hi\")\n+\n+  (RVVM8BF \"rvvm8hi\") (RVVM4BF \"rvvm4hi\") (RVVM2BF \"rvvm2hi\") (RVVM1BF \"rvvm1hi\") (RVVMF2BF \"rvvmf2hi\") (RVVMF4BF \"rvvmf4hi\")\n+\n+  (RVVM8HF \"rvvm8hi\") (RVVM4HF \"rvvm4hi\") (RVVM2HF \"rvvm2hi\") (RVVM1HF \"rvvm1hi\") (RVVMF2HF \"rvvmf2hi\") (RVVMF4HF \"rvvmf4hi\")\n+\n+  (RVVM8SI \"rvvm8si\") (RVVM4SI \"rvvm4si\") (RVVM2SI \"rvvm2si\") (RVVM1SI \"rvvm1si\") (RVVMF2SI \"rvvmf2si\")\n+\n+  (RVVM8SF \"rvvm8si\") (RVVM4SF \"rvvm4si\") (RVVM2SF \"rvvm2si\") (RVVM1SF \"rvvm1si\") (RVVMF2SF \"rvvmf2si\")\n+\n+  (RVVM8DI \"rvvm8di\") (RVVM4DI \"rvvm4di\") (RVVM2DI \"rvvm2di\") (RVVM1DI \"rvvm1di\")\n+\n+  (RVVM8DF \"rvvm8di\") (RVVM4DF \"rvvm4di\") (RVVM2DF \"rvvm2di\") (RVVM1DF \"rvvm1di\")\n+\n+  (V1QI \"v1qi\")\n+  (V2QI \"v2qi\")\n+  (V4QI \"v4qi\")\n+  (V8QI \"v8qi\")\n+  (V16QI \"v16qi\")\n+  (V32QI \"v32qi\")\n+  (V64QI \"v64qi\")\n+  (V128QI \"v128qi\")\n+  (V256QI \"v256qi\")\n+  (V512QI \"v512qi\")\n+  (V1024QI \"v1024qi\")\n+  (V2048QI \"v2048qi\")\n+  (V4096QI \"v4096qi\")\n+  (V1HI \"v1hi\")\n+  (V2HI \"v2hi\")\n+  (V4HI \"v4hi\")\n+  (V8HI \"v8hi\")\n+  (V16HI \"v16hi\")\n+  (V32HI \"v32hi\")\n+  (V64HI \"v64hi\")\n+  (V128HI \"v128hi\")\n+  (V256HI \"v256hi\")\n+  (V512HI \"v512hi\")\n+  (V1024HI \"v1024hi\")\n+  (V2048HI \"v2048hi\")\n+  (V1SI \"v1si\")\n+  (V2SI \"v2si\")\n+  (V4SI \"v4si\")\n+  (V8SI \"V8SI\")\n+  (V16SI \"v16si\")\n+  (V32SI \"v32si\")\n+  (V64SI \"v64si\")\n+  (V128SI \"v128si\")\n+  (V256SI \"v256si\")\n+  (V512SI \"v512si\")\n+  (V1024SI \"v1024si\")\n+  (V1DI \"v1di\")\n+  (V2DI \"v2di\")\n+  (V4DI \"v4di\")\n+  (V8DI \"v8di\")\n+  (V16DI \"v16di\")\n+  (V32DI \"v32di\")\n+  (V64DI \"v64di\")\n+  (V128DI \"v128di\")\n+  (V256DI \"v256di\")\n+  (V512DI \"v512di\")\n+  (V1HF \"v1hi\")\n+  (V2HF \"v2hi\")\n+  (V4HF \"v4hi\")\n+  (V8HF \"v8hi\")\n+  (V16HF \"v16hi\")\n+  (V32HF \"v32hi\")\n+  (V64HF \"v64hi\")\n+  (V128HF \"v128hi\")\n+  (V256HF \"v256hi\")\n+  (V512HF \"v512hi\")\n+  (V1024HF \"v1024hi\")\n+  (V2048HF \"v2048hi\")\n+  (V1BF \"v1hi\")\n+  (V2BF \"v2hi\")\n+  (V4BF \"v4hi\")\n+  (V8BF \"v8hi\")\n+  (V16BF \"v16hi\")\n+  (V32BF \"v32hi\")\n+  (V64BF \"v64hi\")\n+  (V128BF \"v128hi\")\n+  (V256BF \"v256hi\")\n+  (V512BF \"v512hi\")\n+  (V1024BF \"v1024hi\")\n+  (V2048BF \"v2048hi\")\n+  (V1SF \"v1si\")\n+  (V2SF \"v2si\")\n+  (V4SF \"v4si\")\n+  (V8SF \"v8si\")\n+  (V16SF \"v16si\")\n+  (V32SF \"v32si\")\n+  (V64SF \"v64si\")\n+  (V128SF \"v128si\")\n+  (V256SF \"v256si\")\n+  (V512SF \"v512si\")\n+  (V1024SF \"v1024si\")\n+  (V1DF \"v1di\")\n+  (V2DF \"v2di\")\n+  (V4DF \"v4di\")\n+  (V8DF \"v8di\")\n+  (V16DF \"v16di\")\n+  (V32DF \"v32di\")\n+  (V64DF \"v64di\")\n+  (V128DF \"v128di\")\n+  (V256DF \"v256di\")\n+  (V512DF \"v512di\")\n+])\n+\n (define_mode_attr VINDEXEI16 [\n   (RVVM4QI \"RVVM8HI\") (RVVM2QI \"RVVM4HI\") (RVVM1QI \"RVVM2HI\") (RVVMF2QI \"RVVM1HI\") (RVVMF4QI \"RVVMF2HI\") (RVVMF8QI \"RVVMF4HI\")\n \n@@ -2779,6 +3156,85 @@ (define_mode_attr double_trunc_sew [\n   (RVVM8DI \"32\") (RVVM4DI \"32\") (RVVM2DI \"32\") (RVVM1DI \"32\")\n \n   (RVVM8DF \"32\") (RVVM4DF \"32\") (RVVM2DF \"32\") (RVVM1DF \"32\")\n+\n+  (V1HI \"8\")\n+  (V2HI \"8\")\n+  (V4HI \"8\")\n+  (V8HI \"8\")\n+  (V16HI \"8\")\n+  (V32HI \"8\")\n+  (V64HI \"8\")\n+  (V128HI \"8\")\n+  (V256HI \"8\")\n+  (V512HI \"8\")\n+  (V1024HI \"8\")\n+  (V2048HI \"8\")\n+  (V1SI \"16\")\n+  (V2SI \"16\")\n+  (V4SI \"16\")\n+  (V8SI \"16\")\n+  (V16SI \"16\")\n+  (V32SI \"16\")\n+  (V64SI \"16\")\n+  (V128SI \"16\")\n+  (V256SI \"16\")\n+  (V512SI \"16\")\n+  (V1024SI \"16\")\n+  (V1DI \"32\")\n+  (V2DI \"32\")\n+  (V4DI \"32\")\n+  (V8DI \"32\")\n+  (V16DI \"32\")\n+  (V32DI \"32\")\n+  (V64DI \"32\")\n+  (V128DI \"32\")\n+  (V256DI \"32\")\n+  (V512DI \"32\")\n+  (V1HF \"8\")\n+  (V2HF \"8\")\n+  (V4HF \"8\")\n+  (V8HF \"8\")\n+  (V16HF \"8\")\n+  (V32HF \"8\")\n+  (V64HF \"8\")\n+  (V128HF \"8\")\n+  (V256HF \"8\")\n+  (V512HF \"8\")\n+  (V1024HF \"8\")\n+  (V2048HF \"8\")\n+  (V1BF \"8\")\n+  (V2BF \"8\")\n+  (V4BF \"8\")\n+  (V8BF \"8\")\n+  (V16BF \"8\")\n+  (V32BF \"8\")\n+  (V64BF \"8\")\n+  (V128BF \"8\")\n+  (V256BF \"8\")\n+  (V512BF \"8\")\n+  (V1024BF \"8\")\n+  (V2048BF \"8\")\n+  (V1SF \"16\")\n+  (V2SF \"16\")\n+  (V4SF \"16\")\n+  (V8SF \"16\")\n+  (V16SF \"16\")\n+  (V32SF \"16\")\n+  (V64SF \"16\")\n+  (V128SF \"16\")\n+  (V256SF \"16\")\n+  (V512SF \"16\")\n+  (V1024SF \"16\")\n+  (V1DF \"32\")\n+  (V2DF \"32\")\n+  (V4DF \"32\")\n+  (V8DF \"32\")\n+  (V16DF \"32\")\n+  (V32DF \"32\")\n+  (V64DF \"32\")\n+  (V128DF \"32\")\n+  (V256DF \"32\")\n+  (V512DF \"32\")\n ])\n \n (define_mode_attr quad_trunc_sew [\n@@ -2789,12 +3245,76 @@ (define_mode_attr quad_trunc_sew [\n   (RVVM8DI \"16\") (RVVM4DI \"16\") (RVVM2DI \"16\") (RVVM1DI \"16\")\n \n   (RVVM8DF \"16\") (RVVM4DF \"16\") (RVVM2DF \"16\") (RVVM1DF \"16\")\n+\n+  (V1SI \"8\")\n+  (V2SI \"8\")\n+  (V4SI \"8\")\n+  (V8SI \"8\")\n+  (V16SI \"8\")\n+  (V32SI \"8\")\n+  (V64SI \"8\")\n+  (V128SI \"8\")\n+  (V256SI \"8\")\n+  (V512SI \"8\")\n+  (V1024SI \"8\")\n+  (V1DI \"16\")\n+  (V2DI \"16\")\n+  (V4DI \"16\")\n+  (V8DI \"16\")\n+  (V16DI \"16\")\n+  (V32DI \"16\")\n+  (V64DI \"16\")\n+  (V128DI \"16\")\n+  (V256DI \"16\")\n+  (V512DI \"16\")\n+  (V1SF \"8\")\n+  (V2SF \"8\")\n+  (V4SF \"8\")\n+  (V8SF \"8\")\n+  (V16SF \"8\")\n+  (V32SF \"8\")\n+  (V64SF \"8\")\n+  (V128SF \"8\")\n+  (V256SF \"8\")\n+  (V512SF \"8\")\n+  (V1024SF \"8\")\n+  (V1DF \"16\")\n+  (V2DF \"16\")\n+  (V4DF \"16\")\n+  (V8DF \"16\")\n+  (V16DF \"16\")\n+  (V32DF \"16\")\n+  (V64DF \"16\")\n+  (V128DF \"16\")\n+  (V256DF \"16\")\n+  (V512DF \"16\")\n ])\n \n (define_mode_attr oct_trunc_sew [\n   (RVVM8DI \"8\") (RVVM4DI \"8\") (RVVM2DI \"8\") (RVVM1DI \"8\")\n \n   (RVVM8DF \"8\") (RVVM4DF \"8\") (RVVM2DF \"8\") (RVVM1DF \"8\")\n+\n+  (V1DI \"8\")\n+  (V2DI \"8\")\n+  (V4DI \"8\")\n+  (V8DI \"8\")\n+  (V16DI \"8\")\n+  (V32DI \"8\")\n+  (V64DI \"8\")\n+  (V128DI \"8\")\n+  (V256DI \"8\")\n+  (V512DI \"8\")\n+  (V1DF \"8\")\n+  (V2DF \"8\")\n+  (V4DF \"8\")\n+  (V8DF \"8\")\n+  (V16DF \"8\")\n+  (V32DF \"8\")\n+  (V64DF \"8\")\n+  (V128DF \"8\")\n+  (V256DF \"8\")\n+  (V512DF \"8\")\n ])\n \n (define_mode_attr double_ext_sew [\n@@ -2809,6 +3329,72 @@ (define_mode_attr double_ext_sew [\n   (RVVM4SI \"64\") (RVVM2SI \"64\") (RVVM1SI \"64\") (RVVMF2SI \"64\")\n \n   (RVVM4SF \"64\") (RVVM2SF \"64\") (RVVM1SF \"64\") (RVVMF2SF \"64\")\n+\n+  (V1QI \"16\")\n+  (V2QI \"16\")\n+  (V4QI \"16\")\n+  (V8QI \"16\")\n+  (V16QI \"16\")\n+  (V32QI \"16\")\n+  (V64QI \"16\")\n+  (V128QI \"16\")\n+  (V256QI \"16\")\n+  (V512QI \"16\")\n+  (V1024QI \"16\")\n+  (V2048QI \"16\")\n+  (V1HI \"32\")\n+  (V2HI \"32\")\n+  (V4HI \"32\")\n+  (V8HI \"32\")\n+  (V16HI \"32\")\n+  (V32HI \"32\")\n+  (V64HI \"32\")\n+  (V128HI \"32\")\n+  (V256HI \"32\")\n+  (V512HI \"32\")\n+  (V1024HI \"32\")\n+  (V1SI \"64\")\n+  (V2SI \"64\")\n+  (V4SI \"64\")\n+  (V8SI \"64\")\n+  (V16SI \"64\")\n+  (V32SI \"64\")\n+  (V64SI \"64\")\n+  (V128SI \"64\")\n+  (V256SI \"64\")\n+  (V512SI \"64\")\n+  (V1HF \"32\")\n+  (V2HF \"32\")\n+  (V4HF \"32\")\n+  (V8HF \"32\")\n+  (V16HF \"32\")\n+  (V32HF \"32\")\n+  (V64HF \"32\")\n+  (V128HF \"32\")\n+  (V256HF \"32\")\n+  (V512HF \"32\")\n+  (V1024HF \"32\")\n+  (V1BF \"32\")\n+  (V2BF \"32\")\n+  (V4BF \"32\")\n+  (V8BF \"32\")\n+  (V16BF \"32\")\n+  (V32BF \"32\")\n+  (V64BF \"32\")\n+  (V128BF \"32\")\n+  (V256BF \"32\")\n+  (V512BF \"32\")\n+  (V1024BF \"32\")\n+  (V1SF \"64\")\n+  (V2SF \"64\")\n+  (V4SF \"64\")\n+  (V8SF \"64\")\n+  (V16SF \"64\")\n+  (V32SF \"64\")\n+  (V64SF \"64\")\n+  (V128SF \"64\")\n+  (V256SF \"64\")\n+  (V512SF \"64\")\n ])\n \n (define_mode_attr quad_ext_sew [\n@@ -2817,10 +3403,63 @@ (define_mode_attr quad_ext_sew [\n   (RVVM2HI \"64\") (RVVM1HI \"64\") (RVVMF2HI \"64\") (RVVMF4HI \"64\")\n \n   (RVVM2HF \"64\") (RVVM1HF \"64\") (RVVMF2HF \"64\") (RVVMF4HF \"64\")\n+\n+  (V1QI \"32\")\n+  (V2QI \"32\")\n+  (V4QI \"32\")\n+  (V8QI \"32\")\n+  (V16QI \"32\")\n+  (V32QI \"32\")\n+  (V64QI \"32\")\n+  (V128QI \"32\")\n+  (V256QI \"32\")\n+  (V512QI \"32\")\n+  (V1024QI \"32\")\n+  (V1HI \"64\")\n+  (V2HI \"64\")\n+  (V4HI \"64\")\n+  (V8HI \"64\")\n+  (V16HI \"64\")\n+  (V32HI \"64\")\n+  (V64HI \"64\")\n+  (V128HI \"64\")\n+  (V256HI \"64\")\n+  (V512HI \"64\")\n+  (V1HF \"64\")\n+  (V2HF \"64\")\n+  (V4HF \"64\")\n+  (V8HF \"64\")\n+  (V16HF \"64\")\n+  (V32HF \"64\")\n+  (V64HF \"64\")\n+  (V128HF \"64\")\n+  (V256HF \"64\")\n+  (V512HF \"64\")\n+  (V1BF \"64\")\n+  (V2BF \"64\")\n+  (V4BF \"64\")\n+  (V8BF \"64\")\n+  (V16BF \"64\")\n+  (V32BF \"64\")\n+  (V64BF \"64\")\n+  (V128BF \"64\")\n+  (V256BF \"64\")\n+  (V512BF \"64\")\n ])\n \n (define_mode_attr oct_ext_sew [\n   (RVVM1QI \"64\") (RVVMF2QI \"64\") (RVVMF4QI \"64\") (RVVMF8QI \"64\")\n+\n+  (V1QI \"64\")\n+  (V2QI \"64\")\n+  (V4QI \"64\")\n+  (V8QI \"64\")\n+  (V16QI \"64\")\n+  (V32QI \"64\")\n+  (V64QI \"64\")\n+  (V128QI \"64\")\n+  (V256QI \"64\")\n+  (V512QI \"64\")\n ])\n \n (define_mode_attr V_DOUBLE_EXTEND [\n@@ -2957,6 +3596,34 @@ (define_mode_attr V_DOUBLE_TRUNC [\n   (V512DF \"V512SF\")\n ])\n \n+(define_mode_attr VF_DOUBLE_TRUNC_INDEX [\n+  (RVVM8SF \"RVVM4HI\") (RVVM4SF \"RVVM2HI\") (RVVM2SF \"RVVM1HI\") (RVVM1SF \"RVVMF2HI\") (RVVMF2SF \"RVVMF4HI\")\n+\n+  (RVVM8DF \"RVVM4SI\") (RVVM4DF \"RVVM2SI\") (RVVM2DF \"RVVM1SI\") (RVVM1DF \"RVVMF2SI\")\n+\n+  (V1SF \"V1HI\")\n+  (V2SF \"V2HI\")\n+  (V4SF \"V4HI\")\n+  (V8SF \"V8HI\")\n+  (V16SF \"V16HI\")\n+  (V32SF \"V32HI\")\n+  (V64SF \"V64HI\")\n+  (V128SF \"V128HI\")\n+  (V256SF \"V256HI\")\n+  (V512SF \"V512HI\")\n+  (V1024SF \"V1024HI\")\n+  (V1DF \"V1SI\")\n+  (V2DF \"V2SI\")\n+  (V4DF \"V4SI\")\n+  (V8DF \"V8SI\")\n+  (V16DF \"V16SI\")\n+  (V32DF \"V32SI\")\n+  (V64DF \"V64SI\")\n+  (V128DF \"V128SI\")\n+  (V256DF \"V256SI\")\n+  (V512DF \"V512SI\")\n+])\n+\n (define_mode_attr V_QUAD_TRUNC [\n   (RVVM8SI \"RVVM2QI\") (RVVM4SI \"RVVM1QI\") (RVVM2SI \"RVVMF2QI\") (RVVM1SI \"RVVMF4QI\") (RVVMF2SI \"RVVMF8QI\")\n \n@@ -2997,32 +3664,253 @@ (define_mode_attr V_QUAD_TRUNC [\n   (V512DF \"V512HF\")\n ])\n \n-(define_mode_attr V_OCT_TRUNC [\n-  (RVVM8DI \"RVVM1QI\") (RVVM4DI \"RVVMF2QI\") (RVVM2DI \"RVVMF4QI\") (RVVM1DI \"RVVMF8QI\")\n+(define_mode_attr V_OCT_TRUNC [\n+  (RVVM8DI \"RVVM1QI\") (RVVM4DI \"RVVMF2QI\") (RVVM2DI \"RVVMF4QI\") (RVVM1DI \"RVVMF8QI\")\n+\n+  (V1DI \"V1QI\")\n+  (V2DI \"V2QI\")\n+  (V4DI \"V4QI\")\n+  (V8DI \"V8QI\")\n+  (V16DI \"V16QI\")\n+  (V32DI \"V32QI\")\n+  (V64DI \"V64QI\")\n+  (V128DI \"V128QI\")\n+  (V256DI \"V256QI\")\n+  (V512DI \"V512QI\")\n+])\n+\n+; Again in lower case.\n+(define_mode_attr v_double_trunc [\n+  (RVVM8HI \"rvvm4qi\") (RVVM4HI \"rvvm2qi\") (RVVM2HI \"rvvm1qi\") (RVVM1HI \"rvvmf2qi\") (RVVMF2HI \"rvvmf4qi\") (RVVMF4HI \"rvvmf8qi\")\n+\n+  (RVVM8SI \"rvvm4hi\") (RVVM4SI \"rvvm2hi\") (RVVM2SI \"rvvm1hi\") (RVVM1SI \"rvvmf2hi\") (RVVMF2SI \"rvvmf4hi\")\n+\n+  (RVVM8SF \"rvvm4hf\") (RVVM4SF \"rvvm2hf\") (RVVM2SF \"rvvm1hf\") (RVVM1SF \"rvvmf2hf\") (RVVMF2SF \"rvvmf4hf\")\n+\n+  (RVVM8DI \"rvvm4si\") (RVVM4DI \"rvvm2si\") (RVVM2DI \"rvvm1si\") (RVVM1DI \"rvvmf2si\")\n+\n+  (RVVM8DF \"rvvm4sf\") (RVVM4DF \"rvvm2sf\") (RVVM2DF \"rvvm1sf\") (RVVM1DF \"rvvmf2sf\")\n+\n+  (V1HI \"v1qi\")\n+  (V2HI \"v2qi\")\n+  (V4HI \"v4qi\")\n+  (V8HI \"v8qi\")\n+  (V16HI \"v16qi\")\n+  (V32HI \"v32qi\")\n+  (V64HI \"v64qi\")\n+  (V128HI \"v128qi\")\n+  (V256HI \"v256qi\")\n+  (V512HI \"v512qi\")\n+  (V1024HI \"v1024qi\")\n+  (V2048HI \"v2048qi\")\n+  (V1SI \"v1hi\")\n+  (V2SI \"v2hi\")\n+  (V4SI \"v4hi\")\n+  (V8SI \"v8hi\")\n+  (V16SI \"v16hi\")\n+  (V32SI \"v32hi\")\n+  (V64SI \"v64hi\")\n+  (V128SI \"v128hi\")\n+  (V256SI \"v256hi\")\n+  (V512SI \"v512hi\")\n+  (V1024SI \"v1024hi\")\n+  (V1DI \"v1si\")\n+  (V2DI \"v2si\")\n+  (V4DI \"v4si\")\n+  (V8DI \"v8si\")\n+  (V16DI \"v16si\")\n+  (V32DI \"v32si\")\n+  (V64DI \"v64si\")\n+  (V128DI \"v128si\")\n+  (V256DI \"v256si\")\n+  (V512DI \"v512si\")\n+  (V1SF \"v1hf\")\n+  (V2SF \"v2hf\")\n+  (V4SF \"v4hf\")\n+  (V8SF \"v8hf\")\n+  (V16SF \"v16hf\")\n+  (V32SF \"v32hf\")\n+  (V64SF \"v64hf\")\n+  (V128SF \"v128hf\")\n+  (V256SF \"v256hf\")\n+  (V512SF \"v512hf\")\n+  (V1024SF \"v1024hf\")\n+  (V1DF \"v1sf\")\n+  (V2DF \"v2sf\")\n+  (V4DF \"v4sf\")\n+  (V8DF \"v8sf\")\n+  (V16DF \"v16sf\")\n+  (V32DF \"v32sf\")\n+  (V64DF \"v64sf\")\n+  (V128DF \"v128sf\")\n+  (V256DF \"v256sf\")\n+  (V512DF \"v512sf\")\n+])\n+\n+(define_mode_attr v_quad_trunc [\n+  (RVVM8SI \"rvvm2qi\") (RVVM4SI \"rvvm1qi\") (RVVM2SI \"rvvmf2qi\") (RVVM1SI \"rvvmf4qi\") (RVVMF2SI \"rvvmf8qi\")\n+\n+  (RVVM8DI \"rvvm2hi\") (RVVM4DI \"rvvm1hi\") (RVVM2DI \"rvvmf2hi\") (RVVM1DI \"rvvmf4hi\")\n+\n+  (RVVM8DF \"rvvm2hf\") (RVVM4DF \"rvvm1hf\") (RVVM2DF \"rvvmf2hf\") (RVVM1DF \"rvvmf4hf\")\n+\n+  (V1SI \"v1qi\")\n+  (V2SI \"v2qi\")\n+  (V4SI \"v4qi\")\n+  (V8SI \"v8qi\")\n+  (V16SI \"v16qi\")\n+  (V32SI \"v32qi\")\n+  (V64SI \"v64qi\")\n+  (V128SI \"v128qi\")\n+  (V256SI \"v256qi\")\n+  (V512SI \"v512qi\")\n+  (V1024SI \"v1024qi\")\n+  (V1DI \"v1hi\")\n+  (V2DI \"v2hi\")\n+  (V4DI \"v4hi\")\n+  (V8DI \"v8hi\")\n+  (V16DI \"v16hi\")\n+  (V32DI \"v32hi\")\n+  (V64DI \"v64hi\")\n+  (V128DI \"v128hi\")\n+  (V256DI \"v256hi\")\n+  (V512DI \"v512hi\")\n+  (V1DF \"v1hf\")\n+  (V2DF \"v2hf\")\n+  (V4DF \"v4hf\")\n+  (V8DF \"v8hf\")\n+  (V16DF \"v16hf\")\n+  (V32DF \"v32hf\")\n+  (V64DF \"v64hf\")\n+  (V128DF \"v128hf\")\n+  (V256DF \"v256hf\")\n+  (V512DF \"v512hf\")\n+])\n+\n+(define_mode_attr v_oct_trunc [\n+  (RVVM8DI \"rvvm1qi\") (RVVM4DI \"rvvmf2qi\") (RVVM2DI \"rvvmf4qi\") (RVVM1DI \"rvvmf8qi\")\n+\n+  (V1DI \"v1qi\")\n+  (V2DI \"v2qi\")\n+  (V4DI \"v4qi\")\n+  (V8DI \"v8qi\")\n+  (V16DI \"v16qi\")\n+  (V32DI \"v32qi\")\n+  (V64DI \"v64qi\")\n+  (V128DI \"v128qi\")\n+  (V256DI \"v256qi\")\n+  (V512DI \"v512qi\")\n+])\n+\n+(define_mode_attr VINDEX_DOUBLE_TRUNC [\n+  (RVVM8HI \"RVVM4QI\") (RVVM4HI \"RVVM2QI\") (RVVM2HI \"RVVM1QI\") (RVVM1HI \"RVVMF2QI\") (RVVMF2HI \"RVVMF4QI\") (RVVMF4HI \"RVVMF8QI\")\n+\n+  (RVVM8BF \"RVVM4QI\") (RVVM4BF \"RVVM2QI\") (RVVM2BF \"RVVM1QI\") (RVVM1BF \"RVVMF2QI\") (RVVMF2BF \"RVVMF4QI\") (RVVMF4BF \"RVVMF8QI\")\n+\n+  (RVVM8HF \"RVVM4QI\") (RVVM4HF \"RVVM2QI\") (RVVM2HF \"RVVM1QI\") (RVVM1HF \"RVVMF2QI\") (RVVMF2HF \"RVVMF4QI\") (RVVMF4HF \"RVVMF8QI\")\n+\n+  (RVVM8SI \"RVVM4HI\") (RVVM4SI \"RVVM2HI\") (RVVM2SI \"RVVM1HI\") (RVVM1SI \"RVVMF2HI\") (RVVMF2SI \"RVVMF4HI\")\n+\n+  (RVVM8SF \"RVVM4HI\") (RVVM4SF \"RVVM2HI\") (RVVM2SF \"RVVM1HI\") (RVVM1SF \"RVVMF2HI\") (RVVMF2SF \"RVVMF4HI\")\n+\n+  (RVVM8DI \"RVVM4SI\") (RVVM4DI \"RVVM2SI\") (RVVM2DI \"RVVM1SI\") (RVVM1DI \"RVVMF2SI\")\n+\n+  (RVVM8DF \"RVVM4SI\") (RVVM4DF \"RVVM2SI\") (RVVM2DF \"RVVM1SI\") (RVVM1DF \"RVVMF2SI\")\n+\n+  (V1HI \"V1QI\")\n+  (V2HI \"V2QI\")\n+  (V4HI \"V4QI\")\n+  (V8HI \"V8QI\")\n+  (V16HI \"V16QI\")\n+  (V32HI \"V32QI\")\n+  (V64HI \"V64QI\")\n+  (V128HI \"V128QI\")\n+  (V256HI \"V256QI\")\n+  (V512HI \"V512QI\")\n+  (V1024HI \"V1024QI\")\n+  (V2048HI \"V2048QI\")\n+  (V1SI \"V1HI\")\n+  (V2SI \"V2HI\")\n+  (V4SI \"V4HI\")\n+  (V8SI \"V8HI\")\n+  (V16SI \"V16HI\")\n+  (V32SI \"V32HI\")\n+  (V64SI \"V64HI\")\n+  (V128SI \"V128HI\")\n+  (V256SI \"V256HI\")\n+  (V512SI \"V512HI\")\n+  (V1024SI \"V1024HI\")\n+  (V1DI \"V1SI\")\n+  (V2DI \"V2SI\")\n+  (V4DI \"V4SI\")\n+  (V8DI \"V8SI\")\n+  (V16DI \"V16SI\")\n+  (V32DI \"V32SI\")\n+  (V64DI \"V64SI\")\n+  (V128DI \"V128SI\")\n+  (V256DI \"V256SI\")\n+  (V512DI \"V512SI\")\n+  (V1HF \"V1QI\")\n+  (V2HF \"V2QI\")\n+  (V4HF \"V4QI\")\n+  (V8HF \"V8QI\")\n+  (V16HF \"V16QI\")\n+  (V32HF \"V32QI\")\n+  (V64HF \"V64QI\")\n+  (V128HF \"V128QI\")\n+  (V256HF \"V256QI\")\n+  (V512HF \"V512QI\")\n+  (V1024HF \"V1024QI\")\n+  (V2048HF \"V2048QI\")\n+  (V1BF \"V1QI\")\n+  (V2BF \"V2QI\")\n+  (V4BF \"V4QI\")\n+  (V8BF \"V8QI\")\n+  (V16BF \"V16QI\")\n+  (V32BF \"V32QI\")\n+  (V64BF \"V64QI\")\n+  (V128BF \"V128QI\")\n+  (V256BF \"V256QI\")\n+  (V512BF \"V512QI\")\n+  (V1024BF \"V1024QI\")\n+  (V2048BF \"V2048QI\")\n+  (V1SF \"V1HI\")\n+  (V2SF \"V2HI\")\n+  (V4SF \"V4HI\")\n+  (V8SF \"V8HI\")\n+  (V16SF \"V16HI\")\n+  (V32SF \"V32HI\")\n+  (V64SF \"V64HI\")\n+  (V128SF \"V128HI\")\n+  (V256SF \"V256HI\")\n+  (V512SF \"V512HI\")\n+  (V1024SF \"V1024HI\")\n+  (V1DF \"V1SI\")\n+  (V2DF \"V2SI\")\n+  (V4DF \"V4SI\")\n+  (V8DF \"V8SI\")\n+  (V16DF \"V16SI\")\n+  (V32DF \"V32SI\")\n+  (V64DF \"V64SI\")\n+  (V128DF \"V128SI\")\n+  (V256DF \"V256SI\")\n+  (V512DF \"V512SI\")\n+])\n+\n+(define_mode_attr vindex_double_trunc [\n+  (RVVM8HI \"rvvm4qi\") (RVVM4HI \"rvvm2qi\") (RVVM2HI \"rvvm1qi\") (RVVM1HI \"rvvmf2qi\") (RVVMF2HI \"rvvmf4qi\") (RVVMF4HI \"rvvmf8qi\")\n \n-  (V1DI \"V1QI\")\n-  (V2DI \"V2QI\")\n-  (V4DI \"V4QI\")\n-  (V8DI \"V8QI\")\n-  (V16DI \"V16QI\")\n-  (V32DI \"V32QI\")\n-  (V64DI \"V64QI\")\n-  (V128DI \"V128QI\")\n-  (V256DI \"V256QI\")\n-  (V512DI \"V512QI\")\n-])\n+  (RVVM8BF \"rvvm4qi\") (RVVM4BF \"rvvm2qi\") (RVVM2BF \"rvvm1qi\") (RVVM1BF \"rvvmf2qi\") (RVVMF2BF \"rvvmf4qi\") (RVVMF4BF \"rvvmf8qi\")\n \n-; Again in lower case.\n-(define_mode_attr v_double_trunc [\n-  (RVVM8HI \"rvvm4qi\") (RVVM4HI \"rvvm2qi\") (RVVM2HI \"rvvm1qi\") (RVVM1HI \"rvvmf2qi\") (RVVMF2HI \"rvvmf4qi\") (RVVMF4HI \"rvvmf8qi\")\n+  (RVVM8HF \"rvvm4qi\") (RVVM4HF \"rvvm2qi\") (RVVM2HF \"rvvm1qi\") (RVVM1HF \"rvvmf2qi\") (RVVMF2HF \"rvvmf4qi\") (RVVMF4HF \"rvvmf8qi\")\n \n   (RVVM8SI \"rvvm4hi\") (RVVM4SI \"rvvm2hi\") (RVVM2SI \"rvvm1hi\") (RVVM1SI \"rvvmf2hi\") (RVVMF2SI \"rvvmf4hi\")\n \n-  (RVVM8SF \"rvvm4hf\") (RVVM4SF \"rvvm2hf\") (RVVM2SF \"rvvm1hf\") (RVVM1SF \"rvvmf2hf\") (RVVMF2SF \"rvvmf4hf\")\n+  (RVVM8SF \"rvvm4hi\") (RVVM4SF \"rvvm2hi\") (RVVM2SF \"rvvm1hi\") (RVVM1SF \"rvvmf2hi\") (RVVMF2SF \"rvvmf4hi\")\n \n   (RVVM8DI \"rvvm4si\") (RVVM4DI \"rvvm2si\") (RVVM2DI \"rvvm1si\") (RVVM1DI \"rvvmf2si\")\n \n-  (RVVM8DF \"rvvm4sf\") (RVVM4DF \"rvvm2sf\") (RVVM2DF \"rvvm1sf\") (RVVM1DF \"rvvmf2sf\")\n+  (RVVM8DF \"rvvm4si\") (RVVM4DF \"rvvm2si\") (RVVM2DF \"rvvm1si\") (RVVM1DF \"rvvmf2si\")\n \n   (V1HI \"v1qi\")\n   (V2HI \"v2qi\")\n@@ -3057,35 +3945,114 @@ (define_mode_attr v_double_trunc [\n   (V128DI \"v128si\")\n   (V256DI \"v256si\")\n   (V512DI \"v512si\")\n-  (V1SF \"v1hf\")\n-  (V2SF \"v2hf\")\n-  (V4SF \"v4hf\")\n-  (V8SF \"v8hf\")\n-  (V16SF \"v16hf\")\n-  (V32SF \"v32hf\")\n-  (V64SF \"v64hf\")\n-  (V128SF \"v128hf\")\n-  (V256SF \"v256hf\")\n-  (V512SF \"v512hf\")\n-  (V1024SF \"v1024hf\")\n-  (V1DF \"v1sf\")\n-  (V2DF \"v2sf\")\n-  (V4DF \"v4sf\")\n-  (V8DF \"v8sf\")\n-  (V16DF \"v16sf\")\n-  (V32DF \"v32sf\")\n-  (V64DF \"v64sf\")\n-  (V128DF \"v128sf\")\n-  (V256DF \"v256sf\")\n-  (V512DF \"v512sf\")\n+  (V1HF \"v1qi\")\n+  (V2HF \"v2qi\")\n+  (V4HF \"v4qi\")\n+  (V8HF \"v8qi\")\n+  (V16HF \"v16qi\")\n+  (V32HF \"v32qi\")\n+  (V64HF \"v64qi\")\n+  (V128HF \"v128qi\")\n+  (V256HF \"v256qi\")\n+  (V512HF \"v512qi\")\n+  (V1024HF \"v1024qi\")\n+  (V2048HF \"v2048qi\")\n+  (V1BF \"v1qi\")\n+  (V2BF \"v2qi\")\n+  (V4BF \"v4qi\")\n+  (V8BF \"v8qi\")\n+  (V16BF \"v16qi\")\n+  (V32BF \"v32qi\")\n+  (V64BF \"v64qi\")\n+  (V128BF \"v128qi\")\n+  (V256BF \"v256qi\")\n+  (V512BF \"v512qi\")\n+  (V1024BF \"v1024qi\")\n+  (V2048BF \"v2048qi\")\n+  (V1SF \"v1hi\")\n+  (V2SF \"v2hi\")\n+  (V4SF \"v4hi\")\n+  (V8SF \"v8hi\")\n+  (V16SF \"v16hi\")\n+  (V32SF \"v32hi\")\n+  (V64SF \"v64hi\")\n+  (V128SF \"v128hi\")\n+  (V256SF \"v256hi\")\n+  (V512SF \"v512hi\")\n+  (V1024SF \"v1024hi\")\n+  (V1DF \"v1si\")\n+  (V2DF \"v2si\")\n+  (V4DF \"v4si\")\n+  (V8DF \"v8si\")\n+  (V16DF \"v16si\")\n+  (V32DF \"v32si\")\n+  (V64DF \"v64si\")\n+  (V128DF \"v128si\")\n+  (V256DF \"v256si\")\n+  (V512DF \"v512si\")\n ])\n \n-(define_mode_attr v_quad_trunc [\n+(define_mode_attr VINDEX_QUAD_TRUNC [\n+  (RVVM8SI \"RVVM2QI\") (RVVM4SI \"RVVM1QI\") (RVVM2SI \"RVVMF2QI\") (RVVM1SI \"RVVMF4QI\") (RVVMF2SI \"RVVMF8QI\")\n+\n+  (RVVM8SF \"RVVM2QI\") (RVVM4SF \"RVVM1QI\") (RVVM2SF \"RVVMF2QI\") (RVVM1SF \"RVVMF4QI\") (RVVMF2SF \"RVVMF8QI\")\n+\n+  (RVVM8DI \"RVVM2HI\") (RVVM4DI \"RVVM1HI\") (RVVM2DI \"RVVMF2HI\") (RVVM1DI \"RVVMF4HI\")\n+\n+  (RVVM8DF \"RVVM2HI\") (RVVM4DF \"RVVM1HI\") (RVVM2DF \"RVVMF2HI\") (RVVM1DF \"RVVMF4HI\")\n+\n+  (V1SI \"V1QI\")\n+  (V2SI \"V2QI\")\n+  (V4SI \"V4QI\")\n+  (V8SI \"V8QI\")\n+  (V16SI \"V16QI\")\n+  (V32SI \"V32QI\")\n+  (V64SI \"V64QI\")\n+  (V128SI \"V128QI\")\n+  (V256SI \"V256QI\")\n+  (V512SI \"V512QI\")\n+  (V1024SI \"V1024QI\")\n+  (V1DI \"V1HI\")\n+  (V2DI \"V2HI\")\n+  (V4DI \"V4HI\")\n+  (V8DI \"V8HI\")\n+  (V16DI \"V16HI\")\n+  (V32DI \"V32HI\")\n+  (V64DI \"V64HI\")\n+  (V128DI \"V128HI\")\n+  (V256DI \"V256HI\")\n+  (V512DI \"V512HI\")\n+  (V1SF \"V1QI\")\n+  (V2SF \"V2QI\")\n+  (V4SF \"V4QI\")\n+  (V8SF \"V8QI\")\n+  (V16SF \"V16QI\")\n+  (V32SF \"V32QI\")\n+  (V64SF \"V64QI\")\n+  (V128SF \"V128QI\")\n+  (V256SF \"V256QI\")\n+  (V512SF \"V512QI\")\n+  (V1024SF \"V512QI\")\n+  (V1DF \"V1HI\")\n+  (V2DF \"V2HI\")\n+  (V4DF \"V4HI\")\n+  (V8DF \"V8HI\")\n+  (V16DF \"V16HI\")\n+  (V32DF \"V32HI\")\n+  (V64DF \"V64HI\")\n+  (V128DF \"V128HI\")\n+  (V256DF \"V256HI\")\n+  (V512DF \"V512HI\")\n+])\n+\n+(define_mode_attr vindex_quad_trunc [\n   (RVVM8SI \"rvvm2qi\") (RVVM4SI \"rvvm1qi\") (RVVM2SI \"rvvmf2qi\") (RVVM1SI \"rvvmf4qi\") (RVVMF2SI \"rvvmf8qi\")\n \n+  (RVVM8SF \"rvvm2qi\") (RVVM4SF \"rvvm1qi\") (RVVM2SF \"rvvmf2qi\") (RVVM1SF \"rvvmf4qi\") (RVVMF2SF \"rvvmf8qi\")\n+\n   (RVVM8DI \"rvvm2hi\") (RVVM4DI \"rvvm1hi\") (RVVM2DI \"rvvmf2hi\") (RVVM1DI \"rvvmf4hi\")\n \n-  (RVVM8DF \"rvvm2hf\") (RVVM4DF \"rvvm1hf\") (RVVM2DF \"rvvmf2hf\") (RVVM1DF \"rvvmf4hf\")\n+  (RVVM8DF \"rvvm2hi\") (RVVM4DF \"rvvm1hi\") (RVVM2DF \"rvvmf2hi\") (RVVM1DF \"rvvmf4hi\")\n \n   (V1SI \"v1qi\")\n   (V2SI \"v2qi\")\n@@ -3108,21 +4075,61 @@ (define_mode_attr v_quad_trunc [\n   (V128DI \"v128hi\")\n   (V256DI \"v256hi\")\n   (V512DI \"v512hi\")\n-  (V1DF \"v1hf\")\n-  (V2DF \"v2hf\")\n-  (V4DF \"v4hf\")\n-  (V8DF \"v8hf\")\n-  (V16DF \"v16hf\")\n-  (V32DF \"v32hf\")\n-  (V64DF \"v64hf\")\n-  (V128DF \"v128hf\")\n-  (V256DF \"v256hf\")\n-  (V512DF \"v512hf\")\n+  (V1SF \"v1qi\")\n+  (V2SF \"v2qi\")\n+  (V4SF \"v4qi\")\n+  (V8SF \"v8qi\")\n+  (V16SF \"v16qi\")\n+  (V32SF \"v32qi\")\n+  (V64SF \"v64qi\")\n+  (V128SF \"v128qi\")\n+  (V256SF \"v256qi\")\n+  (V512SF \"v512qi\")\n+  (V1024SF \"v512qi\")\n+  (V1DF \"v1hi\")\n+  (V2DF \"v2hi\")\n+  (V4DF \"v4hi\")\n+  (V8DF \"v8hi\")\n+  (V16DF \"v16hi\")\n+  (V32DF \"v32hi\")\n+  (V64DF \"v64hi\")\n+  (V128DF \"v128hi\")\n+  (V256DF \"v256hi\")\n+  (V512DF \"v512hi\")\n ])\n \n-(define_mode_attr v_oct_trunc [\n+(define_mode_attr VINDEX_OCT_TRUNC [\n+  (RVVM8DI \"RVVM1QI\") (RVVM4DI \"RVVMF2QI\") (RVVM2DI \"RVVMF4QI\") (RVVM1DI \"RVVMF8QI\")\n+\n+  (RVVM8DF \"RVVM1QI\") (RVVM4DF \"RVVMF2QI\") (RVVM2DF \"RVVMF4QI\") (RVVM1DF \"RVVMF8QI\")\n+\n+  (V1DI \"V1QI\")\n+  (V2DI \"V2QI\")\n+  (V4DI \"V4QI\")\n+  (V8DI \"V8QI\")\n+  (V16DI \"V16QI\")\n+  (V32DI \"V32QI\")\n+  (V64DI \"V64QI\")\n+  (V128DI \"V128QI\")\n+  (V256DI \"V256QI\")\n+  (V512DI \"V512QI\")\n+  (V1DF \"V1QI\")\n+  (V2DF \"V2QI\")\n+  (V4DF \"V4QI\")\n+  (V8DF \"V8QI\")\n+  (V16DF \"V16QI\")\n+  (V32DF \"V32QI\")\n+  (V64DF \"V64QI\")\n+  (V128DF \"V128QI\")\n+  (V256DF \"V256QI\")\n+  (V512DF \"V512QI\")\n+])\n+\n+(define_mode_attr vindex_oct_trunc [\n   (RVVM8DI \"rvvm1qi\") (RVVM4DI \"rvvmf2qi\") (RVVM2DI \"rvvmf4qi\") (RVVM1DI \"rvvmf8qi\")\n \n+  (RVVM8DF \"rvvm1qi\") (RVVM4DF \"rvvmf2qi\") (RVVM2DF \"rvvmf4qi\") (RVVM1DF \"rvvmf8qi\")\n+\n   (V1DI \"v1qi\")\n   (V2DI \"v2qi\")\n   (V4DI \"v4qi\")\n@@ -3133,52 +4140,176 @@ (define_mode_attr v_oct_trunc [\n   (V128DI \"v128qi\")\n   (V256DI \"v256qi\")\n   (V512DI \"v512qi\")\n+  (V1DF \"v1qi\")\n+  (V2DF \"v2qi\")\n+  (V4DF \"v4qi\")\n+  (V8DF \"v8qi\")\n+  (V16DF \"v16qi\")\n+  (V32DF \"v32qi\")\n+  (V64DF \"v64qi\")\n+  (V128DF \"v128qi\")\n+  (V256DF \"v256qi\")\n+  (V512DF \"v512qi\")\n ])\n \n-(define_mode_attr VINDEX_DOUBLE_TRUNC [\n-  (RVVM8HI \"RVVM4QI\") (RVVM4HI \"RVVM2QI\") (RVVM2HI \"RVVM1QI\") (RVVM1HI \"RVVMF2QI\") (RVVMF2HI \"RVVMF4QI\") (RVVMF4HI \"RVVMF8QI\")\n-\n-  (RVVM8BF \"RVVM4QI\") (RVVM4BF \"RVVM2QI\") (RVVM2BF \"RVVM1QI\") (RVVM1BF \"RVVMF2QI\") (RVVMF2BF \"RVVMF4QI\") (RVVMF4BF \"RVVMF8QI\")\n-\n-  (RVVM8HF \"RVVM4QI\") (RVVM4HF \"RVVM2QI\") (RVVM2HF \"RVVM1QI\") (RVVM1HF \"RVVMF2QI\") (RVVMF2HF \"RVVMF4QI\") (RVVMF4HF \"RVVMF8QI\")\n-\n-  (RVVM8SI \"RVVM4HI\") (RVVM4SI \"RVVM2HI\") (RVVM2SI \"RVVM1HI\") (RVVM1SI \"RVVMF2HI\") (RVVMF2SI \"RVVMF4HI\")\n-\n-  (RVVM8SF \"RVVM4HI\") (RVVM4SF \"RVVM2HI\") (RVVM2SF \"RVVM1HI\") (RVVM1SF \"RVVMF2HI\") (RVVMF2SF \"RVVMF4HI\")\n-\n-  (RVVM8DI \"RVVM4SI\") (RVVM4DI \"RVVM2SI\") (RVVM2DI \"RVVM1SI\") (RVVM1DI \"RVVMF2SI\")\n-\n-  (RVVM8DF \"RVVM4SI\") (RVVM4DF \"RVVM2SI\") (RVVM2DF \"RVVM1SI\") (RVVM1DF \"RVVMF2SI\")\n-])\n+(define_mode_attr VINDEX_DOUBLE_EXT [\n+  (RVVM4QI \"RVVM8HI\") (RVVM2QI \"RVVM4HI\") (RVVM1QI \"RVVM2HI\") (RVVMF2QI \"RVVM1HI\") (RVVMF4QI \"RVVMF2HI\") (RVVMF8QI \"RVVMF4HI\")\n \n-(define_mode_attr VINDEX_QUAD_TRUNC [\n-  (RVVM8SI \"RVVM2QI\") (RVVM4SI \"RVVM1QI\") (RVVM2SI \"RVVMF2QI\") (RVVM1SI \"RVVMF4QI\") (RVVMF2SI \"RVVMF8QI\")\n+  (RVVM4HI \"RVVM8SI\") (RVVM2HI \"RVVM4SI\") (RVVM1HI \"RVVM2SI\") (RVVMF2HI \"RVVM1SI\") (RVVMF4HI \"RVVMF2SI\")\n \n-  (RVVM8SF \"RVVM2QI\") (RVVM4SF \"RVVM1QI\") (RVVM2SF \"RVVMF2QI\") (RVVM1SF \"RVVMF4QI\") (RVVMF2SF \"RVVMF8QI\")\n+  (RVVM4BF \"RVVM8SI\") (RVVM2BF \"RVVM4SI\") (RVVM1BF \"RVVM2SI\") (RVVMF2BF \"RVVM1SI\") (RVVMF4BF \"RVVMF2SI\")\n \n-  (RVVM8DI \"RVVM2HI\") (RVVM4DI \"RVVM1HI\") (RVVM2DI \"RVVMF2HI\") (RVVM1DI \"RVVMF4HI\")\n+  (RVVM4HF \"RVVM8SI\") (RVVM2HF \"RVVM4SI\") (RVVM1HF \"RVVM2SI\") (RVVMF2HF \"RVVM1SI\") (RVVMF4HF \"RVVMF2SI\")\n \n-  (RVVM8DF \"RVVM2HI\") (RVVM4DF \"RVVM1HI\") (RVVM2DF \"RVVMF2HI\") (RVVM1DF \"RVVMF4HI\")\n-])\n+  (RVVM4SI \"RVVM8DI\") (RVVM2SI \"RVVM4DI\") (RVVM1SI \"RVVM2DI\") (RVVMF2SI \"RVVM1DI\")\n \n-(define_mode_attr VINDEX_OCT_TRUNC [\n-  (RVVM8DI \"RVVM1QI\") (RVVM4DI \"RVVMF2QI\") (RVVM2DI \"RVVMF4QI\") (RVVM1DI \"RVVMF8QI\")\n+  (RVVM4SF \"RVVM8DI\") (RVVM2SF \"RVVM4DI\") (RVVM1SF \"RVVM2DI\") (RVVMF2SF \"RVVM1DI\")\n \n-  (RVVM8DF \"RVVM1QI\") (RVVM4DF \"RVVMF2QI\") (RVVM2DF \"RVVMF4QI\") (RVVM1DF \"RVVMF8QI\")\n+  (V1QI \"V1HI\")\n+  (V2QI \"V2HI\")\n+  (V4QI \"V4HI\")\n+  (V8QI \"V8HI\")\n+  (V16QI \"V16HI\")\n+  (V32QI \"V32HI\")\n+  (V64QI \"V64HI\")\n+  (V128QI \"V128HI\")\n+  (V256QI \"V256HI\")\n+  (V512QI \"V512HI\")\n+  (V1024QI \"V1024HI\")\n+  (V2048QI \"V2048HI\")\n+  (V1HI \"V1SI\")\n+  (V2HI \"V2SI\")\n+  (V4HI \"V4SI\")\n+  (V8HI \"V8SI\")\n+  (V16HI \"V16SI\")\n+  (V32HI \"V32SI\")\n+  (V64HI \"V64SI\")\n+  (V128HI \"V128SI\")\n+  (V256HI \"V256SI\")\n+  (V512HI \"V512SI\")\n+  (V1024HI \"V1024SI\")\n+  (V1SI \"V1DI\")\n+  (V2SI \"V2DI\")\n+  (V4SI \"V4DI\")\n+  (V8SI \"V8DI\")\n+  (V16SI \"V16DI\")\n+  (V32SI \"V32DI\")\n+  (V64SI \"V64DI\")\n+  (V128SI \"V128DI\")\n+  (V256SI \"V256DI\")\n+  (V512SI \"V512DI\")\n+  (V1HF \"V1SI\")\n+  (V2HF \"V2SI\")\n+  (V4HF \"V4SI\")\n+  (V8HF \"V8SI\")\n+  (V16HF \"V16SI\")\n+  (V32HF \"V32SI\")\n+  (V64HF \"V64SI\")\n+  (V128HF \"V128SI\")\n+  (V256HF \"V256SI\")\n+  (V512HF \"V512SI\")\n+  (V1024HF \"V1024SI\")\n+  (V1BF \"V1SI\")\n+  (V2BF \"V2SI\")\n+  (V4BF \"V4SI\")\n+  (V8BF \"V8SI\")\n+  (V16BF \"V16SI\")\n+  (V32BF \"V32SI\")\n+  (V64BF \"V64SI\")\n+  (V128BF \"V128SI\")\n+  (V256BF \"V256SI\")\n+  (V512BF \"V512SI\")\n+  (V1024BF \"V1024SI\")\n+  (V1SF \"V1DI\")\n+  (V2SF \"V2DI\")\n+  (V4SF \"V4DI\")\n+  (V8SF \"V8DI\")\n+  (V16SF \"V16DI\")\n+  (V32SF \"V32DI\")\n+  (V64SF \"V64DI\")\n+  (V128SF \"V128DI\")\n+  (V256SF \"V256DI\")\n+  (V512SF \"V512DI\")\n ])\n \n-(define_mode_attr VINDEX_DOUBLE_EXT [\n-  (RVVM4QI \"RVVM8HI\") (RVVM2QI \"RVVM4HI\") (RVVM1QI \"RVVM2HI\") (RVVMF2QI \"RVVM1HI\") (RVVMF4QI \"RVVMF2HI\") (RVVMF8QI \"RVVMF4HI\")\n+(define_mode_attr vindex_double_ext [\n+  (RVVM4QI \"rvvm8hi\") (RVVM2QI \"rvvm4hi\") (RVVM1QI \"rvvm2hi\") (RVVMF2QI \"rvvm1hi\") (RVVMF4QI \"rvvmf2hi\") (RVVMF8QI \"rvvmf4hi\")\n \n-  (RVVM4HI \"RVVM8SI\") (RVVM2HI \"RVVM4SI\") (RVVM1HI \"RVVM2SI\") (RVVMF2HI \"RVVM1SI\") (RVVMF4HI \"RVVMF2SI\")\n+  (RVVM4HI \"rvvm8si\") (RVVM2HI \"rvvm4si\") (RVVM1HI \"rvvm2si\") (RVVMF2HI \"rvvm1si\") (RVVMF4HI \"rvvmf2si\")\n \n-  (RVVM4BF \"RVVM8SI\") (RVVM2BF \"RVVM4SI\") (RVVM1BF \"RVVM2SI\") (RVVMF2BF \"RVVM1SI\") (RVVMF4BF \"RVVMF2SI\")\n+  (RVVM4BF \"rvvm8si\") (RVVM2BF \"rvvm4si\") (RVVM1BF \"rvvm2si\") (RVVMF2BF \"rvvm1si\") (RVVMF4BF \"rvvmf2si\")\n \n-  (RVVM4HF \"RVVM8SI\") (RVVM2HF \"RVVM4SI\") (RVVM1HF \"RVVM2SI\") (RVVMF2HF \"RVVM1SI\") (RVVMF4HF \"RVVMF2SI\")\n+  (RVVM4HF \"rvvm8si\") (RVVM2HF \"rvvm4si\") (RVVM1HF \"rvvm2si\") (RVVMF2HF \"rvvm1si\") (RVVMF4HF \"rvvmf2si\")\n \n-  (RVVM4SI \"RVVM8DI\") (RVVM2SI \"RVVM4DI\") (RVVM1SI \"RVVM2DI\") (RVVMF2SI \"RVVM1DI\")\n+  (RVVM4SI \"rvvm8di\") (RVVM2SI \"rvvm4di\") (RVVM1SI \"rvvm2di\") (RVVMF2SI \"rvvm1di\")\n \n-  (RVVM4SF \"RVVM8DI\") (RVVM2SF \"RVVM4DI\") (RVVM1SF \"RVVM2DI\") (RVVMF2SF \"RVVM1DI\")\n+  (RVVM4SF \"rvvm8di\") (RVVM2SF \"rvvm4di\") (RVVM1SF \"rvvm2di\") (RVVMF2SF \"rvvm1di\")\n+\n+  (V1QI \"v1hi\")\n+  (V2QI \"v2hi\")\n+  (V4QI \"v4hi\")\n+  (V8QI \"v8hi\")\n+  (V16QI \"v16hi\")\n+  (V32QI \"v32hi\")\n+  (V64QI \"v64hi\")\n+  (V128QI \"v128hi\")\n+  (V256QI \"v256hi\")\n+  (V512QI \"v512hi\")\n+  (V1024QI \"v1024hi\")\n+  (V2048QI \"v2048hi\")\n+  (V1HI \"v1si\")\n+  (V2HI \"v2si\")\n+  (V4HI \"v4si\")\n+  (V8HI \"v8si\")\n+  (V16HI \"v16si\")\n+  (V32HI \"v32si\")\n+  (V64HI \"v64si\")\n+  (V128HI \"v128si\")\n+  (V256HI \"v256si\")\n+  (V512HI \"v512si\")\n+  (V1024HI \"v1024si\")\n+  (V1SI \"v1di\")\n+  (V2SI \"v2di\")\n+  (V4SI \"v4di\")\n+  (V8SI \"v8di\")\n+  (V16SI \"v16di\")\n+  (V32SI \"v32di\")\n+  (V64SI \"v64di\")\n+  (V128SI \"v128di\")\n+  (V256SI \"v256di\")\n+  (V512SI \"v512di\")\n+  (V1HF \"v1si\")\n+  (V2HF \"v2si\")\n+  (V4HF \"v4si\")\n+  (V8HF \"v8si\")\n+  (V16HF \"v16si\")\n+  (V32HF \"v32si\")\n+  (V64HF \"v64si\")\n+  (V128HF \"v128si\")\n+  (V256HF \"v256si\")\n+  (V512HF \"v512si\")\n+  (V1024HF \"v1024si\")\n+  (V1BF \"v1si\")\n+  (V2BF \"v2si\")\n+  (V4BF \"v4si\")\n+  (V8BF \"v8si\")\n+  (V16BF \"v16si\")\n+  (V32BF \"v32si\")\n+  (V64BF \"v64si\")\n+  (V128BF \"v128si\")\n+  (V256BF \"v256si\")\n+  (V512BF \"v512si\")\n+  (V1024BF \"v1024si\")\n+  (V1SF \"v1di\")\n+  (V2SF \"v2di\")\n+  (V4SF \"v4di\")\n+  (V8SF \"v8di\")\n+  (V16SF \"v16di\")\n+  (V32SF \"v32di\")\n+  (V64SF \"v64di\")\n+  (V128SF \"v128di\")\n+  (V256SF \"v256di\")\n+  (V512SF \"v512di\")\n ])\n \n (define_mode_attr VINDEX_QUAD_EXT [\n@@ -3189,10 +4320,130 @@ (define_mode_attr VINDEX_QUAD_EXT [\n   (RVVM2BF \"RVVM8DI\") (RVVM1BF \"RVVM4DI\") (RVVMF2BF \"RVVM2DI\") (RVVMF4BF \"RVVM1DI\")\n \n   (RVVM2HF \"RVVM8DI\") (RVVM1HF \"RVVM4DI\") (RVVMF2HF \"RVVM2DI\") (RVVMF4HF \"RVVM1DI\")\n+\n+  (V1QI \"V1SI\")\n+  (V2QI \"V2SI\")\n+  (V4QI \"V4SI\")\n+  (V8QI \"V8SI\")\n+  (V16QI \"V16SI\")\n+  (V32QI \"V32SI\")\n+  (V64QI \"V64SI\")\n+  (V128QI \"V128SI\")\n+  (V256QI \"V256SI\")\n+  (V512QI \"V512SI\")\n+  (V1024QI \"V1024SI\")\n+  (V1HI \"V1DI\")\n+  (V2HI \"V2DI\")\n+  (V4HI \"V4DI\")\n+  (V8HI \"V8DI\")\n+  (V16HI \"V16DI\")\n+  (V32HI \"V32DI\")\n+  (V64HI \"V64DI\")\n+  (V128HI \"V128DI\")\n+  (V256HI \"V256DI\")\n+  (V512HI \"V512DI\")\n+  (V1HF \"V1DI\")\n+  (V2HF \"V2DI\")\n+  (V4HF \"V4DI\")\n+  (V8HF \"V8DI\")\n+  (V16HF \"V16DI\")\n+  (V32HF \"V32DI\")\n+  (V64HF \"V64DI\")\n+  (V128HF \"V128DI\")\n+  (V256HF \"V256DI\")\n+  (V512HF \"V512DI\")\n+  (V1BF \"V1DI\")\n+  (V2BF \"V2DI\")\n+  (V4BF \"V4DI\")\n+  (V8BF \"V8DI\")\n+  (V16BF \"V16DI\")\n+  (V32BF \"V32DI\")\n+  (V64BF \"V64DI\")\n+  (V128BF \"V128DI\")\n+  (V256BF \"V256DI\")\n+  (V512BF \"V512DI\")\n+])\n+\n+(define_mode_attr vindex_quad_ext [\n+  (RVVM2QI \"rvvm8si\") (RVVM1QI \"rvvm4si\") (RVVMF2QI \"rvvm2si\") (RVVMF4QI \"rvvm1si\") (RVVMF8QI \"rvvmf2si\")\n+\n+  (RVVM2HI \"rvvm8di\") (RVVM1HI \"rvvm4di\") (RVVMF2HI \"rvvm2di\") (RVVMF4HI \"rvvm1di\")\n+\n+  (RVVM2BF \"rvvm8di\") (RVVM1BF \"rvvm4di\") (RVVMF2BF \"rvvm2di\") (RVVMF4BF \"rvvm1di\")\n+\n+  (RVVM2HF \"rvvm8di\") (RVVM1HF \"rvvm4di\") (RVVMF2HF \"rvvm2di\") (RVVMF4HF \"rvvm1di\")\n+\n+  (V1QI \"v1si\")\n+  (V2QI \"v2si\")\n+  (V4QI \"v4si\")\n+  (V8QI \"v8si\")\n+  (V16QI \"v16si\")\n+  (V32QI \"v32si\")\n+  (V64QI \"v64si\")\n+  (V128QI \"v128si\")\n+  (V256QI \"v256si\")\n+  (V512QI \"v512si\")\n+  (V1024QI \"v1024si\")\n+  (V1HI \"v1di\")\n+  (V2HI \"v2di\")\n+  (V4HI \"v4di\")\n+  (V8HI \"v8di\")\n+  (V16HI \"v16di\")\n+  (V32HI \"v32di\")\n+  (V64HI \"v64di\")\n+  (V128HI \"v1d8di\")\n+  (V256HI \"v256di\")\n+  (V512HI \"v512di\")\n+  (V1HF \"v1di\")\n+  (V2HF \"v2di\")\n+  (V4HF \"v4di\")\n+  (V8HF \"v8di\")\n+  (V16HF \"v16di\")\n+  (V32HF \"v32di\")\n+  (V64HF \"v64di\")\n+  (V128HF \"v128di\")\n+  (V256HF \"v256di\")\n+  (V512HF \"v512di\")\n+  (V1BF \"v1di\")\n+  (V2BF \"v2di\")\n+  (V4BF \"v4di\")\n+  (V8BF \"v8di\")\n+  (V16BF \"v16di\")\n+  (V32BF \"v32di\")\n+  (V64BF \"v64di\")\n+  (V128BF \"v128di\")\n+  (V256BF \"v256di\")\n+  (V512BF \"v512di\")\n ])\n \n (define_mode_attr VINDEX_OCT_EXT [\n   (RVVM1QI \"RVVM8DI\") (RVVMF2QI \"RVVM4DI\") (RVVMF4QI \"RVVM2DI\") (RVVMF8QI \"RVVM1DI\")\n+\n+  (V1QI \"V1DI\")\n+  (V2QI \"V2DI\")\n+  (V4QI \"V4DI\")\n+  (V8QI \"V8DI\")\n+  (V16QI \"V16DI\")\n+  (V32QI \"V32DI\")\n+  (V64QI \"V64DI\")\n+  (V128QI \"V128DI\")\n+  (V256QI \"V256DI\")\n+  (V512QI \"V512DI\")\n+])\n+\n+(define_mode_attr vindex_oct_ext [\n+  (RVVM1QI \"rvvm8di\") (RVVMF2QI \"rvvm4di\") (RVVMF4QI \"rvvm2di\") (RVVMF8QI \"rvvm1di\")\n+\n+  (V1QI \"v1di\")\n+  (V2QI \"v2di\")\n+  (V4QI \"v4di\")\n+  (V8QI \"v8di\")\n+  (V16QI \"v16di\")\n+  (V32QI \"v32di\")\n+  (V64QI \"v64di\")\n+  (V128QI \"v128di\")\n+  (V256QI \"v256di\")\n+  (V512QI \"v512di\")\n ])\n \n (define_mode_attr VCONVERT [\ndiff --git a/gcc/config/riscv/vector.md b/gcc/config/riscv/vector.md\nindex ba4a43b185c..fb23f49c603 100644\n--- a/gcc/config/riscv/vector.md\n+++ b/gcc/config/riscv/vector.md\n@@ -2680,7 +2680,7 @@ (define_insn \"@pred_indexed_<order>load<mode>_x8_smaller_eew\"\n   [(set_attr \"type\" \"vld<order>x\")\n    (set_attr \"mode\" \"<MODE>\")])\n \n-(define_insn \"@pred_indexed_<order>store<RATIO64:mode><RATIO64I:mode>\"\n+(define_insn \"@pred_indexed_<order>store<mode>_same_eew\"\n   [(set (mem:BLK (scratch))\n \t(unspec:BLK\n \t  [(unspec:<VM>\n@@ -2690,14 +2690,14 @@ (define_insn \"@pred_indexed_<order>store<RATIO64:mode><RATIO64I:mode>\"\n \t     (reg:SI VL_REGNUM)\n \t     (reg:SI VTYPE_REGNUM)] UNSPEC_VPREDICATE)\n \t   (match_operand 1 \"pmode_reg_or_0_operand\"      \"  rJ\")\n-\t   (match_operand:RATIO64I 2 \"register_operand\" \"  vr\")\n-\t   (match_operand:RATIO64 3 \"register_operand\"  \"  vr\")] ORDER))]\n+\t   (match_operand:<VINDEX> 2 \"register_operand\" \"  vr\")\n+\t   (match_operand:VINDEXED 3 \"register_operand\"  \"  vr\")] ORDER))]\n   \"TARGET_VECTOR\"\n-  \"vs<order>xei<RATIO64I:sew>.v\\t%3,(%z1),%2%p0\"\n+  \"vs<order>xei<sew>.v\\t%3,(%z1),%2%p0\"\n   [(set_attr \"type\" \"vst<order>x\")\n-   (set_attr \"mode\" \"<RATIO64:MODE>\")])\n+   (set_attr \"mode\" \"<MODE>\")])\n \n-(define_insn \"@pred_indexed_<order>store<RATIO32:mode><RATIO32I:mode>\"\n+(define_insn \"@pred_indexed_<order>store<mode>_x2_greater_eew\"\n   [(set (mem:BLK (scratch))\n \t(unspec:BLK\n \t  [(unspec:<VM>\n@@ -2707,14 +2707,14 @@ (define_insn \"@pred_indexed_<order>store<RATIO32:mode><RATIO32I:mode>\"\n \t     (reg:SI VL_REGNUM)\n \t     (reg:SI VTYPE_REGNUM)] UNSPEC_VPREDICATE)\n \t   (match_operand 1 \"pmode_reg_or_0_operand\"      \"  rJ\")\n-\t   (match_operand:RATIO32I 2 \"register_operand\" \"  vr\")\n-\t   (match_operand:RATIO32 3 \"register_operand\"  \"  vr\")] ORDER))]\n+\t   (match_operand:<VINDEX_DOUBLE_TRUNC> 2 \"register_operand\" \"  vr\")\n+\t   (match_operand:VEEWEXT2 3 \"register_operand\"  \"  vr\")] ORDER))]\n   \"TARGET_VECTOR\"\n-  \"vs<order>xei<RATIO32I:sew>.v\\t%3,(%z1),%2%p0\"\n+  \"vs<order>xei<double_trunc_sew>.v\\t%3,(%z1),%2%p0\"\n   [(set_attr \"type\" \"vst<order>x\")\n-   (set_attr \"mode\" \"<RATIO32:MODE>\")])\n+   (set_attr \"mode\" \"<MODE>\")])\n \n-(define_insn \"@pred_indexed_<order>store<RATIO16:mode><RATIO16I:mode>\"\n+(define_insn \"@pred_indexed_<order>store<mode>_x4_greater_eew\"\n   [(set (mem:BLK (scratch))\n \t(unspec:BLK\n \t  [(unspec:<VM>\n@@ -2724,14 +2724,14 @@ (define_insn \"@pred_indexed_<order>store<RATIO16:mode><RATIO16I:mode>\"\n \t     (reg:SI VL_REGNUM)\n \t     (reg:SI VTYPE_REGNUM)] UNSPEC_VPREDICATE)\n \t   (match_operand 1 \"pmode_reg_or_0_operand\"      \"  rJ\")\n-\t   (match_operand:RATIO16I 2 \"register_operand\" \"  vr\")\n-\t   (match_operand:RATIO16 3 \"register_operand\"  \"  vr\")] ORDER))]\n+\t   (match_operand:<VINDEX_QUAD_TRUNC> 2 \"register_operand\" \"  vr\")\n+\t   (match_operand:VEEWEXT4 3 \"register_operand\"  \"  vr\")] ORDER))]\n   \"TARGET_VECTOR\"\n-  \"vs<order>xei<RATIO16I:sew>.v\\t%3,(%z1),%2%p0\"\n+  \"vs<order>xei<quad_trunc_sew>.v\\t%3,(%z1),%2%p0\"\n   [(set_attr \"type\" \"vst<order>x\")\n-   (set_attr \"mode\" \"<RATIO16:MODE>\")])\n+   (set_attr \"mode\" \"<MODE>\")])\n \n-(define_insn \"@pred_indexed_<order>store<RATIO8:mode><RATIO8I:mode>\"\n+(define_insn \"@pred_indexed_<order>store<mode>_x8_greater_eew\"\n   [(set (mem:BLK (scratch))\n \t(unspec:BLK\n \t  [(unspec:<VM>\n@@ -2741,14 +2741,14 @@ (define_insn \"@pred_indexed_<order>store<RATIO8:mode><RATIO8I:mode>\"\n \t     (reg:SI VL_REGNUM)\n \t     (reg:SI VTYPE_REGNUM)] UNSPEC_VPREDICATE)\n \t   (match_operand 1 \"pmode_reg_or_0_operand\"      \"  rJ\")\n-\t   (match_operand:RATIO8I 2 \"register_operand\" \"  vr\")\n-\t   (match_operand:RATIO8 3 \"register_operand\"  \"  vr\")] ORDER))]\n+\t   (match_operand:<VINDEX_OCT_TRUNC> 2 \"register_operand\" \"  vr\")\n+\t   (match_operand:VEEWEXT8 3 \"register_operand\"  \"  vr\")] ORDER))]\n   \"TARGET_VECTOR\"\n-  \"vs<order>xei<RATIO8I:sew>.v\\t%3,(%z1),%2%p0\"\n+  \"vs<order>xei<quad_trunc_sew>.v\\t%3,(%z1),%2%p0\"\n   [(set_attr \"type\" \"vst<order>x\")\n-   (set_attr \"mode\" \"<RATIO8:MODE>\")])\n+   (set_attr \"mode\" \"<MODE>\")])\n \n-(define_insn \"@pred_indexed_<order>store<RATIO4:mode><RATIO4I:mode>\"\n+(define_insn \"@pred_indexed_<order>store<mode>_x2_smaller_eew\"\n   [(set (mem:BLK (scratch))\n \t(unspec:BLK\n \t  [(unspec:<VM>\n@@ -2758,14 +2758,14 @@ (define_insn \"@pred_indexed_<order>store<RATIO4:mode><RATIO4I:mode>\"\n \t     (reg:SI VL_REGNUM)\n \t     (reg:SI VTYPE_REGNUM)] UNSPEC_VPREDICATE)\n \t   (match_operand 1 \"pmode_reg_or_0_operand\"      \"  rJ\")\n-\t   (match_operand:RATIO4I 2 \"register_operand\" \"  vr\")\n-\t   (match_operand:RATIO4 3 \"register_operand\"  \"  vr\")] ORDER))]\n+\t   (match_operand:<VINDEX_DOUBLE_EXT> 2 \"register_operand\" \"  vr\")\n+\t   (match_operand:VEEWTRUNC2 3 \"register_operand\"  \"  vr\")] ORDER))]\n   \"TARGET_VECTOR\"\n-  \"vs<order>xei<RATIO4I:sew>.v\\t%3,(%z1),%2%p0\"\n+  \"vs<order>xei<double_ext_sew>.v\\t%3,(%z1),%2%p0\"\n   [(set_attr \"type\" \"vst<order>x\")\n-   (set_attr \"mode\" \"<RATIO4:MODE>\")])\n+   (set_attr \"mode\" \"<MODE>\")])\n \n-(define_insn \"@pred_indexed_<order>store<RATIO2:mode><RATIO2I:mode>\"\n+(define_insn \"@pred_indexed_<order>store<mode>_x4_smaller_eew\"\n   [(set (mem:BLK (scratch))\n \t(unspec:BLK\n \t  [(unspec:<VM>\n@@ -2774,15 +2774,15 @@ (define_insn \"@pred_indexed_<order>store<RATIO2:mode><RATIO2I:mode>\"\n \t     (match_operand 5 \"const_int_operand\"        \"    i\")\n \t     (reg:SI VL_REGNUM)\n \t     (reg:SI VTYPE_REGNUM)] UNSPEC_VPREDICATE)\n-\t   (match_operand 1 \"pmode_reg_or_0_operand\"       \"  rJ\")\n-\t   (match_operand:RATIO2I 2 \"register_operand\"  \"  vr\")\n-\t   (match_operand:RATIO2 3 \"register_operand\"   \"  vr\")] ORDER))]\n+\t   (match_operand 1 \"pmode_reg_or_0_operand\"      \"  rJ\")\n+\t   (match_operand:<VINDEX_QUAD_EXT> 2 \"register_operand\" \"  vr\")\n+\t   (match_operand:VEEWTRUNC4 3 \"register_operand\"  \"  vr\")] ORDER))]\n   \"TARGET_VECTOR\"\n-  \"vs<order>xei<RATIO2I:sew>.v\\t%3,(%z1),%2%p0\"\n+  \"vs<order>xei<quad_ext_sew>.v\\t%3,(%z1),%2%p0\"\n   [(set_attr \"type\" \"vst<order>x\")\n-   (set_attr \"mode\" \"<RATIO2:MODE>\")])\n+   (set_attr \"mode\" \"<MODE>\")])\n \n-(define_insn \"@pred_indexed_<order>store<RATIO1:mode><RATIO1:mode>\"\n+(define_insn \"@pred_indexed_<order>store<mode>_x8_smaller_eew\"\n   [(set (mem:BLK (scratch))\n \t(unspec:BLK\n \t  [(unspec:<VM>\n@@ -2792,12 +2792,12 @@ (define_insn \"@pred_indexed_<order>store<RATIO1:mode><RATIO1:mode>\"\n \t     (reg:SI VL_REGNUM)\n \t     (reg:SI VTYPE_REGNUM)] UNSPEC_VPREDICATE)\n \t   (match_operand 1 \"pmode_reg_or_0_operand\"       \"  rJ\")\n-\t   (match_operand:RATIO1 2 \"register_operand\"   \"  vr\")\n-\t   (match_operand:RATIO1 3 \"register_operand\"    \"  vr\")] ORDER))]\n+\t   (match_operand:<VINDEX_OCT_EXT> 2 \"register_operand\"  \"  vr\")\n+\t   (match_operand:VEEWTRUNC8 3 \"register_operand\"   \"  vr\")] ORDER))]\n   \"TARGET_VECTOR\"\n-  \"vs<order>xei<RATIO1:sew>.v\\t%3,(%z1),%2%p0\"\n+  \"vs<order>xei<oct_ext_sew>.v\\t%3,(%z1),%2%p0\"\n   [(set_attr \"type\" \"vst<order>x\")\n-   (set_attr \"mode\" \"<RATIO1:MODE>\")])\n+   (set_attr \"mode\" \"<MODE>\")])\n \n ;; -------------------------------------------------------------------------------\n ;; ---- Predicated integer binary operations\n","prefixes":["v3","1/4"]}