From patchwork Thu Aug 31 09:08:17 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Chenghui Pan X-Patchwork-Id: 1828177 Return-Path: X-Original-To: incoming@patchwork.ozlabs.org Delivered-To: patchwork-incoming@legolas.ozlabs.org Authentication-Results: legolas.ozlabs.org; spf=pass (sender SPF authorized) smtp.mailfrom=gcc.gnu.org (client-ip=8.43.85.97; helo=server2.sourceware.org; envelope-from=gcc-patches-bounces+incoming=patchwork.ozlabs.org@gcc.gnu.org; receiver=patchwork.ozlabs.org) Received: from server2.sourceware.org (server2.sourceware.org [8.43.85.97]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature ECDSA (secp384r1) server-digest SHA384) (No client certificate requested) by legolas.ozlabs.org (Postfix) with ESMTPS id 4RbwPP3X0Jz1yfm for ; Thu, 31 Aug 2023 19:10:49 +1000 (AEST) Received: from server2.sourceware.org (localhost [IPv6:::1]) by sourceware.org (Postfix) with ESMTP id 7ACD23857C66 for ; Thu, 31 Aug 2023 09:10:47 +0000 (GMT) X-Original-To: gcc-patches@gcc.gnu.org Delivered-To: gcc-patches@gcc.gnu.org Received: from eggs.gnu.org (eggs.gnu.org [IPv6:2001:470:142:3::10]) by sourceware.org (Postfix) with ESMTPS id C3AC83858284 for ; Thu, 31 Aug 2023 09:08:56 +0000 (GMT) DMARC-Filter: OpenDMARC Filter v1.4.2 sourceware.org C3AC83858284 Authentication-Results: sourceware.org; dmarc=none (p=none dis=none) header.from=loongson.cn Authentication-Results: sourceware.org; spf=fail smtp.mailfrom=loongson.cn Received: from mail.loongson.cn ([114.242.206.163]) by eggs.gnu.org with esmtp (Exim 4.90_1) (envelope-from ) id 1qbdfR-0008Um-Fa for gcc-patches@gcc.gnu.org; Thu, 31 Aug 2023 05:08:56 -0400 Received: from loongson.cn (unknown [10.20.4.45]) by gateway (Coremail) with SMTP id _____8Ax1fCYWPBk+2IdAA--.60076S3; Thu, 31 Aug 2023 17:08:40 +0800 (CST) Received: from loongson-pc.loongson.cn (unknown [10.20.4.45]) by localhost.localdomain (Coremail) with SMTP id AQAAf8DxDc+DWPBkK_5nAA--.55112S8; Thu, 31 Aug 2023 17:08:38 +0800 (CST) From: Chenghui Pan To: gcc-patches@gcc.gnu.org Subject: [PATCH v6 4/4] LoongArch: Add Loongson ASX directive builtin function support. Date: Thu, 31 Aug 2023 17:08:17 +0800 Message-Id: <20230831090817.30636-5-panchenghui@loongson.cn> X-Mailer: git-send-email 2.40.1 In-Reply-To: <20230831090817.30636-1-panchenghui@loongson.cn> References: <20230831090817.30636-1-panchenghui@loongson.cn> MIME-Version: 1.0 X-CM-TRANSID: AQAAf8DxDc+DWPBkK_5nAA--.55112S8 X-CM-SenderInfo: psdquxxhqjx33l6o00pqjv00gofq/1tbiAQAABGTwEzQEUQADsg X-Coremail-Antispam: 1Uk129KBj9kXoW8JFWkAry7AFyUtFyDCw18Aw18p5X_Xw13ur WrpFy8CF15Cay7ZF9rCFW8JFWavr4a9r4fC3W7XryY9wnIkayYvasFqF1vyr45AFnxCrWj yw17Xa1YqF95KF17A3gCm3ZEXasCq-sJn29KB7ZKAUJUUUU8529EdanIXcx71UUUUU7KY7 ZEXasCq-sGcSsGvfJ3Ic02F40EFcxC0VAKzVAqx4xG6I80ebIjqfuFe4nvWSU5nxnvy29K BjDU0xBIdaVrnRJUUUk2b4IE77IF4wAFF20E14v26r1j6r4UM7CY07I20VC2zVCF04k26c xKx2IYs7xG6rWj6s0DM7CIcVAFz4kK6r1Y6r17M28lY4IEw2IIxxk0rwA2F7IY1VAKz4vE j48ve4kI8wA2z4x0Y4vE2Ix0cI8IcVAFwI0_Gr0_Xr1l84ACjcxK6xIIjxv20xvEc7CjxV AFwI0_Gr0_Cr1l84ACjcxK6I8E87Iv67AKxVW8Jr0_Cr1UM28EF7xvwVC2z280aVCY1x02 67AKxVW8Jr0_Cr1UM2AIxVAIcxkEcVAq07x20xvEncxIr21l57IF6xkI12xvs2x26I8E6x ACxx1l5I8CrVACY4xI64kE6c02F40Ex7xfMcIj6xIIjxv20xvE14v26r1q6rW5McIj6I8E 87Iv67AKxVW8JVWxJwAm72CE4IkC6x0Yz7v_Jr0_Gr1lF7xvr2IYc2Ij64vIr41l42xK82 IYc2Ij64vIr41l4I8I3I0E4IkC6x0Yz7v_Jr0_Gr1lx2IqxVAqx4xG67AKxVWUJVWUGwC2 0s026x8GjcxK67AKxVWUGVWUWwC2zVAF1VAY17CE14v26r126r1DMIIYrxkI7VAKI48JMI IF0xvE2Ix0cI8IcVAFwI0_Gr0_Xr1lIxAIcVC0I7IYx2IY6xkF7I0E14v26r4j6F4UMIIF 0xvE42xK8VAvwI8IcIk0rVWUJVWUCwCI42IY6I8E87Iv67AKxVW8JVWxJwCI42IY6I8E87 Iv6xkF7I0E14v26r4j6r4UJbIYCTnIWIevJa73UjIFyTuYvjxUcCD7UUUUU Received-SPF: pass client-ip=114.242.206.163; envelope-from=panchenghui@loongson.cn; helo=mail.loongson.cn X-Spam_score_int: -18 X-Spam_score: -1.9 X-Spam_bar: - X-Spam_report: (-1.9 / 5.0 requ) BAYES_00=-1.9, SPF_HELO_NONE=0.001, SPF_PASS=-0.001 autolearn=ham autolearn_force=no X-Spam_action: no action X-Spam-Status: No, score=-13.6 required=5.0 tests=BAYES_00, GIT_PATCH_0, KAM_DMARC_STATUS, KAM_SHORT, SPF_FAIL, SPF_HELO_PASS, TXREP autolearn=ham autolearn_force=no version=3.4.6 X-Spam-Checker-Version: SpamAssassin 3.4.6 (2021-04-09) on server2.sourceware.org X-BeenThere: gcc-patches@gcc.gnu.org X-Mailman-Version: 2.1.30 Precedence: list List-Id: Gcc-patches mailing list List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Cc: xuchenghua@loongson.cn, chenglulu@loongson.cn, i@xen0n.name Errors-To: gcc-patches-bounces+incoming=patchwork.ozlabs.org@gcc.gnu.org Sender: "Gcc-patches" From: Lulu Cheng gcc/ChangeLog: * config.gcc: Export the header file lasxintrin.h. * config/loongarch/loongarch-builtins.cc (enum loongarch_builtin_type): Add Loongson ASX builtin functions support. (AVAIL_ALL): Ditto. (LASX_BUILTIN): Ditto. (LASX_NO_TARGET_BUILTIN): Ditto. (LASX_BUILTIN_TEST_BRANCH): Ditto. (CODE_FOR_lasx_xvsadd_b): Ditto. (CODE_FOR_lasx_xvsadd_h): Ditto. (CODE_FOR_lasx_xvsadd_w): Ditto. (CODE_FOR_lasx_xvsadd_d): Ditto. (CODE_FOR_lasx_xvsadd_bu): Ditto. (CODE_FOR_lasx_xvsadd_hu): Ditto. (CODE_FOR_lasx_xvsadd_wu): Ditto. (CODE_FOR_lasx_xvsadd_du): Ditto. (CODE_FOR_lasx_xvadd_b): Ditto. (CODE_FOR_lasx_xvadd_h): Ditto. (CODE_FOR_lasx_xvadd_w): Ditto. (CODE_FOR_lasx_xvadd_d): Ditto. (CODE_FOR_lasx_xvaddi_bu): Ditto. (CODE_FOR_lasx_xvaddi_hu): Ditto. (CODE_FOR_lasx_xvaddi_wu): Ditto. (CODE_FOR_lasx_xvaddi_du): Ditto. (CODE_FOR_lasx_xvand_v): Ditto. (CODE_FOR_lasx_xvandi_b): Ditto. (CODE_FOR_lasx_xvbitsel_v): Ditto. (CODE_FOR_lasx_xvseqi_b): Ditto. (CODE_FOR_lasx_xvseqi_h): Ditto. (CODE_FOR_lasx_xvseqi_w): Ditto. (CODE_FOR_lasx_xvseqi_d): Ditto. (CODE_FOR_lasx_xvslti_b): Ditto. (CODE_FOR_lasx_xvslti_h): Ditto. (CODE_FOR_lasx_xvslti_w): Ditto. (CODE_FOR_lasx_xvslti_d): Ditto. (CODE_FOR_lasx_xvslti_bu): Ditto. (CODE_FOR_lasx_xvslti_hu): Ditto. (CODE_FOR_lasx_xvslti_wu): Ditto. (CODE_FOR_lasx_xvslti_du): Ditto. (CODE_FOR_lasx_xvslei_b): Ditto. (CODE_FOR_lasx_xvslei_h): Ditto. (CODE_FOR_lasx_xvslei_w): Ditto. (CODE_FOR_lasx_xvslei_d): Ditto. (CODE_FOR_lasx_xvslei_bu): Ditto. (CODE_FOR_lasx_xvslei_hu): Ditto. (CODE_FOR_lasx_xvslei_wu): Ditto. (CODE_FOR_lasx_xvslei_du): Ditto. (CODE_FOR_lasx_xvdiv_b): Ditto. (CODE_FOR_lasx_xvdiv_h): Ditto. (CODE_FOR_lasx_xvdiv_w): Ditto. (CODE_FOR_lasx_xvdiv_d): Ditto. (CODE_FOR_lasx_xvdiv_bu): Ditto. (CODE_FOR_lasx_xvdiv_hu): Ditto. (CODE_FOR_lasx_xvdiv_wu): Ditto. (CODE_FOR_lasx_xvdiv_du): Ditto. (CODE_FOR_lasx_xvfadd_s): Ditto. (CODE_FOR_lasx_xvfadd_d): Ditto. (CODE_FOR_lasx_xvftintrz_w_s): Ditto. (CODE_FOR_lasx_xvftintrz_l_d): Ditto. (CODE_FOR_lasx_xvftintrz_wu_s): Ditto. (CODE_FOR_lasx_xvftintrz_lu_d): Ditto. (CODE_FOR_lasx_xvffint_s_w): Ditto. (CODE_FOR_lasx_xvffint_d_l): Ditto. (CODE_FOR_lasx_xvffint_s_wu): Ditto. (CODE_FOR_lasx_xvffint_d_lu): Ditto. (CODE_FOR_lasx_xvfsub_s): Ditto. (CODE_FOR_lasx_xvfsub_d): Ditto. (CODE_FOR_lasx_xvfmul_s): Ditto. (CODE_FOR_lasx_xvfmul_d): Ditto. (CODE_FOR_lasx_xvfdiv_s): Ditto. (CODE_FOR_lasx_xvfdiv_d): Ditto. (CODE_FOR_lasx_xvfmax_s): Ditto. (CODE_FOR_lasx_xvfmax_d): Ditto. (CODE_FOR_lasx_xvfmin_s): Ditto. (CODE_FOR_lasx_xvfmin_d): Ditto. (CODE_FOR_lasx_xvfsqrt_s): Ditto. (CODE_FOR_lasx_xvfsqrt_d): Ditto. (CODE_FOR_lasx_xvflogb_s): Ditto. (CODE_FOR_lasx_xvflogb_d): Ditto. (CODE_FOR_lasx_xvmax_b): Ditto. (CODE_FOR_lasx_xvmax_h): Ditto. (CODE_FOR_lasx_xvmax_w): Ditto. (CODE_FOR_lasx_xvmax_d): Ditto. (CODE_FOR_lasx_xvmaxi_b): Ditto. (CODE_FOR_lasx_xvmaxi_h): Ditto. (CODE_FOR_lasx_xvmaxi_w): Ditto. (CODE_FOR_lasx_xvmaxi_d): Ditto. (CODE_FOR_lasx_xvmax_bu): Ditto. (CODE_FOR_lasx_xvmax_hu): Ditto. (CODE_FOR_lasx_xvmax_wu): Ditto. (CODE_FOR_lasx_xvmax_du): Ditto. (CODE_FOR_lasx_xvmaxi_bu): Ditto. (CODE_FOR_lasx_xvmaxi_hu): Ditto. (CODE_FOR_lasx_xvmaxi_wu): Ditto. (CODE_FOR_lasx_xvmaxi_du): Ditto. (CODE_FOR_lasx_xvmin_b): Ditto. (CODE_FOR_lasx_xvmin_h): Ditto. (CODE_FOR_lasx_xvmin_w): Ditto. (CODE_FOR_lasx_xvmin_d): Ditto. (CODE_FOR_lasx_xvmini_b): Ditto. (CODE_FOR_lasx_xvmini_h): Ditto. (CODE_FOR_lasx_xvmini_w): Ditto. (CODE_FOR_lasx_xvmini_d): Ditto. (CODE_FOR_lasx_xvmin_bu): Ditto. (CODE_FOR_lasx_xvmin_hu): Ditto. (CODE_FOR_lasx_xvmin_wu): Ditto. (CODE_FOR_lasx_xvmin_du): Ditto. (CODE_FOR_lasx_xvmini_bu): Ditto. (CODE_FOR_lasx_xvmini_hu): Ditto. (CODE_FOR_lasx_xvmini_wu): Ditto. (CODE_FOR_lasx_xvmini_du): Ditto. (CODE_FOR_lasx_xvmod_b): Ditto. (CODE_FOR_lasx_xvmod_h): Ditto. (CODE_FOR_lasx_xvmod_w): Ditto. (CODE_FOR_lasx_xvmod_d): Ditto. (CODE_FOR_lasx_xvmod_bu): Ditto. (CODE_FOR_lasx_xvmod_hu): Ditto. (CODE_FOR_lasx_xvmod_wu): Ditto. (CODE_FOR_lasx_xvmod_du): Ditto. (CODE_FOR_lasx_xvmul_b): Ditto. (CODE_FOR_lasx_xvmul_h): Ditto. (CODE_FOR_lasx_xvmul_w): Ditto. (CODE_FOR_lasx_xvmul_d): Ditto. (CODE_FOR_lasx_xvclz_b): Ditto. (CODE_FOR_lasx_xvclz_h): Ditto. (CODE_FOR_lasx_xvclz_w): Ditto. (CODE_FOR_lasx_xvclz_d): Ditto. (CODE_FOR_lasx_xvnor_v): Ditto. (CODE_FOR_lasx_xvor_v): Ditto. (CODE_FOR_lasx_xvori_b): Ditto. (CODE_FOR_lasx_xvnori_b): Ditto. (CODE_FOR_lasx_xvpcnt_b): Ditto. (CODE_FOR_lasx_xvpcnt_h): Ditto. (CODE_FOR_lasx_xvpcnt_w): Ditto. (CODE_FOR_lasx_xvpcnt_d): Ditto. (CODE_FOR_lasx_xvxor_v): Ditto. (CODE_FOR_lasx_xvxori_b): Ditto. (CODE_FOR_lasx_xvsll_b): Ditto. (CODE_FOR_lasx_xvsll_h): Ditto. (CODE_FOR_lasx_xvsll_w): Ditto. (CODE_FOR_lasx_xvsll_d): Ditto. (CODE_FOR_lasx_xvslli_b): Ditto. (CODE_FOR_lasx_xvslli_h): Ditto. (CODE_FOR_lasx_xvslli_w): Ditto. (CODE_FOR_lasx_xvslli_d): Ditto. (CODE_FOR_lasx_xvsra_b): Ditto. (CODE_FOR_lasx_xvsra_h): Ditto. (CODE_FOR_lasx_xvsra_w): Ditto. (CODE_FOR_lasx_xvsra_d): Ditto. (CODE_FOR_lasx_xvsrai_b): Ditto. (CODE_FOR_lasx_xvsrai_h): Ditto. (CODE_FOR_lasx_xvsrai_w): Ditto. (CODE_FOR_lasx_xvsrai_d): Ditto. (CODE_FOR_lasx_xvsrl_b): Ditto. (CODE_FOR_lasx_xvsrl_h): Ditto. (CODE_FOR_lasx_xvsrl_w): Ditto. (CODE_FOR_lasx_xvsrl_d): Ditto. (CODE_FOR_lasx_xvsrli_b): Ditto. (CODE_FOR_lasx_xvsrli_h): Ditto. (CODE_FOR_lasx_xvsrli_w): Ditto. (CODE_FOR_lasx_xvsrli_d): Ditto. (CODE_FOR_lasx_xvsub_b): Ditto. (CODE_FOR_lasx_xvsub_h): Ditto. (CODE_FOR_lasx_xvsub_w): Ditto. (CODE_FOR_lasx_xvsub_d): Ditto. (CODE_FOR_lasx_xvsubi_bu): Ditto. (CODE_FOR_lasx_xvsubi_hu): Ditto. (CODE_FOR_lasx_xvsubi_wu): Ditto. (CODE_FOR_lasx_xvsubi_du): Ditto. (CODE_FOR_lasx_xvpackod_d): Ditto. (CODE_FOR_lasx_xvpackev_d): Ditto. (CODE_FOR_lasx_xvpickod_d): Ditto. (CODE_FOR_lasx_xvpickev_d): Ditto. (CODE_FOR_lasx_xvrepli_b): Ditto. (CODE_FOR_lasx_xvrepli_h): Ditto. (CODE_FOR_lasx_xvrepli_w): Ditto. (CODE_FOR_lasx_xvrepli_d): Ditto. (CODE_FOR_lasx_xvandn_v): Ditto. (CODE_FOR_lasx_xvorn_v): Ditto. (CODE_FOR_lasx_xvneg_b): Ditto. (CODE_FOR_lasx_xvneg_h): Ditto. (CODE_FOR_lasx_xvneg_w): Ditto. (CODE_FOR_lasx_xvneg_d): Ditto. (CODE_FOR_lasx_xvbsrl_v): Ditto. (CODE_FOR_lasx_xvbsll_v): Ditto. (CODE_FOR_lasx_xvfmadd_s): Ditto. (CODE_FOR_lasx_xvfmadd_d): Ditto. (CODE_FOR_lasx_xvfmsub_s): Ditto. (CODE_FOR_lasx_xvfmsub_d): Ditto. (CODE_FOR_lasx_xvfnmadd_s): Ditto. (CODE_FOR_lasx_xvfnmadd_d): Ditto. (CODE_FOR_lasx_xvfnmsub_s): Ditto. (CODE_FOR_lasx_xvfnmsub_d): Ditto. (CODE_FOR_lasx_xvpermi_q): Ditto. (CODE_FOR_lasx_xvpermi_d): Ditto. (CODE_FOR_lasx_xbnz_v): Ditto. (CODE_FOR_lasx_xbz_v): Ditto. (CODE_FOR_lasx_xvssub_b): Ditto. (CODE_FOR_lasx_xvssub_h): Ditto. (CODE_FOR_lasx_xvssub_w): Ditto. (CODE_FOR_lasx_xvssub_d): Ditto. (CODE_FOR_lasx_xvssub_bu): Ditto. (CODE_FOR_lasx_xvssub_hu): Ditto. (CODE_FOR_lasx_xvssub_wu): Ditto. (CODE_FOR_lasx_xvssub_du): Ditto. (CODE_FOR_lasx_xvabsd_b): Ditto. (CODE_FOR_lasx_xvabsd_h): Ditto. (CODE_FOR_lasx_xvabsd_w): Ditto. (CODE_FOR_lasx_xvabsd_d): Ditto. (CODE_FOR_lasx_xvabsd_bu): Ditto. (CODE_FOR_lasx_xvabsd_hu): Ditto. (CODE_FOR_lasx_xvabsd_wu): Ditto. (CODE_FOR_lasx_xvabsd_du): Ditto. (CODE_FOR_lasx_xvavg_b): Ditto. (CODE_FOR_lasx_xvavg_h): Ditto. (CODE_FOR_lasx_xvavg_w): Ditto. (CODE_FOR_lasx_xvavg_d): Ditto. (CODE_FOR_lasx_xvavg_bu): Ditto. (CODE_FOR_lasx_xvavg_hu): Ditto. (CODE_FOR_lasx_xvavg_wu): Ditto. (CODE_FOR_lasx_xvavg_du): Ditto. (CODE_FOR_lasx_xvavgr_b): Ditto. (CODE_FOR_lasx_xvavgr_h): Ditto. (CODE_FOR_lasx_xvavgr_w): Ditto. (CODE_FOR_lasx_xvavgr_d): Ditto. (CODE_FOR_lasx_xvavgr_bu): Ditto. (CODE_FOR_lasx_xvavgr_hu): Ditto. (CODE_FOR_lasx_xvavgr_wu): Ditto. (CODE_FOR_lasx_xvavgr_du): Ditto. (CODE_FOR_lasx_xvmuh_b): Ditto. (CODE_FOR_lasx_xvmuh_h): Ditto. (CODE_FOR_lasx_xvmuh_w): Ditto. (CODE_FOR_lasx_xvmuh_d): Ditto. (CODE_FOR_lasx_xvmuh_bu): Ditto. (CODE_FOR_lasx_xvmuh_hu): Ditto. (CODE_FOR_lasx_xvmuh_wu): Ditto. (CODE_FOR_lasx_xvmuh_du): Ditto. (CODE_FOR_lasx_xvssran_b_h): Ditto. (CODE_FOR_lasx_xvssran_h_w): Ditto. (CODE_FOR_lasx_xvssran_w_d): Ditto. (CODE_FOR_lasx_xvssran_bu_h): Ditto. (CODE_FOR_lasx_xvssran_hu_w): Ditto. (CODE_FOR_lasx_xvssran_wu_d): Ditto. (CODE_FOR_lasx_xvssrarn_b_h): Ditto. (CODE_FOR_lasx_xvssrarn_h_w): Ditto. (CODE_FOR_lasx_xvssrarn_w_d): Ditto. (CODE_FOR_lasx_xvssrarn_bu_h): Ditto. (CODE_FOR_lasx_xvssrarn_hu_w): Ditto. (CODE_FOR_lasx_xvssrarn_wu_d): Ditto. (CODE_FOR_lasx_xvssrln_bu_h): Ditto. (CODE_FOR_lasx_xvssrln_hu_w): Ditto. (CODE_FOR_lasx_xvssrln_wu_d): Ditto. (CODE_FOR_lasx_xvssrlrn_bu_h): Ditto. (CODE_FOR_lasx_xvssrlrn_hu_w): Ditto. (CODE_FOR_lasx_xvssrlrn_wu_d): Ditto. (CODE_FOR_lasx_xvftint_w_s): Ditto. (CODE_FOR_lasx_xvftint_l_d): Ditto. (CODE_FOR_lasx_xvftint_wu_s): Ditto. (CODE_FOR_lasx_xvftint_lu_d): Ditto. (CODE_FOR_lasx_xvsllwil_h_b): Ditto. (CODE_FOR_lasx_xvsllwil_w_h): Ditto. (CODE_FOR_lasx_xvsllwil_d_w): Ditto. (CODE_FOR_lasx_xvsllwil_hu_bu): Ditto. (CODE_FOR_lasx_xvsllwil_wu_hu): Ditto. (CODE_FOR_lasx_xvsllwil_du_wu): Ditto. (CODE_FOR_lasx_xvsat_b): Ditto. (CODE_FOR_lasx_xvsat_h): Ditto. (CODE_FOR_lasx_xvsat_w): Ditto. (CODE_FOR_lasx_xvsat_d): Ditto. (CODE_FOR_lasx_xvsat_bu): Ditto. (CODE_FOR_lasx_xvsat_hu): Ditto. (CODE_FOR_lasx_xvsat_wu): Ditto. (CODE_FOR_lasx_xvsat_du): Ditto. (loongarch_builtin_vectorized_function): Ditto. (loongarch_expand_builtin_insn): Ditto. (loongarch_expand_builtin): Ditto. * config/loongarch/loongarch-ftypes.def (1): Ditto. (2): Ditto. (3): Ditto. (4): Ditto. * config/loongarch/lasxintrin.h: New file. --- gcc/config.gcc | 2 +- gcc/config/loongarch/lasxintrin.h | 5338 ++++++++++++++++++++ gcc/config/loongarch/loongarch-builtins.cc | 1180 ++++- gcc/config/loongarch/loongarch-ftypes.def | 271 +- 4 files changed, 6788 insertions(+), 3 deletions(-) create mode 100644 gcc/config/loongarch/lasxintrin.h diff --git a/gcc/config.gcc b/gcc/config.gcc index 8fda51e157c..ae6fe50590c 100644 --- a/gcc/config.gcc +++ b/gcc/config.gcc @@ -469,7 +469,7 @@ mips*-*-*) ;; loongarch*-*-*) cpu_type=loongarch - extra_headers="larchintrin.h lsxintrin.h" + extra_headers="larchintrin.h lsxintrin.h lasxintrin.h" extra_objs="loongarch-c.o loongarch-builtins.o loongarch-cpu.o loongarch-opts.o loongarch-def.o" extra_gcc_objs="loongarch-driver.o loongarch-cpu.o loongarch-opts.o loongarch-def.o" extra_options="${extra_options} g.opt fused-madd.opt" diff --git a/gcc/config/loongarch/lasxintrin.h b/gcc/config/loongarch/lasxintrin.h new file mode 100644 index 00000000000..d3937992746 --- /dev/null +++ b/gcc/config/loongarch/lasxintrin.h @@ -0,0 +1,5338 @@ +/* LARCH Loongson ASX intrinsics include file. + + Copyright (C) 2018 Free Software Foundation, Inc. + + This file is part of GCC. + + GCC is free software; you can redistribute it and/or modify it + under the terms of the GNU General Public License as published + by the Free Software Foundation; either version 3, or (at your + option) any later version. + + GCC is distributed in the hope that it will be useful, but WITHOUT + ANY WARRANTY; without even the implied warranty of MERCHANTABILITY + or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public + License for more details. + + Under Section 7 of GPL version 3, you are granted additional + permissions described in the GCC Runtime Library Exception, version + 3.1, as published by the Free Software Foundation. + + You should have received a copy of the GNU General Public License and + a copy of the GCC Runtime Library Exception along with this program; + see the files COPYING3 and COPYING.RUNTIME respectively. If not, see + . */ + +#ifndef _GCC_LOONGSON_ASXINTRIN_H +#define _GCC_LOONGSON_ASXINTRIN_H 1 + +#if defined(__loongarch_asx) + +typedef signed char v32i8 __attribute__ ((vector_size(32), aligned(32))); +typedef signed char v32i8_b __attribute__ ((vector_size(32), aligned(1))); +typedef unsigned char v32u8 __attribute__ ((vector_size(32), aligned(32))); +typedef unsigned char v32u8_b __attribute__ ((vector_size(32), aligned(1))); +typedef short v16i16 __attribute__ ((vector_size(32), aligned(32))); +typedef short v16i16_h __attribute__ ((vector_size(32), aligned(2))); +typedef unsigned short v16u16 __attribute__ ((vector_size(32), aligned(32))); +typedef unsigned short v16u16_h __attribute__ ((vector_size(32), aligned(2))); +typedef int v8i32 __attribute__ ((vector_size(32), aligned(32))); +typedef int v8i32_w __attribute__ ((vector_size(32), aligned(4))); +typedef unsigned int v8u32 __attribute__ ((vector_size(32), aligned(32))); +typedef unsigned int v8u32_w __attribute__ ((vector_size(32), aligned(4))); +typedef long long v4i64 __attribute__ ((vector_size(32), aligned(32))); +typedef long long v4i64_d __attribute__ ((vector_size(32), aligned(8))); +typedef unsigned long long v4u64 __attribute__ ((vector_size(32), aligned(32))); +typedef unsigned long long v4u64_d __attribute__ ((vector_size(32), aligned(8))); +typedef float v8f32 __attribute__ ((vector_size(32), aligned(32))); +typedef float v8f32_w __attribute__ ((vector_size(32), aligned(4))); +typedef double v4f64 __attribute__ ((vector_size(32), aligned(32))); +typedef double v4f64_d __attribute__ ((vector_size(32), aligned(8))); +typedef float __m256 __attribute__ ((__vector_size__ (32), + __may_alias__)); +typedef long long __m256i __attribute__ ((__vector_size__ (32), + __may_alias__)); +typedef double __m256d __attribute__ ((__vector_size__ (32), + __may_alias__)); + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsll_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsll_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsll_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsll_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsll_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsll_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsll_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsll_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvslli_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvslli_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V16HI, V16HI, UQI. */ +#define __lasx_xvslli_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvslli_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvslli_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvslli_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvslli_d(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvslli_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsra_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsra_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsra_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsra_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsra_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsra_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsra_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsra_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvsrai_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvsrai_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V16HI, V16HI, UQI. */ +#define __lasx_xvsrai_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvsrai_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvsrai_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsrai_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvsrai_d(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvsrai_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrar_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrar_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrar_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrar_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrar_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrar_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrar_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrar_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvsrari_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvsrari_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V16HI, V16HI, UQI. */ +#define __lasx_xvsrari_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvsrari_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvsrari_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsrari_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvsrari_d(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvsrari_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrl_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrl_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrl_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrl_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrl_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrl_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrl_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrl_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvsrli_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvsrli_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V16HI, V16HI, UQI. */ +#define __lasx_xvsrli_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvsrli_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvsrli_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsrli_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvsrli_d(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvsrli_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrlr_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrlr_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrlr_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrlr_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrlr_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrlr_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrlr_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrlr_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvsrlri_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvsrlri_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V16HI, V16HI, UQI. */ +#define __lasx_xvsrlri_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvsrlri_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvsrlri_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsrlri_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvsrlri_d(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvsrlri_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitclr_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitclr_b ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitclr_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitclr_h ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitclr_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitclr_w ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitclr_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitclr_d ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: UV32QI, UV32QI, UQI. */ +#define __lasx_xvbitclri_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvbitclri_b ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: UV16HI, UV16HI, UQI. */ +#define __lasx_xvbitclri_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvbitclri_h ((v16u16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV8SI, UV8SI, UQI. */ +#define __lasx_xvbitclri_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvbitclri_w ((v8u32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: UV4DI, UV4DI, UQI. */ +#define __lasx_xvbitclri_d(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvbitclri_d ((v4u64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitset_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitset_b ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitset_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitset_h ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitset_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitset_w ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitset_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitset_d ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: UV32QI, UV32QI, UQI. */ +#define __lasx_xvbitseti_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvbitseti_b ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: UV16HI, UV16HI, UQI. */ +#define __lasx_xvbitseti_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvbitseti_h ((v16u16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV8SI, UV8SI, UQI. */ +#define __lasx_xvbitseti_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvbitseti_w ((v8u32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: UV4DI, UV4DI, UQI. */ +#define __lasx_xvbitseti_d(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvbitseti_d ((v4u64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitrev_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitrev_b ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitrev_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitrev_h ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitrev_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitrev_w ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitrev_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvbitrev_d ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: UV32QI, UV32QI, UQI. */ +#define __lasx_xvbitrevi_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvbitrevi_b ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: UV16HI, UV16HI, UQI. */ +#define __lasx_xvbitrevi_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvbitrevi_h ((v16u16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV8SI, UV8SI, UQI. */ +#define __lasx_xvbitrevi_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvbitrevi_w ((v8u32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: UV4DI, UV4DI, UQI. */ +#define __lasx_xvbitrevi_d(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvbitrevi_d ((v4u64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvadd_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvadd_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvadd_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvadd_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvadd_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvadd_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvadd_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvadd_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvaddi_bu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvaddi_bu ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, UQI. */ +#define __lasx_xvaddi_hu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvaddi_hu ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvaddi_wu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvaddi_wu ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvaddi_du(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvaddi_du ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsub_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsub_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsub_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsub_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsub_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsub_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsub_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsub_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvsubi_bu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsubi_bu ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, UQI. */ +#define __lasx_xvsubi_hu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsubi_hu ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvsubi_wu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsubi_wu ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvsubi_du(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsubi_du ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmax_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmax_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmax_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmax_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmax_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmax_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmax_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmax_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V32QI, V32QI, QI. */ +#define __lasx_xvmaxi_b(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvmaxi_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V16HI, V16HI, QI. */ +#define __lasx_xvmaxi_h(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvmaxi_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V8SI, V8SI, QI. */ +#define __lasx_xvmaxi_w(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvmaxi_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V4DI, V4DI, QI. */ +#define __lasx_xvmaxi_d(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvmaxi_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmax_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmax_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmax_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmax_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmax_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmax_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmax_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmax_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV32QI, UV32QI, UQI. */ +#define __lasx_xvmaxi_bu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvmaxi_bu ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV16HI, UV16HI, UQI. */ +#define __lasx_xvmaxi_hu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvmaxi_hu ((v16u16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV8SI, UV8SI, UQI. */ +#define __lasx_xvmaxi_wu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvmaxi_wu ((v8u32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV4DI, UV4DI, UQI. */ +#define __lasx_xvmaxi_du(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvmaxi_du ((v4u64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmin_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmin_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmin_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmin_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmin_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmin_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmin_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmin_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V32QI, V32QI, QI. */ +#define __lasx_xvmini_b(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvmini_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V16HI, V16HI, QI. */ +#define __lasx_xvmini_h(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvmini_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V8SI, V8SI, QI. */ +#define __lasx_xvmini_w(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvmini_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V4DI, V4DI, QI. */ +#define __lasx_xvmini_d(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvmini_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmin_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmin_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmin_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmin_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmin_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmin_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmin_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmin_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV32QI, UV32QI, UQI. */ +#define __lasx_xvmini_bu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvmini_bu ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV16HI, UV16HI, UQI. */ +#define __lasx_xvmini_hu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvmini_hu ((v16u16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV8SI, UV8SI, UQI. */ +#define __lasx_xvmini_wu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvmini_wu ((v8u32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV4DI, UV4DI, UQI. */ +#define __lasx_xvmini_du(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvmini_du ((v4u64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvseq_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvseq_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvseq_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvseq_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvseq_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvseq_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvseq_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvseq_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V32QI, V32QI, QI. */ +#define __lasx_xvseqi_b(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvseqi_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V16HI, V16HI, QI. */ +#define __lasx_xvseqi_h(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvseqi_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V8SI, V8SI, QI. */ +#define __lasx_xvseqi_w(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvseqi_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V4DI, V4DI, QI. */ +#define __lasx_xvseqi_d(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvseqi_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvslt_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvslt_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvslt_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvslt_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvslt_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvslt_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvslt_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvslt_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V32QI, V32QI, QI. */ +#define __lasx_xvslti_b(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvslti_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V16HI, V16HI, QI. */ +#define __lasx_xvslti_h(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvslti_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V8SI, V8SI, QI. */ +#define __lasx_xvslti_w(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvslti_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V4DI, V4DI, QI. */ +#define __lasx_xvslti_d(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvslti_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvslt_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvslt_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvslt_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvslt_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvslt_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvslt_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvslt_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvslt_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V32QI, UV32QI, UQI. */ +#define __lasx_xvslti_bu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvslti_bu ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, UV16HI, UQI. */ +#define __lasx_xvslti_hu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvslti_hu ((v16u16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, UV8SI, UQI. */ +#define __lasx_xvslti_wu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvslti_wu ((v8u32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V4DI, UV4DI, UQI. */ +#define __lasx_xvslti_du(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvslti_du ((v4u64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsle_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsle_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsle_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsle_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsle_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsle_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsle_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsle_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V32QI, V32QI, QI. */ +#define __lasx_xvslei_b(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvslei_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V16HI, V16HI, QI. */ +#define __lasx_xvslei_h(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvslei_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V8SI, V8SI, QI. */ +#define __lasx_xvslei_w(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvslei_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, si5. */ +/* Data types in instruction templates: V4DI, V4DI, QI. */ +#define __lasx_xvslei_d(/*__m256i*/ _1, /*si5*/ _2) \ + ((__m256i)__builtin_lasx_xvslei_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsle_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsle_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsle_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsle_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsle_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsle_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsle_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsle_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V32QI, UV32QI, UQI. */ +#define __lasx_xvslei_bu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvslei_bu ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, UV16HI, UQI. */ +#define __lasx_xvslei_hu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvslei_hu ((v16u16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, UV8SI, UQI. */ +#define __lasx_xvslei_wu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvslei_wu ((v8u32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V4DI, UV4DI, UQI. */ +#define __lasx_xvslei_du(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvslei_du ((v4u64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvsat_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvsat_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V16HI, V16HI, UQI. */ +#define __lasx_xvsat_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvsat_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvsat_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsat_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvsat_d(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvsat_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: UV32QI, UV32QI, UQI. */ +#define __lasx_xvsat_bu(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvsat_bu ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: UV16HI, UV16HI, UQI. */ +#define __lasx_xvsat_hu(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvsat_hu ((v16u16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV8SI, UV8SI, UQI. */ +#define __lasx_xvsat_wu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsat_wu ((v8u32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: UV4DI, UV4DI, UQI. */ +#define __lasx_xvsat_du(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvsat_du ((v4u64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvadda_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvadda_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvadda_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvadda_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvadda_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvadda_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvadda_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvadda_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsadd_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsadd_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsadd_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsadd_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsadd_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsadd_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsadd_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsadd_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsadd_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsadd_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsadd_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsadd_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsadd_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsadd_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsadd_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsadd_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavg_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavg_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavg_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavg_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavg_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavg_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavg_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavg_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavg_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavg_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavg_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavg_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavg_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavg_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavg_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavg_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavgr_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavgr_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavgr_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavgr_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavgr_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavgr_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavgr_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavgr_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavgr_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavgr_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavgr_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavgr_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavgr_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavgr_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvavgr_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvavgr_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssub_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssub_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssub_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssub_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssub_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssub_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssub_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssub_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssub_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssub_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssub_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssub_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssub_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssub_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssub_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssub_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvabsd_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvabsd_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvabsd_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvabsd_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvabsd_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvabsd_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvabsd_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvabsd_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvabsd_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvabsd_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvabsd_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvabsd_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvabsd_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvabsd_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvabsd_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvabsd_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmul_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmul_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmul_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmul_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmul_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmul_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmul_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmul_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmadd_b (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmadd_b ((v32i8)_1, (v32i8)_2, (v32i8)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmadd_h (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmadd_h ((v16i16)_1, (v16i16)_2, (v16i16)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmadd_w (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmadd_w ((v8i32)_1, (v8i32)_2, (v8i32)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmadd_d (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmadd_d ((v4i64)_1, (v4i64)_2, (v4i64)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmsub_b (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmsub_b ((v32i8)_1, (v32i8)_2, (v32i8)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmsub_h (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmsub_h ((v16i16)_1, (v16i16)_2, (v16i16)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmsub_w (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmsub_w ((v8i32)_1, (v8i32)_2, (v8i32)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmsub_d (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmsub_d ((v4i64)_1, (v4i64)_2, (v4i64)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvdiv_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvdiv_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvdiv_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvdiv_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvdiv_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvdiv_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvdiv_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvdiv_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvdiv_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvdiv_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvdiv_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvdiv_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvdiv_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvdiv_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvdiv_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvdiv_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhaddw_h_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhaddw_h_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhaddw_w_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhaddw_w_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhaddw_d_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhaddw_d_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhaddw_hu_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhaddw_hu_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhaddw_wu_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhaddw_wu_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhaddw_du_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhaddw_du_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhsubw_h_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhsubw_h_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhsubw_w_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhsubw_w_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhsubw_d_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhsubw_d_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhsubw_hu_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhsubw_hu_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhsubw_wu_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhsubw_wu_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhsubw_du_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhsubw_du_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmod_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmod_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmod_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmod_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmod_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmod_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmod_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmod_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmod_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmod_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmod_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmod_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmod_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmod_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmod_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmod_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvrepl128vei_b(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvrepl128vei_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V16HI, V16HI, UQI. */ +#define __lasx_xvrepl128vei_h(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvrepl128vei_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui2. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvrepl128vei_w(/*__m256i*/ _1, /*ui2*/ _2) \ + ((__m256i)__builtin_lasx_xvrepl128vei_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui1. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvrepl128vei_d(/*__m256i*/ _1, /*ui1*/ _2) \ + ((__m256i)__builtin_lasx_xvrepl128vei_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpickev_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpickev_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpickev_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpickev_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpickev_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpickev_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpickev_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpickev_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpickod_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpickod_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpickod_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpickod_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpickod_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpickod_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpickod_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpickod_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvilvh_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvilvh_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvilvh_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvilvh_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvilvh_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvilvh_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvilvh_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvilvh_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvilvl_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvilvl_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvilvl_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvilvl_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvilvl_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvilvl_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvilvl_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvilvl_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpackev_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpackev_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpackev_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpackev_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpackev_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpackev_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpackev_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpackev_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpackod_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpackod_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpackod_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpackod_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpackod_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpackod_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpackod_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvpackod_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk, xa. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvshuf_b (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvshuf_b ((v32i8)_1, (v32i8)_2, (v32i8)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvshuf_h (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvshuf_h ((v16i16)_1, (v16i16)_2, (v16i16)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvshuf_w (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvshuf_w ((v8i32)_1, (v8i32)_2, (v8i32)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvshuf_d (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvshuf_d ((v4i64)_1, (v4i64)_2, (v4i64)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvand_v (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvand_v ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: UV32QI, UV32QI, UQI. */ +#define __lasx_xvandi_b(/*__m256i*/ _1, /*ui8*/ _2) \ + ((__m256i)__builtin_lasx_xvandi_b ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvor_v (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvor_v ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: UV32QI, UV32QI, UQI. */ +#define __lasx_xvori_b(/*__m256i*/ _1, /*ui8*/ _2) \ + ((__m256i)__builtin_lasx_xvori_b ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvnor_v (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvnor_v ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: UV32QI, UV32QI, UQI. */ +#define __lasx_xvnori_b(/*__m256i*/ _1, /*ui8*/ _2) \ + ((__m256i)__builtin_lasx_xvnori_b ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvxor_v (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvxor_v ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: UV32QI, UV32QI, UQI. */ +#define __lasx_xvxori_b(/*__m256i*/ _1, /*ui8*/ _2) \ + ((__m256i)__builtin_lasx_xvxori_b ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk, xa. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvbitsel_v (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvbitsel_v ((v32u8)_1, (v32u8)_2, (v32u8)_3); +} + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI, USI. */ +#define __lasx_xvbitseli_b(/*__m256i*/ _1, /*__m256i*/ _2, /*ui8*/ _3) \ + ((__m256i)__builtin_lasx_xvbitseli_b ((v32u8)(_1), (v32u8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V32QI, V32QI, USI. */ +#define __lasx_xvshuf4i_b(/*__m256i*/ _1, /*ui8*/ _2) \ + ((__m256i)__builtin_lasx_xvshuf4i_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V16HI, V16HI, USI. */ +#define __lasx_xvshuf4i_h(/*__m256i*/ _1, /*ui8*/ _2) \ + ((__m256i)__builtin_lasx_xvshuf4i_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V8SI, V8SI, USI. */ +#define __lasx_xvshuf4i_w(/*__m256i*/ _1, /*ui8*/ _2) \ + ((__m256i)__builtin_lasx_xvshuf4i_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, rj. */ +/* Data types in instruction templates: V32QI, SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplgr2vr_b (int _1) +{ + return (__m256i)__builtin_lasx_xvreplgr2vr_b ((int)_1); +} + +/* Assembly instruction format: xd, rj. */ +/* Data types in instruction templates: V16HI, SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplgr2vr_h (int _1) +{ + return (__m256i)__builtin_lasx_xvreplgr2vr_h ((int)_1); +} + +/* Assembly instruction format: xd, rj. */ +/* Data types in instruction templates: V8SI, SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplgr2vr_w (int _1) +{ + return (__m256i)__builtin_lasx_xvreplgr2vr_w ((int)_1); +} + +/* Assembly instruction format: xd, rj. */ +/* Data types in instruction templates: V4DI, DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplgr2vr_d (long int _1) +{ + return (__m256i)__builtin_lasx_xvreplgr2vr_d ((long int)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpcnt_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvpcnt_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpcnt_h (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvpcnt_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpcnt_w (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvpcnt_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvpcnt_d (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvpcnt_d ((v4i64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvclo_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvclo_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvclo_h (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvclo_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvclo_w (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvclo_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvclo_d (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvclo_d ((v4i64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvclz_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvclz_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvclz_h (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvclz_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvclz_w (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvclz_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvclz_d (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvclz_d ((v4i64)_1); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfadd_s (__m256 _1, __m256 _2) +{ + return (__m256)__builtin_lasx_xvfadd_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfadd_d (__m256d _1, __m256d _2) +{ + return (__m256d)__builtin_lasx_xvfadd_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfsub_s (__m256 _1, __m256 _2) +{ + return (__m256)__builtin_lasx_xvfsub_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfsub_d (__m256d _1, __m256d _2) +{ + return (__m256d)__builtin_lasx_xvfsub_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfmul_s (__m256 _1, __m256 _2) +{ + return (__m256)__builtin_lasx_xvfmul_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfmul_d (__m256d _1, __m256d _2) +{ + return (__m256d)__builtin_lasx_xvfmul_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfdiv_s (__m256 _1, __m256 _2) +{ + return (__m256)__builtin_lasx_xvfdiv_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfdiv_d (__m256d _1, __m256d _2) +{ + return (__m256d)__builtin_lasx_xvfdiv_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcvt_h_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcvt_h_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfcvt_s_d (__m256d _1, __m256d _2) +{ + return (__m256)__builtin_lasx_xvfcvt_s_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfmin_s (__m256 _1, __m256 _2) +{ + return (__m256)__builtin_lasx_xvfmin_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfmin_d (__m256d _1, __m256d _2) +{ + return (__m256d)__builtin_lasx_xvfmin_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfmina_s (__m256 _1, __m256 _2) +{ + return (__m256)__builtin_lasx_xvfmina_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfmina_d (__m256d _1, __m256d _2) +{ + return (__m256d)__builtin_lasx_xvfmina_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfmax_s (__m256 _1, __m256 _2) +{ + return (__m256)__builtin_lasx_xvfmax_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfmax_d (__m256d _1, __m256d _2) +{ + return (__m256d)__builtin_lasx_xvfmax_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfmaxa_s (__m256 _1, __m256 _2) +{ + return (__m256)__builtin_lasx_xvfmaxa_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfmaxa_d (__m256d _1, __m256d _2) +{ + return (__m256d)__builtin_lasx_xvfmaxa_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfclass_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvfclass_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfclass_d (__m256d _1) +{ + return (__m256i)__builtin_lasx_xvfclass_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfsqrt_s (__m256 _1) +{ + return (__m256)__builtin_lasx_xvfsqrt_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfsqrt_d (__m256d _1) +{ + return (__m256d)__builtin_lasx_xvfsqrt_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfrecip_s (__m256 _1) +{ + return (__m256)__builtin_lasx_xvfrecip_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfrecip_d (__m256d _1) +{ + return (__m256d)__builtin_lasx_xvfrecip_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfrint_s (__m256 _1) +{ + return (__m256)__builtin_lasx_xvfrint_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfrint_d (__m256d _1) +{ + return (__m256d)__builtin_lasx_xvfrint_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfrsqrt_s (__m256 _1) +{ + return (__m256)__builtin_lasx_xvfrsqrt_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfrsqrt_d (__m256d _1) +{ + return (__m256d)__builtin_lasx_xvfrsqrt_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvflogb_s (__m256 _1) +{ + return (__m256)__builtin_lasx_xvflogb_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvflogb_d (__m256d _1) +{ + return (__m256d)__builtin_lasx_xvflogb_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SF, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfcvth_s_h (__m256i _1) +{ + return (__m256)__builtin_lasx_xvfcvth_s_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfcvth_d_s (__m256 _1) +{ + return (__m256d)__builtin_lasx_xvfcvth_d_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SF, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfcvtl_s_h (__m256i _1) +{ + return (__m256)__builtin_lasx_xvfcvtl_s_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfcvtl_d_s (__m256 _1) +{ + return (__m256d)__builtin_lasx_xvfcvtl_d_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftint_w_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftint_w_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftint_l_d (__m256d _1) +{ + return (__m256i)__builtin_lasx_xvftint_l_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: UV8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftint_wu_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftint_wu_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: UV4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftint_lu_d (__m256d _1) +{ + return (__m256i)__builtin_lasx_xvftint_lu_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrz_w_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrz_w_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrz_l_d (__m256d _1) +{ + return (__m256i)__builtin_lasx_xvftintrz_l_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: UV8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrz_wu_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrz_wu_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: UV4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrz_lu_d (__m256d _1) +{ + return (__m256i)__builtin_lasx_xvftintrz_lu_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SF, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvffint_s_w (__m256i _1) +{ + return (__m256)__builtin_lasx_xvffint_s_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvffint_d_l (__m256i _1) +{ + return (__m256d)__builtin_lasx_xvffint_d_l ((v4i64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SF, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvffint_s_wu (__m256i _1) +{ + return (__m256)__builtin_lasx_xvffint_s_wu ((v8u32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvffint_d_lu (__m256i _1) +{ + return (__m256d)__builtin_lasx_xvffint_d_lu ((v4u64)_1); +} + +/* Assembly instruction format: xd, xj, rk. */ +/* Data types in instruction templates: V32QI, V32QI, SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplve_b (__m256i _1, int _2) +{ + return (__m256i)__builtin_lasx_xvreplve_b ((v32i8)_1, (int)_2); +} + +/* Assembly instruction format: xd, xj, rk. */ +/* Data types in instruction templates: V16HI, V16HI, SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplve_h (__m256i _1, int _2) +{ + return (__m256i)__builtin_lasx_xvreplve_h ((v16i16)_1, (int)_2); +} + +/* Assembly instruction format: xd, xj, rk. */ +/* Data types in instruction templates: V8SI, V8SI, SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplve_w (__m256i _1, int _2) +{ + return (__m256i)__builtin_lasx_xvreplve_w ((v8i32)_1, (int)_2); +} + +/* Assembly instruction format: xd, xj, rk. */ +/* Data types in instruction templates: V4DI, V4DI, SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplve_d (__m256i _1, int _2) +{ + return (__m256i)__builtin_lasx_xvreplve_d ((v4i64)_1, (int)_2); +} + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, USI. */ +#define __lasx_xvpermi_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui8*/ _3) \ + ((__m256i)__builtin_lasx_xvpermi_w ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvandn_v (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvandn_v ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvneg_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvneg_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvneg_h (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvneg_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvneg_w (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvneg_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvneg_d (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvneg_d ((v4i64)_1); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmuh_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmuh_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmuh_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmuh_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmuh_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmuh_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmuh_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmuh_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmuh_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmuh_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmuh_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmuh_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmuh_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmuh_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmuh_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmuh_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V16HI, V32QI, UQI. */ +#define __lasx_xvsllwil_h_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvsllwil_h_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V8SI, V16HI, UQI. */ +#define __lasx_xvsllwil_w_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvsllwil_w_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V4DI, V8SI, UQI. */ +#define __lasx_xvsllwil_d_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsllwil_d_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: UV16HI, UV32QI, UQI. */ +#define __lasx_xvsllwil_hu_bu(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvsllwil_hu_bu ((v32u8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: UV8SI, UV16HI, UQI. */ +#define __lasx_xvsllwil_wu_hu(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvsllwil_wu_hu ((v16u16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV4DI, UV8SI, UQI. */ +#define __lasx_xvsllwil_du_wu(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvsllwil_du_wu ((v8u32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsran_b_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsran_b_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsran_h_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsran_h_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsran_w_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsran_w_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssran_b_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssran_b_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssran_h_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssran_h_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssran_w_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssran_w_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssran_bu_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssran_bu_h ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssran_hu_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssran_hu_w ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssran_wu_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssran_wu_d ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrarn_b_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrarn_b_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrarn_h_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrarn_h_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrarn_w_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrarn_w_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrarn_b_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrarn_b_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrarn_h_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrarn_h_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrarn_w_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrarn_w_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrarn_bu_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrarn_bu_h ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrarn_hu_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrarn_hu_w ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrarn_wu_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrarn_wu_d ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrln_b_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrln_b_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrln_h_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrln_h_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrln_w_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrln_w_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrln_bu_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrln_bu_h ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrln_hu_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrln_hu_w ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrln_wu_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrln_wu_d ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrlrn_b_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrlrn_b_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrlrn_h_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrlrn_h_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsrlrn_w_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsrlrn_w_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV32QI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrlrn_bu_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrlrn_bu_h ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrlrn_hu_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrlrn_hu_w ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrlrn_wu_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrlrn_wu_d ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, UQI. */ +#define __lasx_xvfrstpi_b(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvfrstpi_b ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, UQI. */ +#define __lasx_xvfrstpi_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvfrstpi_h ((v16i16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfrstp_b (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvfrstp_b ((v32i8)_1, (v32i8)_2, (v32i8)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfrstp_h (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvfrstp_h ((v16i16)_1, (v16i16)_2, (v16i16)_3); +} + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, USI. */ +#define __lasx_xvshuf4i_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui8*/ _3) \ + ((__m256i)__builtin_lasx_xvshuf4i_d ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvbsrl_v(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvbsrl_v ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvbsll_v(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvbsll_v ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, USI. */ +#define __lasx_xvextrins_b(/*__m256i*/ _1, /*__m256i*/ _2, /*ui8*/ _3) \ + ((__m256i)__builtin_lasx_xvextrins_b ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, USI. */ +#define __lasx_xvextrins_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui8*/ _3) \ + ((__m256i)__builtin_lasx_xvextrins_h ((v16i16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, USI. */ +#define __lasx_xvextrins_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui8*/ _3) \ + ((__m256i)__builtin_lasx_xvextrins_w ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, USI. */ +#define __lasx_xvextrins_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui8*/ _3) \ + ((__m256i)__builtin_lasx_xvextrins_d ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmskltz_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvmskltz_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmskltz_h (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvmskltz_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmskltz_w (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvmskltz_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmskltz_d (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvmskltz_d ((v4i64)_1); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsigncov_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsigncov_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsigncov_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsigncov_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsigncov_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsigncov_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsigncov_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsigncov_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk, xa. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfmadd_s (__m256 _1, __m256 _2, __m256 _3) +{ + return (__m256)__builtin_lasx_xvfmadd_s ((v8f32)_1, (v8f32)_2, (v8f32)_3); +} + +/* Assembly instruction format: xd, xj, xk, xa. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfmadd_d (__m256d _1, __m256d _2, __m256d _3) +{ + return (__m256d)__builtin_lasx_xvfmadd_d ((v4f64)_1, (v4f64)_2, (v4f64)_3); +} + +/* Assembly instruction format: xd, xj, xk, xa. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfmsub_s (__m256 _1, __m256 _2, __m256 _3) +{ + return (__m256)__builtin_lasx_xvfmsub_s ((v8f32)_1, (v8f32)_2, (v8f32)_3); +} + +/* Assembly instruction format: xd, xj, xk, xa. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfmsub_d (__m256d _1, __m256d _2, __m256d _3) +{ + return (__m256d)__builtin_lasx_xvfmsub_d ((v4f64)_1, (v4f64)_2, (v4f64)_3); +} + +/* Assembly instruction format: xd, xj, xk, xa. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfnmadd_s (__m256 _1, __m256 _2, __m256 _3) +{ + return (__m256)__builtin_lasx_xvfnmadd_s ((v8f32)_1, (v8f32)_2, (v8f32)_3); +} + +/* Assembly instruction format: xd, xj, xk, xa. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfnmadd_d (__m256d _1, __m256d _2, __m256d _3) +{ + return (__m256d)__builtin_lasx_xvfnmadd_d ((v4f64)_1, (v4f64)_2, (v4f64)_3); +} + +/* Assembly instruction format: xd, xj, xk, xa. */ +/* Data types in instruction templates: V8SF, V8SF, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfnmsub_s (__m256 _1, __m256 _2, __m256 _3) +{ + return (__m256)__builtin_lasx_xvfnmsub_s ((v8f32)_1, (v8f32)_2, (v8f32)_3); +} + +/* Assembly instruction format: xd, xj, xk, xa. */ +/* Data types in instruction templates: V4DF, V4DF, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfnmsub_d (__m256d _1, __m256d _2, __m256d _3) +{ + return (__m256d)__builtin_lasx_xvfnmsub_d ((v4f64)_1, (v4f64)_2, (v4f64)_3); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrne_w_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrne_w_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrne_l_d (__m256d _1) +{ + return (__m256i)__builtin_lasx_xvftintrne_l_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrp_w_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrp_w_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrp_l_d (__m256d _1) +{ + return (__m256i)__builtin_lasx_xvftintrp_l_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrm_w_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrm_w_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrm_l_d (__m256d _1) +{ + return (__m256i)__builtin_lasx_xvftintrm_l_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftint_w_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvftint_w_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SF, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvffint_s_l (__m256i _1, __m256i _2) +{ + return (__m256)__builtin_lasx_xvffint_s_l ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrz_w_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvftintrz_w_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrp_w_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvftintrp_w_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrm_w_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvftintrm_w_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrne_w_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvftintrne_w_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftinth_l_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftinth_l_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintl_l_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintl_l_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvffinth_d_w (__m256i _1) +{ + return (__m256d)__builtin_lasx_xvffinth_d_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DF, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvffintl_d_w (__m256i _1) +{ + return (__m256d)__builtin_lasx_xvffintl_d_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrzh_l_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrzh_l_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrzl_l_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrzl_l_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrph_l_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrph_l_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrpl_l_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrpl_l_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrmh_l_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrmh_l_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrml_l_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrml_l_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrneh_l_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrneh_l_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvftintrnel_l_s (__m256 _1) +{ + return (__m256i)__builtin_lasx_xvftintrnel_l_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfrintrne_s (__m256 _1) +{ + return (__m256)__builtin_lasx_xvfrintrne_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfrintrne_d (__m256d _1) +{ + return (__m256d)__builtin_lasx_xvfrintrne_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfrintrz_s (__m256 _1) +{ + return (__m256)__builtin_lasx_xvfrintrz_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfrintrz_d (__m256d _1) +{ + return (__m256d)__builtin_lasx_xvfrintrz_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfrintrp_s (__m256 _1) +{ + return (__m256)__builtin_lasx_xvfrintrp_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfrintrp_d (__m256d _1) +{ + return (__m256d)__builtin_lasx_xvfrintrp_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256 __lasx_xvfrintrm_s (__m256 _1) +{ + return (__m256)__builtin_lasx_xvfrintrm_s ((v8f32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256d __lasx_xvfrintrm_d (__m256d _1) +{ + return (__m256d)__builtin_lasx_xvfrintrm_d ((v4f64)_1); +} + +/* Assembly instruction format: xd, rj, si12. */ +/* Data types in instruction templates: V32QI, CVPOINTER, SI. */ +#define __lasx_xvld(/*void **/ _1, /*si12*/ _2) \ + ((__m256i)__builtin_lasx_xvld ((void *)(_1), (_2))) + +/* Assembly instruction format: xd, rj, si12. */ +/* Data types in instruction templates: VOID, V32QI, CVPOINTER, SI. */ +#define __lasx_xvst(/*__m256i*/ _1, /*void **/ _2, /*si12*/ _3) \ + ((void)__builtin_lasx_xvst ((v32i8)(_1), (void *)(_2), (_3))) + +/* Assembly instruction format: xd, rj, si8, idx. */ +/* Data types in instruction templates: VOID, V32QI, CVPOINTER, SI, UQI. */ +#define __lasx_xvstelm_b(/*__m256i*/ _1, /*void **/ _2, /*si8*/ _3, /*idx*/ _4) \ + ((void)__builtin_lasx_xvstelm_b ((v32i8)(_1), (void *)(_2), (_3), (_4))) + +/* Assembly instruction format: xd, rj, si8, idx. */ +/* Data types in instruction templates: VOID, V16HI, CVPOINTER, SI, UQI. */ +#define __lasx_xvstelm_h(/*__m256i*/ _1, /*void **/ _2, /*si8*/ _3, /*idx*/ _4) \ + ((void)__builtin_lasx_xvstelm_h ((v16i16)(_1), (void *)(_2), (_3), (_4))) + +/* Assembly instruction format: xd, rj, si8, idx. */ +/* Data types in instruction templates: VOID, V8SI, CVPOINTER, SI, UQI. */ +#define __lasx_xvstelm_w(/*__m256i*/ _1, /*void **/ _2, /*si8*/ _3, /*idx*/ _4) \ + ((void)__builtin_lasx_xvstelm_w ((v8i32)(_1), (void *)(_2), (_3), (_4))) + +/* Assembly instruction format: xd, rj, si8, idx. */ +/* Data types in instruction templates: VOID, V4DI, CVPOINTER, SI, UQI. */ +#define __lasx_xvstelm_d(/*__m256i*/ _1, /*void **/ _2, /*si8*/ _3, /*idx*/ _4) \ + ((void)__builtin_lasx_xvstelm_d ((v4i64)(_1), (void *)(_2), (_3), (_4))) + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, UQI. */ +#define __lasx_xvinsve0_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui3*/ _3) \ + ((__m256i)__builtin_lasx_xvinsve0_w ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui2. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, UQI. */ +#define __lasx_xvinsve0_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui2*/ _3) \ + ((__m256i)__builtin_lasx_xvinsve0_d ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvpickve_w(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvpickve_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui2. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvpickve_d(/*__m256i*/ _1, /*ui2*/ _2) \ + ((__m256i)__builtin_lasx_xvpickve_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrlrn_b_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrlrn_b_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrlrn_h_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrlrn_h_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrlrn_w_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrlrn_w_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrln_b_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrln_b_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrln_h_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrln_h_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvssrln_w_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvssrln_w_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvorn_v (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvorn_v ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, i13. */ +/* Data types in instruction templates: V4DI, HI. */ +#define __lasx_xvldi(/*i13*/ _1) \ + ((__m256i)__builtin_lasx_xvldi ((_1))) + +/* Assembly instruction format: xd, rj, rk. */ +/* Data types in instruction templates: V32QI, CVPOINTER, DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvldx (void * _1, long int _2) +{ + return (__m256i)__builtin_lasx_xvldx ((void *)_1, (long int)_2); +} + +/* Assembly instruction format: xd, rj, rk. */ +/* Data types in instruction templates: VOID, V32QI, CVPOINTER, DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +void __lasx_xvstx (__m256i _1, void * _2, long int _3) +{ + return (void)__builtin_lasx_xvstx ((v32i8)_1, (void *)_2, (long int)_3); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvextl_qu_du (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvextl_qu_du ((v4u64)_1); +} + +/* Assembly instruction format: xd, rj, ui3. */ +/* Data types in instruction templates: V8SI, V8SI, SI, UQI. */ +#define __lasx_xvinsgr2vr_w(/*__m256i*/ _1, /*int*/ _2, /*ui3*/ _3) \ + ((__m256i)__builtin_lasx_xvinsgr2vr_w ((v8i32)(_1), (int)(_2), (_3))) + +/* Assembly instruction format: xd, rj, ui2. */ +/* Data types in instruction templates: V4DI, V4DI, DI, UQI. */ +#define __lasx_xvinsgr2vr_d(/*__m256i*/ _1, /*long int*/ _2, /*ui2*/ _3) \ + ((__m256i)__builtin_lasx_xvinsgr2vr_d ((v4i64)(_1), (long int)(_2), (_3))) + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplve0_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvreplve0_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplve0_h (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvreplve0_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplve0_w (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvreplve0_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplve0_d (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvreplve0_d ((v4i64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvreplve0_q (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvreplve0_q ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V16HI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_h_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_h_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_w_h (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_w_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_d_w (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_d_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_w_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_w_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_d_h (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_d_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_d_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_d_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V16HI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_hu_bu (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_hu_bu ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_wu_hu (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_wu_hu ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_du_wu (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_du_wu ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_wu_bu (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_wu_bu ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_du_hu (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_du_hu ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_vext2xv_du_bu (__m256i _1) +{ + return (__m256i)__builtin_lasx_vext2xv_du_bu ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, USI. */ +#define __lasx_xvpermi_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui8*/ _3) \ + ((__m256i)__builtin_lasx_xvpermi_q ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui8. */ +/* Data types in instruction templates: V4DI, V4DI, USI. */ +#define __lasx_xvpermi_d(/*__m256i*/ _1, /*ui8*/ _2) \ + ((__m256i)__builtin_lasx_xvpermi_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvperm_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvperm_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, rj, si12. */ +/* Data types in instruction templates: V32QI, CVPOINTER, SI. */ +#define __lasx_xvldrepl_b(/*void **/ _1, /*si12*/ _2) \ + ((__m256i)__builtin_lasx_xvldrepl_b ((void *)(_1), (_2))) + +/* Assembly instruction format: xd, rj, si11. */ +/* Data types in instruction templates: V16HI, CVPOINTER, SI. */ +#define __lasx_xvldrepl_h(/*void **/ _1, /*si11*/ _2) \ + ((__m256i)__builtin_lasx_xvldrepl_h ((void *)(_1), (_2))) + +/* Assembly instruction format: xd, rj, si10. */ +/* Data types in instruction templates: V8SI, CVPOINTER, SI. */ +#define __lasx_xvldrepl_w(/*void **/ _1, /*si10*/ _2) \ + ((__m256i)__builtin_lasx_xvldrepl_w ((void *)(_1), (_2))) + +/* Assembly instruction format: xd, rj, si9. */ +/* Data types in instruction templates: V4DI, CVPOINTER, SI. */ +#define __lasx_xvldrepl_d(/*void **/ _1, /*si9*/ _2) \ + ((__m256i)__builtin_lasx_xvldrepl_d ((void *)(_1), (_2))) + +/* Assembly instruction format: rd, xj, ui3. */ +/* Data types in instruction templates: SI, V8SI, UQI. */ +#define __lasx_xvpickve2gr_w(/*__m256i*/ _1, /*ui3*/ _2) \ + ((int)__builtin_lasx_xvpickve2gr_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: rd, xj, ui3. */ +/* Data types in instruction templates: USI, V8SI, UQI. */ +#define __lasx_xvpickve2gr_wu(/*__m256i*/ _1, /*ui3*/ _2) \ + ((unsigned int)__builtin_lasx_xvpickve2gr_wu ((v8i32)(_1), (_2))) + +/* Assembly instruction format: rd, xj, ui2. */ +/* Data types in instruction templates: DI, V4DI, UQI. */ +#define __lasx_xvpickve2gr_d(/*__m256i*/ _1, /*ui2*/ _2) \ + ((long int)__builtin_lasx_xvpickve2gr_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: rd, xj, ui2. */ +/* Data types in instruction templates: UDI, V4DI, UQI. */ +#define __lasx_xvpickve2gr_du(/*__m256i*/ _1, /*ui2*/ _2) \ + ((unsigned long int)__builtin_lasx_xvpickve2gr_du ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_q_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_q_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_d_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_d_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_w_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_w_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_h_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_h_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_q_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_q_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_d_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_d_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_w_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_w_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_h_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_h_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwev_q_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwev_q_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwev_d_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwev_d_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwev_w_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwev_w_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwev_h_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwev_h_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwev_q_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwev_q_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwev_d_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwev_d_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwev_w_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwev_w_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwev_h_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwev_h_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_q_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_q_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_d_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_d_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_w_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_w_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_h_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_h_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_q_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_q_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_d_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_d_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_w_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_w_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_h_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_h_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_q_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_q_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_d_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_d_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_w_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_w_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_h_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_h_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_q_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_q_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_d_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_d_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_w_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_w_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_h_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_h_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwod_q_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwod_q_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwod_d_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwod_d_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwod_w_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwod_w_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwod_h_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwod_h_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwod_q_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwod_q_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwod_d_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwod_d_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwod_w_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwod_w_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsubwod_h_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsubwod_h_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_q_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_q_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_d_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_d_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_w_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_w_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_h_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_h_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_q_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_q_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_d_wu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_d_wu ((v8u32)_1, (v8u32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_w_hu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_w_hu ((v16u16)_1, (v16u16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_h_bu (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_h_bu ((v32u8)_1, (v32u8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_d_wu_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_d_wu_w ((v8u32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_w_hu_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_w_hu_h ((v16u16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_h_bu_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_h_bu_b ((v32u8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_d_wu_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_d_wu_w ((v8u32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_w_hu_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_w_hu_h ((v16u16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_h_bu_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_h_bu_b ((v32u8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_d_wu_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_d_wu_w ((v8u32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_w_hu_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_w_hu_h ((v16u16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_h_bu_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_h_bu_b ((v32u8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_d_wu_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_d_wu_w ((v8u32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, UV16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_w_hu_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_w_hu_h ((v16u16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, UV32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_h_bu_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_h_bu_b ((v32u8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhaddw_q_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhaddw_q_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhaddw_qu_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhaddw_qu_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhsubw_q_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhsubw_q_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvhsubw_qu_du (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvhsubw_qu_du ((v4u64)_1, (v4u64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_q_d (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_q_d ((v4i64)_1, (v4i64)_2, (v4i64)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_d_w (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_d_w ((v4i64)_1, (v8i32)_2, (v8i32)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_w_h (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_w_h ((v8i32)_1, (v16i16)_2, (v16i16)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_h_b (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_h_b ((v16i16)_1, (v32i8)_2, (v32i8)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_q_du (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_q_du ((v4u64)_1, (v4u64)_2, (v4u64)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_d_wu (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_d_wu ((v4u64)_1, (v8u32)_2, (v8u32)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_w_hu (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_w_hu ((v8u32)_1, (v16u16)_2, (v16u16)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_h_bu (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_h_bu ((v16u16)_1, (v32u8)_2, (v32u8)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_q_d (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_q_d ((v4i64)_1, (v4i64)_2, (v4i64)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_d_w (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_d_w ((v4i64)_1, (v8i32)_2, (v8i32)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_w_h (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_w_h ((v8i32)_1, (v16i16)_2, (v16i16)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_h_b (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_h_b ((v16i16)_1, (v32i8)_2, (v32i8)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_q_du (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_q_du ((v4u64)_1, (v4u64)_2, (v4u64)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV4DI, UV4DI, UV8SI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_d_wu (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_d_wu ((v4u64)_1, (v8u32)_2, (v8u32)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV8SI, UV8SI, UV16HI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_w_hu (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_w_hu ((v8u32)_1, (v16u16)_2, (v16u16)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: UV16HI, UV16HI, UV32QI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_h_bu (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_h_bu ((v16u16)_1, (v32u8)_2, (v32u8)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, UV4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_q_du_d (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_q_du_d ((v4i64)_1, (v4u64)_2, (v4i64)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, UV8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_d_wu_w (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_d_wu_w ((v4i64)_1, (v8u32)_2, (v8i32)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, UV16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_w_hu_h (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_w_hu_h ((v8i32)_1, (v16u16)_2, (v16i16)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, UV32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwev_h_bu_b (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwev_h_bu_b ((v16i16)_1, (v32u8)_2, (v32i8)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, UV4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_q_du_d (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_q_du_d ((v4i64)_1, (v4u64)_2, (v4i64)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, UV8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_d_wu_w (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_d_wu_w ((v4i64)_1, (v8u32)_2, (v8i32)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, UV16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_w_hu_h (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_w_hu_h ((v8i32)_1, (v16u16)_2, (v16i16)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, UV32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmaddwod_h_bu_b (__m256i _1, __m256i _2, __m256i _3) +{ + return (__m256i)__builtin_lasx_xvmaddwod_h_bu_b ((v16i16)_1, (v32u8)_2, (v32i8)_3); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvrotr_b (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvrotr_b ((v32i8)_1, (v32i8)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvrotr_h (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvrotr_h ((v16i16)_1, (v16i16)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvrotr_w (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvrotr_w ((v8i32)_1, (v8i32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvrotr_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvrotr_d ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvadd_q (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvadd_q ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvsub_q (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvsub_q ((v4i64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwev_q_du_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwev_q_du_d ((v4u64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvaddwod_q_du_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvaddwod_q_du_d ((v4u64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwev_q_du_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwev_q_du_d ((v4u64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, UV4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmulwod_q_du_d (__m256i _1, __m256i _2) +{ + return (__m256i)__builtin_lasx_xvmulwod_q_du_d ((v4u64)_1, (v4i64)_2); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmskgez_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvmskgez_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V32QI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvmsknz_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvmsknz_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V16HI, V32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvexth_h_b (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvexth_h_b ((v32i8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V8SI, V16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvexth_w_h (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvexth_w_h ((v16i16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvexth_d_w (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvexth_d_w ((v8i32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvexth_q_d (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvexth_q_d ((v4i64)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: UV16HI, UV32QI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvexth_hu_bu (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvexth_hu_bu ((v32u8)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: UV8SI, UV16HI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvexth_wu_hu (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvexth_wu_hu ((v16u16)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: UV4DI, UV8SI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvexth_du_wu (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvexth_du_wu ((v8u32)_1); +} + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: UV4DI, UV4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvexth_qu_du (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvexth_qu_du ((v4u64)_1); +} + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V32QI, V32QI, UQI. */ +#define __lasx_xvrotri_b(/*__m256i*/ _1, /*ui3*/ _2) \ + ((__m256i)__builtin_lasx_xvrotri_b ((v32i8)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V16HI, V16HI, UQI. */ +#define __lasx_xvrotri_h(/*__m256i*/ _1, /*ui4*/ _2) \ + ((__m256i)__builtin_lasx_xvrotri_h ((v16i16)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V8SI, V8SI, UQI. */ +#define __lasx_xvrotri_w(/*__m256i*/ _1, /*ui5*/ _2) \ + ((__m256i)__builtin_lasx_xvrotri_w ((v8i32)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V4DI, V4DI, UQI. */ +#define __lasx_xvrotri_d(/*__m256i*/ _1, /*ui6*/ _2) \ + ((__m256i)__builtin_lasx_xvrotri_d ((v4i64)(_1), (_2))) + +/* Assembly instruction format: xd, xj. */ +/* Data types in instruction templates: V4DI, V4DI. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvextl_q_d (__m256i _1) +{ + return (__m256i)__builtin_lasx_xvextl_q_d ((v4i64)_1); +} + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, USI. */ +#define __lasx_xvsrlni_b_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvsrlni_b_h ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, USI. */ +#define __lasx_xvsrlni_h_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvsrlni_h_w ((v16i16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, USI. */ +#define __lasx_xvsrlni_w_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvsrlni_w_d ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, USI. */ +#define __lasx_xvsrlni_d_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvsrlni_d_q ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, USI. */ +#define __lasx_xvsrlrni_b_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvsrlrni_b_h ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, USI. */ +#define __lasx_xvsrlrni_h_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvsrlrni_h_w ((v16i16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, USI. */ +#define __lasx_xvsrlrni_w_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvsrlrni_w_d ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, USI. */ +#define __lasx_xvsrlrni_d_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvsrlrni_d_q ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, USI. */ +#define __lasx_xvssrlni_b_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlni_b_h ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, USI. */ +#define __lasx_xvssrlni_h_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlni_h_w ((v16i16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, USI. */ +#define __lasx_xvssrlni_w_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlni_w_d ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, USI. */ +#define __lasx_xvssrlni_d_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlni_d_q ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: UV32QI, UV32QI, V32QI, USI. */ +#define __lasx_xvssrlni_bu_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlni_bu_h ((v32u8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV16HI, UV16HI, V16HI, USI. */ +#define __lasx_xvssrlni_hu_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlni_hu_w ((v16u16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: UV8SI, UV8SI, V8SI, USI. */ +#define __lasx_xvssrlni_wu_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlni_wu_d ((v8u32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: UV4DI, UV4DI, V4DI, USI. */ +#define __lasx_xvssrlni_du_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlni_du_q ((v4u64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, USI. */ +#define __lasx_xvssrlrni_b_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlrni_b_h ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, USI. */ +#define __lasx_xvssrlrni_h_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlrni_h_w ((v16i16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, USI. */ +#define __lasx_xvssrlrni_w_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlrni_w_d ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, USI. */ +#define __lasx_xvssrlrni_d_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlrni_d_q ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: UV32QI, UV32QI, V32QI, USI. */ +#define __lasx_xvssrlrni_bu_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlrni_bu_h ((v32u8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV16HI, UV16HI, V16HI, USI. */ +#define __lasx_xvssrlrni_hu_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlrni_hu_w ((v16u16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: UV8SI, UV8SI, V8SI, USI. */ +#define __lasx_xvssrlrni_wu_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlrni_wu_d ((v8u32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: UV4DI, UV4DI, V4DI, USI. */ +#define __lasx_xvssrlrni_du_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvssrlrni_du_q ((v4u64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, USI. */ +#define __lasx_xvsrani_b_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvsrani_b_h ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, USI. */ +#define __lasx_xvsrani_h_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvsrani_h_w ((v16i16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, USI. */ +#define __lasx_xvsrani_w_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvsrani_w_d ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, USI. */ +#define __lasx_xvsrani_d_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvsrani_d_q ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, USI. */ +#define __lasx_xvsrarni_b_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvsrarni_b_h ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, USI. */ +#define __lasx_xvsrarni_h_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvsrarni_h_w ((v16i16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, USI. */ +#define __lasx_xvsrarni_w_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvsrarni_w_d ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, USI. */ +#define __lasx_xvsrarni_d_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvsrarni_d_q ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, USI. */ +#define __lasx_xvssrani_b_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvssrani_b_h ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, USI. */ +#define __lasx_xvssrani_h_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvssrani_h_w ((v16i16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, USI. */ +#define __lasx_xvssrani_w_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvssrani_w_d ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, USI. */ +#define __lasx_xvssrani_d_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvssrani_d_q ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: UV32QI, UV32QI, V32QI, USI. */ +#define __lasx_xvssrani_bu_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvssrani_bu_h ((v32u8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV16HI, UV16HI, V16HI, USI. */ +#define __lasx_xvssrani_hu_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvssrani_hu_w ((v16u16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: UV8SI, UV8SI, V8SI, USI. */ +#define __lasx_xvssrani_wu_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvssrani_wu_d ((v8u32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: UV4DI, UV4DI, V4DI, USI. */ +#define __lasx_xvssrani_du_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvssrani_du_q ((v4u64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: V32QI, V32QI, V32QI, USI. */ +#define __lasx_xvssrarni_b_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvssrarni_b_h ((v32i8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: V16HI, V16HI, V16HI, USI. */ +#define __lasx_xvssrarni_h_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvssrarni_h_w ((v16i16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: V8SI, V8SI, V8SI, USI. */ +#define __lasx_xvssrarni_w_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvssrarni_w_d ((v8i32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: V4DI, V4DI, V4DI, USI. */ +#define __lasx_xvssrarni_d_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvssrarni_d_q ((v4i64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui4. */ +/* Data types in instruction templates: UV32QI, UV32QI, V32QI, USI. */ +#define __lasx_xvssrarni_bu_h(/*__m256i*/ _1, /*__m256i*/ _2, /*ui4*/ _3) \ + ((__m256i)__builtin_lasx_xvssrarni_bu_h ((v32u8)(_1), (v32i8)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui5. */ +/* Data types in instruction templates: UV16HI, UV16HI, V16HI, USI. */ +#define __lasx_xvssrarni_hu_w(/*__m256i*/ _1, /*__m256i*/ _2, /*ui5*/ _3) \ + ((__m256i)__builtin_lasx_xvssrarni_hu_w ((v16u16)(_1), (v16i16)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui6. */ +/* Data types in instruction templates: UV8SI, UV8SI, V8SI, USI. */ +#define __lasx_xvssrarni_wu_d(/*__m256i*/ _1, /*__m256i*/ _2, /*ui6*/ _3) \ + ((__m256i)__builtin_lasx_xvssrarni_wu_d ((v8u32)(_1), (v8i32)(_2), (_3))) + +/* Assembly instruction format: xd, xj, ui7. */ +/* Data types in instruction templates: UV4DI, UV4DI, V4DI, USI. */ +#define __lasx_xvssrarni_du_q(/*__m256i*/ _1, /*__m256i*/ _2, /*ui7*/ _3) \ + ((__m256i)__builtin_lasx_xvssrarni_du_q ((v4u64)(_1), (v4i64)(_2), (_3))) + +/* Assembly instruction format: cd, xj. */ +/* Data types in instruction templates: SI, UV32QI. */ +#define __lasx_xbnz_b(/*__m256i*/ _1) \ + ((int)__builtin_lasx_xbnz_b ((v32u8)(_1))) + +/* Assembly instruction format: cd, xj. */ +/* Data types in instruction templates: SI, UV4DI. */ +#define __lasx_xbnz_d(/*__m256i*/ _1) \ + ((int)__builtin_lasx_xbnz_d ((v4u64)(_1))) + +/* Assembly instruction format: cd, xj. */ +/* Data types in instruction templates: SI, UV16HI. */ +#define __lasx_xbnz_h(/*__m256i*/ _1) \ + ((int)__builtin_lasx_xbnz_h ((v16u16)(_1))) + +/* Assembly instruction format: cd, xj. */ +/* Data types in instruction templates: SI, UV32QI. */ +#define __lasx_xbnz_v(/*__m256i*/ _1) \ + ((int)__builtin_lasx_xbnz_v ((v32u8)(_1))) + +/* Assembly instruction format: cd, xj. */ +/* Data types in instruction templates: SI, UV8SI. */ +#define __lasx_xbnz_w(/*__m256i*/ _1) \ + ((int)__builtin_lasx_xbnz_w ((v8u32)(_1))) + +/* Assembly instruction format: cd, xj. */ +/* Data types in instruction templates: SI, UV32QI. */ +#define __lasx_xbz_b(/*__m256i*/ _1) \ + ((int)__builtin_lasx_xbz_b ((v32u8)(_1))) + +/* Assembly instruction format: cd, xj. */ +/* Data types in instruction templates: SI, UV4DI. */ +#define __lasx_xbz_d(/*__m256i*/ _1) \ + ((int)__builtin_lasx_xbz_d ((v4u64)(_1))) + +/* Assembly instruction format: cd, xj. */ +/* Data types in instruction templates: SI, UV16HI. */ +#define __lasx_xbz_h(/*__m256i*/ _1) \ + ((int)__builtin_lasx_xbz_h ((v16u16)(_1))) + +/* Assembly instruction format: cd, xj. */ +/* Data types in instruction templates: SI, UV32QI. */ +#define __lasx_xbz_v(/*__m256i*/ _1) \ + ((int)__builtin_lasx_xbz_v ((v32u8)(_1))) + +/* Assembly instruction format: cd, xj. */ +/* Data types in instruction templates: SI, UV8SI. */ +#define __lasx_xbz_w(/*__m256i*/ _1) \ + ((int)__builtin_lasx_xbz_w ((v8u32)(_1))) + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_caf_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_caf_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_caf_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_caf_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_ceq_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_ceq_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_ceq_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_ceq_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cle_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cle_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cle_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cle_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_clt_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_clt_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_clt_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_clt_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cne_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cne_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cne_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cne_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cor_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cor_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cor_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cor_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cueq_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cueq_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cueq_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cueq_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cule_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cule_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cule_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cule_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cult_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cult_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cult_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cult_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cun_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cun_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cune_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cune_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cune_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cune_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_cun_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_cun_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_saf_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_saf_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_saf_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_saf_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_seq_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_seq_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_seq_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_seq_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sle_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sle_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sle_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sle_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_slt_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_slt_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_slt_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_slt_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sne_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sne_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sne_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sne_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sor_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sor_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sor_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sor_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sueq_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sueq_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sueq_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sueq_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sule_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sule_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sule_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sule_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sult_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sult_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sult_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sult_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sun_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sun_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V4DI, V4DF, V4DF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sune_d (__m256d _1, __m256d _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sune_d ((v4f64)_1, (v4f64)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sune_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sune_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, xk. */ +/* Data types in instruction templates: V8SI, V8SF, V8SF. */ +extern __inline __attribute__((__gnu_inline__, __always_inline__, __artificial__)) +__m256i __lasx_xvfcmp_sun_s (__m256 _1, __m256 _2) +{ + return (__m256i)__builtin_lasx_xvfcmp_sun_s ((v8f32)_1, (v8f32)_2); +} + +/* Assembly instruction format: xd, xj, ui2. */ +/* Data types in instruction templates: V4DF, V4DF, UQI. */ +#define __lasx_xvpickve_d_f(/*__m256d*/ _1, /*ui2*/ _2) \ + ((__m256d)__builtin_lasx_xvpickve_d_f ((v4f64)(_1), (_2))) + +/* Assembly instruction format: xd, xj, ui3. */ +/* Data types in instruction templates: V8SF, V8SF, UQI. */ +#define __lasx_xvpickve_w_f(/*__m256*/ _1, /*ui3*/ _2) \ + ((__m256)__builtin_lasx_xvpickve_w_f ((v8f32)(_1), (_2))) + +/* Assembly instruction format: xd, si10. */ +/* Data types in instruction templates: V32QI, HI. */ +#define __lasx_xvrepli_b(/*si10*/ _1) \ + ((__m256i)__builtin_lasx_xvrepli_b ((_1))) + +/* Assembly instruction format: xd, si10. */ +/* Data types in instruction templates: V4DI, HI. */ +#define __lasx_xvrepli_d(/*si10*/ _1) \ + ((__m256i)__builtin_lasx_xvrepli_d ((_1))) + +/* Assembly instruction format: xd, si10. */ +/* Data types in instruction templates: V16HI, HI. */ +#define __lasx_xvrepli_h(/*si10*/ _1) \ + ((__m256i)__builtin_lasx_xvrepli_h ((_1))) + +/* Assembly instruction format: xd, si10. */ +/* Data types in instruction templates: V8SI, HI. */ +#define __lasx_xvrepli_w(/*si10*/ _1) \ + ((__m256i)__builtin_lasx_xvrepli_w ((_1))) + +#endif /* defined(__loongarch_asx). */ +#endif /* _GCC_LOONGSON_ASXINTRIN_H. */ diff --git a/gcc/config/loongarch/loongarch-builtins.cc b/gcc/config/loongarch/loongarch-builtins.cc index 5958f5b7fbe..064fee7dfa2 100644 --- a/gcc/config/loongarch/loongarch-builtins.cc +++ b/gcc/config/loongarch/loongarch-builtins.cc @@ -74,6 +74,13 @@ enum loongarch_builtin_type /* The function corresponds to an LSX conditional branch instruction combined with a compare instruction. */ LARCH_BUILTIN_LSX_TEST_BRANCH, + + /* For generating LoongArch LASX. */ + LARCH_BUILTIN_LASX, + + /* The function corresponds to an LASX conditional branch instruction + combined with a compare instruction. */ + LARCH_BUILTIN_LASX_TEST_BRANCH, }; /* Declare an availability predicate for built-in functions that require @@ -112,6 +119,7 @@ struct loongarch_builtin_description AVAIL_ALL (hard_float, TARGET_HARD_FLOAT_ABI) AVAIL_ALL (lsx, ISA_HAS_LSX) +AVAIL_ALL (lasx, ISA_HAS_LASX) /* Construct a loongarch_builtin_description from the given arguments. @@ -173,6 +181,30 @@ AVAIL_ALL (lsx, ISA_HAS_LSX) "__builtin_lsx_" #INSN, LARCH_BUILTIN_DIRECT_NO_TARGET, \ FUNCTION_TYPE, loongarch_builtin_avail_lsx } +/* Define an LASX LARCH_BUILTIN_DIRECT function __builtin_lasx_ + for instruction CODE_FOR_lasx_. FUNCTION_TYPE is a builtin_description + field. */ +#define LASX_BUILTIN(INSN, FUNCTION_TYPE) \ + { CODE_FOR_lasx_ ## INSN, \ + "__builtin_lasx_" #INSN, LARCH_BUILTIN_LASX, \ + FUNCTION_TYPE, loongarch_builtin_avail_lasx } + +/* Define an LASX LARCH_BUILTIN_DIRECT_NO_TARGET function __builtin_lasx_ + for instruction CODE_FOR_lasx_. FUNCTION_TYPE is a builtin_description + field. */ +#define LASX_NO_TARGET_BUILTIN(INSN, FUNCTION_TYPE) \ + { CODE_FOR_lasx_ ## INSN, \ + "__builtin_lasx_" #INSN, LARCH_BUILTIN_DIRECT_NO_TARGET, \ + FUNCTION_TYPE, loongarch_builtin_avail_lasx } + +/* Define an LASX LARCH_BUILTIN_LASX_TEST_BRANCH function __builtin_lasx_ + for instruction CODE_FOR_lasx_. FUNCTION_TYPE is a builtin_description + field. */ +#define LASX_BUILTIN_TEST_BRANCH(INSN, FUNCTION_TYPE) \ + { CODE_FOR_lasx_ ## INSN, \ + "__builtin_lasx_" #INSN, LARCH_BUILTIN_LASX_TEST_BRANCH, \ + FUNCTION_TYPE, loongarch_builtin_avail_lasx } + /* LoongArch SX define CODE_FOR_lsx_xxx */ #define CODE_FOR_lsx_vsadd_b CODE_FOR_ssaddv16qi3 #define CODE_FOR_lsx_vsadd_h CODE_FOR_ssaddv8hi3 @@ -442,6 +474,276 @@ AVAIL_ALL (lsx, ISA_HAS_LSX) #define CODE_FOR_lsx_vssrlrn_hu_w CODE_FOR_lsx_vssrlrn_u_hu_w #define CODE_FOR_lsx_vssrlrn_wu_d CODE_FOR_lsx_vssrlrn_u_wu_d +/* LoongArch ASX define CODE_FOR_lasx_mxxx */ +#define CODE_FOR_lasx_xvsadd_b CODE_FOR_ssaddv32qi3 +#define CODE_FOR_lasx_xvsadd_h CODE_FOR_ssaddv16hi3 +#define CODE_FOR_lasx_xvsadd_w CODE_FOR_ssaddv8si3 +#define CODE_FOR_lasx_xvsadd_d CODE_FOR_ssaddv4di3 +#define CODE_FOR_lasx_xvsadd_bu CODE_FOR_usaddv32qi3 +#define CODE_FOR_lasx_xvsadd_hu CODE_FOR_usaddv16hi3 +#define CODE_FOR_lasx_xvsadd_wu CODE_FOR_usaddv8si3 +#define CODE_FOR_lasx_xvsadd_du CODE_FOR_usaddv4di3 +#define CODE_FOR_lasx_xvadd_b CODE_FOR_addv32qi3 +#define CODE_FOR_lasx_xvadd_h CODE_FOR_addv16hi3 +#define CODE_FOR_lasx_xvadd_w CODE_FOR_addv8si3 +#define CODE_FOR_lasx_xvadd_d CODE_FOR_addv4di3 +#define CODE_FOR_lasx_xvaddi_bu CODE_FOR_addv32qi3 +#define CODE_FOR_lasx_xvaddi_hu CODE_FOR_addv16hi3 +#define CODE_FOR_lasx_xvaddi_wu CODE_FOR_addv8si3 +#define CODE_FOR_lasx_xvaddi_du CODE_FOR_addv4di3 +#define CODE_FOR_lasx_xvand_v CODE_FOR_andv32qi3 +#define CODE_FOR_lasx_xvandi_b CODE_FOR_andv32qi3 +#define CODE_FOR_lasx_xvbitsel_v CODE_FOR_lasx_xvbitsel_b +#define CODE_FOR_lasx_xvseqi_b CODE_FOR_lasx_xvseq_b +#define CODE_FOR_lasx_xvseqi_h CODE_FOR_lasx_xvseq_h +#define CODE_FOR_lasx_xvseqi_w CODE_FOR_lasx_xvseq_w +#define CODE_FOR_lasx_xvseqi_d CODE_FOR_lasx_xvseq_d +#define CODE_FOR_lasx_xvslti_b CODE_FOR_lasx_xvslt_b +#define CODE_FOR_lasx_xvslti_h CODE_FOR_lasx_xvslt_h +#define CODE_FOR_lasx_xvslti_w CODE_FOR_lasx_xvslt_w +#define CODE_FOR_lasx_xvslti_d CODE_FOR_lasx_xvslt_d +#define CODE_FOR_lasx_xvslti_bu CODE_FOR_lasx_xvslt_bu +#define CODE_FOR_lasx_xvslti_hu CODE_FOR_lasx_xvslt_hu +#define CODE_FOR_lasx_xvslti_wu CODE_FOR_lasx_xvslt_wu +#define CODE_FOR_lasx_xvslti_du CODE_FOR_lasx_xvslt_du +#define CODE_FOR_lasx_xvslei_b CODE_FOR_lasx_xvsle_b +#define CODE_FOR_lasx_xvslei_h CODE_FOR_lasx_xvsle_h +#define CODE_FOR_lasx_xvslei_w CODE_FOR_lasx_xvsle_w +#define CODE_FOR_lasx_xvslei_d CODE_FOR_lasx_xvsle_d +#define CODE_FOR_lasx_xvslei_bu CODE_FOR_lasx_xvsle_bu +#define CODE_FOR_lasx_xvslei_hu CODE_FOR_lasx_xvsle_hu +#define CODE_FOR_lasx_xvslei_wu CODE_FOR_lasx_xvsle_wu +#define CODE_FOR_lasx_xvslei_du CODE_FOR_lasx_xvsle_du +#define CODE_FOR_lasx_xvdiv_b CODE_FOR_divv32qi3 +#define CODE_FOR_lasx_xvdiv_h CODE_FOR_divv16hi3 +#define CODE_FOR_lasx_xvdiv_w CODE_FOR_divv8si3 +#define CODE_FOR_lasx_xvdiv_d CODE_FOR_divv4di3 +#define CODE_FOR_lasx_xvdiv_bu CODE_FOR_udivv32qi3 +#define CODE_FOR_lasx_xvdiv_hu CODE_FOR_udivv16hi3 +#define CODE_FOR_lasx_xvdiv_wu CODE_FOR_udivv8si3 +#define CODE_FOR_lasx_xvdiv_du CODE_FOR_udivv4di3 +#define CODE_FOR_lasx_xvfadd_s CODE_FOR_addv8sf3 +#define CODE_FOR_lasx_xvfadd_d CODE_FOR_addv4df3 +#define CODE_FOR_lasx_xvftintrz_w_s CODE_FOR_fix_truncv8sfv8si2 +#define CODE_FOR_lasx_xvftintrz_l_d CODE_FOR_fix_truncv4dfv4di2 +#define CODE_FOR_lasx_xvftintrz_wu_s CODE_FOR_fixuns_truncv8sfv8si2 +#define CODE_FOR_lasx_xvftintrz_lu_d CODE_FOR_fixuns_truncv4dfv4di2 +#define CODE_FOR_lasx_xvffint_s_w CODE_FOR_floatv8siv8sf2 +#define CODE_FOR_lasx_xvffint_d_l CODE_FOR_floatv4div4df2 +#define CODE_FOR_lasx_xvffint_s_wu CODE_FOR_floatunsv8siv8sf2 +#define CODE_FOR_lasx_xvffint_d_lu CODE_FOR_floatunsv4div4df2 +#define CODE_FOR_lasx_xvfsub_s CODE_FOR_subv8sf3 +#define CODE_FOR_lasx_xvfsub_d CODE_FOR_subv4df3 +#define CODE_FOR_lasx_xvfmul_s CODE_FOR_mulv8sf3 +#define CODE_FOR_lasx_xvfmul_d CODE_FOR_mulv4df3 +#define CODE_FOR_lasx_xvfdiv_s CODE_FOR_divv8sf3 +#define CODE_FOR_lasx_xvfdiv_d CODE_FOR_divv4df3 +#define CODE_FOR_lasx_xvfmax_s CODE_FOR_smaxv8sf3 +#define CODE_FOR_lasx_xvfmax_d CODE_FOR_smaxv4df3 +#define CODE_FOR_lasx_xvfmin_s CODE_FOR_sminv8sf3 +#define CODE_FOR_lasx_xvfmin_d CODE_FOR_sminv4df3 +#define CODE_FOR_lasx_xvfsqrt_s CODE_FOR_sqrtv8sf2 +#define CODE_FOR_lasx_xvfsqrt_d CODE_FOR_sqrtv4df2 +#define CODE_FOR_lasx_xvflogb_s CODE_FOR_logbv8sf2 +#define CODE_FOR_lasx_xvflogb_d CODE_FOR_logbv4df2 +#define CODE_FOR_lasx_xvmax_b CODE_FOR_smaxv32qi3 +#define CODE_FOR_lasx_xvmax_h CODE_FOR_smaxv16hi3 +#define CODE_FOR_lasx_xvmax_w CODE_FOR_smaxv8si3 +#define CODE_FOR_lasx_xvmax_d CODE_FOR_smaxv4di3 +#define CODE_FOR_lasx_xvmaxi_b CODE_FOR_smaxv32qi3 +#define CODE_FOR_lasx_xvmaxi_h CODE_FOR_smaxv16hi3 +#define CODE_FOR_lasx_xvmaxi_w CODE_FOR_smaxv8si3 +#define CODE_FOR_lasx_xvmaxi_d CODE_FOR_smaxv4di3 +#define CODE_FOR_lasx_xvmax_bu CODE_FOR_umaxv32qi3 +#define CODE_FOR_lasx_xvmax_hu CODE_FOR_umaxv16hi3 +#define CODE_FOR_lasx_xvmax_wu CODE_FOR_umaxv8si3 +#define CODE_FOR_lasx_xvmax_du CODE_FOR_umaxv4di3 +#define CODE_FOR_lasx_xvmaxi_bu CODE_FOR_umaxv32qi3 +#define CODE_FOR_lasx_xvmaxi_hu CODE_FOR_umaxv16hi3 +#define CODE_FOR_lasx_xvmaxi_wu CODE_FOR_umaxv8si3 +#define CODE_FOR_lasx_xvmaxi_du CODE_FOR_umaxv4di3 +#define CODE_FOR_lasx_xvmin_b CODE_FOR_sminv32qi3 +#define CODE_FOR_lasx_xvmin_h CODE_FOR_sminv16hi3 +#define CODE_FOR_lasx_xvmin_w CODE_FOR_sminv8si3 +#define CODE_FOR_lasx_xvmin_d CODE_FOR_sminv4di3 +#define CODE_FOR_lasx_xvmini_b CODE_FOR_sminv32qi3 +#define CODE_FOR_lasx_xvmini_h CODE_FOR_sminv16hi3 +#define CODE_FOR_lasx_xvmini_w CODE_FOR_sminv8si3 +#define CODE_FOR_lasx_xvmini_d CODE_FOR_sminv4di3 +#define CODE_FOR_lasx_xvmin_bu CODE_FOR_uminv32qi3 +#define CODE_FOR_lasx_xvmin_hu CODE_FOR_uminv16hi3 +#define CODE_FOR_lasx_xvmin_wu CODE_FOR_uminv8si3 +#define CODE_FOR_lasx_xvmin_du CODE_FOR_uminv4di3 +#define CODE_FOR_lasx_xvmini_bu CODE_FOR_uminv32qi3 +#define CODE_FOR_lasx_xvmini_hu CODE_FOR_uminv16hi3 +#define CODE_FOR_lasx_xvmini_wu CODE_FOR_uminv8si3 +#define CODE_FOR_lasx_xvmini_du CODE_FOR_uminv4di3 +#define CODE_FOR_lasx_xvmod_b CODE_FOR_modv32qi3 +#define CODE_FOR_lasx_xvmod_h CODE_FOR_modv16hi3 +#define CODE_FOR_lasx_xvmod_w CODE_FOR_modv8si3 +#define CODE_FOR_lasx_xvmod_d CODE_FOR_modv4di3 +#define CODE_FOR_lasx_xvmod_bu CODE_FOR_umodv32qi3 +#define CODE_FOR_lasx_xvmod_hu CODE_FOR_umodv16hi3 +#define CODE_FOR_lasx_xvmod_wu CODE_FOR_umodv8si3 +#define CODE_FOR_lasx_xvmod_du CODE_FOR_umodv4di3 +#define CODE_FOR_lasx_xvmul_b CODE_FOR_mulv32qi3 +#define CODE_FOR_lasx_xvmul_h CODE_FOR_mulv16hi3 +#define CODE_FOR_lasx_xvmul_w CODE_FOR_mulv8si3 +#define CODE_FOR_lasx_xvmul_d CODE_FOR_mulv4di3 +#define CODE_FOR_lasx_xvclz_b CODE_FOR_clzv32qi2 +#define CODE_FOR_lasx_xvclz_h CODE_FOR_clzv16hi2 +#define CODE_FOR_lasx_xvclz_w CODE_FOR_clzv8si2 +#define CODE_FOR_lasx_xvclz_d CODE_FOR_clzv4di2 +#define CODE_FOR_lasx_xvnor_v CODE_FOR_lasx_xvnor_b +#define CODE_FOR_lasx_xvor_v CODE_FOR_iorv32qi3 +#define CODE_FOR_lasx_xvori_b CODE_FOR_iorv32qi3 +#define CODE_FOR_lasx_xvnori_b CODE_FOR_lasx_xvnor_b +#define CODE_FOR_lasx_xvpcnt_b CODE_FOR_popcountv32qi2 +#define CODE_FOR_lasx_xvpcnt_h CODE_FOR_popcountv16hi2 +#define CODE_FOR_lasx_xvpcnt_w CODE_FOR_popcountv8si2 +#define CODE_FOR_lasx_xvpcnt_d CODE_FOR_popcountv4di2 +#define CODE_FOR_lasx_xvxor_v CODE_FOR_xorv32qi3 +#define CODE_FOR_lasx_xvxori_b CODE_FOR_xorv32qi3 +#define CODE_FOR_lasx_xvsll_b CODE_FOR_vashlv32qi3 +#define CODE_FOR_lasx_xvsll_h CODE_FOR_vashlv16hi3 +#define CODE_FOR_lasx_xvsll_w CODE_FOR_vashlv8si3 +#define CODE_FOR_lasx_xvsll_d CODE_FOR_vashlv4di3 +#define CODE_FOR_lasx_xvslli_b CODE_FOR_vashlv32qi3 +#define CODE_FOR_lasx_xvslli_h CODE_FOR_vashlv16hi3 +#define CODE_FOR_lasx_xvslli_w CODE_FOR_vashlv8si3 +#define CODE_FOR_lasx_xvslli_d CODE_FOR_vashlv4di3 +#define CODE_FOR_lasx_xvsra_b CODE_FOR_vashrv32qi3 +#define CODE_FOR_lasx_xvsra_h CODE_FOR_vashrv16hi3 +#define CODE_FOR_lasx_xvsra_w CODE_FOR_vashrv8si3 +#define CODE_FOR_lasx_xvsra_d CODE_FOR_vashrv4di3 +#define CODE_FOR_lasx_xvsrai_b CODE_FOR_vashrv32qi3 +#define CODE_FOR_lasx_xvsrai_h CODE_FOR_vashrv16hi3 +#define CODE_FOR_lasx_xvsrai_w CODE_FOR_vashrv8si3 +#define CODE_FOR_lasx_xvsrai_d CODE_FOR_vashrv4di3 +#define CODE_FOR_lasx_xvsrl_b CODE_FOR_vlshrv32qi3 +#define CODE_FOR_lasx_xvsrl_h CODE_FOR_vlshrv16hi3 +#define CODE_FOR_lasx_xvsrl_w CODE_FOR_vlshrv8si3 +#define CODE_FOR_lasx_xvsrl_d CODE_FOR_vlshrv4di3 +#define CODE_FOR_lasx_xvsrli_b CODE_FOR_vlshrv32qi3 +#define CODE_FOR_lasx_xvsrli_h CODE_FOR_vlshrv16hi3 +#define CODE_FOR_lasx_xvsrli_w CODE_FOR_vlshrv8si3 +#define CODE_FOR_lasx_xvsrli_d CODE_FOR_vlshrv4di3 +#define CODE_FOR_lasx_xvsub_b CODE_FOR_subv32qi3 +#define CODE_FOR_lasx_xvsub_h CODE_FOR_subv16hi3 +#define CODE_FOR_lasx_xvsub_w CODE_FOR_subv8si3 +#define CODE_FOR_lasx_xvsub_d CODE_FOR_subv4di3 +#define CODE_FOR_lasx_xvsubi_bu CODE_FOR_subv32qi3 +#define CODE_FOR_lasx_xvsubi_hu CODE_FOR_subv16hi3 +#define CODE_FOR_lasx_xvsubi_wu CODE_FOR_subv8si3 +#define CODE_FOR_lasx_xvsubi_du CODE_FOR_subv4di3 +#define CODE_FOR_lasx_xvpackod_d CODE_FOR_lasx_xvilvh_d +#define CODE_FOR_lasx_xvpackev_d CODE_FOR_lasx_xvilvl_d +#define CODE_FOR_lasx_xvpickod_d CODE_FOR_lasx_xvilvh_d +#define CODE_FOR_lasx_xvpickev_d CODE_FOR_lasx_xvilvl_d +#define CODE_FOR_lasx_xvrepli_b CODE_FOR_lasx_xvrepliv32qi +#define CODE_FOR_lasx_xvrepli_h CODE_FOR_lasx_xvrepliv16hi +#define CODE_FOR_lasx_xvrepli_w CODE_FOR_lasx_xvrepliv8si +#define CODE_FOR_lasx_xvrepli_d CODE_FOR_lasx_xvrepliv4di + +#define CODE_FOR_lasx_xvandn_v CODE_FOR_xvandnv32qi3 +#define CODE_FOR_lasx_xvorn_v CODE_FOR_xvornv32qi3 +#define CODE_FOR_lasx_xvneg_b CODE_FOR_negv32qi2 +#define CODE_FOR_lasx_xvneg_h CODE_FOR_negv16hi2 +#define CODE_FOR_lasx_xvneg_w CODE_FOR_negv8si2 +#define CODE_FOR_lasx_xvneg_d CODE_FOR_negv4di2 +#define CODE_FOR_lasx_xvbsrl_v CODE_FOR_lasx_xvbsrl_b +#define CODE_FOR_lasx_xvbsll_v CODE_FOR_lasx_xvbsll_b +#define CODE_FOR_lasx_xvfmadd_s CODE_FOR_fmav8sf4 +#define CODE_FOR_lasx_xvfmadd_d CODE_FOR_fmav4df4 +#define CODE_FOR_lasx_xvfmsub_s CODE_FOR_fmsv8sf4 +#define CODE_FOR_lasx_xvfmsub_d CODE_FOR_fmsv4df4 +#define CODE_FOR_lasx_xvfnmadd_s CODE_FOR_xvfnmaddv8sf4_nmadd4 +#define CODE_FOR_lasx_xvfnmadd_d CODE_FOR_xvfnmaddv4df4_nmadd4 +#define CODE_FOR_lasx_xvfnmsub_s CODE_FOR_xvfnmsubv8sf4_nmsub4 +#define CODE_FOR_lasx_xvfnmsub_d CODE_FOR_xvfnmsubv4df4_nmsub4 + +#define CODE_FOR_lasx_xvpermi_q CODE_FOR_lasx_xvpermi_q_v32qi +#define CODE_FOR_lasx_xvpermi_d CODE_FOR_lasx_xvpermi_d_v4di +#define CODE_FOR_lasx_xbnz_v CODE_FOR_lasx_xbnz_v_b +#define CODE_FOR_lasx_xbz_v CODE_FOR_lasx_xbz_v_b + +#define CODE_FOR_lasx_xvssub_b CODE_FOR_lasx_xvssub_s_b +#define CODE_FOR_lasx_xvssub_h CODE_FOR_lasx_xvssub_s_h +#define CODE_FOR_lasx_xvssub_w CODE_FOR_lasx_xvssub_s_w +#define CODE_FOR_lasx_xvssub_d CODE_FOR_lasx_xvssub_s_d +#define CODE_FOR_lasx_xvssub_bu CODE_FOR_lasx_xvssub_u_bu +#define CODE_FOR_lasx_xvssub_hu CODE_FOR_lasx_xvssub_u_hu +#define CODE_FOR_lasx_xvssub_wu CODE_FOR_lasx_xvssub_u_wu +#define CODE_FOR_lasx_xvssub_du CODE_FOR_lasx_xvssub_u_du +#define CODE_FOR_lasx_xvabsd_b CODE_FOR_lasx_xvabsd_s_b +#define CODE_FOR_lasx_xvabsd_h CODE_FOR_lasx_xvabsd_s_h +#define CODE_FOR_lasx_xvabsd_w CODE_FOR_lasx_xvabsd_s_w +#define CODE_FOR_lasx_xvabsd_d CODE_FOR_lasx_xvabsd_s_d +#define CODE_FOR_lasx_xvabsd_bu CODE_FOR_lasx_xvabsd_u_bu +#define CODE_FOR_lasx_xvabsd_hu CODE_FOR_lasx_xvabsd_u_hu +#define CODE_FOR_lasx_xvabsd_wu CODE_FOR_lasx_xvabsd_u_wu +#define CODE_FOR_lasx_xvabsd_du CODE_FOR_lasx_xvabsd_u_du +#define CODE_FOR_lasx_xvavg_b CODE_FOR_lasx_xvavg_s_b +#define CODE_FOR_lasx_xvavg_h CODE_FOR_lasx_xvavg_s_h +#define CODE_FOR_lasx_xvavg_w CODE_FOR_lasx_xvavg_s_w +#define CODE_FOR_lasx_xvavg_d CODE_FOR_lasx_xvavg_s_d +#define CODE_FOR_lasx_xvavg_bu CODE_FOR_lasx_xvavg_u_bu +#define CODE_FOR_lasx_xvavg_hu CODE_FOR_lasx_xvavg_u_hu +#define CODE_FOR_lasx_xvavg_wu CODE_FOR_lasx_xvavg_u_wu +#define CODE_FOR_lasx_xvavg_du CODE_FOR_lasx_xvavg_u_du +#define CODE_FOR_lasx_xvavgr_b CODE_FOR_lasx_xvavgr_s_b +#define CODE_FOR_lasx_xvavgr_h CODE_FOR_lasx_xvavgr_s_h +#define CODE_FOR_lasx_xvavgr_w CODE_FOR_lasx_xvavgr_s_w +#define CODE_FOR_lasx_xvavgr_d CODE_FOR_lasx_xvavgr_s_d +#define CODE_FOR_lasx_xvavgr_bu CODE_FOR_lasx_xvavgr_u_bu +#define CODE_FOR_lasx_xvavgr_hu CODE_FOR_lasx_xvavgr_u_hu +#define CODE_FOR_lasx_xvavgr_wu CODE_FOR_lasx_xvavgr_u_wu +#define CODE_FOR_lasx_xvavgr_du CODE_FOR_lasx_xvavgr_u_du +#define CODE_FOR_lasx_xvmuh_b CODE_FOR_lasx_xvmuh_s_b +#define CODE_FOR_lasx_xvmuh_h CODE_FOR_lasx_xvmuh_s_h +#define CODE_FOR_lasx_xvmuh_w CODE_FOR_lasx_xvmuh_s_w +#define CODE_FOR_lasx_xvmuh_d CODE_FOR_lasx_xvmuh_s_d +#define CODE_FOR_lasx_xvmuh_bu CODE_FOR_lasx_xvmuh_u_bu +#define CODE_FOR_lasx_xvmuh_hu CODE_FOR_lasx_xvmuh_u_hu +#define CODE_FOR_lasx_xvmuh_wu CODE_FOR_lasx_xvmuh_u_wu +#define CODE_FOR_lasx_xvmuh_du CODE_FOR_lasx_xvmuh_u_du +#define CODE_FOR_lasx_xvssran_b_h CODE_FOR_lasx_xvssran_s_b_h +#define CODE_FOR_lasx_xvssran_h_w CODE_FOR_lasx_xvssran_s_h_w +#define CODE_FOR_lasx_xvssran_w_d CODE_FOR_lasx_xvssran_s_w_d +#define CODE_FOR_lasx_xvssran_bu_h CODE_FOR_lasx_xvssran_u_bu_h +#define CODE_FOR_lasx_xvssran_hu_w CODE_FOR_lasx_xvssran_u_hu_w +#define CODE_FOR_lasx_xvssran_wu_d CODE_FOR_lasx_xvssran_u_wu_d +#define CODE_FOR_lasx_xvssrarn_b_h CODE_FOR_lasx_xvssrarn_s_b_h +#define CODE_FOR_lasx_xvssrarn_h_w CODE_FOR_lasx_xvssrarn_s_h_w +#define CODE_FOR_lasx_xvssrarn_w_d CODE_FOR_lasx_xvssrarn_s_w_d +#define CODE_FOR_lasx_xvssrarn_bu_h CODE_FOR_lasx_xvssrarn_u_bu_h +#define CODE_FOR_lasx_xvssrarn_hu_w CODE_FOR_lasx_xvssrarn_u_hu_w +#define CODE_FOR_lasx_xvssrarn_wu_d CODE_FOR_lasx_xvssrarn_u_wu_d +#define CODE_FOR_lasx_xvssrln_bu_h CODE_FOR_lasx_xvssrln_u_bu_h +#define CODE_FOR_lasx_xvssrln_hu_w CODE_FOR_lasx_xvssrln_u_hu_w +#define CODE_FOR_lasx_xvssrln_wu_d CODE_FOR_lasx_xvssrln_u_wu_d +#define CODE_FOR_lasx_xvssrlrn_bu_h CODE_FOR_lasx_xvssrlrn_u_bu_h +#define CODE_FOR_lasx_xvssrlrn_hu_w CODE_FOR_lasx_xvssrlrn_u_hu_w +#define CODE_FOR_lasx_xvssrlrn_wu_d CODE_FOR_lasx_xvssrlrn_u_wu_d +#define CODE_FOR_lasx_xvftint_w_s CODE_FOR_lasx_xvftint_s_w_s +#define CODE_FOR_lasx_xvftint_l_d CODE_FOR_lasx_xvftint_s_l_d +#define CODE_FOR_lasx_xvftint_wu_s CODE_FOR_lasx_xvftint_u_wu_s +#define CODE_FOR_lasx_xvftint_lu_d CODE_FOR_lasx_xvftint_u_lu_d +#define CODE_FOR_lasx_xvsllwil_h_b CODE_FOR_lasx_xvsllwil_s_h_b +#define CODE_FOR_lasx_xvsllwil_w_h CODE_FOR_lasx_xvsllwil_s_w_h +#define CODE_FOR_lasx_xvsllwil_d_w CODE_FOR_lasx_xvsllwil_s_d_w +#define CODE_FOR_lasx_xvsllwil_hu_bu CODE_FOR_lasx_xvsllwil_u_hu_bu +#define CODE_FOR_lasx_xvsllwil_wu_hu CODE_FOR_lasx_xvsllwil_u_wu_hu +#define CODE_FOR_lasx_xvsllwil_du_wu CODE_FOR_lasx_xvsllwil_u_du_wu +#define CODE_FOR_lasx_xvsat_b CODE_FOR_lasx_xvsat_s_b +#define CODE_FOR_lasx_xvsat_h CODE_FOR_lasx_xvsat_s_h +#define CODE_FOR_lasx_xvsat_w CODE_FOR_lasx_xvsat_s_w +#define CODE_FOR_lasx_xvsat_d CODE_FOR_lasx_xvsat_s_d +#define CODE_FOR_lasx_xvsat_bu CODE_FOR_lasx_xvsat_u_bu +#define CODE_FOR_lasx_xvsat_hu CODE_FOR_lasx_xvsat_u_hu +#define CODE_FOR_lasx_xvsat_wu CODE_FOR_lasx_xvsat_u_wu +#define CODE_FOR_lasx_xvsat_du CODE_FOR_lasx_xvsat_u_du + static const struct loongarch_builtin_description loongarch_builtins[] = { #define LARCH_MOVFCSR2GR 0 DIRECT_BUILTIN (movfcsr2gr, LARCH_USI_FTYPE_UQI, hard_float), @@ -1209,7 +1511,761 @@ static const struct loongarch_builtin_description loongarch_builtins[] = { LSX_BUILTIN (vshuf_b, LARCH_V16QI_FTYPE_V16QI_V16QI_V16QI), LSX_BUILTIN (vldx, LARCH_V16QI_FTYPE_CVPOINTER_DI), LSX_NO_TARGET_BUILTIN (vstx, LARCH_VOID_FTYPE_V16QI_CVPOINTER_DI), - LSX_BUILTIN (vextl_qu_du, LARCH_UV2DI_FTYPE_UV2DI) + LSX_BUILTIN (vextl_qu_du, LARCH_UV2DI_FTYPE_UV2DI), + + /* Built-in functions for LASX */ + LASX_BUILTIN (xvsll_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsll_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsll_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsll_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvslli_b, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvslli_h, LARCH_V16HI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvslli_w, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvslli_d, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvsra_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsra_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsra_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsra_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvsrai_b, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvsrai_h, LARCH_V16HI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvsrai_w, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvsrai_d, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvsrar_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsrar_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsrar_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsrar_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvsrari_b, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvsrari_h, LARCH_V16HI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvsrari_w, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvsrari_d, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvsrl_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsrl_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsrl_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsrl_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvsrli_b, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvsrli_h, LARCH_V16HI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvsrli_w, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvsrli_d, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvsrlr_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsrlr_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsrlr_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsrlr_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvsrlri_b, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvsrlri_h, LARCH_V16HI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvsrlri_w, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvsrlri_d, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvbitclr_b, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvbitclr_h, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvbitclr_w, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvbitclr_d, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvbitclri_b, LARCH_UV32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvbitclri_h, LARCH_UV16HI_FTYPE_UV16HI_UQI), + LASX_BUILTIN (xvbitclri_w, LARCH_UV8SI_FTYPE_UV8SI_UQI), + LASX_BUILTIN (xvbitclri_d, LARCH_UV4DI_FTYPE_UV4DI_UQI), + LASX_BUILTIN (xvbitset_b, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvbitset_h, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvbitset_w, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvbitset_d, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvbitseti_b, LARCH_UV32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvbitseti_h, LARCH_UV16HI_FTYPE_UV16HI_UQI), + LASX_BUILTIN (xvbitseti_w, LARCH_UV8SI_FTYPE_UV8SI_UQI), + LASX_BUILTIN (xvbitseti_d, LARCH_UV4DI_FTYPE_UV4DI_UQI), + LASX_BUILTIN (xvbitrev_b, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvbitrev_h, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvbitrev_w, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvbitrev_d, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvbitrevi_b, LARCH_UV32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvbitrevi_h, LARCH_UV16HI_FTYPE_UV16HI_UQI), + LASX_BUILTIN (xvbitrevi_w, LARCH_UV8SI_FTYPE_UV8SI_UQI), + LASX_BUILTIN (xvbitrevi_d, LARCH_UV4DI_FTYPE_UV4DI_UQI), + LASX_BUILTIN (xvadd_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvadd_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvadd_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvadd_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvaddi_bu, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvaddi_hu, LARCH_V16HI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvaddi_wu, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvaddi_du, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvsub_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsub_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsub_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsub_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvsubi_bu, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvsubi_hu, LARCH_V16HI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvsubi_wu, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvsubi_du, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvmax_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvmax_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvmax_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvmax_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvmaxi_b, LARCH_V32QI_FTYPE_V32QI_QI), + LASX_BUILTIN (xvmaxi_h, LARCH_V16HI_FTYPE_V16HI_QI), + LASX_BUILTIN (xvmaxi_w, LARCH_V8SI_FTYPE_V8SI_QI), + LASX_BUILTIN (xvmaxi_d, LARCH_V4DI_FTYPE_V4DI_QI), + LASX_BUILTIN (xvmax_bu, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvmax_hu, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvmax_wu, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvmax_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvmaxi_bu, LARCH_UV32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvmaxi_hu, LARCH_UV16HI_FTYPE_UV16HI_UQI), + LASX_BUILTIN (xvmaxi_wu, LARCH_UV8SI_FTYPE_UV8SI_UQI), + LASX_BUILTIN (xvmaxi_du, LARCH_UV4DI_FTYPE_UV4DI_UQI), + LASX_BUILTIN (xvmin_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvmin_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvmin_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvmin_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvmini_b, LARCH_V32QI_FTYPE_V32QI_QI), + LASX_BUILTIN (xvmini_h, LARCH_V16HI_FTYPE_V16HI_QI), + LASX_BUILTIN (xvmini_w, LARCH_V8SI_FTYPE_V8SI_QI), + LASX_BUILTIN (xvmini_d, LARCH_V4DI_FTYPE_V4DI_QI), + LASX_BUILTIN (xvmin_bu, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvmin_hu, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvmin_wu, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvmin_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvmini_bu, LARCH_UV32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvmini_hu, LARCH_UV16HI_FTYPE_UV16HI_UQI), + LASX_BUILTIN (xvmini_wu, LARCH_UV8SI_FTYPE_UV8SI_UQI), + LASX_BUILTIN (xvmini_du, LARCH_UV4DI_FTYPE_UV4DI_UQI), + LASX_BUILTIN (xvseq_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvseq_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvseq_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvseq_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvseqi_b, LARCH_V32QI_FTYPE_V32QI_QI), + LASX_BUILTIN (xvseqi_h, LARCH_V16HI_FTYPE_V16HI_QI), + LASX_BUILTIN (xvseqi_w, LARCH_V8SI_FTYPE_V8SI_QI), + LASX_BUILTIN (xvseqi_d, LARCH_V4DI_FTYPE_V4DI_QI), + LASX_BUILTIN (xvslt_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvslt_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvslt_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvslt_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvslti_b, LARCH_V32QI_FTYPE_V32QI_QI), + LASX_BUILTIN (xvslti_h, LARCH_V16HI_FTYPE_V16HI_QI), + LASX_BUILTIN (xvslti_w, LARCH_V8SI_FTYPE_V8SI_QI), + LASX_BUILTIN (xvslti_d, LARCH_V4DI_FTYPE_V4DI_QI), + LASX_BUILTIN (xvslt_bu, LARCH_V32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvslt_hu, LARCH_V16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvslt_wu, LARCH_V8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvslt_du, LARCH_V4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvslti_bu, LARCH_V32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvslti_hu, LARCH_V16HI_FTYPE_UV16HI_UQI), + LASX_BUILTIN (xvslti_wu, LARCH_V8SI_FTYPE_UV8SI_UQI), + LASX_BUILTIN (xvslti_du, LARCH_V4DI_FTYPE_UV4DI_UQI), + LASX_BUILTIN (xvsle_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsle_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsle_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsle_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvslei_b, LARCH_V32QI_FTYPE_V32QI_QI), + LASX_BUILTIN (xvslei_h, LARCH_V16HI_FTYPE_V16HI_QI), + LASX_BUILTIN (xvslei_w, LARCH_V8SI_FTYPE_V8SI_QI), + LASX_BUILTIN (xvslei_d, LARCH_V4DI_FTYPE_V4DI_QI), + LASX_BUILTIN (xvsle_bu, LARCH_V32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvsle_hu, LARCH_V16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvsle_wu, LARCH_V8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvsle_du, LARCH_V4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvslei_bu, LARCH_V32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvslei_hu, LARCH_V16HI_FTYPE_UV16HI_UQI), + LASX_BUILTIN (xvslei_wu, LARCH_V8SI_FTYPE_UV8SI_UQI), + LASX_BUILTIN (xvslei_du, LARCH_V4DI_FTYPE_UV4DI_UQI), + + LASX_BUILTIN (xvsat_b, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvsat_h, LARCH_V16HI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvsat_w, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvsat_d, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvsat_bu, LARCH_UV32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvsat_hu, LARCH_UV16HI_FTYPE_UV16HI_UQI), + LASX_BUILTIN (xvsat_wu, LARCH_UV8SI_FTYPE_UV8SI_UQI), + LASX_BUILTIN (xvsat_du, LARCH_UV4DI_FTYPE_UV4DI_UQI), + + LASX_BUILTIN (xvadda_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvadda_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvadda_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvadda_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvsadd_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsadd_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsadd_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsadd_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvsadd_bu, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvsadd_hu, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvsadd_wu, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvsadd_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + + LASX_BUILTIN (xvavg_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvavg_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvavg_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvavg_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvavg_bu, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvavg_hu, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvavg_wu, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvavg_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + + LASX_BUILTIN (xvavgr_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvavgr_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvavgr_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvavgr_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvavgr_bu, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvavgr_hu, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvavgr_wu, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvavgr_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + + LASX_BUILTIN (xvssub_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvssub_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvssub_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvssub_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvssub_bu, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvssub_hu, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvssub_wu, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvssub_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvabsd_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvabsd_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvabsd_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvabsd_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvabsd_bu, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvabsd_hu, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvabsd_wu, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvabsd_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + + LASX_BUILTIN (xvmul_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvmul_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvmul_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvmul_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvmadd_b, LARCH_V32QI_FTYPE_V32QI_V32QI_V32QI), + LASX_BUILTIN (xvmadd_h, LARCH_V16HI_FTYPE_V16HI_V16HI_V16HI), + LASX_BUILTIN (xvmadd_w, LARCH_V8SI_FTYPE_V8SI_V8SI_V8SI), + LASX_BUILTIN (xvmadd_d, LARCH_V4DI_FTYPE_V4DI_V4DI_V4DI), + LASX_BUILTIN (xvmsub_b, LARCH_V32QI_FTYPE_V32QI_V32QI_V32QI), + LASX_BUILTIN (xvmsub_h, LARCH_V16HI_FTYPE_V16HI_V16HI_V16HI), + LASX_BUILTIN (xvmsub_w, LARCH_V8SI_FTYPE_V8SI_V8SI_V8SI), + LASX_BUILTIN (xvmsub_d, LARCH_V4DI_FTYPE_V4DI_V4DI_V4DI), + LASX_BUILTIN (xvdiv_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvdiv_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvdiv_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvdiv_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvdiv_bu, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvdiv_hu, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvdiv_wu, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvdiv_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvhaddw_h_b, LARCH_V16HI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvhaddw_w_h, LARCH_V8SI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvhaddw_d_w, LARCH_V4DI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvhaddw_hu_bu, LARCH_UV16HI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvhaddw_wu_hu, LARCH_UV8SI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvhaddw_du_wu, LARCH_UV4DI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvhsubw_h_b, LARCH_V16HI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvhsubw_w_h, LARCH_V8SI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvhsubw_d_w, LARCH_V4DI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvhsubw_hu_bu, LARCH_V16HI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvhsubw_wu_hu, LARCH_V8SI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvhsubw_du_wu, LARCH_V4DI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvmod_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvmod_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvmod_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvmod_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvmod_bu, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvmod_hu, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvmod_wu, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvmod_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + + LASX_BUILTIN (xvrepl128vei_b, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvrepl128vei_h, LARCH_V16HI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvrepl128vei_w, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvrepl128vei_d, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvpickev_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvpickev_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvpickev_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvpickev_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvpickod_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvpickod_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvpickod_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvpickod_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvilvh_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvilvh_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvilvh_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvilvh_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvilvl_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvilvl_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvilvl_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvilvl_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvpackev_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvpackev_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvpackev_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvpackev_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvpackod_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvpackod_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvpackod_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvpackod_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvshuf_b, LARCH_V32QI_FTYPE_V32QI_V32QI_V32QI), + LASX_BUILTIN (xvshuf_h, LARCH_V16HI_FTYPE_V16HI_V16HI_V16HI), + LASX_BUILTIN (xvshuf_w, LARCH_V8SI_FTYPE_V8SI_V8SI_V8SI), + LASX_BUILTIN (xvshuf_d, LARCH_V4DI_FTYPE_V4DI_V4DI_V4DI), + LASX_BUILTIN (xvand_v, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvandi_b, LARCH_UV32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvor_v, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvori_b, LARCH_UV32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvnor_v, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvnori_b, LARCH_UV32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvxor_v, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvxori_b, LARCH_UV32QI_FTYPE_UV32QI_UQI), + LASX_BUILTIN (xvbitsel_v, LARCH_UV32QI_FTYPE_UV32QI_UV32QI_UV32QI), + LASX_BUILTIN (xvbitseli_b, LARCH_UV32QI_FTYPE_UV32QI_UV32QI_USI), + + LASX_BUILTIN (xvshuf4i_b, LARCH_V32QI_FTYPE_V32QI_USI), + LASX_BUILTIN (xvshuf4i_h, LARCH_V16HI_FTYPE_V16HI_USI), + LASX_BUILTIN (xvshuf4i_w, LARCH_V8SI_FTYPE_V8SI_USI), + + LASX_BUILTIN (xvreplgr2vr_b, LARCH_V32QI_FTYPE_SI), + LASX_BUILTIN (xvreplgr2vr_h, LARCH_V16HI_FTYPE_SI), + LASX_BUILTIN (xvreplgr2vr_w, LARCH_V8SI_FTYPE_SI), + LASX_BUILTIN (xvreplgr2vr_d, LARCH_V4DI_FTYPE_DI), + LASX_BUILTIN (xvpcnt_b, LARCH_V32QI_FTYPE_V32QI), + LASX_BUILTIN (xvpcnt_h, LARCH_V16HI_FTYPE_V16HI), + LASX_BUILTIN (xvpcnt_w, LARCH_V8SI_FTYPE_V8SI), + LASX_BUILTIN (xvpcnt_d, LARCH_V4DI_FTYPE_V4DI), + LASX_BUILTIN (xvclo_b, LARCH_V32QI_FTYPE_V32QI), + LASX_BUILTIN (xvclo_h, LARCH_V16HI_FTYPE_V16HI), + LASX_BUILTIN (xvclo_w, LARCH_V8SI_FTYPE_V8SI), + LASX_BUILTIN (xvclo_d, LARCH_V4DI_FTYPE_V4DI), + LASX_BUILTIN (xvclz_b, LARCH_V32QI_FTYPE_V32QI), + LASX_BUILTIN (xvclz_h, LARCH_V16HI_FTYPE_V16HI), + LASX_BUILTIN (xvclz_w, LARCH_V8SI_FTYPE_V8SI), + LASX_BUILTIN (xvclz_d, LARCH_V4DI_FTYPE_V4DI), + + LASX_BUILTIN (xvrepli_b, LARCH_V32QI_FTYPE_HI), + LASX_BUILTIN (xvrepli_h, LARCH_V16HI_FTYPE_HI), + LASX_BUILTIN (xvrepli_w, LARCH_V8SI_FTYPE_HI), + LASX_BUILTIN (xvrepli_d, LARCH_V4DI_FTYPE_HI), + LASX_BUILTIN (xvfcmp_caf_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_caf_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_cor_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_cor_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_cun_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_cun_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_cune_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_cune_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_cueq_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_cueq_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_ceq_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_ceq_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_cne_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_cne_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_clt_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_clt_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_cult_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_cult_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_cle_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_cle_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_cule_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_cule_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_saf_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_saf_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_sor_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_sor_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_sun_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_sun_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_sune_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_sune_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_sueq_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_sueq_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_seq_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_seq_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_sne_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_sne_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_slt_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_slt_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_sult_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_sult_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_sle_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_sle_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcmp_sule_s, LARCH_V8SI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcmp_sule_d, LARCH_V4DI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfadd_s, LARCH_V8SF_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfadd_d, LARCH_V4DF_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfsub_s, LARCH_V8SF_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfsub_d, LARCH_V4DF_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfmul_s, LARCH_V8SF_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfmul_d, LARCH_V4DF_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfdiv_s, LARCH_V8SF_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfdiv_d, LARCH_V4DF_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfcvt_h_s, LARCH_V16HI_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfcvt_s_d, LARCH_V8SF_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfmin_s, LARCH_V8SF_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfmin_d, LARCH_V4DF_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfmina_s, LARCH_V8SF_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfmina_d, LARCH_V4DF_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfmax_s, LARCH_V8SF_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfmax_d, LARCH_V4DF_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfmaxa_s, LARCH_V8SF_FTYPE_V8SF_V8SF), + LASX_BUILTIN (xvfmaxa_d, LARCH_V4DF_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvfclass_s, LARCH_V8SI_FTYPE_V8SF), + LASX_BUILTIN (xvfclass_d, LARCH_V4DI_FTYPE_V4DF), + LASX_BUILTIN (xvfsqrt_s, LARCH_V8SF_FTYPE_V8SF), + LASX_BUILTIN (xvfsqrt_d, LARCH_V4DF_FTYPE_V4DF), + LASX_BUILTIN (xvfrecip_s, LARCH_V8SF_FTYPE_V8SF), + LASX_BUILTIN (xvfrecip_d, LARCH_V4DF_FTYPE_V4DF), + LASX_BUILTIN (xvfrint_s, LARCH_V8SF_FTYPE_V8SF), + LASX_BUILTIN (xvfrint_d, LARCH_V4DF_FTYPE_V4DF), + LASX_BUILTIN (xvfrsqrt_s, LARCH_V8SF_FTYPE_V8SF), + LASX_BUILTIN (xvfrsqrt_d, LARCH_V4DF_FTYPE_V4DF), + LASX_BUILTIN (xvflogb_s, LARCH_V8SF_FTYPE_V8SF), + LASX_BUILTIN (xvflogb_d, LARCH_V4DF_FTYPE_V4DF), + LASX_BUILTIN (xvfcvth_s_h, LARCH_V8SF_FTYPE_V16HI), + LASX_BUILTIN (xvfcvth_d_s, LARCH_V4DF_FTYPE_V8SF), + LASX_BUILTIN (xvfcvtl_s_h, LARCH_V8SF_FTYPE_V16HI), + LASX_BUILTIN (xvfcvtl_d_s, LARCH_V4DF_FTYPE_V8SF), + LASX_BUILTIN (xvftint_w_s, LARCH_V8SI_FTYPE_V8SF), + LASX_BUILTIN (xvftint_l_d, LARCH_V4DI_FTYPE_V4DF), + LASX_BUILTIN (xvftint_wu_s, LARCH_UV8SI_FTYPE_V8SF), + LASX_BUILTIN (xvftint_lu_d, LARCH_UV4DI_FTYPE_V4DF), + LASX_BUILTIN (xvftintrz_w_s, LARCH_V8SI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrz_l_d, LARCH_V4DI_FTYPE_V4DF), + LASX_BUILTIN (xvftintrz_wu_s, LARCH_UV8SI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrz_lu_d, LARCH_UV4DI_FTYPE_V4DF), + LASX_BUILTIN (xvffint_s_w, LARCH_V8SF_FTYPE_V8SI), + LASX_BUILTIN (xvffint_d_l, LARCH_V4DF_FTYPE_V4DI), + LASX_BUILTIN (xvffint_s_wu, LARCH_V8SF_FTYPE_UV8SI), + LASX_BUILTIN (xvffint_d_lu, LARCH_V4DF_FTYPE_UV4DI), + + LASX_BUILTIN (xvreplve_b, LARCH_V32QI_FTYPE_V32QI_SI), + LASX_BUILTIN (xvreplve_h, LARCH_V16HI_FTYPE_V16HI_SI), + LASX_BUILTIN (xvreplve_w, LARCH_V8SI_FTYPE_V8SI_SI), + LASX_BUILTIN (xvreplve_d, LARCH_V4DI_FTYPE_V4DI_SI), + LASX_BUILTIN (xvpermi_w, LARCH_V8SI_FTYPE_V8SI_V8SI_USI), + + LASX_BUILTIN (xvandn_v, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvneg_b, LARCH_V32QI_FTYPE_V32QI), + LASX_BUILTIN (xvneg_h, LARCH_V16HI_FTYPE_V16HI), + LASX_BUILTIN (xvneg_w, LARCH_V8SI_FTYPE_V8SI), + LASX_BUILTIN (xvneg_d, LARCH_V4DI_FTYPE_V4DI), + LASX_BUILTIN (xvmuh_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvmuh_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvmuh_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvmuh_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvmuh_bu, LARCH_UV32QI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvmuh_hu, LARCH_UV16HI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvmuh_wu, LARCH_UV8SI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvmuh_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvsllwil_h_b, LARCH_V16HI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvsllwil_w_h, LARCH_V8SI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvsllwil_d_w, LARCH_V4DI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvsllwil_hu_bu, LARCH_UV16HI_FTYPE_UV32QI_UQI), /* FIXME: U? */ + LASX_BUILTIN (xvsllwil_wu_hu, LARCH_UV8SI_FTYPE_UV16HI_UQI), + LASX_BUILTIN (xvsllwil_du_wu, LARCH_UV4DI_FTYPE_UV8SI_UQI), + LASX_BUILTIN (xvsran_b_h, LARCH_V32QI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsran_h_w, LARCH_V16HI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsran_w_d, LARCH_V8SI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvssran_b_h, LARCH_V32QI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvssran_h_w, LARCH_V16HI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvssran_w_d, LARCH_V8SI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvssran_bu_h, LARCH_UV32QI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvssran_hu_w, LARCH_UV16HI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvssran_wu_d, LARCH_UV8SI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvsrarn_b_h, LARCH_V32QI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsrarn_h_w, LARCH_V16HI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsrarn_w_d, LARCH_V8SI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvssrarn_b_h, LARCH_V32QI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvssrarn_h_w, LARCH_V16HI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvssrarn_w_d, LARCH_V8SI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvssrarn_bu_h, LARCH_UV32QI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvssrarn_hu_w, LARCH_UV16HI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvssrarn_wu_d, LARCH_UV8SI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvsrln_b_h, LARCH_V32QI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsrln_h_w, LARCH_V16HI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsrln_w_d, LARCH_V8SI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvssrln_bu_h, LARCH_UV32QI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvssrln_hu_w, LARCH_UV16HI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvssrln_wu_d, LARCH_UV8SI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvsrlrn_b_h, LARCH_V32QI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsrlrn_h_w, LARCH_V16HI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsrlrn_w_d, LARCH_V8SI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvssrlrn_bu_h, LARCH_UV32QI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvssrlrn_hu_w, LARCH_UV16HI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvssrlrn_wu_d, LARCH_UV8SI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvfrstpi_b, LARCH_V32QI_FTYPE_V32QI_V32QI_UQI), + LASX_BUILTIN (xvfrstpi_h, LARCH_V16HI_FTYPE_V16HI_V16HI_UQI), + LASX_BUILTIN (xvfrstp_b, LARCH_V32QI_FTYPE_V32QI_V32QI_V32QI), + LASX_BUILTIN (xvfrstp_h, LARCH_V16HI_FTYPE_V16HI_V16HI_V16HI), + LASX_BUILTIN (xvshuf4i_d, LARCH_V4DI_FTYPE_V4DI_V4DI_USI), + LASX_BUILTIN (xvbsrl_v, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvbsll_v, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvextrins_b, LARCH_V32QI_FTYPE_V32QI_V32QI_USI), + LASX_BUILTIN (xvextrins_h, LARCH_V16HI_FTYPE_V16HI_V16HI_USI), + LASX_BUILTIN (xvextrins_w, LARCH_V8SI_FTYPE_V8SI_V8SI_USI), + LASX_BUILTIN (xvextrins_d, LARCH_V4DI_FTYPE_V4DI_V4DI_USI), + LASX_BUILTIN (xvmskltz_b, LARCH_V32QI_FTYPE_V32QI), + LASX_BUILTIN (xvmskltz_h, LARCH_V16HI_FTYPE_V16HI), + LASX_BUILTIN (xvmskltz_w, LARCH_V8SI_FTYPE_V8SI), + LASX_BUILTIN (xvmskltz_d, LARCH_V4DI_FTYPE_V4DI), + LASX_BUILTIN (xvsigncov_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsigncov_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsigncov_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsigncov_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvfmadd_s, LARCH_V8SF_FTYPE_V8SF_V8SF_V8SF), + LASX_BUILTIN (xvfmadd_d, LARCH_V4DF_FTYPE_V4DF_V4DF_V4DF), + LASX_BUILTIN (xvfmsub_s, LARCH_V8SF_FTYPE_V8SF_V8SF_V8SF), + LASX_BUILTIN (xvfmsub_d, LARCH_V4DF_FTYPE_V4DF_V4DF_V4DF), + LASX_BUILTIN (xvfnmadd_s, LARCH_V8SF_FTYPE_V8SF_V8SF_V8SF), + LASX_BUILTIN (xvfnmadd_d, LARCH_V4DF_FTYPE_V4DF_V4DF_V4DF), + LASX_BUILTIN (xvfnmsub_s, LARCH_V8SF_FTYPE_V8SF_V8SF_V8SF), + LASX_BUILTIN (xvfnmsub_d, LARCH_V4DF_FTYPE_V4DF_V4DF_V4DF), + LASX_BUILTIN (xvftintrne_w_s, LARCH_V8SI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrne_l_d, LARCH_V4DI_FTYPE_V4DF), + LASX_BUILTIN (xvftintrp_w_s, LARCH_V8SI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrp_l_d, LARCH_V4DI_FTYPE_V4DF), + LASX_BUILTIN (xvftintrm_w_s, LARCH_V8SI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrm_l_d, LARCH_V4DI_FTYPE_V4DF), + LASX_BUILTIN (xvftint_w_d, LARCH_V8SI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvffint_s_l, LARCH_V8SF_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvftintrz_w_d, LARCH_V8SI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvftintrp_w_d, LARCH_V8SI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvftintrm_w_d, LARCH_V8SI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvftintrne_w_d, LARCH_V8SI_FTYPE_V4DF_V4DF), + LASX_BUILTIN (xvftinth_l_s, LARCH_V4DI_FTYPE_V8SF), + LASX_BUILTIN (xvftintl_l_s, LARCH_V4DI_FTYPE_V8SF), + LASX_BUILTIN (xvffinth_d_w, LARCH_V4DF_FTYPE_V8SI), + LASX_BUILTIN (xvffintl_d_w, LARCH_V4DF_FTYPE_V8SI), + LASX_BUILTIN (xvftintrzh_l_s, LARCH_V4DI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrzl_l_s, LARCH_V4DI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrph_l_s, LARCH_V4DI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrpl_l_s, LARCH_V4DI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrmh_l_s, LARCH_V4DI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrml_l_s, LARCH_V4DI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrneh_l_s, LARCH_V4DI_FTYPE_V8SF), + LASX_BUILTIN (xvftintrnel_l_s, LARCH_V4DI_FTYPE_V8SF), + LASX_BUILTIN (xvfrintrne_s, LARCH_V8SF_FTYPE_V8SF), + LASX_BUILTIN (xvfrintrne_d, LARCH_V4DF_FTYPE_V4DF), + LASX_BUILTIN (xvfrintrz_s, LARCH_V8SF_FTYPE_V8SF), + LASX_BUILTIN (xvfrintrz_d, LARCH_V4DF_FTYPE_V4DF), + LASX_BUILTIN (xvfrintrp_s, LARCH_V8SF_FTYPE_V8SF), + LASX_BUILTIN (xvfrintrp_d, LARCH_V4DF_FTYPE_V4DF), + LASX_BUILTIN (xvfrintrm_s, LARCH_V8SF_FTYPE_V8SF), + LASX_BUILTIN (xvfrintrm_d, LARCH_V4DF_FTYPE_V4DF), + LASX_BUILTIN (xvld, LARCH_V32QI_FTYPE_CVPOINTER_SI), + LASX_NO_TARGET_BUILTIN (xvst, LARCH_VOID_FTYPE_V32QI_CVPOINTER_SI), + LASX_NO_TARGET_BUILTIN (xvstelm_b, LARCH_VOID_FTYPE_V32QI_CVPOINTER_SI_UQI), + LASX_NO_TARGET_BUILTIN (xvstelm_h, LARCH_VOID_FTYPE_V16HI_CVPOINTER_SI_UQI), + LASX_NO_TARGET_BUILTIN (xvstelm_w, LARCH_VOID_FTYPE_V8SI_CVPOINTER_SI_UQI), + LASX_NO_TARGET_BUILTIN (xvstelm_d, LARCH_VOID_FTYPE_V4DI_CVPOINTER_SI_UQI), + LASX_BUILTIN (xvinsve0_w, LARCH_V8SI_FTYPE_V8SI_V8SI_UQI), + LASX_BUILTIN (xvinsve0_d, LARCH_V4DI_FTYPE_V4DI_V4DI_UQI), + LASX_BUILTIN (xvpickve_w, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvpickve_d, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvpickve_w_f, LARCH_V8SF_FTYPE_V8SF_UQI), + LASX_BUILTIN (xvpickve_d_f, LARCH_V4DF_FTYPE_V4DF_UQI), + LASX_BUILTIN (xvssrlrn_b_h, LARCH_V32QI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvssrlrn_h_w, LARCH_V16HI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvssrlrn_w_d, LARCH_V8SI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvssrln_b_h, LARCH_V32QI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvssrln_h_w, LARCH_V16HI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvssrln_w_d, LARCH_V8SI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvorn_v, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvldi, LARCH_V4DI_FTYPE_HI), + LASX_BUILTIN (xvldx, LARCH_V32QI_FTYPE_CVPOINTER_DI), + LASX_NO_TARGET_BUILTIN (xvstx, LARCH_VOID_FTYPE_V32QI_CVPOINTER_DI), + LASX_BUILTIN (xvextl_qu_du, LARCH_UV4DI_FTYPE_UV4DI), + + /* LASX */ + LASX_BUILTIN (xvinsgr2vr_w, LARCH_V8SI_FTYPE_V8SI_SI_UQI), + LASX_BUILTIN (xvinsgr2vr_d, LARCH_V4DI_FTYPE_V4DI_DI_UQI), + + LASX_BUILTIN (xvreplve0_b, LARCH_V32QI_FTYPE_V32QI), + LASX_BUILTIN (xvreplve0_h, LARCH_V16HI_FTYPE_V16HI), + LASX_BUILTIN (xvreplve0_w, LARCH_V8SI_FTYPE_V8SI), + LASX_BUILTIN (xvreplve0_d, LARCH_V4DI_FTYPE_V4DI), + LASX_BUILTIN (xvreplve0_q, LARCH_V32QI_FTYPE_V32QI), + LASX_BUILTIN (vext2xv_h_b, LARCH_V16HI_FTYPE_V32QI), + LASX_BUILTIN (vext2xv_w_h, LARCH_V8SI_FTYPE_V16HI), + LASX_BUILTIN (vext2xv_d_w, LARCH_V4DI_FTYPE_V8SI), + LASX_BUILTIN (vext2xv_w_b, LARCH_V8SI_FTYPE_V32QI), + LASX_BUILTIN (vext2xv_d_h, LARCH_V4DI_FTYPE_V16HI), + LASX_BUILTIN (vext2xv_d_b, LARCH_V4DI_FTYPE_V32QI), + LASX_BUILTIN (vext2xv_hu_bu, LARCH_V16HI_FTYPE_V32QI), + LASX_BUILTIN (vext2xv_wu_hu, LARCH_V8SI_FTYPE_V16HI), + LASX_BUILTIN (vext2xv_du_wu, LARCH_V4DI_FTYPE_V8SI), + LASX_BUILTIN (vext2xv_wu_bu, LARCH_V8SI_FTYPE_V32QI), + LASX_BUILTIN (vext2xv_du_hu, LARCH_V4DI_FTYPE_V16HI), + LASX_BUILTIN (vext2xv_du_bu, LARCH_V4DI_FTYPE_V32QI), + LASX_BUILTIN (xvpermi_q, LARCH_V32QI_FTYPE_V32QI_V32QI_USI), + LASX_BUILTIN (xvpermi_d, LARCH_V4DI_FTYPE_V4DI_USI), + LASX_BUILTIN (xvperm_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN_TEST_BRANCH (xbz_b, LARCH_SI_FTYPE_UV32QI), + LASX_BUILTIN_TEST_BRANCH (xbz_h, LARCH_SI_FTYPE_UV16HI), + LASX_BUILTIN_TEST_BRANCH (xbz_w, LARCH_SI_FTYPE_UV8SI), + LASX_BUILTIN_TEST_BRANCH (xbz_d, LARCH_SI_FTYPE_UV4DI), + LASX_BUILTIN_TEST_BRANCH (xbnz_b, LARCH_SI_FTYPE_UV32QI), + LASX_BUILTIN_TEST_BRANCH (xbnz_h, LARCH_SI_FTYPE_UV16HI), + LASX_BUILTIN_TEST_BRANCH (xbnz_w, LARCH_SI_FTYPE_UV8SI), + LASX_BUILTIN_TEST_BRANCH (xbnz_d, LARCH_SI_FTYPE_UV4DI), + LASX_BUILTIN_TEST_BRANCH (xbz_v, LARCH_SI_FTYPE_UV32QI), + LASX_BUILTIN_TEST_BRANCH (xbnz_v, LARCH_SI_FTYPE_UV32QI), + LASX_BUILTIN (xvldrepl_b, LARCH_V32QI_FTYPE_CVPOINTER_SI), + LASX_BUILTIN (xvldrepl_h, LARCH_V16HI_FTYPE_CVPOINTER_SI), + LASX_BUILTIN (xvldrepl_w, LARCH_V8SI_FTYPE_CVPOINTER_SI), + LASX_BUILTIN (xvldrepl_d, LARCH_V4DI_FTYPE_CVPOINTER_SI), + LASX_BUILTIN (xvpickve2gr_w, LARCH_SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvpickve2gr_wu, LARCH_USI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvpickve2gr_d, LARCH_DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvpickve2gr_du, LARCH_UDI_FTYPE_V4DI_UQI), + + LASX_BUILTIN (xvaddwev_q_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvaddwev_d_w, LARCH_V4DI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvaddwev_w_h, LARCH_V8SI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvaddwev_h_b, LARCH_V16HI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvaddwev_q_du, LARCH_V4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvaddwev_d_wu, LARCH_V4DI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvaddwev_w_hu, LARCH_V8SI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvaddwev_h_bu, LARCH_V16HI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvsubwev_q_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvsubwev_d_w, LARCH_V4DI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsubwev_w_h, LARCH_V8SI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsubwev_h_b, LARCH_V16HI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsubwev_q_du, LARCH_V4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvsubwev_d_wu, LARCH_V4DI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvsubwev_w_hu, LARCH_V8SI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvsubwev_h_bu, LARCH_V16HI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvmulwev_q_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvmulwev_d_w, LARCH_V4DI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvmulwev_w_h, LARCH_V8SI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvmulwev_h_b, LARCH_V16HI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvmulwev_q_du, LARCH_V4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvmulwev_d_wu, LARCH_V4DI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvmulwev_w_hu, LARCH_V8SI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvmulwev_h_bu, LARCH_V16HI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvaddwod_q_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvaddwod_d_w, LARCH_V4DI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvaddwod_w_h, LARCH_V8SI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvaddwod_h_b, LARCH_V16HI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvaddwod_q_du, LARCH_V4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvaddwod_d_wu, LARCH_V4DI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvaddwod_w_hu, LARCH_V8SI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvaddwod_h_bu, LARCH_V16HI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvsubwod_q_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvsubwod_d_w, LARCH_V4DI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvsubwod_w_h, LARCH_V8SI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvsubwod_h_b, LARCH_V16HI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvsubwod_q_du, LARCH_V4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvsubwod_d_wu, LARCH_V4DI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvsubwod_w_hu, LARCH_V8SI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvsubwod_h_bu, LARCH_V16HI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvmulwod_q_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvmulwod_d_w, LARCH_V4DI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvmulwod_w_h, LARCH_V8SI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvmulwod_h_b, LARCH_V16HI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvmulwod_q_du, LARCH_V4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvmulwod_d_wu, LARCH_V4DI_FTYPE_UV8SI_UV8SI), + LASX_BUILTIN (xvmulwod_w_hu, LARCH_V8SI_FTYPE_UV16HI_UV16HI), + LASX_BUILTIN (xvmulwod_h_bu, LARCH_V16HI_FTYPE_UV32QI_UV32QI), + LASX_BUILTIN (xvaddwev_d_wu_w, LARCH_V4DI_FTYPE_UV8SI_V8SI), + LASX_BUILTIN (xvaddwev_w_hu_h, LARCH_V8SI_FTYPE_UV16HI_V16HI), + LASX_BUILTIN (xvaddwev_h_bu_b, LARCH_V16HI_FTYPE_UV32QI_V32QI), + LASX_BUILTIN (xvmulwev_d_wu_w, LARCH_V4DI_FTYPE_UV8SI_V8SI), + LASX_BUILTIN (xvmulwev_w_hu_h, LARCH_V8SI_FTYPE_UV16HI_V16HI), + LASX_BUILTIN (xvmulwev_h_bu_b, LARCH_V16HI_FTYPE_UV32QI_V32QI), + LASX_BUILTIN (xvaddwod_d_wu_w, LARCH_V4DI_FTYPE_UV8SI_V8SI), + LASX_BUILTIN (xvaddwod_w_hu_h, LARCH_V8SI_FTYPE_UV16HI_V16HI), + LASX_BUILTIN (xvaddwod_h_bu_b, LARCH_V16HI_FTYPE_UV32QI_V32QI), + LASX_BUILTIN (xvmulwod_d_wu_w, LARCH_V4DI_FTYPE_UV8SI_V8SI), + LASX_BUILTIN (xvmulwod_w_hu_h, LARCH_V8SI_FTYPE_UV16HI_V16HI), + LASX_BUILTIN (xvmulwod_h_bu_b, LARCH_V16HI_FTYPE_UV32QI_V32QI), + LASX_BUILTIN (xvhaddw_q_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvhaddw_qu_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvhsubw_q_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvhsubw_qu_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI), + LASX_BUILTIN (xvmaddwev_q_d, LARCH_V4DI_FTYPE_V4DI_V4DI_V4DI), + LASX_BUILTIN (xvmaddwev_d_w, LARCH_V4DI_FTYPE_V4DI_V8SI_V8SI), + LASX_BUILTIN (xvmaddwev_w_h, LARCH_V8SI_FTYPE_V8SI_V16HI_V16HI), + LASX_BUILTIN (xvmaddwev_h_b, LARCH_V16HI_FTYPE_V16HI_V32QI_V32QI), + LASX_BUILTIN (xvmaddwev_q_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI_UV4DI), + LASX_BUILTIN (xvmaddwev_d_wu, LARCH_UV4DI_FTYPE_UV4DI_UV8SI_UV8SI), + LASX_BUILTIN (xvmaddwev_w_hu, LARCH_UV8SI_FTYPE_UV8SI_UV16HI_UV16HI), + LASX_BUILTIN (xvmaddwev_h_bu, LARCH_UV16HI_FTYPE_UV16HI_UV32QI_UV32QI), + LASX_BUILTIN (xvmaddwod_q_d, LARCH_V4DI_FTYPE_V4DI_V4DI_V4DI), + LASX_BUILTIN (xvmaddwod_d_w, LARCH_V4DI_FTYPE_V4DI_V8SI_V8SI), + LASX_BUILTIN (xvmaddwod_w_h, LARCH_V8SI_FTYPE_V8SI_V16HI_V16HI), + LASX_BUILTIN (xvmaddwod_h_b, LARCH_V16HI_FTYPE_V16HI_V32QI_V32QI), + LASX_BUILTIN (xvmaddwod_q_du, LARCH_UV4DI_FTYPE_UV4DI_UV4DI_UV4DI), + LASX_BUILTIN (xvmaddwod_d_wu, LARCH_UV4DI_FTYPE_UV4DI_UV8SI_UV8SI), + LASX_BUILTIN (xvmaddwod_w_hu, LARCH_UV8SI_FTYPE_UV8SI_UV16HI_UV16HI), + LASX_BUILTIN (xvmaddwod_h_bu, LARCH_UV16HI_FTYPE_UV16HI_UV32QI_UV32QI), + LASX_BUILTIN (xvmaddwev_q_du_d, LARCH_V4DI_FTYPE_V4DI_UV4DI_V4DI), + LASX_BUILTIN (xvmaddwev_d_wu_w, LARCH_V4DI_FTYPE_V4DI_UV8SI_V8SI), + LASX_BUILTIN (xvmaddwev_w_hu_h, LARCH_V8SI_FTYPE_V8SI_UV16HI_V16HI), + LASX_BUILTIN (xvmaddwev_h_bu_b, LARCH_V16HI_FTYPE_V16HI_UV32QI_V32QI), + LASX_BUILTIN (xvmaddwod_q_du_d, LARCH_V4DI_FTYPE_V4DI_UV4DI_V4DI), + LASX_BUILTIN (xvmaddwod_d_wu_w, LARCH_V4DI_FTYPE_V4DI_UV8SI_V8SI), + LASX_BUILTIN (xvmaddwod_w_hu_h, LARCH_V8SI_FTYPE_V8SI_UV16HI_V16HI), + LASX_BUILTIN (xvmaddwod_h_bu_b, LARCH_V16HI_FTYPE_V16HI_UV32QI_V32QI), + LASX_BUILTIN (xvrotr_b, LARCH_V32QI_FTYPE_V32QI_V32QI), + LASX_BUILTIN (xvrotr_h, LARCH_V16HI_FTYPE_V16HI_V16HI), + LASX_BUILTIN (xvrotr_w, LARCH_V8SI_FTYPE_V8SI_V8SI), + LASX_BUILTIN (xvrotr_d, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvadd_q, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvsub_q, LARCH_V4DI_FTYPE_V4DI_V4DI), + LASX_BUILTIN (xvaddwev_q_du_d, LARCH_V4DI_FTYPE_UV4DI_V4DI), + LASX_BUILTIN (xvaddwod_q_du_d, LARCH_V4DI_FTYPE_UV4DI_V4DI), + LASX_BUILTIN (xvmulwev_q_du_d, LARCH_V4DI_FTYPE_UV4DI_V4DI), + LASX_BUILTIN (xvmulwod_q_du_d, LARCH_V4DI_FTYPE_UV4DI_V4DI), + LASX_BUILTIN (xvmskgez_b, LARCH_V32QI_FTYPE_V32QI), + LASX_BUILTIN (xvmsknz_b, LARCH_V32QI_FTYPE_V32QI), + LASX_BUILTIN (xvexth_h_b, LARCH_V16HI_FTYPE_V32QI), + LASX_BUILTIN (xvexth_w_h, LARCH_V8SI_FTYPE_V16HI), + LASX_BUILTIN (xvexth_d_w, LARCH_V4DI_FTYPE_V8SI), + LASX_BUILTIN (xvexth_q_d, LARCH_V4DI_FTYPE_V4DI), + LASX_BUILTIN (xvexth_hu_bu, LARCH_UV16HI_FTYPE_UV32QI), + LASX_BUILTIN (xvexth_wu_hu, LARCH_UV8SI_FTYPE_UV16HI), + LASX_BUILTIN (xvexth_du_wu, LARCH_UV4DI_FTYPE_UV8SI), + LASX_BUILTIN (xvexth_qu_du, LARCH_UV4DI_FTYPE_UV4DI), + LASX_BUILTIN (xvrotri_b, LARCH_V32QI_FTYPE_V32QI_UQI), + LASX_BUILTIN (xvrotri_h, LARCH_V16HI_FTYPE_V16HI_UQI), + LASX_BUILTIN (xvrotri_w, LARCH_V8SI_FTYPE_V8SI_UQI), + LASX_BUILTIN (xvrotri_d, LARCH_V4DI_FTYPE_V4DI_UQI), + LASX_BUILTIN (xvextl_q_d, LARCH_V4DI_FTYPE_V4DI), + LASX_BUILTIN (xvsrlni_b_h, LARCH_V32QI_FTYPE_V32QI_V32QI_USI), + LASX_BUILTIN (xvsrlni_h_w, LARCH_V16HI_FTYPE_V16HI_V16HI_USI), + LASX_BUILTIN (xvsrlni_w_d, LARCH_V8SI_FTYPE_V8SI_V8SI_USI), + LASX_BUILTIN (xvsrlni_d_q, LARCH_V4DI_FTYPE_V4DI_V4DI_USI), + LASX_BUILTIN (xvsrlrni_b_h, LARCH_V32QI_FTYPE_V32QI_V32QI_USI), + LASX_BUILTIN (xvsrlrni_h_w, LARCH_V16HI_FTYPE_V16HI_V16HI_USI), + LASX_BUILTIN (xvsrlrni_w_d, LARCH_V8SI_FTYPE_V8SI_V8SI_USI), + LASX_BUILTIN (xvsrlrni_d_q, LARCH_V4DI_FTYPE_V4DI_V4DI_USI), + LASX_BUILTIN (xvssrlni_b_h, LARCH_V32QI_FTYPE_V32QI_V32QI_USI), + LASX_BUILTIN (xvssrlni_h_w, LARCH_V16HI_FTYPE_V16HI_V16HI_USI), + LASX_BUILTIN (xvssrlni_w_d, LARCH_V8SI_FTYPE_V8SI_V8SI_USI), + LASX_BUILTIN (xvssrlni_d_q, LARCH_V4DI_FTYPE_V4DI_V4DI_USI), + LASX_BUILTIN (xvssrlni_bu_h, LARCH_UV32QI_FTYPE_UV32QI_V32QI_USI), + LASX_BUILTIN (xvssrlni_hu_w, LARCH_UV16HI_FTYPE_UV16HI_V16HI_USI), + LASX_BUILTIN (xvssrlni_wu_d, LARCH_UV8SI_FTYPE_UV8SI_V8SI_USI), + LASX_BUILTIN (xvssrlni_du_q, LARCH_UV4DI_FTYPE_UV4DI_V4DI_USI), + LASX_BUILTIN (xvssrlrni_b_h, LARCH_V32QI_FTYPE_V32QI_V32QI_USI), + LASX_BUILTIN (xvssrlrni_h_w, LARCH_V16HI_FTYPE_V16HI_V16HI_USI), + LASX_BUILTIN (xvssrlrni_w_d, LARCH_V8SI_FTYPE_V8SI_V8SI_USI), + LASX_BUILTIN (xvssrlrni_d_q, LARCH_V4DI_FTYPE_V4DI_V4DI_USI), + LASX_BUILTIN (xvssrlrni_bu_h, LARCH_UV32QI_FTYPE_UV32QI_V32QI_USI), + LASX_BUILTIN (xvssrlrni_hu_w, LARCH_UV16HI_FTYPE_UV16HI_V16HI_USI), + LASX_BUILTIN (xvssrlrni_wu_d, LARCH_UV8SI_FTYPE_UV8SI_V8SI_USI), + LASX_BUILTIN (xvssrlrni_du_q, LARCH_UV4DI_FTYPE_UV4DI_V4DI_USI), + LASX_BUILTIN (xvsrani_b_h, LARCH_V32QI_FTYPE_V32QI_V32QI_USI), + LASX_BUILTIN (xvsrani_h_w, LARCH_V16HI_FTYPE_V16HI_V16HI_USI), + LASX_BUILTIN (xvsrani_w_d, LARCH_V8SI_FTYPE_V8SI_V8SI_USI), + LASX_BUILTIN (xvsrani_d_q, LARCH_V4DI_FTYPE_V4DI_V4DI_USI), + LASX_BUILTIN (xvsrarni_b_h, LARCH_V32QI_FTYPE_V32QI_V32QI_USI), + LASX_BUILTIN (xvsrarni_h_w, LARCH_V16HI_FTYPE_V16HI_V16HI_USI), + LASX_BUILTIN (xvsrarni_w_d, LARCH_V8SI_FTYPE_V8SI_V8SI_USI), + LASX_BUILTIN (xvsrarni_d_q, LARCH_V4DI_FTYPE_V4DI_V4DI_USI), + LASX_BUILTIN (xvssrani_b_h, LARCH_V32QI_FTYPE_V32QI_V32QI_USI), + LASX_BUILTIN (xvssrani_h_w, LARCH_V16HI_FTYPE_V16HI_V16HI_USI), + LASX_BUILTIN (xvssrani_w_d, LARCH_V8SI_FTYPE_V8SI_V8SI_USI), + LASX_BUILTIN (xvssrani_d_q, LARCH_V4DI_FTYPE_V4DI_V4DI_USI), + LASX_BUILTIN (xvssrani_bu_h, LARCH_UV32QI_FTYPE_UV32QI_V32QI_USI), + LASX_BUILTIN (xvssrani_hu_w, LARCH_UV16HI_FTYPE_UV16HI_V16HI_USI), + LASX_BUILTIN (xvssrani_wu_d, LARCH_UV8SI_FTYPE_UV8SI_V8SI_USI), + LASX_BUILTIN (xvssrani_du_q, LARCH_UV4DI_FTYPE_UV4DI_V4DI_USI), + LASX_BUILTIN (xvssrarni_b_h, LARCH_V32QI_FTYPE_V32QI_V32QI_USI), + LASX_BUILTIN (xvssrarni_h_w, LARCH_V16HI_FTYPE_V16HI_V16HI_USI), + LASX_BUILTIN (xvssrarni_w_d, LARCH_V8SI_FTYPE_V8SI_V8SI_USI), + LASX_BUILTIN (xvssrarni_d_q, LARCH_V4DI_FTYPE_V4DI_V4DI_USI), + LASX_BUILTIN (xvssrarni_bu_h, LARCH_UV32QI_FTYPE_UV32QI_V32QI_USI), + LASX_BUILTIN (xvssrarni_hu_w, LARCH_UV16HI_FTYPE_UV16HI_V16HI_USI), + LASX_BUILTIN (xvssrarni_wu_d, LARCH_UV8SI_FTYPE_UV8SI_V8SI_USI), + LASX_BUILTIN (xvssrarni_du_q, LARCH_UV4DI_FTYPE_UV4DI_V4DI_USI) }; /* Index I is the function declaration for loongarch_builtins[I], or null if @@ -1441,11 +2497,15 @@ loongarch_builtin_vectorized_function (unsigned int fn, tree type_out, { if (out_n == 2 && in_n == 2) return LARCH_GET_BUILTIN (lsx_vfrintrp_d); + if (out_n == 4 && in_n == 4) + return LARCH_GET_BUILTIN (lasx_xvfrintrp_d); } if (out_mode == SFmode && in_mode == SFmode) { if (out_n == 4 && in_n == 4) return LARCH_GET_BUILTIN (lsx_vfrintrp_s); + if (out_n == 8 && in_n == 8) + return LARCH_GET_BUILTIN (lasx_xvfrintrp_s); } break; @@ -1454,11 +2514,15 @@ loongarch_builtin_vectorized_function (unsigned int fn, tree type_out, { if (out_n == 2 && in_n == 2) return LARCH_GET_BUILTIN (lsx_vfrintrz_d); + if (out_n == 4 && in_n == 4) + return LARCH_GET_BUILTIN (lasx_xvfrintrz_d); } if (out_mode == SFmode && in_mode == SFmode) { if (out_n == 4 && in_n == 4) return LARCH_GET_BUILTIN (lsx_vfrintrz_s); + if (out_n == 8 && in_n == 8) + return LARCH_GET_BUILTIN (lasx_xvfrintrz_s); } break; @@ -1468,11 +2532,15 @@ loongarch_builtin_vectorized_function (unsigned int fn, tree type_out, { if (out_n == 2 && in_n == 2) return LARCH_GET_BUILTIN (lsx_vfrint_d); + if (out_n == 4 && in_n == 4) + return LARCH_GET_BUILTIN (lasx_xvfrint_d); } if (out_mode == SFmode && in_mode == SFmode) { if (out_n == 4 && in_n == 4) return LARCH_GET_BUILTIN (lsx_vfrint_s); + if (out_n == 8 && in_n == 8) + return LARCH_GET_BUILTIN (lasx_xvfrint_s); } break; @@ -1481,11 +2549,15 @@ loongarch_builtin_vectorized_function (unsigned int fn, tree type_out, { if (out_n == 2 && in_n == 2) return LARCH_GET_BUILTIN (lsx_vfrintrm_d); + if (out_n == 4 && in_n == 4) + return LARCH_GET_BUILTIN (lasx_xvfrintrm_d); } if (out_mode == SFmode && in_mode == SFmode) { if (out_n == 4 && in_n == 4) return LARCH_GET_BUILTIN (lsx_vfrintrm_s); + if (out_n == 8 && in_n == 8) + return LARCH_GET_BUILTIN (lasx_xvfrintrm_s); } break; @@ -1560,6 +2632,30 @@ loongarch_expand_builtin_insn (enum insn_code icode, unsigned int nops, case CODE_FOR_lsx_vsubi_hu: case CODE_FOR_lsx_vsubi_wu: case CODE_FOR_lsx_vsubi_du: + case CODE_FOR_lasx_xvaddi_bu: + case CODE_FOR_lasx_xvaddi_hu: + case CODE_FOR_lasx_xvaddi_wu: + case CODE_FOR_lasx_xvaddi_du: + case CODE_FOR_lasx_xvslti_bu: + case CODE_FOR_lasx_xvslti_hu: + case CODE_FOR_lasx_xvslti_wu: + case CODE_FOR_lasx_xvslti_du: + case CODE_FOR_lasx_xvslei_bu: + case CODE_FOR_lasx_xvslei_hu: + case CODE_FOR_lasx_xvslei_wu: + case CODE_FOR_lasx_xvslei_du: + case CODE_FOR_lasx_xvmaxi_bu: + case CODE_FOR_lasx_xvmaxi_hu: + case CODE_FOR_lasx_xvmaxi_wu: + case CODE_FOR_lasx_xvmaxi_du: + case CODE_FOR_lasx_xvmini_bu: + case CODE_FOR_lasx_xvmini_hu: + case CODE_FOR_lasx_xvmini_wu: + case CODE_FOR_lasx_xvmini_du: + case CODE_FOR_lasx_xvsubi_bu: + case CODE_FOR_lasx_xvsubi_hu: + case CODE_FOR_lasx_xvsubi_wu: + case CODE_FOR_lasx_xvsubi_du: gcc_assert (has_target_p && nops == 3); /* We only generate a vector of constants iff the second argument is an immediate. We also validate the range of the immediate. */ @@ -1598,6 +2694,26 @@ loongarch_expand_builtin_insn (enum insn_code icode, unsigned int nops, case CODE_FOR_lsx_vmini_h: case CODE_FOR_lsx_vmini_w: case CODE_FOR_lsx_vmini_d: + case CODE_FOR_lasx_xvseqi_b: + case CODE_FOR_lasx_xvseqi_h: + case CODE_FOR_lasx_xvseqi_w: + case CODE_FOR_lasx_xvseqi_d: + case CODE_FOR_lasx_xvslti_b: + case CODE_FOR_lasx_xvslti_h: + case CODE_FOR_lasx_xvslti_w: + case CODE_FOR_lasx_xvslti_d: + case CODE_FOR_lasx_xvslei_b: + case CODE_FOR_lasx_xvslei_h: + case CODE_FOR_lasx_xvslei_w: + case CODE_FOR_lasx_xvslei_d: + case CODE_FOR_lasx_xvmaxi_b: + case CODE_FOR_lasx_xvmaxi_h: + case CODE_FOR_lasx_xvmaxi_w: + case CODE_FOR_lasx_xvmaxi_d: + case CODE_FOR_lasx_xvmini_b: + case CODE_FOR_lasx_xvmini_h: + case CODE_FOR_lasx_xvmini_w: + case CODE_FOR_lasx_xvmini_d: gcc_assert (has_target_p && nops == 3); /* We only generate a vector of constants iff the second argument is an immediate. We also validate the range of the immediate. */ @@ -1620,6 +2736,10 @@ loongarch_expand_builtin_insn (enum insn_code icode, unsigned int nops, case CODE_FOR_lsx_vori_b: case CODE_FOR_lsx_vnori_b: case CODE_FOR_lsx_vxori_b: + case CODE_FOR_lasx_xvandi_b: + case CODE_FOR_lasx_xvori_b: + case CODE_FOR_lasx_xvnori_b: + case CODE_FOR_lasx_xvxori_b: gcc_assert (has_target_p && nops == 3); if (!CONST_INT_P (ops[2].value)) break; @@ -1629,6 +2749,7 @@ loongarch_expand_builtin_insn (enum insn_code icode, unsigned int nops, break; case CODE_FOR_lsx_vbitseli_b: + case CODE_FOR_lasx_xvbitseli_b: gcc_assert (has_target_p && nops == 4); if (!CONST_INT_P (ops[3].value)) break; @@ -1641,6 +2762,10 @@ loongarch_expand_builtin_insn (enum insn_code icode, unsigned int nops, case CODE_FOR_lsx_vreplgr2vr_h: case CODE_FOR_lsx_vreplgr2vr_w: case CODE_FOR_lsx_vreplgr2vr_d: + case CODE_FOR_lasx_xvreplgr2vr_b: + case CODE_FOR_lasx_xvreplgr2vr_h: + case CODE_FOR_lasx_xvreplgr2vr_w: + case CODE_FOR_lasx_xvreplgr2vr_d: /* Map the built-ins to vector fill operations. We need fix up the mode for the element being inserted. */ gcc_assert (has_target_p && nops == 2); @@ -1669,6 +2794,26 @@ loongarch_expand_builtin_insn (enum insn_code icode, unsigned int nops, case CODE_FOR_lsx_vpickod_b: case CODE_FOR_lsx_vpickod_h: case CODE_FOR_lsx_vpickod_w: + case CODE_FOR_lasx_xvilvh_b: + case CODE_FOR_lasx_xvilvh_h: + case CODE_FOR_lasx_xvilvh_w: + case CODE_FOR_lasx_xvilvh_d: + case CODE_FOR_lasx_xvilvl_b: + case CODE_FOR_lasx_xvilvl_h: + case CODE_FOR_lasx_xvilvl_w: + case CODE_FOR_lasx_xvilvl_d: + case CODE_FOR_lasx_xvpackev_b: + case CODE_FOR_lasx_xvpackev_h: + case CODE_FOR_lasx_xvpackev_w: + case CODE_FOR_lasx_xvpackod_b: + case CODE_FOR_lasx_xvpackod_h: + case CODE_FOR_lasx_xvpackod_w: + case CODE_FOR_lasx_xvpickev_b: + case CODE_FOR_lasx_xvpickev_h: + case CODE_FOR_lasx_xvpickev_w: + case CODE_FOR_lasx_xvpickod_b: + case CODE_FOR_lasx_xvpickod_h: + case CODE_FOR_lasx_xvpickod_w: /* Swap the operands 1 and 2 for interleave operations. Built-ins follow convention of ISA, which have op1 as higher component and op2 as lower component. However, the VEC_PERM op in tree and vec_concat in RTL @@ -1690,6 +2835,18 @@ loongarch_expand_builtin_insn (enum insn_code icode, unsigned int nops, case CODE_FOR_lsx_vsrli_h: case CODE_FOR_lsx_vsrli_w: case CODE_FOR_lsx_vsrli_d: + case CODE_FOR_lasx_xvslli_b: + case CODE_FOR_lasx_xvslli_h: + case CODE_FOR_lasx_xvslli_w: + case CODE_FOR_lasx_xvslli_d: + case CODE_FOR_lasx_xvsrai_b: + case CODE_FOR_lasx_xvsrai_h: + case CODE_FOR_lasx_xvsrai_w: + case CODE_FOR_lasx_xvsrai_d: + case CODE_FOR_lasx_xvsrli_b: + case CODE_FOR_lasx_xvsrli_h: + case CODE_FOR_lasx_xvsrli_w: + case CODE_FOR_lasx_xvsrli_d: gcc_assert (has_target_p && nops == 3); if (CONST_INT_P (ops[2].value)) { @@ -1750,6 +2907,25 @@ loongarch_expand_builtin_insn (enum insn_code icode, unsigned int nops, INTVAL (ops[2].value)); break; + case CODE_FOR_lasx_xvinsgr2vr_w: + case CODE_FOR_lasx_xvinsgr2vr_d: + /* Map the built-ins to insert operations. We need to swap operands, + fix up the mode for the element being inserted, and generate + a bit mask for vec_merge. */ + gcc_assert (has_target_p && nops == 4); + std::swap (ops[1], ops[2]); + imode = GET_MODE_INNER (ops[0].mode); + ops[1].value = lowpart_subreg (imode, ops[1].value, ops[1].mode); + ops[1].mode = imode; + rangelo = 0; + rangehi = GET_MODE_NUNITS (ops[0].mode) - 1; + if (CONST_INT_P (ops[3].value) + && IN_RANGE (INTVAL (ops[3].value), rangelo, rangehi)) + ops[3].value = GEN_INT (1 << INTVAL (ops[3].value)); + else + error_opno = 2; + break; + default: break; } @@ -1859,12 +3035,14 @@ loongarch_expand_builtin (tree exp, rtx target, rtx subtarget ATTRIBUTE_UNUSED, { case LARCH_BUILTIN_DIRECT: case LARCH_BUILTIN_LSX: + case LARCH_BUILTIN_LASX: return loongarch_expand_builtin_direct (d->icode, target, exp, true); case LARCH_BUILTIN_DIRECT_NO_TARGET: return loongarch_expand_builtin_direct (d->icode, target, exp, false); case LARCH_BUILTIN_LSX_TEST_BRANCH: + case LARCH_BUILTIN_LASX_TEST_BRANCH: return loongarch_expand_builtin_lsx_test_branch (d->icode, exp); } gcc_unreachable (); diff --git a/gcc/config/loongarch/loongarch-ftypes.def b/gcc/config/loongarch/loongarch-ftypes.def index 1ce9d83ccab..72d96878038 100644 --- a/gcc/config/loongarch/loongarch-ftypes.def +++ b/gcc/config/loongarch/loongarch-ftypes.def @@ -67,6 +67,7 @@ DEF_LARCH_FTYPE (3, (UDI, UDI, UDI, USI)) DEF_LARCH_FTYPE (1, (DF, DF)) DEF_LARCH_FTYPE (2, (DF, DF, DF)) DEF_LARCH_FTYPE (1, (DF, V2DF)) +DEF_LARCH_FTYPE (1, (DF, V4DF)) DEF_LARCH_FTYPE (1, (DI, DI)) DEF_LARCH_FTYPE (1, (DI, SI)) @@ -83,6 +84,7 @@ DEF_LARCH_FTYPE (2, (DI, SI, SI)) DEF_LARCH_FTYPE (2, (DI, USI, USI)) DEF_LARCH_FTYPE (2, (DI, V2DI, UQI)) +DEF_LARCH_FTYPE (2, (DI, V4DI, UQI)) DEF_LARCH_FTYPE (2, (INT, DF, DF)) DEF_LARCH_FTYPE (2, (INT, SF, SF)) @@ -104,21 +106,31 @@ DEF_LARCH_FTYPE (3, (SI, SI, SI, SI)) DEF_LARCH_FTYPE (3, (SI, SI, SI, QI)) DEF_LARCH_FTYPE (1, (SI, UQI)) DEF_LARCH_FTYPE (1, (SI, UV16QI)) +DEF_LARCH_FTYPE (1, (SI, UV32QI)) DEF_LARCH_FTYPE (1, (SI, UV2DI)) +DEF_LARCH_FTYPE (1, (SI, UV4DI)) DEF_LARCH_FTYPE (1, (SI, UV4SI)) +DEF_LARCH_FTYPE (1, (SI, UV8SI)) DEF_LARCH_FTYPE (1, (SI, UV8HI)) +DEF_LARCH_FTYPE (1, (SI, UV16HI)) DEF_LARCH_FTYPE (2, (SI, V16QI, UQI)) +DEF_LARCH_FTYPE (2, (SI, V32QI, UQI)) DEF_LARCH_FTYPE (1, (SI, V2HI)) DEF_LARCH_FTYPE (2, (SI, V2HI, V2HI)) DEF_LARCH_FTYPE (1, (SI, V4QI)) DEF_LARCH_FTYPE (2, (SI, V4QI, V4QI)) DEF_LARCH_FTYPE (2, (SI, V4SI, UQI)) +DEF_LARCH_FTYPE (2, (SI, V8SI, UQI)) DEF_LARCH_FTYPE (2, (SI, V8HI, UQI)) DEF_LARCH_FTYPE (1, (SI, VOID)) DEF_LARCH_FTYPE (2, (UDI, UDI, UDI)) +DEF_LARCH_FTYPE (2, (USI, V32QI, UQI)) DEF_LARCH_FTYPE (2, (UDI, UV2SI, UV2SI)) +DEF_LARCH_FTYPE (2, (USI, V8SI, UQI)) DEF_LARCH_FTYPE (2, (UDI, V2DI, UQI)) +DEF_LARCH_FTYPE (2, (USI, V16HI, UQI)) +DEF_LARCH_FTYPE (2, (UDI, V4DI, UQI)) DEF_LARCH_FTYPE (2, (USI, V16QI, UQI)) DEF_LARCH_FTYPE (2, (USI, V4SI, UQI)) @@ -142,6 +154,23 @@ DEF_LARCH_FTYPE (2, (UV2DI, UV2DI, V2DI)) DEF_LARCH_FTYPE (2, (UV2DI, UV4SI, UV4SI)) DEF_LARCH_FTYPE (1, (UV2DI, V2DF)) +DEF_LARCH_FTYPE (2, (UV32QI, UV32QI, UQI)) +DEF_LARCH_FTYPE (2, (UV32QI, UV32QI, USI)) +DEF_LARCH_FTYPE (2, (UV32QI, UV32QI, UV32QI)) +DEF_LARCH_FTYPE (3, (UV32QI, UV32QI, UV32QI, UQI)) +DEF_LARCH_FTYPE (3, (UV32QI, UV32QI, UV32QI, USI)) +DEF_LARCH_FTYPE (3, (UV32QI, UV32QI, UV32QI, UV32QI)) +DEF_LARCH_FTYPE (2, (UV32QI, UV32QI, V32QI)) + +DEF_LARCH_FTYPE (2, (UV4DI, UV4DI, UQI)) +DEF_LARCH_FTYPE (2, (UV4DI, UV4DI, UV4DI)) +DEF_LARCH_FTYPE (3, (UV4DI, UV4DI, UV4DI, UQI)) +DEF_LARCH_FTYPE (3, (UV4DI, UV4DI, UV4DI, UV4DI)) +DEF_LARCH_FTYPE (3, (UV4DI, UV4DI, UV8SI, UV8SI)) +DEF_LARCH_FTYPE (2, (UV4DI, UV4DI, V4DI)) +DEF_LARCH_FTYPE (2, (UV4DI, UV8SI, UV8SI)) +DEF_LARCH_FTYPE (1, (UV4DI, V4DF)) + DEF_LARCH_FTYPE (2, (UV2SI, UV2SI, UQI)) DEF_LARCH_FTYPE (2, (UV2SI, UV2SI, UV2SI)) @@ -170,7 +199,22 @@ DEF_LARCH_FTYPE (3, (UV8HI, UV8HI, UV8HI, UQI)) DEF_LARCH_FTYPE (3, (UV8HI, UV8HI, UV8HI, UV8HI)) DEF_LARCH_FTYPE (2, (UV8HI, UV8HI, V8HI)) - +DEF_LARCH_FTYPE (2, (UV8SI, UV8SI, UQI)) +DEF_LARCH_FTYPE (2, (UV8SI, UV8SI, UV8SI)) +DEF_LARCH_FTYPE (3, (UV8SI, UV8SI, UV8SI, UQI)) +DEF_LARCH_FTYPE (3, (UV8SI, UV8SI, UV8SI, UV8SI)) +DEF_LARCH_FTYPE (3, (UV8SI, UV8SI, UV16HI, UV16HI)) +DEF_LARCH_FTYPE (2, (UV8SI, UV8SI, V8SI)) +DEF_LARCH_FTYPE (2, (UV8SI, UV16HI, UV16HI)) +DEF_LARCH_FTYPE (1, (UV8SI, V8SF)) + +DEF_LARCH_FTYPE (2, (UV16HI, UV32QI, UV32QI)) +DEF_LARCH_FTYPE (2, (UV16HI, UV16HI, UQI)) +DEF_LARCH_FTYPE (3, (UV16HI, UV16HI, UV32QI, UV32QI)) +DEF_LARCH_FTYPE (2, (UV16HI, UV16HI, UV16HI)) +DEF_LARCH_FTYPE (3, (UV16HI, UV16HI, UV16HI, UQI)) +DEF_LARCH_FTYPE (3, (UV16HI, UV16HI, UV16HI, UV16HI)) +DEF_LARCH_FTYPE (2, (UV16HI, UV16HI, V16HI)) DEF_LARCH_FTYPE (2, (UV8QI, UV4HI, UV4HI)) DEF_LARCH_FTYPE (1, (UV8QI, UV8QI)) @@ -196,6 +240,25 @@ DEF_LARCH_FTYPE (4, (V16QI, V16QI, V16QI, UQI, UQI)) DEF_LARCH_FTYPE (3, (V16QI, V16QI, V16QI, USI)) DEF_LARCH_FTYPE (3, (V16QI, V16QI, V16QI, V16QI)) +DEF_LARCH_FTYPE (2, (V32QI, CVPOINTER, SI)) +DEF_LARCH_FTYPE (2, (V32QI, CVPOINTER, DI)) +DEF_LARCH_FTYPE (1, (V32QI, HI)) +DEF_LARCH_FTYPE (1, (V32QI, SI)) +DEF_LARCH_FTYPE (2, (V32QI, UV32QI, UQI)) +DEF_LARCH_FTYPE (2, (V32QI, UV32QI, UV32QI)) +DEF_LARCH_FTYPE (1, (V32QI, V32QI)) +DEF_LARCH_FTYPE (2, (V32QI, V32QI, QI)) +DEF_LARCH_FTYPE (2, (V32QI, V32QI, SI)) +DEF_LARCH_FTYPE (2, (V32QI, V32QI, UQI)) +DEF_LARCH_FTYPE (2, (V32QI, V32QI, USI)) +DEF_LARCH_FTYPE (3, (V32QI, V32QI, SI, UQI)) +DEF_LARCH_FTYPE (3, (V32QI, V32QI, UQI, V32QI)) +DEF_LARCH_FTYPE (2, (V32QI, V32QI, V32QI)) +DEF_LARCH_FTYPE (3, (V32QI, V32QI, V32QI, SI)) +DEF_LARCH_FTYPE (3, (V32QI, V32QI, V32QI, UQI)) +DEF_LARCH_FTYPE (4, (V32QI, V32QI, V32QI, UQI, UQI)) +DEF_LARCH_FTYPE (3, (V32QI, V32QI, V32QI, USI)) +DEF_LARCH_FTYPE (3, (V32QI, V32QI, V32QI, V32QI)) DEF_LARCH_FTYPE (1, (V2DF, DF)) DEF_LARCH_FTYPE (1, (V2DF, UV2DI)) @@ -207,6 +270,16 @@ DEF_LARCH_FTYPE (1, (V2DF, V2DI)) DEF_LARCH_FTYPE (1, (V2DF, V4SF)) DEF_LARCH_FTYPE (1, (V2DF, V4SI)) +DEF_LARCH_FTYPE (1, (V4DF, DF)) +DEF_LARCH_FTYPE (1, (V4DF, UV4DI)) +DEF_LARCH_FTYPE (1, (V4DF, V4DF)) +DEF_LARCH_FTYPE (2, (V4DF, V4DF, V4DF)) +DEF_LARCH_FTYPE (3, (V4DF, V4DF, V4DF, V4DF)) +DEF_LARCH_FTYPE (2, (V4DF, V4DF, V4DI)) +DEF_LARCH_FTYPE (1, (V4DF, V4DI)) +DEF_LARCH_FTYPE (1, (V4DF, V8SF)) +DEF_LARCH_FTYPE (1, (V4DF, V8SI)) + DEF_LARCH_FTYPE (2, (V2DI, CVPOINTER, SI)) DEF_LARCH_FTYPE (1, (V2DI, DI)) DEF_LARCH_FTYPE (1, (V2DI, HI)) @@ -233,6 +306,32 @@ DEF_LARCH_FTYPE (3, (V2DI, V2DI, V2DI, V2DI)) DEF_LARCH_FTYPE (3, (V2DI, V2DI, V4SI, V4SI)) DEF_LARCH_FTYPE (2, (V2DI, V4SI, V4SI)) +DEF_LARCH_FTYPE (2, (V4DI, CVPOINTER, SI)) +DEF_LARCH_FTYPE (1, (V4DI, DI)) +DEF_LARCH_FTYPE (1, (V4DI, HI)) +DEF_LARCH_FTYPE (2, (V4DI, UV4DI, UQI)) +DEF_LARCH_FTYPE (2, (V4DI, UV4DI, UV4DI)) +DEF_LARCH_FTYPE (2, (V4DI, UV8SI, UV8SI)) +DEF_LARCH_FTYPE (1, (V4DI, V4DF)) +DEF_LARCH_FTYPE (2, (V4DI, V4DF, V4DF)) +DEF_LARCH_FTYPE (1, (V4DI, V4DI)) +DEF_LARCH_FTYPE (1, (UV4DI, UV4DI)) +DEF_LARCH_FTYPE (2, (V4DI, V4DI, QI)) +DEF_LARCH_FTYPE (2, (V4DI, V4DI, SI)) +DEF_LARCH_FTYPE (2, (V4DI, V4DI, UQI)) +DEF_LARCH_FTYPE (2, (V4DI, V4DI, USI)) +DEF_LARCH_FTYPE (3, (V4DI, V4DI, DI, UQI)) +DEF_LARCH_FTYPE (3, (V4DI, V4DI, UQI, V4DI)) +DEF_LARCH_FTYPE (3, (V4DI, V4DI, UV8SI, UV8SI)) +DEF_LARCH_FTYPE (2, (V4DI, V4DI, V4DI)) +DEF_LARCH_FTYPE (3, (V4DI, V4DI, V4DI, SI)) +DEF_LARCH_FTYPE (3, (V4DI, V4DI, V4DI, USI)) +DEF_LARCH_FTYPE (3, (V4DI, V4DI, V4DI, UQI)) +DEF_LARCH_FTYPE (4, (V4DI, V4DI, V4DI, UQI, UQI)) +DEF_LARCH_FTYPE (3, (V4DI, V4DI, V4DI, V4DI)) +DEF_LARCH_FTYPE (3, (V4DI, V4DI, V8SI, V8SI)) +DEF_LARCH_FTYPE (2, (V4DI, V8SI, V8SI)) + DEF_LARCH_FTYPE (1, (V2HI, SI)) DEF_LARCH_FTYPE (2, (V2HI, SI, SI)) DEF_LARCH_FTYPE (3, (V2HI, SI, SI, SI)) @@ -274,6 +373,17 @@ DEF_LARCH_FTYPE (3, (V4SF, V4SF, V4SF, V4SF)) DEF_LARCH_FTYPE (2, (V4SF, V4SF, V4SI)) DEF_LARCH_FTYPE (1, (V4SF, V4SI)) DEF_LARCH_FTYPE (1, (V4SF, V8HI)) +DEF_LARCH_FTYPE (1, (V8SF, V16HI)) + +DEF_LARCH_FTYPE (1, (V8SF, SF)) +DEF_LARCH_FTYPE (1, (V8SF, UV8SI)) +DEF_LARCH_FTYPE (2, (V8SF, V4DF, V4DF)) +DEF_LARCH_FTYPE (1, (V8SF, V8SF)) +DEF_LARCH_FTYPE (2, (V8SF, V8SF, V8SF)) +DEF_LARCH_FTYPE (3, (V8SF, V8SF, V8SF, V8SF)) +DEF_LARCH_FTYPE (2, (V8SF, V8SF, V8SI)) +DEF_LARCH_FTYPE (1, (V8SF, V8SI)) +DEF_LARCH_FTYPE (1, (V8SF, V8HI)) DEF_LARCH_FTYPE (2, (V4SI, CVPOINTER, SI)) DEF_LARCH_FTYPE (1, (V4SI, HI)) @@ -282,6 +392,7 @@ DEF_LARCH_FTYPE (2, (V4SI, UV4SI, UQI)) DEF_LARCH_FTYPE (2, (V4SI, UV4SI, UV4SI)) DEF_LARCH_FTYPE (2, (V4SI, UV8HI, UV8HI)) DEF_LARCH_FTYPE (2, (V4SI, V2DF, V2DF)) +DEF_LARCH_FTYPE (2, (V8SI, V4DF, V4DF)) DEF_LARCH_FTYPE (1, (V4SI, V4SF)) DEF_LARCH_FTYPE (2, (V4SI, V4SF, V4SF)) DEF_LARCH_FTYPE (1, (V4SI, V4SI)) @@ -301,6 +412,32 @@ DEF_LARCH_FTYPE (3, (V4SI, V4SI, V4SI, V4SI)) DEF_LARCH_FTYPE (3, (V4SI, V4SI, V8HI, V8HI)) DEF_LARCH_FTYPE (2, (V4SI, V8HI, V8HI)) +DEF_LARCH_FTYPE (2, (V8SI, CVPOINTER, SI)) +DEF_LARCH_FTYPE (1, (V8SI, HI)) +DEF_LARCH_FTYPE (1, (V8SI, SI)) +DEF_LARCH_FTYPE (2, (V8SI, UV8SI, UQI)) +DEF_LARCH_FTYPE (2, (V8SI, UV8SI, UV8SI)) +DEF_LARCH_FTYPE (2, (V8SI, UV16HI, UV16HI)) +DEF_LARCH_FTYPE (2, (V8SI, V2DF, V2DF)) +DEF_LARCH_FTYPE (1, (V8SI, V8SF)) +DEF_LARCH_FTYPE (2, (V8SI, V8SF, V8SF)) +DEF_LARCH_FTYPE (1, (V8SI, V8SI)) +DEF_LARCH_FTYPE (2, (V8SI, V8SI, QI)) +DEF_LARCH_FTYPE (2, (V8SI, V8SI, SI)) +DEF_LARCH_FTYPE (2, (V8SI, V8SI, UQI)) +DEF_LARCH_FTYPE (2, (V8SI, V8SI, USI)) +DEF_LARCH_FTYPE (3, (V8SI, V8SI, SI, UQI)) +DEF_LARCH_FTYPE (3, (V8SI, V8SI, UQI, V8SI)) +DEF_LARCH_FTYPE (3, (V8SI, V8SI, UV16HI, UV16HI)) +DEF_LARCH_FTYPE (2, (V8SI, V8SI, V8SI)) +DEF_LARCH_FTYPE (3, (V8SI, V8SI, V8SI, SI)) +DEF_LARCH_FTYPE (3, (V8SI, V8SI, V8SI, UQI)) +DEF_LARCH_FTYPE (3, (V8SI, V8SI, V8SI, USI)) +DEF_LARCH_FTYPE (4, (V8SI, V8SI, V8SI, UQI, UQI)) +DEF_LARCH_FTYPE (3, (V8SI, V8SI, V8SI, V8SI)) +DEF_LARCH_FTYPE (3, (V8SI, V8SI, V16HI, V16HI)) +DEF_LARCH_FTYPE (2, (V8SI, V16HI, V16HI)) + DEF_LARCH_FTYPE (2, (V8HI, CVPOINTER, SI)) DEF_LARCH_FTYPE (1, (V8HI, HI)) DEF_LARCH_FTYPE (1, (V8HI, SI)) @@ -326,6 +463,31 @@ DEF_LARCH_FTYPE (4, (V8HI, V8HI, V8HI, UQI, UQI)) DEF_LARCH_FTYPE (3, (V8HI, V8HI, V8HI, USI)) DEF_LARCH_FTYPE (3, (V8HI, V8HI, V8HI, V8HI)) +DEF_LARCH_FTYPE (2, (V16HI, CVPOINTER, SI)) +DEF_LARCH_FTYPE (1, (V16HI, HI)) +DEF_LARCH_FTYPE (1, (V16HI, SI)) +DEF_LARCH_FTYPE (2, (V16HI, UV32QI, UV32QI)) +DEF_LARCH_FTYPE (2, (V16HI, UV16HI, UQI)) +DEF_LARCH_FTYPE (2, (V16HI, UV16HI, UV16HI)) +DEF_LARCH_FTYPE (2, (V16HI, V32QI, V32QI)) +DEF_LARCH_FTYPE (2, (V16HI, V8SF, V8SF)) +DEF_LARCH_FTYPE (1, (V16HI, V16HI)) +DEF_LARCH_FTYPE (2, (V16HI, V16HI, QI)) +DEF_LARCH_FTYPE (2, (V16HI, V16HI, SI)) +DEF_LARCH_FTYPE (3, (V16HI, V16HI, SI, UQI)) +DEF_LARCH_FTYPE (2, (V16HI, V16HI, UQI)) +DEF_LARCH_FTYPE (2, (V16HI, V16HI, USI)) +DEF_LARCH_FTYPE (3, (V16HI, V16HI, UQI, SI)) +DEF_LARCH_FTYPE (3, (V16HI, V16HI, UQI, V16HI)) +DEF_LARCH_FTYPE (3, (V16HI, V16HI, UV32QI, UV32QI)) +DEF_LARCH_FTYPE (3, (V16HI, V16HI, V32QI, V32QI)) +DEF_LARCH_FTYPE (2, (V16HI, V16HI, V16HI)) +DEF_LARCH_FTYPE (3, (V16HI, V16HI, V16HI, SI)) +DEF_LARCH_FTYPE (3, (V16HI, V16HI, V16HI, UQI)) +DEF_LARCH_FTYPE (4, (V16HI, V16HI, V16HI, UQI, UQI)) +DEF_LARCH_FTYPE (3, (V16HI, V16HI, V16HI, USI)) +DEF_LARCH_FTYPE (3, (V16HI, V16HI, V16HI, V16HI)) + DEF_LARCH_FTYPE (2, (V8QI, V4HI, V4HI)) DEF_LARCH_FTYPE (1, (V8QI, V8QI)) DEF_LARCH_FTYPE (2, (V8QI, V8QI, V8QI)) @@ -337,62 +499,113 @@ DEF_LARCH_FTYPE (2, (VOID, USI, UQI)) DEF_LARCH_FTYPE (1, (VOID, UHI)) DEF_LARCH_FTYPE (3, (VOID, V16QI, CVPOINTER, SI)) DEF_LARCH_FTYPE (3, (VOID, V16QI, CVPOINTER, DI)) +DEF_LARCH_FTYPE (3, (VOID, V32QI, CVPOINTER, SI)) +DEF_LARCH_FTYPE (3, (VOID, V32QI, CVPOINTER, DI)) +DEF_LARCH_FTYPE (3, (VOID, V4DF, POINTER, SI)) DEF_LARCH_FTYPE (3, (VOID, V2DF, POINTER, SI)) DEF_LARCH_FTYPE (3, (VOID, V2DI, CVPOINTER, SI)) +DEF_LARCH_FTYPE (3, (VOID, V4DI, CVPOINTER, SI)) DEF_LARCH_FTYPE (2, (VOID, V2HI, V2HI)) DEF_LARCH_FTYPE (2, (VOID, V4QI, V4QI)) DEF_LARCH_FTYPE (3, (VOID, V4SF, POINTER, SI)) +DEF_LARCH_FTYPE (3, (VOID, V8SF, POINTER, SI)) DEF_LARCH_FTYPE (3, (VOID, V4SI, CVPOINTER, SI)) +DEF_LARCH_FTYPE (3, (VOID, V8SI, CVPOINTER, SI)) DEF_LARCH_FTYPE (3, (VOID, V8HI, CVPOINTER, SI)) +DEF_LARCH_FTYPE (3, (VOID, V16HI, CVPOINTER, SI)) +DEF_LARCH_FTYPE (1, (V16HI, V32QI)) +DEF_LARCH_FTYPE (1, (UV16HI, UV32QI)) +DEF_LARCH_FTYPE (1, (V8SI, V32QI)) +DEF_LARCH_FTYPE (1, (V4DI, V32QI)) DEF_LARCH_FTYPE (1, (V8HI, V16QI)) DEF_LARCH_FTYPE (1, (V4SI, V16QI)) DEF_LARCH_FTYPE (1, (V2DI, V16QI)) +DEF_LARCH_FTYPE (1, (UV8SI, UV16HI)) +DEF_LARCH_FTYPE (1, (V8SI, V16HI)) +DEF_LARCH_FTYPE (1, (V4DI, V16HI)) DEF_LARCH_FTYPE (1, (V4SI, V8HI)) DEF_LARCH_FTYPE (1, (V2DI, V8HI)) DEF_LARCH_FTYPE (1, (V2DI, V4SI)) +DEF_LARCH_FTYPE (1, (V4DI, V8SI)) +DEF_LARCH_FTYPE (1, (UV4DI, UV8SI)) +DEF_LARCH_FTYPE (1, (UV16HI, V32QI)) +DEF_LARCH_FTYPE (1, (UV8SI, V32QI)) +DEF_LARCH_FTYPE (1, (UV4DI, V32QI)) DEF_LARCH_FTYPE (1, (UV8HI, V16QI)) DEF_LARCH_FTYPE (1, (UV4SI, V16QI)) DEF_LARCH_FTYPE (1, (UV2DI, V16QI)) +DEF_LARCH_FTYPE (1, (UV8SI, V16HI)) +DEF_LARCH_FTYPE (1, (UV4DI, V16HI)) DEF_LARCH_FTYPE (1, (UV4SI, V8HI)) DEF_LARCH_FTYPE (1, (UV2DI, V8HI)) DEF_LARCH_FTYPE (1, (UV2DI, V4SI)) +DEF_LARCH_FTYPE (1, (UV4DI, V8SI)) DEF_LARCH_FTYPE (1, (UV8HI, UV16QI)) DEF_LARCH_FTYPE (1, (UV4SI, UV16QI)) DEF_LARCH_FTYPE (1, (UV2DI, UV16QI)) +DEF_LARCH_FTYPE (1, (UV4DI, UV32QI)) DEF_LARCH_FTYPE (1, (UV4SI, UV8HI)) DEF_LARCH_FTYPE (1, (UV2DI, UV8HI)) DEF_LARCH_FTYPE (1, (UV2DI, UV4SI)) DEF_LARCH_FTYPE (2, (UV8HI, V16QI, V16QI)) DEF_LARCH_FTYPE (2, (UV4SI, V8HI, V8HI)) DEF_LARCH_FTYPE (2, (UV2DI, V4SI, V4SI)) +DEF_LARCH_FTYPE (2, (V16HI, V32QI, UQI)) +DEF_LARCH_FTYPE (2, (V8SI, V16HI, UQI)) +DEF_LARCH_FTYPE (2, (V4DI, V8SI, UQI)) DEF_LARCH_FTYPE (2, (V8HI, V16QI, UQI)) DEF_LARCH_FTYPE (2, (V4SI, V8HI, UQI)) DEF_LARCH_FTYPE (2, (V2DI, V4SI, UQI)) +DEF_LARCH_FTYPE (2, (UV16HI, UV32QI, UQI)) +DEF_LARCH_FTYPE (2, (UV8SI, UV16HI, UQI)) +DEF_LARCH_FTYPE (2, (UV4DI, UV8SI, UQI)) DEF_LARCH_FTYPE (2, (UV8HI, UV16QI, UQI)) DEF_LARCH_FTYPE (2, (UV4SI, UV8HI, UQI)) DEF_LARCH_FTYPE (2, (UV2DI, UV4SI, UQI)) +DEF_LARCH_FTYPE (2, (V32QI, V16HI, V16HI)) +DEF_LARCH_FTYPE (2, (V16HI, V8SI, V8SI)) +DEF_LARCH_FTYPE (2, (V8SI, V4DI, V4DI)) DEF_LARCH_FTYPE (2, (V16QI, V8HI, V8HI)) DEF_LARCH_FTYPE (2, (V8HI, V4SI, V4SI)) DEF_LARCH_FTYPE (2, (V4SI, V2DI, V2DI)) +DEF_LARCH_FTYPE (2, (UV32QI, UV16HI, UV16HI)) +DEF_LARCH_FTYPE (2, (UV16HI, UV8SI, UV8SI)) +DEF_LARCH_FTYPE (2, (UV8SI, UV4DI, UV4DI)) DEF_LARCH_FTYPE (2, (UV16QI, UV8HI, UV8HI)) DEF_LARCH_FTYPE (2, (UV8HI, UV4SI, UV4SI)) DEF_LARCH_FTYPE (2, (UV4SI, UV2DI, UV2DI)) +DEF_LARCH_FTYPE (2, (V32QI, V16HI, UQI)) +DEF_LARCH_FTYPE (2, (V16HI, V8SI, UQI)) +DEF_LARCH_FTYPE (2, (V8SI, V4DI, UQI)) DEF_LARCH_FTYPE (2, (V16QI, V8HI, UQI)) DEF_LARCH_FTYPE (2, (V8HI, V4SI, UQI)) DEF_LARCH_FTYPE (2, (V4SI, V2DI, UQI)) +DEF_LARCH_FTYPE (2, (UV32QI, UV16HI, UQI)) +DEF_LARCH_FTYPE (2, (UV16HI, UV8SI, UQI)) +DEF_LARCH_FTYPE (2, (UV8SI, UV4DI, UQI)) DEF_LARCH_FTYPE (2, (UV16QI, UV8HI, UQI)) DEF_LARCH_FTYPE (2, (UV8HI, UV4SI, UQI)) DEF_LARCH_FTYPE (2, (UV4SI, UV2DI, UQI)) +DEF_LARCH_FTYPE (2, (V32QI, V32QI, DI)) DEF_LARCH_FTYPE (2, (V16QI, V16QI, DI)) +DEF_LARCH_FTYPE (2, (V32QI, UQI, UQI)) DEF_LARCH_FTYPE (2, (V16QI, UQI, UQI)) +DEF_LARCH_FTYPE (3, (V32QI, V32QI, UQI, UQI)) +DEF_LARCH_FTYPE (3, (V16HI, V16HI, UQI, UQI)) +DEF_LARCH_FTYPE (3, (V8SI, V8SI, UQI, UQI)) +DEF_LARCH_FTYPE (3, (V4DI, V4DI, UQI, UQI)) DEF_LARCH_FTYPE (3, (V16QI, V16QI, UQI, UQI)) DEF_LARCH_FTYPE (3, (V8HI, V8HI, UQI, UQI)) DEF_LARCH_FTYPE (3, (V4SI, V4SI, UQI, UQI)) DEF_LARCH_FTYPE (3, (V2DI, V2DI, UQI, UQI)) +DEF_LARCH_FTYPE (2, (V8SF, V4DI, V4DI)) DEF_LARCH_FTYPE (2, (V4SF, V2DI, V2DI)) +DEF_LARCH_FTYPE (1, (V4DI, V8SF)) DEF_LARCH_FTYPE (1, (V2DI, V4SF)) +DEF_LARCH_FTYPE (2, (V4DI, UQI, USI)) DEF_LARCH_FTYPE (2, (V2DI, UQI, USI)) +DEF_LARCH_FTYPE (2, (V4DI, UQI, UQI)) DEF_LARCH_FTYPE (2, (V2DI, UQI, UQI)) DEF_LARCH_FTYPE (4, (VOID, SI, UQI, V16QI, CVPOINTER)) DEF_LARCH_FTYPE (4, (VOID, SI, UQI, V8HI, CVPOINTER)) @@ -402,6 +615,17 @@ DEF_LARCH_FTYPE (2, (V16QI, SI, CVPOINTER)) DEF_LARCH_FTYPE (2, (V8HI, SI, CVPOINTER)) DEF_LARCH_FTYPE (2, (V4SI, SI, CVPOINTER)) DEF_LARCH_FTYPE (2, (V2DI, SI, CVPOINTER)) +DEF_LARCH_FTYPE (4, (VOID, V32QI, UQI, SI, CVPOINTER)) +DEF_LARCH_FTYPE (4, (VOID, V16HI, UQI, SI, CVPOINTER)) +DEF_LARCH_FTYPE (4, (VOID, V8SI, UQI, SI, CVPOINTER)) +DEF_LARCH_FTYPE (4, (VOID, V4DI, UQI, SI, CVPOINTER)) +DEF_LARCH_FTYPE (3, (VOID, V32QI, SI, CVPOINTER)) +DEF_LARCH_FTYPE (2, (V32QI, SI, CVPOINTER)) +DEF_LARCH_FTYPE (2, (V16HI, SI, CVPOINTER)) +DEF_LARCH_FTYPE (2, (V8SI, SI, CVPOINTER)) +DEF_LARCH_FTYPE (2, (V4DI, SI, CVPOINTER)) +DEF_LARCH_FTYPE (1, (V32QI, POINTER)) +DEF_LARCH_FTYPE (2, (VOID, V32QI, POINTER)) DEF_LARCH_FTYPE (2, (V8HI, UV16QI, V16QI)) DEF_LARCH_FTYPE (2, (V16QI, V16QI, UV16QI)) DEF_LARCH_FTYPE (2, (UV16QI, V16QI, UV16QI)) @@ -431,6 +655,33 @@ DEF_LARCH_FTYPE (3, (V4SI, V4SI, V16QI, V16QI)) DEF_LARCH_FTYPE (3, (V4SI, V4SI, UV16QI, V16QI)) DEF_LARCH_FTYPE (3, (UV4SI, UV4SI, UV16QI, UV16QI)) + +DEF_LARCH_FTYPE(2,(V4DI,V16HI,V16HI)) +DEF_LARCH_FTYPE(2,(V4DI,UV4SI,V4SI)) +DEF_LARCH_FTYPE(2,(V8SI,UV16HI,V16HI)) +DEF_LARCH_FTYPE(2,(V16HI,UV32QI,V32QI)) +DEF_LARCH_FTYPE(2,(V4DI,UV8SI,V8SI)) +DEF_LARCH_FTYPE(3,(V4DI,V4DI,V16HI,V16HI)) +DEF_LARCH_FTYPE(2,(UV32QI,V32QI,UV32QI)) +DEF_LARCH_FTYPE(2,(UV16HI,V16HI,UV16HI)) +DEF_LARCH_FTYPE(2,(UV8SI,V8SI,UV8SI)) +DEF_LARCH_FTYPE(2,(UV4DI,V4DI,UV4DI)) +DEF_LARCH_FTYPE(3,(V4DI,V4DI,UV4DI,V4DI)) +DEF_LARCH_FTYPE(3,(V4DI,V4DI,UV8SI,V8SI)) +DEF_LARCH_FTYPE(3,(V8SI,V8SI,UV16HI,V16HI)) +DEF_LARCH_FTYPE(3,(V16HI,V16HI,UV32QI,V32QI)) +DEF_LARCH_FTYPE(2,(V4DI,UV4DI,V4DI)) +DEF_LARCH_FTYPE(2,(V8SI,V32QI,V32QI)) +DEF_LARCH_FTYPE(2,(UV4DI,UV16HI,UV16HI)) +DEF_LARCH_FTYPE(2,(V4DI,UV16HI,V16HI)) +DEF_LARCH_FTYPE(3,(V8SI,V8SI,V32QI,V32QI)) +DEF_LARCH_FTYPE(3,(UV8SI,UV8SI,UV32QI,UV32QI)) +DEF_LARCH_FTYPE(3,(UV4DI,UV4DI,UV16HI,UV16HI)) +DEF_LARCH_FTYPE(3,(V8SI,V8SI,UV32QI,V32QI)) +DEF_LARCH_FTYPE(3,(V4DI,V4DI,UV16HI,V16HI)) +DEF_LARCH_FTYPE(2,(UV8SI,UV32QI,UV32QI)) +DEF_LARCH_FTYPE(2,(V8SI,UV32QI,V32QI)) + DEF_LARCH_FTYPE(4,(VOID,V16QI,CVPOINTER,SI,UQI)) DEF_LARCH_FTYPE(4,(VOID,V8HI,CVPOINTER,SI,UQI)) DEF_LARCH_FTYPE(4,(VOID,V4SI,CVPOINTER,SI,UQI)) @@ -448,11 +699,29 @@ DEF_LARCH_FTYPE (3, (UV8HI, UV8HI, V8HI, USI)) DEF_LARCH_FTYPE (3, (UV4SI, UV4SI, V4SI, USI)) DEF_LARCH_FTYPE (3, (UV2DI, UV2DI, V2DI, USI)) +DEF_LARCH_FTYPE (2, (DI, V8SI, UQI)) +DEF_LARCH_FTYPE (2, (UDI, V8SI, UQI)) + +DEF_LARCH_FTYPE (3, (UV32QI, UV32QI, V32QI, USI)) +DEF_LARCH_FTYPE (3, (UV16HI, UV16HI, V16HI, USI)) +DEF_LARCH_FTYPE (3, (UV8SI, UV8SI, V8SI, USI)) +DEF_LARCH_FTYPE (3, (UV4DI, UV4DI, V4DI, USI)) + +DEF_LARCH_FTYPE(4,(VOID,V32QI,CVPOINTER,SI,UQI)) +DEF_LARCH_FTYPE(4,(VOID,V16HI,CVPOINTER,SI,UQI)) +DEF_LARCH_FTYPE(4,(VOID,V8SI,CVPOINTER,SI,UQI)) +DEF_LARCH_FTYPE(4,(VOID,V4DI,CVPOINTER,SI,UQI)) + DEF_LARCH_FTYPE (1, (BOOLEAN,V16QI)) DEF_LARCH_FTYPE(2,(V16QI,CVPOINTER,CVPOINTER)) DEF_LARCH_FTYPE(3,(VOID,V16QI,CVPOINTER,CVPOINTER)) +DEF_LARCH_FTYPE(2,(V32QI,CVPOINTER,CVPOINTER)) +DEF_LARCH_FTYPE(3,(VOID,V32QI,CVPOINTER,CVPOINTER)) DEF_LARCH_FTYPE (3, (V16QI, V16QI, SI, UQI)) DEF_LARCH_FTYPE (3, (V2DI, V2DI, SI, UQI)) DEF_LARCH_FTYPE (3, (V2DI, V2DI, DI, UQI)) DEF_LARCH_FTYPE (3, (V4SI, V4SI, SI, UQI)) + +DEF_LARCH_FTYPE (2, (V8SF, V8SF, UQI)) +DEF_LARCH_FTYPE (2, (V4DF, V4DF, UQI))