�
J��g�N � �V � d dl mZmZmZmZmZmZmZ d dlZd dl Z d dl
Z
d dlmZ d dl
mc mZ ddlmZmZ ddlmZmZmZmZ ddlmZ ddlmZ e
j Zd d l
mZ g d
�Zd� Z d� Z! d^d
edee"ee" f de"deedf fd�Z#dedefd�Z$er*dd�deeee f dee% deedf fd�Z&ndd�dee% deedf fd�Z&dee% fd�Z' d_dede"d ee" d!ee" d"ee d#e(d$e%d%e(d&ee( d'ee( defd(�Z) ee
j* d)� � Z*ereZ+neeeef Z+ d`ded*e(d+e(d,e(dee" de+fd-�Z, daded+e(d,e(dee" de+f
d.�Z-d`d/�Z.d`d0�Z/d`d1�Z0 ed,d2de.e/e1d3�4� � Z2 ed,d2de,e0e1d3�4� � Z3 ed+d5de3e2e1d3�4� � Z4e,j5 e4_5 dad6�Z6dad7�Z7dad8�Z8 ed,dde6e7e1d9�4� � Z9 ed,dde-e8e1d9�4� � Z: ed+d5de:e9e1d9�4� � Z;e-j5 e;_5 ern�edbd:e"d;ee
j fd<�� � Z<edcd:eee" ee" f d;ee
j fd=�� � Z<edcd:eee" d;ee
j fd>�� � Z<edcd:e
j d;ee
j fd?�� � Z<dbd;ee
j fd@�Z<dedefdA�Z=dB� Z>dddE�Z?dF� Z@dG� ZAdH� ZBern0ededJ�� � ZCededK�� � ZCededL�� � ZCededM�� � ZCdedNeeeDe%f fdO�ZCdPedQee"ee" e
jE f deedf fdR�ZFdPedQee"ee" f defdS�ZGddT�dU�ZHdfdV�ZIer ee ZJnee ZJdWe"dXe(d;eJddfdY�ZKdfdZ�ZLdfd[�ZM edXd5deLeMe1d\�4� � ZNeIj5 eN_5 d]� ZOdS )g� )�List�Tuple�Optional�Union�Any�Sequence�
TYPE_CHECKINGN)�_add_docstr� )�svd_lowrank�pca_lowrank)�has_torch_function�has_torch_function_unary�has_torch_function_variadic�handle_torch_function)�boolean_dispatch)� _overload)�_VF)�
atleast_1d�
atleast_2d�
atleast_3d�
align_tensors�broadcast_shapes�broadcast_tensors�cartesian_prod�
block_diag�cdist�chain_matmul�einsum�istft�lu�norm�meshgridr
�split�stftr � tensordot�unique�unique_consecutive�
unravel_indexc �n � t | � � rt t | g| �R � S t j | � � S )a broadcast_tensors(*tensors) -> List of Tensors
Broadcasts the given tensors according to :ref:`broadcasting-semantics`.
Args:
*tensors: any number of tensors of the same type
.. warning::
More than one element of a broadcasted tensor may refer to a single
memory location. As a result, in-place operations (especially ones that
are vectorized) may result in incorrect behavior. If you need to write
to the tensors, please clone them first.
Example::
>>> x = torch.arange(3).view(1, 3)
>>> y = torch.arange(2).view(2, 1)
>>> a, b = torch.broadcast_tensors(x, y)
>>> a.size()
torch.Size([2, 3])
>>> a
tensor([[0, 1, 2],
[0, 1, 2]])
)r r r r ��tensorss �`/home/asafur/pinokio/api/open-webui.git/app/env/lib/python3.11/site-packages/torch/functional.pyr r 0 s? � �6 �'�"�"� K�$�%6��J�'�J�J�J�J�� ��)�)�)� c � �� t j � � � �s�d}| D ]_}t |t t j f� � r |dk rd}�,t |t t f� � rt |� � }||k r|}�`dg|z }ddl m
} | D ]�}t |t t j f� � r|f}t |t t f� � r�t ddt |� � z
d� � D ]�}|| dk r"t d|| � d|| � d�� � � ||| dk � � s ||| || k � � r�a|| dk rt d� � �|| ||<