�
���g � �� � d dl Z d dlmZ d dlZd dlmZ d dlmZmZ ddlm Z e j
e� � Z G d� dej
� � Z G d � d
ej
� � Z G d� dej
� � Z G d
� dej
� � Z G d� dej
� � Z G d� dej
� � Z G d� dej
� � Z G d� dej
� � Z G d� dej
� � Z G d� dej
� � Z G d� dej
� � Z G d� de� � Zi de�d ed!d"d#�f�d$e�d%e�d&ed'd(if�d)e�d*e�d+e�d,ej �d-e�d.e�d/e�d0ej �d1e�d2ej �d3ej �d4ej �ej ej ej d5��Z! ee!� � Z"d6� Z# e#d&� � Z$ e#d%� � Z% e#d� � Z& e#d$� � Z' e#d/� � Z( e#d4� � Z) e#d.� � Z* e#d-� � Z+dS )7� N)�OrderedDict)�version)�Tensor�nn� )�loggingc �2 � � e Zd ZdZ� fd�Zdedefd�Z� xZS )�PytorchGELUTanha
A fast C implementation of the tanh approximation of the GeLU activation function. See
https://arxiv.org/abs/1606.08415.
This implementation is equivalent to NewGELU and FastGELU but much faster. However, it is not an exact numerical
match due to rounding errors.
c �� �� t � � � � � t j t j � � t j d� � k rt
dt j � d�� � �d S )Nz1.12.0zYou are using torch==zM, but torch>=1.12.0 is required to use PytorchGELUTanh. Please upgrade torch.)�super�__init__r �parse�torch�__version__�ImportError��self� __class__s ��h/home/asafur/pinokio/api/open-webui.git/app/env/lib/python3.11/site-packages/transformers/activations.pyr
zPytorchGELUTanh.__init__% so �� �
���������=��*�+�+�g�m�H�.E�.E�E�E��9��(9� 9� 9� 9�� �
� F�E� �input�returnc �D � t j � |d�� � S )N�tanh)�approximate)r �
functional�gelu�r r s r �forwardzPytorchGELUTanh.forward- s � ��}�!�!�%�V�!�<�<�<r ��__name__�
__module__�__qualname__�__doc__r
r r �
__classcell__�r s @r r
r
sd �� � � � � �� �� � � � �=�V� =�� =� =� =� =� =� =� =� =r r
c �"