与其它框架 API 进行对比

警告

MegEngine 的 API 设计遵循 MEP 3 – Tensor API 设计规范 , 向《数组 API 标准》靠齐。

  • 在同其它框架进行对比时,同样的命名不意味着用法也完全一致;

  • 如果有新的 API 支持需求,可在 GitHub 创建相应的 Issue 或 Pull Request.

注解

  • 你可以利用浏览器的查找功能在当前页面查询对应的 API.

  • 当前页面并非自动生成,如发现有缺失/过时内容,欢迎编辑当前页面。

参见

当前页面更多用于检索,具有其它框架使用经验的用户还可以参考 用户迁移指南

数据结构

NumPy

Pytorch

MegEngine

Comment

ndarray

Tensor

Tensor

深入理解 Tensor 数据结构

全局tensor操作

创建功能

NumPy

Pytorch

MegEngine

Comment

arange

arange

arange

linspace

linspace

linspace

eye

eye

eye

zeros

zeros

zeros

zeros_like

zeros_like

zeros_like

zeros_like

ones

ones

ones_like

ones_like

ones_like

full

full

full

full_like

full_like

full_like

操作功能

NumPy

Pytorch

MegEngine

Comment

full_like

reshape

reshape

flatten

flatten

flatten

broadcast_to

broadcast_to / expand

broadcast_to

expand_dims

unsqueeze

expand_dims

squeeze

squeeze

squeeze

concatenate

cat

concat

stack

stack

stack

split

split

split

tile

tile

tile

repeat

repeat_interleave

repeat

roll

roll

roll

Arithmetic operations

NumPy

Pytorch

MegEngine

Comment

add

add

add

+ operator

subtract

sub

sub

- operator

numpy.multiply

mul

mul

* operator

divide

div

div

/ operator

floor_divide

floor_divide

floor_div

// operator

negative

neg

neg

absolute

abs

abs

power

pow

pow

** operator

mod

remainder

mod

% operator

sqrt

sqrt

sqrt

square

square

square

sign

sign

sign

maximum

maximum

maximum

minimum

minimum

minimum

round

round

round

ceil

ceil

ceil

floor

floor

floor

clip

clamp

clip

exp

exp

exp

expm1

expm1

expm1

log

log

log

log1p

log1p

log1p

logaddexp

logaddexp

logaddexp

Trigonometric functions

NumPy

Pytorch

MegEngine

Comment

sin

sin

sin

cos

cos

cos

tan

tan

tan

arcsin

asin

asin

arccos

acos

acos

arctan

atan

atan

Hyperbolic functions

NumPy

Pytorch

MegEngine

Comment

sinh

sinh

sinh

cosh

cosh

cosh

tanh

tanh

tanh

arcsinh

asinh

asinh

arccosh

acosh

acosh

arctanh

atanh

atanh

Bit operations

NumPy

Pytorch

MegEngine

Comment

left_shift

Not Found

left_shift

<< operator

right_shift

Not Found

right_shift

>> operator

Logic functions

NumPy

Pytorch

MegEngine

Comment

isnan

isnan

isnan

isinf

isinf

isinf

logical_and

Not Found

logical_and

& operator

logical_not

Not Found

logical_not

~ operator

logical_or

Not Found

logical_or

| operator

logical_xor

Not Found

logical_xor

^ operator

equal

equal

equal

not_equal

not_equal

not_equal

less

less

less

less_equal

less_equal

less_equal

greater

greater

greater

greater_equal

greater_equal

greater_equal

统计函数

NumPy

Pytorch

MegEngine

Comment

sum

sum

sum

prod

prod

prod

mean

mean

mean

min

min

min

max

max

max

var

var

var

std

std

std

cumsum

cumsum

cumsum

线性代数函数

NumPy

Pytorch

MegEngine

Comment

transpose

transpose

transpose

dot

dot

dot

inv

inv

matinv

matmul

matmul

matmul

svd

svd

svd

norm

norm

norm

Indexing Functions

NumPy

Pytorch

MegEngine

Comment

take_along_axis

gather

gather

put_along_axis

scatter

scatter

where

where

where / cond_take

取决于传参情况

Searching Functions

NumPy

Pytorch

MegEngine

Comment

argmin

argmin

argmin

argmax

argmax

argmax

Sorting Functions

NumPy

Pytorch

MegEngine

Comment

argsort

argsort

argsort

sort

sort

sort

NN Funtional Operations

卷积运算

Pytorch

MegEngine

Comment

conv1d

conv1d

conv2d

conv2d

conv3d

conv3d

conv_transpose1d

Not Implemeted

conv_transpose2d

conv_transpose2d

conv_transpose3d

conv_transpose3d

local_conv2d

local_conv2d

deformable_conv2d

deformable_conv2d

unfold

sliding_window

fold

sliding_window_transpose

池化函数

Pytorch

MegEngine

Comment

avg_pool1d

Not Implemeted

avg_pool2d

avg_pool2d

avg_pool3d

Not Implemeted

max_pool1d

Not Implemeted

max_pool2d

max_pool2d

max_pool3d

Not Implemeted

max_unpool1d

Not Implemeted

max_unpool2d

Not Implemeted

max_unpool3d

Not Implemeted

lp_pool1d

Not Implemeted

lp_pool2d

Not Implemeted

adaptive_max_pool1d

Not Implemeted

adaptive_max_pool2d

adaptive_max_pool2d

adaptive_max_pool3d

Not Implemeted

adaptive_avg_pool1d

Not Implemeted

adaptive_avg_pool2d

adaptive_avg_pool2d

adaptive_avg_pool3d

Not Implemeted

fractional_max_pool2d

Not Implemeted

fractional_max_pool3d

Not Implemeted

非线性激活函数

Pytorch

MegEngine

Comment

threshold

Not Implemeted

relu

relu

hardtanh

Not Implemeted

hardswish

hswish

relu6

relu6

elu

Not Implemeted

selu

Not Implemeted

celu

Not Implemeted

leaky_relu

leaky_relu

prelu

prelu

rrelu

Not Implemeted

glu

Not Implemeted

gelu

gelu

logsigmoid

logsigmoid

hardshrink

Not Implemeted

tanhshrink

Not Implemeted

softsign

Not Implemeted

softplus

Not Implemeted

softmin

Not Implemeted

softmax

softmax

softshrink

Not Implemeted

gumbel_softmax

Not Implemeted

log_softmax

logsoftmax

sigmoid

sigmoid

hardsigmoid

hsigmoid

silu

silu

标准化函数

Pytorch

MegEngine

Comment

batch_norm

batch_norm

group_norm

Not Implemeted

See GroupNorm

instance_norm

Not Implemeted

See InstanceNorm

layer_norm

Not Implemeted

See LayerNorm

local_response_norm

Not Implemeted

See LocalResponseNorm

normalize

normalize

线性函数

Pytorch

MegEngine

Comment

linear

linear

bilinear

Not Implemeted

Dropout 函数

Pytorch

MegEngine

Comment

dropout

dropout

alpha_dropout

Not Implemeted

feature_alpha_dropout

Not Implemeted

dropout2d

Not Implemeted

dropout3d

Not Implemeted

Sparse 函数

Pytorch

MegEngine

Comment

embedding

embedding

embedding_bag

Not Implemeted

one_hot

one_hot

度量函数

Pytorch

MegEngine

Comment

pairwise_distance

Not Implemeted

cosine_similarity

Not Implemeted

pdist

Not Implemeted

损失函数

Pytorch

MegEngine

Comment

binary_cross_entropy

binary_cross_entropy

binary_cross_entropy_with_logits

binary_cross_entropy

poisson_nll_loss

Not Implemeted

cosine_embedding_loss

Not Implemeted

cross_entropy

cross_entropy

ctc_loss

ctc_loss

gaussian_nll_loss

Not Implemeted

hinge_embedding_loss

Not Implemeted

kl_div

Not Implemeted

l1_loss

l1_loss

mse_loss

square_loss

margin_ranking_loss

Not Implemeted

multilabel_margin_loss

Not Implemeted

multilabel_soft_margin_loss

Not Implemeted

multi_margin_loss

Not Implemeted

nll_loss

Not Implemeted

huber_loss

Not Implemeted

smooth_l1_loss

Not Implemeted

soft_margin_loss

Not Implemeted

triplet_margin_loss

Not Implemeted

triplet_margin_with_distance_loss

Not Implemeted

NN Module

Pytorch

MegEngine

Comment

Parameter

Parameter

容器

Pytorch

MegEngine

Comment

Module

Module

Sequential

Sequential

ModuleList

MegEngine 原生支持

ModuleDict

MegEngine 原生支持

ParameterList

MegEngine 原生支持

ParameterDict

MegEngine 原生支持

Initialization

Pytorch

MegEngine

Comment

calculate_gain

calculate_gain

_calculate_fan_in_and_fan_out

calculate_fan_in_and_fan_out

_calculate_correct_fan

calculate_correct_fan

uniform_

uniform_

normal_

normal_

constant_

fill_

ones_

ones_

zeros_

zeros_

eye_

Not Implemeted

dirac_

Not Implemeted

xavier_uniform_

xavier_uniform_

xavier_normal_

xavier_normal_

kaiming_uniform_

msra_uniform_

kaiming_normal_

msra_normal_

orthogonal_

Not Implemeted

sparse_

Not Implemeted

卷积层

Pytorch

MegEngine

Comment

Conv1d

Conv1d

差异对比

Conv2d

Conv2d

差异对比

Conv3d

Conv3d

差异对比

ConvTranspose1d

Not Implemeted

ConvTranspose2d

ConvTranspose2d

ConvTranspose3d

ConvTranspose3d

LazyConv1d

Not Implemeted

LazyConv2d

Not Implemeted

LazyConv3d

Not Implemeted

LazyConvTranspose1d

Not Implemeted

LazyConvTranspose2d

Not Implemeted

LazyConvTranspose3d

Not Implemeted

LocalConv2d

LocalConv2d

DeformableConv2d

DeformableConv2d

Conv1d

Conv1d

Unfold

SlidingWindowTranspose

Fold

SlidingWindow

池化层

Pytorch

MegEngine

Comment

MaxPool1d

Not Implemeted

MaxPool2d

MaxPool2d

MaxPool3d

Not Implemeted

MaxUnpool1d

Not Implemeted

MaxUnpool2d

Not Implemeted

MaxUnpool3d

Not Implemeted

AvgPool1d

Not Implemeted

AvgPool2d

AvgPool2d

AvgPool3d

Not Implemeted

FractionalMaxPool2d

Not Implemeted

FractionalMaxPool3d

Not Implemeted

LPPool1d

Not Implemeted

LPPool2d

Not Implemeted

AdaptiveMaxPool1d

Not Implemeted

AdaptiveMaxPool2d

AdaptiveMaxPool2d

AdaptiveMaxPool3d

Not Implemeted

AdaptiveAvgPool1d

Not Implemeted

AdaptiveAvgPool2d

AdaptiveAvgPool2d

AdaptiveAvgPool3d

Not Implemeted

Padding Layers

Pytorch

MegEngine

Comment

ReflectionPad1d

Pad

mode = REFLECT

ReflectionPad2d

Pad

mode = REFLECT

ReflectionPad3d

Pad

mode = REFLECT

ReplicationPad1d

Pad

mode = EDGE

ReplicationPad2d

Pad

mode = EDGE

ReplicationPad3d

Pad

mode = EDGE

ZeroPad2d

Pad

mode = CONSTANT

ConstantPad1d

Pad

mode = CONSTANT

ConstantPad2d

Pad

mode = CONSTANT

ConstantPad3d

Pad

mode = CONSTANT

非线性激活层

Pytorch

MegEngine

Comment

ELU

Not Implemeted

Hardshrink

Not Implemeted

Hardsigmoid

Not Implemeted

Hardtanh

Not Implemeted

Hardswish

Not Implemeted

LeakyReLU

LeakyReLU

LogSigmoid

Not Implemeted

MultiheadAttention

Not Implemeted

PReLU

PReLU

ReLU

ReLU

ReLU6

Not Implemeted

RReLU

Not Implemeted

SELU

Not Implemeted

CELU

Not Implemeted

GELU

GELU

Sigmoid

Sigmoid

SiLU

SiLU

Softplus

Not Implemeted

Softshrink

Not Implemeted

Softsign

Not Implemeted

Tanh

Not Implemeted

Tanhshrink

Not Implemeted

Threshold

Not Implemeted

Softmin

Not Implemeted

Softmax

Softmax

Softmax2d

Not Implemeted

LogSoftmax

Not Implemeted

AdaptiveLogSoftmaxWithLoss

Not Implemeted

归一化层

Pytorch

MegEngine

Comment

BatchNorm1d

BatchNorm1d

差异对比

BatchNorm2d

BatchNorm2d

差异对比

BatchNorm3d

Not Implemeted

LazyBatchNorm1d

Not Implemeted

LazyBatchNorm2d

Not Implemeted

LazyBatchNorm3d

Not Implemeted

GroupNorm

GroupNorm

SyncBatchNorm

SyncBatchNorm

InstanceNorm1d

Not Implemeted

InstanceNorm2d

InstanceNorm

InstanceNorm3d

Not Implemeted

LayerNorm

LayerNorm

LocalResponseNorm

Not Implemeted

Recurrent Layers

Pytorch

MegEngine

Comment

RNNBase

RNNBase

RNN

RNN

LSTM

LSTM

GRU

Not Implemeted

RNNCell

RNNCell

LSTMCell

LSTMCell

GRUCell

Not Implemeted

Transformer Layers

Pytorch

MegEngine

Comment

Transformer

Not Implemeted

TransformerEncoder

Not Implemeted

TransformerDecoder

Not Implemeted

TransformerEncoderLayer

Not Implemeted

TransformerDecoderLayer

Not Implemeted

线性变化层

Pytorch

MegEngine

Comment

Identity

Identity

Linear

Linear

Bilinear

Not Implemeted

LazyLinear

Not Implemeted

Droupout 层

Pytorch

MegEngine

Comment

Dropout

Dropout

Dropout1d

Not Implemeted

Dropout2d

Not Implemeted

Dropout3d

Not Implemeted

AlphaDropout

Not Implemeted

FeatureAlphaDropout

Not Implemeted

Sparse 层

Pytorch

MegEngine

Comment

Embedding

Embedding

EmbeddingBag

Not Implemeted

Distance Functions

Pytorch

MegEngine

Comment

CosineSimilarity

Not Implemeted

PairwiseDistance

Not Implemeted

Loss Functions

参见

请参考 loss function 的 functional 实现。

Pytorch

MegEngine

Comment

L1Loss

Not Implemeted

MSELoss

Not Implemeted

CrossEntropyLoss

Not Implemeted

CTCLoss

Not Implemeted

NLLLoss

Not Implemeted

PoissonNLLLoss

Not Implemeted

KLDivLoss

Not Implemeted

BCELoss

Not Implemeted

BCEWithLogitsLoss

Not Implemeted

MarginRankingLoss

Not Implemeted

HingeEmbeddingLoss

Not Implemeted

MultiLabelMarginLoss

Not Implemeted

SmoothL1Loss

Not Implemeted

SoftMarginLoss

Not Implemeted

MultiLabelSoftMarginLoss

Not Implemeted

CosineEmbeddingLoss

Not Implemeted

MultiMarginLoss

Not Implemeted

TripletMarginLoss

Not Implemeted

TripletMarginWithDistanceLoss

Not Implemeted

Vision functions

Pytorch

MegEngine

Comment

pixel_shuffle

pixel_shuffle

pixel_unshuffle

Not Implemeted

pad

pad

interpolate

interpolate

upsample

interpolate

upsample_nearest

interpolate

upsample_bilinear

interpolate

grid_sample

remap

affine_grid

warp_affine

nms

nms

roi_align

roi_align

roi_pool

roi_pooling

OpenCV Python Package

Pytorch

MegEngine

Comment

cvtColor

cvt_color

resize

interpolate

remap

remap

warpAffine

warp_affine

warpPerspective

warp_perspective

NVIDIA

Pytorch

MegEngine

Comment

correlation

correlation

nvof

nvof

Not Implemeted

注解

一些 API 在 MegEngine 中可能还没有实现,但所有的 API 并不是一开始就被设计出来的。 我们可以像搭积木一样,利用已经存在的基础 API 来组合出 MegEngine 中尚未提供的接口。

比如 “如何实现 roll ” 这个问题,可以使用 splitconcat 拼接出来:

import megengine.functional as F

def roll(x, shifts, axis):
    shp = x.shape
    dim = len(shp)
    if isinstance(shifts, int):
        assert isinstance(axis, int)
        shifts = [shifts]
        axis = [axis]
    assert len(shifts) == len(axis)
    y = x
    for i in range(len(shifts)):
        axis_ = axis[i]
        shift_ = shifts[i]
        axis_t_ = axis_ + dim if axis_ < 0 else axis_
        assert (
            dim > axis_t_ >= 0
        ), "axis out of range (expected to be in range of [{}, {}], but got {})".format(
            -dim, dim - 1, axis_
        )
        if shift_ == 0:
            continue
            size = shp[axis_t_]
        if shift_ > 0:
            a, b = F.split(y, [size - shift_,], axis=axis_t_)
        else:
            a, b = F.split(y, [-shift_,], axis=axis_t_)
        y = F.concat((b, a), axis=axis_t_)
      return y

除此之外,你可以尝试在 GitHub Issues 或论坛中针对 API 问题发起求助。

我们也欢迎你将自己实现的 API 以 Pull Request 的形式提交到 MegEngine 代码库中来~

注解

对于缺失的 Loss Funtions 算子,大都可自行设计实现。