【图像去噪】第七期论文复现赛——SwinIR

作者引入Swin-T结构应用于低级视觉任务,包括图像超分辨率重建、图像去噪、图像压缩伪影去除。SwinIR网络由一个浅层特征提取模块、深层特征提取模块、重建模块构成。重建模块对不同的任务使用不同的结构。浅层特征提取就是一个3×3的卷积层。深层特征提取是k个RSTB块和一个卷积层加残差连接构成。

☞☞☞AI 智能聊天, 问答助手, AI 智能搜索, 免费无限量使用 DeepSeek R1 模型☜☜☜

【图像去噪】第七期论文复现赛——swinir - 创想鸟

论文复现——Low-level算法 SwinIR (去噪)

SwinIR: Image Restoration Using Swin Transformer——基于Swin Transformer的用于图像恢复的强基线模型

官方源码:https://github.com/JingyunLiang/SwinIR

复现地址:https://github.com/sldyns/SwinIR_paddle

脚本任务:https://aistudio.baidu.com/aistudio/clusterprojectdetail/3792518

PaddleGAN版本:https://github.com/PaddlePaddle/PaddleGAN/blob/develop/docs/zh_CN/tutorials/swinir.md

1. 简介

【图像去噪】第七期论文复现赛——SwinIR - 创想鸟 SwinIR的结构比较简单,如果看过Swin-Transformer的话就没什么难点了。作者引入Swin-T结构应用于低级视觉任务,包括图像超分辨率重建、图像去噪、图像压缩伪影去除。SwinIR网络由一个浅层特征提取模块、深层特征提取模块、重建模块构成。重建模块对不同的任务使用不同的结构。浅层特征提取就是一个3×3的卷积层。深层特征提取是k个RSTB块和一个卷积层加残差连接构成。每个RSTB(Res-Swin-Transformer-Block)由L个STL和一层卷积加残差连接构成。

2. 复现精度

在 CBSD68 测试集上测试,达到验收最低标准34.32:

SwinIR Noise Level15

Pytorch34.42Paddle34.32

注:源代码八卡训练的iteration为 1,600,000,我们四卡只训练到了 426,000 就超时停止了.

3. 数据集、预训练模型、文件结构

3.1训练数据

DIV2K (800 training images) + Flickr2K (2650 images) + BSD500 (400 training&testing images) + WED(4744 images)

已经整理好的数据:放在了 Ai Studio 里.

训练数据放在:data/trainsets/trainH 下

3.2测试数据

测试数据为 CBSD68:放在了 Ai Studio 里.

解压到:data/testsets/CBSD68

In [2]

# 解压!cd data && unzip -oq -d testsets/ data147756/CBSD68.zip!cd data && unzip -oq -d trainsets/ data149405/trainH.zip

   In [3]

# 添加软链接!cd work && ln -s ../data/trainsets trainsets && ln -s ../data/testsets testsets

   

3.2 预训练模型

已放在文件夹 work/pretrained_models 下:

官方预训练模型,已转为 paddle 的,名为 005_colorDN_DFWB_s128w8_SwinIR-M_noise15.pdparams.复现的模型,名为 SwinIR_paddle.pdparams.

3.3 文件结构

SwinIR_Paddle    |-- data                                        # 数据相关文件    |-- models                                   # 模型相关文件    |-- options                                   # 训练配置文件    |-- trainsets         |-- trainH                                # 训练数据    |-- testsets         |-- CBSD68                             # 测试数据    |-- test_tipc                                  # TIPC: Linux GPU/CPU 基础训练推理测试    |-- pretrained_models                  # 预训练模型    |-- utils                                          # 一些工具代码    |-- config.py                                  # 配置文件    |-- generate_patches_SIDD.py      # 生成数据patch    |-- infer.py                                     # 模型推理代码    |-- LICENSE                                   # LICENSE文件    |-- main_test_swinir.py                  # 模型测试代码    |-- main_train_psnr.py                   # 模型训练代码    |-- main_train_tipc.py                    # TICP训练代码    |-- README.md                             # README.md文件    |-- train.log                                    # 训练日志

   

4. 环境依赖

PaddlePaddle >= 2.3.2

scikit-image == 0.19.3

In [1]

!pip install scikit-image

       

Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simpleCollecting scikit-image  Downloading https://pypi.tuna.tsinghua.edu.cn/packages/2d/ba/63ce953b7d593bd493e80be158f2d9f82936582380aee0998315510633aa/scikit_image-0.19.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (13.5 MB)     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 13.5/13.5 MB 717.8 kB/s eta 0:00:0000:0100:01Requirement already satisfied: scipy>=1.4.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image) (1.6.3)Requirement already satisfied: packaging>=20.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image) (21.3)Requirement already satisfied: networkx>=2.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image) (2.4)Requirement already satisfied: numpy>=1.17.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image) (1.19.5)Collecting tifffile>=2019.7.26  Downloading https://pypi.tuna.tsinghua.edu.cn/packages/d8/38/85ae5ed77598ca90558c17a2f79ddaba33173b31cf8d8f545d34d9134f0d/tifffile-2021.11.2-py3-none-any.whl (178 kB)     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 178.9/178.9 kB 25.2 kB/s eta 0:00:00a 0:00:01Requirement already satisfied: pillow!=7.1.0,!=7.1.1,!=8.3.0,>=6.1.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image) (8.2.0)Collecting PyWavelets>=1.1.1  Downloading https://pypi.tuna.tsinghua.edu.cn/packages/ae/56/4441877073d8a5266dbf7b04c7f3dc66f1149c8efb9323e0ef987a9bb1ce/PyWavelets-1.3.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (6.4 MB)     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.4/6.4 MB 1.3 MB/s eta 0:00:0000:0100:01Requirement already satisfied: imageio>=2.4.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image) (2.6.1)Requirement already satisfied: decorator>=4.3.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from networkx>=2.2->scikit-image) (4.4.2)Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from packaging>=20.0->scikit-image) (3.0.9)Installing collected packages: tifffile, PyWavelets, scikit-imageSuccessfully installed PyWavelets-1.3.0 scikit-image-0.19.3 tifffile-2021.11.2[notice] A new release of pip available: 22.1.2 -> 22.3[notice] To update, run: pip install --upgrade pip

       

5. 快速开始

配置文件在work/options下,可修改学习率、batch_size等参数

5.1 模型训练

为更好的体验,建议使用单机多卡训练,例如fork并运行脚本任务: https://aistudio.baidu.com/aistudio/clusterprojectdetail/3792518

In [ ]

# 单机单卡!cd work && python main_train_psnr.py --opt options/train_swinir_multi_card_32.json

       

/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/matplotlib/__init__.py:107: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working  from collections import MutableMapping/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/matplotlib/rcsetup.py:20: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working  from collections import Iterable, Mapping/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/matplotlib/colors.py:53: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working  from collections import Sizedexport CUDA_VISIBLE_DEVICES=0,1,2,3number of GPUs is: 4/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/distributed/parallel.py:159: UserWarning: Currently not a parallel execution environment, `paddle.distributed.init_parallel_env` will not do anything.  "Currently not a parallel execution environment, `paddle.distributed.init_parallel_env` will not do anything."1LogHandlers setup!22-10-17 16:47:13.877 :   task: swinir_denoising_color_15  model: plain  gpu_ids: [0, 1, 2, 3]  dist: True  n_channels: 3  path:[    root: denoising    pretrained_netG: None    pretrained_netE: None    task: denoising/swinir_denoising_color_15    log: denoising/swinir_denoising_color_15    options: denoising/swinir_denoising_color_15/options    models: denoising/swinir_denoising_color_15/models    images: denoising/swinir_denoising_color_15/images    pretrained_optimizerG: None  ]  datasets:[    train:[      name: train_dataset      dataset_type: dncnn      dataroot_H: trainsets/trainH      dataroot_L: None      H_size: 128      sigma: 15      sigma_test: 15      dataloader_shuffle: True      dataloader_num_workers: 8      dataloader_batch_size: 2      phase: train      scale: 1      n_channels: 3    ]    test:[      name: test_dataset      dataset_type: dncnn      dataroot_H: testsets/CBSD68      dataroot_L: None      sigma: 15      sigma_test: 15      phase: test      scale: 1      n_channels: 3    ]  ]  netG:[    net_type: swinir    upscale: 1    in_chans: 3    img_size: 128    window_size: 8    img_range: 1.0    depths: [6, 6, 6, 6, 6, 6]    embed_dim: 180    num_heads: [6, 6, 6, 6, 6, 6]    mlp_ratio: 2    upsampler: None    resi_connection: 1conv    init_type: default    scale: 1  ]  train:[    G_lossfn_type: charbonnier    G_lossfn_weight: 1.0    G_charbonnier_eps: 1e-09    E_decay: 0.999    G_optimizer_type: adam    G_optimizer_lr: 0.0002    G_optimizer_wd: 0    G_optimizer_clipgrad: None    G_optimizer_reuse: True    G_scheduler_type: MultiStepLR    G_scheduler_milestones: [800000, 1200000, 1400000, 1500000, 1600000]    G_scheduler_gamma: 0.5    G_regularizer_orthstep: None    G_regularizer_clipstep: None    G_param_strict: True    E_param_strict: True    manual_seed: 42    checkpoint_test: 2000    checkpoint_save: 2000    checkpoint_print: 400    F_feature_layer: 34    F_weights: 1.0    F_lossfn_type: l1    F_use_input_norm: True    F_use_range_norm: False    G_optimizer_betas: [0.9, 0.999]    G_scheduler_restart_weights: 1  ]  opt_path: options/train_swinir_multi_card_32.json  is_train: True  merge_bn: False  merge_bn_startpoint: -1  scale: 1  find_unused_parameters: True  use_static_graph: False  num_gpu: 4  nranks: 1Random seed: 42Dataset: Denosing on AWGN with fixed sigma. Only dataroot_H is needed.Dataset [DatasetDnCNN - train_dataset] is created.22-10-17 16:47:13.913 : Number of train images: 8,694, iters: 4,347Dataset: Denosing on AWGN with fixed sigma. Only dataroot_H is needed.Dataset [DatasetDnCNN - test_dataset] is created.W1017 16:47:14.995301  3546 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 11.2, Runtime API Version: 11.2W1017 16:47:14.999406  3546 gpu_resources.cc:91] device: 0, cuDNN Version: 8.2.Pass this initialization! Initialization was done during network definition!Pass this initialization! Initialization was done during network definition!Training model [ModelPlain] is created.Copying model for E ...22-10-17 16:47:15.620 : Networks name: SwinIRParams number: Tensor(shape=[1], dtype=int64, place=Place(gpu:0), stop_gradient=False,       [11504163])Net structure:SwinIR(  (conv_first): Conv2D(3, 180, kernel_size=[3, 3], padding=1, data_format=NCHW)  (patch_embed): PatchEmbed(    (norm): LayerNorm(normalized_shape=[180], epsilon=1e-05)  )  (patch_unembed): PatchUnEmbed()  (pos_drop): Dropout(p=0.0, axis=None, mode=upscale_in_train)  (layers): LayerList(    (0): RSTB(      (residual_group): BasicLayer(dim=180, input_resolution=(128, 128), depth=6        (blocks): LayerList(          (0): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): Identity()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (1): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (2): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (3): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (4): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (5): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )        )      )      (conv): Conv2D(180, 180, kernel_size=[3, 3], padding=1, data_format=NCHW)      (patch_embed): PatchEmbed()      (patch_unembed): PatchUnEmbed()    )    (1): RSTB(      (residual_group): BasicLayer(dim=180, input_resolution=(128, 128), depth=6        (blocks): LayerList(          (0): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (1): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (2): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (3): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (4): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (5): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )        )      )      (conv): Conv2D(180, 180, kernel_size=[3, 3], padding=1, data_format=NCHW)      (patch_embed): PatchEmbed()      (patch_unembed): PatchUnEmbed()    )    (2): RSTB(      (residual_group): BasicLayer(dim=180, input_resolution=(128, 128), depth=6        (blocks): LayerList(          (0): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (1): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (2): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (3): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (4): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (5): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )        )      )      (conv): Conv2D(180, 180, kernel_size=[3, 3], padding=1, data_format=NCHW)      (patch_embed): PatchEmbed()      (patch_unembed): PatchUnEmbed()    )    (3): RSTB(      (residual_group): BasicLayer(dim=180, input_resolution=(128, 128), depth=6        (blocks): LayerList(          (0): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (1): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (2): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (3): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (4): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (5): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )        )      )      (conv): Conv2D(180, 180, kernel_size=[3, 3], padding=1, data_format=NCHW)      (patch_embed): PatchEmbed()      (patch_unembed): PatchUnEmbed()    )    (4): RSTB(      (residual_group): BasicLayer(dim=180, input_resolution=(128, 128), depth=6        (blocks): LayerList(          (0): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (1): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (2): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (3): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (4): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (5): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )        )      )      (conv): Conv2D(180, 180, kernel_size=[3, 3], padding=1, data_format=NCHW)      (patch_embed): PatchEmbed()      (patch_unembed): PatchUnEmbed()    )    (5): RSTB(      (residual_group): BasicLayer(dim=180, input_resolution=(128, 128), depth=6        (blocks): LayerList(          (0): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (1): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (2): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (3): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (4): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=0, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )          (5): SwinTransformerBlock(dim=180, input_resolution=(128, 128), num_heads=6, window_size=8, shift_size=4, mlp_ratio=2            (norm1): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (attn): WindowAttention(dim=180, window_size=(8, 8), num_heads=6              (qkv): Linear(in_features=180, out_features=540, dtype=float32)              (attn_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (proj): Linear(in_features=180, out_features=180, dtype=float32)              (proj_dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)              (softmax): Softmax(axis=-1)            )            (drop_path): DropPath()            (norm2): LayerNorm(normalized_shape=[180], epsilon=1e-05)            (mlp): Mlp(              (fc1): Linear(in_features=180, out_features=360, dtype=float32)              (fc2): Linear(in_features=360, out_features=180, dtype=float32)              (act): GELU(approximate=False)              (dropout): Dropout(p=0.0, axis=None, mode=upscale_in_train)            )          )        )      )      (conv): Conv2D(180, 180, kernel_size=[3, 3], padding=1, data_format=NCHW)      (patch_embed): PatchEmbed()      (patch_unembed): PatchUnEmbed()    )  )  (norm): LayerNorm(normalized_shape=[180], epsilon=1e-05)  (conv_after_body): Conv2D(180, 180, kernel_size=[3, 3], padding=1, data_format=NCHW)  (conv_last): Conv2D(180, 3, kernel_size=[3, 3], padding=1, data_format=NCHW))22-10-17 16:47:15.819 :  |  mean  |  min   |  max   |  std   || shape                | -0.002 | -0.941 |  1.199 |  0.274 | (180, 3, 3, 3) || conv_first.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || conv_first.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || patch_embed.norm.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || patch_embed.norm.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.0.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.0.norm1.bias | -0.000 | -0.040 |  0.040 |  0.017 | (225, 6) || layers.0.residual_group.blocks.0.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.0.residual_group.blocks.0.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.0.residual_group.blocks.0.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.0.residual_group.blocks.0.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.0.residual_group.blocks.0.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.0.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.0.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.0.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.0.residual_group.blocks.0.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.0.residual_group.blocks.0.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.0.residual_group.blocks.0.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.0.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.0.residual_group.blocks.1.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.1.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.1.norm1.bias | -0.001 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.0.residual_group.blocks.1.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.0.residual_group.blocks.1.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.0.residual_group.blocks.1.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.0.residual_group.blocks.1.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.0.residual_group.blocks.1.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.1.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.1.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.1.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.0.residual_group.blocks.1.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.0.residual_group.blocks.1.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.0.residual_group.blocks.1.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.1.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.2.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.2.norm1.bias |  0.001 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.0.residual_group.blocks.2.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.0.residual_group.blocks.2.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.0.residual_group.blocks.2.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.0.residual_group.blocks.2.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.0.residual_group.blocks.2.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.2.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.2.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.2.norm2.bias |  0.001 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.0.residual_group.blocks.2.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.0.residual_group.blocks.2.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.0.residual_group.blocks.2.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.2.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.0.residual_group.blocks.3.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.3.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.3.norm1.bias | -0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.0.residual_group.blocks.3.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.0.residual_group.blocks.3.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.0.residual_group.blocks.3.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.0.residual_group.blocks.3.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.0.residual_group.blocks.3.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.3.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.3.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.3.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.0.residual_group.blocks.3.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.0.residual_group.blocks.3.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.0.residual_group.blocks.3.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.3.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.4.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.4.norm1.bias |  0.000 | -0.039 |  0.040 |  0.018 | (225, 6) || layers.0.residual_group.blocks.4.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.0.residual_group.blocks.4.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.0.residual_group.blocks.4.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.0.residual_group.blocks.4.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.0.residual_group.blocks.4.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.4.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.4.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.4.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.0.residual_group.blocks.4.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.0.residual_group.blocks.4.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.0.residual_group.blocks.4.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.4.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.0.residual_group.blocks.5.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.5.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.5.norm1.bias | -0.000 | -0.040 |  0.039 |  0.018 | (225, 6) || layers.0.residual_group.blocks.5.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.0.residual_group.blocks.5.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.0.residual_group.blocks.5.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.0.residual_group.blocks.5.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.0.residual_group.blocks.5.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.5.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.0.residual_group.blocks.5.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.5.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.0.residual_group.blocks.5.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.0.residual_group.blocks.5.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.0.residual_group.blocks.5.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.0.residual_group.blocks.5.mlp.fc2.bias | -0.000 | -0.155 |  0.158 |  0.035 | (180, 180, 3, 3) || layers.0.conv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.0.conv.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.0.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.0.norm1.bias | -0.000 | -0.040 |  0.039 |  0.018 | (225, 6) || layers.1.residual_group.blocks.0.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.1.residual_group.blocks.0.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.1.residual_group.blocks.0.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.1.residual_group.blocks.0.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.1.residual_group.blocks.0.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.0.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.0.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.0.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.1.residual_group.blocks.0.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.1.residual_group.blocks.0.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.1.residual_group.blocks.0.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.0.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.1.residual_group.blocks.1.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.1.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.1.norm1.bias |  0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.1.residual_group.blocks.1.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.1.residual_group.blocks.1.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.1.residual_group.blocks.1.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.1.residual_group.blocks.1.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.1.residual_group.blocks.1.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.1.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.1.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.1.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.1.residual_group.blocks.1.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.1.residual_group.blocks.1.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.1.residual_group.blocks.1.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.1.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.2.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.2.norm1.bias | -0.001 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.1.residual_group.blocks.2.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.1.residual_group.blocks.2.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.1.residual_group.blocks.2.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.1.residual_group.blocks.2.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.1.residual_group.blocks.2.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.2.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.2.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.2.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.1.residual_group.blocks.2.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.1.residual_group.blocks.2.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.1.residual_group.blocks.2.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.2.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.1.residual_group.blocks.3.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.3.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.3.norm1.bias |  0.001 | -0.040 |  0.040 |  0.017 | (225, 6) || layers.1.residual_group.blocks.3.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.1.residual_group.blocks.3.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.1.residual_group.blocks.3.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.1.residual_group.blocks.3.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.1.residual_group.blocks.3.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.3.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.3.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.3.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.1.residual_group.blocks.3.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.1.residual_group.blocks.3.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.1.residual_group.blocks.3.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.3.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.4.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.4.norm1.bias |  0.000 | -0.040 |  0.040 |  0.017 | (225, 6) || layers.1.residual_group.blocks.4.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.1.residual_group.blocks.4.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.1.residual_group.blocks.4.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.1.residual_group.blocks.4.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.1.residual_group.blocks.4.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.4.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.4.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.4.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.1.residual_group.blocks.4.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.1.residual_group.blocks.4.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.1.residual_group.blocks.4.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.4.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.1.residual_group.blocks.5.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.5.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.5.norm1.bias |  0.000 | -0.039 |  0.040 |  0.018 | (225, 6) || layers.1.residual_group.blocks.5.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.1.residual_group.blocks.5.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.1.residual_group.blocks.5.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.1.residual_group.blocks.5.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.1.residual_group.blocks.5.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.5.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.1.residual_group.blocks.5.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.5.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.1.residual_group.blocks.5.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.1.residual_group.blocks.5.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.1.residual_group.blocks.5.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.1.residual_group.blocks.5.mlp.fc2.bias |  0.000 | -0.155 |  0.147 |  0.035 | (180, 180, 3, 3) || layers.1.conv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.1.conv.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.0.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.0.norm1.bias | -0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.2.residual_group.blocks.0.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.2.residual_group.blocks.0.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.2.residual_group.blocks.0.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.2.residual_group.blocks.0.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.2.residual_group.blocks.0.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.0.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.0.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.0.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.2.residual_group.blocks.0.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.2.residual_group.blocks.0.mlp.fc1.bias |  0.001 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.2.residual_group.blocks.0.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.0.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.2.residual_group.blocks.1.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.1.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.1.norm1.bias |  0.000 | -0.039 |  0.039 |  0.017 | (225, 6) || layers.2.residual_group.blocks.1.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.2.residual_group.blocks.1.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.2.residual_group.blocks.1.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.2.residual_group.blocks.1.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.2.residual_group.blocks.1.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.1.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.1.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.1.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.2.residual_group.blocks.1.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.2.residual_group.blocks.1.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.2.residual_group.blocks.1.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.1.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.2.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.2.norm1.bias | -0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.2.residual_group.blocks.2.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.2.residual_group.blocks.2.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.2.residual_group.blocks.2.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.2.residual_group.blocks.2.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.2.residual_group.blocks.2.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.2.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.2.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.2.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.2.residual_group.blocks.2.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.2.residual_group.blocks.2.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.2.residual_group.blocks.2.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.2.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.2.residual_group.blocks.3.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.3.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.3.norm1.bias | -0.000 | -0.039 |  0.039 |  0.018 | (225, 6) || layers.2.residual_group.blocks.3.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.2.residual_group.blocks.3.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.2.residual_group.blocks.3.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.2.residual_group.blocks.3.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.2.residual_group.blocks.3.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.3.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.3.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.3.norm2.bias | -0.001 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.2.residual_group.blocks.3.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.2.residual_group.blocks.3.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.2.residual_group.blocks.3.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.3.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.4.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.4.norm1.bias | -0.000 | -0.040 |  0.040 |  0.017 | (225, 6) || layers.2.residual_group.blocks.4.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.2.residual_group.blocks.4.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.2.residual_group.blocks.4.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.2.residual_group.blocks.4.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.2.residual_group.blocks.4.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.4.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.4.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.4.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.2.residual_group.blocks.4.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.2.residual_group.blocks.4.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.2.residual_group.blocks.4.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.4.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.2.residual_group.blocks.5.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.5.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.5.norm1.bias |  0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.2.residual_group.blocks.5.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.2.residual_group.blocks.5.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.2.residual_group.blocks.5.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.2.residual_group.blocks.5.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.2.residual_group.blocks.5.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.5.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.2.residual_group.blocks.5.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.5.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.2.residual_group.blocks.5.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.2.residual_group.blocks.5.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.2.residual_group.blocks.5.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.2.residual_group.blocks.5.mlp.fc2.bias |  0.000 | -0.159 |  0.157 |  0.035 | (180, 180, 3, 3) || layers.2.conv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.2.conv.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.0.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.0.norm1.bias | -0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.3.residual_group.blocks.0.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.3.residual_group.blocks.0.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.3.residual_group.blocks.0.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.3.residual_group.blocks.0.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.3.residual_group.blocks.0.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.0.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.0.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.0.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.3.residual_group.blocks.0.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.3.residual_group.blocks.0.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.3.residual_group.blocks.0.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.0.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.3.residual_group.blocks.1.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.1.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.1.norm1.bias | -0.001 | -0.039 |  0.040 |  0.018 | (225, 6) || layers.3.residual_group.blocks.1.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.3.residual_group.blocks.1.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.3.residual_group.blocks.1.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.3.residual_group.blocks.1.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.3.residual_group.blocks.1.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.1.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.1.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.1.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.3.residual_group.blocks.1.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.3.residual_group.blocks.1.mlp.fc1.bias |  0.001 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.3.residual_group.blocks.1.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.1.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.2.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.2.norm1.bias | -0.001 | -0.040 |  0.040 |  0.017 | (225, 6) || layers.3.residual_group.blocks.2.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.3.residual_group.blocks.2.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.3.residual_group.blocks.2.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.3.residual_group.blocks.2.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.3.residual_group.blocks.2.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.2.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.2.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.2.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.3.residual_group.blocks.2.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.3.residual_group.blocks.2.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.3.residual_group.blocks.2.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.2.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.3.residual_group.blocks.3.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.3.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.3.norm1.bias |  0.000 | -0.040 |  0.039 |  0.018 | (225, 6) || layers.3.residual_group.blocks.3.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.3.residual_group.blocks.3.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.3.residual_group.blocks.3.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.3.residual_group.blocks.3.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.3.residual_group.blocks.3.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.3.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.3.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.3.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.3.residual_group.blocks.3.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.3.residual_group.blocks.3.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.3.residual_group.blocks.3.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.3.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.4.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.4.norm1.bias |  0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.3.residual_group.blocks.4.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.3.residual_group.blocks.4.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.3.residual_group.blocks.4.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.3.residual_group.blocks.4.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.3.residual_group.blocks.4.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.4.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.4.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.4.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.3.residual_group.blocks.4.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.3.residual_group.blocks.4.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.3.residual_group.blocks.4.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.4.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.3.residual_group.blocks.5.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.5.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.5.norm1.bias |  0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.3.residual_group.blocks.5.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.3.residual_group.blocks.5.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.3.residual_group.blocks.5.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.3.residual_group.blocks.5.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.3.residual_group.blocks.5.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.5.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.3.residual_group.blocks.5.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.5.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.3.residual_group.blocks.5.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.3.residual_group.blocks.5.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.3.residual_group.blocks.5.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.3.residual_group.blocks.5.mlp.fc2.bias | -0.000 | -0.166 |  0.169 |  0.035 | (180, 180, 3, 3) || layers.3.conv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.3.conv.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.0.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.0.norm1.bias |  0.001 | -0.040 |  0.040 |  0.017 | (225, 6) || layers.4.residual_group.blocks.0.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.4.residual_group.blocks.0.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.4.residual_group.blocks.0.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.4.residual_group.blocks.0.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.4.residual_group.blocks.0.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.0.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.0.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.0.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.4.residual_group.blocks.0.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.4.residual_group.blocks.0.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.4.residual_group.blocks.0.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.0.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.4.residual_group.blocks.1.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.1.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.1.norm1.bias | -0.001 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.4.residual_group.blocks.1.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.4.residual_group.blocks.1.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.4.residual_group.blocks.1.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.4.residual_group.blocks.1.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.4.residual_group.blocks.1.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.1.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.1.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.1.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.4.residual_group.blocks.1.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.4.residual_group.blocks.1.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.4.residual_group.blocks.1.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.1.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.2.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.2.norm1.bias | -0.000 | -0.040 |  0.039 |  0.018 | (225, 6) || layers.4.residual_group.blocks.2.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.4.residual_group.blocks.2.attn.relative_position_index |  0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.4.residual_group.blocks.2.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.4.residual_group.blocks.2.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.4.residual_group.blocks.2.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.2.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.2.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.2.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.4.residual_group.blocks.2.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.4.residual_group.blocks.2.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.4.residual_group.blocks.2.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.2.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.4.residual_group.blocks.3.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.3.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.3.norm1.bias | -0.001 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.4.residual_group.blocks.3.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.4.residual_group.blocks.3.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.4.residual_group.blocks.3.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.4.residual_group.blocks.3.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.4.residual_group.blocks.3.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.3.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.3.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.3.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.4.residual_group.blocks.3.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.4.residual_group.blocks.3.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.4.residual_group.blocks.3.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.3.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.4.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.4.norm1.bias |  0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.4.residual_group.blocks.4.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.4.residual_group.blocks.4.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.4.residual_group.blocks.4.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.4.residual_group.blocks.4.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.4.residual_group.blocks.4.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.4.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.4.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.4.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.4.residual_group.blocks.4.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.4.residual_group.blocks.4.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.4.residual_group.blocks.4.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.4.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.4.residual_group.blocks.5.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.5.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.5.norm1.bias |  0.000 | -0.040 |  0.039 |  0.017 | (225, 6) || layers.4.residual_group.blocks.5.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.4.residual_group.blocks.5.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.4.residual_group.blocks.5.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.4.residual_group.blocks.5.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.4.residual_group.blocks.5.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.5.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.4.residual_group.blocks.5.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.5.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.4.residual_group.blocks.5.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.4.residual_group.blocks.5.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.4.residual_group.blocks.5.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.4.residual_group.blocks.5.mlp.fc2.bias | -0.000 | -0.148 |  0.169 |  0.035 | (180, 180, 3, 3) || layers.4.conv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.4.conv.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.0.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.0.norm1.bias |  0.001 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.5.residual_group.blocks.0.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.5.residual_group.blocks.0.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.5.residual_group.blocks.0.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.5.residual_group.blocks.0.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.5.residual_group.blocks.0.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.0.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.0.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.0.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.5.residual_group.blocks.0.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.5.residual_group.blocks.0.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.5.residual_group.blocks.0.mlp.fc2.weight | -0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.0.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.5.residual_group.blocks.1.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.1.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.1.norm1.bias |  0.001 | -0.039 |  0.040 |  0.018 | (225, 6) || layers.5.residual_group.blocks.1.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.5.residual_group.blocks.1.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.5.residual_group.blocks.1.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.5.residual_group.blocks.1.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.5.residual_group.blocks.1.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.1.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.1.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.1.norm2.bias | -0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.5.residual_group.blocks.1.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.5.residual_group.blocks.1.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.5.residual_group.blocks.1.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.1.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.2.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.2.norm1.bias |  0.000 | -0.039 |  0.040 |  0.018 | (225, 6) || layers.5.residual_group.blocks.2.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.5.residual_group.blocks.2.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.5.residual_group.blocks.2.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.5.residual_group.blocks.2.attn.qkv.bias |  0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.5.residual_group.blocks.2.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.2.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.2.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.2.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.5.residual_group.blocks.2.mlp.fc1.weight | -0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.5.residual_group.blocks.2.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.5.residual_group.blocks.2.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.2.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.5.residual_group.blocks.3.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.3.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.3.norm1.bias | -0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.5.residual_group.blocks.3.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.5.residual_group.blocks.3.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.5.residual_group.blocks.3.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.5.residual_group.blocks.3.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.5.residual_group.blocks.3.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.3.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.3.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.3.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.5.residual_group.blocks.3.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.5.residual_group.blocks.3.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.5.residual_group.blocks.3.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.3.mlp.fc2.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.4.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.4.norm1.bias |  0.000 | -0.040 |  0.040 |  0.017 | (225, 6) || layers.5.residual_group.blocks.4.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.5.residual_group.blocks.4.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.5.residual_group.blocks.4.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.5.residual_group.blocks.4.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.5.residual_group.blocks.4.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.4.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.4.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.4.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.5.residual_group.blocks.4.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.5.residual_group.blocks.4.mlp.fc1.bias | -0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.5.residual_group.blocks.4.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.4.mlp.fc2.bias | -6.152 | -100.000 | -0.000 | 24.029 | (256, 64, 64) || layers.5.residual_group.blocks.5.attn_mask |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.5.norm1.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.5.norm1.bias |  0.000 | -0.040 |  0.040 |  0.018 | (225, 6) || layers.5.residual_group.blocks.5.attn.relative_position_bias_table | 112.000 |  0.000 | 224.000 | 48.713 | (64, 64) || layers.5.residual_group.blocks.5.attn.relative_position_index | -0.000 | -0.040 |  0.040 |  0.018 | (180, 540) || layers.5.residual_group.blocks.5.attn.qkv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (540,) || layers.5.residual_group.blocks.5.attn.qkv.bias | -0.000 | -0.040 |  0.040 |  0.018 | (180, 180) || layers.5.residual_group.blocks.5.attn.proj.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.5.attn.proj.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || layers.5.residual_group.blocks.5.norm2.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.5.norm2.bias |  0.000 | -0.105 |  0.105 |  0.061 | (180, 360) || layers.5.residual_group.blocks.5.mlp.fc1.weight |  0.000 | -0.000 |  0.000 |  0.000 | (360,) || layers.5.residual_group.blocks.5.mlp.fc1.bias |  0.000 | -0.105 |  0.105 |  0.061 | (360, 180) || layers.5.residual_group.blocks.5.mlp.fc2.weight |  0.000 | -0.000 |  0.000 |  0.000 | (180,) || layers.5.residual_group.blocks.5.mlp.fc2.bias | -0.000 | -0.152 |  0.160 |  0.035 | (180, 180, 3, 3) || layers.5.conv.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || layers.5.conv.bias |  1.000 |  1.000 |  1.000 |  0.000 | (180,) || norm.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || norm.bias | -0.000 | -0.166 |  0.159 |  0.035 | (180, 180, 3, 3) || conv_after_body.weight |  0.000 |  0.000 |  0.000 |  0.000 | (180,) || conv_after_body.bias |  0.000 | -0.120 |  0.133 |  0.035 | (3, 180, 3, 3) || conv_last.weight |  0.000 |  0.000 |  0.000 |  0.000 | (3,) || conv_last.bias

       In [ ]

# 单机四卡# !cd work && python -m paddle.distributed.launch main_train_psnr.py --opt options/train_swinir_multi_card_32.json

   

训练过程会将模型参数保存在 work/denoising/swinir_denoising_color_15/models/ 文件夹下.

训练日志将会保存在 work/denoising/swinir_denoising_color_15/models/train.log

本人单机四卡的训练日志为 work/train.log

5.2 模型评估与预测

在 CBSD68 测试数据上作测试,加强度为15的噪声,结果将存放在 work/results/swinir_color_dn_noise15/ 文件夹下

In [14]

!cd work && python main_test_swinir.py --task color_dn --noise 15 --model_path pretrained_models/SwinIR_paddle.pdparams --folder_gt testsets/CBSD68/

       

loading model from pretrained_models/SwinIR_paddle.pdparamsW1016 16:39:45.440834 13414 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 11.2, Runtime API Version: 11.2W1016 16:39:45.444900 13414 gpu_resources.cc:91] device: 0, cuDNN Version: 8.2.Testing 0 101085               - PSNR: 31.32 dB; SSIM: 0.9118; PSNR_Y: 32.88 dB; SSIM_Y: 0.9210; PSNR_B: 0.00 dB.Testing 1 101087               - PSNR: 35.39 dB; SSIM: 0.9517; PSNR_Y: 37.14 dB; SSIM_Y: 0.9603; PSNR_B: 0.00 dB.Testing 2 102061               - PSNR: 35.21 dB; SSIM: 0.9348; PSNR_Y: 36.97 dB; SSIM_Y: 0.9477; PSNR_B: 0.00 dB.Testing 3 103070               - PSNR: 35.86 dB; SSIM: 0.9448; PSNR_Y: 37.79 dB; SSIM_Y: 0.9562; PSNR_B: 0.00 dB.Testing 4 105025               - PSNR: 33.11 dB; SSIM: 0.9426; PSNR_Y: 34.87 dB; SSIM_Y: 0.9509; PSNR_B: 0.00 dB.Testing 5 106024               - PSNR: 37.57 dB; SSIM: 0.9560; PSNR_Y: 39.35 dB; SSIM_Y: 0.9652; PSNR_B: 0.00 dB.Testing 6 108005               - PSNR: 33.86 dB; SSIM: 0.9337; PSNR_Y: 35.73 dB; SSIM_Y: 0.9470; PSNR_B: 0.00 dB.Testing 7 108070               - PSNR: 31.96 dB; SSIM: 0.9222; PSNR_Y: 33.71 dB; SSIM_Y: 0.9353; PSNR_B: 0.00 dB.Testing 8 108082               - PSNR: 34.37 dB; SSIM: 0.9420; PSNR_Y: 36.06 dB; SSIM_Y: 0.9520; PSNR_B: 0.00 dB.Testing 9 109053               - PSNR: 34.67 dB; SSIM: 0.9309; PSNR_Y: 36.52 dB; SSIM_Y: 0.9428; PSNR_B: 0.00 dB.Testing 10 119082               - PSNR: 34.75 dB; SSIM: 0.9570; PSNR_Y: 36.56 dB; SSIM_Y: 0.9655; PSNR_B: 0.00 dB.Testing 11 12084                - PSNR: 33.08 dB; SSIM: 0.9168; PSNR_Y: 35.40 dB; SSIM_Y: 0.9340; PSNR_B: 0.00 dB.Testing 12 123074               - PSNR: 34.70 dB; SSIM: 0.9363; PSNR_Y: 36.45 dB; SSIM_Y: 0.9465; PSNR_B: 0.00 dB.Testing 13 126007               - PSNR: 35.77 dB; SSIM: 0.9334; PSNR_Y: 37.60 dB; SSIM_Y: 0.9484; PSNR_B: 0.00 dB.Testing 14 130026               - PSNR: 32.89 dB; SSIM: 0.9189; PSNR_Y: 34.51 dB; SSIM_Y: 0.9319; PSNR_B: 0.00 dB.Testing 15 134035               - PSNR: 33.90 dB; SSIM: 0.9480; PSNR_Y: 35.66 dB; SSIM_Y: 0.9559; PSNR_B: 0.00 dB.Testing 16 14037                - PSNR: 37.21 dB; SSIM: 0.9387; PSNR_Y: 38.95 dB; SSIM_Y: 0.9533; PSNR_B: 0.00 dB.Testing 17 143090               - PSNR: 37.57 dB; SSIM: 0.9513; PSNR_Y: 39.62 dB; SSIM_Y: 0.9639; PSNR_B: 0.00 dB.Testing 18 145086               - PSNR: 33.79 dB; SSIM: 0.9310; PSNR_Y: 35.46 dB; SSIM_Y: 0.9461; PSNR_B: 0.00 dB.Testing 19 147091               - PSNR: 34.33 dB; SSIM: 0.9289; PSNR_Y: 36.20 dB; SSIM_Y: 0.9468; PSNR_B: 0.00 dB.Testing 20 148026               - PSNR: 32.43 dB; SSIM: 0.9511; PSNR_Y: 34.22 dB; SSIM_Y: 0.9619; PSNR_B: 0.00 dB.Testing 21 148089               - PSNR: 32.71 dB; SSIM: 0.9363; PSNR_Y: 34.36 dB; SSIM_Y: 0.9462; PSNR_B: 0.00 dB.Testing 22 157055               - PSNR: 34.56 dB; SSIM: 0.9495; PSNR_Y: 36.59 dB; SSIM_Y: 0.9616; PSNR_B: 0.00 dB.Testing 23 159008               - PSNR: 34.26 dB; SSIM: 0.9460; PSNR_Y: 35.95 dB; SSIM_Y: 0.9578; PSNR_B: 0.00 dB.Testing 24 160068               - PSNR: 35.17 dB; SSIM: 0.9682; PSNR_Y: 36.88 dB; SSIM_Y: 0.9758; PSNR_B: 0.00 dB.Testing 25 16077                - PSNR: 33.49 dB; SSIM: 0.9201; PSNR_Y: 35.28 dB; SSIM_Y: 0.9347; PSNR_B: 0.00 dB.Testing 26 163085               - PSNR: 34.90 dB; SSIM: 0.9267; PSNR_Y: 36.75 dB; SSIM_Y: 0.9421; PSNR_B: 0.00 dB.Testing 27 167062               - PSNR: 37.95 dB; SSIM: 0.9449; PSNR_Y: 39.81 dB; SSIM_Y: 0.9601; PSNR_B: 0.00 dB.Testing 28 167083               - PSNR: 31.24 dB; SSIM: 0.9596; PSNR_Y: 32.78 dB; SSIM_Y: 0.9645; PSNR_B: 0.00 dB.Testing 29 170057               - PSNR: 34.72 dB; SSIM: 0.9203; PSNR_Y: 36.58 dB; SSIM_Y: 0.9371; PSNR_B: 0.00 dB.Testing 30 175032               - PSNR: 30.79 dB; SSIM: 0.9515; PSNR_Y: 32.49 dB; SSIM_Y: 0.9566; PSNR_B: 0.00 dB.Testing 31 175043               - PSNR: 32.28 dB; SSIM: 0.9442; PSNR_Y: 34.06 dB; SSIM_Y: 0.9519; PSNR_B: 0.00 dB.Testing 32 182053               - PSNR: 33.90 dB; SSIM: 0.9614; PSNR_Y: 35.50 dB; SSIM_Y: 0.9692; PSNR_B: 0.00 dB.Testing 33 189080               - PSNR: 37.19 dB; SSIM: 0.9205; PSNR_Y: 38.89 dB; SSIM_Y: 0.9385; PSNR_B: 0.00 dB.Testing 34 19021                - PSNR: 33.28 dB; SSIM: 0.9276; PSNR_Y: 34.99 dB; SSIM_Y: 0.9394; PSNR_B: 0.00 dB.Testing 35 196073               - PSNR: 31.92 dB; SSIM: 0.8469; PSNR_Y: 33.31 dB; SSIM_Y: 0.8625; PSNR_B: 0.00 dB.Testing 36 197017               - PSNR: 33.46 dB; SSIM: 0.9228; PSNR_Y: 35.10 dB; SSIM_Y: 0.9352; PSNR_B: 0.00 dB.Testing 37 208001               - PSNR: 34.05 dB; SSIM: 0.9204; PSNR_Y: 36.01 dB; SSIM_Y: 0.9396; PSNR_B: 0.00 dB.Testing 38 210088               - PSNR: 38.00 dB; SSIM: 0.9620; PSNR_Y: 40.78 dB; SSIM_Y: 0.9756; PSNR_B: 0.00 dB.Testing 39 21077                - PSNR: 34.18 dB; SSIM: 0.9032; PSNR_Y: 35.83 dB; SSIM_Y: 0.9194; PSNR_B: 0.00 dB.Testing 40 216081               - PSNR: 33.35 dB; SSIM: 0.9367; PSNR_Y: 35.04 dB; SSIM_Y: 0.9494; PSNR_B: 0.00 dB.Testing 41 219090               - PSNR: 34.88 dB; SSIM: 0.9243; PSNR_Y: 36.50 dB; SSIM_Y: 0.9377; PSNR_B: 0.00 dB.Testing 42 220075               - PSNR: 35.54 dB; SSIM: 0.9521; PSNR_Y: 37.61 dB; SSIM_Y: 0.9650; PSNR_B: 0.00 dB.Testing 43 223061               - PSNR: 33.88 dB; SSIM: 0.9557; PSNR_Y: 35.49 dB; SSIM_Y: 0.9627; PSNR_B: 0.00 dB.Testing 44 227092               - PSNR: 37.97 dB; SSIM: 0.9423; PSNR_Y: 39.80 dB; SSIM_Y: 0.9547; PSNR_B: 0.00 dB.Testing 45 229036               - PSNR: 32.15 dB; SSIM: 0.9262; PSNR_Y: 33.76 dB; SSIM_Y: 0.9353; PSNR_B: 0.00 dB.Testing 46 236037               - PSNR: 32.16 dB; SSIM: 0.9350; PSNR_Y: 33.94 dB; SSIM_Y: 0.9431; PSNR_B: 0.00 dB.Testing 47 24077                - PSNR: 35.38 dB; SSIM: 0.9678; PSNR_Y: 37.68 dB; SSIM_Y: 0.9761; PSNR_B: 0.00 dB.Testing 48 241004               - PSNR: 35.61 dB; SSIM: 0.9070; PSNR_Y: 37.23 dB; SSIM_Y: 0.9253; PSNR_B: 0.00 dB.Testing 49 241048               - PSNR: 32.33 dB; SSIM: 0.9273; PSNR_Y: 33.99 dB; SSIM_Y: 0.9364; PSNR_B: 0.00 dB.Testing 50 253027               - PSNR: 33.92 dB; SSIM: 0.9310; PSNR_Y: 35.58 dB; SSIM_Y: 0.9421; PSNR_B: 0.00 dB.Testing 51 253055               - PSNR: 36.10 dB; SSIM: 0.9168; PSNR_Y: 37.78 dB; SSIM_Y: 0.9341; PSNR_B: 0.00 dB.Testing 52 260058               - PSNR: 35.37 dB; SSIM: 0.8999; PSNR_Y: 36.95 dB; SSIM_Y: 0.9184; PSNR_B: 0.00 dB.Testing 53 271035               - PSNR: 34.40 dB; SSIM: 0.9299; PSNR_Y: 36.10 dB; SSIM_Y: 0.9428; PSNR_B: 0.00 dB.Testing 54 285079               - PSNR: 33.00 dB; SSIM: 0.9271; PSNR_Y: 34.77 dB; SSIM_Y: 0.9383; PSNR_B: 0.00 dB.Testing 55 291000               - PSNR: 30.12 dB; SSIM: 0.9457; PSNR_Y: 31.99 dB; SSIM_Y: 0.9523; PSNR_B: 0.00 dB.Testing 56 295087               - PSNR: 33.49 dB; SSIM: 0.9405; PSNR_Y: 35.32 dB; SSIM_Y: 0.9515; PSNR_B: 0.00 dB.Testing 57 296007               - PSNR: 34.75 dB; SSIM: 0.8944; PSNR_Y: 36.36 dB; SSIM_Y: 0.9144; PSNR_B: 0.00 dB.Testing 58 296059               - PSNR: 34.66 dB; SSIM: 0.9062; PSNR_Y: 36.29 dB; SSIM_Y: 0.9223; PSNR_B: 0.00 dB.Testing 59 299086               - PSNR: 35.32 dB; SSIM: 0.9082; PSNR_Y: 37.03 dB; SSIM_Y: 0.9262; PSNR_B: 0.00 dB.Testing 60 300091               - PSNR: 35.18 dB; SSIM: 0.9137; PSNR_Y: 36.80 dB; SSIM_Y: 0.9295; PSNR_B: 0.00 dB.Testing 61 302008               - PSNR: 38.71 dB; SSIM: 0.9649; PSNR_Y: 40.80 dB; SSIM_Y: 0.9749; PSNR_B: 0.00 dB.Testing 62 304034               - PSNR: 32.27 dB; SSIM: 0.9455; PSNR_Y: 34.11 dB; SSIM_Y: 0.9549; PSNR_B: 0.00 dB.Testing 63 304074               - PSNR: 31.85 dB; SSIM: 0.9043; PSNR_Y: 33.38 dB; SSIM_Y: 0.9154; PSNR_B: 0.00 dB.Testing 64 306005               - PSNR: 33.75 dB; SSIM: 0.9265; PSNR_Y: 35.72 dB; SSIM_Y: 0.9444; PSNR_B: 0.00 dB.Testing 65 3096                 - PSNR: 42.69 dB; SSIM: 0.9768; PSNR_Y: 44.83 dB; SSIM_Y: 0.9854; PSNR_B: 0.00 dB.Testing 66 33039                - PSNR: 30.98 dB; SSIM: 0.9585; PSNR_Y: 32.50 dB; SSIM_Y: 0.9619; PSNR_B: 0.00 dB.Testing 67 351093               - PSNR: 32.45 dB; SSIM: 0.9607; PSNR_Y: 34.08 dB; SSIM_Y: 0.9672; PSNR_B: 0.00 dB.results/swinir_color_dn_noise15 -- Average PSNR/SSIM(RGB): 34.32 dB; 0.9344-- Average PSNR_Y/SSIM_Y: 36.10 dB; 0.9465

       

输出如下:

— Average PSNR/SSIM(RGB): 34.32 dB; 0.9344

接近了验收精度.

5.3 单张图像去噪测试

导入单张图像,测试去噪效果,首先需要在work/test_images里上传一张图片.

In [13]

# 先上传一张图片import os.path as ospfrom IPython.display import displayfrom PIL import Imageimg_path = 'bird.png' # 改成自己上传的图片名称full_img_path = osp.join(osp.abspath('work/test_images/'), img_path)img = Image.open(full_img_path).convert('RGB')print('以下为上传的图片:')display(img)

       

以下为上传的图片:

       


               

需要指定干净图像和噪声图像,可以只给一张噪声图片,也可以只给一张干净图片,也可以都给.

给定一张噪声图片:指定参数noisy_img,直接输出去噪图片.

给定一张干净图片:指定参数clean_img和noisyL,后者为噪声水平,默认为15,输出加噪图片和去噪图片.

给定噪声图片和干净图片:直接输出去噪图片.

In [24]

# 仅给定干净图片,噪声水平为15!cd work && python predict_single.py --clean_img $full_img_path --save_images --noisyL 15 --model_path pretrained_models/SwinIR_paddle.pdparams

       

loading model from pretrained_models/SwinIR_paddle.pdparamsW1016 17:20:03.374689 17743 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 11.2, Runtime API Version: 11.2W1016 17:20:03.378355 17743 gpu_resources.cc:91] device: 0, cuDNN Version: 8.2.only clean image provided, noise level is 15PSNR on test data 34.9994

       In [25]

# 去噪效果查看import globfrom IPython.display import displayfrom PIL import Imageimgs = glob.glob('work/test_images/*')for path in imgs:    print(path)    img = Image.open(path)    display(img)

       

work/test_images/bird_denoised.png

       


               

work/test_images/bird_noised.png

       


               

work/test_images/bird.png

       


               

以上就是【图像去噪】第七期论文复现赛——SwinIR的详细内容,更多请关注创想鸟其它相关文章!

版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。
如发现本站有涉嫌抄袭侵权/违法违规的内容, 请发送邮件至 chuangxiangniao@163.com 举报,一经查实,本站将立刻删除。
发布者:程序猿,转转请注明出处:https://www.chuangxiangniao.com/p/59560.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2025年11月10日 14:57:49
下一篇 2025年11月10日 15:16:13

相关推荐

  • Uniapp 中如何不拉伸不裁剪地展示图片?

    灵活展示图片:如何不拉伸不裁剪 在界面设计中,常常需要以原尺寸展示用户上传的图片。本文将介绍一种在 uniapp 框架中实现该功能的简单方法。 对于不同尺寸的图片,可以采用以下处理方式: 极端宽高比:撑满屏幕宽度或高度,再等比缩放居中。非极端宽高比:居中显示,若能撑满则撑满。 然而,如果需要不拉伸不…

    2025年12月24日
    400
  • 如何让小说网站控制台显示乱码,同时网页内容正常显示?

    如何在不影响用户界面的情况下实现控制台乱码? 当在小说网站上下载小说时,大家可能会遇到一个问题:网站上的文本在网页内正常显示,但是在控制台中却是乱码。如何实现此类操作,从而在不影响用户界面(UI)的情况下保持控制台乱码呢? 答案在于使用自定义字体。网站可以通过在服务器端配置自定义字体,并通过在客户端…

    2025年12月24日
    800
  • 如何在地图上轻松创建气泡信息框?

    地图上气泡信息框的巧妙生成 地图上气泡信息框是一种常用的交互功能,它简便易用,能够为用户提供额外信息。本文将探讨如何借助地图库的功能轻松创建这一功能。 利用地图库的原生功能 大多数地图库,如高德地图,都提供了现成的信息窗体和右键菜单功能。这些功能可以通过以下途径实现: 高德地图 JS API 参考文…

    2025年12月24日
    400
  • 如何使用 scroll-behavior 属性实现元素scrollLeft变化时的平滑动画?

    如何实现元素scrollleft变化时的平滑动画效果? 在许多网页应用中,滚动容器的水平滚动条(scrollleft)需要频繁使用。为了让滚动动作更加自然,你希望给scrollleft的变化添加动画效果。 解决方案:scroll-behavior 属性 要实现scrollleft变化时的平滑动画效果…

    2025年12月24日
    000
  • 如何为滚动元素添加平滑过渡,使滚动条滑动时更自然流畅?

    给滚动元素平滑过渡 如何在滚动条属性(scrollleft)发生改变时为元素添加平滑的过渡效果? 解决方案:scroll-behavior 属性 为滚动容器设置 scroll-behavior 属性可以实现平滑滚动。 html 代码: click the button to slide right!…

    2025年12月24日
    500
  • 如何选择元素个数不固定的指定类名子元素?

    灵活选择元素个数不固定的指定类名子元素 在网页布局中,有时需要选择特定类名的子元素,但这些元素的数量并不固定。例如,下面这段 html 代码中,activebar 和 item 元素的数量均不固定: *n *n 如果需要选择第一个 item元素,可以使用 css 选择器 :nth-child()。该…

    2025年12月24日
    200
  • 使用 SVG 如何实现自定义宽度、间距和半径的虚线边框?

    使用 svg 实现自定义虚线边框 如何实现一个具有自定义宽度、间距和半径的虚线边框是一个常见的前端开发问题。传统的解决方案通常涉及使用 border-image 引入切片图片,但是这种方法存在引入外部资源、性能低下的缺点。 为了避免上述问题,可以使用 svg(可缩放矢量图形)来创建纯代码实现。一种方…

    2025年12月24日
    100
  • 如何解决本地图片在使用 mask JS 库时出现的跨域错误?

    如何跨越localhost使用本地图片? 问题: 在本地使用mask js库时,引入本地图片会报跨域错误。 解决方案: 要解决此问题,需要使用本地服务器启动文件,以http或https协议访问图片,而不是使用file://协议。例如: python -m http.server 8000 然后,可以…

    2025年12月24日
    200
  • 如何让“元素跟随文本高度,而不是撑高父容器?

    如何让 元素跟随文本高度,而不是撑高父容器 在页面布局中,经常遇到父容器高度被子元素撑开的问题。在图例所示的案例中,父容器被较高的图片撑开,而文本的高度没有被考虑。本问答将提供纯css解决方案,让图片跟随文本高度,确保父容器的高度不会被图片影响。 解决方法 为了解决这个问题,需要将图片从文档流中脱离…

    2025年12月24日
    000
  • 为什么 CSS mask 属性未请求指定图片?

    解决 css mask 属性未请求图片的问题 在使用 css mask 属性时,指定了图片地址,但网络面板显示未请求获取该图片,这可能是由于浏览器兼容性问题造成的。 问题 如下代码所示: 立即学习“前端免费学习笔记(深入)”; icon [data-icon=”cloud”] { –icon-cl…

    2025年12月24日
    200
  • 如何利用 CSS 选中激活标签并影响相邻元素的样式?

    如何利用 css 选中激活标签并影响相邻元素? 为了实现激活标签影响相邻元素的样式需求,可以通过 :has 选择器来实现。以下是如何具体操作: 对于激活标签相邻后的元素,可以在 css 中使用以下代码进行设置: li:has(+li.active) { border-radius: 0 0 10px…

    2025年12月24日
    100
  • 如何模拟Windows 10 设置界面中的鼠标悬浮放大效果?

    win10设置界面的鼠标移动显示周边的样式(探照灯效果)的实现方式 在windows设置界面的鼠标悬浮效果中,光标周围会显示一个放大区域。在前端开发中,可以通过多种方式实现类似的效果。 使用css 使用css的transform和box-shadow属性。通过将transform: scale(1.…

    2025年12月24日
    200
  • 为什么我的 Safari 自定义样式表在百度页面上失效了?

    为什么在 Safari 中自定义样式表未能正常工作? 在 Safari 的偏好设置中设置自定义样式表后,您对其进行测试却发现效果不同。在您自己的网页中,样式有效,而在百度页面中却失效。 造成这种情况的原因是,第一个访问的项目使用了文件协议,可以访问本地目录中的图片文件。而第二个访问的百度使用了 ht…

    2025年12月24日
    000
  • 如何用前端实现 Windows 10 设置界面的鼠标移动探照灯效果?

    如何在前端实现 Windows 10 设置界面中的鼠标移动探照灯效果 想要在前端开发中实现 Windows 10 设置界面中类似的鼠标移动探照灯效果,可以通过以下途径: CSS 解决方案 DEMO 1: Windows 10 网格悬停效果:https://codepen.io/tr4553r7/pe…

    2025年12月24日
    000
  • 使用CSS mask属性指定图片URL时,为什么浏览器无法加载图片?

    css mask属性未能加载图片的解决方法 使用css mask属性指定图片url时,如示例中所示: mask: url(“https://api.iconify.design/mdi:apple-icloud.svg”) center / contain no-repeat; 但是,在网络面板中却…

    2025年12月24日
    000
  • 如何用CSS Paint API为网页元素添加时尚的斑马线边框?

    为元素添加时尚的斑马线边框 在网页设计中,有时我们需要添加时尚的边框来提升元素的视觉效果。其中,斑马线边框是一种既醒目又别致的设计元素。 实现斜向斑马线边框 要实现斜向斑马线间隔圆环,我们可以使用css paint api。该api提供了强大的功能,可以让我们在元素上绘制复杂的图形。 立即学习“前端…

    2025年12月24日
    000
  • 图片如何不撑高父容器?

    如何让图片不撑高父容器? 当父容器包含不同高度的子元素时,父容器的高度通常会被最高元素撑开。如果你希望父容器的高度由文本内容撑开,避免图片对其产生影响,可以通过以下 css 解决方法: 绝对定位元素: .child-image { position: absolute; top: 0; left: …

    2025年12月24日
    000
  • 使用 Mask 导入本地图片时,如何解决跨域问题?

    跨域疑难:如何解决 mask 引入本地图片产生的跨域问题? 在使用 mask 导入本地图片时,你可能会遇到令人沮丧的跨域错误。为什么会出现跨域问题呢?让我们深入了解一下: mask 框架假设你以 http(s) 协议加载你的 html 文件,而当使用 file:// 协议打开本地文件时,就会产生跨域…

    2025年12月24日
    200
  • CSS 帮助

    我正在尝试将文本附加到棕色框的左侧。我不能。我不知道代码有什么问题。请帮助我。 css .hero { position: relative; bottom: 80px; display: flex; justify-content: left; align-items: start; color:…

    2025年12月24日 好文分享
    200
  • HTML、CSS 和 JavaScript 中的简单侧边栏菜单

    构建一个简单的侧边栏菜单是一个很好的主意,它可以为您的网站添加有价值的功能和令人惊叹的外观。 侧边栏菜单对于客户找到不同项目的方式很有用,而不会让他们觉得自己有太多选择,从而创造了简单性和秩序。 今天,我将分享一个简单的 HTML、CSS 和 JavaScript 源代码来创建一个简单的侧边栏菜单。…

    2025年12月24日
    200

发表回复

登录后才能评论
关注微信