首页 最新 热门 推荐

  • 首页
  • 最新
  • 热门
  • 推荐

复现U-Mamba

  • 25-03-04 18:43
  • 3853
  • 5995
blog.csdn.net

一边复现一边更新吧

复现U-Mamba

一.环境安装

在这里插入图片描述

出错步骤

1.步骤pip install causal-conv1d==1.1.1
错误:ERROR: Could not build wheels for causal-conv1d, which is required to install pyproject.toml-based projects
解决办法:下载之后直接安装,如果想要最新版本的,第三步可以忽略
**
在这里插入图片描述
2.步骤:pip install mamba-ssm
这里训练的时候我会出错!!!!!!!!!!我把这个版本的mamba-ssm卸载了,安装了特定版本:mamba-ssm=1.1.1
错误:Guessing wheel URL: https://github.com/state-spaces/mamba/releases/download/v1.1.1/mamba_ssm-1.1.1+cu118torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
error:
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for mamba-ssm
Running setup.py clean for mamba-ssm
Failed to build mamba-ssm
ERROR: Could not build wheels for mamba-ssm, which is required to install pyproject.toml-based projects
解决办法:点击错误里面给出的网址,下载whl文件,然后使用pip install 路径+文件名安装

3.前一段时间换了一个服务器,上面的问题完全不会出现,感觉是原来的服务器网不好,但是这个服务器安装的时候也出现问题
问题: FileNotFoundError: [Errno 2] No such file or directory: ‘/usr/local/cuda/bin/nvcc’
解决方案:不太懂服务器,新的服务器也找不到/usr/local/cuda/这个文件夹,原来的服务器就有,问了师兄,师兄说cuda都装在自己环境下,外面只有一个驱动,我就自己安装了。
参考:linux非root下安装CUDA

二.运行

1.处理数据
不知道怎么用nnunetv2的,可以参考这个:
nnunetv2

2.训练
我的是已经处理好的数据,3d的,用的这个命令

nnUNetv2_train DATASET_ID 3d_fullres all -tr nnUNetTrainerUMambaEnc
  • 1

因为要配置环境变量,整了这个train.sh文件:

export nnUNet_results="data/nnUNet_results"#换成自己的路径
export nnUNet_preprocessed="data/nnUNet_raw"
export nnUNet_raw="data/nnUNet_raw"
export CUDA_VISIBLE_DEVICES=0  
nnUNetv2_train 206 3d_fullres 0 -tr nnUNetTrainerUMambaBot #206是数据集号码,0是不用交叉验证,直接划分好训练集和验证集
  • 1
  • 2
  • 3
  • 4
  • 5

运行命令:

source U-Mamba/train.sh
  • 1

3.出错

TypeError: causal_conv1d_fwd(): incompatible function arguments. The following argument types are supported:
1. (arg0: torch.Tensor, arg1: torch.Tensor, arg2: Optional[torch.Tensor], arg3: Optional[torch.Tensor], arg4: bool) -> torch.Tensor
解决办法:https://github.com/bowang-lab/U-Mamba/issues
在这里插入图片描述
把mam-ssm换成作者环境里的版本

三.测试

建立test.sh文件

nnUNetv2_predict -i INPUT_FOLDER -o OUTPUT_FOLDER -d DATASET_ID -c CONFIGURATION -tr nnUNetTrainerUMambaBot --disable_tta
  • 1

**出错:**FileNotFoundError: [Errno 2] No such file or directory: ‘U-Mamba/data/nnUNet_results/Dataset208_tubu/nnUNetTrainerUMambaBot__nnUNetPlans__3d_fullres/fold_1/checkpoint_final.pth’
查看目录
在这里插入图片描述
没有checkpoint_final.pth

找到umamba/nnunetv2/inference/predict_from_raw_data.py的759行,

    parser.add_argument('-chk', type=str, required=False, default='checkpoint_final.pth',
                        help='Name of the checkpoint you want to use. Default: checkpoint_final.pth')

  • 1
  • 2
  • 3

根据需要改成checkpoint_best.pth或checkpoint_latest.pth

**注意:**而且我的是fold_0,所以在test.sh添加-f 0

**出错:**啊!!!好多奇奇怪怪的错误,我也不知道为什么,一顿操作就好了。但是也不懂为什么

Error: mkl-service + Intel(R) MKL: MKL_THREADING_LAYER=INTEL is incompatible with libgomp.so.1 library.
Try to import numpy first or set the threading layer accordingly. Set MKL_SERVICE_FORCE_INTEL to force it.
Traceback (most recent call last):
File "umamba/bin/nnUNetv2_predict", line 33, in
sys.exit(load_entry_point('nnunetv2', 'console_scripts', 'nnUNetv2_predict')())
File "U-Mamba/umamba/nnunetv2/inference/predict_from_raw_data.py", line 833, in predict_entry_point
predictor.predict_from_files(args.i, args.o, save_probabilities=args.save_probabilities,
File "U-Mamba/umamba/nnunetv2/inference/predict_from_raw_data.py", line 250, in predict_from_files
return self.predict_from_data_iterator(data_iterator, save_probabilities, num_processes_segmentation_export)
File "U-Mamba/umamba/nnunetv2/inference/predict_from_raw_data.py", line 343, in predict_from_data_iterator
for preprocessed in data_iterator:
File "U-Mamba/umamba/nnunetv2/inference/data_iterators.py", line 109, in preprocessing_iterator_fromfiles
raise RuntimeError('Background workers died. Look for the error message further up! If there is '
RuntimeError: Background workers died. Look for the error message further up! If there is none then your RAM was full and the worker was killed by the OS. Use fewer workers or get more RAM in that case!
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14

不知道是Error: mkl-service + Intel® MKL: MKL_THREADING_LAYER=INTEL is incompatible with libgomp.so.1 library.
Try to import numpy first or set the threading layer accordingly. Set MKL_SERVICE_FORCE_INTEL to force it.还是RuntimeError: Background workers died. Look for the error message further up! If there is none then your RAM was full and the worker was killed by the OS. Use fewer workers or get more RAM in that case!

对于Error: mkl-service + Intel® MKL: MKL_THREADING_LAYER=INTEL is incompatible with libgomp.so.1 library.
Try to import numpy first or set the threading layer accordingly. Set MKL_SERVICE_FORCE_INTEL to force it.,我参考这里加了:

export MKL_SERVICE_FORCE_INTEL=1
'MKL_THREADING_LAYER' = 'GNU' 
  • 1
  • 2

对于RuntimeError: Background workers died. Look for the error message further up! If there is none then your RAM was full and the worker was killed by the OS. Use fewer workers or get more RAM in that case!,我参考nnunet里面的issues加了:

-npp 1 -nps 1
  • 1

成功运行了,但是还是依然还有:Error: mkl-service + Intel® MKL: MKL_THREADING_LAYER=INTEL is incompatible with libgomp.so.1 library.Try to import numpy first or set the threading layer accordingly. Set MKL_SERVICE_FORCE_INTEL to force it.
解决了但是不知道为什么,摊手无奈.jpg

文章知识点与官方知识档案匹配,可进一步学习相关知识
Python入门技能树首页概览419931 人正在系统学习中
注:本文转载自blog.csdn.net的嘻嘻嘻.的文章"https://blog.csdn.net/qq_45738122/article/details/135760127"。版权归原作者所有,此博客不拥有其著作权,亦不承担相应法律责任。如有侵权,请联系我们删除。
复制链接
复制链接
相关推荐
发表评论
登录后才能发表评论和回复 注册

/ 登录

评论记录:

未查询到任何数据!
回复评论:

分类栏目

后端 (14832) 前端 (14280) 移动开发 (3760) 编程语言 (3851) Java (3904) Python (3298) 人工智能 (10119) AIGC (2810) 大数据 (3499) 数据库 (3945) 数据结构与算法 (3757) 音视频 (2669) 云原生 (3145) 云平台 (2965) 前沿技术 (2993) 开源 (2160) 小程序 (2860) 运维 (2533) 服务器 (2698) 操作系统 (2325) 硬件开发 (2492) 嵌入式 (2955) 微软技术 (2769) 软件工程 (2056) 测试 (2865) 网络空间安全 (2948) 网络与通信 (2797) 用户体验设计 (2592) 学习和成长 (2593) 搜索 (2744) 开发工具 (7108) 游戏 (2829) HarmonyOS (2935) 区块链 (2782) 数学 (3112) 3C硬件 (2759) 资讯 (2909) Android (4709) iOS (1850) 代码人生 (3043) 阅读 (2841)

热门文章

101
推荐
关于我们 隐私政策 免责声明 联系我们
Copyright © 2020-2024 蚁人论坛 (iYenn.com) All Rights Reserved.
Scroll to Top