georgeforeman.org

DistributedDataParallel non-floating point dtype parameter with requires_grad=False · Issue #32018 · pytorch/pytorch · GitHub

By A Mystery Man Writer

🐛 Bug Using DistributedDataParallel on a model that has at-least one non-floating point dtype parameter with requires_grad=False with a WORLD_SIZE <= nGPUs/2 on the machine results in an error "Only Tensors of floating point dtype can re

Distributed] `Invalid scalar type` when `dist.scatter()` boolean

Rethinking PyTorch Fully Sharded Data Parallel (FSDP) from First

pytorch报错解决2——Only Tensors of floating point and complex dtype

Achieving FP32 Accuracy for INT8 Inference Using Quantization

Issue for DataParallel · Issue #8637 · pytorch/pytorch · GitHub

Introduction to Tensors in Pytorch #1

Achieving FP32 Accuracy for INT8 Inference Using Quantization

Cannot update part of the parameters in DistributedDataParallel

Torch 2.1 compile + FSDP (mixed precision) + LlamaForCausalLM

If a module passed to DistributedDataParallel has no parameter

详解pytorch中的常见的Tensor数据类型以及类型转换_torch.int32-CSDN博客

Inplace error if DistributedDataParallel module that contains a