Thank you for taking the time to submit an issue! Background information What version of Open MPI are you using? (e.g., v3.0.5, v4.0.2, git branch name and hash, etc.) v4.0.1 and v4.0.2 Describe how Open MPI was installed (e.g., from a s...
We updated the distributed parameter initialization with bcast overhttps://github.com/laekov/fastmoe/blob/master/fmoe/distributed.py#L100, which is not correct. In PyTorch’s distributed module, you are supposed to pass a global rank to thebroadcastfunction, and it parses the gl...
intMPI_Bcast(void*buffer,intcount, MPI_Datatype datatype,introot, MPI_Comm comm){interr; MEMCHECKER( memchecker_datatype(datatype); memchecker_comm(comm);if(OMPI_COMM_IS_INTRA(comm)) {if(ompi_comm_rank(comm) == root) {/* check whether root's send buffer is defined. */memchecker_...