Pytorch dataloader parallel. shmmax is enough = 18446744073692774399).