... def positional_encoding(self, x): ... def forward(self, x): # Apply positional encoding x = self.positional_encoding(x) # Apply fully connected layer x = self.fc(x) # Apply element-wise sin x = torch.sin(x) return x This PyTorch code snippet shows how to implement SPE, ...
A PyTorch implementation of the 1d and 2d Sinusoidal positional encoding/embedding. - wzlxjtu/PositionalEncoding2D