Delete the Self-Attention Class in Proformer since it directly uses the implemented self-attention layer in pytorch.
Delete the Self-Attention Class in Proformer since it directly uses the implemented self-attention layer in pytorch.