You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have been greatly impressed by your work on the Agent Attention model, as detailed in your recent publication and the associated GitHub repository. The method of integrating Softmax with linear attention mechanisms to enhance computational efficiency while maintaining robust expressiveness is particularly compelling.
Given that the xformers library is a platform for optimizing and enhancing the efficiency of Transformers, I am curious to know if there are any plans to integrate the Agent Attention mechanism into xformers. Such an integration could potentially make your innovative approach more accessible and practical for a broader audience, enabling developers and researchers to utilize Agent Attention in real-world applications more readily.
Could you please share any information regarding plans to migrate Agent Attention code to xformers or similar libraries, or if there are any ongoing projects aimed at such integration?
Thank you for your time and consideration.
Best regards,
xczhou
The text was updated successfully, but these errors were encountered:
Hi @XCZhou520, thanks for your interest in our work. We plan to apply xformers or flash attention to our agent attention in the future. And we also encourage and welcome contributions from the community to explore and achieve this.
Dear Dr. Han and Dr. Ye,
I have been greatly impressed by your work on the Agent Attention model, as detailed in your recent publication and the associated GitHub repository. The method of integrating Softmax with linear attention mechanisms to enhance computational efficiency while maintaining robust expressiveness is particularly compelling.
Given that the xformers library is a platform for optimizing and enhancing the efficiency of Transformers, I am curious to know if there are any plans to integrate the Agent Attention mechanism into xformers. Such an integration could potentially make your innovative approach more accessible and practical for a broader audience, enabling developers and researchers to utilize Agent Attention in real-world applications more readily.
Could you please share any information regarding plans to migrate Agent Attention code to xformers or similar libraries, or if there are any ongoing projects aimed at such integration?
Thank you for your time and consideration.
Best regards,
xczhou
The text was updated successfully, but these errors were encountered: