You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @honglunzhang-mt. I greatly appreciate your interest in our work. Currently, agentsd does not support diffusers, but we plan to include support for it in a few weeks. We also encourage and welcome contributions from the community to help achieve this.
hello, I greatly appreciate your awesome work.
It seems that
self.attn1
in https://github.com/LeapLabTHU/Agent-Attention/blob/master/agentsd/patch.py#L220 is replaced as AgentAttention, whose forward function acceptsforward(self, x, agent=None, context=None, mask=None)
as parameters. However, in https://github.com/LeapLabTHU/Agent-Attention/blob/master/agentsd/patch.py#L220,encoder_hidden_states
andattention_mask
are passed toself.attn1
, which causes the problemforward() got an unexpected keyword argument 'encoder_hidden_states'
.do you have any solution? thanks a lot
The text was updated successfully, but these errors were encountered: