You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I hit error when I tried following request body to /attention_head_record
{
"dst": "logits",
"layerIndex": 9,
"activationIndex": 8,
"datasets": [
"https://openaipublic.blob.core.windows.net/neuron-explainer/gpt2_small_data/collated-activations/"
]
}
Error as below:
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 265, in attention_head_record
) = convert_activation_records_to_token_and_attention_activations_lists(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 191, in convert_activation_records_to_token_and_attention_activations_lists
return normalize_attention_token_scalars(zipped_tokens_and_raw_attention_activations)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 455, in normalize_attention_token_scalars
) = compute_scalar_summary_and_scale(compute_max_scalar_out, list_of_sequence_lists)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 427, in compute_scalar_summary_and_scale
scalar_indexed_by_token_sequence_list: list[list[list[float]]] = [
^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 428, in
[compute_scalar_summary_function(sequence) for sequence in sequence_list]
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 428, in
[compute_scalar_summary_function(sequence) for sequence in sequence_list]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 421, in compute_max_scalar_out
return compute_summary_of_scalar_out(attn_token_sequence_list, _max_non_none)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 392, in compute_summary_of_scalar_out
scalar_out = [
^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 393, in
operation(
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 380, in _max_non_none
return max([a for a in scalars if a is not None])
Any idea? Same request was working fine on /neuron_record.
Will it due to wrong dataset I used? @WuTheFWasThat
The text was updated successfully, but these errors were encountered:
Hi Team,
I hit error when I tried following request body to /attention_head_record
{
"dst": "logits",
"layerIndex": 9,
"activationIndex": 8,
"datasets": [
"https://openaipublic.blob.core.windows.net/neuron-explainer/gpt2_small_data/collated-activations/"
]
}
Error as below:
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 265, in attention_head_record
) = convert_activation_records_to_token_and_attention_activations_lists(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 191, in convert_activation_records_to_token_and_attention_activations_lists
return normalize_attention_token_scalars(zipped_tokens_and_raw_attention_activations)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 455, in normalize_attention_token_scalars
) = compute_scalar_summary_and_scale(compute_max_scalar_out, list_of_sequence_lists)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 427, in compute_scalar_summary_and_scale
scalar_indexed_by_token_sequence_list: list[list[list[float]]] = [
^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 428, in
[compute_scalar_summary_function(sequence) for sequence in sequence_list]
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 428, in
[compute_scalar_summary_function(sequence) for sequence in sequence_list]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 421, in compute_max_scalar_out
return compute_summary_of_scalar_out(attn_token_sequence_list, _max_non_none)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 392, in compute_summary_of_scalar_out
scalar_out = [
^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 393, in
operation(
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 380, in _max_non_none
return max([a for a in scalars if a is not None])
Any idea? Same request was working fine on /neuron_record.
Will it due to wrong dataset I used? @WuTheFWasThat
The text was updated successfully, but these errors were encountered: