-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RFC][Discussion] Hidet Script #331
Labels
Comments
vadiklyutiy
pushed a commit
that referenced
this issue
Jul 22, 2024
Previously, an error was encountered during a model compilation attempt: > torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised: > RuntimeError: Can not interpreting max given arguments: > max(tensor(...)) > Possible candidates are: > torch_max_v3(x: hidet.Tensor, dim: Union[int, hidet.ir.expr.Expr], keepdim: bool = False, *, out: Union[hidet.Tensor, Tuple[hidet.Tensor, ...], List[hidet.Tensor]] = None) -> Tuple[hidet.Tensor, hidet.Tensor] > File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/register_functions.py", line 1067 Despite we indeed have a [function](https://github.com/CentML/hidet/blob/13a806608d40de2de1fcc682adeea8d204189f3c/python/hidet/graph/frontend/torch/register_functions.py#L1056-L1060) that can be used to interpret the `torch.Tensor.max` with described arguments.
vadiklyutiy
pushed a commit
that referenced
this issue
Jul 23, 2024
Previously, an error was encountered during a model compilation attempt: > torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised: > RuntimeError: Can not interpreting max given arguments: > max(tensor(...)) > Possible candidates are: > torch_max_v3(x: hidet.Tensor, dim: Union[int, hidet.ir.expr.Expr], keepdim: bool = False, *, out: Union[hidet.Tensor, Tuple[hidet.Tensor, ...], List[hidet.Tensor]] = None) -> Tuple[hidet.Tensor, hidet.Tensor] > File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/register_functions.py", line 1067 Despite we indeed have a [function](https://github.com/CentML/hidet/blob/13a806608d40de2de1fcc682adeea8d204189f3c/python/hidet/graph/frontend/torch/register_functions.py#L1056-L1060) that can be used to interpret the `torch.Tensor.max` with described arguments.
vadiklyutiy
pushed a commit
that referenced
this issue
Dec 26, 2024
Previously, an error was encountered during a model compilation attempt: > torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised: > RuntimeError: Can not interpreting max given arguments: > max(tensor(...)) > Possible candidates are: > torch_max_v3(x: hidet.Tensor, dim: Union[int, hidet.ir.expr.Expr], keepdim: bool = False, *, out: Union[hidet.Tensor, Tuple[hidet.Tensor, ...], List[hidet.Tensor]] = None) -> Tuple[hidet.Tensor, hidet.Tensor] > File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/register_functions.py", line 1067 Despite we indeed have a [function](https://github.com/CentML/hidet/blob/13a806608d40de2de1fcc682adeea8d204189f3c/python/hidet/graph/frontend/torch/register_functions.py#L1056-L1060) that can be used to interpret the `torch.Tensor.max` with described arguments.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
rendered draft.
Working in progress.
The text was updated successfully, but these errors were encountered: