Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] default.qubit cannot handle PhaseShift operations with a batch size of 1 #6874

Open
1 task done
albi3ro opened this issue Jan 22, 2025 · 0 comments
Open
1 task done
Labels
bug 🐛 Something isn't working

Comments

@albi3ro
Copy link
Contributor

albi3ro commented Jan 22, 2025

Expected behavior

I expect to be able to apply any operation with a batch size of 1.

Actual behavior

We get an error with PhaseShift when it has a batch size of 1. It is fine without batching, and with batch sizes greater than one.

Additional information

No response

Source code

import pennylane as qml
import numpy as np
import jax

params = jax.numpy.array([1.57])

@qml.qnode(qml.device("default.qubit"))
def circuit(x):
    qml.PhaseShift(x, wires=1)
    return qml.expval(qml.Z(0))

circuit(params)

Tracebacks

File pennylane/pennylane/devices/qubit/apply_operation.py:452, in apply_phaseshift(op, state, is_state_batched, debugger, **_)
    450         state1 = math.expand_dims(state1, 0)
    451 state1 = math.multiply(math.cast(state1, dtype=complex), math.exp(1.0j * params))
--> 452 state = math.stack([state0, state1], axis=axis)
    453 if not is_state_batched and op.batch_size == 1:
    454     state = math.stack([state], axis=0)

File pennylane/pennylane/math/multi_dispatch.py:153, in multi_dispatch.<locals>.decorator.<locals>.wrapper(*args, **kwargs)
    150 interface = interface or get_interface(*dispatch_args)
    151 kwargs["like"] = interface
--> 153 return fn(*args, **kwargs)

File pennylane/pennylane/math/multi_dispatch.py:504, in stack(values, axis, like)
    475 """Stack a sequence of tensors along the specified axis.
    476 
    477 .. warning::
   (...)
    501        [5.00e+00, 8.00e+00, 1.01e+02]], dtype=float32)>
    502 """
    503 values = np.coerce(values, like=like)
--> 504 return np.stack(values, axis=axis, like=like)

File autoray/autoray.py:81, in do(fn, like, *args, **kwargs)
     79 backend = _choose_backend(fn, args, kwargs, like=like)
     80 func = get_lib_fn(backend, fn)
---> 81 return func(*args, **kwargs)

File numpy/lax_numpy.py:2235, in stack(arrays, axis, out, dtype)
   2233 for a in arrays:
   2234   if shape(a) != shape0:
-> 2235     raise ValueError("All input arrays must have the same shape.")
   2236   new_arrays.append(expand_dims(a, axis))
   2237 return concatenate(new_arrays, axis=axis, dtype=dtype)

ValueError: All input arrays must have the same shape.

System information

master

Existing GitHub issues

  • I have searched existing GitHub issues to make sure the issue does not already exist.
@albi3ro albi3ro added the bug 🐛 Something isn't working label Jan 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug 🐛 Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant