Mindful this post is a bit old, but IBM Quantum did put a few systems on the cloud in a pay-as-you-go model, with a current cost of $1.60 USD per "runtime-second" (https://cloud.ibm.com/catalog/services/quantum-computing), where the cost of sitting in queue is free.
So $3M USD currently buys you 1875000 seconds, or about 21.7 days, of quantum compute time.
Whether that amount of time is enough to unlock >$3M in end-user value, IDK.
From my POV, the interesting tension is the balance between end-users wanting to run more circuits for their work to get more accurate results via error mitigation (which can come with substantial overheads), and those self-same end users wanting to minimize cost.
Ah, I didn't know that. Back when I looked into this, it seemed like I had to 'inquire' to discover what the pricing was. I assume runtime seconds are initialization + gate execution + readout + reset + I/O to or from the fridge, etc?
Mindful this post is a bit old, but IBM Quantum did put a few systems on the cloud in a pay-as-you-go model, with a current cost of $1.60 USD per "runtime-second" (https://cloud.ibm.com/catalog/services/quantum-computing), where the cost of sitting in queue is free.
So $3M USD currently buys you 1875000 seconds, or about 21.7 days, of quantum compute time.
Whether that amount of time is enough to unlock >$3M in end-user value, IDK.
From my POV, the interesting tension is the balance between end-users wanting to run more circuits for their work to get more accurate results via error mitigation (which can come with substantial overheads), and those self-same end users wanting to minimize cost.
Ah, I didn't know that. Back when I looked into this, it seemed like I had to 'inquire' to discover what the pricing was. I assume runtime seconds are initialization + gate execution + readout + reset + I/O to or from the fridge, etc?
IIRC, this quantity how much time has elapsed once a job has been pulled from the queue.
Whether the definition will change, I’m not sure.