Talk:Quantum computing

Active discussions
Quantum computing is a former featured article. Please see the links under Article milestones below for its original nomination page (for older articles, check the nomination archive) and why it was removed.
Article milestones
DateProcessResult
January 19, 2004Refreshing brilliant proseKept
May 9, 2006Featured article reviewKept
May 13, 2007Featured article reviewDemoted
Current status: Former featured article

Contradictory statements regarding space?Edit

The introduction suggests that simulating an n-bit quantum computer on a classical TM requires 2^n discrete states. But doesn't this contradict the later statement that QBP is a subset of PSPACE? (Erniecohen (talk) 15:50, 14 October 2012 (UTC))

It's just an example of how a classical computer could simulate a quantum computer. It doesn't mean that it's the only way to do so. --Robin (talk) 15:55, 14 October 2012 (UTC)
The problem is that the way that it is written, it strongly implies that quantum computing provides a space advantage, which is just false, so the 2^500 crap should just be removed. The relevant connection between the models is just that a classical TM can simulate a quantum computer with a polynomial blowup in space, but is strongly believed to require an exponential blowup in time. — Preceding unsigned comment added by Erniecohen (talkcontribs) 16:04, 14 October 2012 (UTC)
I wasn't aware of this. Do you have a reference? Skippydo (talk) 19:27, 14 October 2012 (UTC)

Has this been resolved? It seems odd to me in any case to say that 2500 complex values is equivalent to 2501 bits (does this mean that there are only four complex values available)? W. P. Uzer (talk) 12:27, 2 February 2014 (UTC)

Given these two objections, I've commented out the sentence in question for the moment. W. P. Uzer (talk) 08:50, 4 February 2014 (UTC)

How does it work... really!Edit

It seems like this article would benefit from explaining how quantum computers work physically. I assume they manipulate individual atoms, but how?

It seems to me that nobody really knows how it (sort of) works. It seems the researchers in the lab are playing with atomic scale phenomena that they don't fully understand. Since these researchers were trained to believe in quantum theory as the undisputed truth they preach their religion as fact. In the end they create mechanisms that can do some sort of awkward computing and they call these mechanisms "Quantum Computers". — Preceding unsigned comment added by 31.168.26.225 (talk) 15:39, 15 June 2016 (UTC)

Comment so that archiving will work. (Answer already in article.) — Arthur Rubin (talk) 12:28, 31 May 2015 (UTC)
and where is the critique section of this article? — Preceding unsigned comment added by 75.163.218.83 (talk) 05:12, 4 September 2015 (UTC)
i second that nomination , do you know that so called supercomputers are idle most of time. having more states doesnt not equate to efficiency, and superpostion collapse implies even less output. less states. Juror1 (talk) 17:59, 30 November 2017 (UTC)
Agreed. In the overview we find the statement: "Quantum computers share theoretical similarities with non-deterministic and probabilistic computers." Wow. Do you seriously suppose that the normal Wikipedia reader has any clue about determinism or probabilistic systems? Why even write something like that at the top of an article. Anyone who already knows about determinism, etc., also (probably) has some knowledge about Quantum Systems. The author of that statement could not do any better at pushing casual readers away from this topic. If anything, we need to draw more people into Computer Science in general, and bleeding-edge research in specific. Bloviation like the statement I referenced does much more harm than good by immediately throwing out arcane, industry-specific terminology. It literally turns readers off. I'm a computer scientist / engineer by education and trade, and it turned me off Big Time. Come on people. Write something that actual people can read and comprehend. Use your words. And not the 3-dollar ones, either. Plain English will do just fine.
There's an old-saying about being able to explain something in simple terms ... If you can't, you probably don't understand the subject yourself. Cheers. — Preceding unsigned comment added by 98.194.39.86 (talk) 03:44, 25 October 2016 (UTC)

Announcing new resultsEdit

/(

"..(-2,334)‎ . .(Sorry, Wikipedia is not the place to announce new results. See WP:OR) .." - and wghere, WHERE is such place, for new ideas, announcing and discuss them, developing them more, and .."brainstorming",maybe ?, or just constructive discussion.. yap, according that "WP:OR" document, if we all would strictly managed by it, it would be means, - all advancing, all scientific and technical development would be almost stopped, in fact.. bcos no one would know about any new ideas, new results, new.. anything.. :/(

\( — Preceding unsigned comment added by Martin Hovorka (talkcontribs) 19:30, 2 September 2013 (UTC)

deleting of new idea announcingEdit

/(

hmm... not even just announce,a bit, any new idea (imho quite promising, relevant, and ..perspective.., this my multi-Sieves/Sifthering Q.C. Idea) ..even nor get,make a little advertisement,and spreading-out, for this new,constructive,fresh-brigth concept.. :\( (..but it is, and always was, my idea, trying make Q.Computing with this concept of sieve, /sifter approach,/concept(sieving) — Preceding unsigned comment added by Martin Hovorka (talkcontribs) 20:57, 2 September 2013 (UTC)

It's your fault for announcing the idea here. Doesn't everyone know by now that, if you want to patent an invention, you must do so within one year of publication. Under the new US patent law, publication may make it impossible to patent it at all. (And, yes, posting it on Wikipedia constitutes "publication".) — Arthur Rubin (talk) 21:07, 2 September 2013 (UTC)

other wiki for discussing quantum ideasEdit

There is a common misconception that people who support the WP:OR policy are people who don't want to see any original research on any wiki.

Actually, some of us *do* want to see original research on a wiki -- but on an appropriate wiki where such research is on-topic, not just any random wiki.

There are over 10,000 English-language wiki. Pretty much any topic you can think of is on-topic at at least one of them. In particular, quantum computing seems to be on topic at several wiki including http://quantiki.org/ , Wikia: 3dids, http://quantum.cs.washington.edu/ , http://wiki.qcraft.org/ , http://c2.com/cgi/wiki?QuantumComputing , http://twoqubits.wikidot.com/ , http://chemwiki.ucdavis.edu/Physical_Chemistry/Quantum_Mechanics , http://www.physics.thetangentbundle.net/wiki/Quantum_mechanics , etc.

--DavidCary (talk) 04:30, 15 January 2014 (UTC)

Quantum supercomputerEdit

Is it really the case that a quantum computer is "also known as a quantum supercomputer"? I've never seen that usage before & suspect it should be deleted. --24.12.205.104 (talk) 00:48, 6 January 2014 (UTC)

misc suggestions / solving chess - programming the universe - more intuitive introEdit

I think it would be great if the article mentioned solving chess as something that quantum computing would allow.

I also think Seth Lloyd's book "Programming the Universe" should be referenced somewhere.

The first sentence I think is too technical. It tells precisely what the term means in physics, it doesn't give any layman sense as to what quantum computing is. It almost sounds like: "Quantum computing is a process where computers use quantum systems to do computing". A layman will want to know something basic immediately like the fact that quantum computers are extremely fast and will soon be replacing standard computers.

...My 2 cents. Squish7 (talk) 00:40, 26 March 2014 (UTC)

Page title should be "Quantum computing"Edit

To my sensibility, this page title is all wrong, or maybe not entirely - simply we have to create a more general page about more "general aspects and applications of quantum computing" like some quantum transistors that work partially as "quantum computers" but are cheaper to produce.

Computer science is no more about computers than astronomy is about telescopes.Edsger Dijkstra

Additionally, Google

 "quantum computing" -"quantum computer"

gives 2,440,000 results

 -"quantum computing" "quantum computer" 

gives 961,000 results, so one appears without the other significantly more often. — MaxEnt 05:10, 5 May 2014 (UTC)

  DoneRuud 18:01, 21 December 2014 (UTC)

Quantum gamesEdit

A collection of IPs is adding an announcement of "the first quantum computer game". Even if this were sourced to a reliable source (it's not at all in the reference specified, which is a blog [not even a blog entry], and all I can find is a blog entry pointing to the announcement by the creator), would it be appropriate for the article? — Arthur Rubin (talk) 05:49, 21 May 2014 (UTC)

The announcement by the alleged creator is not a reliable source, either. — Arthur Rubin (talk) 10:39, 24 May 2014 (UTC)

strongly need th consider of change the title of this wiki-page from Q. "Computer" to "COMPUTING"Edit

can be title of whole wiki-page re-turned to be Quantum COMPUTING, as it was before ? as "Computing" is far more fitting, more overal term for this wiki page´ topic, as just 1 concrete THE "computer") — Preceding unsigned comment added by 78.99.236.255 (talk) 17:49, 23 July 2014 (UTC)

  DoneRuud 18:01, 21 December 2014 (UTC)

Invented When?Edit

You should discuss when Quantum computing was invented and the purpose of Quantum Computing at the time.

2601:C:5600:27B:C92A:EEBE:6758:C647 (talk) 17:52, 18 January 2015 (UTC)Johnnie DeHuff

The basic idea occurred as early as 1965, in Feynman Lectures on Physics,vol. 3 when Feynman mentioned that it was frequently faster to do the physical experiment than to do the mathematical computation. In other words, use a physics experiment as an analog computer. --Ancheta Wis   (talk | contribs) 17:56, 1 February 2015 (UTC)

MinthreadsleftEdit

Why is it 10? If we're going to archive, 5, or possibly 2, would be better. I already archived one thread which was (potentially) about the subject, rather than about the article. — Arthur Rubin (talk) 22:52, 4 May 2015 (UTC)

Changed to 5. I also "archived" 3 sections about alleged simulators of quantum computers, with no credible source. — Arthur Rubin (talk) 12:30, 31 May 2015 (UTC)

Three-bit Example?Edit

It would be very helpful to see an example of a calculation or problem solved at the three bit level. Is there such a thing? JFistere (talk) 03:11, 15 October 2015 (UTC)

simplify intro!Edit

The technical expertise in this article is very good, but way too technical at the intro. The intro should be a thesis statement especially for newbies. I'll back-read the terminology you used and see if I can glean enough to simplify the intro for you. No promises though!

By the way, are the states referred to in the article the spin of the qbit?

I just found a good nuts 'n bolts explanation here...http://www.wired.com/2014/05/quantum-computing/-Pb8bije6a7b6a3w (talk) 18:57, 26 October 2015 (UTC)

misleading ledeEdit

  • "Given sufficient computational resources, however, a classical computer could be made to simulate any quantum algorithm, as quantum computation does not violate the Church–Turing thesis.[10]"

The whole article is currently very tech. Which is OK with me, even though I cannot understand it well enough to judge how correct it is... But it needs more content that is at a lower level. And it needs to be careful not to mislead readers who are not so tech. The above sentence currently concluding the lede is a prime example of content which is probably correct in theory but seriously misleading, particularly in the lede context. Most ordinary readers would not understand that "sufficient computational resources" includes unbounded quantities thereof, with no regard for feasibility. Yes, we may think quantum computers can only solve problems that a classical computer could solve if it was big enough and had enough time to solve. But there are many problems that would require using the entire universe to solve (organized as a classical computer) and still take longer than the expected life of the universe. If we think quantum computers will be able to solve some such problems (within feasible, limited, size and time constraints), we can fairly say there is a distinct difference in the problems these two classes of computers can solve, even if both classes of computers are only able to solve (all) Turing problems -- "given enough resources". I don't really know enough to know how the lede should be fixed -- but maybe I will Be Bold anyway, to get the ball rolling...-71.174.188.32 (talk) 19:11, 10 December 2015 (UTC)

I could not agree more. This article reads like an excerpt from a sub-par college textbook for a class called: "Hypothetical Quantum Computing Systems" or something. Text books are fine when confined to Academia. Wikipedia is supposed to be for the masses. And with so much jargon being tossed about, it makes it appear that this subject is well-defined and based on real-world working models; which is simply NOT the case. 99% of what I read is pure theory at this time (2016). 98.194.39.86 (talk) 03:54, 25 October 2016 (UTC)

Do I have this right?Edit

I've been trying to ground myself in this subject in order to attempt a good thesis statement. The way I understand it, those qbits would explore all possibilities for a given expression - between two clock ticks of the quantum computer. Is that right? Pb8bije6a7b6a3w (talk) 00:28, 11 December 2015 (UTC)

No, that's not right. "Exploring all possibilities" is not an accurate description of quantum computing. For comparison with the double-slit experiment, it is not correct to say that "a photon explores both slits and then chooses the right one". It is equally incorrect to say that "a quantum computer explores all possibilities and then chooses the right one". Think of quantum computing as operations that are meant to have a specific impact on correlated interference patterns. MvH (talk) 04:59, 14 January 2017 (UTC)

Quantum vials / cartridge entanglementEdit

Large multi-mono-atomic groupings cannot maintain cohesion for long. An old idea is to replace each atom with a vial (cartridge), so now each vial plays the role of a single atom, then we entangle the vials among each other. The problem is that particles inside a vial may entangle with other ones of the same vial (so now the vial doesn't act as a single atom), in order to avoid that we use few atoms on each vial [seems clever but it isn't for other reasons/small mistakes become important] and we cool each vial to make it almost a condensate [seems clever but it isn't for other reasons/each person should have a quantum computers, so superconductivity at room temperature is the only solution, not extreme temperatures]. Some reverberative NON-MEASURED laser radiation (the laser should allow different quantum states of oscillation that will be not revealed to us) will increase cohesion inside the mini vats. Sounds great but until now the result is randomness. Crucial details are under development. Akio (Akio) 00:28, 13 May (UTC)

classical computer analoguesEdit

A classical computer analogue, uses a noise generator to introduce noise when needed and entangled quantum rotation statistics. Then we run trillions of times the wavefunctional collapse in order we average the result (most tasks though do not require large averagings, so even a billion or a million might work for simple tasks). That quantum analugue is 10^7 times slower than an ideal quantum computer, but it works, also we don't have large arrayed quantum entangletrons (computer). Human averaging algorithms might not be perfect, but a good noise generator is way better than the natural noise if we add some mistakes of measurement of a future actual large entangletron (quantum computer). Yoritomo (Yoritomo) 04:01, 15 May (UTC)

Each digit should be partially connected (at variable rates that change for each task) with all other digits (at a unique percentage to each other digit), but also to separate non-interconnected randomness (at variable rates that change for each task, at a unique percentage to each other digit). This is called "tensor computing". That works fine for many tasks, you might not get the absolute answer but you will get close after a trillion trials. If you want the absolute answer (well to get even closer, because even getting the absolute answer is something probabilistic), you then should allow partial variable dynamics take action, a combinatorics method of "communicating vessels" (well partially communicating at different rates for each digit and partially introducing variable separate-noise rates to each digit). These combinatoric communicating vessels use as their grain an algorithmic module that acts as the smallest particle. An actual quantum computer uses as a grain-module an infinitely small module and we cannot, but we can get very close and minimize our grain-module enough to perform each task with a classical super computer. Nowadays a classical pseudo-entangletron is better than an entangletron (quantum computer) simply because we can already build a very complex one! Ieyasu (Ieyasu) 04:32, 15 May (UTC)
your first digit is totally random, all other digits are probabilistically separately random (not of the same random series) with some statistical probability of indruducing the initial random digit— Preceding unsigned comment added by 2A02:587:4103:2300:C8D5:96C1:1BF2:71A7 (talk)

External links modifiedEdit

Hello fellow Wikipedians,

I have just modified one external link on Quantum computing. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

As of February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{sourcecheck}} (last update: 15 July 2018).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 12:43, 21 July 2016 (UTC)

No mention of China?Edit

Claims of quantum computing superiority (including launching of a quantum satellite) in second-half of Sept. 2016 interview at: https://www.rt.com/shows/keiser-report/358371-episode-max-keiser-963/ 72.171.152.192 (talk) 18:17, 17 September 2016 (UTC)

Error in "Basis" section?Edit

"A double qubit can represent a one, a zero, or any quantum superposition of those two qubit states; a pair of qubits can be in any quantum superposition of 4 states, and three qubits..."

Should presumably read "A single qubit..." etc. Northutsire (talk) 19:24, 4 October 2016 (UTC)

Error in usage of the term non-determinismEdit

The term non-deterministic was used twice in the article, the first time properly, in the sentence "Quantum computers share theoretical similarities with non-deterministic and probabilistic computers.", and the second time in a wrong way in the sentence "Quantum algorithms are often non-deterministic, in that they provide the correct solution only with a certain known probability." It is highly misleading as the term non-deterministic computing and the term probabilistic computing are very different in computer science. I corrected the error and clarified that the term non-determinism must not be used in that context. Of course, an alternative is simply to replace non-deterministic by probabilistic in the previous version, yet I find this to be a very good place to clarify to the physicists among the readers that the term non-determinism cannot be used for describing the probabilistic nature of (regular) quantum computers. Tal Mor (talk) 08:10, 29 November 2016 (UTC)

Sorry - I meant to clarify the issue to all the readers that are not computer scientists (not just to the physicists). Tal Mor (talk) 08:13, 29 November 2016 (UTC)

Make paragraph, but also new page: Quantum computing and deep learningEdit

No human being can generate some complicated questions to ask a quantum computer about molecular statistics of mixed materials, so we define the parameters of our questions, but the final question is a result of computing! You don't need a computer only to get an answer, but even to finalize hard questions. We must be more analytical about how a quantum computer might act as a neural network, or how a neural network might control a quantum computer. Don't write random stuff. Collect some information for 4 years, and then write here. — Preceding unsigned comment added by 2A02:587:4116:2200:A123:2E94:AB7B:DF1 (talk) 11:10, 17 April 2017 (UTC)

D-Wave, Google, 2017Edit

This youtube video discusses a paper by google employees,[1] which claims or demonstrates the computational advantage of D-Wave's quantum annealing concept, and I don't yet see it being discussed in the article. JMK (talk) 19:40, 3 September 2017 (UTC)

  1. ^ Denchev, Vasil S.; Boixo, Sergio; Isakov, Sergei V.; Ding, Nan; Babbush, Ryan; Smelyanskiy, Vadim; Martinis, John; Neven, Hartmut (1 August 2016). "What is the Computational Value of Finite-Range Tunneling?". Physical Review X. 6 (3). doi:10.1103/PhysRevX.6.031015. Retrieved 3 September 2017.

description of digital computer incorrectEdit

who will fix it?Juror1 (talk) 17:53, 30 November 2017 (UTC)

Too complex first sentenceEdit

The first two sentences read:

Quantum computing studies computational systems (quantum computers) that make direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from binary digital electronic computers based on transistors.

I changed to:

Quantum computing is computing using quantum-mechanical phenomena, such as superposition and entanglement. Quantum computers are devices that perform quantum computing. They are different from binary digital electronic computers based on transistors.

This was reversed with the motivation that the previous version "reads better". Here is my motivation for why my version is better:

It is shorter. It is divided into more sentences. (It is a well established fact that shorter sentences are easier to read.) It says what the article is about. The previous version assumes, but does not say, that QC is a subject area. It says that QC is a phenomenon and not the study of this phenomenon. Isn't it decided that the phenomenon itself is more important than the study of it? Removing to perform operations on data makes it shorter, which is good. The first sentence should never use a word that is so difficult that it needs to be explained in the same sentence. (Because the first sentence should be as short as possible.) I think computing is OK. If it is too difficult for the reader, then reading the article about computing first is necessary to understand this article. --Ettrig (talk) 11:42, 13 December 2017 (UTC)

D-Wave 2000 Qubits - commercial today !?Edit

https://www.dwavesys.com/d-wave-two-system
Isn't this a real Quantum Computer ? With 2000 Qubits !!!! It seems to me like they and NASA once and for all have solved the necessary superconducting problem, through vacuum. No particles is no temperature - and the superconducting problem solved - !???
Although expensive, large companies can afford them - and make more money perhaps ? Boeing720 (talk) 00:31, 19 December 2017 (UTC)

The D-Wave device is not a universal quantum computer, see D-Wave Systems#Controversy. Luca (talk) 09:16, 19 December 2017 (UTC)

Explanations for the massesEdit

I'd like to join the chorus asking for a bit more content pitched to the educated layman. In particular, one point that brought me here in an unsuccessful search for an explanation doesn't seem to be found.

I second the call for a worked-through 3-qubit problem if that's feasible. That might help me visualize how, when the superposition collapses, you know that you are observing the solution to the problem you were running on the quantum computer. That seems to be saying that there is only one possible position, which doesn't track with the idea of superposition. Grossly simplified, if the problem you are trying to solve is 2+2, how do you manipulate the qubits to collapse the probabilities down to 100% for 4? MvH took a stab at this in #14 on the talk page, but it was too general to make much sense to a layman.210.178.60.82 (talk) 12:41, 19 January 2018 (UTC)

quantum supremacyEdit

The initial addition on Quantum Supremacy (16:54, 5 July 2017 by user Schlafly) provided a definition of the term, and seemed to add additional information to the base subject matter of quantum computing. On 13:29, 12 December 2017, user 185.78.8.131 edited the section and added material relating to assertion of controversy about the concept of Quantum Supremacy, and apparently to quantum mechanics in general.

First, controversy is fine and I feel it should be placed into it's own section rather than being post-pended to an existing one. However, it may better placed in the article on quantum mechanics where it may receive a greater level of review.

Second, the sentence starting with "Those such as Roger Schlafly," caught my attention. Wondering to whom "those" referred, I checked the three references given, and found them to all point to a blog authored by Schlafly. To me, this seems to run afoul of original-research and or NPOV and or reliable sources.

So I've flagged this section with POV-section. — Preceding unsigned comment added by Debrown (talkcontribs) 17:27, 1 February 2018 (UTC)

make page: Pseudoquantum computingEdit

non fundamental mimicking of the shared noise attractor which may include a percentage of constant values (some percentage of the time the attractor is activated, so we have pseudorandom data within a range of values, and some percentage of the time the machine runs, a constant value is appearing according to the entaglement angles chart, but a randomizer selects among a. the pseudorandom data of the attractor and b. the constant value - a. and b. have standard percentages of activations, but they appear randomly via a randomizer - we have to run the program many times in order we average the results.)


This isn't an actual quantum computer but a Pseudoquantum one.

It's useful for some problems. — Preceding unsigned comment added by 2A02:2149:8430:F500:24AD:8759:82E7:D432 (talk) 16:23, 9 February 2018 (UTC)

and how to find the correct constant number and the correct attractor (at the beginning you can start with known problems and use it for educational reasons, then you can upgrade the system) — Preceding unsigned comment added by 2A02:2149:8430:F500:24AD:8759:82E7:D432 (talk) 16:31, 9 February 2018 (UTC)

noise testingEdit

in some problems random (false/erroneous)solution testing is helpful

in bigger problems it doesn't work well — Preceding unsigned comment added by 2A02:2149:8430:F500:24AD:8759:82E7:D432 (talk) 16:38, 9 February 2018 (UTC)

Analogue percentage of voltage through analogue logic gates and noise sharing through analogue logic gatesEdit

Analogue systems add more noise... but it's better than nothing. analogue logic gates allow interference phenomena to occur. You run it many times and all percentages of voltage per order of digit must be described by their lower order of significance digits (and some information is lost + we have noise) — Preceding unsigned comment added by 2A02:2149:8888:3A00:A10A:46B7:5026:A29A (talk) 00:30, 17 February 2018 (UTC)

we don't have 0 and 1 but values from 0% to 100% and when we have less than 100% the digit has to become zero and the lower ranked digits should express now with absolute zeros and ones the number
the initial percentaged analogue digits should communicate with analogue logic gates and a random generator which is activated sometimes, and an other random generator shuffles the percentages of entagled configuration vs pure noise
  • sometimes we have 40% entaglement and 60% noise, we shuffle though the modes of operation with another randomizer
  • the entaglement can be the same number, the opposite number, or any known standard logic operation

Dubious timelineEdit

I find it odd that Yuri Manin is listed as the first person to propose the idea of quantum computation. The reference given is a 1980 book on (classical) computability which makes a brief mention (only in its preface!) of the possibility of using quantum effects for computation. Nowhere in the body of that book is quantum computing discussed. The idea of using quantum effects for computation was already articulated in Feynman's 1959 talk There's Plenty of Room at the Bottom:

Atoms on a small scale behave like nothing on a large scale, for they satisfy the laws of quantum mechanics. So, as we go down and fiddle around with the atoms down there, we are working with different laws, and we can expect to do different things. We can manufacture in different ways. We can use, not just circuits, but some system involving the quantized energy levels, or the interactions of quantized spins, etc.

— Richard P. Feynman, "There's Plenty of Room at the Bottom", American Physical Society (1959)

The current timeline is dubious and needs to be fixed. Feynman's 1959 talk is the earliest reference I know of (and is cited in the Britannica article on quantum computing), but there may be others. Would anyone care to comment/suggest other possible early references on using quantum effects in computing devices? Stablenode (talk) 17:01, 21 April 2018 (UTC)

Following up, an English translation of Manin's entire discussion of quantum computation in his 1980 book is given in Manin's own 1999 paper Classical computing, quantum computing, and Shor's factoring algorithm. It is literally 3 paragraphs long.

The following text is a contribution to the prehistory of quantum computing. It is the translation from Russian of the last three paragraphs of the Introduction to [Ma2] (1980). For this reference I am grateful to A. Kitaev [Ki]. “ Perhaps, for better understanding of this phenomenon [DNA replication], we need a mathematical theory of quantum automata. Such a theory would provide us with mathematical models of deterministic processes with quite unusual properties. One reason for this is that the quantum state space has far greater capacity than the classical one: for a classical system with N states, its quantum version allowing superposition accommodates c^N states. When we join two classical systems, their number of states N1 and N2 are multiplied, and in the quantum case we get exponential growth c^(N1*N2). These crude estimates show that the quantum behavior of the system might be much more complex than its classical simulation. In particular, since there is no unique decomposition of a quantum system into its constituent parts, a state of the quantum automaton can be considered in many ways as a state of various virtual classical automata. Cf. the following instructive comment at the end of the article [Po]: ‘The quantum–mechanical computation of one molecule of methane requires 10^42 grid points. Assuming that at each point we have to perform only 10 elementary operations, and that the computation is performed at the extremely low temperature T = 3.10^−3 K, we would still have to use all the energy produced on Earth during the last century.’ The first difficulty we must overcome is the choice of the correct balance between the mathematical and the physical principles. The quantum automaton has to be an abstract one: its mathematical model must appeal only to the general principles of quantum physics, without prescribing a physical implementation. Then the model of evolution is the unitary rotation in a finite dimensional Hilbert space, and the decomposition of the system into its virtual parts corresponds to the tensor product decomposition of the state space. Somewhere in this picture we must accommodate interaction, which is described by density matrices and probabilities.”

— Yuri Manin, "Classical computing, quantum computing, and Shor's factoring algorithm", https://arxiv.org/abs/quant-ph/9903008 (1999)


Peter Shor in his 2000 arXiv article (Introduction to Quantum Algorithms) has a very clear summary of where Manin's contribution lies; it certainly cannot be said that he was the first to suggest the idea of quantum computing:

In 1982, Feynman [19] argued that simulating quantum mechanics inherently required an exponential amount of overhead, so that it must take enormous amounts of computer time no matter how clever you are. This realization was come to independently, and somewhat earlier, in 1980, in the Soviet Union by Yuri Manin [30]. It is not true that all quantum mechanical systems are difficult to simulate; some of them have exact solutions and others have very clever computational shortcuts, but it does appear to be true when simulating a generic quantum mechanics system. Another thing Feynman suggested in this paper was the use of quantum computers to get around this. That is, a computer based on fundamentally quantum mechanical phenomena might be used to simulate quantum mechanics much more efficiently. In much the same spirit, you could think of a wind tunnel as a “turbulence computer”. Benioff [5] had already showed how quantum mechanical processes could be used as the basis of a classical Turing machine. Feynman [20] refined these ideas in a later paper.

— Peter W. Shor, "Introduction to Quantum Algorithms", https://arxiv.org/pdf/quant-ph/0005003 (2000)

I propose the timeline be changed to reflect Feynamn's earlier observations in the 1950s and make it clear that Manin's short observation in 1980 was articulating a motivation for pursuing this approach to computing. To claim that Manin spawned the field is ridiculous. Stablenode (talk) 19:57, 22 April 2018 (UTC)

in my opinion the 1959 talk by Feynman has at best a very tenuous link to the field: there is no hint of exploiting quantum effects for computation in the talk, in the quoted section computing is not mentioned at all (it is earlier, but there only the potential of miniaturization is discussed, not quantum effects). So I think it's not correct to write that Feynman "states the possibility of using quantum effects for computation". Maybe one could say he implies it, but the reference to "different laws" that apply at the quantum level could just as well imply (as mentioned before in the talk) that they pose novel problems Compared to later contributions like those of Wiesner or Holevo that are very explicit about using QM for information processing and Feynman's 1982 talk, I find it a stretch to include "Plenty of Room" as having started this field. --Qcomp (talk) 18:02, 30 August 2018 (UTC)

Concrete proposals to improve this articleEdit

With the recent surge of interest in quantum computing, an increasing number of readers come to this article to learn what quantum computing is. But this article isn't as good as it could be, and I think it can be significantly improved. Maybe we can discuss some concrete proposals to improve the article? If we have reasonable consensus on some proposals, someone can go ahead and make changes. My suggestions are

  1. Merge the long Timeline section with the article Timeline of quantum computing and remove it from this article.
  2. More ambitiously, restructure the sections after the lede to clearly answer the questions a reader may have. I think the answer to many of these questions are already in the article, but some of them are hard to find. Some questions a reader may have:
    1. "What is a quantum computer?" (explain what it is)
    2. "Why do we want to build a quantum computer?" (describe applications and motivation for building one, examples of problems that can be solved exponentially faster on a quantum computer, etc.)
    3. "How are quantum computers made?" (discuss physical implementations, such as ion traps, etc., and the challenges in building a quantum computer)
    4. "Who is building quantum computers?" (Companies, universities, etc., perhaps with some examples. This could be part of the previous section, with examples of who is using which technology. E.g., Company X and Prof. Y at University Z are building quantum computers from ion traps.)
    5. "Where can I learn more?" (a short, curated further reading list with some guidance would be great; e.g., we should distinguish between popular science articles for the lay person, detailed overview articles for more interested readers, textbooks for undergraduates and grad students, etc.)
  3. We could follow the model of having a high-level encyclopedic overview in this article, and a more detailed explanation of what quantum computing is in another article, as done by quantum mechanics and introduction to quantum mechanics, and general relativity and introduction to general relativity (both of which are featured articles!).

That's all I can think of for now, but I'm sure others probably have concrete proposals to improve this article as well. Let's discuss them! — Robin (talk) 18:09, 20 May 2018 (UTC)

I don't understand this article at allEdit

"The calculation usually ends with a measurement, collapsing the system of qubits into one of the {\displaystyle 2^{n}} 2^{n} eigenstates, where each qubit is zero or one, decomposing into a classical state.'

This is way too complicated for me, and I'm not exactly new to algorithms.

Please tell how quantum computing works in laymen's terms. Like: "For example, if quantum computers had enough bits you could encode the Travelling Salesmen problem for 5 cities like this, and after such and such trick, and measuring the qubit 1 to 5, by simply ordering the probabilities you'd have the perfect route."

Joepnl (talk) 00:10, 13 June 2018 (UTC)

Copied to Quantum Mind pageEdit

This section was copied to the Quantum Mind page on 1 Feb 2018 by user: wcrea6:

Quantum computing is computing using quantum-mechanical phenomena, such as superposition and entanglement.[1] A quantum computer is a device that performs quantum computing. They are different from binary digital electronic computers based on transistors. Whereas common digital computing requires that the data be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in superpositions of states. One of the greatest challenges is controlling or removing quantum decoherence. This usually means isolating the system from its environment as interactions with the external world cause the system to decohere. Currently, some quantum computers require their qubits to be cooled to 20 millikelvins in order to prevent significant decoherence.[2] As a result, time consuming tasks may render some quantum algorithms inoperable, as maintaining the state of qubits for a long enough duration will eventually corrupt the superpositions.[3]

References

  1. ^ Gershenfeld, Neil; Chuang, Isaac L. (June 1998). "Quantum Computing with Molecules" (PDF). Scientific American.
  2. ^ Jones, Nicola (19 June 2013). "Computing: The quantum company". Nature. 498 (7454): 286–288. Bibcode:2013Natur.498..286J. doi:10.1038/498286a. PMID 23783610.
  3. ^ Amy, Matthew; Matteo, Olivia; Gheorghiu, Vlad; Mosca, Michele; Parent, Alex; Schanck, John (November 30, 2016). "Estimating the cost of generic quantum pre-image attacks on SHA-2 and SHA-3". arXiv:1603.09383 [quant-ph].

Suggested article improvementsEdit

Hello! I'd like to propose a few improvements to this Wikipedia article. But first, a disclosure, which is also displayed in the connected contributor template above: I am submitting this request on behalf of Intel via Interfuse Communications, and as part of my work at Beutler Ink. Given my conflict of interest, I'm seeking volunteer editors to review and implement the proposed content appropriately, and I will not edit the article directly. This request is related to another one I submitted here for the Timeline of quantum computing article, which has been answered already. I propose updating this article with the following claims to the "Timeline" section:

  • Intel confirms development of a 17-qubit superconducting test chip.[1]
  • QuTech successfully tests silicon-based 2-spin-qubit processor.[2]
  • Intel begins testing silicon-based spin-qubit processor, manufactured in the company's D1D Fab in Oregon.[3]
  • Intel confirms development of a 49-qubit superconducting test chip, called "Tangle Lake".[4]

References

  1. ^ Knight, Will (October 10, 2017). "Quantum Inside: Intel Manufactures an Exotic New Chip". MIT Technology Review. Retrieved July 5, 2018.
  2. ^ Giles, Martin (February 15, 2018). "Old-fashioned silicon might be the key to building ubiquitous quantum computers". MIT Technology Review. Retrieved July 5, 2018.
  3. ^ Forrest, Conner (June 12, 2018). "Why Intel's smallest spin qubit chip could be a turning point in quantum computing". TechRepublic. Retrieved July 12, 2018.
  4. ^ Hsu, Jeremy (January 9, 2018). "CES 2018: Intel's 49-Qubit Chip Shoots for Quantum Supremacy". Institute of Electrical and Electronics Engineers. Retrieved July 5, 2018.

Additional sourcing is available for each of these claims, but these were the sources preferred by the reviewing editor of the related edit request.

@Qcomp: Since you assisted with the other request, I'm curious if you're willing to review this one as well and update the article appropriately.

Thanks again for your consideration. Inkian Jason (talk) 19:44, 5 September 2018 (UTC)

Thanks for your suggestions and contribution. I have a problem with this whole section of the article. Given that the main article is Timeline of Quantum Computing this here should be, in my opinion, a very restricted subset of that more detailed timeline. It does start out that way but then quickly becomes a potpourri of big, small, and marginal advances full of marketing-speak; I think, the list should only contain a few items per "branch" (different implementations, algorithms, complexity results) that represent such big steps that the interested layperson can appreciate them clearly as progress. WHat's the opinion of other authors of this article? without a better idea what's the consensus here I'm reluctant to add to a list that I'd much rather shorten first. --Qcomp (talk) 13:39, 11 September 2018 (UTC)
Not an author, but I think that the general Wikipedia approach pushes towards keeping things simple. The major advancements in the field of quantum computing should be enough to provide information to the reader, and "minor" ones can be kept in the timeline. #!/bin/DokReggar -talk 14:06, 11 September 2018 (UTC)
@Qcomp and DokReggar: Thank you both for replying. I see what you mean regarding redundancy between this article's "Timeline" section and the Timeline of quantum computing article. The reason I'm attempting to add details about Intel's work is because the company is currently not mentioned at all, yet "IBM" appears in the article's prose approximately 30 times. I am not sure what the criteria are for inclusion, but my goal was to make this article more equitable. Are there ways to address this imbalance? Inkian Jason (talk) 21:33, 11 September 2018 (UTC)
I agree that the timeline is unbalanced; in my opinion, what we should have is maybe one entry saying that beginning in 2012, major technology companies such as IBM, Google, and Intel made significant investments in the construction of quantum computers based on superconducting qubits and announced in rapid succession the availability of devices with several tens of qubits. Maybe it would also make sense to add to the timeline here only with one year delay, picking the few most relevant contributions out of the more extensive main Timeline-article. However, I think a renovation of the section has to be discussed further (and I don't have time right now to attempt it). For the time being, I think it's appropriate to include the announcements on the spin-qubit processor, because it's (to my knowledge) the first contribution of a major commercial actor to this implementation. As a test balloon for a more sparse way of reporting the timeline here, I'd propose an entry:

In late 2017 and early 2018 IBM[1], Intel[2], and Google[3] each reported testing quantum processors containing 50, 49, and 72 qubits, respectively, all realized using superconducting circuits. These circuits are approaching the range in which simulating their quantum dynamics is expected to become prohibitive on classical computers, although it has been argued that further improvements in error rates are needed to put classical simulation out of reach.[4]

References

  1. ^ Will Knight (2017-11-10). "IBM Raises the Bar with a 50-Qubit Quantum Computer". MIT Technology Review. Retrieved 2017-12-13.
  2. ^ Hsu, Jeremy (January 9, 2018). "CES 2018: Intel's 49-Qubit Chip Shoots for Quantum Supremacy". Institute of Electrical and Electronics Engineers. Retrieved July 5, 2018.
  3. ^ Susan Curtis (2018-03-06). "Google aims for quantum supremacy". PhysicsWorld. Retrieved 2018-09-12.
  4. ^ Tom Simonite (2018-05-19). "Google, Alibaba Spar Over Timeline for "Quantum Supremacy"". Wired. Retrieved 2018-09-12.
let me know what you think. --Qcomp (talk) 10:41, 12 September 2018 (UTC)
Seems good and terse enough for me. Thanks for your work! #!/bin/DokReggar -talk 13:19, 12 September 2018 (UTC)
@Qcomp: Thanks for proposing specific wording for a summary. I certainly understand your preference to summarize where appropriate. Do you mind clarifying what exactly is being trimmed in favor of the suggested wording? Thanks in advance. Inkian Jason (talk) 20:40, 12 September 2018 (UTC)
sure; I'm proposing to replace the "November 2017" (IBM), March 2018 (Google), and your proposed January 2018 (49 qubit chip) entries by the one given above. I would argue that these three announcement have some significance due to the approach to the supposed "no-longer-simulable" threshold, though there is of course no hard threshold and in quoting only the number of qubits and not the achieved qubit lifetimes and gate fidelities (especially compared to the smaller devices before: do we see scalability in that the fidelities stay constant even though the system gets bigger?), it's not clear how big a step they really represent. In a few years time, neither the small difference in number nor in time of announcement will seem important, that's why I think a single entry is appropriate.
If the change does not meet with resistance I would then (slowly) try to concentrate the timeline here to a few major scientific and technological advances. E.g., I do not see the point of pure business announcements like the one of April 2018 in this timeline, but it would be etter if we could develop some set of criteria here what should be included in the timeline. --Qcomp (talk) 21:54, 12 September 2018 (UTC)
@Qcomp: OK, thanks for clarifying. Your summary seems appropriate. Inkian Jason (talk) 22:44, 12 September 2018 (UTC)
I see the summary was added with this edit. Thanks again for your help. Inkian Jason (talk) 15:18, 17 September 2018 (UTC)
I'm a bit late to this discussion, but I agree that this section is just far too cluttered for this article. Given that we already have a timeline of quantum computing, I'm in favor of drastically cutting this section down to a paragraph or two. --Robin (talk) 05:18, 28 January 2019 (UTC)

Article is unintelligible as to what Quantum Computing actually isEdit

We start with some discussion about binary number, that 2^3 = 8. Then say that they are complex numbers, with the probability being the modulus. OK, but so what? Then a leap into vagueness.

It reads like it has been written by someone that read a quantum paper without understanding it, and picked out a few key words.

This is the nature of the topic. There is only either journalistic fluff or heavy research papers. There is thhe book by Neilsen which I may try to wade through if I have time, but like all this stuff it enjoys heavy mathematical treatment of simple ideas. (As an analogy, one does not need to start with partial differential equations in order to develop the idea that Apples fall off trees.)

So what is required for this article is someone that actually understands the topic, yet can write clearly. Sadly, I doubt if that person exists. Tuntable (talk) 02:13, 17 September 2018 (UTC)

I tend to agree. Boeing720 (talk) 09:42, 25 December 2018 (UTC)

Qubit vs logic Qubit ? Impact on society (internet mainly)Edit

As I have understood it, a Qubit state cannot be directly read. But this can be solved (I only have vague ideas of how), but even then errors may occur. The studied particle behaves according to a certain probability, which not always is correct. Nevertheless an error corrected Qubit can be obtained (somehow), and would then equal a logic Qubit. If I'm not all wrong here, the logic Qubit is very desirable, and could possibly be given a headline high up (?).
IBM predicted in May 2018 that Quantum Computing will be mainstream in five years according to this short video [1] (press "play"). Are we inline with the progress of Quantum Computers from a society aspect ? Binary technology can still be used for decades ahead, in stand alone (embedded) system - but what about internet, as Quantum Computers comes for real. Let's say a 300 logic qubit machine, what would its impact on internet be ? Boeing720 (talk) 09:40, 25 December 2018 (UTC)

In 'Basics' perhaps clarify "... can be in an arbitrary superposition of up to {\displaystyle 2^{n}} 2^{n} different states simultaneously"Edit

Does this mean "in EVERY superposition of up to {\displaystyle 2^{n}} 2^{n} different states simultaneously"? If not, surely "an arbitrary superposition" is exactly ONE superposition state, and being on one state "simultaneously" does not seem meaningful. JohnjPerth (talk) 00:42, 28 January 2019 (UTC)johnjPerth

I think the word 'simultaneously' is a bit redundant and confusing. I think a 'superposition' is intrinsically being partly in many states simultaneously.

118.209.183.247 (talk) 07:37, 12 March 2019 (UTC)

'Programming The Universe' (2006) by MIT's Dr. Seth LloydEdit

In Further Reading, I added... *Lloyd, Seth (2006) Programming The Universe, Alfred A. Knopf. 2601:580:106:5243:34B1:52ED:44D3:4115 (talk) 12:08, 11 May 2019 (UTC)

How small?Edit

The Lede section says: "nor have commercially useful algorithms been published for today's small, noisy quantum computers."

How small? Like 2 qubits? 16? Is that how "size" is measured? This is the critical paragraph that explains if we are talking dreams or reality. More should be said on that — explicitly. See also MOS:LEDE.
--2602:306:CFCE:1EE0:81AE:F280:6717:4833 (talk) 16:05, 8 September 2019 (UTC)JustAsking

EigenstateEdit

The Basic paragraph mentioned a keyword eigenstate(s). Please give inline citation on that because that paragraph is supposed to be read by as a "Basic", by a person who probably never learnt quantuam mechanics. --Kittyhawk2 (talk) 07:56, 26 September 2019 (UTC)

Ongoing rewrite from quantum computing expertEdit

Hi all, I've created a new account largely to fix up this mess of an article. Please see my user page for information about me.

I've done a fairly thorough rewrite of the introduction and the (now renamed) Basic Concepts section. I will try to find time over the next few weeks to rewrite most of the rest of the article. To be clear, the article as it was written was full of fairly egregious misconceptions. It is clear that no expert has helped out for a long time. I'm fairly knowledgeable about the subject and decided that I'm going to do it if no one else is going to.

I am not an experienced editor of Wikipedia, so I would appreciate it if anyone better connected than me can put me in touch with an experienced editor. I would gladly work with that person to make this article a worthy exemplar of Wikipedia as a whole. As it stands, I think the article is an embarrassment to the quantum computing community. We should never have allowed the article to persist like this for so long.

I regularly use Wikipedia for research as well as personal reasons, and I don't donate as often as I should. Let this effort stand as my contribution to the Wikipedia project.

Yuval Rishu Sanders (talk) 11:01, 27 September 2019 (UTC)

Thank you for your work, it is good to have experts editing technical articles. I watch this page but have never contributed to it other than reverting vandalism. A small tip for editing: as you can see here, [[voltage]]s (voltages) produces the same as [[Voltage|voltages]] (voltages) yet is simpler and cleaner. BernardoSulzbach (talk) 14:12, 27 September 2019 (UTC)
Very nice that you want to clean the article! Please can you comment on my last edit, in particular why Google removed this artice and materials to it?ZBalling (talk) 21:03, 27 September 2019 (UTC)
Just getting onto this now. The Google article was never intended to be publicised, and was leaked to the media via NASA only by mistake. It wouldn't be appropriate for me to comment further. I appreciate the other comments on how to edit Wikipedia properly, and I'm glad that experienced editors can be checking to make sure I do a good job. Yuval Rishu Sanders (talk) 08:03, 7 October 2019 (UTC)

I've finished up the next stage of this rewrite, which is to replace the confusing and incorrect pair of sections on quantum operations with a more reasonable introduction to quantum logic gates. I am aware that the current presentation could be too technical and I would appreciate any non-expert comments and criticisms.

In the next stage of this project, I hope to write something about the basic idea of how we implement quantum operations. To give away the punchline, I tend to explain this in terms of Schroedinger's equation and pulse shaping. That section is a bit harder to write because very few authors think to say how quantum computers are suppose to work at a purely abstract level. Those discussions are usually focussed on one possible implementation or another, which is a shame because the underlying principle isn't hardware-specific. Yuval Rishu Sanders (talk) 10:07, 7 October 2019 (UTC)

I've reviewed your edit and improved on some minor points. As suggestions, I recommend checking out MOS:WE and MOS:NOTE. As I said in my previous comment, you should try to simplify links when you can, as in [2]. BernardoSulzbach (talk) 14:28, 7 October 2019 (UTC)
Thanks. Apologies for links, I’ve been using the rich text editor. I’m still learning the wiki markup language, whatever it’s called. I’ll take a closer look at your suggested links. Again, thanks for the help. Yuval Rishu Sanders (talk) 23:03, 24 October 2019 (UTC)
"I’ve been using the rich text editor." I honestly thought it would generate better markup. Ultimately, as long as the text is correct, verifiable and good to read, the technical details will get resolved in the long-term. Thank you for your work. BernardoSulzbach (talk) 14:26, 25 October 2019 (UTC)
I've made some edits in the hopes of improving the style. I'm sure it's not yet up to snuff, but I think I've dealt with some of the major concerns. It would be good to get a few people involved with this project, and I'll start asking around. Yuval Rishu Sanders (talk) 05:10, 4 January 2020 (UTC)
MOS:INTRO says that opening paragraphs should avoid "difficult to understand terminology" and that "the average Wikipedia visit is a few minutes." I expect that this article is visited by many non-scientists looking for a very high level explanation and an overview of what's currently possible. Such readers will not get very far past references to Turing Machines and Lambda Calculus. How can the introduction to such a technical concept be rewritten for readers with only introductory (or no) relevant background? I suggest moving the contents of the Basic concepts section to the opening paragraphs, and demoting the more technical definitions and historical content to lower in the intro, or elsewhere in the article. Do editors agree? Bosons (talk) 16:55, 4 January 2020 (UTC)

Remove the 'Timeline' sectionEdit

https://en.wikipedia.org/wiki/Quantum_computing#Timeline will be a never-ending list that will take up most of the page. Some of the most important information have already been repeated throughout this article. Do you have objections? Vtomole (talk) 16:01, 20 November 2019 (UTC)

I agree: there should be a (regular text) section about the development of quantum computing since the 1980s, but not a newsticker as it is now (except for the time until 1996 - but even there items like QKD and quantum teleportation do not belong. And having already Timeline of quantum computing is enough - there's no need to have two lists without clearly defined and distinct criteria on what belongs on which --Qcomp (talk) 18:44, 20 November 2019 (UTC)
No objections. I personally dislike the "thousand small paragraphs" style. Maybe decide what is truly essential and keep that in 3-4 paragraphs and move the rest into Timeline of quantum computing, if it isn't already there. BernardoSulzbach (talk) 17:15, 23 November 2019 (UTC)

Secondary refsEdit

Reuters has an article on disagreements about the Google claims: https://www.kitco.com/news/2019-10-23/Google-claims-quantum-supremacy-others-say-hold-on-a-qubit.html Kdammers (talk) 03:31, 27 October 2019 (UTC)

One of the classical quantomimesEdit

  1. use a classical computer
  2. design a noise generator biasing algorithm
    (For example white noise is unbiased. Usually here we don't want that pattern. Also a channeling connectome is required in order many layers of results, evolve in a system. We run many times the program per second and we use sieving formulas for the finds.)
  3. mimic a quantum computer approach for that single problem
make page: quantomime: Classical system which mimics a quantum one.

Possible error in "Quantum Supremecy" sectionEdit

If I understand this correctly, it is proof there there is in fact a category of problems only solvable and testable by quantum machines, that a standard computer would not be able to solve regardless of time or complexity available[1][2]. David Duetsch's 1985 paper [page 5] states "The fact that classical physics and the classical universal Turing machine do not obey the Church-Turing principle in the strong physical form (1.2) is one motivation for seeking a truly quantum model"[3]

"Any computational problem that can be solved by a classical computer can also, in principle, be solved by a quantum computer. Conversely, quantum computers obey the Church–Turing thesis; that is, any computational problem that can be solved by a quantum computer can also be solved by a classical computer. While this means that quantum computers provide no additional advantages over classical computers in terms of computability, they do in theory enable the design of algorithms for certain problems that have significantly lower time complexities than known classical algorithms. Notably, quantum computers are believed to be able to quickly solve certain problems that no classical computer could solve in any feasible amount of time—a feat known as "quantum supremacy." The study of the computational complexity of problems with respect to quantum computers is known as quantum complexity theory."

--KyleBrochu (talk) 21:10, 30 June 2020 (UTC)

Hmm, the 1985 paper may conflict with what is stated on the page, but it also may not. The introduction states:
"Computing machines resembling the universal quantum computer could, in principle, be built and would have many remarkable properties not reproducible by any Turing machine. These do not include the computation of non-recursive functions, but they do include‘quantum parallelism’, a method by which certain probabilistic tasks can be per-formed faster by a universal quantum computer than by any classical restriction of it." 
I'm not sure that this is a refutation. He doesn't claim that there are undecidable problems that a quantum computer can solve or claim that quantum computers do not satisfy the standard definition of the Church-Turing Thesis (what he calls the "conventional, non-physical view" of the thesis). Rather at first glance what he appears to do is define a new stronger, "physical version of the Church- Turing principle" (he explicitly states "I propose to reinterpret Turing’s 'functions which would naturally be regarded as computable'"), which is: "Every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means". He then says that classical computers do not satisfy this stronger principle but conjectures that quantum computers do. I'm not sure now how exactly to judge this and will have to look more into this paper.
The reference provided in the section on computability (Nielsen pg. 126), however, is more straightforward. It states:
"Quantum computers also obey the Church-Turing thesis. That is, quantum computers can compute the same class of functions as is computable by a Turing machine. The difference between quantum computers and Turing machines turns out to lie in the efficiency with which the computation may be performed".
This appears to corroborate the claim made on this page. Online it is actually fairly difficult to find an answer to this problem as research overwhelmingly deals with complexity and very little with computability. Lots of online forums agree with this page, but those are obviously not reliable sources and it is certainly possible that they are agreeing with Wikipedia precisely because they based their answer on what they read on Wikipedia.
And on the other two articles you provided, they do not conflict with the page. They state that there are problems that a quantum computer can solve that classical computers cannot feasibly solve, i.e. it takes too much time; they do not state that a quantum computer can solve a problem a classical one cannot. Look at the important qualification in the title of the first article: "Finally, a Problem That Only Quantum Computers Will Ever Be Able to Solve". The title includes "ever be able to solve" because the article then goes on describe how the researchers "prove, with a certain caveat, that quantum computers could handle the problem efficiently while traditional computers would bog down forever trying to solve it." In other words, they are merely saying the researchers have identified a problem that would take a classical computer infeasibly long to solve. That's why the article then goes on to talk about the complexity classes BQP and PH—because they're talking about time complexity, not computability. The second article you provide similarly discusses the separation of BQP and PH (it asks in the introduction: "Can polynomial-time quantum algorithms be simulated by classical algorithms in the polynomial-time hierarchy?"); it is not speaking to computability.
--Jaydavidmartin (talk) 23:56, 13 October 2020 (UTC)
Return to "Quantum computing" page.