Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Why do we need Q#?

31 views
Skip to first unread message

😉 Good Guy 😉

unread,
Nov 15, 2018, 9:45:43 PM11/15/18
to

<https://blogs.msdn.microsoft.com/visualstudio/2018/11/15/why-do-we-need-q/>

--
With over 950 million devices now running Windows 10, customer satisfaction is higher than any previous version of windows.

Mr. Man-wai Chang

unread,
Nov 15, 2018, 11:17:43 PM11/15/18
to
On 11/16/2018 10:45 AM, 😉 Good Guy 😉 wrote:
>
> <https://blogs.msdn.microsoft.com/visualstudio/2018/11/15/why-do-we-need-q/>
>

We don't!

--
@~@ Remain silent! Drink, Blink, Stretch! Live long and prosper!!
/ v \ Simplicity is Beauty!
/( _ )\ May the Force and farces be with you!
^ ^ (x86_64 Ubuntu 9.10) Linux 2.6.39.3
¤£­É¶U! ¤£¶BÄF! ¤£½ä¿ú! ¤£´©¥æ! ¤£¥´¥æ! ¤£¥´§T! ¤£¦Û±þ! ¤£¨D¯«!
½Ð¦Ò¼{ºî´© (CSSA):
http://www.swd.gov.hk/tc/index/site_pubsvc/page_socsecu/sub_addressesa

Big Bad Bob

unread,
Nov 16, 2018, 3:45:34 AM11/16/18
to
On 11/15/18 20:17, Mr. Man-wai Chang wrote:
> On 11/16/2018 10:45 AM, 😉 Good Guy 😉 wrote:
>>
>> <https://blogs.msdn.microsoft.com/visualstudio/2018/11/15/why-do-we-need-q/>
>>
>>
>
> We don't!
>

we didn't/don't need C-pound, either. Nor '.Not'. But none of this
will stop Micro-shaft from cramming it at us and insisting we jump on
"yet another bandwagon" and chase "yet another moving target" as a
development platform, until they pull the rug from under us and decide
to stop supporting it...

Micro-shat has been "getting it wrong" since the ".Not" initiative in
the early noughties. It was wrong then, it's still wrong now, and
win-10-nic is just another example of them NOT having a clue.

now, where did I leave my clue-bat...

--
(aka 'Bombastic Bob' in case you wondered)

'Feeling with my fingers, and thinking with my brain' - me

'your story is so touching, but it sounds just like a lie'
"Straighten up and fly right"

Mayayana

unread,
Nov 16, 2018, 8:32:55 AM11/16/18
to
On 11/15/18 20:17, Mr. Man-wai Chang wrote:
On 11/16/2018 10:45 AM, ?? Good Guy ?? wrote:
"Big Bad Bob" <BigBadBob-at...@testing.local> wrote
Why don't you block those posts? Do you just like to
get mad? Neither of them posts anything but pro-MS
and anti-everyone else. I'm not convinced either one
is even a human. Though Good Guy's animosity seems
a bit too colorful to be cooked up by software. :)

Do you actually understand the page linked? I don't.
I wasn't aware that so-called quantum computing even
existed yet. Maybe it doesn't and they're just planning.
The fact the author uses "TL;DR" is a good indicator
that he thinks by piling popular, current cliches together.

Which is an interesting thing about programmers in general.
They tend to be people who are very good with math
but virtually incapable of intellectual thinking. By which
I mean that if the meaning of life or advice on dating can't
be rendered in a scientific formula then it doesn't count
for them as a relevant topic for their attention.

The piece is sort of interesting, though. It sounds like
Q is meant to be used like inline assembly. The main
difference I can see is that inline assembly does something,
going direct to the CPU, while it's not clear what, if
anything, Q does at this point. Their sample code looks
like randomization. But I don't think I have the curiosity
to figure out whether they're really talking about something.
(Nor do I want to install the latest VS.)


Mr. Man-wai Chang

unread,
Nov 16, 2018, 8:38:59 AM11/16/18
to
On 11/16/2018 4:45 PM, Big Bad Bob wrote:
>
> we didn't/don't need C-pound, either. Nor '.Not'. But none of this
> will stop Micro-shaft from cramming it at us and insisting we jump on
> "yet another bandwagon" and chase "yet another moving target" as a
> development platform, until they pull the rug from under us and decide
> to stop supporting it...
>
> Micro-shat has been "getting it wrong" since the ".Not" initiative in
> the early noughties. It was wrong then, it's still wrong now, and
> win-10-nic is just another example of them NOT having a clue.
>
> now, where did I leave my clue-bat...
>

A new/different programming language is just a change of words and
syntax! :)

Mr. Man-wai Chang

unread,
Nov 16, 2018, 8:40:33 AM11/16/18
to
On 11/16/2018 9:31 PM, Mayayana wrote:
> The piece is sort of interesting, though. It sounds like
> Q is meant to be used like inline assembly. The main
> difference I can see is that inline assembly does something,
> going direct to the CPU, while it's not clear what, if
> anything, Q does at this point. Their sample code looks
> like randomization. But I don't think I have the curiosity
> to figure out whether they're really talking about something.
> (Nor do I want to install the latest VS.)


Anyone wanna bet that Q# compiler would be written in C? ;)

Wolf K

unread,
Nov 16, 2018, 9:33:14 AM11/16/18
to
On 2018-11-16 08:31, Mayayana wrote:
[...]
> Do you actually understand the page linked? I don't.
> I wasn't aware that so-called quantum computing even
> existed yet. Maybe it doesn't and they're just planning.
[...]

Quantum computing is simulated in ordinary computers. Obviously not the
really big problems that a true q-machine could solve, but manageable
problems to test programming concepts and provide proof-of-concept. A
coding langauge for q-machines is part of that effort: it's kinda
difficult to think in terms of propositions that are both true and false
until the probability wave collapses (which IMO is a highly misleading
metaphor, aka "interpretation", but that's another issue).

FWIW, a recent report in New Scientists said that one of the players
(IBM?) has managed to isolate four or five q-bits IIRC. The technical
problem is to maintain q-bits long enough to actually do some real work.
Single-atom q-bits are very unstable. There are some hints that
molecular q-bits could be more stable, ie, could be made at higher
temperatures. In any case, portable q-machines will not be available,
since the chips will ahve to be cooled to near absolute zero.

Best,

--
Wolf K
kirkwood40.blogspot.com
People worry that computers will get too smart
and take over the world, but the real problem is
that they’re too stupid and they’ve already taken over
the world (Pedro Domingos)

Mr. Man-wai Chang

unread,
Nov 16, 2018, 9:48:56 AM11/16/18
to
On 11/16/2018 10:33 PM, Wolf K wrote:
> On 2018-11-16 08:31, Mayayana wrote:
> [...]
>> Do you actually understand the page linked? I don't.
>> I wasn't aware that so-called quantum computing even
>> existed yet. Maybe it doesn't and they're just planning.
> [...]
>
> Quantum computing is simulated in ordinary computers. Obviously not the
> really big problems that a true q-machine could solve, but manageable
> problems to test programming concepts and provide proof-of-concept. A
> coding langauge for q-machines is part of that effort: it's kinda
> difficult to think in terms of propositions that are both true and false
> until the probability wave collapses (which IMO is a highly misleading
> metaphor, aka "interpretation", but that's another issue).
>
> FWIW, a recent report in New Scientists said that one of the players
> (IBM?) has managed to isolate four or five q-bits IIRC. The technical
> problem is to maintain q-bits long enough to actually do some real work.
> Single-atom q-bits are very unstable. There are some hints that
> molecular q-bits could be more stable, ie, could be made at higher
> temperatures. In any case, portable q-machines will not be available,
> since the chips will ahve to be cooled to near absolute zero.

Quantum is just a different magic wand! There is nothing new there!
0 new messages