back to list

Re: [metatuning] Digest Number 1015

🔗Robert Walker <robertwalker@...>

7/7/2004 5:26:00 PM

Hi Carl,

> Not if it is allowed to update its programming.

That's no good if the way it is allowed to
update its programming is itself programmed.
You can add infinitely many Godel axioms
to a theory in an algorithmic way and
get a Godel sentence still. I can't
remember the details but remember the
result. That's why Roger Penrose says
that there has to be something not computable
going on. His ideas were pretty thoroughly
scrutinised - here in Oxford there is
perhaps the largest logic / philosophy
of mathematics group of researchers in
the UK and he presents his ideas to the
group and gets lots of feedback from it.
I don't think you will find flaws
in the reasoning or the philosophy
at least not easy to find ones certainly.
I was here studying at the university
when he was writing the book, not that
I was in anyway involved in the criticism
of his ideas, but I did go to some of his
early talks that he gave before it was published.

> I fail to see how humans magically escape
> Godel incompleteness.

Well the other possibility is that human
beings don't understand truth quite,
so that on occasion when presented with
a Godel sentence they mightn not recognise
its truth, no matter how long they
consider it, even though it is in fact
true (if the axioms are consistent).
But I take the point of view that
humans can undertand what truth is and
can follow that, and can't be mistaken
- can be in individual cases but the
basic idea of what truth is is something
clear that I think will be shared
by any living being that reaches
a certain level of understanding
of the world - not by all beings,
isn't a hallmark of conciousness,
but is a potential in conciousness
and I'm convinced by Roger Penrose's
argument that it isn't a potential
in programmed computers.

> > On the question about whether the
> > world continues afer one dies
> > - would it continue if there
> > were no beings at all left in
> > the world do you think?

> I do, because that seems like the best
> explanation of what I observe.

> I highly recommend David Deutsch's
> _The Fabric of Reality_. I think
> you'd love it!

Yes of course, no reason why you
wouldn't. Of course you would
if you think of matter as
primary and mind as secondary.

Indeed only if one has very strong
idealist views might one wonder
about it. If one thinks of
the world as mediating interactions
between minds, one might nevertheless
consider it possible that it could
continue without minds there.
Other ways it could continue too.

But perhaps you can see how it
is a question one can ask, and
that might even cause some moments
of thought - and that others might
legitimately have other views
about the matter.

I remember a couple of talks by
David Deutsch as he was also
here in Oxford, teaching when I was
studying here - they were on
quantum computation, and interesting
they were too. But I didn't know
he'd written this book - I'm sure
he would have interesting things
to say, and I'll look out for it.

> > If one sees minds as primary,

> I see information processing as
> primary - slightly different.

Yes I understand that. I wasn't
aiming to present you as an
Idealist!! But then I suppose
what I said could be understood
that way by someone who thinks of
minds as by definition
information processing.

I just thought you'd
be interested in the parallels
and the kind of dualist (not perfect
surely) symmetry between the two positions
- that reductionists say the
same kind of things about matter that
idealists say about mind.
That the question of whether
the mind continues after the
body and world is gone from ones
personal experience is a kind of
symmetrical question to the one
about whether the world continues
after minds are gone from it.

Then, same sort of reasoning too,
that one thinks it does because
often other things have departed
from ones awareness during ones
life and the mind continued through it,
so seems that it would continue
even when the entire world
disappears. Particularly also
that when one wakes up from dreams
everything in the dream vanishes
completely but the mind continues
into another world. That happens
every night, though one doesn't
always remember it. So seems
pretty likely that it will happen
when one dies too.

That I know won't seem convincing
if you think of matter as primary,
but if you think of mind as primary
then it is - and which you take
as primary is a philsophical
rather than a scientific matter,
as both are valid extensions of
the scientific method at least of reasoning
carefully and clearly about things into reasoning
about the mind and philosophical
matters.

It is a kind of interesting parallel,
the way the one argument reasons
for the continuation of matter
in the form of this world
after one has gone from it, and the other
reasons for the continuation of
mind after ones body and personal
world has gone. Though the
arguments aren't exact parallels,
there's an intriguing kind of near
duality between them.

Robert

🔗Carl Lumma <clumma@...>

7/7/2004 6:22:36 PM

> Hi Carl,

Hello there,

> > Not if it is allowed to update its programming.
>
> That's no good if the way it is allowed to
> update its programming is itself programmed.

Sure it is. Discrete programs can produce as
random an output as you like, therefore they
should be capable of modifying themselves in
as complex a way as you like. Through iteration,
all things are possible!

> You can add infinitely many Godel axioms
> to a theory in an algorithmic way and
> get a Godel sentence still.

The thing about Godel sentences is that they're
not terribly significant in thinking and reasoning.
It's like the old joke where you complain to your
Doctor that your elbow hurts when you move it "like
this", and his answer is, "Then don't move it like
that!"

> That's why Roger Penrose says
> that there has to be something not computable
> going on.

Nonsense, as far as I could tell. There's nothing
uncomputable about Godel sentences.

> His ideas were pretty thoroughly
> scrutinised - here in Oxford there is
> perhaps the largest logic / philosophy
> of mathematics group of researchers in
> the UK and he presents his ideas to the
> group and gets lots of feedback from it.
> I don't think you will find flaws
> in the reasoning or the philosophy
> at least not easy to find ones certainly.

Perhaps I should revisit this book, next time
I visit my parents. Of course by now I'm sure
there are many comments on the web about it.

> I was here studying at the university
> when he was writing the book, not that
> I was in anyway involved in the criticism
> of his ideas, but I did go to some of his
> early talks that he gave before it was published.

Penrose was at Penn State, where many of my
close friends studied, when we were in college.
He was constantly giving talks on his
microtubules idea. My guess at the time was
that he had been estranged from Oxford.

> > I fail to see how humans magically escape
> > Godel incompleteness.
>
> Well the other possibility is that human
> beings don't understand truth quite,
> so that on occasion when presented with
> a Godel sentence they mightn not recognise
> its truth, no matter how long they
> consider it, even though it is in fact
> true (if the axioms are consistent).
> But I take the point of view that
> humans can undertand what truth is and
> can follow that, and can't be mistaken
> - can be in individual cases but the
> basic idea of what truth is is something
> clear that I think will be shared
> by any living being that reaches
> a certain level of understanding
> of the world - not by all beings,
> isn't a hallmark of conciousness,
> but is a potential in conciousness
> and I'm convinced by Roger Penrose's
> argument that it isn't a potential
> in programmed computers.

I don't see truth as having anything to do
with consciousness. Truth is a rather
obscure notion that took many centuries to
refine. It doesn't really exist outside of
formal symbolic systems.

> > > On the question about whether the
> > > world continues afer one dies
> > > - would it continue if there
> > > were no beings at all left in
> > > the world do you think?
>
> > I do, because that seems like the best
> > explanation of what I observe.
>
> > I highly recommend David Deutsch's
> > _The Fabric of Reality_. I think
> > you'd love it!
>
> Yes of course, no reason why you
> wouldn't. Of course you would
> if you think of matter as
> primary and mind as secondary.

No, I'm the one who thinks of information
as primary. In that, I like Deutsch's
"Virtual Reality" principle. But here I
was referring to it because of his
discussion of induction, and other
philosophical issues that occupy the
bulk of his book.

> I remember a couple of talks by
> David Deutsch as he was also
> here in Oxford, teaching when I was
> studying here - they were on
> quantum computation, and interesting
> they were too. But I didn't know
> he'd written this book - I'm sure
> he would have interesting things
> to say, and I'll look out for it.

Quantum mechanics is given short shrift
in the book, actually. He does harp
repeatedly on the 'if shadow particles
aren't real, then how do quantum computers
work?' point, but doesn't go into much
detail. His philosophical musings are IMHO
more interesting.

> > > If one sees minds as primary,
>
> > I see information processing as
> > primary - slightly different.
>
> Yes I understand that. I wasn't
> aiming to present you as an
> Idealist!! But then I suppose
> what I said could be understood
> that way by someone who thinks of
> minds as by definition
> information processing.
>
> I just thought you'd
> be interested in the parallels
> and the kind of dualist (not perfect
> surely) symmetry between the two positions
> - that reductionists say the
> same kind of things about matter that
> idealists say about mind.

Yes, that is worth pointing out. In the
end, I suspect the ultimate theory of
everything could be phrased either way.
That is, the current differences of
opinion are probably largely semantic.
Natural languages aren't very precise for
hammering out details of a theory of
everything!

> That the question of whether
> the mind continues after the
> body and world is gone from ones
> personal experience is a kind of
> symmetrical question to the one
> about whether the world continues
> after minds are gone from it.

Good point. However, I have never
observed the former, but I have observed
the latter.

Of course, how would you observe the
former -- they're gone from the world!
In fact, if they're gone from the
world, it means it's useless to talk
about it.

> Then, same sort of reasoning too,
> that one thinks it does because
> often other things have departed
> from ones awareness during ones
> life and the mind continued through it,
> so seems that it would continue
> even when the entire world
> disappears. Particularly also
> that when one wakes up from dreams
> everything in the dream vanishes
> completely but the mind continues
> into another world. That happens
> every night, though one doesn't
> always remember it. So seems
> pretty likely that it will happen
> when one dies too.

Doesn't seem likely to me. Dreams
are fairly well understood. The brain
activity can be observed. Not so
with death.

-Carl

🔗Gene Ward Smith <gwsmith@...>

7/7/2004 6:26:41 PM

--- In metatuning@yahoogroups.com, "Carl Lumma" <clumma@y...> wrote:
> I don't see truth as having anything to do
> with consciousness. Truth is a rather
> obscure notion that took many centuries to
> refine. It doesn't really exist outside of
> formal symbolic systems.

I presume you realize that whatever other merits it may possess, this
claim cannot possibly be true.

🔗Carl Lumma <clumma@...>

7/7/2004 6:30:34 PM

> > I don't see truth as having anything to do
> > with consciousness. Truth is a rather
> > obscure notion that took many centuries to
> > refine. It doesn't really exist outside of
> > formal symbolic systems.
>
> I presume you realize that whatever other merits it
> may possess, this claim cannot possibly be true.

Natural languages can be used in such a way that
they can be mapped to formal axiomatic systems, so
I'd strike the "cannot possibly".

-Carl

🔗Gene Ward Smith <gwsmith@...>

7/7/2004 7:58:56 PM

--- In metatuning@yahoogroups.com, "Carl Lumma" <clumma@y...> wrote:
> > > I don't see truth as having anything to do
> > > with consciousness. Truth is a rather
> > > obscure notion that took many centuries to
> > > refine. It doesn't really exist outside of
> > > formal symbolic systems.
> >
> > I presume you realize that whatever other merits it
> > may possess, this claim cannot possibly be true.
>
> Natural languages can be used in such a way that
> they can be mapped to formal axiomatic systems, so
> I'd strike the "cannot possibly".

Ho. You think you can translate the Rnglish word "truth" into a
formalsystem with Tarski's definition or some such animal? Nope.

🔗Robert Walker <robertwalker@...>

7/8/2004 5:00:50 AM

Hi Carl,

> Sure it is. Discrete programs can produce as
> random an output as you like, therefore they
> should be capable of modifying themselves in
> as complex a way as you like. Through iteration,
> all things are possible!

Random changes in the axiom system are no good
becasue it could just as easily randomly
add in the negation of one of the Godel sentences
which is consistent with the original axiom
system but can be seen to be not true.

Indeed if not carefully done it would
most probably produce an inconsistent
axiom system from which one can prove
anything.

So changes in the axiom system have to be
based on some programmed notion of truth
to guide the changes. That is the
question that is addressed here.

I'm sure there are subtleties here
if you really follow it up and
I can't say I have followed them
all up but I guess that I respect
Roger Penrose's integrity,
as a researcher in the field
to have followed those all through
correctly.

He wrote a follow up book
"Shadows of the Mind"
I haven't read it yet,
only just discovered about it
(haven't been doing active research
in logic for years) but it
should be interesting.

This on-line link has some
discussion of his book
by participants at a
colloquium about it
and including his own responses
to their papers.
http://psyche.cs.monash.edu.au/psyche-index-v2.html

> > That's why Roger Penrose says
> > that there has to be something not computable
> > going on.

> Nonsense, as far as I could tell. There's nothing
> uncomputable about Godel sentences.

Ah, you are missing the point there.
The Godel sentences are computable of course
given the axiom system. No problem with
a computer program constructing a
godel sentence if it is given an
axiom system to construct it from,
one could write such a program.

What is problematical is if you
let it look at its own axiom system,
find its Godel sentence, and then
add that in to its own axiom system.
The problem there is that everything
is programmed. Humans not being
programmed can do that with no restrictions,
using their innate notion of truth.
Computers which are programmed, even with
self modifying programs, can indeed
add in new axioms like that too, but
they will have some restrictions on what they can do.

Then the significance there isn't particularly
that the Godel sentences are interesting
truths as indeed usually they are not.
The point is that if the computer can't
be programmed to recognise them all as true
when a human can then it means there is some
limitation in the way the computer
"understands truth" - the program can't be said to
implement it in the way that it is understood
by a human being.

At least that is the basic idea and that
is how the non computability comes into
the argument. Of course it can
be disputed, and people have done so,
I don't know of a philosophical
position that doesn't get disputed by
someone.

> Penrose was at Penn State, where many of my
> close friends studied, when we were in college.
> He was constantly giving talks on his
> microtubules idea. My guess at the time was
> that he had been estranged from Oxford.

Don't know of anything like that.

Academics often have sabaticals or
go away for longer periods of time.
I don't even know if he is still at
Oxford.

> I don't see truth as having anything to do
> with consciousness. Truth is a rather
> obscure notion that took many centuries to
> refine. It doesn't really exist outside of
> formal symbolic systems.

Well, that is a formalist idea.
It is a respected philsophical point of view
and probably Roger Penrose's argument won't
mean so much to a formalist. It can't
be proved or disproved as far as I know,
just argued for or against philsophically.

Myself, I see the truth in maths as
the same kind of truth we use in
every day language when we ask whether
it is true or not that such and such
a think happened in the world,
e.g. that it is raining or that the
sun is shining etc. There's no doubt
we use numbers in daily life reasoning,
a formalist will say that is just
because it is a formal pattern
that has applicability in the world
while others would say that we can
see truths about numbers, either
through understanding how constructions
and mathematical proofs work or
because the truths are valid
in some kind of platonic sense.

I suppose an extreme formalist might
say we have no notion of every day truth
either and are just following patterns
there that have proved useful evolutionary
speaking or something. Probably that
could be argued for too, maybe someone
has. I think few mathematicians would
think that way though, you tend to get
attracted to maths because you have
a strong notion of truth and validity.
There have been mathematician formalists,
Hilbert for one, but I'm not sure whether any of them actually
extend their formalism to truth generally??

Godel's theorem was a bit of a blow to
formalists as they had the idea that you
could have a single system to describe
everything and this shows you can't but
need to have many such systems. So
that makes it a bit more tricky as then
you have to deecide between them and how
do you do that on formalist grounds?

> No, I'm the one who thinks of information
> as primary. In that, I like Deutsch's
> "Virtual Reality" principle. But here I
> was referring to it because of his
> discussion of induction, and other
> philosophical issues that occupy the
> bulk of his book.

Rightio

> Doesn't seem likely to me. Dreams
> are fairly well understood. The brain
> activity can be observed. Not so
> with death.

Well I don't expect you to be
convinced. That is just bringing
in a physical world primary approach
and when you see things like that
then mind based arguments aren't compelling,
of course.

But if mind is primary then you
see things a bit differently.

Whether or not the accompanying
physical activities are understood,
what is relevant then is that
you experience a dream as something
real at the time. Then when you
wake up that entire reality
of the dream has vanished. So what does
that mean about how mind works?

BTW a dream doesn't need a dreamer
outside the dream from the mind
perspective. It just happens to do
so in the dreams that we know about,
but maybe there are other dreams that don't
need external dreamers.

I was also interested in Fredkins ideas
about a cellular automaton world.
There I thought it superfluous though
to add in the idea that it was a simulation
in an extremely complex computer in a larger
physical universe - that doesn't explain
anything at all, just like saying the
world is supported on turtles which will be
supported by more turtles indefinitely.
Why not just say that space time itself is cellular
and that the laws of space time are the laws
of a cellular automaton. Why embed it in
a continuous other universe? I used to think
that seemed pretty possible.

Now however, I'm wondering about it because
in such a universe everything would have
to be computable. Once you add in QM
to it probably the cellularity of it
needs to break down in some sense?

Robert

.

🔗Paul Erlich <PERLICH@...>

7/8/2004 1:47:02 PM

--- In metatuning@yahoogroups.com, "Robert Walker"
<robertwalker@n...> wrote:
> Hi Carl,
>
> > Not if it is allowed to update its programming.
>
> That's no good if the way it is allowed to
> update its programming is itself programmed.
> You can add infinitely many Godel axioms
> to a theory in an algorithmic way and
> get a Godel sentence still. I can't
> remember the details but remember the
> result. That's why Roger Penrose says
> that there has to be something not computable
> going on. His ideas were pretty thoroughly
> scrutinised - here in Oxford there is
> perhaps the largest logic / philosophy
> of mathematics group of researchers in
> the UK and he presents his ideas to the
> group and gets lots of feedback from it.
> I don't think you will find flaws
> in the reasoning or the philosophy

Umm . . . Robert, that's overstating the case a bit.
For example, have you seen Penrose's book _The Large, The Small, and
the Human Mind_? Stephen Hawking and the other thinkers Penrose
invited to contribute have certainly found flaws . . . or so they
think . . . I've purchased other books on this too, wish I could
remember . . . but lots of fine thinkers think they've found
flaws . . .

> > I fail to see how humans magically escape
> > Godel incompleteness.
>
> Well the other possibility is that human
> beings don't understand truth quite,
> so that on occasion when presented with
> a Godel sentence they mightn not recognise
> its truth, no matter how long they
> consider it, even though it is in fact
> true (if the axioms are consistent).

Aren't you forgetting model theory here? There is a model of
arithmetic in which G is true, and also a model in which ~G is true.
So this "truth" you speak of is, of course, determined independently
of the axioms (assuming they're consistent), since the axioms can't
select between one model and the other.

🔗Paul Erlich <PERLICH@...>

7/8/2004 2:58:49 PM

--- In metatuning@yahoogroups.com, "Robert Walker"
<robertwalker@n...> wrote:
> Hi Carl,
>
> > Sure it is. Discrete programs can produce as
> > random an output as you like, therefore they
> > should be capable of modifying themselves in
> > as complex a way as you like. Through iteration,
> > all things are possible!
>
> Random changes in the axiom system are no good
> becasue it could just as easily randomly
> add in the negation of one of the Godel sentences
> which is consistent with the original axiom
> system but can be seen to be not true.

It's true in an alterative model of arithmetic. You end up with three-
component, transfinite quantities. See _Godel, Escher, Back_ by
Douglas Hofstadter.

> I'm sure there are subtleties here
> if you really follow it up and
> I can't say I have followed them
> all up but I guess that I respect
> Roger Penrose's integrity,
> as a researcher in the field
> to have followed those all through
> correctly.

You can respect someone's integrity and still allow for the
possibility that they've made a mistake. Otherwise, you'd have to
disrespect the integrity of an awful lot of fine thinkers who've
poked holes in Penrose's argument (of course, Penrose has
counterarguments, and the whole thing goes back and forth, perhaps
the argument will never end . . .).

> Then the significance there isn't particularly
> that the Godel sentences are interesting
> truths as indeed usually they are not.
> The point is that if the computer can't
> be programmed to recognise them all as true
> when a human can then it means there is some
> limitation in the way the computer
> "understands truth" - the program can't be said to
> implement it in the way that it is understood
> by a human being.

This is the best part of the argument. Unfortunately, just as a
computer can't pin down a precise notion of truth, the argument can't
either, so there's an element of faith in there.

> > I don't see truth as having anything to do
> > with consciousness. Truth is a rather
> > obscure notion that took many centuries to
> > refine. It doesn't really exist outside of
> > formal symbolic systems.
>
> Well, that is a formalist idea.
> It is a respected philsophical point of view
> and probably Roger Penrose's argument won't
> mean so much to a formalist. It can't
> be proved or disproved as far as I know,
> just argued for or against philsophically.

Formalism doesn't seem to hold water for me because of our experience
that we "know" which model of arithmetic corresponds to the natural
numbers and which doesn't, even though they both obey the Peano
axioms.

> There have been mathematician formalists,
> Hilbert for one, but I'm not sure whether any of them actually
> extend their formalism to truth generally??
>
> Godel's theorem was a bit of a blow to
> formalists

It was a death blow to the positivists, who represented at least one
school of formalism.

> Now however, I'm wondering about it because
> in such a universe everything would have
> to be computable. Once you add in QM
> to it probably the cellularity of it
> needs to break down in some sense?

Well, the conventional notion, which would include causality and
determinism (or not), certainly breaks down. But Loop Quantum Gravity
provides hope for a different notion of "cellularity", a much more
abstract one, one from which space and time themselves emerge as
epiphenomena. See Lee Smolin, _Three Roads to Quantum Gravity_.

🔗Gene Ward Smith <gwsmith@...>

7/8/2004 5:24:17 PM

--- In metatuning@yahoogroups.com, "Paul Erlich" <PERLICH@A...> wrote:

> Aren't you forgetting model theory here? There is a model of
> arithmetic in which G is true, and also a model in which ~G is
true.

You want to add to the axioms of arithmetic a statement that the
axioms are inconsistent?

🔗Paul Erlich <PERLICH@...>

7/8/2004 5:38:44 PM

--- In metatuning@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> --- In metatuning@yahoogroups.com, "Paul Erlich" <PERLICH@A...>
wrote:
>
> > Aren't you forgetting model theory here? There is a model of
> > arithmetic in which G is true, and also a model in which ~G is
> true.
>
> You want to add to the axioms of arithmetic a statement that the
> axioms are inconsistent?

It's omega-inconsistency, which is far milder than a finite
inconsistency. This is discussed by Hofstadter and in greater detail
by technical accounts of Robinson's Nonstandard Analysis.

🔗Gene Ward Smith <gwsmith@...>

7/8/2004 6:27:00 PM

--- In metatuning@yahoogroups.com, "Paul Erlich" <PERLICH@A...> wrote:

> It's omega-inconsistency, which is far milder than a finite
> inconsistency. This is discussed by Hofstadter and in greater
detail
> by technical accounts of Robinson's Nonstandard Analysis.

I don't recall it from Robinson's book, but it's been a while.
Nonstandard analysis can be looked at from an algebraic point of
view, by finding the right kind of maximal idea in the ring of
sequences of real numbers, which I find more congenial.

🔗Paul Erlich <PERLICH@...>

7/8/2004 7:08:00 PM

--- In metatuning@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> --- In metatuning@yahoogroups.com, "Paul Erlich" <PERLICH@A...>
wrote:
>
> > It's omega-inconsistency, which is far milder than a finite
> > inconsistency. This is discussed by Hofstadter and in greater
> detail
> > by technical accounts of Robinson's Nonstandard Analysis.
>
> I don't recall it from Robinson's book, but it's been a while.
> Nonstandard analysis can be looked at from an algebraic point of
> view, by finding the right kind of maximal idea in the ring of
> sequences of real numbers, which I find more congenial.

Well, in _Godel, Escher, Bach_, Hofstadter talks about how, if you
take ~G as an axiom of the system, you end up with these three-
component transfinite numbers . . . ring any bells? Hofstadter makes
it clear that omega-inconsistency is not fatal by any means -- in
fact, it seems to me that it might appeal greatly to an intuitionist
of a certain stripe . . .

🔗Carl Lumma <clumma@...>

7/8/2004 7:52:07 PM

> > Sure it is. Discrete programs can produce as
> > random an output as you like, therefore they
> > should be capable of modifying themselves in
> > as complex a way as you like. Through iteration,
> > all things are possible!
>
> Random changes in the axiom system are no good
> becasue it could just as easily randomly
> add in the negation of one of the Godel sentences
> which is consistent with the original axiom
> system but can be seen to be not true.

Stop thinking axiomatic systems and start thinking
self-modifying programs. Random mutations are
certainly helpful in GA, but that's not even what
I meant. I meant that in principle there's no limit
to the originality available to computers when it
comes to modifying their own code.

> So changes in the axiom system have to be
> based on some programmed notion of truth
> to guide the changes. That is the
> question that is addressed here.

Goal-driven stuff is certainly possible too.

> He wrote a follow up book
> "Shadows of the Mind"
> I haven't read it yet,
> only just discovered about it
> (haven't been doing active research
> in logic for years) but it
> should be interesting.

That one was even weaker than _Emperor_.

> This on-line link has some
> discussion of his book
> by participants at a
> colloquium about it
> and including his own responses
> to their papers.
> http://psyche.cs.monash.edu.au/psyche-index-v2.html

Thanks for the link. I don't mean to say,
in any of this, that I don't respect Penrose,
certainly.

> > > That's why Roger Penrose says
> > > that there has to be something not computable
> > > going on.
>
> > Nonsense, as far as I could tell. There's nothing
> > uncomputable about Godel sentences.
>
> Ah, you are missing the point there.
> The Godel sentences are computable of course
> given the axiom system. No problem with
> a computer program constructing a
> godel sentence if it is given an
> axiom system to construct it from,
> one could write such a program.
>
> What is problematical is if you
> let it look at its own axiom system,
> find its Godel sentence, and then
> add that in to its own axiom system.
> The problem there is that everything
> is programmed. Humans not being
> programmed can do that with no restrictions,
> using their innate notion of truth.
> Computers which are programmed, even with
> self modifying programs, can indeed
> add in new axioms like that too, but
> they will have some restrictions on what
> they can do.

Such as?

> Then the significance there isn't particularly
> that the Godel sentences are interesting
> truths as indeed usually they are not.
> The point is that if the computer can't
> be programmed to recognise them all as true
> when a human can then it means there is some
> limitation in the way the computer
> "understands truth" - the program can't be said to
> implement it in the way that it is understood
> by a human being.

It seems feasible to write an expert system to
recognize Godel sentences, let alone the possibility
that a self-evolving AI would gain the same kind of
"innate notion" that humans have. Of course, most
humans get along fine ignoring Godel sentences, at
least ones that are externally apparent.

To show that computers can NEVER do what humans
do is a very very difficult sort of proof, which
Penrose hasn't even smelled the fumes of, I'm
afraid.

To show that computers can't currently do what
humans do is rather trivial.

> > Penrose was at Penn State, where many of my
> > close friends studied, when we were in college.
> > He was constantly giving talks on his
> > microtubules idea. My guess at the time was
> > that he had been estranged from Oxford.
>
> Don't know of anything like that.

I was probably overstepping my guessing power.

> I suppose an extreme formalist might
> say we have no notion of every day truth
> either and are just following patterns
> there that have proved useful evolutionary
> speaking or something.

If you can construct a formal system that
explains everyday behavior, that's great.
The best available description is good enough
for me. But so far, such systems have proven
very hard to construct.

> Godel's theorem was a bit of a blow to
> formalists

Yes, well I suppose I'm not a formalist then.

> But if mind is primary then you
> see things a bit differently.
>
> Whether or not the accompanying
> physical activities are understood,
> what is relevant then is that
> you experience a dream as something
> real at the time. Then when you
> wake up that entire reality
> of the dream has vanished. So what does
> that mean about how mind works?

It means the mind shouldn't be taken as
primary. Pyschoactive drugs, optical
illusions, etc. should shake any faith
that the mind is primary.

Or should it? Aaron didn't have very
positive things to say about Carlos Castaneda,
but the first book in his Don Juan series
actually had one of the best arguments for
the primary mind I've ever read.

> I was also interested in Fredkins ideas
> about a cellular automaton world.
> There I thought it superfluous though
> to add in the idea that it was a simulation
> in an extremely complex computer in a larger
> physical universe - that doesn't explain
> anything at all, just like saying the
> world is supported on turtles which will be
> supported by more turtles indefinitely.

Did Fredkin say that? I always thought
the simulation *was* the universe. Maybe this
is what Kalle was talking about...

-Carl