h a l f b a k e r yA riddle wrapped in a mystery inside a rich, flaky crust
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Syntho-God
A reasonable facsimile of a devine oveseer. | |
It already exists in its rudimentary form. You can ask
Alexa
or Siri if it's going to rain tomorrow and you'll get an
answer. But for the daunting quesitons in life: "What's my
greater purpose?", ""How should I live my life?" etc, we
create a universallly accessible datebase called "Syntho-
God".
That's
the easy part. Here's the tricky part. The tenets of
this robot-diety are all created by popular up-vote.
Now in this devided world we live in it's hard to imagine
all
cultures would agree on anything, but that woudln't be
necessary, they would agree on some things and that's
where this universal truth computer would start.
It may be slow at first. The ruler of the universe in my
favorite series of books, "The Hitchikers Guide To The
Galaxy" said a lot of "I don't know." and "On the other
hand
it could be this way, I'm not sure..." which science (the
other way we get information) does a lot of as well.
It might start off as simple as "Do not be a cannibal", "Do
not slap your grandmother." and build from there.
After decades of the whole planet agreeing on stuff it
might bring us together to some extent. Which may or
may
not be a good thing.
Now here's the good part. You could create an actual,
taylored to you, godlike entity that is specifically
taylored to your personality and your life. It would be
programmed to
look out for you, to be your gardian angel and assistant,
just like god. It could give you an education, help you in
times of need with good, generally agreed to life advice
and just be your all around digital best buddy.
Syntho-God (tm). "Maybe not as good as, but probably
more real than the real thing."
Project Pope
https://en.wikipedi...g/wiki/Project_Pope Prior Art [8th of 7, Oct 13 2017]
"The Last Question"
http://multivax.com/last_question.html Well-worn territory [RayfordSteele, Oct 13 2017]
From The Gadfly -- one of the most famous books in the atheist Soviet Union
https://www.goodrea..._Voynich_The_Gadfly [theircompetitor, Oct 13 2017]
Emergency Faith Pack
Emergency_20faith_20pack Prior Art [8th of 7, Oct 13 2017]
Apple syntho god
https://youtu.be/QRH8eimU_20 Apple agent of the lord [mylodon, Oct 13 2017]
Microsoft syntho god
https://en.m.wikipe...ki/Office_Assistant I am being unfair here as i am sure bill hoped for more [mylodon, Oct 13 2017]
Space power tools
http://www.popularm...by-nasa-astronauts/ [mylodon, Oct 14 2017]
Creating life.
http://www.telegrap...ut-playing-god.html Too clever for our own good? [doctorremulac3, Oct 16 2017]
Another approach apparently
https://www.dailyst...on-killer-robots-AI Mine is a sort of universal mind, this is just some sort of robot devil for mindless worshipers. [doctorremulac3, Nov 19 2017]
True Love
http://www.angelfir...a/savvy/story7.html An (almost) personal-god Multivac story [Skewed, Nov 20 2017]
All The Troubles Of The World
http://www.mcguirem...f_the_world_(1).pdf Another Multivac story, shades of Minority Report. [Skewed, Nov 20 2017]
Person Of Interest
https://www.youtube...watch?v=WYDWSNMTauQ A bit more recent. [Skewed, Nov 20 2017]
Earth, the TV Show
https://www.youtube...watch?v=wK-IuIbfb-A [doctorremulac3, Nov 23 2017]
Lycurgus
https://en.wikipedi.../Lycurgus_of_Sparta For some reason, there is no Lysergus [pertinax, Nov 26 2017, last modified Oct 08 2021]
A fly's brain
https://www.mpg.de/...ng-motion-detection A fly would destroy humans at any sport if they knew how to play. And didn't get squashed by the ball. [doctorremulac3, Nov 27 2017]
More AI god ideas.
https://www.dailyst...n-way-of-the-future Didn't read the article, think I got the basic idea from the headline. [doctorremulac3, Dec 11 2017]
The Machine Stops
http://archive.ncsa...ajlich/forster.html Always make sure your self-repair mechanism is capable of self-self-repairing. [Wrongfellow, Dec 26 2017]
Was only a matter of time.
https://futurism.co...sJxryVzEVZBnL9qmrFc [doctorremulac3, May 16 2023]
The AI alignment problem
https://en.wikipedi...g/wiki/AI_alignment [Voice, May 30 2023]
[link]
|
|
What if your grandmother is a cannibal? |
|
|
I assume the syntho-god is called Roland? |
|
|
//What if your grandmother is a cannibal?// |
|
|
Then Syntho-God starts repeating "DOES NOT
COMPUTE! DOES NOT COMPUTE!", starts smoking
and blows up. |
|
|
We're still working out the kinks. |
|
|
You could, if you wanted, program your specific
Syntho-God portal to be very neurotic or goofy,
caring or thoughtful, strong and sure of itself,
whatever you were comfortable with. I
for instance might be more comfortable having a
SG entity that's very practical. "Synth-God, what's
the meaning of life?" "Oh Jesus, not this
conversation again. Right now your car is due for
an oil change, let's go with that." |
|
|
// "Synth-God, what's the meaning of life?" // |
|
|
THAT... would be a given. |
|
|
A default answer should be: |
|
|
"INSUFFICIENT DATA FOR A MEANINGFUL
ANSWER" |
|
|
Except that it's not. Go back and actually read the idea. |
|
|
Remember, everybody is 99% atheist not believing in most
of
the gods that have been worshiped through history. |
|
|
There may be somebody who believes in all of them, I just
haven't heard of such a person. |
|
|
This one will actually answer when you pray to it. |
|
|
Anybody reading this who gets comfort or solace from their
God, kindly ignore this post. I'm happy for you. (Unless your
god tells you to kill me.) |
|
|
I did actually read the idea. |
|
|
I didn't say that all parts of it were baked, but that the basic
territory of creating a technological answer-machine is. |
|
|
Not trying to pick a fight, so not sure why the attitude. |
|
|
"He was the sort of person who stood on mountaintops during
thunderstorms in wet copper armour shouting 'All the Gods are
bastards." (Terry Pratchett) |
|
|
Why not give it a try ? After all, what could possibly go wrong ? |
|
|
// There may be somebody who believes in all of them, I just
haven't heard of such a person. // |
|
|
Inspired my injectable eucharist that works like an epi pen for
when you've only got seconds to choose between salvation and
eternal hell-fire. |
|
|
Ha! Would you sell them in packs of two? |
|
|
Would they have a "use-by" date ? |
|
|
I am interested in this territory, and i have read this a few times, and
it is very well worn. i think it encompasses some of the hopes and
desires man had when man created machines. However, not well
implemented in current world today. Popular upvote mechanism as
a means for weighting
a neural net may be the most unique part but already done
functionally even in a google search if a link is considered to be a
validation or upvote of a sort. But maybe working that Idea througH
so it works better as a mechanized means of democracy. Anyways.
The devil is in the details. |
|
|
I recently read a David Brin novel with a variation of this idea as it's premise. In the end a human intelligence is downloaded during death and all of the various internet chatter and opinion ratings washing over this augmented intelligence act much as random thoughts bubbling up from the subconscious, competing and collaborating over the thoughts of a normal human mind on a day to day basis. |
|
|
For AI to have a chance to work it needs a dash of chaos thrown into the mix, or there can be no awareness of right and wrong. |
|
|
Let me put it another way: |
|
|
It would be a democratically created god usefully
implemented. |
|
|
The ultimate hive mind. The unversal conciousness
of mankind. The human animal's idea of truth. |
|
|
Or we might find out we have nothing at all in
common except for the frowning on cannibalism
thing. |
|
|
Hey, maybe that's all it would say. "Don't eat each
other. Now leave me alone." |
|
|
Amyway, this idea is about taking already existing
concepts and steering them in a new direction:
Let's build an actual, functional synthetic god
that's an extension of all of us. |
|
|
Just calling it "god" would clarify the vision. |
|
|
Eventually AI will profile every human being at a very young age as to their individual learning bents. It will categorize disease based on visual auditory and chemical clues we can not perceive, make connections and leaps of logic based on memories no human mind could ever hope to compete with, and root out the asswipes currently harshing our multi-cultural buzz. |
|
|
In my estimation AI will never achieve intuition. No amount of free association can mimic or equal intuition without requiring an infinite number of scenarios to do so. That's why we're needed. |
|
|
Only life can create by leaps of illogical intuition. |
|
|
The sum of the multiverse equals consciousness, and the sum of consciousness is what everybody and their dog has saddled with the moniker "God" in various forms. |
|
|
Do you think that the sum of all multi-universal consciousness really cares which idiosyncratic doctrine any of us talkin-monkeys subscribe to? |
|
|
Treat the next synapse as you'd like the next synapse to treat you, and everything is copacetic. |
|
|
All other rules are extraneous. |
|
|
Why did this remind me of the Synod, Earth: Final Conflict? The aliens ultimate decision tree. They did look like living connected neurons. though. |
|
|
Intuition is exactly what neural nets use. They have nothing but a bunch of data and operate on hunches. |
|
|
There is often no direct logic making decisions; just statistics and connections processed by solving a sequence of equations in the equivalent of the visual center of a computer.. The imagination, the gpu. |
|
|
Logic itself is a discrete component of it but any larger features become blurring indistinguishable from human intuition. Similar in some ways to how exquisitely complex compression has turned digital transmission into something visually similar to old VHS tapes. |
|
|
The problem will be, we will no longer be able to logically control machines, but will need to cajoll, convince, and inspire them to work. |
|
|
Maybe even perform virgin sacrifices. |
|
|
//Only life can create by leaps of illogical
intuition.//
|
|
|
//There is often no direct logic making decisions;
just statistics and connections processed by solving
a sequence of equations in the equivalent of the
visual center of a computer.//
|
|
|
This thread has suddenly become really
interesting.
|
|
|
I've thrown out my take on the life vs machines
thing
before as to motivation, I think it applies to
intuition or other human or life traits as well. |
|
|
I think the fundamental
difference between life and machines is at
the molecular level.
Forgetting for a second about the fact that we will
create biological life someday, life has motivation,
machine does not.
|
|
|
The third smallest component of life is the cell. (I
know there are parts of cells but bear with me)
Atoms, molecules and cells. These cells have
programming, the big picture of what makes life
life. Survive, divide and expand. Living and
motivation is programmed in at the cellular level.
|
|
|
The third largest component of AI at this point is
an incredibly simple tool, a switch that's either on
or off. When we pile enough of these together, we
can mimic anything we want, even motivation,
caring, a lust for taking over the universe, but it's
an illusion. These switches don't care, these
switches aren't motivated, these switches aren't
alive. They're inert, dead, inanimate objects. No
matter how many of them you pile up, they're just
massive piles of these dead, dumb tools. They can
be programmed to look alive, but they're not.
|
|
|
That's why I'm not as worried as the super geniuses
who are afraid of the Skynet scenario where AI
becomes conscious and gets angry and rebellious
for some reason. We're projecting human traits onto
highly modified rocks. AI doesn't care about the
motivation of "taking over the universe" or "wiping
out the humans" because it doesn't care about
anything, and never will.
|
|
|
Somebody might be able to make it look like it
cares, but it doesn't.
|
|
|
We've survived the threat of nuclear weapons in
the hands of mad men which is a much greater
threat in my estimation. |
|
|
You want to worry about something, worry about
that biological life we're going to make someday.
That MIGHT be programmed at the cellular level to
wipe out all other
life. But don't worry about that either. I think the
biggest challenge we're facing now is de-evolution,
but that's a different thread. |
|
|
Well... it might be related. The real threat of AI
might be that it takes care of us so well we lose
the ability to take care of ourselves, H.G. Wells
and The Time Machine hit on this concept with
everybody sitting around like chickens in a pen
waiting to be fed, unable to fend for themselves.
We need to remember that evolution never sleeps.
We evolve to fit our circumstance and if our job is
to consume and reproduce, what's the point of
language, intellect, curiosity, aggression or desire?
These traits that have allowed man to be the most
successful animal on the planet might be lost. Remove
their necessity and what purpose do they
serve? And who's to say we'll even retain the desire to
reproduce? |
|
|
So the real threat from AI might not be that it's
aggressive and destructive, it might be just the
opposite. |
|
|
I sent a shortened version of the above post to somebody
who's pretty famous requesting a response that I could
frame and put in my
office because "it would be really cool". |
|
|
Unfortunately I got this back almost immediately. |
|
|
"Thank you for your email to Professor Hawking. |
|
|
As you can imagine, Prof. Hawking receives many such
every day. He very much regrets that due to the severe
limitations he works under, and the enormous number of
requests he receives, he is unable to compose a reply to
every message, and we do not have the resources to deal
with many of the specific scientific enquiries and theories
we receive." |
|
|
Oh well, I tried. If by some stretch of the imagination he
writes back I'll let you know, but don't hold your breath.
(Still, kind of fun.) |
|
|
I would say that I'll ask him next time I see him, but I don't think it would significantly shorten the delay. Plus he may still be mad at me for almost running him over once. |
|
|
Thank you Max, I was going to ask you but thought it
would be pushy. |
|
|
I'd consider a transcript from any conversation between
you guys worthy of framing. I'll put it next to my NASA
invent the future contest award. (I won some power tools
for inventing a new kind of valve.) Also my gold records.
All the things I have on my wall in lieu of any kind of
diplomas. (Not even the high school kind. Yes, that's
probably
why I'm so uptight about wanting to be considered smart.) |
|
|
Actually... that would go on my desk next to the pictures
of the wife and kids. I would treasure it. Even if his
answer was: "Tell the dumb American to piss off and you
go take some driving safety lessons!". |
|
|
Wait - you have gold records? As "the ones you get for selling X number of copies" as opposed to "the ones you make with gold spray paint and an LP"? Seriously? |
|
|
Don't be TOO impressed. Being a recording artist in my
youth I made some pretty good money and put that
money into what became a very
successful recording studio and production company. The
gold and platinum and
multi platinum records are from the bands that venture
produced. I'm still proud of them. It could be argued they
wouldn't exist if it weren't for me though it's hard to
speculate on such things. I certainly helped. |
|
|
But they're from famous bands and you know them.
500,00 units get you a gold record and 1,000,000 a
platinum in the states. Different countries have different
certification numbers. Anyway, I've got a wall full of
them. |
|
|
Want to hear more about the NASA award? Power tools?
No? OK. Nobody ever wants to hear about the power
tools. |
|
|
Wowww. Now that, [doc], is properly awesome. Actual music that actually sold to people who actually bought it - both yours and that of the bands you produced. Damn. |
|
|
And gowonden - tell us about the valve. |
|
|
I put some money into a production company and I think they bought
office chairs and lunch with it. |
|
|
Back to the machines..... |
|
|
Being too well looked after, as a future apocalypse, at the moment
seems unlikely due to the difficulty of software to last a week without
an update and hardware requiring constant adjustment, charging,
replacement, etc. How can ai repair itself manually or expurgiate
decades of code debt? We are almost impossibly far away from
machines being able to evolve physically by themselves, because they
are all very top heavy and don't have a physical self repairing and
duplicating cells. |
|
|
This may be something we evolve to become very good at (plugging
USB cables into dead devices, swapping out hard drives, developing
dedicated teams to refactor old bugs so they are new and more agile)
maybe so good computers force us to work in camps, maybe in large
buildings stacked up floor after floor of gridlike compartments,
populated with humans dedicated to keeping machines charged,
updated, fed, repaired. |
|
|
//And gowonden - tell us about the valve.// |
|
|
It's called the "gauge pressure dissipation valve".
Temperature regulation valves have an inherent issue
with having to re-direct water that has back pressure that
causes friction on the valve requiring some measure of
force to overcome that friction. |
|
|
My power tool winning design did something new in that
it dissipated the back pressure by spraying the water
through a small gap so it's traveling as a result of velocity
only making it very easy to re-direct. |
|
|
An example of the concept can be given with a garden
hose. Try to re-direct the flow of water by pressing your
thumb against the opening. It's very hard because you're
fighting that back pressure. Now move your palm a few
inches away from the outlet and simply place it in front
of the stream. You've directed the water just as
effectively, but there's very little force needed. |
|
|
It was designed for a temperature regulating shower
head. The stream of water went through a sort of
rotating "key" that had holes in in that lined up at the
correct temperature for bathing but turned into the
water path it it became too hot or two cold. Because
there was no back pressure to overcome, a simple bi-
metal spring could react to the water temperature and
turn the wheel requiring very little torque. The upshot
of it was, it reacted instantaneously so it was impossible
to be hit with even a small amount of water that was of
an uncomfortable temperature. |
|
|
It was a very simple "anti scald, anti freeze" shower head.
Teledyne considered licencing the design having worked
on the problem for years. They thought it was very clever
how I had overcome a problem their engineers had failed
to solve. |
|
|
Didn't go anywhere till I entered it into NASA thing. It was
a big deal to me because it proved (to myself) that I
could do something creative besides music. When I built
the prototype, the moment it actually worked it was
similar to the first time I heard a song of mine on the
radio, kind of an out of body experience. It made me very
happy to put it mildly. One of those "Life's
scrapbook moments." |
|
|
Sheesh, no wonder people don't want to hear about the
power tools. Kind of a wordy story. Anyway, it's a special
thing to me. |
|
|
Don't understand. Splashing water off a hand is different then
stopping flow. Redirecting flow would mean.. The excess hot or cold
would just go straight down the drain? Or do you redirect back into a
reservoir via a low pressure line? |
|
|
Same questions as [mylo]. |
|
|
But it sounds ingenious - had I but a hat, I would doff it to you. Having no hat, however, I shall dedicate my next G&T to you - it'll be in about 3-4 minutes. |
|
|
Im genuinely curious as to what album features the doctor. |
|
|
I can also report that the aforementioned G&T worked very well. |
|
|
//The excess hot or cold would just go straight down the
drain? Or do you redirect back into a reservoir via a low
pressure line?// |
|
|
Oh yea, sorry. Yes, it just turns into a faucet that you
hold your hands under while you adjust the temperature.
Once it's within a range that's comfortable for bathing it
automatically directs the water towards you. |
|
|
I've used it, you can play with the controls turning the hot
water all the way off, then all the way on while turning
the cold water all the way off and it's impossible to ever
get hit with a single drop of uncomfortable temperature
water. There was a buffer chamber for if you were to put
all cold water in, then turn it off and put all hot water in
for some reason, while the key wheel transitioned from
too hot to too cold, as the key passed through the "allow
water" position, that water would be half freezing, half
boiling. Warm. |
|
|
//Im genuinely curious as to what album features the
doctor.// |
|
|
I'm very happy with my anonymity. I assure you, I'm not
that
interesting anyway. Keep in mind, the highlight of my
weekend was sending a letter to Stephen Hawking that
he'll
never read. |
|
|
Just some guy who used to sing and got his 15 minutes
that's
all. Eh, maybe more like 5. Better yet, let's assume I'm
making it all up. |
|
|
Next week, my NFL career. I'll tell how I was a... who's
the guy who throws the ball? Quarterback, for that team
from San Francisco with the gold helmets. Or maybe they
were red. |
|
|
Well, speaking as someone who has (a) never won anything from NASA and (b) never produced anything musical apart from one particularly memorable fart, I am moved to dedicate my third G&T to you also. (The second one slipped in between the first and the upcoming third.) |
|
|
Admit it, [doc], you're actually Ozzy Osbourne in real life. |
|
|
I'm not sure he can operate a computer. |
|
|
But cheers max. G&Ts are what I drank back before I
swore
off doing anything fun ever again. G&Ts and every other
alcohol in existence that is. |
|
|
Boy, really airing my dirty laundry today. Halfbakery
Confessionals. |
|
|
(And that's not true, I still do fun stuff, just nothing that
carries the threat of imminent or lingering death
associated with my old pastimes.) |
|
|
[doctorremulac3] Field-programmable gate arrays. It just has to care about programming itself to a higher and higher level. Thinking only about cells is anthropocentric, in this universe of wonders. |
|
|
Coming to this thread a bit late bit that is cool,
[DrR3] - the Venn diagram of overlap between
gold-disc holders and
NASA engineering competition winners must be
pretty small |
|
|
Hippo, thank you, I never thought of it that way
but I guess you're right. You made my morning,
you're a nice person. I'm going to go out and say
something nice and positive to somebody today in
turn. |
|
|
Ive had way too many friends that
are now multi-millionaires where I'm firmly middle
class to be beaming with 24-7 pride, so it's nice to
have something like that to be
sort of proud of. Christ, my ex-wife married the
co-
founder of one of the first computer stores, had
over 800 outlets and he was worth 9 figures by the
time it was sold so some even bigger company.
Private jet, the whole bit. (I was much better
looking than him though, thank God.) Point is, I've
got plenty
to keep me humble so it's nice to get a pat on the
back. We all need it
every once in a while. I've got a story about the
private jet to
tell sometime. |
|
|
Wjt, I would respectfully say that the
anthropocentric view is that any entity we create
will mimic the specifics of our life form. These traits
like desire to survive, expand, even live evolved to
fill a roll life needed to function. |
|
|
I've also seen the hypothesized knowledge curve
of AI in the future and premise that our place in
the universe might be relegated to the pre-curser
to this superior entity that will look down on us
because we're stupid and blobby. Another
anthropocentric view? |
|
|
My point is, when does this fundamental switch,
from not caring to caring take place? |
|
|
And I dont think we'll ever be dumb enough to give
total autonomous kill ability to our killing
machines. We've had the ability to do this for a
long time and I don't know if this is even on the
drawing board anywhere. |
|
|
I like to rank danger in order of likelihood: |
|
|
1- Car crash
2- Metabolic syndrome based disease
3- Getting caught misspelling words on Halbakery
4- Nuclear war
5- Asteroid hitting Earth
6- AI deciding it cares enough about me to bother
killing me. |
|
|
That being said, we can program AI to do anything
we want. If somebody decides to program AI to
replace biological life, yea. It could give us a good
run for our money. |
|
|
I still propose the real danger is AI replacing our
will to survive by removing all adversity and
challenges from survival turning us into mindless
food processing slugs. |
|
|
I had an idea for a novel. It's the future, everybody
is stupid because of thousands of years of
dependence on this Syntho-God thing, then it
breaks. The hero comes out looking for food after
a big bright light made a big booming sound and
the food stopped coming. (Asteroid hit the Earth
maybe. I'd have to come up with a plausible reason
a mechanism that's worked for thousands of years
would stop working.) Anyway, using rudimentary
language, we follow his adventures re-evolving
into a self-reliant animal again. Title: "Syntho-
God". |
|
|
Anybody done that? Do novels make any money?
Maybe it needs to be a video game scenario,
they're the only entertainment making any money
these days. |
|
|
Last thought on the naughty AI thing, let's not
worry about it, but let's keep our hand on the plug
anyway. Even though I've got some gold records
and won some power tools for designing a valve I
could still be wrong. I know, bit of a stretch but it's
possible. |
|
|
My experience with novel writing is that its
somewhat more likely than winning the lottery, but
somewhat less likely than getting shot, and almost
as pleasant. |
|
|
Regarding the evolution of AI - I think it's inevitable that AI will become smarter than us at some point, but it will happen consecutively in different domains. |
|
|
AI is already much smarter than we are at spell-checking a huge document, multiplying 50-digit numbers or adjusting the hue and saturation in a digital photograph. Of course, because these things can be done by computer, we redefine them as non-intelligent activities. |
|
|
AI is also better than us at playing chess, so we have redefined chess-playing (at least by a computer) as being non-intelligent - AI just analyses more patterns. |
|
|
AI is getting good at face and image recognition. It can already do it much faster than we can, but still makes mistakes. Even so, we are pre-emptively defining image and face recognition as non-intelligent. |
|
|
As AI is developed to include "tricks" that solve more and more problems, so those activities will be progressively defined as non-intelligent. Eventually, this will apply to every activity. |
|
|
The risk, therefore, is not so much that AI will develop ingelligence, but that we humans will cease to be intelligent by defining everything that we (and computers) can do as being non-intelligent. |
|
|
One thing we can all agree on, we put in parameters for a
system to get a job done we better be careful. |
|
|
"Syntho-God, reduce crime by 80%." |
|
|
"Syntho-God computing solution... Program complete." |
|
|
"In other news, all males on Earth were arrested by Syntho-
God controlled police drones responding to the program to
reduce crime by 80%. Syntho-God's programmers speaking
from their prison cells released a statement saying. "Hey,
luckily we were able to stop Syntho-God's plan to reduce
crime by 100% immediately."." |
|
|
There's a trope in literature where people make a deal with
the devil and leave out some detail that the devil uses as a
loophole to totally screw the person, the great movie
"Bedazzled" explores this story-line. We'll have to think of
this cyber entity as having the potential to be pretty evil if
we don't watch our specifics. |
|
|
[MaxwellBuchanan] It's like the argument for free will in a
way. People enjoy the concept they are innately special in
some way and not driven by mechanics. |
|
|
I'm sure the basic concepts of that will be replicated by
machines. So even our desire to to be human will not be
intelligent. |
|
|
See link about guys creating life in a test tube. |
|
|
Also important to realize, this distinct line between
biological and digital may be blurred at some point
with some weird breakthrough "dig-cell" or
something. |
|
|
Then who knows where the hell we're going. |
|
|
Don't know. All I know is that I won't want to be running on
any of the current operating system manufacturers' versions. |
|
|
Speaking of biologically programmed impetus, I
was wondering if we're biologically programmed to
go into space. |
|
|
Do most religions say you go into the sky when you
die? Alternately, how many say you go deep into
the Earth (hell) if you screw up? Might that be
indicative of some drive to move to the stars?
Seems odd most religions say you go to the same
place. |
|
|
I need to do some research on "Where heaven is
located" for the various religions. Unless somebody
knows already. |
|
|
I think hell is often under ground though. I'm
almost positive nobody has ever said you go into
the sky when you're damned. |
|
|
An intriguing thought. Well, down is almost always a result
of being prone, depressed, or dead, or such, and up is
almost always authoritative, hopeful, associated with
flying, etc. And it carries into other mammals as well. |
|
|
But are we not descended from troglodytes? |
|
|
Up was where the unattainable Mysteries were, the
birds, the stars, the rain, Robin Lopez's hair... |
|
|
//I'm almost positive nobody has ever said you go into the sky when you're damned.// Have you flown with Ryanair? |
|
|
Ah yes, Ryanair ... the first international corporation to successfully
revive the "African Slave Ship" business model ... |
|
|
// not driven by mechanics// |
|
|
Of course. It is the chauffeur's job to drive the vehicle ; the
mechanics perform maintenance and repair. |
|