h a l f b a k e r yVeni, vedi, fish velocipede
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
This AI system would be resident on some phone or whatever digital assistant becomes in fashion and would learn all there is to know about being you. Forgetting the obvious security, privacy, and whatnot issues for the moment, it would learn all about what it is to be 'you,' your attitudes, opinions,
intelligence, fears, hopes, dreams, interests, disinterests, relations, etc. It would continue to learn all these about you throughout your life and file them all away.
You could have the AI imitate you at any age, according to how you were then. Want to have a debate with your 20-year-old self? Tell the AI to imitate you at that age.
Furthermore, as it's a connected AI, there's no reason it has to be restricted to just you. It could represent an amalgam of the collective wisdom and paranoia of everyone or anyone specific. Want to talk about poetry with Elon Musk? Have coffee with George Takei? Anyone with a profile could be available. Personalities in demand could charge a nominal subscription fee.
Dying of cancer and want to pass on your wisdom beyond your grave to your great great grandchildren? Here's a way to do it.
Inspiration in the news
https://www.npr.org...deliver-immortality Dying guy makes an ai of himself for his wife. Talk about potential for weird for her 2nd mate [RayfordSteele, Jun 14 2024]
Please log in.
If you're not logged in,
you can see what this page
looks like, but you will
not be able to add anything.
Annotation:
|
|
Yesterday I was thinking about how this would make a good book premise. The protagonist could be the AI bot, the antagonist is the human the bot is learning how to be as the antagonist is thought to be dead or dying, leading to the AI bot's activation, but then recovers in some fashion. Drama ensues. |
|
|
The other obstacle would be the pandora's box of ethics, security, abuse, fraud, etc. Just imagine the deepfakiness. |
|
|
That link... wow. Leave it to MIT to go even further than what I thought would be possible. |
|
|
The books could be a whole series and take a number of genres from sci-fi horror to slapstick humor. Or really throw a curveball and add a time travel arc to the bot's journey. |
|
|
Here's the issue I have with these "AI personality X" ideas; |
|
|
If you took the machine-intelligence part away and asked a regular person to perform or adopt a particular identity for you to have a chat with, how would that feel? |
|
|
Yes, it could be super-fun if you were in a role-playing kind of mood, but at the end of the day, someone pretending to be X isn't as exciting as actually talking to X. Unless, your genuine friendship with the person who's going to be doing the pretending is rich enough for this to be a fun and engaging exercise, but that's more about the other real person than it is about the personality being co-opted. |
|
|
If someone pretends to be someone else, can I form any kind of meaningful interaction with that secondary personality? I have to at the very least, suspend my own disbelief. |
|
|
Given that an AI is (probably) less intelligent than a person pretending to be X, and that I don't have a meaningful relationship with any given Large Language Model, the excitement is further diminished. |
|
|
There is something about encoding or recording your own personality into something (the "Dixie Flatline" comes to mind from Neuromancer) but again, to the downstream consumer, whoever that might be, would it be any different to asking someone to pretend to be someone else for the purposes of a conversation? It's just a difficult thing to imagine trying to actively engage with. |
|
|
[on further consideration] I think there's something here related to the sufficiency or otherwise of the Turing test. |
|
|
If an intelligence is testable by talking to it and reflecting whether or not it was convincing, then does that follow that identity (i.e. a specific implementation of an intelligence) is similarly determinable? |
|
|
If I think I'm talking to someone specific, does it matter if that someone isn't really who I thought they were? Turing might argue it doesn't matter, (or at least I might pretend/imagine that's what he might argue!!) but I'd be tempted to disagree. I suppose I could spin up a few versions of myself and let them each posit what their imagined Turings might have to say on the matter and settle the thing by majority. |
|
|
[zen] a lot of people use telephones, videoconferencing, etc. i.e. they have conversations with electronic devices and pretend to themselves that they are interacting with a human. I would argue that the difference between interacting with a machine that is actuated by a distant human is closer to interacting with a machine that is actuated by machinery, than it is to interacting with an actual live physically present human. |
|
|
[poc] they do, but I don't see how using a machine (telephone, fax, telegraph, virtual reality video-conferencing) as part of the interface that helps carry language-based communication between two humans any different to letting the mechanics of air-pressure carry voice signals between mouth and ear, or talking to someone while looking at them in the mirror. Yes, the fidelity of the signal is affected, and some physical interactions are constrained, but in terms of language based interactions, qualitatively unaffected. |
|
|
As physically embodied creatures, our channels for communication are necessarily mechanical, conveying those mechanical signals through more sophisticated physical methods over longer distances doesn't seem to make a huge qualitative difference. |
|
|
But it'd be very different experience for me talking to a family member on the phone, than it would talking to a machine I know is mimicking a family member on the phone. The relationships are different. I'm not sure the relationships are so different as to make such a conversation difficult to engage with, without some mental gymnastics. |
|
|
And to connect back to the idea more specifically, we watch plays and movies all the time, and for the most part (and to be honest, this is quite amazing that it happens at all) forget that what we're watching isn't real, so who's to say. I suppose the difference is that in a book, a play, or a movie, I'm a relatively passive observer, soaking up the action. |
|
|
Whereas, if I'm expected to *participate*, that's a whole lot of extra stuff that I'm not as ready to exercise, in terms of suspending my disbelief. If I have to become an actor in the play, that's a very different deal to consuming media in the ways we're more used to. |
|
|
//If I have to become an actor in the play// - you are basically describing computer games now? |
|
|
//some physical interactions are constrained, but in terms of language based interactions// This is precisely the point, the constraint to language misses a large amount of the information flow between two humans interacting face to face. |
|
|
Obviously non-direct information can be useful brain input, like reading the dictionary, or exchanging typed messages on a stupid ideas website. But the valorisation of the source of the information as being human / non human is what I am getting at. Its just information. The experiential experience of experiencing the presence of another live being seems to be different. |
|
|
I imagine it could be a much more creepy version of Alexa that would eventually be put in storage as people decide they need to move on. Unless they have sort of chosen dedicated space for it. |
|
| |