![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Pelargir » September 13th, 2017, 7:54 pm wrote:You are right about your assumption that AI would need someone to learn from, just as humans do, I'd say. So an AI that "created itself" would most likely not work properly, as many cruel experiments with children have shown.
However an AI that was created by a human would most likely learn from it's creator. I'd even compare it to a "normal" human child.
I don't think a body is necessary for a consciousness. Most of the body does not affect the mind, so a paralyzed person can still think, Stephen hawking would be a good example for that. The only part that is directly important to the human mind is the brain. However a computer can simulate parts of the brain, for example the memory and sensors such as a camera can simulate eyes.
Therefore I don't think AI will never exist because of the body, I'd rather say it will never exist because of human incompetence (although I personally think it will exist sometime)
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Pelargir » September 13th, 2017, 10:44 am wrote:You mean Movies like Terminator?
It's, As you've said, Science Fiction.
It is Not unlikely that sometime in the near future someone (hopefully not google or something like that) who creates something between weak and strong AI.
Our society would definitely not be able to cope with that, which can lead to some rather serious problems.
Depending on the AI, they might be even a lot worse than a hurricane. Just think about all the bombs that can be fired from a long distance because of computers...
It is good to deal with the problems you have, but it'd be better to deal with the ones you will have, I'd say.
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Pelargir » September 13th, 2017, 1:59 pm wrote:There is no safe option for controlling bombs, but that's not important
Since you already mentioned it: AI controlled by a government or more specifically the governments military force would be quite close to the worst thing that could possibly happen.
And it's slavery, actually, since AI has (or would have, if it existed) a consciousness and should therefore be considered human.
So if some people used it without making an agreement, that could be called slavery. However most governments or at least their military forces wouldn't care, since AI is the best thing that could possibly for an army (as long as they are on the same side and maybe except for some kind of shield that can stop missiles)
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Pelargir » September 13th, 2017, 2:21 pm wrote:And then we've got a problem.
I brought this up, because it is a good example why thinking about AI is not a waste of time.
Nowadays computers are the safest way of controlling bombs, I guess, but as soon as AI exists and learns coding, that isn't safe anymore.
... nobody will need bombs anymore (because bombs are completely necessary now..) as soon as AI exists and this problem will never occur
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Infinite_Observer » September 13th, 2017, 6:13 pm wrote:Because basic human survival instincts tell us that is very scary. We are to used to being the dominate life form. Once another conscious form of life controls our fate like that we give up the one thing we have come to pride our species with Superiority.
![]() |
![]() |
![]() |
![]() |
Pelargir » September 14th, 2017, 1:33 am wrote:You know, to discuss something properly, there should be reasons and/or examples.
So why do you think that there can't be any consciousness without a body?
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Infinite_Observer » September 13th, 2017, 8:25 pm wrote:Sorry Im not really used to debating much yet please bare with me. But I ment control our fate if like what was discussed earlier in the thread they could control the nuclear weapons and bomb anywhere by controlling the computers with access. Im not sure if I have a point exactly. Just exploring the idea.
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
BadgerJelly » September 13th, 2017, 10:15 pm wrote:Serpent -
Then we are talking about a completely alien thing and hardly worth calling "consciousness" imo.
I don't really consider consciousness, as we know it, to be anything other than part of a sensory system.
If it cannot have a physical sense of "being" then it seems completely meaningless to say "consciousness".
A sense of space, time and other are essential to consciousness.
By sense of "other" I am referring in part to limitation of perception and control. Any system has to exist within a system and be aware of its limited control in order to become conscious.
I think today we're already quite aware, as in various periods of human history, that our sense of identity is under threat.
I imagine if we integrate with computer systems we may see two equally disruptive things happen. The first being perhaps more obvious to most people, the so called "hive mind",
but what I think would also happen is that individuals sens of identity may be prone to splinter into many fragments
In this sense we may very well see people like this being "carried" along by AI systems so they can feel useful to society. This leads to many difficult issue concerning ethics and how these people would "feel" like they were productive when they were in fact just being "carried" along. The funny thing is this is pretty much how consciousness tends to function! Our sense of "authorship", depending on what position you're inclined to view the situation, is often "false".
Also, I mentioned some time ago about a chess competition with teams of AI systems and humans, one human and one computer. The most successful team was not the grand master and the higher computing machine, the most successful teams were the fastest and best computing machines combined with ordinary players. From this we can assume to a degree that our 'lack of ability" and "mistakes" are part of what makes us incredibly successful as a species. If the AI knew what was good for it I doub tit would throw away such a useful resource? Our "stupidity" may be the "genius" factor that saves us from obliteration! Hahaa! XD
![]() |
![]() |
Users browsing this forum: No registered users and 10 guests