Hello!

  • 4 Replies
  • 2523 Views
*

InterestedInAI

  • Roomba
  • *
  • 1
Hello!
« on: June 17, 2016, 02:13:39 pm »
Hi,
I've always been interested in AI (I did a Computer Science and Cybernetics degree in Reading England). Though now I'm an analyst I have a project I've been working on which has opened up more the philosophy of AI to me and I wanted to find a community to discuss it with.

So just to summarise what I've been working on - there is a measure where I work that looks at how happy people are with the service. Thousands of people every month respond to a text question giving a rating and feedback. I developed procedures to upload the responses into the Data Warehouse, remove special characters, gave associated ratings to the positivity and negativity of words used in the feedback, wrote a fuzzy matching function to group suffixes and misspellings so we could do text analysis and find trends in the feedback to create a "voice" of the services.

Now I’m making a very by assumption here but it occurred to me while I was working on this what's wrong in natural language AI approaches I’ve read up on. It’s that the logic for a developer in solving this is to look at it at the problem at too high a level. We approach it as a philosopher looking at the rules and structures of words and sentences, we build rules to replicate how the brain sees patterns. The way we learn language as babies however, is by association to ourselves. For example as baby we were told we look happy, so we associate the word happy to the emotional and physical feeling (what the muscles in our faces are doing). Language is learnt from how we associate words to ourselves, the emotional response we get from the visual and aural input.

My question is this, is there development in giving AI a sense of identity, a sense of self and then teaching it through it’s association to its surroundings? Or am I completely wrong in my assumptions?

Thanks

*

8pla.net

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1302
  • TV News. Pub. UAL (PhD). Robitron Mod. LPC Judge.
    • 8pla.net
Re: Hello!
« Reply #1 on: June 17, 2016, 03:37:54 pm »
Hello Int,

Let me start by giving you a nickname... Int.  I think that is a cool nickname, because it is a data type in C Language which is short for integer.   And, it is an abbreviation that is much easer to type. Let me know if you approve. You can call me, by the nickname they gave me, 8pla.

I would theorize, A.I. may have a stable high level, and a stable low level, but may lack a stable mid level.  In my view, this is where A.I. research is focused currently.   Let's for example form a theory, that places chatbots on a high level, and neural networks on a low level.

The chatbots though limited, I would say politely toward ANNs, are superior to neural networks at mimicking natural language.  The neural network though limited, in my opinion, are superior at simulating natural learning and appearing to be alive. Sort of like a puppy dog trying its best to learn from its master.  A mid level, might be a synergy between the high level and low level A.I.   

At least this is my informal working theory, based on hands on experience building chatbots and artificial neural networks.  I enjoy trying discover new connections between these two amazing A.I. success stories.  But, there are so many other success stories in A.I. to choose from. That's what makes it so exciting.

   
My Very Enormous Monster Just Stopped Using Nine

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: Hello!
« Reply #2 on: June 17, 2016, 04:21:26 pm »
Welcome InterestedInAi, (agreed...you've got to let us shorten that name). ;)

So to offer my answer to your, "... is there development in giving AI a sense of identity, a sense of self and then teaching it through it’s association to its surroundings? Or am I completely wrong in my assumptions?"

Look at some of the work done by people like Cynthia Breazeal of MIT robots fame. Her creation, Kismet and later her furry companion Leonardo, both used emotional teaching along with action / reward situations and training. She of course, later developed the "home help bot", Jibo, which recognizes faces and responds with many emotional reactions.

Teaching robots / A.I. what is good or acceptable and what is bad or unacceptable is ongoing and certainly good practice. It is always interesting, the methodology used by various researchers and practitioners, in attempts to "train" their respective creations. Visual, touching and verbal all seem to be used with something in the programing assigning a weighted number to indicate to the program that a certain result was acceptable, then this example is saved for future reference or use. Bad or failed results are likewise categorized. The learning is ongoing and growing.

No, you are not wrong in your assumptions, in my opinion.

Enjoy your stay and again, welcome!
In the world of AI, it's the thought that counts!

*

Don Patrick

  • Trusty Member
  • ********
  • Replicant
  • *
  • 633
    • AI / robot merchandise
Re: Hello!
« Reply #3 on: June 18, 2016, 10:02:28 am »
There are a number of projects that try to learn from the ground up.
http://icub.org/ for instance is a child-like learning robot. The most notable about this approach is just that it's very slow. There are also robots that create their own language by assigning random generated bleeps to physical actions like raising an arm, and then by trial-and-error get a mimicking robot to learn to do the same at the appropriate bleep. Again, this is a slow process.
And there is http://www.mindconstruct.com/, who is trying to tie language to experiences.

I can't say that I have particular faith in any of them, because I think that approach just takes at least as much hands-on parenting and going to school as it takes a human child to read and write perfectly, while no-one is prepared to do that (or to fund that). Scientists are looking for a magic bullet and will be sorely disappointed to find that it still takes decades of training. Nevertheless they are interesting projects that can offer a lot of insights, and I agree that language processing nowadays is far too shallowly working at textual level. But, when full human understanding isn't required, then textual level analyses cover 80% of commercial needs with relatively little effort.
CO2 retains heat. More CO2 in the air = hotter climate.

*

8pla.net

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1302
  • TV News. Pub. UAL (PhD). Robitron Mod. LPC Judge.
    • 8pla.net
Re: Hello!
« Reply #4 on: June 19, 2016, 07:15:50 am »
Int mentioned, "there is a measure where I work that looks at how happy people are with the service. Thousands of people every month respond to a text question giving a rating and feedback."  With this type of project, I also have work experience, with thousands of multiple choice questions. In my case it was a survey for statistics called a questionnaire. 

In the past, a sense of identity was artificial for humans.  It  depended on the economic system.  For example, slaves, the spoils of war in Africa, later shipped globally, did not have legal names.  Human beings were valued by gender and age, and lost their sense of identity, listed as property with the livestock.  The economic system of slavery is still widespread in modern times.

So, if we extrapolate from slave groups for A.I. research and development... Logically, a sense of identity is man made for naturally intelligent humans.  Therefore, it may be simulated for Artificial Intelligent systems.  Voice recognition is possible with training.  So why wouldn't recognizing qualities that in a context make the A.I. an individual, also be possible with training? 


My Very Enormous Monster Just Stopped Using Nine

 


AI controlled F-16, for real!
by frankinstien (AI News )
May 04, 2024, 01:04:11 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am

Users Online

322 Guests, 0 Users

Most Online Today: 384. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles