Dan - I've seen that a lot. Some people never really learn to use the computer though, but rely on others to tell them what to do. They can't actually "think" logically for themselves. Others, however, have an inherent curiosity and ability to figure things out for themselves. I once dated a guy that didn't even know what a windows desktop was on him computer. Two weeks later he diagnosed a hard drive failure, backed up data, removed the drive and reinstalled a working one, and got everything working again. By himself.
Some people catch on really quickly, once they get a computer in their hands. Others just never get it.
Tom - That is so true. Sometimes though, I've come across the phrase "information overload" in terms of people trying to process all that information. Sometimes it leads to a mental burnout state. However, I think if one evaluates the information at a comfortable level and pace, one can only expand and their mental, cognitive and logical skills will improve.
BTW, as a note, the professor said that multiple choice questions are by nature ambiguous. One has to usually choose the best of the group. He mentioned this is such of the English language. Something I can easily understand after our conversations of chatbot use of the English language.
I did get to understand the choices better but it still left a bit of a hole. It was one of those questions where you REALLY had to exercise your brain cells. And then it amounted to just taking a wild guess or determining which of the answers left after process of elimination is the best match of the answers left to chose from.
I'm wondering from this, if this would be how some, if not all chatbots work. Or if not, if this idea may help chatbots work better? However the chatbot would have to then come up with the choices as well and determine from those choices which is best.
Amazing to think about. I bet most of the students might not have even thought that far.
And here I answered an ambiguous question correctly due to logical thinking (or so I thought). Perhaps AIs can use logical thought to answer ambiguous questions or come up with responses?
Now this is a case of the reverse where things one learns outside the AI arena can be brought INTO the AI area. Which is what we are trying to do in the first place, I guess.
It certainly can go both ways. And by such, one I think becomes more observent outside the forum for things to learn and discuss in the forum.