Ai Dreams Forum

Member's Experiments & Projects => General Project Discussion => Topic started by: frankinstien on November 11, 2020, 12:40:23 am

Title: It's here!
Post by: frankinstien on November 11, 2020, 12:40:23 am
I just got these glasses (https://epson.com/For-Work/Wearables/Smart-Glasses/Moverio-BT-300-Smart-Glasses-(AR-Developer-Edition)-/p/V11H756020).  So what, right?

Well if you recall my symbolic modeling (http://wraithbots.com/2020/11/06/symbolic-concept-modeling/) approach where one can model just about anything where I wanted the machine to experience life. Well without a body there is no way that will happen so...wait for it...I decided you use my own body! Yes! How, well those glasses are augmented reality glasses with an HD camera so the intent is to use object detection software to pick out stuff as I walk about or use a drone (It was actually designed for flying drones) the machine will ask what those objects are if it doesn't recognize them. Where there is a piece of software I'm developing that asks questions using the symbolic modeling tool about anything. But it doesn't stop there, I also will be using a smartwatch to collect information about my body temperature, heart rate,  oxygen levels, my rate of motion, etc that will correlate to the visual and auditory information collected from the glasses. I will add an environment temperature and wind sensors as well to some Raspberry Pi Zero that I can carry along.

So I plan I just traveling about and letting Amanda, (https://aidreams.co.uk/forum/general-hardware-talk/my-hal-rig/) that's the big ole Hal server I showed in a previous post, get a taste of life as a human. Oh, and those smartwatches can tell if you're falling asleep and the quality of the sleep as well this way the machine can have information that makes a lot of terms and ideas that humans use much more relatable.

So this get's as close as you can get to a machine being a toddler where it points at just about everything and a parent has to teach the child what it's looking at and why humans act the way they do.  :D
Title: Re: It's here!
Post by: HS on November 11, 2020, 04:41:46 am
That's quite a neat solution.  O0
Title: Re: It's here!
Post by: Don Patrick on November 11, 2020, 07:31:32 am
Very smart indeed. Much more feasible than training a robot.
Title: Re: It's here!
Post by: frankinstien on November 11, 2020, 06:26:57 pm
There is just one more thing I think should be included with this setup and that is a motion capture suit, here a DIY implementation (https://hackaday.com/2020/07/19/building-a-motion-capture-suit-on-the-cheap/).  I have always been an advocate for hybrid AI, Symbolic, and NNs. For this case motion capture using a NN would fit the bill to articulate types of human body motion. The integration with the modeling tool would be a breeze since it can link to libraries that could be trained NNs.
Title: Re: It's here!
Post by: frankinstien on November 11, 2020, 06:50:40 pm
Quote
For this case motion capture using a NN would fit the bill to articulate types of human body motion.

After talking with people that work with mocap they say an NN is not needed and simply capturing the data works well where algorithms can even interpolate transitions between motions.
Title: Re: It's here!
Post by: frankinstien on December 20, 2020, 10:16:52 pm
Has anyone tried these gloves (https://hi5vrglove.com/store/hi5glove)? They are haptic gloves and for the price $1000  :o  it's almost too good to be true. Unfortunately, they are out of stock.  :(
Title: Re: It's here!
Post by: frankinstien on December 20, 2020, 10:45:13 pm
Has anyone tried these gloves (https://hi5vrglove.com/store/hi5glove)? They are haptic gloves and for the price $1000  :o  it's almost too good to be true. Unfortunately, they are out of stock.  :(

Actually for what I need the haptic glove, which is hand positioning and pressure sensing on the fingertips, this less expensive glove (https://www.amazon.com/gp/product/B072L7BM6F?pf_rd_r=FFXWCGA9QM01X84QYDYN&pf_rd_p=9d9090dd-8b99-4ac3-b4a9-90a1db2ef53b) might work.