castable flat folding robot with mechanical neural network

  • 8 Replies
  • 3851 Views
*

MagnusWootton

  • Replicant
  • ********
  • 646
castable flat folding robot with mechanical neural network
« on: April 07, 2022, 04:44:37 pm »


I'm just posting this to show you guys and get the plan clearer in my mind, to finally put this thing together.

This robot will cast as a flat extrusion, then fold into a volume for the eye/ear and motor connection,  it runs on a single motor and the rest is mechanical only, even the perceptron/neural network.  It will run at about 100 hz, but the perceptron is fully "wired" parallel so it can get an output of its 3 hidden layer neural network in 1 tap,  so it doesn't need a high frame rate.   

Its like a flat chipboard, it's version of a skeleton, and I might model some aesthetics on it and have a silicone coating.  Being only mechanical it requires no electricity to run it,  itll run off a spring if you wanted it to.

Because I want it castable in one hit, its easier if the whole thing didn't have a single via/jumper and its all one layer,  I manage to do it by having an upper layer that just folds over, and the eye is done as single slits then it accordian folds together, and just links directly to the neural network inputs.
Because it has an ear and other state keeping reasons it needs a little temporal feedback from the output to the input of the network,  that's another easy way to gain recurrency,   so it has context over time.

The eye is ultrasonic, and it has a mechanical step down transmission to lower the frequency of the emitted signal  (Which comes packed with the eye) and as I lower the frequency it'll increase the torque so it can push its way through the network to the output just passively, with no amplification / power addition necessary than just the ultrasound on the eye membranes itself.  (It gets a good 1000x the power that way,   but I have to keep the stroke small maybe,  have to think a little more.    And it sees like a bat at 40kilohertz.

The same transmission that's in the eye is also in the leg,  so I can have a tendon vibrate with a short stroke, conduct it down the leg, then I go into the transmission which slows the vibration down to a slow turn and that's what pulls the joints.

The legs will have to go through a short reorientation from the flat cast to occupy their relevant axi's of rotation after the mouldings been done in a short assembly step when you add the motor to it.    The machine doesn't have angle sensors on its joints,   I actually have a technique to know the angles of the joints from looking at the camera instead, but It may need a little recurrency (feedback) to get the context requirement to pin the angles down in the network.

The network will have about 300k synapses in a 2d grid with 3 hidden layers, shrinking from a large input layer connected to the camera, going to 16 motors + the feedback state information.

I don't know if it'll be enough synapses,  but its good enough for a try,   to set the weights of the mechanical network I tap into it with an arduino at the motor output register position, and the sensor in position, I gather the I/O pairs, and then I run my a.i. software and it should give me the best possible weights to set this mechanical machine with.

Its fully 3d printable in 1 hit, also, and doesn't necessarily need electricity to run, but u can use it for the main engine.


Heres a picture of the step down transmission attached to its compound eye.


This is also in its leg.

its about 70% done, I need a couple of weeks and then Ill print out my NANOBOT on my anycubic photon!  hhe  its doubtful it'll work yet but at least I got the plan down fully finished soon! (couple of days.)

I hope to get it done in as few as 5 prints max,  its mostly all together,  it should cast in 2 whole connected pieces, a moving layer which has the mechanical connections and a base bracing frame it sits on.   I wonder if I could probably do it in 1 piece and then it all just folds over, the moving connectors to the fitting brace/base.

« Last Edit: April 07, 2022, 05:04:43 pm by MagnusWootton »

*

Zero

  • Eve
  • ***********
  • 1287
Re: castable flat folding robot with mechanical neural network
« Reply #1 on: April 07, 2022, 06:28:31 pm »
wut? did I read 'mechanical neural net running off a spring?  :o

How does a mechanical neuron work?

*

MagnusWootton

  • Replicant
  • ********
  • 646
Re: castable flat folding robot with mechanical neural network
« Reply #2 on: April 07, 2022, 06:58:35 pm »
Yep fully mechanical,  no electricity required mandatorily for the robot to function.

It uses a rotating cranks (Instead of cogs, but they both rotate.) and rods connecting the logic through,    You can get a controlled not  (Like in quantum computing - but this is normal logic.) which is universal logic so it can run any program, by just leaning the crank just left and right of a vertical rod connecting to it, and it will invert the output (Which goes out right, perpendicular)   so if its leaning left,  it goes right left, when it pulls and pushes, and if its leaning right, it goes left right, when it pulls and pushes. (the inverse.)

I can do more things too with a similar principle,   that can increase frequency by doubling and decrease it by halving (that was that Friterchet diagram.),  I take one push, and i turn it into 2 pushes. and I use that transmission all through it,  in its eye+ear and also in its legs,  and it steps up quite a bit to go out its ultrasonic emitter,  and it all can be powered with any motor you want,  so a little one could just use 1 dc motor.

The mechanical neurons work via using leverage on the crank to get the multiply for the synapses (so the ratio of the radius of connection to the input and output rods), and then the system is allowed to jam the negative out for the rectify, and then thats all you need for a neuron.

I have to have these "slack rods" in place that when a jam happens it only happens on that one line and doesn't sieze the rest of the machine.

Then the other cool thing was getting it castable as a single "cookie cutter" extrusion,  so the layers of the network are put on a grid, and there's no big wire mess that u usually get with a neural network.

« Last Edit: April 07, 2022, 07:41:30 pm by MagnusWootton »

*

infurl

  • Administrator
  • ***********
  • Eve
  • *
  • 1372
  • Humans will disappoint you.
    • Home Page
Re: castable flat folding robot with mechanical neural network
« Reply #3 on: April 07, 2022, 09:19:31 pm »
https://en.wikipedia.org/wiki/Fluidics

You're quite right in that you don't need electricity to implement logic. You might find some useful concepts in the field of fluidics which uses the flow of gases and liquids to perform sophisticated control functions.

https://en.wikipedia.org/wiki/Difference_engine

Also remember that the first programmable computer was purely mechanical, Charles Babbage's difference engine, so you're in good company.

*

Zero

  • Eve
  • ***********
  • 1287
Re: castable flat folding robot with mechanical neural network
« Reply #4 on: April 08, 2022, 08:52:19 am »
It's very cool project. I guess you can't "train" the net on computer the usual way, but maybe you can evolve it with natural selection techniques, before printing the result.

*

MagnusWootton

  • Replicant
  • ********
  • 646
Re: castable flat folding robot with mechanical neural network
« Reply #5 on: April 08, 2022, 09:34:06 am »
I'm glad you guys like it!

So with training, it happens on a gpu thats hooked up to the plastic body, via wifi to an arduino tapping into the mechanics (at 2 positions, one at the sensor, and one at the motors), with the help from capacitors to read values, and solenoids (or highvoltage electrostatic actuators would work too.) to activate it.

Seems like it could be tricky to do that,  But If I bring the sensor to one position, and I write to all the motors from one position,  its in the design to be easily tappable by a microcontroller. (Multiplexed.)

Training is just computing what the weights (which is the ratio of the radiuses of input to output on those little crank levers) are going to be.

I'm running this "external gpu controlled ai system" and then its deciding what it does,  its going to develop a billion or so I/O pairs, and then I backprop these into the weights.   Is basicly whats happening.

So the "real ai" happens on the computer, and mechanically its just going to repeat back what it did out of the mechanical network.

Then the software is a computer vision system + a motor search.     (remember my little video I had here ages ago: youtube.com/watch?v=po3c-Y6sWiM)  and it also has another cool feature that its going to work without angle sensors and it works out its posture using the camera position and orientation over time.  (So it needs temporal context to help it work.  then it can only be one combination of leg motions over time to match up to it.)

Its a really cool toy,   and if its a single injection mould cycle per robot, I can get heaps of em for very little cost!
Pump 'em out like barbie dolls. =)
« Last Edit: April 08, 2022, 10:17:20 am by MagnusWootton »

*

Zero

  • Eve
  • ***********
  • 1287
Re: castable flat folding robot with mechanical neural network
« Reply #6 on: April 08, 2022, 11:32:28 am »
Indeed. But why do you need to hook it up during the training? Why not just train it all virtual then print it?

*

MagnusWootton

  • Replicant
  • ********
  • 646
Re: castable flat folding robot with mechanical neural network
« Reply #7 on: April 08, 2022, 12:25:55 pm »
Indeed. But why do you need to hook it up during the training? Why not just train it all virtual then print it?

So I implement something like an accellerometre, using computer vision instead.
What I need, every time the robot moves, is the new camera position+orientation,  and the floor plane underneath it's position+orientation,  with those two things,  I can get it to walk along.

But I plan on doing more than that,   I actually want it to be able to fight another robot,  so I can have robot wrestling with it, so the computer vision is more complete than that, what I'll be implementing.
Its me being tricky, not putting angle sensors in it, you don't have to omit those, its easier if you have the angles of the leg hinges known.

So, why do I need the actual physical response from the specific body itself?

Its because I have N physical properties (constants,  like gravity, friction, and hinge motor force) I need to brute force, to match the virtual simulation to something closer to reality.
Its actually probably not exact even at best (the virtual physics compared to the actual physics), and it'll show the fact by being a little uncoordinated, and it wont be able to do really amazing tricks, but it'll be able to possibly wrestle and pin down, as long as its not being "too tricky a skateboarder" how it does it.  (Just does it simply.)

Because the internal workings of the body is apart of the physical properties as well, that go into the N constants for the physics simulator, I need to actually run it with the body,  and also the body has to be supported by the software as well.  so if there is a type of gear,  or motor being used, the physics simulator has to be able to support it.  (So a dc motor robot, is slightly different than this thing, which runs off cranks and levers.)

You could think the simulator is really hard to write,   but something approximate is really easy,  if you know the basics of virtual ragdoll systems you can omit alot of maths!  alot of it can be discarded, but it does make it less realistic, but as long as you keep the angular momentum that would help.

If you want to read about the physics engine Im going to use,  its similar to these->






They make it look really tough to learn,  but something a little less realistic, but still the basic gist, is really easy if you just think about it on a page by yourself, get the basics down, they dont tend to help you in that area on the internet.  (Thinking for yourself, its more of a psychological motivation type lecture, its a bit different "meta-learning".)

Its like that car and soft body box thing,  except its a stick-insect instead, and i just add internal forces,  other than that exactly the same code.

I spose... if you didnt worry about the exact motors/linkage/gears used, it probably could be fully virtual.   the other physics constants would be the same no matter what the body.
But I bet it would synch slightly better, if I custom did it with the single specific body itself, because theres alot of factors involved, and if I do it in the exact body it would possibly get  a more accurate result taking into account the extra unknown factors involved.


*

MagnusWootton

  • Replicant
  • ********
  • 646
Re: castable flat folding robot with mechanical neural network
« Reply #8 on: April 08, 2022, 12:52:32 pm »
I have to actually get it from the robot itself anyway... I need to sample from the eye in the system itself, or I cant correlate the motors to the exact sensor its using.  But I can use a web cam when i'm training it,  but i have to record the actual plastic eye as I do it, so I can pair them spacially.
« Last Edit: April 08, 2022, 01:33:56 pm by MagnusWootton »

 


Requirements for functional equivalence to conscious processing?
by DaltonG (General AI Discussion)
November 19, 2024, 11:56:05 am
Will LLMs ever learn what is ... is?
by HS (Future of AI)
November 10, 2024, 06:28:10 pm
Who's the AI?
by frankinstien (Future of AI)
November 04, 2024, 05:45:05 am
Project Acuitas
by WriterOfMinds (General Project Discussion)
October 27, 2024, 09:17:10 pm
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

258 Guests, 0 Users

Most Online Today: 532. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles