Recent Posts

Pages: 1 2 [3] 4 5 ... 10
21
General Chat / Re: RIP Windows Phone
« Last post by Zero on December 15, 2017, 02:57:05 pm »
Yes. I understand that.
22
General Chat / Re: RIP Windows Phone
« Last post by keghn on December 15, 2017, 02:42:01 pm »
 Big business not nimble enough to stay in the game.
 Big big business is about the keeping old gang together to fight of rivals or and change. Keep the job simply until retirement.

23
General Chat / Re: RIP Windows Phone
« Last post by ranch vermin on December 15, 2017, 12:41:52 pm »
It was a while back,  but i remember being pissed off about visual studio supporting windows phone over the desktop applications.
now its dead,  what a total stuff-around.
24
AI Programming / Re: new demo on the way
« Last post by ranch vermin on December 15, 2017, 12:34:54 pm »
Its on its way!   If anyone wants to see what gpu code can look like  (this drone is gpu accellerated.)  heres what mine does->   (its the whole robot update read and write)

its finished on my computer now, but things are left out of this one so its nonrunnable.
But its just as small,   the ai loop is not very large at all!!

Code: [Select]
DRONE_KEY run_drone(DRONE& drone)
{
 DRONE_KEY key;

 memset(&key,0,sizeof(DRONE_KEY));

 SRV views[1000];


 ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
 //
 // SHITTY 3D SENSE   (i dont think you want to use this for now, just pass it a depth map.)
 //
 views[0]=drone.retina_srv;
 RunComputeShader(dc, drone_sobel, 1, views, nullptr, nullptr, 0, drone.sobel_retina_uav, (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );   
 views[0]=drone.sobel_retina_srv;
 RunComputeShader(dc, drone_blur_sobel, 1, views, nullptr, nullptr, 0, drone.blur_sobel_retina_uav, (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );
 views[0]=drone.sobel_retina_srv;
 views[1]=drone.blur_sobel_retina_srv;
 RunComputeShader(dc, drone_tune_blur_sobel, 2, views, nullptr, nullptr, 0, drone.tune_blur_sobel_retina_uav, (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );
 //then just leave the 3d there for now.
 //basic "depth green screen"  so the foreground is mostly separate from the back, but it makes blunders.

 ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
 //
 // EYE    (1st near neighbour solution!)
 //
 int i;
 for(i=0;i<drone.lods;i++)
 {
  views[0]=drone.retina_srv;
  RunComputeShader(dc, drone_make_eye_keys, 1, views, nullptr, nullptr, 0, drone.eyekeys_uav[i], (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );
 } 
 for(i=0;i<drone.membanks_eye;i++){views[i]=drone.eyemem_srv[i];};
 for(i=0;i<drone.lods;i++)
 {
  views[i+1]=drone.eyekeys_srv[i];
 }
 RunComputeShader(dc, drone_jumble_match_eye_keys, drone.lods+1, views, nullptr, nullptr, 0, drone.class_map_uav, (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );
 //suppress the output map.
 views[0]=drone.class_map_srv;
 RunComputeShader(dc, drone_suppress_eye_keys, 1, views, nullptr, nullptr, 0, drone.class_map_suppressed_uav, (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );
 //form the word keys.
 views[0]=drone.class_map_suppressed_srv;
 RunComputeShader(dc, drone_make_phrases, 1, views, nullptr, nullptr, 0, drone.phrase_keys_uav, (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );
 //run the soft body word router. //then you class it,  MAN, WOMAN, PRIMO & PRIMA
 views[0]=drone.phrase_keys_srv;
 RunComputeShader(dc, drone_match_phrases, 1, views, nullptr, nullptr, 0, drone.phrase_map_uav, (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );
 for(i=0;i<drone.lods;i++)
 {
  //form the environment keys.  suppress the words off one at a time.
  views[0]=drone.phrase_map_srv;
  RunComputeShader(dc, drone_suppress_phrases, 1, views, nullptr, nullptr, 0, drone.suppressed_phrase_map_uav[0][i], (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );
 }
 //keep a history chain, at some point back
 for(i=drone.divergences-2;i>=0;i--){memcpy(&drone.motorkey_history[i+1],&drone.motorkey_history[i],sizeof(DRONE_KEY));}


//

 ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
 //
 // MOTOR    (huge population over time!)
 //
 views[0]=drone.phrase_keys_srv;
 uint j;
 for(j=0;j<drone.lods;j++) //add the j dimension
 {
  for(i=0;i<drone.membanks_motorsensor;i++){views[1+i]=drone.motorsensormem_srv[0][i][j];}
  for(i=0;i<drone.membanks_sensormotor;i++){views[1+i+drone.membanks_motorsensor]=drone.motorsensormem_srv[1][i][j];}
  RunComputeShader(dc, drone_run_pop, 1, views, nullptr, nullptr, 0, drone.blended_motor_key_uav[0], (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );
 }

 /////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
 //
 // SENSOR MOTOR MEMS.   and output key.
 //
 //write new motor key, to environment keys.   //U HAVE TO TRUNCATE IT FOR EACH LOD MEMORY.
 for(i=0;i<drone.lods;i++){views[i]=drone.suppressed_phrase_map_srv[i];}
 for(i=0;i<drone.membanks_motorsensor;i++){views[drone.lods+i]=drone.motorsensormem_srv[0][i];}
 RunComputeShader(dc, drone_run_pop, 1, views, nullptr, nullptr, 0, drone.blended_motor_key_uav[0], (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );
 //write old environment keys to new motor key  //U DO IT FOR LODS.
 views[0]=drone.phrase_map_srv;
 for(i=0;i<drone.membanks_motorsensor;i++){views[1+i]=drone.motorsensormem_srv[0][i];}
 RunComputeShader(dc, drone_run_pop, 1, views, nullptr, nullptr, 0, drone.blended_motor_key_uav[0], (1<<drone.logretina)/32, (1<<drone.logretina)/32, 1 );

//
 //keep history.
 for(i=0;i<drone.lods;i++){dc->CopyResource(drone.suppressed_phrase_map[1][i],drone.suppressed_phrase_map[0][i]);}
 dc->CopyResource(drone.blended_motor_key[1],drone.blended_motor_key[0]);

 //troublesome VIDEO to SYSTEM copy.
 ID3D11Buffer* debugbuf = CreateAndCopyToDebugBuf( g_pDevice, g_pContext, drone.blended_motor_key[0]);
 D3D11_MAPPED_SUBRESOURCE MappedResource;   
 BufType *p;
 g_pContext->Map(debugbuf,0,D3D11_MAP_READ,0,&MappedResource );   
 p = (BufType*)MappedResource.pData;
 memcpy(&key,p,sizeof(DRONE_KEY)); //not right. its wrong.
 g_pContext->Unmap(debugbuf, 0);
 SAFE_RELEASE( debugbuf );

 //OUTPUT MOTOR
 return key;
}
25
General Chat / RIP Windows Phone
« Last post by Zero on December 15, 2017, 12:16:49 pm »
So... Windows Phone is dead. Microsoft's pulling the plug, it's official.

I own a Windows Phone, and I really like it. It's a 535, with "Microsoft" written on it. I already feel like driving an elegant antique car! I'll keep it until I die, and mention it in my testament. It will live forever in my family...  ;D

Beyond the "betrayal", it's sad because we now have only two alternatives for smartphones: google or apple. Monopole is a bad news, globally speaking.
26
Robotics News / Unlocking marine mysteries with artificial intelligence
« Last post by Tyler on December 15, 2017, 12:00:07 pm »
Unlocking marine mysteries with artificial intelligence
15 December 2017, 4:59 am

Each year the melting of the Charles River serves as a harbinger for warmer weather. Shortly thereafter is the return of budding trees, longer days, and flip-flops. For students of class 2.680 (Unmanned Marine Vehicle Autonomy, Sensing and Communications), the newly thawed river means it’s time to put months of hard work into practice.

Aquatic environments like the Charles present challenges for robots because of the severely limited communication capabilities. “In underwater marine robotics, there is a unique need for artificial intelligence — it’s crucial,” says MIT Professor Henrik Schmidt, the course’s co-instructor. “And that is what we focus on in this class.”

The class, which is offered during spring semester, is structured around the presence of ice on the Charles. While the river is covered by a thick sheet of ice in February and into March, students are taught to code and program a remotely-piloted marine vehicle for a given mission. Students program with MOOS-IvP, an autonomy software used widely for industry and naval applications.

“They’re not working with a toy,” says Schmidt’s co-instructor, Research Scientist Michael Benjamin. “We feel it’s important that they learn how to extend the software — write their own sensor processing models and AI behavior. And then we set them loose on the Charles.”

As the students learn basic programming and software skills, they also develop a deeper understanding of ocean engineering. “The way I look at it, we are trying to clone the oceanographer and put our understanding of how the ocean works into the robot,” Schmidt adds. This means students learn the specifics of ocean environments — studying topics like oceanography or underwater acoustics.

Students develop code for several missions they will conduct on the Charles River by the end of the semester. These missions include finding hazardous objects in the water, receiving simulated temperature and acoustic data along the river, and communicating with other vehicles.

“We learned a lot about the applications of these robots and some of the challenges that are faced in developing for ocean environments,” says Alicia Cabrera-Mino ’17, who took the course last spring.

Augmenting robotic marine vehicles with artificial intelligence is useful in a number of fields. It can help researchers gather data on temperature changes in our ocean, inform strategies to reverse global warming, traverse the 95 percent of our oceans that has yet to be explored, map seabeds, and further our understanding of oceanography.

According to graduate student Gregory Nannig, a former navigator in the U.S. Navy, adding AI capabilities to marine vehicles could also help avoid navigational accidents. “I think that it can really enable better decision making,” Nannig explains. “Just like the advent of radar or going from celestial navigation to GPS, we’ll now have artificial intelligence systems that can monitor things humans can’t.”

Students in 2.680 use their newly acquired coding skills to build such systems. Come spring, armed with the software they’ve spent months working on and a better understanding of ocean environments, they enter the MIT Sailing Pavilion prepared to test their artificial intelligence coding skills on the recently melted Charles River.

As marine vehicles glide along the Charles, executing missions based on the coding students have spent the better part of a semester perfecting, the mood is often one of exhilaration. “I’ve had students have big emotions when they see a bit of AI that they’ve created,” Benjamin recalls. “I’ve seen people call their parents from the dock.”

For this artificial intelligence to be effective in the water, students need to combine software skills with ocean engineering expertise. Schmidt and Benjamin have structured 2.680 to ensure students have a working knowledge of these twin pillars of robotic marine vehicle autonomy.

By combining these two research areas in their own research, Schmidt and Benjamin hope to create underwater robots that can go places humans simply cannot. “There are a lot of applications for better understanding and exploring our ocean if we can do it smartly with robots,” Benjamin adds.

Source: MIT News - CSAIL - Robotics - Computer Science and Artificial Intelligence Laboratory (CSAIL) - Robots - Artificial intelligence

Reprinted with permission of MIT News : MIT News homepage



Use the link at the top of the story to get to the original article.
27
AI Programming / Re: Listing States / Processes / EventActs
« Last post by Zero on December 15, 2017, 11:56:43 am »
Here is the PEGjs syntax for items. You can test it at https://pegjs.org/online

   namespace prefix : "filename"
   item id -> [ meta | data ]
   item id -> relation type { slot1: item1, item2, ... ; slot2: item3; }
   item id -> ( behavior node, ... , ... )

   namespace / namespace / item id


Code: [Select]
sourceCode
= ns:namespace* id:itemDef* _ { return { namespaces:ns, definitions:id }; }


_
= [ \t\r\n]*


namespace
= _ ns:namespaceId _ ":" _ '"' fp:filepath '"' { return { namespace: ns, filepath: fp }; }


itemDef
= _ id:localItemId _ "->" _ iv:itemValue { return { item:id, value:iv }; }


namespaceId
= c:normalChar* { return c.join('').trim(); }


filepath
= c:[^"\t\r\n]* { return c.join('').trim(); }


itemValue
= metadata / relationship / behavior


localItemId
= c:normalChar+ { return c.join('').trim(); }


itemId
= fns:namespaceId nns:nextItemPath* {

  var whole = [fns].concat(nns);
  var id = whole.pop();
 
  return { type:"item", id:id, path:whole };
}


nextItemPath
= "/" ns:namespaceId { return ns; }


metadata
= _ "[" meta:metadataContent+ "|" data:metadataContent+ "]" {

return { type:"meta/data", content: { meta:meta, data:data } };
}


relationship
= _ ri:relationshipId _ "{" _ sd:slotDef* _ "}" {

return { type:"relation", name:ri, content: sd };
}


behavior
= _ "(" fb:behaviorContent nb:nextBehaviorContent* ")" {

return { type:"behavior", content: [fb].concat(nb)};
}


metadataContent
= itemValue
/ textContent


textContent
= c:normalChar+ { return { type:"text", content: c.join('') }; }


relationshipId
= c:normalChar+ { return c.join('').trim(); }


slotDef
= _ si:slotId _ ":" _ fi:slotValue _ ni:nextSlotValue* _ ";" {

    return { slot:si, value:[fi].concat(ni) };
}


behaviorContent
= itemValue / itemId


nextBehaviorContent
= _ "," _ b:behaviorContent { return b; }


slotId
= c:normalChar+ { return c.join('').trim(); }


slotValue
= itemId / itemValue


nextSlotValue
= _ "," _ v:slotValue { return v; }


normalChar
= c:[^\-\<\>\{\}\[\]\(\)\:\/\,\;\|\\] { return c; }
/ "-" c:[^>] { return "-"+c; }
/ "\\" c:. { return c; }

Back in black.
28
Future of AI / Re: The Future of AI Is The Past! Pre-Internet Marketing & Branding
« Last post by Zero on December 14, 2017, 08:57:55 pm »
I don't think it's meant to be more than a concept. But I like the idea, it's unusual and refreshing!
29
Future of AI / Re: The Future of AI Is The Past! Pre-Internet Marketing & Branding
« Last post by Art on December 14, 2017, 02:38:21 pm »
Welcome!

Interesting concept character.

I think Bladerunner has a mechanical, robotic owl that belonged to Dr. Tyrell.

Of course if a real application, that is, a real robot, those sharp, pointy claws would have to go or be replaced with a softer, rubber facsimile, I would think.
30
Future of AI / Re: The Greater Good - Mind Field
« Last post by Art on December 14, 2017, 02:30:17 pm »
Michael's videos have always been extremely interesting and very thought provoking, especially this new series compared to VSAUCE .

Aside from the humans, what if Track 1 had 5 escaped murders & rapists, and track 2 had young Einstein before becoming famous, or a future President. Would it make a difference?

As Spock once said, "The needs of the many outweigh the needs of the few or the one."

It is far more difficult than one thinks to have to make a decision that will involve a fellow human or humans being killed. Very powerful video and I especially liked the fact that they employed a psychologist to test the subjects beforehand and the on-screen message that This was a Test....

Good post Freddy!
Pages: 1 2 [3] 4 5 ... 10

Users Online

33 Guests, 0 Users

Most Online Today: 49. Most Online Ever: 208 (August 27, 2008, 09:36:30 am)

Articles