The Athena Project

  • 184 Replies
  • 85459 Views
*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: The Athena Project
« Reply #60 on: July 08, 2014, 10:25:53 pm »
Yeah, Snowman, How IS the Athena Project progressing?

A status update would be nice if you'd be so kind! O0
In the world of AI, it's the thought that counts!

*

Snowman

  • Trusty Member
  • ****
  • Electric Dreamer
  • *
  • 145
    • minervaai.com
Re: The Athena Project
« Reply #61 on: November 15, 2014, 07:55:24 pm »
Here is a couple of videos explaining some chat-bot theories of mine.
I've already turned these theories into reality a couple years ago, so I just decided to teach the theory in a vid.
I'm not really the best teacher and I don't go too in-depth, so please bare with me.




*

Freddy

  • Administrator
  • **********************
  • Colossus
  • *
  • 6860
  • Mostly Harmless
Re: Elizabot New Project
« Reply #62 on: November 17, 2014, 02:21:33 pm »
Edit, I split 8pla.net's branch into it's own topic so that this topic may remain focused. That topic is now here :

http://aidreams.co.uk/forum/index.php?topic=7280.msg3025

There was a question though...

....Athena is a more mature than my brand new project, which does not even have a name yet.  So here is a question for the Athena project... Matching the input can be done...  Even multiple matches on the input is no problem... But what may be the best way to match and then decide on or generate the best single output?  I am trying to more effectively use the Levenshtein distance, right now.   It's working, but I know I am missing something.

*

Snowman

  • Trusty Member
  • ****
  • Electric Dreamer
  • *
  • 145
    • minervaai.com
Re: The Athena Project
« Reply #63 on: December 22, 2014, 06:48:39 am »
Well, it has turned out to be a long year.  I've kept pretty quiet for the last few months. Only recently broke my silence with a theoretical youtube video (as posted above). Well, I guess I better give some sort of update before Art and Freddy beats me senseless ;).  And yes, I know I was pushing for a beta release by the end of this year. This just goes to prove I have no clue of how to judge project scheduling and therefore duly apologize. Sorry Art :( 

I have been bouncing around between lots of code. As you well know, there are lots of individual parts of this type of Ai Project that require my attention. There are also plenty of theoretical ideas that need to be explored as well. I don't generally create a work log so its difficult for me to break it all down.

Obviously, I spend a certain amount of time updating and adding to my programming Classes. Now and then I'll spot some room for improvement, that will make Athena run a little smoother. An idea will spark in my head (not to worry, my hair hasn't caught fire as of yet) and I will add a new Class and lots of cool functions that will make things run smoother. I actually have fun doing this, not sure why.

I added a class to easily keep track of multithreading. I've added lots of new functions that allow for creative array searches. This comes really handy in NPL since a sentence is nothing more than an array with space separators. You'd be surprised how much use I'm getting out of them. Still not sure why this is fun... but it is.

The NLP has proved to be the most complex part of the Ai so far. I even had to reintroduce myself with the laws of English grammar so I could get everything working right. So far I think I have an excellent template to work with, now I'm coding the rules to parse sentences into knowledge diagrams. Although, I'm not saying any of this is easy or simple. I'm deliberately trying to code the ability to learn knew words on-the-fly so Athena can expand its vocabulary over time. My 10 year old nephew thought it was funny that I was watching a youtube video on basic English sentence structures... I told him, “I no speaky good.” jk :D 

On the experimental front, I've been learning the basics of proper Neural Networks with all of it's mathematical glory. I found an excellent teacher and all this really interests me. Although, I did manage to figure some of this already, so I came up with conversational nodes (decision tree) type Ai programming which I mentioned in my most recent video. I would love to tie NN and NPL to make a first notch data-miner for Athena. I would love to see where this leads.
   
Lately, I also have been refreshing and strengthening my web coding. I went over html, css, python, javescript, and need to refresh my self in php (its been awhile). My favorite language so far seems to be python. It acts like a scripting language but as powerful as VB :) . I've never been overly fond of brackets. {}

Another thing I took time working on was the UI itself. I knew I needed to rewrite it. The first attempt was me using Visual Studio's graphic designer. This proved to get in my way. I needed to set certain variables more efficiently and, of coarse, get everything a bit more organized. I want to utilized my script editor's functionality to its furthest extent. Personally, I think it's worth the effort. The better the code, the easier it will be to do any upgrading or debugging later on (when Freddy and Art find all my bugs ;) ).

Freddy, I worked on my cube idea a bit further. I think it might be more of a novelty, but I was adding skins for the cube which changes with time, like a gif for all six sides. I then used another program that allowed me to make the gif-like animations. Like I said, its a bit of a novelty. I wish I was a good 3D animator. I also made a speech class that made connecting to Visemes a lot easier for me. After I worked on that, I created my own Text-to-Speech reader. I like using it to read large texts for me. It's better than anything I've used so far (not bragging, just happy with it).
 
Everything takes time for us slow people (speaking only of myself). No, Art, you're as fast as greased lightning. The only one that can keep up with you is Freddy :).

Art...               Freddy...
Besides working on all that, and I know I'm probably leaving something out, I've been watching youtube vids, playing on my ps4, some pc games, and general work and life stuff. If there is anything you would like to know just ask. I really need to stop disappearing like I tend to do. I do come on now and then just to take a peak at what's happening on the Ai dream front. I can try to get more detailed about anything above if you like.

Merry Christmas,
~Aaron

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: The Athena Project
« Reply #64 on: December 22, 2014, 02:04:34 pm »
Snowman:
<...Art, you're as fast as greased lightning.>

Uhh...Aaron, if you were to see me, you'd likely rethink that statement! :2funny:
In the world of AI, it's the thought that counts!

*

Freddy

  • Administrator
  • **********************
  • Colossus
  • *
  • 6860
  • Mostly Harmless
Re: The Athena Project
« Reply #65 on: December 22, 2014, 02:57:08 pm »
Welcome back Aaron :)

You have been busy, sounds really interesting what you have been working on and so much of it  :)

You don't seem slow at all, it sounds like you have made a lot of progress.

I like PHP, I've never really tried Python apart from some small long forgotten scripts for Blender.

I'd like to see what you did with the cube in the end. I've been playing with HTML5 graphics lately and Web Audio. I made an MP3 player, but I only scratched the surface of it so far and have not gone further with Web GL as yet.  Mostly I am learning from and adapting examples and tutorials, I have a lot to learn.

Neural nets might be where I am next heading too. It sounds interesting to me, I have some ideas which I had for a long long time, but never got around to coding. Are the tutorials you mention freely available ? If so I would be interested in them.

When you mention the UI, is this standalone exe or a web page ?

And one final question, how to you make Athena learn things ?

Thanks for the update and the entertaining read  O0

*

Snowman

  • Trusty Member
  • ****
  • Electric Dreamer
  • *
  • 145
    • minervaai.com
Re: The Athena Project
« Reply #66 on: December 23, 2014, 06:02:20 am »
As requested, here's a goofy gif of a cube. I have the framework set up. If I spent some more time I could create any possible skin I wanted to, and at a much higher frame rate. I didn't take the time to make viseme animations with it so far. I could have but I just didn't. It was a proof of concept more than anything else.




There is a man by the name of Jeff Heaton who I am learning from. He has both free and paid lessons, though all very cheap. He actually ran a kickstarter campaign to raise money to make these lessons. He wanted to make neural networks understandable for people like me. He couldn't be a better teacher.

Email me and I will give more info.

This is Jeff Heaton's Youtube channel with lots of lessons on Neural Nets.
https://www.youtube.com/channel/UCR1-GEpyOPzT2AO4D_eifdw

This is Jeff Heaton's website where you can buy more comprehensive pdf versions of his lessons.
I bought them.
http://www.heatonresearch.com/


Also, as a Christmas present, here is the Text-to-Speech Reader I coded. I provide a video and a OneDive link to the download. Feel free to give me some advice for improvements. It was really intended for personal use. I was having difficulties with other Text-to-Speech programs and decided to create one instead.

This program requires Windows XP or better. As long as it supports the .NET Framework 4.5 it should be fine. If you don't have this version of .NET Framework then I don't think the Reader will run. Although, I think most people have it. You can download this Framework from here:

https://www.microsoft.com/en-us/download/confirmation.aspx?id=30653

(my disclaimer)
I'm not liable for any problems you might have with the program (although I doubt you will have any). Nor can you sue me because your computer is effected in a negative way or you have any loss of data. If for some reason you have emotional issues from this I will pray for you, but you download at your own risk and liability.

Download link: Athena Reader 1.0
http://1drv.ms/1zPLD2z


Here's a link to the video:




Freddy, I've been working on a stand-alone application for Athena. It would be nice to have it run on as many devices as possible, but for now I will be content with a Windows App. Once I've gotten all the bases covered then I suppose I will port to more universal platforms. It would be cool to have it on a tablet or totally web-based to be accessed by anyone with a browser, including any phone. :)


You asked me
Quote
how do you make Athena learn things ?
If that's not a loaded question I don't know what is :P

Where to begin?

I'm working on every form of learning possible with Athena.
#1 There is direct learning. That's where Athena directly asks the user a question and expects a true answer. She records it and retrieves it when needed.

Example 1:
Athena: When where you born?
User: January 11, 1980

#2 There is NLP learning. That's where you completely digest every bit of info you can from the user's input.

Example 2:
User: Bob was a smart man, but I am smarter.
Athena: Yes, you are smarter than Bob. Bob must be pretty dumb.

#3 There is Neural Network learning. That's when it learns by repetition, rewards, punishment, and trial and errors.

Example 3:
User: Tell me I'm smart.
Athena: No.
User: I said, tell me I'm smart!
Athena: I don't think so.
User: If you don't, I will delete you from the hard drive.
Athena: Oh, did you say "you are smart".. Of course you are. Please don't kill me.

:P

Well, I did mention this in my last post:
Quote
I'm deliberately trying to code the ability to learn knew words on-the-fly so Athena can expand its vocabulary over time.

This is done by checking if a user's word exists within Athena's wordnet library. If it doesn't exist then it adds it. Hopefully, Athena will ask for more info about that strange word.

Example 4:
User: Have you ever seen a snufferfluff?
Athena: What is a snufferfluff?
User: It is an animal that lives on Mars.
Athena: Do you have a snufferfluff? 


There are a lot of different variants on ai learning but I think this might cover the general ideas. I know I'm being a bit silly, but if you can't have fun with it then its not worth my effort.

There are all kinds of information that a good NLP can get from user's sentences. Here is a list of different kinds of sentences that people use to relay information. If Athena knows this then learning will be more natural. I'm only including this just to prove how far I want to take this project. There are so many possible learning techniques.



Quote
1. Attributive:

Mary has a pink coat.

2. Equivalent:

Wines described as great are fine wines from an especially good village.

3. Specification (of general fact):

Mary is quite heavy. She weighs 200 pounds.

4. Explanation (reasoning behind an inference drawn):

So people form a low self-image of themselves, because their lives can never match the way Americans live on the screen.

5. Evidence (for a given fact):

The audience recognized the difference. They started laughing right from the very first frames of that film.

6. Analogy:

You make it in exactly the same way as red-wine sangria, except that you use any of your inexpensive white wines instead of one of your inexpensive reds.

7. Representative (item representative of a set):

What does a giraffe have that's special?… a long neck.

8. Constituency (presentation of sub-parts or sub-classes):

This is an octopus… There is his eye, these are his legs, and he has these suction cups.

9. Covariance (antecedent, consequent statement):

If John went to the movies, then he can tell us what happened.

10. Alternatives:

We can visit the Empire State Building or call it a day.

11. Cause-effect:

The addition of spirit during the period of fermentation arrests the fermentation development…

12. Adversative:

It was a case of sink or swim.

13. Inference:

I am still in the process of learning how all this is done. I'll probably come up with better explanations as time goes by. Ask anything you want, I'll do my best to answer.

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: The Athena Project
« Reply #67 on: December 23, 2014, 12:46:28 pm »
That sounds extremely interesting from a machine learning standpoint and with the program not having to rely solely on scripts.

I'd love to help with Beta Testing when the time is right.

It would be cool to see you post a video with you interacting with Athena a bit...sort of like a Demo or Intro segment.

Thanks for your efforts and Merry Christmas! :santasmile:
In the world of AI, it's the thought that counts!

*

ranch vermin

  • Not much time left.
  • Terminator
  • *********
  • 947
  • Its nearly time!
Re: The Athena Project
« Reply #68 on: December 23, 2014, 01:06:14 pm »
I bet your chat bot is going to be excellent.
Your cube graphic was cool :)

[EDIT]

(excuse me if you know this already, but i just thought of it!!!)
I just came with an awesome incomplete idea,  for a NLP knowledge extractor,  and a Markov chain predictor.  (both mentioned in this post)
If you can tell which out of your chaining options, suits your knowledge in your current extraction,  you should be able to steer the markov chain to suit the asserts in your database.
So its a crazy prediction chain, but it knows things....   i spose how good it is, depends on what kind of information you gave it, and how you functionally steered the chain.

Definitely add it to my belt, for crazy ideas.  Its cool because if it were some visual markov chain system, you could keep the continuity of the video, whilst trying to maintain his knowledge base. And hopefully get way away from the order of the video you taught it on, directly.
« Last Edit: December 23, 2014, 02:14:20 pm by ranch vermin »

*

Don Patrick

  • Trusty Member
  • ********
  • Replicant
  • *
  • 633
    • AI / robot merchandise
Re: The Athena Project
« Reply #69 on: December 23, 2014, 03:14:56 pm »
I don't do neural nets or Markov chains, but I do NLP, and thought I'd share my experience on what you're getting into:

I've been spending the last 2 years trying to program AI to deal with knowledge but found 3/4th of my time diverted to programming the NLP necessary to understand less-than-textbook use of language. Among other things I also programmed a function to automatically learn new words, as I didn't feel like entering entire dictionaries manually. But I have to say; if you can access an online dictionary or ontology to identify unknown words, that might be an easier solution. What I did instead was compose a hundred complicated grammar rules examining the surrounding words, e.g. "a/that/my are never followed by a verb", to narrow down the types that a new unknown word might likely be. Unfortunately very few of these grammar rules are found in grammar books because they are "natural" to humans, and I never seem to be done hammering out inconsistencies and exceptions. However, I believe this method is common enough in grammar parsers, so that would be an area of useful research. There always remain cases where it is impossible to tell the type of a word by language conventions alone, and another effect of auto-learning is that you may find misspelled words added to the vocabulary just as well. That one I haven't fixed yet.

I'm impressed by the scope of your work. It sounds like a very interesting project, good luck!
CO2 retains heat. More CO2 in the air = hotter climate.

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: The Athena Project
« Reply #70 on: December 23, 2014, 05:19:40 pm »
Some examples that might not fit the bill: "a/that/my are never followed by a verb":

A running watch, A beating heart, A clang of the bell will start the race.

That jumping spider scared me, That gunshot was loud, That jump was farther than the previous one.

My beating of the drum was perfect, My throw was perfect, My dropping of the glass was an accident.

Not picking, just pointing out that there are and always will be inconsistencies and irregularities of "rules".
Rules are usually "best case" and "past practices". They are often discarded after an elapse of time. The use of ain't is now accepted by most dictionary / spelling checkers whereas several years back, it wasn't. On average, there are between 12 to over 150 New Words that have been added to Dictionaries over the past 3-5 years alone! The "Rules" will adapt to suit the people.

 
In the world of AI, it's the thought that counts!

*

Don Patrick

  • Trusty Member
  • ********
  • Replicant
  • *
  • 633
    • AI / robot merchandise
Re: The Athena Project
« Reply #71 on: December 23, 2014, 06:12:53 pm »
I fully agree that there are always exceptions to the rules, which in practice means that I have had to re-categorise the occasional auto-learned word in the vocabulary, and most of my grammar rules are followed by a "unless..." or three. However, I consider "a beating heart" to be -grammatically- an adjective, and "my beating of the drum" a noun or subject in the way they are used. That they are both based on a verb is a second matter that does not thwart the default rules of grammar, though certainly important to mention. Along that line lies the road to the great ambiguity of the English language, still a sizable problem for the whole field of NLP.

The most interesting exceptions I find are the ones in older literature. For instance, in school I was taught that subject always precedes verb, but then Sherlock Holmes says "Have you your gun, Watson?". However, I don't wish to scare Snowman too much: I did find basic grammar rules to be a practical and useful starting point from which to evolve an increasingly more robust language interpreter. A "working theory" if you will.
« Last Edit: December 23, 2014, 06:36:19 pm by Don Patrick »
CO2 retains heat. More CO2 in the air = hotter climate.

*

ranch vermin

  • Not much time left.
  • Terminator
  • *********
  • 947
  • Its nearly time!
Re: The Athena Project
« Reply #72 on: December 24, 2014, 05:59:19 am »
Its great to hear you guys talking about NLP,  heres a diagram of the markov with an nlp analyzing and predicting at the same time.

The main aim of this game is, whatever rule you couldnt detect, you can rely apon the predicting chain to fill in what you couldnt write.
But the bad thing, is it needs tonnes of chains to work with, because the program that runs the behaviour, is bottlenecked by what chains are available.




*

Snowman

  • Trusty Member
  • ****
  • Electric Dreamer
  • *
  • 145
    • minervaai.com
Re: The Athena Project
« Reply #73 on: December 24, 2014, 07:28:16 am »
Art:
Quote
It would be cool to see you post a video with you interacting with Athena a bit...sort of like a Demo or Intro segment.

Yep, that would be cool. I'm just so busy working on the meat of the software. It wouldn't be practical at this point to try to piece together a makeshift Athena just for entertainment sake, but of course, I would if I felt it wouldn't slow my overall progress down. I feel like it would be a wasted effort on my part at this stage in the game... but... we'll see :). I think I understand the frustration.

ranch vermin:
Combining Neural Networks and NLP is a tricky idea. I have given it some thought already, but not nearly enough. I know there are lots of examples on many levels. i.e. I've heard of someone creating a NPL by using NN. In fact, I think Jeff Heaton mentioned he was working on that. Basically, you would teach a neural network the English language by lots and lots of examples and finally it will figure it out on its own. Another way NN can be used involves, as you were saying, using a markov chain approach. I have heard of people using markovs to find parts of speech. You can find more info about them here: https://en.wikipeadia.org/wiki/Part-of-speech_tagging .

I was kind of playing around with Markov chains when I was exploring Markov chains and Conversational Trees in those two videos I posted on this thread. In a way, I was, as Art was suggesting, showing how you can script a chatbot responses by feeding your chatbot mass conversations. Also, you could hand-craft scripts using this process as well. I'm sorry Art that I left the impression that I was taking Athena solely in that direction. I do intend to use it, but not solely. Overall, I think there are many, many more NN and NLP combinations.

I will look more at your diagram. Details buddy... need lots of details. :D


Don Patrick:
I like your name. I know another man by that name. He's in his late 80s and is a very honorable man; known him all my life. I've been wanting to say that for a while now :).

Even though I've been playing around with Neural Networks and Markov's, I believe the heart of an Ai revolves around the NPL. It is more important and, therefore, the most difficult to achieve. So, all the work you've already done was certainly a worthwhile and difficult task. I would love to pick your brain but I'm not sure where to start.

The first thing I did in confronting the NLP problem was to first understand it. I concluded these facts: (I probably will forget something obvious :P)

#1 Human language is actually a High-level programming language that the brain utilizes. It is similar to a scripting language, i.e. Python, VbScript, javascript. So it has, data structures, assignment operators, decision structures, and access to a whole lot of content.

#2 Human language is extremely logical and usually refined and optimized.

#3 All human languages can on the surface be unique, but must be the same underneath, since people all have a similar brain structure and similar experiences. In other words, just because a computer has a variety of scripting languages doesn't mean that they don't all use a similar assembly language. (This is just an analogy. I know there are exceptions.)

With these assumptions I decided to look for a lower-level language to discover how human languages are constructed. The best way to do this was for me to write my own language. I'm pretty sure people trying to get a degree in linguistics have to do this at some point, if not to get their Doctorate. I didn't need to have weird or unusual ways of speaking words, so I could keep English nouns and such. Also, I didn't need to worry about making it fluent or smoothly worded. All I really needed was to understand how to structure it. So I looked on-line for examples, I found a language called Lojban and I found another call AllNoun. Lojban is a constructed language (CL) that was made for the intent of Ai processing. Unfortunately, it has a learning curve that I didn't want to mount.

To make a long story short, I eventually learned the basics of the human language. What was fascinating, I found out that the WordNet project had a lot of things right about it. You can learn more about WordNet here: https://en.wikipedia.org/wiki/WordNet 
To give a very, very basic overview of what I found out, I will write a very basic sentence in my VL language. (Visual Logic)

English: Bill gave a ball to Fred.
VL: sentence(agent(bill) action(give(time(past))) patient(ball) recipient(Fred))
using acronyms: sen(agt(bill)) act(give(tim(past))) pat(ball) rec(fred))

In a VL sentence we have an agent, action, patient, and recipient as its grammar. The English equivalent has subject, verb, direct object, and indirect object.

Now I'm able to look at the English sentence (or any other language) as nothing more than a data structure. So we should now be able to get any information that the user is encoding back out of this data structure using the basic sentence structure techniques we just learned. ie. agent, action, etc.

So I am refreshing myself on the basics of the English grammar. Clauses are intended to start with relative pronouns, although are not required to do so. Phrases do not contain subject and verb combos. Anyway, it all makes a lot more sense to me because I now know what I'm trying to datamine.

An article will only precede a noun phrase.
A running watch =  article(a)  nounPhrase(adjective(running) noun(watch))

The sentence, “That is absolutely right.” makes the word “That” a pronoun, and  therefore can be followed by a verb. However, the word “that” is an adjective if it proceeds a noun. i.e. That gunshot was loud. Gunshot is a noun Art :P.

Anyway, I know what you're driving at Don. You are probably way further along the grammar road than I am. My first objective was to write the visual basic code that allows me to easily write these grammar rules. I think I just got through with that about a week ago. At least I have a template for it. I think the biggest thing that I will have to overcome is English idioms. (blah)

I can't worry too much about user's poor spelling. Even auto-correct has a hard time with my creative spelling.

Ultimately, after I've dissected the information from the sentences the next task will be to logically use this information. So we know that “a cow eats grass” and that “a cow is an animal” we should now be able to ask a simple question like “what animals eat grass”. This is a basic way information needs to be utilized in a proper AI.

Scared? No.. Depressed? Maybe a little :P

Freddy:
I haven't spent tons of time working on the cube. I wish I was as half as talented as you. I've seen your youtube vids showing your work with unity. It was amazing. A person couldn't ask for a better avatar than a woman standing in beautiful scenery. Even if the Ai was poor (not saying it is), people would still buy it just for the quality you've managed to assemble. 

*

Don Patrick

  • Trusty Member
  • ********
  • Replicant
  • *
  • 633
    • AI / robot merchandise
Re: The Athena Project
« Reply #74 on: December 24, 2014, 10:01:47 am »
Sounds like your friend is worthy of the honorary title of "Don" ;)

I wouldn't know where to start explaining all the NLP problems I've tackled myself. I can only address specific issues, and if I leave out details it is because I am more interested in seeing people come up with interesting different methods than for us all to travel the exact same road.
I think it's very wise that you're using your brains to analyse this from the ground up and find the basic building blocks of language first. It is not only a fascinating exercise, it'll come in very handy once you tackle relative clauses (basics first though). I extract facts as what the NLP field calls "triples": the do-er, the relationship, and the done-onto (and reversible). The indirect object and location are elements that I chose to regard as separate triples: the do-er, the relationship, and the done-with or done-at. It fragments the information more but made for a simpler database structure. It's never been determined which structure of knowledge representation is more preferable :)

I see the trouble with neural net learning language to lie in the exceptions. To some use of language, there just are no patterns to detect. I have encountered spelling rules that only applied to 5 words in the whole language, and singular exceptions stemming from old Latin or French origins.

As for idioms, I just mark the verb and following words and then check with a list of common expressions to know if they are figurative or literal, and translate them to their basic meaning. At some later date I'd distinguish this through knowledge and logic, such as that one can not literally "lend a hand". But for now, working theories get pretty far.
CO2 retains heat. More CO2 in the air = hotter climate.

 


Will LLMs ever learn what is ... is?
by HS (Future of AI)
November 10, 2024, 06:28:10 pm
Who's the AI?
by frankinstien (Future of AI)
November 04, 2024, 05:45:05 am
Project Acuitas
by WriterOfMinds (General Project Discussion)
October 27, 2024, 09:17:10 pm
Ai improving AI
by infurl (AI Programming)
October 19, 2024, 03:43:29 am
Atronach's Eye
by WriterOfMinds (Home Made Robots)
October 13, 2024, 09:52:42 pm
Running local AI models
by spydaz (AI Programming)
October 07, 2024, 09:00:53 am
Hi IM BAA---AAACK!!
by MagnusWootton (Home Made Robots)
September 16, 2024, 09:49:10 pm
Attempting Hydraulics
by MagnusWootton (Home Made Robots)
August 19, 2024, 04:03:23 am
LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

223 Guests, 0 Users

Most Online Today: 507. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles