avatar

Snowman

The Athena Project in General Project Discussion

I suppose this thread will be devoted to the The Athena Project. I've noticed by reading a few of the other threads that there are currently lots of minds devoted to the task of making a chatbot, ai, or something similar. Athena is intended to be my study into ai architecture and perhaps I can add something to my own understanding of the human condition in general also. Basically, I look at ai from every angle and endeavor to make something practical.

     What kind of ai do I intend Athena to be?

I think Athena should be a database of knowledge that mimics as much as possible a real companion. I don't intend for her to be an actual living thing. In order to make an actual living thing, you would first need to encode behavioral knowledge into a physical structural medium that not only processes information but also interacts with its environment. All Athena will be doing is sitting dormant in code upon the undynamic nature of a hard drive. In other words, she can't truly interact with the real world, except for the miniature world of the input window of a console.

     What am I focused primarily on when developing Athena?

I think my focus has mainly been on the actual ai engine. True, I have also worked on a platform of sorts, a user-interface that will make coding Athena fun and exciting. However, what's the point in making a terrific user-interface, or even an avatar, if you don't actually have a clue as to how to make the ai in the first place. Don't get me wrong, I think that graphics and beautiful ai girls have their place in the ai community, its just that I want to use my particular talents on the engine itself.

     Do I intend to sell Athena when she finally gets finished?

Well, I've almost decided to make it a donation type of service. If the people appreciate Athena then they will donate. If they want to donate time in coding and graphics, and they think they've added something to the Athena community, then if they don't feel like giving money then they shouldn't do so. The ai community is a sincere one. Most of them I've seen are in it for curiosity sake. They are dreamers (as the forum name Ai Dreams suggests). Also, many people are getting good at cracking software nowadays so... Anyway, I think you can always use other means of earning money. Like in selling custom packages or some other related sales. I'm sure I can think of some other way of earning money. (i.e. Haptek has a free player, but earns money by selling different types of editors).

     What about customization?

Like I've already mentioned, I think people will help develop Athena. That is intended. I want to make it as easy as possible for any person to learn coding and edit Athena's behaviors. I want it to be highly customizable. I even made the user-interface very customizable. So yes, I want the community involved.


I hope to add many more ideas and thoughts about Athena soon. I intend to present an overview of the coding that I've created thus far and give some details as to why it matters. From Language processing, search features, database creation, various utilities, and algorithms I hope to present these in this thread. (now don't fall asleep yet, it will get a lot more boring from here on out.  :P ) I think by making Athena a bit more open source I will probable gain some insight into other peoples ideas and perhaps learn a thing or two about the ai community.

I don't want to get so greedy that I rob my support community. (like shooting yourself in the foot.)

153 Comments | Started December 27, 2013, 09:12:38 AM
avatar

Tyler

Preparing for the Future of Artificial Intelligence in AI News

Preparing for the Future of Artificial Intelligence
3 May 2016, 9:52 pm

announcing a new series of workshops and an interagency working group to learn more about the benefits and risks of artificial intelligence

the White House Office of Science and Technology Policy is excited to announce that we will be co-hosting four public workshops over the coming months on topics in AI to spur public dialogue on artificial intelligence and machine learning and identify challenges and opportunities related to this emerging technology.
White House Office of Science and Technology PolicyLink

Source: AI in the News

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 05:00:18 PM
avatar

Tyler

Google develops a Chrysler minivan in Robotics News

Google develops a Chrysler minivan
5 May 2016, 1:00 pm

Never say never? Photo: 2011 Dodge Charger-The Future of Driving Commercial/YouTubeNever say never? Source: 2011 Dodge Charger, future of driving commercial/YouTube If you had asked me recently what big car company was the furthest behind when it came to robocars, one likely answer would be Fiat-Chrysler. In fact, famously, Chrysler ran ads several years ago during the superbowl making fun of self-driving cars and Google in particular:

Now Google has announced a minor partnership with Chrysler where they will be getting Chrysler to build 100 custom versions of their hybrid minivans for Google’s experiments. Minivans are a good choice for taxis, with spacious seating and electric sliding doors, if you want a vehicle to pick you up, it probably should have something like this.

This is a minor partnership, closer to a purchase order than a partnership, but it will be touted as a great deal more. My own feeling is it’s unlikely a major automaker will truly partner with a big non-auto player like Google, Uber, Baidu or Apple. Everybody is concerned about who will own the customer and the brand, and who will be the “Foxconn” and the big tech companies have no great reason to yield on that (because they are big) and the big car companies are unlikely to yield, either. Instead, they will acquire or do deals they control with smaller companies (like the purchase of Cruise or the partnership with Lyft from GM.)

Still, what may change this is an automaker (like FCA) getting desperate. GM got desperate and spent billions. FCA may do the same. Other companies with little underway (like Honda, Peugeot, Mazda, Subaru, Suzuki) may also panic or hope that the Tier 1 suppliers (Bosch, Delphi, Conti) will save them.

Google custom designed a car for their 3rd generation prototype, with 2 seats, no controls and and electric NEV power train. This has taught them a lot, but I bet it has also taught them that designing a car from scratch is an expensive proposition before you are ready to make thousands of them.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 05:00:18 PM
avatar

keghn

HTM School in General AI Discussion

HTM School Episode 0: HTM Overview:




HTM School Episode 1: Bit Arrays:

! No longer available




HTM School Episode 2: SDR Capacity & Comparison:




8 Comments | Started April 20, 2016, 12:11:18 AM
avatar

Tyler

The White House Has Realized Artificial Intelligence Is Very Important in AI News

The White House Has Realized Artificial Intelligence Is Very Important
3 May 2016, 12:00 am

                    Artificial intelligence promises to fundamentally change the way humans live. By replicating intelligence on any level, we can begin to automate all kinds of jobs and otherwise human tasks, shifting the economy and potentially eliminating the need for a flesh-and-blood workforce.

Popular ScienceLink

Source: AI in the News

To visit any links mentioned please view the original article, the link is at the top of this post.

1 Comment | Started Today at 11:00:10 AM
avatar

Tyler

SVR Case Studies: OATV investing in early stage service robotics in Robotics News

SVR Case Studies: OATV investing in early stage service robotics
5 May 2016, 9:30 am

Young People In The Robotics ClassroomOATV, or O’Reilly AlphaTech Ventures, is a seed stage investment firm based in San Francisco with a track record of backing robotics startups in emerging areas. As well as investing in Fetch Robotics, some of their other hardware investments include 3D Robotics, Planet Labs, Misfit Wearables, Littlebits and Sight Machine. OATV typically invest between $250,000 and $2 million into startups at a critical early stage of development. This pre-revenue runway helps startups refine their prototypes, determine product/market fit, and achieve strong follow on rounds. OATV invests early, typically before market categories are well defined.

Interview with former OATV Principal Roger Chen Edited for clarity

What’s OATV’s investment thesis and how does that make you look at robotics companies?

Roger Chen, former OATV PrincipalRoger Chen, former OATV Principal We like to look on the edge. If you take a look at our portfolio, there’s a wide array of companies in different categories. You have anything from consumer internet companies like Foursquare to satellite companies like Planet Labs to drone companies like 3D Robotics and logistics companies like Fetch Robotics. We made each investment when we sensed the emergence of a new category. When we think a “thing” is going to become a “thing,” we try to find companies and entrepreneurs in those categories very early on and back them just before the categories are really created. Before 3D Robotics, drones wasn’t really much of a category. And neither was space before Planet Labs and other pioneering companies like SkyBox.

Our focus has been on that strategy applied at the seed stage. We’re a little bit different than other seed firms in that we invest in fewer companies, about six a year. We concentrate more capital into those companies and take more of an all-in approach. Our personal philosophy and style is to try to work more closely with the companies we invest in, and it becomes hard and unwieldy to do that if you invest in too many companies within one year.

When it comes to robotics, we are seeing a lot of interesting things happen. There has been a confluence of technologies enabling new forms of robotics, from innovations in actuators enabling compliance and collaborative robotics to innovations in sensors and software.

rosROS, or Robot Operating System, only emerged these last few years, and before that, software development for robotics was exceedingly difficult. The advent of open source communities and platforms like ROS has really catalyzed the field. So those are some of the enablers on the technology side.

There’s also a lot happening on the market side, and market pull is just as important as technology. Just to give one example, let’s take a look at e-commerce and what’s happening with consumers. They want things faster, cheaper, and personalized, and this just creates so much pressure on a lot of these manufacturing, supply chain, and logistics companies.

At the same time, there are macro trends within the labor economy, as baby boomers start getting older and labor supply is expected to drop significantly. The confluence of all these factors puts logistics companies in a tough spot. People tend to forget that someone somewhere still needs to make, package, and ship things as part of e-commerce’s backend. Suddenly, automation and robots make a lot of sense and are economical.

That’s how we see the robotics industry, and while that example is specific to the logistics industry, I think there are a lot of industries where automation and robotics are going to come into play in similar ways. It’s going to be a collision between technology enablers on one side and intense emerging market demand on the other.

Could you expand on some of the trends enabling robotics and automation in logistics?

I’ll talk a little more about market pull. Depending on country, online sales make up somewhere around 10% of overall retail. You can see how massive companies like Amazon and Alibaba are, and that 10% online share of retail will only continue to grow. It speaks to the growing volume of work that has to be fulfilled on the backend of e-commerce.

I think a lot of people, especially consumers, don’t see how much work has to go into fulfilling those online orders. There is a box that has to be moved. There is something that has to be packed. There is something that has to shipped and transported. That’s kind of shielded away from consumers’ eyes. But it all has to be done, and it’s becoming more and more challenging for logistics companies to fulfill all these operations economically.

I can give you another statistic on the labor side. A lot of people have concerns about how robotics and automation will disrupt labor, which I think is valid and true to a certain extent. But I think you also have to be nuanced about it because if you actually look at the manufacturing and materials handling industry, particularly in the US, there is a huge job gap of 600,000 people because there is not enough sufficiently skilled labor to execute on the work to be done. That then presents an opportunity for robotics and automation to come in and fill that gap.

These are the powerful forces we see driving robotics: the really intense demand for logistics fulfillment, and simultaneously a lack of people to do all the work at an economical cost.

What other areas of the commerce value chain can robotics, smart automation and AI potentially improve?

There’s a ton of room for cost reduction via automation, but it’s not necessarily with just physical automation. There are software solutions as well that can make supply chains a lot more productive. For example, our portfolio company Fetch Robotics is tackling the logistics problem by streamlining operations in factories and warehouses with a mobile robotics platform. However, Fetch will be just as much about its future operations management software and application data as its material handling robots.

Fetch_and_Freight_System_Fetch_Robotics

At some point, all the goods that a company like Fetch moves around will need to be packaged into containers to be shipped all around the world. Another OATV company called Haven is creating a marketplace for streamlining how container shipping is booked. If you think about it, it’s rather ridiculous that people still have to call one another and use manual paper-based processes to mix and match which containers should go on which ship. It’s just very inefficient, and it hurts business by not maximizing fulfillment of shipping capacity. This is a case where relatively simple automation through purely software and a web application can go a very long way in driving up productivity in the supply chain.

The take home message here is that when I think about automation for improving supply chains, it’s not necessarily just robots with arms that move around and pick things up. It’s as much about the software as it is about the hardware.

What are some examples of OATV portfolio companies, perhaps not robotics companies but where the lessons can be applied to robotics?

I just talked about Haven a little bit – it’s essentially an online marketplace for more efficiently filling capacity on ships for shipping things.

There are a couple other OATV companies that come to mind. One is Sight Machine. It’s

a data platform company. They aggregate data streams across the manufacturing floor, perform analytics on them, and offer a frontend dashboard for customers to understand exactly what’s happening along their manufacturing lines. That has a ton of value because that kind of intelligence is what will allow decision makers overseeing operations to keep things up and running efficiently.

Another example is Riffyn. They are similar to Sight Machine in that they are also a data platform that aggregates data streams, but their focus is on the R&D lab for life science companies. Currently 70-90% of R&D results in the life sciences are not reproducible. If you are a pharma company, imagine the egregious amounts of R&D money wasted due to poor process control. Companies like Riffyn perform data automation to give science-driven enterprises control of their processes again. They automate data collection, root cause analysis and continuous deployment of improved process designs to drive up productivity for the R&D pipeline.

Both Sight Machine and Riffyn automate workflows to enable superior operational intelligence, flexibility, and performance to drive up productivity. While neither company is a robotics company in the traditional sense, that exact value proposition very much applies to robotics as well.

How is robotics today different from robotics in the past?

Once upon a time, robotics was about stationary, highly repetitive, high performance, and generally expensive automation. Industrial robot arms would repeat the same action again and again with extreme precision. We’re not as excited about those applications. We think a lot more about flexible robotics. We think about programmable robotics.

It’s no longer just about robotics automating specific human tasks. It’s more about how flexible robotics will enable superior operations overall. I would actually draw a parallel with what’s happening in the software world with the DevOps movement, which I think we’re shortly going to see in the physical world. With DevOps, the idea is to leverage a closed feedback loop to quickly iterate on software development, deployment and operations in a continuous fashion. I think this concept of continuous development and deployment will extend beyond virtual environments to optimize physical operations as well.

Fetch_reach-300x200

Companies like Fetch Robotics have a physical hardware platform that automates some process, which it will do well, but probably not perfectly at first. However, it will continuously improve and learn about the process because of all the operational data and analytics that result from prior deployments. That data will come back to decision makers and engineers and inform them how to redesign and improve process. Now, imagine a simple firmware or a software push that instantly reorganizes and improves operations

That’s the sort of robotics we’re seeing these days, and I think that’s what makes these companies really exciting. Robots are no longer one-trick ponies that just automate a single task.

Do you think these trends are leading to a change in the business model away from robots as capital expenditure? Will we see robots as a service? Or robots as a delivery mechanism for sensors and analytics?

At the end of the day, the decision to use robotics is still at its core a question of ROI. For some companies the ROI math doesn’t add up, but for an increasing number of companies, I think it does.

I do think though that there will be some interesting and innovative business models for robotics. How will future robotics products be priced? Some companies will make money selling hardware like more traditional industrial robotics companies. A company like Fetch Robotics makes a physical product that it will sell, but at the same time, there are potential SaaS-like revenue streams through the data applications and operations software that the company will offer. I’m definitely excited to see what new business models come out of all this.

And in conclusion?

I’m excited about this new wave of robotics companies evolving from traditional robotics that have primarily been about static, repetitive processes. We are witnessing the emergence of the first platforms to offer flexible and programmable robotics that we haven’t seen before.

I’m also really excited about learning what other verticals robotics will address. Most of the conversation about robotics has been centered on logistics and e-commerce, but there will be several other verticals as well. Applications like exploration, search and rescue, caregiving and more. I can’t wait.

The rest of our free report is available here — or in installments at Silicon Valley Robotics — featuring case studies and analysis from industry experts and investors.

If you enjoyed reading this post, you may also like..


Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 11:00:10 AM
avatar

Tyler

IBM makes a big shift into cognitive computing in AI News

IBM makes a big shift into cognitive computing
3 May 2016, 12:00 am

There aren't any signs that suggest if you drive up the narrow road that wraps around the hill you'll find a research facility at the top. No signs that the research center is home to a Fortune 500 company.

Boston HeraldLink

Source: AI in the News

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 05:00:03 AM
avatar

Tyler

Farming with robots in Robotics News

Farming with robots
4 May 2016, 6:38 pm

Source: Deepfield RoboticsSource: Deepfield Robotics Farmers are increasingly under pressure to feed more people. The UN predicts that the world population will rise from 7.3 billion today to 9.7 billion in 2050. This growing population has become pickier about the food they eat. In the EU alone, the organic market grew by 7.4% in 2014 with sales valued at €24 billion. Beyond organic food, there is an overall push to make farming greener by using less water and pesticides.

“Agriculture is most vulnerable to the impacts of climate change, but it’s also one of its causes”, says Birgit Schulz from Deepfield Robotics. “Making cultivation sustainable is essential”.

These factors mean farmers need to produce more, at a higher quality, and in a sustainable manner. With youth turning away from the profession, there is also less labour available to drive the vision forward.

Enter the robots — set to improve production yield, while reducing resources required, and making farming an exciting high-tech profession.

“Few people want to get up at 5am, farming is a heavy and dirty job.”, says Eldert van Henten, head of the the Farm Technology Group at Wageningen University in the Netherlands. He adds that, “the high-tech nature of future farming might attract new people into the profession, but also bring back those who left”.

Robots are just part of an overall push towards precision agriculture. Given the potential, Europe has funded at least 6 projects around robotic farming. And there is plenty to do given the large number of tasks on a farm that are ripe for automation.

For crop farming, robots need to autonomously navigate their environment and perform actions at set locations, for example, picking a fruit, spraying a pesticide, planting a seed, imaging a plant, or making a measurement. Glasshouses are slightly simpler to move around since the environment is more carefully engineered, and is often fit with tracks which robots follow to reach desired locations. In the case of outdoor farming, the robots work by receiving a plan with a set of locations to visit on the field. When the robot trajectories are known, the robot can use GPS positioning and a closed-loop control to make sure it remains on track. When the task is to follow an unknown trajectory, for example a crop row, vision is often used to allow the robot to find its way. Robots are wirelessly connected to a central operator to both receive updated instructions regarding the mission, and report status and data. Put together, making an autonomous farm robot requires clever controllers, localisation and communication systems. To a certain extent, the technology is similar to that of autonomous cars applied to agtech. Where it differs is that farming robots often need to manipulate their environment, picking vegetables or fruits, applying pesticides in a localised manner, or planting seeds. All these tasks require sensing, manipulation, and processing of their own.

The recently finished project RHEA developed a fleet of tractors and aerial robots with sensor systems to discriminate weeds from crops and apply herbicides where needed. Pablo Gonzalez de Santos, the lead on the project, explains what is achievable today:

“Autonomous robots have already been demonstrated in many agricultural activities. Conventional tasks such as tilling, sowing, harvesting of grains, can be performed using autonomous robots with the accuracy provided by the vehicle itself (currently about ±2 cm when using GNSS technology). For other tasks that demand the use of vision to follow trajectories, the current accuracy is approximately ±7 cm.”

When it comes to using robots with intelligent tools, the achievements are promising.

“Autonomous tractors carrying herbicide sprayers coordinating with drones equipped with weed-detection systems have proven to save up to 75% of the herbicide. Autonomous tractors equipped with on-board weed detection systems are able to kill 90% of weeds on a field.”

Bosch startup ‘Deepfield Robotics’ also develops technologies for sustainable farming. “Our products are sensor networks and robots”, says Birgit Schulz.

Their robots navigate plant rows, sense the plants, and send the data to the farmers to help optimise seed breeding. If equipped with a “weed puncher”, the robot can literally drive weeds into the ground. Deepfield Robotics also provides smart sensors that can be positioned in the fields. Resulting networks are already deployed in farms to monitor soil conditions for asparagus.

Source: Deepfield RoboticsSource: Deepfield Robotics There are however many challenges ahead. Pablo Gonzalez de Santos says “Technical developments are required to identify fruits and analyse their degree of ripeness in harsh conditions (changing light conditions, presence of dust, extreme temperatures, wind variations), as well as to detect weeds. Robot position accuracy also has to be enhanced to help optimise pesticide applications and the precision of manipulation. Although industrial manipulators exhibit very good accuracy and speed in factories, their application in farming is more difficult due to the objects moving, being soft and delicate, and obstacle-rich environments.” Just imagine what it takes to pick a sweet pepper from a plant, compared to grabbing a bolt on an assembly line. As it turns out, sweet pepper harvesting is the subject of the recent Horizon2020 Sweeper project, which follows a previous EU project called Crops.

Eldert van Henten says, “It’s not just vision, tactile sensing would also be helpful to pick peppers out of a busy scene. Farmers definitely use touch. ”

Sweeper project.Source: Sweeper project And as is often the case in robotics, the lack of clear regulation is causing a headache for companies entering the field.

“Safety is an issue – robots have to be capable of detecting what is going on in their surroundings and act accordingly to protect humans, wildlife, and themselves from crashes and accidents”, says Pablo Gonzalez de Santos. He adds that, “it is completely unclear who carries responsibilities for injuries caused by autonomous ground robots. The legislation for drones is also restrictive, requiring special authorisations, even for research purposes. “

Birgit Schulz agrees that safety and regulation considerations are paramount and raises questions regarding how to “define and implement the right degree of autonomy for the robot” as well as a number of logistical considerations including “How does the robot get to the field? How can we make it easy to use the robot? What happens if a robot gets stuck?”

The good news is that farms are already open to new technologies being deployed and the investments that are required. You just need to look at some of their equipment to see the high-tech machinery in place. A little known fact: farms are already amongst the most prominent adopters of robotic technology.

Lely, which is based in the Netherlands, has a fleet of over 20,000 milking robots installed throughout the world. The Lely Astronaut A4 box allows cows to be milked when they choose so, instead of when the farmer needs it to be done. The robot attaches incoming cows to the teat cups, reattaches them if required, and detaches them after milking. As an added bonus, data about the cows is collected, which can help the farmer monitor the herd and take action should a problem arrive, or simply to improve yield.

The company also makes autonomous mobile robots that clean the barn, and automatically feed the cows with Juno and Vector.

And the farm of tomorrow will include many robots working together. The MARS project, which stands for Mobile Agricultural Robot Swarms, demonstrated a cloud-based approach to farming at Hannover Messe last week. By deploying many simpler and smaller robots, they hope to make their farm-solutions safer, more reliable, and productive, while avoiding soil compaction that comes with larger robots navigating the fields. A swarm could also provide continuous operation, by having robots take turns charging or undergoing maintenance.

Source: MARS projectSource: MARS project It’s an all around exciting time to deploy robots on farms, and there is a clear need to do so given the drive for increased food production, and sustainability. Steps are currently being made to develop the technology that will enable the automation of individual tasks before integration in a “digital farm” that will empower farmers to run operations in a fulfilling and efficient way. But there is still a lot of work to do. Eldert van Henten says robots will need to be “quick, precise, 100% successful, and cost-effective” before we can hope to see them on our farms. But, he adds, there is a “trend towards human-robot co-working. With one or more robots doing (part of) the job while being supervised, instructed by a human, or jointly working with a human. This might improve acceptance and feasibility of robotics technology for the more challenging tasks in agriculture”.

Read the previous article in the series here.

The ten-part series on European Robotics will be published every two weeks on the SPARC website and Robohub. Funding for the series was provided by RockEU – a Coordination and Support Action funded under FP7 by the European Commission, Grant Agreement Number 611247.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 05:00:03 AM
avatar

yotamarker

mini a.i puzzles in General AI Discussion

this thread will be about mini a.i puzzles, how way the brain solves problems and paradoxes.

1st puzzle : is sacrifice able
if you have old very used shoes you don't care if it is raining if you go to work with them
if you use them as brakes will biking BUT if they are new and expensive you would be
careful.
what makes the brain classify an object as high value, and what make it be extra careful for it ?

7 Comments | Started April 26, 2016, 06:12:33 PM
avatar

Tyler

Google launches an Android keyboard that makes it easier to test with one hand in AI News

Google launches an Android keyboard that makes it easier to test with one hand
3 May 2016, 12:00 am

                    With phones getting bigger and bigger, many people struggle to type on their screens with only one hand. But an app for Android phones could help address this problem.

Daily Mail - Sciencetech (UK)Link

Source: AI in the News

To visit any links mentioned please view the original article, the link is at the top of this post.

Started May 04, 2016, 11:00:04 PM
Botwiki.org Monthly Bot Challenge

Botwiki.org Monthly Bot Challenge in Websites

Botwiki.org is a site for showcasing friendly, useful, artistic online bots, and our Monthly Bot Challenge is a recurring community event dedicated to making these kinds of bots.

Feb 25, 2016, 19:46:54 pm
From Movies to Reality: How Robots Are Revolutionizing Our World

From Movies to Reality: How Robots Are Revolutionizing Our World in Articles

Robots were once upon a time just a work of human imagination. Found only in books and movies, not once did we think a time would come where we would be able to interact with robots in real world. Eventually, in fact rapidly, the innovations we only dreamt of are now becoming a reality. Quoting the great Stephen Hawking "This is a glorious time to be alive for scientists". It is indeed the best time for the technology has become more and more sophisticated that its growing power might even endanger humanity.

Jan 26, 2016, 10:12:00 am
Uncanny

Uncanny in Robots in Movies

Uncanny is a 2015 American science fiction film directed by Matthew Leutwyler and based on a screenplay by Shahin Chandrasoma. It is about the world's first "perfect" artificial intelligence (David Clayton Rogers) that begins to exhibit startling and unnerving emergent behavior when a reporter (Lucy Griffiths) begins a relationship with the scientist (Mark Webber) who created it.

Jan 20, 2016, 13:09:41 pm
AI Virtual Pets

AI Virtual Pets in Other

Artificial life also called Alife is simply the simulation of any aspect of life, as through computers, robotics, or biochemistry. (taken from the Free dictionary)This site focus's on the software aspect of it.

Oct 03, 2015, 09:21:09 am
Why did HAL sing ‘Daisy’?

Why did HAL sing ‘Daisy’? in Articles

...a burning question posed by most people who have watched or read “2001: A Space Odyssey”: that is, why does the computer HAL-9000 sing the song ‘Daisy Bell’ as the astronaut Dave Bowman takes him apart?

Sep 04, 2015, 09:28:55 am
Humans

Humans in Robots on TV

Humans is a British-American science fiction television series. Written by the British team Sam Vincent and Jonathan Brackley, based on the award-winning Swedish science fiction drama Real Humans, the series explores the emotional impact of the blurring of the lines between humans and machines.

Aug 28, 2015, 09:13:37 am
Virtual Talk

Virtual Talk in Chatbots - English

[iTunes app] Virtual Talk is a AI chatting app that makes you talk with whomever you want. It remembers what you say and learns new dialogs. This app is one of the smartest chatbots in the world.

Aug 17, 2015, 13:33:09 pm
Robot Overlords

Robot Overlords in Robots in Movies

Not long after the invasion and occupation of Earth by a race of powerful robots wanting human knowledge and ingenuity, humans are confined to their homes. Leaving without permission would be to risk their lives. Monitored by the electronic implants in their necks, the robot sentries are able to track the movements of humans in order to control them. And if any person comes out of their home, they are given warnings by the robot sentries to get inside their home. If they do not comply, they are shot immediately.

Long article on the making of here...

Aug 15, 2015, 14:42:25 pm
Zerfoly

Zerfoly in Chatbots - English

Zerfoly is a chatbot platform that makes it possible to create imaginary persons (chatbots) and teach them to talk to each other.

You will be able to let loose your creativity and imagination. Build persons, by writing interactive dialogues. The persons you create will gradually become individuals with unique personalities. One of the persons could bear your name and learn to talk like you; your alter ego. Another way of using Zerfoly is as an interactive diary.

Aug 09, 2015, 11:06:42 am
Project Anni (English & German)

Project Anni (English & German) in Chatbots - English

Anni stands for Artificial Neural Network Intelligence and was based in first versions on a three layer perceptron – a special type of a feedforward artificial neural network – to find appropriate responses. These experiments failed and the algorithm was changed.

The current version does not use artificial neural networks anymore, but instead searches for similarities of the user's input in a database consisting of various chat logs to find a reasonable response. Furthermore this enables Anni to speak in several languages like English as well as German in the current version.

Sep 02, 2010, 09:32:12 am