Recent Posts

Pages: [1] 2 3 ... 10
1
AI News / Re: Most Human Like Android Built to Date
« Last post by 8pla.net on Today at 12:36:10 AM »
Art, I think this may go beyond just being a good video.  And, you're right of course!  It won Best Video of 2017 by IEEE.  I just wanted to emphasize the news value of this story for our A.I. community here. 

This story is likely a true breakthrough for chatbots. It features an  algorithm that in a new way, addresses that age-old chatbot criticism, that they do not really understand what's being chatted. 

According to, The Institute of Electrical and Electronics Engineers, chatbots can now approximate what's being said to them, instead.  This may, I think, in some respects close a number of gaps between narrow A.I. and A.G.I.
2
Robotics News / Security for multirobot systems
« Last post by Tyler on March 27, 2017, 10:48:51 PM »
Security for multirobot systems
17 March 2017, 12:30 pm

Researchers including MIT professor Daniela Rus (left) and research scientist Stephanie Gil (right) have developed a technique for preventing malicious hackers from commandeering robot teams’ communication networks. To verify the theoretical predictions, the researchers implemented their system using a battery of distributed Wi-Fi transmitters and an autonomous helicopter. Image: M. Scott Brauer. Distributed planning, communication, and control algorithms for autonomous robots make up a major area of research in computer science. But in the literature on multirobot systems, security has gotten relatively short shrift.

In the latest issue of the journal Autonomous Robots, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory and their colleagues present a new technique for preventing malicious hackers from commandeering robot teams’ communication networks. The technique could provide an added layer of security in systems that encrypt communications, or an alternative in circumstances in which encryption is impractical.

“The robotics community has focused on making multirobot systems autonomous and increasingly more capable by developing the science of autonomy. In some sense we have not done enough about systems-level issues like cybersecurity and privacy,” says Daniela Rus, an Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT and senior author on the new paper.

“But when we deploy multirobot systems in real applications, we expose them to all the issues that current computer systems are exposed to,” she adds. “If you take over a computer system, you can make it release private data — and you can do a lot of other bad things. A cybersecurity attack on a robot has all the perils of attacks on computer systems, plus the robot could be controlled to take potentially damaging action in the physical world. So in some sense there is even more urgency that we think about this problem.”

Identity theft

Most planning algorithms in multirobot systems rely on some kind of voting procedure to determine a course of action. Each robot makes a recommendation based on its own limited, local observations, and the recommendations are aggregated to yield a final decision.

A natural way for a hacker to infiltrate a multirobot system would be to impersonate a large number of robots on the network and cast enough spurious votes to tip the collective decision, a technique called “spoofing.” The researchers’ new system analyzes the distinctive ways in which robots’ wireless transmissions interact with the environment, to assign each of them its own radio “fingerprint.” If the system identifies multiple votes as coming from the same transmitter, it can discount them as probably fraudulent.

“There are two ways to think of it,” says Stephanie Gil, a research scientist in Rus’ Distributed Robotics Lab and a co-author on the new paper. “In some cases cryptography is too difficult to implement in a decentralized form. Perhaps you just don’t have that central key authority that you can secure, and you have agents continually entering or exiting the network, so that a key-passing scheme becomes much more challenging to implement. In that case, we can still provide protection.

“And in case you can implement a cryptographic scheme, then if one of the agents with the key gets compromised, we can still provide  protection by mitigating and even quantifying the maximum amount of damage that can be done by the adversary.”

Hold your ground

In their paper, the researchers consider a problem known as “coverage,” in which robots position themselves to distribute some service across a geographic area — communication links, monitoring, or the like. In this case, each robot’s “vote” is simply its report of its position, which the other robots use to determine their own.

The paper includes a theoretical analysis that compares the results of a common coverage algorithm under normal circumstances and the results produced when the new system is actively thwarting a spoofing attack. Even when 75 percent of the robots in the system have been infiltrated by such an attack, the robots’ positions are within 3 centimeters of what they should be. To verify the theoretical predictions, the researchers also implemented their system using a battery of distributed Wi-Fi transmitters and an autonomous helicopter.

“This generalizes naturally to other types of algorithms beyond coverage,” Rus says.

The new system grew out of an earlier project involving Rus, Gil, Dina Katabi — who is the other Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT — and Swarun Kumar, who earned master’s and doctoral degrees at MIT before moving to Carnegie Mellon University. That project sought to use Wi-Fi signals to determine transmitters’ locations and to repair ad hoc communication networks. On the new paper, the same quartet of researchers is joined by MIT Lincoln Laboratory’s Mark Mazumder.

Typically, radio-based location determination requires an array of receiving antennas. A radio signal traveling through the air reaches each of the antennas at a slightly different time, a difference that shows up in the phase of the received signals, or the alignment of the crests and troughs of their electromagnetic waves. From this phase information, it’s possible to determine the direction from which the signal arrived.

Space vs. time

A bank of antennas, however, is too bulky for an autonomous helicopter to ferry around. The MIT researchers found a way to make accurate location measurements using only two antennas, spaced about 8 inches apart. Those antennas must move through space in order to simulate measurements from multiple antennas. That’s a requirement that autonomous robots meet easily. In the experiments reported in the new paper, for instance, the autonomous helicopter hovered in place and rotated around its axis in order to make its measurements.

When a Wi-Fi transmitter broadcasts a signal, some of it travels in a direct path toward the receiver, but much of it bounces off of obstacles in the environment, arriving at the receiver from different directions. For location determination, that’s a problem, but for radio fingerprinting, it’s an advantage: The different energies of signals arriving from different directions give each transmitter a distinctive profile.

There’s still some room for error in the receiver’s measurements, however, so the researchers’ new system doesn’t completely ignore probably fraudulent transmissions. Instead, it discounts them in proportion to its certainty that they have the same source. The new paper’s theoretical analysis shows that, for a range of reasonable assumptions about measurement ambiguities, the system will thwart spoofing attacks without unduly punishing valid transmissions that happen to have similar fingerprints.

“The work has important implications, as many systems of this type are on the horizon — networked autonomous driving cars, Amazon delivery drones, et cetera,” says David Hsu, a professor of computer science at the National University of Singapore. “Security would be a major issue for such systems, even more so than today’s networked computers. This solution is creative and departs completely from traditional defense mechanisms.”

If you enjoyed this article from CSAIL, you might also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.
3
General AI Discussion / Re: A.eye
« Last post by keghn on March 27, 2017, 09:31:23 PM »
 Are testing it.  by seeing which route around a triangle is the shortest?
4
General AI Discussion / Re: A.eye
« Last post by yotamarker on March 27, 2017, 06:13:27 PM »
3 problems with dijksra's alg :
big pictures :
stack overflow
too long process time

colorful pictures :

shapes not consistant
5
Robotics News / Collaborating machines and avoiding soil compression
« Last post by Tyler on March 27, 2017, 04:48:50 PM »
Collaborating machines and avoiding soil compression
17 March 2017, 10:00 am

Image: Swarmfarm Soil compression can be a serious problem, but it isn’t always, or in all ways, a bad thing. For example, impressions made by hoofed animals, so long as they only cover a minor fraction of the soil surface, create spaces in which water can accumulate and help it percolate into the soil more effectively, avoiding erosion runoff.

The linear depressions made by wheels rolling across the surface are more problematic because they create channels that can accelerate the concentration of what would otherwise be evenly distributed rainfall, turning it into a destructive force. This is far less serious when those wheels follow the contour of the land rather than running up and down slopes.

Taking this one step further, if it is possible for wheeled machines to always follow the same tracks, the compression is localized and the majority of the land area remains unaffected. If those tracks are filled with some material though which water can percolate but which impedes the accumulation of energy in downhill flows, the damage is limited to the sacrifice of the portion of the overall land area dedicated to those tracks and the creation of compression zones beneath them, which may result in boggy conditions on the uphill sides of the tracks, which may or may not be a bad thing, depending on what one is trying to grow there.

Source: vinbot.eu (I should note at this point that such tracks, when they run on the contour, are reminiscent of the ‘swales’ used in permaculture and regenerative agriculture.)

Tractors with GPS guidance are capable of running their wheels over the same tracks with each pass, but the need for traction, so they can apply towing force to implements running through the soil, means that those tracks will constitute a significant percentage of the overall area. Machines, such as dedicated sprayers, with narrower wheels that can be spread more widely apart, create tracks which occupy far less of the total land area, but they are not built for traction, and using them in place of tractors for all field operations would require a very different approach to farming.

It is possible to get away from machine-caused soil compression altogether, using either aerial machines (drones) or machines which are supported by or suspended from fixed structures, like posts or rails.



Small drones are much like hummingbirds in that they create little disturbance, but they are also limited in the types of operations they can perform by their inability to carry much weight or exert significant force. They’re fine for pollination but you wouldn’t be able to use them to uproot weeds with tenacious roots or to harvest watermelons or pumpkins.

On the other hand, fixed structures and the machines that are supported by or suspended from them have a significant up-front cost. In the case of equipment suspended from beams or gantries spanning between rails and supported from wheeled trucks which are themselves supported by rails, there is a tradeoff between the spacing of the rails and the strength/stiffness required in the gantry. Center-pivot arrangements also have such a tradeoff, but they use a central pivot in place of one rail (or wheel track), and it’s common for them to have several points of support spaced along the beam, requiring several concentric rails or wheel tracks.

Strictly speaking, there’s no particular advantage in having rail-based systems follow the contour of the land since they leave no tracks at all. Center-pivot systems using wheels that run directly on the soil rather than rail are best used on nearly flat ground since their round tracks necessarily run downhill over part of their circumference. In any rail-based system, the “rail” might be part of the mobile unit rather than part of the fixed infrastructure, drawing support from posts spaced closely enough that there were always at least two beneath it. However, this would preclude using trough-shaped rails to deliver water for irrigation.

Since the time of expensive machines is precious, it’s best to avoid burdening them with operations that can be handled by small, inexpensive drones, and the ideal arrangement is probably a combination of small drones, a smaller number of larger drones with some carrying capacity, light on-ground devices that put little pressure on the soil, and more substantial machines supported or suspended from fixed infrastructure, whether rail, center-pivot, or something else. Livestock (chickens, for example), outfitted with light wearable devices, might also be part of the mix.

The small drones, being more numerous, will be the best source of raw data, which can be used to optimize the operation of the larger drones, on-ground devices, and the machines mounted on fixed infrastructure, although too much centralized control would not be efficient. Each device should be capable of continuing to do useful work even when it loses network connection, and peer-to-peer connections will be more appropriate than running everything through a central hub in some circumstances.

Bonirob, an agricultural robot. Source: Bosch  

This is essentially a problem in complex swarm engineering, complex because of the variety of devices involved. Solving it in a way that creates a multi-device platform capable of following rules, carrying out plans, and recognizing anomalous conditions is the all-important first step in enabling the kind of robotics that can then go one to enable regenerative practices in farming (and land management in general).

If you enjoyed this article, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

 

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.
6
General AI Discussion / Re: A.eye
« Last post by yotamarker on March 27, 2017, 04:45:51 PM »


Dijkstra's algorithm probably isn't the best solution
7
General AI Discussion / Re: A.eye
« Last post by yotamarker on March 27, 2017, 04:41:27 PM »
8
AI News / Re: Most Human Like Android Built to Date
« Last post by Art on March 27, 2017, 02:01:06 PM »
Nice vid. Yes, funny, cute and also informative.
9
Robotics News / Envisioning the future of robotics
« Last post by Tyler on March 27, 2017, 10:48:32 AM »
Envisioning the future of robotics
16 March 2017, 10:07 am

Image: Ryan Etter Robotics is said to be the next technological revolution. Many seem to agree that robots will have a tremendous impact over the following years, and some are heavily betting on it. Companies are investing billions buying other companies, and public authorities are discussing legal frameworks to enable a coherent growth of robotics.

Understanding where the field of robotics is heading is more than mere guesswork. While much public concern focuses on the potential societal issues that will arise with the advent of robots, in this article, we present a review of some of the most relevant milestones that happened in robotics over the last decades. We also offer our insights on feasible technologies we might expect in the near future.

Copyright © Acutronic Robotics 2017. All Rights Reserved.  Pre-robots and first manipulators

What’s the origin of robots? To figure it out we’ll need to go back quite a few decades to when different conflicts motivated the technological growth that eventually enabled companies to build the first digitally controlled mechanical arms. One of the first and well documented robots was UNIMATE (considered by many the first industrial robot): a programmable machine funded by General Motors, used to create a production line with only robots. UNIMATE helped improve industrial production at the time. This motivated other companies and research centers to actively dedicate resources to robotics, which boosted growth in the field.



Sensorized robots

Sensors were not typically included in robots until the 70’s. Starting in1968, a second generation of robots emerged that integrated sensors. These robots were able to react to their environment and offer responses that met varying scenarios.

Relevant investments were observed during this period. Industrial players worldwide were attracted by the advantage that robots promised.



Worldwide industrial robots:  Era of the robots

Many consider that the Era of Robots started in 1980. Billions of dollars were invested by companies all around to world to automate basic tasks in their assembly lines. Sales of industrial robots grew 80% above the previous years’.

Key technologies appeared within these years: General internet access was extended in 1980; Ethernet became a standard in 1983 (IEEE 802.3); the Linux kernel was announced in 1991; and soon after that real-time patches started appearing on top of Linux.

The robots created between 1980 and 1999 belong to what we call the third generation of robots: robots that were re-programmable and included dedicated controllers. Robots populated many industrial sectors and were used for a wide variety of activities: painting, soldering, moving, assembling, etc.

By the end of the 90s, companies started thinking about robots beyond the industrial sphere. Several companies created promising concepts that would inspire future roboticists. Among the robots created within this period, we highlight two:

  • The first LEGO Mindstorms kit (1998): a set consisting of 717 pieces including LEGO bricks, motors, gears, different sensors, and a RCX Brick with an embedded microprocessor to construct various robots using the exact same parts. The kit allowed the learning of  basic robotics principles. Creative projects have appeared over the years showing the potential of interchangeable hardware in robotics. Within a few years. the LEGO Mindstorms kit became the most successful project that involved robot part interchangeability.
  • Sony’s AIBO (1999): the world’s first entertainment robot. Widely used for research and development, Sony offered robotics to everyone in the form of a $1,500 robot that included a distributed hardware and software architecture. The OPEN-R architecture involved the use of modular hardware components — e.g. appendages that can be easily removed and replaced to customize the shape and function of the robots — and modular software components that could be interchanged to modify their behavior and movement patterns. OPEN-R inspired future robotic frameworks, and minimized the need for programming individual movements or responses.
Integration effort was identified as one of the main issues within robotics, particularly related to industrial robots. A common infrastructure typically reduces the integration effort by facilitating an environment in which components can be connected and made to interoperate. Each of the infrastructure-supported components are optimized for such integration at their conception, and the infrastructure handles the integration effort. At that point, components could come from different manufacturers (yet when supported by a common infrastructure, they will interoperate).

Sony’s AIBO and LEGO’s Mindstorms kit were built upon this principle, and both represented common infrastructures. Even though they came from the consumer side of robotics, one could argue that their success was strongly related to the fact that both products made use of interchangeable hardware and software modules. The use of a common infrastructure proved to be one of the key advantages of these technologies, however those concepts were never translated to industrial environments. Instead, each manufacturer, in an attempt to dominate the market, started creating their own “robot programming languages”.

The dawn of smart robots

Starting from the year 2000, we observed a new generation of robot technologies. The so-called fourth generation of robots consisted of more intelligent robots that included advanced computers to reason and learn (to some extend at least), and more sophisticated sensors that helped controllers adapt themselves more effectively to different circumstances.

Among the technologies that appeared in this period, we highlight the Player Project (2000, formerly the Player/Stage Project), the Gazebo simulator (2004) and the Robot Operating System (2007). Moreover, relevant hardware platforms appeared during these years. Single Board Computers (SBCs), like the Raspberry Pi, enabled millions of users all around the world to create robots easily.



The boost of bio-inspired artificial intelligence

The increasing popularity of artificial intelligence, and particularly neural networks, became relevant in this period as well. While a lot of the important work on neural networks happened in the 80’s and in the 90’s, computers did not have enough computational power at the time. Datasets weren’t big enough to be useful in practical applications. As a result, neural networks practically disappeared in the first decade of the 21st century. However, starting from 2009 (speech recognition), neural networks gained popularity and started delivering good results in fields such as computer vision (2012) or machine translation (2014). Over the last few years, we’ve seen how these techniques have been translated to robotics for tasks such as robotic grasping. In the coming years, we expect to see these AI techniques having more and more impact in robotics.

What happened to industrial robots?

Relevant key technologies have also emerged from the industrial robotics landscape (e.g.: EtherCAT). However, except for the appearance of the first so-called collaborative robots, the progress within the field of industrial robotics has significantly slowed down when compared to previous decades. Several groups have identified this fact and written about it with conflicting opinions. Below, we summarize some of the most relevant points encountered while reviewing previous work:

  • The Industrial robot industry :  is it only a supplier industry?

    For some, the industrial robot industry is a supplier industry. It supplies components and systems to larger industries, like manufacturing. These groups argue that the manufacturing industry is dominated by the PLC, motion control and communication suppliers which, together with the big customers, are setting the standards. Industrial robots therefore need to adapt and speak factory languages (PROFINET, ETHERCAT, Modbus TCP, Ethernet/IP, CANOPEN, DEVICENET, etc.) which for each factory, might be different.
  • Lack of collaboration and standardized interfaces in industry

    To date, each industrial robot manufacturer’s business model is somehow about locking you into their system and controllers. Typically, one will encounter the following facts when working with an industrial robot: a) each robot company has its own proprietary programming language, b) programs can’t be ported from one robot company to the next one, c) communication protocols are different, d) logical, mechanical and electrical interfaces are not standardized across the industry. As a result, most robotic peripheral makers suffer from having to support many different protocols, which requires a lot of development time that reduces the functionality of the product.
  • Competing by obscuring vs opening new markets?

    The closed attitude of most industrial robot companies is typically justified by the existing competition. Such an attitude leads to a lack of understanding between different manufacturers. An interesting approach would be to have manufacturers agree on a common infrastructure. Such an infrastructure could define a set of electrical and logical interfaces (leaving the mechanical ones aside due to the variability of robots in different industries) that would allow industrial robot companies to produce robots and components that could interoperate, be exchanged and eventually enter into new markets. This would also lead to a competitive environment where manufacturers would need to demonstrate features, rather than the typical obscured environment where only some are allowed to participate.
 The Hardware Robot Operating System (H-ROS)

For robots to enter new and different fields, it seems reasonable that they need to adapt to the environment itself. This fact was previously highlighted for the industrial robotics case, where robots had to be fluent with factory languages. One could argue the same for service robots (e.g. households robots that will need to adapt to dish washers, washing machines, media servers, etc.), medical robots and many other areas of robotics. Such reasoning lead to the creation of the Hardware Robot Operating System (H-ROS), a vendor-agnostic hardware and software infrastructure for the creation of robot components that interoperate and can be exchanged between robots. H-ROS builds on top of ROS, which is used to define a set of standardized logical interfaces that each physical robot component must meet if compliant with H-ROS.

H-ROS facilitates a fast way of building robots, choosing the best component for each use-case from a common robot marketplace. It complies with different environments (industrial, professional, medical, …) where variables such as time constraints are critical. Building or extending robots is simplified to the point of placing H-ROS compliant components together. The user simply needs to program the cognition part (i.e. brain) of the robot and develop their own use-cases, all without facing the complexity of integrating different technologies and hardware interfaces.

The future ahead

With latest AI results being translated to robotics, and recent investments in the field, there’s a high anticipation for the near future of robotics.

As nicely introduced by Melonee Wise in a recent interview, there’s still not that many things you can do with a $1000-5000 BOM robot (which is what most people would pay on an individual basis for a robot). Hardware is still a limiting factor, and our team strongly believes that a common infrastructure, such as H-ROS, will facilitate an environment where robot hardware and software can evolve.

The list presented below summarizes, according to our judgement, some of the most technically feasible future robotic technologies to appear.



Acknowledgments

This review was funded and supported by Acutronic Robotics, a firm focused on the development of next-generation robot solutions for a range of clients.

The authors would also like to thank the Erle Robotics and the Acutronic groups for their support and help.

References

  • [1] Gates, B. ”A robot in every home,” Scientific American, 296(1), 2007, pp. 58–65. (link)
  • [2] Trikha, B. “ A Journey from floppy disk to cloud storage,” in International Journal on Computer Science and Engineering,Vol. 2, 2010, pp.1449–1452. (link)
  • [3] Copeland, B. J. “Colossus: its origins and originators,” in IEEE Annals of the History of Computing, Vol. 26, 2004, pp. 38–45. (link)
  • [4] Bondyopadhyay, P. K. “In the beginning [junction transistor],” in Proceedings of the IEEE, Vol. 86, 1998, pp.63–77. (link)
  • [5] Bryson, A. E. “Optimal control-1950 to 1985,” in IEEE Control Systems, Vol. 16, 1996, pp.26–33. (link)
  • [6] Middleditch, A. E. “Survey of numerical controller technology,” in Production Automation Project, University of Rochester, 1973. (link)
  • [7] Acal, A. P., & Lobera, A. S. “Virtual reality simulation applied to a numerical control milling machine,” in International Journal on Interactive Design and Manufacturing, Vol.1, 2007, pp.143–154. (link)
  • [8] Mark, M. “U.S. Patent №2,901,927,” Washington DC: U.S. Patent and Trademark Office, 1959 (link)
  • [9] Mickle, P. “A peep into the automated future,” in The capital century 1900–1999. http://www. capitalcentury. com/1961. html, 1961. (link)
  • [10] Kilby, J. S. (1976). Invention of the integrated circuit. IEEE Transactions on electron devices, 23(7), 648–654. (link)
  • [11] Giralt, G., Chatila, R., & Vaisset, M. “An integrated navigation and motion control system for autonomous multisensory mobile robots,” in Autonomous robot vehicles, 1990, pp.420–443. (link)
  • [12] Bryan, L. A., & Bryan, E. A. “Programmable controllers,” 1988. (link)
  • [13] Wade, J. “Dynamics of organizational communities and technological bandwagons: An empirical investigation of community evolution in the microprocessor market,” in Strategic Management Journal, Vol.16, 1995, pp.111–133. (link)
  • [14] Wallén, J. “The history of the industrial robot,” in Linköping University Electronic Press, 2008. (link)
  • [15] Paul, R. P., “WAVE: A Model Based Language for Manipulator Control,” in The Industrial Robot, Vol. 4, 1977, pp.10–17. (link)
  • [16] Shepherd, S., & Buchstab, A. “Kuka robots on-site,” in Robotic Fabrication in Architecture, Art and Design 2014, 2014, pp. 373–380. (link)
  • [17] Cutkosky, M. R., & Wright, P. K. (1982). Position Sensing Wrists for Industrial Manipulators (No. CMU-RI-TR-82–9). CARNEGIE-MELLON UNIV PITTSBURGH PA ROBOTICS INST. (link)
  • [18] Finkel, R., Taylor, R., Bolles, Paul, R. and Feldman, J., “An Overview of AL, A Programming System for Automation,” in Proceedings -Fourth International Joint Conference on Artificial Intelligence, June 1975, pp.758–765. (link)
  • [19] Park, J., & Kim, G. J. “Robots with projectors: an alternative to anthropomorphic HRI,” in Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, March 2009, pp. 221–222 (link)
  • [20] Srihari, K., & Deisenroth, M. P. (1988). Robot Programming Languages — A State of the Art Survey. In Robotics and Factories of the Future’87 (pp. 625–635). Springer Berlin Heidelberg. (link)
  • [21] Gruver, W. A., Soroka, B. J., Craig, J. J. and Turner, T. L., “Industrial Robot Programming Languages: A Comparative Evaluation,” in IEEE Transactions on Systems, Man, and Cybernetics, Vol. SMC-14, №4, July/August 1984, pp. 565–570. (link)
  • [22] Maeda, J. (2005). Current research and development and approach to future automated construction in Japan. In Construction Research Congress 2005: Broadening Perspectives (pp. 1–11). (link)
  • [23] Castells, M. “The Internet galaxy: Reflections on the Internet, business, and society,” in Oxford University Press on Demand, 2002. (link)
  • [24] Beckhoof “25 Years of PC Control,” 2011. (link)
  • [25] Shuang Yu “IEEE 802.3 ‘Standard for Ethernet’ Marks 30 Years of Innovation and Global Market Growth,” Press release IEEE, June 24, 2013. Retrieved January 11, 2014. (link)
  • [26] Brooks, R. “New approaches to robotics,”in Science, 253(5025), 1991, 1227–1232. (link)
  • [27] World Heritage Encyclopedia, “International Federation of Robotics” in World Heritage Encyclopedia (link)
  • [28] Lapham, J. “ RobotScript™: the introduction of a universal robot programming language,” Industrial Robot: An International Journal, 26(1),1999, pp. 17–25 (link)
  • [29]García Marín, J. A. “New concepts in automation and robotic technology for surface engineering,” 2010. (link)
  • [30] Walter A Aviles, Robin T Laird, and Margaret E Myers. “Towards a modular robotic architecture,” in 1988 Robotics Conferences. International Society for Optics and Photonics, 1989, pp. 271–278 (link)
  • [31] Angle, C. “Genghis, a six legged autonomous walking robot,” Doctoral dissertation, Massachusetts Institute of Technology, 1989. (link)
  • [32] Bovet, D. P., & Cesati, M. “Understanding the Linux Kernel: from I/O ports to process management,” in O’Reilly Media, Inc.” 2005. (link)
  • [33] Alpert, D., & Avnon, D. “Architecture of the Pentium microprocessor,” in IEEE micro, Vol. 13, 1993, pp.11–21. (link)
  • [34] Hollingum, J. “ABB focus on lean robotization,” in Industrial Robot: An International Journal, Vol. 21, 1994, pp.15–16. (link)
  • [35] Barabanov, M. “A Linux Based Real-Time Operating System,” 1996. (link)
  • [36] Yodaiken, V. “Cheap Operating systems Research,” in Published in the Proceedings of the First Conference on Freely Redistributable Systems, Cambridge MA, 1996 (link)
  • [37] Decotignie, J. D. “Ethernet-based real-time and industrial communications,” in Proceedings of the IEEE, Vol. 93, 2005, pp.1102–1117. (link)
  • [38] Wade, S., Dunnigan, M. W., & Williams, B. W. “Modeling and simulation of induction machine vector control with rotor resistance identification,” in IEEE transactions on power electronics, Vol. 12, 1997, pp.495–506. (link)
  • [39] Campbell, M., Hoane, A. J., & Hsu, F. H. “Deep blue,” in Artificial intelligence, Vol. 134, 2002, pp.57–83. (link)
  • [40] Folkner, W. M., Yoder, C. F., Yuan, D. N., Standish, E. M., & Preston, R. A. “Interior structure and seasonal mass redistribution of Mars from radio tracking of Mars Pathfinder,” in Science 278(5344), 1997, pp.1749–1752. (link)
  • [41] Cliburn, D. C. “Experiences with the LEGO Mindstorms throughout the undergraduate computer science curriculum,”in Frontiers in Education Conference, 36th Annual,IEEE, October 2006, pp.1–6. (link)
  • [42] Rowe S., R Wagner C. “An introduction to the joint architecture for unmanned systems (JAUS),” in Ann Arbor 1001, 2008. (link)
  • [43] Fujita, M. “On activating human communications with pet-type robot AIBO,” in Proceedings of the IEEE, Vol. 92, 2004, pp.1804–1813.(link)
  • [44] Breazeal, C. L. “Sociable machines: Expressive social exchange between humans and robots,” in Doctoral dissertation, Massachusetts Institute of Technology, 2000. (link)
  • [45] Rafiei, M., Elmi, S. M., & Zare, A. “Wireless communication protocols for smart metering applications in power distribution networks,” in Electrical Power Distribution Networks (EPDC), 2012 Proceedings of 17th Conference on. IEEE, May 2012, pp. 1–5. (link)
  • [46] Brian Gerkey, Richard T Vaughan, and Andrew Howard. “The Player/Stage project: Tools for multi-robot and distributed sensor systems” in Proceedings of the 11th international conference on advanced robotics. Vol. 1. 2003, pp. 317–323. (link)
  • [47] Herman Bruyninckx. “Open robot control software: the OROCOS project” in Robotics and Automation, 2001. Proceedings 2001 icra. ieee International Conference on. Vol. 3. IEEE. 2001, pp. 2523–2528. (link)
  • [48] Hirose, M., & Ogawa, K. “Honda humanoid robots development,” in Philosophical Transactions of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, 365(1850), 2007, pp.11–19. (link)
  • [49] Mohr, F. W., Falk, V., Diegeler, A., Walther, T., Gummert, J. F., Bucerius, J., … & Autschbach, R. “Computer-enhanced “robotic” cardiac surgery: experience in 148 patients” in The Journal of thoracic and cardiovascular surgery, 121(5), 2001, pp.842–853. (link)
  • [50] Jones, J. L., Mack, N. E., Nugent, D. M., & Sandin, P. E. “U.S. Patent №6,883,201,” in Washington, DC: U.S. Patent and Trademark Office, 2005. (link)
  • [51] Jansen, D., & Buttner, H. “Real-time Ethernet: the EtherCAT solution,” in Computing and Control Engineering, 15(1), 2004, pp. 16–21. (link)
  • [52] Koenig, N., & Howard, A. “Design and use paradigms for gazebo, an open-source multi-robot simulator,” in Intelligent Robots and Systems, 2004.(IROS 2004). Proceedings. 2004 IEEE/RSJ International Conference on IEEE., Vol. 3, September 2004, pp. 2149–2154. (link)
  • [53] Cousins, S. “Willow garage retrospective [ros topics],” in IEEE Robotics & Automation Magazine, 21(1), 2014, pp.16–20. (link)
  • [54] Garage, W. Robot operating system. 2009. [Online]. (link)
  • [55] Fisher A. “Inside Google’s Quest To Popularize Self-Driving Cars,” in Popular Science, Bonnier Corporation, Retrieved 10 October 2013. (link)
  • [56] Cousins, S. “Ros on the pr2 [ros topics],” in IEEE Robotics & Automation Magazine, 17(3), 2010, pp.23–25. (link)
  • [57] PARROT, S. A. “Parrot AR. Drone,” 2010. (link)
  • [58] Honda Motor Co. ASIMO, 2011. [Online]. (link)
  • [59] Shen, F., Yu, H., Sakurai, K., & Hasegawa, O. “An incremental online semi-supervised active learning algorithm based on self-organizing incremental neural network,” in Neural Computing and Applications, 20(7), 2011, pp.1061–1074. (link)
  • [60] Industria 4.0 en la Feria de Hannover: La senda hacia la “fábrica inteligente” pasa por la Feria de Hannover, sitio digital ‘Deutschland’, 7 de abril de 2014 (link)
  • [61] Richardson, Matt, and Shawn Wallace “Getting started with raspberry PI,” in O’Reilly Media, Inc., 2012. (link)
  • [62] Edwards, S., & Lewis, C. “Ros-industrial: applying the robot operating system (ros) to industrial applications,” in IEEE Int. Conference on Robotics and Automation, ECHORD Workshop, May 2012. (link)
  • [63] Canis, B. Unmanned aircraft systems (UAS): Commercial outlook for a new industry. Congressional Research Service, Washington, 2015, p.8. (link)
  • [64] Trishan de Lanerolle, The Dronecode Foundation aims to keep UAVs open, Jul 2015. [Online] (link)
  • [65] Savioke, Your Robot Butler Has Arrived, August 2014. [Online]. (link)
  • [66] ABB, “ABB introduces Yumi, world´s first truly collaborative dual-arm robot”, 2015. Press release (link)
  • [67] LEE, Chang-Shing, et al. Human vs. Computer Go: Review and Prospect [Discussion Forum]. IEEE Computational Intelligence Magazine, 2016, vol. 11, no 3, pp. 67–72. (link)
  • [68] Bogue, R. (2015). Sensors for robotic perception. Part one: human interaction and intentions. Industrial Robot: An International Journal, 42(5), pp.386–391 (link)
  • [69] The Linux foundation, official wiki. 2009. [Online]. (link)
  • [70] The Tesla Team, “ All Tesla Cars Being Produced Now Have Full Self-Driving Hardware” Official web, 19 Oct. 2016. [Online]. (link)
  • [71] Brian Gerkey. Why ROS 2.0?, 2014. [Online]. (link)
  • [72] Acutronic Robotics, “H-ROS: Hardware Robot Operating System”, 2016. [Online]. (link)
  • [73] Judith Viladomat, TALOS:the next step in humanoid robots from PAL Robotics. 4 Oct. 2016 [Online]. (link)

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.
10
AI Programming / Re: Rivescript not working as expected
« Last post by brty21 on March 27, 2017, 10:22:23 AM »
Here's a short couple of scripts that I came up with which use PHP and javascript to interface with MaryTTS.

Browsers/JS don't like cross site scripting. There's ways to get around it, but I find them complicated and it's easily solved with a relay for testing purposes. You can get into all that other gubbins later if you want.

Take these two scripts and put them in some directory that is useful to you. The HTML/JS script calls the PHP. The PHP calls Mary. The PHP then sends what it got from Mary back to the client or browser. The client then plays it.

The main reason why you get the cross site problem is that Mary uses a different port compared to normal, ie port 80 in most cases.



Here's the HTML/JS :

Code: [Select]
<html>

<title>Demo Play TTS</title>

<head>

</head>

<script>

// It's a pain using JS for cross site so just use a simple PHP relay to start with.
var relay = "gettts.php";

// The words to render.
var tts = "Just a simple Demo. Refresh page if you change the words in the script.";

// The URL for the relay with the words added on.
var url = relay + "?words=" + tts;

// Set up an audio player and get the audio from the relay URL.
var audio = new Audio(url);

// Play the audio which would have been made by Mary.
audio.play();

</script>

<body>

Just a simple Demo. Refresh page if you change the words in the script.

</body>

</html>

And the PHP...

Code: [Select]
<?php

// Your server or host.
$host "http://localhost";

// A voice you have installed with Mary.
$voice "cmu-slt-hsmm";

// Get the words sent from the JS - best to do some sanitising here later.
$words $_GET['words'];

// URL to the Mary port and various settings.
$url $host  ":59125/process?INPUT_TEXT=" $words ".&INPUT_TYPE=TEXT&OUTPUT_TYPE=AUDIO&LOCALE=en_US&AUDIO=WAVE_FILE&VOICE=" $voice;

// Mary doesn't like spaces.
$url str_replace(" ""+"$url);

// Get the WAV file from Mary.
$wav file_get_contents($url);

// Set a header for the client.
header('Content-Type: audio/wav');

// Echo the audio data.
echo $wav;

// End.
die();

Have a play with that. Save the PHP as 'gettts.php' and the HTML whatever you see fit. Change the JS variable 'tts' to whatever you like. Just hit the HTML page in your browser to see what it does.

Hopefully it will work, it did for me. You may have to change the voice setting if you don't have that one, but I think that is the default voice when installing.

Note that PHP is server side - so you need to browse to the HTML via your local host. Just double clicking on the HTML file will not work.
:)

Working well, Thanks a lot! At least I have some idea now

Since RiveScript is working as expected now. I will post new topics to new threads
Pages: [1] 2 3 ... 10

Welcome

Please login or register.



Login with username, password and session length
A.eye
by keghn (General AI Discussion)
March 27, 2017, 09:31:23 PM
Rivescript not working as expected- solved
by brty21 (AI Programming)
March 27, 2017, 10:22:23 AM
Way before I attempt VR let me ask some questions.
by LOCKSUIT (General Project Discussion)
March 27, 2017, 05:18:49 AM
Hello All
by Art (New Users Please Post Here)
March 25, 2017, 07:50:49 PM
My DeepMind Generation
by Freddy (General Chat)
March 25, 2017, 04:16:32 AM
mini a.i puzzles
by yotamarker (General AI Discussion)
March 23, 2017, 08:12:36 PM
Solid proof why Automation is incredible.
by Art (General AI Discussion)
March 23, 2017, 05:17:09 PM
Hello all
by 8pla.net (New Users Please Post Here)
March 22, 2017, 06:06:52 PM

Users Online

16 Guests, 1 User
Users active in past 15 minutes:
kei10
[Trusty Member]

Most Online Today: 47. Most Online Ever: 208 (August 27, 2008, 08:24:30 AM)

Articles