'에뮬레이터'에 해당되는 글 1건

  1. 2007.11.27 Computer Vision 과 Robotics 의 만남.. 1
컵퓨터 비젼에 관심을 갖고,
무엇을 할까 생각하다가..
로보트를 만들어보자 생각했다.

근데 내가 어떻게 만들어???
그래서 로봇도 소프트웨어로 에뮬레이션 해보면 재밌겠다 싶더라고..

http://www.ibm.com/developerworks/linux/library/l-robotools/index.html

영어 공부할 겸.. 읽어 봅시다.

Open source robotics toolkits

Use virtual arenas to test your robotics algorithms

developerWorks
Document options
Set printer orientation to landscape mode

Print this page

Email this page

E-mail this page

Document options requiring JavaScript are not displayed


New forum features

Try private messaging, read-tracking, and more


Rate this page

Help us improve this content


Level: Intermediate

M. Tim Jones (mtj@mtjones.com), Consultant Engineer, Emulex

05 Sep 2006

Building a robot involves skills from many disciplines, including embedded firmware and hardware design, sensor selection, controls systems design, and mechanical design. But simulation environments can provide a virtual arena for testing, measuring, and visualizing robotics algorithms without the high cost (and time) of development. This article introduces you to some of the open source robotics toolkits for Linux®, demonstrates their capabilities, and helps you decide which is best for you.
Software robot?

Researchers at the University of Washington coined the term Softbots -- a combination of software and robot. The term intelligent agent is now more commonly used, especially in the context of Internet-capable entities. In 1996, Franklin and Graesser introduced the first agent taxonomy to classify viruses as autonomous agents.

The spectrum of traditional robots is large and varied, but with the advent of software agents (the virtual robot counterpart), these variations have expanded. Many of the characteristics of physical robots lend themselves to robots in the virtual domain. For example, mobility in physical robots implies some sort of locomotion, but mobile soft robots (or agents) can have mobility -- here, the ability to migrate between hosts in a network. Figure 1 shows a shallow view of autonomous robotics in the physical and virtual domains. This article focuses on software agents as a mechanism to simulate robots in synthetic environments.


Figure 1. Shallow taxonomy of autonomous robots
Shallow taxonomy of autonomous robots

Elements of a robot

Whether you're talking about a physical robot or a virtual (soft) robot, the fundamental concepts are the same. The robot has a set of sensors used to perceive its environment, a set of effectors to manipulate its environment, and a control system that allows the robot to act in an intentional and useful way (see Figure 2).


Figure 2. The fundamental elements of all robotic systems
The fundamental elements of all robotic systems

In the physical world, a fire-fighting robot could use temperature sensors, infrared (IR) sensors, a Global Positioning System (GPS) to perceive its environment, and motors and perhaps a fire extinguisher as effectors to manipulate its environment. A virtual search agent could use Web servers and HTTP interfaces to both perceive and manipulate the environment (the Internet) and a console as an effector to communicate with a user.

The system shown in Figure 3 is that of a closed loop, with sensors feeding the control system that drive changes in the environment. Another way to think about this is in terms of feedback. If the control system specifies an act that changes the environment, the sensors can validate this change, feeding back the new state of the environment to the control system. An open-loop system would have to assume that the acts successfully changed the state of the environment, which can never be a good thing.


Figure 3. Closing the loop with the environment
Closing the loop with the environment

When building a robot, you must consider the sensors, the effectors, and the control system as a whole. For this article, I focus on the control system and the ways in which you can simulate and validate it before spending time embedding it in a physical robot.



Back to top


Robotics and simulation

Simulation plays a key role in the field of robotics, because it permits experimentation that would otherwise be expensive and/or time-consuming. Simulation permits you to try ideas in dynamic, synthetic environments while collecting stimulus response data to determine the quality of the control system. Simulation also allows the evolution of robotics control systems, which depend on random permutations of the control system over many generations (as demonstrated by genetic algorithms).

Linux and robotics

Linux is a popular operating system for robotics, because at its roots, it shares a similar history with robotics. Robotics is a field of experimentation. It's about optimizing, trying new things, and evolving toward the future. Linux, at its heart, is about the same things. Early robots were oddities and exhibited few practical applications. Similarly, Linux began as hobbyist operating system but has grown into a powerful and stable operating system that can be found from tiny embedded devices to supercomputers (including many robots).

One of the greatest advantages to simulation occurs in multi-robot simulations. A popular venue for these simulations is in Robot Soccer, where either through simulation or with physical robots, one team of robots competes against another in the popular world sport of soccer (making it ideal for international competition). The robots must compete cooperatively against the other robots on their team (possibly with communication) as well as competitively with the robots on the opposing team, making it a challenging test of robot behavior.

But there are downsides to simulation. The real world tends to be messy and noisy, and synthetic environments are fundamentally difficult to model. Simulating a robot also tends to be difficult, as sensors in the real world can often exhibit different or unexpected characteristics. Despite the disadvantages, you can learn a lot by simulating robots in synthetic environments.



Back to top


Open source toolkits for Linux

Several open source toolkits are available for building robotic control systems. This article looks at mobile robot simulators, a physics modeling system, and finally, a simulator that supports embedding a simulated control system in a physical robot. The vast majority of toolkits available run on Linux, primarily because of the open source model. Open source software is a platform from which you can develop software more quickly and with less effort, so it's ideal. Linux also permits customization not possible in other operating systems (such as minimizing and extending the kernel). Links to these toolkits and more are in Resources section at the end of this article.

ODE

Russell Smith's Open Dynamics Engine (ODE) is an open source physics engine with which you can simulate articulated rigid body dynamics. In this way, you can simulate the physics of real-world objects independent of a graphics library (for which you could use OpenGL). You can use the ODE to model all sort of objects in synthetic environments, such as characters in three-dimensional game environments or vehicles in driving simulations. In addition to being fast, the ODE supports collision detection for real-time simulation.

What is an articulated rigid body?

An articulated rigid body is a structure that consists of a variety of shapes connected by various kinds of joints. For example, consider the joints that make up the leg or the elements in a vehicle's chassis, suspension, and wheels. The ODE can model these elements efficiently, including friction models.

The ODE currently supports the ball-and-socket, hinge, slider, fixed, angular motor, and hinge-2 (for vehicle joints) joint types, among others. It also supports a variety of collision primitives (such as sphere and plane) and several collision spaces.

The ODE is written primarily in the C++ programming language, but it exposes clean interfaces in C and C++ for integration with your application. What makes the ODE even better are the licenses under which it's released: the GNU Lesser General Public License (LGPL) and the BSD License. Under either license, you can use the ODE source in commercial products without a fee. As a result, you'll find the ODE in a variety of commercial games, flight simulators, and virtual reality simulations.

The source example in Listing 1 shows a simple world with Mars' gravity and a sphere that currently has some upward velocity. Given that the world has gravity, that upward velocity won't last long; eventually, the sphere reaches the apex and begins its descent. After the initialization is complete (that is, objects created in the world and their attributes set), you can simulate the physics of the world with a call to dWorldStep. To understand what's happening, you make a regular call to dBodyGetPosition and pass in your sphere's identifier to get its current position.


Listing 1. Simple ODE experiment of a sphere in a world with gravity
		 
#include <iostream>
#include <ode/ode.h>

#define time_step		 (float)0.1

int main()
{
  dWorldID myWorld_id;
  dBodyID mySphere_id;
  dMass sphereMass;
  const dReal *pos;
  float time = 0.0;

  /* Create a new world */
  myWorld_id = dWorldCreate();

  /* Create a sphere in the world */
  mySphere_id  = dBodyCreate( myWorld_id );

  /* Set the world's global gravity vector (Mars) -- x,y,z */
  dWorldSetGravity( myWorld_id, 0, 0, -3.77 );

  /* Set the Sphere's position in the world -- x,y,z */
  dBodySetPosition( mySphere_id, 0, 0, 100 );

  /* Set the Sphere's mass (density, radius) */
  dMassSetSphere( &sphereMass, 1, 2 );
  dBodySetMass( mySphere_id, &sphereMass );

  /* Give the sphere a small amount of upward (z) velocity */
  dBodySetLinearVel( mySphere_id, 0.0, 0.0, 5.0 );

  /* Run the simulation */
  while (time < 5.0) {

    /* Simulate the world for the defined time-step */
    dWorldStep( myWorld_id, time_step );

    /* Get the current position of the sphere */
    pos = dBodyGetPosition( mySphere_id );

    std::cout << "position (" << pos[0] << ", "
             << pos[1] << ", " << pos[2] << ")\n";

    /* Next time step */
    time += time_step;

  }

  /* Destroy the objects */
  dBodyDestroy( mySphere_id );
  dWorldDestroy( myWorld_id );

  return 0;
}

So, if you need an industrial-quality physics engine (that operates on Linux as well as other platforms) to simulate your mobile robot or unmanned aerial vehicle in realistic environments, the ODE is a superb choice. Used with the OpenGL application program interface (API), the ODE generates photo-realistic graphics with realistic physics, as well.

Simbad robot simulator

Simbad is a three-dimensional robot simulator written in the Java® programming language (so it runs on Linux and other platforms that support the Java virtual machine, or JVM); however, the simulator includes support for Python scripting (through Jython). Simbad was designed to study artificial intelligence (AI) algorithms in the context of autonomous robotics, and it includes a rich graphical user interface (GUI) for visualization not only of the robot's actions but also from the robot's perspective.

What makes Simbad interesting is that it's simple to use and allows you to create new robot behaviors quickly. But while developing for Simbad is simple, it's actually an extensible framework for robotic simulation.

With the simulator, you can create or tailor an environment, and then develop your robot controller using a variety of sensors. Available sensors include a vision sensor (color monoscopic camera), range sensors (sonars and IR detectors), and bumpers for collision detection.

The APIs for the sensors are clean and intuitive to use. The example in Listing 2 demonstrates the use of sonar and how to detect a hit (an object detected).


Listing 2. A snippet of code demonstrating simulated sonar use
		 
int sonar_id, total_sonars;

// If at least one sensor has a hit
if (sonars.oneHasHit()) {

  // Find out how many sonars are on the robot
  total_sonars = sonars.getNumSensors();

  // Iterate through each sonar
  for ( sonar_id = 0 ; sonar_id < total_sonars ; sonar_id++ ) {

    // Does this one have a hit?
    if (sonars.hasHit(sonar_id)) {

      // Emit the details (angle, range)
      System.out.println( "Sonar hit at angle " +
                           sonars.getAngle(i) +
                           " at range " +
                           sonars.getMeasurement(i) );

    }

  }

}

Other sensors available in Simbad follow a similar pattern, creating an intuitive set of APIs.

What really makes Simbad so useful is its console for robot simulation and visualization. As Figure 4 shows, the Simbad console gives you a real-time view of the world, an inspector panel that provides robot details (including the camera), and a control panel for managing the simulation.


Figure 4. The Simbad robot simulator and visualizer console
Simbad Robot Simulator and visualizer console

Simbad also provides good documentation and tutorials to get you up and running quickly in both the Java and Python languages. And along with single-robot simulation, you can simulate multiple robots simultaneously. Overall, the Simbad simulator is a great environment for testing ideas in intelligent robotics algorithms. Simbad is available under the GPL open source license.

TeamBots

TeamBots is a portable multi-agent robotic simulator that supports simulation of multi-agent control systems in dynamic environments with visualization. What makes TeamBots unique compared to other simulators such as Simbad is the portability of the control system. You can develop your control system and validate it on the simulator, and then test your control system in a real mobile robot (using the Nomadic Technologies Nomad 150 robot).

The TeamBots API provides an abstraction layer for the control system (see Figure 5). As a result, the control system has no idea whether it's running on a simulator in a synthetic environment (TBSim) or in a mobile robot platform in a real environment (TBHard).


Figure 5. The TeamBots API abstraction layer to the control system
The TeamBots API

The TeamBots simulation environment is very flexible and easily allows the construction of synthetic environments with objects and other robots. It is easy to add walls, arbitrary objects, roads, and other robots running the same or different control systems. In this way, you can build predatory/prey simulations (as one example). In addition, objects need not be static. You could place objects that move around the environment or objects that can move if nudged by a robot (such as a ball).

With TeamBots, you can model different types of robot simulations. For example, in 1997, Georgia Tech used TeamBots to win the American Association for Artificial Intelligence (AAAI) mobile robot competition with two simulated Nomad 150 robots foraging in a dynamic environment. The goal was for the two robots to search the environment, and then pick up and return the blue objects to the blue bin and the orange objects to the orange bin (see Figure 6). To add some complexity to the competition, the orange balls were dynamic and constantly moved around the environment.


Figure 6. TeamBots simulation of foraging behavior
TeamBots simulation of foraging behavior

In Figure 6, mobile robot 1 has a blue object and is moving toward the blue bin to drop it off. Robot 0 is searching.

You can also use TeamBots in the development of robotic soccer players. As soccer is a sport with international appeal, it's a great platform for competition between international universities and groups. Rules for robot soccer can differ (especially when considering the varieties that exist for mobile platforms, bipedal platforms, or Sony Aibo), but all share the fundamental model of the game.

In Figure 7, Robot 1 (yellow/white) is moving toward the ball in a goal attempt. Robot 0 (blue/red) is the opposing goal keeper and is positioning for a block. Robot soccer is actually quite interesting to watch, and the TeamBots distribution provides several teams that you can employ or use to experiment for new strategies.


Figure 7. Demonstrating TeamBots in the SoccerBots domain
TeamBots in the SoccerBots domain

TeamBots provides a Java API for soccer that allows you to concentrate on the "brain" of the player. The effector API permits turning the robot, moving at a certain speed, kicking the ball, or simply moving the ball. Sensors are built at a high level and provide APIs for determining the vector to the ball, an array of vectors to other players (team and opponents), getting the current heading, getting a vector to the opposing goal, and so on.

To give you an idea of the level of the TeamBots Soccer API, check out Listing 3, which presents a very simple strategy. This strategy (derived from the SoccerBots source by Tucker Balch) simply looks for the ball, heads to it, and then kicks it (without regard to the direction of the goal). It's a random strategy but demonstrates the simplicity of the API.


Listing 3. Simple soccer player snippet using the TeamBots SoccerBots API
		 
public int TakeStep()
{
  Vec2 ball;
  long T;

  T = abstract_robot.getTime();

  // Get the vector to the ball
  ball = abstract_robot.getBall(T);

  // Point ourselves to it
  abstract_robot.setSteerHeading(T, ball.t);

  // Go to it (maximum speed)
  abstract_robot.setSpeed(T, 1.0);

  // If we can kick it, do so!
  if (abstract_robot.canKick(T)) abstract_robot.kick(T);

  return(CSSTAT_OK);
}

The TeamBots distribution is a great environment for both prototyping and simulating mobile robots and also for executing them within real robots using the TBHard environment. TeamBots is open source (developed by Tucker Balch of Georgia Tech and Carnegie Mellon University) and can be used freely for educational and research purposes. The simulator was developed in the Java language and is distributed with full source code and several examples to help you get up and running quickly.



Back to top


Other toolkits

One of the most well-known mobile robot platforms for which numerous simulators have been written is called Khepera. Unfortunately, Khepera evolved into commercial software and is no longer open source. Fortunately, toolkits such as KControl are still available for developing control systems for Khepera on Linux.

An interesting three-dimensional robot simulator with dynamics is available in Gazebo. Gazebo models not only standard robot sensors (such as inertial measurement units, GPS receivers, and monocular cameras) but also real-world rigid-body physics for robotic environments. Gazebo supports a plug-in model in which you can load new robot sensor models into environments dynamically.

Finally, a useful robot navigation toolkit is Carmen -- the Carnegie Mellon Robot Navigation Toolkit. Carmen implements a modular architecture that provides fundamental navigation primitives such as obstacle avoidance, path planning, and mapping. As well as providing a two-dimensional simulator, Carmen supports several physical robot platforms running Linux.



Back to top


Building Linux robots

Getting started building Linux-based robots isn't as difficult as you might think. In fact, some high-school science curriculums are using Linux and readily available hardware as the core for Linux-based robots. For example, you could use an old PC motherboard as the system core (or better yet, an old laptop), and boot Linux from a USB drive (which would consume significantly less power than a CD-ROM or hard/floppy drive). The onboard parallel port can be easily transformed into a multitude of devices, such as discrete inputs and outputs, or to drive a set of stepper motors. The serial port can be used to sink GPS coordinates, or with an external device, as an A/D (Analog to Digital) or D/A (Digital to Analog) converter. Finally, you can purchase inexpensive USB Web cameras to give your robot the ability of sight.

But what really makes Linux shine in this environment is the ability to simplify the environment to make robot control system design accessible to anyone through higher level languages such as Python. Michael Surran of the Greater Houlton Christian Academy in Maine offered recently, for the second year, a high school robotics course that features Linux and readily available hardware. At the core of their curriculum is the use of Python. Since Python is an interpreted language, it's very easy to experiment with algorithms, without the need for lengthy compile cycles (what makes interpreted scripting languages so useful in the first place).

If you're looking beyond the homebrew Linux solution, Carnegie Mellon University recently introduced the "Qwerkbot" platform from their Mobile Robot Programming Lab (MRPL), which runs the 2.6 Linux kernel. The "Qwerk" is an ARM9-based board with 8MB of flash, and 32MB of SDRAM; it includes four onboard motor controllers, 16 servo controllers, 16 digital I/Os, 8 12-bit analog inputs, and a whole lot more.



Back to top


Conclusion

Robot simulators can greatly simplify the job of building physical robots. Through simulators, you can test ideas and strategies before putting them into hardware. Luckily, the Linux and open source communities have several options that are not only easy to use but can even support direct linkage to hardware platforms.



Resources

Learn
  • The Open Dynamics Engine is a physics engine for modelling articulated rigid-body dynamics.

  • The Simbad robot simulator is a great tool for robot simulation and visualization.

  • The TeamBots distribution is a great environment for both prototyping and simulating mobile robots as well as executing them within real robots using the TBHard environment. TeamBots is open source (developed by Tucker Balch of Georgia Tech and Carnegie Mellon University) and can be used freely for educational and research purposes.

  • The RoboCup Soccer competition's extensive set of rules that define what's permitted in play, from the size of the field to the ball used (an orange golf ball).

  • The 2005 RoboCup held in Osaka, Japan, included a variety of robotic soccer events, including traditional mobile robot players, bipedal robots, Sony Aibos, simulated virtual players as well as several other robotic demonstrations. The RoboCup Web site provides some great video footage of this event.

  • The American Association for Artificial Intelligence Web site maintains a good (and current) list of AI topics and research. Their robotics page will keep you up-to-date on the latest happenings in the world of soft and hard robotics (as well as their open source projects page).

  • Khepera II remains based on the Motorola 68331 CPU (the miniature Khepera is a bit outdated). In addition to the mobile platform, many options exist to extend the platform, such as with a video camera module, a gripper module, and a radio modem to permit communication with other Khepera mobile robots.

  • The Gazebo multi-robot simulator implements not only a realistic robot simulation but also an accurate simulation of rigid-body physics for the robot's environment.

  • Carmen is a robotic navigation toolkit that provides simulation and support for physical robot platforms. It implements several navigation primitives, such as mapping and path planning.

  • Find interesting recipes for the Qwerkbot, including a pan-tilt unit for a camera, at CMU.

  • "Do-It-Yourself Robots with Linux", a Linux Journal article by Michael Surran's explains why Linux and older PC hardware make a great platform for building mobile robots.

  • To run Java on Linux, you can use the Blackdown Java Linux package.

  • In the developerWorks Linux zone, find more resources for Linux developers.

  • Stay current with developerWorks technical events and Webcasts.

Get products and technologies
  • Order the SEK for Linux, a two-DVD set containing the latest IBM trial software for Linux from DB2®, Lotus®, Rational®, Tivoli®, and WebSphere®.

  • With IBM trial software, available for download directly from developerWorks, build your next development project on Linux.


Discuss


About the author

M. Tim Jones

M. Tim Jones is an embedded software architect and the author of GNU/Linux Application Programming, AI Application Programming, and BSD Sockets Programming from a Multilanguage Perspective. His engineering background ranges from the development of kernels for geosynchronous spacecraft to embedded systems architecture and networking protocols development. Tim is a Consultant Engineer for Emulex Corp. in Longmont, Colorado.


Posted by 따봉맨
이전버튼 1 이전버튼