LEARNING ROBOTICS USING PYTHON PDF

adminComment(0)

Premium eBook free for Geeks. Contribute to arpitjindal97/technology_books development by creating an account on GitHub. To help you learn psychology on your own, Psychology: A Self-Teaching Guide Following each section there are one or se. If you are an engineer, a researcher, or a hobbyist, and you are interested in robotics and want to build your own robot, this book is for you. Readers are.


Learning Robotics Using Python Pdf

Author:QUEENIE HERSOM
Language:English, Dutch, Hindi
Country:China
Genre:Academic & Education
Pages:361
Published (Last):20.12.2015
ISBN:327-2-25135-759-7
ePub File Size:28.33 MB
PDF File Size:16.76 MB
Distribution:Free* [*Registration needed]
Downloads:39274
Uploaded by: LOYD

Learning Robotics Using Python is an essential guide for creating an autonomous mobile robot using popular robotic software frameworks such as ROS using. Packt Publishing, p. ISBN: , Learning about robotics will become an increasingly essential skill as it becomes a. Robot Operating System (ROS) is one of the most popular robotics software frameworks in research and industry. It has various features for implementing.

In this tutorial, I will be describing the robot control software architecture that comes with v1.

However, I encourage you to dive into the source and mess around. The simulator has been forked and used to control different mobile robots, including a Roomba2 from iRobot. Likewise, please feel free to fork the project and improve it. It evolves our robot state machine and triggers the controllers for computing the desired behavior.

In particular, a specific controller is selected depending on the state machine. The Goal Robots, like people, need a purpose in life. The goal of our software controlling this robot will be very simple: It will attempt to make its way to a predetermined goal point. This is usually the basic feature that any mobile robot should have, from autonomous cars to robotic vacuum cleaners. The coordinates of the goal are programmed into the control software before the robot is activated but could be generated from an additional Python application that oversees the robot movements.

For example, think of it driving through multiple waypoints. However, to complicate matters, the environment of the robot may be strewn with obstacles. Therefore, if the robot encounters an obstacle, it will have to find its way around so that it can continue on its way to the goal. The Programmable Robot Every robot comes with different capabilities and control concerns.

The first thing to note is that, in this guide, our robot will be an autonomous mobile robot. This means that it will move around in space freely and that it will do so under its own control. This is in contrast to, say, a remote-control robot which is not autonomous or a factory robot arm which is not mobile. Our robot must figure out for itself how to achieve its goals and survive in its environment.

This proves to be a surprisingly difficult challenge for novice robotics programmers. Control Inputs: Sensors There are many different ways a robot may be equipped to monitor its environment.

These can include anything from proximity sensors, light sensors, bumpers, cameras, and so forth. In addition, robots may communicate with external sensors that give them information that they themselves cannot directly observe. There are more sensors facing the front of the robot than the back because it is usually more important for the robot to know what is in front of it than what is behind it.

In addition to the proximity sensors, the robot has a pair of wheel tickers that track wheel movement. These allow you to track how many rotations each wheel makes, with one full forward turn of a wheel being 2, ticks. Turns in the opposite direction count backward, decreasing the tick count instead of increasing it. Later I will show you how to compute it from ticks with an easy Python function. Control Outputs: Mobility Some robots move around on legs.

Some roll like a ball.

Learning Robotics Using Python About Packt Publishing

Some even slither like a snake. Our robot is a differential drive robot, meaning that it rolls around on two wheels. When both wheels turn at the same speed, the robot moves in a straight line.

When the wheels move at different speeds, the robot turns. Thus, controlling the movement of this robot comes down to properly controlling the rates at which each of these two wheels turn.

If you want to create a different robot, you simply have to provide a different Python robot class that can be used by the same interface, and the rest of the code controllers, supervisor, and simulator will work out of the box!

The Simulator As you would use a real robot in the real world without paying too much attention to the laws of physics involved, you can ignore how the robot is simulated and just skip directly to how the controller software is programmed, since it will be almost the same between the real world and a simulation.

But if you are curious, I will briefly introduce it here. The file world. The step function is executed in a loop so that robot. The same concepts apply to the encoders.

A Simple Model First, our robot will have a very simple model. It will make many assumptions about the world. Some of the important ones include: The terrain is always flat and even Obstacles are never round Nothing is ever going to push the robot around The sensors never fail or give false readings The wheels always turn when they are told to Although most of these assumptions are reasonable inside a house-like environment, round obstacles could be present.

Our obstacle avoidance software has a simple implementation and follows the border of obstacles in order to go around them. We will hint readers on how to improve the control framework of our robot with an additional check to avoid circular obstacles.

The Challenge of the Programmable Robot: Perception vs. Reality, and the Fragility of Control

The Control Loop We will now enter into the core of our control software and explain the behaviors that we want to program inside the robot. Additional behaviors can be added to this framework, and you should try your own ideas after you finish reading! A robot is a dynamic system. The state of the robot, the readings of its sensors, and the effects of its control signals are in constant flux.

Controlling the way events play out involves the following three steps: Apply control signals. Measure the results. Generate new control signals calculated to bring us closer to our goal.

These steps are repeated over and over until we have achieved our goal. The more times we can do this per second, the finer control we will have over the system. The Sobot Rimulator robot repeats these steps 20 times per second 20 Hz , but many robots must do this thousands or millions of times per second in order to have adequate control.

Remember our previous introduction about different robot programming languages for different robotics systems and speed requirements. In general, each time our robot takes measurements with its sensors, it uses these measurements to update its internal estimate of the state of the world—for example, the distance from its goal. It compares this state to a reference value of what it wants the state to be for the distance, it wants it to be zero , and calculates the error between the desired state and the actual state.

Once this information is known, generating new control signals can be reduced to a problem of minimizing the error which will eventually move the robot towards the goal. A Nifty Trick: Simplifying the Model To control the robot we want to program, we have to send a signal to the left wheel telling it how fast to turn, and a separate signal to the right wheel telling it how fast to turn. However, constantly thinking in terms of vL and vR is very cumbersome.

This is known as a unicycle model of control. Here is the Python code that implements the final transformation in supervisor. These estimates will never be perfect, but they must be fairly good because the robot will be basing all of its decisions on these estimations.

Using its proximity sensors and wheel tickers alone, it must try to guess the following: The direction to obstacles The position of the robot The heading of the robot The first two properties are determined by the proximity sensor readings and are fairly straightforward.

We know ahead of time that the seventh reading, for example, corresponds to the sensor that points 75 degrees to the right of the robot. Thus, if this value shows a reading corresponding to 0.

If there is no obstacle, the sensor will return a reading of its maximum range of 0. Thus, if we read 0. Because of the way the infrared sensors work measuring infrared reflection , the numbers they return are a non-linear transformation of the actual distance detected.

Thus, the Python function for determining the distance indicated must convert these readings into meters. This is done in supervisor. Determining the position and heading of the robot together known as the pose in robotics programming is somewhat more challenging.

Our robot uses odometry to estimate its pose. This is where the wheel tickers come in. This is one reason it is important to iterate the control loop very frequently in a real-world robot, where the motors moving the wheels may not be perfect.

If we waited too long to measure the wheel tickers, both wheels could have done quite a lot, and it will be impossible to estimate where we have ended up. Given our current software simulator, we can afford to run the odometry computation at 20 Hz—the same frequency as the controllers. But it could be a good idea to have a separate Python thread running faster to catch smaller movements of the tickers.

Below is the full odometry function in supervisor. Positive x is to the east and positive y is to the north. Thus a heading of 0 indicates that the robot is facing directly east. The robot always assumes its initial pose is 0, 0 , 0. So how do we make the wheels turn to get it there? This then becomes a simple task and can be easily programmed in Python. If we go forward while facing the goal, we will get there. Thanks to our odometry, we know what our current coordinates and heading are.

We also know what the coordinates of the goal are because they were pre-programmed. Thus, the angle of this vector from the X-axis is the difference between our heading and the heading we want to be on. In other words, it is the error between our current state and what we want our current state to be. It is a coefficient which determines how fast we turn in proportion to how far away from the goal we are facing. If the error in our heading is 0, then the turning rate is also 0. A good general rule of thumb is one you probably know instinctively: If we are not making a turn, we can go forward at full speed, and then the faster we are turning, the more we should slow down.

This generally helps us keep our system stable and acting within the bounds of our model. How would this formula change? OK, we have almost completed a single control loop. The only thing left to do is transform these two unicycle-model parameters into differential wheel speeds, and send the signals to the wheels.

When an obstacle is encountered, turn away from it until it is no longer in front of us.

Lentin Joseph. Learning Robotics using Python

Accordingly, when there is no obstacle in front of us, we want our reference vector to simply point forward. In this chapter we are discussing about interfacing of actuators that we are using in our robot.

We can seen motor and encoder interfacing with a controller board called Tiva C Launchpad. We are discussing the controller code for interfacing motor and encoder. In the future, if the robot require high accuracy and torque, we can seen Dynamixel servos which can substitute current DC motors. In this chapter we can seen some robotic sensors which are using in this robot. These three sensors help in the navigation of the robot. We can also see the basic code to interface these sensors to Tiva C Launchpad.

The main aim of this chapter is to discuss about speech recognition and synthesis and how we can implement it on Chefbot. By adding speech functionalities in our robot, we can make the robot more interactive than before. We can discuss what are the processes involved in the speech recognition and synthesis process and can see the block diagram of these processes and the functions of each block.

After discussing about these libraries, we will discussed and work with the Python interfaces of each library. Towards the end of this chapter, we will implement ROS packages that perform speech recognition and synthesis functionalities. In this chapter, we can see how to add Artificial Intelligence to Chefbot in order to interact with people. This function is an add-on to Chefbot to increase the interactivity of the robot.

We are using simple AI techniques such as pattern matching and searching in Chefbot. The pattern datasets are stored in special type of files called AIML. We will use this to decode AIML files.

This method is similar to a stimulus-response system. The user has to give a stimulus in the form of text data, and from the AIML pattern, the module will find the appropriate reply for the user input.

We will see the entire communication system of the robot and how the robot communicates with people.

It includes speech recognition and synthesis along with AI. We have discussed about speech processing in the previous chapter. Finally, we will implement the entire code in ROS along with speech recognition and synthesis units. This chapter is about assembling the hardware of ChefBot and integrating the embedded and ROS code into the robot to perform autonomous navigation. We have seen the robot hardware parts that was selected using the design from Chapter 5, Working with Robotic Actuators.

We assembled individual sections of the robot and connected the prototype PCB we designed for the robot. This consists of the Launchpad board, motor driver, left shifter, ultrasonic, and IMU. The Launchpad board was flashed with the new embedded code, which can interface all sensors in the robot and can send or receive data from the PC. After discussing the embedded code, we wrote a ROS Python driver node to interface the serial data from the Launchpad board. We interfaced the robot to ROS navigation stack.

In this chapter, we are discussing to create a GUI for ChefBot that can be used by an ordinary user who doesn't have any idea about the internal working of a robot. The ChefBot GUI can start the robot, select a table number, and command the robot to get into that position.

The position of each table are getting from the generated map we hard coded the positions in this Python script for testing. When a table is selected, we set a goal position on the map, and when we click on the Go button, the robot will move into the goal position.

The user can cancel the operation at any time and command the robot to come to the home position. The GUI can also receive the real-time status of the robot and its battery status.

We will see some plugins used for debugging the data from the robot. In this chapter, we will discuss all the possible errors and calibration required on the robot before working with the robot. The calibration is required to reduce errors from the sensors. We also see all the steps to do before working with robot GUI that we have designed. We will discuss the pros and cons of this navigation method.

The area we are handling is still in research so we can't expect high accuracy from the current prototype. A good veneer of both hardware and software and a great for dummies and enthusiasts of robotics community. Book is crafted to a good extent of detail while keeping it easy to read! Have been always fascinated by the 'Dummies' series for the fact that only a person with an in-depth knowledge of the subject can explain otherwise hard to comprehend concepts in a crisp n easy to understand style; building from the basics and going to advance topics.

Learning Robotics Using Python is just that! Every reader who goes through, or I'd say experiences, this book would delve into the artistic world of robots. Lentin's comprehensive, yet simple approach guides even a beginner to confidence. A great book to learn robotics - the right mix of theoretical and practical knowledge. It is a complete well detailed and pratical guide to robotics for beginners.

Very easy to follow and must have for ones starting with real robotics. This is one of the best books for learning robotics practically.

The highlight of this book is that it deals with all the realms of robotics, mechanical CAD design, electronics circuit design, embedded firmware development, high level image and speech processing, autonomous navigation using AI techniques ,and much more.

It also gives an intro to using ROS for a beginner. This is a really nice book that covers everything that you need to know to build your robot from the hardware design to the software and the sensors in a systematic way.

Learning Robotics Using Python

About Learning Robotics using Python. First Published: May Production Reference: Lentin Joseph Reviewers: Avkash Chauhan, Vladimir Iakovlev Reviewers: Rebecca Youe Acquisition Editor: Rebecca Youe Content Development Editor: Athira Laji Technical Editors: Harshal Ved Proof readers: Stephen Copestake, Safis Editing Indexer: Priya Sane Graphics: Sheetal Aute Production Coordinator: Nitesh Thakur Cover: Nitesh Thakur.

Introduction to Robotics. What is a robot? Where do robots come from?The more times we can do this per second, the finer control we will have over the system. There are more sensors facing the front of the robot than the back because it is usually more important for the robot to know what is in front of it than what is behind it. Below is the full odometry function in supervisor. Both fields develop software in order to help or replace humans, but RPA targets tasks usually done by a human in front of a computer, such as sending emails, filing receipts, or browsing a website.

However, to complicate matters, the environment of the robot may be strewn with obstacles.

Finally, use that to develop smart, complex software routines to create your desired behavior. Assuming you are able to run the Java Virtual Machine on your robot, you can interface your Java code with the motor and sensor drivers using sockets or RPC. Priya Sane Graphics: