Zigbee based wireless motor speed and directional control - part 2

1 comments
ABSTRACT

Wireless sensor networks have been very popular research field for the last couple of years. ZigBee is a new technology now being deployed for wireless sensor networks. ZigBee is a low data rate wireless network standard defined by the ZigBee Alliance and based on the IEEE 802.15.4.The ZigBee wireless network has some advantages compared  with other wireless networks, it has the characteristics of low power, low price, highly secured and reliable, so implementing a remote motor control and monitoring system proves to have a good cost performance ratio.

This paper presents the implementation of a wireless sensor actor network (point to point) to control the speed of a DC motor from a remote location. Pulse Width Modulation technique provides the actual control. A feedback to measure the current speed is setup using Infra Red Sensor and fed to PIC16F877 Microcontroller directly. The conventional setup uses a frequency to voltage converter to find the speed whereas a new technique is used to calculate the speed from the IR sensor output without using frequency to voltage converter. The motor is controlled remotely with a ±5 rpm tolerance.
Keywords: Wireless Sensor Networks, ZigBee, Xbee, motor control 




HOST UNIT CIRCUIT DIAGRAM:




NODE UNIT CIRCUIT DIAGRAM:








OBSERVED GRAPH:


Xbee Configuration using X-CTU Software :

ORIGINAL PICTURES:

HOST UNIT:
NODE UNIT:

CONCLUSION:

Different speeds are maintained at the given set speed by varying the duty cycle of PWM using PIC16F877A MICROCONTROLLER and a wireless sensor ZIGBEE with a resolution of ±5 rpm tolerance .The measured speed values are transferred by serial interface to the PC and are logged into a text file and are visualized with the aid of XCTU front.

The radius of the location under observation can be increased by 150 meters using an Xbee-Pro instead of ordinary Xbee. Alternatively, using a mesh network with Xbee the distance between the host and sensor node is not a crucial parameter. The acquired data can be further analyzed or used for control of the remote process.

REFERENCES:

[1] H. Taub and DL Schilling. Principles of communication systems. McGraw Hill, New York, Second edition, 1986
[2] Low-Rate Wireless Personal Area Networks: Enabling Wireless Sensors with IEEE 802.15.4 by Jose A. Gutierrez ,Edgar H. Callaway, Jr, Raymond L Barrett, Jr. Published by Standards Information Network IEEE Press November 2003
[3] Wireless Communications: Principles & Practice, T.S. Rappaport. PH 1996
[4] ZigBee Wireless Networking Drew Gislason Newnes Press 2007
[5] http://www.zigbee.org/en/index.asp: Zigbee Official website
[6]http://www.maxstream.net/products/xbee/datasheet_XBee_OEM_RF-Modules.pdf:Maxtream’s Transceiver Xbee
[7]Tanenbaum, A.S., Computer Networks, 3rd ed., Prentice-Hall International 1996.




Posted by:
SANTHOSH   BHARADWAJ    REDDY
email: bharadwaj874@gmail.com

ZigBee Based Wireless Motor Control - part 1

0 comments
ABSTRACT

Wireless sensor networks have been very popular research field for the last couple of years. ZigBee is a new technology now being deployed for wireless sensor networks. ZigBee is a low data rate wireless network standard defined by the ZigBee Alliance and based on the IEEE 802.15.4.The ZigBee wireless network has some advantages compared  with other wireless networks, it has the characteristics of low power, low price, highly secured and reliable, so implementing a remote motor control and monitoring system proves to have a good cost performance ratio.

This paper presents the implementation of a wireless sensor actor network (point to point) to control the speed of a DC motor from a remote location. Pulse Width Modulation technique provides the actual control. A feedback to measure the current speed is setup using Infra Red Sensor and fed to PIC16F877 Microcontroller directly. The conventional setup uses a frequency to voltage converter to find the speed whereas a new technique is used to calculate the speed from the IR sensor output without using frequency to voltage converter. The motor is controlled remotely with a ±5 rpm tolerance.
Keywords: Wireless Sensor Networks, ZigBee, Xbee, motor control  

INTRODUCTION

ZigBee is a new technology now being deployed for wireless sensor networks. A sensor network is an infrastructure comprised of sensing, computing and communications elements that allows the administrator to instrument, observe and react to events and phenomena in a specified environment. In this paper point to point communication is established with ZigBee connectivity for DC motor speed control

ZIGBEE

ZigBee is one of the wireless protocols based on the IEEE 802.15.4 standard for wireless personal area networks (WPANs). ZigBee is designed to use in embedded applications requiring low data rates and low power consumption. ZigBee’s key features are as follows: (a) Reliable and self configuration (b) Supports large number of nodes (c) Easy to deploy (d) Very long battery life (e) Secure (f) Low cost (g) Can be used globally


ZIGBEE RADIO

Many companies like Texas, Microchip, Atmel, and Ember are producing ZigBee radios. In this paper the Xbee, a ZigBee radio from MaxStream Inc[6]. is chosen. One of the main advantages of the XBee module is its easy UART (Universal Asynchronous Receive Transmit) serial interface. This interface makes it ideal for communication with a PC, as well as a microcontroller. Devices having a UART interface can be connect directly to this Xbee module.It runs off a voltage of between 2.8 and3.4 V, the consumption being 45 mA in TX, 50 mA in RX and less than 10uA in power-down, the whole set being supplied with 3.3 V.

EXPERIMENTAL SETUP

               NODE UNIT BLOCK DIAGRAM:



HOST UNIT BLOCK DIAGRAM

IR SENSOR UNIT:


FLOWCHART:       

This flowchart shows the operation of host unit:

The below flowchart shows operation of node unit:


Post by
SANTHOSH BHARADWAJ REDDY ( my R&D Proj )

Zigbee code for Arduino software

0 comments
Zigbee code using Arduino software


*** CONFIGURATION ***

SENDER: (REMOTE SENSOR RADIO)
ATID3456 (PAN ID)
ATDH -> set to SH of partner radio
ATDL -> set to SL of partner radio
ATJV1 -> rejoin with coordinator on startup
ATD02 pin 0 in analog in mode with a photo resistor (don't forget the voltage divider circuit--resistor to ground is good)
ATD14 pin 1 in digital output (default low) mode with an LED from that pin to ground
ATIR64 sample rate 100 millisecs (hex 64)

* THE LOCAL RADIO _MUST_ BE IN API MODE *

RECEIVER: (LOCAL RADIO)
ATID3456 (PAN ID)
ATDH -> set to SH of partner radio
ATDL -> set to SL of partner radio

*/

#define VERSION "1.01"

int LED = 11;
int analogValue = 0;
int remoteIndicator = false; // keeps track of the desired remote on/off state
int lastRemoteIndicator = false; // record of prior remote state
unsigned long lastSent = 0; // records last time the remote was re-set to keep it in sync

void setup() {
pinMode(LED,OUTPUT);
Serial.begin(9600);
}

void loop() {
// make sure everything we need is in the buffer
if (Serial.available() >= 23) {
// look for the start byte
if (Serial.read() == 0x7E) {
// read the variables that we're not using out of the buffer
// (includes two more for the digital pin report)
for (int i = 0; i<20; i++) {
byte discard = Serial.read();
}
int analogHigh = Serial.read();
int analogLow = Serial.read();
analogValue = analogLow + (analogHigh * 256);
}
}

// darkness is too creepy for romance
if (analogValue > 0 && analogValue <= 350) {
digitalWrite(LED, LOW);
remoteIndicator = false;
}
// medium light is the perfect mood for romance
if (analogValue > 350 && analogValue <= 750) {
digitalWrite(LED, HIGH);
remoteIndicator = true;
}
// bright light kills the romantic mood
if (analogValue > 750 && analogValue <= 1023) {
digitalWrite(LED, LOW);
remoteIndicator = false;
}

// set the indicator immediately when there's a state change
if (remoteIndicator != lastRemoteIndicator) {
if (remoteIndicator==false) setRemoteState(0x4);
if (remoteIndicator==true) setRemoteState(0x5);
lastRemoteIndicator = remoteIndicator;
}

// re-set the indicator occasionally in case it's out of sync
if (millis() - lastSent > 10000 ) {
if (remoteIndicator==false) setRemoteState(0x4);
if (remoteIndicator==true) setRemoteState(0x5);
lastSent = millis();
}

}

void setRemoteState(int value) { // pass either a 0x4 or and 0x5 to turn the pin on or off
Serial.print(0x7E, BYTE); // start byte
Serial.print(0x0, BYTE); // high part of length (always zero)
Serial.print(0x10, BYTE); // low part of length (the number of bytes that follow, not including checksum)
Serial.print(0x17, BYTE); // 0x17 is a remote AT command
Serial.print(0x0, BYTE); // frame id set to zero for no reply
// ID of recipient, or use 0xFFFF for broadcast
Serial.print(00, BYTE);
Serial.print(00, BYTE);
Serial.print(00, BYTE);
Serial.print(00, BYTE);
Serial.print(00, BYTE);
Serial.print(00, BYTE);
Serial.print(0xFF, BYTE); // 0xFF for broadcast
Serial.print(0xFF, BYTE); // 0xFF for broadcast
// 16 bit of recipient or 0xFFFE if unknown
Serial.print(0xFF, BYTE);
Serial.print(0xFE, BYTE);
Serial.print(0x02, BYTE); // 0x02 to apply changes immediately on remote
// command name in ASCII characters
Serial.print('D', BYTE);
Serial.print('1', BYTE);
// command data in as many bytes as needed
Serial.print(value, BYTE);
// checksum is all bytes after length bytes
long sum = 0x17 + 0xFF + 0xFF + 0xFF + 0xFE + 0x02 + 'D' + '1' + value;
Serial.print( 0xFF - ( sum & 0xFF) , BYTE ); // calculate the proper checksum
delay(10); // safety pause to avoid overwhelming the serial port (if this function is not implemented properly)
}

Zigbee based Line Following robot

0 comments

Visual Line Following

The following tutorial is one way to use a vision system to identify and follow a line.
The system uses:
  • single CCD camera
  • USB Digitizer
  • Zigbee module
  • Pentium CPU on board
  • Left, Right differential motors
The BucketBot

Example Images

To get a better idea of what we're trying to accomplish lets first look at some sample pictures that the BucketBot took. The images are from the CCD camera mounted to the front of the BucketBot angled towards the ground.

Note that we have not done any calibration of lighting adjustments. The images are straight from the camera.

If it worth mentioning that the lines are created by black electrical tape stuck on moveable floor tiles. This allows us to move the tiles around and experiment with shapes quite easily.


Diagonal Two lines crossing

Extreme Curve!

Lighting Issues...

Bad lighting can really cause problems with most image analysis techniques (esp. when thresholding). Imagine if a robot were to suddenly move under a shadow and lose all control! The best vision techniques try to be more robust to lighting changes.

To understand some of those issues lets look at two histograms from a straight line image from the previous slide. Next to each of these images are the images histogram. The histogram of an image is a graphical representation of the count of different pixel intensities in the image.


Edge Detection

In order to follow the line we need to extract properties from the image that we can use to steer the robot in the right direction. The next step is to identify or highlight the line with respect to the rest of the image. We can do this by detecting the transition from background tile to the line and then from the line to the background tile. This detection routine is known as edge detection.

The way we perform edge detection is to run the image through a convolution filter that is 'focused' on detecting line edges. A convolution filter is a matrix of numbers that specify values that are to be multiplied, added and then divided from the image pixels to create the resulting pixel value.


Edge Detection

Here are two sample images with the convolution edge detection matrix run after normalization:


Original<br>Straight Edges 
Original<br>Curve Edges
It is interesting to note that while the convolution filter does identify the line it also picks up on a lot of speckles that are not really desired as it causes noise in the resulting image.

Also note that the detected lines are quite faint and sometimes even broken. This is largely due to the small (3x3) neighborhood that the convolution filter looks at. It is easy to see a very large difference between the line and the tile from a global point of view (as how you and I look at the images) but from the image pixel point of view it is not as easy. To get a better result we need to perform some modifications to our current operations.

Modified Line Detection

To better highlight the line we are going to:


1. Use a larger filter; instead of a 3x3 neighborhood we will use a 5x5. This will result in larger values for edges that are "thicker". We will use the following matrix:
-1-1-1-1-1
-1000-1
-10160-1
-1000-1
-1-1-1-1-1

2. Further reduce the speckling issue we will square the resulting pixel value. This causes larger values to become larger but smaller values to remain small. The result is then normalized to fit into the 0-255 pixel value range.

3. Threshold the final image by removing any pixels lower than a 40 intensity value.


The results of this modified technique:


Old<br>Technique New<br>TechniqueOld<br>TechniqueNew<br>Technique
We now nicely see the line edges and most of the noise speckles are gone. We can now continue to the next step which is how to understand these images in order to map the results to left and right motor pulses.


Center of Gravity


There are many ways we could translate the resulting image intensities into right and left motor movements. A simple way would be to add up all the pixel values of the left side of the image and compare the result to the right side. Based on which is more the robot would move towards that side.
At this point, however, we will use the COG or Center of Gravity of the image to help guide our robot.
The COG of an object is the location where one could balance the object using just one finger. In image terms it is where one would balance all the white pixels at a single spot. The COG is quick and easy to calculate and will change based on the object's shape or position in an image.

To calculate the COG of an image add all the x,y locations of non-black pixels and divide by the number of pixels counted. The resulting two numbers (one for x and the other for y) is the COG location.
Let's do that for a couple images and see what we get!


Center of Gravity

The COG location is the red square with a green line from the center of the image to the COG location.

Straight<br>with<br>COGCurve<br>with<br>COG
Based on the location of the COG we can now apply the following conditions to the robot motors:

  • When the COG is to the right of the center of screen, turn on the left motor for a bit.
  • When the COG is on the left, turn on the right motor.
  • When the COG is below center, apply reverse power to opposite motor to pivot the robot.
You can also try other conditions such as when the distance of the COG to the center of screen is large really turn up the motor values to try to catch up to the line.
Let's have a look at some more results ...





PROJECT CODE:


' initialize starting servo values
pan = GetVariable("PAN_SERVO")
tilt = GetVariable("TILT_SERVO")
steering = GetVariable("STR_SERVO")
throttle = GetVariable("THR_SERVO")


' get the size (width or height) of the current bounding box
size = GetVariable("COG_BOX_SIZE")


' if it is equal to "" then no object was detected
if size <> "" then


' get the horizontal center of gravity
cogX = GetVariable("COG_X")


' pan left
if cogX < 140 then
pan = pan - 2
end if


' pan right
if cogX > 180 then
pan = pan + 2
end if


' get the vertical center of gravity
cogY = GetVariable("COG_Y")


' tilt down
if cogY < 100 then
tilt = tilt - 2
' tilt up
end if


if cogY > 140 then
tilt = tilt + 2
end if


'steer where it looks
'adjust 300 accordingly, 300 is due to the steering servo being offset 
steering = 300 - pan


'get the distance variable to control the movement
cogSize = GetVariable("COG_BOX_SIZE")


' forward
if cogSize < 30 then
throttle = 130
' reverse
elseif cogSize > 45 then
throttle = 160
else
throttle = 150
end if


SetVariable "STR_SERVO", steering
SetVariable "THR_SERVO", throttle
SetVariable "PAN_SERVO", pan
SetVariable "TILT_SERVO", tilt

end if






Zigbee based Obstacle Avoidance Robot

0 comments

Obstacle Avoidance

Obstacle avoidance is one of the most important aspects of mobile robotics. Without it robot movement would be very restrictive and fragile. This tutorial explains several ways to accomplish the task of obstacle avoidance within the home environment. Given your own robots you can experiment with the provided techniques to see which one works best.
There are many techniques that can be used for obstacle avoidance. The best technique for you will depend on your specific environment and what equipment you have available. We will first start with simpler techniques that are easy to get running and can be experimented on to improve their quality based on your environment.
Let's get started by first looking at an indoor scene that a mobile robot may encounter.
Robot View
Here the robot is placed on the carpet and faced with a couple obstacles. The following algorithms will refer to aspects of this images and exploit attributes that are common in obstacle avoidance scenarios. For example, the ground plane assumption states that the robot is placed on a relatively flat ground (i.e. no offroading for these robots!) and that the camera is placed looking relatively straight ahead or slightly down (but not up towards the ceiling).
By looking at this image we can see that the carpet is more or less a single color with the obstacles being different in many ways than the ground plane (or carpet).

Edge Based Technique

The first technique that exploits these differences uses an edge detector likeCanny to produce an edge only version of the previous image. Using this module we get an image that looks like:
Edges Detected
You can see that the obstacles are somewhat outlined by the edge detection routine. This helps to identify the objects but still does not give us a correct bearing on what direction to go in order to avoid the obstacles.
The next step is to understand which obstacles would be hit first if the robot moved forward. To start this process we use the Side_Fill module to fill in the empty space at the bottom of the image as long as an edge is not encountered. This works by starting at the bottom of the image and proceeding vertically pixel by pixel filling each empty black pixel until a non-black pixel is seen. The filling then stops that vertical column and proceeds with the next.
Filled From Below
You will quickly notice the single width vertical lines that appear in the image. These are caused by holes where the edge detection routine failed. As they specify potential paths that are too thin for most any robot we want to remove them as possible candidates for available robot paths. We do this by using theErode module and just eroding or shrinking the current image horizontally by an amount large enough such that the resulting white areas would be large enough for the robot to pass without hitting any obstacle. We chose a horizontal value of 20.
Horizontal Eroded
Now that we have all potential paths we smooth the entire structure to ensure that any point picked as the goal direction is in the middle of a potential path. This is based on the assumption that it is easier to understand the highest part or peak of a mountain as compared to a flat plateau. Using the Smooth Hull module we can round out flat plateaus to give us better peaks.
Smoothed Outline
Once this is done we now need to identify the highest point in this structure which represents the most distant goal that the robot could head towards without hitting an obstacle. Based on the X location of this point with respect to the center of the screen you would then decide if your robot should move left, straight, or right to reach that goal point. To identify that location we use the Point Locationmodule and request the Highest point which is identified by a red square.
Final Goal Point
Finally just for viewing purposes we merge the current point back into the original image to help us gauge if that location appears to be a reasonable result.
Given this point's X location at 193 and the middle of the image at 160 (the camera is set to 320x240) we will probably move the robot straight. If the X value were > 220 or < 100 we would probably steer the robot to the right or left instead.
Some other results using this technique.



This works reasonable well as long as the floor is a single color. But this is not the only way to recognize the floor plane ...

Blob Based Technique

The second technique exploits the fact that the floor is a single large object. Thus starting with the original image we can segment the image into a smaller number of colors in order to connect pixels into blobs that we can process. This grouping can use either the Flood Fill module or the Segment Colors module. Using the flood fill module the image becomes
Flood Fill
The next step is to isolate the largest blob in the image which is assumed to be the floor. This is done using the Blob Size module which is set to just return the single largest blob in the image.
Largest Blob
We then dilate this image by 2 pixels using the Dliate to close all the small holes in the floor blob.
Dilated
Then we negate this image and use the same Side Fill module as before to determine the possible vertical routes the robot could take. We need to negate the image prior to this module as the Side_Fill module only fills black pixels. In the above image the object to be filled is white and thus when negated will become black.
Negated & Side Fill
From here on the stages are the same as the previous technique. Namely Erode to remove small pathways, smooth the resulting object and identify the top most point. The final image looks similar to the previous technique.
The results are very similar but the first technique exploited edges whereas this one exploited connected pixels of similar color. But the issue of the similar colored floor plane still remains. What happens if you do not have the same colored carpet? For example, suppose that you have a high frequency pattern in a carpet.
The resulting edge and blob based techniques will not work as the blob and edge detection will pick up on the small patterns of the carpet and incorrectly see them as obstacles.
Blob TechniqueEdge Technique
You can notice the failure of both these techniques in the above images which if fully processed would only see non-obstacle space in the lower 10 pixels of the image. This is clearly incorrect!

Which it apparently does! So the floor finder module has allowed us to use a single technique for both similar colored carpet to one that has a lot of small internal patterns in it.

Notes:
1. Three techniques were discussed with the final technique being the most stable given the test images we used. Your experience will be different. It is worth testing each technique on your own images to get a sense of how stable they are and which one will work best for you.
2. Keep in mind when moving the code into a robotic control that the horizontal erode is used as a gauge to the robot's width. You will have to experiment with your setup to determine which erosion width is best for your robot.
3. The final variable displayed has the X value of the goal point. That X value should be then used by your application or within the VBScript module to create a value that will control your servos. It could be as simple as
GetVariable("HIGHEST_MIDDLE_X")
midx GetVariable("IMAGE_WIDTH") / 2
leftThreshold midx - 50
rightThreshold midx + 50

if x <leftThreshold then
  SetVariable "left_motor"0
  SetVariable "right_motor"255
else
  if x >rightThreshold then
    SetVariable "left_motor"255
    SetVariable "right_motor"0
  else
    SetVariable "left_motor"255
    SetVariable "right_motor"255
  end if
end if
which will rotate the left and right motors towards the goal or go straight if in the middle. Keep in mind that the above only creates two variables (left_motor and right_motor) that need to be selected in the appropriate hardware control module. You will have to adjust the values for your robot based on what servos, motors, etc.
Related Posts Plugin for WordPress, Blogger...
Copyright © Zigbee based Projects