Ever since I posted this back in January I’ve been collecting ideas and information on how to make something like the craft pictured in the video and related ones.
The tenative parts list can be found here, but there’s an analog accelerometer, an esc + brushless motor combo, and a battery array so far.
As for wireless, that’s the one area of this project that I’ve done no research on at all. I’m probably going to use a long range bluetooth serial connection, or an xbee serial connection. No matter what it’s going to be serial, as that’s what I’m most familiar with.
So far I’ve proto’d the controller and written the framework for the visual basic program and some of the controller arduino side of the program.
Here’s the video of what I’ve done so far, as you can see the trackbar visualizes reallly nicely. and i’m using the split function and my knowledge of arrays to separate x and y resistance values from the joystick:
http://youtu.be/qTN9DqKTL9M
As you can probably also tell, there’s no name for the project yet, if you think of something let me know!
I should preface this by saying that it is no where near as polished as it could be. But it works, and the objects taken by the camera are recognizable. Eventually, if I have the funding, I will upgrade the Camera.
There are many elements to this project. There really isn’t a great place to start with this project, so well start with the fritzing schematic and a list of materials needed.
You will need:
2 Attiny85’s
16 wires
a breadboard
a 100k ohm resistor
a 330 ohm resistor a raspberry pi PIR motion detector
the cables for interfacing with the raspberry pi
an isp programmer for the attiny’s
So let’s look at this schematic:
When motion is detected, the signal pin on the motion detector gets pulled low. at this point the 3.3v attiny pulls pin 1 high, sending a signal into the rpi (it’s 3.3v so it’s safe for the broadcomm) and illuminating an LED so the user can see that motion is being detected.
The raspberry pi then sends a signal to the 5v attiny indicating that motion has indeed been detected. Once this occurs, the 5v attiny pulls the 5v pin going to the webcam high. I had to integrate this step of turning the webcam on via the attiny because there is a hardware misconfiguration that causes the software i’m using to take the picture hang. Turning the camera off and on each time a picture needs to be taken gets rid of this problem, because the camera always takes the first picture after being turned on.
At this point a picture is taken and moved to a flash drive. They could be moved wherever, but a flashdrive works the best for now, as I will be deploying this system in a place where there isn’t internet (in my garage)
I will need to illuminate the “subjects” that I will be capturing. In order to do this, I will eventually need to set some pin high. Weather it be that it sets of a camera flash or turns on some lights for a second, it will need to happen down the line.
Like all of my “research” I mostly googled around / plugged in code until something worked. I came back with these links:
4. The devs included a great install script with this package, run it to install with:
$ sudo python setup.py install
Now you should see a bunch of text in the command line. I have no idea why, but my first run of this command didn’t “take” but I ran it again and now it works great.
To use this, you need to know what pins correspond to pins on the RPi. You can google this yourself.
Now we get writing code. I’m using a graphical python editor called geaney which comes pre-loaded with squeeze.
To blink pins 11 and 13, use and run this python script.
You can see the plaintext version of that script here.
Essentially i’m trying to mimic things I’ve done with an Arduino for some time.
Maybe later today (if I can somehow find a 100ohm resistor) I’ll work on using inputs. I have a PIR motion sensor with me, but I neglected to bring the proper resistor to use it. I do have tack switches and resistors for those, which I could use to mimic the motion detector, but I don’t think that would be as cool.
that the program can execute. You can run the program with the above configuration by typing:
fswebcam -c /home/user/Pictures/picture.conf //the location of the .conf file should be wherever you put it, so the code after the space after the -c will be different for you.
This will produce an image from your webcam in the location you specified in the picture.conf file.
I installed all of this on my server and got the below image. This is what the laptop running this website sees!
2. Write a python (I’ve decided to go with python because I’ve never used python before, it will be good to learn something new. And because there is already a library for controlling the GPIO pins on the RPi baked into the language.) script that can sense weather or not this sensor‘s alarm pin is pulled low. At that point, the script should then execute the above bash command and take a picture. After that, that photo should be moved somewhere to do something, but I don’t know what that will be yet. It will probably be uploaded to a secure location on my server and then I will have it email me if at the end of the night, something shows up.
//If you just want to see the PiScanner Progress, skip the next 2 blocks
So, you may have noticed an absence of recent posts on this website. This is due to a few factors but the main one is a lack of focus. Not a lack of focus in a multi-tasking sense, but in a sort of literal sense. I’ve got foggy fragments of ideas, but no concrete ideas that are feasible. I think that the introduction of the “Project Execution” prefix in the title of this post (and hopefully all of my other project-based posts) will keep me “responsible” in regard to things I try and undertake.
There will be 5 steps for each project.
Declaration
Research
Documentation
Report feasibility [1] (This is where we will learn if the project is possible or not, and if not why.), Share all code [2] Take Lots of pictures [3]
Demonstrate / Distribute
Now to PiScanner
After Receiving my Raspberry Pi :
In the mail recently, I’ve got a whole new area of new more powerful computer science to enter into, and I think the PiScanner is the best way to do this.
The PiScanner will be a program that will identify motion using a PIR detector in junction with the GPIO pins, capture an image using a webcam or other camera compatible with the ports on the board, and then upload that image to a server to be viewed via rss.
This will be great for my backyard at night, it’s like animal planet out there this time of year and it will teach me a bunch of new things upon completion.
Hello! sorry about my absence, I’ve been very busy with school/baseball over the past few weeks, but summer is soon, and I’ll be updating with a higher frequency once that happens.
So. you’ll need a few things in order to make sure this whole process works.
1. Multiple ethernet connections.
2. An Arduino and a compatible wiznet device (ethernet shield)
Here I’m using an arduino knockoff I got 2 years ago when I first got into hobby electronics, and an “ETHERNET 4 NANO” by gravitech. I didn’t want to permanently implement my arduino nano, but I still wanted to be able to do the project. It’s powered by 5v from my computer (well from a powered hub) and the brown striped wire goes to the + on the powerswitch tail II. The blue cable is ethernet.
3. A webserver with php installed on it. (Mine’s just LAMP’d)
This is the important part. The arduino is accessing a .php document on the server to tell weather or not the lamp should be on. The UI is also hosted out of the servo. That laptop is my old HP laptop that I’ve had for almost 6 years, it’s now got ubuntu 10.04 LTS on it and it’s been LAMP’d among other things.
4. Something that can SAFELY switch line voltage. (I’m using a powerswitch tail II, because I really don’t want to get killed, and it’s also very simple – if you’re using naked relays, please for the love of god, be careful with line voltages.)
This thing is a beast. I can drive it via 5v easily, and it switches line like a champ. It’s been in constant use since my last post about home automation.
START OF GUIDE
1. So first, we need to hook up the shield, the arduino and the PSTII. It’s really easy.
2. Get linux on the computer you want to run the server out of. I made a video a long time ago back when I was first getting into linux. The installation process is still the same, you can find that video here.
3. Install various programs your server. First things first, you have to install openssh by running the command:
sudo apt-get install openssh
This will make the server headless, and it will enable you to access the server via command line. Specifically through PuTTY for windows. You can then ssh to your server which is key. You then need to install LAMP by running the command:
sudo apt-get install LAMP
This will install PHP on your server as wells apache and mysql. The two ladder objects are not essential for this project, but if you are going to use the server for other things, I strongly recommend you get them as well. The install process is very easy, just make sure you WRITE EVERY USERNAME AND PASSWORD YOU USE DOWN. I’ve had to reinstall several times before I learned my lesson on this.
After you are done with that, install vim by running the command:
sudo apt-get install vim
Vim is a text editing program for linux. You also need to get pure ftp to upload and download files from your server. Get it by running the command:
sudo apt-get install pure-ftpd
Congrats! you are now an owner of a very powerful linux server. Wasn’t that easy?
4. Upload the call.php, res.php documents to your server.
This is the call.php text, you just need to make it into a .php document and upload it to the server. This document creates a form that the user accesses to turn the lamp on and off.
This is the res.php. It is what the user is taken to once they have entered the information in the call.php.
If you walk though the code. You will see that once the correct password gets entered, the res.php document creates a functioning document stat.php. This is the same document that the arduino reads, and compares to.
Here it is, upload it to your board with the shield attached, and everything should work. Access the call.php via your server, and go through the process.
At this point everything should work, please let me know if it doesn’t in the comments.
This is a round about way of using a servo to move an analog sensor to 3 points to triangulate the servo degree where the photocell experiences the most light.
The servo sweeps back and forth and then another servo points to where the brightest point was.
It is very simple wiring, but the code is a little fancy. It’s all commented up here, but as always deduction is your friend.
Anyways! I’ve set aside my home automation project and picked up a few of these shift registers: the 74hc595 by TI, seen here:
As seen in the video above, I’m controlling 14 led’s using two of the 74hc595s and and 3 pins on an Arduino Uno.
The program uses a for loop and cycles through the 14 pins on the registers. That code can be found here. Thanks for reading, check back soon for updates on the home automation project.
Watch the video above for context. In summation though, what we have is basically an arduino that fetches a string from a php file on my server, interprets it, and compares it to a predetermined char array.
In the future, this WILL have a GUI that can be controlled from a website and do the lamp that way from any source inducing mobile (maybe even twitter!).
Source-wise the code isin’t very well commented but the Arduino is here, and the php is here. A word of warning: unless you want me to be able to control a leigon of lamps across the world, I would suggest changing ip addresses / filepaths on the Arduino side.
On another side note all of my sources for stuff I write about on this blog here