Thursday, August 9, 2018

Electronic Musical Toy

There is an article on the makezine.com website here written by Gareth Branwyn. It tells of a project built by Martin Hertig. It is essentially a one octave electronic musical instrument that Mr. Hertig built to be installed backstage at the concert venue Jugendkulturhaus Dynamo. I have a young grandson and I thought he might enjoy playing with something like that so I decided to build one. Here is my version, standing on the shoulders of Martin Hertig.

As in the original, rather than buttons, I wanted it to be touch sensitive. Here, as has happened many times in the past Adafruit came to my rescue with the "Adafruit 12-key Capacitive Touch Sensor Breakout - MPR121" that can be found here. It is a nifty little device that can sense touch on twelve different lines, with millions of adjustments (of course that can be good and bad) to control how it acts. This device can communicate with little computers via I2C (Inter-Integrated Circuit) a protocol that is supported by lots of devices including Arduino, Raspberry Pi, etc.

My thinking was this. I wanted a full chromatic octave, twelve notes, plus I wanted some non-chromatic percussion, drums and so forth. Also I wanted to be able to adjust the octave up or down that the twelve notes played. Finally, I wanted to change the instrument whose sound was emulated by the device. All that meant I would need two of the 12-key touch sensors.

I also needed a device that could generate the sounds of the notes and different instruments. Yet again, Adafruit bailed me out, this time with the "Adafruit "Music Maker" MP3 Shield for Arduino w/3W Stereo Amp - v1.0" found here. This item, among lots of other functionality, has a midi synthesizer (and I know midi from my pipe organ project) as well as a 3-Watt amplifier that can be connected directly to speakers. Further, it is an Arduino shield so it plugs right into an Arduino Uno, which of course speaks I2C. Sometimes things just work out.



And here it is. The knobs at the top are the percussion. The twelve handles are one chromatic octave, arranged like a piano keyboard. The two knobs on the left shift the notes up or down one octave. And the knob on the right changes the instrument that is being emulated.















This is what it looks like with the back removed. In the lower right is the Arduino Uno with the Music Maker shield on top. Next to it are the Capacitive Touch Sensors. Besides that just speakers and wire.

I posted a video on YouTube here in case you want to see it play.



Saturday, May 26, 2018

Indirect Ophthalmoscopy Using a Raspberry Pi -- Part 2

As I discussed in my earlier post on this topic, my brother and I are attempting to build an inexpensive, easy to operate, fundus camera that can be used to screen for diabetic retinopathy. Also as I said earlier, we didn't have much luck with a device I built by stacking up a raspberry pi camera, a raspberry pi, and a 7" touchscreen. Therefore we took a new approach.

Optical bench showing various adjustments
I  built an optical bench that allowed many adjustments for the distances and alignments of the components. It held the camera, the LEDs to illuminate the eye, the 20 diopter condensing lens, and the patient's head. The Raspberry Pi and some other components were attached to one or another of these devices via cables. Let me talk about these things in turn.

Camera with adjustable focus lens and LED board ahead of it
Camera -- We did not actually use the official Raspberry Pi camera. The official camera has a fixed lens. That is, it is not meant to have its focus changed from its factory setting, which is more or less at infinity. We want to be able to focus more effectively on nearby objects. The lens can actually be turned using a pair of tweezers and thus the focus can be brought nearer but this is a fiddly process, and not very convenient. Instead we used a camera sold by UCTronics. It uses the same sensor as version 2 of the Pi Camera but has a lens that can be easily focused manually. I purchased an extra long camera cable from Adafruit allowing me to mount the camera on the bench and still reach the Pi.

Board containing prototype bi-color LEDs
Board containing conventional LEDs






























LEDs -- I built two versions of the board containing the LEDs. One contained two of bi-color prototype LEDs (white and infrared) that I mentioned last time, and one contained two each of conventional through-hole white and infrared LEDs. Through experimentation we found that a single LED of each type was able to provide sufficient light. The reason we decided to have two, at right angles to each other, is an attempt to overcome specular reflection from the various optical components. The highlights of these LEDs potentially hide important information in the image so we felt it was important to deal with it. More about that later. There is a ribbon cable that runs from the LED board to a breadboard that contains the circuitry that drives the LEDs based on control signals from the Pi. There are also two potentiometers that can be used to control the brightness of the LEDs.
Infrared and White LED brightness controls



















Condensing Lens -- This lens provides the primary magnification for the image of the retina. Other focal length lenses are sometimes used but this seems to be the most common for this type of screening test.

Chin and head rest as well a condensing lens
Patient Fixture -- This is simply an adjustable chin rest and head rest meant to hold the patient's head in a fixed position.














Raspberry Pi --

Hardware -- We used a Pi 3 Model B Version 2. Using GPIO pins it is connected to the circuit board that drives the LEDs mentioned above. It is also connected via USB cable to an Arduino. We wanted the ability to control a number of Pi Camera parameters on the fly. (See Software below) We did this by using four potentiometers as voltage dividers, read the voltage via the analog to digital converters on the Arduino, and then pass that information to the Pi via the USB.

Brightness, Contrast, Sharpness and ISO camera controls
Software -- The Pi is programmed using Python. The program starts out (using the Python camera interface) to set up a camera preview and then enters a tight loop reading the desired settings for brightness, contracts, sharpness, and ISO from the Arduino based on the positions of the potentiometers. It sets the given parameters in the camera software and then loops back to read the information from the Arduino again. There is a normally open button connected to a GPIO pin on the Pi that works as a shutter button. When a button is pushed an interrupt on the Pi is triggered. The Pi takes a picture, turns off the preview, and displays the picture in a browser window on the Pi. When the shutter button is pushed again the camera preview is re-enabled and the process starts over. The pictures are numbered consecutively so they are not overwritten.  Using hardcoded variables in the Python program the action of the LEDs can be controlled. For instance, the eye can be illuminated in IR during the preview but have the white LED flash as the picture is taken. Or the eye can be illuminated in white light during the preview and when the picture is taken. Or it can be illuminated in IR continuously. Or no illumination can be provided.

Here is an example of a picture we took using our device. The optic nerve is the yellow circular area in the upper left of the picture. This picture is not perfect, of course. You can see evidence of the specular reflection I mentioned earlier. We are, however, pleased with the resolution and the field of view. Also we still haven't been able to achieve sharp focus using the IR LEDs so it should be noted that this photo is taken with a chemically dilated pupil.

Next time I'll talk about what we're doing about that.

Oh, and one other thing. There is a group that has made excellent progress on this type of device. If you're interested in the topic be sure to look here.








Sunday, April 29, 2018

Indirect Ophthalmoscopy using a Raspberry Pi -- Part 1

Periodically your ophthalmologist will want to perform a dilated retinal exam on you. He or she will put a drop in each eye to dilate the pupils, wait about twenty minutes and then use a lens and light to examine your retina. The retina is the light sensitive tissue on the inside of the eye. Generally this is a screening exam for, among other things, diabetic retinopathy. Retinopathy is a disease of the retina, and thus diabetic retinopathy is such a disease caused by diabetes.

These are important examinations but are unpopular among patients for a few reasons. They require extra time for the eye drops to take effect, and then that effect lasts for several hours. During that time the eyes are particularly sensitive to light, and the vision is often blurry making it difficult to drive and to read.

My brother is an ophthalmologist and he pointed out to me an article written by a couple of doctors at the University of Illinois (here) in which they described building a non-mydriatic fundus camera. Mydriasis is the dilation of the pupil and in this case fundus refers to the inside back of the eye. These instruments are available but they are extremely expensive. The interesting thing about this article is that it describes building such a camera using a Raspberry Pi with a Pi Camera.

The key idea in the article is that instead of using a chemical to dilate the patient's pupil, the patient is placed in a darkened room and the pupil is allowed to dilate naturally. The problem with this approach is that in the dark the examiner cannot see to focus the camera, and because the dimensions involved are so small, the focus is critical. However, the article describes using a combination infrared and white LEDs for illumination. Generally, electronic cameras are sensitive to infrared light but human eyes are not. Thus, the examiner illuminates the eye in infrared light and viewing the image from the camera is able to focus, and then flashes the white LED to take the picture. The white light produces an image with good color rendition, an important factor in performing the exam, but the flash is so fast that the patient's pupil doesn't react until after the picture is taken.

A significant advantage of an inexpensive fundus camera is that the retina could be imaged in settings other than a doctor's office. Clinics, schools and so forth could capture the images and they could then be reviewed by a retina specialist at a later time. My brother suggested that we try to build one of these camera systems, and therein lies a tale.

One of the trickiest parts of taking a picture of the retina is the fact that it must be taken through the pupil. Even when dilated that is an opening of only a very few millimeters. Through this tiny opening light must be shone to illuminate the retina as well as the picture taken. That means that the light must be very close to the main axis of the camera lens. The University of Illinois group used a prototype of a tiny LED made by a Japanese company that can emit both IR light as well as white light. With the help of a Japanese friend of mine we undertook to obtain a few of these prototypes. Pending their arrival we did some experiments using a group of conventional LEDs and a partially silvered mirror (left). The idea was that the camera would take a picture of the eye as reflected in the front of the mirror while the eye was illuminated by LEDs behind the mirror. Thus the LEDs could be made precisely collinear with the camera.

Now, the way the retina is usually examined is that the ophthalmologist uses a 20D hand lens that he or she holds close to the patient's eye. At the same time the doctor observes the image in the hand lens using a light source and another magnifying lens that is often worn as a headlamp. Getting a good view is tricky business because it involves moving the two lenses and light such that the image is appropriately magnified, while maintaining an adequate field of view as well as having the image in focus. This is complicated by the fact that if the patient is near sighted or far sighted the correct position of the lenses changes. With practice doctors develop a good facility for this. As you might imagine, however, doing this with a camera, screen and light as well as the 20D condensing lens can be a challenge.

We built such a device and experimented around with it. We were completely unsuccessful at getting a clear view of the retina in IR light and thus could not get a good picture. Try as we might there were just too many variables, including the number of LEDs, the focus of the lens on the Pi Camera, the distances, the size of the device, etc., etc.

Around this time the Japanese LEDs arrived. We replaced the conventional LEDs that we had been using. These new LEDs were SMT (surface mount technology) and so had tiny solder pads and presented their own challenges but we were able to get them closer to collinear with the camera and so eliminated the partially silvered mirror. Nothing else changed, including our results.

We thought it might make sense to go back to first principles so we started from scratch. This time we built an optical bench that we could use to do more precise experimentation. I'll show you that in my next post.