Thursday, July 26, 2007

Ideas and Technology: More Intelligent Alarms

From the time I started traveling by bus to office, I am wasting a minimum 2hrs every day. Minimum because, the time it takes to travel depends on various factors like the condition of the traffic and the driver. Some of them make it in just 50 min, while some take me on a long 90 min journey. With more traffic it only gets worse. Reading is not something I can do, due to low light in the evenings, so the best thing could be to take a nap. I tried this option a lot of times, but my mind always tries to be over careful so that I just don't miss my stop. The result; I only end up resting my eyes. Probably I could set off an alarm at 50 min starting from the departure time, but then on quite a few occasions it wakes me up midway whenever a slow driver blends with a bad traffic. Why can't alarms be more intelligent? How do I make sure my alarm gets active at almost the same time that my bus reaches a specific PLACE. There you are, my alarm should better track the location than time! A lot of people think of GPS whenever it comes to keeping track of the location. But then it would require that a GPS receiver be integrated into your most commonly used, all in one device; your mobile. If it is too complex then forget it, I don't want it to be very accurate for this application. I am OK with my alarm waking me up on reaching the nearest tower to my house, which can be done with conventional mobiles as they will know the location code of the towers. I think, if this was so simple, mobile companies would have implemented it way back, or probably it has done it but I am unaware of one? I found quite a few number of literatures on this topic on the net, but most of them just think of GPS when it comes to location tracking. Hope this is a much simpler way out. Well, for my problem at least!

Sunday, July 22, 2007

Computer Vision (31): "Seeing" through ears

Till a few days back even I wasn’t aware of the existence of such creatures in Nature. I had not even thought of trying out something like this, even though it has been years getting into researching in this field. Nature again outwitted us in its design and complexity. I am actually talking about creatures having ears at a vertical offset to extract yet another dimension; depth that our ears/brain fail to solve through hearing. The Great Horned Owl (Bubo Virginianus), the Barn Owl (Tyto Alba) and the Barred Owl are some of such Nature’s selected gifted creatures. This offset helps them to hone on a creature with more sensitivity and helps them hunt down creatures even in complete darkness. With this ability they don’t even spare creatures like mice that usually hide under snow and manage to escape from their sight. Evolution has created wonders in Nature. These predators usually live in regions with long and dark winters and hence have developed the ability to “see” through their ears.

But how does it all work? With just horizontal offset our ears manage to tell us the direction of sound in the 3D space. Imagine it to be an arrow being hit in that particular direction. You don’t know the distance of the target but just fire it in that direction. The arrow actually leaves from a point which is the horizontal bisector of your ears. Applying the same concept on vertical offset there will be another arrow leaving from a point which is the vertical bisector of the ears (in the case of these specially gifted creatures). From primary school mathematics we all know that two straight non parallel lines can only meet at one point in space, which in this case happens to be the target.
Even Nature can only produce best designs and not perfect ones and the Owls will definitely have to starve if their prey manages to remain silent. To make its design more reliable and worthy, Nature has never allowed a prey to have this very thought in its mind.

Saturday, July 21, 2007

Computer Vision (30): Why wasn't our face designed like this?

I have already touched upon the reasons behind having two sensors and their advantages (1). We now know why we were born with two ears, two nostrils and two eyes but only one mouth. What I failed to discuss at that point of time is their placement. I will concentrate more on the hearing and vision (placement of ears and eyes) which are better developed in electronics than the smell.
Our eyes as any layman will be aware of, is a 2D sensor similar to the sensors present in a camera (not design wise of course!). They capture the 2D projection of the 3D environment that we live in. In order to perceive depth (the 3rd dimension), our designers came up with the concept of two eyes (to determine depth through triangulation). For triangulation (2) to work the eyes only needed to be at an offset; horizontal, vertical, crossed anything would do. So probably the so called creator of humans (GOD for some and Nature for others) decided that they would place them horizontally to give a more symmetric look wrt our body. But why wasn’t it placed at the side of our head in place of our ears? (Hmmm, then where would our ears sit?).
Light, we know from our high school physics does not bend along the corners (I mean, a bulb in one room cannot light the other beside it), and from my earlier posts we know that to perceive depth our eyes need to capture some common region (the common region is where we perceive depth), they have to be placed on the same plane and parallel to it. This plane happened to be the plane of our frontal face and so our eyes came at the front where they are today.
Let’s move on to hearing now! When we hear some sound, our brain will be able to determine the direction of it (listen to some stereo sound), but will not be able to pin point the exact location (in terms of depth or distance from us) of the source. That is because our ears unlike our eyes are a single dimensional sensor able to detect only the intensity of sound (of course they can separate out the frequencies, but that is no way related to direction of sound) at any point in time. In order to derive the direction from which it came our creators/designers probably thought of reusing the same concept that they had come up for our sight and so gave us two ears (to derive the direction of sound through the difference in timing when it arrives at each of the ears). To derive the direction, our ears only needed to be at an offset; horizontal, vertical or crossed, so to give a symmetric look they probably thought of placing it at a horizontal offset. But why was it placed at the side of our face and not at the top like a rabbit, dog or any other creature?
Again from high school physics we know that sound can bend along the corners and pass through tiny gaps and spread out. So you can enjoy your music irrespective of where you are and where you are playing it in your house (well! if you could only adjust with its differing intensity). So our ears never demanded to be on the same plane and parallel to it! The common signals required to perceive the direction would anyway reach it irrespective of its placement since sound can bend. Secondly our ears were required to isolate the direction in the 360deg space unlike our eyes that only projects the frontal 180deg. Probably the best place to keep it was at the side of our face.
Our 2D vision was now capable of perceiving depth and our 1D hearing could locate the direction. Since our visual processing is one of the most complex design elements in our body and very well developed to perceive depth the designers never thought of giving any extra feature for any of our other sensors. But Nature has in fact produced creatures that have a different design to what we have, with ears at a vertical offset, on top of their head, etc, which I will be discussing in my next post.

References:
1. http://puneethbc.blogspot.com/2007/03/computer-vision-4.html
2. http://puneethbc.blogspot.com/2007/04/computer-vision-13.html

Wednesday, July 18, 2007

Photography: Effects of Aperture variation on Clarity

The image above is not related to the title, I will come to it at the end.
One thing I wanted to observe in the previous post photos (min and max aperture) is that apart from the increase in sharpness, min aperture should give us a true, more real to our sight colors and contrast. WHY? Applying the same "light cone" concept, smaller the opening/aperture, narrower will be the light cone, which means lesser mixing of the light from adjacent cones when they are collected at the sensor. The increase in DOF when the aperture is closed is infact the result of the light cone getting narrower. Ideally if the aperture is a single point, there will be no cone at all, since the base of the cone is now a point! This means there will be only one ray (assuming light to be a ray for simplicity) from every point in space able to pass through this "point aperture" to be captured on the sensor. If the sensor is infinite in resolution, every point in space will be represented by a pixel on the sensor, which would give you a truely represented image/projection of the surrounding space. Practically since the pixels are of finite dimension the aperture need not be a single point and keeping the aperture at its minimum should give you a more real picture. We can clearly observe this difference in the below images at the central bottom portion of the clouds where the light and dark regions are clearly seen in the snap taken with min aperture than the one with max aperture. But again the factors for the failure of my experiment mentioned in my earlier post might be responsible for clearing this idea! So I will not conclude till some authentic and satisfactory experiments are made :(Good article on DOF: http://en.wikipedia.org/wiki/Depth_of_field

Also I have a few sample images from my multishot image merging software here: http://www.flickr.com/photos/57078108@N00/I will be launching it as soon as the GUI gets ready!

Sunday, July 8, 2007

Photography: Effects of aperture variation on sharpness

Taken with f5.6

Taken with f36

Either my concept is junk or my experiment. My knowledge says that a wider aperture would give lesser sharpness in the image regions where planes other than the focus plane is projected on the sensor, due to the wider light cone from these planes. When these cones meet the sensor they will be more spread out when compared to the cones from the same region when the aperture is smaller. But my experiments to prove this are telling a different story altogether.

To make the difference more apparent I selected the max and min apertures available in my camera in telephoto and kept the focus plane fixed, still I see a wider aperture giving better sharpness in all regions. Still, at this point of time I am more towards my concepts being true. My experiment turns out to be junk because in order to verify my concepts, the experimental setup should have had no other variables other than the aperture settings between the two shots. Here are the two things that changed between the two shots unknowingly.
  1. Here I tried to shoot the clouds with min and max apertures and the shutter speed that I got was 1/80 and 1/4000 respectively. Even though the shots were taken only a few seconds apart we can easily see that the clouds have significantly changed their pattern in this time. This places a high probability of more cloud movement in the lower aperture shot than the wider one, which might have caused more blurness in the former.
  2. Both the shots were taken with the camera hand held from on top of a pretty high building under windy conditions, which might have caused more handshake and in turn blurring in the lower aperture shot than the higher one.
I am not giving excuses for backing my concepts, I did not retake the shots with a tripod because I wanted a more practical, day to day effect of these values in photography. So even though conceptually a lower aperture would give more sharpness (proof pending), during practical day to day photography it is better to get a higher shutter speed just enough to keep any unwanted movements off our sensor.