Is this possible? (math)

Brilliant answers, thanks. I couldn't figure it out I was starting to feel amazingly stupid. :boxedin:
 
Just shooting from the hip off of intuition here, but wouldn't the distance of the lens, from either the object or the screen, to produce a focused image be dependent on the size of the object and the intended size of the projected image of the object on the screen?
I think that you're thinking of a person looking through magnifying glass, in which case one doesn't need to produce a focused image, because one's eye will focus the image (as long as the image isn't too far from being focused). Or you might be imagining a projector, which can be moved a bit and still produce a readable image. However, that's only if the projector is moved only a small distance. If you focus a projector at one distance, then try to move it a significant distance away, it won't be focused anymore.

"To visualize the story, you may try to do the following experiment. Get the magnifying glass (it is a converging lens), [light] the candle, and set up a screen (piece of paper). By moving the converging lens between the screen and the candle, you will find that there [is] one specific distance between the candle and the lens that produces the larger image of the candle light."
If they can't get the English right, that supports the supposition that they got the math wrong.
 
A converging lens has a focal length of 10 cm. A screen is placed 30 cm from an object. Where should the lens be placed, in relation to the object, to produce a focused image?

You're supposed to be able to get the answer using the thin lens equation,

F = 10cm
di + do = 30cm

1/di + 1/do = 1/F

But nothing I've tried clicks. Everything seems to indicate that when the focal length is 10cm, the image always appears at least 40cm away. Sorry to post such a crappy question but I'm stumped.
OK, the first thing you need to do is consider precisely how the lens works.

Focal lengths for lenses are for an object at infinity. Convex lenses form real images for objects farther than the focal length, and virtual images for objects closer than the focal length.

The equation to use is the thin lens formula:

1/F = 1/s1 + 1/s2
where,
F is the focal length of the lens;
s1 is the distance from the lens to the object; and
s2 is the distance from the lens to the real image.

Note that this formula is only valid if the distance to the object is greater than the focal length. In this case, since the focal length is 10cm, and there is 30cm between screen and object, there might be enough distance to form a real image on the screen if we place the lens between the object and the screen. To find out, we have to determine what the sum of the distance to the object and the distance to the screen must be, and it must be less than 30cm.

Now, we also know that
s1 + s2 = 30cm
and
F = 10cm

If s1 is 10cm, then

1/10 = 1/10 + 1/s2
which would make s2 undefined. Let's try s1 = 11cm.

1/10 = 1/11 + 1/s2
so,
1/s2 = 1/10 - 1/11 = 1/110
Nope, that won't work. In fact, as s1 varies from 11cm to 10cm, s2 varies from 110cm to infinity.

How about if we make s1 = 30cm?
1/10 = 1/30 + 1/s2
so,
1/s2 = 1/10 - 1/30 = 1/15
And that won't work either. A brief look at this shows that as s1 varies from 10cm to 30cm, s2 varies from infinity to 15cm. You cannot bring the object to focus with only 30cm total to work with. The minimum distance is 40cm. You get that with s1 = s2 = 20cm. I'll leave it to you to prove that.
 
Last edited:

Back
Top Bottom