I have constructed a small circuit on breadboard, with a series resistor
(330 ohms) & LED in parallel with a 100uF capacitor. When I apply my 5
volts, the LED lights and when I remove the source voltage, the LED slowly
fades out. Great, just as it should.
My question is, how do I do this in reverse? That is, when I apply my 5
volts, how do I get the LED to slowly fade in?
Thanks in Advance
--
David Giblin
Lancashire
England
Put the cap in series, I think.
You are DISCHARGING the capacitor sort of slowly. Say the LED has a
voltage drop that is almost constant at,um, 1.8 volts. So when you
start letting it discharge, the voltage across the resistor is
5.0 - 1.8 = 3.2 volts just at that time. So, just at that time, the
CURRENT is I=E/R = 3.2/330 = .0097 Amps or 9.7 Milliamps which is
enough to light the LED up well. BUT the capacitor starts discharging
and this takes "a while". "A While" is about equal to R (ohms) times
C (FARADS). Like: T=R*C 330 * .000100 = .03 seconds. Hmm that seems a
little short. How long does the LED take to fade to low brightness??
Your Question:
You are CHARGING the capacitor sort of Quickly, from some power
supply. To slow down the charging, add a series resistor. Try
something like 330 ohms...
The above is inexact, because things are complicated by the
fact that the LED is a diode that has an almost constant voltage drop.
IF you had an oscilloscope you could connect just an R and a C and
watch the waveform as it charges and discharges. The curve is
exponential with an R and a C.
Hope this makes some sense.
Main thing is: Keep On experimenting!!
--
Regards, Terry King ...In The Woods In Vermont
te...@fredking.us
The one who Dies With The Most Parts LOSES!! What do you need?
<snip>
> So, please forgive my naivety.
-> Hint: There is nothing to forgive.
> I have constructed a small circuit on breadboard, with a series resistor
> (330 ohms) & LED in parallel with a 100uF capacitor. When I apply my 5
> volts, the LED lights and when I remove the source voltage, the LED slowly
> fades out. Great, just as it should.
>
> My question is, how do I do this in reverse? That is, when I apply my 5
> volts, how do I get the LED to slowly fade in?
<snip>
The only way to make the LEDs slowly fade in is to slowly raise the current
through them, but there are thoudands of ways to do so. The one I'm trying
to explain is rather simple and inexact, but should work for LEDs. However
if you need an (almost) exact solution for some complicated optoelectronic
devices, consider using Op-Amps or even a DAC-controlled circuit. One easy
way would be to raise the voltage across a (voltage-independent) resistor
what in turn will mare the current increase prportionally to the voltage.
The following circuit scheme (view with a fixed font) allows a slow but
non-linear fade-in and fade-out. It is possible to make it linear, but this
would require a constant current for charging and discharging a capacitor
what most likely means a complicated device, so I took the "quick and
dirty" approach not to make it linear.
---------x--------------------x-------------- + Source Voltage
| ¦
/ switch ¦
/ --- LED (or more than one LED
| \¦/ in series depending on the
charge--- ¦ source voltage and your needs)
re- | | ---
sis- | | ¦
tor --- C¦
¦ /
¦ B / NPN transistor
x-------x---------¦ (should be able to
¦ ¦ \ sustain the current)
¦ ¦ >
dis- ¦ ¦ E¦
charge--- ¦ ¦
re- ¦ ¦ ¦ ---
sis- ¦ ¦ ----- capa- ¦ ¦ resistor
tor --- ----- citor ¦ ¦ (current control)
¦ ¦ ---
¦ ¦ |
---------x-------x------------x--------------- +0V(GND)
The current control resistor should be rated somewhere around 200 Ohm and
the source voltage shold be chosen as to allow a drop of approx. one third
of it across the transistor and the resistor and two thirds of it across
the LEDs (if more than one LED) or half of it on each part if only one LED
is used. A too high voltage across the transistor is likely to make it
overheat. The delay, rise and fall times will vary depending on the charge
and discharge resistors which should be of approx. equal (+- 30 per cent)
resistance. and the capacitor. When switched on using the switch (assuming
it is already connected) the circuitry should initiate a "slow start" of
the LEDs and when switched off, a "slow shutdown", however when powered
off, the LEDs will go out immediately. You cah assist this problem if
needed by connecting a sufficiently large capacitor in parallel with the
entire circuitry (which will no doubt increase its size and weight).
If anything should go wrong or if there are still questions,
feel free to ask me here or by e-mail.
Dimitrij
>I have constructed a small circuit on breadboard, with a series resistor
>(330 ohms) & LED in parallel with a 100uF capacitor. When I apply my 5
>volts, the LED lights and when I remove the source voltage, the LED slowly
>fades out. Great, just as it should.
>
>My question is, how do I do this in reverse? That is, when I apply my 5
>volts, how do I get the LED to slowly fade in?
>
---
Increase the capacitance of the 100µF cap to increase the fade-in time.
+5V>----> |
|
O
|
+--------+
| |
| [10K]
C |
NPN B-------+
E |
| |+
[330R] [100µF]
| |
[LED] |
| |
GND>------+--------+
There is an example of a linear LED fading circuit at the
link below. It uses one dual op-amp, 6 resistors,
one cap, and a transistor.
http://ourworld.compuserve.com/homepages/Bill_Bowden/page5.htm#eyes.gif
-Bill