Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

FPGA I/O Selection in UCF

105 views
Skip to first unread message

jti...@gmail.com

unread,
Nov 5, 2007, 3:26:40 PM11/5/07
to
I have an application with a Xilinx Spartan3 FPGA where I would like
to use a single FPGA binary to support to I/O voltage levels: 2.8v and
1.8v. My question is as follows:

Why does the UCF file include the selected IO standard for each pin?
I understand that the drive strengths and slew may change based on the
I/O standard. Are there any other functional hardware changes made
based on the selected standard? What are the potential consequences
of telling the compiler that I am using 2.8v but then running the
application with 1.8v?

Thanks

Gabor

unread,
Nov 5, 2007, 4:01:17 PM11/5/07
to
On Nov 5, 4:26 pm, jtin...@gmail.com wrote:
> I have an application with a Xilinx Spartan3 FPGA where I would like
> to use a single FPGA binary to support to I/O voltage levels: 2.8v and
> 1.8v. My question is as follows:
>
> Why does the UCF file include the selected IO standard for each pin?

It doesn't have to, but if you don't include the IO standard you
get the global default (which depends on the device family).

> I understand that the drive strengths and slew may change based on the
> I/O standard. Are there any other functional hardware changes made
> based on the selected standard? What are the potential consequences
> of telling the compiler that I am using 2.8v but then running the
> application with 1.8v?
>
> Thanks

The obvious changes in hardware are the Vcco (Bank IO power supply)
and Vref (reference voltage for single-ended standards requiring a
reference). The placement tools make sure that the standards are
compatible for all IO's in each bank. i.e you can't mix IO's
that require Vcco at 1.8V with those that require Vcco at 2.5V in
the same bank. Similar for inputs requiring different Vref in
a bank.

Some input standards don't use a reference voltage, but may need
a particular Vcco voltage. Output standards all need a specific
Vcco. Telling the tools you are running at 2.5V (did you mean 2.5
and not 2.8?) but powering Vcco with 1.8V will result in hardware
that does not match the simulation in terms of timing and drive
capability and in some cases just won't work.

morphiend

unread,
Nov 5, 2007, 4:12:32 PM11/5/07
to

Different IO standards have different voltage tolerances for
determining 1 vs 0. Also, they have different termination methods for
signal integrity and drive strength. If you use a lower voltage in a
higher voltage setting, its possible that you'll never 'see' a
transition inside the FPGA since its being set for higher tolerances.

Also, FPGAs usually have a voltage requirements for their IO banks. An
IO bank being a group of IO pins. This IO bank should (probably) have
a requirement that all pins in a bank use a compatible IO standard.
This can be limited to requiring the same voltage used in the bank, or
maybe even the same type of signalling in the bank.

austin

unread,
Nov 5, 2007, 4:25:00 PM11/5/07
to
jt,

The strength of the driver depends on the selected number of "legs"
enabled, and the attributes (FAST or SLOW) for any particular Vcco voltage.

IO's that require a Vref will require a different Vref voltage
externally to operate properly.

LVDS only works at 2.5 volts (in the latest parts).

But, a LVCMOS IO that is set for 12 mA in the highest Vcco, will still
be fairly strong at a lower Vcco. Timing, rise and fall times will be
slower at lower Vcco, but the LVCMOS IO will still function just fine.

To really make sure this will work, I would download the SPICE models,
and run them at the different Vcco voltages, and make sure the signal
integrity met all of your needs. The reason for simulating with SPICE
is that you can change the Vcco, and see the result of fast and slow
process corner material, as well as how the IO behaves with temperature.

Austin

jti...@gmail.com

unread,
Nov 5, 2007, 4:45:11 PM11/5/07
to

Thanks for the response. I understood that the compiler verifies that
you don't mix/match different standards on the same I/O bank.

In my particular case all of the pins are inputs and I am adjusting
the Vcco supply with LVCMOS (VREF not used). At some stage of the I/O
blocks, digital input signals must transition from the I/O voltage
levels to the internal voltage levels. This is the area where I am
most curious.

I am assuming that the Input Register has transistors should be
supplied with VCCO whose VIH and VIL thresholds will roughly track
with the VCCO (i.e VIH ~= VCCO - X volts). At this stage I am not
concerned about the input register detecting the signal level
properly, but I am concerned about the connecting blocks (CLK, Reset,
and output voltage threshold). Ignoring timing considerations, do we
know if the compiler makes any hardware configuration to interconnect
the two voltage domains (Vcore, VCCO) at the input register?

jti...@gmail.com

unread,
Nov 5, 2007, 4:56:13 PM11/5/07
to

Hi Austin, thanks for the response. It was my suspicion that LVCMOS
should work with the two voltages. I should have included that detail
in my original post.

On a side note--I'm very happy to see Xilinx support here. It's much
faster than waiting a day "to receive security clearance" to access
the Xilinx WebCase =)

Jeff


austin

unread,
Nov 5, 2007, 5:28:33 PM11/5/07
to
jt,

Level shifters exist between all IO circuitry and all logic circuitry
which takes care of the differences between the IO and the core voltages.

As you have guessed, timing will be different.

Austin

Hal Murray

unread,
Nov 5, 2007, 10:40:26 PM11/5/07
to

>Thanks for the response. I understood that the compiler verifies that
>you don't mix/match different standards on the same I/O bank.

I assume it only checks for incompatable standards.


>In my particular case all of the pins are inputs and I am adjusting
>the Vcco supply with LVCMOS (VREF not used). At some stage of the I/O
>blocks, digital input signals must transition from the I/O voltage
>levels to the internal voltage levels. This is the area where I am
>most curious.

>I am assuming that the Input Register has transistors should be
>supplied with VCCO whose VIH and VIL thresholds will roughly track
>with the VCCO (i.e VIH ~= VCCO - X volts). At this stage I am not
>concerned about the input register detecting the signal level
>properly, but I am concerned about the connecting blocks (CLK, Reset,
>and output voltage threshold). Ignoring timing considerations, do we
>know if the compiler makes any hardware configuration to interconnect
>the two voltage domains (Vcore, VCCO) at the input register?

How about making two versions of the same program that only differ
on the input standards. Get the output to some sort of ascii
text and run them through diff.

--
These are my opinions, not necessarily my employer's. I hate spam.

0 new messages