Michael Kellett <
m...@mkesc.co.uk> wrote:
> On 09/08/2020 17:05, Dimiter_Popoff wrote:
>
> snipped
>
> If you don't need the RAM to be that fast or that big then you could use
> SDRAM, it'll save you a lot of trouble, both in board layout and FPGA work.
>
> I wrote my own SDRAM controller a long time ago (when quite new to
> FPGAs) - not too big a deal at all.
>
> DDRAM is a different thing altogether (reading and writing data in a
> 300ps window - its a miracle that it ever works !).
>
> Using a pair of 16 bit wide SDRAMs at 100MHz gives you 400 Mbytes/s -
> chicken feed by DDRAM standards but maybe enough.
Just about - 1080p/60Hz/24bpp is 373MB/s. If you cut the colour depth down
you can reduce that bandwidth.
However you also need bandwidth to write into that RAM - you'll need to
measure how much you need for that.
On PCIe, that means you're in mid-level FPGA territory as it needs high-speed
transceivers. So not the bottom-end Cyclone V E but the fancier Cyclone V
GX. And similar across all the vendors lines - that means you're heading
towards the top end of Lattice's offerings.
One thing to be wary of is that Intel/Altera at least offer time-limited
demos of IP cores that you don't have licences for. The device will work
while plugged into the programmer, but not work standalone. There's
warnings about this, but you may be able to successfully synthesise despite
not having a licence (which might actually be fine for proof-of-concept/eval
purposes).
On the system-on-module front there are lots of vendors but I've personally
worked with
https://www.enclustra.com/ who have a decent range, although
possibly higher-end than you want.
For LVDS I'd think about buffering - you might want a proper line driver to
drive a long cable, rather than FPGA I/Os which aren't designed to drive
into a long capacitance. We used a third-party HDMI module once and had a
lot of problems because the registers in the driver chip (made by ITE)
weren't documented - in the end we recorded a log of what the demo code set
registers to and replayed that in the same order in our code, a hack but it
just about worked. Instead of proper HDMI (audio etc) you might think about
DVI-D: HDMI monitors can accept it but it's a simpler protocol. You can
also get chips which take in parallel RGB (6-8 bits per channel), Hsync,
Vsync, pixel clock and output DVI or HDMI. That means you just generate
something that looks like pre-DAC VGA and the chip handles the rest.
You might also consider EDID monitor detection, depending on the
application.
What I'd do is look for a PCIe-card-shaped dev board, which should come with
examples for driving that. For the video side, worst case you can make a
resistive-DAC VGA port off some GPIO pins, but you could look at other
options - for example wiring a parallelRGB-DVI chip off GPIOs. Maybe the
board will come with HDMI or Displayport already (and some demo come), but
that's rare on PCIe-card boards. I think at that point you have to live
with whatever RAM type the dev board gives you - if SDRAM works for
you it'll save some troubles, but I've seen a few where there's only one 16
bit SDRAM, which is low on the bandwidth stakes.
Dev boards sometimes come with a one-seat licence for the tools too.
You might also be able to get access by speaking to a sales rep.
Personally I have a dislike of Xilinx tools and I'd prefer Intel or Lattice,
but I have more experience of Intel in particular, so I may be biased.
On the Intel front, 'Quartus Lite' is the free tools and supports Cyclone,
so I'd be looking at a Cyclone IV or V in GX version. For example:
https://www.terasic.com.tw/cgi-bin/page/archive.pl?Language=English&CategoryNo=167&No=1159
Meanwhile, in Lattice land (tools licence free only for this board):
http://www.latticesemi.com/en/Products/DevelopmentBoardsAndKits/LatticeECP3VersaDevelopmentKit.aspx
Theo