Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Using Modelsim with VERY VERY large designs / netlists / sdo's

1,454 views
Skip to first unread message

Asher C. Martin

unread,
Apr 7, 2003, 7:37:09 PM4/7/03
to
Hi,

I'm having issues with VERY VERY large designs and EDA tools such as
Modelsim. It seems that there are some fundamental limitations to EDA
tools such as Modelsim and others when you get very very large. I
have a design with the following specs.

DESIGN SPECS:
- netlist file (.vo) = 134 MB file
- timing file (.sdo) = 59 MB file
- atoms = 300 KB file

Here are some quick questions.

QUESTIONS:
1. What is the ratio between required/allocated memory needed to
simulate a design and the size of the netlist (see above)? For
example, can we say that there needs to be about 5 MB for every 1 MB
of ASCII netlist? Just some kind of rule of thumb people use would be
nice.
2. What exactly does a "** Fatal: (vsim-4) ****** Memory allocation
failure***** Please check your system for available memory and swap
space." mean from Modelsim 5.x? Is this simply a statement that the
system is out of memory and that it can't be simulated on my system
with 1 GB of RAM - woooh unbelievable!
3. If it doesn't simulate in modelsim are there any other
recommendations of tools I might try that use less memory?
NCVERILOG/VCS/Verilog-XL etc?
4. How do I look for modules in my design that are consuming large
amount of memory - so I can hack them down to size? I would like to
break mem usage up by each sub module.

Thanks for your help!

>Asher<
e-mail A (aatz) AsherM (dootz) com
http://www.asherm.com/

Larry Doolittle

unread,
Apr 7, 2003, 8:06:53 PM4/7/03
to
On 7 Apr 2003 16:37:09 -0700, Asher C. Martin <ash...@asherm.com> wrote:
>I'm having issues with VERY VERY large designs and EDA tools such as
>Modelsim. It seems that there are some fundamental limitations to EDA
>tools such as Modelsim and others when you get very very large.

I have heard people sometimes wind up with Icarus Verilog on a
64-bit machine like an Alpha in these cases. If you run out of
virtual memory there, you just buy a bigger disk.

- Larry

Stephen Williams

unread,
Apr 7, 2003, 11:50:34 PM4/7/03
to

Icarus Verilog is also known to work on AMD64 systems in 64bit mode.

Unfortunately, the original poster quoted design specs in terms of
sizes of Modelsim intermediate files, so no-one but other Modelsim
users can make judgments about the size of the design.

> 2. What exactly does a "** Fatal: (vsim-4) ****** Memory allocation
> failure***** Please check your system for available memory and swap
> space." mean from Modelsim 5.x? Is this simply a statement that the
> system is out of memory and that it can't be simulated on my system
> with 1 GB of RAM - woooh unbelievable!

Need more information. How big are your Verilog files, and what
operating system are you using? Only Windows 98/95/ME give up without
resorting to paging files, so if you are using DOS based Windows,
then give it up and upgrade to Windows 2000 or Linux. (Modelsim
runs on both, if you wish to stick with it.)


--
Steve Williams "The woods are lovely, dark and deep.
steve at icarus.com But I have promises to keep,
steve at picturel.com and lines to code before I sleep,
http://www.picturel.com And lines to code before I sleep."

FMF

unread,
Apr 8, 2003, 12:17:23 AM4/8/03
to
Asher,

I am not surprised your design does not run in 1GB. I wouldn't bet that
it would even run in 4GB. You need a machine with a large memory OR a
big swap space and *lots* of time. In either case you will need to down
load the 64-bit version of modelsim.

Rick Munden
munden(at)acuson o com

Petter Gustad

unread,
Apr 8, 2003, 3:38:35 AM4/8/03
to
ash...@asherm.com (Asher C. Martin) writes:

> I'm having issues with VERY VERY large designs and EDA tools such as

I don't have any experience with Modelsim, but my experience with
other simulators has indicated that VCS has a somewhat small memory
footprint. Have your tried VCS on your design? Are you running on a
64-bit architecture?

Petter
--
________________________________________________________________________
Petter Gustad 8'h2B | ~8'h2B http://www.gustad.com/petter

Petter Gustad

unread,
Apr 8, 2003, 3:41:04 AM4/8/03
to
Petter Gustad <newsma...@gustad.com> writes:

> ash...@asherm.com (Asher C. Martin) writes:
>
> > I'm having issues with VERY VERY large designs and EDA tools such as
>
> I don't have any experience with Modelsim, but my experience with
> other simulators has indicated that VCS has a somewhat small memory
> footprint. Have your tried VCS on your design? Are you running on a
> 64-bit architecture?

I forgot to include this URL:

http://www.synopsys.com/products/simulation/cross_compile_wp.html

Kim Enkovaara

unread,
Apr 8, 2003, 3:44:52 AM4/8/03
to
> I'm having issues with VERY VERY large designs and EDA tools such as
> Modelsim. It seems that there are some fundamental limitations to EDA
> tools such as Modelsim and others when you get very very large. I
> have a design with the following specs.
>
> DESIGN SPECS:
> - netlist file (.vo) = 134 MB file
> - timing file (.sdo) = 59 MB file
> - atoms = 300 KB file

Actually wouldn't call that very big design, it medium sized nowadays :) Of course memory size limitations etc. depend very much from the vendor cell coding style. Simulation language is the most important thing in gate simulations, no sane person uses vhdl for netlist simulations. It is just too slow and memory consuming. But I assume from the filenames, that you use verilog.

> QUESTIONS:
> 1. What is the ratio between required/allocated memory needed to
> simulate a design and the size of the netlist (see above)? For
> example, can we say that there needs to be about 5 MB for every 1 MB
> of ASCII netlist? Just some kind of rule of thumb people use would be
> nice.

There is no real rule, I have used rough estimate of 1Mgate/1G of mem (usually this is pessimistic in todays tools). This is very dependent on the vendor libraries and how Modelsim can optimize them (+opt, +nocheckALL, -O5 etc. switches).

> 2. What exactly does a "** Fatal: (vsim-4) ****** Memory allocation
> failure***** Please check your system for available memory and swap
> space." mean from Modelsim 5.x? Is this simply a statement that the
> system is out of memory and that it can't be simulated on my system
> with 1 GB of RAM - woooh unbelievable!

It says that the tool can't reserve more memory. If you are using Linux as a platform the maximum is usually 3G for memory allocations, with Sun the maximum is very near the 4G limit. If you need more space just use Sun + Modelsim64. Gate level simulations are not easy thing to do, they require usually much small tweaking and tricks (and sometimes help from the FAEs). Also in older linux glibc versions (2.0.x) there were some tricks how to get Modelsim to reserve more than 1G of memory.

> 3. If it doesn't simulate in modelsim are there any other
> recommendations of tools I might try that use less memory?
> NCVERILOG/VCS/Verilog-XL etc?

VCS is more memory efficient, but the advantages are again design dependent.

> 4. How do I look for modules in my design that are consuming large
> amount of memory - so I can hack them down to size? I would like to
> break mem usage up by each sub module.

I hope you are using Modelsim 5.6 or 5.7, they are the first ones that really optimize verilog netlists well. Also please read Modelsim performance optimization appnote (http://www.model.com/products/documentation/performance_appnote_57.pdf)

--Kim

Egbert Molenkamp

unread,
Apr 8, 2003, 7:30:17 AM4/8/03
to

> Need more information. How big are your Verilog files, and what
> operating system are you using? Only Windows 98/95/ME give up without
> resorting to paging files, so if you are using DOS based Windows,
> then give it up and upgrade to Windows 2000 or Linux. (Modelsim
> runs on both, if you wish to stick with it.)

Recently we had a problem in which the synthesis tooling had not
enough memory. We used another machine with more memory
but still synthesis stops with the same upper bound for virtual memory.
After some time we found that Windows default an upper limit of 2GB.
This can be extended to 3GB with an option in the boot.ini.
See
http://www.microsoft.com/hwdev/platform/server/PAE/PAEmem.asp

Egbert Molenkamp

Anthony J Bybell

unread,
Apr 8, 2003, 11:07:30 AM4/8/03
to
Petter Gustad <newsma...@gustad.com> wrote in message news:<m3k7e5p...@scimul.dolphinics.no>...

> ash...@asherm.com (Asher C. Martin) writes:
>
> > I'm having issues with VERY VERY large designs and EDA tools such as
>
> I don't have any experience with Modelsim, but my experience with
> other simulators has indicated that VCS has a somewhat small memory
> footprint. Have your tried VCS on your design? Are you running on a
> 64-bit architecture?
>
> Petter

He can probably get away with that for running actual sim jobs but
it's possible that he still might need a monster box for compiling the
sim executable.

-t

John Williamson

unread,
Apr 8, 2003, 11:12:19 AM4/8/03
to
ash...@asherm.com (Asher C. Martin) wrote in message news:<3e30b793.03040...@posting.google.com>...


Hi Asher,

What O/S are you using? Although Windows is 32 bits, I have found
the realistic memory limit to be 1.5 gig on Windows 2000 and Windows
XP. It could be that you are running into this limit with ModelSim,
and adding more virtual memory will not fix the problem.

You may want to try Silos, as Simucad has a version that allows memory
usage up to 3 gig by using the Windows XP Pro addressing on 32 bit
PC's. For unlimited memory usage, the 64 bit version of Silos runs on
the 64 bit Itanium computer using Windows XP 64 bit O/S.

John Williamson
Technical Marketing Manager, Simucad
www.simucad.com

Petter Gustad

unread,
Apr 8, 2003, 12:51:56 PM4/8/03
to

Exactly. This is specified in the Synopsys URL I listed.

Mike Treseler

unread,
Apr 8, 2003, 1:49:20 PM4/8/03
to
Asher C. Martin wrote:
>
> I'm having issues with VERY VERY large designs and EDA tools such as
> Modelsim. It seems that there are some fundamental limitations to EDA
> tools such as Modelsim and others when you get very very large. I
> have a design with the following specs.
>
> DESIGN SPECS:
> - netlist file (.vo) = 134 MB file
> - timing file (.sdo) = 59 MB file
> - atoms = 300 KB file
>


Consider a change in strategy.
100% synchronous processes.
Simulate source code, not the netlist
to verify function.
Use the static timer in your place+route
instead of modelsim to verify timing.

-- Mike Treseler

Steven Sharp

unread,
Apr 8, 2003, 1:56:51 PM4/8/03
to
ash...@asherm.com (Asher C. Martin) wrote in message news:<3e30b793.03040...@posting.google.com>...

> QUESTIONS:


> 1. What is the ratio between required/allocated memory needed to
> simulate a design and the size of the netlist (see above)?

There can't be any simple ratio to the size of the source code. Note
that a single line of code that declares a memory can require many
megabytes of space, as can a line that declares a large array of
instances. Hierarchical designs can create a number of instances that
is exponentially larger than the source code. For flat gate-level
netlists using a particular library, you might be able to come up with
an approximate ratio. You should be able to do this yourself with
your particular toolset.

As I recall, 250 bytes per gate in a gate-level simulation should be
in the ballpark. I wouldn't be surprised if ModelSim used several
times that much.

> 2. What exactly does a "** Fatal: (vsim-4) ****** Memory allocation
> failure***** Please check your system for available memory and swap
> space." mean from Modelsim 5.x? Is this simply a statement that the
> system is out of memory and that it can't be simulated on my system
> with 1 GB of RAM - woooh unbelievable!

You might want to do as the message says and check your swap space
(or if you have a system expert available, ask them to do it).
Increasing your swap space may allow the simulator to continue with
virtual memory, but swapping to disk will slow simulation to a crawl.

Unix tools such as ps and top will allow you to see your process size
and how much memory it is using before it crashes.

> 3. If it doesn't simulate in modelsim are there any other
> recommendations of tools I might try that use less memory?
> NCVERILOG/VCS/Verilog-XL etc?

I'm not sure about XL, but the other two apparently use much less
memory than ModelSim.

> 4. How do I look for modules in my design that are consuming large
> amount of memory - so I can hack them down to size? I would like to
> break mem usage up by each sub module.

I don't know what information ModelSim makes available.

Clearly any module that declares large memories is a possible culprit.
Small modules that are instantiated a large number of times will give
a large payback for minor improvements. Timing and SDF annotation can
use a lot of memory, and turning them off should save that.

Asher C. Martin

unread,
Apr 8, 2003, 5:09:41 PM4/8/03
to
FMF <fmfo...@sbcglobal.net> wrote in message news:<3E924D29...@sbcglobal.net>...

> Asher,
>
> I am not surprised your design does not run in 1GB. I wouldn't bet that
> it would even run in 4GB. You need a machine with a large memory OR a
> big swap space and *lots* of time. In either case you will need to down
> load the 64-bit version of modelsim.
>
> Rick Munden
> munden(at)acuson o com

Hi Rick,

I did try to add a lot of swap space... double-triple.. still didn't
work - hard memory seems to be the key?

>Asher<

Asher C. Martin

unread,
Apr 8, 2003, 5:10:53 PM4/8/03
to
Hi Everyone,

Thanks for the quick follow ups. I've attached some graphs and plots
showing the memory usage for both the compilation and simulation over
time.

PLOT #1
Here is a plot of the memory usage over the period of the compilation
of the 134 MB design file. The design file is successfully compiled
and then the memory and CPU usage drops back to normal as expected.
http://www.asherm.com/research/hdl/problem/compile_design_mem_plot.jpg

DATA #1
Here is are the actually amount of memory / virtual memory used during
the period of the compilation (mem usage gets up to about 670+ MB for
the 134 MB file a ratio – coincidently - of exactly 5 to 1 – perhaps
this isn't coincendent? I believe that a good simulator should have a
ratio closer to 1 to 1 – however this is idealistic.
http://www.asherm.com/research/hdl/problem/compile_memory_usage.jpg

PLOT #2
This plot shows the memory usage AFTER compilation but during
simulation of the 134 MB design. The peak memory usage gets up to
about 520 MB but the VM size is much more oddly at about 810 MB. The
crash happens at the end of the simulation and then the memory drops
abruptly as you see in the following plot.
http://www.asherm.com/research/hdl/problem/simulate_design_mem_plot.jpg

DATA #2
Here is the raw data from the simulation showing the peak memory usage
statistics and VM size.
http://www.asherm.com/research/hdl/problem/simulate_memory_usage.jpg

In general, this is just interesting data that others may be
interested in. The key result that I would like to know is what are
the ratio of memory usage to design file size for the various
simulators. It is my belief or suspicion that a better simulator will
have a lower ratio.

Modelsim has a ratio of about 5 to 1 any ideas on Verilog-XL, VCS,
NCVERILOG or the others?

Please only post hard experimental numbers - no marketing or
propaganda thanks!

Take care,
>Asher<
San Francisco, CA
http://www.asherm.com/

Kim Enkovaara

unread,
Apr 9, 2003, 2:08:15 AM4/9/03
to
> In general, this is just interesting data that others may be
> interested in. The key result that I would like to know is what are
> the ratio of memory usage to design file size for the various
> simulators. It is my belief or suspicion that a better simulator will
> have a lower ratio.

Also different simulators use memory in different places. For example in VCS the bottleneck can be the linking stage of the design. Even if the binary after the compilation is ~1-2G dynamic linker can use 5-7 G of memory. Also during compilation the compiler can easily use 1-2G of memory. On the other hand Modelsim compiler for example uses less memory, but during runtime the memory consumption can be higer. On the other hand Modelsim compiler is faster than VCS, but the simulation speed is slower (also depends on the case).

So there is no single answer for the question, it depends what you are looking for :) At 1Mgate and 10Mgate the problems are different.

--Kim

Kim Enkovaara

unread,
Apr 9, 2003, 2:02:43 AM4/9/03
to
> Consider a change in strategy.
> 100% synchronous processes.
> Simulate source code, not the netlist
> to verify function.
> Use the static timer in your place+route
> instead of modelsim to verify timing.

That is a very risky strategy with ASIC. Altough STA is the main tool for timing checks, there is always a place for some gate level simulations. You have to somehow verify all the clocking systems (PLL reset sequences, locking, dividers), asynchronous interfaces, IO-pad connections, scan chains, bscan, BIST logic etc. Usually IO-logic and test-logic is not available as a normal RTL, tools just insert that logic to the netlist.

Formal tools can reduce the amount of needed gate level simulations, but I don't yet see a way to totally leave gate level simulations from the flow. Functional gate level simulation is quite important, it's so easy to get some part of the test logic incorrectly hooked up.

In FPGA designs I don't see a reason for gate level simulations, unless there is a suspision for synthesizer bug and you don't own formal tools. In FPGA basic structures are tested by FPGA vendor, and the timing can be quite easily verified with STA (altough asyncronous interfaces can be problematic). The original question might have been FPGA design, at least netlist size compared to SDF size was quite odd. Usually SDF is ~5-10x larger than netlist in ASIC flows.

--Kim

nemgreen

unread,
Apr 9, 2003, 9:12:52 AM4/9/03
to
ash...@asherm.com (Asher C. Martin) wrote in message news:<3e30b793.03040...@posting.google.com>...
>
> DATA #2
> Here is the raw data from the simulation showing the peak memory usage
> statistics and VM size.
> http://www.asherm.com/research/hdl/problem/simulate_memory_usage.jpg
>
I notice that you have both vsim (the simulator kernel) and vish (the
gui) running and that vish is consuming 400mb+ eg you must be logging
a load of signals!

Try running in batch mode without logging signals and see what you get
(vsim -c)

Using the latest release (5.7) should give you the best memory
efficiency.

Remember that ModelSim is by default in debug mode and so you should
use the verilog optimizations options (-fast) to get the best
performance and memory footprint.

- Norm

nemgreen

unread,
Apr 9, 2003, 9:18:25 AM4/9/03
to
sh...@cadence.com (Steven Sharp) wrote in message news:<3a8e124e.03040...@posting.google.com>...

> ash...@asherm.com (Asher C. Martin) wrote in message news:<3e30b793.03040...@posting.google.com>...
>
> > QUESTIONS:
> > 1. What is the ratio between required/allocated memory needed to
> > simulate a design and the size of the netlist (see above)?
>
> There can't be any simple ratio to the size of the source code. Note
> that a single line of code that declares a memory can require many
> megabytes of space, as can a line that declares a large array of
> instances. Hierarchical designs can create a number of instances that
> is exponentially larger than the source code. For flat gate-level
> netlists using a particular library, you might be able to come up with
> an approximate ratio. You should be able to do this yourself with
> your particular toolset.
>
> As I recall, 250 bytes per gate in a gate-level simulation should be
> in the ballpark. I wouldn't be surprised if ModelSim used several
> times that much.
>

Well be surprised - it doesn't!

We took a 10M+ gatelevel design into 32-bit ModelSim that NCsim couldn't handle!

Steven Sharp

unread,
Apr 10, 2003, 5:44:09 PM4/10/03
to
nemg...@yahoo.co.uk (nemgreen) wrote in message news:<b8497b97.03040...@posting.google.com>...

>
> Well be surprised - it doesn't!
>
> We took a 10M+ gatelevel design into 32-bit ModelSim that NCsim couldn't handle!

I am indeed surprised by that. I had heard that ModelSim generally took
significantly more memory than NCSim. Perhaps there is something unusual
about your design (flat vs. hierarchical, amount of timing, Verilog vs.
VHDL, etc.) If you haven't reported this problem to Cadence yet, I would
suggest doing so.

Asher C. Martin

unread,
Apr 12, 2003, 5:35:51 PM4/12/03
to
ldoo...@recycle.lbl.gov (Larry Doolittle) wrote in message news:<slrnb944lt....@recycle.lbl.gov>...

Hi Larry,

Thanks for the suggestion about Icarus Verilog - it seems like an
interesting gEDA project. I'm not sure how stable it is for very
large designs but I downloaded the source and tried compiling a few
projects with it and it seems to work pretty well.

anyone else who might be interested in this free compiler it is
located at

http://www.icarus.com/eda/verilog/

Perhaps in the future we will not have to pay thousands of dollars for
these EDA tools that will not even load and simulate our code anyway.
My best wishes go out to free and open source projects such as Icarus.

Asher C. Martin

unread,
Apr 12, 2003, 5:40:05 PM4/12/03
to
Petter Gustad <newsma...@gustad.com> wrote in message news:<m3k7e5p...@scimul.dolphinics.no>...
> ash...@asherm.com (Asher C. Martin) writes:
>
> > I'm having issues with VERY VERY large designs and EDA tools such as
>
> I don't have any experience with Modelsim, but my experience with
> other simulators has indicated that VCS has a somewhat small memory
> footprint. Have your tried VCS on your design? Are you running on a
> 64-bit architecture?
>
> Petter

Hi Peter,

My system has dual-processor PIIIs running at 900 MHz each - so it's a
standard 32-bit x86 architecture. I also have about 1 GB of ram.

Asher C. Martin

unread,
Apr 12, 2003, 5:48:36 PM4/12/03
to
Petter Gustad <newsma...@gustad.com> wrote in message news:<m3fzotp...@scimul.dolphinics.no>...

> Petter Gustad <newsma...@gustad.com> writes:
>
> > ash...@asherm.com (Asher C. Martin) writes:
> >
> > > I'm having issues with VERY VERY large designs and EDA tools such as
> >
> > I don't have any experience with Modelsim, but my experience with
> > other simulators has indicated that VCS has a somewhat small memory
> > footprint. Have your tried VCS on your design? Are you running on a
> > 64-bit architecture?
>
> I forgot to include this URL:
>
> http://www.synopsys.com/products/simulation/cross_compile_wp.html
>
> Petter

Hi Petter,

This article mentioned that for 32-bit CPU's architectures there is a
4 GB memory capacity limit on Verilog or VHDL design size. So, in a
way going to a 64 bit CPU may solve the problem since the total
addressable memory at least theoretically is approximately
16,000,000,000 GB of memory and no design ever created by humans is
this big – at least yet anyway. I hope before our Verilog/VHDL
designs get that big we will all be able to retire and have our
computers code it for us.

cfk

unread,
Apr 12, 2003, 6:28:19 PM4/12/03
to
Dear Asher:
I have also found that Icarus Verilog can simulate the somewhat largish
opencore PCI Bridge project and the waveforms can be viewed with the GTKWave
viewer. It works quite well. Now the next step is to add place and route and
a software connection to magic.


"Asher C. Martin" <ash...@asherm.com> wrote in message
news:3e30b793.03041...@posting.google.com...


> Petter Gustad <newsma...@gustad.com> wrote in message
news:<m3fzotp...@scimul.dolphinics.no>...
> > Petter Gustad <newsma...@gustad.com> writes:
> >
> > > ash...@asherm.com (Asher C. Martin) writes:
> > >
> > > > I'm having issues with VERY VERY large designs and EDA tools such as
> > >
> > > I don't have any experience with Modelsim, but my experience with
> > > other simulators has indicated that VCS has a somewhat small memory
> > > footprint. Have your tried VCS on your design? Are you running on a
> > > 64-bit architecture?
> >
> > I forgot to include this URL:
> >
> > http://www.synopsys.com/products/simulation/cross_compile_wp.html
> >
> > Petter
>
> Hi Petter,
>
> This article mentioned that for 32-bit CPU's architectures there is a
> 4 GB memory capacity limit on Verilog or VHDL design size. So, in a
> way going to a 64 bit CPU may solve the problem since the total
> addressable memory at least theoretically is approximately
> 16,000,000,000 GB of memory and no design ever created by humans is

> this big - at least yet anyway. I hope before our Verilog/VHDL

0 new messages