Hello,
as part of a project, I compare the simulations results of FDS 6.6.0 and CFAST 7.2.4 in the calculation of the upper Layer Temperature during a full room fire. The basis for determining the heat release rate is the European Standard DIN EN 1991(Action on Structures).
The simulation setups for FDS and CFAST are identical. The room Dimensions are 15 m x 15 m x 3 m. There are a total of 50.7 m^2 of ventilation area to make sure that enough oxygen inflow is available. The fire has the following characteristics. Chemical composition is C58H93O11N1 with a Heat of Combustion of 30.000 kJ/kg. It is assumed, that the entire room (floor) is burning (assumption is given by DIN EN 1991).
The measurement of the ULT is done with four DEVC-Lines distributed in the room and by measuring the Quantity 'UPPER LAYER TEMPERATURE'. For the comparison with CFAST, there average of the four DEVC Lines are taken into account. CFAST is using the Normal (Two-Zone-Model) with adiabatic Compartment Surfaces (the last one is also correct for FDS).
Those characteristics are identical for each simulation.
A total of eight simulations were calculated. Each calculation with a different HRR (5 MW, 10 MW, 20 MW, 30 MW, 40 MW, 50 MW, 60 MW, 70 MW; also given by DIN EN 1991)
FDS and CFAST tend to give out increasingly differences in Temperature with higher HRR. In my opinion CFAST calculations are unrealistic high, as shown in the following graphic:

Are these results to be expected (because of nature of CFAST) or are they calculated incorrectly by CFAST?
I have attached the CFAST and FDS Input file and also HRR-Graphs for each Simulation.
Thanks in advance for your help.
Max Boehler.